From mboxrd@z Thu Jan 1 00:00:00 1970 Authentication-Results: mx.groups.io; dkim=missing; spf=pass (domain: suse.com, ip: 15.124.2.85, mailfrom: glin@suse.com) Received: from m4a0039g.houston.softwaregrp.com (m4a0039g.houston.softwaregrp.com [15.124.2.85]) by groups.io with SMTP; Thu, 22 Aug 2019 01:39:32 -0700 Received: FROM m4a0039g.houston.softwaregrp.com (15.120.17.146) BY m4a0039g.houston.softwaregrp.com WITH ESMTP; Thu, 22 Aug 2019 08:37:37 +0000 Received: from M4W0334.microfocus.com (2002:f78:1192::f78:1192) by M4W0334.microfocus.com (2002:f78:1192::f78:1192) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256_P256) id 15.1.1591.10; Thu, 22 Aug 2019 08:26:51 +0000 Received: from NAM02-SN1-obe.outbound.protection.outlook.com (15.124.8.11) by M4W0334.microfocus.com (15.120.17.146) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256_P256) id 15.1.1591.10 via Frontend Transport; Thu, 22 Aug 2019 08:26:51 +0000 ARC-Seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=oUHHAYSWSbUFWazAyjI2vaMyxQ0fiw8mtjWQKblxP1UA8f7zHpC4Ne4qfvY45ikfGwfY7L3T+tw/QsAbUncu1YbjivjhJ+0Ma3OliUxHUfYTBb+3rhgJTHCOM99FwcRUoFb+byN4MIpn7RYXbaMhfBuYZFWOWc/A28KN8Asc9JYenlR7q75t166DKcRyZbTtTW/eiENMlv4Wo0I4KYmHQLksZy9TeYI+IWHW+DbxzDSvprvqiP0YyibsVawrWKWH3UM9bGiDj/O6shYXqWYtZVWxVNHHj7qWsk6+iG+oThGqrsdQXMI/rpDb0ORBgh2Ot80bKl0zRtkf5f4Et9i4CA== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-SenderADCheck; bh=Galcw0G5l/ZsxaQG3Gei98ObxYM0qr7FOw74RX23Gss=; b=BUUW25V9XOwoCwX3czVx3y5iFowE2gg+d358xT3UYEY3Al1ahR/vCbkfjDxKnKxFVq/cGmAhvoqutb6F3OkwNQ/9vFtzFTJlMKEBm7Cf4CHcBXUt9+BbXkYR60f7aXROl9TPeLiLLUTIK3KqpIADAb0evNpXLxzcmk21K9ueovOp5+mdNJE5gWNON4DrZOpzRSv3YAG66qAj6Qfkdgv1MAKimTR76saNZw52B4RG/Lcu3nDH7Ti0JZEpW1em/i8LcCXd+FF8mvhIpV2JJS2W1k3MGc7W8G3x8eP2eXTk0OxGSkIX8AZodYJXyRKisNd2sE3v2Hf2foWIXbZ4WwsUmA== ARC-Authentication-Results: i=1; mx.microsoft.com 1; spf=pass smtp.mailfrom=suse.com; dmarc=pass action=none header.from=suse.com; dkim=pass header.d=suse.com; arc=none Received: from DM6PR18MB2489.namprd18.prod.outlook.com (20.179.105.16) by DM6PR18MB2987.namprd18.prod.outlook.com (20.179.48.152) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.2178.18; Thu, 22 Aug 2019 08:26:44 +0000 Received: from DM6PR18MB2489.namprd18.prod.outlook.com ([fe80::95e2:f1a5:356c:9a5d]) by DM6PR18MB2489.namprd18.prod.outlook.com ([fe80::95e2:f1a5:356c:9a5d%3]) with mapi id 15.20.2178.018; Thu, 22 Aug 2019 08:26:44 +0000 From: "Gary Lin" To: " Feng, Bob C " CC: "devel@edk2.groups.io" , Liming Gao , Steven Shi Subject: Re: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen Objects Thread-Topic: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen Objects Thread-Index: AQHVTNisDGMwYE/WpEiUVzEFz271IacGYL4AgACIdCCAAAO1AA== Date: Thu, 22 Aug 2019 08:26:43 +0000 Message-ID: <20190822082630.GH2052@GaryWorkstation> References: <20190807042537.11928-1-bob.c.feng@intel.com> <20190807042537.11928-5-bob.c.feng@intel.com> <20190822080437.GF2052@GaryWorkstation> <08650203BA1BD64D8AD9B6D5D74A85D161527696@SHSMSX104.ccr.corp.intel.com> In-Reply-To: <08650203BA1BD64D8AD9B6D5D74A85D161527696@SHSMSX104.ccr.corp.intel.com> Accept-Language: zh-TW, en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-clientproxiedby: DB6PR07CA0160.eurprd07.prod.outlook.com (2603:10a6:6:43::14) To DM6PR18MB2489.namprd18.prod.outlook.com (2603:10b6:5:184::16) authentication-results: spf=none (sender IP is ) smtp.mailfrom=GLin@suse.com; x-ms-exchange-messagesentrepresentingtype: 1 x-originating-ip: [202.47.205.198] x-ms-publictraffictype: Email x-ms-office365-filtering-correlation-id: 8793fbf8-508a-46c0-ceda-08d726da74c0 x-microsoft-antispam: BCL:0;PCL:0;RULEID:(2390118)(7020095)(4652040)(8989299)(5600166)(711020)(4605104)(1401327)(4534185)(4627221)(201703031133081)(201702281549075)(8990200)(2017052603328)(7193020);SRVR:DM6PR18MB2987; x-ms-traffictypediagnostic: DM6PR18MB2987: x-ms-exchange-purlcount: 5 x-microsoft-antispam-prvs: x-ms-oob-tlc-oobclassifiers: OLM:235; x-forefront-prvs: 01371B902F x-forefront-antispam-report: SFV:NSPM;SFS:(10019020)(7916004)(1496009)(4636009)(39860400002)(346002)(396003)(366004)(376002)(136003)(189003)(199004)(13464003)(386003)(33656002)(6436002)(186003)(486006)(66066001)(30864003)(6486002)(54906003)(86362001)(14444005)(18074004)(256004)(14454004)(71190400001)(71200400001)(4326008)(102836004)(6512007)(9686003)(26005)(6306002)(6916009)(305945005)(25786009)(53946003)(7736002)(53936002)(33716001)(966005)(6246003)(2906002)(446003)(478600001)(80792005)(3846002)(6116002)(8936002)(476003)(76176011)(99286004)(66946007)(66556008)(66476007)(64756008)(66446008)(316002)(229853002)(19627235002)(52116002)(81156014)(8676002)(1076003)(81166006)(11346002)(6506007)(5660300002)(53546011)(21314003)(559001)(569006);DIR:OUT;SFP:1102;SCL:1;SRVR:DM6PR18MB2987;H:DM6PR18MB2489.namprd18.prod.outlook.com;FPR:;SPF:None;LANG:en;PTR:InfoNoRecords;MX:1;A:1; received-spf: None (protection.outlook.com: suse.com does not designate permitted sender hosts) x-ms-exchange-senderadcheck: 1 x-microsoft-antispam-message-info: 2s09n4oQ4dyNWZMbxAsiND1x0TK1YxSKZiF1MWbLzjNI5GbAnkOjkq5LT5CGkufjYFUWUXDdZsZnlpKsr9CIslZ4EZOZ03COA4QSu9XJcEVCmpe0rquwVF2y2umnQKnO4fwUCLvDIU+kVXN1nT2dMvoYlBjp9VRjfr+sE7GKiHFkhLE7Xfiyq2X2gHUpdeKsaw21z+wgGgDtOSPoVYCiVL7jfv9no+0nsuzyRy/WythoBi1Ukzu9CqyMiTVA1PmJo4QZZfqXvzrZoI/OQcqsKMaJsftWCbk5TKGMp52WvL9V6PMaTIM3eW3uT9lCwINIlclNN6MgXfgn6/O5eDW+iajeppBDnX34PW8YbyS6hdS6RMCZZpiVxLwPq3WgaHfrIfQ3EjzXq5WU2YNtjDhZUXct+udY4jcoHhP6gpjgRFE= x-ms-exchange-transport-forked: True MIME-Version: 1.0 X-MS-Exchange-CrossTenant-Network-Message-Id: 8793fbf8-508a-46c0-ceda-08d726da74c0 X-MS-Exchange-CrossTenant-originalarrivaltime: 22 Aug 2019 08:26:43.8251 (UTC) X-MS-Exchange-CrossTenant-fromentityheader: Hosted X-MS-Exchange-CrossTenant-id: 856b813c-16e5-49a5-85ec-6f081e13b527 X-MS-Exchange-CrossTenant-mailboxtype: HOSTED X-MS-Exchange-CrossTenant-userprincipalname: tBYVDfr5IyGi2M5OrxDeOxEJ88mGaLatTNBezscdrCVvzgZS/+A8iKorfYxrx/2m X-MS-Exchange-Transport-CrossTenantHeadersStamped: DM6PR18MB2987 Return-Path: GLin@suse.com X-OriginatorOrg: suse.com Content-Language: en-US Content-Type: text/plain; charset="us-ascii" Content-ID: Content-Transfer-Encoding: quoted-printable On Thu, Aug 22, 2019 at 08:15:03AM +0000, Feng, Bob C wrote: > Hi Gary, >=20 > https://edk2.groups.io/g/devel/message/46196 >=20 > This patch is under review and it can fix this regression issue. >=20 Oh, thanks! Will try the patch. Gary Lin > Thanks, > Bob=20 >=20 > -----Original Message----- > From: devel@edk2.groups.io [mailto:devel@edk2.groups.io] On Behalf Of Ga= ry Lin > Sent: Thursday, August 22, 2019 4:05 PM > To: devel@edk2.groups.io; Feng, Bob C > Cc: Gao, Liming ; Shi, Steven > Subject: Re: [edk2-devel] [Patch 04/10 V8] BaseTools: Decouple AutoGen O= bjects >=20 > On Wed, Aug 07, 2019 at 12:25:31PM +0800, Bob Feng wrote: > > BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1875 > >=20 > > 1. Separate the AutoGen.py into 3 small py files. > > One is for AutoGen base class, one is for WorkspaceAutoGen class > > and PlatformAutoGen class, and the one for ModuleAutoGen class. > > 2. Create a new class DataPipe to store the Platform scope settings. > > Create a new class PlatformInfo to provide the same interface > > as PlatformAutoGen. PlatformInfo class is initialized by > > DataPipe instance. > > Create a new class WorkspaceInfo to provide the same interface > > as WorkspaceAutoGen. WorkspaceInfo class is initialized by > > DataPipe instance. > > 3. Change ModuleAutoGen to depends on DataPipe, PlatformInfo and > > WorkspaceInfo. Remove the dependency of ModuleAutoGen to PlatformAutoG= en. > >=20 > I found a regression of the dependency check. >=20 > When adding a driver, e.g. OpalPasswordPei, to OVMF without setting the > library dependencies correctly, before this patch, 'build' would show th= e > missing library and stop immediately. Now, 'build' doesn't complain anyt= hing > and just starts compiling the code. It ends up in some strange errors du= e > to the missing libraries. >=20 > It's easy to reproduce the bug with the following patch: >=20 > diff --git a/OvmfPkg/OvmfPkgX64.dsc b/OvmfPkg/OvmfPkgX64.dsc > index 68073ef55b4d..7d67706612d1 100644 > --- a/OvmfPkg/OvmfPkgX64.dsc > +++ b/OvmfPkg/OvmfPkgX64.dsc > @@ -636,6 +636,7 @@ [Components] > NULL|SecurityPkg/Library/HashInstanceLibSha384/HashInstanceLibSha= 384.inf > NULL|SecurityPkg/Library/HashInstanceLibSha512/HashInstanceLibSha= 512.inf > } > + SecurityPkg/Tcg/Opal/OpalPassword/OpalPasswordPei.inf > !if $(TPM2_CONFIG_ENABLE) =3D=3D TRUE > SecurityPkg/Tcg/Tcg2Config/Tcg2ConfigDxe.inf > !endif >=20 > Gary Lin >=20 > > Cc: Liming Gao > > Cc: Steven Shi > > Signed-off-by: Bob Feng > > --- > > BaseTools/Source/Python/AutoGen/AutoGen.py | 4264 +---------------= - > > BaseTools/Source/Python/AutoGen/DataPipe.py | 147 + > > BaseTools/Source/Python/AutoGen/GenC.py | 2 +- > > .../Source/Python/AutoGen/ModuleAutoGen.py | 1908 ++++++++ > > .../Python/AutoGen/ModuleAutoGenHelper.py | 619 +++ > > .../Source/Python/AutoGen/PlatformAutoGen.py | 1505 ++++++ > > .../Source/Python/AutoGen/WorkspaceAutoGen.py | 904 ++++ > > BaseTools/Source/Python/Common/Misc.py | 1 - > > .../Python/PatchPcdValue/PatchPcdValue.py | 1 - > > .../Source/Python/Workspace/DscBuildData.py | 10 +- > > .../Source/Python/Workspace/InfBuildData.py | 29 + > > .../Python/Workspace/WorkspaceCommon.py | 4 + > > .../Python/Workspace/WorkspaceDatabase.py | 3 + > > BaseTools/Source/Python/build/BuildReport.py | 4 +- > > BaseTools/Source/Python/build/build.py | 51 +- > > 15 files changed, 5204 insertions(+), 4248 deletions(-) > > create mode 100644 BaseTools/Source/Python/AutoGen/DataPipe.py > > create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py > > create mode 100644 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelpe= r.py > > create mode 100644 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py > > create mode 100644 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.p= y > >=20 > > diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/So= urce/Python/AutoGen/AutoGen.py > > index bb0da46d74a9..d9ee699d8f30 100644 > > --- a/BaseTools/Source/Python/AutoGen/AutoGen.py > > +++ b/BaseTools/Source/Python/AutoGen/AutoGen.py > > @@ -10,226 +10,11 @@ > > > > ## Import Modules > > # > > from __future__ import print_function > > from __future__ import absolute_import > > -import Common.LongFilePathOs as os > > -import re > > -import os.path as path > > -import copy > > -import uuid > > - > > -from . import GenC > > -from . import GenMake > > -from . import GenDepex > > -from io import BytesIO > > - > > -from .StrGather import * > > -from .BuildEngine import BuildRuleObj as BuildRule > > -from .BuildEngine import gDefaultBuildRuleFile,AutoGenReqBuildRuleVer= Num > > -import shutil > > -from Common.LongFilePathSupport import CopyLongFilePath > > -from Common.BuildToolError import * > > -from Common.DataType import * > > -from Common.Misc import * > > -from Common.StringUtils import * > > -import Common.GlobalData as GlobalData > > -from GenFds.FdfParser import * > > -from CommonDataClass.CommonClass import SkuInfoClass > > -from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile > > -import Common.VpdInfoFile as VpdInfoFile > > -from .GenPcdDb import CreatePcdDatabaseCode > > -from Workspace.MetaFileCommentParser import UsageList > > -from Workspace.WorkspaceCommon import GetModuleLibInstances > > -from Common.MultipleWorkspace import MultipleWorkspace as mws > > -from . import InfSectionParser > > -import datetime > > -import hashlib > > -from .GenVar import VariableMgr, var_info > > -from collections import OrderedDict > > -from collections import defaultdict > > -from Workspace.WorkspaceCommon import OrderedListDict > > -from Common.ToolDefClassObject import gDefaultToolsDefFile > > - > > -from Common.caching import cached_property, cached_class_function > > - > > -## Regular expression for splitting Dependency Expression string into= tokens > > -gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") > > - > > -## Regular expression for match: PCD(xxxx.yyy) > > -gPCDAsGuidPattern =3D re.compile(r"^PCD\(.+\..+\)$") > > - > > -# > > -# Regular expression for finding Include Directories, the difference = between MSFT and INTEL/GCC/RVCT > > -# is the former use /I , the Latter used -I to specify include direct= ories > > -# > > -gBuildOptIncludePatternMsft =3D re.compile(r"(?:.*?)/I[ \t]*([^ ]*)",= re.MULTILINE | re.DOTALL) > > -gBuildOptIncludePatternOther =3D re.compile(r"(?:.*?)-I[ \t]*([^ ]*)"= , re.MULTILINE | re.DOTALL) > > - > > -# > > -# Match name =3D variable > > -# > > -gEfiVarStoreNamePattern =3D re.compile("\s*name\s*=3D\s*(\w+)") > > -# > > -# The format of guid in efivarstore statement likes following and mus= t be correct: > > -# guid =3D {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13= , 0x02, 0x11, 0x3D}} > > -# > > -gEfiVarStoreGuidPattern =3D re.compile("\s*guid\s*=3D\s*({.*?{.*?}\s*= })") > > - > > -## Mapping Makefile type > > -gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} > > - > > - > > -## default file name for AutoGen > > -gAutoGenCodeFileName =3D "AutoGen.c" > > -gAutoGenHeaderFileName =3D "AutoGen.h" > > -gAutoGenStringFileName =3D "%(module_name)sStrDefs.h" > > -gAutoGenStringFormFileName =3D "%(module_name)sStrDefs.hpk" > > -gAutoGenDepexFileName =3D "%(module_name)s.depex" > > -gAutoGenImageDefFileName =3D "%(module_name)sImgDefs.h" > > -gAutoGenIdfFileName =3D "%(module_name)sIdf.hpk" > > -gInfSpecVersion =3D "0x00010017" > > - > > -# > > -# Template string to generic AsBuilt INF > > -# > > -gAsBuiltInfHeaderString =3D TemplateString("""${header_comments} > > - > > -# DO NOT EDIT > > -# FILE auto-generated > > - > > -[Defines] > > - INF_VERSION =3D ${module_inf_version} > > - BASE_NAME =3D ${module_name} > > - FILE_GUID =3D ${module_guid} > > - MODULE_TYPE =3D ${module_module_type}${BEGIN} > > - VERSION_STRING =3D ${module_version_string}${END}${BEGI= N} > > - PCD_IS_DRIVER =3D ${pcd_is_driver_string}${END}${BEGIN= } > > - UEFI_SPECIFICATION_VERSION =3D ${module_uefi_specification_version}= ${END}${BEGIN} > > - PI_SPECIFICATION_VERSION =3D ${module_pi_specification_version}${= END}${BEGIN} > > - ENTRY_POINT =3D ${module_entry_point}${END}${BEGIN} > > - UNLOAD_IMAGE =3D ${module_unload_image}${END}${BEGIN} > > - CONSTRUCTOR =3D ${module_constructor}${END}${BEGIN} > > - DESTRUCTOR =3D ${module_destructor}${END}${BEGIN} > > - SHADOW =3D ${module_shadow}${END}${BEGIN} > > - PCI_VENDOR_ID =3D ${module_pci_vendor_id}${END}${BEGIN= } > > - PCI_DEVICE_ID =3D ${module_pci_device_id}${END}${BEGIN= } > > - PCI_CLASS_CODE =3D ${module_pci_class_code}${END}${BEGI= N} > > - PCI_REVISION =3D ${module_pci_revision}${END}${BEGIN} > > - BUILD_NUMBER =3D ${module_build_number}${END}${BEGIN} > > - SPEC =3D ${module_spec}${END}${BEGIN} > > - UEFI_HII_RESOURCE_SECTION =3D ${module_uefi_hii_resource_section}$= {END}${BEGIN} > > - MODULE_UNI_FILE =3D ${module_uni_file}${END} > > - > > -[Packages.${module_arch}]${BEGIN} > > - ${package_item}${END} > > - > > -[Binaries.${module_arch}]${BEGIN} > > - ${binary_item}${END} > > - > > -[PatchPcd.${module_arch}]${BEGIN} > > - ${patchablepcd_item} > > -${END} > > - > > -[Protocols.${module_arch}]${BEGIN} > > - ${protocol_item} > > -${END} > > - > > -[Ppis.${module_arch}]${BEGIN} > > - ${ppi_item} > > -${END} > > - > > -[Guids.${module_arch}]${BEGIN} > > - ${guid_item} > > -${END} > > - > > -[PcdEx.${module_arch}]${BEGIN} > > - ${pcd_item} > > -${END} > > - > > -[LibraryClasses.${module_arch}] > > -## @LIB_INSTANCES${BEGIN} > > -# ${libraryclasses_item}${END} > > - > > -${depexsection_item} > > - > > -${userextension_tianocore_item} > > - > > -${tail_comments} > > - > > -[BuildOptions.${module_arch}] > > -## @AsBuilt${BEGIN} > > -## ${flags_item}${END} > > -""") > > -## Split command line option string to list > > -# > > -# subprocess.Popen needs the args to be a sequence. Otherwise there's= problem > > -# in non-windows platform to launch command > > -# > > -def _SplitOption(OptionString): > > - OptionList =3D [] > > - LastChar =3D " " > > - OptionStart =3D 0 > > - QuotationMark =3D "" > > - for Index in range(0, len(OptionString)): > > - CurrentChar =3D OptionString[Index] > > - if CurrentChar in ['"', "'"]: > > - if QuotationMark =3D=3D CurrentChar: > > - QuotationMark =3D "" > > - elif QuotationMark =3D=3D "": > > - QuotationMark =3D CurrentChar > > - continue > > - elif QuotationMark: > > - continue > > - > > - if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r"= , "\n"]: > > - if Index > OptionStart: > > - OptionList.append(OptionString[OptionStart:Index - 1]= ) > > - OptionStart =3D Index > > - LastChar =3D CurrentChar > > - OptionList.append(OptionString[OptionStart:]) > > - return OptionList > > - > > -# > > -# Convert string to C format array > > -# > > -def _ConvertStringToByteArray(Value): > > - Value =3D Value.strip() > > - if not Value: > > - return None > > - if Value[0] =3D=3D '{': > > - if not Value.endswith('}'): > > - return None > > - Value =3D Value.replace(' ', '').replace('{', '').replace('}'= , '') > > - ValFields =3D Value.split(',') > > - try: > > - for Index in range(len(ValFields)): > > - ValFields[Index] =3D str(int(ValFields[Index], 0)) > > - except ValueError: > > - return None > > - Value =3D '{' + ','.join(ValFields) + '}' > > - return Value > > - > > - Unicode =3D False > > - if Value.startswith('L"'): > > - if not Value.endswith('"'): > > - return None > > - Value =3D Value[1:] > > - Unicode =3D True > > - elif not Value.startswith('"') or not Value.endswith('"'): > > - return None > > - > > - Value =3D eval(Value) # translate escape character > > - NewValue =3D '{' > > - for Index in range(0, len(Value)): > > - if Unicode: > > - NewValue =3D NewValue + str(ord(Value[Index]) % 0x10000) = + ',' > > - else: > > - NewValue =3D NewValue + str(ord(Value[Index]) % 0x100) + = ',' > > - Value =3D NewValue + '0}' > > - return Value > > - > > +from Common.DataType import TAB_STAR > > ## Base class for AutoGen > > # > > # This class just implements the cache mechanism of AutoGen objects= . > > # > > class AutoGen(object): > > @@ -246,10 +31,11 @@ class AutoGen(object): > > # @param Toolchain Tool chain name > > # @param Arch Target arch > > # @param *args The specific class related parameters > > # @param **kwargs The specific class related dict param= eters > > # > > + > > def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > > # check if the object has been created > > Key =3D (Target, Toolchain, Arch, MetaFile) > > if Key in cls.__ObjectCache: > > # if it exists, just return it directly > > @@ -279,4007 +65,49 @@ class AutoGen(object): > > > > ## "=3D=3D" operator > > def __eq__(self, Other): > > return Other and self.MetaFile =3D=3D Other > > > > -## Workspace AutoGen class > > -# > > -# This class is used mainly to control the whole platform build for= different > > -# architecture. This class will generate top level makefile. > > -# > > -class WorkspaceAutoGen(AutoGen): > > - # call super().__init__ then call the worker function with differ= ent parameter count > > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > - if not hasattr(self, "_Init"): > > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch, *args, **kwargs) > > - self._Init =3D True > > - > > - ## Initialize WorkspaceAutoGen > > - # > > - # @param WorkspaceDir Root directory of workspace > > - # @param ActivePlatform Meta-file of active platform > > - # @param Target Build target > > - # @param Toolchain Tool chain name > > - # @param ArchList List of architecture of curre= nt build > > - # @param MetaFileDb Database containing meta-file= s > > - # @param BuildConfig Configuration of build > > - # @param ToolDefinition Tool chain definitions > > - # @param FlashDefinitionFile File of flash definition > > - # @param Fds FD list to be generated > > - # @param Fvs FV list to be generated > > - # @param Caps Capsule list to be generated > > - # @param SkuId SKU id from command line > > - # > > - def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolc= hain, ArchList, MetaFileDb, > > - BuildConfig, ToolDefinition, FlashDefinitionFile=3D'', = Fds=3DNone, Fvs=3DNone, Caps=3DNone, SkuId=3D'', UniFlag=3DNone, > > - Progress=3DNone, BuildModule=3DNone): > > - self.BuildDatabase =3D MetaFileDb > > - self.MetaFile =3D ActivePlatform > > - self.WorkspaceDir =3D WorkspaceDir > > - self.Platform =3D self.BuildDatabase[self.MetaFile, TAB= _ARCH_COMMON, Target, Toolchain] > > - GlobalData.gActivePlatform =3D self.Platform > > - self.BuildTarget =3D Target > > - self.ToolChain =3D Toolchain > > - self.ArchList =3D ArchList > > - self.SkuId =3D SkuId > > - self.UniFlag =3D UniFlag > > - > > - self.TargetTxt =3D BuildConfig > > - self.ToolDef =3D ToolDefinition > > - self.FdfFile =3D FlashDefinitionFile > > - self.FdTargetList =3D Fds if Fds else [] > > - self.FvTargetList =3D Fvs if Fvs else [] > > - self.CapTargetList =3D Caps if Caps else [] > > - self.AutoGenObjectList =3D [] > > - self._GuidDict =3D {} > > - > > - # there's many relative directory operations, so ... > > - os.chdir(self.WorkspaceDir) > > - > > - self.MergeArch() > > - self.ValidateBuildTarget() > > - > > - EdkLogger.info("") > > - if self.ArchList: > > - EdkLogger.info('%-16s =3D %s' % ("Architecture(s)", ' '.j= oin(self.ArchList))) > > - EdkLogger.info('%-16s =3D %s' % ("Build target", self.BuildTa= rget)) > > - EdkLogger.info('%-16s =3D %s' % ("Toolchain", self.ToolChain)= ) > > - > > - EdkLogger.info('\n%-24s =3D %s' % ("Active Platform", self.Pl= atform)) > > - if BuildModule: > > - EdkLogger.info('%-24s =3D %s' % ("Active Module", BuildMo= dule)) > > - > > - if self.FdfFile: > > - EdkLogger.info('%-24s =3D %s' % ("Flash Image Definition"= , self.FdfFile)) > > - > > - EdkLogger.verbose("\nFLASH_DEFINITION =3D %s" % self.FdfFile) > > - > > - if Progress: > > - Progress.Start("\nProcessing meta-data") > > - # > > - # Mark now build in AutoGen Phase > > - # > > - GlobalData.gAutoGenPhase =3D True > > - self.ProcessModuleFromPdf() > > - self.ProcessPcdType() > > - self.ProcessMixedPcd() > > - self.GetPcdsFromFDF() > > - self.CollectAllPcds() > > - self.GeneratePkgLevelHash() > > - # > > - # Check PCDs token value conflict in each DEC file. > > - # > > - self._CheckAllPcdsTokenValueConflict() > > - # > > - # Check PCD type and definition between DSC and DEC > > - # > > - self._CheckPcdDefineAndType() > > - > > - self.CreateBuildOptionsFile() > > - self.CreatePcdTokenNumberFile() > > - self.CreateModuleHashInfo() > > - GlobalData.gAutoGenPhase =3D False > > - > > - # > > - # Merge Arch > > - # > > - def MergeArch(self): > > - if not self.ArchList: > > - ArchList =3D set(self.Platform.SupArchList) > > - else: > > - ArchList =3D set(self.ArchList) & set(self.Platform.SupAr= chList) > > - if not ArchList: > > - EdkLogger.error("build", PARAMETER_INVALID, > > - ExtraData =3D "Invalid ARCH specified. [V= alid ARCH: %s]" % (" ".join(self.Platform.SupArchList))) > > - elif self.ArchList and len(ArchList) !=3D len(self.ArchList): > > - SkippedArchList =3D set(self.ArchList).symmetric_differen= ce(set(self.Platform.SupArchList)) > > - EdkLogger.verbose("\nArch [%s] is ignored because the pla= tform supports [%s] only!" > > - % (" ".join(SkippedArchList), " ".join(= self.Platform.SupArchList))) > > - self.ArchList =3D tuple(ArchList) > > - > > - # Validate build target > > - def ValidateBuildTarget(self): > > - if self.BuildTarget not in self.Platform.BuildTargets: > > - EdkLogger.error("build", PARAMETER_INVALID, > > - ExtraData=3D"Build target [%s] is not sup= ported by the platform. [Valid target: %s]" > > - % (self.BuildTarget, " ".join(s= elf.Platform.BuildTargets))) > > - @cached_property > > - def FdfProfile(self): > > - if not self.FdfFile: > > - self.FdfFile =3D self.Platform.FlashDefinition > > - > > - FdfProfile =3D None > > - if self.FdfFile: > > - Fdf =3D FdfParser(self.FdfFile.Path) > > - Fdf.ParseFile() > > - GlobalData.gFdfParser =3D Fdf > > - if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile= .FdDict: > > - FdDict =3D Fdf.Profile.FdDict[Fdf.CurrentFdName] > > - for FdRegion in FdDict.RegionList: > > - if str(FdRegion.RegionType) is 'FILE' and self.Pl= atform.VpdToolGuid in str(FdRegion.RegionDataList): > > - if int(FdRegion.Offset) % 8 !=3D 0: > > - EdkLogger.error("build", FORMAT_INVALID, = 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset)) > > - FdfProfile =3D Fdf.Profile > > - else: > > - if self.FdTargetList: > > - EdkLogger.info("No flash definition file found. FD [%= s] will be ignored." % " ".join(self.FdTargetList)) > > - self.FdTargetList =3D [] > > - if self.FvTargetList: > > - EdkLogger.info("No flash definition file found. FV [%= s] will be ignored." % " ".join(self.FvTargetList)) > > - self.FvTargetList =3D [] > > - if self.CapTargetList: > > - EdkLogger.info("No flash definition file found. Capsu= le [%s] will be ignored." % " ".join(self.CapTargetList)) > > - self.CapTargetList =3D [] > > - > > - return FdfProfile > > - > > - def ProcessModuleFromPdf(self): > > - > > - if self.FdfProfile: > > - for fvname in self.FvTargetList: > > - if fvname.upper() not in self.FdfProfile.FvDict: > > - EdkLogger.error("build", OPTION_VALUE_INVALID, > > - "No such an FV in FDF file: %s" %= fvname) > > - > > - # In DSC file may use FILE_GUID to override the module, t= hen in the Platform.Modules use FILE_GUIDmodule.inf as key, > > - # but the path (self.MetaFile.Path) is the real path > > - for key in self.FdfProfile.InfDict: > > - if key =3D=3D 'ArchTBD': > > - MetaFile_cache =3D defaultdict(set) > > - for Arch in self.ArchList: > > - Current_Platform_cache =3D self.BuildDatabase= [self.MetaFile, Arch, self.BuildTarget, self.ToolChain] > > - for Pkey in Current_Platform_cache.Modules: > > - MetaFile_cache[Arch].add(Current_Platform= _cache.Modules[Pkey].MetaFile) > > - for Inf in self.FdfProfile.InfDict[key]: > > - ModuleFile =3D PathClass(NormPath(Inf), Globa= lData.gWorkspace, Arch) > > - for Arch in self.ArchList: > > - if ModuleFile in MetaFile_cache[Arch]: > > - break > > - else: > > - ModuleData =3D self.BuildDatabase[ModuleF= ile, Arch, self.BuildTarget, self.ToolChain] > > - if not ModuleData.IsBinaryModule: > > - EdkLogger.error('build', PARSER_ERROR= , "Module %s NOT found in DSC file; Is it really a binary module?" % Module= File) > > - > > - else: > > - for Arch in self.ArchList: > > - if Arch =3D=3D key: > > - Platform =3D self.BuildDatabase[self.Meta= File, Arch, self.BuildTarget, self.ToolChain] > > - MetaFileList =3D set() > > - for Pkey in Platform.Modules: > > - MetaFileList.add(Platform.Modules[Pke= y].MetaFile) > > - for Inf in self.FdfProfile.InfDict[key]: > > - ModuleFile =3D PathClass(NormPath(Inf= ), GlobalData.gWorkspace, Arch) > > - if ModuleFile in MetaFileList: > > - continue > > - ModuleData =3D self.BuildDatabase[Mod= uleFile, Arch, self.BuildTarget, self.ToolChain] > > - if not ModuleData.IsBinaryModule: > > - EdkLogger.error('build', PARSER_E= RROR, "Module %s NOT found in DSC file; Is it really a binary module?" % Mo= duleFile) > > - > > - > > - > > - # parse FDF file to get PCDs in it, if any > > - def GetPcdsFromFDF(self): > > - > > - if self.FdfProfile: > > - PcdSet =3D self.FdfProfile.PcdDict > > - # handle the mixed pcd in FDF file > > - for key in PcdSet: > > - if key in GlobalData.MixedPcd: > > - Value =3D PcdSet[key] > > - del PcdSet[key] > > - for item in GlobalData.MixedPcd[key]: > > - PcdSet[item] =3D Value > > - self.VerifyPcdDeclearation(PcdSet) > > - > > - def ProcessPcdType(self): > > - for Arch in self.ArchList: > > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > - Platform.Pcds > > - # generate the SourcePcdDict and BinaryPcdDict > > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildT= arget, self.ToolChain, Arch) > > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()= ): > > - if BuildData.Arch !=3D Arch: > > - continue > > - if BuildData.MetaFile.Ext =3D=3D '.inf': > > - for key in BuildData.Pcds: > > - if BuildData.Pcds[key].Pending: > > - if key in Platform.Pcds: > > - PcdInPlatform =3D Platform.Pcds[key] > > - if PcdInPlatform.Type: > > - BuildData.Pcds[key].Type =3D PcdI= nPlatform.Type > > - BuildData.Pcds[key].Pending =3D F= alse > > - > > - if BuildData.MetaFile in Platform.Modules= : > > - PlatformModule =3D Platform.Modules[s= tr(BuildData.MetaFile)] > > - if key in PlatformModule.Pcds: > > - PcdInPlatform =3D PlatformModule.= Pcds[key] > > - if PcdInPlatform.Type: > > - BuildData.Pcds[key].Type =3D = PcdInPlatform.Type > > - BuildData.Pcds[key].Pending = =3D False > > - else: > > - #Pcd used in Library, Pcd Type from r= eference module if Pcd Type is Pending > > - if BuildData.Pcds[key].Pending: > > - MGen =3D ModuleAutoGen(self, Buil= dData.MetaFile, self.BuildTarget, self.ToolChain, Arch, self.MetaFile) > > - if MGen and MGen.IsLibrary: > > - if MGen in PGen.LibraryAutoGe= nList: > > - ReferenceModules =3D MGen= .ReferenceModules > > - for ReferenceModule in Re= ferenceModules: > > - if ReferenceModule.Me= taFile in Platform.Modules: > > - RefPlatformModule= =3D Platform.Modules[str(ReferenceModule.MetaFile)] > > - if key in RefPlat= formModule.Pcds: > > - PcdInReferenc= eModule =3D RefPlatformModule.Pcds[key] > > - if PcdInRefer= enceModule.Type: > > - BuildData= .Pcds[key].Type =3D PcdInReferenceModule.Type > > - BuildData= .Pcds[key].Pending =3D False > > - break > > - > > - def ProcessMixedPcd(self): > > - for Arch in self.ArchList: > > - SourcePcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PA= TCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set(= )} > > - BinaryPcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PA= TCHABLE_IN_MODULE:set()} > > - SourcePcdDict_Keys =3D SourcePcdDict.keys() > > - BinaryPcdDict_Keys =3D BinaryPcdDict.keys() > > - > > - # generate the SourcePcdDict and BinaryPcdDict > > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildT= arget, self.ToolChain, Arch) > > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()= ): > > - if BuildData.Arch !=3D Arch: > > - continue > > - if BuildData.MetaFile.Ext =3D=3D '.inf': > > - for key in BuildData.Pcds: > > - if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key]= .Type: > > - if BuildData.IsBinaryModule: > > - BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > - else: > > - SourcePcdDict[TAB_PCDS_DYNAMIC_EX].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > - > > - elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildDat= a.Pcds[key].Type: > > - if BuildData.MetaFile.Ext =3D=3D '.inf': > > - if BuildData.IsBinaryModule: > > - BinaryPcdDict[TAB_PCDS_PATCHABLE_= IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSp= aceGuidCName)) > > - else: > > - SourcePcdDict[TAB_PCDS_PATCHABLE_= IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSp= aceGuidCName)) > > - > > - elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].= Type: > > - SourcePcdDict[TAB_PCDS_DYNAMIC].add((Buil= dData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > > - elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcd= s[key].Type: > > - SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > - > > - # > > - # A PCD can only use one type for all source modules > > - # > > - for i in SourcePcdDict_Keys: > > - for j in SourcePcdDict_Keys: > > - if i !=3D j: > > - Intersections =3D SourcePcdDict[i].intersecti= on(SourcePcdDict[j]) > > - if len(Intersections) > 0: > > - EdkLogger.error( > > - 'build', > > - FORMAT_INVALID, > > - "Building modules from source INFs, follo= wing PCD use %s and %s access method. It must be corrected to use only one = access method." % (i, j), > > - ExtraData=3D'\n\t'.join(str(P[1]+'.'+P[0]= ) for P in Intersections) > > - ) > > - > > - # > > - # intersection the BinaryPCD for Mixed PCD > > - # > > - for i in BinaryPcdDict_Keys: > > - for j in BinaryPcdDict_Keys: > > - if i !=3D j: > > - Intersections =3D BinaryPcdDict[i].intersecti= on(BinaryPcdDict[j]) > > - for item in Intersections: > > - NewPcd1 =3D (item[0] + '_' + i, item[1]) > > - NewPcd2 =3D (item[0] + '_' + j, item[1]) > > - if item not in GlobalData.MixedPcd: > > - GlobalData.MixedPcd[item] =3D [NewPcd= 1, NewPcd2] > > - else: > > - if NewPcd1 not in GlobalData.MixedPcd= [item]: > > - GlobalData.MixedPcd[item].append(= NewPcd1) > > - if NewPcd2 not in GlobalData.MixedPcd= [item]: > > - GlobalData.MixedPcd[item].append(= NewPcd2) > > - > > - # > > - # intersection the SourcePCD and BinaryPCD for Mixed PCD > > - # > > - for i in SourcePcdDict_Keys: > > - for j in BinaryPcdDict_Keys: > > - if i !=3D j: > > - Intersections =3D SourcePcdDict[i].intersecti= on(BinaryPcdDict[j]) > > - for item in Intersections: > > - NewPcd1 =3D (item[0] + '_' + i, item[1]) > > - NewPcd2 =3D (item[0] + '_' + j, item[1]) > > - if item not in GlobalData.MixedPcd: > > - GlobalData.MixedPcd[item] =3D [NewPcd= 1, NewPcd2] > > - else: > > - if NewPcd1 not in GlobalData.MixedPcd= [item]: > > - GlobalData.MixedPcd[item].append(= NewPcd1) > > - if NewPcd2 not in GlobalData.MixedPcd= [item]: > > - GlobalData.MixedPcd[item].append(= NewPcd2) > > - > > - for BuildData in list(PGen.BuildDatabase._CACHE_.values()= ): > > - if BuildData.Arch !=3D Arch: > > - continue > > - for key in BuildData.Pcds: > > - for SinglePcd in GlobalData.MixedPcd: > > - if (BuildData.Pcds[key].TokenCName, BuildData= .Pcds[key].TokenSpaceGuidCName) =3D=3D SinglePcd: > > - for item in GlobalData.MixedPcd[SinglePcd= ]: > > - Pcd_Type =3D item[0].split('_')[-1] > > - if (Pcd_Type =3D=3D BuildData.Pcds[ke= y].Type) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Ty= pe in PCD_DYNAMIC_EX_TYPE_SET) or \ > > - (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC = and BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > > - Value =3D BuildData.Pcds[key] > > - Value.TokenCName =3D BuildData.Pc= ds[key].TokenCName + '_' + Pcd_Type > > - if len(key) =3D=3D 2: > > - newkey =3D (Value.TokenCName,= key[1]) > > - elif len(key) =3D=3D 3: > > - newkey =3D (Value.TokenCName,= key[1], key[2]) > > - del BuildData.Pcds[key] > > - BuildData.Pcds[newkey] =3D Value > > - break > > - break > > - > > - #Collect package set information from INF of FDF > > - @cached_property > > - def PkgSet(self): > > - if not self.FdfFile: > > - self.FdfFile =3D self.Platform.FlashDefinition > > - > > - if self.FdfFile: > > - ModuleList =3D self.FdfProfile.InfList > > - else: > > - ModuleList =3D [] > > - Pkgs =3D {} > > - for Arch in self.ArchList: > > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > - PGen =3D PlatformAutoGen(self, self.MetaFile, self.BuildT= arget, self.ToolChain, Arch) > > - PkgSet =3D set() > > - for Inf in ModuleList: > > - ModuleFile =3D PathClass(NormPath(Inf), GlobalData.gW= orkspace, Arch) > > - if ModuleFile in Platform.Modules: > > - continue > > - ModuleData =3D self.BuildDatabase[ModuleFile, Arch, s= elf.BuildTarget, self.ToolChain] > > - PkgSet.update(ModuleData.Packages) > > - Pkgs[Arch] =3D list(PkgSet) + list(PGen.PackageList) > > - return Pkgs > > - > > - def VerifyPcdDeclearation(self,PcdSet): > > - for Arch in self.ArchList: > > - Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > - Pkgs =3D self.PkgSet[Arch] > > - DecPcds =3D set() > > - DecPcdsKey =3D set() > > - for Pkg in Pkgs: > > - for Pcd in Pkg.Pcds: > > - DecPcds.add((Pcd[0], Pcd[1])) > > - DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2])) > > - > > - Platform.SkuName =3D self.SkuId > > - for Name, Guid,Fileds in PcdSet: > > - if (Name, Guid) not in DecPcds: > > - EdkLogger.error( > > - 'build', > > - PARSER_ERROR, > > - "PCD (%s.%s) used in FDF is not declared in D= EC files." % (Guid, Name), > > - File =3D self.FdfProfile.PcdFileLineDict[Name= , Guid, Fileds][0], > > - Line =3D self.FdfProfile.PcdFileLineDict[Name= , Guid, Fileds][1] > > - ) > > - else: > > - # Check whether Dynamic or DynamicEx PCD used in = FDF file. If used, build break and give a error message. > > - if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPc= dsKey \ > > - or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE)= in DecPcdsKey \ > > - or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in Dec= PcdsKey: > > - continue > > - elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey= or (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey: > > - EdkLogger.error( > > - 'build', > > - PARSER_ERROR, > > - "Using Dynamic or DynamicEx type of P= CD [%s.%s] in FDF file is not allowed." % (Guid, Name), > > - File =3D self.FdfProfile.PcdFileLineD= ict[Name, Guid, Fileds][0], > > - Line =3D self.FdfProfile.PcdFileLineD= ict[Name, Guid, Fileds][1] > > - ) > > - def CollectAllPcds(self): > > - > > - for Arch in self.ArchList: > > - Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > > - # > > - # Explicitly collect platform's dynamic PCDs > > - # > > - Pa.CollectPlatformDynamicPcds() > > - Pa.CollectFixedAtBuildPcds() > > - self.AutoGenObjectList.append(Pa) > > - > > - # > > - # Generate Package level hash value > > - # > > - def GeneratePkgLevelHash(self): > > - for Arch in self.ArchList: > > - GlobalData.gPackageHash =3D {} > > - if GlobalData.gUseHashCache: > > - for Pkg in self.PkgSet[Arch]: > > - self._GenPkgLevelHash(Pkg) > > - > > - > > - def CreateBuildOptionsFile(self): > > - # > > - # Create BuildOptions Macro & PCD metafile, also add the Acti= ve Platform and FDF file. > > - # > > - content =3D 'gCommandLineDefines: ' > > - content +=3D str(GlobalData.gCommandLineDefines) > > - content +=3D TAB_LINE_BREAK > > - content +=3D 'BuildOptionPcd: ' > > - content +=3D str(GlobalData.BuildOptionPcd) > > - content +=3D TAB_LINE_BREAK > > - content +=3D 'Active Platform: ' > > - content +=3D str(self.Platform) > > - content +=3D TAB_LINE_BREAK > > - if self.FdfFile: > > - content +=3D 'Flash Image Definition: ' > > - content +=3D str(self.FdfFile) > > - content +=3D TAB_LINE_BREAK > > - SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'),= content, False) > > - > > - def CreatePcdTokenNumberFile(self): > > - # > > - # Create PcdToken Number file for Dynamic/DynamicEx Pcd. > > - # > > - PcdTokenNumber =3D 'PcdTokenNumber: ' > > - Pa =3D self.AutoGenObjectList[0] > > - if Pa.PcdTokenNumber: > > - if Pa.DynamicPcdList: > > - for Pcd in Pa.DynamicPcdList: > > - PcdTokenNumber +=3D TAB_LINE_BREAK > > - PcdTokenNumber +=3D str((Pcd.TokenCName, Pcd.Toke= nSpaceGuidCName)) > > - PcdTokenNumber +=3D ' : ' > > - PcdTokenNumber +=3D str(Pa.PcdTokenNumber[Pcd.Tok= enCName, Pcd.TokenSpaceGuidCName]) > > - SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'= ), PcdTokenNumber, False) > > - > > - def CreateModuleHashInfo(self): > > - # > > - # Get set of workspace metafiles > > - # > > - AllWorkSpaceMetaFiles =3D self._GetMetaFiles(self.BuildTarget= , self.ToolChain) > > - > > - # > > - # Retrieve latest modified time of all metafiles > > - # > > - SrcTimeStamp =3D 0 > > - for f in AllWorkSpaceMetaFiles: > > - if os.stat(f)[8] > SrcTimeStamp: > > - SrcTimeStamp =3D os.stat(f)[8] > > - self._SrcTimeStamp =3D SrcTimeStamp > > - > > - if GlobalData.gUseHashCache: > > - m =3D hashlib.md5() > > - for files in AllWorkSpaceMetaFiles: > > - if files.endswith('.dec'): > > - continue > > - f =3D open(files, 'rb') > > - Content =3D f.read() > > - f.close() > > - m.update(Content) > > - SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.has= h'), m.hexdigest(), False) > > - GlobalData.gPlatformHash =3D m.hexdigest() > > - > > - # > > - # Write metafile list to build directory > > - # > > - AutoGenFilePath =3D os.path.join(self.BuildDir, 'AutoGen') > > - if os.path.exists (AutoGenFilePath): > > - os.remove(AutoGenFilePath) > > - if not os.path.exists(self.BuildDir): > > - os.makedirs(self.BuildDir) > > - with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as fi= le: > > - for f in AllWorkSpaceMetaFiles: > > - print(f, file=3Dfile) > > - return True > > - > > - def _GenPkgLevelHash(self, Pkg): > > - if Pkg.PackageName in GlobalData.gPackageHash: > > - return > > - > > - PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageN= ame) > > - CreateDirectory(PkgDir) > > - HashFile =3D os.path.join(PkgDir, Pkg.PackageName + '.hash') > > - m =3D hashlib.md5() > > - # Get .dec file's hash value > > - f =3D open(Pkg.MetaFile.Path, 'rb') > > - Content =3D f.read() > > - f.close() > > - m.update(Content) > > - # Get include files hash value > > - if Pkg.Includes: > > - for inc in sorted(Pkg.Includes, key=3Dlambda x: str(x)): > > - for Root, Dirs, Files in os.walk(str(inc)): > > - for File in sorted(Files): > > - File_Path =3D os.path.join(Root, File) > > - f =3D open(File_Path, 'rb') > > - Content =3D f.read() > > - f.close() > > - m.update(Content) > > - SaveFileOnChange(HashFile, m.hexdigest(), False) > > - GlobalData.gPackageHash[Pkg.PackageName] =3D m.hexdigest() > > - > > - def _GetMetaFiles(self, Target, Toolchain): > > - AllWorkSpaceMetaFiles =3D set() > > - # > > - # add fdf > > - # > > - if self.FdfFile: > > - AllWorkSpaceMetaFiles.add (self.FdfFile.Path) > > - for f in GlobalData.gFdfParser.GetAllIncludedFile(): > > - AllWorkSpaceMetaFiles.add (f.FileName) > > - # > > - # add dsc > > - # > > - AllWorkSpaceMetaFiles.add(self.MetaFile.Path) > > - > > - # > > - # add build_rule.txt & tools_def.txt > > - # > > - AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirect= ory, gDefaultBuildRuleFile)) > > - AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirect= ory, gDefaultToolsDefFile)) > > - > > - # add BuildOption metafile > > - # > > - AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildO= ptions')) > > - > > - # add PcdToken Number file for Dynamic/DynamicEx Pcd > > - # > > - AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdTok= enNumber')) > > - > > - for Pa in self.AutoGenObjectList: > > - AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile) > > - > > - for Arch in self.ArchList: > > - # > > - # add dec > > - # > > - for Package in PlatformAutoGen(self, self.MetaFile, Targe= t, Toolchain, Arch).PackageList: > > - AllWorkSpaceMetaFiles.add(Package.MetaFile.Path) > > - > > - # > > - # add included dsc > > - # > > - for filePath in self.BuildDatabase[self.MetaFile, Arch, T= arget, Toolchain]._RawData.IncludedFiles: > > - AllWorkSpaceMetaFiles.add(filePath.Path) > > - > > - return AllWorkSpaceMetaFiles > > - > > - def _CheckPcdDefineAndType(self): > > - PcdTypeSet =3D {TAB_PCDS_FIXED_AT_BUILD, > > - TAB_PCDS_PATCHABLE_IN_MODULE, > > - TAB_PCDS_FEATURE_FLAG, > > - TAB_PCDS_DYNAMIC, > > - TAB_PCDS_DYNAMIC_EX} > > - > > - # This dict store PCDs which are not used by any modules with= specified arches > > - UnusedPcd =3D OrderedDict() > > - for Pa in self.AutoGenObjectList: > > - # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGui= d > > - for Pcd in Pa.Platform.Pcds: > > - PcdType =3D Pa.Platform.Pcds[Pcd].Type > > - > > - # If no PCD type, this PCD comes from FDF > > - if not PcdType: > > - continue > > - > > - # Try to remove Hii and Vpd suffix > > - if PcdType.startswith(TAB_PCDS_DYNAMIC_EX): > > - PcdType =3D TAB_PCDS_DYNAMIC_EX > > - elif PcdType.startswith(TAB_PCDS_DYNAMIC): > > - PcdType =3D TAB_PCDS_DYNAMIC > > - > > - for Package in Pa.PackageList: > > - # Key of DEC's Pcds dictionary is PcdCName, Token= SpaceGuid, PcdType > > - if (Pcd[0], Pcd[1], PcdType) in Package.Pcds: > > - break > > - for Type in PcdTypeSet: > > - if (Pcd[0], Pcd[1], Type) in Package.Pcds: > > - EdkLogger.error( > > - 'build', > > - FORMAT_INVALID, > > - "Type [%s] of PCD [%s.%s] in DSC file= doesn't match the type [%s] defined in DEC file." \ > > - % (Pa.Platform.Pcds[Pcd].Type, Pcd[1]= , Pcd[0], Type), > > - ExtraData=3DNone > > - ) > > - return > > - else: > > - UnusedPcd.setdefault(Pcd, []).append(Pa.Arch) > > - > > - for Pcd in UnusedPcd: > > - EdkLogger.warn( > > - 'build', > > - "The PCD was not specified by any INF module in the p= latform for the given architecture.\n" > > - "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s" > > - % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile= )), str(UnusedPcd[Pcd])), > > - ExtraData=3DNone > > - ) > > - > > - def __repr__(self): > > - return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList)) > > - > > - ## Return the directory to store FV files > > - @cached_property > > - def FvDir(self): > > - return path.join(self.BuildDir, TAB_FV_DIRECTORY) > > - > > - ## Return the directory to store all intermediate and final files= built > > - @cached_property > > - def BuildDir(self): > > - return self.AutoGenObjectList[0].BuildDir > > - > > - ## Return the build output directory platform specifies > > - @cached_property > > - def OutputDir(self): > > - return self.Platform.OutputDirectory > > - > > - ## Return platform name > > - @cached_property > > - def Name(self): > > - return self.Platform.PlatformName > > - > > - ## Return meta-file GUID > > - @cached_property > > - def Guid(self): > > - return self.Platform.Guid > > - > > - ## Return platform version > > - @cached_property > > - def Version(self): > > - return self.Platform.Version > > - > > - ## Return paths of tools > > - @cached_property > > - def ToolDefinition(self): > > - return self.AutoGenObjectList[0].ToolDefinition > > - > > - ## Return directory of platform makefile > > - # > > - # @retval string Makefile directory > > - # > > - @cached_property > > - def MakeFileDir(self): > > - return self.BuildDir > > - > > - ## Return build command string > > - # > > - # @retval string Build command string > > - # > > - @cached_property > > - def BuildCommand(self): > > - # BuildCommand should be all the same. So just get one from p= latform AutoGen > > - return self.AutoGenObjectList[0].BuildCommand > > - > > - ## Check the PCDs token value conflict in each DEC file. > > - # > > - # Will cause build break and raise error message while two PCDs c= onflict. > > - # > > - # @return None > > - # > > - def _CheckAllPcdsTokenValueConflict(self): > > - for Pa in self.AutoGenObjectList: > > - for Package in Pa.PackageList: > > - PcdList =3D list(Package.Pcds.values()) > > - PcdList.sort(key=3Dlambda x: int(x.TokenValue, 0)) > > - Count =3D 0 > > - while (Count < len(PcdList) - 1) : > > - Item =3D PcdList[Count] > > - ItemNext =3D PcdList[Count + 1] > > - # > > - # Make sure in the same token space the TokenValu= e should be unique > > - # > > - if (int(Item.TokenValue, 0) =3D=3D int(ItemNext.T= okenValue, 0)): > > - SameTokenValuePcdList =3D [] > > - SameTokenValuePcdList.append(Item) > > - SameTokenValuePcdList.append(ItemNext) > > - RemainPcdListLength =3D len(PcdList) - Count = - 2 > > - for ValueSameCount in range(RemainPcdListLeng= th): > > - if int(PcdList[len(PcdList) - RemainPcdLi= stLength + ValueSameCount].TokenValue, 0) =3D=3D int(Item.TokenValue, 0): > > - SameTokenValuePcdList.append(PcdList[= len(PcdList) - RemainPcdListLength + ValueSameCount]) > > - else: > > - break; > > - # > > - # Sort same token value PCD list with TokenGu= id and TokenCName > > - # > > - SameTokenValuePcdList.sort(key=3Dlambda x: "%= s.%s" % (x.TokenSpaceGuidCName, x.TokenCName)) > > - SameTokenValuePcdListCount =3D 0 > > - while (SameTokenValuePcdListCount < len(SameT= okenValuePcdList) - 1): > > - Flag =3D False > > - TemListItem =3D SameTokenValuePcdList[Sam= eTokenValuePcdListCount] > > - TemListItemNext =3D SameTokenValuePcdList= [SameTokenValuePcdListCount + 1] > > - > > - if (TemListItem.TokenSpaceGuidCName =3D= =3D TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName !=3D = TemListItemNext.TokenCName): > > - for PcdItem in GlobalData.MixedPcd: > > - if (TemListItem.TokenCName, TemLi= stItem.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \ > > - (TemListItemNext.TokenCName, = TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]: > > - Flag =3D True > > - if not Flag: > > - EdkLogger.error( > > - 'build', > > - FORMAT_INVALID, > > - "The TokenValue [%s] = of PCD [%s.%s] is conflict with: [%s.%s] in %s"\ > > - % (TemListItem.TokenV= alue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemN= ext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package), > > - ExtraData=3DNone > > - ) > > - SameTokenValuePcdListCount +=3D 1 > > - Count +=3D SameTokenValuePcdListCount > > - Count +=3D 1 > > - > > - PcdList =3D list(Package.Pcds.values()) > > - PcdList.sort(key=3Dlambda x: "%s.%s" % (x.TokenSpaceG= uidCName, x.TokenCName)) > > - Count =3D 0 > > - while (Count < len(PcdList) - 1) : > > - Item =3D PcdList[Count] > > - ItemNext =3D PcdList[Count + 1] > > - # > > - # Check PCDs with same TokenSpaceGuidCName.TokenC= Name have same token value as well. > > - # > > - if (Item.TokenSpaceGuidCName =3D=3D ItemNext.Toke= nSpaceGuidCName) and (Item.TokenCName =3D=3D ItemNext.TokenCName) and (int(= Item.TokenValue, 0) !=3D int(ItemNext.TokenValue, 0)): > > - EdkLogger.error( > > - 'build', > > - FORMAT_INVALID, > > - "The TokenValue [%s] of PCD [%s.%= s] in %s defined in two places should be same as well."\ > > - % (Item.TokenValue, Item.TokenSpa= ceGuidCName, Item.TokenCName, Package), > > - ExtraData=3DNone > > - ) > > - Count +=3D 1 > > - ## Generate fds command > > - @property > > - def GenFdsCommand(self): > > - return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(Gen= Make.TopLevelMakefile(self)._TemplateDict)).strip() > > - > > - @property > > - def GenFdsCommandDict(self): > > - FdsCommandDict =3D {} > > - LogLevel =3D EdkLogger.GetLevel() > > - if LogLevel =3D=3D EdkLogger.VERBOSE: > > - FdsCommandDict["verbose"] =3D True > > - elif LogLevel <=3D EdkLogger.DEBUG_9: > > - FdsCommandDict["debug"] =3D LogLevel - 1 > > - elif LogLevel =3D=3D EdkLogger.QUIET: > > - FdsCommandDict["quiet"] =3D True > > - > > - if GlobalData.gEnableGenfdsMultiThread: > > - FdsCommandDict["GenfdsMultiThread"] =3D True > > - if GlobalData.gIgnoreSource: > > - FdsCommandDict["IgnoreSources"] =3D True > > - > > - FdsCommandDict["OptionPcd"] =3D [] > > - for pcd in GlobalData.BuildOptionPcd: > > - if pcd[2]: > > - pcdname =3D '.'.join(pcd[0:3]) > > - else: > > - pcdname =3D '.'.join(pcd[0:2]) > > - if pcd[3].startswith('{'): > > - FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + = 'H' + '"' + pcd[3] + '"') > > - else: > > - FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + = pcd[3]) > > - > > - MacroList =3D [] > > - # macros passed to GenFds > > - MacroDict =3D {} > > - MacroDict.update(GlobalData.gGlobalDefines) > > - MacroDict.update(GlobalData.gCommandLineDefines) > > - for MacroName in MacroDict: > > - if MacroDict[MacroName] !=3D "": > > - MacroList.append('"%s=3D%s"' % (MacroName, MacroDict[= MacroName].replace('\\', '\\\\'))) > > - else: > > - MacroList.append('"%s"' % MacroName) > > - FdsCommandDict["macro"] =3D MacroList > > - > > - FdsCommandDict["fdf_file"] =3D [self.FdfFile] > > - FdsCommandDict["build_target"] =3D self.BuildTarget > > - FdsCommandDict["toolchain_tag"] =3D self.ToolChain > > - FdsCommandDict["active_platform"] =3D str(self) > > - > > - FdsCommandDict["conf_directory"] =3D GlobalData.gConfDirector= y > > - FdsCommandDict["build_architecture_list"] =3D ','.join(self.A= rchList) > > - FdsCommandDict["platform_build_directory"] =3D self.BuildDir > > - > > - FdsCommandDict["fd"] =3D self.FdTargetList > > - FdsCommandDict["fv"] =3D self.FvTargetList > > - FdsCommandDict["cap"] =3D self.CapTargetList > > - return FdsCommandDict > > - > > - ## Create makefile for the platform and modules in it > > - # > > - # @param CreateDepsMakeFile Flag indicating if the ma= kefile for > > - # modules will be created a= s well > > - # > > - def CreateMakeFile(self, CreateDepsMakeFile=3DFalse): > > - if not CreateDepsMakeFile: > > - return > > - for Pa in self.AutoGenObjectList: > > - Pa.CreateMakeFile(True) > > - > > - ## Create autogen code for platform and modules > > - # > > - # Since there's no autogen code for platform, this method will d= o nothing > > - # if CreateModuleCodeFile is set to False. > > - # > > - # @param CreateDepsCodeFile Flag indicating if creati= ng module's > > - # autogen code file or not > > - # > > - def CreateCodeFile(self, CreateDepsCodeFile=3DFalse): > > - if not CreateDepsCodeFile: > > - return > > - for Pa in self.AutoGenObjectList: > > - Pa.CreateCodeFile(True) > > - > > - ## Create AsBuilt INF file the platform > > - # > > - def CreateAsBuiltInf(self): > > - return > > - > > - > > -## AutoGen class for platform > > -# > > -# PlatformAutoGen class will process the original information in pla= tform > > -# file in order to generate makefile for platform. > > -# > > -class PlatformAutoGen(AutoGen): > > - # call super().__init__ then call the worker function with differ= ent parameter count > > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > - if not hasattr(self, "_Init"): > > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch) > > - self._Init =3D True > > - # > > - # Used to store all PCDs for both PEI and DXE phase, in order to = generate > > - # correct PCD database > > - # > > - _DynaPcdList_ =3D [] > > - _NonDynaPcdList_ =3D [] > > - _PlatformPcds =3D {} > > - > > - # > > - # The priority list while override build option > > - # > > - PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMA= NDTYPE_ATTRIBUTE (Highest) > > - "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMAND= TYPE_ATTRIBUTE > > - "0x10111" : 14, # TARGET_*********_ARCH_COMMAND= TYPE_ATTRIBUTE > > - "0x00111" : 13, # ******_*********_ARCH_COMMAND= TYPE_ATTRIBUTE > > - "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMAND= TYPE_ATTRIBUTE > > - "0x01011" : 11, # ******_TOOLCHAIN_****_COMMAND= TYPE_ATTRIBUTE > > - "0x10011" : 10, # TARGET_*********_****_COMMAND= TYPE_ATTRIBUTE > > - "0x00011" : 9, # ******_*********_****_COMMAND= TYPE_ATTRIBUTE > > - "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_*******= ****_ATTRIBUTE > > - "0x01101" : 7, # ******_TOOLCHAIN_ARCH_*******= ****_ATTRIBUTE > > - "0x10101" : 6, # TARGET_*********_ARCH_*******= ****_ATTRIBUTE > > - "0x00101" : 5, # ******_*********_ARCH_*******= ****_ATTRIBUTE > > - "0x11001" : 4, # TARGET_TOOLCHAIN_****_*******= ****_ATTRIBUTE > > - "0x01001" : 3, # ******_TOOLCHAIN_****_*******= ****_ATTRIBUTE > > - "0x10001" : 2, # TARGET_*********_****_*******= ****_ATTRIBUTE > > - "0x00001" : 1} # ******_*********_****_*******= ****_ATTRIBUTE (Lowest) > > - > > - ## Initialize PlatformAutoGen > > - # > > - # > > - # @param Workspace WorkspaceAutoGen object > > - # @param PlatformFile Platform file (DSC file) > > - # @param Target Build target (DEBUG, RELEASE) > > - # @param Toolchain Name of tool chain > > - # @param Arch arch of the platform supports > > - # > > - def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain,= Arch): > > - EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s= ]" % (PlatformFile, Arch)) > > - GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (PlatformF= ile, Arch, Toolchain, Target) > > - > > - self.MetaFile =3D PlatformFile > > - self.Workspace =3D Workspace > > - self.WorkspaceDir =3D Workspace.WorkspaceDir > > - self.ToolChain =3D Toolchain > > - self.BuildTarget =3D Target > > - self.Arch =3D Arch > > - self.SourceDir =3D PlatformFile.SubDir > > - self.FdTargetList =3D self.Workspace.FdTargetList > > - self.FvTargetList =3D self.Workspace.FvTargetList > > - # get the original module/package/platform objects > > - self.BuildDatabase =3D Workspace.BuildDatabase > > - self.DscBuildDataObj =3D Workspace.Platform > > - > > - # flag indicating if the makefile/C-code file has been create= d or not > > - self.IsMakeFileCreated =3D False > > - > > - self._DynamicPcdList =3D None # [(TokenCName1, TokenSpaceG= uidCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > > - self._NonDynamicPcdList =3D None # [(TokenCName1, TokenSpaceG= uidCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > > - > > - self._AsBuildInfList =3D [] > > - self._AsBuildModuleList =3D [] > > - > > - self.VariableInfo =3D None > > - > > - if GlobalData.gFdfParser is not None: > > - self._AsBuildInfList =3D GlobalData.gFdfParser.Profile.In= fList > > - for Inf in self._AsBuildInfList: > > - InfClass =3D PathClass(NormPath(Inf), GlobalData.gWor= kspace, self.Arch) > > - M =3D self.BuildDatabase[InfClass, self.Arch, self.Bu= ildTarget, self.ToolChain] > > - if not M.IsBinaryModule: > > - continue > > - self._AsBuildModuleList.append(InfClass) > > - # get library/modules for build > > - self.LibraryBuildDirectoryList =3D [] > > - self.ModuleBuildDirectoryList =3D [] > > - > > - return True > > - > > - ## hash() operator of PlatformAutoGen > > - # > > - # The platform file path and arch string will be used to represe= nt > > - # hash value of this object > > - # > > - # @retval int Hash value of the platform file path and arch > > - # > > - @cached_class_function > > - def __hash__(self): > > - return hash((self.MetaFile, self.Arch)) > > - > > - @cached_class_function > > - def __repr__(self): > > - return "%s [%s]" % (self.MetaFile, self.Arch) > > - > > - ## Create autogen code for platform and modules > > - # > > - # Since there's no autogen code for platform, this method will d= o nothing > > - # if CreateModuleCodeFile is set to False. > > - # > > - # @param CreateModuleCodeFile Flag indicating if creati= ng module's > > - # autogen code file or not > > - # > > - @cached_class_function > > - def CreateCodeFile(self, CreateModuleCodeFile=3DFalse): > > - # only module has code to be created, so do nothing if Create= ModuleCodeFile is False > > - if not CreateModuleCodeFile: > > - return > > - > > - for Ma in self.ModuleAutoGenList: > > - Ma.CreateCodeFile(True) > > - > > - ## Generate Fds Command > > - @cached_property > > - def GenFdsCommand(self): > > - return self.Workspace.GenFdsCommand > > - > > - ## Create makefile for the platform and modules in it > > - # > > - # @param CreateModuleMakeFile Flag indicating if the ma= kefile for > > - # modules will be created a= s well > > - # > > - def CreateMakeFile(self, CreateModuleMakeFile=3DFalse, FfsCommand= =3D {}): > > - if CreateModuleMakeFile: > > - for Ma in self._MaList: > > - key =3D (Ma.MetaFile.File, self.Arch) > > - if key in FfsCommand: > > - Ma.CreateMakeFile(True, FfsCommand[key]) > > - else: > > - Ma.CreateMakeFile(True) > > - > > - # no need to create makefile for the platform more than once > > - if self.IsMakeFileCreated: > > - return > > - > > - # create library/module build dirs for platform > > - Makefile =3D GenMake.PlatformMakefile(self) > > - self.LibraryBuildDirectoryList =3D Makefile.GetLibraryBuildDi= rectoryList() > > - self.ModuleBuildDirectoryList =3D Makefile.GetModuleBuildDire= ctoryList() > > - > > - self.IsMakeFileCreated =3D True > > - > > - @property > > - def AllPcdList(self): > > - return self.DynamicPcdList + self.NonDynamicPcdList > > - ## Deal with Shared FixedAtBuild Pcds > > - # > > - def CollectFixedAtBuildPcds(self): > > - for LibAuto in self.LibraryAutoGenList: > > - FixedAtBuildPcds =3D {} > > - ShareFixedAtBuildPcdsSameValue =3D {} > > - for Module in LibAuto.ReferenceModules: > > - for Pcd in set(Module.FixedAtBuildPcds + LibAuto.Fixe= dAtBuildPcds): > > - DefaultValue =3D Pcd.DefaultValue > > - # Cover the case: DSC component override the Pcd = value and the Pcd only used in one Lib > > - if Pcd in Module.LibraryPcdList: > > - Index =3D Module.LibraryPcdList.index(Pcd) > > - DefaultValue =3D Module.LibraryPcdList[Index]= .DefaultValue > > - key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.To= kenCName)) > > - if key not in FixedAtBuildPcds: > > - ShareFixedAtBuildPcdsSameValue[key] =3D True > > - FixedAtBuildPcds[key] =3D DefaultValue > > - else: > > - if FixedAtBuildPcds[key] !=3D DefaultValue: > > - ShareFixedAtBuildPcdsSameValue[key] =3D F= alse > > - for Pcd in LibAuto.FixedAtBuildPcds: > > - key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenC= Name)) > > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in s= elf.NonDynamicPcdDict: > > - continue > > - else: > > - DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName= , Pcd.TokenSpaceGuidCName)] > > - if DscPcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > > - continue > > - if key in ShareFixedAtBuildPcdsSameValue and ShareFix= edAtBuildPcdsSameValue[key]: > > - LibAuto.ConstPcd[key] =3D FixedAtBuildPcds[key] > > - > > - def CollectVariables(self, DynamicPcdSet): > > - VpdRegionSize =3D 0 > > - VpdRegionBase =3D 0 > > - if self.Workspace.FdfFile: > > - FdDict =3D self.Workspace.FdfProfile.FdDict[GlobalData.gF= dfParser.CurrentFdName] > > - for FdRegion in FdDict.RegionList: > > - for item in FdRegion.RegionDataList: > > - if self.Platform.VpdToolGuid.strip() and self.Pla= tform.VpdToolGuid in item: > > - VpdRegionSize =3D FdRegion.Size > > - VpdRegionBase =3D FdRegion.Offset > > - break > > - > > - VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefault= Stores(), self.DscBuildDataObj.SkuIds) > > - VariableInfo.SetVpdRegionMaxSize(VpdRegionSize) > > - VariableInfo.SetVpdRegionOffset(VpdRegionBase) > > - Index =3D 0 > > - for Pcd in DynamicPcdSet: > > - pcdname =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenC= Name)) > > - for SkuName in Pcd.SkuInfoList: > > - Sku =3D Pcd.SkuInfoList[SkuName] > > - SkuId =3D Sku.SkuId > > - if SkuId is None or SkuId =3D=3D '': > > - continue > > - if len(Sku.VariableName) > 0: > > - if Sku.VariableAttribute and 'NV' not in Sku.Vari= ableAttribute: > > - continue > > - VariableGuidStructure =3D Sku.VariableGuidValue > > - VariableGuid =3D GuidStructureStringToGuidString(= VariableGuidStructure) > > - for StorageName in Sku.DefaultStoreDict: > > - VariableInfo.append_variable(var_info(Index, = pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGui= d, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.Defa= ultStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else St= ringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAt= tribute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False))) > > - Index +=3D 1 > > - return VariableInfo > > - > > - def UpdateNVStoreMaxSize(self, OrgVpdFile): > > - if self.VariableInfo: > > - VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_FV_DIR= ECTORY, "%s.map" % self.Platform.VpdToolGuid) > > - PcdNvStoreDfBuffer =3D [item for item in self._DynamicPcd= List if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.Toke= nSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > > - > > - if PcdNvStoreDfBuffer: > > - if os.path.exists(VpdMapFilePath): > > - OrgVpdFile.Read(VpdMapFilePath) > > - PcdItems =3D OrgVpdFile.GetOffset(PcdNvStoreDfBuf= fer[0]) > > - NvStoreOffset =3D list(PcdItems.values())[0].stri= p() if PcdItems else '0' > > - else: > > - EdkLogger.error("build", FILE_READ_FAILURE, "Can = not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > > - > > - NvStoreOffset =3D int(NvStoreOffset, 16) if NvStoreOf= fset.upper().startswith("0X") else int(NvStoreOffset) > > - default_skuobj =3D PcdNvStoreDfBuffer[0].SkuInfoList.= get(TAB_DEFAULT) > > - maxsize =3D self.VariableInfo.VpdRegionSize - NvStor= eOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultV= alue.split(",")) > > - var_data =3D self.VariableInfo.PatchNVStoreDefaultMax= Size(maxsize) > > - > > - if var_data and default_skuobj: > > - default_skuobj.DefaultValue =3D var_data > > - PcdNvStoreDfBuffer[0].DefaultValue =3D var_data > > - PcdNvStoreDfBuffer[0].SkuInfoList.clear() > > - PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] = =3D default_skuobj > > - PcdNvStoreDfBuffer[0].MaxDatumSize =3D str(len(de= fault_skuobj.DefaultValue.split(","))) > > - > > - return OrgVpdFile > > - > > - ## Collect dynamic PCDs > > - # > > - # Gather dynamic PCDs list from each module and their settings f= rom platform > > - # This interface should be invoked explicitly when platform acti= on is created. > > - # > > - def CollectPlatformDynamicPcds(self): > > - for key in self.Platform.Pcds: > > - for SinglePcd in GlobalData.MixedPcd: > > - if (self.Platform.Pcds[key].TokenCName, self.Platform= .Pcds[key].TokenSpaceGuidCName) =3D=3D SinglePcd: > > - for item in GlobalData.MixedPcd[SinglePcd]: > > - Pcd_Type =3D item[0].split('_')[-1] > > - if (Pcd_Type =3D=3D self.Platform.Pcds[key].T= ype) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and self.Platform.Pcds[key].Ty= pe in PCD_DYNAMIC_EX_TYPE_SET) or \ > > - (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC and self= .Platform.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > > - Value =3D self.Platform.Pcds[key] > > - Value.TokenCName =3D self.Platform.Pcds[k= ey].TokenCName + '_' + Pcd_Type > > - if len(key) =3D=3D 2: > > - newkey =3D (Value.TokenCName, key[1]) > > - elif len(key) =3D=3D 3: > > - newkey =3D (Value.TokenCName, key[1],= key[2]) > > - del self.Platform.Pcds[key] > > - self.Platform.Pcds[newkey] =3D Value > > - break > > - break > > - > > - # for gathering error information > > - NoDatumTypePcdList =3D set() > > - FdfModuleList =3D [] > > - for InfName in self._AsBuildInfList: > > - InfName =3D mws.join(self.WorkspaceDir, InfName) > > - FdfModuleList.append(os.path.normpath(InfName)) > > - for M in self._MaList: > > -# F is the Module for which M is the module autogen > > - for PcdFromModule in M.ModulePcdList + M.LibraryPcdList: > > - # make sure that the "VOID*" kind of datum has MaxDat= umSize set > > - if PcdFromModule.DatumType =3D=3D TAB_VOID and not Pc= dFromModule.MaxDatumSize: > > - NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromMod= ule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile)) > > - > > - # Check the PCD from Binary INF or Source INF > > - if M.IsBinaryModule =3D=3D True: > > - PcdFromModule.IsFromBinaryInf =3D True > > - > > - # Check the PCD from DSC or not > > - PcdFromModule.IsFromDsc =3D (PcdFromModule.TokenCName= , PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds > > - > > - if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdF= romModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > > - if M.MetaFile.Path not in FdfModuleList: > > - # If one of the Source built modules listed i= n the DSC is not listed > > - # in FDF modules, and the INF lists a PCD can= only use the PcdsDynamic > > - # access method (it is only listed in the DEC= file that declares the > > - # PCD as PcdsDynamic), then build tool will r= eport warning message > > - # notify the PI that they are attempting to b= uild a module that must > > - # be included in a flash image in order to be= functional. These Dynamic > > - # PCD will not be added into the Database unl= ess it is used by other > > - # modules that are included in the FDF file. > > - if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET= and \ > > - PcdFromModule.IsFromBinaryInf =3D=3D Fals= e: > > - # Print warning message to let the develo= per make a determine. > > - continue > > - # If one of the Source built modules listed i= n the DSC is not listed in > > - # FDF modules, and the INF lists a PCD can on= ly use the PcdsDynamicEx > > - # access method (it is only listed in the DEC= file that declares the > > - # PCD as PcdsDynamicEx), then DO NOT break th= e build; DO NOT add the > > - # PCD to the Platform's PCD Database. > > - if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_= SET: > > - continue > > - # > > - # If a dynamic PCD used by a PEM module/PEI modul= e & DXE module, > > - # it should be stored in Pcd PEI database, If a d= ynamic only > > - # used by DXE module, it should be stored in DXE = PCD database. > > - # The default Phase is DXE > > - # > > - if M.ModuleType in SUP_MODULE_SET_PEI: > > - PcdFromModule.Phase =3D "PEI" > > - if PcdFromModule not in self._DynaPcdList_: > > - self._DynaPcdList_.append(PcdFromModule) > > - elif PcdFromModule.Phase =3D=3D 'PEI': > > - # overwrite any the same PCD existing, if Pha= se is PEI > > - Index =3D self._DynaPcdList_.index(PcdFromMod= ule) > > - self._DynaPcdList_[Index] =3D PcdFromModule > > - elif PcdFromModule not in self._NonDynaPcdList_: > > - self._NonDynaPcdList_.append(PcdFromModule) > > - elif PcdFromModule in self._NonDynaPcdList_ and PcdFr= omModule.IsFromBinaryInf =3D=3D True: > > - Index =3D self._NonDynaPcdList_.index(PcdFromModu= le) > > - if self._NonDynaPcdList_[Index].IsFromBinaryInf = =3D=3D False: > > - #The PCD from Binary INF will override the sa= me one from source INF > > - self._NonDynaPcdList_.remove (self._NonDynaPc= dList_[Index]) > > - PcdFromModule.Pending =3D False > > - self._NonDynaPcdList_.append (PcdFromModule) > > - DscModuleSet =3D {os.path.normpath(ModuleInf.Path) for Module= Inf in self.Platform.Modules} > > - # add the PCD from modules that listed in FDF but not in DSC = to Database > > - for InfName in FdfModuleList: > > - if InfName not in DscModuleSet: > > - InfClass =3D PathClass(InfName) > > - M =3D self.BuildDatabase[InfClass, self.Arch, self.Bu= ildTarget, self.ToolChain] > > - # If a module INF in FDF but not in current arch's DS= C module list, it must be module (either binary or source) > > - # for different Arch. PCDs in source module for diffe= rent Arch is already added before, so skip the source module here. > > - # For binary module, if in current arch, we need to l= ist the PCDs into database. > > - if not M.IsBinaryModule: > > - continue > > - # Override the module PCD setting by platform setting > > - ModulePcdList =3D self.ApplyPcdSetting(M, M.Pcds) > > - for PcdFromModule in ModulePcdList: > > - PcdFromModule.IsFromBinaryInf =3D True > > - PcdFromModule.IsFromDsc =3D False > > - # Only allow the DynamicEx and Patchable PCD in A= sBuild INF > > - if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_= SET and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE: > > - EdkLogger.error("build", AUTOGEN_ERROR, "PCD = setting error", > > - File=3Dself.MetaFile, > > - ExtraData=3D"\n\tExisted %s P= CD %s in:\n\t\t%s\n" > > - % (PcdFromModule.Type, PcdFro= mModule.TokenCName, InfName)) > > - # make sure that the "VOID*" kind of datum has Ma= xDatumSize set > > - if PcdFromModule.DatumType =3D=3D TAB_VOID and no= t PcdFromModule.MaxDatumSize: > > - NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFro= mModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName)) > > - if M.ModuleType in SUP_MODULE_SET_PEI: > > - PcdFromModule.Phase =3D "PEI" > > - if PcdFromModule not in self._DynaPcdList_ and Pc= dFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > > - self._DynaPcdList_.append(PcdFromModule) > > - elif PcdFromModule not in self._NonDynaPcdList_ a= nd PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE: > > - self._NonDynaPcdList_.append(PcdFromModule) > > - if PcdFromModule in self._DynaPcdList_ and PcdFro= mModule.Phase =3D=3D 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SE= T: > > - # Overwrite the phase of any the same PCD exi= sting, if Phase is PEI. > > - # It is to solve the case that a dynamic PCD = used by a PEM module/PEI > > - # module & DXE module at a same time. > > - # Overwrite the type of the PCDs in source IN= F by the type of AsBuild > > - # INF file as DynamicEx. > > - Index =3D self._DynaPcdList_.index(PcdFromMod= ule) > > - self._DynaPcdList_[Index].Phase =3D PcdFromMo= dule.Phase > > - self._DynaPcdList_[Index].Type =3D PcdFromMod= ule.Type > > - for PcdFromModule in self._NonDynaPcdList_: > > - # If a PCD is not listed in the DSC file, but binary INF = files used by > > - # this platform all (that use this PCD) list the PCD in a= [PatchPcds] > > - # section, AND all source INF files used by this platform= the build > > - # that use the PCD list the PCD in either a [Pcds] or [Pa= tchPcds] > > - # section, then the tools must NOT add the PCD to the Pla= tform's PCD > > - # Database; the build must assign the access method for t= his PCD as > > - # PcdsPatchableInModule. > > - if PcdFromModule not in self._DynaPcdList_: > > - continue > > - Index =3D self._DynaPcdList_.index(PcdFromModule) > > - if PcdFromModule.IsFromDsc =3D=3D False and \ > > - PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE an= d \ > > - PcdFromModule.IsFromBinaryInf =3D=3D True and \ > > - self._DynaPcdList_[Index].IsFromBinaryInf =3D=3D Fals= e: > > - Index =3D self._DynaPcdList_.index(PcdFromModule) > > - self._DynaPcdList_.remove (self._DynaPcdList_[Index]) > > - > > - # print out error information and break the build, if error f= ound > > - if len(NoDatumTypePcdList) > 0: > > - NoDatumTypePcdListString =3D "\n\t\t".join(NoDatumTypePcd= List) > > - EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting erro= r", > > - File=3Dself.MetaFile, > > - ExtraData=3D"\n\tPCD(s) without MaxDatumS= ize:\n\t\t%s\n" > > - % NoDatumTypePcdListString) > > - self._NonDynamicPcdList =3D self._NonDynaPcdList_ > > - self._DynamicPcdList =3D self._DynaPcdList_ > > - # > > - # Sort dynamic PCD list to: > > - # 1) If PCD's datum type is VOID* and value is unicode string= which starts with L, the PCD item should > > - # try to be put header of dynamicd List > > - # 2) If PCD is HII type, the PCD item should be put after uni= code type PCD > > - # > > - # The reason of sorting is make sure the unicode string is in= double-byte alignment in string table. > > - # > > - UnicodePcdArray =3D set() > > - HiiPcdArray =3D set() > > - OtherPcdArray =3D set() > > - VpdPcdDict =3D {} > > - VpdFile =3D VpdInfoFile.VpdInfoFile() > > - NeedProcessVpdMapFile =3D False > > - > > - for pcd in self.Platform.Pcds: > > - if pcd not in self._PlatformPcds: > > - self._PlatformPcds[pcd] =3D self.Platform.Pcds[pcd] > > - > > - for item in self._PlatformPcds: > > - if self._PlatformPcds[item].DatumType and self._PlatformP= cds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, = TAB_VOID, "BOOLEAN"]: > > - self._PlatformPcds[item].DatumType =3D TAB_VOID > > - > > - if (self.Workspace.ArchList[-1] =3D=3D self.Arch): > > - for Pcd in self._DynamicPcdList: > > - # just pick the a value to determine whether is unico= de string type > > - Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > > - > > - if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_U= INT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > > - Pcd.DatumType =3D TAB_VOID > > - > > - # if found PCD which datum value is unicode strin= g the insert to left size of UnicodeIndex > > - # if found HII type PCD then insert to right of U= nicodeIndex > > - if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMI= C_EX_VPD]: > > - VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCNa= me)] =3D Pcd > > - > > - #Collect DynamicHii PCD values and assign it to DynamicEx= Vpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer > > - PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefault= ValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid")) > > - if PcdNvStoreDfBuffer: > > - self.VariableInfo =3D self.CollectVariables(self._Dyn= amicPcdList) > > - vardump =3D self.VariableInfo.dump() > > - if vardump: > > - # > > - #According to PCD_DATABASE_INIT in edk2\MdeModule= Pkg\Include\Guid\PcdDataBaseSignatureGuid.h, > > - #the max size for string PCD should not exceed US= HRT_MAX 65535(0xffff). > > - #typedef UINT16 SIZE_INFO; > > - #//SIZE_INFO SizeTable[]; > > - if len(vardump.split(",")) > 0xffff: > > - EdkLogger.error("build", RESOURCE_OVERFLOW, '= The current length of PCD %s value is %d, it exceeds to the max size of Str= ing PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBu= ffer.TokenCName]) ,len(vardump.split(",")))) > > - PcdNvStoreDfBuffer.DefaultValue =3D vardump > > - for skuname in PcdNvStoreDfBuffer.SkuInfoList: > > - PcdNvStoreDfBuffer.SkuInfoList[skuname].Defau= ltValue =3D vardump > > - PcdNvStoreDfBuffer.MaxDatumSize =3D str(len(v= ardump.split(","))) > > - else: > > - #If the end user define [DefaultStores] and [XXX.Menu= facturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to = PcdsDynamicVpd > > - if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDe= finedDefaultStoresFlag]: > > - EdkLogger.warn("build", "PcdNvStoreDefaultValueBu= ffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultSto= res is enabled for this platform.\n%s" %self.Platform.MetaFile.Path) > > - PlatformPcds =3D sorted(self._PlatformPcds.keys()) > > - # > > - # Add VPD type PCD into VpdFile and determine whether the= VPD PCD need to be fixed up. > > - # > > - VpdSkuMap =3D {} > > - for PcdKey in PlatformPcds: > > - Pcd =3D self._PlatformPcds[PcdKey] > > - if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMI= C_EX_VPD] and \ > > - PcdKey in VpdPcdDict: > > - Pcd =3D VpdPcdDict[PcdKey] > > - SkuValueMap =3D {} > > - DefaultSku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > - if DefaultSku: > > - PcdValue =3D DefaultSku.DefaultValue > > - if PcdValue not in SkuValueMap: > > - SkuValueMap[PcdValue] =3D [] > > - VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.= VpdOffset) > > - SkuValueMap[PcdValue].append(DefaultSku) > > - > > - for (SkuName, Sku) in Pcd.SkuInfoList.items(): > > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > > - PcdValue =3D Sku.DefaultValue > > - if PcdValue =3D=3D "": > > - PcdValue =3D Pcd.DefaultValue > > - if Sku.VpdOffset !=3D TAB_STAR: > > - if PcdValue.startswith("{"): > > - Alignment =3D 8 > > - elif PcdValue.startswith("L"): > > - Alignment =3D 2 > > - else: > > - Alignment =3D 1 > > - try: > > - VpdOffset =3D int(Sku.VpdOffset) > > - except: > > - try: > > - VpdOffset =3D int(Sku.VpdOffset, = 16) > > - except: > > - EdkLogger.error("build", FORMAT_I= NVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.Toke= nSpaceGuidCName, Pcd.TokenCName)) > > - if VpdOffset % Alignment !=3D 0: > > - if PcdValue.startswith("{"): > > - EdkLogger.warn("build", "The offs= et value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pc= d.TokenCName), File=3Dself.MetaFile) > > - else: > > - EdkLogger.error("build", FORMAT_I= NVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.T= okenSpaceGuidCName, Pcd.TokenCName, Alignment)) > > - if PcdValue not in SkuValueMap: > > - SkuValueMap[PcdValue] =3D [] > > - VpdFile.Add(Pcd, SkuName, Sku.VpdOffset) > > - SkuValueMap[PcdValue].append(Sku) > > - # if the offset of a VPD is *, then it need t= o be fixed up by third party tool. > > - if not NeedProcessVpdMapFile and Sku.VpdOffse= t =3D=3D TAB_STAR: > > - NeedProcessVpdMapFile =3D True > > - if self.Platform.VpdToolGuid is None or s= elf.Platform.VpdToolGuid =3D=3D '': > > - EdkLogger.error("Build", FILE_NOT_FOU= ND, \ > > - "Fail to find third-p= arty BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in to= ols_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > > - > > - VpdSkuMap[PcdKey] =3D SkuValueMap > > - # > > - # Fix the PCDs define in VPD PCD section that never refer= enced by module. > > - # An example is PCD for signature usage. > > - # > > - for DscPcd in PlatformPcds: > > - DscPcdEntry =3D self._PlatformPcds[DscPcd] > > - if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCD= S_DYNAMIC_EX_VPD]: > > - if not (self.Platform.VpdToolGuid is None or self= .Platform.VpdToolGuid =3D=3D ''): > > - FoundFlag =3D False > > - for VpdPcd in VpdFile._VpdArray: > > - # This PCD has been referenced by module > > - if (VpdPcd.TokenSpaceGuidCName =3D=3D Dsc= PcdEntry.TokenSpaceGuidCName) and \ > > - (VpdPcd.TokenCName =3D=3D DscPcdEntry.= TokenCName): > > - FoundFlag =3D True > > - > > - # Not found, it should be signature > > - if not FoundFlag : > > - # just pick the a value to determine whet= her is unicode string type > > - SkuValueMap =3D {} > > - SkuObjList =3D list(DscPcdEntry.SkuInfoLi= st.items()) > > - DefaultSku =3D DscPcdEntry.SkuInfoList.ge= t(TAB_DEFAULT) > > - if DefaultSku: > > - defaultindex =3D SkuObjList.index((TA= B_DEFAULT, DefaultSku)) > > - SkuObjList[0], SkuObjList[defaultinde= x] =3D SkuObjList[defaultindex], SkuObjList[0] > > - for (SkuName, Sku) in SkuObjList: > > - Sku.VpdOffset =3D Sku.VpdOffset.strip= () > > - > > - # Need to iterate DEC pcd information= to get the value & datumtype > > - for eachDec in self.PackageList: > > - for DecPcd in eachDec.Pcds: > > - DecPcdEntry =3D eachDec.Pcds[= DecPcd] > > - if (DecPcdEntry.TokenSpaceGui= dCName =3D=3D DscPcdEntry.TokenSpaceGuidCName) and \ > > - (DecPcdEntry.TokenCName = =3D=3D DscPcdEntry.TokenCName): > > - # Print warning message t= o let the developer make a determine. > > - EdkLogger.warn("build", "= Unreferenced vpd pcd used!", > > - File=3Dse= lf.MetaFile, \ > > - ExtraData= =3D "PCD: %s.%s used in the DSC file %s is unreferenced." \ > > - %(DscPcdE= ntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Pa= th)) > > - > > - DscPcdEntry.DatumType = = =3D DecPcdEntry.DatumType > > - DscPcdEntry.DefaultValue = = =3D DecPcdEntry.DefaultValue > > - DscPcdEntry.TokenValue = =3D DecPcdEntry.TokenValue > > - DscPcdEntry.TokenSpaceGui= dValue =3D eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName] > > - # Only fix the value whil= e no value provided in DSC file. > > - if not Sku.DefaultValue: > > - DscPcdEntry.SkuInfoLi= st[list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue =3D DecPcdEntry.De= faultValue > > - > > - if DscPcdEntry not in self._DynamicPc= dList: > > - self._DynamicPcdList.append(DscPc= dEntry) > > - Sku.VpdOffset =3D Sku.VpdOffset.strip= () > > - PcdValue =3D Sku.DefaultValue > > - if PcdValue =3D=3D "": > > - PcdValue =3D DscPcdEntry.Default= Value > > - if Sku.VpdOffset !=3D TAB_STAR: > > - if PcdValue.startswith("{"): > > - Alignment =3D 8 > > - elif PcdValue.startswith("L"): > > - Alignment =3D 2 > > - else: > > - Alignment =3D 1 > > - try: > > - VpdOffset =3D int(Sku.VpdOffs= et) > > - except: > > - try: > > - VpdOffset =3D int(Sku.Vpd= Offset, 16) > > - except: > > - EdkLogger.error("build", = FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, = DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName)) > > - if VpdOffset % Alignment !=3D 0: > > - if PcdValue.startswith("{"): > > - EdkLogger.warn("build", "= The offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSp= aceGuidCName, DscPcdEntry.TokenCName), File=3Dself.MetaFile) > > - else: > > - EdkLogger.error("build", = FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' = % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment)) > > - if PcdValue not in SkuValueMap: > > - SkuValueMap[PcdValue] =3D [] > > - VpdFile.Add(DscPcdEntry, SkuName,= Sku.VpdOffset) > > - SkuValueMap[PcdValue].append(Sku) > > - if not NeedProcessVpdMapFile and Sku.= VpdOffset =3D=3D TAB_STAR: > > - NeedProcessVpdMapFile =3D True > > - if DscPcdEntry.DatumType =3D=3D TAB_VOID = and PcdValue.startswith("L"): > > - UnicodePcdArray.add(DscPcdEntry) > > - elif len(Sku.VariableName) > 0: > > - HiiPcdArray.add(DscPcdEntry) > > - else: > > - OtherPcdArray.add(DscPcdEntry) > > - > > - # if the offset of a VPD is *, then i= t need to be fixed up by third party tool. > > - VpdSkuMap[DscPcd] =3D SkuValueMap > > - if (self.Platform.FlashDefinition is None or self.Platfor= m.FlashDefinition =3D=3D '') and \ > > - VpdFile.GetCount() !=3D 0: > > - EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, > > - "Fail to get FLASH_DEFINITION definit= ion in DSC file %s which is required when DSC contains VPD PCD." % str(self= .Platform.MetaFile)) > > - > > - if VpdFile.GetCount() !=3D 0: > > - > > - self.FixVpdOffset(VpdFile) > > - > > - self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile)) > > - PcdNvStoreDfBuffer =3D [item for item in self._Dynami= cPcdList if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.= TokenSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > > - if PcdNvStoreDfBuffer: > > - PcdName,PcdGuid =3D PcdNvStoreDfBuffer[0].TokenCN= ame, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName > > - if (PcdName,PcdGuid) in VpdSkuMap: > > - DefaultSku =3D PcdNvStoreDfBuffer[0].SkuInfoL= ist.get(TAB_DEFAULT) > > - VpdSkuMap[(PcdName,PcdGuid)] =3D {DefaultSku.= DefaultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values= () ]} > > - > > - # Process VPD map file generated by third party BPDG = tool > > - if NeedProcessVpdMapFile: > > - VpdMapFilePath =3D os.path.join(self.BuildDir, TA= B_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid) > > - if os.path.exists(VpdMapFilePath): > > - VpdFile.Read(VpdMapFilePath) > > - > > - # Fixup TAB_STAR offset > > - for pcd in VpdSkuMap: > > - vpdinfo =3D VpdFile.GetVpdInfo(pcd) > > - if vpdinfo is None: > > - # just pick the a value to determine whet= her is unicode string type > > - continue > > - for pcdvalue in VpdSkuMap[pcd]: > > - for sku in VpdSkuMap[pcd][pcdvalue]: > > - for item in vpdinfo: > > - if item[2] =3D=3D pcdvalue: > > - sku.VpdOffset =3D item[1] > > - else: > > - EdkLogger.error("build", FILE_READ_FAILURE, "= Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > > - > > - # Delete the DynamicPcdList At the last time enter into t= his function > > - for Pcd in self._DynamicPcdList: > > - # just pick the a value to determine whether is unico= de string type > > - Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > - Sku.VpdOffset =3D Sku.VpdOffset.strip() > > - > > - if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_U= INT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > > - Pcd.DatumType =3D TAB_VOID > > - > > - PcdValue =3D Sku.DefaultValue > > - if Pcd.DatumType =3D=3D TAB_VOID and PcdValue.startsw= ith("L"): > > - # if found PCD which datum value is unicode strin= g the insert to left size of UnicodeIndex > > - UnicodePcdArray.add(Pcd) > > - elif len(Sku.VariableName) > 0: > > - # if found HII type PCD then insert to right of U= nicodeIndex > > - HiiPcdArray.add(Pcd) > > - else: > > - OtherPcdArray.add(Pcd) > > - del self._DynamicPcdList[:] > > - self._DynamicPcdList.extend(list(UnicodePcdArray)) > > - self._DynamicPcdList.extend(list(HiiPcdArray)) > > - self._DynamicPcdList.extend(list(OtherPcdArray)) > > - allskuset =3D [(SkuName, Sku.SkuId) for pcd in self._DynamicP= cdList for (SkuName, Sku) in pcd.SkuInfoList.items()] > > - for pcd in self._DynamicPcdList: > > - if len(pcd.SkuInfoList) =3D=3D 1: > > - for (SkuName, SkuId) in allskuset: > > - if isinstance(SkuId, str) and eval(SkuId) =3D=3D = 0 or SkuId =3D=3D 0: > > - continue > > - pcd.SkuInfoList[SkuName] =3D copy.deepcopy(pcd.Sk= uInfoList[TAB_DEFAULT]) > > - pcd.SkuInfoList[SkuName].SkuId =3D SkuId > > - pcd.SkuInfoList[SkuName].SkuIdName =3D SkuName > > - > > - def FixVpdOffset(self, VpdFile ): > > - FvPath =3D os.path.join(self.BuildDir, TAB_FV_DIRECTORY) > > - if not os.path.exists(FvPath): > > - try: > > - os.makedirs(FvPath) > > - except: > > - EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to= create FV folder under %s" % self.BuildDir) > > - > > - VpdFilePath =3D os.path.join(FvPath, "%s.txt" % self.Platform= .VpdToolGuid) > > - > > - if VpdFile.Write(VpdFilePath): > > - # retrieve BPDG tool's path from tool_def.txt according t= o VPD_TOOL_GUID defined in DSC file. > > - BPDGToolName =3D None > > - for ToolDef in self.ToolDefinition.values(): > > - if TAB_GUID in ToolDef and ToolDef[TAB_GUID] =3D=3D s= elf.Platform.VpdToolGuid: > > - if "PATH" not in ToolDef: > > - EdkLogger.error("build", ATTRIBUTE_NOT_AVAILA= BLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.tx= t" % self.Platform.VpdToolGuid) > > - BPDGToolName =3D ToolDef["PATH"] > > - break > > - # Call third party GUID BPDG tool. > > - if BPDGToolName is not None: > > - VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFile= Path) > > - else: > > - EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to fin= d third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defi= ned in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > > - > > - ## Return the platform build data object > > - @cached_property > > - def Platform(self): > > - return self.BuildDatabase[self.MetaFile, self.Arch, self.Buil= dTarget, self.ToolChain] > > - > > - ## Return platform name > > - @cached_property > > - def Name(self): > > - return self.Platform.PlatformName > > - > > - ## Return the meta file GUID > > - @cached_property > > - def Guid(self): > > - return self.Platform.Guid > > - > > - ## Return the platform version > > - @cached_property > > - def Version(self): > > - return self.Platform.Version > > - > > - ## Return the FDF file name > > - @cached_property > > - def FdfFile(self): > > - if self.Workspace.FdfFile: > > - RetVal=3D mws.join(self.WorkspaceDir, self.Workspace.FdfF= ile) > > - else: > > - RetVal =3D '' > > - return RetVal > > - > > - ## Return the build output directory platform specifies > > - @cached_property > > - def OutputDir(self): > > - return self.Platform.OutputDirectory > > - > > - ## Return the directory to store all intermediate and final files= built > > - @cached_property > > - def BuildDir(self): > > - if os.path.isabs(self.OutputDir): > > - GlobalData.gBuildDirectory =3D RetVal =3D path.join( > > - path.abspath(self.OutputDir), > > - self.BuildTarget + "_" + self= .ToolChain, > > - ) > > - else: > > - GlobalData.gBuildDirectory =3D RetVal =3D path.join( > > - self.WorkspaceDir, > > - self.OutputDir, > > - self.BuildTarget + "_" + self= .ToolChain, > > - ) > > - return RetVal > > - > > - ## Return directory of platform makefile > > - # > > - # @retval string Makefile directory > > - # > > - @cached_property > > - def MakeFileDir(self): > > - return path.join(self.BuildDir, self.Arch) > > - > > - ## Return build command string > > - # > > - # @retval string Build command string > > - # > > - @cached_property > > - def BuildCommand(self): > > - RetVal =3D [] > > - if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefin= ition["MAKE"]: > > - RetVal +=3D _SplitOption(self.ToolDefinition["MAKE"]["PAT= H"]) > > - if "FLAGS" in self.ToolDefinition["MAKE"]: > > - NewOption =3D self.ToolDefinition["MAKE"]["FLAGS"].st= rip() > > - if NewOption !=3D '': > > - RetVal +=3D _SplitOption(NewOption) > > - if "MAKE" in self.EdkIIBuildOption: > > - if "FLAGS" in self.EdkIIBuildOption["MAKE"]: > > - Flags =3D self.EdkIIBuildOption["MAKE"]["FLAGS"] > > - if Flags.startswith('=3D'): > > - RetVal =3D [RetVal[0]] + [Flags[1:]] > > - else: > > - RetVal.append(Flags) > > - return RetVal > > - > > - ## Get tool chain definition > > - # > > - # Get each tool definition for given tool chain from tools_def.t= xt and platform > > - # > > - @cached_property > > - def ToolDefinition(self): > > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDictiona= ry > > - if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef= .ToolsDefTxtDatabase: > > - EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tool= s found in configuration", > > - ExtraData=3D"[%s]" % self.MetaFile) > > - RetVal =3D OrderedDict() > > - DllPathList =3D set() > > - for Def in ToolDefinition: > > - Target, Tag, Arch, Tool, Attr =3D Def.split("_") > > - if Target !=3D self.BuildTarget or Tag !=3D self.ToolChai= n or Arch !=3D self.Arch: > > - continue > > - > > - Value =3D ToolDefinition[Def] > > - # don't record the DLL > > - if Attr =3D=3D "DLL": > > - DllPathList.add(Value) > > - continue > > - > > - if Tool not in RetVal: > > - RetVal[Tool] =3D OrderedDict() > > - RetVal[Tool][Attr] =3D Value > > - > > - ToolsDef =3D '' > > - if GlobalData.gOptions.SilentMode and "MAKE" in RetVal: > > - if "FLAGS" not in RetVal["MAKE"]: > > - RetVal["MAKE"]["FLAGS"] =3D "" > > - RetVal["MAKE"]["FLAGS"] +=3D " -s" > > - MakeFlags =3D '' > > - for Tool in RetVal: > > - for Attr in RetVal[Tool]: > > - Value =3D RetVal[Tool][Attr] > > - if Tool in self._BuildOptionWithToolDef(RetVal) and A= ttr in self._BuildOptionWithToolDef(RetVal)[Tool]: > > - # check if override is indicated > > - if self._BuildOptionWithToolDef(RetVal)[Tool][Att= r].startswith('=3D'): > > - Value =3D self._BuildOptionWithToolDef(RetVal= )[Tool][Attr][1:] > > - else: > > - if Attr !=3D 'PATH': > > - Value +=3D " " + self._BuildOptionWithToo= lDef(RetVal)[Tool][Attr] > > - else: > > - Value =3D self._BuildOptionWithToolDef(Re= tVal)[Tool][Attr] > > - > > - if Attr =3D=3D "PATH": > > - # Don't put MAKE definition in the file > > - if Tool !=3D "MAKE": > > - ToolsDef +=3D "%s =3D %s\n" % (Tool, Value) > > - elif Attr !=3D "DLL": > > - # Don't put MAKE definition in the file > > - if Tool =3D=3D "MAKE": > > - if Attr =3D=3D "FLAGS": > > - MakeFlags =3D Value > > - else: > > - ToolsDef +=3D "%s_%s =3D %s\n" % (Tool, Attr,= Value) > > - ToolsDef +=3D "\n" > > - tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF."= + self.Arch) > > - SaveFileOnChange(tool_def_file, ToolsDef, False) > > - for DllPath in DllPathList: > > - os.environ["PATH"] =3D DllPath + os.pathsep + os.environ[= "PATH"] > > - os.environ["MAKE_FLAGS"] =3D MakeFlags > > - > > - return RetVal > > - > > - ## Return the paths of tools > > - @cached_property > > - def ToolDefinitionFile(self): > > - tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF."= + self.Arch) > > - if not os.path.exists(tool_def_file): > > - self.ToolDefinition > > - return tool_def_file > > - > > - ## Retrieve the toolchain family of given toolchain tag. Default = to 'MSFT'. > > - @cached_property > > - def ToolChainFamily(self): > > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > > - if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \ > > - or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FA= MILY] \ > > - or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolCha= in]: > > - EdkLogger.verbose("No tool chain family found in configur= ation for %s. Default to MSFT." \ > > - % self.ToolChain) > > - RetVal =3D TAB_COMPILER_MSFT > > - else: > > - RetVal =3D ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.To= olChain] > > - return RetVal > > - > > - @cached_property > > - def BuildRuleFamily(self): > > - ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > > - if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \ > > - or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BU= ILDRULEFAMILY] \ > > - or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][sel= f.ToolChain]: > > - EdkLogger.verbose("No tool chain family found in configur= ation for %s. Default to MSFT." \ > > - % self.ToolChain) > > - return TAB_COMPILER_MSFT > > - > > - return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.T= oolChain] > > - > > - ## Return the build options specific for all modules in this plat= form > > - @cached_property > > - def BuildOption(self): > > - return self._ExpandBuildOption(self.Platform.BuildOptions) > > - > > - def _BuildOptionWithToolDef(self, ToolDef): > > - return self._ExpandBuildOption(self.Platform.BuildOptions, To= olDef=3DToolDef) > > - > > - ## Return the build options specific for EDK modules in this plat= form > > - @cached_property > > - def EdkBuildOption(self): > > - return self._ExpandBuildOption(self.Platform.BuildOptions, ED= K_NAME) > > - > > - ## Return the build options specific for EDKII modules in this pl= atform > > - @cached_property > > - def EdkIIBuildOption(self): > > - return self._ExpandBuildOption(self.Platform.BuildOptions, ED= KII_NAME) > > - > > - ## Summarize the packages used by modules in this platform > > - @cached_property > > - def PackageList(self): > > - RetVal =3D set() > > - for La in self.LibraryAutoGenList: > > - RetVal.update(La.DependentPackageList) > > - for Ma in self.ModuleAutoGenList: > > - RetVal.update(Ma.DependentPackageList) > > - #Collect package set information from INF of FDF > > - for ModuleFile in self._AsBuildModuleList: > > - if ModuleFile in self.Platform.Modules: > > - continue > > - ModuleData =3D self.BuildDatabase[ModuleFile, self.Arch, = self.BuildTarget, self.ToolChain] > > - RetVal.update(ModuleData.Packages) > > - return list(RetVal) > > - > > - @cached_property > > - def NonDynamicPcdDict(self): > > - return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd= in self.NonDynamicPcdList} > > - > > - ## Get list of non-dynamic PCDs > > - @property > > - def NonDynamicPcdList(self): > > - if not self._NonDynamicPcdList: > > - self.CollectPlatformDynamicPcds() > > - return self._NonDynamicPcdList > > - > > - ## Get list of dynamic PCDs > > - @property > > - def DynamicPcdList(self): > > - if not self._DynamicPcdList: > > - self.CollectPlatformDynamicPcds() > > - return self._DynamicPcdList > > - > > - ## Generate Token Number for all PCD > > - @cached_property > > - def PcdTokenNumber(self): > > - RetVal =3D OrderedDict() > > - TokenNumber =3D 1 > > - # > > - # Make the Dynamic and DynamicEx PCD use within different Tok= enNumber area. > > - # Such as: > > - # > > - # Dynamic PCD: > > - # TokenNumber 0 ~ 10 > > - # DynamicEx PCD: > > - # TokeNumber 11 ~ 20 > > - # > > - for Pcd in self.DynamicPcdList: > > - if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_TYP= E_SET: > > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > - TokenNumber +=3D 1 > > - > > - for Pcd in self.DynamicPcdList: > > - if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_EX_= TYPE_SET: > > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > - TokenNumber +=3D 1 > > - > > - for Pcd in self.DynamicPcdList: > > - if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_TYP= E_SET: > > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > - TokenNumber +=3D 1 > > - > > - for Pcd in self.DynamicPcdList: > > - if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_EX_= TYPE_SET: > > - EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > - TokenNumber +=3D 1 > > - > > - for Pcd in self.NonDynamicPcdList: > > - RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Token= Number > > - TokenNumber +=3D 1 > > - return RetVal > > - > > - @cached_property > > - def _MaList(self): > > - for ModuleFile in self.Platform.Modules: > > - Ma =3D ModuleAutoGen( > > - self.Workspace, > > - ModuleFile, > > - self.BuildTarget, > > - self.ToolChain, > > - self.Arch, > > - self.MetaFile > > - ) > > - self.Platform.Modules[ModuleFile].M =3D Ma > > - return [x.M for x in self.Platform.Modules.values()] > > - > > - ## Summarize ModuleAutoGen objects of all modules to be built for= this platform > > - @cached_property > > - def ModuleAutoGenList(self): > > - RetVal =3D [] > > - for Ma in self._MaList: > > - if Ma not in RetVal: > > - RetVal.append(Ma) > > - return RetVal > > - > > - ## Summarize ModuleAutoGen objects of all libraries to be built f= or this platform > > - @cached_property > > - def LibraryAutoGenList(self): > > - RetVal =3D [] > > - for Ma in self._MaList: > > - for La in Ma.LibraryAutoGenList: > > - if La not in RetVal: > > - RetVal.append(La) > > - if Ma not in La.ReferenceModules: > > - La.ReferenceModules.append(Ma) > > - return RetVal > > - > > - ## Test if a module is supported by the platform > > - # > > - # An error will be raised directly if the module or its arch is = not supported > > - # by the platform or current configuration > > - # > > - def ValidModule(self, Module): > > - return Module in self.Platform.Modules or Module in self.Plat= form.LibraryInstances \ > > - or Module in self._AsBuildModuleList > > - > > - ## Resolve the library classes in a module to library instances > > - # > > - # This method will not only resolve library classes but also sort= the library > > - # instances according to the dependency-ship. > > - # > > - # @param Module The module from which the library classes= will be resolved > > - # > > - # @retval library_list List of library instances sorted > > - # > > - def ApplyLibraryInstance(self, Module): > > - # Cover the case that the binary INF file is list in the FDF = file but not DSC file, return empty list directly > > - if str(Module) not in self.Platform.Modules: > > - return [] > > - > > - return GetModuleLibInstances(Module, > > - self.Platform, > > - self.BuildDatabase, > > - self.Arch, > > - self.BuildTarget, > > - self.ToolChain, > > - self.MetaFile, > > - EdkLogger) > > - > > - ## Override PCD setting (type, value, ...) > > - # > > - # @param ToPcd The PCD to be overridden > > - # @param FromPcd The PCD overriding from > > - # > > - def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Lib= rary=3D""): > > - # > > - # in case there's PCDs coming from FDF file, which have no ty= pe given. > > - # at this point, ToPcd.Type has the type found from dependent > > - # package > > - # > > - TokenCName =3D ToPcd.TokenCName > > - for PcdItem in GlobalData.MixedPcd: > > - if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in Globa= lData.MixedPcd[PcdItem]: > > - TokenCName =3D PcdItem[0] > > - break > > - if FromPcd is not None: > > - if ToPcd.Pending and FromPcd.Type: > > - ToPcd.Type =3D FromPcd.Type > > - elif ToPcd.Type and FromPcd.Type\ > > - and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in Fr= omPcd.Type: > > - if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > > - ToPcd.Type =3D FromPcd.Type > > - elif ToPcd.Type and FromPcd.Type \ > > - and ToPcd.Type !=3D FromPcd.Type: > > - if Library: > > - Module =3D str(Module) + " 's library file (" + s= tr(Library) + ")" > > - EdkLogger.error("build", OPTION_CONFLICT, "Mismatched= PCD type", > > - ExtraData=3D"%s.%s is used as [%s] in= module %s, but as [%s] in %s."\ > > - % (ToPcd.TokenSpaceGuidCNam= e, TokenCName, > > - ToPcd.Type, Module, From= Pcd.Type, Msg), > > - File=3Dself.MetaFile) > > - > > - if FromPcd.MaxDatumSize: > > - ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > > - ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > > - if FromPcd.DefaultValue: > > - ToPcd.DefaultValue =3D FromPcd.DefaultValue > > - if FromPcd.TokenValue: > > - ToPcd.TokenValue =3D FromPcd.TokenValue > > - if FromPcd.DatumType: > > - ToPcd.DatumType =3D FromPcd.DatumType > > - if FromPcd.SkuInfoList: > > - ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > > - if FromPcd.UserDefinedDefaultStoresFlag: > > - ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDe= finedDefaultStoresFlag > > - # Add Flexible PCD format parse > > - if ToPcd.DefaultValue: > > - try: > > - ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.De= faultValue, ToPcd.DatumType, self.Workspace._GuidDict)(True) > > - except BadExpression as Value: > > - EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%= s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.= DefaultValue, Value), > > - File=3Dself.MetaFile) > > - > > - # check the validation of datum > > - IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.D= efaultValue) > > - if not IsValid: > > - EdkLogger.error('build', FORMAT_INVALID, Cause, File= =3Dself.MetaFile, > > - ExtraData=3D"%s.%s" % (ToPcd.TokenSpa= ceGuidCName, TokenCName)) > > - ToPcd.validateranges =3D FromPcd.validateranges > > - ToPcd.validlists =3D FromPcd.validlists > > - ToPcd.expressions =3D FromPcd.expressions > > - ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > > - > > - if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID an= d not ToPcd.MaxDatumSize: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize speci= fied for PCD %s.%s" \ > > - % (ToPcd.TokenSpaceGuidCName, TokenCName)= ) > > - Value =3D ToPcd.DefaultValue > > - if not Value: > > - ToPcd.MaxDatumSize =3D '1' > > - elif Value[0] =3D=3D 'L': > > - ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > - elif Value[0] =3D=3D '{': > > - ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > > - else: > > - ToPcd.MaxDatumSize =3D str(len(Value) - 1) > > - > > - # apply default SKU for dynamic PCDS if specified one is not = available > > - if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_D= YNAMIC_EX_TYPE_SET) \ > > - and not ToPcd.SkuInfoList: > > - if self.Platform.SkuName in self.Platform.SkuIds: > > - SkuName =3D self.Platform.SkuName > > - else: > > - SkuName =3D TAB_DEFAULT > > - ToPcd.SkuInfoList =3D { > > - SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[= SkuName][0], '', '', '', '', '', ToPcd.DefaultValue) > > - } > > - > > - ## Apply PCD setting defined platform to a module > > - # > > - # @param Module The module from which the PCD setting will be= overridden > > - # > > - # @retval PCD_list The list PCDs with settings from platform > > - # > > - def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > > - # for each PCD in module > > - for Name, Guid in Pcds: > > - PcdInModule =3D Pcds[Name, Guid] > > - # find out the PCD setting in platform > > - if (Name, Guid) in self.Platform.Pcds: > > - PcdInPlatform =3D self.Platform.Pcds[Name, Guid] > > - else: > > - PcdInPlatform =3D None > > - # then override the settings if any > > - self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= = =3D"DSC PCD sections", Library=3DLibrary) > > - # resolve the VariableGuid value > > - for SkuId in PcdInModule.SkuInfoList: > > - Sku =3D PcdInModule.SkuInfoList[SkuId] > > - if Sku.VariableGuid =3D=3D '': continue > > - Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid,= self.PackageList, self.MetaFile.Path) > > - if Sku.VariableGuidValue is None: > > - PackageList =3D "\n\t".join(str(P) for P in self.= PackageList) > > - EdkLogger.error( > > - 'build', > > - RESOURCE_NOT_AVAILABLE, > > - "Value of GUID [%s] is not found in" = % Sku.VariableGuid, > > - ExtraData=3DPackageList + "\n\t(used = with %s.%s from module %s)" \ > > - % (Guid, Name= , str(Module)), > > - File=3Dself.MetaFile > > - ) > > - > > - # override PCD settings with module specific setting > > - if Module in self.Platform.Modules: > > - PlatformModule =3D self.Platform.Modules[str(Module)] > > - for Key in PlatformModule.Pcds: > > - if GlobalData.BuildOptionPcd: > > - for pcd in GlobalData.BuildOptionPcd: > > - (TokenSpaceGuidCName, TokenCName, FieldName, = pcdvalue, _) =3D pcd > > - if (TokenCName, TokenSpaceGuidCName) =3D=3D K= ey and FieldName =3D=3D"": > > - PlatformModule.Pcds[Key].DefaultValue =3D= pcdvalue > > - PlatformModule.Pcds[Key].PcdValueFromComm= =3D pcdvalue > > - break > > - Flag =3D False > > - if Key in Pcds: > > - ToPcd =3D Pcds[Key] > > - Flag =3D True > > - elif Key in GlobalData.MixedPcd: > > - for PcdItem in GlobalData.MixedPcd[Key]: > > - if PcdItem in Pcds: > > - ToPcd =3D Pcds[PcdItem] > > - Flag =3D True > > - break > > - if Flag: > > - self._OverridePcd(ToPcd, PlatformModule.Pcds[Key]= , Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibra= ry) > > - # use PCD value to calculate the MaxDatumSize when it is not = specified > > - for Name, Guid in Pcds: > > - Pcd =3D Pcds[Name, Guid] > > - if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize= : > > - Pcd.MaxSizeUserSet =3D None > > - Value =3D Pcd.DefaultValue > > - if not Value: > > - Pcd.MaxDatumSize =3D '1' > > - elif Value[0] =3D=3D 'L': > > - Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > - elif Value[0] =3D=3D '{': > > - Pcd.MaxDatumSize =3D str(len(Value.split(','))) > > - else: > > - Pcd.MaxDatumSize =3D str(len(Value) - 1) > > - return list(Pcds.values()) > > - > > - > > - > > - ## Calculate the priority value of the build option > > - # > > - # @param Key Build option definition contain: TARGET_TOOLCH= AIN_ARCH_COMMANDTYPE_ATTRIBUTE > > - # > > - # @retval Value Priority value based on the priority list. > > - # > > - def CalculatePriorityValue(self, Key): > > - Target, ToolChain, Arch, CommandType, Attr =3D Key.split('_') > > - PriorityValue =3D 0x11111 > > - if Target =3D=3D TAB_STAR: > > - PriorityValue &=3D 0x01111 > > - if ToolChain =3D=3D TAB_STAR: > > - PriorityValue &=3D 0x10111 > > - if Arch =3D=3D TAB_STAR: > > - PriorityValue &=3D 0x11011 > > - if CommandType =3D=3D TAB_STAR: > > - PriorityValue &=3D 0x11101 > > - if Attr =3D=3D TAB_STAR: > > - PriorityValue &=3D 0x11110 > > - > > - return self.PrioList["0x%0.5x" % PriorityValue] > > - > > - > > - ## Expand * in build option key > > - # > > - # @param Options Options to be expanded > > - # @param ToolDef Use specified ToolDef instead of full ver= sion. > > - # This is needed during initialization to p= revent > > - # infinite recursion betweeh BuildOptions, > > - # ToolDefinition, and this function. > > - # > > - # @retval options Options expanded > > - # > > - def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= = =3DNone): > > - if not ToolDef: > > - ToolDef =3D self.ToolDefinition > > - BuildOptions =3D {} > > - FamilyMatch =3D False > > - FamilyIsNull =3D True > > - > > - OverrideList =3D {} > > - # > > - # Construct a list contain the build options which need overr= ide. > > - # > > - for Key in Options: > > - # > > - # Key[0] -- tool family > > - # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > > - # > > - if (Key[0] =3D=3D self.BuildRuleFamily and > > - (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2= and Key[2] =3D=3D ModuleStyle))): > > - Target, ToolChain, Arch, CommandType, Attr =3D Key[1]= .split('_') > > - if (Target =3D=3D self.BuildTarget or Target =3D=3D T= AB_STAR) and\ > > - (ToolChain =3D=3D self.ToolChain or ToolChain =3D= = =3D TAB_STAR) and\ > > - (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) a= nd\ > > - Options[Key].startswith("=3D"): > > - > > - if OverrideList.get(Key[1]) is not None: > > - OverrideList.pop(Key[1]) > > - OverrideList[Key[1]] =3D Options[Key] > > - > > - # > > - # Use the highest priority value. > > - # > > - if (len(OverrideList) >=3D 2): > > - KeyList =3D list(OverrideList.keys()) > > - for Index in range(len(KeyList)): > > - NowKey =3D KeyList[Index] > > - Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D N= owKey.split("_") > > - for Index1 in range(len(KeyList) - Index - 1): > > - NextKey =3D KeyList[Index1 + Index + 1] > > - # > > - # Compare two Key, if one is included by another,= choose the higher priority one > > - # > > - Target2, ToolChain2, Arch2, CommandType2, Attr2 = =3D NextKey.split("_") > > - if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_= STAR or Target2 =3D=3D TAB_STAR) and\ > > - (ToolChain1 =3D=3D ToolChain2 or ToolChain1 = =3D=3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > > - (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR = or Arch2 =3D=3D TAB_STAR) and\ > > - (CommandType1 =3D=3D CommandType2 or CommandT= ype1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > > - (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR = or Attr2 =3D=3D TAB_STAR): > > - > > - if self.CalculatePriorityValue(NowKey) > self= .CalculatePriorityValue(NextKey): > > - if Options.get((self.BuildRuleFamily, Nex= tKey)) is not None: > > - Options.pop((self.BuildRuleFamily, Ne= xtKey)) > > - else: > > - if Options.get((self.BuildRuleFamily, Now= Key)) is not None: > > - Options.pop((self.BuildRuleFamily, No= wKey)) > > - > > - for Key in Options: > > - if ModuleStyle is not None and len (Key) > 2: > > - # Check Module style is EDK or EDKII. > > - # Only append build option for the matched style modu= le. > > - if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > - continue > > - elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > - continue > > - Family =3D Key[0] > > - Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > - # if tool chain family doesn't match, skip it > > - if Tool in ToolDef and Family !=3D "": > > - FamilyIsNull =3D False > > - if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY,= "") !=3D "": > > - if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUIL= DRULEFAMILY]: > > - continue > > - elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY= ]: > > - continue > > - FamilyMatch =3D True > > - # expand any wildcard > > - if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > - if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > - if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > - if Tool not in BuildOptions: > > - BuildOptions[Tool] =3D {} > > - if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > - BuildOptions[Tool][Attr] =3D Options[Key] > > - else: > > - # append options for the same tool except= PATH > > - if Attr !=3D 'PATH': > > - BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > - else: > > - BuildOptions[Tool][Attr] =3D Options[= Key] > > - # Build Option Family has been checked, which need't to be ch= ecked again for family. > > - if FamilyMatch or FamilyIsNull: > > - return BuildOptions > > - > > - for Key in Options: > > - if ModuleStyle is not None and len (Key) > 2: > > - # Check Module style is EDK or EDKII. > > - # Only append build option for the matched style modu= le. > > - if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > - continue > > - elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > - continue > > - Family =3D Key[0] > > - Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > - # if tool chain family doesn't match, skip it > > - if Tool not in ToolDef or Family =3D=3D "": > > - continue > > - # option has been added before > > - if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > > - continue > > - > > - # expand any wildcard > > - if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > - if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > - if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > - if Tool not in BuildOptions: > > - BuildOptions[Tool] =3D {} > > - if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > - BuildOptions[Tool][Attr] =3D Options[Key] > > - else: > > - # append options for the same tool except= PATH > > - if Attr !=3D 'PATH': > > - BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > - else: > > - BuildOptions[Tool][Attr] =3D Options[= Key] > > - return BuildOptions > > - def GetGlobalBuildOptions(self,Module): > > - ModuleTypeOptions =3D self.Platform.GetBuildOptionsByPkg(Modu= le, Module.ModuleType) > > - ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptio= ns) > > - if Module in self.Platform.Modules: > > - PlatformModule =3D self.Platform.Modules[str(Module)] > > - PlatformModuleOptions =3D self._ExpandBuildOption(Platfor= mModule.BuildOptions) > > - else: > > - PlatformModuleOptions =3D {} > > - return ModuleTypeOptions, PlatformModuleOptions > > - ## Append build options in platform to a module > > - # > > - # @param Module The module to which the build options will be= appended > > - # > > - # @retval options The options appended with build options i= n platform > > - # > > - def ApplyBuildOption(self, Module): > > - # Get the different options for the different style module > > - PlatformOptions =3D self.EdkIIBuildOption > > - ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleTy= pe(EDKII_NAME, Module.ModuleType) > > - ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptio= ns) > > - ModuleOptions =3D self._ExpandBuildOption(Module.BuildOptions= ) > > - if Module in self.Platform.Modules: > > - PlatformModule =3D self.Platform.Modules[str(Module)] > > - PlatformModuleOptions =3D self._ExpandBuildOption(Platfor= mModule.BuildOptions) > > - else: > > - PlatformModuleOptions =3D {} > > - > > - BuildRuleOrder =3D None > > - for Options in [self.ToolDefinition, ModuleOptions, PlatformO= ptions, ModuleTypeOptions, PlatformModuleOptions]: > > - for Tool in Options: > > - for Attr in Options[Tool]: > > - if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > - BuildRuleOrder =3D Options[Tool][Attr] > > - > > - AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOp= tions.keys()) + > > - list(PlatformModuleOptions.keys()) + list(Modu= leTypeOptions.keys()) + > > - list(self.ToolDefinition.keys())) > > - BuildOptions =3D defaultdict(lambda: defaultdict(str)) > > - for Tool in AllTools: > > - for Options in [self.ToolDefinition, ModuleOptions, Platf= ormOptions, ModuleTypeOptions, PlatformModuleOptions]: > > - if Tool not in Options: > > - continue > > - for Attr in Options[Tool]: > > - # > > - # Do not generate it in Makefile > > - # > > - if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > - continue > > - Value =3D Options[Tool][Attr] > > - # check if override is indicated > > - if Value.startswith('=3D'): > > - BuildOptions[Tool][Attr] =3D mws.handleWsMacr= o(Value[1:]) > > - else: > > - if Attr !=3D 'PATH': > > - BuildOptions[Tool][Attr] +=3D " " + mws.h= andleWsMacro(Value) > > - else: > > - BuildOptions[Tool][Attr] =3D mws.handleWs= Macro(Value) > > - > > - return BuildOptions, BuildRuleOrder > > - > > -# > > -# extend lists contained in a dictionary with lists stored in another= dictionary > > -# if CopyToDict is not derived from DefaultDict(list) then this may r= aise exception > > -# > > -def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > > - for Key in CopyFromDict: > > - CopyToDict[Key].extend(CopyFromDict[Key]) > > - > > -# Create a directory specified by a set of path elements and return t= he full path > > -def _MakeDir(PathList): > > - RetVal =3D path.join(*PathList) > > - CreateDirectory(RetVal) > > - return RetVal > > - > > -## ModuleAutoGen class > > -# > > -# This class encapsules the AutoGen behaviors for the build tools. In= addition to > > -# the generation of AutoGen.h and AutoGen.c, it will generate *.depex= file according > > -# to the [depex] section in module's inf file. > > -# > > -class ModuleAutoGen(AutoGen): > > - # call super().__init__ then call the worker function with differ= ent parameter count > > - def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > - if not hasattr(self, "_Init"): > > - self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch, *args) > > - self._Init =3D True > > - > > - ## Cache the timestamps of metafiles of every module in a class a= ttribute > > - # > > - TimeDict =3D {} > > - > > - def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > > - # check if this module is employed by active platform > > - if not PlatformAutoGen(Workspace, args[0], Target, Toolchain,= Arch).ValidModule(MetaFile): > > - EdkLogger.verbose("Module [%s] for [%s] is not employed b= y active platform\n" \ > > - % (MetaFile, Arch)) > > - return None > > - return super(ModuleAutoGen, cls).__new__(cls, Workspace, Meta= File, Target, Toolchain, Arch, *args, **kwargs) > > - > > - ## Initialize ModuleAutoGen > > - # > > - # @param Workspace EdkIIWorkspaceBuild object > > - # @param ModuleFile The path of module file > > - # @param Target Build target (DEBUG, RELEASE) > > - # @param Toolchain Name of tool chain > > - # @param Arch The arch the module supports > > - # @param PlatformFile Platform meta-file > > - # > > - def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, A= rch, PlatformFile): > > - EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]"= % (ModuleFile, Arch)) > > - GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (ModuleFil= e, Arch, Toolchain, Target) > > - > > - self.Workspace =3D Workspace > > - self.WorkspaceDir =3D Workspace.WorkspaceDir > > - self.MetaFile =3D ModuleFile > > - self.PlatformInfo =3D PlatformAutoGen(Workspace, PlatformFile= , Target, Toolchain, Arch) > > - > > - self.SourceDir =3D self.MetaFile.SubDir > > - self.SourceDir =3D mws.relpath(self.SourceDir, self.Workspace= Dir) > > - > > - self.ToolChain =3D Toolchain > > - self.BuildTarget =3D Target > > - self.Arch =3D Arch > > - self.ToolChainFamily =3D self.PlatformInfo.ToolChainFamily > > - self.BuildRuleFamily =3D self.PlatformInfo.BuildRuleFamily > > - > > - self.IsCodeFileCreated =3D False > > - self.IsAsBuiltInfCreated =3D False > > - self.DepexGenerated =3D False > > - > > - self.BuildDatabase =3D self.Workspace.BuildDatabase > > - self.BuildRuleOrder =3D None > > - self.BuildTime =3D 0 > > - > > - self._PcdComments =3D OrderedListDict() > > - self._GuidComments =3D OrderedListDict() > > - self._ProtocolComments =3D OrderedListDict() > > - self._PpiComments =3D OrderedListDict() > > - self._BuildTargets =3D None > > - self._IntroBuildTargetList =3D None > > - self._FinalBuildTargetList =3D None > > - self._FileTypes =3D None > > - > > - self.AutoGenDepSet =3D set() > > - self.ReferenceModules =3D [] > > - self.ConstPcd =3D {} > > - > > - ## hash() operator of ModuleAutoGen > > - # > > - # The module file path and arch string will be used to represent > > - # hash value of this object > > - # > > - # @retval int Hash value of the module file path and arch > > - # > > - @cached_class_function > > - def __hash__(self): > > - return hash((self.MetaFile, self.Arch)) > > - > > - def __repr__(self): > > - return "%s [%s]" % (self.MetaFile, self.Arch) > > - > > - # Get FixedAtBuild Pcds of this Module > > - @cached_property > > - def FixedAtBuildPcds(self): > > - RetVal =3D [] > > - for Pcd in self.ModulePcdList: > > - if Pcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > > - continue > > - if Pcd not in RetVal: > > - RetVal.append(Pcd) > > - return RetVal > > - > > - @cached_property > > - def FixedVoidTypePcds(self): > > - RetVal =3D {} > > - for Pcd in self.FixedAtBuildPcds: > > - if Pcd.DatumType =3D=3D TAB_VOID: > > - if '{}.{}'.format(Pcd.TokenSpaceGuidCName, Pcd.TokenC= Name) not in RetVal: > > - RetVal['{}.{}'.format(Pcd.TokenSpaceGuidCName, Pc= d.TokenCName)] =3D Pcd.DefaultValue > > - return RetVal > > - > > - @property > > - def UniqueBaseName(self): > > - BaseName =3D self.Name > > - for Module in self.PlatformInfo.ModuleAutoGenList: > > - if Module.MetaFile =3D=3D self.MetaFile: > > - continue > > - if Module.Name =3D=3D self.Name: > > - if uuid.UUID(Module.Guid) =3D=3D uuid.UUID(self.Guid)= : > > - EdkLogger.error("build", FILE_DUPLICATED, 'Module= s have same BaseName and FILE_GUID:\n' > > - ' %s\n %s' % (Module.MetaFile, = self.MetaFile)) > > - BaseName =3D '%s_%s' % (self.Name, self.Guid) > > - return BaseName > > - > > - # Macros could be used in build_rule.txt (also Makefile) > > - @cached_property > > - def Macros(self): > > - return OrderedDict(( > > - ("WORKSPACE" ,self.WorkspaceDir), > > - ("MODULE_NAME" ,self.Name), > > - ("MODULE_NAME_GUID" ,self.UniqueBaseName), > > - ("MODULE_GUID" ,self.Guid), > > - ("MODULE_VERSION" ,self.Version), > > - ("MODULE_TYPE" ,self.ModuleType), > > - ("MODULE_FILE" ,str(self.MetaFile)), > > - ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName), > > - ("MODULE_RELATIVE_DIR" ,self.SourceDir), > > - ("MODULE_DIR" ,self.SourceDir), > > - ("BASE_NAME" ,self.Name), > > - ("ARCH" ,self.Arch), > > - ("TOOLCHAIN" ,self.ToolChain), > > - ("TOOLCHAIN_TAG" ,self.ToolChain), > > - ("TOOL_CHAIN_TAG" ,self.ToolChain), > > - ("TARGET" ,self.BuildTarget), > > - ("BUILD_DIR" ,self.PlatformInfo.BuildDir), > > - ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self= .Arch)), > > - ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self= .Arch)), > > - ("MODULE_BUILD_DIR" ,self.BuildDir), > > - ("OUTPUT_DIR" ,self.OutputDir), > > - ("DEBUG_DIR" ,self.DebugDir), > > - ("DEST_DIR_OUTPUT" ,self.OutputDir), > > - ("DEST_DIR_DEBUG" ,self.DebugDir), > > - ("PLATFORM_NAME" ,self.PlatformInfo.Name), > > - ("PLATFORM_GUID" ,self.PlatformInfo.Guid), > > - ("PLATFORM_VERSION" ,self.PlatformInfo.Version), > > - ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir), > > - ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.Platfor= mInfo.SourceDir)), > > - ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir), > > - ("FFS_OUTPUT_DIR" ,self.FfsOutputDir) > > - )) > > - > > - ## Return the module build data object > > - @cached_property > > - def Module(self): > > - return self.BuildDatabase[self.MetaFile, self.Arch, self.Buil= dTarget, self.ToolChain] > > - > > - ## Return the module name > > - @cached_property > > - def Name(self): > > - return self.Module.BaseName > > - > > - ## Return the module DxsFile if exist > > - @cached_property > > - def DxsFile(self): > > - return self.Module.DxsFile > > - > > - ## Return the module meta-file GUID > > - @cached_property > > - def Guid(self): > > - # > > - # To build same module more than once, the module path with F= ILE_GUID overridden has > > - # the file name FILE_GUIDmodule.inf, but the relative path (s= elf.MetaFile.File) is the real path > > - # in DSC. The overridden GUID can be retrieved from file name > > - # > > - if os.path.basename(self.MetaFile.File) !=3D os.path.basename= (self.MetaFile.Path): > > - # > > - # Length of GUID is 36 > > - # > > - return os.path.basename(self.MetaFile.Path)[:36] > > - return self.Module.Guid > > - > > - ## Return the module version > > - @cached_property > > - def Version(self): > > - return self.Module.Version > > - > > - ## Return the module type > > - @cached_property > > - def ModuleType(self): > > - return self.Module.ModuleType > > - > > - ## Return the component type (for Edk.x style of module) > > - @cached_property > > - def ComponentType(self): > > - return self.Module.ComponentType > > - > > - ## Return the build type > > - @cached_property > > - def BuildType(self): > > - return self.Module.BuildType > > - > > - ## Return the PCD_IS_DRIVER setting > > - @cached_property > > - def PcdIsDriver(self): > > - return self.Module.PcdIsDriver > > - > > - ## Return the autogen version, i.e. module meta-file version > > - @cached_property > > - def AutoGenVersion(self): > > - return self.Module.AutoGenVersion > > - > > - ## Check if the module is library or not > > - @cached_property > > - def IsLibrary(self): > > - return bool(self.Module.LibraryClass) > > - > > - ## Check if the module is binary module or not > > - @cached_property > > - def IsBinaryModule(self): > > - return self.Module.IsBinaryModule > > - > > - ## Return the directory to store intermediate files of the module > > - @cached_property > > - def BuildDir(self): > > - return _MakeDir(( > > - self.PlatformInfo.BuildDir, > > - self.Arch, > > - self.SourceDir, > > - self.MetaFile.BaseName > > - )) > > - > > - ## Return the directory to store the intermediate object files of= the module > > - @cached_property > > - def OutputDir(self): > > - return _MakeDir((self.BuildDir, "OUTPUT")) > > - > > - ## Return the directory path to store ffs file > > - @cached_property > > - def FfsOutputDir(self): > > - if GlobalData.gFdfParser: > > - return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIREC= TORY, "Ffs", self.Guid + self.Name) > > - return '' > > - > > - ## Return the directory to store auto-gened source files of the m= odule > > - @cached_property > > - def DebugDir(self): > > - return _MakeDir((self.BuildDir, "DEBUG")) > > - > > - ## Return the path of custom file > > - @cached_property > > - def CustomMakefile(self): > > - RetVal =3D {} > > - for Type in self.Module.CustomMakefile: > > - MakeType =3D gMakeTypeMap[Type] if Type in gMakeTypeMap e= lse 'nmake' > > - File =3D os.path.join(self.SourceDir, self.Module.CustomM= akefile[Type]) > > - RetVal[MakeType] =3D File > > - return RetVal > > - > > - ## Return the directory of the makefile > > - # > > - # @retval string The directory string of module's makefile > > - # > > - @cached_property > > - def MakeFileDir(self): > > - return self.BuildDir > > - > > - ## Return build command string > > - # > > - # @retval string Build command string > > - # > > - @cached_property > > - def BuildCommand(self): > > - return self.PlatformInfo.BuildCommand > > - > > - ## Get object list of all packages the module and its dependent l= ibraries belong to > > - # > > - # @retval list The list of package object > > - # > > - @cached_property > > - def DerivedPackageList(self): > > - PackageList =3D [] > > - for M in [self.Module] + self.DependentLibraryList: > > - for Package in M.Packages: > > - if Package in PackageList: > > - continue > > - PackageList.append(Package) > > - return PackageList > > - > > - ## Get the depex string > > - # > > - # @return : a string contain all depex expression. > > - def _GetDepexExpresionString(self): > > - DepexStr =3D '' > > - DepexList =3D [] > > - ## DPX_SOURCE IN Define section. > > - if self.Module.DxsFile: > > - return DepexStr > > - for M in [self.Module] + self.DependentLibraryList: > > - Filename =3D M.MetaFile.Path > > - InfObj =3D InfSectionParser.InfSectionParser(Filename) > > - DepexExpressionList =3D InfObj.GetDepexExpresionList() > > - for DepexExpression in DepexExpressionList: > > - for key in DepexExpression: > > - Arch, ModuleType =3D key > > - DepexExpr =3D [x for x in DepexExpression[key] if= not str(x).startswith('#')] > > - # the type of build module is USER_DEFINED. > > - # All different DEPEX section tags would be copie= d into the As Built INF file > > - # and there would be separate DEPEX section tags > > - if self.ModuleType.upper() =3D=3D SUP_MODULE_USER= _DEFINED or self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > > - if (Arch.upper() =3D=3D self.Arch.upper()) an= d (ModuleType.upper() !=3D TAB_ARCH_COMMON): > > - DepexList.append({(Arch, ModuleType): Dep= exExpr}) > > - else: > > - if Arch.upper() =3D=3D TAB_ARCH_COMMON or \ > > - (Arch.upper() =3D=3D self.Arch.upper() and = \ > > - ModuleType.upper() in [TAB_ARCH_COMMON, sel= f.ModuleType.upper()]): > > - DepexList.append({(Arch, ModuleType): Dep= exExpr}) > > - > > - #the type of build module is USER_DEFINED. > > - if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_DEFINED or = self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > > - for Depex in DepexList: > > - for key in Depex: > > - DepexStr +=3D '[Depex.%s.%s]\n' % key > > - DepexStr +=3D '\n'.join('# '+ val for val in Depe= x[key]) > > - DepexStr +=3D '\n\n' > > - if not DepexStr: > > - return '[Depex.%s]\n' % self.Arch > > - return DepexStr > > - > > - #the type of build module not is USER_DEFINED. > > - Count =3D 0 > > - for Depex in DepexList: > > - Count +=3D 1 > > - if DepexStr !=3D '': > > - DepexStr +=3D ' AND ' > > - DepexStr +=3D '(' > > - for D in Depex.values(): > > - DepexStr +=3D ' '.join(val for val in D) > > - Index =3D DepexStr.find('END') > > - if Index > -1 and Index =3D=3D len(DepexStr) - 3: > > - DepexStr =3D DepexStr[:-3] > > - DepexStr =3D DepexStr.strip() > > - DepexStr +=3D ')' > > - if Count =3D=3D 1: > > - DepexStr =3D DepexStr.lstrip('(').rstrip(')').strip() > > - if not DepexStr: > > - return '[Depex.%s]\n' % self.Arch > > - return '[Depex.%s]\n# ' % self.Arch + DepexStr > > - > > - ## Merge dependency expression > > - # > > - # @retval list The token list of the dependency expressi= on after parsed > > - # > > - @cached_property > > - def DepexList(self): > > - if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSIO= N_FILE in self.FileTypes: > > - return {} > > - > > - DepexList =3D [] > > - # > > - # Append depex from dependent libraries, if not "BEFORE", "AF= TER" expression > > - # > > - for M in [self.Module] + self.DependentLibraryList: > > - Inherited =3D False > > - for D in M.Depex[self.Arch, self.ModuleType]: > > - if DepexList !=3D []: > > - DepexList.append('AND') > > - DepexList.append('(') > > - #replace D with value if D is FixedAtBuild PCD > > - NewList =3D [] > > - for item in D: > > - if '.' not in item: > > - NewList.append(item) > > - else: > > - FixedVoidTypePcds =3D {} > > - if item in self.FixedVoidTypePcds: > > - FixedVoidTypePcds =3D self.FixedVoidTypeP= cds > > - elif M in self.PlatformInfo.LibraryAutoGenLis= t: > > - Index =3D self.PlatformInfo.LibraryAutoGe= nList.index(M) > > - FixedVoidTypePcds =3D self.PlatformInfo.L= ibraryAutoGenList[Index].FixedVoidTypePcds > > - if item not in FixedVoidTypePcds: > > - EdkLogger.error("build", FORMAT_INVALID, = "{} used in [Depex] section should be used as FixedAtBuild type and VOID* d= atum type in the module.".format(item)) > > - else: > > - Value =3D FixedVoidTypePcds[item] > > - if len(Value.split(',')) !=3D 16: > > - EdkLogger.error("build", FORMAT_INVAL= ID, > > - "{} used in [Depex] s= ection should be used as FixedAtBuild type and VOID* datum type and 16 byte= s in the module.".format(item)) > > - NewList.append(Value) > > - DepexList.extend(NewList) > > - if DepexList[-1] =3D=3D 'END': # no need of a END at= this time > > - DepexList.pop() > > - DepexList.append(')') > > - Inherited =3D True > > - if Inherited: > > - EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Na= me, M.BaseName, DepexList)) > > - if 'BEFORE' in DepexList or 'AFTER' in DepexList: > > - break > > - if len(DepexList) > 0: > > - EdkLogger.verbose('') > > - return {self.ModuleType:DepexList} > > - > > - ## Merge dependency expression > > - # > > - # @retval list The token list of the dependency expressi= on after parsed > > - # > > - @cached_property > > - def DepexExpressionDict(self): > > - if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSIO= N_FILE in self.FileTypes: > > - return {} > > - > > - DepexExpressionString =3D '' > > - # > > - # Append depex from dependent libraries, if not "BEFORE", "AF= TER" expresion > > - # > > - for M in [self.Module] + self.DependentLibraryList: > > - Inherited =3D False > > - for D in M.DepexExpression[self.Arch, self.ModuleType]: > > - if DepexExpressionString !=3D '': > > - DepexExpressionString +=3D ' AND ' > > - DepexExpressionString +=3D '(' > > - DepexExpressionString +=3D D > > - DepexExpressionString =3D DepexExpressionString.rstri= p('END').strip() > > - DepexExpressionString +=3D ')' > > - Inherited =3D True > > - if Inherited: > > - EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Na= me, M.BaseName, DepexExpressionString)) > > - if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexE= xpressionString: > > - break > > - if len(DepexExpressionString) > 0: > > - EdkLogger.verbose('') > > - > > - return {self.ModuleType:DepexExpressionString} > > - > > - # Get the tiano core user extension, it is contain dependent libr= ary. > > - # @retval: a list contain tiano core userextension. > > - # > > - def _GetTianoCoreUserExtensionList(self): > > - TianoCoreUserExtentionList =3D [] > > - for M in [self.Module] + self.DependentLibraryList: > > - Filename =3D M.MetaFile.Path > > - InfObj =3D InfSectionParser.InfSectionParser(Filename) > > - TianoCoreUserExtenList =3D InfObj.GetUserExtensionTianoCo= re() > > - for TianoCoreUserExtent in TianoCoreUserExtenList: > > - for Section in TianoCoreUserExtent: > > - ItemList =3D Section.split(TAB_SPLIT) > > - Arch =3D self.Arch > > - if len(ItemList) =3D=3D 4: > > - Arch =3D ItemList[3] > > - if Arch.upper() =3D=3D TAB_ARCH_COMMON or Arch.up= per() =3D=3D self.Arch.upper(): > > - TianoCoreList =3D [] > > - TianoCoreList.extend([TAB_SECTION_START + Sec= tion + TAB_SECTION_END]) > > - TianoCoreList.extend(TianoCoreUserExtent[Sect= ion][:]) > > - TianoCoreList.append('\n') > > - TianoCoreUserExtentionList.append(TianoCoreLi= st) > > - > > - return TianoCoreUserExtentionList > > - > > - ## Return the list of specification version required for the modu= le > > - # > > - # @retval list The list of specification defined in modu= le file > > - # > > - @cached_property > > - def Specification(self): > > - return self.Module.Specification > > - > > - ## Tool option for the module build > > - # > > - # @param PlatformInfo The object of PlatformBuildInfo > > - # @retval dict The dict containing valid options > > - # > > - @cached_property > > - def BuildOption(self): > > - RetVal, self.BuildRuleOrder =3D self.PlatformInfo.ApplyBuildO= ption(self.Module) > > - if self.BuildRuleOrder: > > - self.BuildRuleOrder =3D ['.%s' % Ext for Ext in self.Buil= dRuleOrder.split()] > > - return RetVal > > - > > - ## Get include path list from tool option for the module build > > - # > > - # @retval list The include path list > > - # > > - @cached_property > > - def BuildOptionIncPathList(self): > > - # > > - # Regular expression for finding Include Directories, the dif= ference between MSFT and INTEL/GCC/RVCT > > - # is the former use /I , the Latter used -I to specify includ= e directories > > - # > > - if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT): > > - BuildOptIncludeRegEx =3D gBuildOptIncludePatternMsft > > - elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'R= VCT'): > > - BuildOptIncludeRegEx =3D gBuildOptIncludePatternOther > > - else: > > - # > > - # New ToolChainFamily, don't known whether there is optio= n to specify include directories > > - # > > - return [] > > - > > - RetVal =3D [] > > - for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'A= SM'): > > - try: > > - FlagOption =3D self.BuildOption[Tool]['FLAGS'] > > - except KeyError: > > - FlagOption =3D '' > > - > > - if self.ToolChainFamily !=3D 'RVCT': > > - IncPathList =3D [NormPath(Path, self.Macros) for Path= in BuildOptIncludeRegEx.findall(FlagOption)] > > - else: > > - # > > - # RVCT may specify a list of directory separated by c= ommas > > - # > > - IncPathList =3D [] > > - for Path in BuildOptIncludeRegEx.findall(FlagOption): > > - PathList =3D GetSplitList(Path, TAB_COMMA_SPLIT) > > - IncPathList.extend(NormPath(PathEntry, self.Macro= s) for PathEntry in PathList) > > - > > - # > > - # EDK II modules must not reference header files outside = of the packages they depend on or > > - # within the module's directory tree. Report error if vio= lation. > > - # > > - if GlobalData.gDisableIncludePathCheck =3D=3D False: > > - for Path in IncPathList: > > - if (Path not in self.IncludePathList) and (Common= Path([Path, self.MetaFile.Dir]) !=3D self.MetaFile.Dir): > > - ErrMsg =3D "The include directory for the EDK= II module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, = Tool, FlagOption) > > - EdkLogger.error("build", > > - PARAMETER_INVALID, > > - ExtraData=3DErrMsg, > > - File=3Dstr(self.MetaFile)) > > - RetVal +=3D IncPathList > > - return RetVal > > - > > - ## Return a list of files which can be built from source > > - # > > - # What kind of files can be built is determined by build rules i= n > > - # $(CONF_DIRECTORY)/build_rule.txt and toolchain family. > > - # > > - @cached_property > > - def SourceFileList(self): > > - RetVal =3D [] > > - ToolChainTagSet =3D {"", TAB_STAR, self.ToolChain} > > - ToolChainFamilySet =3D {"", TAB_STAR, self.ToolChainFamily, s= elf.BuildRuleFamily} > > - for F in self.Module.Sources: > > - # match tool chain > > - if F.TagName not in ToolChainTagSet: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s= ] for processing file [%s] is found, " > > - "but [%s] is currently used" % (F.Tag= Name, str(F), self.ToolChain)) > > - continue > > - # match tool chain family or build rule family > > - if F.ToolChainFamily not in ToolChainFamilySet: > > - EdkLogger.debug( > > - EdkLogger.DEBUG_0, > > - "The file [%s] must be built by tools of = [%s], " \ > > - "but current toolchain family is [%s], bu= ildrule family is [%s]" \ > > - % (str(F), F.ToolChainFamily, self.To= olChainFamily, self.BuildRuleFamily)) > > - continue > > - > > - # add the file path into search path list for file includ= ing > > - if F.Dir not in self.IncludePathList: > > - self.IncludePathList.insert(0, F.Dir) > > - RetVal.append(F) > > - > > - self._MatchBuildRuleOrder(RetVal) > > - > > - for F in RetVal: > > - self._ApplyBuildRule(F, TAB_UNKNOWN_FILE) > > - return RetVal > > - > > - def _MatchBuildRuleOrder(self, FileList): > > - Order_Dict =3D {} > > - self.BuildOption > > - for SingleFile in FileList: > > - if self.BuildRuleOrder and SingleFile.Ext in self.BuildRu= leOrder and SingleFile.Ext in self.BuildRules: > > - key =3D SingleFile.Path.rsplit(SingleFile.Ext,1)[0] > > - if key in Order_Dict: > > - Order_Dict[key].append(SingleFile.Ext) > > - else: > > - Order_Dict[key] =3D [SingleFile.Ext] > > - > > - RemoveList =3D [] > > - for F in Order_Dict: > > - if len(Order_Dict[F]) > 1: > > - Order_Dict[F].sort(key=3Dlambda i: self.BuildRuleOrde= r.index(i)) > > - for Ext in Order_Dict[F][1:]: > > - RemoveList.append(F + Ext) > > - > > - for item in RemoveList: > > - FileList.remove(item) > > - > > - return FileList > > - > > - ## Return the list of unicode files > > - @cached_property > > - def UnicodeFileList(self): > > - return self.FileTypes.get(TAB_UNICODE_FILE,[]) > > - > > - ## Return the list of vfr files > > - @cached_property > > - def VfrFileList(self): > > - return self.FileTypes.get(TAB_VFR_FILE, []) > > - > > - ## Return the list of Image Definition files > > - @cached_property > > - def IdfFileList(self): > > - return self.FileTypes.get(TAB_IMAGE_FILE,[]) > > - > > - ## Return a list of files which can be built from binary > > - # > > - # "Build" binary files are just to copy them to build directory. > > - # > > - # @retval list The list of files which can be bu= ilt later > > - # > > - @cached_property > > - def BinaryFileList(self): > > - RetVal =3D [] > > - for F in self.Module.Binaries: > > - if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Targ= et !=3D self.BuildTarget: > > - continue > > - RetVal.append(F) > > - self._ApplyBuildRule(F, F.Type, BinaryFileList=3DRetVal) > > - return RetVal > > - > > - @cached_property > > - def BuildRules(self): > > - RetVal =3D {} > > - BuildRuleDatabase =3D BuildRule > > - for Type in BuildRuleDatabase.FileTypeList: > > - #first try getting build rule by BuildRuleFamily > > - RuleObject =3D BuildRuleDatabase[Type, self.BuildType, se= lf.Arch, self.BuildRuleFamily] > > - if not RuleObject: > > - # build type is always module type, but ... > > - if self.ModuleType !=3D self.BuildType: > > - RuleObject =3D BuildRuleDatabase[Type, self.Modul= eType, self.Arch, self.BuildRuleFamily] > > - #second try getting build rule by ToolChainFamily > > - if not RuleObject: > > - RuleObject =3D BuildRuleDatabase[Type, self.BuildType= , self.Arch, self.ToolChainFamily] > > - if not RuleObject: > > - # build type is always module type, but ... > > - if self.ModuleType !=3D self.BuildType: > > - RuleObject =3D BuildRuleDatabase[Type, self.M= oduleType, self.Arch, self.ToolChainFamily] > > - if not RuleObject: > > - continue > > - RuleObject =3D RuleObject.Instantiate(self.Macros) > > - RetVal[Type] =3D RuleObject > > - for Ext in RuleObject.SourceFileExtList: > > - RetVal[Ext] =3D RuleObject > > - return RetVal > > - > > - def _ApplyBuildRule(self, File, FileType, BinaryFileList=3DNone): > > - if self._BuildTargets is None: > > - self._IntroBuildTargetList =3D set() > > - self._FinalBuildTargetList =3D set() > > - self._BuildTargets =3D defaultdict(set) > > - self._FileTypes =3D defaultdict(set) > > - > > - if not BinaryFileList: > > - BinaryFileList =3D self.BinaryFileList > > - > > - SubDirectory =3D os.path.join(self.OutputDir, File.SubDir) > > - if not os.path.exists(SubDirectory): > > - CreateDirectory(SubDirectory) > > - LastTarget =3D None > > - RuleChain =3D set() > > - SourceList =3D [File] > > - Index =3D 0 > > - # > > - # Make sure to get build rule order value > > - # > > - self.BuildOption > > - > > - while Index < len(SourceList): > > - Source =3D SourceList[Index] > > - Index =3D Index + 1 > > - > > - if Source !=3D File: > > - CreateDirectory(Source.Dir) > > - > > - if File.IsBinary and File =3D=3D Source and File in Binar= yFileList: > > - # Skip all files that are not binary libraries > > - if not self.IsLibrary: > > - continue > > - RuleObject =3D self.BuildRules[TAB_DEFAULT_BINARY_FIL= E] > > - elif FileType in self.BuildRules: > > - RuleObject =3D self.BuildRules[FileType] > > - elif Source.Ext in self.BuildRules: > > - RuleObject =3D self.BuildRules[Source.Ext] > > - else: > > - # stop at no more rules > > - if LastTarget: > > - self._FinalBuildTargetList.add(LastTarget) > > - break > > - > > - FileType =3D RuleObject.SourceFileType > > - self._FileTypes[FileType].add(Source) > > - > > - # stop at STATIC_LIBRARY for library > > - if self.IsLibrary and FileType =3D=3D TAB_STATIC_LIBRARY: > > - if LastTarget: > > - self._FinalBuildTargetList.add(LastTarget) > > - break > > - > > - Target =3D RuleObject.Apply(Source, self.BuildRuleOrder) > > - if not Target: > > - if LastTarget: > > - self._FinalBuildTargetList.add(LastTarget) > > - break > > - elif not Target.Outputs: > > - # Only do build for target with outputs > > - self._FinalBuildTargetList.add(Target) > > - > > - self._BuildTargets[FileType].add(Target) > > - > > - if not Source.IsBinary and Source =3D=3D File: > > - self._IntroBuildTargetList.add(Target) > > - > > - # to avoid cyclic rule > > - if FileType in RuleChain: > > - break > > - > > - RuleChain.add(FileType) > > - SourceList.extend(Target.Outputs) > > - LastTarget =3D Target > > - FileType =3D TAB_UNKNOWN_FILE > > - > > - @cached_property > > - def Targets(self): > > - if self._BuildTargets is None: > > - self._IntroBuildTargetList =3D set() > > - self._FinalBuildTargetList =3D set() > > - self._BuildTargets =3D defaultdict(set) > > - self._FileTypes =3D defaultdict(set) > > - > > - #TRICK: call SourceFileList property to apply build rule for = source files > > - self.SourceFileList > > - > > - #TRICK: call _GetBinaryFileList to apply build rule for binar= y files > > - self.BinaryFileList > > - > > - return self._BuildTargets > > - > > - @cached_property > > - def IntroTargetList(self): > > - self.Targets > > - return self._IntroBuildTargetList > > - > > - @cached_property > > - def CodaTargetList(self): > > - self.Targets > > - return self._FinalBuildTargetList > > - > > - @cached_property > > - def FileTypes(self): > > - self.Targets > > - return self._FileTypes > > - > > - ## Get the list of package object the module depends on > > - # > > - # @retval list The package object list > > - # > > - @cached_property > > - def DependentPackageList(self): > > - return self.Module.Packages > > - > > - ## Return the list of auto-generated code file > > - # > > - # @retval list The list of auto-generated file > > - # > > - @cached_property > > - def AutoGenFileList(self): > > - AutoGenUniIdf =3D self.BuildType !=3D 'UEFI_HII' > > - UniStringBinBuffer =3D BytesIO() > > - IdfGenBinBuffer =3D BytesIO() > > - RetVal =3D {} > > - AutoGenC =3D TemplateString() > > - AutoGenH =3D TemplateString() > > - StringH =3D TemplateString() > > - StringIdf =3D TemplateString() > > - GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUni= Idf, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer) > > - # > > - # AutoGen.c is generated if there are library classes in inf,= or there are object files > > - # > > - if str(AutoGenC) !=3D "" and (len(self.Module.LibraryClasses)= > 0 > > - or TAB_OBJECT_FILE in self.FileTy= pes): > > - AutoFile =3D PathClass(gAutoGenCodeFileName, self.DebugDi= r) > > - RetVal[AutoFile] =3D str(AutoGenC) > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if str(AutoGenH) !=3D "": > > - AutoFile =3D PathClass(gAutoGenHeaderFileName, self.Debug= Dir) > > - RetVal[AutoFile] =3D str(AutoGenH) > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if str(StringH) !=3D "": > > - AutoFile =3D PathClass(gAutoGenStringFileName % {"module_= name":self.Name}, self.DebugDir) > > - RetVal[AutoFile] =3D str(StringH) > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if UniStringBinBuffer is not None and UniStringBinBuffer.getv= alue() !=3D b"": > > - AutoFile =3D PathClass(gAutoGenStringFormFileName % {"mod= ule_name":self.Name}, self.OutputDir) > > - RetVal[AutoFile] =3D UniStringBinBuffer.getvalue() > > - AutoFile.IsBinary =3D True > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if UniStringBinBuffer is not None: > > - UniStringBinBuffer.close() > > - if str(StringIdf) !=3D "": > > - AutoFile =3D PathClass(gAutoGenImageDefFileName % {"modul= e_name":self.Name}, self.DebugDir) > > - RetVal[AutoFile] =3D str(StringIdf) > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue()= !=3D b"": > > - AutoFile =3D PathClass(gAutoGenIdfFileName % {"module_nam= e":self.Name}, self.OutputDir) > > - RetVal[AutoFile] =3D IdfGenBinBuffer.getvalue() > > - AutoFile.IsBinary =3D True > > - self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > - if IdfGenBinBuffer is not None: > > - IdfGenBinBuffer.close() > > - return RetVal > > - > > - ## Return the list of library modules explicitly or implicitly us= ed by this module > > - @cached_property > > - def DependentLibraryList(self): > > - # only merge library classes and PCD for non-library module > > - if self.IsLibrary: > > - return [] > > - return self.PlatformInfo.ApplyLibraryInstance(self.Module) > > - > > - ## Get the list of PCDs from current module > > - # > > - # @retval list The list of PCD > > - # > > - @cached_property > > - def ModulePcdList(self): > > - # apply PCD settings from platform > > - RetVal =3D self.PlatformInfo.ApplyPcdSetting(self.Module, sel= f.Module.Pcds) > > - ExtendCopyDictionaryLists(self._PcdComments, self.Module.PcdC= omments) > > - return RetVal > > - > > - ## Get the list of PCDs from dependent libraries > > - # > > - # @retval list The list of PCD > > - # > > - @cached_property > > - def LibraryPcdList(self): > > - if self.IsLibrary: > > - return [] > > - RetVal =3D [] > > - Pcds =3D set() > > - # get PCDs from dependent libraries > > - for Library in self.DependentLibraryList: > > - PcdsInLibrary =3D OrderedDict() > > - ExtendCopyDictionaryLists(self._PcdComments, Library.PcdC= omments) > > - for Key in Library.Pcds: > > - # skip duplicated PCDs > > - if Key in self.Module.Pcds or Key in Pcds: > > - continue > > - Pcds.add(Key) > > - PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > > - RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Modu= le, PcdsInLibrary, Library=3DLibrary)) > > - return RetVal > > - > > - ## Get the GUID value mapping > > - # > > - # @retval dict The mapping between GUID cname and its va= lue > > - # > > - @cached_property > > - def GuidList(self): > > - RetVal =3D OrderedDict(self.Module.Guids) > > - for Library in self.DependentLibraryList: > > - RetVal.update(Library.Guids) > > - ExtendCopyDictionaryLists(self._GuidComments, Library.Gui= dComments) > > - ExtendCopyDictionaryLists(self._GuidComments, self.Module.Gui= dComments) > > - return RetVal > > - > > - @cached_property > > - def GetGuidsUsedByPcd(self): > > - RetVal =3D OrderedDict(self.Module.GetGuidsUsedByPcd()) > > - for Library in self.DependentLibraryList: > > - RetVal.update(Library.GetGuidsUsedByPcd()) > > - return RetVal > > - ## Get the protocol value mapping > > - # > > - # @retval dict The mapping between protocol cname and it= s value > > - # > > - @cached_property > > - def ProtocolList(self): > > - RetVal =3D OrderedDict(self.Module.Protocols) > > - for Library in self.DependentLibraryList: > > - RetVal.update(Library.Protocols) > > - ExtendCopyDictionaryLists(self._ProtocolComments, Library= .ProtocolComments) > > - ExtendCopyDictionaryLists(self._ProtocolComments, self.Module= .ProtocolComments) > > - return RetVal > > - > > - ## Get the PPI value mapping > > - # > > - # @retval dict The mapping between PPI cname and its val= ue > > - # > > - @cached_property > > - def PpiList(self): > > - RetVal =3D OrderedDict(self.Module.Ppis) > > - for Library in self.DependentLibraryList: > > - RetVal.update(Library.Ppis) > > - ExtendCopyDictionaryLists(self._PpiComments, Library.PpiC= omments) > > - ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiC= omments) > > - return RetVal > > - > > - ## Get the list of include search path > > - # > > - # @retval list The list path > > - # > > - @cached_property > > - def IncludePathList(self): > > - RetVal =3D [] > > - RetVal.append(self.MetaFile.Dir) > > - RetVal.append(self.DebugDir) > > - > > - for Package in self.Module.Packages: > > - PackageDir =3D mws.join(self.WorkspaceDir, Package.MetaFi= le.Dir) > > - if PackageDir not in RetVal: > > - RetVal.append(PackageDir) > > - IncludesList =3D Package.Includes > > - if Package._PrivateIncludes: > > - if not self.MetaFile.OriginalPath.Path.startswith(Pac= kageDir): > > - IncludesList =3D list(set(Package.Includes).diffe= rence(set(Package._PrivateIncludes))) > > - for Inc in IncludesList: > > - if Inc not in RetVal: > > - RetVal.append(str(Inc)) > > - return RetVal > > - > > - @cached_property > > - def IncludePathLength(self): > > - return sum(len(inc)+1 for inc in self.IncludePathList) > > - > > - ## Get HII EX PCDs which maybe used by VFR > > - # > > - # efivarstore used by VFR may relate with HII EX PCDs > > - # Get the variable name and GUID from efivarstore and HII EX PCD > > - # List the HII EX PCDs in As Built INF if both name and GUID mat= ch. > > - # > > - # @retval list HII EX PCDs > > - # > > - def _GetPcdsMaybeUsedByVfr(self): > > - if not self.SourceFileList: > > - return [] > > - > > - NameGuids =3D set() > > - for SrcFile in self.SourceFileList: > > - if SrcFile.Ext.lower() !=3D '.vfr': > > - continue > > - Vfri =3D os.path.join(self.OutputDir, SrcFile.BaseName + = '.i') > > - if not os.path.exists(Vfri): > > - continue > > - VfriFile =3D open(Vfri, 'r') > > - Content =3D VfriFile.read() > > - VfriFile.close() > > - Pos =3D Content.find('efivarstore') > > - while Pos !=3D -1: > > - # > > - # Make sure 'efivarstore' is the start of efivarstore= statement > > - # In case of the value of 'name' (name =3D efivarstor= e) is equal to 'efivarstore' > > - # > > - Index =3D Pos - 1 > > - while Index >=3D 0 and Content[Index] in ' \t\r\n': > > - Index -=3D 1 > > - if Index >=3D 0 and Content[Index] !=3D ';': > > - Pos =3D Content.find('efivarstore', Pos + len('ef= ivarstore')) > > - continue > > - # > > - # 'efivarstore' must be followed by name and guid > > - # > > - Name =3D gEfiVarStoreNamePattern.search(Content, Pos) > > - if not Name: > > - break > > - Guid =3D gEfiVarStoreGuidPattern.search(Content, Pos) > > - if not Guid: > > - break > > - NameArray =3D _ConvertStringToByteArray('L"' + Name.g= roup(1) + '"') > > - NameGuids.add((NameArray, GuidStructureStringToGuidSt= ring(Guid.group(1)))) > > - Pos =3D Content.find('efivarstore', Name.end()) > > - if not NameGuids: > > - return [] > > - HiiExPcds =3D [] > > - for Pcd in self.PlatformInfo.Platform.Pcds.values(): > > - if Pcd.Type !=3D TAB_PCDS_DYNAMIC_EX_HII: > > - continue > > - for SkuInfo in Pcd.SkuInfoList.values(): > > - Value =3D GuidValue(SkuInfo.VariableGuid, self.Platfo= rmInfo.PackageList, self.MetaFile.Path) > > - if not Value: > > - continue > > - Name =3D _ConvertStringToByteArray(SkuInfo.VariableNa= me) > > - Guid =3D GuidStructureStringToGuidString(Value) > > - if (Name, Guid) in NameGuids and Pcd not in HiiExPcds= : > > - HiiExPcds.append(Pcd) > > - break > > - > > - return HiiExPcds > > - > > - def _GenOffsetBin(self): > > - VfrUniBaseName =3D {} > > - for SourceFile in self.Module.Sources: > > - if SourceFile.Type.upper() =3D=3D ".VFR" : > > - # > > - # search the .map file to find the offset of vfr bina= ry in the PE32+/TE file. > > - # > > - VfrUniBaseName[SourceFile.BaseName] =3D (SourceFile.B= aseName + "Bin") > > - elif SourceFile.Type.upper() =3D=3D ".UNI" : > > - # > > - # search the .map file to find the offset of Uni stri= ngs binary in the PE32+/TE file. > > - # > > - VfrUniBaseName["UniOffsetName"] =3D (self.Name + "Str= ings") > > - > > - if not VfrUniBaseName: > > - return None > > - MapFileName =3D os.path.join(self.OutputDir, self.Name + ".ma= p") > > - EfiFileName =3D os.path.join(self.OutputDir, self.Name + ".ef= i") > > - VfrUniOffsetList =3D GetVariableOffset(MapFileName, EfiFileNa= me, list(VfrUniBaseName.values())) > > - if not VfrUniOffsetList: > > - return None > > - > > - OutputName =3D '%sOffset.bin' % self.Name > > - UniVfrOffsetFileName =3D os.path.join( self.OutputDir, Ou= tputName) > > - > > - try: > > - fInputfile =3D open(UniVfrOffsetFileName, "wb+", 0) > > - except: > > - EdkLogger.error("build", FILE_OPEN_FAILURE, "File open fa= iled for %s" % UniVfrOffsetFileName, None) > > - > > - # Use a instance of BytesIO to cache data > > - fStringIO =3D BytesIO() > > - > > - for Item in VfrUniOffsetList: > > - if (Item[0].find("Strings") !=3D -1): > > - # > > - # UNI offset in image. > > - # GUID + Offset > > - # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0= xef, 0x89, 0xfc, 0x6, 0x66 } } > > - # > > - UniGuid =3D b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef= \x89\xfc\x06f' > > - fStringIO.write(UniGuid) > > - UniValue =3D pack ('Q', int (Item[1], 16)) > > - fStringIO.write (UniValue) > > - else: > > - # > > - # VFR binary offset in image. > > - # GUID + Offset > > - # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0= x7, 0x46, 0xda, 0x6, 0xa2 } }; > > - # > > - VfrGuid =3D b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06= \xa2' > > - fStringIO.write(VfrGuid) > > - VfrValue =3D pack ('Q', int (Item[1], 16)) > > - fStringIO.write (VfrValue) > > - # > > - # write data into file. > > - # > > - try : > > - fInputfile.write (fStringIO.getvalue()) > > - except: > > - EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data = to file %s failed, please check whether the " > > - "file been locked or using by other appli= cations." %UniVfrOffsetFileName, None) > > - > > - fStringIO.close () > > - fInputfile.close () > > - return OutputName > > - > > - @cached_property > > - def OutputFile(self): > > - retVal =3D set() > > - OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > > - DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > > - for Item in self.CodaTargetList: > > - File =3D Item.Target.Path.replace('\\', '/').strip('/').r= eplace(DebugDir, '').replace(OutputDir, '').strip('/') > > - retVal.add(File) > > - if self.DepexGenerated: > > - retVal.add(self.Name + '.depex') > > - > > - Bin =3D self._GenOffsetBin() > > - if Bin: > > - retVal.add(Bin) > > - > > - for Root, Dirs, Files in os.walk(OutputDir): > > - for File in Files: > > - if File.lower().endswith('.pdb'): > > - retVal.add(File) > > - > > - return retVal > > - > > - ## Create AsBuilt INF file the module > > - # > > - def CreateAsBuiltInf(self): > > - > > - if self.IsAsBuiltInfCreated: > > - return > > - > > - # Skip INF file generation for libraries > > - if self.IsLibrary: > > - return > > - > > - # Skip the following code for modules with no source files > > - if not self.SourceFileList: > > - return > > - > > - # Skip the following code for modules without any binary file= s > > - if self.BinaryFileList: > > - return > > - > > - ### TODO: How to handles mixed source and binary modules > > - > > - # Find all DynamicEx and PatchableInModule PCDs used by this = module and dependent libraries > > - # Also find all packages that the DynamicEx PCDs depend on > > - Pcds =3D [] > > - PatchablePcds =3D [] > > - Packages =3D [] > > - PcdCheckList =3D [] > > - PcdTokenSpaceList =3D [] > > - for Pcd in self.ModulePcdList + self.LibraryPcdList: > > - if Pcd.Type =3D=3D TAB_PCDS_PATCHABLE_IN_MODULE: > > - PatchablePcds.append(Pcd) > > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGu= idCName, TAB_PCDS_PATCHABLE_IN_MODULE)) > > - elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET: > > - if Pcd not in Pcds: > > - Pcds.append(Pcd) > > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpa= ceGuidCName, TAB_PCDS_DYNAMIC_EX)) > > - PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpa= ceGuidCName, TAB_PCDS_DYNAMIC)) > > - PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName) > > - GuidList =3D OrderedDict(self.GuidList) > > - for TokenSpace in self.GetGuidsUsedByPcd: > > - # If token space is not referred by patch PCD or Ex PCD, = remove the GUID from GUID list > > - # The GUIDs in GUIDs section should really be the GUIDs i= n source INF or referred by Ex an patch PCDs > > - if TokenSpace not in PcdTokenSpaceList and TokenSpace in = GuidList: > > - GuidList.pop(TokenSpace) > > - CheckList =3D (GuidList, self.PpiList, self.ProtocolList, Pcd= CheckList) > > - for Package in self.DerivedPackageList: > > - if Package in Packages: > > - continue > > - BeChecked =3D (Package.Guids, Package.Ppis, Package.Proto= cols, Package.Pcds) > > - Found =3D False > > - for Index in range(len(BeChecked)): > > - for Item in CheckList[Index]: > > - if Item in BeChecked[Index]: > > - Packages.append(Package) > > - Found =3D True > > - break > > - if Found: > > - break > > - > > - VfrPcds =3D self._GetPcdsMaybeUsedByVfr() > > - for Pkg in self.PlatformInfo.PackageList: > > - if Pkg in Packages: > > - continue > > - for VfrPcd in VfrPcds: > > - if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, T= AB_PCDS_DYNAMIC_EX) in Pkg.Pcds or > > - (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, T= AB_PCDS_DYNAMIC) in Pkg.Pcds): > > - Packages.append(Pkg) > > - break > > - > > - ModuleType =3D SUP_MODULE_DXE_DRIVER if self.ModuleType =3D= =3D SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType > > - DriverType =3D self.PcdIsDriver if self.PcdIsDriver else '' > > - Guid =3D self.Guid > > - MDefs =3D self.Module.Defines > > - > > - AsBuiltInfDict =3D { > > - 'module_name' : self.Name, > > - 'module_guid' : Guid, > > - 'module_module_type' : ModuleType, > > - 'module_version_string' : [MDefs['VERSION_STRIN= G']] if 'VERSION_STRING' in MDefs else [], > > - 'pcd_is_driver_string' : [], > > - 'module_uefi_specification_version' : [], > > - 'module_pi_specification_version' : [], > > - 'module_entry_point' : self.Module.ModuleEnt= ryPointList, > > - 'module_unload_image' : self.Module.ModuleUnl= oadImageList, > > - 'module_constructor' : self.Module.Construct= orList, > > - 'module_destructor' : self.Module.Destructo= rList, > > - 'module_shadow' : [MDefs['SHADOW']] if = 'SHADOW' in MDefs else [], > > - 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID= ']] if 'PCI_VENDOR_ID' in MDefs else [], > > - 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID= ']] if 'PCI_DEVICE_ID' in MDefs else [], > > - 'module_pci_class_code' : [MDefs['PCI_CLASS_COD= E']] if 'PCI_CLASS_CODE' in MDefs else [], > > - 'module_pci_revision' : [MDefs['PCI_REVISION'= ]] if 'PCI_REVISION' in MDefs else [], > > - 'module_build_number' : [MDefs['BUILD_NUMBER'= ]] if 'BUILD_NUMBER' in MDefs else [], > > - 'module_spec' : [MDefs['SPEC']] if 'S= PEC' in MDefs else [], > > - 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESO= URCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [], > > - 'module_uni_file' : [MDefs['MODULE_UNI_FI= LE']] if 'MODULE_UNI_FILE' in MDefs else [], > > - 'module_arch' : self.Arch, > > - 'package_item' : [Package.MetaFile.Fil= e.replace('\\', '/') for Package in Packages], > > - 'binary_item' : [], > > - 'patchablepcd_item' : [], > > - 'pcd_item' : [], > > - 'protocol_item' : [], > > - 'ppi_item' : [], > > - 'guid_item' : [], > > - 'flags_item' : [], > > - 'libraryclasses_item' : [] > > - } > > - > > - if 'MODULE_UNI_FILE' in MDefs: > > - UNIFile =3D os.path.join(self.MetaFile.Dir, MDefs['MODULE= _UNI_FILE']) > > - if os.path.isfile(UNIFile): > > - shutil.copy2(UNIFile, self.OutputDir) > > - > > - if self.AutoGenVersion > int(gInfSpecVersion, 0): > > - AsBuiltInfDict['module_inf_version'] =3D '0x%08x' % self.= AutoGenVersion > > - else: > > - AsBuiltInfDict['module_inf_version'] =3D gInfSpecVersion > > - > > - if DriverType: > > - AsBuiltInfDict['pcd_is_driver_string'].append(DriverType) > > - > > - if 'UEFI_SPECIFICATION_VERSION' in self.Specification: > > - AsBuiltInfDict['module_uefi_specification_version'].appen= d(self.Specification['UEFI_SPECIFICATION_VERSION']) > > - if 'PI_SPECIFICATION_VERSION' in self.Specification: > > - AsBuiltInfDict['module_pi_specification_version'].append(= self.Specification['PI_SPECIFICATION_VERSION']) > > - > > - OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > > - DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > > - for Item in self.CodaTargetList: > > - File =3D Item.Target.Path.replace('\\', '/').strip('/').r= eplace(DebugDir, '').replace(OutputDir, '').strip('/') > > - if os.path.isabs(File): > > - File =3D File.replace('\\', '/').strip('/').replace(O= utputDir, '').strip('/') > > - if Item.Target.Ext.lower() =3D=3D '.aml': > > - AsBuiltInfDict['binary_item'].append('ASL|' + File) > > - elif Item.Target.Ext.lower() =3D=3D '.acpi': > > - AsBuiltInfDict['binary_item'].append('ACPI|' + File) > > - elif Item.Target.Ext.lower() =3D=3D '.efi': > > - AsBuiltInfDict['binary_item'].append('PE32|' + self.N= ame + '.efi') > > - else: > > - AsBuiltInfDict['binary_item'].append('BIN|' + File) > > - if not self.DepexGenerated: > > - DepexFile =3D os.path.join(self.OutputDir, self.Name + '.= depex') > > - if os.path.exists(DepexFile): > > - self.DepexGenerated =3D True > > - if self.DepexGenerated: > > - if self.ModuleType in [SUP_MODULE_PEIM]: > > - AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + s= elf.Name + '.depex') > > - elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODUL= E_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]: > > - AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + s= elf.Name + '.depex') > > - elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]: > > - AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + s= elf.Name + '.depex') > > - > > - Bin =3D self._GenOffsetBin() > > - if Bin: > > - AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin) > > - > > - for Root, Dirs, Files in os.walk(OutputDir): > > - for File in Files: > > - if File.lower().endswith('.pdb'): > > - AsBuiltInfDict['binary_item'].append('DISPOSABLE|= ' + File) > > - HeaderComments =3D self.Module.HeaderComments > > - StartPos =3D 0 > > - for Index in range(len(HeaderComments)): > > - if HeaderComments[Index].find('@BinaryHeader') !=3D -1: > > - HeaderComments[Index] =3D HeaderComments[Index].repla= ce('@BinaryHeader', '@file') > > - StartPos =3D Index > > - break > > - AsBuiltInfDict['header_comments'] =3D '\n'.join(HeaderComment= s[StartPos:]).replace(':#', '://') > > - AsBuiltInfDict['tail_comments'] =3D '\n'.join(self.Module.Tai= lComments) > > - > > - GenList =3D [ > > - (self.ProtocolList, self._ProtocolComments, 'protocol_ite= m'), > > - (self.PpiList, self._PpiComments, 'ppi_item'), > > - (GuidList, self._GuidComments, 'guid_item') > > - ] > > - for Item in GenList: > > - for CName in Item[0]: > > - Comments =3D '\n '.join(Item[1][CName]) if CName in = Item[1] else '' > > - Entry =3D Comments + '\n ' + CName if Comments else = CName > > - AsBuiltInfDict[Item[2]].append(Entry) > > - PatchList =3D parsePcdInfoFromMapFile( > > - os.path.join(self.OutputDir, self.Name + = '.map'), > > - os.path.join(self.OutputDir, self.Name + = '.efi') > > - ) > > - if PatchList: > > - for Pcd in PatchablePcds: > > - TokenCName =3D Pcd.TokenCName > > - for PcdItem in GlobalData.MixedPcd: > > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in G= lobalData.MixedPcd[PcdItem]: > > - TokenCName =3D PcdItem[0] > > - break > > - for PatchPcd in PatchList: > > - if TokenCName =3D=3D PatchPcd[0]: > > - break > > - else: > > - continue > > - PcdValue =3D '' > > - if Pcd.DatumType =3D=3D 'BOOLEAN': > > - BoolValue =3D Pcd.DefaultValue.upper() > > - if BoolValue =3D=3D 'TRUE': > > - Pcd.DefaultValue =3D '1' > > - elif BoolValue =3D=3D 'FALSE': > > - Pcd.DefaultValue =3D '0' > > - > > - if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES: > > - HexFormat =3D '0x%02x' > > - if Pcd.DatumType =3D=3D TAB_UINT16: > > - HexFormat =3D '0x%04x' > > - elif Pcd.DatumType =3D=3D TAB_UINT32: > > - HexFormat =3D '0x%08x' > > - elif Pcd.DatumType =3D=3D TAB_UINT64: > > - HexFormat =3D '0x%016x' > > - PcdValue =3D HexFormat % int(Pcd.DefaultValue, 0) > > - else: > > - if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize = =3D=3D '': > > - EdkLogger.error("build", AUTOGEN_ERROR, > > - "Unknown [MaxDatumSize] of PC= D [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName) > > - ) > > - ArraySize =3D int(Pcd.MaxDatumSize, 0) > > - PcdValue =3D Pcd.DefaultValue > > - if PcdValue[0] !=3D '{': > > - Unicode =3D False > > - if PcdValue[0] =3D=3D 'L': > > - Unicode =3D True > > - PcdValue =3D PcdValue.lstrip('L') > > - PcdValue =3D eval(PcdValue) > > - NewValue =3D '{' > > - for Index in range(0, len(PcdValue)): > > - if Unicode: > > - CharVal =3D ord(PcdValue[Index]) > > - NewValue =3D NewValue + '0x%02x' % (C= harVal & 0x00FF) + ', ' \ > > - + '0x%02x' % (CharVal >> 8) += ', ' > > - else: > > - NewValue =3D NewValue + '0x%02x' % (o= rd(PcdValue[Index]) % 0x100) + ', ' > > - Padding =3D '0x00, ' > > - if Unicode: > > - Padding =3D Padding * 2 > > - ArraySize =3D ArraySize // 2 > > - if ArraySize < (len(PcdValue) + 1): > > - if Pcd.MaxSizeUserSet: > > - EdkLogger.error("build", AUTOGEN_ERRO= R, > > - "The maximum size of VOID= * type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpac= eGuidCName, TokenCName) > > - ) > > - else: > > - ArraySize =3D len(PcdValue) + 1 > > - if ArraySize > len(PcdValue) + 1: > > - NewValue =3D NewValue + Padding * (ArrayS= ize - len(PcdValue) - 1) > > - PcdValue =3D NewValue + Padding.strip().rstri= p(',') + '}' > > - elif len(PcdValue.split(',')) <=3D ArraySize: > > - PcdValue =3D PcdValue.rstrip('}') + ', 0x00' = * (ArraySize - len(PcdValue.split(','))) > > - PcdValue +=3D '}' > > - else: > > - if Pcd.MaxSizeUserSet: > > - EdkLogger.error("build", AUTOGEN_ERROR, > > - "The maximum size of VOID* ty= pe PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGui= dCName, TokenCName) > > - ) > > - else: > > - ArraySize =3D len(PcdValue) + 1 > > - PcdItem =3D '%s.%s|%s|0x%X' % \ > > - (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, P= atchPcd[1]) > > - PcdComments =3D '' > > - if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self.= _PcdComments: > > - PcdComments =3D '\n '.join(self._PcdComments[Pcd= .TokenSpaceGuidCName, Pcd.TokenCName]) > > - if PcdComments: > > - PcdItem =3D PcdComments + '\n ' + PcdItem > > - AsBuiltInfDict['patchablepcd_item'].append(PcdItem) > > - > > - for Pcd in Pcds + VfrPcds: > > - PcdCommentList =3D [] > > - HiiInfo =3D '' > > - TokenCName =3D Pcd.TokenCName > > - for PcdItem in GlobalData.MixedPcd: > > - if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in Globa= lData.MixedPcd[PcdItem]: > > - TokenCName =3D PcdItem[0] > > - break > > - if Pcd.Type =3D=3D TAB_PCDS_DYNAMIC_EX_HII: > > - for SkuName in Pcd.SkuInfoList: > > - SkuInfo =3D Pcd.SkuInfoList[SkuName] > > - HiiInfo =3D '## %s|%s|%s' % (SkuInfo.VariableName= , SkuInfo.VariableGuid, SkuInfo.VariableOffset) > > - break > > - if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._Pcd= Comments: > > - PcdCommentList =3D self._PcdComments[Pcd.TokenSpaceGu= idCName, Pcd.TokenCName][:] > > - if HiiInfo: > > - UsageIndex =3D -1 > > - UsageStr =3D '' > > - for Index, Comment in enumerate(PcdCommentList): > > - for Usage in UsageList: > > - if Comment.find(Usage) !=3D -1: > > - UsageStr =3D Usage > > - UsageIndex =3D Index > > - break > > - if UsageIndex !=3D -1: > > - PcdCommentList[UsageIndex] =3D '## %s %s %s' % (U= sageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, '')) > > - else: > > - PcdCommentList.append('## UNDEFINED ' + HiiInfo) > > - PcdComments =3D '\n '.join(PcdCommentList) > > - PcdEntry =3D Pcd.TokenSpaceGuidCName + '.' + TokenCName > > - if PcdComments: > > - PcdEntry =3D PcdComments + '\n ' + PcdEntry > > - AsBuiltInfDict['pcd_item'].append(PcdEntry) > > - for Item in self.BuildOption: > > - if 'FLAGS' in self.BuildOption[Item]: > > - AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_F= LAGS =3D %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, sel= f.Arch, Item, self.BuildOption[Item]['FLAGS'].strip())) > > - > > - # Generated LibraryClasses section in comments. > > - for Library in self.LibraryAutoGenList: > > - AsBuiltInfDict['libraryclasses_item'].append(Library.Meta= File.File.replace('\\', '/')) > > - > > - # Generated UserExtensions TianoCore section. > > - # All tianocore user extensions are copied. > > - UserExtStr =3D '' > > - for TianoCore in self._GetTianoCoreUserExtensionList(): > > - UserExtStr +=3D '\n'.join(TianoCore) > > - ExtensionFile =3D os.path.join(self.MetaFile.Dir, TianoCo= re[1]) > > - if os.path.isfile(ExtensionFile): > > - shutil.copy2(ExtensionFile, self.OutputDir) > > - AsBuiltInfDict['userextension_tianocore_item'] =3D UserExtStr > > - > > - # Generated depex expression section in comments. > > - DepexExpression =3D self._GetDepexExpresionString() > > - AsBuiltInfDict['depexsection_item'] =3D DepexExpression if De= pexExpression else '' > > - > > - AsBuiltInf =3D TemplateString() > > - AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfD= ict)) > > - > > - SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.i= nf'), str(AsBuiltInf), False) > > - > > - self.IsAsBuiltInfCreated =3D True > > - > > - def CopyModuleToCache(self): > > - FileDir =3D path.join(GlobalData.gBinCacheDest, self.Platform= Info.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.So= urceDir, self.MetaFile.BaseName) > > - CreateDirectory (FileDir) > > - HashFile =3D path.join(self.BuildDir, self.Name + '.hash') > > - if os.path.exists(HashFile): > > - CopyFileOnChange(HashFile, FileDir) > > - ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') > > - if os.path.exists(ModuleFile): > > - CopyFileOnChange(ModuleFile, FileDir) > > - > > - if not self.OutputFile: > > - Ma =3D self.BuildDatabase[self.MetaFile, self.Arch, self.= BuildTarget, self.ToolChain] > > - self.OutputFile =3D Ma.Binaries > > - > > - for File in self.OutputFile: > > - File =3D str(File) > > - if not os.path.isabs(File): > > - File =3D os.path.join(self.OutputDir, File) > > - if os.path.exists(File): > > - sub_dir =3D os.path.relpath(File, self.OutputDir) > > - destination_file =3D os.path.join(FileDir, sub_dir) > > - destination_dir =3D os.path.dirname(destination_file) > > - CreateDirectory(destination_dir) > > - CopyFileOnChange(File, destination_dir) > > - > > - def AttemptModuleCacheCopy(self): > > - # If library or Module is binary do not skip by hash > > - if self.IsBinaryModule: > > - return False > > - # .inc is contains binary information so do not skip by hash = as well > > - for f_ext in self.SourceFileList: > > - if '.inc' in str(f_ext): > > - return False > > - FileDir =3D path.join(GlobalData.gBinCacheSource, self.Platfo= rmInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.= SourceDir, self.MetaFile.BaseName) > > - HashFile =3D path.join(FileDir, self.Name + '.hash') > > - if os.path.exists(HashFile): > > - f =3D open(HashFile, 'r') > > - CacheHash =3D f.read() > > - f.close() > > - self.GenModuleHash() > > - if GlobalData.gModuleHash[self.Arch][self.Name]: > > - if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch]= [self.Name]: > > - for root, dir, files in os.walk(FileDir): > > - for f in files: > > - if self.Name + '.hash' in f: > > - CopyFileOnChange(HashFile, self.Build= Dir) > > - else: > > - File =3D path.join(root, f) > > - sub_dir =3D os.path.relpath(File, Fil= eDir) > > - destination_file =3D os.path.join(sel= f.OutputDir, sub_dir) > > - destination_dir =3D os.path.dirname(d= estination_file) > > - CreateDirectory(destination_dir) > > - CopyFileOnChange(File, destination_di= r) > > - if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D= "PcdDxe": > > - CreatePcdDatabaseCode(self, TemplateString(),= TemplateString()) > > - return True > > - return False > > - > > - ## Create makefile for the module and its dependent libraries > > - # > > - # @param CreateLibraryMakeFile Flag indicating if or not= the makefiles of > > - # dependent libraries will = be created > > - # > > - @cached_class_function > > - def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList= =3D []): > > - # nest this function inside its only caller. > > - def CreateTimeStamp(): > > - FileSet =3D {self.MetaFile.Path} > > - > > - for SourceFile in self.Module.Sources: > > - FileSet.add (SourceFile.Path) > > - > > - for Lib in self.DependentLibraryList: > > - FileSet.add (Lib.MetaFile.Path) > > - > > - for f in self.AutoGenDepSet: > > - FileSet.add (f.Path) > > - > > - if os.path.exists (self.TimeStampPath): > > - os.remove (self.TimeStampPath) > > - with open(self.TimeStampPath, 'w+') as file: > > - for f in FileSet: > > - print(f, file=3Dfile) > > - > > - # Ignore generating makefile when it is a binary module > > - if self.IsBinaryModule: > > - return > > - > > - self.GenFfsList =3D GenFfsList > > - if not self.IsLibrary and CreateLibraryMakeFile: > > - for LibraryAutoGen in self.LibraryAutoGenList: > > - LibraryAutoGen.CreateMakeFile() > > - > > - # Don't enable if hash feature enabled, CanSkip uses timestam= ps to determine build skipping > > - if not GlobalData.gUseHashCache and self.CanSkip(): > > - return > > - > > - if len(self.CustomMakefile) =3D=3D 0: > > - Makefile =3D GenMake.ModuleMakefile(self) > > - else: > > - Makefile =3D GenMake.CustomMakefile(self) > > - if Makefile.Generate(): > > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile fo= r module %s [%s]" % > > - (self.Name, self.Arch)) > > - else: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generatio= n of makefile for module %s [%s]" % > > - (self.Name, self.Arch)) > > - > > - CreateTimeStamp() > > - > > - def CopyBinaryFiles(self): > > - for File in self.Module.Binaries: > > - SrcPath =3D File.Path > > - DstPath =3D os.path.join(self.OutputDir, os.path.basename= (SrcPath)) > > - CopyLongFilePath(SrcPath, DstPath) > > - ## Create autogen code for the module and its dependent libraries > > - # > > - # @param CreateLibraryCodeFile Flag indicating if or not= the code of > > - # dependent libraries will = be created > > - # > > - def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): > > - if self.IsCodeFileCreated: > > - return > > - > > - # Need to generate PcdDatabase even PcdDriver is binarymodule > > - if self.IsBinaryModule and self.PcdIsDriver !=3D '': > > - CreatePcdDatabaseCode(self, TemplateString(), TemplateStr= ing()) > > - return > > - if self.IsBinaryModule: > > - if self.IsLibrary: > > - self.CopyBinaryFiles() > > - return > > - > > - if not self.IsLibrary and CreateLibraryCodeFile: > > - for LibraryAutoGen in self.LibraryAutoGenList: > > - LibraryAutoGen.CreateCodeFile() > > - > > - # Don't enable if hash feature enabled, CanSkip uses timestam= ps to determine build skipping > > - if not GlobalData.gUseHashCache and self.CanSkip(): > > - return > > - > > - AutoGenList =3D [] > > - IgoredAutoGenList =3D [] > > - > > - for File in self.AutoGenFileList: > > - if GenC.Generate(File.Path, self.AutoGenFileList[File], F= ile.IsBinary): > > - AutoGenList.append(str(File)) > > - else: > > - IgoredAutoGenList.append(str(File)) > > - > > - > > - for ModuleType in self.DepexList: > > - # Ignore empty [depex] section or [depex] section for SUP= _MODULE_USER_DEFINED module > > - if len(self.DepexList[ModuleType]) =3D=3D 0 or ModuleType= =3D=3D SUP_MODULE_USER_DEFINED or ModuleType =3D=3D SUP_MODULE_HOST_APPLIC= ATION: > > - continue > > - > > - Dpx =3D GenDepex.DependencyExpression(self.DepexList[Modu= leType], ModuleType, True) > > - DpxFile =3D gAutoGenDepexFileName % {"module_name" : self= .Name} > > - > > - if len(Dpx.PostfixNotation) !=3D 0: > > - self.DepexGenerated =3D True > > - > > - if Dpx.Generate(path.join(self.OutputDir, DpxFile)): > > - AutoGenList.append(str(DpxFile)) > > - else: > > - IgoredAutoGenList.append(str(DpxFile)) > > - > > - if IgoredAutoGenList =3D=3D []: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files = for module %s [%s]" % > > - (" ".join(AutoGenList), self.Name, self.A= rch)) > > - elif AutoGenList =3D=3D []: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generatio= n of [%s] files for module %s [%s]" % > > - (" ".join(IgoredAutoGenList), self.Name, = self.Arch)) > > - else: > > - EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipp= ed %s) files for module %s [%s]" % > > - (" ".join(AutoGenList), " ".join(IgoredAu= toGenList), self.Name, self.Arch)) > > - > > - self.IsCodeFileCreated =3D True > > - return AutoGenList > > - > > - ## Summarize the ModuleAutoGen objects of all libraries used by t= his module > > - @cached_property > > - def LibraryAutoGenList(self): > > - RetVal =3D [] > > - for Library in self.DependentLibraryList: > > - La =3D ModuleAutoGen( > > - self.Workspace, > > - Library.MetaFile, > > - self.BuildTarget, > > - self.ToolChain, > > - self.Arch, > > - self.PlatformInfo.MetaFile > > - ) > > - if La not in RetVal: > > - RetVal.append(La) > > - for Lib in La.CodaTargetList: > > - self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE= ) > > - return RetVal > > - > > - def GenModuleHash(self): > > - # Initialize a dictionary for each arch type > > - if self.Arch not in GlobalData.gModuleHash: > > - GlobalData.gModuleHash[self.Arch] =3D {} > > - > > - # Early exit if module or library has been hashed and is in m= emory > > - if self.Name in GlobalData.gModuleHash[self.Arch]: > > - return GlobalData.gModuleHash[self.Arch][self.Name].encod= e('utf-8') > > - > > - # Initialze hash object > > - m =3D hashlib.md5() > > - > > - # Add Platform level hash > > - m.update(GlobalData.gPlatformHash.encode('utf-8')) > > - > > - # Add Package level hash > > - if self.DependentPackageList: > > - for Pkg in sorted(self.DependentPackageList, key=3Dlambda= x: x.PackageName): > > - if Pkg.PackageName in GlobalData.gPackageHash: > > - m.update(GlobalData.gPackageHash[Pkg.PackageName]= .encode('utf-8')) > > - > > - # Add Library hash > > - if self.LibraryAutoGenList: > > - for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x= : x.Name): > > - if Lib.Name not in GlobalData.gModuleHash[self.Arch]: > > - Lib.GenModuleHash() > > - m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].= encode('utf-8')) > > - > > - # Add Module self > > - f =3D open(str(self.MetaFile), 'rb') > > - Content =3D f.read() > > - f.close() > > - m.update(Content) > > - > > - # Add Module's source files > > - if self.SourceFileList: > > - for File in sorted(self.SourceFileList, key=3Dlambda x: s= tr(x)): > > - f =3D open(str(File), 'rb') > > - Content =3D f.read() > > - f.close() > > - m.update(Content) > > - > > - GlobalData.gModuleHash[self.Arch][self.Name] =3D m.hexdigest(= ) > > - > > - return GlobalData.gModuleHash[self.Arch][self.Name].encode('u= tf-8') > > - > > - ## Decide whether we can skip the ModuleAutoGen process > > - def CanSkipbyHash(self): > > - # Hashing feature is off > > - if not GlobalData.gUseHashCache: > > - return False > > - > > - # Initialize a dictionary for each arch type > > - if self.Arch not in GlobalData.gBuildHashSkipTracking: > > - GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() > > - > > - # If library or Module is binary do not skip by hash > > - if self.IsBinaryModule: > > - return False > > - > > - # .inc is contains binary information so do not skip by hash = as well > > - for f_ext in self.SourceFileList: > > - if '.inc' in str(f_ext): > > - return False > > - > > - # Use Cache, if exists and if Module has a copy in cache > > - if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy= (): > > - return True > > - > > - # Early exit for libraries that haven't yet finished building > > - HashFile =3D path.join(self.BuildDir, self.Name + ".hash") > > - if self.IsLibrary and not os.path.exists(HashFile): > > - return False > > - > > - # Return a Boolean based on if can skip by hash, either from = memory or from IO. > > - if self.Name not in GlobalData.gBuildHashSkipTracking[self.Ar= ch]: > > - # If hashes are the same, SaveFileOnChange() will return = False. > > - GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] = =3D not SaveFileOnChange(HashFile, self.GenModuleHash(), True) > > - return GlobalData.gBuildHashSkipTracking[self.Arch][self.= Name] > > - else: > > - return GlobalData.gBuildHashSkipTracking[self.Arch][self.= Name] > > - > > - ## Decide whether we can skip the ModuleAutoGen process > > - # If any source file is newer than the module than we cannot ski= p > > - # > > - def CanSkip(self): > > - if self.MakeFileDir in GlobalData.gSikpAutoGenCache: > > - return True > > - if not os.path.exists(self.TimeStampPath): > > - return False > > - #last creation time of the module > > - DstTimeStamp =3D os.stat(self.TimeStampPath)[8] > > - > > - SrcTimeStamp =3D self.Workspace._SrcTimeStamp > > - if SrcTimeStamp > DstTimeStamp: > > - return False > > - > > - with open(self.TimeStampPath,'r') as f: > > - for source in f: > > - source =3D source.rstrip('\n') > > - if not os.path.exists(source): > > - return False > > - if source not in ModuleAutoGen.TimeDict : > > - ModuleAutoGen.TimeDict[source] =3D os.stat(source= )[8] > > - if ModuleAutoGen.TimeDict[source] > DstTimeStamp: > > - return False > > - GlobalData.gSikpAutoGenCache.add(self.MakeFileDir) > > - return True > > - > > - @cached_property > > - def TimeStampPath(self): > > - return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp') > > + @classmethod > > + def Cache(cls): > > + return cls.__ObjectCache > > + > > +# > > +# The priority list while override build option > > +# > > +PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTY= PE_ATTRIBUTE (Highest) > > + "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE= _ATTRIBUTE > > + "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE= _ATTRIBUTE > > + "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE= _ATTRIBUTE > > + "0x00011" : 9, # ******_*********_****_COMMANDTYPE= _ATTRIBUTE > > + "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********= _ATTRIBUTE > > + "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********= _ATTRIBUTE > > + "0x10101" : 6, # TARGET_*********_ARCH_***********= _ATTRIBUTE > > + "0x00101" : 5, # ******_*********_ARCH_***********= _ATTRIBUTE > > + "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********= _ATTRIBUTE > > + "0x01001" : 3, # ******_TOOLCHAIN_****_***********= _ATTRIBUTE > > + "0x10001" : 2, # TARGET_*********_****_***********= _ATTRIBUTE > > + "0x00001" : 1} # ******_*********_****_***********= _ATTRIBUTE (Lowest) > > +## Calculate the priority value of the build option > > +# > > +# @param Key Build option definition contain: TARGET_TOOLCHAIN_= ARCH_COMMANDTYPE_ATTRIBUTE > > +# > > +# @retval Value Priority value based on the priority list. > > +# > > +def CalculatePriorityValue(Key): > > + Target, ToolChain, Arch, CommandType, Attr =3D Key.split('_') > > + PriorityValue =3D 0x11111 > > + if Target =3D=3D TAB_STAR: > > + PriorityValue &=3D 0x01111 > > + if ToolChain =3D=3D TAB_STAR: > > + PriorityValue &=3D 0x10111 > > + if Arch =3D=3D TAB_STAR: > > + PriorityValue &=3D 0x11011 > > + if CommandType =3D=3D TAB_STAR: > > + PriorityValue &=3D 0x11101 > > + if Attr =3D=3D TAB_STAR: > > + PriorityValue &=3D 0x11110 > > + > > + return PrioList["0x%0.5x" % PriorityValue] > > diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/S= ource/Python/AutoGen/DataPipe.py > > new file mode 100644 > > index 000000000000..5bcc39bd380d > > --- /dev/null > > +++ b/BaseTools/Source/Python/AutoGen/DataPipe.py > > @@ -0,0 +1,147 @@ > > +## @file > > +# Create makefile for MS nmake and GNU make > > +# > > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> > +# SPDX-License-Identifier: BSD-2-Clause-Patent > > +# > > +from __future__ import absolute_import > > +from Workspace.WorkspaceDatabase import BuildDB > > +from Workspace.WorkspaceCommon import GetModuleLibInstances > > +import Common.GlobalData as GlobalData > > +import os > > +import pickle > > +from pickle import HIGHEST_PROTOCOL > > + > > +class PCD_DATA(): > > + def __init__(self,TokenCName,TokenSpaceGuidCName,Type,DatumType,S= kuInfoList,DefaultValue, > > + MaxDatumSize,UserDefinedDefaultStoresFlag,validatera= nges, > > + validlists,expressions,CustomAttribute,TokenValue): > > + self.TokenCName =3D TokenCName > > + self.TokenSpaceGuidCName =3D TokenSpaceGuidCName > > + self.Type =3D Type > > + self.DatumType =3D DatumType > > + self.SkuInfoList =3D SkuInfoList > > + self.DefaultValue =3D DefaultValue > > + self.MaxDatumSize =3D MaxDatumSize > > + self.UserDefinedDefaultStoresFlag =3D UserDefinedDefaultStore= sFlag > > + self.validateranges =3D validateranges > > + self.validlists =3D validlists > > + self.expressions =3D expressions > > + self.CustomAttribute =3D CustomAttribute > > + self.TokenValue =3D TokenValue > > + > > +class DataPipe(object): > > + def __init__(self, BuildDir=3DNone): > > + self.data_container =3D {} > > + self.BuildDir =3D BuildDir > > + > > +class MemoryDataPipe(DataPipe): > > + > > + def Get(self,key): > > + return self.data_container.get(key) > > + > > + def dump(self,file_path): > > + with open(file_path,'wb') as fd: > > + pickle.dump(self.data_container,fd,pickle.HIGHEST_PROTOCO= L) > > + > > + def load(self,file_path): > > + with open(file_path,'rb') as fd: > > + self.data_container =3D pickle.load(fd) > > + > > + @property > > + def DataContainer(self): > > + return self.data_container > > + @DataContainer.setter > > + def DataContainer(self,data): > > + self.data_container.update(data) > > + > > + def FillData(self,PlatformInfo): > > + #Platform Pcds > > + self.DataContainer =3D { > > + "PLA_PCD" : [PCD_DATA( > > + pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type, > > + pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue, > > + pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.val= idateranges, > > + pcd.validlists,pcd.expressions,pcd.CustomAttribute,p= cd.TokenValue) > > + for pcd in PlatformInfo.Platform.Pcds.values()] > > + } > > + > > + #Platform Module Pcds > > + ModulePcds =3D {} > > + for m in PlatformInfo.Platform.Modules: > > + m_pcds =3D PlatformInfo.Platform.Modules[m].Pcds > > + if m_pcds: > > + ModulePcds[(m.File,m.Root)] =3D [PCD_DATA( > > + pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type, > > + pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue, > > + pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.val= idateranges, > > + pcd.validlists,pcd.expressions,pcd.CustomAttribute,p= cd.TokenValue) > > + for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()= ] > > + > > + > > + self.DataContainer =3D {"MOL_PCDS":ModulePcds} > > + > > + #Module's Library Instance > > + ModuleLibs =3D {} > > + for m in PlatformInfo.Platform.Modules: > > + module_obj =3D BuildDB.BuildObject[m,PlatformInfo.Arch,Pl= atformInfo.BuildTarget,PlatformInfo.ToolChain] > > + Libs =3D GetModuleLibInstances(module_obj, PlatformInfo.P= latform, BuildDB.BuildObject, PlatformInfo.Arch,PlatformInfo.BuildTarget,Pl= atformInfo.ToolChain) > > + ModuleLibs[(m.File,m.Root,module_obj.Arch)] =3D [(l.MetaF= ile.File,l.MetaFile.Root,l.Arch) for l in Libs] > > + self.DataContainer =3D {"DEPS":ModuleLibs} > > + > > + #Platform BuildOptions > > + > > + platform_build_opt =3D PlatformInfo.EdkIIBuildOption > > + > > + ToolDefinition =3D PlatformInfo.ToolDefinition > > + module_build_opt =3D {} > > + for m in PlatformInfo.Platform.Modules: > > + ModuleTypeOptions, PlatformModuleOptions =3D PlatformInfo= .GetGlobalBuildOptions(BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo= .BuildTarget,PlatformInfo.ToolChain]) > > + if ModuleTypeOptions or PlatformModuleOptions: > > + module_build_opt.update({(m.File,m.Root): {"ModuleTyp= eOptions":ModuleTypeOptions, "PlatformModuleOptions":PlatformModuleOptions}= }) > > + > > + self.DataContainer =3D {"PLA_BO":platform_build_opt, > > + "TOOLDEF":ToolDefinition, > > + "MOL_BO":module_build_opt > > + } > > + > > + > > + > > + #Platform Info > > + PInfo =3D { > > + "WorkspaceDir":PlatformInfo.Workspace.WorkspaceDir, > > + "Target":PlatformInfo.BuildTarget, > > + "ToolChain":PlatformInfo.Workspace.ToolChain, > > + "BuildRuleFile":PlatformInfo.BuildRule, > > + "Arch": PlatformInfo.Arch, > > + "ArchList":PlatformInfo.Workspace.ArchList, > > + "ActivePlatform":PlatformInfo.MetaFile > > + } > > + self.DataContainer =3D {'P_Info':PInfo} > > + > > + self.DataContainer =3D {'M_Name':PlatformInfo.UniqueBaseName} > > + > > + self.DataContainer =3D {"ToolChainFamily": PlatformInfo.ToolC= hainFamily} > > + > > + self.DataContainer =3D {"BuildRuleFamily": PlatformInfo.Build= RuleFamily} > > + > > + self.DataContainer =3D {"MixedPcd":GlobalData.MixedPcd} > > + > > + self.DataContainer =3D {"BuildOptPcd":GlobalData.BuildOptionP= cd} > > + > > + self.DataContainer =3D {"BuildCommand": PlatformInfo.BuildCom= mand} > > + > > + self.DataContainer =3D {"AsBuildModuleList": PlatformInfo._As= BuildModuleList} > > + > > + self.DataContainer =3D {"G_defines": GlobalData.gGlobalDefine= s} > > + > > + self.DataContainer =3D {"CL_defines": GlobalData.gCommandLine= Defines} > > + > > + self.DataContainer =3D {"Env_Var": {k:v for k, v in os.enviro= n.items()}} > > + > > + self.DataContainer =3D {"PackageList": [(dec.MetaFile,dec.Arc= h) for dec in PlatformInfo.PackageList]} > > + > > + self.DataContainer =3D {"GuidDict": PlatformInfo.Platform._Gu= idDict} > > + > > + self.DataContainer =3D {"FdfParser": True if GlobalData.gFdfP= arser else False} > > + > > diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Sourc= e/Python/AutoGen/GenC.py > > index 4cb776206e90..4c3f4e3e55ae 100644 > > --- a/BaseTools/Source/Python/AutoGen/GenC.py > > +++ b/BaseTools/Source/Python/AutoGen/GenC.py > > @@ -1627,11 +1627,11 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH): > > TokenSpaceList =3D [] > > for Pcd in Info.ModulePcdList: > > if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET and Pcd.TokenSpaceGuid= CName not in TokenSpaceList: > > TokenSpaceList.append(Pcd.TokenSpaceGuidCName) > > > > - SkuMgr =3D Info.Workspace.Platform.SkuIdMgr > > + SkuMgr =3D Info.PlatformInfo.Platform.SkuIdMgr > > AutoGenH.Append("\n// Definition of SkuId Array\n") > > AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n") > > # Add extern declarations to AutoGen.h if one or more Token Space= GUIDs were found > > if TokenSpaceList: > > AutoGenH.Append("\n// Definition of PCD Token Space GUIDs use= d in this module\n\n") > > diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTo= ols/Source/Python/AutoGen/ModuleAutoGen.py > > new file mode 100644 > > index 000000000000..d19c03862094 > > --- /dev/null > > +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py > > @@ -0,0 +1,1908 @@ > > +## @file > > +# Create makefile for MS nmake and GNU make > > +# > > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> > +# SPDX-License-Identifier: BSD-2-Clause-Patent > > +# > > +from __future__ import absolute_import > > +from AutoGen.AutoGen import AutoGen > > +from Common.LongFilePathSupport import CopyLongFilePath > > +from Common.BuildToolError import * > > +from Common.DataType import * > > +from Common.Misc import * > > +from Common.StringUtils import NormPath,GetSplitList > > +from collections import defaultdict > > +from Workspace.WorkspaceCommon import OrderedListDict > > +import os.path as path > > +import copy > > +import hashlib > > +from . import InfSectionParser > > +from . import GenC > > +from . import GenMake > > +from . import GenDepex > > +from io import BytesIO > > +from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile > > +from Workspace.MetaFileCommentParser import UsageList > > +from .GenPcdDb import CreatePcdDatabaseCode > > +from Common.caching import cached_class_function > > +from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo > > + > > +## Mapping Makefile type > > +gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} > > +# > > +# Regular expression for finding Include Directories, the difference = between MSFT and INTEL/GCC/RVCT > > +# is the former use /I , the Latter used -I to specify include direct= ories > > +# > > +gBuildOptIncludePatternMsft =3D re.compile(r"(?:.*?)/I[ \t]*([^ ]*)",= re.MULTILINE | re.DOTALL) > > +gBuildOptIncludePatternOther =3D re.compile(r"(?:.*?)-I[ \t]*([^ ]*)"= , re.MULTILINE | re.DOTALL) > > + > > +## default file name for AutoGen > > +gAutoGenCodeFileName =3D "AutoGen.c" > > +gAutoGenHeaderFileName =3D "AutoGen.h" > > +gAutoGenStringFileName =3D "%(module_name)sStrDefs.h" > > +gAutoGenStringFormFileName =3D "%(module_name)sStrDefs.hpk" > > +gAutoGenDepexFileName =3D "%(module_name)s.depex" > > +gAutoGenImageDefFileName =3D "%(module_name)sImgDefs.h" > > +gAutoGenIdfFileName =3D "%(module_name)sIdf.hpk" > > +gInfSpecVersion =3D "0x00010017" > > + > > +# > > +# Match name =3D variable > > +# > > +gEfiVarStoreNamePattern =3D re.compile("\s*name\s*=3D\s*(\w+)") > > +# > > +# The format of guid in efivarstore statement likes following and mus= t be correct: > > +# guid =3D {0xA04A27f4, 0xDF00, 0x4D42, {0xB5, 0x52, 0x39, 0x51, 0x13= , 0x02, 0x11, 0x3D}} > > +# > > +gEfiVarStoreGuidPattern =3D re.compile("\s*guid\s*=3D\s*({.*?{.*?}\s*= })") > > + > > +# > > +# Template string to generic AsBuilt INF > > +# > > +gAsBuiltInfHeaderString =3D TemplateString("""${header_comments} > > + > > +# DO NOT EDIT > > +# FILE auto-generated > > + > > +[Defines] > > + INF_VERSION =3D ${module_inf_version} > > + BASE_NAME =3D ${module_name} > > + FILE_GUID =3D ${module_guid} > > + MODULE_TYPE =3D ${module_module_type}${BEGIN} > > + VERSION_STRING =3D ${module_version_string}${END}${BEGI= N} > > + PCD_IS_DRIVER =3D ${pcd_is_driver_string}${END}${BEGIN= } > > + UEFI_SPECIFICATION_VERSION =3D ${module_uefi_specification_version}= ${END}${BEGIN} > > + PI_SPECIFICATION_VERSION =3D ${module_pi_specification_version}${= END}${BEGIN} > > + ENTRY_POINT =3D ${module_entry_point}${END}${BEGIN} > > + UNLOAD_IMAGE =3D ${module_unload_image}${END}${BEGIN} > > + CONSTRUCTOR =3D ${module_constructor}${END}${BEGIN} > > + DESTRUCTOR =3D ${module_destructor}${END}${BEGIN} > > + SHADOW =3D ${module_shadow}${END}${BEGIN} > > + PCI_VENDOR_ID =3D ${module_pci_vendor_id}${END}${BEGIN= } > > + PCI_DEVICE_ID =3D ${module_pci_device_id}${END}${BEGIN= } > > + PCI_CLASS_CODE =3D ${module_pci_class_code}${END}${BEGI= N} > > + PCI_REVISION =3D ${module_pci_revision}${END}${BEGIN} > > + BUILD_NUMBER =3D ${module_build_number}${END}${BEGIN} > > + SPEC =3D ${module_spec}${END}${BEGIN} > > + UEFI_HII_RESOURCE_SECTION =3D ${module_uefi_hii_resource_section}$= {END}${BEGIN} > > + MODULE_UNI_FILE =3D ${module_uni_file}${END} > > + > > +[Packages.${module_arch}]${BEGIN} > > + ${package_item}${END} > > + > > +[Binaries.${module_arch}]${BEGIN} > > + ${binary_item}${END} > > + > > +[PatchPcd.${module_arch}]${BEGIN} > > + ${patchablepcd_item} > > +${END} > > + > > +[Protocols.${module_arch}]${BEGIN} > > + ${protocol_item} > > +${END} > > + > > +[Ppis.${module_arch}]${BEGIN} > > + ${ppi_item} > > +${END} > > + > > +[Guids.${module_arch}]${BEGIN} > > + ${guid_item} > > +${END} > > + > > +[PcdEx.${module_arch}]${BEGIN} > > + ${pcd_item} > > +${END} > > + > > +[LibraryClasses.${module_arch}] > > +## @LIB_INSTANCES${BEGIN} > > +# ${libraryclasses_item}${END} > > + > > +${depexsection_item} > > + > > +${userextension_tianocore_item} > > + > > +${tail_comments} > > + > > +[BuildOptions.${module_arch}] > > +## @AsBuilt${BEGIN} > > +## ${flags_item}${END} > > +""") > > +# > > +# extend lists contained in a dictionary with lists stored in another= dictionary > > +# if CopyToDict is not derived from DefaultDict(list) then this may r= aise exception > > +# > > +def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > > + for Key in CopyFromDict: > > + CopyToDict[Key].extend(CopyFromDict[Key]) > > + > > +# Create a directory specified by a set of path elements and return t= he full path > > +def _MakeDir(PathList): > > + RetVal =3D path.join(*PathList) > > + CreateDirectory(RetVal) > > + return RetVal > > + > > +# > > +# Convert string to C format array > > +# > > +def _ConvertStringToByteArray(Value): > > + Value =3D Value.strip() > > + if not Value: > > + return None > > + if Value[0] =3D=3D '{': > > + if not Value.endswith('}'): > > + return None > > + Value =3D Value.replace(' ', '').replace('{', '').replace('}'= , '') > > + ValFields =3D Value.split(',') > > + try: > > + for Index in range(len(ValFields)): > > + ValFields[Index] =3D str(int(ValFields[Index], 0)) > > + except ValueError: > > + return None > > + Value =3D '{' + ','.join(ValFields) + '}' > > + return Value > > + > > + Unicode =3D False > > + if Value.startswith('L"'): > > + if not Value.endswith('"'): > > + return None > > + Value =3D Value[1:] > > + Unicode =3D True > > + elif not Value.startswith('"') or not Value.endswith('"'): > > + return None > > + > > + Value =3D eval(Value) # translate escape character > > + NewValue =3D '{' > > + for Index in range(0, len(Value)): > > + if Unicode: > > + NewValue =3D NewValue + str(ord(Value[Index]) % 0x10000) = + ',' > > + else: > > + NewValue =3D NewValue + str(ord(Value[Index]) % 0x100) + = ',' > > + Value =3D NewValue + '0}' > > + return Value > > + > > +## ModuleAutoGen class > > +# > > +# This class encapsules the AutoGen behaviors for the build tools. In= addition to > > +# the generation of AutoGen.h and AutoGen.c, it will generate *.depex= file according > > +# to the [depex] section in module's inf file. > > +# > > +class ModuleAutoGen(AutoGen): > > + # call super().__init__ then call the worker function with differ= ent parameter count > > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > + if not hasattr(self, "_Init"): > > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch, *args) > > + self._Init =3D True > > + > > + ## Cache the timestamps of metafiles of every module in a class a= ttribute > > + # > > + TimeDict =3D {} > > + > > + def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > > +# check if this module is employed by active platform > > + if not PlatformInfo(Workspace, args[0], Target, Toolchain, Ar= ch,args[-1]).ValidModule(MetaFile): > > + EdkLogger.verbose("Module [%s] for [%s] is not employed b= y active platform\n" \ > > + % (MetaFile, Arch)) > > + return None > > + return super(ModuleAutoGen, cls).__new__(cls, Workspace, Meta= File, Target, Toolchain, Arch, *args, **kwargs) > > + > > + ## Initialize ModuleAutoGen > > + # > > + # @param Workspace EdkIIWorkspaceBuild object > > + # @param ModuleFile The path of module file > > + # @param Target Build target (DEBUG, RELEASE) > > + # @param Toolchain Name of tool chain > > + # @param Arch The arch the module supports > > + # @param PlatformFile Platform meta-file > > + # > > + def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, A= rch, PlatformFile,DataPipe): > > + EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]"= % (ModuleFile, Arch)) > > + GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (ModuleFil= e, Arch, Toolchain, Target) > > + > > + self.Workspace =3D None > > + self.WorkspaceDir =3D "" > > + self.PlatformInfo =3D None > > + self.DataPipe =3D DataPipe > > + self.__init_platform_info__() > > + self.MetaFile =3D ModuleFile > > + self.SourceDir =3D self.MetaFile.SubDir > > + self.SourceDir =3D mws.relpath(self.SourceDir, self.Workspace= Dir) > > + > > + self.ToolChain =3D Toolchain > > + self.BuildTarget =3D Target > > + self.Arch =3D Arch > > + self.ToolChainFamily =3D self.PlatformInfo.ToolChainFamily > > + self.BuildRuleFamily =3D self.PlatformInfo.BuildRuleFamily > > + > > + self.IsCodeFileCreated =3D False > > + self.IsAsBuiltInfCreated =3D False > > + self.DepexGenerated =3D False > > + > > + self.BuildDatabase =3D self.Workspace.BuildDatabase > > + self.BuildRuleOrder =3D None > > + self.BuildTime =3D 0 > > + > > + self._GuidComments =3D OrderedListDict() > > + self._ProtocolComments =3D OrderedListDict() > > + self._PpiComments =3D OrderedListDict() > > + self._BuildTargets =3D None > > + self._IntroBuildTargetList =3D None > > + self._FinalBuildTargetList =3D None > > + self._FileTypes =3D None > > + > > + self.AutoGenDepSet =3D set() > > + self.ReferenceModules =3D [] > > + self.ConstPcd =3D {} > > + > > + def __init_platform_info__(self): > > + pinfo =3D self.DataPipe.Get("P_Info") > > + self.Workspace =3D WorkSpaceInfo(pinfo.get("WorkspaceDir"),pi= nfo.get("ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.= get("ArchList")) > > + self.WorkspaceDir =3D pinfo.get("WorkspaceDir") > > + self.PlatformInfo =3D PlatformInfo(self.Workspace,pinfo.get("= ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.get("Arch= "),self.DataPipe) > > + ## hash() operator of ModuleAutoGen > > + # > > + # The module file path and arch string will be used to represent > > + # hash value of this object > > + # > > + # @retval int Hash value of the module file path and arch > > + # > > + @cached_class_function > > + def __hash__(self): > > + return hash((self.MetaFile, self.Arch)) > > + def __repr__(self): > > + return "%s [%s]" % (self.MetaFile, self.Arch) > > + > > + # Get FixedAtBuild Pcds of this Module > > + @cached_property > > + def FixedAtBuildPcds(self): > > + RetVal =3D [] > > + for Pcd in self.ModulePcdList: > > + if Pcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > > + continue > > + if Pcd not in RetVal: > > + RetVal.append(Pcd) > > + return RetVal > > + > > + @cached_property > > + def FixedVoidTypePcds(self): > > + RetVal =3D {} > > + for Pcd in self.FixedAtBuildPcds: > > + if Pcd.DatumType =3D=3D TAB_VOID: > > + if '.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)= ) not in RetVal: > > + RetVal['.'.join((Pcd.TokenSpaceGuidCName, Pcd.Tok= enCName))] =3D Pcd.DefaultValue > > + return RetVal > > + > > + @property > > + def UniqueBaseName(self): > > + ModuleNames =3D self.DataPipe.Get("M_Name") > > + if not ModuleNames: > > + return self.Name > > + return ModuleNames.get(self.Name,self.Name) > > + > > + # Macros could be used in build_rule.txt (also Makefile) > > + @cached_property > > + def Macros(self): > > + return OrderedDict(( > > + ("WORKSPACE" ,self.WorkspaceDir), > > + ("MODULE_NAME" ,self.Name), > > + ("MODULE_NAME_GUID" ,self.UniqueBaseName), > > + ("MODULE_GUID" ,self.Guid), > > + ("MODULE_VERSION" ,self.Version), > > + ("MODULE_TYPE" ,self.ModuleType), > > + ("MODULE_FILE" ,str(self.MetaFile)), > > + ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName), > > + ("MODULE_RELATIVE_DIR" ,self.SourceDir), > > + ("MODULE_DIR" ,self.SourceDir), > > + ("BASE_NAME" ,self.Name), > > + ("ARCH" ,self.Arch), > > + ("TOOLCHAIN" ,self.ToolChain), > > + ("TOOLCHAIN_TAG" ,self.ToolChain), > > + ("TOOL_CHAIN_TAG" ,self.ToolChain), > > + ("TARGET" ,self.BuildTarget), > > + ("BUILD_DIR" ,self.PlatformInfo.BuildDir), > > + ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self= .Arch)), > > + ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self= .Arch)), > > + ("MODULE_BUILD_DIR" ,self.BuildDir), > > + ("OUTPUT_DIR" ,self.OutputDir), > > + ("DEBUG_DIR" ,self.DebugDir), > > + ("DEST_DIR_OUTPUT" ,self.OutputDir), > > + ("DEST_DIR_DEBUG" ,self.DebugDir), > > + ("PLATFORM_NAME" ,self.PlatformInfo.Name), > > + ("PLATFORM_GUID" ,self.PlatformInfo.Guid), > > + ("PLATFORM_VERSION" ,self.PlatformInfo.Version), > > + ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir), > > + ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.Platfor= mInfo.SourceDir)), > > + ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir), > > + ("FFS_OUTPUT_DIR" ,self.FfsOutputDir) > > + )) > > + > > + ## Return the module build data object > > + @cached_property > > + def Module(self): > > + return self.BuildDatabase[self.MetaFile, self.Arch, self.Buil= dTarget, self.ToolChain] > > + > > + ## Return the module name > > + @cached_property > > + def Name(self): > > + return self.Module.BaseName > > + > > + ## Return the module DxsFile if exist > > + @cached_property > > + def DxsFile(self): > > + return self.Module.DxsFile > > + > > + ## Return the module meta-file GUID > > + @cached_property > > + def Guid(self): > > + # > > + # To build same module more than once, the module path with F= ILE_GUID overridden has > > + # the file name FILE_GUIDmodule.inf, but the relative path (s= elf.MetaFile.File) is the real path > > + # in DSC. The overridden GUID can be retrieved from file name > > + # > > + if os.path.basename(self.MetaFile.File) !=3D os.path.basename= (self.MetaFile.Path): > > + # > > + # Length of GUID is 36 > > + # > > + return os.path.basename(self.MetaFile.Path)[:36] > > + return self.Module.Guid > > + > > + ## Return the module version > > + @cached_property > > + def Version(self): > > + return self.Module.Version > > + > > + ## Return the module type > > + @cached_property > > + def ModuleType(self): > > + return self.Module.ModuleType > > + > > + ## Return the component type (for Edk.x style of module) > > + @cached_property > > + def ComponentType(self): > > + return self.Module.ComponentType > > + > > + ## Return the build type > > + @cached_property > > + def BuildType(self): > > + return self.Module.BuildType > > + > > + ## Return the PCD_IS_DRIVER setting > > + @cached_property > > + def PcdIsDriver(self): > > + return self.Module.PcdIsDriver > > + > > + ## Return the autogen version, i.e. module meta-file version > > + @cached_property > > + def AutoGenVersion(self): > > + return self.Module.AutoGenVersion > > + > > + ## Check if the module is library or not > > + @cached_property > > + def IsLibrary(self): > > + return bool(self.Module.LibraryClass) > > + > > + ## Check if the module is binary module or not > > + @cached_property > > + def IsBinaryModule(self): > > + return self.Module.IsBinaryModule > > + > > + ## Return the directory to store intermediate files of the module > > + @cached_property > > + def BuildDir(self): > > + return _MakeDir(( > > + self.PlatformInfo.BuildDir, > > + self.Arch, > > + self.SourceDir, > > + self.MetaFile.BaseName > > + )) > > + > > + ## Return the directory to store the intermediate object files of= the module > > + @cached_property > > + def OutputDir(self): > > + return _MakeDir((self.BuildDir, "OUTPUT")) > > + > > + ## Return the directory path to store ffs file > > + @cached_property > > + def FfsOutputDir(self): > > + if GlobalData.gFdfParser: > > + return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIREC= TORY, "Ffs", self.Guid + self.Name) > > + return '' > > + > > + ## Return the directory to store auto-gened source files of the m= odule > > + @cached_property > > + def DebugDir(self): > > + return _MakeDir((self.BuildDir, "DEBUG")) > > + > > + ## Return the path of custom file > > + @cached_property > > + def CustomMakefile(self): > > + RetVal =3D {} > > + for Type in self.Module.CustomMakefile: > > + MakeType =3D gMakeTypeMap[Type] if Type in gMakeTypeMap e= lse 'nmake' > > + File =3D os.path.join(self.SourceDir, self.Module.CustomM= akefile[Type]) > > + RetVal[MakeType] =3D File > > + return RetVal > > + > > + ## Return the directory of the makefile > > + # > > + # @retval string The directory string of module's makefile > > + # > > + @cached_property > > + def MakeFileDir(self): > > + return self.BuildDir > > + > > + ## Return build command string > > + # > > + # @retval string Build command string > > + # > > + @cached_property > > + def BuildCommand(self): > > + return self.PlatformInfo.BuildCommand > > + > > + ## Get object list of all packages the module and its dependent l= ibraries belong to > > + # > > + # @retval list The list of package object > > + # > > + @cached_property > > + def DerivedPackageList(self): > > + PackageList =3D [] > > + for M in [self.Module] + self.DependentLibraryList: > > + for Package in M.Packages: > > + if Package in PackageList: > > + continue > > + PackageList.append(Package) > > + return PackageList > > + > > + ## Get the depex string > > + # > > + # @return : a string contain all depex expression. > > + def _GetDepexExpresionString(self): > > + DepexStr =3D '' > > + DepexList =3D [] > > + ## DPX_SOURCE IN Define section. > > + if self.Module.DxsFile: > > + return DepexStr > > + for M in [self.Module] + self.DependentLibraryList: > > + Filename =3D M.MetaFile.Path > > + InfObj =3D InfSectionParser.InfSectionParser(Filename) > > + DepexExpressionList =3D InfObj.GetDepexExpresionList() > > + for DepexExpression in DepexExpressionList: > > + for key in DepexExpression: > > + Arch, ModuleType =3D key > > + DepexExpr =3D [x for x in DepexExpression[key] if= not str(x).startswith('#')] > > + # the type of build module is USER_DEFINED. > > + # All different DEPEX section tags would be copie= d into the As Built INF file > > + # and there would be separate DEPEX section tags > > + if self.ModuleType.upper() =3D=3D SUP_MODULE_USER= _DEFINED or self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > > + if (Arch.upper() =3D=3D self.Arch.upper()) an= d (ModuleType.upper() !=3D TAB_ARCH_COMMON): > > + DepexList.append({(Arch, ModuleType): Dep= exExpr}) > > + else: > > + if Arch.upper() =3D=3D TAB_ARCH_COMMON or \ > > + (Arch.upper() =3D=3D self.Arch.upper() and = \ > > + ModuleType.upper() in [TAB_ARCH_COMMON, sel= f.ModuleType.upper()]): > > + DepexList.append({(Arch, ModuleType): Dep= exExpr}) > > + > > + #the type of build module is USER_DEFINED. > > + if self.ModuleType.upper() =3D=3D SUP_MODULE_USER_DEFINED or = self.ModuleType.upper() =3D=3D SUP_MODULE_HOST_APPLICATION: > > + for Depex in DepexList: > > + for key in Depex: > > + DepexStr +=3D '[Depex.%s.%s]\n' % key > > + DepexStr +=3D '\n'.join('# '+ val for val in Depe= x[key]) > > + DepexStr +=3D '\n\n' > > + if not DepexStr: > > + return '[Depex.%s]\n' % self.Arch > > + return DepexStr > > + > > + #the type of build module not is USER_DEFINED. > > + Count =3D 0 > > + for Depex in DepexList: > > + Count +=3D 1 > > + if DepexStr !=3D '': > > + DepexStr +=3D ' AND ' > > + DepexStr +=3D '(' > > + for D in Depex.values(): > > + DepexStr +=3D ' '.join(val for val in D) > > + Index =3D DepexStr.find('END') > > + if Index > -1 and Index =3D=3D len(DepexStr) - 3: > > + DepexStr =3D DepexStr[:-3] > > + DepexStr =3D DepexStr.strip() > > + DepexStr +=3D ')' > > + if Count =3D=3D 1: > > + DepexStr =3D DepexStr.lstrip('(').rstrip(')').strip() > > + if not DepexStr: > > + return '[Depex.%s]\n' % self.Arch > > + return '[Depex.%s]\n# ' % self.Arch + DepexStr > > + > > + ## Merge dependency expression > > + # > > + # @retval list The token list of the dependency expressi= on after parsed > > + # > > + @cached_property > > + def DepexList(self): > > + if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSIO= N_FILE in self.FileTypes: > > + return {} > > + > > + DepexList =3D [] > > + # > > + # Append depex from dependent libraries, if not "BEFORE", "AF= TER" expression > > + # > > + FixedVoidTypePcds =3D {} > > + for M in [self] + self.LibraryAutoGenList: > > + FixedVoidTypePcds.update(M.FixedVoidTypePcds) > > + for M in [self] + self.LibraryAutoGenList: > > + Inherited =3D False > > + for D in M.Module.Depex[self.Arch, self.ModuleType]: > > + if DepexList !=3D []: > > + DepexList.append('AND') > > + DepexList.append('(') > > + #replace D with value if D is FixedAtBuild PCD > > + NewList =3D [] > > + for item in D: > > + if '.' not in item: > > + NewList.append(item) > > + else: > > + try: > > + Value =3D FixedVoidTypePcds[item] > > + if len(Value.split(',')) !=3D 16: > > + EdkLogger.error("build", FORMAT_INVAL= ID, > > + "{} used in [Depex] s= ection should be used as FixedAtBuild type and VOID* datum type and 16 byte= s in the module.".format(item)) > > + NewList.append(Value) > > + except: > > + EdkLogger.error("build", FORMAT_INVALID, = "{} used in [Depex] section should be used as FixedAtBuild type and VOID* d= atum type in the module.".format(item)) > > + > > + DepexList.extend(NewList) > > + if DepexList[-1] =3D=3D 'END': # no need of a END at= this time > > + DepexList.pop() > > + DepexList.append(')') > > + Inherited =3D True > > + if Inherited: > > + EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Na= me, M.Module.BaseName, DepexList)) > > + if 'BEFORE' in DepexList or 'AFTER' in DepexList: > > + break > > + if len(DepexList) > 0: > > + EdkLogger.verbose('') > > + return {self.ModuleType:DepexList} > > + > > + ## Merge dependency expression > > + # > > + # @retval list The token list of the dependency expressi= on after parsed > > + # > > + @cached_property > > + def DepexExpressionDict(self): > > + if self.DxsFile or self.IsLibrary or TAB_DEPENDENCY_EXPRESSIO= N_FILE in self.FileTypes: > > + return {} > > + > > + DepexExpressionString =3D '' > > + # > > + # Append depex from dependent libraries, if not "BEFORE", "AF= TER" expresion > > + # > > + for M in [self.Module] + self.DependentLibraryList: > > + Inherited =3D False > > + for D in M.DepexExpression[self.Arch, self.ModuleType]: > > + if DepexExpressionString !=3D '': > > + DepexExpressionString +=3D ' AND ' > > + DepexExpressionString +=3D '(' > > + DepexExpressionString +=3D D > > + DepexExpressionString =3D DepexExpressionString.rstri= p('END').strip() > > + DepexExpressionString +=3D ')' > > + Inherited =3D True > > + if Inherited: > > + EdkLogger.verbose("DEPEX[%s] (+%s) =3D %s" % (self.Na= me, M.BaseName, DepexExpressionString)) > > + if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexE= xpressionString: > > + break > > + if len(DepexExpressionString) > 0: > > + EdkLogger.verbose('') > > + > > + return {self.ModuleType:DepexExpressionString} > > + > > + # Get the tiano core user extension, it is contain dependent libr= ary. > > + # @retval: a list contain tiano core userextension. > > + # > > + def _GetTianoCoreUserExtensionList(self): > > + TianoCoreUserExtentionList =3D [] > > + for M in [self.Module] + self.DependentLibraryList: > > + Filename =3D M.MetaFile.Path > > + InfObj =3D InfSectionParser.InfSectionParser(Filename) > > + TianoCoreUserExtenList =3D InfObj.GetUserExtensionTianoCo= re() > > + for TianoCoreUserExtent in TianoCoreUserExtenList: > > + for Section in TianoCoreUserExtent: > > + ItemList =3D Section.split(TAB_SPLIT) > > + Arch =3D self.Arch > > + if len(ItemList) =3D=3D 4: > > + Arch =3D ItemList[3] > > + if Arch.upper() =3D=3D TAB_ARCH_COMMON or Arch.up= per() =3D=3D self.Arch.upper(): > > + TianoCoreList =3D [] > > + TianoCoreList.extend([TAB_SECTION_START + Sec= tion + TAB_SECTION_END]) > > + TianoCoreList.extend(TianoCoreUserExtent[Sect= ion][:]) > > + TianoCoreList.append('\n') > > + TianoCoreUserExtentionList.append(TianoCoreLi= st) > > + > > + return TianoCoreUserExtentionList > > + > > + ## Return the list of specification version required for the modu= le > > + # > > + # @retval list The list of specification defined in modu= le file > > + # > > + @cached_property > > + def Specification(self): > > + return self.Module.Specification > > + > > + ## Tool option for the module build > > + # > > + # @param PlatformInfo The object of PlatformBuildInfo > > + # @retval dict The dict containing valid options > > + # > > + @cached_property > > + def BuildOption(self): > > + RetVal, self.BuildRuleOrder =3D self.PlatformInfo.ApplyBuildO= ption(self.Module) > > + if self.BuildRuleOrder: > > + self.BuildRuleOrder =3D ['.%s' % Ext for Ext in self.Buil= dRuleOrder.split()] > > + return RetVal > > + > > + ## Get include path list from tool option for the module build > > + # > > + # @retval list The include path list > > + # > > + @cached_property > > + def BuildOptionIncPathList(self): > > + # > > + # Regular expression for finding Include Directories, the dif= ference between MSFT and INTEL/GCC/RVCT > > + # is the former use /I , the Latter used -I to specify includ= e directories > > + # > > + if self.PlatformInfo.ToolChainFamily in (TAB_COMPILER_MSFT): > > + BuildOptIncludeRegEx =3D gBuildOptIncludePatternMsft > > + elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'R= VCT'): > > + BuildOptIncludeRegEx =3D gBuildOptIncludePatternOther > > + else: > > + # > > + # New ToolChainFamily, don't known whether there is optio= n to specify include directories > > + # > > + return [] > > + > > + RetVal =3D [] > > + for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'A= SM'): > > + try: > > + FlagOption =3D self.BuildOption[Tool]['FLAGS'] > > + except KeyError: > > + FlagOption =3D '' > > + > > + if self.ToolChainFamily !=3D 'RVCT': > > + IncPathList =3D [NormPath(Path, self.Macros) for Path= in BuildOptIncludeRegEx.findall(FlagOption)] > > + else: > > + # > > + # RVCT may specify a list of directory seperated by c= ommas > > + # > > + IncPathList =3D [] > > + for Path in BuildOptIncludeRegEx.findall(FlagOption): > > + PathList =3D GetSplitList(Path, TAB_COMMA_SPLIT) > > + IncPathList.extend(NormPath(PathEntry, self.Macro= s) for PathEntry in PathList) > > + > > + # > > + # EDK II modules must not reference header files outside = of the packages they depend on or > > + # within the module's directory tree. Report error if vio= lation. > > + # > > + if GlobalData.gDisableIncludePathCheck =3D=3D False: > > + for Path in IncPathList: > > + if (Path not in self.IncludePathList) and (Common= Path([Path, self.MetaFile.Dir]) !=3D self.MetaFile.Dir): > > + ErrMsg =3D "The include directory for the EDK= II module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, = Tool, FlagOption) > > + EdkLogger.error("build", > > + PARAMETER_INVALID, > > + ExtraData=3DErrMsg, > > + File=3Dstr(self.MetaFile)) > > + RetVal +=3D IncPathList > > + return RetVal > > + > > + ## Return a list of files which can be built from source > > + # > > + # What kind of files can be built is determined by build rules i= n > > + # $(CONF_DIRECTORY)/build_rule.txt and toolchain family. > > + # > > + @cached_property > > + def SourceFileList(self): > > + RetVal =3D [] > > + ToolChainTagSet =3D {"", TAB_STAR, self.ToolChain} > > + ToolChainFamilySet =3D {"", TAB_STAR, self.ToolChainFamily, s= elf.BuildRuleFamily} > > + for F in self.Module.Sources: > > + # match tool chain > > + if F.TagName not in ToolChainTagSet: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s= ] for processing file [%s] is found, " > > + "but [%s] is currently used" % (F.Tag= Name, str(F), self.ToolChain)) > > + continue > > + # match tool chain family or build rule family > > + if F.ToolChainFamily not in ToolChainFamilySet: > > + EdkLogger.debug( > > + EdkLogger.DEBUG_0, > > + "The file [%s] must be built by tools of = [%s], " \ > > + "but current toolchain family is [%s], bu= ildrule family is [%s]" \ > > + % (str(F), F.ToolChainFamily, self.To= olChainFamily, self.BuildRuleFamily)) > > + continue > > + > > + # add the file path into search path list for file includ= ing > > + if F.Dir not in self.IncludePathList: > > + self.IncludePathList.insert(0, F.Dir) > > + RetVal.append(F) > > + > > + self._MatchBuildRuleOrder(RetVal) > > + > > + for F in RetVal: > > + self._ApplyBuildRule(F, TAB_UNKNOWN_FILE) > > + return RetVal > > + > > + def _MatchBuildRuleOrder(self, FileList): > > + Order_Dict =3D {} > > + self.BuildOption > > + for SingleFile in FileList: > > + if self.BuildRuleOrder and SingleFile.Ext in self.BuildRu= leOrder and SingleFile.Ext in self.BuildRules: > > + key =3D SingleFile.Path.rsplit(SingleFile.Ext,1)[0] > > + if key in Order_Dict: > > + Order_Dict[key].append(SingleFile.Ext) > > + else: > > + Order_Dict[key] =3D [SingleFile.Ext] > > + > > + RemoveList =3D [] > > + for F in Order_Dict: > > + if len(Order_Dict[F]) > 1: > > + Order_Dict[F].sort(key=3Dlambda i: self.BuildRuleOrde= r.index(i)) > > + for Ext in Order_Dict[F][1:]: > > + RemoveList.append(F + Ext) > > + > > + for item in RemoveList: > > + FileList.remove(item) > > + > > + return FileList > > + > > + ## Return the list of unicode files > > + @cached_property > > + def UnicodeFileList(self): > > + return self.FileTypes.get(TAB_UNICODE_FILE,[]) > > + > > + ## Return the list of vfr files > > + @cached_property > > + def VfrFileList(self): > > + return self.FileTypes.get(TAB_VFR_FILE, []) > > + > > + ## Return the list of Image Definition files > > + @cached_property > > + def IdfFileList(self): > > + return self.FileTypes.get(TAB_IMAGE_FILE,[]) > > + > > + ## Return a list of files which can be built from binary > > + # > > + # "Build" binary files are just to copy them to build directory. > > + # > > + # @retval list The list of files which can be bu= ilt later > > + # > > + @cached_property > > + def BinaryFileList(self): > > + RetVal =3D [] > > + for F in self.Module.Binaries: > > + if F.Target not in [TAB_ARCH_COMMON, TAB_STAR] and F.Targ= et !=3D self.BuildTarget: > > + continue > > + RetVal.append(F) > > + self._ApplyBuildRule(F, F.Type, BinaryFileList=3DRetVal) > > + return RetVal > > + > > + @cached_property > > + def BuildRules(self): > > + RetVal =3D {} > > + BuildRuleDatabase =3D self.PlatformInfo.BuildRule > > + for Type in BuildRuleDatabase.FileTypeList: > > + #first try getting build rule by BuildRuleFamily > > + RuleObject =3D BuildRuleDatabase[Type, self.BuildType, se= lf.Arch, self.BuildRuleFamily] > > + if not RuleObject: > > + # build type is always module type, but ... > > + if self.ModuleType !=3D self.BuildType: > > + RuleObject =3D BuildRuleDatabase[Type, self.Modul= eType, self.Arch, self.BuildRuleFamily] > > + #second try getting build rule by ToolChainFamily > > + if not RuleObject: > > + RuleObject =3D BuildRuleDatabase[Type, self.BuildType= , self.Arch, self.ToolChainFamily] > > + if not RuleObject: > > + # build type is always module type, but ... > > + if self.ModuleType !=3D self.BuildType: > > + RuleObject =3D BuildRuleDatabase[Type, self.M= oduleType, self.Arch, self.ToolChainFamily] > > + if not RuleObject: > > + continue > > + RuleObject =3D RuleObject.Instantiate(self.Macros) > > + RetVal[Type] =3D RuleObject > > + for Ext in RuleObject.SourceFileExtList: > > + RetVal[Ext] =3D RuleObject > > + return RetVal > > + > > + def _ApplyBuildRule(self, File, FileType, BinaryFileList=3DNone): > > + if self._BuildTargets is None: > > + self._IntroBuildTargetList =3D set() > > + self._FinalBuildTargetList =3D set() > > + self._BuildTargets =3D defaultdict(set) > > + self._FileTypes =3D defaultdict(set) > > + > > + if not BinaryFileList: > > + BinaryFileList =3D self.BinaryFileList > > + > > + SubDirectory =3D os.path.join(self.OutputDir, File.SubDir) > > + if not os.path.exists(SubDirectory): > > + CreateDirectory(SubDirectory) > > + LastTarget =3D None > > + RuleChain =3D set() > > + SourceList =3D [File] > > + Index =3D 0 > > + # > > + # Make sure to get build rule order value > > + # > > + self.BuildOption > > + > > + while Index < len(SourceList): > > + Source =3D SourceList[Index] > > + Index =3D Index + 1 > > + > > + if Source !=3D File: > > + CreateDirectory(Source.Dir) > > + > > + if File.IsBinary and File =3D=3D Source and File in Binar= yFileList: > > + # Skip all files that are not binary libraries > > + if not self.IsLibrary: > > + continue > > + RuleObject =3D self.BuildRules[TAB_DEFAULT_BINARY_FIL= E] > > + elif FileType in self.BuildRules: > > + RuleObject =3D self.BuildRules[FileType] > > + elif Source.Ext in self.BuildRules: > > + RuleObject =3D self.BuildRules[Source.Ext] > > + else: > > + # stop at no more rules > > + if LastTarget: > > + self._FinalBuildTargetList.add(LastTarget) > > + break > > + > > + FileType =3D RuleObject.SourceFileType > > + self._FileTypes[FileType].add(Source) > > + > > + # stop at STATIC_LIBRARY for library > > + if self.IsLibrary and FileType =3D=3D TAB_STATIC_LIBRARY: > > + if LastTarget: > > + self._FinalBuildTargetList.add(LastTarget) > > + break > > + > > + Target =3D RuleObject.Apply(Source, self.BuildRuleOrder) > > + if not Target: > > + if LastTarget: > > + self._FinalBuildTargetList.add(LastTarget) > > + break > > + elif not Target.Outputs: > > + # Only do build for target with outputs > > + self._FinalBuildTargetList.add(Target) > > + > > + self._BuildTargets[FileType].add(Target) > > + > > + if not Source.IsBinary and Source =3D=3D File: > > + self._IntroBuildTargetList.add(Target) > > + > > + # to avoid cyclic rule > > + if FileType in RuleChain: > > + break > > + > > + RuleChain.add(FileType) > > + SourceList.extend(Target.Outputs) > > + LastTarget =3D Target > > + FileType =3D TAB_UNKNOWN_FILE > > + > > + @cached_property > > + def Targets(self): > > + if self._BuildTargets is None: > > + self._IntroBuildTargetList =3D set() > > + self._FinalBuildTargetList =3D set() > > + self._BuildTargets =3D defaultdict(set) > > + self._FileTypes =3D defaultdict(set) > > + > > + #TRICK: call SourceFileList property to apply build rule for = source files > > + self.SourceFileList > > + > > + #TRICK: call _GetBinaryFileList to apply build rule for binar= y files > > + self.BinaryFileList > > + > > + return self._BuildTargets > > + > > + @cached_property > > + def IntroTargetList(self): > > + self.Targets > > + return self._IntroBuildTargetList > > + > > + @cached_property > > + def CodaTargetList(self): > > + self.Targets > > + return self._FinalBuildTargetList > > + > > + @cached_property > > + def FileTypes(self): > > + self.Targets > > + return self._FileTypes > > + > > + ## Get the list of package object the module depends on > > + # > > + # @retval list The package object list > > + # > > + @cached_property > > + def DependentPackageList(self): > > + return self.Module.Packages > > + > > + ## Return the list of auto-generated code file > > + # > > + # @retval list The list of auto-generated file > > + # > > + @cached_property > > + def AutoGenFileList(self): > > + AutoGenUniIdf =3D self.BuildType !=3D 'UEFI_HII' > > + UniStringBinBuffer =3D BytesIO() > > + IdfGenBinBuffer =3D BytesIO() > > + RetVal =3D {} > > + AutoGenC =3D TemplateString() > > + AutoGenH =3D TemplateString() > > + StringH =3D TemplateString() > > + StringIdf =3D TemplateString() > > + GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUni= Idf, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer) > > + # > > + # AutoGen.c is generated if there are library classes in inf,= or there are object files > > + # > > + if str(AutoGenC) !=3D "" and (len(self.Module.LibraryClasses)= > 0 > > + or TAB_OBJECT_FILE in self.FileTy= pes): > > + AutoFile =3D PathClass(gAutoGenCodeFileName, self.DebugDi= r) > > + RetVal[AutoFile] =3D str(AutoGenC) > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if str(AutoGenH) !=3D "": > > + AutoFile =3D PathClass(gAutoGenHeaderFileName, self.Debug= Dir) > > + RetVal[AutoFile] =3D str(AutoGenH) > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if str(StringH) !=3D "": > > + AutoFile =3D PathClass(gAutoGenStringFileName % {"module_= name":self.Name}, self.DebugDir) > > + RetVal[AutoFile] =3D str(StringH) > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if UniStringBinBuffer is not None and UniStringBinBuffer.getv= alue() !=3D b"": > > + AutoFile =3D PathClass(gAutoGenStringFormFileName % {"mod= ule_name":self.Name}, self.OutputDir) > > + RetVal[AutoFile] =3D UniStringBinBuffer.getvalue() > > + AutoFile.IsBinary =3D True > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if UniStringBinBuffer is not None: > > + UniStringBinBuffer.close() > > + if str(StringIdf) !=3D "": > > + AutoFile =3D PathClass(gAutoGenImageDefFileName % {"modul= e_name":self.Name}, self.DebugDir) > > + RetVal[AutoFile] =3D str(StringIdf) > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue()= !=3D b"": > > + AutoFile =3D PathClass(gAutoGenIdfFileName % {"module_nam= e":self.Name}, self.OutputDir) > > + RetVal[AutoFile] =3D IdfGenBinBuffer.getvalue() > > + AutoFile.IsBinary =3D True > > + self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE) > > + if IdfGenBinBuffer is not None: > > + IdfGenBinBuffer.close() > > + return RetVal > > + > > + ## Return the list of library modules explicitly or implicitly us= ed by this module > > + @cached_property > > + def DependentLibraryList(self): > > + # only merge library classes and PCD for non-library module > > + if self.IsLibrary: > > + return [] > > + return self.PlatformInfo.ApplyLibraryInstance(self.Module) > > + > > + ## Get the list of PCDs from current module > > + # > > + # @retval list The list of PCD > > + # > > + @cached_property > > + def ModulePcdList(self): > > + # apply PCD settings from platform > > + RetVal =3D self.PlatformInfo.ApplyPcdSetting(self.Module, sel= f.Module.Pcds) > > + > > + return RetVal > > + @cached_property > > + def _PcdComments(self): > > + ReVal =3D OrderedListDict() > > + ExtendCopyDictionaryLists(ReVal, self.Module.PcdComments) > > + if not self.IsLibrary: > > + for Library in self.DependentLibraryList: > > + ExtendCopyDictionaryLists(ReVal, Library.PcdComments) > > + return ReVal > > + > > + ## Get the list of PCDs from dependent libraries > > + # > > + # @retval list The list of PCD > > + # > > + @cached_property > > + def LibraryPcdList(self): > > + if self.IsLibrary: > > + return [] > > + RetVal =3D [] > > + Pcds =3D set() > > + # get PCDs from dependent libraries > > + for Library in self.DependentLibraryList: > > + PcdsInLibrary =3D OrderedDict() > > + for Key in Library.Pcds: > > + # skip duplicated PCDs > > + if Key in self.Module.Pcds or Key in Pcds: > > + continue > > + Pcds.add(Key) > > + PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > > + RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self.Modu= le, PcdsInLibrary, Library=3DLibrary)) > > + return RetVal > > + > > + ## Get the GUID value mapping > > + # > > + # @retval dict The mapping between GUID cname and its va= lue > > + # > > + @cached_property > > + def GuidList(self): > > + RetVal =3D self.Module.Guids > > + for Library in self.DependentLibraryList: > > + RetVal.update(Library.Guids) > > + ExtendCopyDictionaryLists(self._GuidComments, Library.Gui= dComments) > > + ExtendCopyDictionaryLists(self._GuidComments, self.Module.Gui= dComments) > > + return RetVal > > + > > + @cached_property > > + def GetGuidsUsedByPcd(self): > > + RetVal =3D OrderedDict(self.Module.GetGuidsUsedByPcd()) > > + for Library in self.DependentLibraryList: > > + RetVal.update(Library.GetGuidsUsedByPcd()) > > + return RetVal > > + ## Get the protocol value mapping > > + # > > + # @retval dict The mapping between protocol cname and it= s value > > + # > > + @cached_property > > + def ProtocolList(self): > > + RetVal =3D OrderedDict(self.Module.Protocols) > > + for Library in self.DependentLibraryList: > > + RetVal.update(Library.Protocols) > > + ExtendCopyDictionaryLists(self._ProtocolComments, Library= .ProtocolComments) > > + ExtendCopyDictionaryLists(self._ProtocolComments, self.Module= .ProtocolComments) > > + return RetVal > > + > > + ## Get the PPI value mapping > > + # > > + # @retval dict The mapping between PPI cname and its val= ue > > + # > > + @cached_property > > + def PpiList(self): > > + RetVal =3D OrderedDict(self.Module.Ppis) > > + for Library in self.DependentLibraryList: > > + RetVal.update(Library.Ppis) > > + ExtendCopyDictionaryLists(self._PpiComments, Library.PpiC= omments) > > + ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiC= omments) > > + return RetVal > > + > > + ## Get the list of include search path > > + # > > + # @retval list The list path > > + # > > + @cached_property > > + def IncludePathList(self): > > + RetVal =3D [] > > + RetVal.append(self.MetaFile.Dir) > > + RetVal.append(self.DebugDir) > > + > > + for Package in self.Module.Packages: > > + PackageDir =3D mws.join(self.WorkspaceDir, Package.MetaFi= le.Dir) > > + if PackageDir not in RetVal: > > + RetVal.append(PackageDir) > > + IncludesList =3D Package.Includes > > + if Package._PrivateIncludes: > > + if not self.MetaFile.OriginalPath.Path.startswith(Pac= kageDir): > > + IncludesList =3D list(set(Package.Includes).diffe= rence(set(Package._PrivateIncludes))) > > + for Inc in IncludesList: > > + if Inc not in RetVal: > > + RetVal.append(str(Inc)) > > + return RetVal > > + > > + @cached_property > > + def IncludePathLength(self): > > + return sum(len(inc)+1 for inc in self.IncludePathList) > > + > > + ## Get HII EX PCDs which maybe used by VFR > > + # > > + # efivarstore used by VFR may relate with HII EX PCDs > > + # Get the variable name and GUID from efivarstore and HII EX PCD > > + # List the HII EX PCDs in As Built INF if both name and GUID mat= ch. > > + # > > + # @retval list HII EX PCDs > > + # > > + def _GetPcdsMaybeUsedByVfr(self): > > + if not self.SourceFileList: > > + return [] > > + > > + NameGuids =3D set() > > + for SrcFile in self.SourceFileList: > > + if SrcFile.Ext.lower() !=3D '.vfr': > > + continue > > + Vfri =3D os.path.join(self.OutputDir, SrcFile.BaseName + = '.i') > > + if not os.path.exists(Vfri): > > + continue > > + VfriFile =3D open(Vfri, 'r') > > + Content =3D VfriFile.read() > > + VfriFile.close() > > + Pos =3D Content.find('efivarstore') > > + while Pos !=3D -1: > > + # > > + # Make sure 'efivarstore' is the start of efivarstore= statement > > + # In case of the value of 'name' (name =3D efivarstor= e) is equal to 'efivarstore' > > + # > > + Index =3D Pos - 1 > > + while Index >=3D 0 and Content[Index] in ' \t\r\n': > > + Index -=3D 1 > > + if Index >=3D 0 and Content[Index] !=3D ';': > > + Pos =3D Content.find('efivarstore', Pos + len('ef= ivarstore')) > > + continue > > + # > > + # 'efivarstore' must be followed by name and guid > > + # > > + Name =3D gEfiVarStoreNamePattern.search(Content, Pos) > > + if not Name: > > + break > > + Guid =3D gEfiVarStoreGuidPattern.search(Content, Pos) > > + if not Guid: > > + break > > + NameArray =3D _ConvertStringToByteArray('L"' + Name.g= roup(1) + '"') > > + NameGuids.add((NameArray, GuidStructureStringToGuidSt= ring(Guid.group(1)))) > > + Pos =3D Content.find('efivarstore', Name.end()) > > + if not NameGuids: > > + return [] > > + HiiExPcds =3D [] > > + for Pcd in self.PlatformInfo.Pcds.values(): > > + if Pcd.Type !=3D TAB_PCDS_DYNAMIC_EX_HII: > > + continue > > + for SkuInfo in Pcd.SkuInfoList.values(): > > + Value =3D GuidValue(SkuInfo.VariableGuid, self.Platfo= rmInfo.PackageList, self.MetaFile.Path) > > + if not Value: > > + continue > > + Name =3D _ConvertStringToByteArray(SkuInfo.VariableNa= me) > > + Guid =3D GuidStructureStringToGuidString(Value) > > + if (Name, Guid) in NameGuids and Pcd not in HiiExPcds= : > > + HiiExPcds.append(Pcd) > > + break > > + > > + return HiiExPcds > > + > > + def _GenOffsetBin(self): > > + VfrUniBaseName =3D {} > > + for SourceFile in self.Module.Sources: > > + if SourceFile.Type.upper() =3D=3D ".VFR" : > > + # > > + # search the .map file to find the offset of vfr bina= ry in the PE32+/TE file. > > + # > > + VfrUniBaseName[SourceFile.BaseName] =3D (SourceFile.B= aseName + "Bin") > > + elif SourceFile.Type.upper() =3D=3D ".UNI" : > > + # > > + # search the .map file to find the offset of Uni stri= ngs binary in the PE32+/TE file. > > + # > > + VfrUniBaseName["UniOffsetName"] =3D (self.Name + "Str= ings") > > + > > + if not VfrUniBaseName: > > + return None > > + MapFileName =3D os.path.join(self.OutputDir, self.Name + ".ma= p") > > + EfiFileName =3D os.path.join(self.OutputDir, self.Name + ".ef= i") > > + VfrUniOffsetList =3D GetVariableOffset(MapFileName, EfiFileNa= me, list(VfrUniBaseName.values())) > > + if not VfrUniOffsetList: > > + return None > > + > > + OutputName =3D '%sOffset.bin' % self.Name > > + UniVfrOffsetFileName =3D os.path.join( self.OutputDir, Ou= tputName) > > + > > + try: > > + fInputfile =3D open(UniVfrOffsetFileName, "wb+", 0) > > + except: > > + EdkLogger.error("build", FILE_OPEN_FAILURE, "File open fa= iled for %s" % UniVfrOffsetFileName, None) > > + > > + # Use a instance of BytesIO to cache data > > + fStringIO =3D BytesIO() > > + > > + for Item in VfrUniOffsetList: > > + if (Item[0].find("Strings") !=3D -1): > > + # > > + # UNI offset in image. > > + # GUID + Offset > > + # { 0x8913c5e0, 0x33f6, 0x4d86, { 0x9b, 0xf1, 0x43, 0= xef, 0x89, 0xfc, 0x6, 0x66 } } > > + # > > + UniGuid =3D b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef= \x89\xfc\x06f' > > + fStringIO.write(UniGuid) > > + UniValue =3D pack ('Q', int (Item[1], 16)) > > + fStringIO.write (UniValue) > > + else: > > + # > > + # VFR binary offset in image. > > + # GUID + Offset > > + # { 0xd0bc7cb4, 0x6a47, 0x495f, { 0xaa, 0x11, 0x71, 0= x7, 0x46, 0xda, 0x6, 0xa2 } }; > > + # > > + VfrGuid =3D b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06= \xa2' > > + fStringIO.write(VfrGuid) > > + VfrValue =3D pack ('Q', int (Item[1], 16)) > > + fStringIO.write (VfrValue) > > + # > > + # write data into file. > > + # > > + try : > > + fInputfile.write (fStringIO.getvalue()) > > + except: > > + EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data = to file %s failed, please check whether the " > > + "file been locked or using by other appli= cations." %UniVfrOffsetFileName, None) > > + > > + fStringIO.close () > > + fInputfile.close () > > + return OutputName > > + @cached_property > > + def OutputFile(self): > > + retVal =3D set() > > + OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > > + DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > > + for Item in self.CodaTargetList: > > + File =3D Item.Target.Path.replace('\\', '/').strip('/').r= eplace(DebugDir, '').replace(OutputDir, '').strip('/') > > + retVal.add(File) > > + if self.DepexGenerated: > > + retVal.add(self.Name + '.depex') > > + > > + Bin =3D self._GenOffsetBin() > > + if Bin: > > + retVal.add(Bin) > > + > > + for Root, Dirs, Files in os.walk(OutputDir): > > + for File in Files: > > + if File.lower().endswith('.pdb'): > > + retVal.add(File) > > + > > + return retVal > > + > > + ## Create AsBuilt INF file the module > > + # > > + def CreateAsBuiltInf(self): > > + > > + if self.IsAsBuiltInfCreated: > > + return > > + > > + # Skip INF file generation for libraries > > + if self.IsLibrary: > > + return > > + > > + # Skip the following code for modules with no source files > > + if not self.SourceFileList: > > + return > > + > > + # Skip the following code for modules without any binary file= s > > + if self.BinaryFileList: > > + return > > + > > + ### TODO: How to handles mixed source and binary modules > > + > > + # Find all DynamicEx and PatchableInModule PCDs used by this = module and dependent libraries > > + # Also find all packages that the DynamicEx PCDs depend on > > + Pcds =3D [] > > + PatchablePcds =3D [] > > + Packages =3D [] > > + PcdCheckList =3D [] > > + PcdTokenSpaceList =3D [] > > + for Pcd in self.ModulePcdList + self.LibraryPcdList: > > + if Pcd.Type =3D=3D TAB_PCDS_PATCHABLE_IN_MODULE: > > + PatchablePcds.append(Pcd) > > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGu= idCName, TAB_PCDS_PATCHABLE_IN_MODULE)) > > + elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET: > > + if Pcd not in Pcds: > > + Pcds.append(Pcd) > > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpa= ceGuidCName, TAB_PCDS_DYNAMIC_EX)) > > + PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpa= ceGuidCName, TAB_PCDS_DYNAMIC)) > > + PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName) > > + GuidList =3D OrderedDict(self.GuidList) > > + for TokenSpace in self.GetGuidsUsedByPcd: > > + # If token space is not referred by patch PCD or Ex PCD, = remove the GUID from GUID list > > + # The GUIDs in GUIDs section should really be the GUIDs i= n source INF or referred by Ex an patch PCDs > > + if TokenSpace not in PcdTokenSpaceList and TokenSpace in = GuidList: > > + GuidList.pop(TokenSpace) > > + CheckList =3D (GuidList, self.PpiList, self.ProtocolList, Pcd= CheckList) > > + for Package in self.DerivedPackageList: > > + if Package in Packages: > > + continue > > + BeChecked =3D (Package.Guids, Package.Ppis, Package.Proto= cols, Package.Pcds) > > + Found =3D False > > + for Index in range(len(BeChecked)): > > + for Item in CheckList[Index]: > > + if Item in BeChecked[Index]: > > + Packages.append(Package) > > + Found =3D True > > + break > > + if Found: > > + break > > + > > + VfrPcds =3D self._GetPcdsMaybeUsedByVfr() > > + for Pkg in self.PlatformInfo.PackageList: > > + if Pkg in Packages: > > + continue > > + for VfrPcd in VfrPcds: > > + if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, T= AB_PCDS_DYNAMIC_EX) in Pkg.Pcds or > > + (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, T= AB_PCDS_DYNAMIC) in Pkg.Pcds): > > + Packages.append(Pkg) > > + break > > + > > + ModuleType =3D SUP_MODULE_DXE_DRIVER if self.ModuleType =3D= =3D SUP_MODULE_UEFI_DRIVER and self.DepexGenerated else self.ModuleType > > + DriverType =3D self.PcdIsDriver if self.PcdIsDriver else '' > > + Guid =3D self.Guid > > + MDefs =3D self.Module.Defines > > + > > + AsBuiltInfDict =3D { > > + 'module_name' : self.Name, > > + 'module_guid' : Guid, > > + 'module_module_type' : ModuleType, > > + 'module_version_string' : [MDefs['VERSION_STRIN= G']] if 'VERSION_STRING' in MDefs else [], > > + 'pcd_is_driver_string' : [], > > + 'module_uefi_specification_version' : [], > > + 'module_pi_specification_version' : [], > > + 'module_entry_point' : self.Module.ModuleEnt= ryPointList, > > + 'module_unload_image' : self.Module.ModuleUnl= oadImageList, > > + 'module_constructor' : self.Module.Construct= orList, > > + 'module_destructor' : self.Module.Destructo= rList, > > + 'module_shadow' : [MDefs['SHADOW']] if = 'SHADOW' in MDefs else [], > > + 'module_pci_vendor_id' : [MDefs['PCI_VENDOR_ID= ']] if 'PCI_VENDOR_ID' in MDefs else [], > > + 'module_pci_device_id' : [MDefs['PCI_DEVICE_ID= ']] if 'PCI_DEVICE_ID' in MDefs else [], > > + 'module_pci_class_code' : [MDefs['PCI_CLASS_COD= E']] if 'PCI_CLASS_CODE' in MDefs else [], > > + 'module_pci_revision' : [MDefs['PCI_REVISION'= ]] if 'PCI_REVISION' in MDefs else [], > > + 'module_build_number' : [MDefs['BUILD_NUMBER'= ]] if 'BUILD_NUMBER' in MDefs else [], > > + 'module_spec' : [MDefs['SPEC']] if 'S= PEC' in MDefs else [], > > + 'module_uefi_hii_resource_section' : [MDefs['UEFI_HII_RESO= URCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [], > > + 'module_uni_file' : [MDefs['MODULE_UNI_FI= LE']] if 'MODULE_UNI_FILE' in MDefs else [], > > + 'module_arch' : self.Arch, > > + 'package_item' : [Package.MetaFile.Fil= e.replace('\\', '/') for Package in Packages], > > + 'binary_item' : [], > > + 'patchablepcd_item' : [], > > + 'pcd_item' : [], > > + 'protocol_item' : [], > > + 'ppi_item' : [], > > + 'guid_item' : [], > > + 'flags_item' : [], > > + 'libraryclasses_item' : [] > > + } > > + > > + if 'MODULE_UNI_FILE' in MDefs: > > + UNIFile =3D os.path.join(self.MetaFile.Dir, MDefs['MODULE= _UNI_FILE']) > > + if os.path.isfile(UNIFile): > > + shutil.copy2(UNIFile, self.OutputDir) > > + > > + if self.AutoGenVersion > int(gInfSpecVersion, 0): > > + AsBuiltInfDict['module_inf_version'] =3D '0x%08x' % self.= AutoGenVersion > > + else: > > + AsBuiltInfDict['module_inf_version'] =3D gInfSpecVersion > > + > > + if DriverType: > > + AsBuiltInfDict['pcd_is_driver_string'].append(DriverType) > > + > > + if 'UEFI_SPECIFICATION_VERSION' in self.Specification: > > + AsBuiltInfDict['module_uefi_specification_version'].appen= d(self.Specification['UEFI_SPECIFICATION_VERSION']) > > + if 'PI_SPECIFICATION_VERSION' in self.Specification: > > + AsBuiltInfDict['module_pi_specification_version'].append(= self.Specification['PI_SPECIFICATION_VERSION']) > > + > > + OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') > > + DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') > > + for Item in self.CodaTargetList: > > + File =3D Item.Target.Path.replace('\\', '/').strip('/').r= eplace(DebugDir, '').replace(OutputDir, '').strip('/') > > + if os.path.isabs(File): > > + File =3D File.replace('\\', '/').strip('/').replace(O= utputDir, '').strip('/') > > + if Item.Target.Ext.lower() =3D=3D '.aml': > > + AsBuiltInfDict['binary_item'].append('ASL|' + File) > > + elif Item.Target.Ext.lower() =3D=3D '.acpi': > > + AsBuiltInfDict['binary_item'].append('ACPI|' + File) > > + elif Item.Target.Ext.lower() =3D=3D '.efi': > > + AsBuiltInfDict['binary_item'].append('PE32|' + self.N= ame + '.efi') > > + else: > > + AsBuiltInfDict['binary_item'].append('BIN|' + File) > > + if not self.DepexGenerated: > > + DepexFile =3D os.path.join(self.OutputDir, self.Name + '.= depex') > > + if os.path.exists(DepexFile): > > + self.DepexGenerated =3D True > > + if self.DepexGenerated: > > + if self.ModuleType in [SUP_MODULE_PEIM]: > > + AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + s= elf.Name + '.depex') > > + elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODUL= E_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]: > > + AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + s= elf.Name + '.depex') > > + elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]: > > + AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + s= elf.Name + '.depex') > > + > > + Bin =3D self._GenOffsetBin() > > + if Bin: > > + AsBuiltInfDict['binary_item'].append('BIN|%s' % Bin) > > + > > + for Root, Dirs, Files in os.walk(OutputDir): > > + for File in Files: > > + if File.lower().endswith('.pdb'): > > + AsBuiltInfDict['binary_item'].append('DISPOSABLE|= ' + File) > > + HeaderComments =3D self.Module.HeaderComments > > + StartPos =3D 0 > > + for Index in range(len(HeaderComments)): > > + if HeaderComments[Index].find('@BinaryHeader') !=3D -1: > > + HeaderComments[Index] =3D HeaderComments[Index].repla= ce('@BinaryHeader', '@file') > > + StartPos =3D Index > > + break > > + AsBuiltInfDict['header_comments'] =3D '\n'.join(HeaderComment= s[StartPos:]).replace(':#', '://') > > + AsBuiltInfDict['tail_comments'] =3D '\n'.join(self.Module.Tai= lComments) > > + > > + GenList =3D [ > > + (self.ProtocolList, self._ProtocolComments, 'protocol_ite= m'), > > + (self.PpiList, self._PpiComments, 'ppi_item'), > > + (GuidList, self._GuidComments, 'guid_item') > > + ] > > + for Item in GenList: > > + for CName in Item[0]: > > + Comments =3D '\n '.join(Item[1][CName]) if CName in = Item[1] else '' > > + Entry =3D Comments + '\n ' + CName if Comments else = CName > > + AsBuiltInfDict[Item[2]].append(Entry) > > + PatchList =3D parsePcdInfoFromMapFile( > > + os.path.join(self.OutputDir, self.Name + = '.map'), > > + os.path.join(self.OutputDir, self.Name + = '.efi') > > + ) > > + if PatchList: > > + for Pcd in PatchablePcds: > > + TokenCName =3D Pcd.TokenCName > > + for PcdItem in GlobalData.MixedPcd: > > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in G= lobalData.MixedPcd[PcdItem]: > > + TokenCName =3D PcdItem[0] > > + break > > + for PatchPcd in PatchList: > > + if TokenCName =3D=3D PatchPcd[0]: > > + break > > + else: > > + continue > > + PcdValue =3D '' > > + if Pcd.DatumType =3D=3D 'BOOLEAN': > > + BoolValue =3D Pcd.DefaultValue.upper() > > + if BoolValue =3D=3D 'TRUE': > > + Pcd.DefaultValue =3D '1' > > + elif BoolValue =3D=3D 'FALSE': > > + Pcd.DefaultValue =3D '0' > > + > > + if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES: > > + HexFormat =3D '0x%02x' > > + if Pcd.DatumType =3D=3D TAB_UINT16: > > + HexFormat =3D '0x%04x' > > + elif Pcd.DatumType =3D=3D TAB_UINT32: > > + HexFormat =3D '0x%08x' > > + elif Pcd.DatumType =3D=3D TAB_UINT64: > > + HexFormat =3D '0x%016x' > > + PcdValue =3D HexFormat % int(Pcd.DefaultValue, 0) > > + else: > > + if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize = =3D=3D '': > > + EdkLogger.error("build", AUTOGEN_ERROR, > > + "Unknown [MaxDatumSize] of PC= D [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName) > > + ) > > + ArraySize =3D int(Pcd.MaxDatumSize, 0) > > + PcdValue =3D Pcd.DefaultValue > > + if PcdValue[0] !=3D '{': > > + Unicode =3D False > > + if PcdValue[0] =3D=3D 'L': > > + Unicode =3D True > > + PcdValue =3D PcdValue.lstrip('L') > > + PcdValue =3D eval(PcdValue) > > + NewValue =3D '{' > > + for Index in range(0, len(PcdValue)): > > + if Unicode: > > + CharVal =3D ord(PcdValue[Index]) > > + NewValue =3D NewValue + '0x%02x' % (C= harVal & 0x00FF) + ', ' \ > > + + '0x%02x' % (CharVal >> 8) += ', ' > > + else: > > + NewValue =3D NewValue + '0x%02x' % (o= rd(PcdValue[Index]) % 0x100) + ', ' > > + Padding =3D '0x00, ' > > + if Unicode: > > + Padding =3D Padding * 2 > > + ArraySize =3D ArraySize // 2 > > + if ArraySize < (len(PcdValue) + 1): > > + if Pcd.MaxSizeUserSet: > > + EdkLogger.error("build", AUTOGEN_ERRO= R, > > + "The maximum size of VOID= * type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpac= eGuidCName, TokenCName) > > + ) > > + else: > > + ArraySize =3D len(PcdValue) + 1 > > + if ArraySize > len(PcdValue) + 1: > > + NewValue =3D NewValue + Padding * (ArrayS= ize - len(PcdValue) - 1) > > + PcdValue =3D NewValue + Padding.strip().rstri= p(',') + '}' > > + elif len(PcdValue.split(',')) <=3D ArraySize: > > + PcdValue =3D PcdValue.rstrip('}') + ', 0x00' = * (ArraySize - len(PcdValue.split(','))) > > + PcdValue +=3D '}' > > + else: > > + if Pcd.MaxSizeUserSet: > > + EdkLogger.error("build", AUTOGEN_ERROR, > > + "The maximum size of VOID* ty= pe PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGui= dCName, TokenCName) > > + ) > > + else: > > + ArraySize =3D len(PcdValue) + 1 > > + PcdItem =3D '%s.%s|%s|0x%X' % \ > > + (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, P= atchPcd[1]) > > + PcdComments =3D '' > > + if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self.= _PcdComments: > > + PcdComments =3D '\n '.join(self._PcdComments[Pcd= .TokenSpaceGuidCName, Pcd.TokenCName]) > > + if PcdComments: > > + PcdItem =3D PcdComments + '\n ' + PcdItem > > + AsBuiltInfDict['patchablepcd_item'].append(PcdItem) > > + > > + for Pcd in Pcds + VfrPcds: > > + PcdCommentList =3D [] > > + HiiInfo =3D '' > > + TokenCName =3D Pcd.TokenCName > > + for PcdItem in GlobalData.MixedPcd: > > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in Globa= lData.MixedPcd[PcdItem]: > > + TokenCName =3D PcdItem[0] > > + break > > + if Pcd.Type =3D=3D TAB_PCDS_DYNAMIC_EX_HII: > > + for SkuName in Pcd.SkuInfoList: > > + SkuInfo =3D Pcd.SkuInfoList[SkuName] > > + HiiInfo =3D '## %s|%s|%s' % (SkuInfo.VariableName= , SkuInfo.VariableGuid, SkuInfo.VariableOffset) > > + break > > + if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._Pcd= Comments: > > + PcdCommentList =3D self._PcdComments[Pcd.TokenSpaceGu= idCName, Pcd.TokenCName][:] > > + if HiiInfo: > > + UsageIndex =3D -1 > > + UsageStr =3D '' > > + for Index, Comment in enumerate(PcdCommentList): > > + for Usage in UsageList: > > + if Comment.find(Usage) !=3D -1: > > + UsageStr =3D Usage > > + UsageIndex =3D Index > > + break > > + if UsageIndex !=3D -1: > > + PcdCommentList[UsageIndex] =3D '## %s %s %s' % (U= sageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, '')) > > + else: > > + PcdCommentList.append('## UNDEFINED ' + HiiInfo) > > + PcdComments =3D '\n '.join(PcdCommentList) > > + PcdEntry =3D Pcd.TokenSpaceGuidCName + '.' + TokenCName > > + if PcdComments: > > + PcdEntry =3D PcdComments + '\n ' + PcdEntry > > + AsBuiltInfDict['pcd_item'].append(PcdEntry) > > + for Item in self.BuildOption: > > + if 'FLAGS' in self.BuildOption[Item]: > > + AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_F= LAGS =3D %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, sel= f.Arch, Item, self.BuildOption[Item]['FLAGS'].strip())) > > + > > + # Generated LibraryClasses section in comments. > > + for Library in self.LibraryAutoGenList: > > + AsBuiltInfDict['libraryclasses_item'].append(Library.Meta= File.File.replace('\\', '/')) > > + > > + # Generated UserExtensions TianoCore section. > > + # All tianocore user extensions are copied. > > + UserExtStr =3D '' > > + for TianoCore in self._GetTianoCoreUserExtensionList(): > > + UserExtStr +=3D '\n'.join(TianoCore) > > + ExtensionFile =3D os.path.join(self.MetaFile.Dir, TianoCo= re[1]) > > + if os.path.isfile(ExtensionFile): > > + shutil.copy2(ExtensionFile, self.OutputDir) > > + AsBuiltInfDict['userextension_tianocore_item'] =3D UserExtStr > > + > > + # Generated depex expression section in comments. > > + DepexExpression =3D self._GetDepexExpresionString() > > + AsBuiltInfDict['depexsection_item'] =3D DepexExpression if De= pexExpression else '' > > + > > + AsBuiltInf =3D TemplateString() > > + AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfD= ict)) > > + > > + SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.i= nf'), str(AsBuiltInf), False) > > + > > + self.IsAsBuiltInfCreated =3D True > > + > > + def CopyModuleToCache(self): > > + FileDir =3D path.join(GlobalData.gBinCacheDest, self.Platform= Info.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceD= ir, self.MetaFile.BaseName) > > + CreateDirectory (FileDir) > > + HashFile =3D path.join(self.BuildDir, self.Name + '.hash') > > + if os.path.exists(HashFile): > > + CopyFileOnChange(HashFile, FileDir) > > + ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') > > + if os.path.exists(ModuleFile): > > + CopyFileOnChange(ModuleFile, FileDir) > > + if not self.OutputFile: > > + Ma =3D self.BuildDatabase[self.MetaFile, self.Arch, self.= BuildTarget, self.ToolChain] > > + self.OutputFile =3D Ma.Binaries > > + for File in self.OutputFile: > > + File =3D str(File) > > + if not os.path.isabs(File): > > + File =3D os.path.join(self.OutputDir, File) > > + if os.path.exists(File): > > + sub_dir =3D os.path.relpath(File, self.OutputDir) > > + destination_file =3D os.path.join(FileDir, sub_dir) > > + destination_dir =3D os.path.dirname(destination_file) > > + CreateDirectory(destination_dir) > > + CopyFileOnChange(File, destination_dir) > > + > > + def AttemptModuleCacheCopy(self): > > + # If library or Module is binary do not skip by hash > > + if self.IsBinaryModule: > > + return False > > + # .inc is contains binary information so do not skip by hash = as well > > + for f_ext in self.SourceFileList: > > + if '.inc' in str(f_ext): > > + return False > > + FileDir =3D path.join(GlobalData.gBinCacheSource, self.Platfo= rmInfo.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) > > + HashFile =3D path.join(FileDir, self.Name + '.hash') > > + if os.path.exists(HashFile): > > + f =3D open(HashFile, 'r') > > + CacheHash =3D f.read() > > + f.close() > > + self.GenModuleHash() > > + if GlobalData.gModuleHash[self.Arch][self.Name]: > > + if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch]= [self.Name]: > > + for root, dir, files in os.walk(FileDir): > > + for f in files: > > + if self.Name + '.hash' in f: > > + CopyFileOnChange(HashFile, self.Build= Dir) > > + else: > > + File =3D path.join(root, f) > > + sub_dir =3D os.path.relpath(File, Fil= eDir) > > + destination_file =3D os.path.join(sel= f.OutputDir, sub_dir) > > + destination_dir =3D os.path.dirname(d= estination_file) > > + CreateDirectory(destination_dir) > > + CopyFileOnChange(File, destination_di= r) > > + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D= "PcdDxe": > > + CreatePcdDatabaseCode(self, TemplateString(),= TemplateString()) > > + return True > > + return False > > + > > + ## Create makefile for the module and its dependent libraries > > + # > > + # @param CreateLibraryMakeFile Flag indicating if or not= the makefiles of > > + # dependent libraries will = be created > > + # > > + @cached_class_function > > + def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList= =3D []): > > + # nest this function inside it's only caller. > > + def CreateTimeStamp(): > > + FileSet =3D {self.MetaFile.Path} > > + > > + for SourceFile in self.Module.Sources: > > + FileSet.add (SourceFile.Path) > > + > > + for Lib in self.DependentLibraryList: > > + FileSet.add (Lib.MetaFile.Path) > > + > > + for f in self.AutoGenDepSet: > > + FileSet.add (f.Path) > > + > > + if os.path.exists (self.TimeStampPath): > > + os.remove (self.TimeStampPath) > > + with open(self.TimeStampPath, 'w+') as fd: > > + for f in FileSet: > > + fd.write(f) > > + fd.write("\n") > > + > > + # Ignore generating makefile when it is a binary module > > + if self.IsBinaryModule: > > + return > > + > > + self.GenFfsList =3D GenFfsList > > + > > + if not self.IsLibrary and CreateLibraryMakeFile: > > + for LibraryAutoGen in self.LibraryAutoGenList: > > + LibraryAutoGen.CreateMakeFile() > > + # Don't enable if hash feature enabled, CanSkip uses timestam= ps to determine build skipping > > + if not GlobalData.gUseHashCache and self.CanSkip(): > > + return > > + > > + if len(self.CustomMakefile) =3D=3D 0: > > + Makefile =3D GenMake.ModuleMakefile(self) > > + else: > > + Makefile =3D GenMake.CustomMakefile(self) > > + if Makefile.Generate(): > > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated makefile fo= r module %s [%s]" % > > + (self.Name, self.Arch)) > > + else: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generatio= n of makefile for module %s [%s]" % > > + (self.Name, self.Arch)) > > + > > + CreateTimeStamp() > > + > > + def CopyBinaryFiles(self): > > + for File in self.Module.Binaries: > > + SrcPath =3D File.Path > > + DstPath =3D os.path.join(self.OutputDir, os.path.basename= (SrcPath)) > > + CopyLongFilePath(SrcPath, DstPath) > > + ## Create autogen code for the module and its dependent libraries > > + # > > + # @param CreateLibraryCodeFile Flag indicating if or not= the code of > > + # dependent libraries will = be created > > + # > > + def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): > > + if self.IsCodeFileCreated: > > + return > > + > > + # Need to generate PcdDatabase even PcdDriver is binarymodule > > + if self.IsBinaryModule and self.PcdIsDriver !=3D '': > > + CreatePcdDatabaseCode(self, TemplateString(), TemplateStr= ing()) > > + return > > + if self.IsBinaryModule: > > + if self.IsLibrary: > > + self.CopyBinaryFiles() > > + return > > + > > + if not self.IsLibrary and CreateLibraryCodeFile: > > + for LibraryAutoGen in self.LibraryAutoGenList: > > + LibraryAutoGen.CreateCodeFile() > > + > > + # Don't enable if hash feature enabled, CanSkip uses timestam= ps to determine build skipping > > + if not GlobalData.gUseHashCache and self.CanSkip(): > > + return > > + > > + AutoGenList =3D [] > > + IgoredAutoGenList =3D [] > > + > > + for File in self.AutoGenFileList: > > + if GenC.Generate(File.Path, self.AutoGenFileList[File], F= ile.IsBinary): > > + AutoGenList.append(str(File)) > > + else: > > + IgoredAutoGenList.append(str(File)) > > + > > + > > + for ModuleType in self.DepexList: > > + # Ignore empty [depex] section or [depex] section for SUP= _MODULE_USER_DEFINED module > > + if len(self.DepexList[ModuleType]) =3D=3D 0 or ModuleType= =3D=3D SUP_MODULE_USER_DEFINED or ModuleType =3D=3D SUP_MODULE_HOST_APPLIC= ATION: > > + continue > > + > > + Dpx =3D GenDepex.DependencyExpression(self.DepexList[Modu= leType], ModuleType, True) > > + DpxFile =3D gAutoGenDepexFileName % {"module_name" : self= .Name} > > + > > + if len(Dpx.PostfixNotation) !=3D 0: > > + self.DepexGenerated =3D True > > + > > + if Dpx.Generate(path.join(self.OutputDir, DpxFile)): > > + AutoGenList.append(str(DpxFile)) > > + else: > > + IgoredAutoGenList.append(str(DpxFile)) > > + > > + if IgoredAutoGenList =3D=3D []: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] files = for module %s [%s]" % > > + (" ".join(AutoGenList), self.Name, self.A= rch)) > > + elif AutoGenList =3D=3D []: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "Skipped the generatio= n of [%s] files for module %s [%s]" % > > + (" ".join(IgoredAutoGenList), self.Name, = self.Arch)) > > + else: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "Generated [%s] (skipp= ed %s) files for module %s [%s]" % > > + (" ".join(AutoGenList), " ".join(IgoredAu= toGenList), self.Name, self.Arch)) > > + > > + self.IsCodeFileCreated =3D True > > + return AutoGenList > > + > > + ## Summarize the ModuleAutoGen objects of all libraries used by t= his module > > + @cached_property > > + def LibraryAutoGenList(self): > > + RetVal =3D [] > > + for Library in self.DependentLibraryList: > > + La =3D ModuleAutoGen( > > + self.Workspace, > > + Library.MetaFile, > > + self.BuildTarget, > > + self.ToolChain, > > + self.Arch, > > + self.PlatformInfo.MetaFile, > > + self.DataPipe > > + ) > > + La.IsLibrary =3D True > > + if La not in RetVal: > > + RetVal.append(La) > > + for Lib in La.CodaTargetList: > > + self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE= ) > > + return RetVal > > + > > + def GenModuleHash(self): > > + # Initialize a dictionary for each arch type > > + if self.Arch not in GlobalData.gModuleHash: > > + GlobalData.gModuleHash[self.Arch] =3D {} > > + > > + # Early exit if module or library has been hashed and is in m= emory > > + if self.Name in GlobalData.gModuleHash[self.Arch]: > > + return GlobalData.gModuleHash[self.Arch][self.Name].encod= e('utf-8') > > + > > + # Initialze hash object > > + m =3D hashlib.md5() > > + > > + # Add Platform level hash > > + m.update(GlobalData.gPlatformHash.encode('utf-8')) > > + > > + # Add Package level hash > > + if self.DependentPackageList: > > + for Pkg in sorted(self.DependentPackageList, key=3Dlambda= x: x.PackageName): > > + if Pkg.PackageName in GlobalData.gPackageHash: > > + m.update(GlobalData.gPackageHash[Pkg.PackageName]= .encode('utf-8')) > > + > > + # Add Library hash > > + if self.LibraryAutoGenList: > > + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x= : x.Name): > > + if Lib.Name not in GlobalData.gModuleHash[self.Arch]: > > + Lib.GenModuleHash() > > + m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].= encode('utf-8')) > > + > > + # Add Module self > > + f =3D open(str(self.MetaFile), 'rb') > > + Content =3D f.read() > > + f.close() > > + m.update(Content) > > + > > + # Add Module's source files > > + if self.SourceFileList: > > + for File in sorted(self.SourceFileList, key=3Dlambda x: s= tr(x)): > > + f =3D open(str(File), 'rb') > > + Content =3D f.read() > > + f.close() > > + m.update(Content) > > + > > + GlobalData.gModuleHash[self.Arch][self.Name] =3D m.hexdigest(= ) > > + > > + return GlobalData.gModuleHash[self.Arch][self.Name].encode('u= tf-8') > > + > > + ## Decide whether we can skip the ModuleAutoGen process > > + def CanSkipbyHash(self): > > + # Hashing feature is off > > + if not GlobalData.gUseHashCache: > > + return False > > + > > + # Initialize a dictionary for each arch type > > + if self.Arch not in GlobalData.gBuildHashSkipTracking: > > + GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() > > + > > + # If library or Module is binary do not skip by hash > > + if self.IsBinaryModule: > > + return False > > + > > + # .inc is contains binary information so do not skip by hash = as well > > + for f_ext in self.SourceFileList: > > + if '.inc' in str(f_ext): > > + return False > > + > > + # Use Cache, if exists and if Module has a copy in cache > > + if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy= (): > > + return True > > + > > + # Early exit for libraries that haven't yet finished building > > + HashFile =3D path.join(self.BuildDir, self.Name + ".hash") > > + if self.IsLibrary and not os.path.exists(HashFile): > > + return False > > + > > + # Return a Boolean based on if can skip by hash, either from = memory or from IO. > > + if self.Name not in GlobalData.gBuildHashSkipTracking[self.Ar= ch]: > > + # If hashes are the same, SaveFileOnChange() will return = False. > > + GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] = =3D not SaveFileOnChange(HashFile, self.GenModuleHash(), True) > > + return GlobalData.gBuildHashSkipTracking[self.Arch][self.= Name] > > + else: > > + return GlobalData.gBuildHashSkipTracking[self.Arch][self.= Name] > > + > > + ## Decide whether we can skip the ModuleAutoGen process > > + # If any source file is newer than the module than we cannot ski= p > > + # > > + def CanSkip(self): > > + if self.MakeFileDir in GlobalData.gSikpAutoGenCache: > > + return True > > + if not os.path.exists(self.TimeStampPath): > > + return False > > + #last creation time of the module > > + DstTimeStamp =3D os.stat(self.TimeStampPath)[8] > > + > > + SrcTimeStamp =3D self.Workspace._SrcTimeStamp > > + if SrcTimeStamp > DstTimeStamp: > > + return False > > + > > + with open(self.TimeStampPath,'r') as f: > > + for source in f: > > + source =3D source.rstrip('\n') > > + if not os.path.exists(source): > > + return False > > + if source not in ModuleAutoGen.TimeDict : > > + ModuleAutoGen.TimeDict[source] =3D os.stat(source= )[8] > > + if ModuleAutoGen.TimeDict[source] > DstTimeStamp: > > + return False > > + GlobalData.gSikpAutoGenCache.add(self.MakeFileDir) > > + return True > > + > > + @cached_property > > + def TimeStampPath(self): > > + return os.path.join(self.MakeFileDir, 'AutoGenTimeStamp') > > diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py b/= BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py > > new file mode 100644 > > index 000000000000..c7591253debd > > --- /dev/null > > +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py > > @@ -0,0 +1,619 @@ > > +## @file > > +# Create makefile for MS nmake and GNU make > > +# > > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> > +# SPDX-License-Identifier: BSD-2-Clause-Patent > > +# > > +from __future__ import absolute_import > > +from Workspace.WorkspaceDatabase import WorkspaceDatabase,BuildDB > > +from Common.caching import cached_property > > +from AutoGen.BuildEngine import BuildRule,AutoGenReqBuildRuleVerNum > > +from AutoGen.AutoGen import CalculatePriorityValue > > +from Common.Misc import CheckPcdDatum,GuidValue > > +from Common.Expression import ValueExpressionEx > > +from Common.DataType import * > > +from CommonDataClass.Exceptions import * > > +from CommonDataClass.CommonClass import SkuInfoClass > > +import Common.EdkLogger as EdkLogger > > +from Common.BuildToolError import OPTION_CONFLICT,FORMAT_INVALID,RESO= URCE_NOT_AVAILABLE > > +from Common.MultipleWorkspace import MultipleWorkspace as mws > > +from collections import defaultdict > > +from Common.Misc import PathClass > > +import os > > + > > + > > +# > > +# The priority list while override build option > > +# > > +PrioList =3D {"0x11111" : 16, # TARGET_TOOLCHAIN_ARCH_COMMANDTY= PE_ATTRIBUTE (Highest) > > + "0x01111" : 15, # ******_TOOLCHAIN_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x10111" : 14, # TARGET_*********_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x00111" : 13, # ******_*********_ARCH_COMMANDTYPE= _ATTRIBUTE > > + "0x11011" : 12, # TARGET_TOOLCHAIN_****_COMMANDTYPE= _ATTRIBUTE > > + "0x01011" : 11, # ******_TOOLCHAIN_****_COMMANDTYPE= _ATTRIBUTE > > + "0x10011" : 10, # TARGET_*********_****_COMMANDTYPE= _ATTRIBUTE > > + "0x00011" : 9, # ******_*********_****_COMMANDTYPE= _ATTRIBUTE > > + "0x11101" : 8, # TARGET_TOOLCHAIN_ARCH_***********= _ATTRIBUTE > > + "0x01101" : 7, # ******_TOOLCHAIN_ARCH_***********= _ATTRIBUTE > > + "0x10101" : 6, # TARGET_*********_ARCH_***********= _ATTRIBUTE > > + "0x00101" : 5, # ******_*********_ARCH_***********= _ATTRIBUTE > > + "0x11001" : 4, # TARGET_TOOLCHAIN_****_***********= _ATTRIBUTE > > + "0x01001" : 3, # ******_TOOLCHAIN_****_***********= _ATTRIBUTE > > + "0x10001" : 2, # TARGET_*********_****_***********= _ATTRIBUTE > > + "0x00001" : 1} # ******_*********_****_***********= _ATTRIBUTE (Lowest) > > +## Base class for AutoGen > > +# > > +# This class just implements the cache mechanism of AutoGen objects= . > > +# > > +class AutoGenInfo(object): > > + # database to maintain the objects in each child class > > + __ObjectCache =3D {} # (BuildTarget, ToolChain, ARCH, platform= file): AutoGen object > > + > > + ## Factory method > > + # > > + # @param Class class object of real AutoGen class > > + # (WorkspaceAutoGen, ModuleAutoGen or P= latformAutoGen) > > + # @param Workspace Workspace directory or WorkspaceAutoG= en object > > + # @param MetaFile The path of meta file > > + # @param Target Build target > > + # @param Toolchain Tool chain name > > + # @param Arch Target arch > > + # @param *args The specific class related parameters > > + # @param **kwargs The specific class related dict param= eters > > + # > > + @classmethod > > + def GetCache(cls): > > + return cls.__ObjectCache > > + def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *a= rgs, **kwargs): > > + # check if the object has been created > > + Key =3D (Target, Toolchain, Arch, MetaFile) > > + if Key in cls.__ObjectCache: > > + # if it exists, just return it directly > > + return cls.__ObjectCache[Key] > > + # it didnt exist. create it, cache it, then return it > > + RetVal =3D cls.__ObjectCache[Key] =3D super(AutoGenInfo, cls)= .__new__(cls) > > + return RetVal > > + > > + > > + ## hash() operator > > + # > > + # The file path of platform file will be used to represent hash = value of this object > > + # > > + # @retval int Hash value of the file path of platform file > > + # > > + def __hash__(self): > > + return hash(self.MetaFile) > > + > > + ## str() operator > > + # > > + # The file path of platform file will be used to represent this = object > > + # > > + # @retval string String of platform file path > > + # > > + def __str__(self): > > + return str(self.MetaFile) > > + > > + ## "=3D=3D" operator > > + def __eq__(self, Other): > > + return Other and self.MetaFile =3D=3D Other > > + > > + ## Expand * in build option key > > + # > > + # @param Options Options to be expanded > > + # @param ToolDef Use specified ToolDef instead of full ver= sion. > > + # This is needed during initialization to p= revent > > + # infinite recursion betweeh BuildOptions, > > + # ToolDefinition, and this function. > > + # > > + # @retval options Options expanded > > + # > > + def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= = =3DNone): > > + if not ToolDef: > > + ToolDef =3D self.ToolDefinition > > + BuildOptions =3D {} > > + FamilyMatch =3D False > > + FamilyIsNull =3D True > > + > > + OverrideList =3D {} > > + # > > + # Construct a list contain the build options which need overr= ide. > > + # > > + for Key in Options: > > + # > > + # Key[0] -- tool family > > + # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > > + # > > + if (Key[0] =3D=3D self.BuildRuleFamily and > > + (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2= and Key[2] =3D=3D ModuleStyle))): > > + Target, ToolChain, Arch, CommandType, Attr =3D Key[1]= .split('_') > > + if (Target =3D=3D self.BuildTarget or Target =3D=3D T= AB_STAR) and\ > > + (ToolChain =3D=3D self.ToolChain or ToolChain =3D= = =3D TAB_STAR) and\ > > + (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) a= nd\ > > + Options[Key].startswith("=3D"): > > + > > + if OverrideList.get(Key[1]) is not None: > > + OverrideList.pop(Key[1]) > > + OverrideList[Key[1]] =3D Options[Key] > > + > > + # > > + # Use the highest priority value. > > + # > > + if (len(OverrideList) >=3D 2): > > + KeyList =3D list(OverrideList.keys()) > > + for Index in range(len(KeyList)): > > + NowKey =3D KeyList[Index] > > + Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D N= owKey.split("_") > > + for Index1 in range(len(KeyList) - Index - 1): > > + NextKey =3D KeyList[Index1 + Index + 1] > > + # > > + # Compare two Key, if one is included by another,= choose the higher priority one > > + # > > + Target2, ToolChain2, Arch2, CommandType2, Attr2 = =3D NextKey.split("_") > > + if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_= STAR or Target2 =3D=3D TAB_STAR) and\ > > + (ToolChain1 =3D=3D ToolChain2 or ToolChain1 = =3D=3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > > + (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR = or Arch2 =3D=3D TAB_STAR) and\ > > + (CommandType1 =3D=3D CommandType2 or CommandT= ype1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > > + (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR = or Attr2 =3D=3D TAB_STAR): > > + > > + if CalculatePriorityValue(NowKey) > Calculate= PriorityValue(NextKey): > > + if Options.get((self.BuildRuleFamily, Nex= tKey)) is not None: > > + Options.pop((self.BuildRuleFamily, Ne= xtKey)) > > + else: > > + if Options.get((self.BuildRuleFamily, Now= Key)) is not None: > > + Options.pop((self.BuildRuleFamily, No= wKey)) > > + > > + for Key in Options: > > + if ModuleStyle is not None and len (Key) > 2: > > + # Check Module style is EDK or EDKII. > > + # Only append build option for the matched style modu= le. > > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > + continue > > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > + continue > > + Family =3D Key[0] > > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > + # if tool chain family doesn't match, skip it > > + if Tool in ToolDef and Family !=3D "": > > + FamilyIsNull =3D False > > + if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY,= "") !=3D "": > > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUIL= DRULEFAMILY]: > > + continue > > + elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY= ]: > > + continue > > + FamilyMatch =3D True > > + # expand any wildcard > > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > + if Tool not in BuildOptions: > > + BuildOptions[Tool] =3D {} > > + if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D Options[Key] > > + else: > > + # append options for the same tool except= PATH > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > + else: > > + BuildOptions[Tool][Attr] =3D Options[= Key] > > + # Build Option Family has been checked, which need't to be ch= ecked again for family. > > + if FamilyMatch or FamilyIsNull: > > + return BuildOptions > > + > > + for Key in Options: > > + if ModuleStyle is not None and len (Key) > 2: > > + # Check Module style is EDK or EDKII. > > + # Only append build option for the matched style modu= le. > > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > + continue > > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > + continue > > + Family =3D Key[0] > > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > + # if tool chain family doesn't match, skip it > > + if Tool not in ToolDef or Family =3D=3D "": > > + continue > > + # option has been added before > > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > > + continue > > + > > + # expand any wildcard > > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > + if Tool not in BuildOptions: > > + BuildOptions[Tool] =3D {} > > + if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D Options[Key] > > + else: > > + # append options for the same tool except= PATH > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > + else: > > + BuildOptions[Tool][Attr] =3D Options[= Key] > > + return BuildOptions > > +# > > +#This class is the pruned WorkSpaceAutoGen for ModuleAutoGen in multi= ple thread > > +# > > +class WorkSpaceInfo(AutoGenInfo): > > + def __init__(self,Workspace, MetaFile, Target, ToolChain, Arch): > > + self._SrcTimeStamp =3D 0 > > + self.Db =3D BuildDB > > + self.BuildDatabase =3D self.Db.BuildObject > > + self.Target =3D Target > > + self.ToolChain =3D ToolChain > > + self.WorkspaceDir =3D Workspace > > + self.ActivePlatform =3D MetaFile > > + self.ArchList =3D Arch > > + > > + > > +class PlatformInfo(AutoGenInfo): > > + def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch,D= ataPipe): > > + self.Wa =3D Workspace > > + self.WorkspaceDir =3D self.Wa.WorkspaceDir > > + self.MetaFile =3D MetaFile > > + self.Arch =3D Arch > > + self.Target =3D Target > > + self.BuildTarget =3D Target > > + self.ToolChain =3D ToolChain > > + self.Platform =3D self.Wa.BuildDatabase[self.MetaFile, self.A= rch, self.Target, self.ToolChain] > > + > > + self.SourceDir =3D MetaFile.SubDir > > + self.DataPipe =3D DataPipe > > + @cached_property > > + def _AsBuildModuleList(self): > > + retVal =3D self.DataPipe.Get("AsBuildModuleList") > > + if retVal is None: > > + retVal =3D {} > > + return retVal > > + > > + ## Test if a module is supported by the platform > > + # > > + # An error will be raised directly if the module or its arch is = not supported > > + # by the platform or current configuration > > + # > > + def ValidModule(self, Module): > > + return Module in self.Platform.Modules or Module in self.Plat= form.LibraryInstances \ > > + or Module in self._AsBuildModuleList > > + > > + @cached_property > > + def ToolChainFamily(self): > > + retVal =3D self.DataPipe.Get("ToolChainFamily") > > + if retVal is None: > > + retVal =3D {} > > + return retVal > > + > > + @cached_property > > + def BuildRuleFamily(self): > > + retVal =3D self.DataPipe.Get("BuildRuleFamily") > > + if retVal is None: > > + retVal =3D {} > > + return retVal > > + > > + @cached_property > > + def _MbList(self): > > + return [self.Wa.BuildDatabase[m, self.Arch, self.BuildTarget,= self.ToolChain] for m in self.Platform.Modules] > > + > > + @cached_property > > + def PackageList(self): > > + RetVal =3D set() > > + for dec_file,Arch in self.DataPipe.Get("PackageList"): > > + RetVal.add(self.Wa.BuildDatabase[dec_file,Arch,self.Build= Target, self.ToolChain]) > > + return list(RetVal) > > + > > + ## Return the directory to store all intermediate and final files= built > > + @cached_property > > + def BuildDir(self): > > + if os.path.isabs(self.OutputDir): > > + RetVal =3D os.path.join( > > + os.path.abspath(self.OutputDir), > > + self.Target + "_" + self.ToolChain, > > + ) > > + else: > > + RetVal =3D os.path.join( > > + self.WorkspaceDir, > > + self.OutputDir, > > + self.Target + "_" + self.ToolChain, > > + ) > > + return RetVal > > + > > + ## Return the build output directory platform specifies > > + @cached_property > > + def OutputDir(self): > > + return self.Platform.OutputDirectory > > + > > + ## Return platform name > > + @cached_property > > + def Name(self): > > + return self.Platform.PlatformName > > + > > + ## Return meta-file GUID > > + @cached_property > > + def Guid(self): > > + return self.Platform.Guid > > + > > + ## Return platform version > > + @cached_property > > + def Version(self): > > + return self.Platform.Version > > + > > + ## Return paths of tools > > + @cached_property > > + def ToolDefinition(self): > > + retVal =3D self.DataPipe.Get("TOOLDEF") > > + if retVal is None: > > + retVal =3D {} > > + return retVal > > + > > + ## Return build command string > > + # > > + # @retval string Build command string > > + # > > + @cached_property > > + def BuildCommand(self): > > + retVal =3D self.DataPipe.Get("BuildCommand") > > + if retVal is None: > > + retVal =3D [] > > + return retVal > > + > > + @cached_property > > + def PcdTokenNumber(self): > > + retVal =3D self.DataPipe.Get("PCD_TNUM") > > + if retVal is None: > > + retVal =3D {} > > + return retVal > > + > > + ## Override PCD setting (type, value, ...) > > + # > > + # @param ToPcd The PCD to be overridden > > + # @param FromPcd The PCD overriding from > > + # > > + def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Lib= rary=3D""): > > + # > > + # in case there's PCDs coming from FDF file, which have no ty= pe given. > > + # at this point, ToPcd.Type has the type found from dependent > > + # package > > + # > > + TokenCName =3D ToPcd.TokenCName > > + for PcdItem in self.MixedPcd: > > + if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in self.= MixedPcd[PcdItem]: > > + TokenCName =3D PcdItem[0] > > + break > > + if FromPcd is not None: > > + if ToPcd.Pending and FromPcd.Type: > > + ToPcd.Type =3D FromPcd.Type > > + elif ToPcd.Type and FromPcd.Type\ > > + and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in Fr= omPcd.Type: > > + if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > > + ToPcd.Type =3D FromPcd.Type > > + elif ToPcd.Type and FromPcd.Type \ > > + and ToPcd.Type !=3D FromPcd.Type: > > + if Library: > > + Module =3D str(Module) + " 's library file (" + s= tr(Library) + ")" > > + EdkLogger.error("build", OPTION_CONFLICT, "Mismatched= PCD type", > > + ExtraData=3D"%s.%s is used as [%s] in= module %s, but as [%s] in %s."\ > > + % (ToPcd.TokenSpaceGuidCNam= e, TokenCName, > > + ToPcd.Type, Module, From= Pcd.Type, Msg), > > + File=3Dself.MetaFile) > > + > > + if FromPcd.MaxDatumSize: > > + ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > > + ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > > + if FromPcd.DefaultValue: > > + ToPcd.DefaultValue =3D FromPcd.DefaultValue > > + if FromPcd.TokenValue: > > + ToPcd.TokenValue =3D FromPcd.TokenValue > > + if FromPcd.DatumType: > > + ToPcd.DatumType =3D FromPcd.DatumType > > + if FromPcd.SkuInfoList: > > + ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > > + if FromPcd.UserDefinedDefaultStoresFlag: > > + ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDe= finedDefaultStoresFlag > > + # Add Flexible PCD format parse > > + if ToPcd.DefaultValue: > > + try: > > + ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.De= faultValue, ToPcd.DatumType, self._GuidDict)(True) > > + except BadExpression as Value: > > + EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%= s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.= DefaultValue, Value), > > + File=3Dself.MetaFile) > > + > > + # check the validation of datum > > + IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.D= efaultValue) > > + if not IsValid: > > + EdkLogger.error('build', FORMAT_INVALID, Cause, File= =3Dself.MetaFile, > > + ExtraData=3D"%s.%s" % (ToPcd.TokenSpa= ceGuidCName, TokenCName)) > > + ToPcd.validateranges =3D FromPcd.validateranges > > + ToPcd.validlists =3D FromPcd.validlists > > + ToPcd.expressions =3D FromPcd.expressions > > + ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > > + > > + if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID an= d not ToPcd.MaxDatumSize: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize speci= fied for PCD %s.%s" \ > > + % (ToPcd.TokenSpaceGuidCName, TokenCName)= ) > > + Value =3D ToPcd.DefaultValue > > + if not Value: > > + ToPcd.MaxDatumSize =3D '1' > > + elif Value[0] =3D=3D 'L': > > + ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > + elif Value[0] =3D=3D '{': > > + ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > > + else: > > + ToPcd.MaxDatumSize =3D str(len(Value) - 1) > > + > > + # apply default SKU for dynamic PCDS if specified one is not = available > > + if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_D= YNAMIC_EX_TYPE_SET) \ > > + and not ToPcd.SkuInfoList: > > + if self.Platform.SkuName in self.Platform.SkuIds: > > + SkuName =3D self.Platform.SkuName > > + else: > > + SkuName =3D TAB_DEFAULT > > + ToPcd.SkuInfoList =3D { > > + SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[= SkuName][0], '', '', '', '', '', ToPcd.DefaultValue) > > + } > > + > > + def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > > + # for each PCD in module > > + for Name, Guid in Pcds: > > + PcdInModule =3D Pcds[Name, Guid] > > + # find out the PCD setting in platform > > + if (Name, Guid) in self.Pcds: > > + PcdInPlatform =3D self.Pcds[Name, Guid] > > + else: > > + PcdInPlatform =3D None > > + # then override the settings if any > > + self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= = =3D"DSC PCD sections", Library=3DLibrary) > > + # resolve the VariableGuid value > > + for SkuId in PcdInModule.SkuInfoList: > > + Sku =3D PcdInModule.SkuInfoList[SkuId] > > + if Sku.VariableGuid =3D=3D '': continue > > + Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid,= self.PackageList, self.MetaFile.Path) > > + if Sku.VariableGuidValue is None: > > + PackageList =3D "\n\t".join(str(P) for P in self.= PackageList) > > + EdkLogger.error( > > + 'build', > > + RESOURCE_NOT_AVAILABLE, > > + "Value of GUID [%s] is not found in" = % Sku.VariableGuid, > > + ExtraData=3DPackageList + "\n\t(used = with %s.%s from module %s)" \ > > + % (Guid, Name= , str(Module)), > > + File=3Dself.MetaFile > > + ) > > + > > + # override PCD settings with module specific setting > > + if Module in self.Platform.Modules: > > + PlatformModule =3D self.Platform.Modules[str(Module)] > > + for Key in PlatformModule.Pcds: > > + if self.BuildOptionPcd: > > + for pcd in self.BuildOptionPcd: > > + (TokenSpaceGuidCName, TokenCName, FieldName, = pcdvalue, _) =3D pcd > > + if (TokenCName, TokenSpaceGuidCName) =3D=3D K= ey and FieldName =3D=3D"": > > + PlatformModule.Pcds[Key].DefaultValue =3D= pcdvalue > > + PlatformModule.Pcds[Key].PcdValueFromComm= =3D pcdvalue > > + break > > + Flag =3D False > > + if Key in Pcds: > > + ToPcd =3D Pcds[Key] > > + Flag =3D True > > + elif Key in self.MixedPcd: > > + for PcdItem in self.MixedPcd[Key]: > > + if PcdItem in Pcds: > > + ToPcd =3D Pcds[PcdItem] > > + Flag =3D True > > + break > > + if Flag: > > + self._OverridePcd(ToPcd, PlatformModule.Pcds[Key]= , Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibra= ry) > > + # use PCD value to calculate the MaxDatumSize when it is not = specified > > + for Name, Guid in Pcds: > > + Pcd =3D Pcds[Name, Guid] > > + if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize= : > > + Pcd.MaxSizeUserSet =3D None > > + Value =3D Pcd.DefaultValue > > + if not Value: > > + Pcd.MaxDatumSize =3D '1' > > + elif Value[0] =3D=3D 'L': > > + Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > + elif Value[0] =3D=3D '{': > > + Pcd.MaxDatumSize =3D str(len(Value.split(','))) > > + else: > > + Pcd.MaxDatumSize =3D str(len(Value) - 1) > > + return list(Pcds.values()) > > + > > + @cached_property > > + def Pcds(self): > > + PlatformPcdData =3D self.DataPipe.Get("PLA_PCD") > > +# for pcd in PlatformPcdData: > > +# for skuid in pcd.SkuInfoList: > > +# pcd.SkuInfoList[skuid] =3D self.CreateSkuInfoFromDi= ct(pcd.SkuInfoList[skuid]) > > + return {(pcddata.TokenCName,pcddata.TokenSpaceGuidCName):pcdd= ata for pcddata in PlatformPcdData} > > + > > + def CreateSkuInfoFromDict(self,SkuInfoDict): > > + return SkuInfoClass( > > + SkuInfoDict.get("SkuIdName"), > > + SkuInfoDict.get("SkuId"), > > + SkuInfoDict.get("VariableName"), > > + SkuInfoDict.get("VariableGuid"), > > + SkuInfoDict.get("VariableOffset"), > > + SkuInfoDict.get("HiiDefaultValue"), > > + SkuInfoDict.get("VpdOffset"), > > + SkuInfoDict.get("DefaultValue"), > > + SkuInfoDict.get("VariableGuidValue"), > > + SkuInfoDict.get("VariableAttribute",""), > > + SkuInfoDict.get("DefaultStore",None) > > + ) > > + @cached_property > > + def MixedPcd(self): > > + return self.DataPipe.Get("MixedPcd") > > + @cached_property > > + def _GuidDict(self): > > + RetVal =3D self.DataPipe.Get("GuidDict") > > + if RetVal is None: > > + RetVal =3D {} > > + return RetVal > > + @cached_property > > + def BuildOptionPcd(self): > > + return self.DataPipe.Get("BuildOptPcd") > > + def ApplyBuildOption(self,module): > > + PlatformOptions =3D self.DataPipe.Get("PLA_BO") > > + ModuleBuildOptions =3D self.DataPipe.Get("MOL_BO") > > + ModuleOptionFromDsc =3D ModuleBuildOptions.get((module.MetaFi= le.File,module.MetaFile.Root)) > > + if ModuleOptionFromDsc: > > + ModuleTypeOptions, PlatformModuleOptions =3D ModuleOption= FromDsc["ModuleTypeOptions"],ModuleOptionFromDsc["PlatformModuleOptions"] > > + else: > > + ModuleTypeOptions, PlatformModuleOptions =3D {}, {} > > + ToolDefinition =3D self.DataPipe.Get("TOOLDEF") > > + ModuleOptions =3D self._ExpandBuildOption(module.BuildOptions= ) > > + BuildRuleOrder =3D None > > + for Options in [ToolDefinition, ModuleOptions, PlatformOption= s, ModuleTypeOptions, PlatformModuleOptions]: > > + for Tool in Options: > > + for Attr in Options[Tool]: > > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > + BuildRuleOrder =3D Options[Tool][Attr] > > + > > + AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOp= tions.keys()) + > > + list(PlatformModuleOptions.keys()) + list(Modu= leTypeOptions.keys()) + > > + list(ToolDefinition.keys())) > > + BuildOptions =3D defaultdict(lambda: defaultdict(str)) > > + for Tool in AllTools: > > + for Options in [ToolDefinition, ModuleOptions, PlatformOp= tions, ModuleTypeOptions, PlatformModuleOptions]: > > + if Tool not in Options: > > + continue > > + for Attr in Options[Tool]: > > + # > > + # Do not generate it in Makefile > > + # > > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > + continue > > + Value =3D Options[Tool][Attr] > > + # check if override is indicated > > + if Value.startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D mws.handleWsMacr= o(Value[1:]) > > + else: > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + mws.h= andleWsMacro(Value) > > + else: > > + BuildOptions[Tool][Attr] =3D mws.handleWs= Macro(Value) > > + > > + return BuildOptions, BuildRuleOrder > > + > > + def ApplyLibraryInstance(self,module): > > + alldeps =3D self.DataPipe.Get("DEPS") > > + if alldeps is None: > > + alldeps =3D {} > > + mod_libs =3D alldeps.get((module.MetaFile.File,module.MetaFil= e.Root,module.Arch,module.MetaFile.Path),[]) > > + retVal =3D [] > > + for (file_path,root,arch,abs_path) in mod_libs: > > + libMetaFile =3D PathClass(file_path,root) > > + libMetaFile.OriginalPath =3D PathClass(file_path,root) > > + libMetaFile.Path =3D abs_path > > + retVal.append(self.Wa.BuildDatabase[libMetaFile, arch, se= lf.Target,self.ToolChain]) > > + return retVal > > + > > + ## Parse build_rule.txt in Conf Directory. > > + # > > + # @retval BuildRule object > > + # > > + @cached_property > > + def BuildRule(self): > > + WInfo =3D self.DataPipe.Get("P_Info") > > + RetVal =3D WInfo.get("BuildRuleFile") > > + if RetVal._FileVersion =3D=3D "": > > + RetVal._FileVersion =3D AutoGenReqBuildRuleVerNum > > + return RetVal > > diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/Base= Tools/Source/Python/AutoGen/PlatformAutoGen.py > > new file mode 100644 > > index 000000000000..6c947eca2b57 > > --- /dev/null > > +++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py > > @@ -0,0 +1,1505 @@ > > +## @file > > +# Create makefile for MS nmake and GNU make > > +# > > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> > +# SPDX-License-Identifier: BSD-2-Clause-Patent > > +# > > + > > +## Import Modules > > +# > > +from __future__ import print_function > > +from __future__ import absolute_import > > +import os.path as path > > +import copy > > +from collections import defaultdict > > + > > +from .BuildEngine import BuildRule,gDefaultBuildRuleFile,AutoGenReqBu= ildRuleVerNum > > +from .GenVar import VariableMgr, var_info > > +from . import GenMake > > +from AutoGen.DataPipe import MemoryDataPipe > > +from AutoGen.ModuleAutoGen import ModuleAutoGen > > +from AutoGen.AutoGen import AutoGen > > +from AutoGen.AutoGen import CalculatePriorityValue > > +from Workspace.WorkspaceCommon import GetModuleLibInstances > > +from CommonDataClass.CommonClass import SkuInfoClass > > +from Common.caching import cached_class_function > > +from Common.Expression import ValueExpressionEx > > +from Common.StringUtils import StringToArray,NormPath > > +from Common.BuildToolError import * > > +from Common.DataType import * > > +from Common.Misc import * > > +import Common.VpdInfoFile as VpdInfoFile > > + > > +## Split command line option string to list > > +# > > +# subprocess.Popen needs the args to be a sequence. Otherwise there's= problem > > +# in non-windows platform to launch command > > +# > > +def _SplitOption(OptionString): > > + OptionList =3D [] > > + LastChar =3D " " > > + OptionStart =3D 0 > > + QuotationMark =3D "" > > + for Index in range(0, len(OptionString)): > > + CurrentChar =3D OptionString[Index] > > + if CurrentChar in ['"', "'"]: > > + if QuotationMark =3D=3D CurrentChar: > > + QuotationMark =3D "" > > + elif QuotationMark =3D=3D "": > > + QuotationMark =3D CurrentChar > > + continue > > + elif QuotationMark: > > + continue > > + > > + if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r"= , "\n"]: > > + if Index > OptionStart: > > + OptionList.append(OptionString[OptionStart:Index - 1]= ) > > + OptionStart =3D Index > > + LastChar =3D CurrentChar > > + OptionList.append(OptionString[OptionStart:]) > > + return OptionList > > + > > +## AutoGen class for platform > > +# > > +# PlatformAutoGen class will process the original information in pla= tform > > +# file in order to generate makefile for platform. > > +# > > +class PlatformAutoGen(AutoGen): > > + # call super().__init__ then call the worker function with differ= ent parameter count > > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > + if not hasattr(self, "_Init"): > > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch) > > + self._Init =3D True > > + # > > + # Used to store all PCDs for both PEI and DXE phase, in order to = generate > > + # correct PCD database > > + # > > + _DynaPcdList_ =3D [] > > + _NonDynaPcdList_ =3D [] > > + _PlatformPcds =3D {} > > + > > + > > + > > + ## Initialize PlatformAutoGen > > + # > > + # > > + # @param Workspace WorkspaceAutoGen object > > + # @param PlatformFile Platform file (DSC file) > > + # @param Target Build target (DEBUG, RELEASE) > > + # @param Toolchain Name of tool chain > > + # @param Arch arch of the platform supports > > + # > > + def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain,= Arch): > > + EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s= ]" % (PlatformFile, Arch)) > > + GlobalData.gProcessingFile =3D "%s [%s, %s, %s]" % (PlatformF= ile, Arch, Toolchain, Target) > > + > > + self.MetaFile =3D PlatformFile > > + self.Workspace =3D Workspace > > + self.WorkspaceDir =3D Workspace.WorkspaceDir > > + self.ToolChain =3D Toolchain > > + self.BuildTarget =3D Target > > + self.Arch =3D Arch > > + self.SourceDir =3D PlatformFile.SubDir > > + self.FdTargetList =3D self.Workspace.FdTargetList > > + self.FvTargetList =3D self.Workspace.FvTargetList > > + # get the original module/package/platform objects > > + self.BuildDatabase =3D Workspace.BuildDatabase > > + self.DscBuildDataObj =3D Workspace.Platform > > + > > + # flag indicating if the makefile/C-code file has been create= d or not > > + self.IsMakeFileCreated =3D False > > + > > + self._DynamicPcdList =3D None # [(TokenCName1, TokenSpaceG= uidCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > > + self._NonDynamicPcdList =3D None # [(TokenCName1, TokenSpaceG= uidCName1), (TokenCName2, TokenSpaceGuidCName2), ...] > > + > > + self._AsBuildInfList =3D [] > > + self._AsBuildModuleList =3D [] > > + > > + self.VariableInfo =3D None > > + > > + if GlobalData.gFdfParser is not None: > > + self._AsBuildInfList =3D GlobalData.gFdfParser.Profile.In= fList > > + for Inf in self._AsBuildInfList: > > + InfClass =3D PathClass(NormPath(Inf), GlobalData.gWor= kspace, self.Arch) > > + M =3D self.BuildDatabase[InfClass, self.Arch, self.Bu= ildTarget, self.ToolChain] > > + if not M.IsBinaryModule: > > + continue > > + self._AsBuildModuleList.append(InfClass) > > + # get library/modules for build > > + self.LibraryBuildDirectoryList =3D [] > > + self.ModuleBuildDirectoryList =3D [] > > + > > + self.DataPipe =3D MemoryDataPipe(self.BuildDir) > > + self.DataPipe.FillData(self) > > + > > + return True > > + ## hash() operator of PlatformAutoGen > > + # > > + # The platform file path and arch string will be used to represe= nt > > + # hash value of this object > > + # > > + # @retval int Hash value of the platform file path and arch > > + # > > + @cached_class_function > > + def __hash__(self): > > + return hash((self.MetaFile, self.Arch)) > > + @cached_class_function > > + def __repr__(self): > > + return "%s [%s]" % (self.MetaFile, self.Arch) > > + > > + ## Create autogen code for platform and modules > > + # > > + # Since there's no autogen code for platform, this method will d= o nothing > > + # if CreateModuleCodeFile is set to False. > > + # > > + # @param CreateModuleCodeFile Flag indicating if creati= ng module's > > + # autogen code file or not > > + # > > + @cached_class_function > > + def CreateCodeFile(self, CreateModuleCodeFile=3DFalse): > > + # only module has code to be created, so do nothing if Create= ModuleCodeFile is False > > + if not CreateModuleCodeFile: > > + return > > + > > + for Ma in self.ModuleAutoGenList: > > + Ma.CreateCodeFile(True) > > + > > + ## Generate Fds Command > > + @cached_property > > + def GenFdsCommand(self): > > + return self.Workspace.GenFdsCommand > > + > > + ## Create makefile for the platform and modules in it > > + # > > + # @param CreateModuleMakeFile Flag indicating if the ma= kefile for > > + # modules will be created a= s well > > + # > > + def CreateMakeFile(self, CreateModuleMakeFile=3DFalse, FfsCommand= =3D {}): > > + if CreateModuleMakeFile: > > + for Ma in self._MaList: > > + key =3D (Ma.MetaFile.File, self.Arch) > > + if key in FfsCommand: > > + Ma.CreateMakeFile(True, FfsCommand[key]) > > + else: > > + Ma.CreateMakeFile(True) > > + > > + # no need to create makefile for the platform more than once > > + if self.IsMakeFileCreated: > > + return > > + > > + # create library/module build dirs for platform > > + Makefile =3D GenMake.PlatformMakefile(self) > > + self.LibraryBuildDirectoryList =3D Makefile.GetLibraryBuildDi= rectoryList() > > + self.ModuleBuildDirectoryList =3D Makefile.GetModuleBuildDire= ctoryList() > > + > > + self.IsMakeFileCreated =3D True > > + > > + @property > > + def AllPcdList(self): > > + return self.DynamicPcdList + self.NonDynamicPcdList > > + ## Deal with Shared FixedAtBuild Pcds > > + # > > + def CollectFixedAtBuildPcds(self): > > + for LibAuto in self.LibraryAutoGenList: > > + FixedAtBuildPcds =3D {} > > + ShareFixedAtBuildPcdsSameValue =3D {} > > + for Module in LibAuto.ReferenceModules: > > + for Pcd in set(Module.FixedAtBuildPcds + LibAuto.Fixe= dAtBuildPcds): > > + DefaultValue =3D Pcd.DefaultValue > > + # Cover the case: DSC component override the Pcd = value and the Pcd only used in one Lib > > + if Pcd in Module.LibraryPcdList: > > + Index =3D Module.LibraryPcdList.index(Pcd) > > + DefaultValue =3D Module.LibraryPcdList[Index]= .DefaultValue > > + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.To= kenCName)) > > + if key not in FixedAtBuildPcds: > > + ShareFixedAtBuildPcdsSameValue[key] =3D True > > + FixedAtBuildPcds[key] =3D DefaultValue > > + else: > > + if FixedAtBuildPcds[key] !=3D DefaultValue: > > + ShareFixedAtBuildPcdsSameValue[key] =3D F= alse > > + for Pcd in LibAuto.FixedAtBuildPcds: > > + key =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenC= Name)) > > + if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in s= elf.NonDynamicPcdDict: > > + continue > > + else: > > + DscPcd =3D self.NonDynamicPcdDict[(Pcd.TokenCName= , Pcd.TokenSpaceGuidCName)] > > + if DscPcd.Type !=3D TAB_PCDS_FIXED_AT_BUILD: > > + continue > > + if key in ShareFixedAtBuildPcdsSameValue and ShareFix= edAtBuildPcdsSameValue[key]: > > + LibAuto.ConstPcd[key] =3D FixedAtBuildPcds[key] > > + > > + def CollectVariables(self, DynamicPcdSet): > > + VpdRegionSize =3D 0 > > + VpdRegionBase =3D 0 > > + if self.Workspace.FdfFile: > > + FdDict =3D self.Workspace.FdfProfile.FdDict[GlobalData.gF= dfParser.CurrentFdName] > > + for FdRegion in FdDict.RegionList: > > + for item in FdRegion.RegionDataList: > > + if self.Platform.VpdToolGuid.strip() and self.Pla= tform.VpdToolGuid in item: > > + VpdRegionSize =3D FdRegion.Size > > + VpdRegionBase =3D FdRegion.Offset > > + break > > + > > + VariableInfo =3D VariableMgr(self.DscBuildDataObj._GetDefault= Stores(), self.DscBuildDataObj.SkuIds) > > + VariableInfo.SetVpdRegionMaxSize(VpdRegionSize) > > + VariableInfo.SetVpdRegionOffset(VpdRegionBase) > > + Index =3D 0 > > + for Pcd in DynamicPcdSet: > > + pcdname =3D ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenC= Name)) > > + for SkuName in Pcd.SkuInfoList: > > + Sku =3D Pcd.SkuInfoList[SkuName] > > + SkuId =3D Sku.SkuId > > + if SkuId is None or SkuId =3D=3D '': > > + continue > > + if len(Sku.VariableName) > 0: > > + if Sku.VariableAttribute and 'NV' not in Sku.Vari= ableAttribute: > > + continue > > + VariableGuidStructure =3D Sku.VariableGuidValue > > + VariableGuid =3D GuidStructureStringToGuidString(= VariableGuidStructure) > > + for StorageName in Sku.DefaultStoreDict: > > + VariableInfo.append_variable(var_info(Index, = pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGui= d, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.Defa= ultStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else St= ringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAt= tribute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False))) > > + Index +=3D 1 > > + return VariableInfo > > + > > + def UpdateNVStoreMaxSize(self, OrgVpdFile): > > + if self.VariableInfo: > > + VpdMapFilePath =3D os.path.join(self.BuildDir, TAB_FV_DIR= ECTORY, "%s.map" % self.Platform.VpdToolGuid) > > + PcdNvStoreDfBuffer =3D [item for item in self._DynamicPcd= List if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.Toke= nSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > > + > > + if PcdNvStoreDfBuffer: > > + if os.path.exists(VpdMapFilePath): > > + OrgVpdFile.Read(VpdMapFilePath) > > + PcdItems =3D OrgVpdFile.GetOffset(PcdNvStoreDfBuf= fer[0]) > > + NvStoreOffset =3D list(PcdItems.values())[0].stri= p() if PcdItems else '0' > > + else: > > + EdkLogger.error("build", FILE_READ_FAILURE, "Can = not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > > + > > + NvStoreOffset =3D int(NvStoreOffset, 16) if NvStoreOf= fset.upper().startswith("0X") else int(NvStoreOffset) > > + default_skuobj =3D PcdNvStoreDfBuffer[0].SkuInfoList.= get(TAB_DEFAULT) > > + maxsize =3D self.VariableInfo.VpdRegionSize - NvStor= eOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultV= alue.split(",")) > > + var_data =3D self.VariableInfo.PatchNVStoreDefaultMax= Size(maxsize) > > + > > + if var_data and default_skuobj: > > + default_skuobj.DefaultValue =3D var_data > > + PcdNvStoreDfBuffer[0].DefaultValue =3D var_data > > + PcdNvStoreDfBuffer[0].SkuInfoList.clear() > > + PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] = =3D default_skuobj > > + PcdNvStoreDfBuffer[0].MaxDatumSize =3D str(len(de= fault_skuobj.DefaultValue.split(","))) > > + > > + return OrgVpdFile > > + > > + ## Collect dynamic PCDs > > + # > > + # Gather dynamic PCDs list from each module and their settings f= rom platform > > + # This interface should be invoked explicitly when platform acti= on is created. > > + # > > + def CollectPlatformDynamicPcds(self): > > + self.CategoryPcds() > > + self.SortDynamicPcd() > > + > > + def CategoryPcds(self): > > + # Category Pcds into DynamicPcds and NonDynamicPcds > > + # for gathering error information > > + NoDatumTypePcdList =3D set() > > + FdfModuleList =3D [] > > + for InfName in self._AsBuildInfList: > > + InfName =3D mws.join(self.WorkspaceDir, InfName) > > + FdfModuleList.append(os.path.normpath(InfName)) > > + for M in self._MbList: > > +# F is the Module for which M is the module autogen > > + ModPcdList =3D self.ApplyPcdSetting(M, M.ModulePcdList) > > + LibPcdList =3D [] > > + for lib in M.LibraryPcdList: > > + LibPcdList.extend(self.ApplyPcdSetting(M, M.LibraryPc= dList[lib], lib)) > > + for PcdFromModule in ModPcdList + LibPcdList: > > + > > + # make sure that the "VOID*" kind of datum has MaxDat= umSize set > > + if PcdFromModule.DatumType =3D=3D TAB_VOID and not Pc= dFromModule.MaxDatumSize: > > + NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromMod= ule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile)) > > + > > + # Check the PCD from Binary INF or Source INF > > + if M.IsBinaryModule =3D=3D True: > > + PcdFromModule.IsFromBinaryInf =3D True > > + > > + # Check the PCD from DSC or not > > + PcdFromModule.IsFromDsc =3D (PcdFromModule.TokenCName= , PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds > > + > > + if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdF= romModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > > + if M.MetaFile.Path not in FdfModuleList: > > + # If one of the Source built modules listed i= n the DSC is not listed > > + # in FDF modules, and the INF lists a PCD can= only use the PcdsDynamic > > + # access method (it is only listed in the DEC= file that declares the > > + # PCD as PcdsDynamic), then build tool will r= eport warning message > > + # notify the PI that they are attempting to b= uild a module that must > > + # be included in a flash image in order to be= functional. These Dynamic > > + # PCD will not be added into the Database unl= ess it is used by other > > + # modules that are included in the FDF file. > > + if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET= and \ > > + PcdFromModule.IsFromBinaryInf =3D=3D Fals= e: > > + # Print warning message to let the develo= per make a determine. > > + continue > > + # If one of the Source built modules listed i= n the DSC is not listed in > > + # FDF modules, and the INF lists a PCD can on= ly use the PcdsDynamicEx > > + # access method (it is only listed in the DEC= file that declares the > > + # PCD as PcdsDynamicEx), then DO NOT break th= e build; DO NOT add the > > + # PCD to the Platform's PCD Database. > > + if PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_= SET: > > + continue > > + # > > + # If a dynamic PCD used by a PEM module/PEI modul= e & DXE module, > > + # it should be stored in Pcd PEI database, If a d= ynamic only > > + # used by DXE module, it should be stored in DXE = PCD database. > > + # The default Phase is DXE > > + # > > + if M.ModuleType in SUP_MODULE_SET_PEI: > > + PcdFromModule.Phase =3D "PEI" > > + if PcdFromModule not in self._DynaPcdList_: > > + self._DynaPcdList_.append(PcdFromModule) > > + elif PcdFromModule.Phase =3D=3D 'PEI': > > + # overwrite any the same PCD existing, if Pha= se is PEI > > + Index =3D self._DynaPcdList_.index(PcdFromMod= ule) > > + self._DynaPcdList_[Index] =3D PcdFromModule > > + elif PcdFromModule not in self._NonDynaPcdList_: > > + self._NonDynaPcdList_.append(PcdFromModule) > > + elif PcdFromModule in self._NonDynaPcdList_ and PcdFr= omModule.IsFromBinaryInf =3D=3D True: > > + Index =3D self._NonDynaPcdList_.index(PcdFromModu= le) > > + if self._NonDynaPcdList_[Index].IsFromBinaryInf = =3D=3D False: > > + #The PCD from Binary INF will override the sa= me one from source INF > > + self._NonDynaPcdList_.remove (self._NonDynaPc= dList_[Index]) > > + PcdFromModule.Pending =3D False > > + self._NonDynaPcdList_.append (PcdFromModule) > > + DscModuleSet =3D {os.path.normpath(ModuleInf.Path) for Module= Inf in self.Platform.Modules} > > + # add the PCD from modules that listed in FDF but not in DSC = to Database > > + for InfName in FdfModuleList: > > + if InfName not in DscModuleSet: > > + InfClass =3D PathClass(InfName) > > + M =3D self.BuildDatabase[InfClass, self.Arch, self.Bu= ildTarget, self.ToolChain] > > + # If a module INF in FDF but not in current arch's DS= C module list, it must be module (either binary or source) > > + # for different Arch. PCDs in source module for diffe= rent Arch is already added before, so skip the source module here. > > + # For binary module, if in current arch, we need to l= ist the PCDs into database. > > + if not M.IsBinaryModule: > > + continue > > + # Override the module PCD setting by platform setting > > + ModulePcdList =3D self.ApplyPcdSetting(M, M.Pcds) > > + for PcdFromModule in ModulePcdList: > > + PcdFromModule.IsFromBinaryInf =3D True > > + PcdFromModule.IsFromDsc =3D False > > + # Only allow the DynamicEx and Patchable PCD in A= sBuild INF > > + if PcdFromModule.Type not in PCD_DYNAMIC_EX_TYPE_= SET and PcdFromModule.Type not in TAB_PCDS_PATCHABLE_IN_MODULE: > > + EdkLogger.error("build", AUTOGEN_ERROR, "PCD = setting error", > > + File=3Dself.MetaFile, > > + ExtraData=3D"\n\tExisted %s P= CD %s in:\n\t\t%s\n" > > + % (PcdFromModule.Type, PcdFro= mModule.TokenCName, InfName)) > > + # make sure that the "VOID*" kind of datum has Ma= xDatumSize set > > + if PcdFromModule.DatumType =3D=3D TAB_VOID and no= t PcdFromModule.MaxDatumSize: > > + NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFro= mModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName)) > > + if M.ModuleType in SUP_MODULE_SET_PEI: > > + PcdFromModule.Phase =3D "PEI" > > + if PcdFromModule not in self._DynaPcdList_ and Pc= dFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET: > > + self._DynaPcdList_.append(PcdFromModule) > > + elif PcdFromModule not in self._NonDynaPcdList_ a= nd PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE: > > + self._NonDynaPcdList_.append(PcdFromModule) > > + if PcdFromModule in self._DynaPcdList_ and PcdFro= mModule.Phase =3D=3D 'PEI' and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SE= T: > > + # Overwrite the phase of any the same PCD exi= sting, if Phase is PEI. > > + # It is to solve the case that a dynamic PCD = used by a PEM module/PEI > > + # module & DXE module at a same time. > > + # Overwrite the type of the PCDs in source IN= F by the type of AsBuild > > + # INF file as DynamicEx. > > + Index =3D self._DynaPcdList_.index(PcdFromMod= ule) > > + self._DynaPcdList_[Index].Phase =3D PcdFromMo= dule.Phase > > + self._DynaPcdList_[Index].Type =3D PcdFromMod= ule.Type > > + for PcdFromModule in self._NonDynaPcdList_: > > + # If a PCD is not listed in the DSC file, but binary INF = files used by > > + # this platform all (that use this PCD) list the PCD in a= [PatchPcds] > > + # section, AND all source INF files used by this platform= the build > > + # that use the PCD list the PCD in either a [Pcds] or [Pa= tchPcds] > > + # section, then the tools must NOT add the PCD to the Pla= tform's PCD > > + # Database; the build must assign the access method for t= his PCD as > > + # PcdsPatchableInModule. > > + if PcdFromModule not in self._DynaPcdList_: > > + continue > > + Index =3D self._DynaPcdList_.index(PcdFromModule) > > + if PcdFromModule.IsFromDsc =3D=3D False and \ > > + PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE an= d \ > > + PcdFromModule.IsFromBinaryInf =3D=3D True and \ > > + self._DynaPcdList_[Index].IsFromBinaryInf =3D=3D Fals= e: > > + Index =3D self._DynaPcdList_.index(PcdFromModule) > > + self._DynaPcdList_.remove (self._DynaPcdList_[Index]) > > + > > + # print out error information and break the build, if error f= ound > > + if len(NoDatumTypePcdList) > 0: > > + NoDatumTypePcdListString =3D "\n\t\t".join(NoDatumTypePcd= List) > > + EdkLogger.error("build", AUTOGEN_ERROR, "PCD setting erro= r", > > + File=3Dself.MetaFile, > > + ExtraData=3D"\n\tPCD(s) without MaxDatumS= ize:\n\t\t%s\n" > > + % NoDatumTypePcdListString) > > + self._NonDynamicPcdList =3D self._NonDynaPcdList_ > > + self._DynamicPcdList =3D self._DynaPcdList_ > > + > > + def SortDynamicPcd(self): > > + # > > + # Sort dynamic PCD list to: > > + # 1) If PCD's datum type is VOID* and value is unicode string= which starts with L, the PCD item should > > + # try to be put header of dynamicd List > > + # 2) If PCD is HII type, the PCD item should be put after uni= code type PCD > > + # > > + # The reason of sorting is make sure the unicode string is in= double-byte alignment in string table. > > + # > > + UnicodePcdArray =3D set() > > + HiiPcdArray =3D set() > > + OtherPcdArray =3D set() > > + VpdPcdDict =3D {} > > + VpdFile =3D VpdInfoFile.VpdInfoFile() > > + NeedProcessVpdMapFile =3D False > > + > > + for pcd in self.Platform.Pcds: > > + if pcd not in self._PlatformPcds: > > + self._PlatformPcds[pcd] =3D self.Platform.Pcds[pcd] > > + > > + for item in self._PlatformPcds: > > + if self._PlatformPcds[item].DatumType and self._PlatformP= cds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, = TAB_VOID, "BOOLEAN"]: > > + self._PlatformPcds[item].DatumType =3D TAB_VOID > > + > > + if (self.Workspace.ArchList[-1] =3D=3D self.Arch): > > + for Pcd in self._DynamicPcdList: > > + # just pick the a value to determine whether is unico= de string type > > + Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > > + > > + if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_U= INT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > > + Pcd.DatumType =3D TAB_VOID > > + > > + # if found PCD which datum value is unicode strin= g the insert to left size of UnicodeIndex > > + # if found HII type PCD then insert to right of U= nicodeIndex > > + if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMI= C_EX_VPD]: > > + VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCNa= me)] =3D Pcd > > + > > + #Collect DynamicHii PCD values and assign it to DynamicEx= Vpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer > > + PcdNvStoreDfBuffer =3D VpdPcdDict.get(("PcdNvStoreDefault= ValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid")) > > + if PcdNvStoreDfBuffer: > > + self.VariableInfo =3D self.CollectVariables(self._Dyn= amicPcdList) > > + vardump =3D self.VariableInfo.dump() > > + if vardump: > > + # > > + #According to PCD_DATABASE_INIT in edk2\MdeModule= Pkg\Include\Guid\PcdDataBaseSignatureGuid.h, > > + #the max size for string PCD should not exceed US= HRT_MAX 65535(0xffff). > > + #typedef UINT16 SIZE_INFO; > > + #//SIZE_INFO SizeTable[]; > > + if len(vardump.split(",")) > 0xffff: > > + EdkLogger.error("build", RESOURCE_OVERFLOW, '= The current length of PCD %s value is %d, it exceeds to the max size of Str= ing PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBu= ffer.TokenCName]) ,len(vardump.split(",")))) > > + PcdNvStoreDfBuffer.DefaultValue =3D vardump > > + for skuname in PcdNvStoreDfBuffer.SkuInfoList: > > + PcdNvStoreDfBuffer.SkuInfoList[skuname].Defau= ltValue =3D vardump > > + PcdNvStoreDfBuffer.MaxDatumSize =3D str(len(v= ardump.split(","))) > > + else: > > + #If the end user define [DefaultStores] and [XXX.Menu= facturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to = PcdsDynamicVpd > > + if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDe= finedDefaultStoresFlag]: > > + EdkLogger.warn("build", "PcdNvStoreDefaultValueBu= ffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultSto= res is enabled for this platform.\n%s" %self.Platform.MetaFile.Path) > > + PlatformPcds =3D sorted(self._PlatformPcds.keys()) > > + # > > + # Add VPD type PCD into VpdFile and determine whether the= VPD PCD need to be fixed up. > > + # > > + VpdSkuMap =3D {} > > + for PcdKey in PlatformPcds: > > + Pcd =3D self._PlatformPcds[PcdKey] > > + if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMI= C_EX_VPD] and \ > > + PcdKey in VpdPcdDict: > > + Pcd =3D VpdPcdDict[PcdKey] > > + SkuValueMap =3D {} > > + DefaultSku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > + if DefaultSku: > > + PcdValue =3D DefaultSku.DefaultValue > > + if PcdValue not in SkuValueMap: > > + SkuValueMap[PcdValue] =3D [] > > + VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.= VpdOffset) > > + SkuValueMap[PcdValue].append(DefaultSku) > > + > > + for (SkuName, Sku) in Pcd.SkuInfoList.items(): > > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > > + PcdValue =3D Sku.DefaultValue > > + if PcdValue =3D=3D "": > > + PcdValue =3D Pcd.DefaultValue > > + if Sku.VpdOffset !=3D TAB_STAR: > > + if PcdValue.startswith("{"): > > + Alignment =3D 8 > > + elif PcdValue.startswith("L"): > > + Alignment =3D 2 > > + else: > > + Alignment =3D 1 > > + try: > > + VpdOffset =3D int(Sku.VpdOffset) > > + except: > > + try: > > + VpdOffset =3D int(Sku.VpdOffset, = 16) > > + except: > > + EdkLogger.error("build", FORMAT_I= NVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.Toke= nSpaceGuidCName, Pcd.TokenCName)) > > + if VpdOffset % Alignment !=3D 0: > > + if PcdValue.startswith("{"): > > + EdkLogger.warn("build", "The offs= et value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pc= d.TokenCName), File=3Dself.MetaFile) > > + else: > > + EdkLogger.error("build", FORMAT_I= NVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.T= okenSpaceGuidCName, Pcd.TokenCName, Alignment)) > > + if PcdValue not in SkuValueMap: > > + SkuValueMap[PcdValue] =3D [] > > + VpdFile.Add(Pcd, SkuName, Sku.VpdOffset) > > + SkuValueMap[PcdValue].append(Sku) > > + # if the offset of a VPD is *, then it need t= o be fixed up by third party tool. > > + if not NeedProcessVpdMapFile and Sku.VpdOffse= t =3D=3D TAB_STAR: > > + NeedProcessVpdMapFile =3D True > > + if self.Platform.VpdToolGuid is None or s= elf.Platform.VpdToolGuid =3D=3D '': > > + EdkLogger.error("Build", FILE_NOT_FOU= ND, \ > > + "Fail to find third-p= arty BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in to= ols_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > > + > > + VpdSkuMap[PcdKey] =3D SkuValueMap > > + # > > + # Fix the PCDs define in VPD PCD section that never refer= enced by module. > > + # An example is PCD for signature usage. > > + # > > + for DscPcd in PlatformPcds: > > + DscPcdEntry =3D self._PlatformPcds[DscPcd] > > + if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCD= S_DYNAMIC_EX_VPD]: > > + if not (self.Platform.VpdToolGuid is None or self= .Platform.VpdToolGuid =3D=3D ''): > > + FoundFlag =3D False > > + for VpdPcd in VpdFile._VpdArray: > > + # This PCD has been referenced by module > > + if (VpdPcd.TokenSpaceGuidCName =3D=3D Dsc= PcdEntry.TokenSpaceGuidCName) and \ > > + (VpdPcd.TokenCName =3D=3D DscPcdEntry.= TokenCName): > > + FoundFlag =3D True > > + > > + # Not found, it should be signature > > + if not FoundFlag : > > + # just pick the a value to determine whet= her is unicode string type > > + SkuValueMap =3D {} > > + SkuObjList =3D list(DscPcdEntry.SkuInfoLi= st.items()) > > + DefaultSku =3D DscPcdEntry.SkuInfoList.ge= t(TAB_DEFAULT) > > + if DefaultSku: > > + defaultindex =3D SkuObjList.index((TA= B_DEFAULT, DefaultSku)) > > + SkuObjList[0], SkuObjList[defaultinde= x] =3D SkuObjList[defaultindex], SkuObjList[0] > > + for (SkuName, Sku) in SkuObjList: > > + Sku.VpdOffset =3D Sku.VpdOffset.strip= () > > + > > + # Need to iterate DEC pcd information= to get the value & datumtype > > + for eachDec in self.PackageList: > > + for DecPcd in eachDec.Pcds: > > + DecPcdEntry =3D eachDec.Pcds[= DecPcd] > > + if (DecPcdEntry.TokenSpaceGui= dCName =3D=3D DscPcdEntry.TokenSpaceGuidCName) and \ > > + (DecPcdEntry.TokenCName = =3D=3D DscPcdEntry.TokenCName): > > + # Print warning message t= o let the developer make a determine. > > + EdkLogger.warn("build", "= Unreferenced vpd pcd used!", > > + File=3Dse= lf.MetaFile, \ > > + ExtraData= =3D "PCD: %s.%s used in the DSC file %s is unreferenced." \ > > + %(DscPcdE= ntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Pa= th)) > > + > > + DscPcdEntry.DatumType = = =3D DecPcdEntry.DatumType > > + DscPcdEntry.DefaultValue = = =3D DecPcdEntry.DefaultValue > > + DscPcdEntry.TokenValue = =3D DecPcdEntry.TokenValue > > + DscPcdEntry.TokenSpaceGui= dValue =3D eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName] > > + # Only fix the value whil= e no value provided in DSC file. > > + if not Sku.DefaultValue: > > + DscPcdEntry.SkuInfoLi= st[list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue =3D DecPcdEntry.De= faultValue > > + > > + if DscPcdEntry not in self._DynamicPc= dList: > > + self._DynamicPcdList.append(DscPc= dEntry) > > + Sku.VpdOffset =3D Sku.VpdOffset.strip= () > > + PcdValue =3D Sku.DefaultValue > > + if PcdValue =3D=3D "": > > + PcdValue =3D DscPcdEntry.Default= Value > > + if Sku.VpdOffset !=3D TAB_STAR: > > + if PcdValue.startswith("{"): > > + Alignment =3D 8 > > + elif PcdValue.startswith("L"): > > + Alignment =3D 2 > > + else: > > + Alignment =3D 1 > > + try: > > + VpdOffset =3D int(Sku.VpdOffs= et) > > + except: > > + try: > > + VpdOffset =3D int(Sku.Vpd= Offset, 16) > > + except: > > + EdkLogger.error("build", = FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, = DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName)) > > + if VpdOffset % Alignment !=3D 0: > > + if PcdValue.startswith("{"): > > + EdkLogger.warn("build", "= The offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSp= aceGuidCName, DscPcdEntry.TokenCName), File=3Dself.MetaFile) > > + else: > > + EdkLogger.error("build", = FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' = % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment)) > > + if PcdValue not in SkuValueMap: > > + SkuValueMap[PcdValue] =3D [] > > + VpdFile.Add(DscPcdEntry, SkuName,= Sku.VpdOffset) > > + SkuValueMap[PcdValue].append(Sku) > > + if not NeedProcessVpdMapFile and Sku.= VpdOffset =3D=3D TAB_STAR: > > + NeedProcessVpdMapFile =3D True > > + if DscPcdEntry.DatumType =3D=3D TAB_VOID = and PcdValue.startswith("L"): > > + UnicodePcdArray.add(DscPcdEntry) > > + elif len(Sku.VariableName) > 0: > > + HiiPcdArray.add(DscPcdEntry) > > + else: > > + OtherPcdArray.add(DscPcdEntry) > > + > > + # if the offset of a VPD is *, then i= t need to be fixed up by third party tool. > > + VpdSkuMap[DscPcd] =3D SkuValueMap > > + if (self.Platform.FlashDefinition is None or self.Platfor= m.FlashDefinition =3D=3D '') and \ > > + VpdFile.GetCount() !=3D 0: > > + EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, > > + "Fail to get FLASH_DEFINITION definit= ion in DSC file %s which is required when DSC contains VPD PCD." % str(self= .Platform.MetaFile)) > > + > > + if VpdFile.GetCount() !=3D 0: > > + > > + self.FixVpdOffset(VpdFile) > > + > > + self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile)) > > + PcdNvStoreDfBuffer =3D [item for item in self._Dynami= cPcdList if item.TokenCName =3D=3D "PcdNvStoreDefaultValueBuffer" and item.= TokenSpaceGuidCName =3D=3D "gEfiMdeModulePkgTokenSpaceGuid"] > > + if PcdNvStoreDfBuffer: > > + PcdName,PcdGuid =3D PcdNvStoreDfBuffer[0].TokenCN= ame, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName > > + if (PcdName,PcdGuid) in VpdSkuMap: > > + DefaultSku =3D PcdNvStoreDfBuffer[0].SkuInfoL= ist.get(TAB_DEFAULT) > > + VpdSkuMap[(PcdName,PcdGuid)] =3D {DefaultSku.= DefaultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values= () ]} > > + > > + # Process VPD map file generated by third party BPDG = tool > > + if NeedProcessVpdMapFile: > > + VpdMapFilePath =3D os.path.join(self.BuildDir, TA= B_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid) > > + if os.path.exists(VpdMapFilePath): > > + VpdFile.Read(VpdMapFilePath) > > + > > + # Fixup TAB_STAR offset > > + for pcd in VpdSkuMap: > > + vpdinfo =3D VpdFile.GetVpdInfo(pcd) > > + if vpdinfo is None: > > + # just pick the a value to determine whet= her is unicode string type > > + continue > > + for pcdvalue in VpdSkuMap[pcd]: > > + for sku in VpdSkuMap[pcd][pcdvalue]: > > + for item in vpdinfo: > > + if item[2] =3D=3D pcdvalue: > > + sku.VpdOffset =3D item[1] > > + else: > > + EdkLogger.error("build", FILE_READ_FAILURE, "= Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath) > > + > > + # Delete the DynamicPcdList At the last time enter into t= his function > > + for Pcd in self._DynamicPcdList: > > + # just pick the a value to determine whether is unico= de string type > > + Sku =3D Pcd.SkuInfoList.get(TAB_DEFAULT) > > + Sku.VpdOffset =3D Sku.VpdOffset.strip() > > + > > + if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_U= INT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]: > > + Pcd.DatumType =3D TAB_VOID > > + > > + PcdValue =3D Sku.DefaultValue > > + if Pcd.DatumType =3D=3D TAB_VOID and PcdValue.startsw= ith("L"): > > + # if found PCD which datum value is unicode strin= g the insert to left size of UnicodeIndex > > + UnicodePcdArray.add(Pcd) > > + elif len(Sku.VariableName) > 0: > > + # if found HII type PCD then insert to right of U= nicodeIndex > > + HiiPcdArray.add(Pcd) > > + else: > > + OtherPcdArray.add(Pcd) > > + del self._DynamicPcdList[:] > > + self._DynamicPcdList.extend(list(UnicodePcdArray)) > > + self._DynamicPcdList.extend(list(HiiPcdArray)) > > + self._DynamicPcdList.extend(list(OtherPcdArray)) > > + allskuset =3D [(SkuName, Sku.SkuId) for pcd in self._DynamicP= cdList for (SkuName, Sku) in pcd.SkuInfoList.items()] > > + for pcd in self._DynamicPcdList: > > + if len(pcd.SkuInfoList) =3D=3D 1: > > + for (SkuName, SkuId) in allskuset: > > + if isinstance(SkuId, str) and eval(SkuId) =3D=3D = 0 or SkuId =3D=3D 0: > > + continue > > + pcd.SkuInfoList[SkuName] =3D copy.deepcopy(pcd.Sk= uInfoList[TAB_DEFAULT]) > > + pcd.SkuInfoList[SkuName].SkuId =3D SkuId > > + pcd.SkuInfoList[SkuName].SkuIdName =3D SkuName > > + > > + def FixVpdOffset(self, VpdFile ): > > + FvPath =3D os.path.join(self.BuildDir, TAB_FV_DIRECTORY) > > + if not os.path.exists(FvPath): > > + try: > > + os.makedirs(FvPath) > > + except: > > + EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to= create FV folder under %s" % self.BuildDir) > > + > > + VpdFilePath =3D os.path.join(FvPath, "%s.txt" % self.Platform= .VpdToolGuid) > > + > > + if VpdFile.Write(VpdFilePath): > > + # retrieve BPDG tool's path from tool_def.txt according t= o VPD_TOOL_GUID defined in DSC file. > > + BPDGToolName =3D None > > + for ToolDef in self.ToolDefinition.values(): > > + if TAB_GUID in ToolDef and ToolDef[TAB_GUID] =3D=3D s= elf.Platform.VpdToolGuid: > > + if "PATH" not in ToolDef: > > + EdkLogger.error("build", ATTRIBUTE_NOT_AVAILA= BLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.tx= t" % self.Platform.VpdToolGuid) > > + BPDGToolName =3D ToolDef["PATH"] > > + break > > + # Call third party GUID BPDG tool. > > + if BPDGToolName is not None: > > + VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFile= Path) > > + else: > > + EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to fin= d third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defi= ned in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.") > > + > > + ## Return the platform build data object > > + @cached_property > > + def Platform(self): > > + return self.BuildDatabase[self.MetaFile, self.Arch, self.Buil= dTarget, self.ToolChain] > > + > > + ## Return platform name > > + @cached_property > > + def Name(self): > > + return self.Platform.PlatformName > > + > > + ## Return the meta file GUID > > + @cached_property > > + def Guid(self): > > + return self.Platform.Guid > > + > > + ## Return the platform version > > + @cached_property > > + def Version(self): > > + return self.Platform.Version > > + > > + ## Return the FDF file name > > + @cached_property > > + def FdfFile(self): > > + if self.Workspace.FdfFile: > > + RetVal=3D mws.join(self.WorkspaceDir, self.Workspace.FdfF= ile) > > + else: > > + RetVal =3D '' > > + return RetVal > > + > > + ## Return the build output directory platform specifies > > + @cached_property > > + def OutputDir(self): > > + return self.Platform.OutputDirectory > > + > > + ## Return the directory to store all intermediate and final files= built > > + @cached_property > > + def BuildDir(self): > > + if os.path.isabs(self.OutputDir): > > + GlobalData.gBuildDirectory =3D RetVal =3D path.join( > > + path.abspath(self.OutputDir), > > + self.BuildTarget + "_" + self= .ToolChain, > > + ) > > + else: > > + GlobalData.gBuildDirectory =3D RetVal =3D path.join( > > + self.WorkspaceDir, > > + self.OutputDir, > > + self.BuildTarget + "_" + self= .ToolChain, > > + ) > > + return RetVal > > + > > + ## Return directory of platform makefile > > + # > > + # @retval string Makefile directory > > + # > > + @cached_property > > + def MakeFileDir(self): > > + return path.join(self.BuildDir, self.Arch) > > + > > + ## Return build command string > > + # > > + # @retval string Build command string > > + # > > + @cached_property > > + def BuildCommand(self): > > + RetVal =3D [] > > + if "MAKE" in self.ToolDefinition and "PATH" in self.ToolDefin= ition["MAKE"]: > > + RetVal +=3D _SplitOption(self.ToolDefinition["MAKE"]["PAT= H"]) > > + if "FLAGS" in self.ToolDefinition["MAKE"]: > > + NewOption =3D self.ToolDefinition["MAKE"]["FLAGS"].st= rip() > > + if NewOption !=3D '': > > + RetVal +=3D _SplitOption(NewOption) > > + if "MAKE" in self.EdkIIBuildOption: > > + if "FLAGS" in self.EdkIIBuildOption["MAKE"]: > > + Flags =3D self.EdkIIBuildOption["MAKE"]["FLAGS"] > > + if Flags.startswith('=3D'): > > + RetVal =3D [RetVal[0]] + [Flags[1:]] > > + else: > > + RetVal.append(Flags) > > + return RetVal > > + > > + ## Get tool chain definition > > + # > > + # Get each tool definition for given tool chain from tools_def.t= xt and platform > > + # > > + @cached_property > > + def ToolDefinition(self): > > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDictiona= ry > > + if TAB_TOD_DEFINES_COMMAND_TYPE not in self.Workspace.ToolDef= .ToolsDefTxtDatabase: > > + EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No tool= s found in configuration", > > + ExtraData=3D"[%s]" % self.MetaFile) > > + RetVal =3D OrderedDict() > > + DllPathList =3D set() > > + for Def in ToolDefinition: > > + Target, Tag, Arch, Tool, Attr =3D Def.split("_") > > + if Target !=3D self.BuildTarget or Tag !=3D self.ToolChai= n or Arch !=3D self.Arch: > > + continue > > + > > + Value =3D ToolDefinition[Def] > > + # don't record the DLL > > + if Attr =3D=3D "DLL": > > + DllPathList.add(Value) > > + continue > > + > > + if Tool not in RetVal: > > + RetVal[Tool] =3D OrderedDict() > > + RetVal[Tool][Attr] =3D Value > > + > > + ToolsDef =3D '' > > + if GlobalData.gOptions.SilentMode and "MAKE" in RetVal: > > + if "FLAGS" not in RetVal["MAKE"]: > > + RetVal["MAKE"]["FLAGS"] =3D "" > > + RetVal["MAKE"]["FLAGS"] +=3D " -s" > > + MakeFlags =3D '' > > + for Tool in RetVal: > > + for Attr in RetVal[Tool]: > > + Value =3D RetVal[Tool][Attr] > > + if Tool in self._BuildOptionWithToolDef(RetVal) and A= ttr in self._BuildOptionWithToolDef(RetVal)[Tool]: > > + # check if override is indicated > > + if self._BuildOptionWithToolDef(RetVal)[Tool][Att= r].startswith('=3D'): > > + Value =3D self._BuildOptionWithToolDef(RetVal= )[Tool][Attr][1:] > > + else: > > + if Attr !=3D 'PATH': > > + Value +=3D " " + self._BuildOptionWithToo= lDef(RetVal)[Tool][Attr] > > + else: > > + Value =3D self._BuildOptionWithToolDef(Re= tVal)[Tool][Attr] > > + > > + if Attr =3D=3D "PATH": > > + # Don't put MAKE definition in the file > > + if Tool !=3D "MAKE": > > + ToolsDef +=3D "%s =3D %s\n" % (Tool, Value) > > + elif Attr !=3D "DLL": > > + # Don't put MAKE definition in the file > > + if Tool =3D=3D "MAKE": > > + if Attr =3D=3D "FLAGS": > > + MakeFlags =3D Value > > + else: > > + ToolsDef +=3D "%s_%s =3D %s\n" % (Tool, Attr,= Value) > > + ToolsDef +=3D "\n" > > + > > + tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF."= + self.Arch) > > + SaveFileOnChange(tool_def_file, ToolsDef, False) > > + for DllPath in DllPathList: > > + os.environ["PATH"] =3D DllPath + os.pathsep + os.environ[= "PATH"] > > + os.environ["MAKE_FLAGS"] =3D MakeFlags > > + > > + return RetVal > > + > > + ## Return the paths of tools > > + @cached_property > > + def ToolDefinitionFile(self): > > + tool_def_file =3D os.path.join(self.MakeFileDir, "TOOLS_DEF."= + self.Arch) > > + if not os.path.exists(tool_def_file): > > + self.ToolDefinition > > + return tool_def_file > > + > > + ## Retrieve the toolchain family of given toolchain tag. Default = to 'MSFT'. > > + @cached_property > > + def ToolChainFamily(self): > > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > > + if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \ > > + or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FA= MILY] \ > > + or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolCha= in]: > > + EdkLogger.verbose("No tool chain family found in configur= ation for %s. Default to MSFT." \ > > + % self.ToolChain) > > + RetVal =3D TAB_COMPILER_MSFT > > + else: > > + RetVal =3D ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.To= olChain] > > + return RetVal > > + > > + @cached_property > > + def BuildRuleFamily(self): > > + ToolDefinition =3D self.Workspace.ToolDef.ToolsDefTxtDatabase > > + if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \ > > + or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BU= ILDRULEFAMILY] \ > > + or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][sel= f.ToolChain]: > > + EdkLogger.verbose("No tool chain family found in configur= ation for %s. Default to MSFT." \ > > + % self.ToolChain) > > + return TAB_COMPILER_MSFT > > + > > + return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.T= oolChain] > > + > > + ## Return the build options specific for all modules in this plat= form > > + @cached_property > > + def BuildOption(self): > > + return self._ExpandBuildOption(self.Platform.BuildOptions) > > + > > + def _BuildOptionWithToolDef(self, ToolDef): > > + return self._ExpandBuildOption(self.Platform.BuildOptions, To= olDef=3DToolDef) > > + > > + ## Return the build options specific for EDK modules in this plat= form > > + @cached_property > > + def EdkBuildOption(self): > > + return self._ExpandBuildOption(self.Platform.BuildOptions, ED= K_NAME) > > + > > + ## Return the build options specific for EDKII modules in this pl= atform > > + @cached_property > > + def EdkIIBuildOption(self): > > + return self._ExpandBuildOption(self.Platform.BuildOptions, ED= KII_NAME) > > + > > + ## Parse build_rule.txt in Conf Directory. > > + # > > + # @retval BuildRule object > > + # > > + @cached_property > > + def BuildRule(self): > > + BuildRuleFile =3D None > > + if TAB_TAT_DEFINES_BUILD_RULE_CONF in self.Workspace.TargetTx= t.TargetTxtDictionary: > > + BuildRuleFile =3D self.Workspace.TargetTxt.TargetTxtDicti= onary[TAB_TAT_DEFINES_BUILD_RULE_CONF] > > + if not BuildRuleFile: > > + BuildRuleFile =3D gDefaultBuildRuleFile > > + RetVal =3D BuildRule(BuildRuleFile) > > + if RetVal._FileVersion =3D=3D "": > > + RetVal._FileVersion =3D AutoGenReqBuildRuleVerNum > > + else: > > + if RetVal._FileVersion < AutoGenReqBuildRuleVerNum : > > + # If Build Rule's version is less than the version nu= mber required by the tools, halting the build. > > + EdkLogger.error("build", AUTOGEN_ERROR, > > + ExtraData=3D"The version number [%s] = of build_rule.txt is less than the version number required by the AutoGen.(= the minimum required version number is [%s])"\ > > + % (RetVal._FileVersion, AutoGenReqBu= ildRuleVerNum)) > > + return RetVal > > + > > + ## Summarize the packages used by modules in this platform > > + @cached_property > > + def PackageList(self): > > + RetVal =3D set() > > + for Mb in self._MbList: > > + RetVal.update(Mb.Packages) > > + for lb in Mb.LibInstances: > > + RetVal.update(lb.Packages) > > + #Collect package set information from INF of FDF > > + for ModuleFile in self._AsBuildModuleList: > > + if ModuleFile in self.Platform.Modules: > > + continue > > + ModuleData =3D self.BuildDatabase[ModuleFile, self.Arch, = self.BuildTarget, self.ToolChain] > > + RetVal.update(ModuleData.Packages) > > + return list(RetVal) > > + > > + @cached_property > > + def NonDynamicPcdDict(self): > > + return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd= in self.NonDynamicPcdList} > > + > > + ## Get list of non-dynamic PCDs > > + @property > > + def NonDynamicPcdList(self): > > + if not self._NonDynamicPcdList: > > + self.CollectPlatformDynamicPcds() > > + return self._NonDynamicPcdList > > + > > + ## Get list of dynamic PCDs > > + @property > > + def DynamicPcdList(self): > > + if not self._DynamicPcdList: > > + self.CollectPlatformDynamicPcds() > > + return self._DynamicPcdList > > + > > + ## Generate Token Number for all PCD > > + @cached_property > > + def PcdTokenNumber(self): > > + RetVal =3D OrderedDict() > > + TokenNumber =3D 1 > > + # > > + # Make the Dynamic and DynamicEx PCD use within different Tok= enNumber area. > > + # Such as: > > + # > > + # Dynamic PCD: > > + # TokenNumber 0 ~ 10 > > + # DynamicEx PCD: > > + # TokeNumber 11 ~ 20 > > + # > > + for Pcd in self.DynamicPcdList: > > + if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_TYP= E_SET: > > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > + TokenNumber +=3D 1 > > + > > + for Pcd in self.DynamicPcdList: > > + if Pcd.Phase =3D=3D "PEI" and Pcd.Type in PCD_DYNAMIC_EX_= TYPE_SET: > > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > + TokenNumber +=3D 1 > > + > > + for Pcd in self.DynamicPcdList: > > + if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_TYP= E_SET: > > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > + TokenNumber +=3D 1 > > + > > + for Pcd in self.DynamicPcdList: > > + if Pcd.Phase =3D=3D "DXE" and Pcd.Type in PCD_DYNAMIC_EX_= TYPE_SET: > > + EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d"= % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber)) > > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D T= okenNumber > > + TokenNumber +=3D 1 > > + > > + for Pcd in self.NonDynamicPcdList: > > + RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] =3D Token= Number > > + TokenNumber +=3D 1 > > + return RetVal > > + > > + @cached_property > > + def _MbList(self): > > + return [self.BuildDatabase[m, self.Arch, self.BuildTarget, se= lf.ToolChain] for m in self.Platform.Modules] > > + > > + @cached_property > > + def _MaList(self): > > + for ModuleFile in self.Platform.Modules: > > + Ma =3D ModuleAutoGen( > > + self.Workspace, > > + ModuleFile, > > + self.BuildTarget, > > + self.ToolChain, > > + self.Arch, > > + self.MetaFile, > > + self.DataPipe > > + ) > > + self.Platform.Modules[ModuleFile].M =3D Ma > > + return [x.M for x in self.Platform.Modules.values()] > > + > > + ## Summarize ModuleAutoGen objects of all modules to be built for= this platform > > + @cached_property > > + def ModuleAutoGenList(self): > > + RetVal =3D [] > > + for Ma in self._MaList: > > + if Ma not in RetVal: > > + RetVal.append(Ma) > > + return RetVal > > + > > + ## Summarize ModuleAutoGen objects of all libraries to be built f= or this platform > > + @cached_property > > + def LibraryAutoGenList(self): > > + RetVal =3D [] > > + for Ma in self._MaList: > > + for La in Ma.LibraryAutoGenList: > > + if La not in RetVal: > > + RetVal.append(La) > > + if Ma not in La.ReferenceModules: > > + La.ReferenceModules.append(Ma) > > + return RetVal > > + > > + ## Test if a module is supported by the platform > > + # > > + # An error will be raised directly if the module or its arch is = not supported > > + # by the platform or current configuration > > + # > > + def ValidModule(self, Module): > > + return Module in self.Platform.Modules or Module in self.Plat= form.LibraryInstances \ > > + or Module in self._AsBuildModuleList > > + @cached_property > > + def GetAllModuleInfo(self,WithoutPcd=3DTrue): > > + ModuleLibs =3D set() > > + for m in self.Platform.Modules: > > + module_obj =3D self.BuildDatabase[m,self.Arch,self.BuildT= arget,self.ToolChain] > > + if not bool(module_obj.LibraryClass): > > + Libs =3D GetModuleLibInstances(module_obj, self.Platf= orm, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain) > > + else: > > + Libs =3D [] > > + ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,= l.Arch,True) for l in Libs])) > > + if WithoutPcd and module_obj.PcdIsDriver: > > + continue > > + ModuleLibs.add((m.File,m.Root,module_obj.Arch,False)) > > + > > + return ModuleLibs > > + > > + ## Resolve the library classes in a module to library instances > > + # > > + # This method will not only resolve library classes but also sort= the library > > + # instances according to the dependency-ship. > > + # > > + # @param Module The module from which the library classes= will be resolved > > + # > > + # @retval library_list List of library instances sorted > > + # > > + def ApplyLibraryInstance(self, Module): > > + # Cover the case that the binary INF file is list in the FDF = file but not DSC file, return empty list directly > > + if str(Module) not in self.Platform.Modules: > > + return [] > > + > > + return GetModuleLibInstances(Module, > > + self.Platform, > > + self.BuildDatabase, > > + self.Arch, > > + self.BuildTarget, > > + self.ToolChain, > > + self.MetaFile, > > + EdkLogger) > > + > > + ## Override PCD setting (type, value, ...) > > + # > > + # @param ToPcd The PCD to be overridden > > + # @param FromPcd The PCD overriding from > > + # > > + def _OverridePcd(self, ToPcd, FromPcd, Module=3D"", Msg=3D"", Lib= rary=3D""): > > + # > > + # in case there's PCDs coming from FDF file, which have no ty= pe given. > > + # at this point, ToPcd.Type has the type found from dependent > > + # package > > + # > > + TokenCName =3D ToPcd.TokenCName > > + for PcdItem in GlobalData.MixedPcd: > > + if (ToPcd.TokenCName, ToPcd.TokenSpaceGuidCName) in Globa= lData.MixedPcd[PcdItem]: > > + TokenCName =3D PcdItem[0] > > + break > > + if FromPcd is not None: > > + if ToPcd.Pending and FromPcd.Type: > > + ToPcd.Type =3D FromPcd.Type > > + elif ToPcd.Type and FromPcd.Type\ > > + and ToPcd.Type !=3D FromPcd.Type and ToPcd.Type in Fr= omPcd.Type: > > + if ToPcd.Type.strip() =3D=3D TAB_PCDS_DYNAMIC_EX: > > + ToPcd.Type =3D FromPcd.Type > > + elif ToPcd.Type and FromPcd.Type \ > > + and ToPcd.Type !=3D FromPcd.Type: > > + if Library: > > + Module =3D str(Module) + " 's library file (" + s= tr(Library) + ")" > > + EdkLogger.error("build", OPTION_CONFLICT, "Mismatched= PCD type", > > + ExtraData=3D"%s.%s is used as [%s] in= module %s, but as [%s] in %s."\ > > + % (ToPcd.TokenSpaceGuidCNam= e, TokenCName, > > + ToPcd.Type, Module, From= Pcd.Type, Msg), > > + File=3Dself.MetaFile) > > + > > + if FromPcd.MaxDatumSize: > > + ToPcd.MaxDatumSize =3D FromPcd.MaxDatumSize > > + ToPcd.MaxSizeUserSet =3D FromPcd.MaxDatumSize > > + if FromPcd.DefaultValue: > > + ToPcd.DefaultValue =3D FromPcd.DefaultValue > > + if FromPcd.TokenValue: > > + ToPcd.TokenValue =3D FromPcd.TokenValue > > + if FromPcd.DatumType: > > + ToPcd.DatumType =3D FromPcd.DatumType > > + if FromPcd.SkuInfoList: > > + ToPcd.SkuInfoList =3D FromPcd.SkuInfoList > > + if FromPcd.UserDefinedDefaultStoresFlag: > > + ToPcd.UserDefinedDefaultStoresFlag =3D FromPcd.UserDe= finedDefaultStoresFlag > > + # Add Flexible PCD format parse > > + if ToPcd.DefaultValue: > > + try: > > + ToPcd.DefaultValue =3D ValueExpressionEx(ToPcd.De= faultValue, ToPcd.DatumType, self.Platform._GuidDict)(True) > > + except BadExpression as Value: > > + EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%= s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.= DefaultValue, Value), > > + File=3Dself.MetaFile) > > + > > + # check the validation of datum > > + IsValid, Cause =3D CheckPcdDatum(ToPcd.DatumType, ToPcd.D= efaultValue) > > + if not IsValid: > > + EdkLogger.error('build', FORMAT_INVALID, Cause, File= =3Dself.MetaFile, > > + ExtraData=3D"%s.%s" % (ToPcd.TokenSpa= ceGuidCName, TokenCName)) > > + ToPcd.validateranges =3D FromPcd.validateranges > > + ToPcd.validlists =3D FromPcd.validlists > > + ToPcd.expressions =3D FromPcd.expressions > > + ToPcd.CustomAttribute =3D FromPcd.CustomAttribute > > + > > + if FromPcd is not None and ToPcd.DatumType =3D=3D TAB_VOID an= d not ToPcd.MaxDatumSize: > > + EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize speci= fied for PCD %s.%s" \ > > + % (ToPcd.TokenSpaceGuidCName, TokenCName)= ) > > + Value =3D ToPcd.DefaultValue > > + if not Value: > > + ToPcd.MaxDatumSize =3D '1' > > + elif Value[0] =3D=3D 'L': > > + ToPcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > + elif Value[0] =3D=3D '{': > > + ToPcd.MaxDatumSize =3D str(len(Value.split(','))) > > + else: > > + ToPcd.MaxDatumSize =3D str(len(Value) - 1) > > + > > + # apply default SKU for dynamic PCDS if specified one is not = available > > + if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_D= YNAMIC_EX_TYPE_SET) \ > > + and not ToPcd.SkuInfoList: > > + if self.Platform.SkuName in self.Platform.SkuIds: > > + SkuName =3D self.Platform.SkuName > > + else: > > + SkuName =3D TAB_DEFAULT > > + ToPcd.SkuInfoList =3D { > > + SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[= SkuName][0], '', '', '', '', '', ToPcd.DefaultValue) > > + } > > + > > + ## Apply PCD setting defined platform to a module > > + # > > + # @param Module The module from which the PCD setting will be= overridden > > + # > > + # @retval PCD_list The list PCDs with settings from platform > > + # > > + def ApplyPcdSetting(self, Module, Pcds, Library=3D""): > > + # for each PCD in module > > + for Name, Guid in Pcds: > > + PcdInModule =3D Pcds[Name, Guid] > > + # find out the PCD setting in platform > > + if (Name, Guid) in self.Platform.Pcds: > > + PcdInPlatform =3D self.Platform.Pcds[Name, Guid] > > + else: > > + PcdInPlatform =3D None > > + # then override the settings if any > > + self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg= = =3D"DSC PCD sections", Library=3DLibrary) > > + # resolve the VariableGuid value > > + for SkuId in PcdInModule.SkuInfoList: > > + Sku =3D PcdInModule.SkuInfoList[SkuId] > > + if Sku.VariableGuid =3D=3D '': continue > > + Sku.VariableGuidValue =3D GuidValue(Sku.VariableGuid,= self.PackageList, self.MetaFile.Path) > > + if Sku.VariableGuidValue is None: > > + PackageList =3D "\n\t".join(str(P) for P in self.= PackageList) > > + EdkLogger.error( > > + 'build', > > + RESOURCE_NOT_AVAILABLE, > > + "Value of GUID [%s] is not found in" = % Sku.VariableGuid, > > + ExtraData=3DPackageList + "\n\t(used = with %s.%s from module %s)" \ > > + % (Guid, Name= , str(Module)), > > + File=3Dself.MetaFile > > + ) > > + > > + # override PCD settings with module specific setting > > + if Module in self.Platform.Modules: > > + PlatformModule =3D self.Platform.Modules[str(Module)] > > + for Key in PlatformModule.Pcds: > > + if GlobalData.BuildOptionPcd: > > + for pcd in GlobalData.BuildOptionPcd: > > + (TokenSpaceGuidCName, TokenCName, FieldName, = pcdvalue, _) =3D pcd > > + if (TokenCName, TokenSpaceGuidCName) =3D=3D K= ey and FieldName =3D=3D"": > > + PlatformModule.Pcds[Key].DefaultValue =3D= pcdvalue > > + PlatformModule.Pcds[Key].PcdValueFromComm= =3D pcdvalue > > + break > > + Flag =3D False > > + if Key in Pcds: > > + ToPcd =3D Pcds[Key] > > + Flag =3D True > > + elif Key in GlobalData.MixedPcd: > > + for PcdItem in GlobalData.MixedPcd[Key]: > > + if PcdItem in Pcds: > > + ToPcd =3D Pcds[PcdItem] > > + Flag =3D True > > + break > > + if Flag: > > + self._OverridePcd(ToPcd, PlatformModule.Pcds[Key]= , Module, Msg=3D"DSC Components Module scoped PCD section", Library=3DLibra= ry) > > + # use PCD value to calculate the MaxDatumSize when it is not = specified > > + for Name, Guid in Pcds: > > + Pcd =3D Pcds[Name, Guid] > > + if Pcd.DatumType =3D=3D TAB_VOID and not Pcd.MaxDatumSize= : > > + Pcd.MaxSizeUserSet =3D None > > + Value =3D Pcd.DefaultValue > > + if not Value: > > + Pcd.MaxDatumSize =3D '1' > > + elif Value[0] =3D=3D 'L': > > + Pcd.MaxDatumSize =3D str((len(Value) - 2) * 2) > > + elif Value[0] =3D=3D '{': > > + Pcd.MaxDatumSize =3D str(len(Value.split(','))) > > + else: > > + Pcd.MaxDatumSize =3D str(len(Value) - 1) > > + return list(Pcds.values()) > > + > > + ## Append build options in platform to a module > > + # > > + # @param Module The module to which the build options will be= appended > > + # > > + # @retval options The options appended with build options i= n platform > > + # > > + def ApplyBuildOption(self, Module): > > + # Get the different options for the different style module > > + PlatformOptions =3D self.EdkIIBuildOption > > + ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleTy= pe(EDKII_NAME, Module.ModuleType) > > + ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptio= ns) > > + ModuleOptions =3D self._ExpandBuildOption(Module.BuildOptions= ) > > + if Module in self.Platform.Modules: > > + PlatformModule =3D self.Platform.Modules[str(Module)] > > + PlatformModuleOptions =3D self._ExpandBuildOption(Platfor= mModule.BuildOptions) > > + else: > > + PlatformModuleOptions =3D {} > > + > > + BuildRuleOrder =3D None > > + for Options in [self.ToolDefinition, ModuleOptions, PlatformO= ptions, ModuleTypeOptions, PlatformModuleOptions]: > > + for Tool in Options: > > + for Attr in Options[Tool]: > > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > + BuildRuleOrder =3D Options[Tool][Attr] > > + > > + AllTools =3D set(list(ModuleOptions.keys()) + list(PlatformOp= tions.keys()) + > > + list(PlatformModuleOptions.keys()) + list(Modu= leTypeOptions.keys()) + > > + list(self.ToolDefinition.keys())) > > + BuildOptions =3D defaultdict(lambda: defaultdict(str)) > > + for Tool in AllTools: > > + for Options in [self.ToolDefinition, ModuleOptions, Platf= ormOptions, ModuleTypeOptions, PlatformModuleOptions]: > > + if Tool not in Options: > > + continue > > + for Attr in Options[Tool]: > > + # > > + # Do not generate it in Makefile > > + # > > + if Attr =3D=3D TAB_TOD_DEFINES_BUILDRULEORDER: > > + continue > > + Value =3D Options[Tool][Attr] > > + # check if override is indicated > > + if Value.startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D mws.handleWsMacr= o(Value[1:]) > > + else: > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + mws.h= andleWsMacro(Value) > > + else: > > + BuildOptions[Tool][Attr] =3D mws.handleWs= Macro(Value) > > + > > + return BuildOptions, BuildRuleOrder > > + > > + > > + def GetGlobalBuildOptions(self,Module): > > + ModuleTypeOptions =3D self.Platform.GetBuildOptionsByModuleTy= pe(EDKII_NAME, Module.ModuleType) > > + ModuleTypeOptions =3D self._ExpandBuildOption(ModuleTypeOptio= ns) > > + > > + if Module in self.Platform.Modules: > > + PlatformModule =3D self.Platform.Modules[str(Module)] > > + PlatformModuleOptions =3D self._ExpandBuildOption(Platfor= mModule.BuildOptions) > > + else: > > + PlatformModuleOptions =3D {} > > + > > + return ModuleTypeOptions,PlatformModuleOptions > > + def ModuleGuid(self,Module): > > + if os.path.basename(Module.MetaFile.File) !=3D os.path.basena= me(Module.MetaFile.Path): > > + # > > + # Length of GUID is 36 > > + # > > + return os.path.basename(Module.MetaFile.Path)[:36] > > + return Module.Guid > > + @cached_property > > + def UniqueBaseName(self): > > + retVal =3D{} > > + ModuleNameDict =3D {} > > + UniqueName =3D {} > > + for Module in self._MbList: > > + unique_base_name =3D '%s_%s' % (Module.BaseName,self.Modu= leGuid(Module)) > > + if unique_base_name not in ModuleNameDict: > > + ModuleNameDict[unique_base_name] =3D [] > > + ModuleNameDict[unique_base_name].append(Module.MetaFile) > > + if Module.BaseName not in UniqueName: > > + UniqueName[Module.BaseName] =3D set() > > + UniqueName[Module.BaseName].add((self.ModuleGuid(Module),= Module.MetaFile)) > > + for module_paths in ModuleNameDict.values(): > > + if len(module_paths) > 1 and len(set(module_paths))>1: > > + samemodules =3D list(set(module_paths)) > > + EdkLogger.error("build", FILE_DUPLICATED, 'Modules ha= ve same BaseName and FILE_GUID:\n' > > + ' %s\n %s' % (samemodules[0], s= amemodules[1])) > > + for name in UniqueName: > > + Guid_Path =3D UniqueName[name] > > + if len(Guid_Path) > 1: > > + retVal[name] =3D '%s_%s' % (name,Guid_Path.pop()[0]) > > + return retVal > > + ## Expand * in build option key > > + # > > + # @param Options Options to be expanded > > + # @param ToolDef Use specified ToolDef instead of full ver= sion. > > + # This is needed during initialization to p= revent > > + # infinite recursion betweeh BuildOptions, > > + # ToolDefinition, and this function. > > + # > > + # @retval options Options expanded > > + # > > + def _ExpandBuildOption(self, Options, ModuleStyle=3DNone, ToolDef= = =3DNone): > > + if not ToolDef: > > + ToolDef =3D self.ToolDefinition > > + BuildOptions =3D {} > > + FamilyMatch =3D False > > + FamilyIsNull =3D True > > + > > + OverrideList =3D {} > > + # > > + # Construct a list contain the build options which need overr= ide. > > + # > > + for Key in Options: > > + # > > + # Key[0] -- tool family > > + # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE > > + # > > + if (Key[0] =3D=3D self.BuildRuleFamily and > > + (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2= and Key[2] =3D=3D ModuleStyle))): > > + Target, ToolChain, Arch, CommandType, Attr =3D Key[1]= .split('_') > > + if (Target =3D=3D self.BuildTarget or Target =3D=3D T= AB_STAR) and\ > > + (ToolChain =3D=3D self.ToolChain or ToolChain =3D= = =3D TAB_STAR) and\ > > + (Arch =3D=3D self.Arch or Arch =3D=3D TAB_STAR) a= nd\ > > + Options[Key].startswith("=3D"): > > + > > + if OverrideList.get(Key[1]) is not None: > > + OverrideList.pop(Key[1]) > > + OverrideList[Key[1]] =3D Options[Key] > > + > > + # > > + # Use the highest priority value. > > + # > > + if (len(OverrideList) >=3D 2): > > + KeyList =3D list(OverrideList.keys()) > > + for Index in range(len(KeyList)): > > + NowKey =3D KeyList[Index] > > + Target1, ToolChain1, Arch1, CommandType1, Attr1 =3D N= owKey.split("_") > > + for Index1 in range(len(KeyList) - Index - 1): > > + NextKey =3D KeyList[Index1 + Index + 1] > > + # > > + # Compare two Key, if one is included by another,= choose the higher priority one > > + # > > + Target2, ToolChain2, Arch2, CommandType2, Attr2 = =3D NextKey.split("_") > > + if (Target1 =3D=3D Target2 or Target1 =3D=3D TAB_= STAR or Target2 =3D=3D TAB_STAR) and\ > > + (ToolChain1 =3D=3D ToolChain2 or ToolChain1 = =3D=3D TAB_STAR or ToolChain2 =3D=3D TAB_STAR) and\ > > + (Arch1 =3D=3D Arch2 or Arch1 =3D=3D TAB_STAR = or Arch2 =3D=3D TAB_STAR) and\ > > + (CommandType1 =3D=3D CommandType2 or CommandT= ype1 =3D=3D TAB_STAR or CommandType2 =3D=3D TAB_STAR) and\ > > + (Attr1 =3D=3D Attr2 or Attr1 =3D=3D TAB_STAR = or Attr2 =3D=3D TAB_STAR): > > + > > + if CalculatePriorityValue(NowKey) > Calculate= PriorityValue(NextKey): > > + if Options.get((self.BuildRuleFamily, Nex= tKey)) is not None: > > + Options.pop((self.BuildRuleFamily, Ne= xtKey)) > > + else: > > + if Options.get((self.BuildRuleFamily, Now= Key)) is not None: > > + Options.pop((self.BuildRuleFamily, No= wKey)) > > + > > + for Key in Options: > > + if ModuleStyle is not None and len (Key) > 2: > > + # Check Module style is EDK or EDKII. > > + # Only append build option for the matched style modu= le. > > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > + continue > > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > + continue > > + Family =3D Key[0] > > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > + # if tool chain family doesn't match, skip it > > + if Tool in ToolDef and Family !=3D "": > > + FamilyIsNull =3D False > > + if ToolDef[Tool].get(TAB_TOD_DEFINES_BUILDRULEFAMILY,= "") !=3D "": > > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_BUIL= DRULEFAMILY]: > > + continue > > + elif Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY= ]: > > + continue > > + FamilyMatch =3D True > > + # expand any wildcard > > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > + if Tool not in BuildOptions: > > + BuildOptions[Tool] =3D {} > > + if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D Options[Key] > > + else: > > + # append options for the same tool except= PATH > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > + else: > > + BuildOptions[Tool][Attr] =3D Options[= Key] > > + # Build Option Family has been checked, which need't to be ch= ecked again for family. > > + if FamilyMatch or FamilyIsNull: > > + return BuildOptions > > + > > + for Key in Options: > > + if ModuleStyle is not None and len (Key) > 2: > > + # Check Module style is EDK or EDKII. > > + # Only append build option for the matched style modu= le. > > + if ModuleStyle =3D=3D EDK_NAME and Key[2] !=3D EDK_NA= ME: > > + continue > > + elif ModuleStyle =3D=3D EDKII_NAME and Key[2] !=3D ED= KII_NAME: > > + continue > > + Family =3D Key[0] > > + Target, Tag, Arch, Tool, Attr =3D Key[1].split("_") > > + # if tool chain family doesn't match, skip it > > + if Tool not in ToolDef or Family =3D=3D "": > > + continue > > + # option has been added before > > + if Family !=3D ToolDef[Tool][TAB_TOD_DEFINES_FAMILY]: > > + continue > > + > > + # expand any wildcard > > + if Target =3D=3D TAB_STAR or Target =3D=3D self.BuildTarg= et: > > + if Tag =3D=3D TAB_STAR or Tag =3D=3D self.ToolChain: > > + if Arch =3D=3D TAB_STAR or Arch =3D=3D self.Arch: > > + if Tool not in BuildOptions: > > + BuildOptions[Tool] =3D {} > > + if Attr !=3D "FLAGS" or Attr not in BuildOpti= ons[Tool] or Options[Key].startswith('=3D'): > > + BuildOptions[Tool][Attr] =3D Options[Key] > > + else: > > + # append options for the same tool except= PATH > > + if Attr !=3D 'PATH': > > + BuildOptions[Tool][Attr] +=3D " " + O= ptions[Key] > > + else: > > + BuildOptions[Tool][Attr] =3D Options[= Key] > > + return BuildOptions > > diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/Bas= eTools/Source/Python/AutoGen/WorkspaceAutoGen.py > > new file mode 100644 > > index 000000000000..22a7d996fd3b > > --- /dev/null > > +++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py > > @@ -0,0 +1,904 @@ > > +## @file > > +# Create makefile for MS nmake and GNU make > > +# > > +# Copyright (c) 2019, Intel Corporation. All rights reserved.
> > +# SPDX-License-Identifier: BSD-2-Clause-Patent > > +# > > + > > +## Import Modules > > +# > > +from __future__ import print_function > > +from __future__ import absolute_import > > +import os.path as path > > +import hashlib > > +from collections import defaultdict > > +from GenFds.FdfParser import FdfParser > > +from Workspace.WorkspaceCommon import GetModuleLibInstances > > +from AutoGen import GenMake > > +from AutoGen.AutoGen import AutoGen > > +from AutoGen.PlatformAutoGen import PlatformAutoGen > > +from AutoGen.BuildEngine import gDefaultBuildRuleFile > > +from Common.ToolDefClassObject import gDefaultToolsDefFile > > +from Common.StringUtils import NormPath > > +from Common.BuildToolError import * > > +from Common.DataType import * > > +from Common.Misc import * > > + > > +## Regular expression for splitting Dependency Expression string into= tokens > > +gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") > > + > > +## Regular expression for match: PCD(xxxx.yyy) > > +gPCDAsGuidPattern =3D re.compile(r"^PCD\(.+\..+\)$") > > + > > +## Workspace AutoGen class > > +# > > +# This class is used mainly to control the whole platform build for= different > > +# architecture. This class will generate top level makefile. > > +# > > +class WorkspaceAutoGen(AutoGen): > > + # call super().__init__ then call the worker function with differ= ent parameter count > > + def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, = *args, **kwargs): > > + if not hasattr(self, "_Init"): > > + self._InitWorker(Workspace, MetaFile, Target, Toolchain, = Arch, *args, **kwargs) > > + self._Init =3D True > > + > > + ## Initialize WorkspaceAutoGen > > + # > > + # @param WorkspaceDir Root directory of workspace > > + # @param ActivePlatform Meta-file of active platform > > + # @param Target Build target > > + # @param Toolchain Tool chain name > > + # @param ArchList List of architecture of curre= nt build > > + # @param MetaFileDb Database containing meta-file= s > > + # @param BuildConfig Configuration of build > > + # @param ToolDefinition Tool chain definitions > > + # @param FlashDefinitionFile File of flash definition > > + # @param Fds FD list to be generated > > + # @param Fvs FV list to be generated > > + # @param Caps Capsule list to be generated > > + # @param SkuId SKU id from command line > > + # > > + def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolc= hain, ArchList, MetaFileDb, > > + BuildConfig, ToolDefinition, FlashDefinitionFile=3D'', = Fds=3DNone, Fvs=3DNone, Caps=3DNone, SkuId=3D'', UniFlag=3DNone, > > + Progress=3DNone, BuildModule=3DNone): > > + self.BuildDatabase =3D MetaFileDb > > + self.MetaFile =3D ActivePlatform > > + self.WorkspaceDir =3D WorkspaceDir > > + self.Platform =3D self.BuildDatabase[self.MetaFile, TAB= _ARCH_COMMON, Target, Toolchain] > > + GlobalData.gActivePlatform =3D self.Platform > > + self.BuildTarget =3D Target > > + self.ToolChain =3D Toolchain > > + self.ArchList =3D ArchList > > + self.SkuId =3D SkuId > > + self.UniFlag =3D UniFlag > > + > > + self.TargetTxt =3D BuildConfig > > + self.ToolDef =3D ToolDefinition > > + self.FdfFile =3D FlashDefinitionFile > > + self.FdTargetList =3D Fds if Fds else [] > > + self.FvTargetList =3D Fvs if Fvs else [] > > + self.CapTargetList =3D Caps if Caps else [] > > + self.AutoGenObjectList =3D [] > > + self._GuidDict =3D {} > > + > > + # there's many relative directory operations, so ... > > + os.chdir(self.WorkspaceDir) > > + > > + self.MergeArch() > > + self.ValidateBuildTarget() > > + > > + EdkLogger.info("") > > + if self.ArchList: > > + EdkLogger.info('%-16s =3D %s' % ("Architecture(s)", ' '.j= oin(self.ArchList))) > > + EdkLogger.info('%-16s =3D %s' % ("Build target", self.BuildTa= rget)) > > + EdkLogger.info('%-16s =3D %s' % ("Toolchain", self.ToolChain)= ) > > + > > + EdkLogger.info('\n%-24s =3D %s' % ("Active Platform", self.Pl= atform)) > > + if BuildModule: > > + EdkLogger.info('%-24s =3D %s' % ("Active Module", BuildMo= dule)) > > + > > + if self.FdfFile: > > + EdkLogger.info('%-24s =3D %s' % ("Flash Image Definition"= , self.FdfFile)) > > + > > + EdkLogger.verbose("\nFLASH_DEFINITION =3D %s" % self.FdfFile) > > + > > + if Progress: > > + Progress.Start("\nProcessing meta-data") > > + # > > + # Mark now build in AutoGen Phase > > + # > > + GlobalData.gAutoGenPhase =3D True > > + self.ProcessModuleFromPdf() > > + self.ProcessPcdType() > > + self.ProcessMixedPcd() > > + self.VerifyPcdsFromFDF() > > + self.CollectAllPcds() > > + self.GeneratePkgLevelHash() > > + # > > + # Check PCDs token value conflict in each DEC file. > > + # > > + self._CheckAllPcdsTokenValueConflict() > > + # > > + # Check PCD type and definition between DSC and DEC > > + # > > + self._CheckPcdDefineAndType() > > + > > + self.CreateBuildOptionsFile() > > + self.CreatePcdTokenNumberFile() > > + self.CreateModuleHashInfo() > > + GlobalData.gAutoGenPhase =3D False > > + > > + # > > + # Merge Arch > > + # > > + def MergeArch(self): > > + if not self.ArchList: > > + ArchList =3D set(self.Platform.SupArchList) > > + else: > > + ArchList =3D set(self.ArchList) & set(self.Platform.SupAr= chList) > > + if not ArchList: > > + EdkLogger.error("build", PARAMETER_INVALID, > > + ExtraData =3D "Invalid ARCH specified. [V= alid ARCH: %s]" % (" ".join(self.Platform.SupArchList))) > > + elif self.ArchList and len(ArchList) !=3D len(self.ArchList): > > + SkippedArchList =3D set(self.ArchList).symmetric_differen= ce(set(self.Platform.SupArchList)) > > + EdkLogger.verbose("\nArch [%s] is ignored because the pla= tform supports [%s] only!" > > + % (" ".join(SkippedArchList), " ".join(= self.Platform.SupArchList))) > > + self.ArchList =3D tuple(ArchList) > > + > > + # Validate build target > > + def ValidateBuildTarget(self): > > + if self.BuildTarget not in self.Platform.BuildTargets: > > + EdkLogger.error("build", PARAMETER_INVALID, > > + ExtraData=3D"Build target [%s] is not sup= ported by the platform. [Valid target: %s]" > > + % (self.BuildTarget, " ".join(s= elf.Platform.BuildTargets))) > > + @cached_property > > + def FdfProfile(self): > > + if not self.FdfFile: > > + self.FdfFile =3D self.Platform.FlashDefinition > > + > > + FdfProfile =3D None > > + if self.FdfFile: > > + Fdf =3D FdfParser(self.FdfFile.Path) > > + Fdf.ParseFile() > > + GlobalData.gFdfParser =3D Fdf > > + if Fdf.CurrentFdName and Fdf.CurrentFdName in Fdf.Profile= .FdDict: > > + FdDict =3D Fdf.Profile.FdDict[Fdf.CurrentFdName] > > + for FdRegion in FdDict.RegionList: > > + if str(FdRegion.RegionType) is 'FILE' and self.Pl= atform.VpdToolGuid in str(FdRegion.RegionDataList): > > + if int(FdRegion.Offset) % 8 !=3D 0: > > + EdkLogger.error("build", FORMAT_INVALID, = 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset)) > > + FdfProfile =3D Fdf.Profile > > + else: > > + if self.FdTargetList: > > + EdkLogger.info("No flash definition file found. FD [%= s] will be ignored." % " ".join(self.FdTargetList)) > > + self.FdTargetList =3D [] > > + if self.FvTargetList: > > + EdkLogger.info("No flash definition file found. FV [%= s] will be ignored." % " ".join(self.FvTargetList)) > > + self.FvTargetList =3D [] > > + if self.CapTargetList: > > + EdkLogger.info("No flash definition file found. Capsu= le [%s] will be ignored." % " ".join(self.CapTargetList)) > > + self.CapTargetList =3D [] > > + > > + return FdfProfile > > + > > + def ProcessModuleFromPdf(self): > > + > > + if self.FdfProfile: > > + for fvname in self.FvTargetList: > > + if fvname.upper() not in self.FdfProfile.FvDict: > > + EdkLogger.error("build", OPTION_VALUE_INVALID, > > + "No such an FV in FDF file: %s" %= fvname) > > + > > + # In DSC file may use FILE_GUID to override the module, t= hen in the Platform.Modules use FILE_GUIDmodule.inf as key, > > + # but the path (self.MetaFile.Path) is the real path > > + for key in self.FdfProfile.InfDict: > > + if key =3D=3D 'ArchTBD': > > + MetaFile_cache =3D defaultdict(set) > > + for Arch in self.ArchList: > > + Current_Platform_cache =3D self.BuildDatabase= [self.MetaFile, Arch, self.BuildTarget, self.ToolChain] > > + for Pkey in Current_Platform_cache.Modules: > > + MetaFile_cache[Arch].add(Current_Platform= _cache.Modules[Pkey].MetaFile) > > + for Inf in self.FdfProfile.InfDict[key]: > > + ModuleFile =3D PathClass(NormPath(Inf), Globa= lData.gWorkspace, Arch) > > + for Arch in self.ArchList: > > + if ModuleFile in MetaFile_cache[Arch]: > > + break > > + else: > > + ModuleData =3D self.BuildDatabase[ModuleF= ile, Arch, self.BuildTarget, self.ToolChain] > > + if not ModuleData.IsBinaryModule: > > + EdkLogger.error('build', PARSER_ERROR= , "Module %s NOT found in DSC file; Is it really a binary module?" % Module= File) > > + > > + else: > > + for Arch in self.ArchList: > > + if Arch =3D=3D key: > > + Platform =3D self.BuildDatabase[self.Meta= File, Arch, self.BuildTarget, self.ToolChain] > > + MetaFileList =3D set() > > + for Pkey in Platform.Modules: > > + MetaFileList.add(Platform.Modules[Pke= y].MetaFile) > > + for Inf in self.FdfProfile.InfDict[key]: > > + ModuleFile =3D PathClass(NormPath(Inf= ), GlobalData.gWorkspace, Arch) > > + if ModuleFile in MetaFileList: > > + continue > > + ModuleData =3D self.BuildDatabase[Mod= uleFile, Arch, self.BuildTarget, self.ToolChain] > > + if not ModuleData.IsBinaryModule: > > + EdkLogger.error('build', PARSER_E= RROR, "Module %s NOT found in DSC file; Is it really a binary module?" % Mo= duleFile) > > + > > + > > + > > + # parse FDF file to get PCDs in it, if any > > + def VerifyPcdsFromFDF(self): > > + > > + if self.FdfProfile: > > + PcdSet =3D self.FdfProfile.PcdDict > > + self.VerifyPcdDeclearation(PcdSet) > > + > > + def ProcessPcdType(self): > > + for Arch in self.ArchList: > > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > + Platform.Pcds > > + # generate the SourcePcdDict and BinaryPcdDict > > + Libs =3D [] > > + for BuildData in list(self.BuildDatabase._CACHE_.values()= ): > > + if BuildData.Arch !=3D Arch: > > + continue > > + if BuildData.MetaFile.Ext =3D=3D '.inf' and str(Build= Data) in Platform.Modules : > > + Libs.extend(GetModuleLibInstances(BuildData, Plat= form, > > + self.BuildDatabase, > > + Arch, > > + self.BuildTarget, > > + self.ToolChain > > + )) > > + for BuildData in list(self.BuildDatabase._CACHE_.values()= ): > > + if BuildData.Arch !=3D Arch: > > + continue > > + if BuildData.MetaFile.Ext =3D=3D '.inf': > > + for key in BuildData.Pcds: > > + if BuildData.Pcds[key].Pending: > > + if key in Platform.Pcds: > > + PcdInPlatform =3D Platform.Pcds[key] > > + if PcdInPlatform.Type: > > + BuildData.Pcds[key].Type =3D PcdI= nPlatform.Type > > + BuildData.Pcds[key].Pending =3D F= alse > > + > > + if BuildData.MetaFile in Platform.Modules= : > > + PlatformModule =3D Platform.Modules[s= tr(BuildData.MetaFile)] > > + if key in PlatformModule.Pcds: > > + PcdInPlatform =3D PlatformModule.= Pcds[key] > > + if PcdInPlatform.Type: > > + BuildData.Pcds[key].Type =3D = PcdInPlatform.Type > > + BuildData.Pcds[key].Pending = =3D False > > + else: > > + #Pcd used in Library, Pcd Type from r= eference module if Pcd Type is Pending > > + if BuildData.Pcds[key].Pending: > > + if bool(BuildData.LibraryClass): > > + if BuildData in set(Libs): > > + ReferenceModules =3D Buil= dData.ReferenceModules > > + for ReferenceModule in Re= ferenceModules: > > + if ReferenceModule.Me= taFile in Platform.Modules: > > + RefPlatformModule= =3D Platform.Modules[str(ReferenceModule.MetaFile)] > > + if key in RefPlat= formModule.Pcds: > > + PcdInReferenc= eModule =3D RefPlatformModule.Pcds[key] > > + if PcdInRefer= enceModule.Type: > > + BuildData= .Pcds[key].Type =3D PcdInReferenceModule.Type > > + BuildData= .Pcds[key].Pending =3D False > > + break > > + > > + def ProcessMixedPcd(self): > > + for Arch in self.ArchList: > > + SourcePcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PA= TCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set(= )} > > + BinaryPcdDict =3D {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PA= TCHABLE_IN_MODULE:set()} > > + SourcePcdDict_Keys =3D SourcePcdDict.keys() > > + BinaryPcdDict_Keys =3D BinaryPcdDict.keys() > > + > > + # generate the SourcePcdDict and BinaryPcdDict > > + > > + for BuildData in list(self.BuildDatabase._CACHE_.values()= ): > > + if BuildData.Arch !=3D Arch: > > + continue > > + if BuildData.MetaFile.Ext =3D=3D '.inf': > > + for key in BuildData.Pcds: > > + if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key]= .Type: > > + if BuildData.IsBinaryModule: > > + BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > + else: > > + SourcePcdDict[TAB_PCDS_DYNAMIC_EX].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > + > > + elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildDat= a.Pcds[key].Type: > > + if BuildData.MetaFile.Ext =3D=3D '.inf': > > + if BuildData.IsBinaryModule: > > + BinaryPcdDict[TAB_PCDS_PATCHABLE_= IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSp= aceGuidCName)) > > + else: > > + SourcePcdDict[TAB_PCDS_PATCHABLE_= IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSp= aceGuidCName)) > > + > > + elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].= Type: > > + SourcePcdDict[TAB_PCDS_DYNAMIC].add((Buil= dData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)) > > + elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcd= s[key].Type: > > + SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].ad= d((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName)= ) > > + > > + # > > + # A PCD can only use one type for all source modules > > + # > > + for i in SourcePcdDict_Keys: > > + for j in SourcePcdDict_Keys: > > + if i !=3D j: > > + Intersections =3D SourcePcdDict[i].intersecti= on(SourcePcdDict[j]) > > + if len(Intersections) > 0: > > + EdkLogger.error( > > + 'build', > > + FORMAT_INVALID, > > + "Building modules from source INFs, follo= wing PCD use %s and %s access method. It must be corrected to use only one = access method." % (i, j), > > + ExtraData=3D'\n\t'.join(str(P[1]+'.'+P[0]= ) for P in Intersections) > > + ) > > + > > + # > > + # intersection the BinaryPCD for Mixed PCD > > + # > > + for i in BinaryPcdDict_Keys: > > + for j in BinaryPcdDict_Keys: > > + if i !=3D j: > > + Intersections =3D BinaryPcdDict[i].intersecti= on(BinaryPcdDict[j]) > > + for item in Intersections: > > + NewPcd1 =3D (item[0] + '_' + i, item[1]) > > + NewPcd2 =3D (item[0] + '_' + j, item[1]) > > + if item not in GlobalData.MixedPcd: > > + GlobalData.MixedPcd[item] =3D [NewPcd= 1, NewPcd2] > > + else: > > + if NewPcd1 not in GlobalData.MixedPcd= [item]: > > + GlobalData.MixedPcd[item].append(= NewPcd1) > > + if NewPcd2 not in GlobalData.MixedPcd= [item]: > > + GlobalData.MixedPcd[item].append(= NewPcd2) > > + > > + # > > + # intersection the SourcePCD and BinaryPCD for Mixed PCD > > + # > > + for i in SourcePcdDict_Keys: > > + for j in BinaryPcdDict_Keys: > > + if i !=3D j: > > + Intersections =3D SourcePcdDict[i].intersecti= on(BinaryPcdDict[j]) > > + for item in Intersections: > > + NewPcd1 =3D (item[0] + '_' + i, item[1]) > > + NewPcd2 =3D (item[0] + '_' + j, item[1]) > > + if item not in GlobalData.MixedPcd: > > + GlobalData.MixedPcd[item] =3D [NewPcd= 1, NewPcd2] > > + else: > > + if NewPcd1 not in GlobalData.MixedPcd= [item]: > > + GlobalData.MixedPcd[item].append(= NewPcd1) > > + if NewPcd2 not in GlobalData.MixedPcd= [item]: > > + GlobalData.MixedPcd[item].append(= NewPcd2) > > + > > + BuildData =3D self.BuildDatabase[self.MetaFile, Arch, sel= f.BuildTarget, self.ToolChain] > > + for key in BuildData.Pcds: > > + for SinglePcd in GlobalData.MixedPcd: > > + if (BuildData.Pcds[key].TokenCName, BuildData.Pcd= s[key].TokenSpaceGuidCName) =3D=3D SinglePcd: > > + for item in GlobalData.MixedPcd[SinglePcd]: > > + Pcd_Type =3D item[0].split('_')[-1] > > + if (Pcd_Type =3D=3D BuildData.Pcds[key].T= ype) or (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type i= n PCD_DYNAMIC_EX_TYPE_SET) or \ > > + (Pcd_Type =3D=3D TAB_PCDS_DYNAMIC and = BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET): > > + Value =3D BuildData.Pcds[key] > > + Value.TokenCName =3D BuildData.Pcds[k= ey].TokenCName + '_' + Pcd_Type > > + if len(key) =3D=3D 2: > > + newkey =3D (Value.TokenCName, key= [1]) > > + elif len(key) =3D=3D 3: > > + newkey =3D (Value.TokenCName, key= [1], key[2]) > > + del BuildData.Pcds[key] > > + BuildData.Pcds[newkey] =3D Value > > + break > > + break > > + > > + if self.FdfProfile: > > + PcdSet =3D self.FdfProfile.PcdDict > > + # handle the mixed pcd in FDF file > > + for key in PcdSet: > > + if key in GlobalData.MixedPcd: > > + Value =3D PcdSet[key] > > + del PcdSet[key] > > + for item in GlobalData.MixedPcd[key]: > > + PcdSet[item] =3D Value > > + > > + #Collect package set information from INF of FDF > > + @cached_property > > + def PkgSet(self): > > + if not self.FdfFile: > > + self.FdfFile =3D self.Platform.FlashDefinition > > + > > + if self.FdfFile: > > + ModuleList =3D self.FdfProfile.InfList > > + else: > > + ModuleList =3D [] > > + Pkgs =3D {} > > + for Arch in self.ArchList: > > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > + PkgSet =3D set() > > + for mb in [self.BuildDatabase[m, Arch, self.BuildTarget, = self.ToolChain] for m in Platform.Modules]: > > + PkgSet.update(mb.Packages) > > + for Inf in ModuleList: > > + ModuleFile =3D PathClass(NormPath(Inf), GlobalData.gW= orkspace, Arch) > > + if ModuleFile in Platform.Modules: > > + continue > > + ModuleData =3D self.BuildDatabase[ModuleFile, Arch, s= elf.BuildTarget, self.ToolChain] > > + PkgSet.update(ModuleData.Packages) > > + Pkgs[Arch] =3D list(PkgSet) > > + return Pkgs > > + > > + def VerifyPcdDeclearation(self,PcdSet): > > + for Arch in self.ArchList: > > + Platform =3D self.BuildDatabase[self.MetaFile, Arch, self= .BuildTarget, self.ToolChain] > > + Pkgs =3D self.PkgSet[Arch] > > + DecPcds =3D set() > > + DecPcdsKey =3D set() > > + for Pkg in Pkgs: > > + for Pcd in Pkg.Pcds: > > + DecPcds.add((Pcd[0], Pcd[1])) > > + DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2])) > > + > > + Platform.SkuName =3D self.SkuId > > + for Name, Guid,Fileds in PcdSet: > > + if (Name, Guid) not in DecPcds: > > + EdkLogger.error( > > + 'build', > > + PARSER_ERROR, > > + "PCD (%s.%s) used in FDF is not declared in D= EC files." % (Guid, Name), > > + File =3D self.FdfProfile.PcdFileLineDict[Name= , Guid, Fileds][0], > > + Line =3D self.FdfProfile.PcdFileLineDict[Name= , Guid, Fileds][1] > > + ) > > + else: > > + # Check whether Dynamic or DynamicEx PCD used in = FDF file. If used, build break and give a error message. > > + if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPc= dsKey \ > > + or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE)= in DecPcdsKey \ > > + or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in Dec= PcdsKey: > > + continue > > + elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey= or (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey: > > + EdkLogger.error( > > + 'build', > > + PARSER_ERROR, > > + "Using Dynamic or DynamicEx type of P= CD [%s.%s] in FDF file is not allowed." % (Guid, Name), > > + File =3D self.FdfProfile.PcdFileLineD= ict[Name, Guid, Fileds][0], > > + Line =3D self.FdfProfile.PcdFileLineD= ict[Name, Guid, Fileds][1] > > + ) > > + def CollectAllPcds(self): > > + > > + for Arch in self.ArchList: > > + Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > > + # > > + # Explicitly collect platform's dynamic PCDs > > + # > > + Pa.CollectPlatformDynamicPcds() > > + Pa.CollectFixedAtBuildPcds() > > + self.AutoGenObjectList.append(Pa) > > + # We need to calculate the PcdTokenNumber after all Arch Pcds= are collected. > > + for Arch in self.ArchList: > > + #Pcd TokenNumber > > + Pa =3D PlatformAutoGen(self, self.MetaFile, self.BuildTar= get, self.ToolChain, Arch) > > + self.UpdateModuleDataPipe(Arch, {"PCD_TNUM":Pa.PcdTokenN= umber}) > > + > > + def UpdateModuleDataPipe(self,arch, attr_dict): > > + for (Target, Toolchain, Arch, MetaFile) in AutoGen.Cache(): > > + if Arch !=3D arch: > > + continue > > + try: > > + AutoGen.Cache()[(Target, Toolchain, Arch, MetaFile)].= DataPipe.DataContainer =3D attr_dict > > + except Exception: > > + pass > > + # > > + # Generate Package level hash value > > + # > > + def GeneratePkgLevelHash(self): > > + for Arch in self.ArchList: > > + GlobalData.gPackageHash =3D {} > > + if GlobalData.gUseHashCache: > > + for Pkg in self.PkgSet[Arch]: > > + self._GenPkgLevelHash(Pkg) > > + > > + > > + def CreateBuildOptionsFile(self): > > + # > > + # Create BuildOptions Macro & PCD metafile, also add the Acti= ve Platform and FDF file. > > + # > > + content =3D 'gCommandLineDefines: ' > > + content +=3D str(GlobalData.gCommandLineDefines) > > + content +=3D TAB_LINE_BREAK > > + content +=3D 'BuildOptionPcd: ' > > + content +=3D str(GlobalData.BuildOptionPcd) > > + content +=3D TAB_LINE_BREAK > > + content +=3D 'Active Platform: ' > > + content +=3D str(self.Platform) > > + content +=3D TAB_LINE_BREAK > > + if self.FdfFile: > > + content +=3D 'Flash Image Definition: ' > > + content +=3D str(self.FdfFile) > > + content +=3D TAB_LINE_BREAK > > + SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'),= content, False) > > + > > + def CreatePcdTokenNumberFile(self): > > + # > > + # Create PcdToken Number file for Dynamic/DynamicEx Pcd. > > + # > > + PcdTokenNumber =3D 'PcdTokenNumber: ' > > + Pa =3D self.AutoGenObjectList[0] > > + if Pa.PcdTokenNumber: > > + if Pa.DynamicPcdList: > > + for Pcd in Pa.DynamicPcdList: > > + PcdTokenNumber +=3D TAB_LINE_BREAK > > + PcdTokenNumber +=3D str((Pcd.TokenCName, Pcd.Toke= nSpaceGuidCName)) > > + PcdTokenNumber +=3D ' : ' > > + PcdTokenNumber +=3D str(Pa.PcdTokenNumber[Pcd.Tok= enCName, Pcd.TokenSpaceGuidCName]) > > + SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'= ), PcdTokenNumber, False) > > + > > + def CreateModuleHashInfo(self): > > + # > > + # Get set of workspace metafiles > > + # > > + AllWorkSpaceMetaFiles =3D self._GetMetaFiles(self.BuildTarget= , self.ToolChain) > > + > > + # > > + # Retrieve latest modified time of all metafiles > > + # > > + SrcTimeStamp =3D 0 > > + for f in AllWorkSpaceMetaFiles: > > + if os.stat(f)[8] > SrcTimeStamp: > > + SrcTimeStamp =3D os.stat(f)[8] > > + self._SrcTimeStamp =3D SrcTimeStamp > > + > > + if GlobalData.gUseHashCache: > > + m =3D hashlib.md5() > > + for files in AllWorkSpaceMetaFiles: > > + if files.endswith('.dec'): > > + continue > > + f =3D open(files, 'rb') > > + Content =3D f.read() > > + f.close() > > + m.update(Content) > > + SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.has= h'), m.hexdigest(), False) > > + GlobalData.gPlatformHash =3D m.hexdigest() > > + > > + # > > + # Write metafile list to build directory > > + # > > + AutoGenFilePath =3D os.path.join(self.BuildDir, 'AutoGen') > > + if os.path.exists (AutoGenFilePath): > > + os.remove(AutoGenFilePath) > > + if not os.path.exists(self.BuildDir): > > + os.makedirs(self.BuildDir) > > + with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as fi= le: > > + for f in AllWorkSpaceMetaFiles: > > + print(f, file=3Dfile) > > + return True > > + > > + def _GenPkgLevelHash(self, Pkg): > > + if Pkg.PackageName in GlobalData.gPackageHash: > > + return > > + > > + PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageN= ame) > > + CreateDirectory(PkgDir) > > + HashFile =3D os.path.join(PkgDir, Pkg.PackageName + '.hash') > > + m =3D hashlib.md5() > > + # Get .dec file's hash value > > + f =3D open(Pkg.MetaFile.Path, 'rb') > > + Content =3D f.read() > > + f.close() > > + m.update(Content) > > + # Get include files hash value > > + if Pkg.Includes: > > + for inc in sorted(Pkg.Includes, key=3Dlambda x: str(x)): > > + for Root, Dirs, Files in os.walk(str(inc)): > > + for File in sorted(Files): > > + File_Path =3D os.path.join(Root, File) > > + f =3D open(File_Path, 'rb') > > + Content =3D f.read() > > + f.close() > > + m.update(Content) > > + SaveFileOnChange(HashFile, m.hexdigest(), False) > > + GlobalData.gPackageHash[Pkg.PackageName] =3D m.hexdigest() > > + > > + def _GetMetaFiles(self, Target, Toolchain): > > + AllWorkSpaceMetaFiles =3D set() > > + # > > + # add fdf > > + # > > + if self.FdfFile: > > + AllWorkSpaceMetaFiles.add (self.FdfFile.Path) > > + for f in GlobalData.gFdfParser.GetAllIncludedFile(): > > + AllWorkSpaceMetaFiles.add (f.FileName) > > + # > > + # add dsc > > + # > > + AllWorkSpaceMetaFiles.add(self.MetaFile.Path) > > + > > + # > > + # add build_rule.txt & tools_def.txt > > + # > > + AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirect= ory, gDefaultBuildRuleFile)) > > + AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirect= ory, gDefaultToolsDefFile)) > > + > > + # add BuildOption metafile > > + # > > + AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'BuildO= ptions')) > > + > > + # add PcdToken Number file for Dynamic/DynamicEx Pcd > > + # > > + AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdTok= enNumber')) > > + > > + for Pa in self.AutoGenObjectList: > > + AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile) > > + > > + for Arch in self.ArchList: > > + # > > + # add dec > > + # > > + for Package in PlatformAutoGen(self, self.MetaFile, Targe= t, Toolchain, Arch).PackageList: > > + AllWorkSpaceMetaFiles.add(Package.MetaFile.Path) > > + > > + # > > + # add included dsc > > + # > > + for filePath in self.BuildDatabase[self.MetaFile, Arch, T= arget, Toolchain]._RawData.IncludedFiles: > > + AllWorkSpaceMetaFiles.add(filePath.Path) > > + > > + return AllWorkSpaceMetaFiles > > + > > + def _CheckPcdDefineAndType(self): > > + PcdTypeSet =3D {TAB_PCDS_FIXED_AT_BUILD, > > + TAB_PCDS_PATCHABLE_IN_MODULE, > > + TAB_PCDS_FEATURE_FLAG, > > + TAB_PCDS_DYNAMIC, > > + TAB_PCDS_DYNAMIC_EX} > > + > > + # This dict store PCDs which are not used by any modules with= specified arches > > + UnusedPcd =3D OrderedDict() > > + for Pa in self.AutoGenObjectList: > > + # Key of DSC's Pcds dictionary is PcdCName, TokenSpaceGui= d > > + for Pcd in Pa.Platform.Pcds: > > + PcdType =3D Pa.Platform.Pcds[Pcd].Type > > + > > + # If no PCD type, this PCD comes from FDF > > + if not PcdType: > > + continue > > + > > + # Try to remove Hii and Vpd suffix > > + if PcdType.startswith(TAB_PCDS_DYNAMIC_EX): > > + PcdType =3D TAB_PCDS_DYNAMIC_EX > > + elif PcdType.startswith(TAB_PCDS_DYNAMIC): > > + PcdType =3D TAB_PCDS_DYNAMIC > > + > > + for Package in Pa.PackageList: > > + # Key of DEC's Pcds dictionary is PcdCName, Token= SpaceGuid, PcdType > > + if (Pcd[0], Pcd[1], PcdType) in Package.Pcds: > > + break > > + for Type in PcdTypeSet: > > + if (Pcd[0], Pcd[1], Type) in Package.Pcds: > > + EdkLogger.error( > > + 'build', > > + FORMAT_INVALID, > > + "Type [%s] of PCD [%s.%s] in DSC file= doesn't match the type [%s] defined in DEC file." \ > > + % (Pa.Platform.Pcds[Pcd].Type, Pcd[1]= , Pcd[0], Type), > > + ExtraData=3DNone > > + ) > > + return > > + else: > > + UnusedPcd.setdefault(Pcd, []).append(Pa.Arch) > > + > > + for Pcd in UnusedPcd: > > + EdkLogger.warn( > > + 'build', > > + "The PCD was not specified by any INF module in the p= latform for the given architecture.\n" > > + "\tPCD: [%s.%s]\n\tPlatform: [%s]\n\tArch: %s" > > + % (Pcd[1], Pcd[0], os.path.basename(str(self.MetaFile= )), str(UnusedPcd[Pcd])), > > + ExtraData=3DNone > > + ) > > + > > + def __repr__(self): > > + return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList)) > > + > > + ## Return the directory to store FV files > > + @cached_property > > + def FvDir(self): > > + return path.join(self.BuildDir, TAB_FV_DIRECTORY) > > + > > + ## Return the directory to store all intermediate and final files= built > > + @cached_property > > + def BuildDir(self): > > + return self.AutoGenObjectList[0].BuildDir > > + > > + ## Return the build output directory platform specifies > > + @cached_property > > + def OutputDir(self): > > + return self.Platform.OutputDirectory > > + > > + ## Return platform name > > + @cached_property > > + def Name(self): > > + return self.Platform.PlatformName > > + > > + ## Return meta-file GUID > > + @cached_property > > + def Guid(self): > > + return self.Platform.Guid > > + > > + ## Return platform version > > + @cached_property > > + def Version(self): > > + return self.Platform.Version > > + > > + ## Return paths of tools > > + @cached_property > > + def ToolDefinition(self): > > + return self.AutoGenObjectList[0].ToolDefinition > > + > > + ## Return directory of platform makefile > > + # > > + # @retval string Makefile directory > > + # > > + @cached_property > > + def MakeFileDir(self): > > + return self.BuildDir > > + > > + ## Return build command string > > + # > > + # @retval string Build command string > > + # > > + @cached_property > > + def BuildCommand(self): > > + # BuildCommand should be all the same. So just get one from p= latform AutoGen > > + return self.AutoGenObjectList[0].BuildCommand > > + > > + ## Check the PCDs token value conflict in each DEC file. > > + # > > + # Will cause build break and raise error message while two PCDs c= onflict. > > + # > > + # @return None > > + # > > + def _CheckAllPcdsTokenValueConflict(self): > > + for Pa in self.AutoGenObjectList: > > + for Package in Pa.PackageList: > > + PcdList =3D list(Package.Pcds.values()) > > + PcdList.sort(key=3Dlambda x: int(x.TokenValue, 0)) > > + Count =3D 0 > > + while (Count < len(PcdList) - 1) : > > + Item =3D PcdList[Count] > > + ItemNext =3D PcdList[Count + 1] > > + # > > + # Make sure in the same token space the TokenValu= e should be unique > > + # > > + if (int(Item.TokenValue, 0) =3D=3D int(ItemNext.T= okenValue, 0)): > > + SameTokenValuePcdList =3D [] > > + SameTokenValuePcdList.append(Item) > > + SameTokenValuePcdList.append(ItemNext) > > + RemainPcdListLength =3D len(PcdList) - Count = - 2 > > + for ValueSameCount in range(RemainPcdListLeng= th): > > + if int(PcdList[len(PcdList) - RemainPcdLi= stLength + ValueSameCount].TokenValue, 0) =3D=3D int(Item.TokenValue, 0): > > + SameTokenValuePcdList.append(PcdList[= len(PcdList) - RemainPcdListLength + ValueSameCount]) > > + else: > > + break; > > + # > > + # Sort same token value PCD list with TokenGu= id and TokenCName > > + # > > + SameTokenValuePcdList.sort(key=3Dlambda x: "%= s.%s" % (x.TokenSpaceGuidCName, x.TokenCName)) > > + SameTokenValuePcdListCount =3D 0 > > + while (SameTokenValuePcdListCount < len(SameT= okenValuePcdList) - 1): > > + Flag =3D False > > + TemListItem =3D SameTokenValuePcdList[Sam= eTokenValuePcdListCount] > > + TemListItemNext =3D SameTokenValuePcdList= [SameTokenValuePcdListCount + 1] > > + > > + if (TemListItem.TokenSpaceGuidCName =3D= =3D TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName !=3D = TemListItemNext.TokenCName): > > + for PcdItem in GlobalData.MixedPcd: > > + if (TemListItem.TokenCName, TemLi= stItem.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \ > > + (TemListItemNext.TokenCName, = TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]: > > + Flag =3D True > > + if not Flag: > > + EdkLogger.error( > > + 'build', > > + FORMAT_INVALID, > > + "The TokenValue [%s] = of PCD [%s.%s] is conflict with: [%s.%s] in %s"\ > > + % (TemListItem.TokenV= alue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemN= ext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package), > > + ExtraData=3DNone > > + ) > > + SameTokenValuePcdListCount +=3D 1 > > + Count +=3D SameTokenValuePcdListCount > > + Count +=3D 1 > > + > > + PcdList =3D list(Package.Pcds.values()) > > + PcdList.sort(key=3Dlambda x: "%s.%s" % (x.TokenSpaceG= uidCName, x.TokenCName)) > > + Count =3D 0 > > + while (Count < len(PcdList) - 1) : > > + Item =3D PcdList[Count] > > + ItemNext =3D PcdList[Count + 1] > > + # > > + # Check PCDs with same TokenSpaceGuidCName.TokenC= Name have same token value as well. > > + # > > + if (Item.TokenSpaceGuidCName =3D=3D ItemNext.Toke= nSpaceGuidCName) and (Item.TokenCName =3D=3D ItemNext.TokenCName) and (int(= Item.TokenValue, 0) !=3D int(ItemNext.TokenValue, 0)): > > + EdkLogger.error( > > + 'build', > > + FORMAT_INVALID, > > + "The TokenValue [%s] of PCD [%s.%= s] in %s defined in two places should be same as well."\ > > + % (Item.TokenValue, Item.TokenSpa= ceGuidCName, Item.TokenCName, Package), > > + ExtraData=3DNone > > + ) > > + Count +=3D 1 > > + ## Generate fds command > > + @property > > + def GenFdsCommand(self): > > + return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(Gen= Make.TopLevelMakefile(self)._TemplateDict)).strip() > > + > > + @property > > + def GenFdsCommandDict(self): > > + FdsCommandDict =3D {} > > + LogLevel =3D EdkLogger.GetLevel() > > + if LogLevel =3D=3D EdkLogger.VERBOSE: > > + FdsCommandDict["verbose"] =3D True > > + elif LogLevel <=3D EdkLogger.DEBUG_9: > > + FdsCommandDict["debug"] =3D LogLevel - 1 > > + elif LogLevel =3D=3D EdkLogger.QUIET: > > + FdsCommandDict["quiet"] =3D True > > + > > + if GlobalData.gEnableGenfdsMultiThread: > > + FdsCommandDict["GenfdsMultiThread"] =3D True > > + if GlobalData.gIgnoreSource: > > + FdsCommandDict["IgnoreSources"] =3D True > > + > > + FdsCommandDict["OptionPcd"] =3D [] > > + for pcd in GlobalData.BuildOptionPcd: > > + if pcd[2]: > > + pcdname =3D '.'.join(pcd[0:3]) > > + else: > > + pcdname =3D '.'.join(pcd[0:2]) > > + if pcd[3].startswith('{'): > > + FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + = 'H' + '"' + pcd[3] + '"') > > + else: > > + FdsCommandDict["OptionPcd"].append(pcdname + '=3D' + = pcd[3]) > > + > > + MacroList =3D [] > > + # macros passed to GenFds > > + MacroDict =3D {} > > + MacroDict.update(GlobalData.gGlobalDefines) > > + MacroDict.update(GlobalData.gCommandLineDefines) > > + for MacroName in MacroDict: > > + if MacroDict[MacroName] !=3D "": > > + MacroList.append('"%s=3D%s"' % (MacroName, MacroDict[= MacroName].replace('\\', '\\\\'))) > > + else: > > + MacroList.append('"%s"' % MacroName) > > + FdsCommandDict["macro"] =3D MacroList > > + > > + FdsCommandDict["fdf_file"] =3D [self.FdfFile] > > + FdsCommandDict["build_target"] =3D self.BuildTarget > > + FdsCommandDict["toolchain_tag"] =3D self.ToolChain > > + FdsCommandDict["active_platform"] =3D str(self) > > + > > + FdsCommandDict["conf_directory"] =3D GlobalData.gConfDirector= y > > + FdsCommandDict["build_architecture_list"] =3D ','.join(self.A= rchList) > > + FdsCommandDict["platform_build_directory"] =3D self.BuildDir > > + > > + FdsCommandDict["fd"] =3D self.FdTargetList > > + FdsCommandDict["fv"] =3D self.FvTargetList > > + FdsCommandDict["cap"] =3D self.CapTargetList > > + return FdsCommandDict > > + > > + ## Create makefile for the platform and modules in it > > + # > > + # @param CreateDepsMakeFile Flag indicating if the ma= kefile for > > + # modules will be created a= s well > > + # > > + def CreateMakeFile(self, CreateDepsMakeFile=3DFalse): > > + if not CreateDepsMakeFile: > > + return > > + for Pa in self.AutoGenObjectList: > > + Pa.CreateMakeFile(True) > > + > > + ## Create autogen code for platform and modules > > + # > > + # Since there's no autogen code for platform, this method will d= o nothing > > + # if CreateModuleCodeFile is set to False. > > + # > > + # @param CreateDepsCodeFile Flag indicating if creati= ng module's > > + # autogen code file or not > > + # > > + def CreateCodeFile(self, CreateDepsCodeFile=3DFalse): > > + if not CreateDepsCodeFile: > > + return > > + for Pa in self.AutoGenObjectList: > > + Pa.CreateCodeFile(True) > > + > > + ## Create AsBuilt INF file the platform > > + # > > + def CreateAsBuiltInf(self): > > + return > > + > > diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source= /Python/Common/Misc.py > > index 1caa184eb923..26d149c27040 100644 > > --- a/BaseTools/Source/Python/Common/Misc.py > > +++ b/BaseTools/Source/Python/Common/Misc.py > > @@ -647,11 +647,10 @@ def GuidValue(CName, PackageList, Inffile =3D No= ne): > > if not Inffile.startswith(P.MetaFile.Dir): > > GuidKeys =3D [x for x in P.Guids if x not in P._Priva= teGuids] > > if CName in GuidKeys: > > return P.Guids[CName] > > return None > > - return None > > > > ## A string template class > > # > > # This class implements a template for string replacement. A string = template > > # looks like following > > diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/= BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py > > index 02735e165ca1..d35cd792704c 100644 > > --- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py > > +++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py > > @@ -9,11 +9,10 @@ > > # Import Modules > > # > > import Common.LongFilePathOs as os > > from Common.LongFilePathSupport import OpenLongFilePath as open > > import sys > > -import re > > > > from optparse import OptionParser > > from optparse import make_option > > from Common.BuildToolError import * > > import Common.EdkLogger as EdkLogger > > diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseT= ools/Source/Python/Workspace/DscBuildData.py > > index fa41e57c4f45..383aeaaa15c3 100644 > > --- a/BaseTools/Source/Python/Workspace/DscBuildData.py > > +++ b/BaseTools/Source/Python/Workspace/DscBuildData.py > > @@ -1371,15 +1371,15 @@ class DscBuildData(PlatformBuildClassObject): > > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL= _PCD_FIXED_AT_BUILD], > > self._PCD_TYPE_STRING_[MODEL_= PCD_PATCHABLE_IN_MODULE], > > self._PCD_TYPE_STRING_[MODEL_= PCD_FEATURE_FLAG], > > self._PCD_TYPE_STRING_[MODEL_= PCD_DYNAMIC], > > self._PCD_TYPE_STRING_[MODEL_= PCD_DYNAMIC_EX]]: > > - self.Pcds[Name, Guid] =3D copy.deepcopy(PcdIn= Dec) > > - self.Pcds[Name, Guid].DefaultValue =3D NoFile= dValues[( Guid, Name)][0] > > + self._Pcds[Name, Guid] =3D copy.deepcopy(PcdI= nDec) > > + self._Pcds[Name, Guid].DefaultValue =3D NoFil= edValues[( Guid, Name)][0] > > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL= _PCD_DYNAMIC], > > self._PCD_TYPE_STRING_[MODEL_= PCD_DYNAMIC_EX]]: > > - self.Pcds[Name, Guid].SkuInfoList =3D {TAB_DE= FAULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', ''= , '', NoFiledValues[( Guid, Name)][0])} > > + self._Pcds[Name, Guid].SkuInfoList =3D {TAB_D= EFAULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '= ', '', NoFiledValues[( Guid, Name)][0])} > > return AllPcds > > > > def OverrideByFdfOverAll(self,AllPcds): > > > > if GlobalData.gFdfParser is None: > > @@ -1417,12 +1417,12 @@ class DscBuildData(PlatformBuildClassObject): > > if PcdInDec: > > PcdInDec.PcdValueFromFdf =3D Value > > if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL= _PCD_FIXED_AT_BUILD], > > self._PCD_TYPE_STRING_[MODEL_= PCD_PATCHABLE_IN_MODULE], > > self._PCD_TYPE_STRING_[MODEL_= PCD_FEATURE_FLAG]]: > > - self.Pcds[Name, Guid] =3D copy.deepcopy(PcdIn= Dec) > > - self.Pcds[Name, Guid].DefaultValue =3D Value > > + self._Pcds[Name, Guid] =3D copy.deepcopy(PcdI= nDec) > > + self._Pcds[Name, Guid].DefaultValue =3D Value > > return AllPcds > > > > def ParsePcdNameStruct(self,NamePart1,NamePart2): > > TokenSpaceCName =3D PcdCName =3D DimensionAttr =3D Field =3D = "" > > if "." in NamePart1: > > diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseT= ools/Source/Python/Workspace/InfBuildData.py > > index da35391d3aff..e63246b03b6e 100644 > > --- a/BaseTools/Source/Python/Workspace/InfBuildData.py > > +++ b/BaseTools/Source/Python/Workspace/InfBuildData.py > > @@ -152,10 +152,17 @@ class InfBuildData(ModuleBuildClassObject): > > self._GuidsUsedByPcd =3D OrderedDict() > > self._GuidComments =3D None > > self._PcdComments =3D None > > self._BuildOptions =3D None > > self._DependencyFileList =3D None > > + self.LibInstances =3D [] > > + self.ReferenceModules =3D set() > > + self.Guids > > + self.Pcds > > + def SetReferenceModule(self,Module): > > + self.ReferenceModules.add(Module) > > + return self > > > > ## XXX[key] =3D value > > def __setitem__(self, key, value): > > self.__dict__[self._PROPERTY_[key]] =3D value > > > > @@ -703,10 +710,29 @@ class InfBuildData(ModuleBuildClassObject): > > RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC)) > > RetVal.update(self._GetPcd(MODEL_PCD_DYNAMIC_EX)) > > return RetVal > > > > @cached_property > > + def ModulePcdList(self): > > + RetVal =3D self.Pcds > > + return RetVal > > + @cached_property > > + def LibraryPcdList(self): > > + if bool(self.LibraryClass): > > + return [] > > + RetVal =3D {} > > + Pcds =3D set() > > + for Library in self.LibInstances: > > + PcdsInLibrary =3D OrderedDict() > > + for Key in Library.Pcds: > > + if Key in self.Pcds or Key in Pcds: > > + continue > > + Pcds.add(Key) > > + PcdsInLibrary[Key] =3D copy.copy(Library.Pcds[Key]) > > + RetVal[Library] =3D PcdsInLibrary > > + return RetVal > > + @cached_property > > def PcdsName(self): > > PcdsName =3D set() > > for Type in (MODEL_PCD_FIXED_AT_BUILD,MODEL_PCD_PATCHABLE_IN_= MODULE,MODEL_PCD_FEATURE_FLAG,MODEL_PCD_DYNAMIC,MODEL_PCD_DYNAMIC_EX): > > RecordList =3D self._RawData[Type, self._Arch, self._Plat= form] > > for TokenSpaceGuid, PcdCName, _, _, _, _, _ in RecordList= : > > @@ -1028,5 +1054,8 @@ class InfBuildData(ModuleBuildClassObject): > > @property > > def IsBinaryModule(self): > > if (self.Binaries and not self.Sources) or GlobalData.gIgnore= Source: > > return True > > return False > > +def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict): > > + for Key in CopyFromDict: > > + CopyToDict[Key].extend(CopyFromDict[Key]) > > diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/Ba= seTools/Source/Python/Workspace/WorkspaceCommon.py > > index 41ae684d3ee9..76583f46e500 100644 > > --- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py > > +++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py > > @@ -86,10 +86,12 @@ def GetDeclaredPcd(Platform, BuildDatabase, Arch, = Target, Toolchain, additionalP > > # > > def GetLiabraryInstances(Module, Platform, BuildDatabase, Arch, Targe= t, Toolchain): > > return GetModuleLibInstances(Module, Platform, BuildDatabase, Arc= h, Target, Toolchain) > > > > def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Targ= et, Toolchain, FileName =3D '', EdkLogger =3D None): > > + if Module.LibInstances: > > + return Module.LibInstances > > ModuleType =3D Module.ModuleType > > > > # add forced library instances (specified under LibraryClasses se= ctions) > > # > > # If a module has a MODULE_TYPE of USER_DEFINED, > > @@ -244,6 +246,8 @@ def GetModuleLibInstances(Module, Platform, BuildD= atabase, Arch, Target, Toolcha > > # > > # Build the list of constructor and destructor names > > # The DAG Topo sort produces the destructor order, so the list of= constructors must generated in the reverse order > > # > > SortedLibraryList.reverse() > > + Module.LibInstances =3D SortedLibraryList > > + SortedLibraryList =3D [lib.SetReferenceModule(Module) for lib in = SortedLibraryList] > > return SortedLibraryList > > diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/= BaseTools/Source/Python/Workspace/WorkspaceDatabase.py > > index 28a975f54e51..ab7b4506c1c1 100644 > > --- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py > > +++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py > > @@ -60,10 +60,12 @@ class WorkspaceDatabase(object): > > MODEL_FILE_DEC : DecBuildData, > > MODEL_FILE_DSC : DscBuildData, > > } > > > > _CACHE_ =3D {} # (FilePath, Arch) : > > + def GetCache(self): > > + return self._CACHE_ > > > > # constructor > > def __init__(self, WorkspaceDb): > > self.WorkspaceDb =3D WorkspaceDb > > > > @@ -201,10 +203,11 @@ class WorkspaceDatabase(object): > > Platform =3D self.BuildObject[PathClass(Dscfile), TAB_COMMON] > > if Platform is None: > > EdkLogger.error('build', PARSER_ERROR, "Failed to parser = DSC file: %s" % Dscfile) > > return Platform > > > > +BuildDB =3D WorkspaceDatabase() > > ## > > # > > # This acts like the main() function for the script, unless it is 'im= port'ed into another > > # script. > > # > > diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/= Source/Python/build/BuildReport.py > > index b4189240e127..9c12c01d2a2a 100644 > > --- a/BaseTools/Source/Python/build/BuildReport.py > > +++ b/BaseTools/Source/Python/build/BuildReport.py > > @@ -32,11 +32,11 @@ from Common.BuildToolError import CODE_ERROR > > from Common.BuildToolError import COMMAND_FAILURE > > from Common.BuildToolError import FORMAT_INVALID > > from Common.LongFilePathSupport import OpenLongFilePath as open > > from Common.MultipleWorkspace import MultipleWorkspace as mws > > import Common.GlobalData as GlobalData > > -from AutoGen.AutoGen import ModuleAutoGen > > +from AutoGen.ModuleAutoGen import ModuleAutoGen > > from Common.Misc import PathClass > > from Common.StringUtils import NormPath > > from Common.DataType import * > > import collections > > from Common.Expression import * > > @@ -2140,11 +2140,11 @@ class PlatformReport(object): > > if GlobalData.gFdfParser is not None: > > if Pa.Arch in GlobalData.gFdfParser.Profile.InfDi= ct: > > INFList =3D GlobalData.gFdfParser.Profile.Inf= Dict[Pa.Arch] > > for InfName in INFList: > > InfClass =3D PathClass(NormPath(InfName),= Wa.WorkspaceDir, Pa.Arch) > > - Ma =3D ModuleAutoGen(Wa, InfClass, Pa.Bui= ldTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile) > > + Ma =3D ModuleAutoGen(Wa, InfClass, Pa.Bui= ldTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile,Pa.DataPile) > > if Ma is None: > > continue > > if Ma not in ModuleAutoGenList: > > ModuleAutoGenList.append(Ma) > > for MGen in ModuleAutoGenList: > > diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source= /Python/build/build.py > > index 07693b97359e..3d083f4eaade 100644 > > --- a/BaseTools/Source/Python/build/build.py > > +++ b/BaseTools/Source/Python/build/build.py > > @@ -10,46 +10,49 @@ > > > > ## > > # Import Modules > > # > > from __future__ import print_function > > -import Common.LongFilePathOs as os > > -import re > > +from __future__ import absolute_import > > +import os.path as path > > import sys > > +import os > > +import re > > import glob > > import time > > import platform > > import traceback > > -import encodings.ascii > > import multiprocessing > > - > > -from struct import * > > -from threading import * > > +from threading import Thread,Event,BoundedSemaphore > > import threading > > +from subprocess import Popen,PIPE > > +from collections import OrderedDict, defaultdict > > from optparse import OptionParser > > -from subprocess import * > > +from AutoGen.PlatformAutoGen import PlatformAutoGen > > +from AutoGen.ModuleAutoGen import ModuleAutoGen > > +from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen > > +from AutoGen import GenMake > > from Common import Misc as Utils > > > > -from Common.LongFilePathSupport import OpenLongFilePath as open > > from Common.TargetTxtClassObject import TargetTxt > > from Common.ToolDefClassObject import ToolDef > > +from Common.Misc import PathClass,SaveFileOnChange,RemoveDirectory > > +from Common.StringUtils import NormPath > > +from Common.MultipleWorkspace import MultipleWorkspace as mws > > +from Common.BuildToolError import * > > from Common.DataType import * > > +import Common.EdkLogger as EdkLogger > > from Common.BuildVersion import gBUILD_VERSION > > -from AutoGen.AutoGen import * > > -from Common.BuildToolError import * > > -from Workspace.WorkspaceDatabase import WorkspaceDatabase > > -from Common.MultipleWorkspace import MultipleWorkspace as mws > > +from Workspace.WorkspaceDatabase import BuildDB > > > > from BuildReport import BuildReport > > -from GenPatchPcdTable.GenPatchPcdTable import * > > -from PatchPcdValue.PatchPcdValue import * > > +from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdIn= foFromMapFile > > +from PatchPcdValue.PatchPcdValue import PatchBinaryFile > > > > -import Common.EdkLogger > > import Common.GlobalData as GlobalData > > from GenFds.GenFds import GenFds, GenFdsApi > > > > -from collections import OrderedDict, defaultdict > > > > # Version and Copyright > > VersionNumber =3D "0.60" + ' ' + gBUILD_VERSION > > __version__ =3D "%prog Version " + VersionNumber > > __copyright__ =3D "Copyright (c) 2007 - 2018, Intel Corporation All = rights reserved." > > @@ -773,11 +776,11 @@ class Build(): > > ConfDirectoryPath =3D mws.join(self.WorkspaceDir, 'Co= nf') > > GlobalData.gConfDirectory =3D ConfDirectoryPath > > GlobalData.gDatabasePath =3D os.path.normpath(os.path.join(Co= nfDirectoryPath, GlobalData.gDatabasePath)) > > if not os.path.exists(os.path.join(GlobalData.gConfDirectory,= '.cache')): > > os.makedirs(os.path.join(GlobalData.gConfDirectory, '.cac= he')) > > - self.Db =3D WorkspaceDatabase() > > + self.Db =3D BuildDB > > self.BuildDatabase =3D self.Db.BuildObject > > self.Platform =3D None > > self.ToolChainFamily =3D None > > self.LoadFixAddress =3D 0 > > self.UniFlag =3D BuildOptions.Flag > > @@ -1698,17 +1701,21 @@ class Build(): > > CmdListDict =3D {} > > if GlobalData.gEnableGenfdsMultiThread and self.Fdf: > > CmdListDict =3D self._GenFfsCmd(Wa.ArchList) > > > > for Arch in Wa.ArchList: > > + PcdMaList =3D [] > > GlobalData.gGlobalDefines['ARCH'] =3D Arch > > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Bui= ldTarget, ToolChain, Arch) > > for Module in Pa.Platform.Modules: > > # Get ModuleAutoGen object to generate C code= file and makefile > > - Ma =3D ModuleAutoGen(Wa, Module, BuildTarget,= ToolChain, Arch, self.PlatformFile) > > + Ma =3D ModuleAutoGen(Wa, Module, BuildTarget,= ToolChain, Arch, self.PlatformFile,Pa.DataPipe) > > if Ma is None: > > continue > > + if Ma.PcdIsDriver: > > + Ma.PlatformInfo =3D Pa > > + PcdMaList.append(Ma) > > self.BuildModules.append(Ma) > > self._BuildPa(self.Target, Pa, FfsCommand=3DCmdLi= stDict) > > > > # Create MAP file when Load Fix Address is enabled. > > if self.Target in ["", "all", "fds"]: > > @@ -1800,11 +1807,11 @@ class Build(): > > AutoGenStart =3D time.time() > > GlobalData.gGlobalDefines['ARCH'] =3D Arch > > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Bui= ldTarget, ToolChain, Arch) > > for Module in Pa.Platform.Modules: > > if self.ModuleFile.Dir =3D=3D Module.Dir and = self.ModuleFile.Name =3D=3D Module.Name: > > - Ma =3D ModuleAutoGen(Wa, Module, BuildTar= get, ToolChain, Arch, self.PlatformFile) > > + Ma =3D ModuleAutoGen(Wa, Module, BuildTar= get, ToolChain, Arch, self.PlatformFile,Pa.DataPipe) > > if Ma is None: > > continue > > MaList.append(Ma) > > if Ma.CanSkipbyHash(): > > self.HashSkipModules.append(Ma) > > @@ -1980,10 +1987,11 @@ class Build(): > > # multi-thread exit flag > > ExitFlag =3D threading.Event() > > ExitFlag.clear() > > self.AutoGenTime +=3D int(round((time.time() - Worksp= aceAutoGenTime))) > > for Arch in Wa.ArchList: > > + PcdMaList =3D [] > > AutoGenStart =3D time.time() > > GlobalData.gGlobalDefines['ARCH'] =3D Arch > > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, Bui= ldTarget, ToolChain, Arch) > > if Pa is None: > > continue > > @@ -1997,14 +2005,17 @@ class Build(): > > if Inf in Pa.Platform.Modules: > > continue > > ModuleList.append(Inf) > > for Module in ModuleList: > > # Get ModuleAutoGen object to generate C code= file and makefile > > - Ma =3D ModuleAutoGen(Wa, Module, BuildTarget,= ToolChain, Arch, self.PlatformFile) > > + Ma =3D ModuleAutoGen(Wa, Module, BuildTarget,= ToolChain, Arch, self.PlatformFile,Pa.DataPipe) > > > > if Ma is None: > > continue > > + if Ma.PcdIsDriver: > > + Ma.PlatformInfo =3D Pa > > + PcdMaList.append(Ma) > > if Ma.CanSkipbyHash(): > > self.HashSkipModules.append(Ma) > > if GlobalData.gBinCacheSource: > > EdkLogger.quiet("cache hit: %s[%s]" %= (Ma.MetaFile.Path, Ma.Arch)) > > continue > > --=20 > > 2.20.1.windows.1 > >=20 > >=20 > >=20 > >=20 > >=20 >=20 >=20 >=20 >=20