public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
From: Jaben Carsey <jaben.carsey@intel.com>
To: edk2-devel@lists.01.org
Cc: Liming Gao <liming.gao@intel.com>, Yonghong Zhu <yonghong.zhu@intel.com>
Subject: [PATCH v1 10/11] BaseTools: change to set for membership testing
Date: Mon, 14 May 2018 11:09:19 -0700	[thread overview]
Message-ID: <fab416e6059bf2a012b5cc4a67336d76a0f206ba.1526321053.git.jaben.carsey@intel.com> (raw)
In-Reply-To: <cover.1526321052.git.jaben.carsey@intel.com>
In-Reply-To: <cover.1526321052.git.jaben.carsey@intel.com>

when doing testing for membership in a list or a tuple, use a set instead.
when order matters, use a list or tuple (i.e. for loop)

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                      |  49 +++----
 BaseTools/Source/Python/AutoGen/GenC.py                         |  68 ++++-----
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |  27 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                      |   9 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                     |  52 +++----
 BaseTools/Source/Python/AutoGen/GenVar.py                       |   4 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py      |   4 +-
 BaseTools/Source/Python/BPDG/BPDG.py                            |   2 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                          |   6 +-
 BaseTools/Source/Python/Common/DataType.py                      |  36 +++--
 BaseTools/Source/Python/Common/Expression.py                    |   8 +-
 BaseTools/Source/Python/Common/Misc.py                          |  24 ++--
 BaseTools/Source/Python/Common/Parsing.py                       |   2 +-
 BaseTools/Source/Python/Common/RangeExpression.py               |  10 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py          |   8 +-
 BaseTools/Source/Python/Ecc/Check.py                            | 121 ++--------------
 BaseTools/Source/Python/Ecc/MetaDataParser.py                   |   4 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py |  48 +++----
 BaseTools/Source/Python/Ecc/c.py                                |  27 ++--
 BaseTools/Source/Python/Eot/Parser.py                           |   4 +-
 BaseTools/Source/Python/Eot/Report.py                           |  13 +-
 BaseTools/Source/Python/Eot/c.py                                |   2 +-
 BaseTools/Source/Python/GenFds/DataSection.py                   |   2 +-
 BaseTools/Source/Python/GenFds/DepexSection.py                  |   5 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                     | 152 ++++++++++----------
 BaseTools/Source/Python/GenFds/FfsInfStatement.py               |   4 +-
 BaseTools/Source/Python/GenFds/Fv.py                            |   4 +-
 BaseTools/Source/Python/GenFds/GenFds.py                        |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py          |  16 +--
 BaseTools/Source/Python/GenFds/GuidSection.py                   |  14 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py            |   4 +-
 BaseTools/Source/Python/GenFds/Region.py                        |   2 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py          |  20 +--
 BaseTools/Source/Python/Trim/Trim.py                            |   7 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py               |   2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |  86 +++++------
 BaseTools/Source/Python/Workspace/InfBuildData.py               |  26 ++--
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py      |   4 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py             |  66 ++++-----
 BaseTools/Source/Python/build/BuildReport.py                    |  34 ++---
 BaseTools/Source/Python/build/build.py                          |  32 ++---
 41 files changed, 462 insertions(+), 550 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 4ccb50a0a0af..dcad8b4f32f6 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1540,7 +1540,7 @@ class PlatformAutoGen(AutoGen):
                 self._PlatformPcds[pcd] = self.Platform.Pcds[pcd]
 
         for item in self._PlatformPcds:
-            if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+            if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                 self._PlatformPcds[item].DatumType = TAB_VOID
 
         if (self.Workspace.ArchList[-1] == self.Arch): 
@@ -1549,12 +1549,12 @@ class PlatformAutoGen(AutoGen):
                 Sku = Pcd.SkuInfoList.values()[0]
                 Sku.VpdOffset = Sku.VpdOffset.strip()
 
-                if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+                if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                     Pcd.DatumType = TAB_VOID
 
                     # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
                     # if found HII type PCD then insert to right of UnicodeIndex
-                if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+                if Pcd.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                     VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
 
             #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
@@ -1576,7 +1576,7 @@ class PlatformAutoGen(AutoGen):
             VpdSkuMap = {}
             for PcdKey in PlatformPcds:
                 Pcd = self._PlatformPcds[PcdKey]
-                if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD] and \
+                if Pcd.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD} and \
                    PcdKey in VpdPcdDict:
                     Pcd = VpdPcdDict[PcdKey]
                     SkuValueMap = {}
@@ -1630,7 +1630,7 @@ class PlatformAutoGen(AutoGen):
             #            
             for DscPcd in PlatformPcds:
                 DscPcdEntry = self._PlatformPcds[DscPcd]
-                if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+                if DscPcdEntry.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                     if not (self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == ''):
                         FoundFlag = False
                         for VpdPcd in VpdFile._VpdArray:
@@ -1748,7 +1748,7 @@ class PlatformAutoGen(AutoGen):
                 Sku = Pcd.SkuInfoList.values()[0]
                 Sku.VpdOffset = Sku.VpdOffset.strip()
 
-                if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+                if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                     Pcd.DatumType = TAB_VOID
 
                 PcdValue = Sku.DefaultValue
@@ -1768,7 +1768,7 @@ class PlatformAutoGen(AutoGen):
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
                 for (SkuName,SkuId) in allskuset:
-                    if type(SkuId) in (str,unicode) and eval(SkuId) == 0 or SkuId == 0:
+                    if type(SkuId) in {str,unicode} and eval(SkuId) == 0 or SkuId == 0:
                         continue
                     pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
@@ -3086,13 +3086,10 @@ class ModuleAutoGen(AutoGen):
     #   @retval     list    The list of package object
     #
     def _GetDerivedPackageList(self):
-        PackageList = []
+        PackageSet = set()
         for M in [self.Module] + self.DependentLibraryList:
-            for Package in M.Packages:
-                if Package in PackageList:
-                    continue
-                PackageList.append(Package)
-        return PackageList
+            PackageSet = PackageSet.union(M.Packages)
+        return list(PackageSet)
     
     ## Get the depex string
     #
@@ -3120,7 +3117,7 @@ class ModuleAutoGen(AutoGen):
                     else:
                         if Arch.upper() == TAB_ARCH_COMMON or \
                           (Arch.upper() == self.Arch.upper() and \
-                          ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
+                          ModuleType.upper() in {TAB_ARCH_COMMON, self.ModuleType.upper()}):
                             DepexList.append({(Arch, ModuleType): DepexExpr})
         
         #the type of build module is USER_DEFINED.
@@ -3279,9 +3276,9 @@ class ModuleAutoGen(AutoGen):
             # Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
             # is the former use /I , the Latter used -I to specify include directories
             #
-            if self.PlatformInfo.ToolChainFamily in ('MSFT'):
+            if self.PlatformInfo.ToolChainFamily == 'MSFT':
                 BuildOptIncludeRegEx = gBuildOptIncludePatternMsft
-            elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVCT'):
+            elif self.PlatformInfo.ToolChainFamily in {'INTEL', 'GCC', 'RVCT'}:
                 BuildOptIncludeRegEx = gBuildOptIncludePatternOther
             else:
                 #
@@ -3291,7 +3288,7 @@ class ModuleAutoGen(AutoGen):
                 return self._BuildOptionIncPathList
             
             BuildOptionIncPathList = []
-            for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM'):
+            for Tool in ['CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM']:
                 Attr = 'FLAGS'
                 try:
                     FlagOption = self.BuildOption[Tool][Attr]
@@ -3339,12 +3336,12 @@ class ModuleAutoGen(AutoGen):
             self._SourceFileList = []
             for F in self.Module.Sources:
                 # match tool chain
-                if F.TagName not in ("", "*", self.ToolChain):
+                if F.TagName not in {"", "*", self.ToolChain}:
                     EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] for processing file [%s] is found, "
                                     "but [%s] is needed" % (F.TagName, str(F), self.ToolChain))
                     continue
                 # match tool chain family or build rule family
-                if F.ToolChainFamily not in ("", "*", self.ToolChainFamily, self.BuildRuleFamily):
+                if F.ToolChainFamily not in {"", "*", self.ToolChainFamily, self.BuildRuleFamily}:
                     EdkLogger.debug(
                                 EdkLogger.DEBUG_0,
                                 "The file [%s] must be built by tools of [%s], " \
@@ -3423,7 +3420,7 @@ class ModuleAutoGen(AutoGen):
         if self._BinaryFileList is None:
             self._BinaryFileList = []
             for F in self.Module.Binaries:
-                if F.Target not in [TAB_ARCH_COMMON, '*'] and F.Target != self.BuildTarget:
+                if F.Target not in {TAB_ARCH_COMMON, '*'} and F.Target != self.BuildTarget:
                     continue
                 self._BinaryFileList.append(F)
                 self._ApplyBuildRule(F, F.Type)
@@ -4049,11 +4046,11 @@ class ModuleAutoGen(AutoGen):
                 AsBuiltInfDict['binary_item'] += ['BIN|' + File]
         if self.DepexGenerated:
             self.OutputFile.add(self.Name + '.depex')
-            if self.ModuleType in [SUP_MODULE_PEIM]:
+            if self.ModuleType == SUP_MODULE_PEIM:
                 AsBuiltInfDict['binary_item'] += ['PEI_DEPEX|' + self.Name + '.depex']
-            if self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
+            if self.ModuleType in {SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER}:
                 AsBuiltInfDict['binary_item'] += ['DXE_DEPEX|' + self.Name + '.depex']
-            if self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]:
+            if self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
                 AsBuiltInfDict['binary_item'] += ['SMM_DEPEX|' + self.Name + '.depex']
 
         Bin = self._GenOffsetBin()
@@ -4107,11 +4104,11 @@ class ModuleAutoGen(AutoGen):
                 else:
                     continue
                 PcdValue = ''
-                if Pcd.DatumType == 'BOOLEAN':
+                if Pcd.DatumType == TAB_BOOLEAN:
                     BoolValue = Pcd.DefaultValue.upper()
-                    if BoolValue == 'TRUE':
+                    if BoolValue == TAB_TRUE_1:
                         Pcd.DefaultValue = '1'
-                    elif BoolValue == 'FALSE':
+                    elif BoolValue == TAB_FALSE_1:
                         Pcd.DefaultValue = '0'
 
                 if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index e73d83395255..60066e47bbce 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -43,9 +43,9 @@ gItemTypeStringDatabase  = {
 
 
 ## Datum size
-gDatumSizeStringDatabase = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOLEAN',TAB_VOID:'8'}
-gDatumSizeStringDatabaseH = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOL',TAB_VOID:'PTR'}
-gDatumSizeStringDatabaseLib = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'Bool',TAB_VOID:'Ptr'}
+gDatumSizeStringDatabase = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:TAB_BOOLEAN,TAB_VOID:'8'}
+gDatumSizeStringDatabaseH = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:'BOOL',TAB_VOID:'PTR'}
+gDatumSizeStringDatabaseLib = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:'Bool',TAB_VOID:'Ptr'}
 
 ## AutoGen File Header Templates
 gAutoGenHeaderString = TemplateString("""\
@@ -996,11 +996,11 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
         Unicode = False
         ValueNumber = 0
 
-        if Pcd.DatumType == 'BOOLEAN':
+        if Pcd.DatumType == TAB_BOOLEAN:
             BoolValue = Value.upper()
-            if BoolValue == 'TRUE' or BoolValue == '1':
+            if BoolValue == TAB_TRUE_1 or BoolValue == '1':
                 Value = '1U'
-            elif BoolValue == 'FALSE' or BoolValue == '0':
+            elif BoolValue == TAB_FALSE_1 or BoolValue == '0':
                 Value = '0U'
 
         if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
@@ -1367,17 +1367,17 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
         if len(Lib.ConstructorList) <= 0:
             continue
         Dict = {'Function':Lib.ConstructorList}
-        if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Lib.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1398,14 +1398,14 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
     else:
-        if Info.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Info.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 ## Create code for library destructor
@@ -1429,17 +1429,17 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
         if len(Lib.DestructorList) <= 0:
             continue
         Dict = {'Function':Lib.DestructorList}
-        if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Lib.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1460,14 +1460,14 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
     else:
-        if Info.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Info.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 
@@ -1478,7 +1478,7 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
-    if Info.IsLibrary or Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC]:
+    if Info.IsLibrary or Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC}:
         return
     #
     # Module Entry Points
@@ -1498,7 +1498,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         'UefiSpecVersion':   UefiSpecVersion + 'U'
     }
 
-    if Info.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE]:
+    if Info.ModuleType in {SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE}:
         if Info.SourceFileList:
           if NumEntryPoints != 1:
               EdkLogger.error(
@@ -1526,7 +1526,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         else:
             AutoGenC.Append(gPeimEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gPeimEntryPointPrototype.Replace(Dict))
-    elif Info.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER]:
+    elif Info.ModuleType in {SUP_MODULE_DXE_RUNTIME_DRIVER,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER}:
         if NumEntryPoints < 2:
             AutoGenC.Append(gUefiDriverEntryPointString[NumEntryPoints].Replace(Dict))
         else:
@@ -1558,7 +1558,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
-    if Info.IsLibrary or Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC]:
+    if Info.IsLibrary or Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC}:
         return
     #
     # Unload Image Handlers
@@ -1578,7 +1578,7 @@ def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1602,7 +1602,7 @@ def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1626,7 +1626,7 @@ def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreatePpiDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1663,7 +1663,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
     # Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
     if TokenSpaceList:
         AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
-        if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+        if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
             GuidType = TAB_GUID
         else:
             GuidType = "EFI_GUID"              
@@ -1782,7 +1782,7 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                     for FileObj in ImageFiles.ImageFilesDict[Idf]:
                         for sourcefile in Info.SourceFileList:
                             if FileObj.FileName == sourcefile.File:
-                                if not sourcefile.Ext.upper() in ['.PNG', '.BMP', '.JPG']:
+                                if not sourcefile.Ext.upper() in {'.PNG', '.BMP', '.JPG'}:
                                     EdkLogger.error("build", AUTOGEN_ERROR, "The %s's postfix must be one of .bmp, .jpg, .png" % (FileObj.FileName), ExtraData="[%s]" % str(Info))
                                 FileObj.File = sourcefile
                                 break
@@ -2107,11 +2107,11 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
                 if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
                     TokenCName = Pcd.TokenCName
                     Value = Pcd.DefaultValue
-                    if Pcd.DatumType == 'BOOLEAN':
+                    if Pcd.DatumType == TAB_BOOLEAN:
                         BoolValue = Value.upper()
-                        if BoolValue == 'TRUE':
+                        if BoolValue == TAB_TRUE_1:
                             Value = '1'
-                        elif BoolValue == 'FALSE':
+                        elif BoolValue == TAB_FALSE_1:
                             Value = '0'
                     for PcdItem in GlobalData.MixedPcd:
                         if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 873ed6e59300..d4730dd227df 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -114,18 +114,15 @@ class DependencyExpression:
     }
 
     # all supported op codes and operands
-    SupportedOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
-    SupportedOperand = [DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE]
-
-    OpcodeWithSingleOperand = [DEPEX_OPCODE_NOT, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
-    OpcodeWithTwoOperand = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]
+    SupportedOpcode = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR}
+    SupportedOperand = {DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE}
 
     # op code that should not be the last one
-    NonEndingOpcode = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR]
+    NonEndingOpcode = {DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR}
     # op code must not present at the same time
-    ExclusiveOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    ExclusiveOpcode = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER}
     # op code that should be the first one if it presents
-    AboveAllOpcode = [DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    AboveAllOpcode = {DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER}
 
     #
     # open and close brace must be taken as individual tokens
@@ -177,7 +174,7 @@ class DependencyExpression:
         LastToken = ''
         for Token in self.TokenList:
             if Token == "(":
-                if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before open parentheses",
                                     ExtraData="Near %s" % LastToken)
                 Stack.append(Token)
@@ -185,7 +182,7 @@ class DependencyExpression:
                 if '(' not in Stack:
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: mismatched parentheses",
                                     ExtraData=str(self))
-                elif LastToken in self.SupportedOpcode + ['', None]:
+                elif LastToken in self.SupportedOpcode.union(['', None]):
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before close parentheses",
                                     ExtraData="Near %s" % LastToken)
                 while len(Stack) > 0:
@@ -195,10 +192,10 @@ class DependencyExpression:
                     self.PostfixNotation.append(Stack.pop())
             elif Token in self.OpcodePriority:
                 if Token == DEPEX_OPCODE_NOT:
-                    if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                    if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before NOT",
                                         ExtraData="Near %s" % LastToken)
-                elif LastToken in self.SupportedOpcode + ['(', '', None]:
+                elif LastToken in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before " + Token,
                                         ExtraData="Near %s" % LastToken)
 
@@ -211,7 +208,7 @@ class DependencyExpression:
             else:
                 if Token not in self.SupportedOpcode:
                     # not OP, take it as GUID
-                    if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                    if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before %s" % Token,
                                         ExtraData="Near %s" % LastToken)
                     if len(self.OpcodeList) == 0 or self.OpcodeList[-1] not in self.ExclusiveOpcode:
@@ -274,7 +271,7 @@ class DependencyExpression:
           return
         Op = OpcodeSet.pop()
         #if Op isn't either OR or AND, return
-        if Op not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
+        if Op not in {DEPEX_OPCODE_AND, DEPEX_OPCODE_OR}:
             return
         NewOperand = []
         AllOperand = set()
@@ -302,7 +299,7 @@ class DependencyExpression:
             return
 
         # don't generate depex if all operands are architecture protocols
-        if self.ModuleType in [SUP_MODULE_UEFI_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE] and \
+        if self.ModuleType in {SUP_MODULE_UEFI_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE} and \
            Op == DEPEX_OPCODE_AND and \
            self.ArchProtocols == set(GuidStructureStringToGuidString(Guid) for Guid in AllOperand):
             self.PostfixNotation = []
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 4ae977ccd400..12e871a8b8d3 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -23,6 +23,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 from Common.BuildToolError import *
 from Common.Misc import *
 from Common.String import *
+from Common.DataType import *
 from BuildEngine import *
 import Common.GlobalData as GlobalData
 from collections import OrderedDict
@@ -499,7 +500,7 @@ cleanlib:
 
         PCI_COMPRESS_Flag = False
         for k, v in self._AutoGenObject.Module.Defines.iteritems():
-            if 'PCI_COMPRESS' == k and 'TRUE' == v:
+            if 'PCI_COMPRESS' == k and TAB_TRUE_1 == v:
                 PCI_COMPRESS_Flag = True
 
         # tools definitions
@@ -900,7 +901,7 @@ cleanlib:
             self._AutoGenObject.AutoGenDepSet |= set(self.FileDependency[File])
 
             # skip non-C files
-            if File.Ext not in [".c", ".C"] or File.Name == "AutoGen.c":
+            if File.Ext not in {".c", ".C"} or File.Name == "AutoGen.c":
                 continue
             elif DepSet is None:
                 DepSet = set(self.FileDependency[File])
@@ -917,7 +918,7 @@ cleanlib:
 
         for File in self.FileDependency:
             # skip non-C files
-            if File.Ext not in [".c", ".C"] or File.Name == "AutoGen.c":
+            if File.Ext not in {".c", ".C"} or File.Name == "AutoGen.c":
                 continue
             NewDepSet = set(self.FileDependency[File])
             NewDepSet -= DepSet
@@ -958,7 +959,7 @@ cleanlib:
                 # Use file list macro as dependency
                 if T.GenFileListMacro:
                     Deps.append("$(%s)" % T.FileListMacro)
-                    if Type in [TAB_OBJECT_FILE, TAB_STATIC_LIBRARY]:
+                    if Type in {TAB_OBJECT_FILE, TAB_STATIC_LIBRARY}:
                         Deps.append("$(%s)" % T.ListFileMacro)
 
                 TargetDict = {
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index d2d42fe9d08e..48b34e6f87e5 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -293,7 +293,7 @@ class DbItemList:
 
         Buffer = ''
         for Datas in self.RawDataList:
-            if type(Datas) in (list, tuple):
+            if type(Datas) in {list, tuple}:
                 for Data in Datas:
                     if PackStr:
                         Buffer += pack(PackStr, GetIntegerValue(Data))
@@ -368,7 +368,7 @@ class DbComItemList (DbItemList):
         Buffer = ''
         for DataList in self.RawDataList:
             for Data in DataList:
-                if type(Data) in (list, tuple):
+                if type(Data) in {list, tuple}:
                     for SingleData in Data:
                         Buffer += pack(PackStr, GetIntegerValue(SingleData))
                 else:
@@ -414,7 +414,7 @@ class DbStringHeadTableItemList(DbItemList):
                 Offset += len(self.RawDataList[ItemIndex])
         else:
             for innerIndex in range(Index):
-                if type(self.RawDataList[innerIndex]) in (list, tuple):
+                if type(self.RawDataList[innerIndex]) in {list, tuple}:
                     Offset += len(self.RawDataList[innerIndex]) * self.ItemSize
                 else:
                     Offset += self.ItemSize
@@ -431,7 +431,7 @@ class DbStringHeadTableItemList(DbItemList):
             self.ListSize = self.GetInterOffset(len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
         else:
             for Datas in self.RawDataList:
-                if type(Datas) in (list, tuple):
+                if type(Datas) in {list, tuple}:
                     self.ListSize += len(Datas) * self.ItemSize
                 else:
                     self.ListSize += self.ItemSize
@@ -783,7 +783,7 @@ def BuildExDataBase(Dict):
     Pad = 0xDA
     
     UninitDataBaseSize  = 0
-    for Item in (DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean):
+    for Item in {DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean}:
         UninitDataBaseSize += Item.GetListSize()
     
     if (DbTotalLength - UninitDataBaseSize) % 8:
@@ -1001,11 +1001,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         'EX_TOKEN_NUMBER'               : '0U',
         'SIZE_TABLE_SIZE'               : '2U',
         'SKU_HEAD_SIZE'                 : '1U',
-        'GUID_TABLE_EMPTY'              : 'TRUE',
-        'STRING_TABLE_EMPTY'            : 'TRUE',
-        'SKUID_TABLE_EMPTY'             : 'TRUE',
-        'DATABASE_EMPTY'                : 'TRUE',
-        'EXMAP_TABLE_EMPTY'             : 'TRUE',
+        'GUID_TABLE_EMPTY'              : TAB_TRUE_1,
+        'STRING_TABLE_EMPTY'            : TAB_TRUE_1,
+        'SKUID_TABLE_EMPTY'             : TAB_TRUE_1,
+        'DATABASE_EMPTY'                : TAB_TRUE_1,
+        'EXMAP_TABLE_EMPTY'             : TAB_TRUE_1,
         'PCD_DATABASE_UNINIT_EMPTY'     : '  UINT8  dummy; /* PCD_DATABASE_UNINIT is emptry */',
         'SYSTEM_SKU_ID'                 : '  SKU_ID             SystemSkuId;',
         'SYSTEM_SKU_ID_VALUE'           : '0U'
@@ -1022,14 +1022,14 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['VARDEF_SKUID_' + DatumType] = []
         Dict['VARDEF_VALUE_' + DatumType] = []
         Dict['VARDEF_DB_VALUE_' + DatumType] = []
-        for Init in ['INIT','UNINIT']:
+        for Init in {'INIT','UNINIT'}:
             Dict[Init+'_CNAME_DECL_' + DatumType]   = []
             Dict[Init+'_GUID_DECL_' + DatumType]    = []
             Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
             Dict[Init+'_VALUE_' + DatumType]        = []
             Dict[Init+'_DB_VALUE_'+DatumType] = []
             
-    for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
+    for Type in {'STRING_HEAD','VPD_HEAD','VARIABLE_HEAD'}:
         Dict[Type + '_CNAME_DECL']   = []
         Dict[Type + '_GUID_DECL']    = []
         Dict[Type + '_NUMSKUS_DECL'] = []
@@ -1087,7 +1087,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
     i = 0
     ReorderedDynPcdList = GetOrderedDynamicPcdList(DynamicPcdList, Platform.PcdTokenNumber)
     for item in ReorderedDynPcdList:
-        if item.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+        if item.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
             item.DatumType = TAB_VOID
     for Pcd in ReorderedDynPcdList:
         VoidStarTypeCurrSize = []
@@ -1130,11 +1130,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Pcd.InitString = 'UNINIT'
 
         if Pcd.DatumType == TAB_VOID:
-            if Pcd.Type not in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+            if Pcd.Type not in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                 Pcd.TokenTypeList = ['PCD_TYPE_STRING']
             else:
                 Pcd.TokenTypeList = []
-        elif Pcd.DatumType == 'BOOLEAN':
+        elif Pcd.DatumType == TAB_BOOLEAN:
             Pcd.TokenTypeList = ['PCD_DATUM_TYPE_UINT8_BOOLEAN']
         else:
             Pcd.TokenTypeList = ['PCD_DATUM_TYPE_' + Pcd.DatumType]
@@ -1235,10 +1235,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     
                     if Pcd.DatumType == TAB_UINT64:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "ULL")
-                    elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
+                    elif Pcd.DatumType in {TAB_UINT32, TAB_UINT16, TAB_UINT8}:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
-                    elif Pcd.DatumType == "BOOLEAN":
-                        if eval(Sku.HiiDefaultValue) in [1,0]:
+                    elif Pcd.DatumType == TAB_BOOLEAN:
+                        if eval(Sku.HiiDefaultValue) in {1,0}:
                             Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
                     else:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
@@ -1323,7 +1323,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             else:
                 if "PCD_TYPE_HII" not in Pcd.TokenTypeList:
                     Pcd.TokenTypeList += ['PCD_TYPE_DATA']
-                    if Sku.DefaultValue == 'TRUE':
+                    if Sku.DefaultValue == TAB_TRUE_1:
                         Pcd.InitString = 'INIT'
                     else:
                         Pcd.InitString = Pcd.isinit
@@ -1333,10 +1333,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                 #
                 if Pcd.DatumType == TAB_UINT64:
                     ValueList.append(Sku.DefaultValue + "ULL")
-                elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
+                elif Pcd.DatumType in {TAB_UINT32, TAB_UINT16, TAB_UINT8}:
                     ValueList.append(Sku.DefaultValue + "U")
-                elif Pcd.DatumType == "BOOLEAN":
-                    if Sku.DefaultValue in ["1", "0"]:
+                elif Pcd.DatumType == TAB_BOOLEAN:
+                    if Sku.DefaultValue in {"1", "0"}:
                         ValueList.append(Sku.DefaultValue + "U")              
                 else:
                     ValueList.append(Sku.DefaultValue)
@@ -1516,7 +1516,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             StringTableSize += Dict['PCD_CNAME_LENGTH'][index]
             StringTableIndex += 1
     if GuidList != []:
-        Dict['GUID_TABLE_EMPTY'] = 'FALSE'
+        Dict['GUID_TABLE_EMPTY'] = TAB_FALSE_1
         Dict['GUID_TABLE_SIZE'] = str(len(GuidList)) + 'U'
     else:
         Dict['GUID_STRUCTURE'] = [GuidStringToGuidStructureString('00000000-0000-0000-0000-000000000000')]
@@ -1528,7 +1528,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['STRING_TABLE_GUID'].append('')
         Dict['STRING_TABLE_VALUE'].append('{ 0 }')
     else:
-        Dict['STRING_TABLE_EMPTY'] = 'FALSE'
+        Dict['STRING_TABLE_EMPTY'] = TAB_FALSE_1
         Dict['STRING_TABLE_SIZE'] = str(StringTableSize) + 'U'
 
     if Dict['SIZE_TABLE_CNAME'] == []:
@@ -1538,12 +1538,12 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append('0U')
 
     if NumberOfLocalTokens != 0:
-        Dict['DATABASE_EMPTY']                = 'FALSE'
+        Dict['DATABASE_EMPTY']                = TAB_FALSE_1
         Dict['LOCAL_TOKEN_NUMBER_TABLE_SIZE'] = NumberOfLocalTokens
         Dict['LOCAL_TOKEN_NUMBER']            = NumberOfLocalTokens
 
     if NumberOfExTokens != 0:
-        Dict['EXMAP_TABLE_EMPTY']    = 'FALSE'
+        Dict['EXMAP_TABLE_EMPTY']    = TAB_FALSE_1
         Dict['EXMAPPING_TABLE_SIZE'] = str(NumberOfExTokens) + 'U'
         Dict['EX_TOKEN_NUMBER']      = str(NumberOfExTokens) + 'U'
     else:
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 35f022ac2e19..c01661864c6d 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -295,8 +295,8 @@ class VariableMgr(object):
                 for value_char in tail.split(","):
                     Buffer += pack("=B",int(value_char,16))
                 data_len += len(tail.split(","))
-        elif data_type == "BOOLEAN":
-            Buffer += pack("=B",True) if var_value.upper() == "TRUE" else pack("=B",False)
+        elif data_type == TAB_BOOLEAN:
+            Buffer += pack("=B",True) if var_value.upper() == TAB_TRUE_1 else pack("=B",False)
             data_len += 1
         elif data_type  == DataType.TAB_UINT8:
             Buffer += pack("=B",GetIntegerValue(var_value))
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index b2a9bb1134ed..63add891e3f1 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -58,7 +58,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                 itemIndex += 1
                 realLength += 5
                 for v_data in item.data:
-                    if type(v_data) in (int, long):
+                    if type(v_data) in {int, long}:
                         realLength += item.StorageWidth
                     else:
                         realLength += item.StorageWidth
@@ -138,7 +138,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                 Buffer += b
                 realLength += 1
                 for v_data in item.data:
-                    if type(v_data) in (int, long):
+                    if type(v_data) in {int, long}:
                         b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data)
                         Buffer += b
                         realLength += item.StorageWidth
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 6c8f89f5d12b..88e12b247c58 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -134,7 +134,7 @@ def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
     if os.path.exists(VpdFileName) and not Force:
         print "\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName
         choice = sys.stdin.readline()
-        if choice.strip().lower() not in ['y', 'yes', '']:
+        if choice.strip().lower() not in {'y', 'yes', ''}:
             return
         
     GenVPD = GenVpd.GenVPD (InputFileName, MapFileName, VpdFileName)
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index dba815415f92..7125788b5bfe 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -72,9 +72,9 @@ class PcdEntry:
     #    
     def _IsBoolean(self, ValueString, Size):
         if (Size == "1"):
-            if ValueString.upper() in ["TRUE", "FALSE"]:
+            if ValueString.upper() in TAB_TRUE_FALSE_SET:
                 return True
-            elif ValueString in ["0", "1", "0x0", "0x1", "0x00", "0x01"]:
+            elif ValueString in {"0", "1", "0x0", "0x1", "0x00", "0x01"}:
                 return True
 
         return False
@@ -101,7 +101,7 @@ class PcdEntry:
     # 
     # 
     def _PackBooleanValue(self, ValueString):
-        if ValueString.upper() == "TRUE" or ValueString in ["1", "0x1", "0x01"]:
+        if ValueString.upper() == TAB_TRUE_1 or ValueString in {"1", "0x1", "0x01"}:
             try:
                 self.PcdValue = pack(_FORMAT_CHAR[1], 1)
             except:
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 93136dff0db2..b86e403c10fb 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -41,10 +41,11 @@ TAB_UINT32 = 'UINT32'
 TAB_UINT64 = 'UINT64'
 TAB_VOID = 'VOID*'
 TAB_GUID = 'GUID'
+TAB_BOOLEAN = 'BOOLEAN'
 
 TAB_PCD_CLEAN_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64}
-TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN'}
-TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN', TAB_VOID}
+TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_BOOLEAN}
+TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_BOOLEAN, TAB_VOID}
 
 TAB_EDK_SOURCE = '$(EDK_SOURCE)'
 TAB_EFI_SOURCE = '$(EFI_SOURCE)'
@@ -79,10 +80,10 @@ SUP_MODULE_SMM_CORE = 'SMM_CORE'
 SUP_MODULE_MM_STANDALONE = 'MM_STANDALONE'
 SUP_MODULE_MM_CORE_STANDALONE = 'MM_CORE_STANDALONE'
 
-SUP_MODULE_LIST = [SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
+SUP_MODULE_SET = {SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
                    SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER, \
-                   SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]
-SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(SUP_MODULE_LIST)
+                   SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}
+SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(SUP_MODULE_SET)
 SUP_MODULE_SET_PEI = {SUP_MODULE_PEIM, SUP_MODULE_PEI_CORE}
 
 EDK_COMPONENT_TYPE_LIBRARY = 'LIBRARY'
@@ -290,9 +291,23 @@ TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SET =  {TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_
                                             TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE}
 
 ## The mapping dictionary from datum type to its maximum number.
-MAX_VAL_TYPE = {"BOOLEAN":0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
+MAX_VAL_TYPE = {TAB_BOOLEAN:0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
 ## The mapping dictionary from datum type to size string.
-MAX_SIZE_TYPE = {"BOOLEAN":1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
+MAX_SIZE_TYPE = {TAB_BOOLEAN:1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
+
+TAB_TRUE_1 = 'TRUE'
+TAB_TRUE_2 = 'true'
+TAB_TRUE_3 = 'True'
+
+TAB_FALSE_1 = 'FALSE'
+TAB_FALSE_2 = 'false'
+TAB_FALSE_3 = 'False'
+
+TAB_TRUE_SET = {TAB_TRUE_1,TAB_TRUE_2,TAB_TRUE_3}
+TAB_FALSE_SET = {TAB_FALSE_1,TAB_FALSE_2,TAB_FALSE_3}
+
+TAB_TRUE_FALSE_SET = {TAB_TRUE_1,TAB_FALSE_1}
+
 
 TAB_DEPEX = 'Depex'
 TAB_DEPEX_COMMON = TAB_DEPEX + TAB_SPLIT + TAB_ARCH_COMMON
@@ -500,7 +515,12 @@ DEPEX_OPCODE_TRUE = "TRUE"
 DEPEX_OPCODE_FALSE = "FALSE"
 
 # Dependency Expression
-DEPEX_SUPPORTED_OPCODE_SET = {"BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "END", "SOR", "TRUE", "FALSE", '(', ')'}
+DEPEX_SUPPORTED_OPCODE_SET = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER,
+                              DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, 
+                              DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, 
+                              DEPEX_OPCODE_END, DEPEX_OPCODE_SOR,
+                              DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE,
+                             '(', ')'}
 
 TAB_STATIC_LIBRARY = "STATIC-LIBRARY-FILE"
 TAB_DYNAMIC_LIBRARY = "DYNAMIC-LIBRARY-FILE"
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 36f2654fc9cf..3133f610b4a7 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -656,9 +656,9 @@ class ValueExpression(BaseExpression):
 
         if self._Token.startswith('"'):
             self._Token = self._Token[1:-1]
-        elif self._Token in {"FALSE", "false", "False"}:
+        elif self._Token in TAB_FALSE_SET:
             self._Token = False
-        elif self._Token in {"TRUE", "true", "True"}:
+        elif self._Token in TAB_TRUE_SET:
             self._Token = True
         else:
             self.__IsNumberToken()
@@ -1020,9 +1020,9 @@ class ValueExpressionEx(ValueExpression):
                     else:
                         raise  BadExpression("Type: %s, Value: %s, %s"%(self.PcdType, PcdValue, Value))
 
-        if PcdValue == 'True':
+        if PcdValue == TAB_TRUE_3:
             PcdValue = '1'
-        if PcdValue == 'False':
+        if PcdValue == TAB_FALSE_3:
             PcdValue = '0'
 
         if RealValue:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index bfb6e56a923f..5b8459e5007b 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1403,9 +1403,9 @@ def ParseFieldValue (Value):
         if Value == 0:
             return 0, 1
         return Value, (Value.bit_length() + 7) / 8
-    if Value.lower() == 'true':
+    if Value.lower() == TAB_TRUE_2:
         return 1, 1
-    if Value.lower() == 'false':
+    if Value.lower() == TAB_FALSE_2:
         return 0, 1
     return Value, 1
 
@@ -1438,7 +1438,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
     FieldList = AnalyzePcdExpression(Setting)
 
     IsValid = True
-    if PcdType in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG):
+    if PcdType in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG}:
         Value = FieldList[0]
         Size = ''
         if len(FieldList) > 1:
@@ -1461,7 +1461,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [str(Value), '', str(Size)], IsValid, 0
-    elif PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+    elif PcdType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
         Value = FieldList[0]
         Size = Type = ''
         if len(FieldList) > 1:
@@ -1482,7 +1482,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [Value, Type, str(Size)], IsValid, 0
-    elif PcdType in (MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_VPD):
+    elif PcdType in {MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_VPD}:
         VpdOffset = FieldList[0]
         Value = Size = ''
         if not DataType == TAB_VOID:
@@ -1504,7 +1504,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [VpdOffset, str(Size), Value], IsValid, 2
-    elif PcdType in (MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII):
+    elif PcdType in {MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII}:
         HiiString = FieldList[0]
         Guid = Offset = Value = Attribute = ''
         if len(FieldList) > 1:
@@ -1574,11 +1574,11 @@ def CheckPcdDatum(Type, Value):
                 PrintList = list(Printset)
                 PrintList.sort()
                 return False, "Invalid PCD string value of type [%s]; must be printable chars %s." % (Type, PrintList)
-    elif Type == 'BOOLEAN':
-        if Value not in ['TRUE', 'True', 'true', '0x1', '0x01', '1', 'FALSE', 'False', 'false', '0x0', '0x00', '0']:
+    elif Type == TAB_BOOLEAN:
+        if Value not in {TAB_TRUE_1, TAB_TRUE_2, TAB_TRUE_3, '0x1', '0x01', '1', TAB_FALSE_1, TAB_FALSE_2, TAB_FALSE_3, '0x0', '0x00', '0'}:
             return False, "Invalid value [%s] of type [%s]; must be one of TRUE, True, true, 0x1, 0x01, 1"\
                           ", FALSE, False, false, 0x0, 0x00, 0" % (Value, Type)
-    elif Type in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64]:
+    elif Type in {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64}:
         try:
             Value = long(Value, 0)
         except:
@@ -1601,7 +1601,7 @@ def SplitOption(OptionString):
     QuotationMark = ""
     for Index in range(0, len(OptionString)):
         CurrentChar = OptionString[Index]
-        if CurrentChar in ['"', "'"]:
+        if CurrentChar in {'"', "'"}:
             if QuotationMark == CurrentChar:
                 QuotationMark = ""
             elif QuotationMark == "":
@@ -1610,7 +1610,7 @@ def SplitOption(OptionString):
         elif QuotationMark:
             continue
 
-        if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", "\n"]:
+        if CurrentChar in {"/", "-"} and LastChar in {" ", "\t", "\r", "\n"}:
             if Index > OptionStart:
                 OptionList.append(OptionString[OptionStart:Index - 1])
             OptionStart = Index
@@ -2083,7 +2083,7 @@ def PackRegistryFormatGuid(Guid):
 #   @retval     Value    The integer value that the input represents
 #
 def GetIntegerValue(Input):
-    if type(Input) in (int, long):
+    if type(Input) in {int, long}:
         return Input
     String = Input
     if String.endswith("U"):
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 453c2039e3d9..94e73f2b78f9 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -42,7 +42,7 @@ def ParseDefineMacro2(Table, RecordSets, GlobalMacro):
     #
     # Replace the Macros
     #
-    for Value in (v for v in RecordSets.values() if v):
+    for Value in [v for v in RecordSets.values() if v]:
         for Item in Value:
             Item[0] = ReplaceMacro(Item[0], Macros)
 
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 7f504d6e310c..fe78bcfd90bb 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -327,12 +327,12 @@ class RangeExpression(BaseExpression):
         
     def Eval(self, Operator, Oprand1, Oprand2 = None):
         
-        if Operator in ["!", "NOT", "not"]:
+        if Operator in {"!", "NOT", "not"}:
             if not gGuidPattern.match(Oprand1.strip()):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             return self.NegtiveRange(Oprand1)
         else:
-            if Operator in ["==", ">=", "<=", ">", "<", '^']:
+            if Operator in {"==", ">=", "<=", ">", "<", '^'}:
                 return self.EvalRange(Operator, Oprand1)
             elif Operator == 'and' :
                 if not gGuidPatternEnd.match(Oprand1.strip()) or not gGuidPatternEnd.match(Oprand2.strip()):
@@ -439,7 +439,7 @@ class RangeExpression(BaseExpression):
         Val = self._RelExpr()
         while self._IsOperator({"!=", "NOT", "not"}):
             Op = self._Token
-            if Op in ["!", "NOT", "not"]:
+            if Op in {"!", "NOT", "not"}:
                 if not self._IsOperator({"IN", "in"}):
                     raise BadExpression(ERR_REL_NOT_IN)
                 Op += ' ' + self._Token
@@ -576,9 +576,9 @@ class RangeExpression(BaseExpression):
 
         if self._Token.startswith('"'):
             self._Token = self._Token[1:-1]
-        elif self._Token in ["FALSE", "false", "False"]:
+        elif self._Token in TAB_FALSE_SET:
             self._Token = False
-        elif self._Token in ["TRUE", "true", "True"]:
+        elif self._Token in TAB_TRUE_SET:
             self._Token = True
         else:
             self.__IsNumberToken()
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 6f5e5f0d173d..ca116ed9b0aa 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -96,8 +96,8 @@ class TargetTxtClassObject(object):
             else:
                 Value = ""
 
-            if Key in [DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF, \
-                       DataType.TAB_TAT_DEFINES_ACTIVE_MODULE, DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF]:
+            if Key in {DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF, \
+                       DataType.TAB_TAT_DEFINES_ACTIVE_MODULE, DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF}:
                 self.TargetTxtDictionary[Key] = Value.replace('\\', '/')
                 if Key == DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF and self.TargetTxtDictionary[Key]:
                     if self.TargetTxtDictionary[Key].startswith("Conf/"):
@@ -119,8 +119,8 @@ class TargetTxtClassObject(object):
                         # The File pointed to by BUILD_RULE_CONF is not in a Conf/ directory
                         Build_Rule = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                     self.TargetTxtDictionary[Key] = Build_Rule
-            elif Key in [DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH, \
-                         DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG]:
+            elif Key in {DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH, \
+                         DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG}:
                 self.TargetTxtDictionary[Key] = Value.split()
             elif Key == DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER:
                 try:
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index e7bd97297538..6bb86f86a706 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -239,11 +239,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckReturnType == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout return type ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutReturnType(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutReturnType(FullName)
 
@@ -252,11 +247,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckOptionalFunctionalModifier == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout modifier ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutModifier(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutModifier(FullName)
 
@@ -266,11 +256,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionName == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function name ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutName(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutName(FullName)
 
@@ -279,12 +264,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionPrototype == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function prototype ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[PROTOTYPE]" + FullName)
-#                        c.CheckFuncLayoutPrototype(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[PROTOTYPE]" + FullName)
                 c.CheckFuncLayoutPrototype(FullName)
@@ -294,11 +273,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionBody == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function body ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutBody(FullName)
             for FullName in EccGlobalData.gCFileList:
                 c.CheckFuncLayoutBody(FullName)
 
@@ -309,12 +283,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckNoInitOfVariable == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout local variables ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutLocalVariable(FullName)
-
             for FullName in EccGlobalData.gCFileList:
                 c.CheckFuncLayoutLocalVariable(FullName)
 
@@ -337,11 +305,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckNoUseCType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration No use C type ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckDeclNoUseCType(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckDeclNoUseCType(FullName)
 
@@ -350,11 +313,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckInOutModifier == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration argument modifier ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckDeclArgModifier(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckDeclArgModifier(FullName)
 
@@ -368,12 +326,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckEnumeratedType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration enum typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[ENUM]" + FullName)
-#                        c.CheckDeclEnumTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[ENUM]" + FullName)
                 c.CheckDeclEnumTypedef(FullName)
@@ -383,12 +335,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckStructureDeclaration == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration struct typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[STRUCT]" + FullName)
-#                        c.CheckDeclStructTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[STRUCT]" + FullName)
                 c.CheckDeclStructTypedef(FullName)
@@ -420,12 +366,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckUnionType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration union typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[UNION]" + FullName)
-#                        c.CheckDeclUnionTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[UNION]" + FullName)
                 c.CheckDeclUnionTypedef(FullName)
@@ -441,12 +381,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckBooleanValue == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression Boolean value ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[BOOLEAN]" + FullName)
-#                        c.CheckBooleanValueComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[BOOLEAN]" + FullName)
                 c.CheckBooleanValueComparison(FullName)
@@ -456,12 +390,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckNonBooleanOperator == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression Non-Boolean variable...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[NON-BOOLEAN]" + FullName)
-#                        c.CheckNonBooleanValueComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[NON-BOOLEAN]" + FullName)
                 c.CheckNonBooleanValueComparison(FullName)
@@ -471,12 +399,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckComparisonNullType == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression NULL pointer ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[POINTER]" + FullName)
-#                        c.CheckPointerNullComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[POINTER]" + FullName)
                 c.CheckPointerNullComparison(FullName)
@@ -518,11 +440,6 @@ class Check(object):
         if EccGlobalData.gConfig.IncludeFileCheckIfndefStatement == '1' or EccGlobalData.gConfig.IncludeFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking header file ifndef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckHeaderFileIfndef(FullName)
             for FullName in EccGlobalData.gHFileList:
                 MsgList = c.CheckHeaderFileIfndef(FullName)
 
@@ -531,11 +448,6 @@ class Check(object):
         if EccGlobalData.gConfig.IncludeFileCheckData == '1' or EccGlobalData.gConfig.IncludeFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking header file data ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckHeaderFileData(FullName)
             for FullName in EccGlobalData.gHFileList:
                 MsgList = c.CheckHeaderFileData(FullName)
 
@@ -555,10 +467,10 @@ class Check(object):
             for Dirpath, Dirnames, Filenames in self.WalkTree():
                 for F in Filenames:
                     Ext = os.path.splitext(F)[1]
-                    if Ext in ('.h', '.c'):
+                    if Ext in {'.h', '.c'}:
                         FullName = os.path.join(Dirpath, F)
                         MsgList = c.CheckFileHeaderDoxygenComments(FullName)
-                    elif Ext in ('.inf', '.dec', '.dsc', '.fdf'):
+                    elif Ext in {'.inf', '.dec', '.dsc', '.fdf'}:
                         FullName = os.path.join(Dirpath, F)
                         op = open(FullName).readlines()
                         FileLinesList = op
@@ -642,11 +554,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckFunctionHeader == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen function header ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckFuncHeaderDoxygenComments(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckFuncHeaderDoxygenComments(FullName)
 
@@ -662,11 +569,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckCommentFormat == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen comment ///< ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckDoxygenTripleForwardSlash(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckDoxygenTripleForwardSlash(FullName)
 
@@ -675,11 +577,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckCommand == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen command ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckDoxygenCommand(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckDoxygenCommand(FullName)
 
@@ -1027,11 +924,11 @@ class Check(object):
                     for Record in RecordSet:
                         FunName = Record[0]
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, FunName):
-                            if Model in [MODEL_PCD_FIXED_AT_BUILD] and not FunName.startswith('FixedPcdGet'):
+                            if Model == MODEL_PCD_FIXED_AT_BUILD and not FunName.startswith('FixedPcdGet'):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FixPcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_FEATURE_FLAG] and not FunName.startswith(('FeaturePcdGet','FeaturePcdSet')):
+                            if Model == MODEL_PCD_FEATURE_FLAG and not FunName.startswith(('FeaturePcdGet','FeaturePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FeaturePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_PATCHABLE_IN_MODULE] and not FunName.startswith(('PatchablePcdGet','PatchablePcdSet')):
+                            if Model == MODEL_PCD_PATCHABLE_IN_MODULE and not FunName.startswith(('PatchablePcdGet','PatchablePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a PatchablePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
 
             #ERROR_META_DATA_FILE_CHECK_PCD_TYPE
@@ -1188,12 +1085,12 @@ class Check(object):
                 if Usage.startswith(DT.TAB_SPECIAL_COMMENT):
                     PcdCommentList = Usage[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if len(PcdCommentList) >= 1:
-                        if Model in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_FEATURE_FLAG] \
+                        if Model in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_FEATURE_FLAG} \
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
                                                                           DT.TAB_INF_USAGE_UNDEFINED)):
                             EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
-                        if Model in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX] \
+                        if Model in {MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX} \
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_PRO,
                                                                           DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
@@ -1259,7 +1156,7 @@ class Check(object):
         or EccGlobalData.gConfig.CheckAll == '1':
             for Dirpath, Dirnames, Filenames in self.WalkTree():
                 for F in Filenames:
-                    if os.path.splitext(F)[1] in ('.h', '.c'):
+                    if os.path.splitext(F)[1] in {'.h', '.c'}:
                         FullName = os.path.join(Dirpath, F)
                         Id = c.GetTableID(FullName)
                         if Id < 0:
@@ -1269,7 +1166,7 @@ class Check(object):
                         self.NamingConventionCheckTypedefStatement(FileTable)
                         self.NamingConventionCheckVariableName(FileTable)
                         self.NamingConventionCheckSingleCharacterVariable(FileTable)
-                        if os.path.splitext(F)[1] in ('.h'):
+                        if os.path.splitext(F)[1] == '.h':
                             self.NamingConventionCheckIfndefStatement(FileTable)
 
         self.NamingConventionCheckPathName()
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 82ede3eb330c..f3b7b41298bc 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -135,8 +135,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
         # indication of different block; or in the position that Abstract should be, also keep it
         # as it indicates that no abstract
         #
-        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE, \
-                                                      HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT]:
+        if not Comment and HeaderCommentStage not in {HEADER_COMMENT_LICENSE, \
+                                                      HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT}:
             continue
         
         if HeaderCommentStage == HEADER_COMMENT_NOT_STARTED:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 0f9711ba109e..9baa59f94d9c 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -482,14 +482,14 @@ class InfParser(MetaFileParser):
         IsFindBlockComment = False
 
         for Index in range(0, len(Content)):
-            if self._SectionType in [MODEL_EFI_GUID,
+            if self._SectionType in {MODEL_EFI_GUID,
                                      MODEL_EFI_PROTOCOL,
                                      MODEL_EFI_PPI,
                                      MODEL_PCD_FIXED_AT_BUILD,
                                      MODEL_PCD_PATCHABLE_IN_MODULE,
                                      MODEL_PCD_FEATURE_FLAG,
                                      MODEL_PCD_DYNAMIC_EX,
-                                     MODEL_PCD_DYNAMIC]:
+                                     MODEL_PCD_DYNAMIC}:
                 Line = Content[Index].strip()
                 if Line.startswith(TAB_SPECIAL_COMMENT):
                     Usage += ' ' + Line[Line.find(TAB_SPECIAL_COMMENT):]
@@ -525,7 +525,7 @@ class InfParser(MetaFileParser):
                 self._SectionHeaderParser()
                 # Check invalid sections
                 if self._Version < 0x00010005:
-                    if self._SectionType in [MODEL_META_DATA_BUILD_OPTION,
+                    if self._SectionType in {MODEL_META_DATA_BUILD_OPTION,
                                              MODEL_EFI_LIBRARY_CLASS,
                                              MODEL_META_DATA_PACKAGE,
                                              MODEL_PCD_FIXED_AT_BUILD,
@@ -536,13 +536,13 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_GUID,
                                              MODEL_EFI_PROTOCOL,
                                              MODEL_EFI_PPI,
-                                             MODEL_META_DATA_USER_EXTENSION]:
+                                             MODEL_META_DATA_USER_EXTENSION}:
                         EdkLogger.error('Parser', FORMAT_INVALID,
                                         "Section [%s] is not allowed in inf file without version" % (self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
-                elif self._SectionType in [MODEL_EFI_INCLUDE,
+                elif self._SectionType in {MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
-                                           MODEL_META_DATA_NMAKE]:
+                                           MODEL_META_DATA_NMAKE}:
                     EdkLogger.error('Parser', FORMAT_INVALID,
                                     "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
@@ -694,9 +694,9 @@ class InfParser(MetaFileParser):
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
             InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
+            if InfPcdValueList[0] in TAB_TRUE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
-            elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
+            elif InfPcdValueList[0] in TAB_FALSE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1);
 
     ## [depex] section parser
@@ -933,7 +933,7 @@ class DscParser(MetaFileParser):
         if DirectiveName not in self.DataType:
             EdkLogger.error("Parser", FORMAT_INVALID, "Unknown directive [%s]" % DirectiveName,
                             File=self.MetaFile, Line=self._LineIndex+1)
-        if DirectiveName in ['!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'] and self._ValueList[1] == '':
+        if DirectiveName in {'!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'} and self._ValueList[1] == '':
             EdkLogger.error("Parser", FORMAT_INVALID, "Missing expression",
                             File=self.MetaFile, Line=self._LineIndex+1,
                             ExtraData=self._CurrentLine)
@@ -944,9 +944,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 # Remove any !else or !elseif
                 DirectiveInfo = self._DirectiveStack.pop()
-                if DirectiveInfo[0] in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if DirectiveInfo[0] in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                         MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
             else:
                 EdkLogger.error("Parser", FORMAT_INVALID, "Redundant '!endif'",
@@ -1053,9 +1053,9 @@ class DscParser(MetaFileParser):
                             File=self.MetaFile, Line=self._LineIndex+1)
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-        if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
+        if DscPcdValueList[0] in TAB_TRUE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
-        elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
+        elif DscPcdValueList[0] in TAB_FALSE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
 
     ## [components] section parser
@@ -1121,7 +1121,7 @@ class DscParser(MetaFileParser):
         Macros.update(GlobalData.gPlatformDefines)
         Macros.update(GlobalData.gCommandLineDefines)
         # PCD cannot be referenced in macro definition
-        if self._ItemType not in [MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE]:
+        if self._ItemType not in {MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE}:
             Macros.update(self._Symbols)
         return Macros
 
@@ -1303,8 +1303,8 @@ class DscParser(MetaFileParser):
 
     def __ProcessDirective(self):
         Result = None
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF}:
             Macros = self._Macros
             Macros.update(GlobalData.gGlobalDefines)
             try:
@@ -1325,9 +1325,9 @@ class DscParser(MetaFileParser):
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
 
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
             self._DirectiveStack.append(self._ItemType)
             if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
                 Result = bool(Result)
@@ -1350,10 +1350,10 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 self._DirectiveEvalStack.pop()
                 Directive = self._DirectiveStack.pop()
-                if Directive in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if Directive in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
-                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
         elif self._ItemType == MODEL_META_DATA_INCLUDE:
             # The included file must be relative to workspace or same directory as DSC file
@@ -1450,9 +1450,9 @@ class DscParser(MetaFileParser):
             except WrnExpression, Value:
                 ValueList[-1] = Value.result
             
-            if ValueList[-1] == 'True':
+            if ValueList[-1] == TAB_TRUE_3:
                 ValueList[-1] = '1'
-            if ValueList[-1] == 'False':
+            if ValueList[-1] == TAB_FALSE_3:
                 ValueList[-1] = '0'      
 
         self._ValueList[2] = '|'.join(ValueList)
@@ -1880,9 +1880,9 @@ class DecParser(MetaFileParser):
             if self._UniObj:
                 self._UniObj.CheckPcdInfo(TokenList[0])
 
-        if ValueList[0] in ['True', 'true', 'TRUE']:
+        if ValueList[0] in TAB_TRUE_SET:
             ValueList[0] = '1'
-        elif ValueList[0] in ['False', 'false', 'FALSE']:
+        elif ValueList[0] in TAB_FALSE_SET:
             ValueList[0] = '0'
 
         self._ValueList[2] = ValueList[0].strip() + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 4c49d1ca570f..29e8a81d8ef9 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -22,6 +22,7 @@ from Common import EdkLogger
 from EccToolError import *
 import EccGlobalData
 import MetaDataParser
+from Common.DataType import TAB_BOOLEAN
 
 IncludeFileListDict = {}
 AllIncludeFileListDict = {}
@@ -518,7 +519,7 @@ def CollectSourceCodeDataIntoDB(RootDir):
             collector = None
             FullName = os.path.normpath(os.path.join(dirpath, f))
             model = DataClass.MODEL_FILE_OTHERS
-            if os.path.splitext(f)[1] in ('.h', '.c'):
+            if os.path.splitext(f)[1] in {'.h', '.c'}:
                 EdkLogger.info("Parsing " + FullName)
                 model = f.endswith('c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
                 collector = CodeFragmentCollector.CodeFragmentCollector(FullName)
@@ -543,7 +544,7 @@ def CollectSourceCodeDataIntoDB(RootDir):
 
     Db = GetDB()
     for file in FileObjList:
-        if file.ExtName.upper() not in ['INF', 'DEC', 'DSC', 'FDF']:
+        if file.ExtName.upper() not in {'INF', 'DEC', 'DSC', 'FDF'}:
             Db.InsertOneFile(file)
 
     Db.UpdateIdentifierBelongsToFunction()
@@ -571,7 +572,7 @@ def GetTableID(FullFileName, ErrorMsgList=None):
     return FileID
 
 def GetIncludeFileList(FullFileName):
-    if os.path.splitext(FullFileName)[1].upper() not in ('.H'):
+    if os.path.splitext(FullFileName)[1].upper() not == '.H':
         return []
     IFList = IncludeFileListDict.get(FullFileName)
     if IFList is not None:
@@ -991,7 +992,7 @@ def GetFinalTypeValue(Type, FieldName, TypedefDict, SUDict):
     while LBPos == -1:
         FTList = Value.split()
         for FT in FTList:
-            if FT not in ('struct', 'union'):
+            if FT not in {'struct', 'union'}:
                 Value = TypedefDict.get(FT)
                 if Value is None:
                     Value = SUDict.get(FT)
@@ -1639,7 +1640,7 @@ def CheckMemberVariableFormat(Name, Value, FileTable, TdId, ModelId):
         TokenList = Field.split()
         # Remove pointers before variable
         Token = TokenList[-1]
-        if Token in ['OPTIONAL']:
+        if Token == 'OPTIONAL':
             Token = TokenList[-2]
         if not Pattern.match(Token.lstrip('*')):
             ErrMsgList.append(Token.lstrip('*'))
@@ -2046,18 +2047,18 @@ def CheckNonBooleanValueComparison(FullFileName):
                 if SearchInCache:
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
-                        if Type.find('BOOLEAN') == -1:
+                        if Type.find(TAB_BOOLEAN) == -1:
                             PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, TAB_BOOLEAN, StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
-                if Type.find('BOOLEAN') == -1:
+                if Type.find(TAB_BOOLEAN) == -1:
                     PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
@@ -2101,7 +2102,7 @@ def CheckBooleanValueComparison(FullFileName):
 
         for Exp in GetPredicateListFromPredicateExpStr(Str[0]):
             PredInfo = SplitPredicateStr(Exp)
-            if PredInfo[1] in ('==', '!=') and PredInfo[0][1] in ('TRUE', 'FALSE'):
+            if PredInfo[1] in {'==', '!='} and PredInfo[0][1] in TAB_TRUE_FALSE_SET:
                 PredVarStr = PredInfo[0][0].strip()
                 IsFuncCall = False
                 SearchInCache = False
@@ -2125,19 +2126,19 @@ def CheckBooleanValueComparison(FullFileName):
                 if SearchInCache:
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
-                        if Type.find('BOOLEAN') != -1:
+                        if Type.find(TAB_BOOLEAN) != -1:
                             PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
 
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, TAB_BOOLEAN, StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
-                if Type.find('BOOLEAN') != -1:
+                if Type.find(TAB_BOOLEAN) != -1:
                     PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
@@ -2236,7 +2237,7 @@ def CheckDoxygenCommand(FullFileName):
                     continue
                 if not Part.replace('@', '').strip():
                     continue
-                if Part.lstrip('@') in ['{', '}']:
+                if Part.lstrip('@') in {'{', '}'}:
                     continue
                 if Part.lstrip('@').isalpha():
                     if Part.lstrip('@') not in DoxygenCommandList:
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index 14c287588a01..f7ce6371e0ea 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -72,8 +72,8 @@ def PreProcess(Filename, MergeMultipleLines = True, LineNo = -1):
             if IsFindBlockCode and Line[-1] != TAB_SLASH:
                 ReservedLine = (ReservedLine + TAB_SPACE_SPLIT + Line).strip()
                 Lines.append(ReservedLine)
-                for Index in (0, ReservedLineLength):
-                    Lines.append('')
+                Lines.append('')
+                Lines.append('')
                 ReservedLine = ''
                 ReservedLineLength = 0
                 IsFindBlockCode = False
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 99b8b152180a..0e9d7300f4f2 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -17,6 +17,7 @@
 import Common.LongFilePathOs as os
 import EotGlobalData
 from Common.LongFilePathSupport import OpenLongFilePath as open
+from Common.DataType import *
 
 ## Report() class
 #
@@ -138,11 +139,13 @@ class Report(object):
     #  @param DepexString: A DEPEX string needed to be parsed
     #
     def GenerateDepex(self, DepexString):
-        NonGuidList = ['AND', 'OR', 'NOT', 'BEFORE', 'AFTER', 'TRUE', 'FALSE']
+        NonGuidSet = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, 
+                      DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT,
+                      DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE}
         ItemList = DepexString.split(' ')
         DepexString = ''
         for Item in ItemList:
-            if Item not in NonGuidList:
+            if Item not in NonGuidSet:
                 SqlCommand = """select DISTINCT GuidName from Report where GuidValue like '%s' and ItemMode = 'Produced' group by GuidName""" % (Item)
                 RecordSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
                 if RecordSet != []:
@@ -234,7 +237,7 @@ class Report(object):
     #
     def GenerateFfs(self, FfsObj):
         self.FfsIndex = self.FfsIndex + 1
-        if FfsObj is not None and FfsObj.Type in [0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0xA]:
+        if FfsObj is not None and FfsObj.Type in {0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0xA}:
             FfsGuid = FfsObj.Guid
             FfsOffset = FfsObj._OFF_
             FfsName = 'Unknown-Module'
@@ -278,9 +281,9 @@ class Report(object):
         <td colspan="4"><table width="100%%"  border="1">""" % (self.FfsIndex, self.FfsIndex, self.FfsIndex, FfsPath, FfsName, FfsGuid, FfsOffset, FfsType, self.FfsIndex)
             
             if self.DispatchList:
-                if FfsObj.Type in [0x04, 0x06]:
+                if FfsObj.Type in {0x04, 0x06}:
                     self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "P", FfsName, FfsPath))
-                if FfsObj.Type in [0x05, 0x07, 0x08, 0x0A]:
+                if FfsObj.Type in {0x05, 0x07, 0x08, 0x0A}:
                     self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "D", FfsName, FfsPath))
                
             self.WriteLn(Content)
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index 8199ce5ee73e..84a6f0961279 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -345,7 +345,7 @@ def CreateCCodeDB(FileNameList):
     ParseErrorFileList = []
     ParsedFiles = {}
     for FullName in FileNameList:
-        if os.path.splitext(FullName)[1] in ('.h', '.c'):
+        if os.path.splitext(FullName)[1] in {'.h', '.c'}:
             if FullName.lower() in ParsedFiles:
                 continue
             ParsedFiles[FullName.lower()] = 1
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index 71c2796b0b39..cc9c4d5b9aa7 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -92,7 +92,7 @@ class DataSection (DataSectionClassObject):
                 self.Alignment = str (ImageObj.SectionAlignment / 0x100000) + 'M'
 
         NoStrip = True
-        if self.SecType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
+        if self.SecType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             if self.KeepReloc is not None:
                 NoStrip = self.KeepReloc
 
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index 4392b9c62409..0521dd5b8d43 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -82,7 +82,10 @@ class DepexSection (DepexSectionClassObject):
             ExpList = self.Expression.split()
 
             for Exp in ExpList:
-                if Exp.upper() not in ('AND', 'OR', 'NOT', 'TRUE', 'FALSE', 'SOR', 'BEFORE', 'AFTER', 'END'):
+                if Exp.upper() not in {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, 
+                        DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT,
+                        DEPEX_OPCODE_END, DEPEX_OPCODE_SOR, DEPEX_OPCODE_TRUE,
+                        DEPEX_OPCODE_FALSE}:
                     GuidStr = self.__FindGuidValue(Exp)
                     if GuidStr is None:
                         EdkLogger.error("GenFds", RESOURCE_NOT_AVAILABLE,
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 55348083b954..61b31bd36ff2 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -280,7 +280,7 @@ class FdfParser:
         Count = 0
         while not self.__EndOfFile():
             Count += 1
-            if self.__CurrentChar() in (T_CHAR_NULL, T_CHAR_CR, T_CHAR_LF, T_CHAR_SPACE, T_CHAR_TAB):
+            if self.__CurrentChar() in {T_CHAR_NULL, T_CHAR_CR, T_CHAR_LF, T_CHAR_SPACE, T_CHAR_TAB}:
                 self.__SkippedChars += str(self.__CurrentChar())
                 self.__GetOneChar()
 
@@ -423,14 +423,14 @@ class FdfParser:
             return
 
         Offset = StartPos[1]
-        while self.Profile.FileLinesList[StartPos[0]][Offset] not in ('\r', '\n'):
+        while self.Profile.FileLinesList[StartPos[0]][Offset] not in {'\r', '\n'}:
             self.Profile.FileLinesList[StartPos[0]][Offset] = Value
             Offset += 1
 
         Line = StartPos[0]
         while Line < EndPos[0]:
             Offset = 0
-            while self.Profile.FileLinesList[Line][Offset] not in ('\r', '\n'):
+            while self.Profile.FileLinesList[Line][Offset] not in {'\r', '\n'}:
                 self.Profile.FileLinesList[Line][Offset] = Value
                 Offset += 1
             Line += 1
@@ -741,7 +741,7 @@ class FdfParser:
                     PreIndex = 0
                     StartPos = CurLine.find('$(', PreIndex)
                     EndPos = CurLine.find(')', StartPos+2)
-                    while StartPos != -1 and EndPos != -1 and self.__Token not in ['!ifdef', '!ifndef', '!if', '!elseif']:
+                    while StartPos != -1 and EndPos != -1 and self.__Token not in {'!ifdef', '!ifndef', '!if', '!elseif'}:
                         MacroName = CurLine[StartPos+2 : EndPos]
                         MacorValue = self.__GetMacroValue(MacroName)
                         if MacorValue is not None:
@@ -792,7 +792,7 @@ class FdfParser:
                 self.Profile.PcdFileLineDict[PcdPair] = FileLineTuple
 
                 self.__WipeOffArea.append(((SetLine, SetOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
-            elif self.__Token in ('!ifdef', '!ifndef', '!if'):
+            elif self.__Token in {'!ifdef', '!ifndef', '!if'}:
                 IfStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self.__Token))
                 IfList.append([IfStartPos, None, None])
 
@@ -810,7 +810,7 @@ class FdfParser:
                 IfList[-1] = [IfList[-1][0], ConditionSatisfied, BranchDetermined]
                 if ConditionSatisfied:
                     self.__WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))                 
-            elif self.__Token in ('!elseif', '!else'):
+            elif self.__Token in {'!elseif', '!else'}:
                 ElseStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self.__Token))
                 if len(IfList) <= 0:
                     raise Warning("Missing !if statement", self.FileName, self.CurrentLineNumber)
@@ -1001,7 +1001,7 @@ class FdfParser:
     def __GetExpression(self):
         Line = self.Profile.FileLinesList[self.CurrentLineNumber - 1]
         Index = len(Line) - 1
-        while Line[Index] in ['\r', '\n']:
+        while Line[Index] in {'\r', '\n'}:
             Index -= 1
         ExpressionString = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:Index+1]
         self.CurrentOffsetWithinLine += len(ExpressionString)
@@ -1489,7 +1489,7 @@ class FdfParser:
 
         while self.__GetTokenStatements(FdObj):
             pass
-        for Attr in ("BaseAddress", "Size", "ErasePolarity"):
+        for Attr in ["BaseAddress", "Size", "ErasePolarity"]:
             if getattr(FdObj, Attr) is None:
                 self.__GetNextToken()
                 raise Warning("Keyword %s missing" % Attr, self.FileName, self.CurrentLineNumber)
@@ -1831,7 +1831,7 @@ class FdfParser:
         if not self.__GetNextWord():
             return True
 
-        if not self.__Token in ("SET", BINARY_FILE_TYPE_FV, "FILE", "DATA", "CAPSULE", "INF"):
+        if self.__Token not in {"SET", BINARY_FILE_TYPE_FV, "FILE", "DATA", "CAPSULE", "INF"}:
             #
             # If next token is a word which is not a valid FV type, it might be part of [PcdOffset[|PcdSize]]
             # Or it might be next region's offset described by an expression which starts with a PCD.
@@ -2134,7 +2134,7 @@ class FdfParser:
                 self.__GetFvExtEntryStatement(FvObj) or self.__GetFvNameString(FvObj)):
                 break
 
-        if FvObj.FvNameString == 'TRUE' and not FvObj.FvNameGuid:
+        if FvObj.FvNameString == TAB_TRUE_1 and not FvObj.FvNameGuid:
             raise Warning("FvNameString found but FvNameGuid was not found", self.FileName, self.CurrentLineNumber)
 
         self.__GetAprioriSection(FvObj, FvObj.DefineVarDict.copy())
@@ -2168,10 +2168,10 @@ class FdfParser:
         if not self.__GetNextToken():
             raise Warning("expected alignment value", self.FileName, self.CurrentLineNumber)
 
-        if self.__Token.upper() not in ("1", "2", "4", "8", "16", "32", "64", "128", "256", "512", \
+        if self.__Token.upper() not in {"1", "2", "4", "8", "16", "32", "64", "128", "256", "512", \
                                         "1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K", \
                                         "1M", "2M", "4M", "8M", "16M", "32M", "64M", "128M", "256M", "512M", \
-                                        "1G", "2G"):
+                                        "1G", "2G"}:
             raise Warning("Unknown alignment value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         Obj.FvAlignment = self.__Token
         return True
@@ -2221,12 +2221,12 @@ class FdfParser:
         if not self.__GetNextToken():
             raise Warning("expected FvForceRebase value", self.FileName, self.CurrentLineNumber)
 
-        if self.__Token.upper() not in ["TRUE", "FALSE", "0", "0X0", "0X00", "1", "0X1", "0X01"]:
+        if self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "0", "0X0", "0X00", "1", "0X1", "0X01"}:
             raise Warning("Unknown FvForceRebase value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         
-        if self.__Token.upper() in ["TRUE", "1", "0X1", "0X01"]:
+        if self.__Token.upper() in {TAB_TRUE_1, "1", "0X1", "0X01"}:
             Obj.FvForceRebase = True
-        elif self.__Token.upper() in ["FALSE", "0", "0X0", "0X00"]:
+        elif self.__Token.upper() in {TAB_FALSE_1, "0", "0X0", "0X00"}:
             Obj.FvForceRebase = False
         else:
             Obj.FvForceRebase = None
@@ -2247,19 +2247,19 @@ class FdfParser:
         while self.__GetNextWord():
             IsWordToken = True
             name = self.__Token
-            if name not in ("ERASE_POLARITY", "MEMORY_MAPPED", \
+            if name not in {"ERASE_POLARITY", "MEMORY_MAPPED", \
                            "STICKY_WRITE", "LOCK_CAP", "LOCK_STATUS", "WRITE_ENABLED_CAP", \
                            "WRITE_DISABLED_CAP", "WRITE_STATUS", "READ_ENABLED_CAP", \
                            "READ_DISABLED_CAP", "READ_STATUS", "READ_LOCK_CAP", \
                            "READ_LOCK_STATUS", "WRITE_LOCK_CAP", "WRITE_LOCK_STATUS", \
-                           "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"):
+                           "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"}:
                 self.__UndoToken()
                 return False
 
             if not self.__IsToken( "="):
                 raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
 
-            if not self.__GetNextToken() or self.__Token.upper() not in ("TRUE", "FALSE", "1", "0"):
+            if not self.__GetNextToken() or self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "1", "0"}:
                 raise Warning("expected TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
 
             FvObj.FvAttributeDict[name] = self.__Token
@@ -2297,7 +2297,7 @@ class FdfParser:
         if not self.__IsToken( "="):
             raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
 
-        if not self.__GetNextToken() or self.__Token not in ('TRUE', 'FALSE'):
+        if not self.__GetNextToken() or self.__Token not in TAB_TRUE_FALSE_SET:
             raise Warning("expected TRUE or FALSE for FvNameString", self.FileName, self.CurrentLineNumber)
 
         FvObj.FvNameString = self.__Token
@@ -2614,7 +2614,7 @@ class FdfParser:
     #
     @staticmethod
     def __FileCouldHaveRelocFlag (FileType):
-        if FileType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, 'PEI_DXE_COMBO'):
+        if FileType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, 'PEI_DXE_COMBO'}:
             return True
         else:
             return False
@@ -2629,7 +2629,7 @@ class FdfParser:
     #
     @staticmethod
     def __SectionCouldHaveRelocFlag (SectionType):
-        if SectionType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
+        if SectionType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             return True
         else:
             return False
@@ -2676,7 +2676,7 @@ class FdfParser:
                 raise Warning("expected FD name", self.FileName, self.CurrentLineNumber)
             FfsFileObj.FdName = self.__Token
 
-        elif self.__Token in ("DEFINE", "APRIORI", "SECTION"):
+        elif self.__Token in {"DEFINE", "APRIORI", "SECTION"}:
             self.__UndoToken()
             self.__GetSectionData( FfsFileObj, MacroDict)
 
@@ -2707,8 +2707,8 @@ class FdfParser:
         while True:
             AlignValue = None
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 #For FFS, Auto is default option same to ""
                 if not self.__Token == "Auto":
@@ -2766,8 +2766,8 @@ class FdfParser:
             FfsFileObj.CheckSum = True
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
             if not self.__Token == "Auto":
@@ -2838,8 +2838,8 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
 
@@ -2953,8 +2953,8 @@ class FdfParser:
                 self.SetFileBufferPos(OldPos)
                 return False
 
-            if self.__Token not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX):
+            if self.__Token not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Unknown section type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if AlignValue == 'Auto'and (not self.__Token == BINARY_FILE_TYPE_PE32) and (not self.__Token == BINARY_FILE_TYPE_TE):
                 raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3102,7 +3102,7 @@ class FdfParser:
                     continue
                 except ValueError:
                     raise Warning("expected Number", self.FileName, self.CurrentLineNumber)
-            elif self.__Token.upper() not in ("TRUE", "FALSE", "1", "0"):
+            elif self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "1", "0"}:
                 raise Warning("expected TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
             AttribDict[AttribKey] = self.__Token
 
@@ -3128,8 +3128,8 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
 
@@ -3292,21 +3292,21 @@ class FdfParser:
     def __GetCapsuleTokens(self, Obj):
         if not self.__GetNextToken():
             return False
-        while self.__Token in ("CAPSULE_GUID", "CAPSULE_HEADER_SIZE", "CAPSULE_FLAGS", "OEM_CAPSULE_FLAGS", "CAPSULE_HEADER_INIT_VERSION"):
+        while self.__Token in {"CAPSULE_GUID", "CAPSULE_HEADER_SIZE", "CAPSULE_FLAGS", "OEM_CAPSULE_FLAGS", "CAPSULE_HEADER_INIT_VERSION"}:
             Name = self.__Token.strip()
             if not self.__IsToken("="):
                 raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
             if not self.__GetNextToken():
                 raise Warning("expected value", self.FileName, self.CurrentLineNumber)
             if Name == 'CAPSULE_FLAGS':
-                if not self.__Token in ("PersistAcrossReset", "PopulateSystemTable", "InitiateReset"):
+                if self.__Token not in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
                     raise Warning("expected PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                 Value = self.__Token.strip()
                 while self.__IsToken(","):
                     Value += ','
                     if not self.__GetNextToken():
                         raise Warning("expected value", self.FileName, self.CurrentLineNumber)
-                    if not self.__Token in ("PersistAcrossReset", "PopulateSystemTable", "InitiateReset"):
+                    if self.__Token not in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
                         raise Warning("expected PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                     Value += self.__Token.strip()
             elif Name == 'OEM_CAPSULE_FLAGS':
@@ -3521,7 +3521,7 @@ class FdfParser:
         AfileName = self.__Token
         AfileBaseName = os.path.basename(AfileName)
         
-        if os.path.splitext(AfileBaseName)[1]  not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
+        if os.path.splitext(AfileBaseName)[1]  not in {".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"}:
             raise Warning('invalid binary file type, should be one of "bin",BINARY_FILE_TYPE_BIN,"Bin","dat","DAT","Dat","data","DATA","Data"', \
                           self.FileName, self.CurrentLineNumber)
         
@@ -3614,12 +3614,12 @@ class FdfParser:
 
         if not self.__GetNextWord():
             raise Warning("expected Module type", self.FileName, self.CurrentLineNumber)
-        if self.__Token.upper() not in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, \
+        if self.__Token.upper() not in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, \
                              SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, \
                              SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, \
                              SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, "DEFAULT", SUP_MODULE_BASE, \
                              EDK_COMPONENT_TYPE_SECURITY_CORE, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, \
-                                        "PE32_PEIM", EDK_COMPONENT_TYPE_BS_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER, EDK_COMPONENT_TYPE_APPLICATION, "ACPITABLE", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE):
+                                        "PE32_PEIM", EDK_COMPONENT_TYPE_BS_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER, EDK_COMPONENT_TYPE_APPLICATION, "ACPITABLE", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
             raise Warning("Unknown Module type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         return self.__Token
 
@@ -3661,8 +3661,8 @@ class FdfParser:
             raise Warning("expected FFS type", self.FileName, self.CurrentLineNumber)
 
         Type = self.__Token.strip().upper()
-        if Type not in ("RAW", "FREEFORM", SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,\
-                             "PEI_DXE_COMBO", "DRIVER", SUP_MODULE_DXE_CORE, EDK_COMPONENT_TYPE_APPLICATION, "FV_IMAGE", "SMM", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE):
+        if Type not in {"RAW", "FREEFORM", SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,\
+                             "PEI_DXE_COMBO", "DRIVER", SUP_MODULE_DXE_CORE, EDK_COMPONENT_TYPE_APPLICATION, "FV_IMAGE", "SMM", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
             raise Warning("Unknown FV type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
 
         if not self.__IsToken("="):
@@ -3718,8 +3718,8 @@ class FdfParser:
 
         AlignValue = ""
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
             if not self.__Token == "Auto":
@@ -3755,8 +3755,8 @@ class FdfParser:
 
             SectionName = self.__Token
 
-            if SectionName not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                                    BINARY_FILE_TYPE_UI, BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX):
+            if SectionName not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                                    BINARY_FILE_TYPE_UI, BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Unknown leaf section name '%s'" % SectionName, self.FileName, self.CurrentLineNumber)
 
 
@@ -3768,8 +3768,8 @@ class FdfParser:
 
             SectAlignment = ""
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
                     raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3812,8 +3812,8 @@ class FdfParser:
             return False
         SectionName = self.__Token
 
-        if SectionName not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+        if SectionName not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
             self.__UndoToken()
             return False
 
@@ -3848,16 +3848,16 @@ class FdfParser:
                 FvImageSectionObj.FvFileType = self.__Token
 
                 if self.__GetAlignment():
-                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                            "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                    if self.__Token not in {"8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                            "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                         raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self.__Token
 
                 if self.__IsToken('|'):
                     FvImageSectionObj.FvFileExtension = self.__GetFileExtension()
                 elif self.__GetNextToken():
-                    if self.__Token not in ("}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+                    if self.__Token not in {"}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
                         FvImageSectionObj.FvFileName = self.__Token
                     else:
                         self.__UndoToken()
@@ -3916,8 +3916,8 @@ class FdfParser:
                 EfiSectionObj.BuildNum = self.__Token
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
                 raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3938,8 +3938,8 @@ class FdfParser:
         if self.__IsToken('|'):
             EfiSectionObj.FileExtension = self.__GetFileExtension()
         elif self.__GetNextToken():
-            if self.__Token not in ("}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                       BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+            if self.__Token not in {"}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                       BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
                 
                 if self.__Token.startswith('PCD'):
                     self.__UndoToken()
@@ -3973,7 +3973,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldBeOptional(SectionType):
-        if SectionType in (BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "RAW", BINARY_FILE_TYPE_SMM_DEPEX):
+        if SectionType in {BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "RAW", BINARY_FILE_TYPE_SMM_DEPEX}:
             return True
         else:
             return False
@@ -3988,7 +3988,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldHaveBuildNum(SectionType):
-        if SectionType in ("VERSION"):
+        if SectionType == "VERSION":
             return True
         else:
             return False
@@ -4003,7 +4003,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldHaveString(SectionType):
-        if SectionType in (BINARY_FILE_TYPE_UI, "VERSION"):
+        if SectionType in {BINARY_FILE_TYPE_UI, "VERSION"}:
             return True
         else:
             return False
@@ -4018,34 +4018,34 @@ class FdfParser:
     #
     def __CheckRuleSectionFileType(self, SectionType, FileType):
         if SectionType == "COMPAT16":
-            if FileType not in ("COMPAT16", "SEC_COMPAT16"):
+            if FileType not in {"COMPAT16", "SEC_COMPAT16"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PE32:
-            if FileType not in (BINARY_FILE_TYPE_PE32, "SEC_PE32"):
+            if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_PE32"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PIC:
-            if FileType not in (BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_PIC):
+            if FileType not in {BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_PIC}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_TE:
-            if FileType not in (BINARY_FILE_TYPE_TE, "SEC_TE"):
+            if FileType not in {BINARY_FILE_TYPE_TE, "SEC_TE"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "RAW":
-            if FileType not in (BINARY_FILE_TYPE_BIN, "SEC_BIN", "RAW", "ASL", "ACPI"):
+            if FileType not in {BINARY_FILE_TYPE_BIN, "SEC_BIN", "RAW", "ASL", "ACPI"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_DXE_DEPEX or SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
-            if FileType not in (BINARY_FILE_TYPE_DXE_DEPEX, "SEC_DXE_DEPEX", BINARY_FILE_TYPE_SMM_DEPEX):
+            if FileType not in {BINARY_FILE_TYPE_DXE_DEPEX, "SEC_DXE_DEPEX", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_UI:
-            if FileType not in (BINARY_FILE_TYPE_UI, "SEC_UI"):
+            if FileType not in {BINARY_FILE_TYPE_UI, "SEC_UI"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "VERSION":
-            if FileType not in ("VERSION", "SEC_VERSION"):
+            if FileType not in {"VERSION", "SEC_VERSION"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PEI_DEPEX:
-            if FileType not in (BINARY_FILE_TYPE_PEI_DEPEX, "SEC_PEI_DEPEX"):
+            if FileType not in {BINARY_FILE_TYPE_PEI_DEPEX, "SEC_PEI_DEPEX"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_GUID:
-            if FileType not in (BINARY_FILE_TYPE_PE32, "SEC_GUID"):
+            if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_GUID"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
 
     ## __GetRuleEncapsulationSection() method
@@ -4147,7 +4147,7 @@ class FdfParser:
             raise Warning("expected '.'", self.FileName, self.CurrentLineNumber)
 
         Arch = self.__SkippedChars.rstrip(".").upper()
-        if Arch not in ("IA32", "X64", "IPF", "ARM", "AARCH64"):
+        if Arch not in {"IA32", "X64", "IPF", "ARM", "AARCH64"}:
             raise Warning("Unknown Arch '%s'" % Arch, self.FileName, self.CurrentLineNumber)
 
         if not self.__GetNextWord():
@@ -4161,7 +4161,7 @@ class FdfParser:
         if self.__IsToken(","):
             if not self.__GetNextWord():
                 raise Warning("expected Arch list", self.FileName, self.CurrentLineNumber)
-            if self.__Token.upper() not in ("IA32", "X64", "IPF", "ARM", "AARCH64"):
+            if self.__Token.upper() not in {"IA32", "X64", "IPF", "ARM", "AARCH64"}:
                 raise Warning("Unknown Arch '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             VtfObj.ArchList = self.__Token.upper()
 
@@ -4224,7 +4224,7 @@ class FdfParser:
                 if not self.__GetNextWord():
                     raise Warning("Expected Region Name", self.FileName, self.CurrentLineNumber)
 
-                if self.__Token not in ("F", "N", "S"):    #, "H", "L", "PH", "PL"): not support
+                if self.__Token not in {"F", "N", "S"}:    #, "H", "L", "PH", "PL"): not support
                     raise Warning("Unknown location type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
 
                 CompStatementObj.FilePos = self.__Token
@@ -4240,7 +4240,7 @@ class FdfParser:
 
         if not self.__GetNextToken():
             raise Warning("expected Component type", self.FileName, self.CurrentLineNumber)
-        if self.__Token not in ("FIT", "PAL_B", "PAL_A", "OEM"):
+        if self.__Token not in {"FIT", "PAL_B", "PAL_A", "OEM"}:
             if not self.__Token.startswith("0x") or len(self.__Token) < 3 or len(self.__Token) > 4 or \
                 not self.__Token[2] in string.hexdigits or not self.__Token[-1] in string.hexdigits:
                 raise Warning("Unknown location type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
@@ -4268,7 +4268,7 @@ class FdfParser:
 
         if not self.__GetNextToken():
             raise Warning("expected Component CS", self.FileName, self.CurrentLineNumber)
-        if self.__Token not in ("1", "0"):
+        if self.__Token not in {"1", "0"}:
             raise Warning("Unknown  Component CS '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         CompStatementObj.CompCs = self.__Token
 
@@ -4456,7 +4456,7 @@ class FdfParser:
                         raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
                     if not self.__GetNextToken():
                         raise Warning("expected TRUE/FALSE for compress", self.FileName, self.CurrentLineNumber)
-                    Overrides.NeedCompress = self.__Token.upper() == 'TRUE'
+                    Overrides.NeedCompress = self.__Token.upper() == TAB_TRUE_1
                     continue
 
                 if self.__IsToken( "}"):
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index e4276c3a8c07..f62ee73b1238 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -748,7 +748,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
             if SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
                 EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
         NoStrip = True
-        if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM):
+        if self.ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM}:
             if self.KeepReloc is not None:
                 NoStrip = self.KeepReloc
             elif Rule.KeepReloc is not None:
@@ -902,7 +902,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @retval string       File name of the generated section file
     #
     def __GenComplexFileSection__(self, Rule, FvChildAddr, FvParentAddr, IsMakefile = False):
-        if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM):
+        if self.ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM}:
             if Rule.KeepReloc is not None:
                 self.KeepRelocFromRule = Rule.KeepReloc
         SectFiles = []
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c672f1d7d8fa..6c90fa3ca9e6 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -297,7 +297,7 @@ class FV (FvClassObject):
         if self.FvAttributeDict:
             for FvAttribute in self.FvAttributeDict.keys() :
                 if FvAttribute == "FvUsedSizeEnable":
-                    if self.FvAttributeDict[FvAttribute].upper() in {'TRUE', '1'}:
+                    if self.FvAttributeDict[FvAttribute].upper() in {TAB_TRUE_1, '1'}:
                         self.UsedSizeEnable = True
                     continue
                 self.FvInfFile.writelines("EFI_{FA} = {VAL}{END}".format(FA=FvAttribute, VAL=self.FvAttributeDict[FvAttribute], END=TAB_LINE_BREAK))
@@ -323,7 +323,7 @@ class FV (FvClassObject):
                 # } EFI_FIRMWARE_VOLUME_EXT_ENTRY_USED_SIZE_TYPE;
                 Buffer += pack('HHL', 8, 3, 0)
 
-            if self.FvNameString == 'TRUE':
+            if self.FvNameString == TAB_TRUE_1:
                 #
                 # Create EXT entry for FV UI name
                 # This GUID is used: A67DF1FA-8DE8-4E98-AF09-4BDF2EFFBC7C
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 998bd5345c3c..b9167bac7eda 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -212,7 +212,7 @@ def main():
                     else:
                         GlobalData.gCommandLineDefines[List[0].strip()] = List[1].strip()
                 else:
-                    GlobalData.gCommandLineDefines[List[0].strip()] = "TRUE"
+                    GlobalData.gCommandLineDefines[List[0].strip()] = TAB_TRUE_1
         os.environ["WORKSPACE"] = Workspace
 
         # Use the -t and -b option as gGlobalDefines's TOOLCHAIN and TARGET if they are not defined
@@ -432,7 +432,7 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
                     List = Key.split('_')
                     if List[Index] == '*':
                         for String in ToolDb[ToolList[Index]]:
-                            if String in [Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]:
+                            if String in {Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag}:
                                 List[Index] = String
                                 NewKey = '%s_%s_%s_%s_%s' % tuple(List)
                                 if NewKey not in BuildOption:
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index b840079e7ad4..e7dd212b649e 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -212,12 +212,12 @@ class GenFdsGlobalVariable:
 
         if not Inf.IsBinaryModule:
             for File in Inf.Sources:
-                if File.TagName in ("", "*", GenFdsGlobalVariable.ToolChainTag) and \
-                    File.ToolChainFamily in ("", "*", GenFdsGlobalVariable.ToolChainFamily):
+                if File.TagName in {"", "*", GenFdsGlobalVariable.ToolChainTag} and \
+                    File.ToolChainFamily in {"", "*", GenFdsGlobalVariable.ToolChainFamily}:
                     FileList.append((File, DataType.TAB_UNKNOWN_FILE))
 
         for File in Inf.Binaries:
-            if File.Target in [DataType.TAB_COMMON, '*', GenFdsGlobalVariable.TargetName]:
+            if File.Target in {DataType.TAB_COMMON, '*', GenFdsGlobalVariable.TargetName}:
                 FileList.append((File, File.Type))
 
         for File, FileType in FileList:
@@ -494,9 +494,9 @@ class GenFdsGlobalVariable:
     def GetAlignment (AlignString):
         if AlignString is None:
             return 0
-        if AlignString in ("1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K"):
+        if AlignString in {"1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K"}:
             return int (AlignString.rstrip('K')) * 1024
-        elif AlignString in ("1M", "2M", "4M", "8M", "16M"):
+        elif AlignString in {"1M", "2M", "4M", "8M", "16M"}:
             return int (AlignString.rstrip('M')) * 1024 * 1024
         else:
             return int (AlignString)
@@ -551,9 +551,9 @@ class GenFdsGlobalVariable:
             Cmd += ["-r", BaseAddress]
 
         if ForceRebase == False:
-            Cmd += ["-F", "FALSE"]
+            Cmd += ["-F", DataType.TAB_FALSE_1]
         elif ForceRebase == True:
-            Cmd += ["-F", "TRUE"]
+            Cmd += ["-F", DataType.TAB_TRUE_1]
 
         if Capsule:
             Cmd += ["-c"]
@@ -686,7 +686,7 @@ class GenFdsGlobalVariable:
 
     def CallExternalTool (cmd, errorMess, returnValue=[]):
 
-        if type(cmd) not in (tuple, list):
+        if type(cmd) not in {tuple, list}:
             GenFdsGlobalVariable.ErrorLogger("ToolError!  Invalid parameter type in call to CallExternalTool")
 
         if GenFdsGlobalVariable.DebugLevel != -1:
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 28571292f5a6..b36d2868059a 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -77,7 +77,7 @@ class GuidSection(GuidSectionClassObject) :
         else:
             FvAddrIsSet = False
         
-        if self.ProcessRequired in ("TRUE", "1"):
+        if self.ProcessRequired in {TAB_TRUE_1, "1"}:
             if self.FvAddr != []:
                 #no use FvAddr when the image is processed.
                 self.FvAddr = []
@@ -175,7 +175,7 @@ class GuidSection(GuidSectionClassObject) :
             if ExternalOption is not None:
                 CmdOption = CmdOption + ' ' + ExternalOption
             if not GenFdsGlobalVariable.EnableGenfdsMultiThread:
-                if self.ProcessRequired not in ("TRUE", "1") and self.IncludeFvSection and not FvAddrIsSet and self.FvParentAddr is not None:
+                if self.ProcessRequired not in {TAB_TRUE_1, "1"} and self.IncludeFvSection and not FvAddrIsSet and self.FvParentAddr is not None:
                     #FirstCall is only set for the encapsulated flash FV image without process required attribute.
                     FirstCall = True
                 #
@@ -232,11 +232,11 @@ class GuidSection(GuidSectionClassObject) :
                 #
                 # Call Gensection Add Section Header
                 #
-                if self.ProcessRequired in ("TRUE", "1"):
+                if self.ProcessRequired in {TAB_TRUE_1, "1"}:
                     if 'PROCESSING_REQUIRED' not in Attribute:
                         Attribute.append('PROCESSING_REQUIRED')
 
-                if self.AuthStatusValid in ("TRUE", "1"):
+                if self.AuthStatusValid in {TAB_TRUE_1, "1"}:
                     Attribute.append('AUTH_STATUS_VALID')
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                      Guid=self.NameGuid, GuidAttr=Attribute, GuidHdrLen=HeaderLength)
@@ -248,14 +248,14 @@ class GuidSection(GuidSectionClassObject) :
                 HeaderLength = None
                 if self.ExtraHeaderSize != -1:
                     HeaderLength = str(self.ExtraHeaderSize)
-                if self.AuthStatusValid in ("TRUE", "1"):
+                if self.AuthStatusValid in {TAB_TRUE_1, "1"}:
                     Attribute.append('AUTH_STATUS_VALID')
                 if self.ProcessRequired == "NONE" and HeaderLength is None:
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                          Guid=self.NameGuid, GuidAttr=Attribute,
                                                          GuidHdrLen=HeaderLength, DummyFile=DummyFile, IsMakefile=IsMakefile)
                 else:
-                    if self.ProcessRequired in ("TRUE", "1"):
+                    if self.ProcessRequired in {TAB_TRUE_1, "1"}:
                         if 'PROCESSING_REQUIRED' not in Attribute:
                             Attribute.append('PROCESSING_REQUIRED')
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
@@ -268,7 +268,7 @@ class GuidSection(GuidSectionClassObject) :
                 # reset guided section alignment to none for the processed required guided data
                 self.Alignment = None
                 self.IncludeFvSection = False
-                self.ProcessRequired = "TRUE"
+                self.ProcessRequired = TAB_TRUE_1
             if IsMakefile and self.Alignment is not None and self.Alignment.strip() == '0':
                 self.Alignment = '1'
             return OutputFileList, self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 93c4456eb89f..a20c28314894 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -52,10 +52,10 @@ class OptRomInfStatement (FfsInfStatement):
         if self.OverrideAttribs.NeedCompress is None:
             self.OverrideAttribs.NeedCompress = self.OptRomDefs.get ('PCI_COMPRESS')
             if self.OverrideAttribs.NeedCompress is not None:
-                if self.OverrideAttribs.NeedCompress.upper() not in ('TRUE', 'FALSE'):
+                if self.OverrideAttribs.NeedCompress.upper() not in TAB_TRUE_FALSE_SET:
                     GenFdsGlobalVariable.ErrorLogger( "Expected TRUE/FALSE for PCI_COMPRESS: %s" %self.InfFileName)
                 self.OverrideAttribs.NeedCompress = \
-                    self.OverrideAttribs.NeedCompress.upper() == 'TRUE'
+                    self.OverrideAttribs.NeedCompress.upper() == TAB_TRUE_1
 
         if self.OverrideAttribs.PciVendorId is None:
             self.OverrideAttribs.PciVendorId = self.OptRomDefs.get ('PCI_VENDOR_ID')
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index e67d056cc178..57d5d15c36a5 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -220,7 +220,7 @@ class Region(RegionClassObject):
             #
             self.PadBuffer(Buffer, ErasePolarity, Size)
 
-        if self.RegionType in ('FILE', 'INF'):
+        if self.RegionType in {'FILE', 'INF'}:
             for RegionData in self.RegionDataList:
                 if self.RegionType == 'INF':
                     RegionData.__InfParse__(None)
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index 76fef41176ac..6c85ff4dd073 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -59,18 +59,8 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Get PCD value data length
     #
-    ValueLength = 0
-    if TypeName == 'BOOLEAN':
-        ValueLength = 1
-    elif TypeName == TAB_UINT8:
-        ValueLength = 1
-    elif TypeName == TAB_UINT16:
-        ValueLength = 2
-    elif TypeName == TAB_UINT32:
-        ValueLength = 4
-    elif TypeName == TAB_UINT64:
-        ValueLength = 8
-    elif TypeName == TAB_VOID:
+    ValueLength = MAX_SIZE_TYPE.get(TypeName,0)
+    if TypeName == TAB_VOID:
         if MaxSize == 0:
             return OPTION_MISSING, "PcdMaxSize is not specified for VOID* type PCD."
         ValueLength = int(MaxSize)
@@ -100,14 +90,14 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     SavedStr = ValueString
     ValueString = ValueString.upper()
     ValueNumber = 0
-    if TypeName == 'BOOLEAN':
+    if TypeName == TAB_BOOLEAN:
         #
         # Get PCD value for BOOLEAN data type
         #
         try:
-            if ValueString == 'TRUE':
+            if ValueString == TAB_TRUE_1:
                 ValueNumber = 1
-            elif ValueString == 'FALSE':
+            elif ValueString == TAB_FALSE_1:
                 ValueNumber = 0
             ValueNumber = int (ValueString, 0)
             if ValueNumber != 0:
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index a92df52979c6..b65c7bead814 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -300,7 +300,7 @@ def TrimPreprocessedVfr(Source, Target):
             FoundTypedef = False
             TypedefEnd = Index
             # keep all "typedef struct" except to GUID, EFI_PLABEL and PAL_CALL_RETURN
-            if Line.strip("} ;\r\n") in [TAB_GUID, "EFI_PLABEL", "PAL_CALL_RETURN"]:
+            if Line.strip("} ;\r\n") in {TAB_GUID, "EFI_PLABEL", "PAL_CALL_RETURN"}:
                 for i in range(TypedefStart, TypedefEnd+1):
                     Lines[i] = "\n"
 
@@ -357,7 +357,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         Result = gAslIncludePattern.findall(Line)
         if len(Result) == 0:
             Result = gAslCIncludePattern.findall(Line)
-            if len(Result) == 0 or os.path.splitext(Result[0][1])[1].lower() not in [".asl", ".asi"]:
+            if len(Result) == 0 or os.path.splitext(Result[0][1])[1].lower() not in {".asl", ".asi"}:
                 NewFileContent.append("%s%s" % (Indent, Line))
                 continue
             #
@@ -499,7 +499,8 @@ def TrimEdkSources(Source, Target):
 
             for FileName in Files:
                 Dummy, Ext = os.path.splitext(FileName)
-                if Ext.upper() not in ['.C', '.H']: continue
+                if Ext.upper() not in {'.C', '.H'}:
+                    continue
                 if Target is None or Target == '':
                     TrimEdkSourceCode(
                         os.path.join(CurrentDir, FileName),
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index 1fbd095f743c..cb6e431b09be 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -453,7 +453,7 @@ class DecBuildData(PackageBuildClassObject):
             Pcds[pcd.TokenCName, pcd.TokenSpaceGuidCName, self._PCD_TYPE_STRING_[Type]] = pcd
         StructPattern = re.compile(r'[_a-zA-Z][0-9A-Za-z_]*$')
         for pcd in Pcds.values():
-            if pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+            if pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                 if StructPattern.match(pcd.DatumType) is None:
                     EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.", pcd.DefinitionPosition[0],pcd.DefinitionPosition[1])
         for struct_pcd in Pcds.values():
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 7b062b564da5..7944f7cf4d23 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -483,16 +483,16 @@ class DscBuildData(PlatformBuildClassObject):
         return self._BuildTargets
 
     def _GetPcdInfoFlag(self):
-        if self._PcdInfoFlag is None or self._PcdInfoFlag.upper() == 'FALSE':
+        if self._PcdInfoFlag is None or self._PcdInfoFlag.upper() == TAB_FALSE_1:
             return False
-        elif self._PcdInfoFlag.upper() == 'TRUE':
+        elif self._PcdInfoFlag.upper() == TAB_TRUE_1:
             return True
         else:
             return False
     def _GetVarCheckFlag(self):
-        if self._VarCheckFlag is None or self._VarCheckFlag.upper() == 'FALSE':
+        if self._VarCheckFlag is None or self._VarCheckFlag.upper() == TAB_FALSE_1:
             return False
-        elif self._VarCheckFlag.upper() == 'TRUE':
+        elif self._VarCheckFlag.upper() == TAB_TRUE_1:
             return True
         else:
             return False
@@ -810,7 +810,7 @@ class DscBuildData(PlatformBuildClassObject):
                     EdkLogger.error('build', ErrorCode, File=self.MetaFile, Line=LineNo,
                                     ExtraData=ErrorInfo)
 
-                if ModuleType != TAB_COMMON and ModuleType not in SUP_MODULE_LIST:
+                if ModuleType != TAB_COMMON and ModuleType not in SUP_MODULE_SET:
                     EdkLogger.error('build', OPTION_UNKNOWN, "Unknown module type [%s]" % ModuleType,
                                     File=self.MetaFile, ExtraData=LibraryInstance, Line=LineNo)
                 LibraryClassDict[Arch, ModuleType, LibraryClass] = LibraryInstance
@@ -821,7 +821,7 @@ class DscBuildData(PlatformBuildClassObject):
             self._LibraryClasses = tdict(True)
             for LibraryClass in LibraryClassSet:
                 # try all possible module types
-                for ModuleType in SUP_MODULE_LIST:
+                for ModuleType in SUP_MODULE_SET:
                     LibraryInstance = LibraryClassDict[self._Arch, ModuleType, LibraryClass]
                     if LibraryInstance is None:
                         continue
@@ -873,7 +873,7 @@ class DscBuildData(PlatformBuildClassObject):
                             File=self.MetaFile, Line=LineNo)
         ValueList, IsValid, Index = AnalyzeDscPcd(Setting, PcdType, self._DecPcds[PcdCName, TokenSpaceGuid].DatumType)
         if not IsValid:
-            if PcdType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
+            if PcdType not in {MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD}:
                 EdkLogger.error('build', FORMAT_INVALID, "Pcd format incorrect.", File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
             else:
@@ -907,7 +907,7 @@ class DscBuildData(PlatformBuildClassObject):
             if not Valid:
                 EdkLogger.error('build', FORMAT_INVALID, ErrStr, File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
-            if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+            if PcdType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
                 if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
                     EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file." , File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
@@ -933,7 +933,7 @@ class DscBuildData(PlatformBuildClassObject):
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
     def CompleteHiiPcdsDefaultStores(self,Pcds):
-        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
+        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}]
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         for pcd in HiiPcd:
             for skuid in pcd.SkuInfoList:
@@ -946,10 +946,10 @@ class DscBuildData(PlatformBuildClassObject):
 
     def RecoverCommandLinePcd(self):
         def UpdateCommandLineValue(pcd):
-            if pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 pcd.PcdValueFromComm = pcd.DefaultValue
-            elif pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            elif pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).HiiDefaultValue
             else:
                 pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).DefaultValue
@@ -1083,9 +1083,9 @@ class DscBuildData(PlatformBuildClassObject):
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         else:
-            if PcdValue.upper() == 'FALSE':
+            if PcdValue.upper() == TAB_FALSE_1:
                 PcdValue = str(0)
-            if PcdValue.upper() == 'TRUE':
+            if PcdValue.upper() == TAB_TRUE_1:
                 PcdValue = str(1)
             if not FieldName:
                 if PcdDatumType not in TAB_PCD_NUMERIC_TYPES:
@@ -1142,7 +1142,7 @@ class DscBuildData(PlatformBuildClassObject):
             #
             # Retrieve build option for EDKII and EDK style module
             #
-            for CodeBase in (EDKII_NAME, EDK_NAME):
+            for CodeBase in {EDKII_NAME, EDK_NAME}:
                 RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
                 for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
                     if Dummy3.upper() != TAB_COMMON:
@@ -1237,7 +1237,7 @@ class DscBuildData(PlatformBuildClassObject):
                             SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
                             for defaultstore in SkuInfo.DefaultStoreDict:
                                 SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
-                    if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]]:
+                    if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]}:
                         if Pcd.DatumType == TAB_VOID:
                             if not Pcd.MaxDatumSize:
                                 Pcd.MaxDatumSize = '0'
@@ -1249,9 +1249,9 @@ class DscBuildData(PlatformBuildClassObject):
                 PcdInDec = self.DecPcds.get((Name,Guid))
                 if PcdInDec:
                     PcdInDec.PcdValueFromComm = NoFiledValues[(Guid,Name)][0]
-                    if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                    if PcdInDec.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]}:
                         self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
                         self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid,Name)][0]
         return AllPcds
@@ -1302,7 +1302,7 @@ class DscBuildData(PlatformBuildClassObject):
                 str_pcd_obj_str.copy(str_pcd_dec)
                 if str_pcd_obj:
                     str_pcd_obj_str.copy(str_pcd_obj)
-                    if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                    if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                         str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                     else:
                         str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
@@ -1323,7 +1323,7 @@ class DscBuildData(PlatformBuildClassObject):
                     str_pcd_obj = Pcds.get(Pcd, None)
                     if str_pcd_obj:
                         str_pcd_obj_str.copy(str_pcd_obj)
-                        if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                        if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                             str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                         else:
                             str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
@@ -1345,7 +1345,7 @@ class DscBuildData(PlatformBuildClassObject):
                     stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy({defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
                     if not NoDefault:
                         stru_pcd.ValueChain.add(skuid,'')
-            if stru_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if stru_pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for skuid in SkuIds:
                     nextskuid = skuid
                     NoDefault = False
@@ -1372,16 +1372,16 @@ class DscBuildData(PlatformBuildClassObject):
                 if str_pcd_obj is None:
                     print PcdName, PcdGuid
                     raise
-                if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                     if skuname not in str_pcd_obj.SkuInfoList:
                         str_pcd_obj.SkuInfoList[skuname] = SkuInfoClass(SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], HiiDefaultValue=PcdValue, DefaultStore = {StoreName:PcdValue})
                     else:
                         str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue = PcdValue
                         str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.update({StoreName:PcdValue})
-                elif str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                    if skuname in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
+                elif str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
+                    if skuname in {self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON}:
                         str_pcd_obj.DefaultValue = PcdValue
                 else:
                     if skuname not in str_pcd_obj.SkuInfoList:
@@ -1398,8 +1398,8 @@ class DscBuildData(PlatformBuildClassObject):
                     else:
                         str_pcd_obj.SkuInfoList[skuname].DefaultValue = PcdValue
             for str_pcd_obj in S_pcd_set.values():
-                if str_pcd_obj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                if str_pcd_obj.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                     continue
                 PcdDefaultStoreSet = set(defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values() for defaultstorename in skuobj.DefaultStoreDict)
                 DefaultStoreObj = DefaultStore(self._GetDefaultStores())
@@ -1447,7 +1447,7 @@ class DscBuildData(PlatformBuildClassObject):
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build ', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
                                             File=self.MetaFile, Line=Dummy5)
-            if SkuName in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
+            if SkuName in {self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON}:
                 if "." not in TokenSpaceGuid:
                     PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, Dummy5))
                 PcdDict[Arch, PcdCName, TokenSpaceGuid, SkuName] = Setting
@@ -1491,7 +1491,7 @@ class DscBuildData(PlatformBuildClassObject):
 
     def GetStructurePcdMaxSize(self, str_pcd):
         pcd_default_value = str_pcd.DefaultValue
-        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]] else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
+        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]} else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
         sku_values.append(pcd_default_value)
 
         def get_length(value):
@@ -1891,8 +1891,8 @@ class DscBuildData(PlatformBuildClassObject):
             # Assign field values in PCD
             #
             CApp = CApp + DscBuildData.GenerateDefaultValueAssignStatement(Pcd)
-            if Pcd.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if Pcd.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 for skuname in self.SkuIdMgr.GetSkuChain(SkuName):
                     storeset = [DefaultStoreName] if DefaultStoreName == TAB_DEFAULT_STORES_DEFAULT else [TAB_DEFAULT_STORES_DEFAULT, DefaultStoreName]
                     for defaultstorenameitem in storeset:
@@ -1940,8 +1940,8 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + self.GenerateSizeFunction(Pcd)
             CApp = CApp + self.GenerateDefaultValueAssignFunction(Pcd)
             CApp = CApp + self.GenerateCommandLineValue(Pcd)
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 CApp = CApp + self.GenerateInitValueFunction(Pcd,self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -1949,8 +1949,8 @@ class DscBuildData(PlatformBuildClassObject):
                         continue
                     for DefaultStoreName in Pcd.SkuOverrideValues[SkuName]:
                         CApp = CApp + self.GenerateInitValueFunction(Pcd,SkuName,DefaultStoreName)
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 InitByteValue, CApp = self.GenerateInitializeFunc(self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd, InitByteValue, CApp)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -1966,7 +1966,7 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = CApp + '  )\n'
         CApp = CApp + '{\n'
         for Pcd in StructuredPcds.values():
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -2288,13 +2288,13 @@ class DscBuildData(PlatformBuildClassObject):
     def CopyDscRawValue(self,Pcd):
         if Pcd.DscRawValue is None:
             Pcd.DscRawValue = dict()
-        if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+        if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
             if self.SkuIdMgr.SystemSkuId not in Pcd.DscRawValue:
                 Pcd.DscRawValue[self.SkuIdMgr.SystemSkuId] = {}
             Pcd.DscRawValue[self.SkuIdMgr.SystemSkuId][TAB_DEFAULT_STORES_DEFAULT] = Pcd.DefaultValue
         for skuname in Pcd.SkuInfoList:
             Pcd.DscRawValue[skuname] = {}
-            if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for defaultstore in Pcd.SkuInfoList[skuname].DefaultStoreDict:
                     Pcd.DscRawValue[skuname][defaultstore] = Pcd.SkuInfoList[skuname].DefaultStoreDict[defaultstore]
             else:
@@ -2307,16 +2307,16 @@ class DscBuildData(PlatformBuildClassObject):
         for PcdCName, TokenSpaceGuid in PcdSet:
             PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
             self.CopyDscRawValue(PcdObj)
-            if PcdObj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
+            if PcdObj.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_VPD],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_DEFAULT],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]:
+                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]}:
                 Pcds[PcdCName, TokenSpaceGuid]= PcdObj
                 continue
             PcdType = PcdObj.Type
-            if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if PcdType in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for skuid in PcdObj.SkuInfoList:
                     skuobj = PcdObj.SkuInfoList[skuid]
                     mindefaultstorename = DefaultStoreObj.GetMin(set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict))
@@ -2332,7 +2332,7 @@ class DscBuildData(PlatformBuildClassObject):
                     PcdObj.SkuInfoList[skuname] = copy.deepcopy(PcdObj.SkuInfoList[nextskuid])
                     PcdObj.SkuInfoList[skuname].SkuId = skuid
                     PcdObj.SkuInfoList[skuname].SkuIdName = skuname
-            if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if PcdType in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 PcdObj.DefaultValue = PcdObj.SkuInfoList.values()[0].HiiDefaultValue if self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.SINGLE else PcdObj.SkuInfoList[TAB_DEFAULT].HiiDefaultValue
             Pcds[PcdCName, TokenSpaceGuid]= PcdObj
         return Pcds
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 3d9391039f4f..ef91df6e612e 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -230,8 +230,8 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Defs[Name] = Value
                 self._Macros[Name] = Value
             # some special items in [Defines] section need special treatment
-            elif Name in ('EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION', 'EDK_RELEASE_VERSION', 'PI_SPECIFICATION_VERSION'):
-                if Name in ('EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION'):
+            elif Name in {'EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION', 'EDK_RELEASE_VERSION', 'PI_SPECIFICATION_VERSION'}:
+                if Name in {'EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION'}:
                     Name = 'UEFI_SPECIFICATION_VERSION'
                 if self._Specification is None:
                     self._Specification = OrderedDict()
@@ -248,8 +248,8 @@ class InfBuildData(ModuleBuildClassObject):
                 if len(ValueList) > 1:
                     SupModuleList = GetSplitValueList(ValueList[1], ' ')
                 else:
-                    SupModuleList = SUP_MODULE_LIST
-                self._LibraryClass.append(LibraryClassObject(LibraryClass, SupModuleList))
+                    SupModuleList = SUP_MODULE_SET
+                self._LibraryClass.append(LibraryClassObject(LibraryClass, list(SupModuleList)))
             elif Name == 'ENTRY_POINT':
                 if self._ModuleEntryPointList is None:
                     self._ModuleEntryPointList = []
@@ -280,7 +280,7 @@ class InfBuildData(ModuleBuildClassObject):
                     self._CustomMakefile['MSFT'] = TokenList[0]
                     self._CustomMakefile['GCC'] = TokenList[0]
                 else:
-                    if TokenList[0] not in ['MSFT', 'GCC']:
+                    if TokenList[0] not in {'MSFT', 'GCC'}:
                         EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
                                         "No supported family [%s]" % TokenList[0],
                                         File=self.MetaFile, Line=Record[-1])
@@ -296,7 +296,7 @@ class InfBuildData(ModuleBuildClassObject):
             if not self._ModuleType:
                 EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
                                 "MODULE_TYPE is not given", File=self.MetaFile)
-            if self._ModuleType not in SUP_MODULE_LIST:
+            if self._ModuleType not in SUP_MODULE_SET:
                 RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, self._Platform]
                 for Record in RecordList:
                     Name = Record[1]
@@ -304,7 +304,7 @@ class InfBuildData(ModuleBuildClassObject):
                         LineNo = Record[6]
                         break
                 EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
-                                "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (self._ModuleType, ' '.join(l for l in SUP_MODULE_LIST)),
+                                "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (self._ModuleType, ' '.join(l for l in SUP_MODULE_SET)),
                                 File=self.MetaFile, Line=LineNo)
             if (self._Specification is None) or (not 'PI_SPECIFICATION_VERSION' in self._Specification) or (int(self._Specification['PI_SPECIFICATION_VERSION'], 16) < 0x0001000A):
                 if self._ModuleType == SUP_MODULE_SMM_CORE:
@@ -318,11 +318,11 @@ class InfBuildData(ModuleBuildClassObject):
                and 'PCI_CLASS_CODE' in self._Defs and 'PCI_REVISION' in self._Defs:
                 self._BuildType = 'UEFI_OPTIONROM'
                 if 'PCI_COMPRESS' in self._Defs:
-                    if self._Defs['PCI_COMPRESS'] not in ('TRUE', 'FALSE'):
+                    if self._Defs['PCI_COMPRESS'] not in TAB_TRUE_FALSE_SET:
                         EdkLogger.error("build", FORMAT_INVALID, "Expected TRUE/FALSE for PCI_COMPRESS: %s" % self.MetaFile)
 
             elif 'UEFI_HII_RESOURCE_SECTION' in self._Defs \
-               and self._Defs['UEFI_HII_RESOURCE_SECTION'] == 'TRUE':
+               and self._Defs['UEFI_HII_RESOURCE_SECTION'] == TAB_TRUE_1:
                 self._BuildType = 'UEFI_HII'
             else:
                 self._BuildType = self._ModuleType.upper()
@@ -345,7 +345,7 @@ class InfBuildData(ModuleBuildClassObject):
             if self._ComponentType in COMPONENT_TO_MODULE_MAP_DICT:
                 self._ModuleType = COMPONENT_TO_MODULE_MAP_DICT[self._ComponentType]
             if self._ComponentType == EDK_COMPONENT_TYPE_LIBRARY:
-                self._LibraryClass = [LibraryClassObject(self._BaseName, SUP_MODULE_LIST)]
+                self._LibraryClass = [LibraryClassObject(self._BaseName, list(SUP_MODULE_SET))]
             # make use some [nmake] section macros
             Macros = self._Macros
             Macros["EDK_SOURCE"] = GlobalData.gEcpSource
@@ -442,7 +442,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._GetHeaderInfo()
             if self._ModuleType is None:
                 self._ModuleType = SUP_MODULE_BASE
-            if self._ModuleType not in SUP_MODULE_LIST:
+            if self._ModuleType not in SUP_MODULE_SET:
                 self._ModuleType = SUP_MODULE_USER_DEFINED
         return self._ModuleType
 
@@ -496,7 +496,7 @@ class InfBuildData(ModuleBuildClassObject):
         if self._Shadow is None:
             if self._Header_ is None:
                 self._GetHeaderInfo()
-            if self._Shadow is not None and self._Shadow.upper() == 'TRUE':
+            if self._Shadow is not None and self._Shadow.upper() == TAB_TRUE_1:
                 self._Shadow = True
             else:
                 self._Shadow = False
@@ -886,7 +886,7 @@ class InfBuildData(ModuleBuildClassObject):
 
             if len(RecordList) != 0 and self.ModuleType == SUP_MODULE_USER_DEFINED:
                 for Record in RecordList:
-                    if Record[4] not in [SUP_MODULE_PEIM, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER]:
+                    if Record[4] not in {SUP_MODULE_PEIM, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER}:
                         EdkLogger.error('build', FORMAT_INVALID,
                                         "'%s' module must specify the type of [Depex] section" % self.ModuleType,
                                         File=self.MetaFile)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
index df1e90faf5a0..8650a51933d6 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
@@ -33,9 +33,9 @@ ErrorMsgMap = {
 }
 
 def CheckInfComment(SectionType, Comments, InfFile, LineNo, ValueList):
-    if SectionType in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_EX, MODEL_PCD_DYNAMIC]:
+    if SectionType in {MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_EX, MODEL_PCD_DYNAMIC}:
         CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0]+'.'+ValueList[1], ErrorMsgMap[MODEL_PCD_DYNAMIC])
-    elif SectionType in [MODEL_EFI_GUID, MODEL_EFI_PPI]:
+    elif SectionType in {MODEL_EFI_GUID, MODEL_EFI_PPI}:
         CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
     elif SectionType == MODEL_EFI_PROTOCOL:
         CheckUsage(Comments, UsageList + ("TO_START", "BY_START"), InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 2c116ddbcb71..50a74bc415ef 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -584,7 +584,7 @@ class InfParser(MetaFileParser):
                 self._SectionHeaderParser()
                 # Check invalid sections
                 if self._Version < 0x00010005:
-                    if self._SectionType in [MODEL_META_DATA_BUILD_OPTION,
+                    if self._SectionType in {MODEL_META_DATA_BUILD_OPTION,
                                              MODEL_EFI_LIBRARY_CLASS,
                                              MODEL_META_DATA_PACKAGE,
                                              MODEL_PCD_FIXED_AT_BUILD,
@@ -595,13 +595,13 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_GUID,
                                              MODEL_EFI_PROTOCOL,
                                              MODEL_EFI_PPI,
-                                             MODEL_META_DATA_USER_EXTENSION]:
+                                             MODEL_META_DATA_USER_EXTENSION}:
                         EdkLogger.error('Parser', FORMAT_INVALID,
                                         "Section [%s] is not allowed in inf file without version" % (self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
-                elif self._SectionType in [MODEL_EFI_INCLUDE,
+                elif self._SectionType in {MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
-                                           MODEL_META_DATA_NMAKE]:
+                                           MODEL_META_DATA_NMAKE}:
                     EdkLogger.error('Parser', FORMAT_INVALID,
                                     "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -764,9 +764,9 @@ class InfParser(MetaFileParser):
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
             InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
+            if InfPcdValueList[0] in TAB_TRUE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
-            elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
+            elif InfPcdValueList[0] in TAB_FALSE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1);
         if (self._ValueList[0], self._ValueList[1]) not in self.PcdsDict:
             self.PcdsDict[self._ValueList[0], self._ValueList[1]] = self._SectionType
@@ -1017,13 +1017,13 @@ class DscParser(MetaFileParser):
             EdkLogger.error("Parser", FORMAT_INVALID, "Unknown directive [%s]" % DirectiveName,
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
-        if DirectiveName in ['!IF', '!IFDEF', '!IFNDEF']:
+        if DirectiveName in {'!IF', '!IFDEF', '!IFNDEF'}:
             self._InDirective += 1
 
-        if DirectiveName in ['!ENDIF']:
+        if DirectiveName == '!ENDIF':
             self._InDirective -= 1
 
-        if DirectiveName in ['!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'] and self._ValueList[1] == '':
+        if DirectiveName in {'!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'} and self._ValueList[1] == '':
             EdkLogger.error("Parser", FORMAT_INVALID, "Missing expression",
                             File=self.MetaFile, Line=self._LineIndex + 1,
                             ExtraData=self._CurrentLine)
@@ -1037,9 +1037,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 # Remove any !else or !elseif
                 DirectiveInfo = self._DirectiveStack.pop()
-                if DirectiveInfo[0] in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if DirectiveInfo[0] in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                         MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
             else:
                 EdkLogger.error("Parser", FORMAT_INVALID, "Redundant '!endif'",
@@ -1104,7 +1104,7 @@ class DscParser(MetaFileParser):
     @ParseMacro
     def _SkuIdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
-        if len(TokenList) not in (2,3):
+        if len(TokenList) not in {2,3}:
             EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Number>|<UiName>[|<UiName>]'",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -1156,7 +1156,7 @@ class DscParser(MetaFileParser):
             #
             # The PCD values are optional for FIXEDATBUILD, PATCHABLEINMODULE, Dynamic/DynamicEx default
             #
-            if self._SectionType in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+            if self._SectionType in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
                 return
             EdkLogger.error('Parser', FORMAT_INVALID, "No PCD value given",
                             ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
@@ -1164,13 +1164,13 @@ class DscParser(MetaFileParser):
 
         # Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
         ValueList = GetSplitValueList(self._ValueList[2])
-        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64] \
-                              and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
+        if len(ValueList) > 1 and ValueList[1] in {TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64} \
+                              and self._ItemType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
             EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
         # Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
-        if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
+        if self._ItemType in {MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII}:
             DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
             if len(DscPcdValueList[0].replace('L','').replace('"','').strip()) == 0:
                 EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
@@ -1178,9 +1178,9 @@ class DscParser(MetaFileParser):
 
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-        if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
+        if DscPcdValueList[0] in TAB_TRUE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
-        elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
+        elif DscPcdValueList[0] in TAB_FALSE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
 
 
@@ -1248,7 +1248,7 @@ class DscParser(MetaFileParser):
         Macros.update(GlobalData.gPlatformDefines)
         Macros.update(GlobalData.gCommandLineDefines)
         # PCD cannot be referenced in macro definition
-        if self._ItemType not in [MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE]:
+        if self._ItemType not in {MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE}:
             Macros.update(self._Symbols)
         if GlobalData.BuildOptionPcd:
             for Item in GlobalData.BuildOptionPcd:
@@ -1412,9 +1412,9 @@ class DscParser(MetaFileParser):
     def __RetrievePcdValue(self):
         Content = open(str(self.MetaFile), 'r').readlines()
         GlobalData.gPlatformOtherPcds['DSCFILE'] = str(self.MetaFile)
-        for PcdType in (MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_HII,
+        for PcdType in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_HII,
                         MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
-                        MODEL_PCD_DYNAMIC_EX_VPD):
+                        MODEL_PCD_DYNAMIC_EX_VPD]:
             Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
             for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID, Line in Records:
                 Name = TokenSpaceGuid + '.' + PcdName
@@ -1455,8 +1455,8 @@ class DscParser(MetaFileParser):
 
     def __ProcessDirective(self):
         Result = None
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF}:
             Macros = self._Macros
             Macros.update(GlobalData.gGlobalDefines)
             try:
@@ -1474,9 +1474,9 @@ class DscParser(MetaFileParser):
                                 Line=self._LineIndex + 1)
                 Result = Excpt.result
 
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
             self._DirectiveStack.append(self._ItemType)
             if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
                 Result = bool(Result)
@@ -1500,9 +1500,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 self._DirectiveEvalStack.pop()
                 Directive = self._DirectiveStack.pop()
-                if Directive in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if Directive in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
         elif self._ItemType == MODEL_META_DATA_INCLUDE:
             # The included file must be relative to workspace or same directory as DSC file
@@ -1600,7 +1600,7 @@ class DscParser(MetaFileParser):
         self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros, RaiseError=True)
 
     def __ProcessPcd(self):
-        if self._ItemType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
+        if self._ItemType not in {MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD}:
             self._ValueList[2] = ReplaceMacro(self._ValueList[2], self._Macros, RaiseError=True)
             return
 
@@ -1617,9 +1617,9 @@ class DscParser(MetaFileParser):
             except:
                 pass
 
-        if ValList[Index] == 'True':
+        if ValList[Index] == TAB_TRUE_3:
             ValList[Index] = '1'
-        if ValList[Index] == 'False':
+        if ValList[Index] == TAB_FALSE_3:
             ValList[Index] = '0'
 
         if (not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack):
@@ -1852,7 +1852,7 @@ class DecParser(MetaFileParser):
             if len(ItemList) > 2:
                 S2 = ItemList[2].upper()
                 # only Includes, GUIDs, PPIs, Protocols section have Private tag
-                if self._SectionName in [TAB_INCLUDES.upper(), TAB_GUIDS.upper(), TAB_PROTOCOLS.upper(), TAB_PPIS.upper()]:
+                if self._SectionName in {TAB_INCLUDES.upper(), TAB_GUIDS.upper(), TAB_PROTOCOLS.upper(), TAB_PPIS.upper()}:
                     if S2 != 'PRIVATE':
                         EdkLogger.error("Parser", FORMAT_INVALID, 'Please use keyword "Private" as section tag modifier.',
                                         File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
@@ -2030,9 +2030,9 @@ class DecParser(MetaFileParser):
                 self._ValueList[0] = self._CurrentStructurePcdName
                 self._ValueList[1] = ValueList[1].strip()
 
-            if ValueList[0] in ['True', 'true', 'TRUE']:
+            if ValueList[0] in TAB_TRUE_SET:
                 ValueList[0] = '1'
-            elif ValueList[0] in ['False', 'false', 'FALSE']:
+            elif ValueList[0] in TAB_FALSE_SET:
                 ValueList[0] = '0'
 
             # check for duplicate PCD definition
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index db9e1ed062fb..478dab3b61b0 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -124,7 +124,9 @@ gDriverTypeMap = {
   }
 
 ## The look up table of the supported opcode in the dependency expression binaries
-gOpCodeList = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "TRUE", "FALSE", "END", "SOR"]
+gOpCodeList = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH,
+               DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_TRUE,
+               DEPEX_OPCODE_FALSE, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
 
 ##
 # Writes a string to the file object.
@@ -296,7 +298,7 @@ class DepexParser(object):
             OpCode = DepexFile.read(1)
             while OpCode:
                 Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
-                if Statement in ["BEFORE", "AFTER", "PUSH"]:
+                if Statement in {"BEFORE", "AFTER", "PUSH"}:
                     GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
                                 struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
                     GuidString = self._GuidDb.get(GuidValue, GuidValue)
@@ -409,7 +411,7 @@ class DepexReport(object):
         if not ModuleType:
             ModuleType = COMPONENT_TO_MODULE_MAP_DICT.get(M.ComponentType, "")
 
-        if ModuleType in [SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE, SUP_MODULE_UEFI_APPLICATION]:
+        if ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE, SUP_MODULE_UEFI_APPLICATION}:
             return
       
         for Source in M.SourceFileList:
@@ -493,25 +495,25 @@ class BuildFlagsReport(object):
         #
         for Source in M.SourceFileList:
             Ext = os.path.splitext(Source.File)[1].lower()
-            if Ext in [".c", ".cc", ".cpp"]:
+            if Ext in {".c", ".cc", ".cpp"}:
                 BuildOptions["CC"] = 1
-            elif Ext in [".s", ".asm"]:
+            elif Ext in {".s", ".asm"}:
                 BuildOptions["PP"] = 1
                 BuildOptions["ASM"] = 1
-            elif Ext in [".vfr"]:
+            elif Ext == ".vfr":
                 BuildOptions["VFRPP"] = 1
                 BuildOptions["VFR"] = 1
-            elif Ext in [".dxs"]:
+            elif Ext == ".dxs":
                 BuildOptions["APP"] = 1
                 BuildOptions["CC"] = 1
-            elif Ext in [".asl"]:
+            elif Ext == ".asl":
                 BuildOptions["ASLPP"] = 1
                 BuildOptions["ASL"] = 1
-            elif Ext in [".aslc"]:
+            elif Ext == ".aslc":
                 BuildOptions["ASLCC"] = 1
                 BuildOptions["ASLDLINK"] = 1
                 BuildOptions["CC"] = 1
-            elif Ext in [".asm16"]:
+            elif Ext == ".asm16":
                 BuildOptions["ASMLINK"] = 1
             BuildOptions["SLINK"] = 1
             BuildOptions["DLINK"] = 1
@@ -1030,11 +1032,11 @@ class PcdReport(object):
                     IsStructure = False
                     if GlobalData.gStructurePcd and (self.Arch in GlobalData.gStructurePcd) and ((Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.gStructurePcd[self.Arch]):
                         IsStructure = True
-                        if TypeName in ('DYNVPD', 'DEXVPD'):
+                        if TypeName in {'DYNVPD', 'DEXVPD'}:
                             SkuInfoList = Pcd.SkuInfoList
                         Pcd = GlobalData.gStructurePcd[self.Arch][(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                         Pcd.DatumType = Pcd.StructName
-                        if TypeName in ('DYNVPD', 'DEXVPD'):
+                        if TypeName in {'DYNVPD', 'DEXVPD'}:
                             Pcd.SkuInfoList = SkuInfoList
                         if Pcd.PcdFieldValueFromComm:
                             BuildOptionMatch = True
@@ -1052,7 +1054,7 @@ class PcdReport(object):
                                 SkuList = sorted(Pcd.SkuInfoList.keys())
                                 for Sku in SkuList:
                                     SkuInfo = Pcd.SkuInfoList[Sku]
-                                    if TypeName in ('DYNHII', 'DEXHII'):
+                                    if TypeName in {'DYNHII', 'DEXHII'}:
                                         if SkuInfo.DefaultStoreDict:
                                             DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
                                             for DefaultStore in DefaultStoreList:
@@ -1091,7 +1093,7 @@ class PcdReport(object):
                     if ModulePcdSet is None:
                         if IsStructure:
                             continue
-                        if not TypeName in ('PATCH', 'FLAG', 'FIXED'):
+                        if TypeName not in {'PATCH', 'FLAG', 'FIXED'}:
                             continue
                         if not BuildOptionMatch:
                             ModuleOverride = self.ModulePcdOverride.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), {})
@@ -1186,7 +1188,7 @@ class PcdReport(object):
             for Sku in SkuList:
                 SkuInfo = Pcd.SkuInfoList[Sku]
                 SkuIdName = SkuInfo.SkuIdName
-                if TypeName in ('DYNHII', 'DEXHII'):
+                if TypeName in {'DYNHII', 'DEXHII'}:
                     if SkuInfo.DefaultStoreDict:
                         DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
                         for DefaultStore in DefaultStoreList:
@@ -1271,7 +1273,7 @@ class PcdReport(object):
                                 FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
                                 FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
-                    if TypeName in ('DYNVPD', 'DEXVPD'):
+                    if TypeName in {'DYNVPD', 'DEXVPD'}:
                         FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
                     if IsStructure:
                         OverrideValues = Pcd.SkuOverrideValues[Sku]
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 1fb8c7985d99..66a97fc8c1cd 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -224,7 +224,7 @@ def NormFile(FilePath, Workspace):
         EdkLogger.error("build", FILE_NOT_FOUND, ExtraData="\t%s (Please give file in absolute path or relative to WORKSPACE)" % FileFullPath)
 
     # remove workspace directory from the beginning part of the file path
-    if Workspace[-1] in ["\\", "/"]:
+    if Workspace[-1] in {"\\", "/"}:
         return FileFullPath[len(Workspace):]
     else:
         return FileFullPath[(len(Workspace) + 1):]
@@ -410,7 +410,7 @@ class ModuleMakeUnit(BuildUnit):
     def __init__(self, Obj, Target):
         Dependency = [ModuleMakeUnit(La, Target) for La in Obj.LibraryAutoGenList]
         BuildUnit.__init__(self, Obj, Obj.BuildCommand, Target, Dependency, Obj.MakeFileDir)
-        if Target in [None, "", "all"]:
+        if Target in {None, "", "all"}:
             self.Target = "tbuild"
 
 ## The smallest platform unit that can be built by nmake/make command in multi-thread build mode
@@ -1228,7 +1228,7 @@ class Build():
             return False
 
         # skip file generation for cleanxxx targets, run and fds target
-        if Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+        if Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
             # for target which must generate AutoGen code and makefile
             if not self.SkipAutoGen or Target == 'genc':
                 self.Progress.Start("Generating code")
@@ -1347,7 +1347,7 @@ class Build():
             return False
 
         # skip file generation for cleanxxx targets, run and fds target
-        if Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+        if Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
             # for target which must generate AutoGen code and makefile
             if not self.SkipAutoGen or Target == 'genc':
                 self.Progress.Start("Generating code")
@@ -1488,7 +1488,7 @@ class Build():
             for SectionHeader in ModuleInfo.Image.SectionHeaderList:
                 if SectionHeader[0] == '.text':
                     TextSectionAddress = SectionHeader[1]
-                elif SectionHeader[0] in ['.data', '.sdata']:
+                elif SectionHeader[0] in {'.data', '.sdata'}:
                     DataSectionAddress = SectionHeader[1]
             if AddrIsOffset:
                 MapBuffer.write('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
@@ -1583,19 +1583,19 @@ class Build():
                     if not ImageClass.IsValid:
                         EdkLogger.error("build", FILE_PARSE_FAILURE, ExtraData=ImageClass.ErrorInfo)
                     ImageInfo = PeImageInfo(Module.Name, Module.Guid, Module.Arch, Module.OutputDir, Module.DebugDir, ImageClass)
-                    if Module.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, SUP_MODULE_DXE_CORE]:
+                    if Module.ModuleType in {SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, SUP_MODULE_DXE_CORE}:
                         PeiModuleList[Module.MetaFile] = ImageInfo
                         PeiSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [EDK_COMPONENT_TYPE_BS_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_UEFI_DRIVER]:
+                    elif Module.ModuleType in {EDK_COMPONENT_TYPE_BS_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_UEFI_DRIVER}:
                         BtModuleList[Module.MetaFile] = ImageInfo
                         BtSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER]:
+                    elif Module.ModuleType in {SUP_MODULE_DXE_RUNTIME_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER}:
                         RtModuleList[Module.MetaFile] = ImageInfo
                         #IPF runtime driver needs to be at 2 page alignment.
                         if IsIpfPlatform and ImageInfo.Image.Size % 0x2000 != 0:
                             ImageInfo.Image.Size = (ImageInfo.Image.Size / 0x2000 + 1) * 0x2000
                         RtSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [SUP_MODULE_SMM_CORE, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
+                    elif Module.ModuleType in {SUP_MODULE_SMM_CORE, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
                         SmmModuleList[Module.MetaFile] = ImageInfo
                         SmmSize += ImageInfo.Image.Size
                         if Module.ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
@@ -1757,7 +1757,7 @@ class Build():
                     self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict)
 
                 # Create MAP file when Load Fix Address is enabled.
-                if self.Target in ["", "all", "fds"]:
+                if self.Target in {"", "all", "fds"}:
                     for Arch in Wa.ArchList:
                         GlobalData.gGlobalDefines['ARCH'] = Arch
                         #
@@ -1855,7 +1855,7 @@ class Build():
                                 self.HashSkipModules.append(Ma)
                                 continue
                             # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
-                            if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+                            if self.Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
                                 # for target which must generate AutoGen code and makefile
                                 if not self.SkipAutoGen or self.Target == 'genc':
                                     self.Progress.Start("Generating code")
@@ -2036,7 +2036,7 @@ class Build():
                             continue
 
                         # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
-                        if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+                        if self.Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
                             # for target which must generate AutoGen code and makefile
                             if not self.SkipAutoGen or self.Target == 'genc':
                                 Ma.CreateCodeFile(True)
@@ -2101,7 +2101,7 @@ class Build():
                     EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 # Create MAP file when Load Fix Address is enabled.
-                if self.Target in ["", "all", "fds"]:
+                if self.Target in {"", "all", "fds"}:
                     for Arch in Wa.ArchList:
                         #
                         # Check whether the set fix address is above 4G for 32bit image.
@@ -2213,7 +2213,7 @@ class Build():
     #
     def Launch(self):
         if not self.ModuleFile:
-            if not self.SpawnMode or self.Target not in ["", "all"]:
+            if not self.SpawnMode or self.Target not in {"", "all"}:
                 self.SpawnMode = False
                 self._BuildPlatform()
             else:
@@ -2274,7 +2274,7 @@ def ParseDefines(DefineList=[]):
                                 ExtraData=DefineTokenList[0])
 
             if len(DefineTokenList) == 1:
-                DefineDict[DefineTokenList[0]] = "TRUE"
+                DefineDict[DefineTokenList[0]] = TAB_TRUE_1
             else:
                 DefineDict[DefineTokenList[0]] = DefineTokenList[1].strip()
     return DefineDict
@@ -2478,7 +2478,7 @@ def Main():
             if ErrorCode != 0:
                 EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
 
-        if Option.Flag is not None and Option.Flag not in ['-c', '-s']:
+        if Option.Flag is not None and Option.Flag not in {'-c', '-s'}:
             EdkLogger.error("build", OPTION_VALUE_INVALID, "UNI flag must be one of -c or -s")
 
         MyBuild = Build(Target, Workspace, Option)
-- 
2.16.2.windows.1



  parent reply	other threads:[~2018-05-14 18:09 UTC|newest]

Thread overview: 13+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 02/11] BaseTools: Workspace - create a base class Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 03/11] BaseTools: remove unused code Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 06/11] BaseTools: refactor section generation Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 07/11] BaseTools: refactor file opening/writing Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 08/11] BaseTools: refactor to change object types Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings Jaben Carsey
2018-05-14 18:09 ` Jaben Carsey [this message]
2018-05-14 18:09 ` [PATCH v1 11/11] BaseTools: remove extra assignment Jaben Carsey
  -- strict thread matches above, loose matches on Subject: below --
2018-06-20 21:08 [PATCH v2 00/11] BaseTools Refactoring Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 10/11] BaseTools: change to set for membership testing Jaben Carsey

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-list from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=fab416e6059bf2a012b5cc4a67336d76a0f206ba.1526321053.git.jaben.carsey@intel.com \
    --to=devel@edk2.groups.io \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox