public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
* [PATCH v1 00/11] BaseTools refactoring
@ 2018-05-14 18:09 Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
                   ` (10 more replies)
  0 siblings, 11 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel

This patch cleans up BaseTools.
On classes that are not instantiated and are deisgned as purely base classes,
they get marked using the abstract base class to prevent instantiation.  This
prevents future errors.
Create a new shared base class to centralize some code that was identical in
multiple classes.
cleanup to pass tuples to startswith and endswith instead of calling multiple
times in a single expression.
when an statement used a fixed list or tuple change to a set to allow
hashing to speed up the operation.  if the order was important then make sure
that is a list.  creating on the fly tuples serves no purpose.
use with statements for file opening and make sure we dont over request file
privilidge.  this also seems to have cleaned up the error about the file
AutoGenTimeStamp which was causing errors previously.
lots of work to elimiate concatenating strings as this is a poor performing
operation since strings are immutable and must be completely reallocated and
moved for each concatenation.

Jaben Carsey (11):
  BaseTools: decorate base classes to prevent instantiation
  BaseTools: Workspace - create a base class
  BaseTools: remove unused code
  BaseTools: remove repeated calls to startswith/endswith
  BaseTools: use set presence instead of series of equality
  BaseTools: refactor section generation
  BaseTools: refactor file opening/writing
  BaseTools: refactor to change object types
  BaseTools: refactor to stop re-allocating strings
  BaseTools: change to set for membership testing
  BaseTools: remove extra assignment

 BaseTools/Source/Python/AutoGen/AutoGen.py                      | 248 ++++++++++----------
 BaseTools/Source/Python/AutoGen/GenC.py                         |  75 +++---
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |  42 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                      |  43 ++--
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                     |  52 ++--
 BaseTools/Source/Python/AutoGen/GenVar.py                       |   6 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py               |  21 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                    |  22 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py               |   8 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py      |  32 ++-
 BaseTools/Source/Python/BPDG/BPDG.py                            |   2 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                          |  38 ++-
 BaseTools/Source/Python/Common/DataType.py                      |  37 ++-
 BaseTools/Source/Python/Common/Expression.py                    |  68 +++---
 BaseTools/Source/Python/Common/Misc.py                          |  77 +++---
 BaseTools/Source/Python/Common/Parsing.py                       |   2 +-
 BaseTools/Source/Python/Common/RangeExpression.py               |  10 +-
 BaseTools/Source/Python/Common/String.py                        |   2 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py          |  21 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py            |   4 +-
 BaseTools/Source/Python/Common/VariableAttributes.py            |   6 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                   |   4 +-
 BaseTools/Source/Python/CommonDataClass/CommonClass.py          |  29 +--
 BaseTools/Source/Python/CommonDataClass/DataClass.py            |  47 +---
 BaseTools/Source/Python/Ecc/Check.py                            | 123 +---------
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py            |   4 +-
 BaseTools/Source/Python/Ecc/Configuration.py                    |  10 +-
 BaseTools/Source/Python/Ecc/Database.py                         |  76 +-----
 BaseTools/Source/Python/Ecc/Ecc.py                              |   4 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                   |   4 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py |  54 +++--
 BaseTools/Source/Python/Ecc/c.py                                |  42 ++--
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py            |   9 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                    |  14 +-
 BaseTools/Source/Python/Eot/FileProfile.py                      |   8 +-
 BaseTools/Source/Python/Eot/Parser.py                           |   4 +-
 BaseTools/Source/Python/Eot/Report.py                           |  19 +-
 BaseTools/Source/Python/Eot/c.py                                |   2 +-
 BaseTools/Source/Python/GenFds/Capsule.py                       |  26 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                   |  11 +-
 BaseTools/Source/Python/GenFds/CompressSection.py               |  24 +-
 BaseTools/Source/Python/GenFds/DataSection.py                   |  32 +--
 BaseTools/Source/Python/GenFds/DepexSection.py                  |   8 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                    | 111 ++++-----
 BaseTools/Source/Python/GenFds/FdfParser.py                     | 195 ++++++++-------
 BaseTools/Source/Python/GenFds/Ffs.py                           |  88 +++----
 BaseTools/Source/Python/GenFds/FfsFileStatement.py              |   9 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py               |  28 +--
 BaseTools/Source/Python/GenFds/Fv.py                            | 112 +++------
 BaseTools/Source/Python/GenFds/FvImageSection.py                |  38 ++-
 BaseTools/Source/Python/GenFds/GenFds.py                        |   8 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py          |  65 ++---
 BaseTools/Source/Python/GenFds/GuidSection.py                   |  71 +++---
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py            |  15 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                     |   2 -
 BaseTools/Source/Python/GenFds/Region.py                        |  17 +-
 BaseTools/Source/Python/GenFds/Section.py                       | 231 ++++++++----------
 BaseTools/Source/Python/GenFds/UiSection.py                     |  22 +-
 BaseTools/Source/Python/GenFds/VerSection.py                    |  26 +-
 BaseTools/Source/Python/GenFds/Vtf.py                           | 108 +++------
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py    |  21 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py          |  39 +--
 BaseTools/Source/Python/Table/Table.py                          |   6 +-
 BaseTools/Source/Python/Table/TableDataModel.py                 |  11 +-
 BaseTools/Source/Python/Table/TableReport.py                    |  47 ++--
 BaseTools/Source/Python/TargetTool/TargetTool.py                |  97 ++++----
 BaseTools/Source/Python/Trim/Trim.py                            |  76 +++---
 BaseTools/Source/Python/Workspace/BuildClassObject.py           | 149 +++++-------
 BaseTools/Source/Python/Workspace/DecBuildData.py               |   2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               | 119 +++++-----
 BaseTools/Source/Python/Workspace/InfBuildData.py               |  32 +--
 BaseTools/Source/Python/Workspace/MetaDataTable.py              |  10 +-
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py      |   4 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py             |  74 +++---
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py            |  20 +-
 BaseTools/Source/Python/Workspace/WorkspaceDatabase.py          |  18 +-
 BaseTools/Source/Python/build/BuildReport.py                    | 115 +++++----
 BaseTools/Source/Python/build/build.py                          | 140 ++++++-----
 78 files changed, 1478 insertions(+), 2018 deletions(-)

-- 
2.16.2.windows.1



^ permalink raw reply	[flat|nested] 13+ messages in thread

* [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 02/11] BaseTools: Workspace - create a base class Jaben Carsey
                   ` (9 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

use python's ABC (abstract base class) to raise type errors if we instantiate
classes we designed to be used only as base classes for other classes.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                      | 4 ++++
 BaseTools/Source/Python/AutoGen/GenMake.py                      | 4 ++++
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py      | 4 ++++
 BaseTools/Source/Python/Common/Expression.py                    | 4 ++++
 BaseTools/Source/Python/Common/VariableAttributes.py            | 6 +++++-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 4 ++++
 BaseTools/Source/Python/Table/Table.py                          | 6 +++++-
 BaseTools/Source/Python/Workspace/BuildClassObject.py           | 7 +++++++
 BaseTools/Source/Python/Workspace/MetaFileParser.py             | 4 ++++
 9 files changed, 41 insertions(+), 2 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 54f6b1f173b2..619e1e41e32b 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -47,6 +47,7 @@ import hashlib
 from GenVar import VariableMgr,var_info
 from collections import OrderedDict
 from collections import defaultdict
+from abc import ABCMeta, abstractmethod
 
 ## Regular expression for splitting Dependency Expression string into tokens
 gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
@@ -197,6 +198,9 @@ class AutoGen(object):
             cls.__ObjectCache[Key] = super(AutoGen, cls).__new__(cls)
             return cls.__ObjectCache[Key]
 
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__ (self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
         super(AutoGen, self).__init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
 
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index a37350742240..68ec9a817133 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -26,6 +26,7 @@ from Common.String import *
 from BuildEngine import *
 import Common.GlobalData as GlobalData
 from collections import OrderedDict
+from abc import ABCMeta, abstractmethod
 
 ## Regular expression for finding header file inclusions
 gIncludePattern = re.compile(r"^[ \t]*#?[ \t]*include(?:[ \t]*(?:\\(?:\r\n|\r|\n))*[ \t]*)*(?:\(?[\"<]?[ \t]*)([-\w.\\/() \t]+)(?:[ \t]*[\">]?\)?)", re.MULTILINE | re.UNICODE | re.IGNORECASE)
@@ -171,6 +172,9 @@ class BuildFile(object):
     #
     #   @param  AutoGenObject   Object of AutoGen class
     #
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, AutoGenObject):
         self._AutoGenObject = AutoGenObject
         self._FileType = gMakeType
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 64d4965e9662..e2b4795129ef 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -20,6 +20,7 @@ from Common.Misc import *
 from StringIO import StringIO
 from struct import pack
 from Common.DataType import *
+from abc import ABCMeta, abstractmethod
 
 class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
     def __init__(self):
@@ -222,6 +223,9 @@ class VAR_CHECK_PCD_VARIABLE_TAB(object):
 
 
 class VAR_CHECK_PCD_VALID_OBJ(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, VarOffset, data, PcdDataType):
         self.Type = 1
         self.Length = 0  # Length include this header
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 9e9d9fdc02e7..9fa07c6add16 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -19,6 +19,7 @@ from Misc import GuidStringToGuidStructureString, ParseFieldValue, IsFieldValueA
 import Common.EdkLogger as EdkLogger
 import copy
 from Common.DataType import *
+from abc import ABCMeta, abstractmethod
 
 ERR_STRING_EXPR         = 'This operator cannot be used in string expression: [%s].'
 ERR_SNYTAX              = 'Syntax error, the rest of expression cannot be evaluated: [%s].'
@@ -202,6 +203,9 @@ def IntToStr(Value):
 SupportedInMacroList = ['TARGET', 'TOOL_CHAIN_TAG', 'ARCH', 'FAMILY']
 
 class BaseExpression(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, *args, **kwargs):
         super(BaseExpression, self).__init__()
 
diff --git a/BaseTools/Source/Python/Common/VariableAttributes.py b/BaseTools/Source/Python/Common/VariableAttributes.py
index a2e22ca0409c..72f64fff3864 100644
--- a/BaseTools/Source/Python/Common/VariableAttributes.py
+++ b/BaseTools/Source/Python/Common/VariableAttributes.py
@@ -3,7 +3,7 @@
 # This file is used to handle the variable attributes and property information
 #
 #
-# Copyright (c) 2015, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2015 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -12,6 +12,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
+from abc import ABCMeta, abstractmethod
    
 class VariableAttributes(object):
     EFI_VARIABLE_NON_VOLATILE = 0x00000001
@@ -25,6 +26,9 @@ class VariableAttributes(object):
                      "RO":VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
                      }
     
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self):
         pass
     
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 4d61cd1cea91..e5c43b629151 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -35,6 +35,7 @@ from MetaFileTable import MetaFileStorage
 from GenFds.FdfParser import FdfParser
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.LongFilePathSupport import CodecOpenLongFilePath
+from abc import ABCMeta, abstractmethod
 
 ## A decorator used to parse macro definition
 def ParseMacro(Parser):
@@ -146,6 +147,9 @@ class MetaFileParser(object):
     #   @param      Owner           Owner ID (for sub-section parsing)
     #   @param      From            ID from which the data comes (for !INCLUDE directive)
     #
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, FilePath, FileType, Table, Owner=-1, From=-1):
         self._Table = Table
         self._RawTable = Table
diff --git a/BaseTools/Source/Python/Table/Table.py b/BaseTools/Source/Python/Table/Table.py
index c311df91c2ec..46bc92ea8377 100644
--- a/BaseTools/Source/Python/Table/Table.py
+++ b/BaseTools/Source/Python/Table/Table.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create/update/query/erase a common table
 #
-# Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -15,6 +15,7 @@
 # Import Modules
 #
 import Common.EdkLogger as EdkLogger
+from abc import ABCMeta, abstractmethod
 
 ## TableFile
 #
@@ -26,6 +27,9 @@ import Common.EdkLogger as EdkLogger
 # @param TableName:  Name of the table
 #
 class Table(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, Cursor):
         self.Cur = Cursor
         self.Table = ''
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 209315d901b2..5f34e8e0bc69 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -18,6 +18,7 @@ from Common.Misc import RealPath2
 from Common.BuildToolError import *
 from Common.DataType import *
 import collections
+from abc import ABCMeta, abstractmethod
 
 ## PcdClassObject
 #
@@ -381,6 +382,9 @@ class ModuleBuildClassObject(object):
 #                       { [(PcdCName, PcdGuidCName)] : PcdClassObject}
 #
 class PackageBuildClassObject(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self):
         self.MetaFile                = ''
         self.PackageName             = ''
@@ -451,6 +455,9 @@ class PackageBuildClassObject(object):
 #                         { [BuildOptionKey] : BuildOptionValue }
 #
 class PlatformBuildClassObject(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self):
         self.MetaFile                = ''
         self.PlatformName            = ''
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 36843643ed13..21b20bce4018 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -34,6 +34,7 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 from collections import defaultdict
 from MetaFileTable import MetaFileStorage
 from MetaFileCommentParser import CheckInfComment
+from abc import ABCMeta, abstractmethod
 
 ## RegEx for finding file versions
 hexVersionPattern = re.compile(r'0[xX][\da-f-A-F]{5,8}')
@@ -154,6 +155,9 @@ class MetaFileParser(object):
     #   @param      Owner           Owner ID (for sub-section parsing)
     #   @param      From            ID from which the data comes (for !INCLUDE directive)
     #
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
     def __init__(self, FilePath, FileType, Arch, Table, Owner= -1, From= -1):
         self._Table = Table
         self._RawTable = Table
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 02/11] BaseTools: Workspace - create a base class
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 03/11] BaseTools: remove unused code Jaben Carsey
                   ` (8 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

refactor 3 classes and create a new base class for their shared functions.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/Workspace/BuildClassObject.py | 140 +++++++-------------
 1 file changed, 50 insertions(+), 90 deletions(-)

diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 5f34e8e0bc69..db9518cdff17 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -253,6 +253,47 @@ class LibraryClassObject(object):
         if Type is not None:
             self.SupModList = CleanString(Type).split(DataType.TAB_SPACE_SPLIT)
 
+## BuildClassObjectBase
+#
+# This is a base class for classes later in this file. it simplifies by 
+# not requiring duplication of standard functions.
+# This class is not intended for use outside of being a base class.
+#
+class BuildClassObjectBase(object):
+    __metaclass__ = ABCMeta
+    # prevent this class from being accidentally instantiated
+    @abstractmethod
+    def __init__(self, *a,**k):
+        super(BuildClassObjectBase,self).__init__(*a,**k)
+
+    ## Convert the class to a string
+    #
+    #  Convert member MetaFile of the class to a string
+    #
+    #  @retval string Formatted String
+    #
+    def __str__(self):
+        return str(self.MetaFile)
+
+    ## Override __eq__ function
+    #
+    # Check whether ModuleBuildClassObjects are the same
+    #
+    # @retval False The two ModuleBuildClassObjects are different
+    # @retval True  The two ModuleBuildClassObjects are the same
+    #
+    def __eq__(self, Other):
+        return Other and self.MetaFile == Other
+
+    ## Override __hash__ function
+    #
+    # Use MetaFile as key in hash table
+    #
+    # @retval string Key for hash table
+    #
+    def __hash__(self):
+        return hash(self.MetaFile)
+
 ## ModuleBuildClassObject
 #
 # This Class defines ModuleBuildClass
@@ -297,8 +338,9 @@ class LibraryClassObject(object):
 #                              { [BuildOptionKey] : BuildOptionValue}
 # @var Depex:                  To store value for Depex
 #
-class ModuleBuildClassObject(object):
-    def __init__(self):
+class ModuleBuildClassObject(BuildClassObjectBase):
+    def __init__(self, *a, **k):
+        super(ModuleBuildClassObject,self).__init__(*a,**k)
         self.AutoGenVersion          = 0
         self.MetaFile                = ''
         self.BaseName                = ''
@@ -330,34 +372,6 @@ class ModuleBuildClassObject(object):
         self.BuildOptions            = {}
         self.Depex                   = {}
 
-    ## Convert the class to a string
-    #
-    #  Convert member MetaFile of the class to a string
-    #
-    #  @retval string Formatted String
-    #
-    def __str__(self):
-        return str(self.MetaFile)
-
-    ## Override __eq__ function
-    #
-    # Check whether ModuleBuildClassObjects are the same
-    #
-    # @retval False The two ModuleBuildClassObjects are different
-    # @retval True  The two ModuleBuildClassObjects are the same
-    #
-    def __eq__(self, Other):
-        return self.MetaFile == Other
-
-    ## Override __hash__ function
-    #
-    # Use MetaFile as key in hash table
-    #
-    # @retval string Key for hash table
-    #
-    def __hash__(self):
-        return hash(self.MetaFile)
-
 ## PackageBuildClassObject
 #
 # This Class defines PackageBuildClass
@@ -381,11 +395,12 @@ class ModuleBuildClassObject(object):
 # @var Pcds:            To store value for Pcds, it is a set structure as
 #                       { [(PcdCName, PcdGuidCName)] : PcdClassObject}
 #
-class PackageBuildClassObject(object):
+class PackageBuildClassObject(BuildClassObjectBase):
     __metaclass__ = ABCMeta
     # prevent this class from being accidentally instantiated
     @abstractmethod
-    def __init__(self):
+    def __init__(self, *a, **k):
+        super(PackageBuildClassObject,self).__init__(*a,**k)
         self.MetaFile                = ''
         self.PackageName             = ''
         self.Guid                    = ''
@@ -398,34 +413,6 @@ class PackageBuildClassObject(object):
         self.LibraryClasses          = {}
         self.Pcds                    = {}
 
-    ## Convert the class to a string
-    #
-    #  Convert member MetaFile of the class to a string
-    #
-    #  @retval string Formatted String
-    #
-    def __str__(self):
-        return str(self.MetaFile)
-
-    ## Override __eq__ function
-    #
-    # Check whether PackageBuildClassObjects are the same
-    #
-    # @retval False The two PackageBuildClassObjects are different
-    # @retval True  The two PackageBuildClassObjects are the same
-    #
-    def __eq__(self, Other):
-        return self.MetaFile == Other
-
-    ## Override __hash__ function
-    #
-    # Use MetaFile as key in hash table
-    #
-    # @retval string Key for hash table
-    #
-    def __hash__(self):
-        return hash(self.MetaFile)
-
 ## PlatformBuildClassObject
 #
 # This Class defines PlatformBuildClass
@@ -454,11 +441,12 @@ class PackageBuildClassObject(object):
 # @var BuildOptions:      To store value for BuildOptions, it is a set structure as
 #                         { [BuildOptionKey] : BuildOptionValue }
 #
-class PlatformBuildClassObject(object):
+class PlatformBuildClassObject(BuildClassObjectBase):
     __metaclass__ = ABCMeta
     # prevent this class from being accidentally instantiated
     @abstractmethod
-    def __init__(self):
+    def __init__(self, *a, **k):
+        super(PlatformBuildClassObject,self).__init__(*a,**k)
         self.MetaFile                = ''
         self.PlatformName            = ''
         self.Guid                    = ''
@@ -476,31 +464,3 @@ class PlatformBuildClassObject(object):
         self.Libraries               = {}
         self.Pcds                    = {}
         self.BuildOptions            = {}
-
-    ## Convert the class to a string
-    #
-    #  Convert member MetaFile of the class to a string
-    #
-    #  @retval string Formatted String
-    #
-    def __str__(self):
-        return str(self.MetaFile)
-
-    ## Override __eq__ function
-    #
-    # Check whether PlatformBuildClassObjects are the same
-    #
-    # @retval False The two PlatformBuildClassObjects are different
-    # @retval True  The two PlatformBuildClassObjects are the same
-    #
-    def __eq__(self, Other):
-        return self.MetaFile == Other
-
-    ## Override __hash__ function
-    #
-    # Use MetaFile as key in hash table
-    #
-    # @retval string Key for hash table
-    #
-    def __hash__(self):
-        return hash(self.MetaFile)
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 03/11] BaseTools: remove unused code
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 02/11] BaseTools: Workspace - create a base class Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith Jaben Carsey
                   ` (7 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

delete commented out code
delete never used class/variable/function/import
refactor to remove Ffs class
dont construct class just for class attribute

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/GenMake.py             | 17 ++--
 BaseTools/Source/Python/CommonDataClass/DataClass.py   | 47 +----------
 BaseTools/Source/Python/Ecc/Database.py                | 76 +----------------
 BaseTools/Source/Python/Ecc/c.py                       |  1 -
 BaseTools/Source/Python/GenFds/CapsuleData.py          |  1 -
 BaseTools/Source/Python/GenFds/CompressSection.py      |  6 +-
 BaseTools/Source/Python/GenFds/DataSection.py          |  4 +-
 BaseTools/Source/Python/GenFds/DepexSection.py         |  1 -
 BaseTools/Source/Python/GenFds/EfiSection.py           | 16 ++--
 BaseTools/Source/Python/GenFds/Ffs.py                  | 88 ++++++++------------
 BaseTools/Source/Python/GenFds/FfsFileStatement.py     |  4 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py      | 10 +--
 BaseTools/Source/Python/GenFds/Fv.py                   |  1 -
 BaseTools/Source/Python/GenFds/FvImageSection.py       |  6 +-
 BaseTools/Source/Python/GenFds/GuidSection.py          |  4 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py   |  7 +-
 BaseTools/Source/Python/GenFds/UiSection.py            |  6 +-
 BaseTools/Source/Python/GenFds/VerSection.py           |  6 +-
 BaseTools/Source/Python/Workspace/WorkspaceDatabase.py | 18 ++--
 BaseTools/Source/Python/build/build.py                 |  2 +-
 20 files changed, 81 insertions(+), 240 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 68ec9a817133..1c8ab7fe1ec8 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -74,8 +74,6 @@ class BuildFile(object):
     ## template used to generate the build file (i.e. makefile if using make)
     _TEMPLATE_ = TemplateString('')
 
-    _DEFAULT_FILE_NAME_ = "Makefile"
-
     ## default file name for each type of build file
     _FILE_NAME_ = {
         "nmake" :   "Makefile",
@@ -151,23 +149,18 @@ class BuildFile(object):
         "gmake" :   "test -f %(Src)s && $(CP) %(Src)s %(Dst)s"
     }
 
-    _CD_TEMPLATE_ = {
-        "nmake" :   'if exist %(dir)s cd %(dir)s',
-        "gmake" :   "test -e %(dir)s && cd %(dir)s"
-    }
-
     _MAKE_TEMPLATE_ = {
         "nmake" :   'if exist %(file)s "$(MAKE)" $(MAKE_FLAGS) -f %(file)s',
         "gmake" :   'test -e %(file)s && "$(MAKE)" $(MAKE_FLAGS) -f %(file)s'
     }
 
-    _INCLUDE_CMD_ = {
-        "nmake" :   '!INCLUDE',
-        "gmake" :   "include"
+    _INC_FLAG_ = {
+        "MSFT" : "/I",
+        "GCC" : "-I",
+        "INTEL" : "-I",
+        "RVCT" : "-I"
     }
 
-    _INC_FLAG_ = {"MSFT" : "/I", "GCC" : "-I", "INTEL" : "-I", "RVCT" : "-I"}
-
     ## Constructor of BuildFile
     #
     #   @param  AutoGenObject   Object of AutoGen class
diff --git a/BaseTools/Source/Python/CommonDataClass/DataClass.py b/BaseTools/Source/Python/CommonDataClass/DataClass.py
index 31ed46c7ec56..ddf3270ad8a5 100644
--- a/BaseTools/Source/Python/CommonDataClass/DataClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/DataClass.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to define class for data structure used in ECC
 #
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.    The full text of the license may be found at
@@ -290,51 +290,6 @@ class IdentifierClass(object):
         self.EndLine = EndLine
         self.EndColumn = EndColumn
 
-## PcdClass
-#
-# This class defines a structure of a Pcd
-#
-# @param ID:                   ID of a Pcd
-# @param CName:                CName of a Pcd
-# @param TokenSpaceGuidCName:  TokenSpaceGuidCName of a Pcd
-# @param Token:                Token of a Pcd
-# @param DatumType:            DatumType of a Pcd
-# @param Model:                Model of a Pcd
-# @param BelongsToFile:        The Pcd belongs to which file
-# @param BelongsToFunction:    The Pcd belongs to which function
-# @param StartLine:            StartLine of a Pcd
-# @param StartColumn:          StartColumn of a Pcd
-# @param EndLine:              EndLine of a Pcd
-# @param EndColumn:            EndColumn of a Pcd
-#
-# @var ID:                     ID of a Pcd
-# @var CName:                  CName of a Pcd
-# @var TokenSpaceGuidCName:    TokenSpaceGuidCName of a Pcd
-# @var Token:                  Token of a Pcd
-# @var DatumType:              DatumType of a Pcd
-# @var Model:                  Model of a Pcd
-# @var BelongsToFile:          The Pcd belongs to which file
-# @var BelongsToFunction:      The Pcd belongs to which function
-# @var StartLine:              StartLine of a Pcd
-# @var StartColumn:            StartColumn of a Pcd
-# @var EndLine:                EndLine of a Pcd
-# @var EndColumn:              EndColumn of a Pcd
-#
-class PcdDataClass(object):
-    def __init__(self, ID = -1, CName = '', TokenSpaceGuidCName = '', Token = '', DatumType = '', Model = MODEL_UNKNOWN, \
-                 BelongsToFile = -1, BelongsToFunction = -1, StartLine = -1, StartColumn = -1, EndLine = -1, EndColumn = -1):
-        self.ID = ID
-        self.CName = CName
-        self.TokenSpaceGuidCName = TokenSpaceGuidCName
-        self.Token = Token
-        self.DatumType = DatumType
-        self.BelongsToFile = BelongsToFile
-        self.BelongsToFunction = BelongsToFunction
-        self.StartLine = StartLine
-        self.StartColumn = StartColumn
-        self.EndLine = EndLine
-        self.EndColumn = EndColumn
-
 ## FileClass
 #
 # This class defines a structure of a file
diff --git a/BaseTools/Source/Python/Ecc/Database.py b/BaseTools/Source/Python/Ecc/Database.py
index 204117512452..dbc699502934 100644
--- a/BaseTools/Source/Python/Ecc/Database.py
+++ b/BaseTools/Source/Python/Ecc/Database.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create a database used by ECC tool
 #
-# Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -211,59 +211,6 @@ class Database(object):
 
         EdkLogger.verbose("Insert information from file %s ... DONE!" % File.FullPath)
 
-    ## UpdateIdentifierBelongsToFunction
-    #
-    # Update the field "BelongsToFunction" for each Indentifier
-    #
-    #
-    def UpdateIdentifierBelongsToFunction_disabled(self):
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers started ...")
-
-        SqlCommand = """select ID, BelongsToFile, StartLine, EndLine, Model from Identifier"""
-        EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-        self.Cur.execute(SqlCommand)
-        Records = self.Cur.fetchall()
-        for Record in Records:
-            IdentifierID = Record[0]
-            BelongsToFile = Record[1]
-            StartLine = Record[2]
-            EndLine = Record[3]
-            Model = Record[4]
-
-            #
-            # Check whether an identifier belongs to a function
-            #
-            EdkLogger.debug(4, "For common identifiers ... ")
-            SqlCommand = """select ID from Function
-                        where StartLine < %s and EndLine > %s
-                        and BelongsToFile = %s""" % (StartLine, EndLine, BelongsToFile)
-            EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-            self.Cur.execute(SqlCommand)
-            IDs = self.Cur.fetchall()
-            for ID in IDs:
-                SqlCommand = """Update Identifier set BelongsToFunction = %s where ID = %s""" % (ID[0], IdentifierID)
-                EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-                self.Cur.execute(SqlCommand)
-
-            #
-            # Check whether the identifier is a function header
-            #
-            EdkLogger.debug(4, "For function headers ... ")
-            if Model == DataClass.MODEL_IDENTIFIER_COMMENT:
-                SqlCommand = """select ID from Function
-                        where StartLine = %s + 1
-                        and BelongsToFile = %s""" % (EndLine, BelongsToFile)
-                EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-                self.Cur.execute(SqlCommand)
-                IDs = self.Cur.fetchall()
-                for ID in IDs:
-                    SqlCommand = """Update Identifier set BelongsToFunction = %s, Model = %s where ID = %s""" % (ID[0], DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, IdentifierID)
-                    EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-                    self.Cur.execute(SqlCommand)
-
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers ... DONE")
-
-
     ## UpdateIdentifierBelongsToFunction
     #
     # Update the field "BelongsToFunction" for each Indentifier
@@ -281,8 +228,6 @@ class Database(object):
             BelongsToFile = Record[1]
             StartLine = Record[2]
             EndLine = Record[3]
-            #Data1.append(("'file%s'" % BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine))
-            #Data2.append(("'file%s'" % BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1))
 
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s where BelongsToFile = %s and StartLine > %s and EndLine < %s""" % \
                         (BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine)
@@ -291,25 +236,6 @@ class Database(object):
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s, Model = %s where BelongsToFile = %s and Model = %s and EndLine = %s""" % \
                          (BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1)
             self.TblIdentifier.Exec(SqlCommand)
-#       #
-#       # Check whether an identifier belongs to a function
-#       #
-#       print Data1
-#       SqlCommand = """Update ? set BelongsToFunction = ? where BelongsToFile = ? and StartLine > ? and EndLine < ?"""
-#       print SqlCommand
-#       EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-#       self.Cur.executemany(SqlCommand, Data1)
-#
-#       #
-#       # Check whether the identifier is a function header
-#       #
-#       EdkLogger.debug(4, "For function headers ... ")
-#       SqlCommand = """Update ? set BelongsToFunction = ?, Model = ? where BelongsToFile = ? and Model = ? and EndLine = ?"""
-#       EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
-#       self.Cur.executemany(SqlCommand, Data2)
-#
-#       EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers ... DONE")
-
 
 ##
 #
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 93ee1990ba28..bc72abdce477 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -18,7 +18,6 @@ import string
 import CodeFragmentCollector
 import FileProfile
 from CommonDataClass import DataClass
-import Database
 from Common import EdkLogger
 from EccToolError import *
 import EccGlobalData
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index b376d6b2e9be..dd4c27bd15c7 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -15,7 +15,6 @@
 ##
 # Import Modules
 #
-import Ffs
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import StringIO
 from struct import pack
diff --git a/BaseTools/Source/Python/GenFds/CompressSection.py b/BaseTools/Source/Python/GenFds/CompressSection.py
index 4ae14f27b3e1..cdae74c52fd9 100644
--- a/BaseTools/Source/Python/GenFds/CompressSection.py
+++ b/BaseTools/Source/Python/GenFds/CompressSection.py
@@ -1,7 +1,7 @@
 ## @file
 # process compress section generation
 #
-#  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -15,7 +15,7 @@
 ##
 # Import Modules
 #
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import Section
 import subprocess
 import Common.LongFilePathOs as os
@@ -85,7 +85,7 @@ class CompressSection (CompressSectionClassObject) :
                      ModuleName + \
                      SUP_MODULE_SEC      + \
                      SecNum     + \
-                     Ffs.SectionSuffix['COMPRESS']
+                     SectionSuffix['COMPRESS']
         OutputFile = os.path.normpath(OutputFile)
         DummyFile = OutputFile + '.dummy'
         GenFdsGlobalVariable.GenerateSection(DummyFile, SectFiles, InputAlign=SectAlign, IsMakefile=IsMakefile)
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index 29caa00c0d8d..f0e5efab4178 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -18,7 +18,7 @@
 import Section
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import subprocess
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import Common.LongFilePathOs as os
 from CommonDataClass.FdfClass import DataSectionClassObject
 from Common.Misc import PeImageClass
@@ -120,7 +120,7 @@ class DataSection (DataSectionClassObject):
                 )
             self.SectFileName = TeFile
 
-        OutputFile = os.path.join (OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + Ffs.SectionSuffix.get(self.SecType))
+        OutputFile = os.path.join (OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix.get(self.SecType))
         OutputFile = os.path.normpath(OutputFile)
         GenFdsGlobalVariable.GenerateSection(OutputFile, [self.SectFileName], Section.Section.SectionType.get(self.SecType), IsMakefile = IsMakefile)
         FileList = [OutputFile]
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index f42162d5a27e..6e63cb97e51d 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -18,7 +18,6 @@
 import Section
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import subprocess
-from Ffs import Ffs
 import Common.LongFilePathOs as os
 from CommonDataClass.FdfClass import DepexSectionClassObject
 from AutoGen.GenDepex import DependencyExpression
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 5405d0a8da13..0064196a5a4e 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -19,7 +19,7 @@ from struct import *
 import Section
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import subprocess
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import Common.LongFilePathOs as os
 from CommonDataClass.FdfClass import EfiSectionClassObject
 from Common import EdkLogger
@@ -123,7 +123,7 @@ class EfiSection (EfiSectionClassObject):
                     BuildNumTuple = tuple()
 
                 Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + Ffs.SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
                                                     #Ui=StringData,
                                                     Ver=BuildNum,
@@ -134,7 +134,7 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
+                    OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     VerString = f.read()
                     f.close()
@@ -163,7 +163,7 @@ class EfiSection (EfiSectionClassObject):
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "File: %s miss Version Section value" %InfFileName)
                 Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + Ffs.SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
                                                     #Ui=VerString,
                                                     Ver=BuildNum,
@@ -184,7 +184,7 @@ class EfiSection (EfiSectionClassObject):
                 Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + Ffs.SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -193,7 +193,7 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
+                    OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     UiString = f.read()
                     f.close()
@@ -217,7 +217,7 @@ class EfiSection (EfiSectionClassObject):
                 Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + Ffs.SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -238,7 +238,7 @@ class EfiSection (EfiSectionClassObject):
                     """ Copy Map file to FFS output path """
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
+                    OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
                     
                     #Get PE Section alignment when align is set to AUTO
diff --git a/BaseTools/Source/Python/GenFds/Ffs.py b/BaseTools/Source/Python/GenFds/Ffs.py
index df585f3d819b..12d4dfe1b04c 100644
--- a/BaseTools/Source/Python/GenFds/Ffs.py
+++ b/BaseTools/Source/Python/GenFds/Ffs.py
@@ -1,7 +1,7 @@
 ## @file
 # process FFS generation
 #
-#  Copyright (c) 2007-2018, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -12,56 +12,40 @@
 #  WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
-##
-# Import Modules
-#
-from CommonDataClass.FdfClass import FDClassObject
 from Common.DataType import *
+# mapping between FILE type in FDF and file type for GenFfs
+FdfFvFileTypeToFileType = {
+    SUP_MODULE_SEC               : 'EFI_FV_FILETYPE_SECURITY_CORE',
+    SUP_MODULE_PEI_CORE          : 'EFI_FV_FILETYPE_PEI_CORE',
+    SUP_MODULE_PEIM              : 'EFI_FV_FILETYPE_PEIM',
+    SUP_MODULE_DXE_CORE          : 'EFI_FV_FILETYPE_DXE_CORE',
+    'FREEFORM'          : 'EFI_FV_FILETYPE_FREEFORM',
+    'DRIVER'            : 'EFI_FV_FILETYPE_DRIVER',
+    'APPLICATION'       : 'EFI_FV_FILETYPE_APPLICATION',
+    'FV_IMAGE'          : 'EFI_FV_FILETYPE_FIRMWARE_VOLUME_IMAGE',
+    'RAW'               : 'EFI_FV_FILETYPE_RAW',
+    'PEI_DXE_COMBO'     : 'EFI_FV_FILETYPE_COMBINED_PEIM_DRIVER',
+    'SMM'               : 'EFI_FV_FILETYPE_SMM',
+    SUP_MODULE_SMM_CORE          : 'EFI_FV_FILETYPE_SMM_CORE',
+    SUP_MODULE_MM_STANDALONE     : 'EFI_FV_FILETYPE_MM_STANDALONE',
+    SUP_MODULE_MM_CORE_STANDALONE : 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
+}
 
-## generate FFS
-#
-#
-class Ffs(FDClassObject):
-    # mapping between FILE type in FDF and file type for GenFfs
-    FdfFvFileTypeToFileType = {
-        SUP_MODULE_SEC               : 'EFI_FV_FILETYPE_SECURITY_CORE',
-        SUP_MODULE_PEI_CORE          : 'EFI_FV_FILETYPE_PEI_CORE',
-        SUP_MODULE_PEIM              : 'EFI_FV_FILETYPE_PEIM',
-        SUP_MODULE_DXE_CORE          : 'EFI_FV_FILETYPE_DXE_CORE',
-        'FREEFORM'          : 'EFI_FV_FILETYPE_FREEFORM',
-        'DRIVER'            : 'EFI_FV_FILETYPE_DRIVER',
-        'APPLICATION'       : 'EFI_FV_FILETYPE_APPLICATION',
-        'FV_IMAGE'          : 'EFI_FV_FILETYPE_FIRMWARE_VOLUME_IMAGE',
-        'RAW'               : 'EFI_FV_FILETYPE_RAW',
-        'PEI_DXE_COMBO'     : 'EFI_FV_FILETYPE_COMBINED_PEIM_DRIVER',
-        'SMM'               : 'EFI_FV_FILETYPE_SMM',
-        SUP_MODULE_SMM_CORE          : 'EFI_FV_FILETYPE_SMM_CORE',
-        SUP_MODULE_MM_STANDALONE     : 'EFI_FV_FILETYPE_MM_STANDALONE',
-        SUP_MODULE_MM_CORE_STANDALONE : 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
-    }
-    
-    # mapping between section type in FDF and file suffix
-    SectionSuffix = {
-        BINARY_FILE_TYPE_PE32                 : '.pe32',
-        BINARY_FILE_TYPE_PIC                  : '.pic',
-        BINARY_FILE_TYPE_TE                   : '.te',
-        BINARY_FILE_TYPE_DXE_DEPEX            : '.dpx',
-        'VERSION'              : '.ver',
-        BINARY_FILE_TYPE_UI                   : '.ui',
-        'COMPAT16'             : '.com16',
-        'RAW'                  : '.raw',
-        'FREEFORM_SUBTYPE_GUID': '.guid',
-        'SUBTYPE_GUID'         : '.guid',        
-        'FV_IMAGE'             : 'fv.sec',
-        'COMPRESS'             : '.com',
-        'GUIDED'               : '.guided',
-        BINARY_FILE_TYPE_PEI_DEPEX            : '.dpx',
-        BINARY_FILE_TYPE_SMM_DEPEX            : '.dpx'
-    }
-    
-    ## The constructor
-    #
-    #   @param  self        The object pointer
-    #
-    def __init__(self):
-        FfsClassObject.__init__(self)
+# mapping between section type in FDF and file suffix
+SectionSuffix = {
+    BINARY_FILE_TYPE_PE32                 : '.pe32',
+    BINARY_FILE_TYPE_PIC                  : '.pic',
+    BINARY_FILE_TYPE_TE                   : '.te',
+    BINARY_FILE_TYPE_DXE_DEPEX            : '.dpx',
+    'VERSION'              : '.ver',
+    BINARY_FILE_TYPE_UI                   : '.ui',
+    'COMPAT16'             : '.com16',
+    'RAW'                  : '.raw',
+    'FREEFORM_SUBTYPE_GUID': '.guid',
+    'SUBTYPE_GUID'         : '.guid',        
+    'FV_IMAGE'             : 'fv.sec',
+    'COMPRESS'             : '.com',
+    'GUIDED'               : '.guided',
+    BINARY_FILE_TYPE_PEI_DEPEX            : '.dpx',
+    BINARY_FILE_TYPE_SMM_DEPEX            : '.dpx'
+}
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index ba8e0465ef34..871499d3d2ad 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -15,7 +15,7 @@
 ##
 # Import Modules
 #
-import Ffs
+from Ffs import FdfFvFileTypeToFileType
 import Rule
 import Common.LongFilePathOs as os
 import StringIO
@@ -167,7 +167,7 @@ class FileStatement (FileStatementClassObject) :
         #
         FfsFileOutput = os.path.join(OutputDir, self.NameGuid + '.ffs')
         GenFdsGlobalVariable.GenerateFfs(FfsFileOutput, SectionFiles,
-                                         Ffs.Ffs.FdfFvFileTypeToFileType.get(self.FvFileType),
+                                         FdfFvFileTypeToFileType.get(self.FvFileType),
                                          self.NameGuid,
                                          Fixed=self.Fixed,
                                          CheckSum=self.CheckSum,
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index c332eee6079d..f76563d736f6 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -21,7 +21,7 @@ import Common.LongFilePathOs as os
 import StringIO
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
-import Ffs
+from Ffs import FdfFvFileTypeToFileType,SectionSuffix
 import subprocess
 import sys
 import Section
@@ -761,7 +761,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
                 SecNum = '%d' %Index
                 GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              Ffs.Ffs.SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
+                              SectionSuffix[SectionType] + 'SEC' + SecNum
                 Index = Index + 1
                 OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
                 File = GenFdsGlobalVariable.MacroExtend(File, Dict, self.CurrentArch)
@@ -804,7 +804,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         else:
             SecNum = '%d' %Index
             GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              Ffs.Ffs.SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
+                              SectionSuffix[SectionType] + 'SEC' + SecNum
             OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
             GenSecInputFile = GenFdsGlobalVariable.MacroExtend(GenSecInputFile, Dict, self.CurrentArch)
 
@@ -883,7 +883,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
             self.ModuleGuid = RegistryGuidStr
 
             GenFdsGlobalVariable.GenerateFfs(FfsOutput, InputSection,
-                                             Ffs.Ffs.FdfFvFileTypeToFileType[Rule.FvFileType],
+                                             FdfFvFileTypeToFileType[Rule.FvFileType],
                                              self.ModuleGuid, Fixed=Rule.Fixed,
                                              CheckSum=Rule.CheckSum, Align=Rule.Alignment,
                                              SectionAlign=SectionAlignments,
@@ -1056,7 +1056,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
         FfsOutput = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
         GenFdsGlobalVariable.GenerateFfs(FfsOutput, InputFile,
-                                             Ffs.Ffs.FdfFvFileTypeToFileType[Rule.FvFileType],
+                                             FdfFvFileTypeToFileType[Rule.FvFileType],
                                              self.ModuleGuid, Fixed=Rule.Fixed,
                                              CheckSum=Rule.CheckSum, Align=Rule.Alignment,
                                              SectionAlign=Alignments,
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 6714838f6fc9..29daba5a3a3e 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -20,7 +20,6 @@ import subprocess
 import StringIO
 from struct import *
 
-import Ffs
 import AprioriSection
 import FfsFileStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 57ecea0377bf..380fbe56f1c4 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -17,7 +17,7 @@
 #
 import Section
 import StringIO
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import subprocess
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import Common.LongFilePathOs as os
@@ -75,7 +75,7 @@ class FvImageSection(FvImageSectionClassObject):
                 if FvAlignmentValue > MaxFvAlignment:
                     MaxFvAlignment = FvAlignmentValue
 
-                OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get("FV_IMAGE"))
+                OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix["FV_IMAGE"])
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
@@ -139,7 +139,7 @@ class FvImageSection(FvImageSectionClassObject):
             #
             # Prepare the parameter of GenSection
             #
-            OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + Ffs.SectionSuffix.get("FV_IMAGE"))
+            OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix["FV_IMAGE"])
             GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
             OutputFileList.append(OutputFile)
 
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index bda185476b95..104650d16781 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -18,7 +18,7 @@
 #
 import Section
 import subprocess
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import Common.LongFilePathOs as os
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import GuidSectionClassObject
@@ -125,7 +125,7 @@ class GuidSection(GuidSectionClassObject) :
                      ModuleName + \
                      SUP_MODULE_SEC + \
                      SecNum + \
-                     Ffs.SectionSuffix['GUIDED']
+                     SectionSuffix['GUIDED']
         OutputFile = os.path.normpath(OutputFile)
 
         ExternalTool = None
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index a865ac4436d5..6179bfa181cb 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -1,7 +1,7 @@
 ## @file
 # process OptionROM generation from INF statement
 #
-#  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -69,11 +69,6 @@ class OptRomInfStatement (FfsInfStatement):
         if self.OverrideAttribs.PciRevision is None:
             self.OverrideAttribs.PciRevision = self.OptRomDefs.get ('PCI_REVISION')
         
-#        InfObj = GenFdsGlobalVariable.WorkSpace.BuildObject[self.PathClassObj, self.CurrentArch]  
-#        RecordList = InfObj._RawData[MODEL_META_DATA_HEADER, InfObj._Arch, InfObj._Platform]
-#        for Record in RecordList:
-#            Record = ReplaceMacros(Record, GlobalData.gEdkGlobal, False)
-#            Name = Record[0]  
     ## GenFfs() method
     #
     #   Generate FFS
diff --git a/BaseTools/Source/Python/GenFds/UiSection.py b/BaseTools/Source/Python/GenFds/UiSection.py
index 280500952b63..fe1e026f5edf 100644
--- a/BaseTools/Source/Python/GenFds/UiSection.py
+++ b/BaseTools/Source/Python/GenFds/UiSection.py
@@ -1,7 +1,7 @@
 ## @file
 # process UI section generation
 #
-#  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -16,7 +16,7 @@
 # Import Modules
 #
 import Section
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import subprocess
 import Common.LongFilePathOs as os
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -58,7 +58,7 @@ class UiSection (UiSectionClassObject):
             self.StringData = FfsInf.__ExtendMacro__(self.StringData)
             self.FileName = FfsInf.__ExtendMacro__(self.FileName)
 
-        OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + Ffs.SectionSuffix.get(BINARY_FILE_TYPE_UI))
+        OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix['UI'])
 
         if self.StringData is not None :
             NameString = self.StringData
diff --git a/BaseTools/Source/Python/GenFds/VerSection.py b/BaseTools/Source/Python/GenFds/VerSection.py
index 456a430079bb..1bcdc8110d30 100644
--- a/BaseTools/Source/Python/GenFds/VerSection.py
+++ b/BaseTools/Source/Python/GenFds/VerSection.py
@@ -1,7 +1,7 @@
 ## @file
 # process Version section generation
 #
-#  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -15,7 +15,7 @@
 ##
 # Import Modules
 #
-from Ffs import Ffs
+from Ffs import SectionSuffix
 import Section
 import Common.LongFilePathOs as os
 import subprocess
@@ -60,7 +60,7 @@ class VerSection (VerSectionClassObject):
             self.FileName = FfsInf.__ExtendMacro__(self.FileName)
 
         OutputFile = os.path.join(OutputPath,
-                                  ModuleName + SUP_MODULE_SEC + SecNum + Ffs.SectionSuffix.get('VERSION'))
+                                  ModuleName + 'SEC' + SecNum + SectionSuffix['VERSION'])
         OutputFile = os.path.normpath(OutputFile)
 
         # Get String Data
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
index 14dcb1ae8136..7c0949da079d 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
@@ -133,15 +133,6 @@ class WorkspaceDatabase(object):
             self._CACHE_[Key] = BuildObject
             return BuildObject
 
-    # placeholder for file format conversion
-    class TransformObjectFactory:
-        def __init__(self, WorkspaceDb):
-            self.WorkspaceDb = WorkspaceDb
-
-        # key = FilePath, Arch
-        def __getitem__(self, Key):
-            pass
-
     ## Constructor of WorkspaceDatabase
     #
     # @param DbPath             Path of database file
@@ -182,7 +173,6 @@ class WorkspaceDatabase(object):
 
         # conversion object for build or file format conversion purpose
         self.BuildObject = WorkspaceDatabase.BuildObjectFactory(self)
-        self.TransformObject = WorkspaceDatabase.TransformObjectFactory(self)
 
     ## Check whether workspace database need to be renew.
     #  The renew reason maybe:
@@ -198,10 +188,12 @@ class WorkspaceDatabase(object):
     #
     def _CheckWhetherDbNeedRenew (self, force, DbPath):
         # if database does not exist, we need do nothing
-        if not os.path.exists(DbPath): return False
+        if not os.path.exists(DbPath):
+            return False
             
         # if user force to renew database, then not check whether database is out of date
-        if force: return True
+        if force:
+            return True
         
         #    
         # Check the time of last modified source file or build.exe
@@ -223,7 +215,7 @@ determine whether database file is out of date!\n")
             for root, dirs, files in os.walk (rootPath):
                 for dir in dirs:
                     # bypass source control folder 
-                    if dir.lower() in [".svn", "_svn", "cvs"]:
+                    if dir.lower() in {".svn", "_svn", "cvs", ".git"}:
                         dirs.remove(dir)
                         
                 for file in files:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 1ef2dc5bfe70..99e4881b3ea4 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -1261,7 +1261,7 @@ class Build():
                                 (AutoGenObject.BuildTarget, AutoGenObject.ToolChain, AutoGenObject.Arch),
                             ExtraData=str(AutoGenObject))
 
-        makefile = GenMake.BuildFile(AutoGenObject)._FILE_NAME_[GenMake.gMakeType]
+        makefile = GenMake.BuildFile._FILE_NAME_[GenMake.gMakeType]
 
         # run
         if Target == 'run':
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (2 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 03/11] BaseTools: remove unused code Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey
                   ` (6 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

As both can take a tuple, use that instead of calling repeatedly.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/GenMake.py                      |  2 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                    |  2 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py      |  2 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                          |  2 +-
 BaseTools/Source/Python/Common/Expression.py                    | 13 ++++++-------
 BaseTools/Source/Python/Common/Misc.py                          |  6 +++---
 BaseTools/Source/Python/Ecc/Check.py                            |  6 +++---
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py            |  4 ++--
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py |  2 +-
 BaseTools/Source/Python/Ecc/c.py                                | 14 +++++++-------
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py            |  9 +++------
 BaseTools/Source/Python/GenFds/FdfParser.py                     | 10 ++++------
 BaseTools/Source/Python/Trim/Trim.py                            |  2 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py           |  2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |  2 +-
 BaseTools/Source/Python/Workspace/InfBuildData.py               |  2 +-
 BaseTools/Source/Python/build/BuildReport.py                    |  2 +-
 17 files changed, 38 insertions(+), 44 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 1c8ab7fe1ec8..d70c5c26ffc8 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -741,7 +741,7 @@ cleanlib:
                             index = index + 1
                         if CmdName == 'Trim':
                             SecDepsFileList.append(os.path.join('$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
-                        if OutputFile.endswith('.ui') or OutputFile.endswith('.ver'):
+                        if OutputFile.endswith(('.ui','.ver')):
                             SecDepsFileList.append(os.path.join('$(MODULE_DIR)','$(MODULE_FILE)'))
                         self.FfsOutputFileList.append((OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
                         if len(SecDepsFileList) > 0:
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index 73af1214eb0a..e5e4f25efd5d 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -290,7 +290,7 @@ def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
                 if DefaultTag not in UniLanguageListFiltered:
                     # check whether language code with primary code equivalent with DefaultTag already in the list, if so, use that
                     for UniLanguage in UniLanguageList:
-                        if UniLanguage.startswith('en-') or UniLanguage.startswith('eng-'):
+                        if UniLanguage.startswith(('eng-','en-')):
                             if UniLanguage not in UniLanguageListFiltered:
                                 UniLanguageListFiltered += [UniLanguage]
                             break
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index e2b4795129ef..3b54865000bf 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -254,7 +254,7 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
         for valid_num in valid_num_list:
             valid_num = valid_num.strip()
 
-            if valid_num.startswith('0x') or valid_num.startswith('0X'):
+            if valid_num.startswith(('0x','0X')):
                 self.data.add(int(valid_num, 16))
             else:
                 self.data.add(int(valid_num))
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 69a9665f5a76..4fa12b7d59de 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -172,7 +172,7 @@ class PcdEntry:
     #  @param ValueString     The Integer type string for pack.
     #       
     def _PackPtrValue(self, ValueString, Size):
-        if ValueString.startswith('L"') or ValueString.startswith("L'"):
+        if ValueString.startswith(("L'",'L"')):
             self._PackUnicode(ValueString, Size)
         elif ValueString.startswith('{') and ValueString.endswith('}'):
             self._PackByteArray(ValueString, Size)
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 9fa07c6add16..e5d17e6b4de0 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -130,7 +130,7 @@ def IsValidCName(Str):
 def BuildOptionValue(PcdValue, GuidDict):
     if PcdValue.startswith('H'):
         InputValue = PcdValue[1:]
-    elif PcdValue.startswith("L'") or PcdValue.startswith("'"):
+    elif PcdValue.startswith(("L'", "'")):
         InputValue = PcdValue
     elif PcdValue.startswith('L'):
         InputValue = 'L"' + PcdValue[1:] + '"'
@@ -390,7 +390,7 @@ class ValueExpression(BaseExpression):
             elif not Val:
                 Val = False
                 RealVal = '""'
-            elif not Val.startswith('L"') and not Val.startswith('{') and not Val.startswith("L'"):
+            elif not Val.startswith(('L"',"L'",'{')):
                 Val = True
                 RealVal = '"' + RealVal + '"'
 
@@ -532,7 +532,7 @@ class ValueExpression(BaseExpression):
         Radix = 10
         if self._Token.lower()[0:2] == '0x' and len(self._Token) > 2:
             Radix = 16
-        if self._Token.startswith('"') or self._Token.startswith('L"'):
+        if self._Token.startswith(('"','L"')):
             Flag = 0
             for Index in range(len(self._Token)):
                 if self._Token[Index] in {'"'}:
@@ -541,7 +541,7 @@ class ValueExpression(BaseExpression):
                     Flag += 1
             if Flag == 2 and self._Token.endswith('"'):
                 return True
-        if self._Token.startswith("'") or self._Token.startswith("L'"):
+        if self._Token.startswith(("'","L'")):
             Flag = 0
             for Index in range(len(self._Token)):
                 if self._Token[Index] in {"'"}:
@@ -810,15 +810,14 @@ class ValueExpressionEx(ValueExpression):
         PcdValue = self.PcdValue
         try:
             PcdValue = ValueExpression.__call__(self, RealValue, Depth)
-            if self.PcdType == TAB_VOID and (PcdValue.startswith("'") or PcdValue.startswith("L'")):
+            if self.PcdType == TAB_VOID and PcdValue.startswith(("'","L'")):
                 PcdValue, Size = ParseFieldValue(PcdValue)
                 PcdValueList = []
                 for I in range(Size):
                     PcdValueList.append('0x%02X'%(PcdValue & 0xff))
                     PcdValue = PcdValue >> 8
                 PcdValue = '{' + ','.join(PcdValueList) + '}'
-            elif self.PcdType in TAB_PCD_NUMERIC_TYPES and (PcdValue.startswith("'") or \
-                      PcdValue.startswith('"') or PcdValue.startswith("L'") or PcdValue.startswith('L"') or PcdValue.startswith('{')):
+            elif self.PcdType in TAB_PCD_NUMERIC_TYPES and PcdValue.startswith(("'",'"',"L'",'L"','{')):
                 raise BadExpression
         except WrnExpression, Value:
             PcdValue = Value.result
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 90350f863826..17907a318944 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1568,8 +1568,8 @@ def AnalyzePcdData(Setting):
 def CheckPcdDatum(Type, Value):
     if Type == TAB_VOID:
         ValueRe = re.compile(r'\s*L?\".*\"\s*$')
-        if not (((Value.startswith('L"') or Value.startswith('"')) and Value.endswith('"'))
-                or (Value.startswith('{') and Value.endswith('}')) or (Value.startswith("L'") or Value.startswith("'") and Value.endswith("'"))
+        if not ((Value.startswith(('L"','"')) and Value.endswith('"'))
+                or (Value.startswith('{') and Value.endswith('}')) or (Value.startswith(("L'","'")) and Value.endswith("'"))
                ):
             return False, "Invalid value [%s] of type [%s]; must be in the form of {...} for array"\
                           ", \"...\" or \'...\' for string, L\"...\" or L\'...\' for unicode string" % (Value, Type)
@@ -2106,7 +2106,7 @@ def GetIntegerValue(Input):
     if String.endswith("LL"):
         String = String[:-2]
 
-    if String.startswith("0x") or String.startswith("0X"):
+    if String.startswith(("0x","0X")):
         return int(String, 16)
     elif String == '':
         return 0
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index dde7d7841082..e7bd97297538 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -777,7 +777,7 @@ class Check(object):
             SqlCommand = """select ID, Value1, Value2 from Dsc where Model = %s""" % MODEL_EFI_LIBRARY_CLASS
             LibraryClasses = EccGlobalData.gDb.TblDsc.Exec(SqlCommand)
             for LibraryClass in LibraryClasses:
-                if LibraryClass[1].upper() == 'NULL' or LibraryClass[1].startswith('!ifdef') or LibraryClass[1].startswith('!ifndef') or LibraryClass[1].endswith('!endif'):
+                if LibraryClass[1].upper() == 'NULL' or LibraryClass[1].startswith(('!ifdef','!ifndef')) or LibraryClass[1].endswith('!endif'):
                     continue
                 else:
                     LibraryIns = os.path.normpath(mws.join(EccGlobalData.gWorkspace, LibraryClass[2]))
@@ -1029,9 +1029,9 @@ class Check(object):
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, FunName):
                             if Model in [MODEL_PCD_FIXED_AT_BUILD] and not FunName.startswith('FixedPcdGet'):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FixPcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_FEATURE_FLAG] and (not FunName.startswith('FeaturePcdGet') and not FunName.startswith('FeaturePcdSet')):
+                            if Model in [MODEL_PCD_FEATURE_FLAG] and not FunName.startswith(('FeaturePcdGet','FeaturePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FeaturePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_PATCHABLE_IN_MODULE] and (not FunName.startswith('PatchablePcdGet') and not FunName.startswith('PatchablePcdSet')):
+                            if Model in [MODEL_PCD_PATCHABLE_IN_MODULE] and not FunName.startswith(('PatchablePcdGet','PatchablePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a PatchablePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
 
             #ERROR_META_DATA_FILE_CHECK_PCD_TYPE
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index ffa51de7c1bf..a2d2817b73f4 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -221,7 +221,7 @@ class CodeFragmentCollector:
         
         if self.Profile.FileLinesList[Line - 1][0] != T_CHAR_HASH:
             BeforeHashPart = str(self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
-            if BeforeHashPart.rstrip().endswith(T_CHAR_COMMA) or BeforeHashPart.rstrip().endswith(';'):
+            if BeforeHashPart.rstrip().endswith((T_CHAR_COMMA,';')):
                 return
         
         if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(','):
@@ -230,7 +230,7 @@ class CodeFragmentCollector:
         if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(';'):
             return
         
-        if str(self.Profile.FileLinesList[Line]).lstrip().startswith(',') or str(self.Profile.FileLinesList[Line]).lstrip().startswith(';'):
+        if str(self.Profile.FileLinesList[Line]).lstrip().startswith((',',';')):
             return
         
         self.Profile.FileLinesList[Line - 1].insert(self.CurrentOffsetWithinLine, ',')
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index e5c43b629151..0f9711ba109e 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -1712,7 +1712,7 @@ class DecParser(MetaFileParser):
         if len(GuidValueList) == 11:
             for GuidValue in GuidValueList:
                 GuidValue = GuidValue.strip()
-                if GuidValue.startswith('0x') or GuidValue.startswith('0X'):
+                if GuidValue.startswith(('0x','0X')):
                     HexList.append('0x' + str(GuidValue[2:]))
                     Index += 1
                     continue
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index bc72abdce477..4c49d1ca570f 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -1372,7 +1372,7 @@ def CheckFuncLayoutName(FullFileName):
                 PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
             StartLine = Param.StartLine
 
-        if not Result[0].endswith('\n  )') and not Result[0].endswith('\r  )'):
+        if not Result[0].endswith(('\n  )','\r  )')):
             PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, '\')\' should be on a new line and indented two spaces', FileTable, Result[1])
 
     SqlStatement = """ select Modifier, ID, FunNameStartColumn, Name
@@ -1398,7 +1398,7 @@ def CheckFuncLayoutName(FullFileName):
             if not Pattern.match(Param.Name) and not Param.Name in ParamIgnoreList and not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Param.Name):
                 PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
             StartLine = Param.StartLine
-        if not Result[0].endswith('\n  )') and not Result[0].endswith('\r  )'):
+        if not Result[0].endswith(('\n  )','\r  )')):
             PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, '\')\' should be on a new line and indented two spaces', 'Function', Result[1])
 
 def CheckFuncLayoutPrototype(FullFileName):
@@ -2193,7 +2193,7 @@ def CheckHeaderFileIfndef(FullFileName):
                    """ % (FileTable, Result[1])
         ResultSet = Db.TblFile.Exec(SqlStatement)
         for Result in ResultSet:
-            if not Result[0].startswith('/*') and not Result[0].startswith('//'):
+            if not Result[0].startswith(('/*','//')):
                 PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_2, '', 'File', FileID)
         break
 
@@ -2203,7 +2203,7 @@ def CheckHeaderFileIfndef(FullFileName):
                    """ % (FileTable, FileTable, DataClass.MODEL_IDENTIFIER_MACRO_ENDIF)
     ResultSet = Db.TblFile.Exec(SqlStatement)
     for Result in ResultSet:
-        if not Result[0].startswith('/*') and not Result[0].startswith('//'):
+        if not Result[0].startswith(('/*','//')):
             PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_3, '', 'File', FileID)
     return ErrorMsgList
 
@@ -2374,7 +2374,7 @@ def CheckFileHeaderDoxygenComments(FullFileName):
                 break
             # Check whether C File header Comment content start with two spaces.
             if EccGlobalData.gConfig.HeaderCheckCFileCommentStartSpacesNum == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-                if CommentLine.startswith('/** @file') == False and CommentLine.startswith('**/') == False and CommentLine.strip() and CommentLine.startswith('  ') == False:
+                if not CommentLine.startswith(('/** @file','**/','  ')) and CommentLine.strip():
                     PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment content should start with two spaces at each line', FileTable, ID)
             
             CommentLine = CommentLine.strip()
@@ -2401,7 +2401,7 @@ def CheckFileHeaderDoxygenComments(FullFileName):
                     # Check whether C File header Comment's each reference at list should begin with a bullet character.
                     if EccGlobalData.gConfig.HeaderCheckCFileCommentReferenceFormat == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
                         if RefListFlag == True:
-                            if RefLine.strip() and RefLine.strip().startswith('**/') == False and RefLine.startswith('  -') == False:                            
+                            if RefLine.strip() and not RefLine.strip().startswith(('**/','  -')):                            
                                 PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'Each reference on a separate line should begin with a bullet character ""-"" ', FileTable, ID)                    
     
     if NoHeaderCommentStartFlag:
@@ -2614,7 +2614,7 @@ def CheckFunctionHeaderConsistentWithDoxygenComment(FuncModifier, FuncHeader, Fu
                 ErrorMsgList.append('Line %d : VOID return type need NO doxygen tags in comment' % CommentStartLine)
                 PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'VOID return type need no doxygen tags in comment ', TableName, CommentId)
         else:
-            if Index < DoxygenTagNumber and not DoxygenStrList[Index].startswith('@retval') and not DoxygenStrList[Index].startswith('@return'):
+            if Index < DoxygenTagNumber and not DoxygenStrList[Index].startswith(('@retval','@return')):
                 ErrorMsgList.append('Line %d : Number of @param doxygen tags in comment does NOT match number of function parameters' % CommentStartLine)
                 PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'Number of @param doxygen tags in comment does NOT match number of function parameters ', TableName, CommentId)
     else:
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index 87f179206d84..b962ad019161 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -215,16 +215,13 @@ class CodeFragmentCollector:
 
         if self.Profile.FileLinesList[Line - 1][0] != T_CHAR_HASH:
             BeforeHashPart = str(self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
-            if BeforeHashPart.rstrip().endswith(T_CHAR_COMMA) or BeforeHashPart.rstrip().endswith(';'):
+            if BeforeHashPart.rstrip().endswith((T_CHAR_COMMA,';')):
                 return
 
-        if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(','):
+        if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith((',',';')):
             return
 
-        if Line - 2 >= 0 and str(self.Profile.FileLinesList[Line - 2]).rstrip().endswith(';'):
-            return
-
-        if str(self.Profile.FileLinesList[Line]).lstrip().startswith(',') or str(self.Profile.FileLinesList[Line]).lstrip().startswith(';'):
+        if str(self.Profile.FileLinesList[Line]).lstrip().startswith((',',';')):
             return
 
         self.Profile.FileLinesList[Line - 1].insert(self.CurrentOffsetWithinLine, ',')
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 8a9296c49d1d..2439d8ab9455 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1250,7 +1250,7 @@ class FdfParser:
     #   @retval False       Not able to find a string data, file buffer pointer not changed
     #
     def __GetStringData(self):
-        if self.__Token.startswith("\"") or self.__Token.startswith("L\""):
+        if self.__Token.startswith(("\"","L\"")):
             self.__UndoToken()
             self.__SkipToToken("\"")
             currentLineNumber = self.CurrentLineNumber
@@ -1262,7 +1262,7 @@ class FdfParser:
             self.__Token = self.__SkippedChars.rstrip('\"')
             return True
 
-        elif self.__Token.startswith("\'") or self.__Token.startswith("L\'"):
+        elif self.__Token.startswith(("\'","L\'")):
             self.__UndoToken()
             self.__SkipToToken("\'")
             currentLineNumber = self.CurrentLineNumber
@@ -1392,8 +1392,7 @@ class FdfParser:
 
     def SectionParser(self, section):
         S = section.upper()
-        if not S.startswith("[DEFINES") and not S.startswith("[FD.") and not S.startswith("[FV.") and not S.startswith("[CAPSULE.") \
-            and not S.startswith("[VTF.") and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM.") and not S.startswith('[FMPPAYLOAD.'):
+        if not S.startswith(("[DEFINES","[FD.","[FV.","[CAPSULE.","[VTF.","[RULE.","[OPTIONROM.",'[FMPPAYLOAD.')):
             raise Warning("Unknown section or section appear sequence error (The correct sequence should be [DEFINES], [FD.], [FV.], [Capsule.], [VTF.], [Rule.], [OptionRom.], [FMPPAYLOAD.])", self.FileName, self.CurrentLineNumber)
 
     ## __GetDefines() method
@@ -1457,8 +1456,7 @@ class FdfParser:
 
         S = self.__Token.upper()
         if S.startswith("[") and not S.startswith("[FD."):
-            if not S.startswith("[FV.") and not S.startswith('[FMPPAYLOAD.') and not S.startswith("[CAPSULE.") \
-                and not S.startswith("[VTF.") and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM."):
+            if not S.startswith(("[FV.",'[FMPPAYLOAD.',"[CAPSULE.","[VTF.","[RULE.","[OPTIONROM.")):
                 raise Warning("Unknown section", self.FileName, self.CurrentLineNumber)
             self.__UndoToken()
             return False
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index a74075859148..d2e6d317676c 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -409,7 +409,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
             LineNum = 0
             for Line in open(IncludePathFile,'r'):
                 LineNum += 1
-                if Line.startswith("/I") or Line.startswith ("-I"):
+                if Line.startswith(("/I","-I")):
                     IncludePathList.append(Line[2:].strip())
                 else:
                     EdkLogger.warn("Trim", "Invalid include line in include list file.", IncludePathFile, LineNum)
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index db9518cdff17..3e68be3ce34e 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -82,7 +82,7 @@ class PcdClassObject(object):
         if self.PcdValueFromComm:
             if self.PcdValueFromComm.startswith("{") and self.PcdValueFromComm.endswith("}"):
                 return max([len(self.PcdValueFromComm.split(",")),MaxSize])
-            elif self.PcdValueFromComm.startswith("\"") or self.PcdValueFromComm.startswith("\'"):
+            elif self.PcdValueFromComm.startswith(("\"","\'")):
                 return max([len(self.PcdValueFromComm)-2+1,MaxSize])
             elif self.PcdValueFromComm.startswith("L\""):
                 return max([2*(len(self.PcdValueFromComm)-3+1),MaxSize])
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 8476543c5352..2de8a84b9bd7 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1062,7 +1062,7 @@ class DscBuildData(PlatformBuildClassObject):
             except BadExpression, Value:     
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
-        elif PcdValue.startswith("L'") or PcdValue.startswith("'"):
+        elif PcdValue.startswith(("L'","'")):
             if FieldName and IsFieldValueAnArray(PcdValue):
                 PcdDatumType = TAB_VOID
                 IsArray = True
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index bd1c84154123..12d848b5fc41 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -1080,7 +1080,7 @@ class InfBuildData(ModuleBuildClassObject):
                     # Check hexadecimal token value length and format.
                     #
                     ReIsValidPcdTokenValue = re.compile(r"^[0][x|X][0]*[0-9a-fA-F]{1,8}$", re.DOTALL)
-                    if Pcd.TokenValue.startswith("0x") or Pcd.TokenValue.startswith("0X"):
+                    if Pcd.TokenValue.startswith(("0x","0X")):
                         if ReIsValidPcdTokenValue.match(Pcd.TokenValue) is None:
                             EdkLogger.error(
                                     'build',
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index cf45ef173498..72a557dfea50 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1318,7 +1318,7 @@ class PcdReport(object):
                     return value[1:-1]
                 for ch in value[1:-1].split(','):
                     ch = ch.strip()
-                    if ch.startswith('0x') or ch.startswith('0X'):
+                    if ch.startswith(('0x','0X')):
                         valuelist.append(ch)
                         continue
                     try:
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 05/11] BaseTools: use set presence instead of series of equality
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (3 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 06/11] BaseTools: refactor section generation Jaben Carsey
                   ` (5 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

Instead of testing each equality individually, just make a set and test once.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/Ecc/Configuration.py | 10 +---------
 1 file changed, 1 insertion(+), 9 deletions(-)

diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index fee7ecb9703d..d37ce8e19e67 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -404,17 +404,9 @@ class Configuration(object):
                     ErrorMsg = "Invalid configuration option '%s' was found" % List[0]
                     EdkLogger.error("Ecc", EdkLogger.ECC_ERROR, ErrorMsg, File = Filepath, Line = LineNo)
                 assert _ConfigFileToInternalTranslation[List[0]] in self.__dict__
-                if List[0] == 'ModifierList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 if List[0] == 'MetaDataFileCheckPathOfGenerateFileList' and List[1] == "":
                     continue
-                if List[0] == 'SkipDirList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'SkipFileList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'BinaryExtList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'Copyright':
+                if List[0] in {'ModifierList','SkipDirList','SkipFileList','BinaryExtList','Copyright'}:
                     List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 self.__dict__[_ConfigFileToInternalTranslation[List[0]]] = List[1]
 
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 06/11] BaseTools: refactor section generation
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (4 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 07/11] BaseTools: refactor file opening/writing Jaben Carsey
                   ` (4 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

use with for opening files
remove unneeded variables
dont seek to 0 for just opened file

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/Common/DataType.py           |   1 +
 BaseTools/Source/Python/GenFds/CompressSection.py    |  18 +-
 BaseTools/Source/Python/GenFds/DataSection.py        |  28 +--
 BaseTools/Source/Python/GenFds/DepexSection.py       |   2 +-
 BaseTools/Source/Python/GenFds/EfiSection.py         | 109 +++++----
 BaseTools/Source/Python/GenFds/FfsInfStatement.py    |  10 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py     |  28 +--
 BaseTools/Source/Python/GenFds/GuidSection.py        |   8 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py |   4 +-
 BaseTools/Source/Python/GenFds/Section.py            | 231 ++++++++------------
 BaseTools/Source/Python/GenFds/UiSection.py          |  18 +-
 BaseTools/Source/Python/GenFds/VerSection.py         |  22 +-
 12 files changed, 206 insertions(+), 273 deletions(-)

diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index a72c7e6f067f..93136dff0db2 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -50,6 +50,7 @@ TAB_EDK_SOURCE = '$(EDK_SOURCE)'
 TAB_EFI_SOURCE = '$(EFI_SOURCE)'
 TAB_WORKSPACE = '$(WORKSPACE)'
 TAB_FV_DIRECTORY = 'FV'
+TAB_SEC_DIRECTORY = 'SEC'
 
 TAB_ARCH_NULL = ''
 TAB_ARCH_COMMON = 'COMMON'
diff --git a/BaseTools/Source/Python/GenFds/CompressSection.py b/BaseTools/Source/Python/GenFds/CompressSection.py
index cdae74c52fd9..b7aa72e43992 100644
--- a/BaseTools/Source/Python/GenFds/CompressSection.py
+++ b/BaseTools/Source/Python/GenFds/CompressSection.py
@@ -56,7 +56,7 @@ class CompressSection (CompressSectionClassObject) :
     #
     def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = {}, IsMakefile = False):
 
-        if FfsInf is not None:
+        if FfsInf:
             self.CompType = FfsInf.__ExtendMacro__(self.CompType)
             self.Alignment = FfsInf.__ExtendMacro__(self.Alignment)
 
@@ -68,13 +68,13 @@ class CompressSection (CompressSectionClassObject) :
             Index = Index + 1
             SecIndex = '%s.%d' %(SecNum, Index)
             ReturnSectList, AlignValue = Sect.GenSection(OutputPath, ModuleName, SecIndex, KeyStringList, FfsInf, Dict, IsMakefile=IsMakefile)
-            if AlignValue is not None:
-                if MaxAlign is None:
+            if AlignValue:
+                if not MaxAlign:
                     MaxAlign = AlignValue
                 if GenFdsGlobalVariable.GetAlignment (AlignValue) > GenFdsGlobalVariable.GetAlignment (MaxAlign):
                     MaxAlign = AlignValue
             if ReturnSectList != []:
-                if AlignValue is None:
+                if not AlignValue:
                     AlignValue = "1"
                 for FileData in ReturnSectList:
                     SectFiles += (FileData,)
@@ -83,17 +83,13 @@ class CompressSection (CompressSectionClassObject) :
         OutputFile = OutputPath + \
                      os.sep     + \
                      ModuleName + \
-                     SUP_MODULE_SEC      + \
+                     SUP_MODULE_SEC + \
                      SecNum     + \
                      SectionSuffix['COMPRESS']
         OutputFile = os.path.normpath(OutputFile)
         DummyFile = OutputFile + '.dummy'
         GenFdsGlobalVariable.GenerateSection(DummyFile, SectFiles, InputAlign=SectAlign, IsMakefile=IsMakefile)
 
-        GenFdsGlobalVariable.GenerateSection(OutputFile, [DummyFile], Section.Section.SectionType['COMPRESS'],
+        GenFdsGlobalVariable.GenerateSection(OutputFile, [DummyFile], Section.SectionType['COMPRESS'],
                                              CompressionType=self.CompTypeDict[self.CompType], IsMakefile=IsMakefile)
-        OutputFileList = []
-        OutputFileList.append(OutputFile)
-        return OutputFileList, self.Alignment
-
-
+        return [OutputFile], self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index f0e5efab4178..71c2796b0b39 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -50,39 +50,39 @@ class DataSection (DataSectionClassObject):
     #   @retval tuple       (Generated file name list, section alignment)
     #
     def GenSection(self, OutputPath, ModuleName, SecNum, keyStringList, FfsFile = None, Dict = {}, IsMakefile = False):
+
+        self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.SectFileName)
         #
         # Prepare the parameter of GenSection
         #
-        if FfsFile is not None:
-            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.SectFileName)
+        if FfsFile:
             self.SectFileName = GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict, FfsFile.CurrentArch)
         else:
-            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.SectFileName)
             self.SectFileName = GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)
 
-        """Check Section file exist or not !"""
-
+        #
+        # Check Section file exist or not
+        #
         if not os.path.exists(self.SectFileName):
             self.SectFileName = os.path.join (GenFdsGlobalVariable.WorkSpaceDir,
                                               self.SectFileName)
 
-        """Copy Map file to Ffs output"""
+        #
+        # Copy Map file to Ffs output
+        #
         Filename = GenFdsGlobalVariable.MacroExtend(self.SectFileName)
-        if Filename[(len(Filename)-4):] == '.efi':
+        if Filename.endswith('.efi'):
             MapFile = Filename.replace('.efi', '.map')
             CopyMapFile = os.path.join(OutputPath, ModuleName + '.map')
             if IsMakefile:
-                if GenFdsGlobalVariable.CopyList == []:
-                    GenFdsGlobalVariable.CopyList = [(MapFile, CopyMapFile)]
-                else:
-                    GenFdsGlobalVariable.CopyList.append((MapFile, CopyMapFile))
+                GenFdsGlobalVariable.CopyList.append((MapFile, CopyMapFile))
             else:
                 if os.path.exists(MapFile):
                     if not os.path.exists(CopyMapFile) or (os.path.getmtime(MapFile) > os.path.getmtime(CopyMapFile)):
                         CopyLongFilePath(MapFile, CopyMapFile)
 
         #Get PE Section alignment when align is set to AUTO
-        if self.Alignment == 'Auto' and self.SecType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
+        if self.Alignment == 'Auto' and self.SecType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             ImageObj = PeImageClass (Filename)
             if ImageObj.SectionAlignment < 0x400:
                 self.Alignment = str (ImageObj.SectionAlignment)
@@ -120,8 +120,8 @@ class DataSection (DataSectionClassObject):
                 )
             self.SectFileName = TeFile
 
-        OutputFile = os.path.join (OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix.get(self.SecType))
+        OutputFile = os.path.join (OutputPath, ModuleName + TAB_SEC_DIRECTORY + SecNum + SectionSuffix.get(self.SecType))
         OutputFile = os.path.normpath(OutputFile)
-        GenFdsGlobalVariable.GenerateSection(OutputFile, [self.SectFileName], Section.Section.SectionType.get(self.SecType), IsMakefile = IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(OutputFile, [self.SectFileName], Section.SectionType.get(self.SecType), IsMakefile = IsMakefile)
         FileList = [OutputFile]
         return FileList, self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index 6e63cb97e51d..4392b9c62409 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -114,5 +114,5 @@ class DepexSection (DepexSectionClassObject):
         OutputFile = os.path.join (OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + '.dpx')
         OutputFile = os.path.normpath(OutputFile)
 
-        GenFdsGlobalVariable.GenerateSection(OutputFile, [InputFile], Section.Section.SectionType.get (SecType), IsMakefile=IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(OutputFile, [InputFile], Section.SectionType.get (SecType), IsMakefile=IsMakefile)
         return [OutputFile], self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 0064196a5a4e..5e8379548d27 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -58,8 +58,11 @@ class EfiSection (EfiSectionClassObject):
         
         if self.FileName is not None and self.FileName.startswith('PCD('):
             self.FileName = GenFdsGlobalVariable.GetPcdValue(self.FileName)
-        """Prepare the parameter of GenSection"""
-        if FfsInf is not None :
+        
+        #
+        # Prepare the parameter of GenSection
+        #
+        if FfsInf:
             InfFileName = FfsInf.InfFileName
             SectionType = FfsInf.__ExtendMacro__(self.SectionType)
             Filename = FfsInf.__ExtendMacro__(self.FileName)
@@ -67,21 +70,23 @@ class EfiSection (EfiSectionClassObject):
             StringData = FfsInf.__ExtendMacro__(self.StringData)
             ModuleNameStr = FfsInf.__ExtendMacro__('$(MODULE_NAME)')
             NoStrip = True
-            if FfsInf.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM) and SectionType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
-                if FfsInf.KeepReloc is not None:
+            if FfsInf.ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM} and SectionType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
+                if FfsInf.KeepReloc:
                     NoStrip = FfsInf.KeepReloc
-                elif FfsInf.KeepRelocFromRule is not None:
+                elif FfsInf.KeepRelocFromRule:
                     NoStrip = FfsInf.KeepRelocFromRule
-                elif self.KeepReloc is not None:
+                elif self.KeepReloc:
                     NoStrip = self.KeepReloc
-                elif FfsInf.ShadowFromInfFile is not None:
+                elif FfsInf.ShadowFromInfFile:
                     NoStrip = FfsInf.ShadowFromInfFile
         else:
             EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s apply rule for None!" %ModuleName)
 
-        """If the file name was pointed out, add it in FileList"""
+        #
+        # If the file name was pointed out, add it in FileList
+        #
         FileList = []
-        if Filename is not None:
+        if Filename:
             Filename = GenFdsGlobalVariable.MacroExtend(Filename, Dict)
             # check if the path is absolute or relative
             if os.path.isabs(Filename):
@@ -98,34 +103,26 @@ class EfiSection (EfiSectionClassObject):
                 if '.depex' in SuffixMap:
                     FileList.append(Filename)
         else:
-            FileList, IsSect = Section.Section.GetFileList(FfsInf, self.FileType, self.FileExtension, Dict, IsMakefile=IsMakefile)
+            FileList, IsSect = Section.GetSectionFileList(FfsInf, self.FileType, self.FileExtension, Dict, IsMakefile=IsMakefile)
             if IsSect :
                 return FileList, self.Alignment
 
         Index = 0
         Align = self.Alignment
 
-        """ If Section type is 'VERSION'"""
+        #
+        # If Section type is 'VERSION'
+        #
         OutputFileList = []
         if SectionType == 'VERSION':
-
             InfOverrideVerString = False
-            if FfsInf.Version is not None:
-                #StringData = FfsInf.Version
+            if FfsInf.Version:
                 BuildNum = FfsInf.Version
                 InfOverrideVerString = True
 
             if InfOverrideVerString:
-                #VerTuple = ('-n', '"' + StringData + '"')
-                if BuildNum is not None and BuildNum != '':
-                    BuildNumTuple = ('-j', BuildNum)
-                else:
-                    BuildNumTuple = tuple()
-
-                Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + TAB_SEC_DIRECTORY + str(SecNum) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                    #Ui=StringData,
                                                     Ver=BuildNum,
                                                     IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -134,38 +131,31 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
-                    f = open(File, 'r')
-                    VerString = f.read()
-                    f.close()
+                    OutputFile = os.path.join(OutputPath, ModuleName + TAB_SEC_DIRECTORY + Num + SectionSuffix.get(SectionType))
+                    with open(File, 'r') as f:
+                        VerString = f.read()
                     BuildNum = VerString
-                    if BuildNum is not None and BuildNum != '':
-                        BuildNumTuple = ('-j', BuildNum)
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                        #Ui=VerString,
                                                         Ver=BuildNum,
                                                         IsMakefile=IsMakefile)
                     OutputFileList.append(OutputFile)
 
             else:
                 BuildNum = StringData
-                if BuildNum is not None and BuildNum != '':
+                if BuildNum:
                     BuildNumTuple = ('-j', BuildNum)
                 else:
                     BuildNumTuple = tuple()
                 BuildNumString = ' ' + ' '.join(BuildNumTuple)
 
-                #if VerString == '' and 
                 if BuildNumString == '':
                     if self.Optional == True :
                         GenFdsGlobalVariable.VerboseLogger( "Optional Section don't exist!")
                         return [], None
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "File: %s miss Version Section value" %InfFileName)
-                Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + TAB_SEC_DIRECTORY + str(SecNum) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                    #Ui=VerString,
                                                     Ver=BuildNum,
                                                     IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -176,15 +166,14 @@ class EfiSection (EfiSectionClassObject):
         elif SectionType == BINARY_FILE_TYPE_UI:
 
             InfOverrideUiString = False
-            if FfsInf.Ui is not None:
+            if FfsInf.Ui:
                 StringData = FfsInf.Ui
                 InfOverrideUiString = True
 
             if InfOverrideUiString:
-                Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + TAB_SEC_DIRECTORY + str(SecNum) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -193,38 +182,34 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
-                    f = open(File, 'r')
-                    UiString = f.read()
-                    f.close()
+                    OutputFile = os.path.join(OutputPath, ModuleName + TAB_SEC_DIRECTORY + Num + SectionSuffix.get(SectionType))
+                    with open(File, 'r') as f:
+                        UiString = f.read()
                     if IsMakefile and UiString == ModuleNameStr:
                         UiString = "$(MODULE_NAME)"
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                         Ui=UiString, IsMakefile=IsMakefile)
                     OutputFileList.append(OutputFile)
             else:
-                if StringData is not None and len(StringData) > 0:
-                    UiTuple = ('-n', '"' + StringData + '"')
-                else:
-                    UiTuple = tuple()
-
+                if not StringData:
                     if self.Optional == True :
                         GenFdsGlobalVariable.VerboseLogger( "Optional Section don't exist!")
                         return '', None
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "File: %s miss UI Section value" %InfFileName)
 
-                Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join( OutputPath, ModuleName + TAB_SEC_DIRECTORY + str(SecNum) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
 
         else:
-            """If File List is empty"""
+            #
+            # If File List is empty
+            #
             if FileList == [] :
                 if self.Optional == True:
                     GenFdsGlobalVariable.VerboseLogger("Optional Section don't exist!")
@@ -233,16 +218,20 @@ class EfiSection (EfiSectionClassObject):
                     EdkLogger.error("GenFds", GENFDS_ERROR, "Output file for %s section could not be found for %s" % (SectionType, InfFileName))
 
             else:
-                """Convert the File to Section file one by one """
+                #
+                # Convert the File to Section file one by one
+                #
                 for File in FileList:
-                    """ Copy Map file to FFS output path """
+                    #
+                    # Copy Map file to FFS output path
+                    #
                     Index = Index + 1
                     Num = '%s.%d' %(SecNum , Index)
-                    OutputFile = os.path.join( OutputPath, ModuleName + 'SEC' + Num + SectionSuffix.get(SectionType))
+                    OutputFile = os.path.join( OutputPath, ModuleName + TAB_SEC_DIRECTORY + Num + SectionSuffix.get(SectionType))
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
                     
                     #Get PE Section alignment when align is set to AUTO
-                    if self.Alignment == 'Auto' and (SectionType == BINARY_FILE_TYPE_PE32 or SectionType == BINARY_FILE_TYPE_TE):
+                    if self.Alignment == 'Auto' and SectionType in {BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_TE}:
                         ImageObj = PeImageClass (File)
                         if ImageObj.SectionAlignment < 0x400:
                             Align = str (ImageObj.SectionAlignment)
@@ -251,7 +240,7 @@ class EfiSection (EfiSectionClassObject):
                         else:
                             Align = str (ImageObj.SectionAlignment / 0x100000) + 'M'
 
-                    if File[(len(File)-4):] == '.efi':
+                    if File.endswith('.efi'):
                         MapFile = File.replace('.efi', '.map')
                         CopyMapFile = os.path.join(OutputPath, ModuleName + '.map')
                         if IsMakefile:
@@ -285,7 +274,9 @@ class EfiSection (EfiSectionClassObject):
                             )
                         File = StrippedFile
                     
-                    """For TE Section call GenFw to generate TE image"""
+                    #
+                    # For TE Section call GenFw to generate TE image
+                    #
 
                     if SectionType == BINARY_FILE_TYPE_TE:
                         TeFile = os.path.join( OutputPath, ModuleName + 'Te.raw')
@@ -297,10 +288,12 @@ class EfiSection (EfiSectionClassObject):
                             )
                         File = TeFile
 
-                    """Call GenSection"""
+                    #
+                    # Call GenSection
+                    #
                     GenFdsGlobalVariable.GenerateSection(OutputFile,
                                                         [File],
-                                                        Section.Section.SectionType.get (SectionType),
+                                                        Section.SectionType.get (SectionType),
                                                         IsMakefile=IsMakefile
                                                         )
                     OutputFileList.append(OutputFile)
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index f76563d736f6..39426b939b4a 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -731,7 +731,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
             else:
                 GenSecInputFile = os.path.normpath(os.path.join(self.EfiOutputPath, GenSecInputFile))
         else:
-            FileList, IsSect = Section.Section.GetFileList(self, '', Rule.FileExtension)
+            FileList, IsSect = Section.GetSectionFileList(self, '', Rule.FileExtension)
 
         Index = 1
         SectionType = Rule.SectionType
@@ -761,7 +761,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
                 SecNum = '%d' %Index
                 GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              SectionSuffix[SectionType] + 'SEC' + SecNum
+                              SectionSuffix[SectionType] + TAB_SEC_DIRECTORY + SecNum
                 Index = Index + 1
                 OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
                 File = GenFdsGlobalVariable.MacroExtend(File, Dict, self.CurrentArch)
@@ -799,12 +799,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
                             IsMakefile=IsMakefile
                         )
                     File = TeFile
-                GenFdsGlobalVariable.GenerateSection(OutputFile, [File], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
+                GenFdsGlobalVariable.GenerateSection(OutputFile, [File], Section.SectionType[SectionType], IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
         else:
             SecNum = '%d' %Index
             GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              SectionSuffix[SectionType] + 'SEC' + SecNum
+                              SectionSuffix[SectionType] + TAB_SEC_DIRECTORY + SecNum
             OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
             GenSecInputFile = GenFdsGlobalVariable.MacroExtend(GenSecInputFile, Dict, self.CurrentArch)
 
@@ -842,7 +842,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
                         IsMakefile=IsMakefile
                     )
                 GenSecInputFile = TeFile
-            GenFdsGlobalVariable.GenerateSection(OutputFile, [GenSecInputFile], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
+            GenFdsGlobalVariable.GenerateSection(OutputFile, [GenSecInputFile], Section.SectionType[SectionType], IsMakefile=IsMakefile)
             OutputFileList.append(OutputFile)
 
         return OutputFileList
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 380fbe56f1c4..dc5dcb7f8e0d 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -54,28 +54,24 @@ class FvImageSection(FvImageSectionClassObject):
     def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = {}, IsMakefile = False):
 
         OutputFileList = []
-        if self.FvFileType is not None:
-            FileList, IsSect = Section.Section.GetFileList(FfsInf, self.FvFileType, self.FvFileExtension)
+        if self.FvFileType:
+            FileList, IsSect = Section.GetSectionFileList(FfsInf, self.FvFileType, self.FvFileExtension)
             if IsSect :
                 return FileList, self.Alignment
 
-            Num = SecNum
-
             MaxFvAlignment = 0
             for FvFileName in FileList:
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
-                    FvFileObj = open (FvFileName,'rb')
-                    FvFileObj.seek(0)
+                    with open (FvFileName,'rb') as FvFileObj:
                     # PI FvHeader is 0x48 byte
                     FvHeaderBuffer = FvFileObj.read(0x48)
                     # FV alignment position.
                     FvAlignmentValue = 1 << (ord (FvHeaderBuffer[0x2E]) & 0x1F)
-                    FvFileObj.close()
                 if FvAlignmentValue > MaxFvAlignment:
                     MaxFvAlignment = FvAlignmentValue
 
-                OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + Num + SectionSuffix["FV_IMAGE"])
+                OutputFile = os.path.join(OutputPath, ModuleName + TAB_SEC_DIRECTORY + SecNum + SectionSuffix["FV_IMAGE"])
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
@@ -97,24 +93,23 @@ class FvImageSection(FvImageSectionClassObject):
         #
         # Generate Fv
         #
-        if self.FvName is not None:
+        if self.FvName:
             Buffer = StringIO.StringIO('')
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName)
-            if Fv is not None:
+            if Fv:
                 self.Fv = Fv
                 FvFileName = Fv.AddToBuffer(Buffer, self.FvAddr, MacroDict = Dict, Flag=IsMakefile)
-                if Fv.FvAlignment is not None:
-                    if self.Alignment is None:
+                if Fv.FvAlignment:
+                    if not self.Alignment:
                         self.Alignment = Fv.FvAlignment
                     else:
                         if GenFdsGlobalVariable.GetAlignment (Fv.FvAlignment) > GenFdsGlobalVariable.GetAlignment (self.Alignment):
                             self.Alignment = Fv.FvAlignment
             else:
-                if self.FvFileName is not None:
+                if self.FvFileName:
                     FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
                     if os.path.isfile(FvFileName):
-                        FvFileObj = open (FvFileName,'rb')
-                        FvFileObj.seek(0)
+                        with open (FvFileName,'rb') as FvFileObj:
                         # PI FvHeader is 0x48 byte
                         FvHeaderBuffer = FvFileObj.read(0x48)
                         # FV alignment position.
@@ -132,14 +127,13 @@ class FvImageSection(FvImageSectionClassObject):
                         else:
                             # FvAlignmentValue is less than 1K
                             self.Alignment = str (FvAlignmentValue)
-                        FvFileObj.close()
                 else:
                     EdkLogger.error("GenFds", GENFDS_ERROR, "FvImageSection Failed! %s NOT found in FDF" % self.FvName)
 
             #
             # Prepare the parameter of GenSection
             #
-            OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix["FV_IMAGE"])
+            OutputFile = os.path.join(OutputPath, ModuleName + TAB_SEC_DIRECTORY + SecNum + SectionSuffix["FV_IMAGE"])
             GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
             OutputFileList.append(OutputFile)
 
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 104650d16781..bc95c7cd9d42 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -139,7 +139,7 @@ class GuidSection(GuidSectionClassObject) :
         #
         if self.NameGuid is None :
             GenFdsGlobalVariable.VerboseLogger("Use GenSection function Generate CRC32 Section")
-            GenFdsGlobalVariable.GenerateSection(OutputFile, SectFile, Section.Section.SectionType[self.SectionType], InputAlign=SectAlign, IsMakefile=IsMakefile)
+            GenFdsGlobalVariable.GenerateSection(OutputFile, SectFile, Section.SectionType[self.SectionType], InputAlign=SectAlign, IsMakefile=IsMakefile)
             OutputFileList = []
             OutputFileList.append(OutputFile)
             return OutputFileList, self.Alignment
@@ -243,7 +243,7 @@ class GuidSection(GuidSectionClassObject) :
 
                 if self.AuthStatusValid in ("TRUE", "1"):
                     Attribute.append('AUTH_STATUS_VALID')
-                GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.Section.SectionType['GUIDED'],
+                GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                      Guid=self.NameGuid, GuidAttr=Attribute, GuidHdrLen=HeaderLength)
 
             else:
@@ -256,14 +256,14 @@ class GuidSection(GuidSectionClassObject) :
                 if self.AuthStatusValid in ("TRUE", "1"):
                     Attribute.append('AUTH_STATUS_VALID')
                 if self.ProcessRequired == "NONE" and HeaderLength is None:
-                    GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.Section.SectionType['GUIDED'],
+                    GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                          Guid=self.NameGuid, GuidAttr=Attribute,
                                                          GuidHdrLen=HeaderLength, DummyFile=DummyFile, IsMakefile=IsMakefile)
                 else:
                     if self.ProcessRequired in ("TRUE", "1"):
                         if 'PROCESSING_REQUIRED' not in Attribute:
                             Attribute.append('PROCESSING_REQUIRED')
-                    GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.Section.SectionType['GUIDED'],
+                    GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                          Guid=self.NameGuid, GuidAttr=Attribute,
                                                          GuidHdrLen=HeaderLength, IsMakefile=IsMakefile)
 
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 6179bfa181cb..93c4456eb89f 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -119,7 +119,7 @@ class OptRomInfStatement (FfsInfStatement):
             GenSecInputFile = self.__ExtendMacro__(Rule.FileName)
             OutputFileList.append(GenSecInputFile)
         else:
-            OutputFileList, IsSect = Section.Section.GetFileList(self, '', Rule.FileExtension)
+            OutputFileList, IsSect = Section.GetSectionFileList(self, '', Rule.FileExtension)
 
         return OutputFileList
 
@@ -141,7 +141,7 @@ class OptRomInfStatement (FfsInfStatement):
                     GenSecInputFile = self.__ExtendMacro__(Sect.FileName)
                     OutputFileList.append(GenSecInputFile)
                 else:
-                    FileList, IsSect = Section.Section.GetFileList(self, '', Sect.FileExtension)
+                    FileList, IsSect = Section.GetSectionFileList(self, '', Sect.FileExtension)
                     OutputFileList.extend(FileList)    
         
         return OutputFileList
diff --git a/BaseTools/Source/Python/GenFds/Section.py b/BaseTools/Source/Python/GenFds/Section.py
index 5895998158b6..92d49da333eb 100644
--- a/BaseTools/Source/Python/GenFds/Section.py
+++ b/BaseTools/Source/Python/GenFds/Section.py
@@ -22,148 +22,107 @@ from Common import EdkLogger
 from Common.BuildToolError import *
 from Common.DataType import *
 
-## section base class
-#
-#
-class Section (SectionClassObject):
-    SectionType = {
-        'RAW'       : 'EFI_SECTION_RAW',
-        'FREEFORM'  : 'EFI_SECTION_FREEFORM_SUBTYPE_GUID',
-        BINARY_FILE_TYPE_PE32      : 'EFI_SECTION_PE32',
-        BINARY_FILE_TYPE_PIC       : 'EFI_SECTION_PIC',
-        BINARY_FILE_TYPE_TE        : 'EFI_SECTION_TE',
-        'FV_IMAGE'  : 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
-        BINARY_FILE_TYPE_DXE_DEPEX : 'EFI_SECTION_DXE_DEPEX',
-        BINARY_FILE_TYPE_PEI_DEPEX : 'EFI_SECTION_PEI_DEPEX',
-        'GUIDED'    : 'EFI_SECTION_GUID_DEFINED',
-        'COMPRESS'  : 'EFI_SECTION_COMPRESSION',
-        BINARY_FILE_TYPE_UI        : 'EFI_SECTION_USER_INTERFACE',
-        BINARY_FILE_TYPE_SMM_DEPEX : 'EFI_SECTION_SMM_DEPEX'
-    }
-
-    BinFileType = {
-        BINARY_FILE_TYPE_GUID          : '.guid',
-        'ACPI'          : '.acpi',
-        'ASL'           : '.asl' ,
-        BINARY_FILE_TYPE_UEFI_APP      : '.app',
-        BINARY_FILE_TYPE_LIB           : '.lib',
-        BINARY_FILE_TYPE_PE32          : '.pe32',
-        BINARY_FILE_TYPE_PIC           : '.pic',
-        BINARY_FILE_TYPE_PEI_DEPEX     : '.depex',
-        'SEC_PEI_DEPEX' : '.depex',
-        BINARY_FILE_TYPE_TE            : '.te',
-        BINARY_FILE_TYPE_UNI_VER       : '.ver',
-        BINARY_FILE_TYPE_VER           : '.ver',
-        BINARY_FILE_TYPE_UNI_UI        : '.ui',
-        BINARY_FILE_TYPE_UI            : '.ui',
-        BINARY_FILE_TYPE_BIN           : '.bin',
-        'RAW'           : '.raw',
-        'COMPAT16'      : '.comp16',
-        BINARY_FILE_TYPE_FV            : '.fv'
-    }
-
-    SectFileType = {
-        'SEC_GUID'      : '.sec' ,
-        'SEC_PE32'      : '.sec' ,
-        'SEC_PIC'       : '.sec',
-        'SEC_TE'        : '.sec',
-        'SEC_VER'       : '.sec',
-        'SEC_UI'        : '.sec',
-        'SEC_COMPAT16'  : '.sec',
-        'SEC_BIN'       : '.sec'
-    }
+SectionType = {
+    'RAW'       : 'EFI_SECTION_RAW',
+    'FREEFORM'  : 'EFI_SECTION_FREEFORM_SUBTYPE_GUID',
+    BINARY_FILE_TYPE_PE32      : 'EFI_SECTION_PE32',
+    BINARY_FILE_TYPE_PIC       : 'EFI_SECTION_PIC',
+    BINARY_FILE_TYPE_TE        : 'EFI_SECTION_TE',
+    'FV_IMAGE'  : 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
+    BINARY_FILE_TYPE_DXE_DEPEX : 'EFI_SECTION_DXE_DEPEX',
+    BINARY_FILE_TYPE_PEI_DEPEX : 'EFI_SECTION_PEI_DEPEX',
+    'GUIDED'    : 'EFI_SECTION_GUID_DEFINED',
+    'COMPRESS'  : 'EFI_SECTION_COMPRESSION',
+    BINARY_FILE_TYPE_UI        : 'EFI_SECTION_USER_INTERFACE',
+    BINARY_FILE_TYPE_SMM_DEPEX : 'EFI_SECTION_SMM_DEPEX'
+}
 
-    ToolGuid = {
-        '0xa31280ad-0x481e-0x41b6-0x95e8-0x127f-0x4c984779' : 'TianoCompress',
-        '0xee4e5898-0x3914-0x4259-0x9d6e-0xdc7b-0xd79403cf' : 'LzmaCompress'
-    }
+BinFileType = {
+    BINARY_FILE_TYPE_GUID          : '.guid',
+    'ACPI'          : '.acpi',
+    'ASL'           : '.asl' ,
+    BINARY_FILE_TYPE_UEFI_APP      : '.app',
+    BINARY_FILE_TYPE_LIB           : '.lib',
+    BINARY_FILE_TYPE_PE32          : '.pe32',
+    BINARY_FILE_TYPE_PIC           : '.pic',
+    BINARY_FILE_TYPE_PEI_DEPEX     : '.depex',
+    'SEC_PEI_DEPEX' : '.depex',
+    BINARY_FILE_TYPE_TE            : '.te',
+    BINARY_FILE_TYPE_UNI_VER       : '.ver',
+    BINARY_FILE_TYPE_VER           : '.ver',
+    BINARY_FILE_TYPE_UNI_UI        : '.ui',
+    BINARY_FILE_TYPE_UI            : '.ui',
+    BINARY_FILE_TYPE_BIN           : '.bin',
+    'RAW'           : '.raw',
+    'COMPAT16'      : '.comp16',
+    BINARY_FILE_TYPE_FV            : '.fv'
+}
 
-    ## The constructor
-    #
-    #   @param  self        The object pointer
-    #
-    def __init__(self):
-        SectionClassObject.__init__(self)
+SectValidFileType = {'SEC_GUID', 'SEC_PE32', 'SEC_PIC', 'SEC_TE', 'SEC_VER', 'SEC_UI', 'SEC_COMPAT16', 'SEC_BIN'}
 
-    ## GenSection() method
-    #
-    #   virtual function
-    #
-    #   @param  self        The object pointer
-    #   @param  OutputPath  Where to place output file
-    #   @param  ModuleName  Which module this section belongs to
-    #   @param  SecNum      Index of section
-    #   @param  KeyStringList  Filter for inputs of section generation
-    #   @param  FfsInf      FfsInfStatement object that contains this section data
-    #   @param  Dict        dictionary contains macro and its value
-    #
-    def GenSection(self, OutputPath, GuidName, SecNum, keyStringList, FfsInf = None, Dict = {}):
-        pass
-
-    ## GetFileList() method
-    #
-    #   Generate compressed section
-    #
-    #   @param  self        The object pointer
-    #   @param  FfsInf      FfsInfStatement object that contains file list
-    #   @param  FileType    File type to get
-    #   @param  FileExtension  File extension to get
-    #   @param  Dict        dictionary contains macro and its value
-    #   @retval tuple       (File list, boolean)
-    #
-    def GetFileList(FfsInf, FileType, FileExtension, Dict = {}, IsMakefile=False):
-        IsSect = FileType in Section.SectFileType
+## GetFileList() method
+#
+#   Generate compressed section
+#
+#   @param  self        The object pointer
+#   @param  FfsInf      FfsInfStatement object that contains file list
+#   @param  FileType    File type to get
+#   @param  FileExtension  File extension to get
+#   @param  Dict        dictionary contains macro and its value
+#   @retval tuple       (File list, boolean)
+#
+def GetSectionFileList(FfsInf, FileType, FileExtension, Dict = {}, IsMakefile=False):
+    IsSect = FileType in SectValidFileType
 
-        if FileExtension is not None:
-            Suffix = FileExtension
-        elif IsSect :
-            Suffix = Section.SectionType.get(FileType)
-        else:
-            Suffix = Section.BinFileType.get(FileType)
-        if FfsInf is None:
-            EdkLogger.error("GenFds", GENFDS_ERROR, 'Inf File does not exist!')
+    if FileExtension:
+        Suffix = FileExtension
+    elif IsSect:
+        Suffix = SectionType.get(FileType)
+    else:
+        Suffix = BinFileType.get(FileType)
+    if FfsInf is None:
+        EdkLogger.error("GenFds", GENFDS_ERROR, 'Inf File does not exist!')
 
-        FileList = []
-        if FileType is not None:
-            for File in FfsInf.BinFileList:
-                if File.Arch == TAB_ARCH_COMMON or FfsInf.CurrentArch == File.Arch:
-                    if File.Type == FileType or (int(FfsInf.PiSpecVersion, 16) >= 0x0001000A \
-                                                 and FileType == 'DXE_DPEX' and File.Type == BINARY_FILE_TYPE_SMM_DEPEX) \
-                                                 or (FileType == BINARY_FILE_TYPE_TE and File.Type == BINARY_FILE_TYPE_PE32):
-                        if '*' in FfsInf.TargetOverrideList or File.Target == '*' or File.Target in FfsInf.TargetOverrideList or FfsInf.TargetOverrideList == []:
-                            FileList.append(FfsInf.PatchEfiFile(File.Path, File.Type))
-                        else:
-                            GenFdsGlobalVariable.InfLogger ("\nBuild Target \'%s\' of File %s is not in the Scope of %s specified by INF %s in FDF" %(File.Target, File.File, FfsInf.TargetOverrideList, FfsInf.InfFileName))
+    FileList = []
+    if FileType is not None:
+        for File in FfsInf.BinFileList:
+            if File.Arch == TAB_ARCH_COMMON or FfsInf.CurrentArch == File.Arch:
+                if File.Type == FileType or (int(FfsInf.PiSpecVersion, 16) >= 0x0001000A \
+                                             and FileType == 'DXE_DPEX' and File.Type == BINARY_FILE_TYPE_SMM_DEPEX) \
+                                             or (FileType == BINARY_FILE_TYPE_TE and File.Type == BINARY_FILE_TYPE_PE32):
+                    if '*' in FfsInf.TargetOverrideList or File.Target == '*' or File.Target in FfsInf.TargetOverrideList or FfsInf.TargetOverrideList == []:
+                        FileList.append(FfsInf.PatchEfiFile(File.Path, File.Type))
                     else:
-                        GenFdsGlobalVariable.VerboseLogger ("\nFile Type \'%s\' of File %s in %s is not same with file type \'%s\' from Rule in FDF" %(File.Type, File.File, FfsInf.InfFileName, FileType))
+                        GenFdsGlobalVariable.InfLogger ("\nBuild Target \'%s\' of File %s is not in the Scope of %s specified by INF %s in FDF" %(File.Target, File.File, FfsInf.TargetOverrideList, FfsInf.InfFileName))
                 else:
-                    GenFdsGlobalVariable.InfLogger ("\nCurrent ARCH \'%s\' of File %s is not in the Support Arch Scope of %s specified by INF %s in FDF" %(FfsInf.CurrentArch, File.File, File.Arch, FfsInf.InfFileName))
+                    GenFdsGlobalVariable.VerboseLogger ("\nFile Type \'%s\' of File %s in %s is not same with file type \'%s\' from Rule in FDF" %(File.Type, File.File, FfsInf.InfFileName, FileType))
+            else:
+                GenFdsGlobalVariable.InfLogger ("\nCurrent ARCH \'%s\' of File %s is not in the Support Arch Scope of %s specified by INF %s in FDF" %(FfsInf.CurrentArch, File.File, File.Arch, FfsInf.InfFileName))
 
-        if (not IsMakefile and Suffix is not None and os.path.exists(FfsInf.EfiOutputPath)) or (IsMakefile and Suffix is not None):
-            #
-            # Get Makefile path and time stamp
-            #
-            MakefileDir = FfsInf.EfiOutputPath[:-len('OUTPUT')]
-            Makefile = os.path.join(MakefileDir, 'Makefile')
-            if not os.path.exists(Makefile):
-                Makefile = os.path.join(MakefileDir, 'GNUmakefile')
-            if os.path.exists(Makefile):
-                # Update to search files with suffix in all sub-dirs.
-                Tuple = os.walk(FfsInf.EfiOutputPath)
-                for Dirpath, Dirnames, Filenames in Tuple:
-                    for F in Filenames:
-                        if os.path.splitext(F)[1] == Suffix:
-                            FullName = os.path.join(Dirpath, F)
-                            if os.path.getmtime(FullName) > os.path.getmtime(Makefile):
-                                FileList.append(FullName)
-            if not FileList:
-                SuffixMap = FfsInf.GetFinalTargetSuffixMap()
-                if Suffix in SuffixMap:
-                    FileList.extend(SuffixMap[Suffix])
-                
-        #Process the file lists is alphabetical for a same section type
-        if len (FileList) > 1:
-            FileList.sort()
+    if (not IsMakefile and Suffix is not None and os.path.exists(FfsInf.EfiOutputPath)) or (IsMakefile and Suffix is not None):
+        #
+        # Get Makefile path and time stamp
+        #
+        MakefileDir = FfsInf.EfiOutputPath[:-len('OUTPUT')]
+        Makefile = os.path.join(MakefileDir, 'Makefile')
+        if not os.path.exists(Makefile):
+            Makefile = os.path.join(MakefileDir, 'GNUmakefile')
+        if os.path.exists(Makefile):
+            # Update to search files with suffix in all sub-dirs.
+            Tuple = os.walk(FfsInf.EfiOutputPath)
+            for Dirpath, Dirnames, Filenames in Tuple:
+                for F in Filenames:
+                    if os.path.splitext(F)[1] == Suffix:
+                        FullName = os.path.join(Dirpath, F)
+                        if os.path.getmtime(FullName) > os.path.getmtime(Makefile):
+                            FileList.append(FullName)
+        if not FileList:
+            SuffixMap = FfsInf.GetFinalTargetSuffixMap()
+            if Suffix in SuffixMap:
+                FileList.extend(SuffixMap[Suffix])
+            
+    #Process the file lists is alphabetical for a same section type
+    if len (FileList) > 1:
+        FileList.sort()
 
-        return FileList, IsSect
-    GetFileList = staticmethod(GetFileList)
+    return FileList, IsSect
diff --git a/BaseTools/Source/Python/GenFds/UiSection.py b/BaseTools/Source/Python/GenFds/UiSection.py
index fe1e026f5edf..f8b5cdbf7cf9 100644
--- a/BaseTools/Source/Python/GenFds/UiSection.py
+++ b/BaseTools/Source/Python/GenFds/UiSection.py
@@ -15,7 +15,6 @@
 ##
 # Import Modules
 #
-import Section
 from Ffs import SectionSuffix
 import subprocess
 import Common.LongFilePathOs as os
@@ -53,25 +52,22 @@ class UiSection (UiSectionClassObject):
         #
         # Prepare the parameter of GenSection
         #
-        if FfsInf is not None:
+        if FfsInf:
             self.Alignment = FfsInf.__ExtendMacro__(self.Alignment)
             self.StringData = FfsInf.__ExtendMacro__(self.StringData)
             self.FileName = FfsInf.__ExtendMacro__(self.FileName)
 
-        OutputFile = os.path.join(OutputPath, ModuleName + 'SEC' + SecNum + SectionSuffix['UI'])
+        OutputFile = os.path.join(OutputPath, ModuleName + TAB_SEC_DIRECTORY + SecNum + SectionSuffix['UI'])
 
-        if self.StringData is not None :
+        if self.StringData:
             NameString = self.StringData
-        elif self.FileName is not None:
+        elif self.FileName:
             FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
             FileNameStr = GenFdsGlobalVariable.MacroExtend(FileNameStr, Dict)
-            FileObj = open(FileNameStr, 'r')
-            NameString = FileObj.read()
-            FileObj.close()
+            with open(FileNameStr, 'r') as FileObj:
+                NameString = FileObj.read()
         else:
             NameString = ''
         GenFdsGlobalVariable.GenerateSection(OutputFile, None, 'EFI_SECTION_USER_INTERFACE', Ui=NameString, IsMakefile=IsMakefile)
 
-        OutputFileList = []
-        OutputFileList.append(OutputFile)
-        return OutputFileList, self.Alignment
+        return [OutputFile], self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/VerSection.py b/BaseTools/Source/Python/GenFds/VerSection.py
index 1bcdc8110d30..e6e79cff5be3 100644
--- a/BaseTools/Source/Python/GenFds/VerSection.py
+++ b/BaseTools/Source/Python/GenFds/VerSection.py
@@ -16,13 +16,12 @@
 # Import Modules
 #
 from Ffs import SectionSuffix
-import Section
 import Common.LongFilePathOs as os
 import subprocess
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import VerSectionClassObject
 from Common.LongFilePathSupport import OpenLongFilePath as open
-from Common.DataType import SUP_MODULE_SEC
+from Common.DataType import TAB_SEC_DIRECTORY
 
 ## generate version section
 #
@@ -53,31 +52,26 @@ class VerSection (VerSectionClassObject):
         #
         # Prepare the parameter of GenSection
         #
-        if FfsInf is not None:
+        if FfsInf:
             self.Alignment = FfsInf.__ExtendMacro__(self.Alignment)
             self.BuildNum = FfsInf.__ExtendMacro__(self.BuildNum)
             self.StringData = FfsInf.__ExtendMacro__(self.StringData)
             self.FileName = FfsInf.__ExtendMacro__(self.FileName)
 
         OutputFile = os.path.join(OutputPath,
-                                  ModuleName + 'SEC' + SecNum + SectionSuffix['VERSION'])
+                                  ModuleName + TAB_SEC_DIRECTORY + SecNum + SectionSuffix['VERSION'])
         OutputFile = os.path.normpath(OutputFile)
 
         # Get String Data
-        StringData = ''
-        if self.StringData is not None:
+        if self.StringData:
             StringData = self.StringData
-        elif self.FileName is not None:
+        elif self.FileName:
             FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
             FileNameStr = GenFdsGlobalVariable.MacroExtend(FileNameStr, Dict)
-            FileObj = open(FileNameStr, 'r')
-            StringData = FileObj.read()
-            StringData = '"' + StringData + '"'
-            FileObj.close()
+            with open(FileNameStr, 'r') as FileObj:
+                StringData = '"' + FileObj.read() + '"'
         else:
             StringData = ''
         GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
                                              Ver=StringData, BuildNumber=self.BuildNum, IsMakefile=IsMakefile)
-        OutputFileList = []
-        OutputFileList.append(OutputFile)
-        return OutputFileList, self.Alignment
+        return [OutputFile], self.Alignment
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 07/11] BaseTools: refactor file opening/writing
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (5 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 06/11] BaseTools: refactor section generation Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 08/11] BaseTools: refactor to change object types Jaben Carsey
                   ` (3 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

change mode minimal needed permissions
change to use with statement

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                   |  71 +++++-------
 BaseTools/Source/Python/AutoGen/GenC.py                      |   5 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                   |   5 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py            |  21 ++--
 BaseTools/Source/Python/AutoGen/StrGather.py                 |  18 +--
 BaseTools/Source/Python/AutoGen/UniClassObject.py            |   5 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py   |   5 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                       |  30 +++--
 BaseTools/Source/Python/Common/Misc.py                       |  39 +++----
 BaseTools/Source/Python/Common/TargetTxtClassObject.py       |  13 +--
 BaseTools/Source/Python/Common/ToolDefClassObject.py         |   4 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                |   4 +-
 BaseTools/Source/Python/Ecc/Ecc.py                           |   4 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                 |  14 +--
 BaseTools/Source/Python/Eot/FileProfile.py                   |   8 +-
 BaseTools/Source/Python/Eot/Report.py                        |   6 +-
 BaseTools/Source/Python/GenFds/Capsule.py                    |   7 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                |  10 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                  |  20 +---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py           |   5 +-
 BaseTools/Source/Python/GenFds/Fv.py                         |  37 +++---
 BaseTools/Source/Python/GenFds/FvImageSection.py             |   8 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py       |  53 ++++-----
 BaseTools/Source/Python/GenFds/GuidSection.py                |  45 ++++----
 BaseTools/Source/Python/GenFds/Region.py                     |  15 +--
 BaseTools/Source/Python/GenFds/Vtf.py                        | 122 ++++++++++----------
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py |  21 ++--
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py       |  19 ++-
 BaseTools/Source/Python/Table/TableReport.py                 |  47 ++++----
 BaseTools/Source/Python/TargetTool/TargetTool.py             |  97 ++++++++--------
 BaseTools/Source/Python/Trim/Trim.py                         |  67 +++++------
 BaseTools/Source/Python/Workspace/DscBuildData.py            |  17 +--
 BaseTools/Source/Python/build/BuildReport.py                 |  81 +++++++------
 BaseTools/Source/Python/build/build.py                       | 106 ++++++++---------
 34 files changed, 453 insertions(+), 576 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 619e1e41e32b..009e5c56781d 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -653,10 +653,8 @@ class WorkspaceAutoGen(AutoGen):
             for files in AllWorkSpaceMetaFiles:
                 if files.endswith('.dec'):
                     continue
-                f = open(files, 'r')
-                Content = f.read()
-                f.close()
-                m.update(Content)
+                with open(files, 'r') as f:
+                    m.update(f.read())
             SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), m.hexdigest(), True)
             GlobalData.gPlatformHash = m.hexdigest()
 
@@ -664,11 +662,11 @@ class WorkspaceAutoGen(AutoGen):
         # Write metafile list to build directory
         #
         AutoGenFilePath = os.path.join(self.BuildDir, 'AutoGen')
-        if os.path.exists (AutoGenFilePath):
-            os.remove(AutoGenFilePath)
         if not os.path.exists(self.BuildDir):
             os.makedirs(self.BuildDir)
-        with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
+        elif os.path.exists (AutoGenFilePath):
+            os.remove(AutoGenFilePath)
+        with open(AutoGenFilePath, 'w') as file:
             for f in AllWorkSpaceMetaFiles:
                 print >> file, f
         return True
@@ -679,20 +677,16 @@ class WorkspaceAutoGen(AutoGen):
         HashFile = os.path.join(PkgDir, Pkg.PackageName + '.hash')
         m = hashlib.md5()
         # Get .dec file's hash value
-        f = open(Pkg.MetaFile.Path, 'r')
-        Content = f.read()
-        f.close()
-        m.update(Content)
+        with open(Pkg.MetaFile.Path, 'r') as f:
+            m.update(f.read())
         # Get include files hash value
         if Pkg.Includes:
             for inc in Pkg.Includes:
                 for Root, Dirs, Files in os.walk(str(inc)):
                     for File in Files:
                         File_Path = os.path.join(Root, File)
-                        f = open(File_Path, 'r')
-                        Content = f.read()
-                        f.close()
-                        m.update(Content)
+                        with open(File_Path, 'r') as f:
+                            m.update(f.read())
         SaveFileOnChange(HashFile, m.hexdigest(), True)
         if Pkg.PackageName not in GlobalData.gPackageHash[Pkg.Arch]:
             GlobalData.gPackageHash[Pkg.Arch][Pkg.PackageName] = m.hexdigest()
@@ -3779,9 +3773,8 @@ class ModuleAutoGen(AutoGen):
             Vfri = os.path.join(self.OutputDir, SrcFile.BaseName + '.i')
             if not os.path.exists(Vfri):
                 continue
-            VfriFile = open(Vfri, 'r')
-            Content = VfriFile.read()
-            VfriFile.close()
+            with open(Vfri, 'r') as VfriFile:
+                Content = VfriFile.read()
             Pos = Content.find('efivarstore')
             while Pos != -1:
                 #
@@ -3850,11 +3843,6 @@ class ModuleAutoGen(AutoGen):
         OutputName = '%sOffset.bin' % self.Name
         UniVfrOffsetFileName    =  os.path.join( self.OutputDir, OutputName)
 
-        try:
-            fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
-        except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
-
         # Use a instance of StringIO to cache data
         fStringIO = StringIO('')  
 
@@ -3881,17 +3869,19 @@ class ModuleAutoGen(AutoGen):
                 fStringIO.write(''.join(VfrGuid))                   
                 VfrValue = pack ('Q', int (Item[1], 16))
                 fStringIO.write (VfrValue)
-        #
-        # write data into file.
-        #
-        try :  
-            fInputfile.write (fStringIO.getvalue())
+
+        try:
+            with open(UniVfrOffsetFileName, "wb", 0) as fInputfile:
+                # write data into file.
+                try :
+                    fInputfile.write (fStringIO.getvalue())
+                except:
+                    EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
+                                    "file been locked or using by other applications." %UniVfrOffsetFileName,None)
         except:
-            EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
-                            "file been locked or using by other applications." %UniVfrOffsetFileName,None)
+            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
 
         fStringIO.close ()
-        fInputfile.close ()
         return OutputName
 
     ## Create AsBuilt INF file the module
@@ -4282,9 +4272,8 @@ class ModuleAutoGen(AutoGen):
         FileDir = path.join(GlobalData.gBinCacheSource, self.Arch, self.SourceDir, self.MetaFile.BaseName)
         HashFile = path.join(FileDir, self.Name + '.hash')
         if os.path.exists(HashFile):
-            f = open(HashFile, 'r')
-            CacheHash = f.read()
-            f.close()
+            with open(HashFile, 'r') as f:
+                CacheHash = f.read()
             if GlobalData.gModuleHash[self.Arch][self.Name]:
                 if CacheHash == GlobalData.gModuleHash[self.Arch][self.Name]:
                     for root, dir, files in os.walk(FileDir):
@@ -4448,17 +4437,13 @@ class ModuleAutoGen(AutoGen):
                 m.update(GlobalData.gModuleHash[self.Arch][Lib.Name])
 
         # Add Module self
-        f = open(str(self.MetaFile), 'r')
-        Content = f.read()
-        f.close()
-        m.update(Content)
+        with open(str(self.MetaFile), 'r') as f:
+            m.update(f.read())
         # Add Module's source files
         if self.SourceFileList:
             for File in self.SourceFileList:
-                f = open(str(File), 'r')
-                Content = f.read()
-                f.close()
-                m.update(Content)
+                with open(str(File), 'r') as f:
+                    m.update(f.read())
 
         ModuleHashFile = path.join(self.BuildDir, self.Name + ".hash")
         if self.Name not in GlobalData.gModuleHash[self.Arch]:
@@ -4519,7 +4504,7 @@ class ModuleAutoGen(AutoGen):
 
         if os.path.exists (self.GetTimeStampPath()):
             os.remove (self.GetTimeStampPath())
-        with open(self.GetTimeStampPath(), 'w+') as file:
+        with open(self.GetTimeStampPath(), 'w') as file:
             for f in FileSet:
                 print >> file, f
 
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 40a343ca1057..46c7c1c1390b 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1814,9 +1814,8 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                                 Index += 1
                                 continue
 
-                            TmpFile = open(File.Path, 'rb')
-                            Buffer = TmpFile.read()
-                            TmpFile.close()
+                            with open(File.Path, 'rb') as f:
+                                Buffer = f.read()
                             if File.Ext.upper() == '.PNG':
                                 TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_PNG)
                                 TempBuffer += pack('I', len(Buffer))
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index d70c5c26ffc8..30280d449f62 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1026,12 +1026,11 @@ cleanlib:
                 CurrentFileDependencyList = DepDb[F]
             else:
                 try:
-                    Fd = open(F.Path, 'r')
+                    with open(F.Path, 'r') as f:
+                        FileContent = f.read()
                 except BaseException, X:
                     EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
 
-                FileContent = Fd.read()
-                Fd.close()
                 if len(FileContent) == 0:
                     continue
 
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index 769790d965b5..8b84806f9f36 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to collect all defined strings in Image Definition files
 #
-# Copyright (c) 2016, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -69,13 +69,12 @@ class IdfFileClassObject(object):
         self.ImageFilesDict = {}
         self.ImageIDList = []
         for File in FileList:
-            if File is None:
+            if not File:
                 EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
 
             try:
-                IdfFile = open(LongFilePath(File.Path), mode='r')
-                FileIn = IdfFile.read()
-                IdfFile.close()
+                with open(LongFilePath(File.Path), mode='r') as f:
+                    FileIn = f.read()
             except:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File)
 
@@ -118,12 +117,12 @@ def SearchImageID(ImageFileObject, FileList):
 
     for File in FileList:
         if os.path.isfile(File):
-            Lines = open(File, 'r')
-            for Line in Lines:
-                ImageIdList = IMAGE_TOKEN.findall(Line)
-                for ID in ImageIdList:
-                    EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
-                    ImageFileObject.SetImageIDReferenced(ID)
+            with open(File, 'r') as f:
+                for Line in f:
+                    ImageIdList = IMAGE_TOKEN.findall(Line)
+                    for ID in ImageIdList:
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
+                        ImageFileObject.SetImageIDReferenced(ID)
 
 class ImageFileObject(object):
     def __init__(self, FileName, ImageID, TransParent = False):
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index e5e4f25efd5d..c0a39e4a12f1 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -529,11 +529,11 @@ def SearchString(UniObjectClass, FileList, IsCompatibleMode):
 
     for File in FileList:
         if os.path.isfile(File):
-            Lines = open(File, 'r')
-            for Line in Lines:
-                for StrName in STRING_TOKEN.findall(Line):
-                    EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
-                    UniObjectClass.SetStringReferenced(StrName)
+            with open(File, 'r') as f:
+                for Line in f:
+                    for StrName in STRING_TOKEN.findall(Line):
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
+                        UniObjectClass.SetStringReferenced(StrName)
 
     UniObjectClass.ReToken()
 
@@ -603,9 +603,9 @@ if __name__ == '__main__':
     SkipList = ['.inf', '.uni']
     BaseName = 'DriverSample'
     (h, c) = GetStringFiles(UniFileList, SrcFileList, IncludeList, SkipList, BaseName, True)
-    hfile = open('unistring.h', 'w')
-    cfile = open('unistring.c', 'w')
-    hfile.write(h)
-    cfile.write(c)
+    with open('unistring.h', 'w') as f:
+        f.write(h)
+    with open('unistring.c', 'w') as f:
+        f.write(c)
 
     EdkLogger.info('end')
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 54b6fb22a08a..bb37fbfd6a0c 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -303,9 +303,8 @@ class UniFileClassObject(object):
         # Read file
         #
         try:
-            UniFile = open(FileName, mode='rb')
-            FileIn = UniFile.read()
-            UniFile.close()
+            with open(FileName, mode='rb') as f:
+                FileIn = f.read()
         except:
             EdkLogger.Error("build", FILE_OPEN_FAILURE, ExtraData=File)
 
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 3b54865000bf..2c6bb8e396a9 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -165,9 +165,8 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
         
         DbFile = StringIO()
         if Phase == 'DXE' and os.path.exists(BinFilePath):
-            BinFile = open(BinFilePath, "rb")
-            BinBuffer = BinFile.read()
-            BinFile.close()
+            with open(BinFilePath, "rb") as f:
+                BinBuffer = f.read()
             BinBufferSize = len(BinBuffer)
             if (BinBufferSize % 4):
                 for i in range(4 - (BinBufferSize % 4)):
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 4fa12b7d59de..dba815415f92 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -323,13 +323,11 @@ class GenVPD :
         self.PcdFixedOffsetSizeList  = []
         self.PcdUnknownOffsetList    = []
         try:
-            fInputfile = open(InputFileName, "r", 0)
-            try:
-                self.FileLinesList = fInputfile.readlines()
-            except:
-                EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
-            finally:
-                fInputfile.close()
+            with open(InputFileName, "r", 0) as f:
+                try:
+                    self.FileLinesList = f.readlines()
+                except:
+                    EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
         except:
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % InputFileName, None)
 
@@ -661,12 +659,6 @@ class GenVPD :
     def GenerateVpdFile (self, MapFileName, BinFileName):
         #Open an VPD file to process
 
-        try:
-            fVpdFile = open(BinFileName, "wb", 0)
-        except:
-            # Open failed
-            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
-
         try :
             fMapFile = open(MapFileName, "w", 0)
         except:
@@ -697,12 +689,16 @@ class GenVPD :
             else:
                 fStringIO.write (eachPcd.PcdValue)
 
-        try :
-            fVpdFile.write (fStringIO.getvalue())
+        try:
+            with open(BinFileName, "wb", 0) as fVpdFile:
+                try :
+                    fVpdFile.write (fStringIO.getvalue())
+                except:
+                    EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
         except:
-            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
+            # Open failed
+            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
 
         fStringIO.close ()
-        fVpdFile.close ()
         fMapFile.close ()
         
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 17907a318944..0bfb26548d9b 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -63,15 +63,14 @@ def GetVariableOffset(mapfilepath, efifilepath, varnames):
     
     @return List whos elements are tuple with variable name and raw offset
     """
-    lines = []
     try:
-        f = open(mapfilepath, 'r')
-        lines = f.readlines()
-        f.close()
+        with open(mapfilepath, 'r') as f:
+            lines = f.readlines()
     except:
         return None
     
-    if len(lines) == 0: return None
+    if len(lines) == 0: 
+        return None
     firstline = lines[0].strip()
     if (firstline.startswith("Archive member included ") and
         firstline.endswith(" file (symbol)")):
@@ -471,13 +470,11 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
                 if not SaveFileToDisk(File, Content):
                     EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData=File)
             except:
-                Fd = open(File, "wb")
+                with open(File, "wb") as Fd:
+                    Fd.write(Content)
+        else:
+            with open(File, "wb") as Fd:
                 Fd.write(Content)
-                Fd.close()
-        else:
-            Fd = open(File, "wb")
-            Fd.write(Content)
-            Fd.close()
     except IOError, X:
         EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
 
@@ -489,15 +486,11 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
 #   @param      File    The path of file to store the object
 #
 def DataDump(Data, File):
-    Fd = None
     try:
-        Fd = open(File, 'wb')
-        cPickle.dump(Data, Fd, cPickle.HIGHEST_PROTOCOL)
+        with open(File, 'wb') as Fd:
+            cPickle.dump(Data, Fd, cPickle.HIGHEST_PROTOCOL)
     except:
         EdkLogger.error("", FILE_OPEN_FAILURE, ExtraData=File, RaiseError=False)
-    finally:
-        if Fd is not None:
-            Fd.close()
 
 ## Restore a Python object from a file
 #
@@ -507,18 +500,12 @@ def DataDump(Data, File):
 #   @retval     None    If failure in file operation
 #
 def DataRestore(File):
-    Data = None
-    Fd = None
     try:
-        Fd = open(File, 'rb')
-        Data = cPickle.load(Fd)
+        with open(File, 'rb') as Fd:
+            return cPickle.load(Fd)
     except Exception, e:
         EdkLogger.verbose("Failed to load [%s]\n\t%s" % (File, str(e)))
-        Data = None
-    finally:
-        if Fd is not None:
-            Fd.close()
-    return Data
+    return None
 
 ## Retrieve and cache the real path name in file system
 #
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index f8459c892e36..6f5e5f0d173d 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -77,16 +77,14 @@ class TargetTxtClassObject(object):
     # @retval 1 Open file failed
     #
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
-        F = None
+        self.ConfDirectoryPath = os.path.dirname(FileName)
         try:
-            F = open(FileName, 'r')
-            self.ConfDirectoryPath = os.path.dirname(FileName)
+            with open(FileName, 'r') as F:
+                Lines = F.readlines()
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=FileName)
-            if F is not None:
-                F.close()
 
-        for Line in F:
+        for Line in Lines:
             Line = Line.strip()
             if Line.startswith(CommentCharacter) or Line == '':
                 continue
@@ -131,10 +129,7 @@ class TargetTxtClassObject(object):
                     EdkLogger.error("build", FORMAT_INVALID, "Invalid number of [%s]: %s." % (Key, Value),
                                     File=FileName)
                 self.TargetTxtDictionary[Key] = Value
-            #elif Key not in GlobalData.gGlobalDefines:
-            #    GlobalData.gGlobalDefines[Key] = Value
 
-        F.close()
         return 0
 
 ## TargetTxtDict
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 83359586b994..d464330d9f98 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -117,8 +117,8 @@ class ToolDefClassObject(object):
         FileContent = []
         if os.path.isfile(FileName):
             try:
-                F = open(FileName, 'r')
-                FileContent = F.readlines()
+                with open(FileName, 'r') as F:
+                    FileContent = F.readlines()
             except:
                 EdkLogger.error("tools_def.txt parser", FILE_OPEN_FAILURE, ExtraData=FileName)
         else:
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 32895deb5d0c..7d76f04d6c31 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -153,12 +153,12 @@ class VpdInfoFile:
     #  @param FilePath The full path string for existing VPD PCD info file.
     def Read(self, FilePath):
         try:
-            fd = open(FilePath, "r")
+            with open(FilePath, "r") as fd:
+                Lines = fd.readlines()
         except:
             EdkLogger.error("VpdInfoFile", 
                             BuildToolError.FILE_OPEN_FAILURE, 
                             "Fail to open file %s for written." % FilePath)
-        Lines = fd.readlines()
         for Line in Lines:
             Line = Line.strip()
             if len(Line) == 0 or Line.startswith("#"):
diff --git a/BaseTools/Source/Python/Ecc/Ecc.py b/BaseTools/Source/Python/Ecc/Ecc.py
index 60dfc00260f1..554e29e6e0f8 100644
--- a/BaseTools/Source/Python/Ecc/Ecc.py
+++ b/BaseTools/Source/Python/Ecc/Ecc.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to be the main entrance of ECC tool
 #
-# Copyright (c) 2009 - 2016, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -201,7 +201,7 @@ class Ecc(object):
             for specificDir in SpecificDirs:    
                 ScanFolders.append(os.path.join(EccGlobalData.gTarget, specificDir))
         EdkLogger.quiet("Building database for meta data files ...")
-        Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w+')
+        Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w')
         #SkipDirs = Read from config file
         SkipDirs = EccGlobalData.gConfig.SkipDirList
         SkipDirString = string.join(SkipDirs, '|')
diff --git a/BaseTools/Source/Python/Eot/EotGlobalData.py b/BaseTools/Source/Python/Eot/EotGlobalData.py
index a9f51189c1eb..1b561a70e805 100644
--- a/BaseTools/Source/Python/Eot/EotGlobalData.py
+++ b/BaseTools/Source/Python/Eot/EotGlobalData.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to save global datas
 #
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -38,27 +38,27 @@ gMACRO['CAPSULE_INF'] = ''
 
 # Log file for unmatched variables
 gUN_MATCHED_LOG = 'Log_UnMatched.log'
-gOP_UN_MATCHED = open(gUN_MATCHED_LOG, 'w+')
+gOP_UN_MATCHED = open(gUN_MATCHED_LOG, 'w')
 
 # Log file for all INF files
 gINF_FILES = 'Log_Inf_File.log'
-gOP_INF = open(gINF_FILES, 'w+')
+gOP_INF = open(gINF_FILES, 'w')
 
 # Log file for not dispatched PEIM/DRIVER
 gUN_DISPATCHED_LOG = 'Log_UnDispatched.log'
-gOP_UN_DISPATCHED = open(gUN_DISPATCHED_LOG, 'w+')
+gOP_UN_DISPATCHED = open(gUN_DISPATCHED_LOG, 'w')
 
 # Log file for unmatched variables in function calling
 gUN_MATCHED_IN_LIBRARY_CALLING_LOG = 'Log_UnMatchedInLibraryCalling.log'
-gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w+')
+gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w')
 
 # Log file for order of dispatched PEIM/DRIVER
 gDISPATCH_ORDER_LOG = 'Log_DispatchOrder.log'
-gOP_DISPATCH_ORDER = open(gDISPATCH_ORDER_LOG, 'w+')
+gOP_DISPATCH_ORDER = open(gDISPATCH_ORDER_LOG, 'w')
 
 # Log file for found source files
 gSOURCE_FILES = 'Log_SourceFiles.log'
-gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w+')
+gOP_SOURCE_FILES = open(gSOURCE_FILES, 'w')
 
 # Dict for GUID found in DEC files
 gGuidDict = dict()
diff --git a/BaseTools/Source/Python/Eot/FileProfile.py b/BaseTools/Source/Python/Eot/FileProfile.py
index 0544c0d55b44..bf6a4c054baa 100644
--- a/BaseTools/Source/Python/Eot/FileProfile.py
+++ b/BaseTools/Source/Python/Eot/FileProfile.py
@@ -1,7 +1,7 @@
 ## @file
 # fragments of source file
 #
-#  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -49,11 +49,7 @@ class FileProfile :
         self.FileLinesList = []
         self.FileLinesListFromFile = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesListFromFile = fsock.readlines()
-            finally:
-                fsock.close()
-
         except IOError:
             raise Warning("Error when opening file %s" % FileName)
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 7435b4d7c930..99b8b152180a 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create report for Eot tool
 #
-# Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -33,10 +33,10 @@ class Report(object):
     #
     def __init__(self, ReportName = 'Report.html', FvObj = None, DispatchName=None):
         self.ReportName = ReportName
-        self.Op = open(ReportName, 'w+')
+        self.Op = open(ReportName, 'w')
         self.DispatchList = None
         if DispatchName:
-            self.DispatchList = open(DispatchName, 'w+')
+            self.DispatchList = open(DispatchName, 'w')
         self.FvObj = FvObj
         self.FfsIndex = 0
         self.PpiIndex = 0
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index fbd48f3c6d76..6aae2fcb7d97 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -137,9 +137,8 @@ class Capsule (CapsuleClassObject) :
             FileName = driver.GenCapsuleSubItem()
             FwMgrHdr.write(pack('=Q', PreSize))
             PreSize += os.path.getsize(FileName)
-            File = open(FileName, 'rb')
-            Content.write(File.read())
-            File.close()
+            with open(FileName, 'rb') as File:
+                Content.write(File.read())
         for fmp in self.FmpPayloadList:
             if fmp.Existed:
                 FwMgrHdr.write(pack('=Q', PreSize))
@@ -247,7 +246,7 @@ class Capsule (CapsuleClassObject) :
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiCapsuleName +  "_Cap" + '.inf')
-        CapInfFile = StringIO.StringIO() #open (self.CapInfFileName , 'w+')
+        CapInfFile = StringIO.StringIO()
 
         CapInfFile.writelines("[options]" + T_CHAR_LF)
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index dd4c27bd15c7..9916bd4d2627 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -233,12 +233,10 @@ class CapsulePayload(CapsuleData):
         #
         # Append file content to the structure
         #
-        ImageFile = open(self.ImageFile, 'rb')
-        Buffer += ImageFile.read()
-        ImageFile.close()
+        with open(self.ImageFile, 'rb') as ImageFile:
+            Buffer += ImageFile.read()
         if self.VendorCodeFile:
-            VendorFile = open(self.VendorCodeFile, 'rb')
-            Buffer += VendorFile.read()
-            VendorFile.close()
+            with open(self.VendorCodeFile, 'rb') as VendorFile:
+                Buffer += VendorFile.read()
         self.Existed = True
         return Buffer
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 2439d8ab9455..8d1a4b543f0e 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -155,16 +155,11 @@ class IncludeFileProfile :
         self.FileName = FileName
         self.FileLinesList = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesList = fsock.readlines()
-                for index, line in enumerate(self.FileLinesList):
-                    if not line.endswith('\n'):
-                        self.FileLinesList[index] += '\n'
-
-            finally:
-                fsock.close()
-
+            for index, line in enumerate(self.FileLinesList):
+                if not line.endswith('\n'):
+                    self.FileLinesList[index] += '\n'
         except:
             EdkLogger.error("FdfParser", FILE_OPEN_FAILURE, ExtraData=FileName)
 
@@ -216,16 +211,11 @@ class FileProfile :
     def __init__(self, FileName):
         self.FileLinesList = []
         try:
-            fsock = open(FileName, "rb", 0)
-            try:
+            with open(FileName, "rb", 0) as fsock:
                 self.FileLinesList = fsock.readlines()
-            finally:
-                fsock.close()
-
         except:
             EdkLogger.error("FdfParser", FILE_OPEN_FAILURE, ExtraData=FileName)
 
-
         self.PcdDict = {}
         self.InfList = []
         self.InfDict = {'ArchTBD':[]}
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 871499d3d2ad..1449d363eac3 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -104,11 +104,10 @@ class FileStatement (FileStatementClassObject) :
                     MaxAlignValue = 1
                     for Index, File in enumerate(self.FileName):
                         try:
-                            f = open(File, 'rb')
+                            with open(File, 'rb') as f:
+                                Content = f.read()
                         except:
                             GenFdsGlobalVariable.ErrorLogger("Error opening RAW file %s." % (File))
-                        Content = f.read()
-                        f.close()
                         AlignValue = 1
                         if self.SubAlignment[Index] is not None:
                             AlignValue = GenFdsGlobalVariable.GetAlignment(self.SubAlignment[Index])
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 29daba5a3a3e..d4b0611fc55a 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -163,8 +163,8 @@ class FV (FvClassObject):
                 NewFvInfo = open(FvInfoFileName, 'r').read()
             if NewFvInfo is not None and NewFvInfo != OrigFvInfo:
                 FvChildAddr = []
-                AddFileObj = open(FvInfoFileName, 'r')
-                AddrStrings = AddFileObj.readlines()
+                with open(FvInfoFileName, 'r') as AddFileObj:
+                    AddrStrings = AddFileObj.readlines()
                 AddrKeyFound = False
                 for AddrString in AddrStrings:
                     if AddrKeyFound:
@@ -172,7 +172,6 @@ class FV (FvClassObject):
                         FvChildAddr.append (AddrString)
                     elif AddrString.find ("[FV_BASE_ADDRESS]") != -1:
                         AddrKeyFound = True
-                AddFileObj.close()
 
                 if FvChildAddr != []:
                     # Update Ffs again
@@ -195,14 +194,14 @@ class FV (FvClassObject):
             # Write the Fv contents to Buffer
             #
             if os.path.isfile(FvOutputFile):
-                FvFileObj = open(FvOutputFile, 'rb')
+                with open(FvOutputFile, 'rb') as FvFileObj:
+                    Buffer.write(FvFileObj.read())
+                    FvFileObj.seek(0)
+                    # PI FvHeader is 0x48 byte
+                    FvHeaderBuffer = FvFileObj.read(0x48)
+
                 GenFdsGlobalVariable.VerboseLogger("\nGenerate %s FV Successfully" % self.UiFvName)
                 GenFdsGlobalVariable.SharpCounter = 0
-
-                Buffer.write(FvFileObj.read())
-                FvFileObj.seek(0)
-                # PI FvHeader is 0x48 byte
-                FvHeaderBuffer = FvFileObj.read(0x48)
                 # FV alignment position.
                 FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E]) & 0x1F)
                 if FvAlignmentValue >= 0x400:
@@ -217,7 +216,6 @@ class FV (FvClassObject):
                 else:
                     # FvAlignmentValue is less than 1K
                     self.FvAlignment = str (FvAlignmentValue)
-                FvFileObj.close()
                 GenFds.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
                 GenFdsGlobalVariable.LargeFileInFvFlags.pop()
             else:
@@ -378,16 +376,15 @@ class FV (FvClassObject):
                     # check if the file path exists or not
                     if not os.path.isfile(FileFullPath):
                         GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
-                    FvExtFile = open (FileFullPath,'rb')
-                    FvExtFile.seek(0,2)
-                    Size = FvExtFile.tell()
-                    if Size >= 0x10000:
-                        GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
-                    TotalSize += (Size + 4)
-                    FvExtFile.seek(0)
-                    Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
-                    Buffer += FvExtFile.read() 
-                    FvExtFile.close()
+                    with open (FileFullPath,'rb') as FvExtFile:
+                        FvExtFile.seek(0,2)
+                        Size = FvExtFile.tell()
+                        if Size >= 0x10000:
+                            GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
+                        TotalSize += (Size + 4)
+                        FvExtFile.seek(0)
+                        Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
+                        Buffer += FvExtFile.read()
                 if self.FvExtEntryType[Index] == 'DATA':
                     ByteList = self.FvExtEntryData[Index].split(',')
                     Size = len (ByteList)
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index dc5dcb7f8e0d..9f808480e787 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -64,8 +64,8 @@ class FvImageSection(FvImageSectionClassObject):
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
                     with open (FvFileName,'rb') as FvFileObj:
-                    # PI FvHeader is 0x48 byte
-                    FvHeaderBuffer = FvFileObj.read(0x48)
+                        # PI FvHeader is 0x48 byte
+                        FvHeaderBuffer = FvFileObj.read(0x48)
                     # FV alignment position.
                     FvAlignmentValue = 1 << (ord (FvHeaderBuffer[0x2E]) & 0x1F)
                 if FvAlignmentValue > MaxFvAlignment:
@@ -110,8 +110,8 @@ class FvImageSection(FvImageSectionClassObject):
                     FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
                     if os.path.isfile(FvFileName):
                         with open (FvFileName,'rb') as FvFileObj:
-                        # PI FvHeader is 0x48 byte
-                        FvHeaderBuffer = FvFileObj.read(0x48)
+                            # PI FvHeader is 0x48 byte
+                            FvHeaderBuffer = FvFileObj.read(0x48)
                         # FV alignment position.
                         FvAlignmentValue = 1 << (ord (FvHeaderBuffer[0x2E]) & 0x1F)
                         # FvAlignmentValue is larger than or equal to 1K
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 6cf82526efd2..8537800bc2b2 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -300,31 +300,27 @@ class GenFdsGlobalVariable:
         # Create FV Address inf file
         #
         GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
-        FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
-        #
-        # Add [Options]
-        #
-        FvAddressFile.writelines("[options]" + T_CHAR_LF)
         BsAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress:
                 BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress
                 break
-
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                       BsAddress + \
-                                       T_CHAR_LF)
-
         RtAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress:
                 RtAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress
+        with open(GenFdsGlobalVariable.FvAddressFileName, 'w') as FvAddressFile:
+            #
+            # Add [Options]
+            #
+            FvAddressFile.writelines("[options]" + T_CHAR_LF)
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
+                                           BsAddress + \
+                                           T_CHAR_LF)
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
+                                           RtAddress + \
+                                           T_CHAR_LF)
 
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                       RtAddress + \
-                                       T_CHAR_LF)
-
-        FvAddressFile.close()
 
     def SetEnv(FdfParser, WorkSpace, ArchList, GlobalData):
         GenFdsGlobalVariable.ModuleFile = WorkSpace.ModuleFile
@@ -361,11 +357,6 @@ class GenFdsGlobalVariable:
         # Create FV Address inf file
         #
         GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
-        FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
-        #
-        # Add [Options]
-        #
-        FvAddressFile.writelines("[options]" + T_CHAR_LF)
         BsAddress = '0'
         for Arch in ArchList:
             BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,
@@ -373,11 +364,6 @@ class GenFdsGlobalVariable:
                                                                    GlobalData.gGlobalDefines["TOOL_CHAIN_TAG"]].BsBaseAddress
             if BsAddress:
                 break
-
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                 BsAddress + \
-                                 T_CHAR_LF)
-
         RtAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[
@@ -386,12 +372,17 @@ class GenFdsGlobalVariable:
                 RtAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[
                     GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
                     GlobalData.gGlobalDefines["TOOL_CHAIN_TAG"]].RtBaseAddress
-
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                 RtAddress + \
-                                 T_CHAR_LF)
-
-        FvAddressFile.close()
+        with open(GenFdsGlobalVariable.FvAddressFileName, 'w') as FvAddressFile:
+            #
+            # Add [Options]
+            #
+            FvAddressFile.writelines("[options]" + T_CHAR_LF)
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
+                                     BsAddress + \
+                                     T_CHAR_LF)
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
+                                     RtAddress + \
+                                     T_CHAR_LF)
 
     ## ReplaceWorkspaceMacro()
     #
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index bc95c7cd9d42..28571292f5a6 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -202,33 +202,28 @@ class GuidSection(GuidSectionClassObject) :
                 if not os.path.exists(TempFile) :
                     EdkLogger.error("GenFds", COMMAND_FAILURE, 'Fail to call %s, no output file was generated' % ExternalTool)
 
-                FileHandleIn = open(DummyFile, 'rb')
-                FileHandleIn.seek(0, 2)
-                InputFileSize = FileHandleIn.tell()
+                with open(DummyFile, 'rb') as FileHandleIn, open(TempFile, 'rb') as FileHandleOut:
+                    FileHandleIn.seek(0, 2)
+                    InputFileSize = FileHandleIn.tell()
+                    FileHandleOut.seek(0, 2)
+                    TempFileSize = FileHandleOut.tell()
 
-                FileHandleOut = open(TempFile, 'rb')
-                FileHandleOut.seek(0, 2)
-                TempFileSize = FileHandleOut.tell()
+                    Attribute = []
+                    HeaderLength = None
+                    if self.ExtraHeaderSize != -1:
+                        HeaderLength = str(self.ExtraHeaderSize)
 
-                Attribute = []
-                HeaderLength = None
-                if self.ExtraHeaderSize != -1:
-                    HeaderLength = str(self.ExtraHeaderSize)
-
-                if self.ProcessRequired == "NONE" and HeaderLength is None:
-                    if TempFileSize > InputFileSize:
-                        FileHandleIn.seek(0)
-                        BufferIn = FileHandleIn.read()
-                        FileHandleOut.seek(0)
-                        BufferOut = FileHandleOut.read()
-                        if BufferIn == BufferOut[TempFileSize - InputFileSize:]:
-                            HeaderLength = str(TempFileSize - InputFileSize)
-                    #auto sec guided attribute with process required
-                    if HeaderLength is None:
-                        Attribute.append('PROCESSING_REQUIRED')
-
-                FileHandleIn.close()
-                FileHandleOut.close()
+                    if self.ProcessRequired == "NONE" and HeaderLength is None:
+                        if TempFileSize > InputFileSize:
+                            FileHandleIn.seek(0)
+                            BufferIn = FileHandleIn.read()
+                            FileHandleOut.seek(0)
+                            BufferOut = FileHandleOut.read()
+                            if BufferIn == BufferOut[TempFileSize - InputFileSize:]:
+                                HeaderLength = str(TempFileSize - InputFileSize)
+                        #auto sec guided attribute with process required
+                        if HeaderLength is None:
+                            Attribute.append('PROCESSING_REQUIRED')
 
                 if FirstCall and 'PROCESSING_REQUIRED' in Attribute:
                     # Guided data by -z option on first call is the process required data. Call the guided tool with the real option.
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 9d632b6321e2..e67d056cc178 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -159,9 +159,8 @@ class Region(RegionClassObject):
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "Size of FV File (%s) is larger than Region Size 0x%X specified." \
                                             % (RegionData, Size))
-                        BinFile = open(FileName, 'rb')
-                        Buffer.write(BinFile.read())
-                        BinFile.close()
+                        with open(FileName, 'rb') as BinFile:
+                            Buffer.write(BinFile.read())
                         Size = Size - FileLength
             #
             # Pad the left buffer
@@ -213,9 +212,8 @@ class Region(RegionClassObject):
                     EdkLogger.error("GenFds", GENFDS_ERROR,
                                     "Size 0x%X of Capsule File (%s) is larger than Region Size 0x%X specified." \
                                     % (FileLength, RegionData, Size))
-                BinFile = open(FileName, 'rb')
-                Buffer.write(BinFile.read())
-                BinFile.close()
+                with open(FileName, 'rb') as BinFile:
+                    Buffer.write(BinFile.read())
                 Size = Size - FileLength
             #
             # Pad the left buffer
@@ -245,9 +243,8 @@ class Region(RegionClassObject):
                                     "Size of File (%s) is larger than Region Size 0x%X specified." \
                                     % (RegionData, Size))
                 GenFdsGlobalVariable.InfLogger('   Region File Name = %s' % RegionData)
-                BinFile = open(RegionData, 'rb')
-                Buffer.write(BinFile.read())
-                BinFile.close()
+                with open(RegionData, 'rb') as BinFile:
+                    Buffer.write(BinFile.read())
                 Size = Size - FileLength
             #
             # Pad the left buffer
diff --git a/BaseTools/Source/Python/GenFds/Vtf.py b/BaseTools/Source/Python/GenFds/Vtf.py
index 18ea37b9afdd..291070827b78 100644
--- a/BaseTools/Source/Python/GenFds/Vtf.py
+++ b/BaseTools/Source/Python/GenFds/Vtf.py
@@ -1,7 +1,7 @@
 ## @file
 # process VTF generation
 #
-#  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
+#  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 #
 #  This program and the accompanying materials
 #  are licensed and made available under the terms and conditions of the BSD License
@@ -67,81 +67,79 @@ class Vtf (VtfClassObject):
     def GenBsfInf (self):
         FvList = self.GetFvList()
         self.BsfInfName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiName + '.inf')
-        BsfInf = open(self.BsfInfName, 'w+')
-        if self.ResetBin is not None:
-            BsfInf.writelines ("[OPTIONS]" + T_CHAR_LF)
-            BsfInf.writelines ("IA32_RST_BIN" + \
-                               " = " + \
-                               GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)) + \
-                               T_CHAR_LF)
-            BsfInf.writelines (T_CHAR_LF)
-
-        BsfInf.writelines ("[COMPONENTS]" + T_CHAR_LF)
-
-        for ComponentObj in self.ComponentStatementList :
-            BsfInf.writelines ("COMP_NAME" + \
-                               " = " + \
-                               ComponentObj.CompName + \
-                               T_CHAR_LF)
-            if ComponentObj.CompLoc.upper() == 'NONE':
-                BsfInf.writelines ("COMP_LOC" + \
+        with open(self.BsfInfName, 'w') as BsfInf:
+            if self.ResetBin is not None:
+                BsfInf.writelines ("[OPTIONS]" + T_CHAR_LF)
+                BsfInf.writelines ("IA32_RST_BIN" + \
                                    " = " + \
-                                   'N' + \
+                                   GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)) + \
                                    T_CHAR_LF)
+                BsfInf.writelines (T_CHAR_LF)
+
+            BsfInf.writelines ("[COMPONENTS]" + T_CHAR_LF)
 
-            elif ComponentObj.FilePos is not None:
-                BsfInf.writelines ("COMP_LOC" + \
+            for ComponentObj in self.ComponentStatementList :
+                BsfInf.writelines ("COMP_NAME" + \
                                    " = " + \
-                                   ComponentObj.FilePos + \
+                                   ComponentObj.CompName + \
                                    T_CHAR_LF)
-            else:
-                Index = FvList.index(ComponentObj.CompLoc.upper())
-                if Index == 0:
+                if ComponentObj.CompLoc.upper() == 'NONE':
                     BsfInf.writelines ("COMP_LOC" + \
                                        " = " + \
-                                       'F' + \
+                                       'N' + \
                                        T_CHAR_LF)
-                elif Index == 1:
+
+                elif ComponentObj.FilePos is not None:
                     BsfInf.writelines ("COMP_LOC" + \
                                        " = " + \
-                                       'S' + \
+                                       ComponentObj.FilePos + \
                                        T_CHAR_LF)
+                else:
+                    Index = FvList.index(ComponentObj.CompLoc.upper())
+                    if Index == 0:
+                        BsfInf.writelines ("COMP_LOC" + \
+                                           " = " + \
+                                           'F' + \
+                                           T_CHAR_LF)
+                    elif Index == 1:
+                        BsfInf.writelines ("COMP_LOC" + \
+                                           " = " + \
+                                           'S' + \
+                                           T_CHAR_LF)
 
-            BsfInf.writelines ("COMP_TYPE" + \
-                               " = " + \
-                               ComponentObj.CompType + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_VER" + \
-                               " = " + \
-                               ComponentObj.CompVer + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_CS" + \
-                               " = " + \
-                               ComponentObj.CompCs + \
-                               T_CHAR_LF)
-
-            BinPath = ComponentObj.CompBin
-            if BinPath != '-':
-                BinPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(BinPath))
-            BsfInf.writelines ("COMP_BIN" + \
-                               " = " + \
-                               BinPath + \
-                               T_CHAR_LF)
+                BsfInf.writelines ("COMP_TYPE" + \
+                                   " = " + \
+                                   ComponentObj.CompType + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_VER" + \
+                                   " = " + \
+                                   ComponentObj.CompVer + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_CS" + \
+                                   " = " + \
+                                   ComponentObj.CompCs + \
+                                   T_CHAR_LF)
 
-            SymPath = ComponentObj.CompSym
-            if SymPath != '-':
-                SymPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(SymPath))
-            BsfInf.writelines ("COMP_SYM" + \
-                               " = " + \
-                               SymPath + \
-                               T_CHAR_LF)
-            BsfInf.writelines ("COMP_SIZE" + \
-                               " = " + \
-                               ComponentObj.CompSize + \
-                               T_CHAR_LF)
-            BsfInf.writelines (T_CHAR_LF)
+                BinPath = ComponentObj.CompBin
+                if BinPath != '-':
+                    BinPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(BinPath))
+                BsfInf.writelines ("COMP_BIN" + \
+                                   " = " + \
+                                   BinPath + \
+                                   T_CHAR_LF)
 
-        BsfInf.close()
+                SymPath = ComponentObj.CompSym
+                if SymPath != '-':
+                    SymPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(SymPath))
+                BsfInf.writelines ("COMP_SYM" + \
+                                   " = " + \
+                                   SymPath + \
+                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_SIZE" + \
+                                   " = " + \
+                                   ComponentObj.CompSize + \
+                                   T_CHAR_LF)
+                BsfInf.writelines (T_CHAR_LF)
 
     ## GenFvList() method
     #
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index f40c8bd01b23..aa61bc00f277 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -46,13 +46,13 @@ def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
     """
     lines = []
     try:
-        f = open(mapfilepath, 'r')
-        lines = f.readlines()
-        f.close()
+        with open(mapfilepath, 'r') as f:
+            lines = f.readlines()
     except:
         return None
     
-    if len(lines) == 0: return None
+    if len(lines) == 0: 
+        return None
     firstline = lines[0].strip()
     if (firstline.startswith("Archive member included ") and
         firstline.endswith(" file (symbol)")):
@@ -190,18 +190,13 @@ def _parseGeneral(lines, efifilepath):
     
 def generatePcdTable(list, pcdpath):
     try:
-        f = open(pcdpath, 'w')
+        with open(pcdpath, 'w') as f:
+            f.write('PCD Name                       Offset    Section Name\r\n')
+            for pcditem in list:
+                f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
     except:
         pass
 
-    f.write('PCD Name                       Offset    Section Name\r\n')
-    
-    for pcditem in list:
-        f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
-    f.close()
-
-    #print 'Success to generate Binary Patch PCD table at %s!' % pcdpath 
-
 if __name__ == '__main__':
     UsageString = "%prog -m <MapFile> -e <EfiFile> -o <OutFile>"
     AdditionalNotes = "\nPCD table is generated in file name with .BinaryPcdTable.txt postfix"
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index cf2fc7c4f70a..76fef41176ac 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -49,10 +49,9 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Length of Binary File
     #
-    FileHandle = open(FileName, 'rb')
-    FileHandle.seek (0, 2)
-    FileLength = FileHandle.tell()
-    FileHandle.close()
+    with open(FileName, 'rb') as FileHandle:
+        FileHandle.seek (0, 2)
+        FileLength = FileHandle.tell()
     #
     # Unify string to upper string
     #
@@ -85,10 +84,9 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Read binary file into array
     #
-    FileHandle = open(FileName, 'rb')
-    ByteArray = array.array('B')
-    ByteArray.fromfile(FileHandle, FileLength)
-    FileHandle.close()
+    with open(FileName, 'rb') as FileHandle:
+        ByteArray = array.array('B')
+        ByteArray.fromfile(FileHandle, FileLength)
     OrigByteList = ByteArray.tolist()
     ByteList = ByteArray.tolist()
     #
@@ -193,9 +191,8 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     if ByteList != OrigByteList:
         ByteArray = array.array('B')
         ByteArray.fromlist(ByteList)
-        FileHandle = open(FileName, 'wb')
-        ByteArray.tofile(FileHandle)
-        FileHandle.close()
+        with open(FileName, 'wb') as FileHandle:
+            ByteArray.tofile(FileHandle)
     return 0, "Patch Value into File %s successfully." % (FileName)
 
 ## Parse command line options
diff --git a/BaseTools/Source/Python/Table/TableReport.py b/BaseTools/Source/Python/Table/TableReport.py
index 4af0e98d86b4..145c66c4b415 100644
--- a/BaseTools/Source/Python/Table/TableReport.py
+++ b/BaseTools/Source/Python/Table/TableReport.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create/update/query/erase table for ECC reports
 #
-# Copyright (c) 2008 - 2015, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -100,31 +100,30 @@ class TableReport(Table):
     #
     def ToCSV(self, Filename='Report.csv'):
         try:
-            File = open(Filename, 'w+')
-            File.write("""No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
-            RecordSet = self.Query()
-            Index = 0
-            for Record in RecordSet:
-                Index = Index + 1
-                ErrorID = Record[1]
-                OtherMsg = Record[2]
-                BelongsToTable = Record[3]
-                BelongsToItem = Record[4]
-                IsCorrected = Record[5]
-                SqlCommand = ''
-                if BelongsToTable == 'File':
-                    SqlCommand = """select 1, FullPath from %s where ID = %s
-                             """ % (BelongsToTable, BelongsToItem)
-                else:
-                    SqlCommand = """select A.StartLine, B.FullPath from %s as A, File as B
-                                    where A.ID = %s and B.ID = A.BelongsToFile
+            with open(Filename, 'w') as File:
+                File.write("""No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
+                RecordSet = self.Query()
+                Index = 0
+                for Record in RecordSet:
+                    Index = Index + 1
+                    ErrorID = Record[1]
+                    OtherMsg = Record[2]
+                    BelongsToTable = Record[3]
+                    BelongsToItem = Record[4]
+                    IsCorrected = Record[5]
+                    SqlCommand = ''
+                    if BelongsToTable == 'File':
+                        SqlCommand = """select 1, FullPath from %s where ID = %s
                                  """ % (BelongsToTable, BelongsToItem)
-                NewRecord = self.Exec(SqlCommand)
-                if NewRecord != []:
-                    File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
-                    EdkLogger.quiet("%s(%s): [%s]%s %s" % (NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
+                    else:
+                        SqlCommand = """select A.StartLine, B.FullPath from %s as A, File as B
+                                        where A.ID = %s and B.ID = A.BelongsToFile
+                                     """ % (BelongsToTable, BelongsToItem)
+                    NewRecord = self.Exec(SqlCommand)
+                    if NewRecord != []:
+                        File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
+                        EdkLogger.quiet("%s(%s): [%s]%s %s" % (NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
 
-            File.close()
         except IOError:
             NewFilename = 'Report_' + time.strftime("%Y%m%d_%H%M%S.csv", time.localtime())
             EdkLogger.warn("ECC", "The report file %s is locked by other progress, use %s instead!" % (Filename, NewFilename))
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ecac316b7a3a..5c463df6bce5 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -58,22 +58,21 @@ class TargetTool():
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
         """Convert a text file to a dictionary of (name:value) pairs."""
         try:
-            f = open(FileName,'r')
-            for Line in f:
-                if Line.startswith(CommentCharacter) or Line.strip() == '':
-                    continue
-                LineList = Line.split(KeySplitCharacter,1)
-                if len(LineList) >= 2:
-                    Key = LineList[0].strip()
-                    if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
-                        if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM or Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF \
-                          or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
-                          or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
-                            self.TargetTxtDictionary[Key] = LineList[1].replace('\\', '/').strip()
-                        elif Key == TAB_TAT_DEFINES_TARGET or Key == TAB_TAT_DEFINES_TARGET_ARCH \
-                          or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
-                            self.TargetTxtDictionary[Key] = LineList[1].split()
-            f.close()
+            with open(FileName,'r') as f:
+                for Line in f:
+                    if Line.startswith(CommentCharacter) or Line.strip() == '':
+                        continue
+                    LineList = Line.split(KeySplitCharacter,1)
+                    if len(LineList) >= 2:
+                        Key = LineList[0].strip()
+                        if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
+                            if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM or Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF \
+                              or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
+                              or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
+                                self.TargetTxtDictionary[Key] = LineList[1].replace('\\', '/').strip()
+                            elif Key == TAB_TAT_DEFINES_TARGET or Key == TAB_TAT_DEFINES_TARGET_ARCH \
+                              or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
+                                self.TargetTxtDictionary[Key] = LineList[1].split()
             return 0
         except:
             last_type, last_value, last_tb = sys.exc_info()
@@ -94,42 +93,38 @@ class TargetTool():
             
     def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
         try:
-            fr = open(self.FileName, 'r')
-            fw = open(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), 'w')
-
-            existKeys = []
-            for Line in fr:
-                if Line.startswith(CommentCharacter) or Line.strip() == '':
-                    fw.write(Line)
-                else:
-                    LineList = Line.split(KeySplitCharacter,1)
-                    if len(LineList) >= 2:
-                        Key = LineList[0].strip()
-                        if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
-                            if Key not in existKeys:
-                                existKeys.append(Key)
-                            else:
-                                print "Warning: Found duplicate key item in original configuration files!"
-                                
-                            if Num == 0:
-                                Line = "%-30s = \n" % Key
-                            else:
-                                ret = GetConfigureKeyValue(self, Key)
-                                if ret is not None:
-                                    Line = ret
-                            fw.write(Line)
-            for key in self.TargetTxtDictionary:
-                if key not in existKeys:
-                    print "Warning: %s does not exist in original configuration file" % key
-                    Line = GetConfigureKeyValue(self, key)
-                    if Line is None:
-                        Line = "%-30s = " % key
-                    fw.write(Line)
+            with open(self.FileName, 'r') as fr:
+                FileRead = fr.readlines()
+            with open(self.FileName, 'w') as fw:
+                existKeys = []
+                for Line in FileRead:
+                    if Line.startswith(CommentCharacter) or Line.strip() == '':
+                        fw.write(Line)
+                    else:
+                        LineList = Line.split(KeySplitCharacter,1)
+                        if len(LineList) >= 2:
+                            Key = LineList[0].strip()
+                            if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
+                                if Key not in existKeys:
+                                    existKeys.append(Key)
+                                else:
+                                    print "Warning: Found duplicate key item in original configuration files!"
+                                    
+                                if Num == 0:
+                                    Line = "%-30s = \n" % Key
+                                else:
+                                    ret = GetConfigureKeyValue(self, Key)
+                                    if ret is not None:
+                                        Line = ret
+                                fw.write(Line)
+                for key in self.TargetTxtDictionary:
+                    if key not in existKeys:
+                        print "Warning: %s does not exist in original configuration file" % key
+                        Line = GetConfigureKeyValue(self, key)
+                        if Line is None:
+                            Line = "%-30s = " % key
+                        fw.write(Line)
                 
-            fr.close()
-            fw.close()
-            os.remove(self.FileName)
-            os.rename(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), self.FileName)
             
         except:
             last_type, last_value, last_tb = sys.exc_info()
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index d2e6d317676c..a92df52979c6 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -141,14 +141,12 @@ gIncludedAslFile = []
 def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
     CreateDirectory(os.path.dirname(Target))
     try:
-        f = open (Source, 'r')
+        with open (Source, 'r') as f:
+            # read whole file
+            Lines = f.readlines()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
 
-    # read whole file
-    Lines = f.readlines()
-    f.close()
-
     PreprocessedFile = ""
     InjectedFile = ""
     LineIndexOfOriginalFile = None
@@ -243,11 +241,10 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
 
     # save to file
     try:
-        f = open (Target, 'wb')
+        with open (Target, 'wb') as f:
+            f.writelines(NewLines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.writelines(NewLines)
-    f.close()
 
 ## Trim preprocessed VFR file
 #
@@ -261,12 +258,11 @@ def TrimPreprocessedVfr(Source, Target):
     CreateDirectory(os.path.dirname(Target))
     
     try:
-        f = open (Source,'r')
+        with open (Source,'r') as f:
+            # read whole file
+            Lines = f.readlines()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
-    # read whole file
-    Lines = f.readlines()
-    f.close()
 
     FoundTypedef = False
     Brace = 0
@@ -310,11 +306,10 @@ def TrimPreprocessedVfr(Source, Target):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        with open (Target,'w') as f:
+            f.writelines(Lines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.writelines(Lines)
-    f.close()
 
 ## Read the content  ASL file, including ASL included, recursively
 #
@@ -340,7 +335,8 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         for IncludePath in SearchPathList:
             IncludeFile = os.path.join(IncludePath, Source)
             if os.path.isfile(IncludeFile):
-                F = open(IncludeFile, "r")
+                with open(IncludeFile, "r") as OpenFile:
+                    FileLines = OpenFile.readlines()
                 break
         else:
             EdkLogger.error("Trim", "Failed to find include file %s" % Source)
@@ -356,7 +352,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         return []
     gIncludedAslFile.append(IncludeFile)
     
-    for Line in F:
+    for Line in FileLines:
         LocalSearchPath = None
         Result = gAslIncludePattern.findall(Line)
         if len(Result) == 0:
@@ -375,7 +371,6 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         NewFileContent.append("\n")
 
     gIncludedAslFile.pop()
-    F.close()
 
     return NewFileContent
 
@@ -425,12 +420,11 @@ def TrimAslFile(Source, Target, IncludePathFile):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        with open (Target,'w') as f:
+            f.writelines(Lines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
-    f.writelines(Lines)
-    f.close()
 
 def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     VfrNameList = []
@@ -450,11 +444,6 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     if not VfrUniOffsetList:
         return
 
-    try:
-        fInputfile = open(OutputFile, "wb+", 0)
-    except:
-        EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
-
     # Use a instance of StringIO to cache data
     fStringIO = StringIO.StringIO('')
 
@@ -483,16 +472,16 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
             VfrValue = pack ('Q', int (Item[1], 16))
             fStringIO.write (VfrValue)
 
-    #
-    # write data into file.
-    #
-    try :
-        fInputfile.write (fStringIO.getvalue())
+    try:
+        with open(OutputFile, "wb+", 0) as fInputfile:
+            try :
+                fInputfile.write (fStringIO.getvalue())
+            except:
+                EdkLogger.error("Trim", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %OutputFile, None)
     except:
-        EdkLogger.error("Trim", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %OutputFile, None)
+        EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
 
     fStringIO.close ()
-    fInputfile.close ()
 
 ## Trim EDK source code file(s)
 #
@@ -560,12 +549,11 @@ def TrimEdkSourceCode(Source, Target):
     CreateDirectory(os.path.dirname(Target))
 
     try:
-        f = open (Source,'rb')
+        with open (Source,'rb') as f:
+            # read whole file
+            Lines = f.read()
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
-    # read whole file
-    Lines = f.read()
-    f.close()
 
     NewLines = None
     for Re,Repl in gImportCodePatterns:
@@ -579,11 +567,10 @@ def TrimEdkSourceCode(Source, Target):
         return
 
     try:
-        f = open (Target,'wb')
+        with open (Target,'wb') as f:
+            f.write(NewLines)
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
-    f.write(NewLines)
-    f.close()
 
 
 ## Parse command line options
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 2de8a84b9bd7..d714c781e970 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -118,13 +118,10 @@ def GetDependencyList(FileStack,SearchPathList):
             CurrentFileDependencyList = DepDb[F]
         else:
             try:
-                Fd = open(F, 'r')
-                FileContent = Fd.read()
+                with open(F, 'r') as Fd:
+                    FileContent = Fd.read()
             except BaseException, X:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F + "\n\t" + str(X))
-            finally:
-                if "Fd" in dir(locals()):
-                    Fd.close()
 
             if len(FileContent) == 0:
                 continue
@@ -2109,9 +2106,8 @@ class DscBuildData(PlatformBuildClassObject):
         MessageGroup = []
         if returncode <>0:
             CAppBaseFileName = os.path.join(self.OutputPath, PcdValueInitName)
-            File = open (CAppBaseFileName + '.c', 'r')
-            FileData = File.readlines()
-            File.close()
+            with open (CAppBaseFileName + '.c', 'r') as File:
+                FileData = File.readlines()
             for Message in Messages:
                 if " error" in Message or "warning" in Message:
                     FileInfo = Message.strip().split('(')
@@ -2155,9 +2151,8 @@ class DscBuildData(PlatformBuildClassObject):
             if returncode <> 0:
                 EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s' % Command)
 
-        File = open (OutputValueFile, 'r')
-        FileBuffer = File.readlines()
-        File.close()
+        with open (OutputValueFile, 'r') as File:
+            FileBuffer = File.readlines()
 
         StructurePcdSet = []
         for Pcd in FileBuffer:
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 72a557dfea50..db9e1ed062fb 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -291,18 +291,18 @@ class DepexParser(object):
     # @param DepexFileName   The file name of binary dependency expression file.
     #
     def ParseDepexFile(self, DepexFileName):
-        DepexFile = open(DepexFileName, "rb")
         DepexStatement = []
-        OpCode = DepexFile.read(1)
-        while OpCode:
-            Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
-            if Statement in ["BEFORE", "AFTER", "PUSH"]:
-                GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
-                            struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
-                GuidString = self._GuidDb.get(GuidValue, GuidValue)
-                Statement = "%s %s" % (Statement, GuidString)
-            DepexStatement.append(Statement)
+        with open(DepexFileName, "rb") as DepexFile:
             OpCode = DepexFile.read(1)
+            while OpCode:
+                Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
+                if Statement in ["BEFORE", "AFTER", "PUSH"]:
+                    GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
+                                struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
+                    GuidString = self._GuidDb.get(GuidValue, GuidValue)
+                    Statement = "%s %s" % (Statement, GuidString)
+                DepexStatement.append(Statement)
+                OpCode = DepexFile.read(1)
 
         return DepexStatement
     
@@ -629,14 +629,14 @@ class ModuleReport(object):
         FwReportFileName = os.path.join(self._BuildDir, "DEBUG", self.ModuleName + ".txt")
         if os.path.isfile(FwReportFileName):
             try:
-                FileContents = open(FwReportFileName).read()
-                Match = gModuleSizePattern.search(FileContents)
-                if Match:
-                    self.Size = int(Match.group(1))
+                with open(FwReportFileName).read() as FileContents:
+                    Match = gModuleSizePattern.search(FileContents)
+                    if Match:
+                        self.Size = int(Match.group(1))
 
-                Match = gTimeStampPattern.search(FileContents)
-                if Match:
-                    self.BuildTimeStamp = datetime.fromtimestamp(int(Match.group(1)))
+                    Match = gTimeStampPattern.search(FileContents)
+                    if Match:
+                        self.BuildTimeStamp = datetime.fromtimestamp(int(Match.group(1)))
             except IOError:
                 EdkLogger.warn(None, "Fail to read report file", FwReportFileName)
 
@@ -1483,14 +1483,12 @@ class PredictionReport(object):
         GuidList = os.path.join(self._EotDir, "GuidList.txt")
         DispatchList = os.path.join(self._EotDir, "Dispatch.txt")
 
-        TempFile = open(SourceList, "w+")
-        for Item in self._SourceList:
-            FileWrite(TempFile, Item)
-        TempFile.close()
-        TempFile = open(GuidList, "w+")
-        for Key in self._GuidMap:
-            FileWrite(TempFile, "%s %s" % (Key, self._GuidMap[Key]))
-        TempFile.close()
+        with open(SourceList, "w") as TempFile:
+            for Item in self._SourceList:
+                FileWrite(TempFile, Item)
+        with open(GuidList, "w") as TempFile:
+            for Key in self._GuidMap:
+                FileWrite(TempFile, "%s %s" % (Key, self._GuidMap[Key]))
 
         try:
             from Eot.Eot import Eot
@@ -1881,23 +1879,22 @@ class FdReport(object):
                 break
 
         if os.path.isfile(self.VpdFilePath):
-            fd = open(self.VpdFilePath, "r")
-            Lines = fd.readlines()
-            for Line in Lines:
-                Line = Line.strip()
-                if len(Line) == 0 or Line.startswith("#"):
-                    continue
-                try:
-                    PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
-                    PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
-                    if Offset.lower().startswith('0x'):
-                        Offset = '0x%08X' % (int(Offset, 16) + self.VPDBaseAddress)
-                    else:
-                        Offset = '0x%08X' % (int(Offset, 10) + self.VPDBaseAddress)
-                    self.VPDInfoList.append("%s | %s | %s | %s | %s" % (PcdName, SkuId, Offset, Size, Value))
-                except:
-                    EdkLogger.error("BuildReport", CODE_ERROR, "Fail to parse VPD information file %s" % self.VpdFilePath)
-            fd.close()
+            with open(self.VpdFilePath, "r") as fd:
+                Lines = fd.readlines()
+                for Line in Lines:
+                    Line = Line.strip()
+                    if len(Line) == 0 or Line.startswith("#"):
+                        continue
+                    try:
+                        PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
+                        PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
+                        if Offset.lower().startswith('0x'):
+                            Offset = '0x%08X' % (int(Offset, 16) + self.VPDBaseAddress)
+                        else:
+                            Offset = '0x%08X' % (int(Offset, 10) + self.VPDBaseAddress)
+                        self.VPDInfoList.append("%s | %s | %s | %s | %s" % (PcdName, SkuId, Offset, Size, Value))
+                    except:
+                        EdkLogger.error("BuildReport", CODE_ERROR, "Fail to parse VPD information file %s" % self.VpdFilePath)
 
     ##
     # Generate report for the firmware device.
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 99e4881b3ea4..1fb8c7985d99 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -320,9 +320,8 @@ def LaunchCommand(Command, WorkingDir):
         # print out the Response file and its content when make failure
         RespFile = os.path.join(WorkingDir, 'OUTPUT', 'respfilelist.txt')
         if os.path.isfile(RespFile):
-            f = open(RespFile)
-            RespContent = f.read()
-            f.close()
+            with open(RespFile) as f:
+                RespContent = f.read()
             EdkLogger.info(RespContent)
 
         EdkLogger.error("build", COMMAND_FAILURE, ExtraData="%s [%s]" % (Command, WorkingDir))
@@ -1169,9 +1168,8 @@ class Build():
                 EdkLogger.error("Prebuild", PREBUILD_ERROR, 'Prebuild process is not success!')
 
             if os.path.exists(PrebuildEnvFile):
-                f = open(PrebuildEnvFile)
-                envs = f.readlines()
-                f.close()
+                with open(PrebuildEnvFile) as f:
+                    envs = f.readlines()
                 envs = itertools.imap(lambda l: l.split('=',1), envs)
                 envs = itertools.ifilter(lambda l: len(l) == 2, envs)
                 envs = itertools.imap(lambda l: [i.strip() for i in l], envs)
@@ -1451,29 +1449,28 @@ class Build():
             FunctionList = []
             if os.path.exists(ImageMapTable):
                 OrigImageBaseAddress = 0
-                ImageMap = open(ImageMapTable, 'r')
-                for LinStr in ImageMap:
-                    if len (LinStr.strip()) == 0:
-                        continue
-                    #
-                    # Get the preferred address set on link time.
-                    #
-                    if LinStr.find ('Preferred load address is') != -1:
+                with open(ImageMapTable, 'r') as ImageMap:
+                    for LinStr in ImageMap:
+                        if len (LinStr.strip()) == 0:
+                            continue
+                        #
+                        # Get the preferred address set on link time.
+                        #
+                        if LinStr.find ('Preferred load address is') != -1:
+                            StrList = LinStr.split()
+                            OrigImageBaseAddress = int (StrList[len(StrList) - 1], 16)
+
                         StrList = LinStr.split()
-                        OrigImageBaseAddress = int (StrList[len(StrList) - 1], 16)
-
-                    StrList = LinStr.split()
-                    if len (StrList) > 4:
-                        if StrList[3] == 'f' or StrList[3] == 'F':
-                            Name = StrList[1]
-                            RelativeAddress = int (StrList[2], 16) - OrigImageBaseAddress
-                            FunctionList.append ((Name, RelativeAddress))
-                            if ModuleInfo.Arch == 'IPF' and Name.endswith('_ModuleEntryPoint'):
-                                #
-                                # Get the real entry point address for IPF image.
-                                #
-                                ModuleInfo.Image.EntryPoint = RelativeAddress
-                ImageMap.close()
+                        if len (StrList) > 4:
+                            if StrList[3] == 'f' or StrList[3] == 'F':
+                                Name = StrList[1]
+                                RelativeAddress = int (StrList[2], 16) - OrigImageBaseAddress
+                                FunctionList.append ((Name, RelativeAddress))
+                                if ModuleInfo.Arch == 'IPF' and Name.endswith('_ModuleEntryPoint'):
+                                    #
+                                    # Get the real entry point address for IPF image.
+                                    #
+                                    ModuleInfo.Image.EntryPoint = RelativeAddress
             #
             # Add general information.
             #
@@ -1528,32 +1525,30 @@ class Build():
                 FvMapBuffer = os.path.join(Wa.FvDir, FvName + '.Fv.map')
                 if not os.path.exists(FvMapBuffer):
                     continue
-                FvMap = open(FvMapBuffer, 'r')
-                #skip FV size information
-                FvMap.readline()
-                FvMap.readline()
-                FvMap.readline()
-                FvMap.readline()
-                for Line in FvMap:
-                    MatchGuid = GuidPattern.match(Line)
-                    if MatchGuid is not None:
+                with open(FvMapBuffer, 'r') as FvMap:
+                    #skip FV size information
+                    FvMap.readline()
+                    FvMap.readline()
+                    FvMap.readline()
+                    FvMap.readline()
+                    for Line in FvMap:
+                        MatchGuid = GuidPattern.match(Line)
+                        if MatchGuid is not None:
+                            #
+                            # Replace GUID with module name
+                            #
+                            GuidString = MatchGuid.group()
+                            if GuidString.upper() in ModuleList:
+                                Line = Line.replace(GuidString, ModuleList[GuidString.upper()].Name)
+                        MapBuffer.write('%s' % (Line))
                         #
-                        # Replace GUID with module name
+                        # Add the debug image full path.
                         #
-                        GuidString = MatchGuid.group()
-                        if GuidString.upper() in ModuleList:
-                            Line = Line.replace(GuidString, ModuleList[GuidString.upper()].Name)
-                    MapBuffer.write('%s' % (Line))
-                    #
-                    # Add the debug image full path.
-                    #
-                    MatchGuid = GuidName.match(Line)
-                    if MatchGuid is not None:
-                        GuidString = MatchGuid.group().split("=")[1]
-                        if GuidString.upper() in ModuleList:
-                            MapBuffer.write('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
-
-                FvMap.close()
+                        MatchGuid = GuidName.match(Line)
+                        if MatchGuid is not None:
+                            GuidString = MatchGuid.group().split("=")[1]
+                            if GuidString.upper() in ModuleList:
+                                MapBuffer.write('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
 
     ## Collect MAP information of all modules
     #
@@ -2193,10 +2188,9 @@ class Build():
 
                     # Write out GuidedSecTools.txt
                     toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
-                    toolsFile = open(toolsFile, 'wt')
-                    for guidedSectionTool in guidAttribs:
-                        print >> toolsFile, ' '.join(guidedSectionTool)
-                    toolsFile.close()
+                    with open(toolsFile, 'w') as toolsFile:
+                        for guidedSectionTool in guidAttribs:
+                            print >> toolsFile, ' '.join(guidedSectionTool)
 
     ## Returns the full path of the tool.
     #
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 08/11] BaseTools: refactor to change object types
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (6 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 07/11] BaseTools: refactor file opening/writing Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings Jaben Carsey
                   ` (2 subsequent siblings)
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

change to object types that are closer to use case.  for example:
when using a list as a double ended queue, use the built in object.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py           | 30 +++++++++++---------
 BaseTools/Source/Python/GenFds/FdfParser.py          |  5 ++--
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py | 20 ++++++-------
 3 files changed, 29 insertions(+), 26 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 009e5c56781d..599331060187 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -45,10 +45,14 @@ import InfSectionParser
 import datetime
 import hashlib
 from GenVar import VariableMgr,var_info
-from collections import OrderedDict
-from collections import defaultdict
+from collections import OrderedDict,defaultdict,deque
 from abc import ABCMeta, abstractmethod
 
+class OrderedListDict(OrderedDict, defaultdict):
+    def __init__(self, *args, **kwargs):
+        super(OrderedListDict, self).__init__(*args, **kwargs)
+        self.default_factory = list
+
 ## Regular expression for splitting Dependency Expression string into tokens
 gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
 
@@ -2172,8 +2176,8 @@ class PlatformAutoGen(AutoGen):
 
         # EdkII module
         LibraryConsumerList = [Module]
-        Constructor         = []
-        ConsumedByList      = OrderedDict()
+        Constructor         = set()
+        ConsumedByList = OrderedListDict()
         LibraryInstance     = OrderedDict()
 
         EdkLogger.verbose("")
@@ -2219,10 +2223,8 @@ class PlatformAutoGen(AutoGen):
                     continue
 
                 if LibraryModule.ConstructorList != [] and LibraryModule not in Constructor:
-                    Constructor.append(LibraryModule)
+                    Constructor.add(LibraryModule)
 
-                if LibraryModule not in ConsumedByList:
-                    ConsumedByList[LibraryModule] = []
                 # don't add current module itself to consumer list
                 if M != Module:
                     if M in ConsumedByList[LibraryModule]:
@@ -2235,8 +2237,8 @@ class PlatformAutoGen(AutoGen):
         #
         # Q <- Set of all nodes with no incoming edges
         #
-        LibraryList = [] #LibraryInstance.values()
-        Q = []
+        LibraryList = []
+        Q = deque()
         for LibraryClassName in LibraryInstance:
             M = LibraryInstance[LibraryClassName]
             LibraryList.append(M)
@@ -2248,7 +2250,7 @@ class PlatformAutoGen(AutoGen):
         #
         while True:
             EdgeRemoved = True
-            while Q == [] and EdgeRemoved:
+            while not Q and EdgeRemoved:
                 EdgeRemoved = False
                 # for each node Item with a Constructor
                 for Item in LibraryList:
@@ -2263,12 +2265,12 @@ class PlatformAutoGen(AutoGen):
                         EdgeRemoved = True
                         if ConsumedByList[Item] == []:
                             # insert Item into Q
-                            Q.insert(0, Item)
+                            Q.appendleft(Item)
                             break
-                    if Q != []:
+                    if Q:
                         break
             # DAG is done if there's no more incoming edge for all nodes
-            if Q == []:
+            if not Q:
                 break
 
             # remove node from Q
@@ -2286,7 +2288,7 @@ class PlatformAutoGen(AutoGen):
                 if ConsumedByList[Item] != []:
                     continue
                 # insert Item into Q, if Item has no other incoming edges
-                Q.insert(0, Item)
+                Q.appendleft(Item)
 
         #
         # if any remaining node Item in the graph has a constructor and an incoming edge, then the graph has a cycle
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 8d1a4b543f0e..d511cf4f9d5a 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -43,6 +43,7 @@ import OptionRom
 import OptRomInfStatement
 import OptRomFileStatement
 import string
+from collections import deque
 
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from Common.BuildToolError import *
@@ -89,7 +90,7 @@ BaseAddrValuePattern = re.compile('^0[xX][0-9a-fA-F]+')
 FileExtensionPattern = re.compile(r'([a-zA-Z][a-zA-Z0-9]*)')
 TokenFindPattern = re.compile(r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\)|\*)')
 
-AllIncludeFileList = []
+AllIncludeFileList = deque()
 
 # Get the closest parent
 def GetParentAtLine (Line):
@@ -685,7 +686,7 @@ class FdfParser:
                     InsertAtLine += 1
 
                 # reversely sorted to better determine error in file
-                AllIncludeFileList.insert(0, IncFileProfile)
+                AllIncludeFileList.appendleft(IncFileProfile)
 
                 # comment out the processed include file statement
                 TempList = list(self.Profile.FileLinesList[IncludeLine - 1])
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index 573100081815..a793055b6d18 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -11,7 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
-from collections import OrderedDict, defaultdict
+from collections import OrderedDict, defaultdict, deque
 from Common.DataType import SUP_MODULE_USER_DEFINED
 from BuildClassObject import LibraryClassObject
 import Common.GlobalData as GlobalData
@@ -110,7 +110,7 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
 
     # EdkII module
     LibraryConsumerList = [Module]
-    Constructor = []
+    Constructor = set()
     ConsumedByList = OrderedListDict()
     LibraryInstance = OrderedDict()
 
@@ -148,7 +148,7 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
                 continue
 
             if LibraryModule.ConstructorList != [] and LibraryModule not in Constructor:
-                Constructor.append(LibraryModule)
+                Constructor.add(LibraryModule)
 
             # don't add current module itself to consumer list
             if M != Module:
@@ -162,8 +162,8 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
     #
     # Q <- Set of all nodes with no incoming edges
     #
-    LibraryList = [] #LibraryInstance.values()
-    Q = []
+    LibraryList = []
+    Q = deque()
     for LibraryClassName in LibraryInstance:
         M = LibraryInstance[LibraryClassName]
         LibraryList.append(M)
@@ -175,7 +175,7 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
     #
     while True:
         EdgeRemoved = True
-        while Q == [] and EdgeRemoved:
+        while not Q and EdgeRemoved:
             EdgeRemoved = False
             # for each node Item with a Constructor
             for Item in LibraryList:
@@ -190,12 +190,12 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
                     EdgeRemoved = True
                     if len(ConsumedByList[Item]) == 0:
                         # insert Item into Q
-                        Q.insert(0, Item)
+                        Q.appendleft(Item)
                         break
-                if Q != []:
+                if Q:
                     break
         # DAG is done if there's no more incoming edge for all nodes
-        if Q == []:
+        if not Q:
             break
 
         # remove node from Q
@@ -213,7 +213,7 @@ def _GetModuleLibraryInstances(Module, Platform, BuildDatabase, Arch, Target, To
             if len(ConsumedByList[Item]) != 0:
                 continue
             # insert Item into Q, if Item has no other incoming edges
-            Q.insert(0, Item)
+            Q.appendleft(Item)
 
     #
     # if any remaining node Item in the graph has a constructor and an incoming edge, then the graph has a cycle
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (7 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 08/11] BaseTools: refactor to change object types Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 10/11] BaseTools: change to set for membership testing Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 11/11] BaseTools: remove extra assignment Jaben Carsey
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

strings are immutable.  allocate minimal duplication.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                 | 94 +++++++++++---------
 BaseTools/Source/Python/AutoGen/GenC.py                    |  2 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                | 15 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                 |  6 +-
 BaseTools/Source/Python/AutoGen/GenVar.py                  |  2 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py          |  2 +-
 BaseTools/Source/Python/AutoGen/StrGather.py               |  4 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py          |  3 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py | 17 ++--
 BaseTools/Source/Python/Common/Expression.py               | 42 +++++----
 BaseTools/Source/Python/Common/Misc.py                     |  8 +-
 BaseTools/Source/Python/Common/String.py                   |  2 +-
 BaseTools/Source/Python/CommonDataClass/CommonClass.py     | 29 +++---
 BaseTools/Source/Python/GenFds/Capsule.py                  | 19 ++--
 BaseTools/Source/Python/GenFds/FdfParser.py                |  8 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py          |  8 +-
 BaseTools/Source/Python/GenFds/Fv.py                       | 72 +++++----------
 BaseTools/Source/Python/GenFds/GenFds.py                   |  4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py     | 24 ++---
 BaseTools/Source/Python/GenFds/OptionRom.py                |  2 -
 BaseTools/Source/Python/GenFds/Vtf.py                      | 76 +++++-----------
 BaseTools/Source/Python/Table/TableDataModel.py            | 11 +--
 BaseTools/Source/Python/Workspace/DscBuildData.py          | 14 +--
 BaseTools/Source/Python/Workspace/InfBuildData.py          |  4 +-
 BaseTools/Source/Python/Workspace/MetaDataTable.py         | 10 +--
 BaseTools/Source/Python/Workspace/MetaFileParser.py        |  4 +-
 26 files changed, 196 insertions(+), 286 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 599331060187..4ccb50a0a0af 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -815,42 +815,46 @@ class WorkspaceAutoGen(AutoGen):
                         _PcdName = FfsFile.NameGuid.lstrip("PCD(").rstrip(")")
                         PcdFoundFlag = False
                         for Pa in self.AutoGenObjectList:
-                            if not PcdFoundFlag:
-                                for PcdItem in Pa.AllPcdList:
-                                    if (PcdItem.TokenSpaceGuidCName + "." + PcdItem.TokenCName) == _PcdName:
+                            #
+                            # once found, get out of the loop
+                            #
+                            if PcdFoundFlag:
+                                break
+                            for PcdItem in Pa.AllPcdList:
+                                if "{TSG}.{CN}".format(TSG=PcdItem.TokenSpaceGuidCName, CN=PcdItem.TokenCName) == _PcdName:
+                                    #
+                                    # First convert from CFormatGuid to GUID string
+                                    #
+                                    _PcdGuidString = GuidStructureStringToGuidString(PcdItem.DefaultValue)
+
+                                    if not _PcdGuidString:
                                         #
-                                        # First convert from CFormatGuid to GUID string
+                                        # Then try Byte array.
                                         #
-                                        _PcdGuidString = GuidStructureStringToGuidString(PcdItem.DefaultValue)
-
-                                        if not _PcdGuidString:
-                                            #
-                                            # Then try Byte array.
-                                            #
-                                            _PcdGuidString = GuidStructureByteArrayToGuidString(PcdItem.DefaultValue)
+                                        _PcdGuidString = GuidStructureByteArrayToGuidString(PcdItem.DefaultValue)
 
-                                        if not _PcdGuidString:
-                                            #
-                                            # Not Byte array or CFormat GUID, raise error.
-                                            #
-                                            EdkLogger.error("build",
-                                                            FORMAT_INVALID,
-                                                            "The format of PCD value is incorrect. PCD: %s , Value: %s\n" % (_PcdName, PcdItem.DefaultValue),
-                                                            ExtraData=self.FdfFile)
+                                    if not _PcdGuidString:
+                                        #
+                                        # Not Byte array or CFormat GUID, raise error.
+                                        #
+                                        EdkLogger.error("build",
+                                                        FORMAT_INVALID,
+                                                        "The format of PCD value is incorrect. PCD: %s , Value: %s\n" % (_PcdName, PcdItem.DefaultValue),
+                                                        ExtraData=self.FdfFile)
 
-                                        if _PcdGuidString.upper() not in _GuidDict:
-                                            _GuidDict[_PcdGuidString.upper()] = FfsFile
-                                            PcdFoundFlag = True
-                                            break
-                                        else:
-                                            EdkLogger.error("build",
-                                                            FORMAT_INVALID,
-                                                            "Duplicate GUID found for these lines: Line %d: %s and Line %d: %s. GUID: %s" % (FfsFile.CurrentLineNum,
-                                                                                                                                           FfsFile.CurrentLineContent,
-                                                                                                                                           _GuidDict[_PcdGuidString.upper()].CurrentLineNum,
-                                                                                                                                           _GuidDict[_PcdGuidString.upper()].CurrentLineContent,
-                                                                                                                                           FfsFile.NameGuid.upper()),
-                                                            ExtraData=self.FdfFile)
+                                    if _PcdGuidString.upper() not in _GuidDict:
+                                        _GuidDict[_PcdGuidString.upper()] = FfsFile
+                                        PcdFoundFlag = True
+                                        break
+                                    else:
+                                        EdkLogger.error("build",
+                                                        FORMAT_INVALID,
+                                                        "Duplicate GUID found for these lines: Line %d: %s and Line %d: %s. GUID: %s" % (FfsFile.CurrentLineNum,
+                                                                                                                                       FfsFile.CurrentLineContent,
+                                                                                                                                       _GuidDict[_PcdGuidString.upper()].CurrentLineNum,
+                                                                                                                                       _GuidDict[_PcdGuidString.upper()].CurrentLineContent,
+                                                                                                                                       FfsFile.NameGuid.upper()),
+                                                        ExtraData=self.FdfFile)
 
                     if FfsFile.NameGuid.upper() not in _GuidDict:
                         _GuidDict[FfsFile.NameGuid.upper()] = FfsFile
@@ -1832,13 +1836,13 @@ class PlatformAutoGen(AutoGen):
             if os.path.isabs(self.OutputDir):
                 self._BuildDir = path.join(
                                             path.abspath(self.OutputDir),
-                                            self.BuildTarget + "_" + self.ToolChain,
+                                            "{BT}_{TC}".format(BT=self.BuildTarget, TC=self.ToolChain),
                                             )
             else:
                 self._BuildDir = path.join(
                                             self.WorkspaceDir,
                                             self.OutputDir,
-                                            self.BuildTarget + "_" + self.ToolChain,
+                                            "{BT}_{TC}".format(BT=self.BuildTarget, TC=self.ToolChain),
                                             )
             GlobalData.gBuildDirectory = self._BuildDir
         return self._BuildDir
@@ -1916,7 +1920,7 @@ class PlatformAutoGen(AutoGen):
                             Value = self.BuildOption[Tool][Attr][1:]
                         else:
                             if Attr != 'PATH':
-                                Value += " " + self.BuildOption[Tool][Attr]
+                                Value = "{Val} {At}".format(Val=Value,At=self.BuildOption[Tool][Attr])
                             else:
                                 Value = self.BuildOption[Tool][Attr]
 
@@ -1934,8 +1938,10 @@ class PlatformAutoGen(AutoGen):
                 ToolsDef += "\n"
 
             SaveFileOnChange(self.ToolDefinitionFile, ToolsDef)
-            for DllPath in DllPathList:
-                os.environ["PATH"] = DllPath + os.pathsep + os.environ["PATH"]
+            os.environ["PATH"]='{new}{sep}{start}'.format(
+                new=os.pathsep.join(DllPathList),
+                sep=os.pathsep,
+                start=os.environ["PATH"])
             os.environ["MAKE_FLAGS"] = MakeFlags
 
         return self._ToolDefinitions
@@ -1943,7 +1949,7 @@ class PlatformAutoGen(AutoGen):
     ## Return the paths of tools
     def _GetToolDefFile(self):
         if self._ToolDefFile is None:
-            self._ToolDefFile = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
+            self._ToolDefFile = os.path.join(self.MakeFileDir, "TOOLS_DEF.{ARCH}".format(ARCH=self.Arch))
         return self._ToolDefFile
 
     ## Retrieve the toolchain family of given toolchain tag. Default to 'MSFT'.
@@ -2215,7 +2221,7 @@ class PlatformAutoGen(AutoGen):
 
                     LibraryInstance[LibraryClassName] = LibraryModule
                     LibraryConsumerList.append(LibraryModule)
-                    EdkLogger.verbose("\t" + str(LibraryClassName) + " : " + str(LibraryModule))
+                    EdkLogger.verbose("\t{LCN}:{LM}".format(LCN=str(LibraryClassName), LM=str(LibraryModule)))
                 else:
                     LibraryModule = LibraryInstance[LibraryClassName]
 
@@ -2295,7 +2301,7 @@ class PlatformAutoGen(AutoGen):
         #
         for Item in LibraryList:
             if ConsumedByList[Item] != [] and Item in Constructor and len(Constructor) > 1:
-                ErrorMessage = "\tconsumed by " + "\n\tconsumed by ".join(str(L) for L in ConsumedByList[Item])
+                ErrorMessage = "\tconsumed by {LIST}".format(LIST="\n\tconsumed by ".join(str(L) for L in ConsumedByList[Item]))
                 EdkLogger.error("build", BUILD_ERROR, 'Library [%s] with constructors has a cycle' % str(Item),
                                 ExtraData=ErrorMessage, File=self.MetaFile)
             if Item not in SortedLibraryList:
@@ -2491,7 +2497,7 @@ class PlatformAutoGen(AutoGen):
                 if Library not in LibraryList:
                     LibraryList.append(Library)
                     LibraryConsumerList.append(Library)
-                    EdkLogger.verbose("\t" + LibraryName + " : " + str(Library) + ' ' + str(type(Library)))
+                    EdkLogger.verbose("\t{LN}:{LIB} {TYPE}".format(LN=LibraryName, LIB=str(Library), TYPE=str(type(Library))))
         return LibraryList
 
     ## Calculate the priority value of the build option
@@ -2604,7 +2610,7 @@ class PlatformAutoGen(AutoGen):
                         else:
                             # append options for the same tool except PATH
                             if Attr != 'PATH':
-                                BuildOptions[Tool][Attr] += " " + Options[Key]
+                                BuildOptions[Tool][Attr] = "{ORIG} {NEW}".format(ORIG=BuildOptions[Tool][Attr], NEW=Options[Key])
                             else:
                                 BuildOptions[Tool][Attr] = Options[Key]
         # Build Option Family has been checked, which need't to be checked again for family.
@@ -2639,7 +2645,7 @@ class PlatformAutoGen(AutoGen):
                         else:
                             # append options for the same tool except PATH
                             if Attr != 'PATH':
-                                BuildOptions[Tool][Attr] += " " + Options[Key]
+                                BuildOptions[Tool][Attr] = "{ORIG} {NEW}".format(ORIG=BuildOptions[Tool][Attr], NEW=Options[Key])
                             else:
                                 BuildOptions[Tool][Attr] = Options[Key]
         return BuildOptions
@@ -2693,7 +2699,7 @@ class PlatformAutoGen(AutoGen):
                         BuildOptions[Tool][Attr] = mws.handleWsMacro(Value[1:])
                     else:
                         if Attr != 'PATH':
-                            BuildOptions[Tool][Attr] += " " + mws.handleWsMacro(Value)
+                            BuildOptions[Tool][Attr] = "{ORIG} {NEW}".format(ORIG=BuildOptions[Tool][Attr], NEW=mws.handleWsMacro(Value))
                         else:
                             BuildOptions[Tool][Attr] = mws.handleWsMacro(Value)
 
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 46c7c1c1390b..e73d83395255 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -917,7 +917,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             TokenNumber = PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName]
         AutoGenH.Append('\n#define %s  %dU\n' % (PcdTokenName, TokenNumber))
 
-    EdkLogger.debug(EdkLogger.DEBUG_3, "Creating code for " + TokenCName + "." + Pcd.TokenSpaceGuidCName)
+    EdkLogger.debug(EdkLogger.DEBUG_3, "Creating code for {TCN}.{CN}".format(TCN=TokenCName, CN=Pcd.TokenSpaceGuidCName))
     if Pcd.Type not in gItemTypeStringDatabase:
         EdkLogger.error("build", AUTOGEN_ERROR,
                         "Unknown PCD type [%s] of PCD %s.%s" % (Pcd.Type, Pcd.TokenSpaceGuidCName, TokenCName),
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index ed5df2b75440..873ed6e59300 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -156,19 +156,16 @@ class DependencyExpression:
         EdkLogger.debug(EdkLogger.DEBUG_8, repr(self))
         if Optimize:
             self.Optimize()
-            EdkLogger.debug(EdkLogger.DEBUG_8, "\n    Optimized: " + repr(self))
+            EdkLogger.debug(EdkLogger.DEBUG_8, "\n    Optimized: {ME}".format(ME=repr(self)))
 
     def __str__(self):
         return " ".join(self.TokenList)
 
     def __repr__(self):
-        WellForm = ''
-        for Token in self.PostfixNotation:
-            if Token in self.SupportedOpcode:
-                WellForm += "\n    " + Token
-            else:
-                WellForm += ' ' + Token
-        return WellForm
+        return ''.join("{sep}{tok}".format(
+                tok=Token,
+                sep="\n    " if Token in DependencyExpression.SupportedOpcode else ' ')
+            for Token in self.PostfixNotation)
 
     ## Split the expression string into token list
     def GetExpressionTokenList(self):
@@ -359,11 +356,9 @@ class DependencyExpression:
             else:
                 Buffer.write(self.GetGuidValue(Item))
 
-        FilePath = ""
         FileChangeFlag = True
         if File is None:
             sys.stdout.write(Buffer.getvalue())
-            FilePath = "STDOUT"
         else:
             FileChangeFlag = SaveFileOnChange(File, Buffer.getvalue(), True)
 
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 30280d449f62..4ae977ccd400 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1029,7 +1029,7 @@ cleanlib:
                     with open(F.Path, 'r') as f:
                         FileContent = f.read()
                 except BaseException, X:
-                    EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
+                    EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData="{PATH}\n\t{VAL}".format(PATH=F.Path, VAL=str(X)))
 
                 if len(FileContent) == 0:
                     continue
@@ -1552,9 +1552,9 @@ class TopLevelMakefile(BuildFile):
             else:
                 pcdname = '.'.join(pcd[0:2])
             if pcd[3].startswith('{'):
-                ExtraOption += " --pcd " + pcdname + '=' + 'H' + '"' + pcd[3] + '"'
+                ExtraOption = '{ORIG} --pcd {NAME}=H"{VAL}"'.format(NAME=pcdname, VAL=pcd[3], ORIG=ExtraOption)
             else:
-                ExtraOption += " --pcd " + pcdname + '=' + pcd[3]
+                ExtraOption = "{ORIG} --pcd {NAME}={VAL}".format(NAME=pcdname, VAL=pcd[3], ORIG=ExtraOption)
 
         MakefileName = self._FILE_NAME_[self._FileType]
         SubBuildCommandList = []
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 2eab278d6876..35f022ac2e19 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -78,7 +78,7 @@ class VariableMgr(object):
                         value_list += [hex(unpack("B",data_byte)[0])]
                 newvalue[int(item.var_offset,16) if item.var_offset.upper().startswith("0X") else int(item.var_offset)] = value_list
             try:
-                newvaluestr = "{" + ",".join(VariableMgr.assemble_variable(newvalue)) +"}"
+                newvaluestr = '{{{mid}}}'.format(mid=",".join(VariableMgr.assemble_variable(newvalue)))
             except:
                 EdkLogger.error("build", AUTOGEN_ERROR, "Variable offset conflict in PCDs: %s \n" % (" and ".join(item.pcdname for item in sku_var_info_offset_list)))
             n = sku_var_info_offset_list[0]
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index 8b84806f9f36..8a1f51daf435 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -121,7 +121,7 @@ def SearchImageID(ImageFileObject, FileList):
                 for Line in f:
                     ImageIdList = IMAGE_TOKEN.findall(Line)
                     for ID in ImageIdList:
-                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: {id}".format(id=ID))
                         ImageFileObject.SetImageIDReferenced(ID)
 
 class ImageFileObject(object):
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index c0a39e4a12f1..bc5a23e9d920 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -110,7 +110,7 @@ def DecToHexStr(Dec, Digit = 8):
 #
 def DecToHexList(Dec, Digit = 8):
     Hex = '{0:0{1}X}'.format(Dec,Digit)
-    return ["0x" + Hex[Bit:Bit + 2] for Bit in range(Digit - 2, -1, -2)]
+    return ["0x{HEX}".format(HEX=Hex[Bit:Bit + 2]) for Bit in range(Digit - 2, -1, -2)]
 
 ## Convert a acsii string to a hex list
 #
@@ -532,7 +532,7 @@ def SearchString(UniObjectClass, FileList, IsCompatibleMode):
             with open(File, 'r') as f:
                 for Line in f:
                     for StrName in STRING_TOKEN.findall(Line):
-                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
+                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: {NAME}".format(NAME=StrName))
                         UniObjectClass.SetStringReferenced(StrName)
 
     UniObjectClass.ReToken()
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index bb37fbfd6a0c..73ca5b54778f 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -161,8 +161,7 @@ class Ucs2Codec(codecs.Codec):
         for Char in input:
             CodePoint = ord(Char)
             if CodePoint >= 0xd800 and CodePoint <= 0xdfff:
-                raise ValueError("Code Point is in range reserved for " +
-                                 "UTF-16 surrogate pairs")
+                raise ValueError("Code Point is in range reserved for UTF-16 surrogate pairs")
             elif CodePoint > 0xffff:
                 raise ValueError("Code Point too large to encode in UCS-2")
         return self.__utf16.encode(input)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 2c6bb8e396a9..b2a9bb1134ed 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -186,7 +186,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB(object):
         self.Type = 0
         self.Reserved = 0
         self.Attributes = 0x00000000
-        self.Guid = eval("[" + TokenSpaceGuid.replace("{", "").replace("}", "") + "]")
+        self.Guid = eval("[{GUID}]".format(GUID=TokenSpaceGuid.replace("{", "").replace("}", "")))
         self.Name = PcdCName
         self.validtab = []
 
@@ -258,7 +258,6 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
             else:
                 self.data.add(int(valid_num))
 
-                
         self.Length = 5 + len(self.data) * self.StorageWidth
         
            
@@ -266,13 +265,10 @@ class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
     def __init__(self, VarOffset, validrange, PcdDataType):
         super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(VarOffset, validrange, PcdDataType)
         self.Type = 2
-        RangeExpr = ""
-        i = 0
-        for item in self.rawdata:
-            if i == 0:
-                RangeExpr = "( " + item + " )"
-            else:
-                RangeExpr = RangeExpr + "OR ( " + item + " )"
+        if self.rawdata:
+            RangeExpr = "( {ITEM} )".format(ITEM=self.rawdata[-1])
+        else:
+            RangeExpr = ""
         range_result = RangeExpression(RangeExpr, self.PcdDataType)(True)
         for rangelist in range_result:
             for obj in rangelist.pop():
@@ -285,5 +281,4 @@ def GetValidationObject(PcdClass, VarOffset):
         return VAR_CHECK_PCD_VALID_RANGE(VarOffset, PcdClass.validateranges, PcdClass.DatumType)
     if PcdClass.validlists:
         return VAR_CHECK_PCD_VALID_LIST(VarOffset, PcdClass.validlists, PcdClass.DatumType)
-    else:
-        return None
+    return None
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index e5d17e6b4de0..36f2654fc9cf 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -133,7 +133,7 @@ def BuildOptionValue(PcdValue, GuidDict):
     elif PcdValue.startswith(("L'", "'")):
         InputValue = PcdValue
     elif PcdValue.startswith('L'):
-        InputValue = 'L"' + PcdValue[1:] + '"'
+        InputValue = 'L"{VAL}"'.format(VAL=PcdValue[1:])
     else:
         InputValue = PcdValue
     if IsFieldValueAnArray(InputValue):
@@ -178,7 +178,7 @@ def ReplaceExprMacro(String, Macros, ExceptionList = None):
                 # For example: DEFINE ARCH = IA32 X64
                 # $(ARCH) is replaced with "IA32 X64"
                 if ExceptionList and Macro in ExceptionList:
-                    RetStr += '"' + Macros[Macro] + '"'
+                    RetStr = '{ORIG}"{MACRO}"'.format(MACRO=Macros[Macro], ORIG=RetStr)
                 elif Macros[Macro].strip():
                     RetStr += Macros[Macro]
                 else:
@@ -197,7 +197,7 @@ def IntToStr(Value):
     while Value > 0:
         StrList.append(chr(Value & 0xff))
         Value = Value >> 8
-    Value = '"' + ''.join(StrList) + '"'
+    Value = '"{VAL}"'.format(VAL=''.join(StrList))
     return Value
 
 SupportedInMacroList = ['TARGET', 'TOOL_CHAIN_TAG', 'ARCH', 'FAMILY']
@@ -223,17 +223,24 @@ class BaseExpression(object):
 class ValueExpression(BaseExpression):
     # Logical operator mapping
     LogicalOperators = {
-        '&&' : 'and', '||' : 'or',
-        '!'  : 'not', 'AND': 'and',
-        'OR' : 'or' , 'NOT': 'not',
-        'XOR': '^'  , 'xor': '^',
-        'EQ' : '==' , 'NE' : '!=',
-        'GT' : '>'  , 'LT' : '<',
-        'GE' : '>=' , 'LE' : '<=',
+        '&&' : 'and',
+        '||' : 'or',
+        '!'  : 'not',
+        'AND': 'and',
+        'OR' : 'or',
+        'NOT': 'not',
+        'XOR': '^',
+        'xor': '^',
+        'EQ' : '==',
+        'NE' : '!=',
+        'GT' : '>',
+        'LT' : '<',
+        'GE' : '>=',
+        'LE' : '<=',
         'IN' : 'in'
     }
 
-    NonLetterOpLst = ['+', '-', '*', '/', '%', '&', '|', '^', '~', '<<', '>>', '!', '=', '>', '<', '?', ':']
+    NonLetterOpLst = {'+', '-', '*', '/', '%', '&', '|', '^', '~', '<<', '>>', '!', '=', '>', '<', '?', ':'}
 
 
     SymbolPattern = re.compile("("
@@ -710,18 +717,15 @@ class ValueExpression(BaseExpression):
         if Expr.startswith('L"'):
             # Skip L
             self._Idx += 1
-            UStr = self.__GetString()
-            self._Token = 'L"' + UStr + '"'
+            self._Token = 'L"{STR}"'.format(STR=self.__GetString())
             return self._Token
         elif Expr.startswith("L'"):
             # Skip L
             self._Idx += 1
-            UStr = self.__GetString()
-            self._Token = "L'" + UStr + "'"
+            self._Token = "L'{STR}'".format(STR=self.__GetString())
             return self._Token
         elif Expr.startswith("'"):
-            UStr = self.__GetString()
-            self._Token = "'" + UStr + "'"
+            self._Token = "'{STR}'".format(STR=self.__GetString())
             return self._Token
         elif Expr.startswith('UINT'):
             Re = re.compile('(?:UINT8|UINT16|UINT32|UINT64)\((.+)\)')
@@ -758,7 +762,7 @@ class ValueExpression(BaseExpression):
                 return self.__GetString()
             elif Ch == '{':
                 return self.__GetArray()
-            elif Ch == '(' or Ch == ')':
+            elif Ch in {'(', ')'}:
                 self._Idx += 1
                 self._Token = Ch
                 return self._Token
@@ -768,7 +772,7 @@ class ValueExpression(BaseExpression):
     # Parse operator
     def _GetOperator(self):
         self.__SkipWS()
-        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?',':']
+        LegalOpLst = {'&&', '||', '!=', '==', '>=', '<=', '?', ':'}.union(self.NonLetterOpLst)
 
         self._Token = ''
         Expr = self._Expr[self._Idx:]
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 0bfb26548d9b..bfb6e56a923f 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -713,7 +713,7 @@ class TemplateString(object):
                 self._SubSectionList = [TemplateSection]
 
         def __str__(self):
-            return self._Template + " : " + str(self._PlaceHolderList)
+            return "{TEM} : {LIST}".format(TEM=self._Template, LIST=str(self._PlaceHolderList))
 
         def Instantiate(self, PlaceHolderValues):
             RepeatTime = -1
@@ -894,7 +894,7 @@ class Progressor:
                 TimeUp = self.Interval
             time.sleep(self._CheckInterval)
             TimeUp -= self._CheckInterval
-        sys.stdout.write(" " + self.CodaMessage + "\n")
+        sys.stdout.write(" {MSG}\n".format(MSG=self.CodaMessage))
         sys.stdout.flush()
 
     ## Abort the progress display
@@ -1313,7 +1313,7 @@ def ParseFieldValue (Value):
         if Value[0] == '"' and Value[-1] == '"':
             Value = Value[1:-1]
         try:
-            Value = "'" + uuid.UUID(Value).get_bytes_le() + "'"
+            Value = "'{GUID}'".format(GUID=uuid.UUID(Value).get_bytes_le())
         except ValueError, Message:
             raise BadExpression('%s' % Message)
         Value, Size = ParseFieldValue(Value)
@@ -2050,7 +2050,7 @@ class SkuClass():
                     ArrayStrList.append(hex(int(self.AvailableSkuIds[skuname])))
                     skuname = self.GetNextSkuId(skuname)
                 ArrayStrList.append("0x0")
-            ArrayStr = "{" + ",".join(ArrayStrList) +  "}"
+            ArrayStr = "{{{ARRAY}}}".format(ARRAY=",".join(ArrayStrList))
         return ArrayStr
     def __GetAvailableSkuIds(self):
         return self.AvailableSkuIds
diff --git a/BaseTools/Source/Python/Common/String.py b/BaseTools/Source/Python/Common/String.py
index 34361ecdd58c..1516e6c2ae9c 100644
--- a/BaseTools/Source/Python/Common/String.py
+++ b/BaseTools/Source/Python/Common/String.py
@@ -712,7 +712,7 @@ def RaiseParserError(Line, Section, File, Format='', LineNo= -1):
         LineNo = GetLineNo(open(os.path.normpath(File), 'r').read(), Line)
     ErrorMsg = "Invalid statement '%s' is found in section '%s'" % (Line, Section)
     if Format != '':
-        Format = "Correct format is " + Format
+        Format = "Correct format is {FMT}".format(FMT=Format)
     EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg, File=File, Line=LineNo, ExtraData=Format, RaiseError=EdkLogger.IsRaiseError)
 
 ## WorkspaceFile
diff --git a/BaseTools/Source/Python/CommonDataClass/CommonClass.py b/BaseTools/Source/Python/CommonDataClass/CommonClass.py
index e29f5211d5c7..d7123fe91ee1 100644
--- a/BaseTools/Source/Python/CommonDataClass/CommonClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/CommonClass.py
@@ -35,16 +35,14 @@
 # @var DefaultValue:       To store value for DefaultValue
 #
 class SkuInfoClass(object):
-    def __init__(self, SkuIdName = '', SkuId = '', VariableName = '', VariableGuid = '', VariableOffset = '', 
-                 HiiDefaultValue = '', VpdOffset = '', DefaultValue = '', VariableGuidValue = '', VariableAttribute = '', DefaultStore = None):
+    def __init__(self, SkuIdName = '', SkuId = '', VariableName = '', VariableGuid = '', VariableOffset = '',
+                 HiiDefaultValue = '', VpdOffset = '', DefaultValue = '', VariableGuidValue = '', VariableAttribute = '', DefaultStore = {}):
         self.SkuIdName = SkuIdName
         self.SkuId = SkuId
         
         #
         # Used by Hii
         #
-        if DefaultStore is None:
-            DefaultStore = {}
         self.VariableName = VariableName
         self.VariableGuid = VariableGuid
         self.VariableGuidValue = VariableGuidValue
@@ -68,15 +66,18 @@ class SkuInfoClass(object):
     #  Convert each member of the class to string
     #  Organize to a signle line format string
     #
-    #  @retval Rtn Formatted String
+    #  @retval Formatted String
     #
     def __str__(self):
-        Rtn = 'SkuId = ' + str(self.SkuId) + "," + \
-                    'SkuIdName = ' + str(self.SkuIdName) + "," + \
-                    'VariableName = ' + str(self.VariableName) + "," + \
-                    'VariableGuid = ' + str(self.VariableGuid) + "," + \
-                    'VariableOffset = ' + str(self.VariableOffset) + "," + \
-                    'HiiDefaultValue = ' + str(self.HiiDefaultValue) + "," + \
-                    'VpdOffset = ' + str(self.VpdOffset) + "," + \
-                    'DefaultValue = ' + str(self.DefaultValue) + ","
-        return Rtn
+        return 'SkuId = {SKUID},SkuIdName = {SKUNAME},'\
+               'VariableName = {VARNAME},VariableGuid = {VARGUID},'\
+               'VariableOffset = {VAROFFSET},HiiDefaultValue = {DEF},'\
+               'VpdOffset = {OFFSET},DefaultValue = {DEF2},'.format(
+                    SKUID=self.SkuId,
+                    SKUNAME=self.SkuIdName,
+                    VARNAME=self.VariableName,
+                    VARGUID=self.VariableGuid,
+                    VAROFFSET=self.VariableOffset,
+                    DEF=self.HiiDefaultValue,
+                    OFFSET=self.VpdOffset,
+                    DEF2=self.DefaultValue)
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index 6aae2fcb7d97..ab5ea9fc0dd0 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -28,9 +28,8 @@ from struct import pack
 from GenFds import FindExtendTool
 from Common import EdkLogger
 from Common.BuildToolError import *
+from Common.DataType import TAB_LINE_BREAK
 
-
-T_CHAR_LF = '\n'
 WIN_CERT_REVISION      = 0x0200
 WIN_CERT_TYPE_EFI_GUID = 0x0EF1
 EFI_CERT_TYPE_PKCS7_GUID = uuid.UUID('{4aafd29d-68df-49ee-8aa9-347d375665a7}')
@@ -209,16 +208,14 @@ class Capsule (CapsuleClassObject) :
             return self.GenFmpCapsule()
 
         CapInfFile = self.GenCapInf()
-        CapInfFile.writelines("[files]" + T_CHAR_LF)
+        CapInfFile.writelines("[files]{END}".format(END=TAB_LINE_BREAK))
         CapFileList = []
         for CapsuleDataObj in self.CapsuleDataList :
             CapsuleDataObj.CapsuleName = self.CapsuleName
             FileName = CapsuleDataObj.GenCapsuleSubItem()
             CapsuleDataObj.CapsuleName = None
             CapFileList.append(FileName)
-            CapInfFile.writelines("EFI_FILE_NAME = " + \
-                                   FileName      + \
-                                   T_CHAR_LF)
+            CapInfFile.writelines("EFI_FILE_NAME = {NAME}{END}".format(NAME=FileName, END=TAB_LINE_BREAK))
         SaveFileOnChange(self.CapInfFileName, CapInfFile.getvalue(), False)
         CapInfFile.close()
         #
@@ -245,16 +242,12 @@ class Capsule (CapsuleClassObject) :
     #
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
-                                   self.UiCapsuleName +  "_Cap" + '.inf')
+                                   "{NAME}_Cap.inf".format(NAME=self.UiCapsuleName))
         CapInfFile = StringIO.StringIO()
 
-        CapInfFile.writelines("[options]" + T_CHAR_LF)
+        CapInfFile.writelines("[options]{END}".format(END=TAB_LINE_BREAK))
 
         for Item in self.TokensDict:
-            CapInfFile.writelines("EFI_"                    + \
-                                  Item                      + \
-                                  ' = '                     + \
-                                  self.TokensDict[Item]     + \
-                                  T_CHAR_LF)
+            CapInfFile.writelines("EFI_{ITEM} = {ENTRY}{END}".format(ITEM=Item, ENTRY=self.TokensDict[Item], END=TAB_LINE_BREAK))
 
         return CapInfFile
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index d511cf4f9d5a..55348083b954 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -2019,9 +2019,9 @@ class FdfParser:
         AllStrLen = len (AllString)
         DataString = ""
         while AllStrLen > 4:
-            DataString = DataString + "0x" + AllString[AllStrLen - 2: AllStrLen] + ","
+            DataString = "{ORIG}0x{VAL},".format(ORIG=DataString, VAL=AllString[AllStrLen - 2: AllStrLen])
             AllStrLen  = AllStrLen - 2
-        DataString = DataString + AllString[:AllStrLen] + ","
+        DataString = "{ORIG}{VAL},".format(ORIG=DataString, VAL=AllString[:AllStrLen])
 
         # byte value array
         if len (self.__Token) <= 4:
@@ -2059,9 +2059,9 @@ class FdfParser:
             AllStrLen = len (AllString)
             DataString = ""
             while AllStrLen > 4:
-                DataString = DataString + "0x" + AllString[AllStrLen - 2: AllStrLen] + ","
+                DataString = "{ORIG}0x{VAL},".format(ORIG=DataString, VAL=AllString[AllStrLen - 2: AllStrLen])
                 AllStrLen  = AllStrLen - 2
-            DataString = DataString + AllString[:AllStrLen] + ","
+            DataString = "{ORIG}{VAL},".format(ORIG=DataString, VAL=AllString[:AllStrLen])
 
             # byte value array
             if len (self.__Token) <= 4:
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 39426b939b4a..e4276c3a8c07 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -361,7 +361,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
             os.makedirs(self.OutputPath)
 
         self.EfiOutputPath, self.EfiDebugPath = self.__GetEFIOutPutPath__()
-        GenFdsGlobalVariable.VerboseLogger( "ModuelEFIPath: " + self.EfiOutputPath)
+        GenFdsGlobalVariable.VerboseLogger( "ModuelEFIPath: {PATH}".format(PATH=self.EfiOutputPath))
 
     ## PatchEfiFile
     #
@@ -564,7 +564,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
             Rule = GenFdsGlobalVariable.FdfParser.Profile.RuleDict.get(RuleName)
             if Rule is not None:
-                GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : " + RuleName)
+                GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : {NAME}".format(NAME=RuleName))
                 return Rule
 
         RuleName = 'RULE'      + \
@@ -582,7 +582,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
         Rule = GenFdsGlobalVariable.FdfParser.Profile.RuleDict.get(RuleName)
         if Rule is not None:
-            GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : " + RuleName)
+            GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : {NAME}".format(NAME=RuleName))
             return Rule
 
         if Rule is None :
@@ -634,7 +634,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         CurArchList = TargetArchList
         if PlatformArchList != []:
             CurArchList = list(set (TargetArchList) & set (PlatformArchList))
-        GenFdsGlobalVariable.VerboseLogger ("Valid target architecture(s) is : " + " ".join(CurArchList))
+        GenFdsGlobalVariable.VerboseLogger ("Valid target architecture(s) is : {ARCH}".format(ARCH=" ".join(CurArchList)))
 
         ArchList = []
         if self.KeyStringList != []:
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index d4b0611fc55a..c672f1d7d8fa 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -108,9 +108,7 @@ class FV (FvClassObject):
             FfsFileList.append(FileName)
             # Add Apriori file name to Inf file
             if not Flag:
-                self.FvInfFile.writelines("EFI_FILE_NAME = " + \
-                                            FileName          + \
-                                            TAB_LINE_BREAK)
+                self.FvInfFile.writelines("EFI_FILE_NAME = {FN}{END}".format(FN=FileName, END=TAB_LINE_BREAK))
 
         # Process Modules in FfsList
         for FfsFile in self.FfsList :
@@ -122,9 +120,7 @@ class FV (FvClassObject):
             FileName = FfsFile.GenFfs(MacroDict, FvParentAddr=BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
             FfsFileList.append(FileName)
             if not Flag:
-                self.FvInfFile.writelines("EFI_FILE_NAME = " + \
-                                            FileName          + \
-                                            TAB_LINE_BREAK)
+                self.FvInfFile.writelines("EFI_FILE_NAME = {FN}{END}".format(FN=FileName, END=TAB_LINE_BREAK))
         if not Flag:
             SaveFileOnChange(self.InfFileName, self.FvInfFile.getvalue(), False)
             self.FvInfFile.close()
@@ -267,67 +263,46 @@ class FV (FvClassObject):
         #
         # Add [Options]
         #
-        self.FvInfFile.writelines("[options]" + TAB_LINE_BREAK)
+        self.FvInfFile.writelines("[options]{END}".format(END=TAB_LINE_BREAK))
         if BaseAddress is not None :
-            self.FvInfFile.writelines("EFI_BASE_ADDRESS = " + \
-                                       BaseAddress          + \
-                                       TAB_LINE_BREAK)
+            self.FvInfFile.writelines("EFI_BASE_ADDRESS = {BA}{END}".format(BA=BaseAddress,END=TAB_LINE_BREAK))
 
         if BlockSize is not None:
-            self.FvInfFile.writelines("EFI_BLOCK_SIZE = " + \
-                                      '0x%X' %BlockSize    + \
-                                      TAB_LINE_BREAK)
+            self.FvInfFile.writelines("EFI_BLOCK_SIZE = 0x{BS:x}{END}".format(BS=BlockSize,END=TAB_LINE_BREAK))
             if BlockNum is not None:
-                self.FvInfFile.writelines("EFI_NUM_BLOCKS   = "  + \
-                                      ' 0x%X' %BlockNum    + \
-                                      TAB_LINE_BREAK)
+                self.FvInfFile.writelines("EFI_NUM_BLOCKS   = 0x{BN:x}{END}".format(BN=BlockNum, END=TAB_LINE_BREAK))
         else:
             if self.BlockSizeList == []:
                 if not self._GetBlockSize():
                     #set default block size is 1
-                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = 0x1" + TAB_LINE_BREAK)
+                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = 0x1{END}".format(END=TAB_LINE_BREAK))
             
             for BlockSize in self.BlockSizeList :
                 if BlockSize[0] is not None:
-                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = "  + \
-                                          '0x%X' %BlockSize[0]    + \
-                                          TAB_LINE_BREAK)
+                    self.FvInfFile.writelines("EFI_BLOCK_SIZE  = 0x{BS:x}{END}".format(BS=BlockSize[0], END=TAB_LINE_BREAK))
 
                 if BlockSize[1] is not None:
-                    self.FvInfFile.writelines("EFI_NUM_BLOCKS   = "  + \
-                                          ' 0x%X' %BlockSize[1]    + \
-                                          TAB_LINE_BREAK)
+                    self.FvInfFile.writelines("EFI_NUM_BLOCKS   = 0x{BN:x}{END}".format(BN=BlockSize[1], END=TAB_LINE_BREAK))
 
         if self.BsBaseAddress is not None:
-            self.FvInfFile.writelines('EFI_BOOT_DRIVER_BASE_ADDRESS = ' + \
-                                       '0x%X' %self.BsBaseAddress)
+            self.FvInfFile.writelines('EFI_BOOT_DRIVER_BASE_ADDRESS = 0x{BA:x}'.format(BA=self.BsBaseAddress))
         if self.RtBaseAddress is not None:
-            self.FvInfFile.writelines('EFI_RUNTIME_DRIVER_BASE_ADDRESS = ' + \
-                                      '0x%X' %self.RtBaseAddress)
+            self.FvInfFile.writelines('EFI_RUNTIME_DRIVER_BASE_ADDRESS = 0x{BA:x}'.format(BA=self.RtBaseAddress))
         #
         # Add attribute
         #
-        self.FvInfFile.writelines("[attributes]" + TAB_LINE_BREAK)
+        self.FvInfFile.writelines("[attributes]{END}".format(END=TAB_LINE_BREAK))
 
-        self.FvInfFile.writelines("EFI_ERASE_POLARITY   = "       + \
-                                          ' %s' %ErasePloarity    + \
-                                          TAB_LINE_BREAK)
-        if not (self.FvAttributeDict is None):
+        self.FvInfFile.writelines("EFI_ERASE_POLARITY   =  {EP}{END}".format(EP=ErasePloarity, END=TAB_LINE_BREAK))
+        if self.FvAttributeDict:
             for FvAttribute in self.FvAttributeDict.keys() :
                 if FvAttribute == "FvUsedSizeEnable":
-                    if self.FvAttributeDict[FvAttribute].upper() in ('TRUE', '1') :
+                    if self.FvAttributeDict[FvAttribute].upper() in {'TRUE', '1'}:
                         self.UsedSizeEnable = True
                     continue
-                self.FvInfFile.writelines("EFI_"            + \
-                                          FvAttribute       + \
-                                          ' = '             + \
-                                          self.FvAttributeDict[FvAttribute] + \
-                                          TAB_LINE_BREAK )
-        if self.FvAlignment is not None:
-            self.FvInfFile.writelines("EFI_FVB2_ALIGNMENT_"     + \
-                                       self.FvAlignment.strip() + \
-                                       " = TRUE"                + \
-                                       TAB_LINE_BREAK)
+                self.FvInfFile.writelines("EFI_{FA} = {VAL}{END}".format(FA=FvAttribute, VAL=self.FvAttributeDict[FvAttribute], END=TAB_LINE_BREAK))
+        if self.FvAlignment:
+            self.FvInfFile.writelines("EFI_FVB2_ALIGNMENT_{FA} = TRUE{END}".format(FA=self.FvAlignment.strip(), END=TAB_LINE_BREAK))
                                        
         #
         # Generate FV extension header file
@@ -410,16 +385,11 @@ class FV (FvClassObject):
                 if Changed:
                   if os.path.exists (self.InfFileName):
                     os.remove (self.InfFileName)
-                self.FvInfFile.writelines("EFI_FV_EXT_HEADER_FILE_NAME = "      + \
-                                           FvExtHeaderFileName                  + \
-                                           TAB_LINE_BREAK)
-
+                self.FvInfFile.writelines("EFI_FV_EXT_HEADER_FILE_NAME = {NAME}{END}".format(NAME=FvExtHeaderFileName, END=TAB_LINE_BREAK))
          
         #
         # Add [Files]
         #
-        self.FvInfFile.writelines("[files]" + TAB_LINE_BREAK)
+        self.FvInfFile.writelines("[files]{END}".format(END=TAB_LINE_BREAK))
         if VtfDict and self.UiFvName in VtfDict:
-            self.FvInfFile.writelines("EFI_FILE_NAME = "                   + \
-                                       VtfDict[self.UiFvName]              + \
-                                       TAB_LINE_BREAK)
+            self.FvInfFile.writelines("EFI_FILE_NAME = {NAME}{END}".format(NAME=VtfDict[self.UiFvName], END=TAB_LINE_BREAK))
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index f0b51e25dfa2..998bd5345c3c 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -95,7 +95,7 @@ def main():
             if 'EDK_SOURCE' in os.environ:
                 GenFdsGlobalVariable.EdkSourceDir = os.path.normcase(os.environ['EDK_SOURCE'])
             if (Options.debug):
-                GenFdsGlobalVariable.VerboseLogger("Using Workspace:" + Workspace)
+                GenFdsGlobalVariable.VerboseLogger("Using Workspace: {WKSP}".format(WKSP=Workspace))
             if Options.GenfdsMultiThread:
                 GenFdsGlobalVariable.EnableGenfdsMultiThread = True
         os.chdir(GenFdsGlobalVariable.WorkSpaceDir)
@@ -207,7 +207,7 @@ def main():
                         GlobalData.gEdkSource = List[1].strip()
                         GlobalData.gGlobalDefines["EDK_SOURCE"] = GlobalData.gEdkSource
                         continue
-                    elif List[0].strip() in ["WORKSPACE", "TARGET", "TOOLCHAIN"]:
+                    elif List[0].strip() in {"WORKSPACE", "TARGET", "TOOLCHAIN"}:
                         GlobalData.gGlobalDefines[List[0].strip()] = List[1].strip()
                     else:
                         GlobalData.gCommandLineDefines[List[0].strip()] = List[1].strip()
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 8537800bc2b2..b840079e7ad4 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -295,7 +295,6 @@ class GenFdsGlobalVariable:
         if not os.path.exists(GenFdsGlobalVariable.FfsDir) :
             os.makedirs(GenFdsGlobalVariable.FfsDir)
 
-        T_CHAR_LF = '\n'
         #
         # Create FV Address inf file
         #
@@ -313,13 +312,9 @@ class GenFdsGlobalVariable:
             #
             # Add [Options]
             #
-            FvAddressFile.writelines("[options]" + T_CHAR_LF)
-            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                           BsAddress + \
-                                           T_CHAR_LF)
-            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                           RtAddress + \
-                                           T_CHAR_LF)
+            FvAddressFile.writelines("[options]{END}".format(END=DataType.TAB_LINE_BREAK))
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = {BS}{END}".format(BS=BsAddress, END=DataType.TAB_LINE_BREAK))
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = {RT}{END}".format(RT=RtAddress, END=DataType.TAB_LINE_BREAK))
 
 
     def SetEnv(FdfParser, WorkSpace, ArchList, GlobalData):
@@ -352,7 +347,6 @@ class GenFdsGlobalVariable:
         if not os.path.exists(GenFdsGlobalVariable.FfsDir):
             os.makedirs(GenFdsGlobalVariable.FfsDir)
 
-        T_CHAR_LF = '\n'
         #
         # Create FV Address inf file
         #
@@ -376,13 +370,9 @@ class GenFdsGlobalVariable:
             #
             # Add [Options]
             #
-            FvAddressFile.writelines("[options]" + T_CHAR_LF)
-            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                     BsAddress + \
-                                     T_CHAR_LF)
-            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                     RtAddress + \
-                                     T_CHAR_LF)
+            FvAddressFile.writelines("[options]{END}".format(END=DataType.TAB_LINE_BREAK))
+            FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = {BS}{END}".format(BS=BsAddress, END=DataType.TAB_LINE_BREAK))
+            FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = {RT}{END}".format(RT=RtAddress, END=DataType.TAB_LINE_BREAK))
 
     ## ReplaceWorkspaceMacro()
     #
@@ -692,7 +682,7 @@ class GenFdsGlobalVariable:
             if " ".join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
                 GenFdsGlobalVariable.SecCmdList.append(" ".join(Cmd).strip())
         else:
-            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to call " + ToolPath, returnValue)
+            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to call {PATH}".format(PATH=ToolPath), returnValue)
 
     def CallExternalTool (cmd, errorMess, returnValue=[]):
 
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index b05841529940..0b8e79588ff1 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -27,8 +27,6 @@ from Common.Misc import SaveFileOnChange
 from Common import EdkLogger
 from Common.BuildToolError import *
 
-T_CHAR_LF = '\n'
-
 ## 
 #
 #
diff --git a/BaseTools/Source/Python/GenFds/Vtf.py b/BaseTools/Source/Python/GenFds/Vtf.py
index 291070827b78..6016b6d94e94 100644
--- a/BaseTools/Source/Python/GenFds/Vtf.py
+++ b/BaseTools/Source/Python/GenFds/Vtf.py
@@ -19,7 +19,7 @@ from GenFdsGlobalVariable import GenFdsGlobalVariable
 import Common.LongFilePathOs as os
 from CommonDataClass.FdfClass import VtfClassObject
 from Common.LongFilePathSupport import OpenLongFilePath as open
-T_CHAR_LF = '\n'
+from Common.DataType import TAB_LINE_BREAK
 
 ## generate VTF
 #
@@ -43,7 +43,7 @@ class Vtf (VtfClassObject):
     #
     def GenVtf(self, FdAddressDict) :
         self.GenBsfInf()
-        OutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiName + '.Vtf')
+        OutputFile = os.path.join(GenFdsGlobalVariable.FvDir, '{NAME}.Vtf'.format(self.UiName))
         BaseAddArg = self.GetBaseAddressArg(FdAddressDict)
         OutputArg, VtfRawDict = self.GenOutputArg()
         
@@ -69,77 +69,43 @@ class Vtf (VtfClassObject):
         self.BsfInfName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiName + '.inf')
         with open(self.BsfInfName, 'w') as BsfInf:
             if self.ResetBin is not None:
-                BsfInf.writelines ("[OPTIONS]" + T_CHAR_LF)
-                BsfInf.writelines ("IA32_RST_BIN" + \
-                                   " = " + \
-                                   GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)) + \
-                                   T_CHAR_LF)
-                BsfInf.writelines (T_CHAR_LF)
+                BsfInf.writelines ("[OPTIONS]{END}".format(END=TAB_LINE_BREAK))
+                BsfInf.writelines ("IA32_RST_BIN = {DATA}{END}".format(
+                    DATA=GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.ResetBin)),
+                    END=TAB_LINE_BREAK))
+                BsfInf.writelines (TAB_LINE_BREAK)
 
-            BsfInf.writelines ("[COMPONENTS]" + T_CHAR_LF)
+            BsfInf.writelines ("[COMPONENTS]{END}".format(END=TAB_LINE_BREAK))
 
             for ComponentObj in self.ComponentStatementList :
-                BsfInf.writelines ("COMP_NAME" + \
-                                   " = " + \
-                                   ComponentObj.CompName + \
-                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_NAME = {DATA}{END}".format(DATA=ComponentObj.CompName,END=TAB_LINE_BREAK))
+
                 if ComponentObj.CompLoc.upper() == 'NONE':
-                    BsfInf.writelines ("COMP_LOC" + \
-                                       " = " + \
-                                       'N' + \
-                                       T_CHAR_LF)
-
+                    BsfInf.writelines ("COMP_LOC = N{END}".format(END=TAB_LINE_BREAK))
                 elif ComponentObj.FilePos is not None:
-                    BsfInf.writelines ("COMP_LOC" + \
-                                       " = " + \
-                                       ComponentObj.FilePos + \
-                                       T_CHAR_LF)
+                    BsfInf.writelines ("COMP_LOC = {POS}{END}".format(POS=ComponentObj.FilePos, END=TAB_LINE_BREAK))
                 else:
                     Index = FvList.index(ComponentObj.CompLoc.upper())
                     if Index == 0:
-                        BsfInf.writelines ("COMP_LOC" + \
-                                           " = " + \
-                                           'F' + \
-                                           T_CHAR_LF)
+                        BsfInf.writelines ("COMP_LOC = F{END}".format(END=TAB_LINE_BREAK))
                     elif Index == 1:
-                        BsfInf.writelines ("COMP_LOC" + \
-                                           " = " + \
-                                           'S' + \
-                                           T_CHAR_LF)
+                        BsfInf.writelines ("COMP_LOC = S{END}".format(END=TAB_LINE_BREAK))
 
-                BsfInf.writelines ("COMP_TYPE" + \
-                                   " = " + \
-                                   ComponentObj.CompType + \
-                                   T_CHAR_LF)
-                BsfInf.writelines ("COMP_VER" + \
-                                   " = " + \
-                                   ComponentObj.CompVer + \
-                                   T_CHAR_LF)
-                BsfInf.writelines ("COMP_CS" + \
-                                   " = " + \
-                                   ComponentObj.CompCs + \
-                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_TYPE = {DATA}{END}".format(DATA=ComponentObj.CompType,END=TAB_LINE_BREAK))
+                BsfInf.writelines ("COMP_VER = {DATA}{END}".format(DATA=ComponentObj.CompVer,END=TAB_LINE_BREAK))
+                BsfInf.writelines ("COMP_CS = {DATA}{END}".format(DATA=ComponentObj.CompCs,END=TAB_LINE_BREAK))
 
                 BinPath = ComponentObj.CompBin
                 if BinPath != '-':
                     BinPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(BinPath))
-                BsfInf.writelines ("COMP_BIN" + \
-                                   " = " + \
-                                   BinPath + \
-                                   T_CHAR_LF)
+                BsfInf.writelines ("COMP_BIN = {DATA}{END}".format(DATA=BinPath, END=TAB_LINE_BREAK))
 
                 SymPath = ComponentObj.CompSym
                 if SymPath != '-':
                     SymPath = GenFdsGlobalVariable.MacroExtend(GenFdsGlobalVariable.ReplaceWorkspaceMacro(SymPath))
-                BsfInf.writelines ("COMP_SYM" + \
-                                   " = " + \
-                                   SymPath + \
-                                   T_CHAR_LF)
-                BsfInf.writelines ("COMP_SIZE" + \
-                                   " = " + \
-                                   ComponentObj.CompSize + \
-                                   T_CHAR_LF)
-                BsfInf.writelines (T_CHAR_LF)
+                BsfInf.writelines ("COMP_SYM = {DATA}{END}".format(DATA=SymPath, END=TAB_LINE_BREAK))
+                BsfInf.writelines ("COMP_SIZE = {DATA}{END}".format(DATA=ComponentObj.CompSiz, END=TAB_LINE_BREAK))
+                BsfInf.writelines (TAB_LINE_BREAK)
 
     ## GenFvList() method
     #
diff --git a/BaseTools/Source/Python/Table/TableDataModel.py b/BaseTools/Source/Python/Table/TableDataModel.py
index 9c3d7bd9345f..ec47b9f37097 100644
--- a/BaseTools/Source/Python/Table/TableDataModel.py
+++ b/BaseTools/Source/Python/Table/TableDataModel.py
@@ -1,7 +1,7 @@
 ## @file
 # This file is used to create/update/query/erase table for data models
 #
-# Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
+# Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
 # This program and the accompanying materials
 # are licensed and made available under the terms and conditions of the BSD License
 # which accompanies this distribution.  The full text of the license may be found at
@@ -60,7 +60,7 @@ class TableDataModel(Table):
     def Insert(self, CrossIndex, Name, Description):
         self.ID = self.ID + 1
         (Name, Description) = ConvertToSqlString((Name, Description))
-        SqlCommand = """insert into %s values(%s, %s, '%s', '%s')""" % (self.Table, self.ID, CrossIndex, Name, Description)
+        SqlCommand = "insert into %s values(%s, %s, '%s', '%s')" % (self.Table, self.ID, CrossIndex, Name, Description)
         Table.Insert(self, SqlCommand)
         
         return self.ID
@@ -87,9 +87,6 @@ class TableDataModel(Table):
     #
     def GetCrossIndex(self, ModelName):
         CrossIndex = -1
-        SqlCommand = """select CrossIndex from DataModel where name = '""" + ModelName + """'"""
+        SqlCommand = "select CrossIndex from DataModel where name = '{NAME}'".format(NAME=ModelName)
         self.Cur.execute(SqlCommand)
-        for Item in self.Cur:
-            CrossIndex = Item[0]
-        
-        return CrossIndex
+        return self.Cur[-1][0]
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index d714c781e970..7b062b564da5 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -220,7 +220,7 @@ class DscBuildData(PlatformBuildClassObject):
     @property
     def OutputPath(self):
         if os.getenv("WORKSPACE"):
-            return os.path.join(os.getenv("WORKSPACE"), self.OutputDirectory, self._Target + "_" + self._Toolchain,PcdValueInitName)
+            return os.path.join(os.getenv("WORKSPACE"), self.OutputDirectory, "{TGT}_{TC}".format(TGT=self._Target, TC=self._Toolchain),PcdValueInitName)
         else:
             return os.path.dirname(self.DscFile)
 
@@ -762,7 +762,7 @@ class DscBuildData(PlatformBuildClassObject):
                     Module.BuildOptions[ToolChainFamily, ToolChain] = Option
                 else:
                     OptionString = Module.BuildOptions[ToolChainFamily, ToolChain]
-                    Module.BuildOptions[ToolChainFamily, ToolChain] = OptionString + " " + Option
+                    Module.BuildOptions[ToolChainFamily, ToolChain] = "{OPT1} {OPT2}".format(OPT1=OptionString, OPT2=Option)
 
             RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, None, ModuleId]
             if DuplicatedFile and not RecordList:
@@ -1539,7 +1539,7 @@ class DscBuildData(PlatformBuildClassObject):
             if not FieldList:
                 continue
             for FieldName in FieldList:
-                FieldName = "." + FieldName
+                FieldName = ".{ORIG}".format(ORIG=FieldName)
                 IsArray = IsFieldValueAnArray(FieldList[FieldName.strip(".")][0])
                 if IsArray and not (FieldList[FieldName.strip(".")][0].startswith('{GUID') and FieldList[FieldName.strip(".")][0].endswith('}')):
                     try:
@@ -1569,7 +1569,7 @@ class DscBuildData(PlatformBuildClassObject):
                     if not FieldList:
                         continue
                     for FieldName in FieldList:
-                        FieldName = "." + FieldName
+                        FieldName = ".{ORIG}".format(ORIG=FieldName)
                         IsArray = IsFieldValueAnArray(FieldList[FieldName.strip(".")][0])
                         if IsArray and not (FieldList[FieldName.strip(".")][0].startswith('{GUID') and FieldList[FieldName.strip(".")][0].endswith('}')):
                             try:
@@ -1593,7 +1593,7 @@ class DscBuildData(PlatformBuildClassObject):
         if Pcd.PcdFieldValueFromComm:
             CApp = CApp + "// From Command Line \n"
         for FieldName in Pcd.PcdFieldValueFromComm:
-            FieldName = "." + FieldName
+            FieldName = ".{ORIG}".format(ORIG=FieldName)
             IsArray = IsFieldValueAnArray(Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0])
             if IsArray and not (Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0].startswith('{GUID') and Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0].endswith('}')):
                 try:
@@ -2043,7 +2043,7 @@ class DscBuildData(PlatformBuildClassObject):
                         else:
                             # append options for the same tool except PATH
                             if Attr != 'PATH':
-                                BuildOptions[Tool][Attr] += " " + self.BuildOptions[Options]
+                                BuildOptions[Tool][Attr] = "{ORIG} {NEW}".format(ORIG=BuildOptions[Tool][Attr], NEW=self.BuildOptions[Options])
                             else:
                                 BuildOptions[Tool][Attr] = self.BuildOptions[Options]
         if BuildOptions:
@@ -2054,7 +2054,7 @@ class DscBuildData(PlatformBuildClassObject):
                         ValueList = Value.split()
                         if ValueList:
                             for Id, Item in enumerate(ValueList):
-                                if Item in ['-D', '/D', '-U', '/U']:
+                                if Item in {'-D', '/D', '-U', '/U'}:
                                     CC_FLAGS += ' ' + Item
                                     if Id + 1 < len(ValueList):
                                         CC_FLAGS += ' ' + ValueList[Id + 1]
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 12d848b5fc41..3d9391039f4f 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -395,7 +395,7 @@ class InfBuildData(ModuleBuildClassObject):
                             self._BuildOptions[ToolChainFamily, ToolChain] = Value
                         else:
                             OptionString = self._BuildOptions[ToolChainFamily, ToolChain]
-                            self._BuildOptions[ToolChainFamily, ToolChain] = OptionString + " " + Value
+                            self._BuildOptions[ToolChainFamily, ToolChain] = "{OPTION} {VAL}".format(OPTION=OptionString, VAL=Value)
         # set _Header to non-None in order to avoid database re-querying
         self._Header_ = 'DUMMY'
 
@@ -863,7 +863,7 @@ class InfBuildData(ModuleBuildClassObject):
                 else:
                     # concatenate the option string if they're for the same tool
                     OptionString = self._BuildOptions[ToolChainFamily, ToolChain]
-                    self._BuildOptions[ToolChainFamily, ToolChain] = OptionString + " " + Option
+                    self._BuildOptions[ToolChainFamily, ToolChain] = "{OPTION} {OPT}".format(OPTION=OptionString, OPT=Option)
         return self._BuildOptions
 
     ## Retrieve dependency expression
diff --git a/BaseTools/Source/Python/Workspace/MetaDataTable.py b/BaseTools/Source/Python/Workspace/MetaDataTable.py
index e37a10c82f8f..83963008ef4c 100644
--- a/BaseTools/Source/Python/Workspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaDataTable.py
@@ -22,7 +22,7 @@ from CommonDataClass.DataClass import FileClass
 
 ## Convert to SQL required string format
 def ConvertToSqlString(StringList):
-    return map(lambda s: "'" + s.replace("'", "''") + "'", StringList)
+    return map(lambda s: "'{mid}'".format(mid=s.replace("'", "''")), StringList)
 
 ## TableFile
 #
@@ -329,10 +329,6 @@ class TableDataModel(Table):
     #
     def GetCrossIndex(self, ModelName):
         CrossIndex = -1
-        SqlCommand = """select CrossIndex from DataModel where name = '""" + ModelName + """'"""
+        SqlCommand = "select CrossIndex from DataModel where name = '{NAME}'".format(NAME=ModelName)
         self.Cur.execute(SqlCommand)
-        for Item in self.Cur:
-            CrossIndex = Item[0]
-
-        return CrossIndex
-
+        return self.Cur[-1][0]
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 21b20bce4018..2c116ddbcb71 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1924,10 +1924,10 @@ class DecParser(MetaFileParser):
                     return
 
                 if self._include_flag:
-                    self._ValueList[1] = "<HeaderFiles>_" + md5.new(self._CurrentLine).hexdigest()
+                    self._ValueList[1] = "<HeaderFiles>_{MD5}".format(MD5=md5.new(self._CurrentLine).hexdigest())
                     self._ValueList[2] = self._CurrentLine
                 if self._package_flag and "}" != self._CurrentLine:
-                    self._ValueList[1] = "<Packages>_" + md5.new(self._CurrentLine).hexdigest()
+                    self._ValueList[1] = "<Packages>_{MD5}".format(MD5=md5.new(self._CurrentLine).hexdigest())
                     self._ValueList[2] = self._CurrentLine
                 if self._CurrentLine == "}":
                     self._package_flag = False
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 10/11] BaseTools: change to set for membership testing
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (8 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  2018-05-14 18:09 ` [PATCH v1 11/11] BaseTools: remove extra assignment Jaben Carsey
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

when doing testing for membership in a list or a tuple, use a set instead.
when order matters, use a list or tuple (i.e. for loop)

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                      |  49 +++----
 BaseTools/Source/Python/AutoGen/GenC.py                         |  68 ++++-----
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |  27 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                      |   9 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                     |  52 +++----
 BaseTools/Source/Python/AutoGen/GenVar.py                       |   4 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py      |   4 +-
 BaseTools/Source/Python/BPDG/BPDG.py                            |   2 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                          |   6 +-
 BaseTools/Source/Python/Common/DataType.py                      |  36 +++--
 BaseTools/Source/Python/Common/Expression.py                    |   8 +-
 BaseTools/Source/Python/Common/Misc.py                          |  24 ++--
 BaseTools/Source/Python/Common/Parsing.py                       |   2 +-
 BaseTools/Source/Python/Common/RangeExpression.py               |  10 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py          |   8 +-
 BaseTools/Source/Python/Ecc/Check.py                            | 121 ++--------------
 BaseTools/Source/Python/Ecc/MetaDataParser.py                   |   4 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py |  48 +++----
 BaseTools/Source/Python/Ecc/c.py                                |  27 ++--
 BaseTools/Source/Python/Eot/Parser.py                           |   4 +-
 BaseTools/Source/Python/Eot/Report.py                           |  13 +-
 BaseTools/Source/Python/Eot/c.py                                |   2 +-
 BaseTools/Source/Python/GenFds/DataSection.py                   |   2 +-
 BaseTools/Source/Python/GenFds/DepexSection.py                  |   5 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                     | 152 ++++++++++----------
 BaseTools/Source/Python/GenFds/FfsInfStatement.py               |   4 +-
 BaseTools/Source/Python/GenFds/Fv.py                            |   4 +-
 BaseTools/Source/Python/GenFds/GenFds.py                        |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py          |  16 +--
 BaseTools/Source/Python/GenFds/GuidSection.py                   |  14 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py            |   4 +-
 BaseTools/Source/Python/GenFds/Region.py                        |   2 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py          |  20 +--
 BaseTools/Source/Python/Trim/Trim.py                            |   7 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py               |   2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |  86 +++++------
 BaseTools/Source/Python/Workspace/InfBuildData.py               |  26 ++--
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py      |   4 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py             |  66 ++++-----
 BaseTools/Source/Python/build/BuildReport.py                    |  34 ++---
 BaseTools/Source/Python/build/build.py                          |  32 ++---
 41 files changed, 462 insertions(+), 550 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 4ccb50a0a0af..dcad8b4f32f6 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1540,7 +1540,7 @@ class PlatformAutoGen(AutoGen):
                 self._PlatformPcds[pcd] = self.Platform.Pcds[pcd]
 
         for item in self._PlatformPcds:
-            if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+            if self._PlatformPcds[item].DatumType and self._PlatformPcds[item].DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                 self._PlatformPcds[item].DatumType = TAB_VOID
 
         if (self.Workspace.ArchList[-1] == self.Arch): 
@@ -1549,12 +1549,12 @@ class PlatformAutoGen(AutoGen):
                 Sku = Pcd.SkuInfoList.values()[0]
                 Sku.VpdOffset = Sku.VpdOffset.strip()
 
-                if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+                if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                     Pcd.DatumType = TAB_VOID
 
                     # if found PCD which datum value is unicode string the insert to left size of UnicodeIndex
                     # if found HII type PCD then insert to right of UnicodeIndex
-                if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+                if Pcd.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                     VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
 
             #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
@@ -1576,7 +1576,7 @@ class PlatformAutoGen(AutoGen):
             VpdSkuMap = {}
             for PcdKey in PlatformPcds:
                 Pcd = self._PlatformPcds[PcdKey]
-                if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD] and \
+                if Pcd.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD} and \
                    PcdKey in VpdPcdDict:
                     Pcd = VpdPcdDict[PcdKey]
                     SkuValueMap = {}
@@ -1630,7 +1630,7 @@ class PlatformAutoGen(AutoGen):
             #            
             for DscPcd in PlatformPcds:
                 DscPcdEntry = self._PlatformPcds[DscPcd]
-                if DscPcdEntry.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+                if DscPcdEntry.Type in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                     if not (self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == ''):
                         FoundFlag = False
                         for VpdPcd in VpdFile._VpdArray:
@@ -1748,7 +1748,7 @@ class PlatformAutoGen(AutoGen):
                 Sku = Pcd.SkuInfoList.values()[0]
                 Sku.VpdOffset = Sku.VpdOffset.strip()
 
-                if Pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+                if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                     Pcd.DatumType = TAB_VOID
 
                 PcdValue = Sku.DefaultValue
@@ -1768,7 +1768,7 @@ class PlatformAutoGen(AutoGen):
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
                 for (SkuName,SkuId) in allskuset:
-                    if type(SkuId) in (str,unicode) and eval(SkuId) == 0 or SkuId == 0:
+                    if type(SkuId) in {str,unicode} and eval(SkuId) == 0 or SkuId == 0:
                         continue
                     pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
@@ -3086,13 +3086,10 @@ class ModuleAutoGen(AutoGen):
     #   @retval     list    The list of package object
     #
     def _GetDerivedPackageList(self):
-        PackageList = []
+        PackageSet = set()
         for M in [self.Module] + self.DependentLibraryList:
-            for Package in M.Packages:
-                if Package in PackageList:
-                    continue
-                PackageList.append(Package)
-        return PackageList
+            PackageSet = PackageSet.union(M.Packages)
+        return list(PackageSet)
     
     ## Get the depex string
     #
@@ -3120,7 +3117,7 @@ class ModuleAutoGen(AutoGen):
                     else:
                         if Arch.upper() == TAB_ARCH_COMMON or \
                           (Arch.upper() == self.Arch.upper() and \
-                          ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
+                          ModuleType.upper() in {TAB_ARCH_COMMON, self.ModuleType.upper()}):
                             DepexList.append({(Arch, ModuleType): DepexExpr})
         
         #the type of build module is USER_DEFINED.
@@ -3279,9 +3276,9 @@ class ModuleAutoGen(AutoGen):
             # Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC/RVCT
             # is the former use /I , the Latter used -I to specify include directories
             #
-            if self.PlatformInfo.ToolChainFamily in ('MSFT'):
+            if self.PlatformInfo.ToolChainFamily == 'MSFT':
                 BuildOptIncludeRegEx = gBuildOptIncludePatternMsft
-            elif self.PlatformInfo.ToolChainFamily in ('INTEL', 'GCC', 'RVCT'):
+            elif self.PlatformInfo.ToolChainFamily in {'INTEL', 'GCC', 'RVCT'}:
                 BuildOptIncludeRegEx = gBuildOptIncludePatternOther
             else:
                 #
@@ -3291,7 +3288,7 @@ class ModuleAutoGen(AutoGen):
                 return self._BuildOptionIncPathList
             
             BuildOptionIncPathList = []
-            for Tool in ('CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM'):
+            for Tool in ['CC', 'PP', 'VFRPP', 'ASLPP', 'ASLCC', 'APP', 'ASM']:
                 Attr = 'FLAGS'
                 try:
                     FlagOption = self.BuildOption[Tool][Attr]
@@ -3339,12 +3336,12 @@ class ModuleAutoGen(AutoGen):
             self._SourceFileList = []
             for F in self.Module.Sources:
                 # match tool chain
-                if F.TagName not in ("", "*", self.ToolChain):
+                if F.TagName not in {"", "*", self.ToolChain}:
                     EdkLogger.debug(EdkLogger.DEBUG_9, "The toolchain [%s] for processing file [%s] is found, "
                                     "but [%s] is needed" % (F.TagName, str(F), self.ToolChain))
                     continue
                 # match tool chain family or build rule family
-                if F.ToolChainFamily not in ("", "*", self.ToolChainFamily, self.BuildRuleFamily):
+                if F.ToolChainFamily not in {"", "*", self.ToolChainFamily, self.BuildRuleFamily}:
                     EdkLogger.debug(
                                 EdkLogger.DEBUG_0,
                                 "The file [%s] must be built by tools of [%s], " \
@@ -3423,7 +3420,7 @@ class ModuleAutoGen(AutoGen):
         if self._BinaryFileList is None:
             self._BinaryFileList = []
             for F in self.Module.Binaries:
-                if F.Target not in [TAB_ARCH_COMMON, '*'] and F.Target != self.BuildTarget:
+                if F.Target not in {TAB_ARCH_COMMON, '*'} and F.Target != self.BuildTarget:
                     continue
                 self._BinaryFileList.append(F)
                 self._ApplyBuildRule(F, F.Type)
@@ -4049,11 +4046,11 @@ class ModuleAutoGen(AutoGen):
                 AsBuiltInfDict['binary_item'] += ['BIN|' + File]
         if self.DepexGenerated:
             self.OutputFile.add(self.Name + '.depex')
-            if self.ModuleType in [SUP_MODULE_PEIM]:
+            if self.ModuleType == SUP_MODULE_PEIM:
                 AsBuiltInfDict['binary_item'] += ['PEI_DEPEX|' + self.Name + '.depex']
-            if self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
+            if self.ModuleType in {SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER}:
                 AsBuiltInfDict['binary_item'] += ['DXE_DEPEX|' + self.Name + '.depex']
-            if self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]:
+            if self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
                 AsBuiltInfDict['binary_item'] += ['SMM_DEPEX|' + self.Name + '.depex']
 
         Bin = self._GenOffsetBin()
@@ -4107,11 +4104,11 @@ class ModuleAutoGen(AutoGen):
                 else:
                     continue
                 PcdValue = ''
-                if Pcd.DatumType == 'BOOLEAN':
+                if Pcd.DatumType == TAB_BOOLEAN:
                     BoolValue = Pcd.DefaultValue.upper()
-                    if BoolValue == 'TRUE':
+                    if BoolValue == TAB_TRUE_1:
                         Pcd.DefaultValue = '1'
-                    elif BoolValue == 'FALSE':
+                    elif BoolValue == TAB_FALSE_1:
                         Pcd.DefaultValue = '0'
 
                 if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index e73d83395255..60066e47bbce 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -43,9 +43,9 @@ gItemTypeStringDatabase  = {
 
 
 ## Datum size
-gDatumSizeStringDatabase = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOLEAN',TAB_VOID:'8'}
-gDatumSizeStringDatabaseH = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOL',TAB_VOID:'PTR'}
-gDatumSizeStringDatabaseLib = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'Bool',TAB_VOID:'Ptr'}
+gDatumSizeStringDatabase = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:TAB_BOOLEAN,TAB_VOID:'8'}
+gDatumSizeStringDatabaseH = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:'BOOL',TAB_VOID:'PTR'}
+gDatumSizeStringDatabaseLib = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64',TAB_BOOLEAN:'Bool',TAB_VOID:'Ptr'}
 
 ## AutoGen File Header Templates
 gAutoGenHeaderString = TemplateString("""\
@@ -996,11 +996,11 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
         Unicode = False
         ValueNumber = 0
 
-        if Pcd.DatumType == 'BOOLEAN':
+        if Pcd.DatumType == TAB_BOOLEAN:
             BoolValue = Value.upper()
-            if BoolValue == 'TRUE' or BoolValue == '1':
+            if BoolValue == TAB_TRUE_1 or BoolValue == '1':
                 Value = '1U'
-            elif BoolValue == 'FALSE' or BoolValue == '0':
+            elif BoolValue == TAB_FALSE_1 or BoolValue == '0':
                 Value = '0U'
 
         if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
@@ -1367,17 +1367,17 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
         if len(Lib.ConstructorList) <= 0:
             continue
         Dict = {'Function':Lib.ConstructorList}
-        if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Lib.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1398,14 +1398,14 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
     else:
-        if Info.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Info.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 ## Create code for library destructor
@@ -1429,17 +1429,17 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
         if len(Lib.DestructorList) <= 0:
             continue
         Dict = {'Function':Lib.DestructorList}
-        if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Lib.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1460,14 +1460,14 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
     else:
-        if Info.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
+        if Info.ModuleType in {SUP_MODULE_BASE, SUP_MODULE_SEC}:
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in {SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE}:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in {SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE}:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 
@@ -1478,7 +1478,7 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
-    if Info.IsLibrary or Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC]:
+    if Info.IsLibrary or Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC}:
         return
     #
     # Module Entry Points
@@ -1498,7 +1498,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         'UefiSpecVersion':   UefiSpecVersion + 'U'
     }
 
-    if Info.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE]:
+    if Info.ModuleType in {SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE}:
         if Info.SourceFileList:
           if NumEntryPoints != 1:
               EdkLogger.error(
@@ -1526,7 +1526,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         else:
             AutoGenC.Append(gPeimEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gPeimEntryPointPrototype.Replace(Dict))
-    elif Info.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER]:
+    elif Info.ModuleType in {SUP_MODULE_DXE_RUNTIME_DRIVER,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER}:
         if NumEntryPoints < 2:
             AutoGenC.Append(gUefiDriverEntryPointString[NumEntryPoints].Replace(Dict))
         else:
@@ -1558,7 +1558,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
-    if Info.IsLibrary or Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC]:
+    if Info.IsLibrary or Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_SEC}:
         return
     #
     # Unload Image Handlers
@@ -1578,7 +1578,7 @@ def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1602,7 +1602,7 @@ def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1626,7 +1626,7 @@ def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
 #   @param      AutoGenH    The TemplateString object for header file
 #
 def CreatePpiDefinitionCode(Info, AutoGenC, AutoGenH):
-    if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+    if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
         GuidType = TAB_GUID
     else:
         GuidType = "EFI_GUID"
@@ -1663,7 +1663,7 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
     # Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
     if TokenSpaceList:
         AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
-        if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE]:
+        if Info.ModuleType in {SUP_MODULE_USER_DEFINED, SUP_MODULE_BASE}:
             GuidType = TAB_GUID
         else:
             GuidType = "EFI_GUID"              
@@ -1782,7 +1782,7 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                     for FileObj in ImageFiles.ImageFilesDict[Idf]:
                         for sourcefile in Info.SourceFileList:
                             if FileObj.FileName == sourcefile.File:
-                                if not sourcefile.Ext.upper() in ['.PNG', '.BMP', '.JPG']:
+                                if not sourcefile.Ext.upper() in {'.PNG', '.BMP', '.JPG'}:
                                     EdkLogger.error("build", AUTOGEN_ERROR, "The %s's postfix must be one of .bmp, .jpg, .png" % (FileObj.FileName), ExtraData="[%s]" % str(Info))
                                 FileObj.File = sourcefile
                                 break
@@ -2107,11 +2107,11 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
                 if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
                     TokenCName = Pcd.TokenCName
                     Value = Pcd.DefaultValue
-                    if Pcd.DatumType == 'BOOLEAN':
+                    if Pcd.DatumType == TAB_BOOLEAN:
                         BoolValue = Value.upper()
-                        if BoolValue == 'TRUE':
+                        if BoolValue == TAB_TRUE_1:
                             Value = '1'
-                        elif BoolValue == 'FALSE':
+                        elif BoolValue == TAB_FALSE_1:
                             Value = '0'
                     for PcdItem in GlobalData.MixedPcd:
                         if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index 873ed6e59300..d4730dd227df 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -114,18 +114,15 @@ class DependencyExpression:
     }
 
     # all supported op codes and operands
-    SupportedOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
-    SupportedOperand = [DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE]
-
-    OpcodeWithSingleOperand = [DEPEX_OPCODE_NOT, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
-    OpcodeWithTwoOperand = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]
+    SupportedOpcode = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR}
+    SupportedOperand = {DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE}
 
     # op code that should not be the last one
-    NonEndingOpcode = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR]
+    NonEndingOpcode = {DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR}
     # op code must not present at the same time
-    ExclusiveOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    ExclusiveOpcode = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER}
     # op code that should be the first one if it presents
-    AboveAllOpcode = [DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    AboveAllOpcode = {DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER}
 
     #
     # open and close brace must be taken as individual tokens
@@ -177,7 +174,7 @@ class DependencyExpression:
         LastToken = ''
         for Token in self.TokenList:
             if Token == "(":
-                if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before open parentheses",
                                     ExtraData="Near %s" % LastToken)
                 Stack.append(Token)
@@ -185,7 +182,7 @@ class DependencyExpression:
                 if '(' not in Stack:
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: mismatched parentheses",
                                     ExtraData=str(self))
-                elif LastToken in self.SupportedOpcode + ['', None]:
+                elif LastToken in self.SupportedOpcode.union(['', None]):
                     EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before close parentheses",
                                     ExtraData="Near %s" % LastToken)
                 while len(Stack) > 0:
@@ -195,10 +192,10 @@ class DependencyExpression:
                     self.PostfixNotation.append(Stack.pop())
             elif Token in self.OpcodePriority:
                 if Token == DEPEX_OPCODE_NOT:
-                    if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                    if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before NOT",
                                         ExtraData="Near %s" % LastToken)
-                elif LastToken in self.SupportedOpcode + ['(', '', None]:
+                elif LastToken in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before " + Token,
                                         ExtraData="Near %s" % LastToken)
 
@@ -211,7 +208,7 @@ class DependencyExpression:
             else:
                 if Token not in self.SupportedOpcode:
                     # not OP, take it as GUID
-                    if LastToken not in self.SupportedOpcode + ['(', '', None]:
+                    if LastToken not in self.SupportedOpcode.union(['(', '', None]):
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before %s" % Token,
                                         ExtraData="Near %s" % LastToken)
                     if len(self.OpcodeList) == 0 or self.OpcodeList[-1] not in self.ExclusiveOpcode:
@@ -274,7 +271,7 @@ class DependencyExpression:
           return
         Op = OpcodeSet.pop()
         #if Op isn't either OR or AND, return
-        if Op not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
+        if Op not in {DEPEX_OPCODE_AND, DEPEX_OPCODE_OR}:
             return
         NewOperand = []
         AllOperand = set()
@@ -302,7 +299,7 @@ class DependencyExpression:
             return
 
         # don't generate depex if all operands are architecture protocols
-        if self.ModuleType in [SUP_MODULE_UEFI_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE] and \
+        if self.ModuleType in {SUP_MODULE_UEFI_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE} and \
            Op == DEPEX_OPCODE_AND and \
            self.ArchProtocols == set(GuidStructureStringToGuidString(Guid) for Guid in AllOperand):
             self.PostfixNotation = []
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 4ae977ccd400..12e871a8b8d3 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -23,6 +23,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 from Common.BuildToolError import *
 from Common.Misc import *
 from Common.String import *
+from Common.DataType import *
 from BuildEngine import *
 import Common.GlobalData as GlobalData
 from collections import OrderedDict
@@ -499,7 +500,7 @@ cleanlib:
 
         PCI_COMPRESS_Flag = False
         for k, v in self._AutoGenObject.Module.Defines.iteritems():
-            if 'PCI_COMPRESS' == k and 'TRUE' == v:
+            if 'PCI_COMPRESS' == k and TAB_TRUE_1 == v:
                 PCI_COMPRESS_Flag = True
 
         # tools definitions
@@ -900,7 +901,7 @@ cleanlib:
             self._AutoGenObject.AutoGenDepSet |= set(self.FileDependency[File])
 
             # skip non-C files
-            if File.Ext not in [".c", ".C"] or File.Name == "AutoGen.c":
+            if File.Ext not in {".c", ".C"} or File.Name == "AutoGen.c":
                 continue
             elif DepSet is None:
                 DepSet = set(self.FileDependency[File])
@@ -917,7 +918,7 @@ cleanlib:
 
         for File in self.FileDependency:
             # skip non-C files
-            if File.Ext not in [".c", ".C"] or File.Name == "AutoGen.c":
+            if File.Ext not in {".c", ".C"} or File.Name == "AutoGen.c":
                 continue
             NewDepSet = set(self.FileDependency[File])
             NewDepSet -= DepSet
@@ -958,7 +959,7 @@ cleanlib:
                 # Use file list macro as dependency
                 if T.GenFileListMacro:
                     Deps.append("$(%s)" % T.FileListMacro)
-                    if Type in [TAB_OBJECT_FILE, TAB_STATIC_LIBRARY]:
+                    if Type in {TAB_OBJECT_FILE, TAB_STATIC_LIBRARY}:
                         Deps.append("$(%s)" % T.ListFileMacro)
 
                 TargetDict = {
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index d2d42fe9d08e..48b34e6f87e5 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -293,7 +293,7 @@ class DbItemList:
 
         Buffer = ''
         for Datas in self.RawDataList:
-            if type(Datas) in (list, tuple):
+            if type(Datas) in {list, tuple}:
                 for Data in Datas:
                     if PackStr:
                         Buffer += pack(PackStr, GetIntegerValue(Data))
@@ -368,7 +368,7 @@ class DbComItemList (DbItemList):
         Buffer = ''
         for DataList in self.RawDataList:
             for Data in DataList:
-                if type(Data) in (list, tuple):
+                if type(Data) in {list, tuple}:
                     for SingleData in Data:
                         Buffer += pack(PackStr, GetIntegerValue(SingleData))
                 else:
@@ -414,7 +414,7 @@ class DbStringHeadTableItemList(DbItemList):
                 Offset += len(self.RawDataList[ItemIndex])
         else:
             for innerIndex in range(Index):
-                if type(self.RawDataList[innerIndex]) in (list, tuple):
+                if type(self.RawDataList[innerIndex]) in {list, tuple}:
                     Offset += len(self.RawDataList[innerIndex]) * self.ItemSize
                 else:
                     Offset += self.ItemSize
@@ -431,7 +431,7 @@ class DbStringHeadTableItemList(DbItemList):
             self.ListSize = self.GetInterOffset(len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
         else:
             for Datas in self.RawDataList:
-                if type(Datas) in (list, tuple):
+                if type(Datas) in {list, tuple}:
                     self.ListSize += len(Datas) * self.ItemSize
                 else:
                     self.ListSize += self.ItemSize
@@ -783,7 +783,7 @@ def BuildExDataBase(Dict):
     Pad = 0xDA
     
     UninitDataBaseSize  = 0
-    for Item in (DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean):
+    for Item in {DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean}:
         UninitDataBaseSize += Item.GetListSize()
     
     if (DbTotalLength - UninitDataBaseSize) % 8:
@@ -1001,11 +1001,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         'EX_TOKEN_NUMBER'               : '0U',
         'SIZE_TABLE_SIZE'               : '2U',
         'SKU_HEAD_SIZE'                 : '1U',
-        'GUID_TABLE_EMPTY'              : 'TRUE',
-        'STRING_TABLE_EMPTY'            : 'TRUE',
-        'SKUID_TABLE_EMPTY'             : 'TRUE',
-        'DATABASE_EMPTY'                : 'TRUE',
-        'EXMAP_TABLE_EMPTY'             : 'TRUE',
+        'GUID_TABLE_EMPTY'              : TAB_TRUE_1,
+        'STRING_TABLE_EMPTY'            : TAB_TRUE_1,
+        'SKUID_TABLE_EMPTY'             : TAB_TRUE_1,
+        'DATABASE_EMPTY'                : TAB_TRUE_1,
+        'EXMAP_TABLE_EMPTY'             : TAB_TRUE_1,
         'PCD_DATABASE_UNINIT_EMPTY'     : '  UINT8  dummy; /* PCD_DATABASE_UNINIT is emptry */',
         'SYSTEM_SKU_ID'                 : '  SKU_ID             SystemSkuId;',
         'SYSTEM_SKU_ID_VALUE'           : '0U'
@@ -1022,14 +1022,14 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['VARDEF_SKUID_' + DatumType] = []
         Dict['VARDEF_VALUE_' + DatumType] = []
         Dict['VARDEF_DB_VALUE_' + DatumType] = []
-        for Init in ['INIT','UNINIT']:
+        for Init in {'INIT','UNINIT'}:
             Dict[Init+'_CNAME_DECL_' + DatumType]   = []
             Dict[Init+'_GUID_DECL_' + DatumType]    = []
             Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
             Dict[Init+'_VALUE_' + DatumType]        = []
             Dict[Init+'_DB_VALUE_'+DatumType] = []
             
-    for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
+    for Type in {'STRING_HEAD','VPD_HEAD','VARIABLE_HEAD'}:
         Dict[Type + '_CNAME_DECL']   = []
         Dict[Type + '_GUID_DECL']    = []
         Dict[Type + '_NUMSKUS_DECL'] = []
@@ -1087,7 +1087,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
     i = 0
     ReorderedDynPcdList = GetOrderedDynamicPcdList(DynamicPcdList, Platform.PcdTokenNumber)
     for item in ReorderedDynPcdList:
-        if item.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+        if item.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
             item.DatumType = TAB_VOID
     for Pcd in ReorderedDynPcdList:
         VoidStarTypeCurrSize = []
@@ -1130,11 +1130,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Pcd.InitString = 'UNINIT'
 
         if Pcd.DatumType == TAB_VOID:
-            if Pcd.Type not in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
+            if Pcd.Type not in {TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD}:
                 Pcd.TokenTypeList = ['PCD_TYPE_STRING']
             else:
                 Pcd.TokenTypeList = []
-        elif Pcd.DatumType == 'BOOLEAN':
+        elif Pcd.DatumType == TAB_BOOLEAN:
             Pcd.TokenTypeList = ['PCD_DATUM_TYPE_UINT8_BOOLEAN']
         else:
             Pcd.TokenTypeList = ['PCD_DATUM_TYPE_' + Pcd.DatumType]
@@ -1235,10 +1235,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     
                     if Pcd.DatumType == TAB_UINT64:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "ULL")
-                    elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
+                    elif Pcd.DatumType in {TAB_UINT32, TAB_UINT16, TAB_UINT8}:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
-                    elif Pcd.DatumType == "BOOLEAN":
-                        if eval(Sku.HiiDefaultValue) in [1,0]:
+                    elif Pcd.DatumType == TAB_BOOLEAN:
+                        if eval(Sku.HiiDefaultValue) in {1,0}:
                             Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
                     else:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
@@ -1323,7 +1323,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             else:
                 if "PCD_TYPE_HII" not in Pcd.TokenTypeList:
                     Pcd.TokenTypeList += ['PCD_TYPE_DATA']
-                    if Sku.DefaultValue == 'TRUE':
+                    if Sku.DefaultValue == TAB_TRUE_1:
                         Pcd.InitString = 'INIT'
                     else:
                         Pcd.InitString = Pcd.isinit
@@ -1333,10 +1333,10 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                 #
                 if Pcd.DatumType == TAB_UINT64:
                     ValueList.append(Sku.DefaultValue + "ULL")
-                elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
+                elif Pcd.DatumType in {TAB_UINT32, TAB_UINT16, TAB_UINT8}:
                     ValueList.append(Sku.DefaultValue + "U")
-                elif Pcd.DatumType == "BOOLEAN":
-                    if Sku.DefaultValue in ["1", "0"]:
+                elif Pcd.DatumType == TAB_BOOLEAN:
+                    if Sku.DefaultValue in {"1", "0"}:
                         ValueList.append(Sku.DefaultValue + "U")              
                 else:
                     ValueList.append(Sku.DefaultValue)
@@ -1516,7 +1516,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             StringTableSize += Dict['PCD_CNAME_LENGTH'][index]
             StringTableIndex += 1
     if GuidList != []:
-        Dict['GUID_TABLE_EMPTY'] = 'FALSE'
+        Dict['GUID_TABLE_EMPTY'] = TAB_FALSE_1
         Dict['GUID_TABLE_SIZE'] = str(len(GuidList)) + 'U'
     else:
         Dict['GUID_STRUCTURE'] = [GuidStringToGuidStructureString('00000000-0000-0000-0000-000000000000')]
@@ -1528,7 +1528,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['STRING_TABLE_GUID'].append('')
         Dict['STRING_TABLE_VALUE'].append('{ 0 }')
     else:
-        Dict['STRING_TABLE_EMPTY'] = 'FALSE'
+        Dict['STRING_TABLE_EMPTY'] = TAB_FALSE_1
         Dict['STRING_TABLE_SIZE'] = str(StringTableSize) + 'U'
 
     if Dict['SIZE_TABLE_CNAME'] == []:
@@ -1538,12 +1538,12 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append('0U')
 
     if NumberOfLocalTokens != 0:
-        Dict['DATABASE_EMPTY']                = 'FALSE'
+        Dict['DATABASE_EMPTY']                = TAB_FALSE_1
         Dict['LOCAL_TOKEN_NUMBER_TABLE_SIZE'] = NumberOfLocalTokens
         Dict['LOCAL_TOKEN_NUMBER']            = NumberOfLocalTokens
 
     if NumberOfExTokens != 0:
-        Dict['EXMAP_TABLE_EMPTY']    = 'FALSE'
+        Dict['EXMAP_TABLE_EMPTY']    = TAB_FALSE_1
         Dict['EXMAPPING_TABLE_SIZE'] = str(NumberOfExTokens) + 'U'
         Dict['EX_TOKEN_NUMBER']      = str(NumberOfExTokens) + 'U'
     else:
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 35f022ac2e19..c01661864c6d 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -295,8 +295,8 @@ class VariableMgr(object):
                 for value_char in tail.split(","):
                     Buffer += pack("=B",int(value_char,16))
                 data_len += len(tail.split(","))
-        elif data_type == "BOOLEAN":
-            Buffer += pack("=B",True) if var_value.upper() == "TRUE" else pack("=B",False)
+        elif data_type == TAB_BOOLEAN:
+            Buffer += pack("=B",True) if var_value.upper() == TAB_TRUE_1 else pack("=B",False)
             data_len += 1
         elif data_type  == DataType.TAB_UINT8:
             Buffer += pack("=B",GetIntegerValue(var_value))
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index b2a9bb1134ed..63add891e3f1 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -58,7 +58,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                 itemIndex += 1
                 realLength += 5
                 for v_data in item.data:
-                    if type(v_data) in (int, long):
+                    if type(v_data) in {int, long}:
                         realLength += item.StorageWidth
                     else:
                         realLength += item.StorageWidth
@@ -138,7 +138,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                 Buffer += b
                 realLength += 1
                 for v_data in item.data:
-                    if type(v_data) in (int, long):
+                    if type(v_data) in {int, long}:
                         b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data)
                         Buffer += b
                         realLength += item.StorageWidth
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 6c8f89f5d12b..88e12b247c58 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -134,7 +134,7 @@ def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
     if os.path.exists(VpdFileName) and not Force:
         print "\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName
         choice = sys.stdin.readline()
-        if choice.strip().lower() not in ['y', 'yes', '']:
+        if choice.strip().lower() not in {'y', 'yes', ''}:
             return
         
     GenVPD = GenVpd.GenVPD (InputFileName, MapFileName, VpdFileName)
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index dba815415f92..7125788b5bfe 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -72,9 +72,9 @@ class PcdEntry:
     #    
     def _IsBoolean(self, ValueString, Size):
         if (Size == "1"):
-            if ValueString.upper() in ["TRUE", "FALSE"]:
+            if ValueString.upper() in TAB_TRUE_FALSE_SET:
                 return True
-            elif ValueString in ["0", "1", "0x0", "0x1", "0x00", "0x01"]:
+            elif ValueString in {"0", "1", "0x0", "0x1", "0x00", "0x01"}:
                 return True
 
         return False
@@ -101,7 +101,7 @@ class PcdEntry:
     # 
     # 
     def _PackBooleanValue(self, ValueString):
-        if ValueString.upper() == "TRUE" or ValueString in ["1", "0x1", "0x01"]:
+        if ValueString.upper() == TAB_TRUE_1 or ValueString in {"1", "0x1", "0x01"}:
             try:
                 self.PcdValue = pack(_FORMAT_CHAR[1], 1)
             except:
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 93136dff0db2..b86e403c10fb 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -41,10 +41,11 @@ TAB_UINT32 = 'UINT32'
 TAB_UINT64 = 'UINT64'
 TAB_VOID = 'VOID*'
 TAB_GUID = 'GUID'
+TAB_BOOLEAN = 'BOOLEAN'
 
 TAB_PCD_CLEAN_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64}
-TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN'}
-TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN', TAB_VOID}
+TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_BOOLEAN}
+TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_BOOLEAN, TAB_VOID}
 
 TAB_EDK_SOURCE = '$(EDK_SOURCE)'
 TAB_EFI_SOURCE = '$(EFI_SOURCE)'
@@ -79,10 +80,10 @@ SUP_MODULE_SMM_CORE = 'SMM_CORE'
 SUP_MODULE_MM_STANDALONE = 'MM_STANDALONE'
 SUP_MODULE_MM_CORE_STANDALONE = 'MM_CORE_STANDALONE'
 
-SUP_MODULE_LIST = [SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
+SUP_MODULE_SET = {SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
                    SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER, \
-                   SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]
-SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(SUP_MODULE_LIST)
+                   SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}
+SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(SUP_MODULE_SET)
 SUP_MODULE_SET_PEI = {SUP_MODULE_PEIM, SUP_MODULE_PEI_CORE}
 
 EDK_COMPONENT_TYPE_LIBRARY = 'LIBRARY'
@@ -290,9 +291,23 @@ TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SET =  {TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_
                                             TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE}
 
 ## The mapping dictionary from datum type to its maximum number.
-MAX_VAL_TYPE = {"BOOLEAN":0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
+MAX_VAL_TYPE = {TAB_BOOLEAN:0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
 ## The mapping dictionary from datum type to size string.
-MAX_SIZE_TYPE = {"BOOLEAN":1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
+MAX_SIZE_TYPE = {TAB_BOOLEAN:1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
+
+TAB_TRUE_1 = 'TRUE'
+TAB_TRUE_2 = 'true'
+TAB_TRUE_3 = 'True'
+
+TAB_FALSE_1 = 'FALSE'
+TAB_FALSE_2 = 'false'
+TAB_FALSE_3 = 'False'
+
+TAB_TRUE_SET = {TAB_TRUE_1,TAB_TRUE_2,TAB_TRUE_3}
+TAB_FALSE_SET = {TAB_FALSE_1,TAB_FALSE_2,TAB_FALSE_3}
+
+TAB_TRUE_FALSE_SET = {TAB_TRUE_1,TAB_FALSE_1}
+
 
 TAB_DEPEX = 'Depex'
 TAB_DEPEX_COMMON = TAB_DEPEX + TAB_SPLIT + TAB_ARCH_COMMON
@@ -500,7 +515,12 @@ DEPEX_OPCODE_TRUE = "TRUE"
 DEPEX_OPCODE_FALSE = "FALSE"
 
 # Dependency Expression
-DEPEX_SUPPORTED_OPCODE_SET = {"BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "END", "SOR", "TRUE", "FALSE", '(', ')'}
+DEPEX_SUPPORTED_OPCODE_SET = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER,
+                              DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, 
+                              DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, 
+                              DEPEX_OPCODE_END, DEPEX_OPCODE_SOR,
+                              DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE,
+                             '(', ')'}
 
 TAB_STATIC_LIBRARY = "STATIC-LIBRARY-FILE"
 TAB_DYNAMIC_LIBRARY = "DYNAMIC-LIBRARY-FILE"
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 36f2654fc9cf..3133f610b4a7 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -656,9 +656,9 @@ class ValueExpression(BaseExpression):
 
         if self._Token.startswith('"'):
             self._Token = self._Token[1:-1]
-        elif self._Token in {"FALSE", "false", "False"}:
+        elif self._Token in TAB_FALSE_SET:
             self._Token = False
-        elif self._Token in {"TRUE", "true", "True"}:
+        elif self._Token in TAB_TRUE_SET:
             self._Token = True
         else:
             self.__IsNumberToken()
@@ -1020,9 +1020,9 @@ class ValueExpressionEx(ValueExpression):
                     else:
                         raise  BadExpression("Type: %s, Value: %s, %s"%(self.PcdType, PcdValue, Value))
 
-        if PcdValue == 'True':
+        if PcdValue == TAB_TRUE_3:
             PcdValue = '1'
-        if PcdValue == 'False':
+        if PcdValue == TAB_FALSE_3:
             PcdValue = '0'
 
         if RealValue:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index bfb6e56a923f..5b8459e5007b 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1403,9 +1403,9 @@ def ParseFieldValue (Value):
         if Value == 0:
             return 0, 1
         return Value, (Value.bit_length() + 7) / 8
-    if Value.lower() == 'true':
+    if Value.lower() == TAB_TRUE_2:
         return 1, 1
-    if Value.lower() == 'false':
+    if Value.lower() == TAB_FALSE_2:
         return 0, 1
     return Value, 1
 
@@ -1438,7 +1438,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
     FieldList = AnalyzePcdExpression(Setting)
 
     IsValid = True
-    if PcdType in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG):
+    if PcdType in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG}:
         Value = FieldList[0]
         Size = ''
         if len(FieldList) > 1:
@@ -1461,7 +1461,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [str(Value), '', str(Size)], IsValid, 0
-    elif PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+    elif PcdType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
         Value = FieldList[0]
         Size = Type = ''
         if len(FieldList) > 1:
@@ -1482,7 +1482,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [Value, Type, str(Size)], IsValid, 0
-    elif PcdType in (MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_VPD):
+    elif PcdType in {MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_VPD}:
         VpdOffset = FieldList[0]
         Value = Size = ''
         if not DataType == TAB_VOID:
@@ -1504,7 +1504,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
                 IsValid = False
                 Size = -1
         return [VpdOffset, str(Size), Value], IsValid, 2
-    elif PcdType in (MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII):
+    elif PcdType in {MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII}:
         HiiString = FieldList[0]
         Guid = Offset = Value = Attribute = ''
         if len(FieldList) > 1:
@@ -1574,11 +1574,11 @@ def CheckPcdDatum(Type, Value):
                 PrintList = list(Printset)
                 PrintList.sort()
                 return False, "Invalid PCD string value of type [%s]; must be printable chars %s." % (Type, PrintList)
-    elif Type == 'BOOLEAN':
-        if Value not in ['TRUE', 'True', 'true', '0x1', '0x01', '1', 'FALSE', 'False', 'false', '0x0', '0x00', '0']:
+    elif Type == TAB_BOOLEAN:
+        if Value not in {TAB_TRUE_1, TAB_TRUE_2, TAB_TRUE_3, '0x1', '0x01', '1', TAB_FALSE_1, TAB_FALSE_2, TAB_FALSE_3, '0x0', '0x00', '0'}:
             return False, "Invalid value [%s] of type [%s]; must be one of TRUE, True, true, 0x1, 0x01, 1"\
                           ", FALSE, False, false, 0x0, 0x00, 0" % (Value, Type)
-    elif Type in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64]:
+    elif Type in {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64}:
         try:
             Value = long(Value, 0)
         except:
@@ -1601,7 +1601,7 @@ def SplitOption(OptionString):
     QuotationMark = ""
     for Index in range(0, len(OptionString)):
         CurrentChar = OptionString[Index]
-        if CurrentChar in ['"', "'"]:
+        if CurrentChar in {'"', "'"}:
             if QuotationMark == CurrentChar:
                 QuotationMark = ""
             elif QuotationMark == "":
@@ -1610,7 +1610,7 @@ def SplitOption(OptionString):
         elif QuotationMark:
             continue
 
-        if CurrentChar in ["/", "-"] and LastChar in [" ", "\t", "\r", "\n"]:
+        if CurrentChar in {"/", "-"} and LastChar in {" ", "\t", "\r", "\n"}:
             if Index > OptionStart:
                 OptionList.append(OptionString[OptionStart:Index - 1])
             OptionStart = Index
@@ -2083,7 +2083,7 @@ def PackRegistryFormatGuid(Guid):
 #   @retval     Value    The integer value that the input represents
 #
 def GetIntegerValue(Input):
-    if type(Input) in (int, long):
+    if type(Input) in {int, long}:
         return Input
     String = Input
     if String.endswith("U"):
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 453c2039e3d9..94e73f2b78f9 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -42,7 +42,7 @@ def ParseDefineMacro2(Table, RecordSets, GlobalMacro):
     #
     # Replace the Macros
     #
-    for Value in (v for v in RecordSets.values() if v):
+    for Value in [v for v in RecordSets.values() if v]:
         for Item in Value:
             Item[0] = ReplaceMacro(Item[0], Macros)
 
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 7f504d6e310c..fe78bcfd90bb 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -327,12 +327,12 @@ class RangeExpression(BaseExpression):
         
     def Eval(self, Operator, Oprand1, Oprand2 = None):
         
-        if Operator in ["!", "NOT", "not"]:
+        if Operator in {"!", "NOT", "not"}:
             if not gGuidPattern.match(Oprand1.strip()):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             return self.NegtiveRange(Oprand1)
         else:
-            if Operator in ["==", ">=", "<=", ">", "<", '^']:
+            if Operator in {"==", ">=", "<=", ">", "<", '^'}:
                 return self.EvalRange(Operator, Oprand1)
             elif Operator == 'and' :
                 if not gGuidPatternEnd.match(Oprand1.strip()) or not gGuidPatternEnd.match(Oprand2.strip()):
@@ -439,7 +439,7 @@ class RangeExpression(BaseExpression):
         Val = self._RelExpr()
         while self._IsOperator({"!=", "NOT", "not"}):
             Op = self._Token
-            if Op in ["!", "NOT", "not"]:
+            if Op in {"!", "NOT", "not"}:
                 if not self._IsOperator({"IN", "in"}):
                     raise BadExpression(ERR_REL_NOT_IN)
                 Op += ' ' + self._Token
@@ -576,9 +576,9 @@ class RangeExpression(BaseExpression):
 
         if self._Token.startswith('"'):
             self._Token = self._Token[1:-1]
-        elif self._Token in ["FALSE", "false", "False"]:
+        elif self._Token in TAB_FALSE_SET:
             self._Token = False
-        elif self._Token in ["TRUE", "true", "True"]:
+        elif self._Token in TAB_TRUE_SET:
             self._Token = True
         else:
             self.__IsNumberToken()
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 6f5e5f0d173d..ca116ed9b0aa 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -96,8 +96,8 @@ class TargetTxtClassObject(object):
             else:
                 Value = ""
 
-            if Key in [DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF, \
-                       DataType.TAB_TAT_DEFINES_ACTIVE_MODULE, DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF]:
+            if Key in {DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF, \
+                       DataType.TAB_TAT_DEFINES_ACTIVE_MODULE, DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF}:
                 self.TargetTxtDictionary[Key] = Value.replace('\\', '/')
                 if Key == DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF and self.TargetTxtDictionary[Key]:
                     if self.TargetTxtDictionary[Key].startswith("Conf/"):
@@ -119,8 +119,8 @@ class TargetTxtClassObject(object):
                         # The File pointed to by BUILD_RULE_CONF is not in a Conf/ directory
                         Build_Rule = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                     self.TargetTxtDictionary[Key] = Build_Rule
-            elif Key in [DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH, \
-                         DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG]:
+            elif Key in {DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH, \
+                         DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG}:
                 self.TargetTxtDictionary[Key] = Value.split()
             elif Key == DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER:
                 try:
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index e7bd97297538..6bb86f86a706 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -239,11 +239,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckReturnType == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout return type ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutReturnType(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutReturnType(FullName)
 
@@ -252,11 +247,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckOptionalFunctionalModifier == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout modifier ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutModifier(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutModifier(FullName)
 
@@ -266,11 +256,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionName == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function name ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c', '.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutName(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckFuncLayoutName(FullName)
 
@@ -279,12 +264,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionPrototype == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function prototype ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[PROTOTYPE]" + FullName)
-#                        c.CheckFuncLayoutPrototype(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[PROTOTYPE]" + FullName)
                 c.CheckFuncLayoutPrototype(FullName)
@@ -294,11 +273,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckFunctionBody == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout function body ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutBody(FullName)
             for FullName in EccGlobalData.gCFileList:
                 c.CheckFuncLayoutBody(FullName)
 
@@ -309,12 +283,6 @@ class Check(object):
         if EccGlobalData.gConfig.CFunctionLayoutCheckNoInitOfVariable == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking function layout local variables ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckFuncLayoutLocalVariable(FullName)
-
             for FullName in EccGlobalData.gCFileList:
                 c.CheckFuncLayoutLocalVariable(FullName)
 
@@ -337,11 +305,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckNoUseCType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration No use C type ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckDeclNoUseCType(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckDeclNoUseCType(FullName)
 
@@ -350,11 +313,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckInOutModifier == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration argument modifier ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        c.CheckDeclArgModifier(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 c.CheckDeclArgModifier(FullName)
 
@@ -368,12 +326,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckEnumeratedType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration enum typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[ENUM]" + FullName)
-#                        c.CheckDeclEnumTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[ENUM]" + FullName)
                 c.CheckDeclEnumTypedef(FullName)
@@ -383,12 +335,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckStructureDeclaration == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration struct typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[STRUCT]" + FullName)
-#                        c.CheckDeclStructTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[STRUCT]" + FullName)
                 c.CheckDeclStructTypedef(FullName)
@@ -420,12 +366,6 @@ class Check(object):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckUnionType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration union typedef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[UNION]" + FullName)
-#                        c.CheckDeclUnionTypedef(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 EdkLogger.quiet("[UNION]" + FullName)
                 c.CheckDeclUnionTypedef(FullName)
@@ -441,12 +381,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckBooleanValue == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression Boolean value ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[BOOLEAN]" + FullName)
-#                        c.CheckBooleanValueComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[BOOLEAN]" + FullName)
                 c.CheckBooleanValueComparison(FullName)
@@ -456,12 +390,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckNonBooleanOperator == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression Non-Boolean variable...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[NON-BOOLEAN]" + FullName)
-#                        c.CheckNonBooleanValueComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[NON-BOOLEAN]" + FullName)
                 c.CheckNonBooleanValueComparison(FullName)
@@ -471,12 +399,6 @@ class Check(object):
         if EccGlobalData.gConfig.PredicateExpressionCheckComparisonNullType == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking predicate expression NULL pointer ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        EdkLogger.quiet("[POINTER]" + FullName)
-#                        c.CheckPointerNullComparison(FullName)
             for FullName in EccGlobalData.gCFileList:
                 EdkLogger.quiet("[POINTER]" + FullName)
                 c.CheckPointerNullComparison(FullName)
@@ -518,11 +440,6 @@ class Check(object):
         if EccGlobalData.gConfig.IncludeFileCheckIfndefStatement == '1' or EccGlobalData.gConfig.IncludeFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking header file ifndef ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckHeaderFileIfndef(FullName)
             for FullName in EccGlobalData.gHFileList:
                 MsgList = c.CheckHeaderFileIfndef(FullName)
 
@@ -531,11 +448,6 @@ class Check(object):
         if EccGlobalData.gConfig.IncludeFileCheckData == '1' or EccGlobalData.gConfig.IncludeFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking header file data ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckHeaderFileData(FullName)
             for FullName in EccGlobalData.gHFileList:
                 MsgList = c.CheckHeaderFileData(FullName)
 
@@ -555,10 +467,10 @@ class Check(object):
             for Dirpath, Dirnames, Filenames in self.WalkTree():
                 for F in Filenames:
                     Ext = os.path.splitext(F)[1]
-                    if Ext in ('.h', '.c'):
+                    if Ext in {'.h', '.c'}:
                         FullName = os.path.join(Dirpath, F)
                         MsgList = c.CheckFileHeaderDoxygenComments(FullName)
-                    elif Ext in ('.inf', '.dec', '.dsc', '.fdf'):
+                    elif Ext in {'.inf', '.dec', '.dsc', '.fdf'}:
                         FullName = os.path.join(Dirpath, F)
                         op = open(FullName).readlines()
                         FileLinesList = op
@@ -642,11 +554,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckFunctionHeader == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen function header ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckFuncHeaderDoxygenComments(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckFuncHeaderDoxygenComments(FullName)
 
@@ -662,11 +569,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckCommentFormat == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen comment ///< ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckDoxygenTripleForwardSlash(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckDoxygenTripleForwardSlash(FullName)
 
@@ -675,11 +577,6 @@ class Check(object):
         if EccGlobalData.gConfig.DoxygenCheckCommand == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen command ...")
 
-#            for Dirpath, Dirnames, Filenames in self.WalkTree():
-#                for F in Filenames:
-#                    if os.path.splitext(F)[1] in ('.h', '.c'):
-#                        FullName = os.path.join(Dirpath, F)
-#                        MsgList = c.CheckDoxygenCommand(FullName)
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckDoxygenCommand(FullName)
 
@@ -1027,11 +924,11 @@ class Check(object):
                     for Record in RecordSet:
                         FunName = Record[0]
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, FunName):
-                            if Model in [MODEL_PCD_FIXED_AT_BUILD] and not FunName.startswith('FixedPcdGet'):
+                            if Model == MODEL_PCD_FIXED_AT_BUILD and not FunName.startswith('FixedPcdGet'):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FixPcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_FEATURE_FLAG] and not FunName.startswith(('FeaturePcdGet','FeaturePcdSet')):
+                            if Model == MODEL_PCD_FEATURE_FLAG and not FunName.startswith(('FeaturePcdGet','FeaturePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FeaturePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
-                            if Model in [MODEL_PCD_PATCHABLE_IN_MODULE] and not FunName.startswith(('PatchablePcdGet','PatchablePcdSet')):
+                            if Model == MODEL_PCD_PATCHABLE_IN_MODULE and not FunName.startswith(('PatchablePcdGet','PatchablePcdSet')):
                                 EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a PatchablePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
 
             #ERROR_META_DATA_FILE_CHECK_PCD_TYPE
@@ -1188,12 +1085,12 @@ class Check(object):
                 if Usage.startswith(DT.TAB_SPECIAL_COMMENT):
                     PcdCommentList = Usage[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if len(PcdCommentList) >= 1:
-                        if Model in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_FEATURE_FLAG] \
+                        if Model in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_FEATURE_FLAG} \
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
                                                                           DT.TAB_INF_USAGE_UNDEFINED)):
                             EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
-                        if Model in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX] \
+                        if Model in {MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX} \
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_PRO,
                                                                           DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
@@ -1259,7 +1156,7 @@ class Check(object):
         or EccGlobalData.gConfig.CheckAll == '1':
             for Dirpath, Dirnames, Filenames in self.WalkTree():
                 for F in Filenames:
-                    if os.path.splitext(F)[1] in ('.h', '.c'):
+                    if os.path.splitext(F)[1] in {'.h', '.c'}:
                         FullName = os.path.join(Dirpath, F)
                         Id = c.GetTableID(FullName)
                         if Id < 0:
@@ -1269,7 +1166,7 @@ class Check(object):
                         self.NamingConventionCheckTypedefStatement(FileTable)
                         self.NamingConventionCheckVariableName(FileTable)
                         self.NamingConventionCheckSingleCharacterVariable(FileTable)
-                        if os.path.splitext(F)[1] in ('.h'):
+                        if os.path.splitext(F)[1] == '.h':
                             self.NamingConventionCheckIfndefStatement(FileTable)
 
         self.NamingConventionCheckPathName()
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index 82ede3eb330c..f3b7b41298bc 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -135,8 +135,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
         # indication of different block; or in the position that Abstract should be, also keep it
         # as it indicates that no abstract
         #
-        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE, \
-                                                      HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT]:
+        if not Comment and HeaderCommentStage not in {HEADER_COMMENT_LICENSE, \
+                                                      HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT}:
             continue
         
         if HeaderCommentStage == HEADER_COMMENT_NOT_STARTED:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 0f9711ba109e..9baa59f94d9c 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -482,14 +482,14 @@ class InfParser(MetaFileParser):
         IsFindBlockComment = False
 
         for Index in range(0, len(Content)):
-            if self._SectionType in [MODEL_EFI_GUID,
+            if self._SectionType in {MODEL_EFI_GUID,
                                      MODEL_EFI_PROTOCOL,
                                      MODEL_EFI_PPI,
                                      MODEL_PCD_FIXED_AT_BUILD,
                                      MODEL_PCD_PATCHABLE_IN_MODULE,
                                      MODEL_PCD_FEATURE_FLAG,
                                      MODEL_PCD_DYNAMIC_EX,
-                                     MODEL_PCD_DYNAMIC]:
+                                     MODEL_PCD_DYNAMIC}:
                 Line = Content[Index].strip()
                 if Line.startswith(TAB_SPECIAL_COMMENT):
                     Usage += ' ' + Line[Line.find(TAB_SPECIAL_COMMENT):]
@@ -525,7 +525,7 @@ class InfParser(MetaFileParser):
                 self._SectionHeaderParser()
                 # Check invalid sections
                 if self._Version < 0x00010005:
-                    if self._SectionType in [MODEL_META_DATA_BUILD_OPTION,
+                    if self._SectionType in {MODEL_META_DATA_BUILD_OPTION,
                                              MODEL_EFI_LIBRARY_CLASS,
                                              MODEL_META_DATA_PACKAGE,
                                              MODEL_PCD_FIXED_AT_BUILD,
@@ -536,13 +536,13 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_GUID,
                                              MODEL_EFI_PROTOCOL,
                                              MODEL_EFI_PPI,
-                                             MODEL_META_DATA_USER_EXTENSION]:
+                                             MODEL_META_DATA_USER_EXTENSION}:
                         EdkLogger.error('Parser', FORMAT_INVALID,
                                         "Section [%s] is not allowed in inf file without version" % (self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
-                elif self._SectionType in [MODEL_EFI_INCLUDE,
+                elif self._SectionType in {MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
-                                           MODEL_META_DATA_NMAKE]:
+                                           MODEL_META_DATA_NMAKE}:
                     EdkLogger.error('Parser', FORMAT_INVALID,
                                     "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
@@ -694,9 +694,9 @@ class InfParser(MetaFileParser):
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
             InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
+            if InfPcdValueList[0] in TAB_TRUE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
-            elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
+            elif InfPcdValueList[0] in TAB_FALSE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1);
 
     ## [depex] section parser
@@ -933,7 +933,7 @@ class DscParser(MetaFileParser):
         if DirectiveName not in self.DataType:
             EdkLogger.error("Parser", FORMAT_INVALID, "Unknown directive [%s]" % DirectiveName,
                             File=self.MetaFile, Line=self._LineIndex+1)
-        if DirectiveName in ['!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'] and self._ValueList[1] == '':
+        if DirectiveName in {'!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'} and self._ValueList[1] == '':
             EdkLogger.error("Parser", FORMAT_INVALID, "Missing expression",
                             File=self.MetaFile, Line=self._LineIndex+1,
                             ExtraData=self._CurrentLine)
@@ -944,9 +944,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 # Remove any !else or !elseif
                 DirectiveInfo = self._DirectiveStack.pop()
-                if DirectiveInfo[0] in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if DirectiveInfo[0] in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                         MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
             else:
                 EdkLogger.error("Parser", FORMAT_INVALID, "Redundant '!endif'",
@@ -1053,9 +1053,9 @@ class DscParser(MetaFileParser):
                             File=self.MetaFile, Line=self._LineIndex+1)
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-        if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
+        if DscPcdValueList[0] in TAB_TRUE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
-        elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
+        elif DscPcdValueList[0] in TAB_FALSE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
 
     ## [components] section parser
@@ -1121,7 +1121,7 @@ class DscParser(MetaFileParser):
         Macros.update(GlobalData.gPlatformDefines)
         Macros.update(GlobalData.gCommandLineDefines)
         # PCD cannot be referenced in macro definition
-        if self._ItemType not in [MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE]:
+        if self._ItemType not in {MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE}:
             Macros.update(self._Symbols)
         return Macros
 
@@ -1303,8 +1303,8 @@ class DscParser(MetaFileParser):
 
     def __ProcessDirective(self):
         Result = None
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF}:
             Macros = self._Macros
             Macros.update(GlobalData.gGlobalDefines)
             try:
@@ -1325,9 +1325,9 @@ class DscParser(MetaFileParser):
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
 
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
             self._DirectiveStack.append(self._ItemType)
             if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
                 Result = bool(Result)
@@ -1350,10 +1350,10 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 self._DirectiveEvalStack.pop()
                 Directive = self._DirectiveStack.pop()
-                if Directive in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if Directive in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
-                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
         elif self._ItemType == MODEL_META_DATA_INCLUDE:
             # The included file must be relative to workspace or same directory as DSC file
@@ -1450,9 +1450,9 @@ class DscParser(MetaFileParser):
             except WrnExpression, Value:
                 ValueList[-1] = Value.result
             
-            if ValueList[-1] == 'True':
+            if ValueList[-1] == TAB_TRUE_3:
                 ValueList[-1] = '1'
-            if ValueList[-1] == 'False':
+            if ValueList[-1] == TAB_FALSE_3:
                 ValueList[-1] = '0'      
 
         self._ValueList[2] = '|'.join(ValueList)
@@ -1880,9 +1880,9 @@ class DecParser(MetaFileParser):
             if self._UniObj:
                 self._UniObj.CheckPcdInfo(TokenList[0])
 
-        if ValueList[0] in ['True', 'true', 'TRUE']:
+        if ValueList[0] in TAB_TRUE_SET:
             ValueList[0] = '1'
-        elif ValueList[0] in ['False', 'false', 'FALSE']:
+        elif ValueList[0] in TAB_FALSE_SET:
             ValueList[0] = '0'
 
         self._ValueList[2] = ValueList[0].strip() + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 4c49d1ca570f..29e8a81d8ef9 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -22,6 +22,7 @@ from Common import EdkLogger
 from EccToolError import *
 import EccGlobalData
 import MetaDataParser
+from Common.DataType import TAB_BOOLEAN
 
 IncludeFileListDict = {}
 AllIncludeFileListDict = {}
@@ -518,7 +519,7 @@ def CollectSourceCodeDataIntoDB(RootDir):
             collector = None
             FullName = os.path.normpath(os.path.join(dirpath, f))
             model = DataClass.MODEL_FILE_OTHERS
-            if os.path.splitext(f)[1] in ('.h', '.c'):
+            if os.path.splitext(f)[1] in {'.h', '.c'}:
                 EdkLogger.info("Parsing " + FullName)
                 model = f.endswith('c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
                 collector = CodeFragmentCollector.CodeFragmentCollector(FullName)
@@ -543,7 +544,7 @@ def CollectSourceCodeDataIntoDB(RootDir):
 
     Db = GetDB()
     for file in FileObjList:
-        if file.ExtName.upper() not in ['INF', 'DEC', 'DSC', 'FDF']:
+        if file.ExtName.upper() not in {'INF', 'DEC', 'DSC', 'FDF'}:
             Db.InsertOneFile(file)
 
     Db.UpdateIdentifierBelongsToFunction()
@@ -571,7 +572,7 @@ def GetTableID(FullFileName, ErrorMsgList=None):
     return FileID
 
 def GetIncludeFileList(FullFileName):
-    if os.path.splitext(FullFileName)[1].upper() not in ('.H'):
+    if os.path.splitext(FullFileName)[1].upper() not == '.H':
         return []
     IFList = IncludeFileListDict.get(FullFileName)
     if IFList is not None:
@@ -991,7 +992,7 @@ def GetFinalTypeValue(Type, FieldName, TypedefDict, SUDict):
     while LBPos == -1:
         FTList = Value.split()
         for FT in FTList:
-            if FT not in ('struct', 'union'):
+            if FT not in {'struct', 'union'}:
                 Value = TypedefDict.get(FT)
                 if Value is None:
                     Value = SUDict.get(FT)
@@ -1639,7 +1640,7 @@ def CheckMemberVariableFormat(Name, Value, FileTable, TdId, ModelId):
         TokenList = Field.split()
         # Remove pointers before variable
         Token = TokenList[-1]
-        if Token in ['OPTIONAL']:
+        if Token == 'OPTIONAL':
             Token = TokenList[-2]
         if not Pattern.match(Token.lstrip('*')):
             ErrMsgList.append(Token.lstrip('*'))
@@ -2046,18 +2047,18 @@ def CheckNonBooleanValueComparison(FullFileName):
                 if SearchInCache:
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
-                        if Type.find('BOOLEAN') == -1:
+                        if Type.find(TAB_BOOLEAN) == -1:
                             PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, TAB_BOOLEAN, StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
-                if Type.find('BOOLEAN') == -1:
+                if Type.find(TAB_BOOLEAN) == -1:
                     PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
@@ -2101,7 +2102,7 @@ def CheckBooleanValueComparison(FullFileName):
 
         for Exp in GetPredicateListFromPredicateExpStr(Str[0]):
             PredInfo = SplitPredicateStr(Exp)
-            if PredInfo[1] in ('==', '!=') and PredInfo[0][1] in ('TRUE', 'FALSE'):
+            if PredInfo[1] in {'==', '!='} and PredInfo[0][1] in TAB_TRUE_FALSE_SET:
                 PredVarStr = PredInfo[0][0].strip()
                 IsFuncCall = False
                 SearchInCache = False
@@ -2125,19 +2126,19 @@ def CheckBooleanValueComparison(FullFileName):
                 if SearchInCache:
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
-                        if Type.find('BOOLEAN') != -1:
+                        if Type.find(TAB_BOOLEAN) != -1:
                             PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
 
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, TAB_BOOLEAN, StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
-                if Type.find('BOOLEAN') != -1:
+                if Type.find(TAB_BOOLEAN) != -1:
                     PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
@@ -2236,7 +2237,7 @@ def CheckDoxygenCommand(FullFileName):
                     continue
                 if not Part.replace('@', '').strip():
                     continue
-                if Part.lstrip('@') in ['{', '}']:
+                if Part.lstrip('@') in {'{', '}'}:
                     continue
                 if Part.lstrip('@').isalpha():
                     if Part.lstrip('@') not in DoxygenCommandList:
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index 14c287588a01..f7ce6371e0ea 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -72,8 +72,8 @@ def PreProcess(Filename, MergeMultipleLines = True, LineNo = -1):
             if IsFindBlockCode and Line[-1] != TAB_SLASH:
                 ReservedLine = (ReservedLine + TAB_SPACE_SPLIT + Line).strip()
                 Lines.append(ReservedLine)
-                for Index in (0, ReservedLineLength):
-                    Lines.append('')
+                Lines.append('')
+                Lines.append('')
                 ReservedLine = ''
                 ReservedLineLength = 0
                 IsFindBlockCode = False
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 99b8b152180a..0e9d7300f4f2 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -17,6 +17,7 @@
 import Common.LongFilePathOs as os
 import EotGlobalData
 from Common.LongFilePathSupport import OpenLongFilePath as open
+from Common.DataType import *
 
 ## Report() class
 #
@@ -138,11 +139,13 @@ class Report(object):
     #  @param DepexString: A DEPEX string needed to be parsed
     #
     def GenerateDepex(self, DepexString):
-        NonGuidList = ['AND', 'OR', 'NOT', 'BEFORE', 'AFTER', 'TRUE', 'FALSE']
+        NonGuidSet = {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, 
+                      DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT,
+                      DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE}
         ItemList = DepexString.split(' ')
         DepexString = ''
         for Item in ItemList:
-            if Item not in NonGuidList:
+            if Item not in NonGuidSet:
                 SqlCommand = """select DISTINCT GuidName from Report where GuidValue like '%s' and ItemMode = 'Produced' group by GuidName""" % (Item)
                 RecordSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
                 if RecordSet != []:
@@ -234,7 +237,7 @@ class Report(object):
     #
     def GenerateFfs(self, FfsObj):
         self.FfsIndex = self.FfsIndex + 1
-        if FfsObj is not None and FfsObj.Type in [0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0xA]:
+        if FfsObj is not None and FfsObj.Type in {0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0xA}:
             FfsGuid = FfsObj.Guid
             FfsOffset = FfsObj._OFF_
             FfsName = 'Unknown-Module'
@@ -278,9 +281,9 @@ class Report(object):
         <td colspan="4"><table width="100%%"  border="1">""" % (self.FfsIndex, self.FfsIndex, self.FfsIndex, FfsPath, FfsName, FfsGuid, FfsOffset, FfsType, self.FfsIndex)
             
             if self.DispatchList:
-                if FfsObj.Type in [0x04, 0x06]:
+                if FfsObj.Type in {0x04, 0x06}:
                     self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "P", FfsName, FfsPath))
-                if FfsObj.Type in [0x05, 0x07, 0x08, 0x0A]:
+                if FfsObj.Type in {0x05, 0x07, 0x08, 0x0A}:
                     self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "D", FfsName, FfsPath))
                
             self.WriteLn(Content)
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index 8199ce5ee73e..84a6f0961279 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -345,7 +345,7 @@ def CreateCCodeDB(FileNameList):
     ParseErrorFileList = []
     ParsedFiles = {}
     for FullName in FileNameList:
-        if os.path.splitext(FullName)[1] in ('.h', '.c'):
+        if os.path.splitext(FullName)[1] in {'.h', '.c'}:
             if FullName.lower() in ParsedFiles:
                 continue
             ParsedFiles[FullName.lower()] = 1
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index 71c2796b0b39..cc9c4d5b9aa7 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -92,7 +92,7 @@ class DataSection (DataSectionClassObject):
                 self.Alignment = str (ImageObj.SectionAlignment / 0x100000) + 'M'
 
         NoStrip = True
-        if self.SecType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
+        if self.SecType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             if self.KeepReloc is not None:
                 NoStrip = self.KeepReloc
 
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index 4392b9c62409..0521dd5b8d43 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -82,7 +82,10 @@ class DepexSection (DepexSectionClassObject):
             ExpList = self.Expression.split()
 
             for Exp in ExpList:
-                if Exp.upper() not in ('AND', 'OR', 'NOT', 'TRUE', 'FALSE', 'SOR', 'BEFORE', 'AFTER', 'END'):
+                if Exp.upper() not in {DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, 
+                        DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT,
+                        DEPEX_OPCODE_END, DEPEX_OPCODE_SOR, DEPEX_OPCODE_TRUE,
+                        DEPEX_OPCODE_FALSE}:
                     GuidStr = self.__FindGuidValue(Exp)
                     if GuidStr is None:
                         EdkLogger.error("GenFds", RESOURCE_NOT_AVAILABLE,
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 55348083b954..61b31bd36ff2 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -280,7 +280,7 @@ class FdfParser:
         Count = 0
         while not self.__EndOfFile():
             Count += 1
-            if self.__CurrentChar() in (T_CHAR_NULL, T_CHAR_CR, T_CHAR_LF, T_CHAR_SPACE, T_CHAR_TAB):
+            if self.__CurrentChar() in {T_CHAR_NULL, T_CHAR_CR, T_CHAR_LF, T_CHAR_SPACE, T_CHAR_TAB}:
                 self.__SkippedChars += str(self.__CurrentChar())
                 self.__GetOneChar()
 
@@ -423,14 +423,14 @@ class FdfParser:
             return
 
         Offset = StartPos[1]
-        while self.Profile.FileLinesList[StartPos[0]][Offset] not in ('\r', '\n'):
+        while self.Profile.FileLinesList[StartPos[0]][Offset] not in {'\r', '\n'}:
             self.Profile.FileLinesList[StartPos[0]][Offset] = Value
             Offset += 1
 
         Line = StartPos[0]
         while Line < EndPos[0]:
             Offset = 0
-            while self.Profile.FileLinesList[Line][Offset] not in ('\r', '\n'):
+            while self.Profile.FileLinesList[Line][Offset] not in {'\r', '\n'}:
                 self.Profile.FileLinesList[Line][Offset] = Value
                 Offset += 1
             Line += 1
@@ -741,7 +741,7 @@ class FdfParser:
                     PreIndex = 0
                     StartPos = CurLine.find('$(', PreIndex)
                     EndPos = CurLine.find(')', StartPos+2)
-                    while StartPos != -1 and EndPos != -1 and self.__Token not in ['!ifdef', '!ifndef', '!if', '!elseif']:
+                    while StartPos != -1 and EndPos != -1 and self.__Token not in {'!ifdef', '!ifndef', '!if', '!elseif'}:
                         MacroName = CurLine[StartPos+2 : EndPos]
                         MacorValue = self.__GetMacroValue(MacroName)
                         if MacorValue is not None:
@@ -792,7 +792,7 @@ class FdfParser:
                 self.Profile.PcdFileLineDict[PcdPair] = FileLineTuple
 
                 self.__WipeOffArea.append(((SetLine, SetOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
-            elif self.__Token in ('!ifdef', '!ifndef', '!if'):
+            elif self.__Token in {'!ifdef', '!ifndef', '!if'}:
                 IfStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self.__Token))
                 IfList.append([IfStartPos, None, None])
 
@@ -810,7 +810,7 @@ class FdfParser:
                 IfList[-1] = [IfList[-1][0], ConditionSatisfied, BranchDetermined]
                 if ConditionSatisfied:
                     self.__WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))                 
-            elif self.__Token in ('!elseif', '!else'):
+            elif self.__Token in {'!elseif', '!else'}:
                 ElseStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self.__Token))
                 if len(IfList) <= 0:
                     raise Warning("Missing !if statement", self.FileName, self.CurrentLineNumber)
@@ -1001,7 +1001,7 @@ class FdfParser:
     def __GetExpression(self):
         Line = self.Profile.FileLinesList[self.CurrentLineNumber - 1]
         Index = len(Line) - 1
-        while Line[Index] in ['\r', '\n']:
+        while Line[Index] in {'\r', '\n'}:
             Index -= 1
         ExpressionString = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:Index+1]
         self.CurrentOffsetWithinLine += len(ExpressionString)
@@ -1489,7 +1489,7 @@ class FdfParser:
 
         while self.__GetTokenStatements(FdObj):
             pass
-        for Attr in ("BaseAddress", "Size", "ErasePolarity"):
+        for Attr in ["BaseAddress", "Size", "ErasePolarity"]:
             if getattr(FdObj, Attr) is None:
                 self.__GetNextToken()
                 raise Warning("Keyword %s missing" % Attr, self.FileName, self.CurrentLineNumber)
@@ -1831,7 +1831,7 @@ class FdfParser:
         if not self.__GetNextWord():
             return True
 
-        if not self.__Token in ("SET", BINARY_FILE_TYPE_FV, "FILE", "DATA", "CAPSULE", "INF"):
+        if self.__Token not in {"SET", BINARY_FILE_TYPE_FV, "FILE", "DATA", "CAPSULE", "INF"}:
             #
             # If next token is a word which is not a valid FV type, it might be part of [PcdOffset[|PcdSize]]
             # Or it might be next region's offset described by an expression which starts with a PCD.
@@ -2134,7 +2134,7 @@ class FdfParser:
                 self.__GetFvExtEntryStatement(FvObj) or self.__GetFvNameString(FvObj)):
                 break
 
-        if FvObj.FvNameString == 'TRUE' and not FvObj.FvNameGuid:
+        if FvObj.FvNameString == TAB_TRUE_1 and not FvObj.FvNameGuid:
             raise Warning("FvNameString found but FvNameGuid was not found", self.FileName, self.CurrentLineNumber)
 
         self.__GetAprioriSection(FvObj, FvObj.DefineVarDict.copy())
@@ -2168,10 +2168,10 @@ class FdfParser:
         if not self.__GetNextToken():
             raise Warning("expected alignment value", self.FileName, self.CurrentLineNumber)
 
-        if self.__Token.upper() not in ("1", "2", "4", "8", "16", "32", "64", "128", "256", "512", \
+        if self.__Token.upper() not in {"1", "2", "4", "8", "16", "32", "64", "128", "256", "512", \
                                         "1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K", \
                                         "1M", "2M", "4M", "8M", "16M", "32M", "64M", "128M", "256M", "512M", \
-                                        "1G", "2G"):
+                                        "1G", "2G"}:
             raise Warning("Unknown alignment value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         Obj.FvAlignment = self.__Token
         return True
@@ -2221,12 +2221,12 @@ class FdfParser:
         if not self.__GetNextToken():
             raise Warning("expected FvForceRebase value", self.FileName, self.CurrentLineNumber)
 
-        if self.__Token.upper() not in ["TRUE", "FALSE", "0", "0X0", "0X00", "1", "0X1", "0X01"]:
+        if self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "0", "0X0", "0X00", "1", "0X1", "0X01"}:
             raise Warning("Unknown FvForceRebase value '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         
-        if self.__Token.upper() in ["TRUE", "1", "0X1", "0X01"]:
+        if self.__Token.upper() in {TAB_TRUE_1, "1", "0X1", "0X01"}:
             Obj.FvForceRebase = True
-        elif self.__Token.upper() in ["FALSE", "0", "0X0", "0X00"]:
+        elif self.__Token.upper() in {TAB_FALSE_1, "0", "0X0", "0X00"}:
             Obj.FvForceRebase = False
         else:
             Obj.FvForceRebase = None
@@ -2247,19 +2247,19 @@ class FdfParser:
         while self.__GetNextWord():
             IsWordToken = True
             name = self.__Token
-            if name not in ("ERASE_POLARITY", "MEMORY_MAPPED", \
+            if name not in {"ERASE_POLARITY", "MEMORY_MAPPED", \
                            "STICKY_WRITE", "LOCK_CAP", "LOCK_STATUS", "WRITE_ENABLED_CAP", \
                            "WRITE_DISABLED_CAP", "WRITE_STATUS", "READ_ENABLED_CAP", \
                            "READ_DISABLED_CAP", "READ_STATUS", "READ_LOCK_CAP", \
                            "READ_LOCK_STATUS", "WRITE_LOCK_CAP", "WRITE_LOCK_STATUS", \
-                           "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"):
+                           "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"}:
                 self.__UndoToken()
                 return False
 
             if not self.__IsToken( "="):
                 raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
 
-            if not self.__GetNextToken() or self.__Token.upper() not in ("TRUE", "FALSE", "1", "0"):
+            if not self.__GetNextToken() or self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "1", "0"}:
                 raise Warning("expected TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
 
             FvObj.FvAttributeDict[name] = self.__Token
@@ -2297,7 +2297,7 @@ class FdfParser:
         if not self.__IsToken( "="):
             raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
 
-        if not self.__GetNextToken() or self.__Token not in ('TRUE', 'FALSE'):
+        if not self.__GetNextToken() or self.__Token not in TAB_TRUE_FALSE_SET:
             raise Warning("expected TRUE or FALSE for FvNameString", self.FileName, self.CurrentLineNumber)
 
         FvObj.FvNameString = self.__Token
@@ -2614,7 +2614,7 @@ class FdfParser:
     #
     @staticmethod
     def __FileCouldHaveRelocFlag (FileType):
-        if FileType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, 'PEI_DXE_COMBO'):
+        if FileType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, 'PEI_DXE_COMBO'}:
             return True
         else:
             return False
@@ -2629,7 +2629,7 @@ class FdfParser:
     #
     @staticmethod
     def __SectionCouldHaveRelocFlag (SectionType):
-        if SectionType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
+        if SectionType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             return True
         else:
             return False
@@ -2676,7 +2676,7 @@ class FdfParser:
                 raise Warning("expected FD name", self.FileName, self.CurrentLineNumber)
             FfsFileObj.FdName = self.__Token
 
-        elif self.__Token in ("DEFINE", "APRIORI", "SECTION"):
+        elif self.__Token in {"DEFINE", "APRIORI", "SECTION"}:
             self.__UndoToken()
             self.__GetSectionData( FfsFileObj, MacroDict)
 
@@ -2707,8 +2707,8 @@ class FdfParser:
         while True:
             AlignValue = None
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 #For FFS, Auto is default option same to ""
                 if not self.__Token == "Auto":
@@ -2766,8 +2766,8 @@ class FdfParser:
             FfsFileObj.CheckSum = True
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
             if not self.__Token == "Auto":
@@ -2838,8 +2838,8 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
 
@@ -2953,8 +2953,8 @@ class FdfParser:
                 self.SetFileBufferPos(OldPos)
                 return False
 
-            if self.__Token not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX):
+            if self.__Token not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Unknown section type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if AlignValue == 'Auto'and (not self.__Token == BINARY_FILE_TYPE_PE32) and (not self.__Token == BINARY_FILE_TYPE_TE):
                 raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3102,7 +3102,7 @@ class FdfParser:
                     continue
                 except ValueError:
                     raise Warning("expected Number", self.FileName, self.CurrentLineNumber)
-            elif self.__Token.upper() not in ("TRUE", "FALSE", "1", "0"):
+            elif self.__Token.upper() not in {TAB_TRUE_1, TAB_FALSE_1, "1", "0"}:
                 raise Warning("expected TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
             AttribDict[AttribKey] = self.__Token
 
@@ -3128,8 +3128,8 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
 
@@ -3292,21 +3292,21 @@ class FdfParser:
     def __GetCapsuleTokens(self, Obj):
         if not self.__GetNextToken():
             return False
-        while self.__Token in ("CAPSULE_GUID", "CAPSULE_HEADER_SIZE", "CAPSULE_FLAGS", "OEM_CAPSULE_FLAGS", "CAPSULE_HEADER_INIT_VERSION"):
+        while self.__Token in {"CAPSULE_GUID", "CAPSULE_HEADER_SIZE", "CAPSULE_FLAGS", "OEM_CAPSULE_FLAGS", "CAPSULE_HEADER_INIT_VERSION"}:
             Name = self.__Token.strip()
             if not self.__IsToken("="):
                 raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
             if not self.__GetNextToken():
                 raise Warning("expected value", self.FileName, self.CurrentLineNumber)
             if Name == 'CAPSULE_FLAGS':
-                if not self.__Token in ("PersistAcrossReset", "PopulateSystemTable", "InitiateReset"):
+                if self.__Token not in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
                     raise Warning("expected PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                 Value = self.__Token.strip()
                 while self.__IsToken(","):
                     Value += ','
                     if not self.__GetNextToken():
                         raise Warning("expected value", self.FileName, self.CurrentLineNumber)
-                    if not self.__Token in ("PersistAcrossReset", "PopulateSystemTable", "InitiateReset"):
+                    if self.__Token not in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
                         raise Warning("expected PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                     Value += self.__Token.strip()
             elif Name == 'OEM_CAPSULE_FLAGS':
@@ -3521,7 +3521,7 @@ class FdfParser:
         AfileName = self.__Token
         AfileBaseName = os.path.basename(AfileName)
         
-        if os.path.splitext(AfileBaseName)[1]  not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
+        if os.path.splitext(AfileBaseName)[1]  not in {".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"}:
             raise Warning('invalid binary file type, should be one of "bin",BINARY_FILE_TYPE_BIN,"Bin","dat","DAT","Dat","data","DATA","Data"', \
                           self.FileName, self.CurrentLineNumber)
         
@@ -3614,12 +3614,12 @@ class FdfParser:
 
         if not self.__GetNextWord():
             raise Warning("expected Module type", self.FileName, self.CurrentLineNumber)
-        if self.__Token.upper() not in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, \
+        if self.__Token.upper() not in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, \
                              SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, \
                              SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, \
                              SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, "DEFAULT", SUP_MODULE_BASE, \
                              EDK_COMPONENT_TYPE_SECURITY_CORE, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, \
-                                        "PE32_PEIM", EDK_COMPONENT_TYPE_BS_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER, EDK_COMPONENT_TYPE_APPLICATION, "ACPITABLE", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE):
+                                        "PE32_PEIM", EDK_COMPONENT_TYPE_BS_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER, EDK_COMPONENT_TYPE_APPLICATION, "ACPITABLE", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
             raise Warning("Unknown Module type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         return self.__Token
 
@@ -3661,8 +3661,8 @@ class FdfParser:
             raise Warning("expected FFS type", self.FileName, self.CurrentLineNumber)
 
         Type = self.__Token.strip().upper()
-        if Type not in ("RAW", "FREEFORM", SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,\
-                             "PEI_DXE_COMBO", "DRIVER", SUP_MODULE_DXE_CORE, EDK_COMPONENT_TYPE_APPLICATION, "FV_IMAGE", "SMM", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE):
+        if Type not in {"RAW", "FREEFORM", SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,\
+                             "PEI_DXE_COMBO", "DRIVER", SUP_MODULE_DXE_CORE, EDK_COMPONENT_TYPE_APPLICATION, "FV_IMAGE", "SMM", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
             raise Warning("Unknown FV type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
 
         if not self.__IsToken("="):
@@ -3718,8 +3718,8 @@ class FdfParser:
 
         AlignValue = ""
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
             if not self.__Token == "Auto":
@@ -3755,8 +3755,8 @@ class FdfParser:
 
             SectionName = self.__Token
 
-            if SectionName not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                                    BINARY_FILE_TYPE_UI, BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX):
+            if SectionName not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                                    BINARY_FILE_TYPE_UI, BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Unknown leaf section name '%s'" % SectionName, self.FileName, self.CurrentLineNumber)
 
 
@@ -3768,8 +3768,8 @@ class FdfParser:
 
             SectAlignment = ""
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                        "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
                     raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3812,8 +3812,8 @@ class FdfParser:
             return False
         SectionName = self.__Token
 
-        if SectionName not in ("COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+        if SectionName not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
             self.__UndoToken()
             return False
 
@@ -3848,16 +3848,16 @@ class FdfParser:
                 FvImageSectionObj.FvFileType = self.__Token
 
                 if self.__GetAlignment():
-                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                            "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+                    if self.__Token not in {"8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                            "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                         raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self.__Token
 
                 if self.__IsToken('|'):
                     FvImageSectionObj.FvFileExtension = self.__GetFileExtension()
                 elif self.__GetNextToken():
-                    if self.__Token not in ("}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+                    if self.__Token not in {"}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
                         FvImageSectionObj.FvFileName = self.__Token
                     else:
                         self.__UndoToken()
@@ -3916,8 +3916,8 @@ class FdfParser:
                 EfiSectionObj.BuildNum = self.__Token
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
+            if self.__Token not in {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}:
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
                 raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
@@ -3938,8 +3938,8 @@ class FdfParser:
         if self.__IsToken('|'):
             EfiSectionObj.FileExtension = self.__GetFileExtension()
         elif self.__GetNextToken():
-            if self.__Token not in ("}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                       BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX):
+            if self.__Token not in {"}", "COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
+                       BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, BINARY_FILE_TYPE_GUID, BINARY_FILE_TYPE_SMM_DEPEX}:
                 
                 if self.__Token.startswith('PCD'):
                     self.__UndoToken()
@@ -3973,7 +3973,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldBeOptional(SectionType):
-        if SectionType in (BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "RAW", BINARY_FILE_TYPE_SMM_DEPEX):
+        if SectionType in {BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "RAW", BINARY_FILE_TYPE_SMM_DEPEX}:
             return True
         else:
             return False
@@ -3988,7 +3988,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldHaveBuildNum(SectionType):
-        if SectionType in ("VERSION"):
+        if SectionType == "VERSION":
             return True
         else:
             return False
@@ -4003,7 +4003,7 @@ class FdfParser:
     #
     @staticmethod
     def __RuleSectionCouldHaveString(SectionType):
-        if SectionType in (BINARY_FILE_TYPE_UI, "VERSION"):
+        if SectionType in {BINARY_FILE_TYPE_UI, "VERSION"}:
             return True
         else:
             return False
@@ -4018,34 +4018,34 @@ class FdfParser:
     #
     def __CheckRuleSectionFileType(self, SectionType, FileType):
         if SectionType == "COMPAT16":
-            if FileType not in ("COMPAT16", "SEC_COMPAT16"):
+            if FileType not in {"COMPAT16", "SEC_COMPAT16"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PE32:
-            if FileType not in (BINARY_FILE_TYPE_PE32, "SEC_PE32"):
+            if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_PE32"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PIC:
-            if FileType not in (BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_PIC):
+            if FileType not in {BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_PIC}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_TE:
-            if FileType not in (BINARY_FILE_TYPE_TE, "SEC_TE"):
+            if FileType not in {BINARY_FILE_TYPE_TE, "SEC_TE"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "RAW":
-            if FileType not in (BINARY_FILE_TYPE_BIN, "SEC_BIN", "RAW", "ASL", "ACPI"):
+            if FileType not in {BINARY_FILE_TYPE_BIN, "SEC_BIN", "RAW", "ASL", "ACPI"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_DXE_DEPEX or SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
-            if FileType not in (BINARY_FILE_TYPE_DXE_DEPEX, "SEC_DXE_DEPEX", BINARY_FILE_TYPE_SMM_DEPEX):
+            if FileType not in {BINARY_FILE_TYPE_DXE_DEPEX, "SEC_DXE_DEPEX", BINARY_FILE_TYPE_SMM_DEPEX}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_UI:
-            if FileType not in (BINARY_FILE_TYPE_UI, "SEC_UI"):
+            if FileType not in {BINARY_FILE_TYPE_UI, "SEC_UI"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "VERSION":
-            if FileType not in ("VERSION", "SEC_VERSION"):
+            if FileType not in {"VERSION", "SEC_VERSION"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PEI_DEPEX:
-            if FileType not in (BINARY_FILE_TYPE_PEI_DEPEX, "SEC_PEI_DEPEX"):
+            if FileType not in {BINARY_FILE_TYPE_PEI_DEPEX, "SEC_PEI_DEPEX"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_GUID:
-            if FileType not in (BINARY_FILE_TYPE_PE32, "SEC_GUID"):
+            if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_GUID"}:
                 raise Warning("Incorrect section file type '%s'" % FileType, self.FileName, self.CurrentLineNumber)
 
     ## __GetRuleEncapsulationSection() method
@@ -4147,7 +4147,7 @@ class FdfParser:
             raise Warning("expected '.'", self.FileName, self.CurrentLineNumber)
 
         Arch = self.__SkippedChars.rstrip(".").upper()
-        if Arch not in ("IA32", "X64", "IPF", "ARM", "AARCH64"):
+        if Arch not in {"IA32", "X64", "IPF", "ARM", "AARCH64"}:
             raise Warning("Unknown Arch '%s'" % Arch, self.FileName, self.CurrentLineNumber)
 
         if not self.__GetNextWord():
@@ -4161,7 +4161,7 @@ class FdfParser:
         if self.__IsToken(","):
             if not self.__GetNextWord():
                 raise Warning("expected Arch list", self.FileName, self.CurrentLineNumber)
-            if self.__Token.upper() not in ("IA32", "X64", "IPF", "ARM", "AARCH64"):
+            if self.__Token.upper() not in {"IA32", "X64", "IPF", "ARM", "AARCH64"}:
                 raise Warning("Unknown Arch '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             VtfObj.ArchList = self.__Token.upper()
 
@@ -4224,7 +4224,7 @@ class FdfParser:
                 if not self.__GetNextWord():
                     raise Warning("Expected Region Name", self.FileName, self.CurrentLineNumber)
 
-                if self.__Token not in ("F", "N", "S"):    #, "H", "L", "PH", "PL"): not support
+                if self.__Token not in {"F", "N", "S"}:    #, "H", "L", "PH", "PL"): not support
                     raise Warning("Unknown location type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
 
                 CompStatementObj.FilePos = self.__Token
@@ -4240,7 +4240,7 @@ class FdfParser:
 
         if not self.__GetNextToken():
             raise Warning("expected Component type", self.FileName, self.CurrentLineNumber)
-        if self.__Token not in ("FIT", "PAL_B", "PAL_A", "OEM"):
+        if self.__Token not in {"FIT", "PAL_B", "PAL_A", "OEM"}:
             if not self.__Token.startswith("0x") or len(self.__Token) < 3 or len(self.__Token) > 4 or \
                 not self.__Token[2] in string.hexdigits or not self.__Token[-1] in string.hexdigits:
                 raise Warning("Unknown location type '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
@@ -4268,7 +4268,7 @@ class FdfParser:
 
         if not self.__GetNextToken():
             raise Warning("expected Component CS", self.FileName, self.CurrentLineNumber)
-        if self.__Token not in ("1", "0"):
+        if self.__Token not in {"1", "0"}:
             raise Warning("Unknown  Component CS '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
         CompStatementObj.CompCs = self.__Token
 
@@ -4456,7 +4456,7 @@ class FdfParser:
                         raise Warning("expected '='", self.FileName, self.CurrentLineNumber)
                     if not self.__GetNextToken():
                         raise Warning("expected TRUE/FALSE for compress", self.FileName, self.CurrentLineNumber)
-                    Overrides.NeedCompress = self.__Token.upper() == 'TRUE'
+                    Overrides.NeedCompress = self.__Token.upper() == TAB_TRUE_1
                     continue
 
                 if self.__IsToken( "}"):
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index e4276c3a8c07..f62ee73b1238 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -748,7 +748,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
             if SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
                 EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
         NoStrip = True
-        if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM):
+        if self.ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM}:
             if self.KeepReloc is not None:
                 NoStrip = self.KeepReloc
             elif Rule.KeepReloc is not None:
@@ -902,7 +902,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @retval string       File name of the generated section file
     #
     def __GenComplexFileSection__(self, Rule, FvChildAddr, FvParentAddr, IsMakefile = False):
-        if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM):
+        if self.ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM}:
             if Rule.KeepReloc is not None:
                 self.KeepRelocFromRule = Rule.KeepReloc
         SectFiles = []
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index c672f1d7d8fa..6c90fa3ca9e6 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -297,7 +297,7 @@ class FV (FvClassObject):
         if self.FvAttributeDict:
             for FvAttribute in self.FvAttributeDict.keys() :
                 if FvAttribute == "FvUsedSizeEnable":
-                    if self.FvAttributeDict[FvAttribute].upper() in {'TRUE', '1'}:
+                    if self.FvAttributeDict[FvAttribute].upper() in {TAB_TRUE_1, '1'}:
                         self.UsedSizeEnable = True
                     continue
                 self.FvInfFile.writelines("EFI_{FA} = {VAL}{END}".format(FA=FvAttribute, VAL=self.FvAttributeDict[FvAttribute], END=TAB_LINE_BREAK))
@@ -323,7 +323,7 @@ class FV (FvClassObject):
                 # } EFI_FIRMWARE_VOLUME_EXT_ENTRY_USED_SIZE_TYPE;
                 Buffer += pack('HHL', 8, 3, 0)
 
-            if self.FvNameString == 'TRUE':
+            if self.FvNameString == TAB_TRUE_1:
                 #
                 # Create EXT entry for FV UI name
                 # This GUID is used: A67DF1FA-8DE8-4E98-AF09-4BDF2EFFBC7C
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 998bd5345c3c..b9167bac7eda 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -212,7 +212,7 @@ def main():
                     else:
                         GlobalData.gCommandLineDefines[List[0].strip()] = List[1].strip()
                 else:
-                    GlobalData.gCommandLineDefines[List[0].strip()] = "TRUE"
+                    GlobalData.gCommandLineDefines[List[0].strip()] = TAB_TRUE_1
         os.environ["WORKSPACE"] = Workspace
 
         # Use the -t and -b option as gGlobalDefines's TOOLCHAIN and TARGET if they are not defined
@@ -432,7 +432,7 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
                     List = Key.split('_')
                     if List[Index] == '*':
                         for String in ToolDb[ToolList[Index]]:
-                            if String in [Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]:
+                            if String in {Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag}:
                                 List[Index] = String
                                 NewKey = '%s_%s_%s_%s_%s' % tuple(List)
                                 if NewKey not in BuildOption:
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index b840079e7ad4..e7dd212b649e 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -212,12 +212,12 @@ class GenFdsGlobalVariable:
 
         if not Inf.IsBinaryModule:
             for File in Inf.Sources:
-                if File.TagName in ("", "*", GenFdsGlobalVariable.ToolChainTag) and \
-                    File.ToolChainFamily in ("", "*", GenFdsGlobalVariable.ToolChainFamily):
+                if File.TagName in {"", "*", GenFdsGlobalVariable.ToolChainTag} and \
+                    File.ToolChainFamily in {"", "*", GenFdsGlobalVariable.ToolChainFamily}:
                     FileList.append((File, DataType.TAB_UNKNOWN_FILE))
 
         for File in Inf.Binaries:
-            if File.Target in [DataType.TAB_COMMON, '*', GenFdsGlobalVariable.TargetName]:
+            if File.Target in {DataType.TAB_COMMON, '*', GenFdsGlobalVariable.TargetName}:
                 FileList.append((File, File.Type))
 
         for File, FileType in FileList:
@@ -494,9 +494,9 @@ class GenFdsGlobalVariable:
     def GetAlignment (AlignString):
         if AlignString is None:
             return 0
-        if AlignString in ("1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K"):
+        if AlignString in {"1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K"}:
             return int (AlignString.rstrip('K')) * 1024
-        elif AlignString in ("1M", "2M", "4M", "8M", "16M"):
+        elif AlignString in {"1M", "2M", "4M", "8M", "16M"}:
             return int (AlignString.rstrip('M')) * 1024 * 1024
         else:
             return int (AlignString)
@@ -551,9 +551,9 @@ class GenFdsGlobalVariable:
             Cmd += ["-r", BaseAddress]
 
         if ForceRebase == False:
-            Cmd += ["-F", "FALSE"]
+            Cmd += ["-F", DataType.TAB_FALSE_1]
         elif ForceRebase == True:
-            Cmd += ["-F", "TRUE"]
+            Cmd += ["-F", DataType.TAB_TRUE_1]
 
         if Capsule:
             Cmd += ["-c"]
@@ -686,7 +686,7 @@ class GenFdsGlobalVariable:
 
     def CallExternalTool (cmd, errorMess, returnValue=[]):
 
-        if type(cmd) not in (tuple, list):
+        if type(cmd) not in {tuple, list}:
             GenFdsGlobalVariable.ErrorLogger("ToolError!  Invalid parameter type in call to CallExternalTool")
 
         if GenFdsGlobalVariable.DebugLevel != -1:
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 28571292f5a6..b36d2868059a 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -77,7 +77,7 @@ class GuidSection(GuidSectionClassObject) :
         else:
             FvAddrIsSet = False
         
-        if self.ProcessRequired in ("TRUE", "1"):
+        if self.ProcessRequired in {TAB_TRUE_1, "1"}:
             if self.FvAddr != []:
                 #no use FvAddr when the image is processed.
                 self.FvAddr = []
@@ -175,7 +175,7 @@ class GuidSection(GuidSectionClassObject) :
             if ExternalOption is not None:
                 CmdOption = CmdOption + ' ' + ExternalOption
             if not GenFdsGlobalVariable.EnableGenfdsMultiThread:
-                if self.ProcessRequired not in ("TRUE", "1") and self.IncludeFvSection and not FvAddrIsSet and self.FvParentAddr is not None:
+                if self.ProcessRequired not in {TAB_TRUE_1, "1"} and self.IncludeFvSection and not FvAddrIsSet and self.FvParentAddr is not None:
                     #FirstCall is only set for the encapsulated flash FV image without process required attribute.
                     FirstCall = True
                 #
@@ -232,11 +232,11 @@ class GuidSection(GuidSectionClassObject) :
                 #
                 # Call Gensection Add Section Header
                 #
-                if self.ProcessRequired in ("TRUE", "1"):
+                if self.ProcessRequired in {TAB_TRUE_1, "1"}:
                     if 'PROCESSING_REQUIRED' not in Attribute:
                         Attribute.append('PROCESSING_REQUIRED')
 
-                if self.AuthStatusValid in ("TRUE", "1"):
+                if self.AuthStatusValid in {TAB_TRUE_1, "1"}:
                     Attribute.append('AUTH_STATUS_VALID')
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                      Guid=self.NameGuid, GuidAttr=Attribute, GuidHdrLen=HeaderLength)
@@ -248,14 +248,14 @@ class GuidSection(GuidSectionClassObject) :
                 HeaderLength = None
                 if self.ExtraHeaderSize != -1:
                     HeaderLength = str(self.ExtraHeaderSize)
-                if self.AuthStatusValid in ("TRUE", "1"):
+                if self.AuthStatusValid in {TAB_TRUE_1, "1"}:
                     Attribute.append('AUTH_STATUS_VALID')
                 if self.ProcessRequired == "NONE" and HeaderLength is None:
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
                                                          Guid=self.NameGuid, GuidAttr=Attribute,
                                                          GuidHdrLen=HeaderLength, DummyFile=DummyFile, IsMakefile=IsMakefile)
                 else:
-                    if self.ProcessRequired in ("TRUE", "1"):
+                    if self.ProcessRequired in {TAB_TRUE_1, "1"}:
                         if 'PROCESSING_REQUIRED' not in Attribute:
                             Attribute.append('PROCESSING_REQUIRED')
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [TempFile], Section.SectionType['GUIDED'],
@@ -268,7 +268,7 @@ class GuidSection(GuidSectionClassObject) :
                 # reset guided section alignment to none for the processed required guided data
                 self.Alignment = None
                 self.IncludeFvSection = False
-                self.ProcessRequired = "TRUE"
+                self.ProcessRequired = TAB_TRUE_1
             if IsMakefile and self.Alignment is not None and self.Alignment.strip() == '0':
                 self.Alignment = '1'
             return OutputFileList, self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 93c4456eb89f..a20c28314894 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -52,10 +52,10 @@ class OptRomInfStatement (FfsInfStatement):
         if self.OverrideAttribs.NeedCompress is None:
             self.OverrideAttribs.NeedCompress = self.OptRomDefs.get ('PCI_COMPRESS')
             if self.OverrideAttribs.NeedCompress is not None:
-                if self.OverrideAttribs.NeedCompress.upper() not in ('TRUE', 'FALSE'):
+                if self.OverrideAttribs.NeedCompress.upper() not in TAB_TRUE_FALSE_SET:
                     GenFdsGlobalVariable.ErrorLogger( "Expected TRUE/FALSE for PCI_COMPRESS: %s" %self.InfFileName)
                 self.OverrideAttribs.NeedCompress = \
-                    self.OverrideAttribs.NeedCompress.upper() == 'TRUE'
+                    self.OverrideAttribs.NeedCompress.upper() == TAB_TRUE_1
 
         if self.OverrideAttribs.PciVendorId is None:
             self.OverrideAttribs.PciVendorId = self.OptRomDefs.get ('PCI_VENDOR_ID')
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index e67d056cc178..57d5d15c36a5 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -220,7 +220,7 @@ class Region(RegionClassObject):
             #
             self.PadBuffer(Buffer, ErasePolarity, Size)
 
-        if self.RegionType in ('FILE', 'INF'):
+        if self.RegionType in {'FILE', 'INF'}:
             for RegionData in self.RegionDataList:
                 if self.RegionType == 'INF':
                     RegionData.__InfParse__(None)
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index 76fef41176ac..6c85ff4dd073 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -59,18 +59,8 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Get PCD value data length
     #
-    ValueLength = 0
-    if TypeName == 'BOOLEAN':
-        ValueLength = 1
-    elif TypeName == TAB_UINT8:
-        ValueLength = 1
-    elif TypeName == TAB_UINT16:
-        ValueLength = 2
-    elif TypeName == TAB_UINT32:
-        ValueLength = 4
-    elif TypeName == TAB_UINT64:
-        ValueLength = 8
-    elif TypeName == TAB_VOID:
+    ValueLength = MAX_SIZE_TYPE.get(TypeName,0)
+    if TypeName == TAB_VOID:
         if MaxSize == 0:
             return OPTION_MISSING, "PcdMaxSize is not specified for VOID* type PCD."
         ValueLength = int(MaxSize)
@@ -100,14 +90,14 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     SavedStr = ValueString
     ValueString = ValueString.upper()
     ValueNumber = 0
-    if TypeName == 'BOOLEAN':
+    if TypeName == TAB_BOOLEAN:
         #
         # Get PCD value for BOOLEAN data type
         #
         try:
-            if ValueString == 'TRUE':
+            if ValueString == TAB_TRUE_1:
                 ValueNumber = 1
-            elif ValueString == 'FALSE':
+            elif ValueString == TAB_FALSE_1:
                 ValueNumber = 0
             ValueNumber = int (ValueString, 0)
             if ValueNumber != 0:
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index a92df52979c6..b65c7bead814 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -300,7 +300,7 @@ def TrimPreprocessedVfr(Source, Target):
             FoundTypedef = False
             TypedefEnd = Index
             # keep all "typedef struct" except to GUID, EFI_PLABEL and PAL_CALL_RETURN
-            if Line.strip("} ;\r\n") in [TAB_GUID, "EFI_PLABEL", "PAL_CALL_RETURN"]:
+            if Line.strip("} ;\r\n") in {TAB_GUID, "EFI_PLABEL", "PAL_CALL_RETURN"}:
                 for i in range(TypedefStart, TypedefEnd+1):
                     Lines[i] = "\n"
 
@@ -357,7 +357,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None):
         Result = gAslIncludePattern.findall(Line)
         if len(Result) == 0:
             Result = gAslCIncludePattern.findall(Line)
-            if len(Result) == 0 or os.path.splitext(Result[0][1])[1].lower() not in [".asl", ".asi"]:
+            if len(Result) == 0 or os.path.splitext(Result[0][1])[1].lower() not in {".asl", ".asi"}:
                 NewFileContent.append("%s%s" % (Indent, Line))
                 continue
             #
@@ -499,7 +499,8 @@ def TrimEdkSources(Source, Target):
 
             for FileName in Files:
                 Dummy, Ext = os.path.splitext(FileName)
-                if Ext.upper() not in ['.C', '.H']: continue
+                if Ext.upper() not in {'.C', '.H'}:
+                    continue
                 if Target is None or Target == '':
                     TrimEdkSourceCode(
                         os.path.join(CurrentDir, FileName),
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index 1fbd095f743c..cb6e431b09be 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -453,7 +453,7 @@ class DecBuildData(PackageBuildClassObject):
             Pcds[pcd.TokenCName, pcd.TokenSpaceGuidCName, self._PCD_TYPE_STRING_[Type]] = pcd
         StructPattern = re.compile(r'[_a-zA-Z][0-9A-Za-z_]*$')
         for pcd in Pcds.values():
-            if pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
+            if pcd.DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
                 if StructPattern.match(pcd.DatumType) is None:
                     EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.", pcd.DefinitionPosition[0],pcd.DefinitionPosition[1])
         for struct_pcd in Pcds.values():
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 7b062b564da5..7944f7cf4d23 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -483,16 +483,16 @@ class DscBuildData(PlatformBuildClassObject):
         return self._BuildTargets
 
     def _GetPcdInfoFlag(self):
-        if self._PcdInfoFlag is None or self._PcdInfoFlag.upper() == 'FALSE':
+        if self._PcdInfoFlag is None or self._PcdInfoFlag.upper() == TAB_FALSE_1:
             return False
-        elif self._PcdInfoFlag.upper() == 'TRUE':
+        elif self._PcdInfoFlag.upper() == TAB_TRUE_1:
             return True
         else:
             return False
     def _GetVarCheckFlag(self):
-        if self._VarCheckFlag is None or self._VarCheckFlag.upper() == 'FALSE':
+        if self._VarCheckFlag is None or self._VarCheckFlag.upper() == TAB_FALSE_1:
             return False
-        elif self._VarCheckFlag.upper() == 'TRUE':
+        elif self._VarCheckFlag.upper() == TAB_TRUE_1:
             return True
         else:
             return False
@@ -810,7 +810,7 @@ class DscBuildData(PlatformBuildClassObject):
                     EdkLogger.error('build', ErrorCode, File=self.MetaFile, Line=LineNo,
                                     ExtraData=ErrorInfo)
 
-                if ModuleType != TAB_COMMON and ModuleType not in SUP_MODULE_LIST:
+                if ModuleType != TAB_COMMON and ModuleType not in SUP_MODULE_SET:
                     EdkLogger.error('build', OPTION_UNKNOWN, "Unknown module type [%s]" % ModuleType,
                                     File=self.MetaFile, ExtraData=LibraryInstance, Line=LineNo)
                 LibraryClassDict[Arch, ModuleType, LibraryClass] = LibraryInstance
@@ -821,7 +821,7 @@ class DscBuildData(PlatformBuildClassObject):
             self._LibraryClasses = tdict(True)
             for LibraryClass in LibraryClassSet:
                 # try all possible module types
-                for ModuleType in SUP_MODULE_LIST:
+                for ModuleType in SUP_MODULE_SET:
                     LibraryInstance = LibraryClassDict[self._Arch, ModuleType, LibraryClass]
                     if LibraryInstance is None:
                         continue
@@ -873,7 +873,7 @@ class DscBuildData(PlatformBuildClassObject):
                             File=self.MetaFile, Line=LineNo)
         ValueList, IsValid, Index = AnalyzeDscPcd(Setting, PcdType, self._DecPcds[PcdCName, TokenSpaceGuid].DatumType)
         if not IsValid:
-            if PcdType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
+            if PcdType not in {MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD}:
                 EdkLogger.error('build', FORMAT_INVALID, "Pcd format incorrect.", File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
             else:
@@ -907,7 +907,7 @@ class DscBuildData(PlatformBuildClassObject):
             if not Valid:
                 EdkLogger.error('build', FORMAT_INVALID, ErrStr, File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
-            if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+            if PcdType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
                 if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
                     EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file." , File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
@@ -933,7 +933,7 @@ class DscBuildData(PlatformBuildClassObject):
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
     def CompleteHiiPcdsDefaultStores(self,Pcds):
-        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
+        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}]
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         for pcd in HiiPcd:
             for skuid in pcd.SkuInfoList:
@@ -946,10 +946,10 @@ class DscBuildData(PlatformBuildClassObject):
 
     def RecoverCommandLinePcd(self):
         def UpdateCommandLineValue(pcd):
-            if pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 pcd.PcdValueFromComm = pcd.DefaultValue
-            elif pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            elif pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).HiiDefaultValue
             else:
                 pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).DefaultValue
@@ -1083,9 +1083,9 @@ class DscBuildData(PlatformBuildClassObject):
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         else:
-            if PcdValue.upper() == 'FALSE':
+            if PcdValue.upper() == TAB_FALSE_1:
                 PcdValue = str(0)
-            if PcdValue.upper() == 'TRUE':
+            if PcdValue.upper() == TAB_TRUE_1:
                 PcdValue = str(1)
             if not FieldName:
                 if PcdDatumType not in TAB_PCD_NUMERIC_TYPES:
@@ -1142,7 +1142,7 @@ class DscBuildData(PlatformBuildClassObject):
             #
             # Retrieve build option for EDKII and EDK style module
             #
-            for CodeBase in (EDKII_NAME, EDK_NAME):
+            for CodeBase in {EDKII_NAME, EDK_NAME}:
                 RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
                 for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
                     if Dummy3.upper() != TAB_COMMON:
@@ -1237,7 +1237,7 @@ class DscBuildData(PlatformBuildClassObject):
                             SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
                             for defaultstore in SkuInfo.DefaultStoreDict:
                                 SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
-                    if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]]:
+                    if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]}:
                         if Pcd.DatumType == TAB_VOID:
                             if not Pcd.MaxDatumSize:
                                 Pcd.MaxDatumSize = '0'
@@ -1249,9 +1249,9 @@ class DscBuildData(PlatformBuildClassObject):
                 PcdInDec = self.DecPcds.get((Name,Guid))
                 if PcdInDec:
                     PcdInDec.PcdValueFromComm = NoFiledValues[(Guid,Name)][0]
-                    if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                    if PcdInDec.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]}:
                         self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
                         self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid,Name)][0]
         return AllPcds
@@ -1302,7 +1302,7 @@ class DscBuildData(PlatformBuildClassObject):
                 str_pcd_obj_str.copy(str_pcd_dec)
                 if str_pcd_obj:
                     str_pcd_obj_str.copy(str_pcd_obj)
-                    if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                    if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                         str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                     else:
                         str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
@@ -1323,7 +1323,7 @@ class DscBuildData(PlatformBuildClassObject):
                     str_pcd_obj = Pcds.get(Pcd, None)
                     if str_pcd_obj:
                         str_pcd_obj_str.copy(str_pcd_obj)
-                        if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                        if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                             str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                         else:
                             str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
@@ -1345,7 +1345,7 @@ class DscBuildData(PlatformBuildClassObject):
                     stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy({defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
                     if not NoDefault:
                         stru_pcd.ValueChain.add(skuid,'')
-            if stru_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if stru_pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for skuid in SkuIds:
                     nextskuid = skuid
                     NoDefault = False
@@ -1372,16 +1372,16 @@ class DscBuildData(PlatformBuildClassObject):
                 if str_pcd_obj is None:
                     print PcdName, PcdGuid
                     raise
-                if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                if str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                     if skuname not in str_pcd_obj.SkuInfoList:
                         str_pcd_obj.SkuInfoList[skuname] = SkuInfoClass(SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], HiiDefaultValue=PcdValue, DefaultStore = {StoreName:PcdValue})
                     else:
                         str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue = PcdValue
                         str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.update({StoreName:PcdValue})
-                elif str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                    if skuname in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
+                elif str_pcd_obj.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
+                    if skuname in {self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON}:
                         str_pcd_obj.DefaultValue = PcdValue
                 else:
                     if skuname not in str_pcd_obj.SkuInfoList:
@@ -1398,8 +1398,8 @@ class DscBuildData(PlatformBuildClassObject):
                     else:
                         str_pcd_obj.SkuInfoList[skuname].DefaultValue = PcdValue
             for str_pcd_obj in S_pcd_set.values():
-                if str_pcd_obj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                if str_pcd_obj.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                     continue
                 PcdDefaultStoreSet = set(defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values() for defaultstorename in skuobj.DefaultStoreDict)
                 DefaultStoreObj = DefaultStore(self._GetDefaultStores())
@@ -1447,7 +1447,7 @@ class DscBuildData(PlatformBuildClassObject):
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build ', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
                                             File=self.MetaFile, Line=Dummy5)
-            if SkuName in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
+            if SkuName in {self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON}:
                 if "." not in TokenSpaceGuid:
                     PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, Dummy5))
                 PcdDict[Arch, PcdCName, TokenSpaceGuid, SkuName] = Setting
@@ -1491,7 +1491,7 @@ class DscBuildData(PlatformBuildClassObject):
 
     def GetStructurePcdMaxSize(self, str_pcd):
         pcd_default_value = str_pcd.DefaultValue
-        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]] else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
+        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]} else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
         sku_values.append(pcd_default_value)
 
         def get_length(value):
@@ -1891,8 +1891,8 @@ class DscBuildData(PlatformBuildClassObject):
             # Assign field values in PCD
             #
             CApp = CApp + DscBuildData.GenerateDefaultValueAssignStatement(Pcd)
-            if Pcd.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if Pcd.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 for skuname in self.SkuIdMgr.GetSkuChain(SkuName):
                     storeset = [DefaultStoreName] if DefaultStoreName == TAB_DEFAULT_STORES_DEFAULT else [TAB_DEFAULT_STORES_DEFAULT, DefaultStoreName]
                     for defaultstorenameitem in storeset:
@@ -1940,8 +1940,8 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + self.GenerateSizeFunction(Pcd)
             CApp = CApp + self.GenerateDefaultValueAssignFunction(Pcd)
             CApp = CApp + self.GenerateCommandLineValue(Pcd)
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 CApp = CApp + self.GenerateInitValueFunction(Pcd,self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -1949,8 +1949,8 @@ class DscBuildData(PlatformBuildClassObject):
                         continue
                     for DefaultStoreName in Pcd.SkuOverrideValues[SkuName]:
                         CApp = CApp + self.GenerateInitValueFunction(Pcd,SkuName,DefaultStoreName)
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
+                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 InitByteValue, CApp = self.GenerateInitializeFunc(self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd, InitByteValue, CApp)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -1966,7 +1966,7 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = CApp + '  )\n'
         CApp = CApp + '{\n'
         for Pcd in StructuredPcds.values():
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
                 CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -2288,13 +2288,13 @@ class DscBuildData(PlatformBuildClassObject):
     def CopyDscRawValue(self,Pcd):
         if Pcd.DscRawValue is None:
             Pcd.DscRawValue = dict()
-        if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+        if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]}:
             if self.SkuIdMgr.SystemSkuId not in Pcd.DscRawValue:
                 Pcd.DscRawValue[self.SkuIdMgr.SystemSkuId] = {}
             Pcd.DscRawValue[self.SkuIdMgr.SystemSkuId][TAB_DEFAULT_STORES_DEFAULT] = Pcd.DefaultValue
         for skuname in Pcd.SkuInfoList:
             Pcd.DscRawValue[skuname] = {}
-            if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if Pcd.Type in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for defaultstore in Pcd.SkuInfoList[skuname].DefaultStoreDict:
                     Pcd.DscRawValue[skuname][defaultstore] = Pcd.SkuInfoList[skuname].DefaultStoreDict[defaultstore]
             else:
@@ -2307,16 +2307,16 @@ class DscBuildData(PlatformBuildClassObject):
         for PcdCName, TokenSpaceGuid in PcdSet:
             PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
             self.CopyDscRawValue(PcdObj)
-            if PcdObj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
+            if PcdObj.Type not in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_VPD],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_DEFAULT],
                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]:
+                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]}:
                 Pcds[PcdCName, TokenSpaceGuid]= PcdObj
                 continue
             PcdType = PcdObj.Type
-            if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if PcdType in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 for skuid in PcdObj.SkuInfoList:
                     skuobj = PcdObj.SkuInfoList[skuid]
                     mindefaultstorename = DefaultStoreObj.GetMin(set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict))
@@ -2332,7 +2332,7 @@ class DscBuildData(PlatformBuildClassObject):
                     PcdObj.SkuInfoList[skuname] = copy.deepcopy(PcdObj.SkuInfoList[nextskuid])
                     PcdObj.SkuInfoList[skuname].SkuId = skuid
                     PcdObj.SkuInfoList[skuname].SkuIdName = skuname
-            if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+            if PcdType in {self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]}:
                 PcdObj.DefaultValue = PcdObj.SkuInfoList.values()[0].HiiDefaultValue if self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.SINGLE else PcdObj.SkuInfoList[TAB_DEFAULT].HiiDefaultValue
             Pcds[PcdCName, TokenSpaceGuid]= PcdObj
         return Pcds
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 3d9391039f4f..ef91df6e612e 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -230,8 +230,8 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Defs[Name] = Value
                 self._Macros[Name] = Value
             # some special items in [Defines] section need special treatment
-            elif Name in ('EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION', 'EDK_RELEASE_VERSION', 'PI_SPECIFICATION_VERSION'):
-                if Name in ('EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION'):
+            elif Name in {'EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION', 'EDK_RELEASE_VERSION', 'PI_SPECIFICATION_VERSION'}:
+                if Name in {'EFI_SPECIFICATION_VERSION', 'UEFI_SPECIFICATION_VERSION'}:
                     Name = 'UEFI_SPECIFICATION_VERSION'
                 if self._Specification is None:
                     self._Specification = OrderedDict()
@@ -248,8 +248,8 @@ class InfBuildData(ModuleBuildClassObject):
                 if len(ValueList) > 1:
                     SupModuleList = GetSplitValueList(ValueList[1], ' ')
                 else:
-                    SupModuleList = SUP_MODULE_LIST
-                self._LibraryClass.append(LibraryClassObject(LibraryClass, SupModuleList))
+                    SupModuleList = SUP_MODULE_SET
+                self._LibraryClass.append(LibraryClassObject(LibraryClass, list(SupModuleList)))
             elif Name == 'ENTRY_POINT':
                 if self._ModuleEntryPointList is None:
                     self._ModuleEntryPointList = []
@@ -280,7 +280,7 @@ class InfBuildData(ModuleBuildClassObject):
                     self._CustomMakefile['MSFT'] = TokenList[0]
                     self._CustomMakefile['GCC'] = TokenList[0]
                 else:
-                    if TokenList[0] not in ['MSFT', 'GCC']:
+                    if TokenList[0] not in {'MSFT', 'GCC'}:
                         EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
                                         "No supported family [%s]" % TokenList[0],
                                         File=self.MetaFile, Line=Record[-1])
@@ -296,7 +296,7 @@ class InfBuildData(ModuleBuildClassObject):
             if not self._ModuleType:
                 EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
                                 "MODULE_TYPE is not given", File=self.MetaFile)
-            if self._ModuleType not in SUP_MODULE_LIST:
+            if self._ModuleType not in SUP_MODULE_SET:
                 RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, self._Platform]
                 for Record in RecordList:
                     Name = Record[1]
@@ -304,7 +304,7 @@ class InfBuildData(ModuleBuildClassObject):
                         LineNo = Record[6]
                         break
                 EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
-                                "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (self._ModuleType, ' '.join(l for l in SUP_MODULE_LIST)),
+                                "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (self._ModuleType, ' '.join(l for l in SUP_MODULE_SET)),
                                 File=self.MetaFile, Line=LineNo)
             if (self._Specification is None) or (not 'PI_SPECIFICATION_VERSION' in self._Specification) or (int(self._Specification['PI_SPECIFICATION_VERSION'], 16) < 0x0001000A):
                 if self._ModuleType == SUP_MODULE_SMM_CORE:
@@ -318,11 +318,11 @@ class InfBuildData(ModuleBuildClassObject):
                and 'PCI_CLASS_CODE' in self._Defs and 'PCI_REVISION' in self._Defs:
                 self._BuildType = 'UEFI_OPTIONROM'
                 if 'PCI_COMPRESS' in self._Defs:
-                    if self._Defs['PCI_COMPRESS'] not in ('TRUE', 'FALSE'):
+                    if self._Defs['PCI_COMPRESS'] not in TAB_TRUE_FALSE_SET:
                         EdkLogger.error("build", FORMAT_INVALID, "Expected TRUE/FALSE for PCI_COMPRESS: %s" % self.MetaFile)
 
             elif 'UEFI_HII_RESOURCE_SECTION' in self._Defs \
-               and self._Defs['UEFI_HII_RESOURCE_SECTION'] == 'TRUE':
+               and self._Defs['UEFI_HII_RESOURCE_SECTION'] == TAB_TRUE_1:
                 self._BuildType = 'UEFI_HII'
             else:
                 self._BuildType = self._ModuleType.upper()
@@ -345,7 +345,7 @@ class InfBuildData(ModuleBuildClassObject):
             if self._ComponentType in COMPONENT_TO_MODULE_MAP_DICT:
                 self._ModuleType = COMPONENT_TO_MODULE_MAP_DICT[self._ComponentType]
             if self._ComponentType == EDK_COMPONENT_TYPE_LIBRARY:
-                self._LibraryClass = [LibraryClassObject(self._BaseName, SUP_MODULE_LIST)]
+                self._LibraryClass = [LibraryClassObject(self._BaseName, list(SUP_MODULE_SET))]
             # make use some [nmake] section macros
             Macros = self._Macros
             Macros["EDK_SOURCE"] = GlobalData.gEcpSource
@@ -442,7 +442,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._GetHeaderInfo()
             if self._ModuleType is None:
                 self._ModuleType = SUP_MODULE_BASE
-            if self._ModuleType not in SUP_MODULE_LIST:
+            if self._ModuleType not in SUP_MODULE_SET:
                 self._ModuleType = SUP_MODULE_USER_DEFINED
         return self._ModuleType
 
@@ -496,7 +496,7 @@ class InfBuildData(ModuleBuildClassObject):
         if self._Shadow is None:
             if self._Header_ is None:
                 self._GetHeaderInfo()
-            if self._Shadow is not None and self._Shadow.upper() == 'TRUE':
+            if self._Shadow is not None and self._Shadow.upper() == TAB_TRUE_1:
                 self._Shadow = True
             else:
                 self._Shadow = False
@@ -886,7 +886,7 @@ class InfBuildData(ModuleBuildClassObject):
 
             if len(RecordList) != 0 and self.ModuleType == SUP_MODULE_USER_DEFINED:
                 for Record in RecordList:
-                    if Record[4] not in [SUP_MODULE_PEIM, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER]:
+                    if Record[4] not in {SUP_MODULE_PEIM, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER}:
                         EdkLogger.error('build', FORMAT_INVALID,
                                         "'%s' module must specify the type of [Depex] section" % self.ModuleType,
                                         File=self.MetaFile)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
index df1e90faf5a0..8650a51933d6 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
@@ -33,9 +33,9 @@ ErrorMsgMap = {
 }
 
 def CheckInfComment(SectionType, Comments, InfFile, LineNo, ValueList):
-    if SectionType in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_EX, MODEL_PCD_DYNAMIC]:
+    if SectionType in {MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_EX, MODEL_PCD_DYNAMIC}:
         CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0]+'.'+ValueList[1], ErrorMsgMap[MODEL_PCD_DYNAMIC])
-    elif SectionType in [MODEL_EFI_GUID, MODEL_EFI_PPI]:
+    elif SectionType in {MODEL_EFI_GUID, MODEL_EFI_PPI}:
         CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
     elif SectionType == MODEL_EFI_PROTOCOL:
         CheckUsage(Comments, UsageList + ("TO_START", "BY_START"), InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 2c116ddbcb71..50a74bc415ef 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -584,7 +584,7 @@ class InfParser(MetaFileParser):
                 self._SectionHeaderParser()
                 # Check invalid sections
                 if self._Version < 0x00010005:
-                    if self._SectionType in [MODEL_META_DATA_BUILD_OPTION,
+                    if self._SectionType in {MODEL_META_DATA_BUILD_OPTION,
                                              MODEL_EFI_LIBRARY_CLASS,
                                              MODEL_META_DATA_PACKAGE,
                                              MODEL_PCD_FIXED_AT_BUILD,
@@ -595,13 +595,13 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_GUID,
                                              MODEL_EFI_PROTOCOL,
                                              MODEL_EFI_PPI,
-                                             MODEL_META_DATA_USER_EXTENSION]:
+                                             MODEL_META_DATA_USER_EXTENSION}:
                         EdkLogger.error('Parser', FORMAT_INVALID,
                                         "Section [%s] is not allowed in inf file without version" % (self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
-                elif self._SectionType in [MODEL_EFI_INCLUDE,
+                elif self._SectionType in {MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
-                                           MODEL_META_DATA_NMAKE]:
+                                           MODEL_META_DATA_NMAKE}:
                     EdkLogger.error('Parser', FORMAT_INVALID,
                                     "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -764,9 +764,9 @@ class InfParser(MetaFileParser):
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
             InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
+            if InfPcdValueList[0] in TAB_TRUE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
-            elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
+            elif InfPcdValueList[0] in TAB_FALSE_SET:
                 self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1);
         if (self._ValueList[0], self._ValueList[1]) not in self.PcdsDict:
             self.PcdsDict[self._ValueList[0], self._ValueList[1]] = self._SectionType
@@ -1017,13 +1017,13 @@ class DscParser(MetaFileParser):
             EdkLogger.error("Parser", FORMAT_INVALID, "Unknown directive [%s]" % DirectiveName,
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
-        if DirectiveName in ['!IF', '!IFDEF', '!IFNDEF']:
+        if DirectiveName in {'!IF', '!IFDEF', '!IFNDEF'}:
             self._InDirective += 1
 
-        if DirectiveName in ['!ENDIF']:
+        if DirectiveName == '!ENDIF':
             self._InDirective -= 1
 
-        if DirectiveName in ['!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'] and self._ValueList[1] == '':
+        if DirectiveName in {'!IF', '!IFDEF', '!INCLUDE', '!IFNDEF', '!ELSEIF'} and self._ValueList[1] == '':
             EdkLogger.error("Parser", FORMAT_INVALID, "Missing expression",
                             File=self.MetaFile, Line=self._LineIndex + 1,
                             ExtraData=self._CurrentLine)
@@ -1037,9 +1037,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 # Remove any !else or !elseif
                 DirectiveInfo = self._DirectiveStack.pop()
-                if DirectiveInfo[0] in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if DirectiveInfo[0] in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                         MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                        MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
             else:
                 EdkLogger.error("Parser", FORMAT_INVALID, "Redundant '!endif'",
@@ -1104,7 +1104,7 @@ class DscParser(MetaFileParser):
     @ParseMacro
     def _SkuIdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
-        if len(TokenList) not in (2,3):
+        if len(TokenList) not in {2,3}:
             EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Number>|<UiName>[|<UiName>]'",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -1156,7 +1156,7 @@ class DscParser(MetaFileParser):
             #
             # The PCD values are optional for FIXEDATBUILD, PATCHABLEINMODULE, Dynamic/DynamicEx default
             #
-            if self._SectionType in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
+            if self._SectionType in {MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
                 return
             EdkLogger.error('Parser', FORMAT_INVALID, "No PCD value given",
                             ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
@@ -1164,13 +1164,13 @@ class DscParser(MetaFileParser):
 
         # Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
         ValueList = GetSplitValueList(self._ValueList[2])
-        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64] \
-                              and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
+        if len(ValueList) > 1 and ValueList[1] in {TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64} \
+                              and self._ItemType in {MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT}:
             EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
         # Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
-        if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
+        if self._ItemType in {MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII}:
             DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
             if len(DscPcdValueList[0].replace('L','').replace('"','').strip()) == 0:
                 EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
@@ -1178,9 +1178,9 @@ class DscParser(MetaFileParser):
 
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-        if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
+        if DscPcdValueList[0] in TAB_TRUE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
-        elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
+        elif DscPcdValueList[0] in TAB_FALSE_SET:
             self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
 
 
@@ -1248,7 +1248,7 @@ class DscParser(MetaFileParser):
         Macros.update(GlobalData.gPlatformDefines)
         Macros.update(GlobalData.gCommandLineDefines)
         # PCD cannot be referenced in macro definition
-        if self._ItemType not in [MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE]:
+        if self._ItemType not in {MODEL_META_DATA_DEFINE, MODEL_META_DATA_GLOBAL_DEFINE}:
             Macros.update(self._Symbols)
         if GlobalData.BuildOptionPcd:
             for Item in GlobalData.BuildOptionPcd:
@@ -1412,9 +1412,9 @@ class DscParser(MetaFileParser):
     def __RetrievePcdValue(self):
         Content = open(str(self.MetaFile), 'r').readlines()
         GlobalData.gPlatformOtherPcds['DSCFILE'] = str(self.MetaFile)
-        for PcdType in (MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_HII,
+        for PcdType in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_HII,
                         MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
-                        MODEL_PCD_DYNAMIC_EX_VPD):
+                        MODEL_PCD_DYNAMIC_EX_VPD]:
             Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
             for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID, Line in Records:
                 Name = TokenSpaceGuid + '.' + PcdName
@@ -1455,8 +1455,8 @@ class DscParser(MetaFileParser):
 
     def __ProcessDirective(self):
         Result = None
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF]:
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF}:
             Macros = self._Macros
             Macros.update(GlobalData.gGlobalDefines)
             try:
@@ -1474,9 +1474,9 @@ class DscParser(MetaFileParser):
                                 Line=self._LineIndex + 1)
                 Result = Excpt.result
 
-        if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        if self._ItemType in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                              MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
             self._DirectiveStack.append(self._ItemType)
             if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:
                 Result = bool(Result)
@@ -1500,9 +1500,9 @@ class DscParser(MetaFileParser):
             while self._DirectiveStack:
                 self._DirectiveEvalStack.pop()
                 Directive = self._DirectiveStack.pop()
-                if Directive in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+                if Directive in {MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
                                  MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF]:
+                                 MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF}:
                     break
         elif self._ItemType == MODEL_META_DATA_INCLUDE:
             # The included file must be relative to workspace or same directory as DSC file
@@ -1600,7 +1600,7 @@ class DscParser(MetaFileParser):
         self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros, RaiseError=True)
 
     def __ProcessPcd(self):
-        if self._ItemType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
+        if self._ItemType not in {MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD}:
             self._ValueList[2] = ReplaceMacro(self._ValueList[2], self._Macros, RaiseError=True)
             return
 
@@ -1617,9 +1617,9 @@ class DscParser(MetaFileParser):
             except:
                 pass
 
-        if ValList[Index] == 'True':
+        if ValList[Index] == TAB_TRUE_3:
             ValList[Index] = '1'
-        if ValList[Index] == 'False':
+        if ValList[Index] == TAB_FALSE_3:
             ValList[Index] = '0'
 
         if (not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack):
@@ -1852,7 +1852,7 @@ class DecParser(MetaFileParser):
             if len(ItemList) > 2:
                 S2 = ItemList[2].upper()
                 # only Includes, GUIDs, PPIs, Protocols section have Private tag
-                if self._SectionName in [TAB_INCLUDES.upper(), TAB_GUIDS.upper(), TAB_PROTOCOLS.upper(), TAB_PPIS.upper()]:
+                if self._SectionName in {TAB_INCLUDES.upper(), TAB_GUIDS.upper(), TAB_PROTOCOLS.upper(), TAB_PPIS.upper()}:
                     if S2 != 'PRIVATE':
                         EdkLogger.error("Parser", FORMAT_INVALID, 'Please use keyword "Private" as section tag modifier.',
                                         File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
@@ -2030,9 +2030,9 @@ class DecParser(MetaFileParser):
                 self._ValueList[0] = self._CurrentStructurePcdName
                 self._ValueList[1] = ValueList[1].strip()
 
-            if ValueList[0] in ['True', 'true', 'TRUE']:
+            if ValueList[0] in TAB_TRUE_SET:
                 ValueList[0] = '1'
-            elif ValueList[0] in ['False', 'false', 'FALSE']:
+            elif ValueList[0] in TAB_FALSE_SET:
                 ValueList[0] = '0'
 
             # check for duplicate PCD definition
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index db9e1ed062fb..478dab3b61b0 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -124,7 +124,9 @@ gDriverTypeMap = {
   }
 
 ## The look up table of the supported opcode in the dependency expression binaries
-gOpCodeList = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "TRUE", "FALSE", "END", "SOR"]
+gOpCodeList = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH,
+               DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_TRUE,
+               DEPEX_OPCODE_FALSE, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
 
 ##
 # Writes a string to the file object.
@@ -296,7 +298,7 @@ class DepexParser(object):
             OpCode = DepexFile.read(1)
             while OpCode:
                 Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
-                if Statement in ["BEFORE", "AFTER", "PUSH"]:
+                if Statement in {"BEFORE", "AFTER", "PUSH"}:
                     GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
                                 struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
                     GuidString = self._GuidDb.get(GuidValue, GuidValue)
@@ -409,7 +411,7 @@ class DepexReport(object):
         if not ModuleType:
             ModuleType = COMPONENT_TO_MODULE_MAP_DICT.get(M.ComponentType, "")
 
-        if ModuleType in [SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE, SUP_MODULE_UEFI_APPLICATION]:
+        if ModuleType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_DXE_CORE, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_CORE_STANDALONE, SUP_MODULE_UEFI_APPLICATION}:
             return
       
         for Source in M.SourceFileList:
@@ -493,25 +495,25 @@ class BuildFlagsReport(object):
         #
         for Source in M.SourceFileList:
             Ext = os.path.splitext(Source.File)[1].lower()
-            if Ext in [".c", ".cc", ".cpp"]:
+            if Ext in {".c", ".cc", ".cpp"}:
                 BuildOptions["CC"] = 1
-            elif Ext in [".s", ".asm"]:
+            elif Ext in {".s", ".asm"}:
                 BuildOptions["PP"] = 1
                 BuildOptions["ASM"] = 1
-            elif Ext in [".vfr"]:
+            elif Ext == ".vfr":
                 BuildOptions["VFRPP"] = 1
                 BuildOptions["VFR"] = 1
-            elif Ext in [".dxs"]:
+            elif Ext == ".dxs":
                 BuildOptions["APP"] = 1
                 BuildOptions["CC"] = 1
-            elif Ext in [".asl"]:
+            elif Ext == ".asl":
                 BuildOptions["ASLPP"] = 1
                 BuildOptions["ASL"] = 1
-            elif Ext in [".aslc"]:
+            elif Ext == ".aslc":
                 BuildOptions["ASLCC"] = 1
                 BuildOptions["ASLDLINK"] = 1
                 BuildOptions["CC"] = 1
-            elif Ext in [".asm16"]:
+            elif Ext == ".asm16":
                 BuildOptions["ASMLINK"] = 1
             BuildOptions["SLINK"] = 1
             BuildOptions["DLINK"] = 1
@@ -1030,11 +1032,11 @@ class PcdReport(object):
                     IsStructure = False
                     if GlobalData.gStructurePcd and (self.Arch in GlobalData.gStructurePcd) and ((Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.gStructurePcd[self.Arch]):
                         IsStructure = True
-                        if TypeName in ('DYNVPD', 'DEXVPD'):
+                        if TypeName in {'DYNVPD', 'DEXVPD'}:
                             SkuInfoList = Pcd.SkuInfoList
                         Pcd = GlobalData.gStructurePcd[self.Arch][(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                         Pcd.DatumType = Pcd.StructName
-                        if TypeName in ('DYNVPD', 'DEXVPD'):
+                        if TypeName in {'DYNVPD', 'DEXVPD'}:
                             Pcd.SkuInfoList = SkuInfoList
                         if Pcd.PcdFieldValueFromComm:
                             BuildOptionMatch = True
@@ -1052,7 +1054,7 @@ class PcdReport(object):
                                 SkuList = sorted(Pcd.SkuInfoList.keys())
                                 for Sku in SkuList:
                                     SkuInfo = Pcd.SkuInfoList[Sku]
-                                    if TypeName in ('DYNHII', 'DEXHII'):
+                                    if TypeName in {'DYNHII', 'DEXHII'}:
                                         if SkuInfo.DefaultStoreDict:
                                             DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
                                             for DefaultStore in DefaultStoreList:
@@ -1091,7 +1093,7 @@ class PcdReport(object):
                     if ModulePcdSet is None:
                         if IsStructure:
                             continue
-                        if not TypeName in ('PATCH', 'FLAG', 'FIXED'):
+                        if TypeName not in {'PATCH', 'FLAG', 'FIXED'}:
                             continue
                         if not BuildOptionMatch:
                             ModuleOverride = self.ModulePcdOverride.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), {})
@@ -1186,7 +1188,7 @@ class PcdReport(object):
             for Sku in SkuList:
                 SkuInfo = Pcd.SkuInfoList[Sku]
                 SkuIdName = SkuInfo.SkuIdName
-                if TypeName in ('DYNHII', 'DEXHII'):
+                if TypeName in {'DYNHII', 'DEXHII'}:
                     if SkuInfo.DefaultStoreDict:
                         DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
                         for DefaultStore in DefaultStoreList:
@@ -1271,7 +1273,7 @@ class PcdReport(object):
                                 FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
                                 FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
-                    if TypeName in ('DYNVPD', 'DEXVPD'):
+                    if TypeName in {'DYNVPD', 'DEXVPD'}:
                         FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
                     if IsStructure:
                         OverrideValues = Pcd.SkuOverrideValues[Sku]
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 1fb8c7985d99..66a97fc8c1cd 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -224,7 +224,7 @@ def NormFile(FilePath, Workspace):
         EdkLogger.error("build", FILE_NOT_FOUND, ExtraData="\t%s (Please give file in absolute path or relative to WORKSPACE)" % FileFullPath)
 
     # remove workspace directory from the beginning part of the file path
-    if Workspace[-1] in ["\\", "/"]:
+    if Workspace[-1] in {"\\", "/"}:
         return FileFullPath[len(Workspace):]
     else:
         return FileFullPath[(len(Workspace) + 1):]
@@ -410,7 +410,7 @@ class ModuleMakeUnit(BuildUnit):
     def __init__(self, Obj, Target):
         Dependency = [ModuleMakeUnit(La, Target) for La in Obj.LibraryAutoGenList]
         BuildUnit.__init__(self, Obj, Obj.BuildCommand, Target, Dependency, Obj.MakeFileDir)
-        if Target in [None, "", "all"]:
+        if Target in {None, "", "all"}:
             self.Target = "tbuild"
 
 ## The smallest platform unit that can be built by nmake/make command in multi-thread build mode
@@ -1228,7 +1228,7 @@ class Build():
             return False
 
         # skip file generation for cleanxxx targets, run and fds target
-        if Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+        if Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
             # for target which must generate AutoGen code and makefile
             if not self.SkipAutoGen or Target == 'genc':
                 self.Progress.Start("Generating code")
@@ -1347,7 +1347,7 @@ class Build():
             return False
 
         # skip file generation for cleanxxx targets, run and fds target
-        if Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+        if Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
             # for target which must generate AutoGen code and makefile
             if not self.SkipAutoGen or Target == 'genc':
                 self.Progress.Start("Generating code")
@@ -1488,7 +1488,7 @@ class Build():
             for SectionHeader in ModuleInfo.Image.SectionHeaderList:
                 if SectionHeader[0] == '.text':
                     TextSectionAddress = SectionHeader[1]
-                elif SectionHeader[0] in ['.data', '.sdata']:
+                elif SectionHeader[0] in {'.data', '.sdata'}:
                     DataSectionAddress = SectionHeader[1]
             if AddrIsOffset:
                 MapBuffer.write('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
@@ -1583,19 +1583,19 @@ class Build():
                     if not ImageClass.IsValid:
                         EdkLogger.error("build", FILE_PARSE_FAILURE, ExtraData=ImageClass.ErrorInfo)
                     ImageInfo = PeImageInfo(Module.Name, Module.Guid, Module.Arch, Module.OutputDir, Module.DebugDir, ImageClass)
-                    if Module.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, SUP_MODULE_DXE_CORE]:
+                    if Module.ModuleType in {SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, SUP_MODULE_DXE_CORE}:
                         PeiModuleList[Module.MetaFile] = ImageInfo
                         PeiSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [EDK_COMPONENT_TYPE_BS_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_UEFI_DRIVER]:
+                    elif Module.ModuleType in {EDK_COMPONENT_TYPE_BS_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_UEFI_DRIVER}:
                         BtModuleList[Module.MetaFile] = ImageInfo
                         BtSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER]:
+                    elif Module.ModuleType in {SUP_MODULE_DXE_RUNTIME_DRIVER, EDK_COMPONENT_TYPE_RT_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, EDK_COMPONENT_TYPE_SAL_RT_DRIVER}:
                         RtModuleList[Module.MetaFile] = ImageInfo
                         #IPF runtime driver needs to be at 2 page alignment.
                         if IsIpfPlatform and ImageInfo.Image.Size % 0x2000 != 0:
                             ImageInfo.Image.Size = (ImageInfo.Image.Size / 0x2000 + 1) * 0x2000
                         RtSize += ImageInfo.Image.Size
-                    elif Module.ModuleType in [SUP_MODULE_SMM_CORE, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
+                    elif Module.ModuleType in {SUP_MODULE_SMM_CORE, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE}:
                         SmmModuleList[Module.MetaFile] = ImageInfo
                         SmmSize += ImageInfo.Image.Size
                         if Module.ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
@@ -1757,7 +1757,7 @@ class Build():
                     self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict)
 
                 # Create MAP file when Load Fix Address is enabled.
-                if self.Target in ["", "all", "fds"]:
+                if self.Target in {"", "all", "fds"}:
                     for Arch in Wa.ArchList:
                         GlobalData.gGlobalDefines['ARCH'] = Arch
                         #
@@ -1855,7 +1855,7 @@ class Build():
                                 self.HashSkipModules.append(Ma)
                                 continue
                             # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
-                            if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+                            if self.Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
                                 # for target which must generate AutoGen code and makefile
                                 if not self.SkipAutoGen or self.Target == 'genc':
                                     self.Progress.Start("Generating code")
@@ -2036,7 +2036,7 @@ class Build():
                             continue
 
                         # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run', 'fds'
-                        if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
+                        if self.Target not in {'clean', 'cleanlib', 'cleanall', 'run', 'fds'}:
                             # for target which must generate AutoGen code and makefile
                             if not self.SkipAutoGen or self.Target == 'genc':
                                 Ma.CreateCodeFile(True)
@@ -2101,7 +2101,7 @@ class Build():
                     EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 # Create MAP file when Load Fix Address is enabled.
-                if self.Target in ["", "all", "fds"]:
+                if self.Target in {"", "all", "fds"}:
                     for Arch in Wa.ArchList:
                         #
                         # Check whether the set fix address is above 4G for 32bit image.
@@ -2213,7 +2213,7 @@ class Build():
     #
     def Launch(self):
         if not self.ModuleFile:
-            if not self.SpawnMode or self.Target not in ["", "all"]:
+            if not self.SpawnMode or self.Target not in {"", "all"}:
                 self.SpawnMode = False
                 self._BuildPlatform()
             else:
@@ -2274,7 +2274,7 @@ def ParseDefines(DefineList=[]):
                                 ExtraData=DefineTokenList[0])
 
             if len(DefineTokenList) == 1:
-                DefineDict[DefineTokenList[0]] = "TRUE"
+                DefineDict[DefineTokenList[0]] = TAB_TRUE_1
             else:
                 DefineDict[DefineTokenList[0]] = DefineTokenList[1].strip()
     return DefineDict
@@ -2478,7 +2478,7 @@ def Main():
             if ErrorCode != 0:
                 EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
 
-        if Option.Flag is not None and Option.Flag not in ['-c', '-s']:
+        if Option.Flag is not None and Option.Flag not in {'-c', '-s'}:
             EdkLogger.error("build", OPTION_VALUE_INVALID, "UNI flag must be one of -c or -s")
 
         MyBuild = Build(Target, Workspace, Option)
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 11/11] BaseTools: remove extra assignment
  2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
                   ` (9 preceding siblings ...)
  2018-05-14 18:09 ` [PATCH v1 10/11] BaseTools: change to set for membership testing Jaben Carsey
@ 2018-05-14 18:09 ` Jaben Carsey
  10 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-05-14 18:09 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

there is no use to assign back to a variable.  just return the result.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/Common/Expression.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 3133f610b4a7..3036af058c1d 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -197,8 +197,7 @@ def IntToStr(Value):
     while Value > 0:
         StrList.append(chr(Value & 0xff))
         Value = Value >> 8
-    Value = '"{VAL}"'.format(VAL=''.join(StrList))
-    return Value
+    return '"{VAL}"'.format(VAL=''.join(StrList))
 
 SupportedInMacroList = ['TARGET', 'TOOL_CHAIN_TAG', 'ARCH', 'FAMILY']
 
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

* [PATCH v1 05/11] BaseTools: use set presence instead of series of equality
  2018-06-20 21:08 [PATCH v2 00/11] BaseTools Refactoring Jaben Carsey
@ 2018-06-20 21:08 ` Jaben Carsey
  0 siblings, 0 replies; 13+ messages in thread
From: Jaben Carsey @ 2018-06-20 21:08 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao, Yonghong Zhu

Instead of testing each equality individually, just make a set and test once.

Cc: Liming Gao <liming.gao@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Contributed-under: TianoCore Contribution Agreement 1.1
Signed-off-by: Jaben Carsey <jaben.carsey@intel.com>
---
 BaseTools/Source/Python/Ecc/Configuration.py | 10 +---------
 1 file changed, 1 insertion(+), 9 deletions(-)

diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index 217b60f4f319..8f4426c20421 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -404,17 +404,9 @@ class Configuration(object):
                     ErrorMsg = "Invalid configuration option '%s' was found" % List[0]
                     EdkLogger.error("Ecc", EdkLogger.ECC_ERROR, ErrorMsg, File = Filepath, Line = LineNo)
                 assert _ConfigFileToInternalTranslation[List[0]] in self.__dict__
-                if List[0] == 'ModifierList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 if List[0] == 'MetaDataFileCheckPathOfGenerateFileList' and List[1] == "":
                     continue
-                if List[0] == 'SkipDirList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'SkipFileList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'BinaryExtList':
-                    List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                if List[0] == 'Copyright':
+                if List[0] in {'ModifierList','SkipDirList','SkipFileList','BinaryExtList','Copyright'}:
                     List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 self.__dict__[_ConfigFileToInternalTranslation[List[0]]] = List[1]
 
-- 
2.16.2.windows.1



^ permalink raw reply related	[flat|nested] 13+ messages in thread

end of thread, other threads:[~2018-06-20 21:08 UTC | newest]

Thread overview: 13+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2018-05-14 18:09 [PATCH v1 00/11] BaseTools refactoring Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 01/11] BaseTools: decorate base classes to prevent instantiation Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 02/11] BaseTools: Workspace - create a base class Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 03/11] BaseTools: remove unused code Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 04/11] BaseTools: remove repeated calls to startswith/endswith Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 06/11] BaseTools: refactor section generation Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 07/11] BaseTools: refactor file opening/writing Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 08/11] BaseTools: refactor to change object types Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 09/11] BaseTools: refactor to stop re-allocating strings Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 10/11] BaseTools: change to set for membership testing Jaben Carsey
2018-05-14 18:09 ` [PATCH v1 11/11] BaseTools: remove extra assignment Jaben Carsey
  -- strict thread matches above, loose matches on Subject: below --
2018-06-20 21:08 [PATCH v2 00/11] BaseTools Refactoring Jaben Carsey
2018-06-20 21:08 ` [PATCH v1 05/11] BaseTools: use set presence instead of series of equality Jaben Carsey

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox