public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
* [PATCH v4 00/20] BaseTools: One step toward python3
@ 2018-06-25 10:31 Gary Lin
  2018-06-25 10:31 ` [PATCH v4 01/13] BaseTools: Fix a typo in ini.py Gary Lin
                   ` (14 more replies)
  0 siblings, 15 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

v4 changes:
  - Remove the range() patch since it needs python-future
  - Remove the patch to unify long and int since it caused error in
    windows.
  - Split the absolute import patches and will introduce them later

v3 changes:
  - Rebase to the current git HEAD (2e1083038d9aa74fcaa2db8158fdee7c8b4af3bb)
  - Fix a typo in BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
  - Remove the patch for reduce() since it's not used anymore 

v2 changes:
  - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
  - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
    refactor some python scripts to break the circular imports.

This patch series is also available in
https://github.com/lcp/edk2/tree/python3-futurize-v3

Since python2 will be EOL in 2020, we start to evaluate the impact of
the python2 removal. As expected, OMVF building failed the test. It's
actually a task noted in the wiki page:

https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support

Maybe it's time to convert the python scripts gradually.

This patchset doesn't make the python scripts in BaseTools compatible
with python3 immediately. It aims to do the trivial and safe conversion
and replacement to make some statements compatible with both python2 and
python3, so we can deal with the difficult cases later.

With the help of "futurize" from python-future, it's easier to refactor
the statements. This patchset is basically equivalent to "futurize -1"(*)
plus "StringIO.StringIO => io.BytesIO" and minus the absolute import.

The patchset was tested with the following command in openSUSE
Tumbleweed:

$ ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
                     -D NETWORK_IP6_ENABLE \
                     -D HTTP_BOOT_ENABLE \
                     -D TLS_ENABLE \
                     -D TPM2_ENABLE

The firmware file was built successfully and I didn't notice any error
so far. Testing with other platform is welcome.

(*) http://python-future.org/automatic_conversion.html#stage-1-safe-fixes

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>

Gary Lin (13):
  BaseTools: Fix a typo in ini.py
  BaseTools: Refactor python except statements
  BaseTools: Refactor python print statements
  BaseTools: Remove the old python "not-equal"
  BaseTools: Remove tuple parameter in python scripts
  BaseTools: Remove the deprecated hash_key()
  BaseTools: Replace StandardError with Expression
  BaseTools: Remove types.TypeType
  BaseTools: Refactor python raise statement
  BaseTools: Adjust the spaces around commas and colons
  BaseTools: Migrate to the new octal literal
  BaseTools: Fix old python2 idioms
  BaseTools: Replace StringIO.StringIO with io.BytesIO

 .../Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py   |   5 +-
 BaseTools/Scripts/BinToPcd.py                 |   7 +-
 BaseTools/Scripts/ConvertUni.py               |   5 -
 BaseTools/Scripts/FormatDosFiles.py           |   3 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py   |  21 +-
 .../PackageDocumentTools/packagedoc_cli.py    |  47 ++--
 .../plugins/EdkPlugins/basemodel/doxygen.py   |  11 +-
 .../plugins/EdkPlugins/basemodel/efibinary.py |  29 +-
 .../plugins/EdkPlugins/basemodel/ini.py       |   4 +-
 .../EdkPlugins/edk2/model/baseobject.py       |   6 +-
 .../EdkPlugins/edk2/model/doxygengen.py       |   2 +-
 .../EdkPlugins/edk2/model/doxygengen_spec.py  |   2 +-
 .../plugins/EdkPlugins/edk2/model/inf.py      |   8 +-
 BaseTools/Scripts/PatchCheck.py               |   2 +-
 BaseTools/Scripts/RunMakefile.py              |   2 +-
 .../Scripts/SmiHandlerProfileSymbolGen.py     |  19 +-
 BaseTools/Scripts/UpdateBuildVersions.py      |  18 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py    |  77 +++---
 .../Source/Python/AutoGen/BuildEngine.py      |  37 +--
 BaseTools/Source/Python/AutoGen/GenC.py       |  72 ++---
 BaseTools/Source/Python/AutoGen/GenDepex.py   |   8 +-
 BaseTools/Source/Python/AutoGen/GenMake.py    |   8 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py   | 118 ++++----
 BaseTools/Source/Python/AutoGen/GenVar.py     | 160 +++++------
 .../Source/Python/AutoGen/IdfClassObject.py   |   1 -
 BaseTools/Source/Python/AutoGen/StrGather.py  |   8 +-
 .../Source/Python/AutoGen/UniClassObject.py   |  17 +-
 .../Python/AutoGen/ValidCheckingInfoObject.py |   4 +-
 BaseTools/Source/Python/BPDG/BPDG.py          |   3 +-
 BaseTools/Source/Python/BPDG/GenVpd.py        |  18 +-
 BaseTools/Source/Python/Common/DataType.py    |   4 +-
 BaseTools/Source/Python/Common/Expression.py  |  77 +++---
 .../Source/Python/Common/LongFilePathOs.py    |   2 +-
 BaseTools/Source/Python/Common/Misc.py        |  49 ++--
 .../Source/Python/Common/RangeExpression.py   |  33 +--
 BaseTools/Source/Python/Common/StringUtils.py |   6 +-
 .../Python/Common/TargetTxtClassObject.py     |   7 +-
 .../Python/Common/ToolDefClassObject.py       |   8 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py |  23 +-
 BaseTools/Source/Python/Ecc/CParser.py        | 175 ++++++------
 BaseTools/Source/Python/Ecc/Check.py          |  14 +-
 .../Python/Ecc/CodeFragmentCollector.py       |  69 ++---
 BaseTools/Source/Python/Ecc/Configuration.py  |   5 +-
 BaseTools/Source/Python/Ecc/Exception.py      |   3 +-
 .../Ecc/MetaFileWorkspace/MetaDataTable.py    |   5 +-
 .../Ecc/MetaFileWorkspace/MetaFileParser.py   |  42 +--
 .../Source/Python/Ecc/Xml/XmlRoutines.py      |   9 +-
 BaseTools/Source/Python/Ecc/c.py              |  15 +-
 BaseTools/Source/Python/Eot/CParser.py        | 175 ++++++------
 .../Python/Eot/CodeFragmentCollector.py       |  61 +++--
 BaseTools/Source/Python/Eot/InfParserLite.py  |   7 +-
 BaseTools/Source/Python/Eot/Parser.py         |   2 +-
 BaseTools/Source/Python/Eot/c.py              |  23 +-
 .../Source/Python/GenFds/AprioriSection.py    |   6 +-
 BaseTools/Source/Python/GenFds/Capsule.py     |  10 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py |   6 +-
 BaseTools/Source/Python/GenFds/EfiSection.py  |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py          |  12 +-
 BaseTools/Source/Python/GenFds/FdfParser.py   |  45 +--
 .../Source/Python/GenFds/FfsFileStatement.py  |   4 +-
 .../Source/Python/GenFds/FfsInfStatement.py   |  18 +-
 BaseTools/Source/Python/GenFds/Fv.py          |  10 +-
 .../Source/Python/GenFds/FvImageSection.py    |   8 +-
 BaseTools/Source/Python/GenFds/GenFds.py      |  17 +-
 .../Python/GenFds/GenFdsGlobalVariable.py     |   9 +-
 BaseTools/Source/Python/GenFds/OptionRom.py   |   3 -
 BaseTools/Source/Python/GenFds/Region.py      |  11 +-
 .../GenPatchPcdTable/GenPatchPcdTable.py      |   9 +-
 .../Source/Python/Pkcs7Sign/Pkcs7Sign.py      |  31 ++-
 .../Rsa2048Sha256GenerateKeys.py              |  25 +-
 .../Rsa2048Sha256Sign/Rsa2048Sha256Sign.py    |  35 +--
 .../Source/Python/TargetTool/TargetTool.py    |  39 +--
 BaseTools/Source/Python/Trim/Trim.py          |  24 +-
 .../Source/Python/UPT/Core/DependencyRules.py |  12 +-
 .../UPT/Core/DistributionPackageClass.py      |   4 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py  |   2 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py     |   6 +-
 .../Source/Python/UPT/Core/PackageFile.py     |  12 +-
 .../Python/UPT/GenMetaFile/GenDecFile.py      |  15 +-
 .../Python/UPT/GenMetaFile/GenInfFile.py      |  37 +--
 BaseTools/Source/Python/UPT/InstallPkg.py     |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py    |   2 +-
 .../Python/UPT/Library/CommentParsing.py      |   2 +-
 .../Python/UPT/Library/ExpressionValidate.py  |  11 +-
 BaseTools/Source/Python/UPT/Library/Misc.py   |   6 +-
 .../Python/UPT/Library/ParserValidate.py      |   2 +-
 .../Source/Python/UPT/Library/StringUtils.py  |   4 +-
 .../Python/UPT/Library/UniClassObject.py      |  17 +-
 .../Python/UPT/Library/Xml/XmlRoutines.py     |   4 +-
 BaseTools/Source/Python/UPT/MkPkg.py          |   2 +-
 .../UPT/Object/Parser/InfBinaryObject.py      |   6 +-
 .../UPT/Object/Parser/InfDefineObject.py      |   2 +-
 .../Python/UPT/Object/Parser/InfGuidObject.py |   4 +-
 .../Object/Parser/InfLibraryClassesObject.py  |   2 +-
 .../Python/UPT/Object/Parser/InfMisc.py       |   4 +-
 .../UPT/Object/Parser/InfPackagesObject.py    |   4 +-
 .../Python/UPT/Object/Parser/InfPcdObject.py  |   4 +-
 .../Python/UPT/Object/Parser/InfPpiObject.py  |   4 +-
 .../UPT/Object/Parser/InfProtocolObject.py    |   2 +-
 .../UPT/Object/Parser/InfSoucesObject.py      |   3 +-
 .../Object/Parser/InfUserExtensionObject.py   |   4 +-
 .../Python/UPT/PomAdapter/DecPomAlignment.py  |  56 ++--
 .../Python/UPT/PomAdapter/InfPomAlignment.py  |   3 +-
 .../UPT/PomAdapter/InfPomAlignmentMisc.py     |   3 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py     |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py          |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py    |   4 +-
 BaseTools/Source/Python/UPT/UPT.py            |   8 +-
 .../Python/UPT/UnitTest/DecParserTest.py      |   5 +-
 .../UPT/UnitTest/InfBinarySectionTest.py      |   9 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py  |   2 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py  |  24 +-
 .../Python/Workspace/BuildClassObject.py      |  16 +-
 .../Source/Python/Workspace/DecBuildData.py   |  22 +-
 .../Source/Python/Workspace/DscBuildData.py   | 259 +++++++++---------
 .../Source/Python/Workspace/InfBuildData.py   |   2 +-
 .../Source/Python/Workspace/MetaFileParser.py |  69 ++---
 .../Source/Python/Workspace/MetaFileTable.py  |  10 +-
 .../Python/Workspace/WorkspaceCommon.py       |   2 +-
 BaseTools/Source/Python/build/BuildReport.py  |  21 +-
 BaseTools/Source/Python/build/build.py        |  39 +--
 BaseTools/Tests/CheckPythonSyntax.py          |   2 +-
 BaseTools/Tests/TestTools.py                  |  10 +-
 BaseTools/Tests/TianoCompress.py              |   5 +-
 BaseTools/gcc/mingw-gcc-build.py              | 111 ++++----
 125 files changed, 1376 insertions(+), 1353 deletions(-)

-- 
2.17.1



^ permalink raw reply	[flat|nested] 16+ messages in thread

* [PATCH v4 01/13] BaseTools: Fix a typo in ini.py
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 02/13] BaseTools: Refactor python except statements Gary Lin
                   ` (13 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Jaben Carsey, Yonghong Zhu, Liming Gao

"if mis not None:" => "if m is not None:"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Jaben Carsey <jaben.carsey@intel.com>
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Reviewed-by: Jaben Carsey <jaben.carsey@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
index bf1040d6bac4..ea83327052f2 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
@@ -122,7 +122,7 @@ class BaseINIFile(object):
                 continue
 
             m = section_re.match(templine)
-            if mis not None: # found a section
+            if m is not None: # found a section
                 inGlobal = False
                 # Finish the latest section first
                 if len(sObjs) != 0:
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 02/13] BaseTools: Refactor python except statements
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
  2018-06-25 10:31 ` [PATCH v4 01/13] BaseTools: Fix a typo in ini.py Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 03/13] BaseTools: Refactor python print statements Gary Lin
                   ` (12 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Convert "except ... ," to "except ... as" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_except"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py          |   4 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py      |   2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py |   2 +-
 BaseTools/Scripts/UpdateBuildVersions.py                                                |  12 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                                              |   2 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                                             |   2 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                                              |   2 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                                       |   4 +-
 BaseTools/Source/Python/Common/Expression.py                                            |  22 +--
 BaseTools/Source/Python/Common/Misc.py                                                  |   8 +-
 BaseTools/Source/Python/Common/RangeExpression.py                                       |   6 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                           |   2 +-
 BaseTools/Source/Python/Ecc/CParser.py                                                  | 142 ++++++++++----------
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py                          |   2 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py                         |  14 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                                          |   2 +-
 BaseTools/Source/Python/Ecc/c.py                                                        |   2 +-
 BaseTools/Source/Python/Eot/CParser.py                                                  | 142 ++++++++++----------
 BaseTools/Source/Python/GenFds/FdfParser.py                                             |  10 +-
 BaseTools/Source/Python/GenFds/GenFds.py                                                |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                                  |   2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                                        |   2 +-
 BaseTools/Source/Python/Trim/Trim.py                                                    |   4 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                                     |   4 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py                            |   4 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                                               |   2 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                                         |  12 +-
 BaseTools/Source/Python/UPT/InstallPkg.py                                               |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                                              |   2 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                                   |   2 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py                               |   8 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                                   |   8 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                                  |   2 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                                    |   2 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                                               |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                                    |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py                                              |   4 +-
 BaseTools/Source/Python/UPT/UPT.py                                                      |   4 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                                       |  14 +-
 BaseTools/Source/Python/Workspace/InfBuildData.py                                       |   2 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                                     |  12 +-
 BaseTools/Source/Python/Workspace/MetaFileTable.py                                      |   4 +-
 BaseTools/Source/Python/build/BuildReport.py                                            |   4 +-
 BaseTools/Source/Python/build/build.py                                                  |  10 +-
 BaseTools/Tests/CheckPythonSyntax.py                                                    |   2 +-
 BaseTools/gcc/mingw-gcc-build.py                                                        |   2 +-
 46 files changed, 254 insertions(+), 250 deletions(-)

diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
index 488949f24b6f..a177590af597 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
@@ -115,7 +115,7 @@ class DoxygenFile(Page):
             f = open(self.mFilename, 'w')
             f.write('\n'.join(str))
             f.close()
-        except IOError, e:
+        except IOError as e:
             ErrorMsg ('Fail to write file %s' % self.mFilename)
             return False
 
@@ -429,7 +429,7 @@ class DoxygenConfigFile:
             f = open(path, 'w')
             f.write(text)
             f.close()
-        except IOError, e:
+        except IOError as e:
             ErrorMsg ('Fail to generate doxygen config file %s' % path)
             return False
 
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
index 94b6588c0ddf..c22d362ff3e1 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
@@ -1001,7 +1001,7 @@ class PackageDocumentAction(DoxygenAction):
 
         try:
             file = open(path, 'rb')
-        except (IOError, OSError), msg:
+        except (IOError, OSError) as msg:
             return None
 
         t = file.read()
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
index ca55929eda9a..4bae6968a96e 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
@@ -1004,7 +1004,7 @@ class PackageDocumentAction(DoxygenAction):
 
         try:
             file = open(path, 'rb')
-        except (IOError, OSError), msg:
+        except (IOError, OSError) as msg:
             return None
 
         t = file.read()
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index e62030aa9f0f..fb61b89bfb4c 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -90,7 +90,8 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.flush()
         returnValue = err_val.returncode
 
-    except IOError as (errno, strerror):
+    except IOError as err_val:
+        (errno, strerror) = err_val.args
         file_list.close()
         if not Opt.silent:
             sys.stderr.write("I/O ERROR : %s : %s\n" % (str(errno), strerror))
@@ -100,7 +101,8 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.flush()
         returnValue = errno
 
-    except OSError as (errno, strerror):
+    except OSError as err_val:
+        (errno, strerror) = err_val.args
         file_list.close()
         if not Opt.silent:
             sys.stderr.write("OS ERROR : %s : %s\n" % (str(errno), strerror))
@@ -210,13 +212,15 @@ def RevertCmd(Filename, Opt):
             sys.stderr.write("Subprocess ERROR : %s\n" % err_val)
             sys.stderr.flush()
 
-    except IOError as (errno, strerror):
+    except IOError as err_val:
+        (errno, strerror) = err_val.args
         if not Opt.silent:
             sys.stderr.write("I/O ERROR : %d : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
             sys.stderr.flush()
 
-    except OSError as (errno, strerror):
+    except OSError as err_val:
+        (errno, strerror) = err_val.args
         if not Opt.silent:
             sys.stderr.write("OS ERROR : %d : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index ed0be3bc74f9..72d801df8fd5 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -2220,7 +2220,7 @@ class PlatformAutoGen(AutoGen):
             if ToPcd.DefaultValue:
                 try:
                     ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self._GuidDict)(True)
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
                                         File=self.MetaFile)
 
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index ed5df2b75440..b69788c37e08 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -449,7 +449,7 @@ def Main():
                     os.utime(Option.OutputFile, None)
         else:
             Dpx.Generate()
-    except BaseException, X:
+    except BaseException as X:
         EdkLogger.quiet("")
         if Option is not None and Option.debug is not None:
             EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 8541372159a2..48b66c570e0a 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1030,7 +1030,7 @@ cleanlib:
             else:
                 try:
                     Fd = open(F.Path, 'r')
-                except BaseException, X:
+                except BaseException as X:
                     EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
 
                 FileContent = Fd.read()
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 5a3c2547783b..06cf3e7d5162 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -254,7 +254,7 @@ class UniFileClassObject(object):
         if len(Lang) != 3:
             try:
                 FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
-            except UnicodeError, X:
+            except UnicodeError as X:
                 EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File);
             except:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File);
@@ -393,7 +393,7 @@ class UniFileClassObject(object):
 
         try:
             FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
-        except UnicodeError, X:
+        except UnicodeError as X:
             EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File.Path);
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File.Path);
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 9e9d9fdc02e7..7b04dcdb36cc 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -307,7 +307,7 @@ class ValueExpression(BaseExpression):
         }
         try:
             Val = eval(EvalStr, {}, Dict)
-        except Exception, Excpt:
+        except Exception as Excpt:
             raise BadExpression(str(Excpt))
 
         if Operator in {'and', 'or'}:
@@ -425,7 +425,7 @@ class ValueExpression(BaseExpression):
                 continue
             try:
                 Val = self.Eval(Op, Val, EvalFunc())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -464,7 +464,7 @@ class ValueExpression(BaseExpression):
                 Op += ' ' + self._Token
             try:
                 Val = self.Eval(Op, Val, self._RelExpr())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -490,14 +490,14 @@ class ValueExpression(BaseExpression):
             Val = self._UnaryExpr()
             try:
                 return self.Eval('not', Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         if self._IsOperator({"~"}):
             Val = self._UnaryExpr()
             try:
                 return self.Eval('~', Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         return self._IdenExpr()
@@ -816,9 +816,9 @@ class ValueExpressionEx(ValueExpression):
             elif self.PcdType in TAB_PCD_NUMERIC_TYPES and (PcdValue.startswith("'") or \
                       PcdValue.startswith('"') or PcdValue.startswith("L'") or PcdValue.startswith('L"') or PcdValue.startswith('{')):
                 raise BadExpression
-        except WrnExpression, Value:
+        except WrnExpression as Value:
             PcdValue = Value.result
-        except BadExpression, Value:
+        except BadExpression as Value:
             if self.PcdType in TAB_PCD_NUMERIC_TYPES:
                 PcdValue = PcdValue.strip()
                 if PcdValue.startswith('{') and PcdValue.endswith('}'):
@@ -854,7 +854,7 @@ class ValueExpressionEx(ValueExpression):
                                 tmpValue = int(Item, 0)
                                 if tmpValue > 255:
                                     raise BadExpression("Byte  array number %s should less than 0xFF." % Item)
-                            except BadExpression, Value:
+                            except BadExpression as Value:
                                 raise BadExpression(Value)
                             except ValueError:
                                 pass
@@ -870,7 +870,7 @@ class ValueExpressionEx(ValueExpression):
                 else:
                     try:
                         TmpValue, Size = ParseFieldValue(PcdValue)
-                    except BadExpression, Value:
+                    except BadExpression as Value:
                         raise BadExpression("Type: %s, Value: %s, %s" % (self.PcdType, PcdValue, Value))
                 if type(TmpValue) == type(''):
                     try:
@@ -1030,8 +1030,8 @@ if __name__ == '__main__':
         try:
             print ValueExpression(input)(True)
             print ValueExpression(input)(False)
-        except WrnExpression, Ex:
+        except WrnExpression as Ex:
             print Ex.result
             print str(Ex)
-        except Exception, Ex:
+        except Exception as Ex:
             print str(Ex)
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 24706ebe500f..5197818d3f27 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -478,7 +478,7 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
             Fd = open(File, "wb")
             Fd.write(Content)
             Fd.close()
-    except IOError, X:
+    except IOError as X:
         EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
 
     return True
@@ -512,7 +512,7 @@ def DataRestore(File):
     try:
         Fd = open(File, 'rb')
         Data = cPickle.load(Fd)
-    except Exception, e:
+    except Exception as e:
         EdkLogger.verbose("Failed to load [%s]\n\t%s" % (File, str(e)))
         Data = None
     finally:
@@ -1278,7 +1278,7 @@ def ParseDevPathValue (Value):
     try:
         p = subprocess.Popen(Cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         out, err = p.communicate()
-    except Exception, X:
+    except Exception as X:
         raise BadExpression("DevicePath: %s" % (str(X)) )
     finally:
         subprocess._cleanup()
@@ -1327,7 +1327,7 @@ def ParseFieldValue (Value):
             Value = Value[1:-1]
         try:
             Value = "'" + uuid.UUID(Value).get_bytes_le() + "'"
-        except ValueError, Message:
+        except ValueError as Message:
             raise BadExpression(Message)
         Value, Size = ParseFieldValue(Value)
         return Value, 16
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 7f504d6e310c..b6f99447057c 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -422,7 +422,7 @@ class RangeExpression(BaseExpression):
             Op = self._Token
             try:
                 Val = self.Eval(Op, Val, EvalFunc())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -445,7 +445,7 @@ class RangeExpression(BaseExpression):
                 Op += ' ' + self._Token
             try:
                 Val = self.Eval(Op, Val, self._RelExpr())
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 Val = Warn.result
         return Val
@@ -457,7 +457,7 @@ class RangeExpression(BaseExpression):
             Val = self._NeExpr()
             try:
                 return self.Eval(Token, Val)
-            except WrnExpression, Warn:
+            except WrnExpression as Warn:
                 self._WarnExcept = Warn
                 return Warn.result
         return self._IdenExpr()
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 2b447772eafe..8ff544ed769d 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -245,7 +245,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
                                         stdout=subprocess.PIPE, 
                                         stderr= subprocess.PIPE,
                                         shell=True)
-    except Exception, X:
+    except Exception as X:
         EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData=str(X))
     (out, error) = PopenObject.communicate()
     print out
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index 94711a9a378a..ddc6cbd506aa 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -173,7 +173,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -532,7 +532,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -809,7 +809,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -964,7 +964,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1092,7 +1092,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1162,7 +1162,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1216,7 +1216,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1263,7 +1263,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1432,7 +1432,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1465,7 +1465,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1589,7 +1589,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1636,7 +1636,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1699,7 +1699,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1742,7 +1742,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1861,7 +1861,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1921,7 +1921,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2003,7 +2003,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2158,7 +2158,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2223,7 +2223,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2275,7 +2275,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2322,7 +2322,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2464,7 +2464,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3056,7 +3056,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3206,7 +3206,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3462,7 +3462,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3528,7 +3528,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3617,7 +3617,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3825,7 +3825,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3881,7 +3881,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3971,7 +3971,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4219,7 +4219,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4570,7 +4570,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4690,7 +4690,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4770,7 +4770,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4835,7 +4835,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4933,7 +4933,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5012,7 +5012,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5103,7 +5103,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5203,7 +5203,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5355,7 +5355,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5583,7 +5583,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5644,7 +5644,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5691,7 +5691,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5789,7 +5789,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5995,7 +5995,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6065,7 +6065,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6100,7 +6100,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8135,7 +8135,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8170,7 +8170,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8217,7 +8217,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8285,7 +8285,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8355,7 +8355,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8415,7 +8415,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8475,7 +8475,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8535,7 +8535,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8595,7 +8595,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8669,7 +8669,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8743,7 +8743,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8817,7 +8817,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9058,7 +9058,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9155,7 +9155,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9228,7 +9228,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9301,7 +9301,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12467,7 +12467,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12560,7 +12560,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -14530,7 +14530,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16251,7 +16251,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16322,7 +16322,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16435,7 +16435,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16586,7 +16586,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16703,7 +16703,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index 760f88cc7294..fc65e9a2bd3c 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -98,7 +98,7 @@ class Table(object):
         SqlCommand = """drop table IF EXISTS %s""" % self.Table
         try:
             self.Cur.execute(SqlCommand)
-        except Exception, e:
+        except Exception as e:
             print "An error occurred when Drop a table:", e.args[0]
 
     ## Get count
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 3749f6a2699e..fd96bb9a3c0b 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -1183,7 +1183,7 @@ class DscParser(MetaFileParser):
 
             try:
                 Processer[self._ItemType]()
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 # 
                 # Only catch expression evaluation error here. We need to report
                 # the precise number of line on which the error occurred
@@ -1192,7 +1192,7 @@ class DscParser(MetaFileParser):
 #                 EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
 #                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList),
 #                                 Line=self._LineIndex+1)
-            except MacroException, Excpt:
+            except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList), 
                                 Line=self._LineIndex+1)
@@ -1305,10 +1305,10 @@ class DscParser(MetaFileParser):
             Macros.update(GlobalData.gGlobalDefines)
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
-            except SymbolNotFound, Exc:
+            except SymbolNotFound as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -1317,7 +1317,7 @@ class DscParser(MetaFileParser):
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList), 
                                 Line=self._LineIndex+1)
                 Result = Excpt.result
-            except BadExpression, Exc:
+            except BadExpression as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
 
@@ -1437,13 +1437,13 @@ class DscParser(MetaFileParser):
             PcdValue = ValueList[0]      
             try:
                 ValueList[0] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValueList[0] = Value.result          
         else:
             PcdValue = ValueList[-1]
             try:
                 ValueList[-1] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValueList[-1] = Value.result
             
             if ValueList[-1] == 'True':
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index a86f19624c44..d5fb80fcf982 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -214,7 +214,7 @@ def XmlParseFile(FileName):
         Dom = xml.dom.minidom.parse(XmlFile)
         XmlFile.close()
         return Dom
-    except Exception, X:
+    except Exception as X:
         print X
         return ""
 
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 93ee1990ba28..99b22725e6ba 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -2633,7 +2633,7 @@ if __name__ == '__main__':
 #    CollectSourceCodeDataIntoDB(sys.argv[1])
     try:
         test_file = sys.argv[1]
-    except IndexError, v:
+    except IndexError as v:
         print "Usage: %s filename" % sys.argv[0]
         sys.exit(1)
     MsgList = CheckFuncHeaderDoxygenComments(test_file)
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index 94711a9a378a..ddc6cbd506aa 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -173,7 +173,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -532,7 +532,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -809,7 +809,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -964,7 +964,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1092,7 +1092,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1162,7 +1162,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1216,7 +1216,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1263,7 +1263,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1432,7 +1432,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1465,7 +1465,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1589,7 +1589,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1636,7 +1636,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1699,7 +1699,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1742,7 +1742,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1861,7 +1861,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -1921,7 +1921,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2003,7 +2003,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2158,7 +2158,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2223,7 +2223,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2275,7 +2275,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2322,7 +2322,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -2464,7 +2464,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3056,7 +3056,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3206,7 +3206,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3462,7 +3462,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3528,7 +3528,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3617,7 +3617,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3825,7 +3825,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3881,7 +3881,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -3971,7 +3971,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4219,7 +4219,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4570,7 +4570,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4690,7 +4690,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4770,7 +4770,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4835,7 +4835,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -4933,7 +4933,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5012,7 +5012,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5103,7 +5103,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5203,7 +5203,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5355,7 +5355,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5583,7 +5583,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5644,7 +5644,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5691,7 +5691,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5789,7 +5789,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -5995,7 +5995,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6065,7 +6065,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -6100,7 +6100,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8135,7 +8135,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8170,7 +8170,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8217,7 +8217,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8285,7 +8285,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8355,7 +8355,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8415,7 +8415,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8475,7 +8475,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8535,7 +8535,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8595,7 +8595,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8669,7 +8669,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8743,7 +8743,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -8817,7 +8817,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9058,7 +9058,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9155,7 +9155,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9228,7 +9228,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -9301,7 +9301,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12467,7 +12467,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -12560,7 +12560,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -14530,7 +14530,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16251,7 +16251,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16322,7 +16322,7 @@ class CParser(Parser):
                 retval.stop = self.input.LT(-1)
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16435,7 +16435,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16586,7 +16586,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
@@ -16703,7 +16703,7 @@ class CParser(Parser):
 
 
 
-            except RecognitionException, re:
+            except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index ceedcf8a28b9..a99a5e7ac693 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -921,7 +921,7 @@ class FdfParser:
                     return ValueExpression(Expression, MacroPcdDict)(True)
                 else:
                     return ValueExpression(Expression, MacroPcdDict)()
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -930,7 +930,7 @@ class FdfParser:
                                 File=self.FileName, ExtraData=self.__CurrentLine(), 
                                 Line=Line)
                 return Excpt.result
-            except Exception, Excpt:
+            except Exception as Excpt:
                 if hasattr(Excpt, 'Pcd'):
                     if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
                         Info = GlobalData.gPlatformOtherPcds[Excpt.Pcd]
@@ -1368,7 +1368,7 @@ class FdfParser:
             while self.__GetFd() or self.__GetFv() or self.__GetFmp() or self.__GetCapsule() or self.__GetVtf() or self.__GetRule() or self.__GetOptionRom():
                 pass
 
-        except Warning, X:
+        except Warning as X:
             self.__UndoToken()
             #'\n\tGot Token: \"%s\" from File %s\n' % (self.__Token, FileLineTuple[0]) + \
             # At this point, the closest parent would be the included file itself
@@ -4776,7 +4776,7 @@ if __name__ == "__main__":
     import sys
     try:
         test_file = sys.argv[1]
-    except IndexError, v:
+    except IndexError as v:
         print "Usage: %s filename" % sys.argv[0]
         sys.exit(1)
 
@@ -4784,7 +4784,7 @@ if __name__ == "__main__":
     try:
         parser.ParseFile()
         parser.CycleReferenceCheck()
-    except Warning, X:
+    except Warning as X:
         print str(X)
     else:
         print "Success!"
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 339b99867369..ba3950dacd8a 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -335,10 +335,10 @@ def main():
         """Display FV space info."""
         GenFds.DisplayFvSpaceInfo(FdfParserObj)
 
-    except FdfParser.Warning, X:
+    except FdfParser.Warning as X:
         EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
         ReturnCode = FORMAT_INVALID
-    except FatalError, X:
+    except FatalError as X:
         if Options.debug is not None:
             import traceback
             EdkLogger.quiet(traceback.format_exc())
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index c2e82de891d3..c1d656227609 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -721,7 +721,7 @@ class GenFdsGlobalVariable:
 
         try:
             PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-        except Exception, X:
+        except Exception as X:
             EdkLogger.error("GenFds", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
         (out, error) = PopenObject.communicate()
 
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ecac316b7a3a..9fb89549cc29 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -253,7 +253,7 @@ if __name__ == '__main__':
             FileHandle.RWFile('#', '=', 0)
         else:
             FileHandle.RWFile('#', '=', 1)
-    except Exception, e:
+    except Exception as e:
         last_type, last_value, last_tb = sys.exc_info()
         traceback.print_exception(last_type, last_value, last_tb)
 
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index a74075859148..b512d15243f8 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -668,7 +668,7 @@ def Main():
             EdkLogger.SetLevel(CommandOptions.LogLevel + 1)
         else:
             EdkLogger.SetLevel(CommandOptions.LogLevel)
-    except FatalError, X:
+    except FatalError as X:
         return 1
     
     try:
@@ -688,7 +688,7 @@ def Main():
             if CommandOptions.OutputFile is None:
                 CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
             TrimPreprocessedFile(InputFile, CommandOptions.OutputFile, CommandOptions.ConvertHex, CommandOptions.TrimLong)
-    except FatalError, X:
+    except FatalError as X:
         import platform
         import traceback
         if CommandOptions is not None and CommandOptions.LogLevel <= EdkLogger.DEBUG_9:
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 2af847ed2e0b..34f56e7bb487 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -394,7 +394,7 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
                 return False
         else:
             return True
-    except FatalError, ErrCode:
+    except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
                         ST.WRN_EDK1_INF_FOUND%Path)
@@ -446,7 +446,7 @@ def VerifyReplaceModuleDep(Path, DpPackagePathList, OtherPkgList):
                     return False
         else:
             return True
-    except FatalError, ErrCode:
+    except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
                         ST.WRN_EDK1_INF_FOUND%Path)
diff --git a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
index 9c55e0ea88a7..81c67fb510a2 100644
--- a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
+++ b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
@@ -155,7 +155,7 @@ class DistributionPackageClass(object):
                                     ModuleObj.GetName(), \
                                     ModuleObj.GetCombinePath())] = ModuleObj
                         PackageObj.SetModuleDict(ModuleDict)
-                    except FatalError, ErrCode:
+                    except FatalError as ErrCode:
                         if ErrCode.message == EDK1_INF_ERROR:
                             Logger.Warn("UPT",
                                         ST.WRN_EDK1_INF_FOUND%Filename)
@@ -181,7 +181,7 @@ class DistributionPackageClass(object):
                                  ModuleObj.GetName(), 
                                  ModuleObj.GetCombinePath())
                     self.ModuleSurfaceArea[ModuleKey] = ModuleObj
-                except FatalError, ErrCode:
+                except FatalError as ErrCode:
                     if ErrCode.message == EDK1_INF_ERROR:
                         Logger.Error("UPT",
                                      EDK1_INF_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index 78d67ab31e1e..97ad47a58dbb 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -230,7 +230,7 @@ class IpiDatabase(object):
             self._AddDp(DpObj.Header.GetGuid(), DpObj.Header.GetVersion(), \
                         NewDpPkgFileName, DpPkgFileName, RePackage)
     
-        except sqlite3.IntegrityError, DetailMsg:
+        except sqlite3.IntegrityError as DetailMsg:
             Logger.Error("UPT",
                          UPT_DB_UPDATE_ERROR,
                          ST.ERR_UPT_DB_UPDATE_ERROR,
diff --git a/BaseTools/Source/Python/UPT/Core/PackageFile.py b/BaseTools/Source/Python/UPT/Core/PackageFile.py
index ec6f5503eaad..298d8aa9db3b 100644
--- a/BaseTools/Source/Python/UPT/Core/PackageFile.py
+++ b/BaseTools/Source/Python/UPT/Core/PackageFile.py
@@ -51,7 +51,7 @@ class PackageFile:
             self._Files = {}
             for Filename in self._ZipFile.namelist():
                 self._Files[os.path.normpath(Filename)] = Filename
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE, 
                             ExtraData="%s (%s)" % (FileName, str(Xstr)))
 
@@ -106,7 +106,7 @@ class PackageFile:
                             ExtraData="[%s] in %s" % (Which, self._FileName))
         try:
             FileContent = self._ZipFile.read(self._Files[Which])
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_DECOMPRESS_FAILURE, 
                             ExtraData="[%s] in %s (%s)" % (Which, \
                                                            self._FileName, \
@@ -119,14 +119,14 @@ class PackageFile:
                 return
             else:
                 ToFile = __FileHookOpen__(ToDest, 'wb')
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE, 
                             ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
         try:
             ToFile.write(FileContent)
             ToFile.close()
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_WRITE_FAILURE, 
                             ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
@@ -228,7 +228,7 @@ class PackageFile:
                     return
             Logger.Info("packing ..." + File)
             self._ZipFile.write(File, ArcName)
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
                             ExtraData="%s (%s)" % (File, str(Xstr)))
 
@@ -242,7 +242,7 @@ class PackageFile:
             if os.path.splitext(ArcName)[1].lower() == '.pkg':
                 Data = Data.encode('utf_8')
             self._ZipFile.writestr(ArcName, Data)
-        except BaseException, Xstr:
+        except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
                             ExtraData="%s (%s)" % (ArcName, str(Xstr)))
 
diff --git a/BaseTools/Source/Python/UPT/InstallPkg.py b/BaseTools/Source/Python/UPT/InstallPkg.py
index c0d56b55aacd..dc22ff7e3484 100644
--- a/BaseTools/Source/Python/UPT/InstallPkg.py
+++ b/BaseTools/Source/Python/UPT/InstallPkg.py
@@ -537,7 +537,7 @@ def Main(Options = None):
                       Options, Dep, WorkspaceDir, DataBase)
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
diff --git a/BaseTools/Source/Python/UPT/InventoryWs.py b/BaseTools/Source/Python/UPT/InventoryWs.py
index 824e1c288947..cd92753a8d4b 100644
--- a/BaseTools/Source/Python/UPT/InventoryWs.py
+++ b/BaseTools/Source/Python/UPT/InventoryWs.py
@@ -92,7 +92,7 @@ def Main(Options = None):
         DataBase = GlobalData.gDB
         InventoryDistInstalled(DataBase)     
         ReturnCode = 0       
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
diff --git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index 4713614c4a45..8ee788bd7724 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -217,7 +217,7 @@ def ParsePcdErrorCode (Value = None, ContainerFile = None, LineNum = None):
         # To delete the tailing 'L'
         #
         return hex(ErrorCode)[:-1]
-    except ValueError, XStr:
+    except ValueError as XStr:
         if XStr:
             pass
         Logger.Error('Parser', 
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index 090c7eb95716..ca21e6995217 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -297,7 +297,7 @@ class _LogicalExpressionParser(_ExprBase):
         try:
             if self.LogicalExpression() not in [self.ARITH, self.LOGICAL, self.REALLOGICAL, self.STRINGITEM]:
                 return False, ST.ERR_EXPR_LOGICAL % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         self.SkipWhitespace()
         if self.Index != self.Len:
@@ -327,7 +327,7 @@ class _ValidRangeExpressionParser(_ExprBase):
         try:
             if self.RangeExpression() not in [self.HEX, self.INT]:
                 return False, ST.ERR_EXPR_RANGE % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         
         self.SkipWhitespace()
@@ -423,7 +423,7 @@ class _ValidListExpressionParser(_ExprBase):
         try:
             if self.ListExpression() not in [self.NUM]:
                 return False, ST.ERR_EXPR_LIST % self.Token
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
 
         self.SkipWhitespace()
@@ -457,7 +457,7 @@ class _StringTestParser(_ExprBase):
             return False, ST.ERR_EXPR_EMPTY
         try:
             self.StringTest()
-        except _ExprError, XExcept:
+        except _ExprError as XExcept:
             return False, XExcept.Error
         return True, ''
 
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 7dcf0cf6558b..299cd871444b 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -327,9 +327,9 @@ class UniFileClassObject(object):
         if len(Lang) != 3:
             try:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
-            except UnicodeError, Xstr:
+            except UnicodeError as Xstr:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
-            except UnicodeError, Xstr:
+            except UnicodeError as Xstr:
                 FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
             except:
                 EdkLogger.Error("Unicode File Parser", 
@@ -436,7 +436,7 @@ class UniFileClassObject(object):
 
         try:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
-        except UnicodeError, Xstr:
+        except UnicodeError as Xstr:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
         except UnicodeError:
             FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
@@ -1042,7 +1042,7 @@ class UniFileClassObject(object):
                              ExtraData=FilaPath)
         try:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_8').readlines()
-        except UnicodeError, Xstr:
+        except UnicodeError as Xstr:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16').readlines()
         except UnicodeError:
             FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16_le').readlines()
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index f20ae4dfa82f..1096bc5b1849 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -224,6 +224,6 @@ def XmlParseFile(FileName):
         Dom = xml.dom.minidom.parse(XmlFile)
         XmlFile.close()
         return Dom
-    except BaseException, XExcept:
+    except BaseException as XExcept:
         XmlFile.close()
         Logger.Error('\nUPT', PARSER_ERROR, XExcept, File=FileName, RaiseError=True)
diff --git a/BaseTools/Source/Python/UPT/MkPkg.py b/BaseTools/Source/Python/UPT/MkPkg.py
index ff9aa7fb117c..e7ec328a78d9 100644
--- a/BaseTools/Source/Python/UPT/MkPkg.py
+++ b/BaseTools/Source/Python/UPT/MkPkg.py
@@ -213,7 +213,7 @@ def Main(Options = None):
         Logger.Quiet(ST.MSG_FINISH)
         ReturnCode = 0
 
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]        
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % \
diff --git a/BaseTools/Source/Python/UPT/ReplacePkg.py b/BaseTools/Source/Python/UPT/ReplacePkg.py
index efbf68a4ecc6..6f52b4f8f8e8 100644
--- a/BaseTools/Source/Python/UPT/ReplacePkg.py
+++ b/BaseTools/Source/Python/UPT/ReplacePkg.py
@@ -71,7 +71,7 @@ def Main(Options = None):
         InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir, DataBase)
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
diff --git a/BaseTools/Source/Python/UPT/RmPkg.py b/BaseTools/Source/Python/UPT/RmPkg.py
index ea842c11859f..6427a8f16c88 100644
--- a/BaseTools/Source/Python/UPT/RmPkg.py
+++ b/BaseTools/Source/Python/UPT/RmPkg.py
@@ -157,7 +157,7 @@ def Main(Options = None):
         
         ReturnCode = 0
         
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]        
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/UPT/TestInstall.py b/BaseTools/Source/Python/UPT/TestInstall.py
index 899cae56aa87..d8918737f907 100644
--- a/BaseTools/Source/Python/UPT/TestInstall.py
+++ b/BaseTools/Source/Python/UPT/TestInstall.py
@@ -68,12 +68,12 @@ def Main(Options=None):
         else:
             Logger.Quiet(ST.MSG_TEST_INSTALL_FAIL)
 
-    except TE.FatalError, XExcept:
+    except TE.FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
 
-    except Exception, x:
+    except Exception as x:
         ReturnCode = TE.CODE_ERROR
         Logger.Error(
                     "\nTestInstallPkg",
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 09653cdce95f..2644dbed31e9 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -179,7 +179,7 @@ def Main():
 
     try:
         GlobalData.gWORKSPACE, GlobalData.gPACKAGE_PATH = GetWorkspace()
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
         return XExcept.args[0]
@@ -294,7 +294,7 @@ def Main():
             return OPTION_MISSING
 
         ReturnCode = RunModule(Opt)
-    except FatalError, XExcept:
+    except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 1ed7eb1c2cf7..a001162e8e3b 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -120,7 +120,7 @@ def GetDependencyList(FileStack,SearchPathList):
             try:
                 Fd = open(F, 'r')
                 FileContent = Fd.read()
-            except BaseException, X:
+            except BaseException as X:
                 EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F + "\n\t" + str(X))
             finally:
                 if "Fd" in dir(locals()):
@@ -887,11 +887,11 @@ class DscBuildData(PlatformBuildClassObject):
             DatumType = self._DecPcds[PcdCName, TokenSpaceGuid].DatumType
             try:
                 ValueList[Index] = ValueExpressionEx(ValueList[Index], DatumType, self._GuidDict)(True)
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=LineNo,
                                 ExtraData="PCD [%s.%s] Value \"%s\" " % (
                                 TokenSpaceGuid, PcdCName, ValueList[Index]))
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 if hasattr(Excpt, 'Pcd'):
                     if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
                         EdkLogger.error('Parser', FORMAT_INVALID, "Cannot use this PCD (%s) in an expression as"
@@ -1059,7 +1059,7 @@ class DscBuildData(PlatformBuildClassObject):
                 return PcdValue
             try:
                 PcdValue = ValueExpressionEx(PcdValue[1:], PcdDatumType, GuidDict)(True)
-            except BadExpression, Value:     
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         elif PcdValue.startswith("L'") or PcdValue.startswith("'"):
@@ -1070,7 +1070,7 @@ class DscBuildData(PlatformBuildClassObject):
                 return PcdValue
             try:
                 PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         elif PcdValue.startswith('L'):
@@ -1082,7 +1082,7 @@ class DscBuildData(PlatformBuildClassObject):
                 return PcdValue
             try:
                 PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         else:
@@ -1109,7 +1109,7 @@ class DscBuildData(PlatformBuildClassObject):
                     return PcdValue
             try:
                 PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
-            except BadExpression, Value:
+            except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         return PcdValue
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index 836140759f21..165e03f78964 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -1121,7 +1121,7 @@ class InfBuildData(ModuleBuildClassObject):
                     else:
                         try:
                             Pcd.DefaultValue = ValueExpressionEx(Pcd.DefaultValue, Pcd.DatumType, _GuidDict)(True)
-                        except BadExpression, Value:
+                        except BadExpression as Value:
                             EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(TokenSpaceGuid, PcdRealName, Pcd.DefaultValue, Value),
                                             File=self.MetaFile, Line=LineNo)
                     break
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 88c7bb374ccc..f1cfa73fd4f2 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1341,7 +1341,7 @@ class DscParser(MetaFileParser):
                 self._InSubsection = False
             try:
                 Processer[self._ItemType]()
-            except EvaluationException, Excpt:
+            except EvaluationException as Excpt:
                 # 
                 # Only catch expression evaluation error here. We need to report
                 # the precise number of line on which the error occurred
@@ -1363,7 +1363,7 @@ class DscParser(MetaFileParser):
                     EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
                                     File=self._FileWithError, ExtraData=' '.join(self._ValueList),
                                     Line=self._LineIndex + 1)
-            except MacroException, Excpt:
+            except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
                                 File=self._FileWithError, ExtraData=' '.join(self._ValueList),
                                 Line=self._LineIndex + 1)
@@ -1465,10 +1465,10 @@ class DscParser(MetaFileParser):
             Macros.update(GlobalData.gGlobalDefines)
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
-            except SymbolNotFound, Exc:
+            except SymbolNotFound as Exc:
                 EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
                 Result = False
-            except WrnExpression, Excpt:
+            except WrnExpression as Excpt:
                 # 
                 # Catch expression evaluation warning here. We need to report
                 # the precise number of line and return the evaluation result
@@ -1614,7 +1614,7 @@ class DscParser(MetaFileParser):
         if PcdValue and "." not in self._ValueList[0]:
             try:
                 ValList[Index] = ValueExpression(PcdValue, self._Macros)(True)
-            except WrnExpression, Value:
+            except WrnExpression as Value:
                 ValList[Index] = Value.result
             except:
                 pass
@@ -2019,7 +2019,7 @@ class DecParser(MetaFileParser):
                 try:
                     self._GuidDict.update(self._AllPcdDict)
                     ValueList[0] = ValueExpressionEx(ValueList[0], ValueList[1], self._GuidDict)(True)
-                except BadExpression, Value:
+                except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, Value, ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
             # check format of default value against the datum type
             IsValid, Cause = CheckPcdDatum(ValueList[1], ValueList[0])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index 3c8dae0e622f..d17487a4409d 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -63,7 +63,7 @@ class MetaFileTable(Table):
                 # update the timestamp in database
                 self._FileIndexTable.SetFileTimeStamp(self.IdBase, TimeStamp)
                 return False
-        except Exception, Exc:
+        except Exception as Exc:
             EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc))
             return False
         return True
@@ -250,7 +250,7 @@ class PackageTable(MetaFileTable):
                 if comment.startswith("@Expression"):
                     comment = comment.replace("@Expression", "", 1)
                     expressions.append(comment.split("|")[1].strip())
-        except Exception, Exc:
+        except Exception as Exc:
             ValidType = ""
             if oricomment.startswith("@ValidRange"):
                 ValidType = "@ValidRange"
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 324b6ff6aa76..55222c886d2d 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -649,7 +649,7 @@ class ModuleReport(object):
                 cmd = ["GenFw", "--rebase", str(0), "-o", Tempfile, DefaultEFIfile]
                 try:
                     PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-                except Exception, X:
+                except Exception as X:
                     EdkLogger.error("GenFw", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
                 EndOfProcedure = threading.Event()
                 EndOfProcedure.clear()
@@ -962,7 +962,7 @@ class PcdReport(object):
                 if DscDefaultValue != DscDefaultValBak:
                     try:
                         DscDefaultValue = ValueExpressionEx(DscDefaultValue, Pcd.DatumType, self._GuidDict)(True)
-                    except BadExpression, DscDefaultValue:
+                    except BadExpression as DscDefaultValue:
                         EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" %(DscDefaultValue, Pcd.DatumType))
 
                 InfDefaultValue = None
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index c16e810fed71..4600c46be1be 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -548,7 +548,7 @@ class BuildTask:
                 EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join(Th.getName() for Th in threading.enumerate()))
                 # avoid tense loop
                 time.sleep(0.1)
-        except BaseException, X:
+        except BaseException as X:
             #
             # TRICK: hide the output of threads left runing, so that the user can
             #        catch the error message easily
@@ -1324,7 +1324,7 @@ class Build():
             try:
                 #os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
-            except WindowsError, X:
+            except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
@@ -1414,7 +1414,7 @@ class Build():
             try:
                 #os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
-            except WindowsError, X:
+            except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
@@ -2500,14 +2500,14 @@ def Main():
         # All job done, no error found and no exception raised
         #
         BuildError = False
-    except FatalError, X:
+    except FatalError as X:
         if MyBuild is not None:
             # for multi-thread build exits safely
             MyBuild.Relinquish()
         if Option is not None and Option.debug is not None:
             EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
         ReturnCode = X.args[0]
-    except Warning, X:
+    except Warning as X:
         # error from Fdf parser
         if MyBuild is not None:
             # for multi-thread build exits safely
diff --git a/BaseTools/Tests/CheckPythonSyntax.py b/BaseTools/Tests/CheckPythonSyntax.py
index 61a048ad5d05..a55b29de4713 100644
--- a/BaseTools/Tests/CheckPythonSyntax.py
+++ b/BaseTools/Tests/CheckPythonSyntax.py
@@ -29,7 +29,7 @@ class Tests(TestTools.BaseToolsTest):
     def SingleFileTest(self, filename):
         try:
             py_compile.compile(filename, doraise=True)
-        except Exception, e:
+        except Exception as e:
             self.fail('syntax error: %s, Error is %s' % (filename, str(e)))
 
 def MakePythonSyntaxCheckTests():
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 420b3dea80f7..858b4020ef9f 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -337,7 +337,7 @@ class SourceFiles:
                     print '[KeyboardInterrupt]'
                     return False
 
-                except Exception, e:
+                except Exception as e:
                     print e
 
             if not completed: return False
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 03/13] BaseTools: Refactor python print statements
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
  2018-06-25 10:31 ` [PATCH v4 01/13] BaseTools: Fix a typo in ini.py Gary Lin
  2018-06-25 10:31 ` [PATCH v4 02/13] BaseTools: Refactor python except statements Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 04/13] BaseTools: Remove the old python "not-equal" Gary Lin
                   ` (11 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Refactor print statements to be compatible with python 3.
Based on "futurize -f libfuturize.fixes.fix_print_with_import"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                                |  3 +-
 BaseTools/Scripts/BinToPcd.py                                                    |  1 +
 BaseTools/Scripts/FormatDosFiles.py                                              |  1 +
 BaseTools/Scripts/MemoryProfileSymbolGen.py                                      | 13 +--
 BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py                         | 47 +++++-----
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py   |  3 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py | 29 +++---
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                                  | 19 ++--
 BaseTools/Source/Python/AutoGen/AutoGen.py                                       |  5 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                                   | 31 +++---
 BaseTools/Source/Python/AutoGen/UniClassObject.py                                |  7 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                             |  3 +-
 BaseTools/Source/Python/Common/Expression.py                                     | 11 ++-
 BaseTools/Source/Python/Common/RangeExpression.py                                |  5 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                           |  7 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                    |  3 +-
 BaseTools/Source/Python/Ecc/CParser.py                                           |  5 +-
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                             | 69 +++++++-------
 BaseTools/Source/Python/Ecc/Configuration.py                                     |  5 +-
 BaseTools/Source/Python/Ecc/Exception.py                                         |  3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py                   |  3 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                                   |  5 +-
 BaseTools/Source/Python/Ecc/c.py                                                 | 13 +--
 BaseTools/Source/Python/Eot/CParser.py                                           |  5 +-
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                             | 61 ++++++------
 BaseTools/Source/Python/Eot/InfParserLite.py                                     |  7 +-
 BaseTools/Source/Python/Eot/c.py                                                 |  3 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                                      |  7 +-
 BaseTools/Source/Python/GenFds/GenFds.py                                         |  3 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                           |  3 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py                     |  7 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                                   | 23 ++---
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py           | 15 +--
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py                   | 17 ++--
 BaseTools/Source/Python/TargetTool/TargetTool.py                                 | 23 ++---
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py                        |  3 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                            |  9 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py                        | 51 +++++-----
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                            |  5 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py                     |  9 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                                |  5 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                              |  3 +-
 BaseTools/Source/Python/build/build.py                                           |  3 +-
 BaseTools/Tests/TestTools.py                                                     |  5 +-
 BaseTools/Tests/TianoCompress.py                                                 |  5 +-
 BaseTools/gcc/mingw-gcc-build.py                                                 | 99 ++++++++++----------
 46 files changed, 354 insertions(+), 308 deletions(-)

diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index 69fd2d54413e..dd66c7111ac0 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -23,6 +23,7 @@
 #
 # ExceptionList if a tool takes an argument with a / add it to the exception list
 #
+from __future__ import print_function
 import sys
 import os
 import subprocess
@@ -86,7 +87,7 @@ if __name__ == "__main__":
      ret = main(sys.argv[2:])
 
   except:
-    print "exiting: exception from " + sys.argv[0]
+    print("exiting: exception from " + sys.argv[0])
     ret = 2
 
   sys.exit(ret)
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index b907d3e5e000..10b5043325cc 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -14,6 +14,7 @@
 '''
 BinToPcd
 '''
+from __future__ import print_function
 
 import sys
 import argparse
diff --git a/BaseTools/Scripts/FormatDosFiles.py b/BaseTools/Scripts/FormatDosFiles.py
index 2f2d4d532c76..3b16af5a4413 100644
--- a/BaseTools/Scripts/FormatDosFiles.py
+++ b/BaseTools/Scripts/FormatDosFiles.py
@@ -16,6 +16,7 @@
 #
 # Import Modules
 #
+from __future__ import print_function
 import argparse
 import os
 import os.path
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 5709ad4641cb..0a41f9d83271 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -14,6 +14,7 @@
 #
 ##
 
+from __future__ import print_function
 import os
 import re
 import sys
@@ -58,10 +59,10 @@ class Symbols:
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
-            print "parsing (debug) - " + pdbName
+            print("parsing (debug) - " + pdbName)
             os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
         except :
-            print 'ERROR: nm command not available.  Please verify PATH'
+            print('ERROR: nm command not available.  Please verify PATH')
             return
 
         #
@@ -111,11 +112,11 @@ class Symbols:
             DIA2DumpCommand = "Dia2Dump.exe"
             #DIA2SymbolOption = "-p"
             DIA2LinesOption = "-l"
-            print "parsing (pdb) - " + pdbName
+            print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
             os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
         except :
-            print 'ERROR: DIA2Dump command not available.  Please verify PATH'
+            print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
         #
@@ -254,12 +255,12 @@ def main():
     try :
         file = open(Options.inputfilename)
     except Exception:
-        print "fail to open " + Options.inputfilename
+        print("fail to open " + Options.inputfilename)
         return 1
     try :
         newfile = open(Options.outputfilename, "w")
     except Exception:
-        print "fail to open " + Options.outputfilename
+        print("fail to open " + Options.outputfilename)
         return 1
 
     try:
diff --git a/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py b/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
index 557ffa4505e4..4deeee01a5e8 100644
--- a/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
+++ b/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
@@ -12,6 +12,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import os, sys, logging, traceback, subprocess
 from optparse import OptionParser
 
@@ -62,7 +63,7 @@ def parseCmdArgs():
     default = "C:\\Program Files\\doxygen\\bin\\doxygen.exe"
     if options.DoxygenPath is None:
         if os.path.exists(default):
-            print "Warning: Assume doxygen tool is installed at %s. If not, please specify via -x" % default
+            print("Warning: Assume doxygen tool is installed at %s. If not, please specify via -x" % default)
             options.DoxygenPath = default
         else:
             errors.append('- Please specify the path of doxygen tool installation via option -x! or install it in default path %s' % default)
@@ -80,7 +81,7 @@ def parseCmdArgs():
         if options.PackagePath is not None and os.path.exists(options.PackagePath):
             dirpath = os.path.dirname(options.PackagePath)
             default = os.path.join (dirpath, "Document")
-            print 'Warning: Assume document output at %s. If not, please specify via option -o' % default
+            print('Warning: Assume document output at %s. If not, please specify via option -o' % default)
             options.OutputPath = default
             if not os.path.exists(default):
                 try:
@@ -92,21 +93,21 @@ def parseCmdArgs():
 
     if options.Arch is None:
         options.Arch = 'ALL'
-        print "Warning: Assume arch is \"ALL\". If not, specify via -a"
+        print("Warning: Assume arch is \"ALL\". If not, specify via -a")
 
     if options.DocumentMode is None:
         options.DocumentMode = "HTML"
-        print "Warning: Assume document mode is \"HTML\". If not, specify via -m"
+        print("Warning: Assume document mode is \"HTML\". If not, specify via -m")
 
     if options.IncludeOnly is None:
         options.IncludeOnly = False
-        print "Warning: Assume generate package document for all package\'s source including publich interfaces and implementation libraries and modules."
+        print("Warning: Assume generate package document for all package\'s source including publich interfaces and implementation libraries and modules.")
 
     if options.DocumentMode.lower() == 'chm':
         default = "C:\\Program Files\\HTML Help Workshop\\hhc.exe"
         if options.HtmlWorkshopPath is None:
             if os.path.exists(default):
-                print 'Warning: Assume the installation path of Microsoft HTML Workshop is %s. If not, specify via option -c.' % default
+                print('Warning: Assume the installation path of Microsoft HTML Workshop is %s. If not, specify via option -c.' % default)
                 options.HtmlWorkshopPath = default
             else:
                 errors.append('- Please specify the installation path of Microsoft HTML Workshop via option -c!')
@@ -114,7 +115,7 @@ def parseCmdArgs():
             errors.append('- The installation path of Microsoft HTML Workshop %s does not exists. ' % options.HtmlWorkshopPath)
 
     if len(errors) != 0:
-        print '\n'
+        print('\n')
         parser.error('Fail to start due to following reasons: \n%s' %'\n'.join(errors))
     return (options.WorkspacePath, options.PackagePath, options.DoxygenPath, options.OutputPath,
             options.Arch, options.DocumentMode, options.IncludeOnly, options.HtmlWorkshopPath)
@@ -130,21 +131,21 @@ def createPackageObject(wsPath, pkgPath):
     return pkgObj
 
 def callbackLogMessage(msg, level):
-    print msg.strip()
+    print(msg.strip())
 
 def callbackCreateDoxygenProcess(doxPath, configPath):
     if sys.platform == 'win32':
         cmd = '"%s" %s' % (doxPath, configPath)
     else:
         cmd = '%s %s' % (doxPath, configPath)
-    print cmd
+    print(cmd)
     subprocess.call(cmd, shell=True)
 
 
 def DocumentFixup(outPath, arch):
     # find BASE_LIBRARY_JUMP_BUFFER structure reference page
 
-    print '\n    >>> Start fixup document \n'
+    print('\n    >>> Start fixup document \n')
 
     for root, dirs, files in os.walk(outPath):
         for dir in dirs:
@@ -172,10 +173,10 @@ def DocumentFixup(outPath, arch):
             if text.find('MdePkg/Include/Library/UefiApplicationEntryPoint.h File Reference') != -1:
                 FixPageUefiApplicationEntryPoint(fullpath, text)
 
-    print '    >>> Finish all document fixing up! \n'
+    print('    >>> Finish all document fixing up! \n')
 
 def FixPageBaseLib(path, text):
-    print '    >>> Fixup BaseLib file page at file %s \n' % path
+    print('    >>> Fixup BaseLib file page at file %s \n' % path)
     lines = text.split('\n')
     lastBaseJumpIndex = -1
     lastIdtGateDescriptor = -1
@@ -211,10 +212,10 @@ def FixPageBaseLib(path, text):
     except:
         logging.getLogger().error("     <<< Fail to fixup file %s\n" % path)
         return
-    print "    <<< Finish to fixup file %s\n" % path
+    print("    <<< Finish to fixup file %s\n" % path)
 
 def FixPageIA32_IDT_GATE_DESCRIPTOR(path, text):
-    print '    >>> Fixup structure reference IA32_IDT_GATE_DESCRIPTOR at file %s \n' % path
+    print('    >>> Fixup structure reference IA32_IDT_GATE_DESCRIPTOR at file %s \n' % path)
     lines = text.split('\n')
     for index in range(len(lines) - 1, -1, -1):
         line = lines[index].strip()
@@ -229,10 +230,10 @@ def FixPageIA32_IDT_GATE_DESCRIPTOR(path, text):
     except:
         logging.getLogger().error("     <<< Fail to fixup file %s\n" % path)
         return
-    print "    <<< Finish to fixup file %s\n" % path
+    print("    <<< Finish to fixup file %s\n" % path)
 
 def FixPageBASE_LIBRARY_JUMP_BUFFER(path, text):
-    print '    >>> Fixup structure reference BASE_LIBRARY_JUMP_BUFFER at file %s \n' % path
+    print('    >>> Fixup structure reference BASE_LIBRARY_JUMP_BUFFER at file %s \n' % path)
     lines = text.split('\n')
     bInDetail = True
     bNeedRemove = False
@@ -266,10 +267,10 @@ def FixPageBASE_LIBRARY_JUMP_BUFFER(path, text):
     except:
         logging.getLogger().error("     <<< Fail to fixup file %s" % path)
         return
-    print "    <<< Finish to fixup file %s\n" % path
+    print("    <<< Finish to fixup file %s\n" % path)
 
 def FixPageUefiDriverEntryPoint(path, text):
-    print '    >>> Fixup file reference MdePkg/Include/Library/UefiDriverEntryPoint.h at file %s \n' % path
+    print('    >>> Fixup file reference MdePkg/Include/Library/UefiDriverEntryPoint.h at file %s \n' % path)
     lines = text.split('\n')
     bInModuleEntry = False
     bInEfiMain     = False
@@ -318,11 +319,11 @@ def FixPageUefiDriverEntryPoint(path, text):
     except:
         logging.getLogger().error("     <<< Fail to fixup file %s" % path)
         return
-    print "    <<< Finish to fixup file %s\n" % path
+    print("    <<< Finish to fixup file %s\n" % path)
 
 
 def FixPageUefiApplicationEntryPoint(path, text):
-    print '    >>> Fixup file reference MdePkg/Include/Library/UefiApplicationEntryPoint.h at file %s \n' % path
+    print('    >>> Fixup file reference MdePkg/Include/Library/UefiApplicationEntryPoint.h at file %s \n' % path)
     lines = text.split('\n')
     bInModuleEntry = False
     bInEfiMain     = False
@@ -371,7 +372,7 @@ def FixPageUefiApplicationEntryPoint(path, text):
     except:
         logging.getLogger().error("     <<< Fail to fixup file %s" % path)
         return
-    print "    <<< Finish to fixup file %s\n" % path
+    print("    <<< Finish to fixup file %s\n" % path)
 
 if __name__ == '__main__':
     wspath, pkgpath, doxpath, outpath, archtag, docmode, isinc, hwpath = parseCmdArgs()
@@ -424,6 +425,6 @@ if __name__ == '__main__':
         else:
             cmd = '%s %s' % (hwpath, indexpath)
         subprocess.call(cmd)
-        print '\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.chm')
+        print('\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.chm'))
     else:
-        print '\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.html')
+        print('\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.html'))
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
index a177590af597..fe2ba1d8a842 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
@@ -11,6 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import os
 
 from message import *
@@ -446,4 +447,4 @@ if __name__== '__main__':
     p.AddPage(Page('PCD', 'pcds'))
 
     df.Generate()
-    print df
+    print(df)
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
index 9db16a63c07a..290287b817e7 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
@@ -11,6 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import array
 import uuid
 import re
@@ -250,12 +251,12 @@ class EfiFirmwareVolumeHeader(BinaryItem):
         return list2int(self._arr.tolist()[48:50])
 
     def Dump(self):
-        print 'Signature: %s' % self.GetSigunature()
-        print 'Attribute: 0x%X' % self.GetAttribute()
-        print 'Header Length: 0x%X' % self.GetHeaderLength()
-        print 'File system Guid: ', self.GetFileSystemGuid()
-        print 'Revision: 0x%X' % self.GetRevision()
-        print 'FvLength: 0x%X' % self.GetFvLength()
+        print('Signature: %s' % self.GetSigunature())
+        print('Attribute: 0x%X' % self.GetAttribute())
+        print('Header Length: 0x%X' % self.GetHeaderLength())
+        print('File system Guid: ', self.GetFileSystemGuid())
+        print('Revision: 0x%X' % self.GetRevision())
+        print('FvLength: 0x%X' % self.GetFvLength())
 
     def GetFileSystemGuid(self):
         list = self._arr.tolist()
@@ -348,7 +349,7 @@ class EfiFfs(object):
                 line.append('0x%X' % int(item))
                 count += 1
             else:
-                print ' '.join(line)
+                print(' '.join(line))
                 count = 0
                 line = []
                 line.append('0x%X' % int(item))
@@ -445,11 +446,11 @@ class EfiFfsHeader(BinaryItem):
         return 'Unknown Ffs State'
 
     def Dump(self):
-        print "FFS name: ", self.GetNameGuid()
-        print "FFS type: ", self.GetType()
-        print "FFS attr: 0x%X" % self.GetAttributes()
-        print "FFS size: 0x%X" % self.GetFfsSize()
-        print "FFS state: 0x%X" % self.GetState()
+        print("FFS name: ", self.GetNameGuid())
+        print("FFS type: ", self.GetType())
+        print("FFS attr: 0x%X" % self.GetAttributes())
+        print("FFS size: 0x%X" % self.GetFfsSize())
+        print("FFS state: 0x%X" % self.GetState())
 
     def GetRawData(self):
         return self._arr.tolist()
@@ -528,8 +529,8 @@ class EfiSectionHeader(BinaryItem):
         return self.section_type_map[type]
 
     def Dump(self):
-        print 'size = 0x%X' % self.GetSectionSize()
-        print 'type = 0x%X' % self.GetType()
+        print('size = 0x%X' % self.GetSectionSize())
+        print('type = 0x%X' % self.GetType())
 
 
 
diff --git a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
index 26c092410386..8ad5d471d052 100644
--- a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
+++ b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
@@ -14,6 +14,7 @@
 #
 ##
 
+from __future__ import print_function
 import os
 import re
 import sys
@@ -61,10 +62,10 @@ class Symbols:
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
-            print "parsing (debug) - " + pdbName
+            print("parsing (debug) - " + pdbName)
             os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
         except :
-            print 'ERROR: nm command not available.  Please verify PATH'
+            print('ERROR: nm command not available.  Please verify PATH')
             return
 
         #
@@ -103,11 +104,11 @@ class Symbols:
             DIA2DumpCommand = "Dia2Dump.exe"
             #DIA2SymbolOption = "-p"
             DIA2LinesOption = "-l"
-            print "parsing (pdb) - " + pdbName
+            print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
             os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
         except :
-            print 'ERROR: DIA2Dump command not available.  Please verify PATH'
+            print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
         #
@@ -235,14 +236,14 @@ def main():
     try :
         DOMTree = xml.dom.minidom.parse(Options.inputfilename)
     except Exception:
-        print "fail to open input " + Options.inputfilename
+        print("fail to open input " + Options.inputfilename)
         return 1
 
     if Options.guidreffilename is not None:
         try :
             guidreffile = open(Options.guidreffilename)
         except Exception:
-            print "fail to open guidref" + Options.guidreffilename
+            print("fail to open guidref" + Options.guidreffilename)
             return 1
         genGuidString(guidreffile)
         guidreffile.close()
@@ -277,7 +278,7 @@ def main():
 
                     Handler = smiHandler.getElementsByTagName("Handler")
                     RVA = Handler[0].getElementsByTagName("RVA")
-                    print "    Handler RVA: %s" % RVA[0].childNodes[0].data
+                    print("    Handler RVA: %s" % RVA[0].childNodes[0].data)
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
@@ -289,7 +290,7 @@ def main():
 
                     Caller = smiHandler.getElementsByTagName("Caller")
                     RVA = Caller[0].getElementsByTagName("RVA")
-                    print "    Caller RVA: %s" % RVA[0].childNodes[0].data
+                    print("    Caller RVA: %s" % RVA[0].childNodes[0].data)
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
@@ -302,7 +303,7 @@ def main():
     try :
         newfile = open(Options.outputfilename, "w")
     except Exception:
-        print "fail to open output" + Options.outputfilename
+        print("fail to open output" + Options.outputfilename)
         return 1
 
     newfile.write(DOMTree.toprettyxml(indent = "\t", newl = "\n", encoding = "utf-8"))
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index 72d801df8fd5..e268c4c0a1cf 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -15,6 +15,7 @@
 
 ## Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import os.path as path
@@ -688,7 +689,7 @@ class WorkspaceAutoGen(AutoGen):
             os.makedirs(self.BuildDir)
         with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file:
             for f in AllWorkSpaceMetaFiles:
-                print >> file, f
+                print(f, file=file)
         return True
 
     def _GenPkgLevelHash(self, Pkg):
@@ -4362,7 +4363,7 @@ class ModuleAutoGen(AutoGen):
             os.remove (self.GetTimeStampPath())
         with open(self.GetTimeStampPath(), 'w+') as file:
             for f in FileSet:
-                print >> file, f
+                print(f, file=file)
 
     Module          = property(_GetModule)
     Name            = property(_GetBaseName)
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index ad1919442e6e..d4daa3093761 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import copy
@@ -597,19 +598,19 @@ if __name__ == '__main__':
     EdkLogger.Initialize()
     if len(sys.argv) > 1:
         Br = BuildRule(sys.argv[1])
-        print str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "MSFT"][1])
-        print
-        print str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "INTEL"][1])
-        print
-        print str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "GCC"][1])
-        print
-        print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
-        print
-        print str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1])
-        print
-        print str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1])
-        print
-        print str(Br[".s", SUP_MODULE_SEC, "IPF", "COMMON"][1])
-        print
-        print str(Br[".s", SUP_MODULE_SEC][1])
+        print(str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "INTEL"][1]))
+        print()
+        print(str(Br[".c", SUP_MODULE_DXE_DRIVER, "IA32", "GCC"][1]))
+        print()
+        print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".h", "ACPI_TABLE", "IA32", "INTEL"][1]))
+        print()
+        print(str(Br[".ac", "ACPI_TABLE", "IA32", "MSFT"][1]))
+        print()
+        print(str(Br[".s", SUP_MODULE_SEC, "IPF", "COMMON"][1]))
+        print()
+        print(str(Br[".s", SUP_MODULE_SEC][1]))
 
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 06cf3e7d5162..3a931c6f2766 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os, codecs, re
 import distutils.util
 import Common.EdkLogger as EdkLogger
@@ -684,12 +685,12 @@ class UniFileClassObject(object):
     # Show the instance itself
     #
     def ShowMe(self):
-        print self.LanguageDef
+        print(self.LanguageDef)
         #print self.OrderedStringList
         for Item in self.OrderedStringList:
-            print Item
+            print(Item)
             for Member in self.OrderedStringList[Item]:
-                print str(Member)
+                print(str(Member))
 
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 6c8f89f5d12b..86c44abb67a6 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -20,6 +20,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import encodings.ascii
@@ -132,7 +133,7 @@ def MyOptionParser():
 #
 def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
     if os.path.exists(VpdFileName) and not Force:
-        print "\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName
+        print("\nFile %s already exist, Overwrite(Yes/No)?[Y]: " % VpdFileName)
         choice = sys.stdin.readline()
         if choice.strip().lower() not in ['y', 'yes', '']:
             return
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 7b04dcdb36cc..c63030a16e6e 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -12,6 +12,7 @@
 
 ## Import Modules
 #
+from __future__ import print_function
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
@@ -1028,10 +1029,10 @@ if __name__ == '__main__':
         if input in 'qQ':
             break
         try:
-            print ValueExpression(input)(True)
-            print ValueExpression(input)(False)
+            print(ValueExpression(input)(True))
+            print(ValueExpression(input)(False))
         except WrnExpression as Ex:
-            print Ex.result
-            print str(Ex)
+            print(Ex.result)
+            print(str(Ex))
         except Exception as Ex:
-            print str(Ex)
+            print(str(Ex))
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index b6f99447057c..4c29bc9ee4bd 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -12,6 +12,7 @@
 
 # # Import Modules
 #
+from __future__ import print_function
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
@@ -85,11 +86,11 @@ class RangeContainer(object):
         self.__clean__()
         
     def dump(self):
-        print "----------------------"
+        print("----------------------")
         rangelist = ""
         for object in self.rangelist:
             rangelist = rangelist + "[%d , %d]" % (object.start, object.end)
-        print rangelist
+        print(rangelist)
         
         
 class XOROperatorObject(object):   
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index f8459c892e36..8ba8dd31a8c5 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import EdkLogger
 import DataType
@@ -158,6 +159,6 @@ def TargetTxtDict(ConfDir):
 if __name__ == '__main__':
     pass
     Target = TargetTxtDict(os.getenv("WORKSPACE"))
-    print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
-    print Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET]
-    print Target.TargetTxtDictionary
+    print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER])
+    print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET])
+    print(Target.TargetTxtDictionary)
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 8ff544ed769d..09b8196faf07 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -15,6 +15,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import Common.EdkLogger as EdkLogger
@@ -248,7 +249,7 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
     except Exception as X:
         EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData=str(X))
     (out, error) = PopenObject.communicate()
-    print out
+    print(out)
     while PopenObject.returncode is None :
         PopenObject.wait()
     
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index ddc6cbd506aa..d5fd3a37a167 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -1,5 +1,6 @@
 # $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
 
+from __future__ import print_function
 from antlr3 import *
 from antlr3.compat import set, frozenset
          
@@ -102,8 +103,8 @@ class CParser(Parser):
         self.postfix_expression_stack = []
 
     def printTokenInfo(self, line, offset, tokenText):
-    	print str(line)+ ',' + str(offset) + ':' + str(tokenText)
-        
+        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
+
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
     	PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
     	FileProfile.PredicateExpressionList.append(PredExp)
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index ffa51de7c1bf..2efae2c7c1de 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -16,6 +16,7 @@
 # Import Modules
 #
 
+from __future__ import print_function
 import re
 import Common.LongFilePathOs as os
 import sys
@@ -533,58 +534,58 @@ class CodeFragmentCollector:
         
     def PrintFragments(self):
         
-        print '################# ' + self.FileName + '#####################'
+        print('################# ' + self.FileName + '#####################')
         
-        print '/****************************************/'
-        print '/*************** COMMENTS ***************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*************** COMMENTS ***************/')
+        print('/****************************************/')
         for comment in FileProfile.CommentList:
-            print str(comment.StartPos) + comment.Content
+            print(str(comment.StartPos) + comment.Content)
         
-        print '/****************************************/'
-        print '/********* PREPROCESS DIRECTIVES ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREPROCESS DIRECTIVES ********/')
+        print('/****************************************/')
         for pp in FileProfile.PPDirectiveList:
-            print str(pp.StartPos) + pp.Content
+            print(str(pp.StartPos) + pp.Content)
         
-        print '/****************************************/'
-        print '/********* VARIABLE DECLARATIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* VARIABLE DECLARATIONS ********/')
+        print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
             
-        print '/****************************************/'
-        print '/********* FUNCTION DEFINITIONS *********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* FUNCTION DEFINITIONS *********/')
+        print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
             
-        print '/****************************************/'
-        print '/************ ENUMERATIONS **************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************ ENUMERATIONS **************/')
+        print('/****************************************/')
         for enum in FileProfile.EnumerationDefinitionList:
-            print str(enum.StartPos) + enum.Content
+            print(str(enum.StartPos) + enum.Content)
         
-        print '/****************************************/'
-        print '/*********** STRUCTS/UNIONS *************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*********** STRUCTS/UNIONS *************/')
+        print('/****************************************/')
         for su in FileProfile.StructUnionDefinitionList:
-            print str(su.StartPos) + su.Content
+            print(str(su.StartPos) + su.Content)
             
-        print '/****************************************/'
-        print '/********* PREDICATE EXPRESSIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREDICATE EXPRESSIONS ********/')
+        print('/****************************************/')
         for predexp in FileProfile.PredicateExpressionList:
-            print str(predexp.StartPos) + predexp.Content
+            print(str(predexp.StartPos) + predexp.Content)
         
-        print '/****************************************/'    
-        print '/************** TYPEDEFS ****************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************** TYPEDEFS ****************/')
+        print('/****************************************/')
         for typedef in FileProfile.TypedefDefinitionList:
-            print str(typedef.StartPos) + typedef.ToType
+            print(str(typedef.StartPos) + typedef.ToType)
         
 if __name__ == "__main__":
     
     collector = CodeFragmentCollector(sys.argv[1])
     collector.PreprocessFile()
-    print "For Test."
+    print("For Test.")
diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index 217b60f4f319..4711bbd54fdc 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
@@ -419,9 +420,9 @@ class Configuration(object):
                 self.__dict__[_ConfigFileToInternalTranslation[List[0]]] = List[1]
 
     def ShowMe(self):
-        print self.Filename
+        print(self.Filename)
         for Key in self.__dict__.keys():
-            print Key, '=', self.__dict__[Key]
+            print(Key, '=', self.__dict__[Key])
 
 #
 # test that our dict and out class still match in contents.
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index b0882afa6289..bde41c3a4b57 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 from Xml.XmlRoutines import *
 import Common.LongFilePathOs as os
 
@@ -84,4 +85,4 @@ class ExceptionCheck(object):
 #
 if __name__ == '__main__':
     El = ExceptionCheck('C:\\Hess\\Project\\BuildTool\\src\\Ecc\\exception.xml')
-    print El.ExceptionList
+    print(El.ExceptionList)
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index fc65e9a2bd3c..a056c3759fb1 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 
 import Common.EdkLogger as EdkLogger
@@ -99,7 +100,7 @@ class Table(object):
         try:
             self.Cur.execute(SqlCommand)
         except Exception as e:
-            print "An error occurred when Drop a table:", e.args[0]
+            print("An error occurred when Drop a table:", e.args[0])
 
     ## Get count
     #
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index d5fb80fcf982..811106133cb4 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import xml.dom.minidom
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
@@ -215,7 +216,7 @@ def XmlParseFile(FileName):
         XmlFile.close()
         return Dom
     except Exception as X:
-        print X
+        print(X)
         return ""
 
 # This acts like the main() function for the script, unless it is 'import'ed
@@ -225,5 +226,5 @@ if __name__ == '__main__':
     A = CreateXmlElement('AAA', 'CCC',  [['AAA', '111'], ['BBB', '222']], [['A', '1'], ['B', '2']])
     B = CreateXmlElement('ZZZ', 'CCC',  [['XXX', '111'], ['YYY', '222']], [['A', '1'], ['B', '2']])
     C = CreateXmlList('DDD', 'EEE', [A, B], ['FFF', 'GGG'])
-    print C.toprettyxml(indent = " ")
+    print(C.toprettyxml(indent = " "))
     pass
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 99b22725e6ba..e2a5cc8487fa 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -11,6 +11,7 @@
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import sys
 import Common.LongFilePathOs as os
 import re
@@ -2285,7 +2286,7 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
 
     for Result in CommentSet:
@@ -2438,7 +2439,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
     # Func Decl check
     SqlStatement = """ select Modifier, Name, StartLine, ID, Value
@@ -2469,7 +2470,7 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
         for Result in ResultSet:
             CommentSet.append(Result)
     except:
-        print 'Unrecognized chars in comment of file %s', FullFileName
+        print('Unrecognized chars in comment of file %s', FullFileName)
 
     SqlStatement = """ select Modifier, Header, StartLine, ID, Name
                        from Function
@@ -2634,9 +2635,9 @@ if __name__ == '__main__':
     try:
         test_file = sys.argv[1]
     except IndexError as v:
-        print "Usage: %s filename" % sys.argv[0]
+        print("Usage: %s filename" % sys.argv[0])
         sys.exit(1)
     MsgList = CheckFuncHeaderDoxygenComments(test_file)
     for Msg in MsgList:
-        print Msg
-    print 'Done!'
+        print(Msg)
+    print('Done!')
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index ddc6cbd506aa..d5fd3a37a167 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -1,5 +1,6 @@
 # $ANTLR 3.0.1 C.g 2010-02-23 09:58:53
 
+from __future__ import print_function
 from antlr3 import *
 from antlr3.compat import set, frozenset
          
@@ -102,8 +103,8 @@ class CParser(Parser):
         self.postfix_expression_stack = []
 
     def printTokenInfo(self, line, offset, tokenText):
-    	print str(line)+ ',' + str(offset) + ':' + str(tokenText)
-        
+        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
+
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
     	PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
     	FileProfile.PredicateExpressionList.append(PredExp)
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index 87f179206d84..1e30e2ce62e2 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import re
 import Common.LongFilePathOs as os
 import sys
@@ -379,49 +380,49 @@ class CodeFragmentCollector:
     #
     def PrintFragments(self):
 
-        print '################# ' + self.FileName + '#####################'
+        print('################# ' + self.FileName + '#####################')
 
-        print '/****************************************/'
-        print '/*************** ASSIGNMENTS ***************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*************** ASSIGNMENTS ***************/')
+        print('/****************************************/')
         for asign in FileProfile.AssignmentExpressionList:
-            print str(asign.StartPos) + asign.Name + asign.Operator + asign.Value
+            print(str(asign.StartPos) + asign.Name + asign.Operator + asign.Value)
 
-        print '/****************************************/'
-        print '/********* PREPROCESS DIRECTIVES ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* PREPROCESS DIRECTIVES ********/')
+        print('/****************************************/')
         for pp in FileProfile.PPDirectiveList:
-            print str(pp.StartPos) + pp.Content
+            print(str(pp.StartPos) + pp.Content)
 
-        print '/****************************************/'
-        print '/********* VARIABLE DECLARATIONS ********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* VARIABLE DECLARATIONS ********/')
+        print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print str(var.StartPos) + var.Modifier + ' '+ var.Declarator
+            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
 
-        print '/****************************************/'
-        print '/********* FUNCTION DEFINITIONS *********/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/********* FUNCTION DEFINITIONS *********/')
+        print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos)
+            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
 
-        print '/****************************************/'
-        print '/************ ENUMERATIONS **************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************ ENUMERATIONS **************/')
+        print('/****************************************/')
         for enum in FileProfile.EnumerationDefinitionList:
-            print str(enum.StartPos) + enum.Content
+            print(str(enum.StartPos) + enum.Content)
 
-        print '/****************************************/'
-        print '/*********** STRUCTS/UNIONS *************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/*********** STRUCTS/UNIONS *************/')
+        print('/****************************************/')
         for su in FileProfile.StructUnionDefinitionList:
-            print str(su.StartPos) + su.Content
+            print(str(su.StartPos) + su.Content)
 
-        print '/****************************************/'
-        print '/************** TYPEDEFS ****************/'
-        print '/****************************************/'
+        print('/****************************************/')
+        print('/************** TYPEDEFS ****************/')
+        print('/****************************************/')
         for typedef in FileProfile.TypedefDefinitionList:
-            print str(typedef.StartPos) + typedef.ToType
+            print(str(typedef.StartPos) + typedef.ToType)
 
 ##
 #
@@ -430,4 +431,4 @@ class CodeFragmentCollector:
 #
 if __name__ == "__main__":
 
-    print "For Test."
+    print("For Test.")
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index 584a95d6f3e4..24f0d50246e5 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -14,6 +14,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
@@ -164,8 +165,8 @@ if __name__ == '__main__':
     Db.InitDatabase()
     P = EdkInfParser(os.path.normpath("C:\Framework\Edk\Sample\Platform\Nt32\Dxe\PlatformBds\PlatformBds.inf"), Db, '', '')
     for Inf in P.Sources:
-        print Inf
+        print(Inf)
     for Item in P.Macros:
-        print Item, P.Macros[Item]
+        print(Item, P.Macros[Item])
 
-    Db.Close()
\ No newline at end of file
+    Db.Close()
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index 8199ce5ee73e..c70f62f393a9 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import sys
 import Common.LongFilePathOs as os
 import re
@@ -384,4 +385,4 @@ if __name__ == '__main__':
     EdkLogger.SetLevel(EdkLogger.QUIET)
     CollectSourceCodeDataIntoDB(sys.argv[1])
 
-    print 'Done!'
+    print('Done!')
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index a99a5e7ac693..6b4f724f6d9c 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import re
 
 import Fd
@@ -4777,7 +4778,7 @@ if __name__ == "__main__":
     try:
         test_file = sys.argv[1]
     except IndexError as v:
-        print "Usage: %s filename" % sys.argv[0]
+        print("Usage: %s filename" % sys.argv[0])
         sys.exit(1)
 
     parser = FdfParser(test_file)
@@ -4785,7 +4786,7 @@ if __name__ == "__main__":
         parser.ParseFile()
         parser.CycleReferenceCheck()
     except Warning as X:
-        print str(X)
+        print(str(X))
     else:
-        print "Success!"
+        print("Success!")
 
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index ba3950dacd8a..1552ab4ee3a8 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 from optparse import OptionParser
 import sys
 import Common.LongFilePathOs as os
@@ -689,7 +690,7 @@ class GenFds :
         ModuleDict = BuildDb.BuildObject[DscFile, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
         for Key in ModuleDict:
             ModuleObj = BuildDb.BuildObject[Key, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
-            print ModuleObj.BaseName + ' ' + ModuleObj.ModuleType
+            print(ModuleObj.BaseName + ' ' + ModuleObj.ModuleType)
 
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
         GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index c1d656227609..73b52030d929 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import subprocess
@@ -736,7 +737,7 @@ class GenFdsGlobalVariable:
             GenFdsGlobalVariable.InfLogger (out)
             GenFdsGlobalVariable.InfLogger (error)
             if PopenObject.returncode != 0:
-                print "###", cmd
+                print("###", cmd)
                 EdkLogger.error("GenFds", COMMAND_FAILURE, errorMess)
 
     def VerboseLogger (msg):
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index f40c8bd01b23..d7084fbe88da 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -17,6 +17,7 @@
 #
 
 #======================================  External Libraries ========================================
+from __future__ import print_function
 import optparse
 import Common.LongFilePathOs as os
 import re
@@ -216,7 +217,7 @@ if __name__ == '__main__':
     (options, args) = parser.parse_args()
 
     if options.mapfile is None or options.efifile is None:
-        print parser.get_usage()
+        print(parser.get_usage())
     elif os.path.exists(options.mapfile) and os.path.exists(options.efifile):
         list = parsePcdInfoFromMapFile(options.mapfile, options.efifile)
         if list is not None:
@@ -225,6 +226,6 @@ if __name__ == '__main__':
             else:
                 generatePcdTable(list, options.mapfile.replace('.map', '.BinaryPcdTable.txt'))
         else:
-            print 'Fail to generate Patch PCD Table based on map file and efi file'
+            print('Fail to generate Patch PCD Table based on map file and efi file')
     else:
-        print 'Fail to generate Patch PCD Table for fail to find map file or efi file!'
+        print('Fail to generate Patch PCD Table for fail to find map file or efi file!')
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index de8575676cac..4f79d0f82967 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -19,6 +19,7 @@
 '''
 Pkcs7Sign
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -113,14 +114,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
 
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
 
   #
   # Read input file into a buffer and save input filename
@@ -134,7 +135,7 @@ if __name__ == '__main__':
   #
   OutputDir = os.path.dirname(args.OutputFile)
   if not os.path.exists(OutputDir):
-    print 'ERROR: The output path does not exist: %s' % OutputDir
+    print('ERROR: The output path does not exist: %s' % OutputDir)
     sys.exit(1)
   args.OutputFileName = args.OutputFile
 
@@ -170,7 +171,7 @@ if __name__ == '__main__':
         args.SignerPrivateCertFile = open(args.SignerPrivateCertFileName, 'rb')
         args.SignerPrivateCertFile.close()
       except:
-        print 'ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName)
+        print('ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName))
         sys.exit(1)
 
     #
@@ -196,7 +197,7 @@ if __name__ == '__main__':
         args.OtherPublicCertFile = open(args.OtherPublicCertFileName, 'rb')
         args.OtherPublicCertFile.close()
       except:
-        print 'ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName)
+        print('ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName))
         sys.exit(1)
 
     format = "%dsQ" % len(args.InputFileBuffer)
@@ -242,11 +243,11 @@ if __name__ == '__main__':
         args.TrustedPublicCertFile = open(args.TrustedPublicCertFileName, 'rb')
         args.TrustedPublicCertFile.close()
       except:
-        print 'ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName)
+        print('ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName))
         sys.exit(1)
 
     if not args.SignatureSizeStr:
-      print "ERROR: please use the option --signature-size to specify the size of the signature data!"
+      print("ERROR: please use the option --signature-size to specify the size of the signature data!")
       sys.exit(1)
     else:
       if args.SignatureSizeStr.upper().startswith('0X'):
@@ -254,10 +255,10 @@ if __name__ == '__main__':
       else:
         SignatureSize = (long)(args.SignatureSizeStr)
     if SignatureSize < 0:
-        print "ERROR: The value of option --signature-size can't be set to negative value!"
+        print("ERROR: The value of option --signature-size can't be set to negative value!")
         sys.exit(1)
     elif SignatureSize > len(args.InputFileBuffer):
-        print "ERROR: The value of option --signature-size is exceed the size of the input file !"
+        print("ERROR: The value of option --signature-size is exceed the size of the input file !")
         sys.exit(1)
 
     args.SignatureBuffer = args.InputFileBuffer[0:SignatureSize]
@@ -277,7 +278,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=args.SignatureBuffer)[0]
     if Process.returncode <> 0:
-      print 'ERROR: Verification failed'
+      print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 9711de8f5c2e..41bcaa0437c5 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -22,6 +22,7 @@
 '''
 Rsa2048Sha256GenerateKeys
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -75,14 +76,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:  
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
     
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
   
   args.PemFileName = []
   
@@ -103,7 +104,7 @@ if __name__ == '__main__':
       Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
       Process.communicate()
       if Process.returncode <> 0:
-        print 'ERROR: RSA 2048 key generation failed'
+        print('ERROR: RSA 2048 key generation failed')
         sys.exit(Process.returncode)
       
   #
@@ -125,7 +126,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
     if Process.returncode <> 0:
-      print 'ERROR: Unable to extract public key from private key'
+      print('ERROR: Unable to extract public key from private key')
       sys.exit(Process.returncode)
     PublicKey = ''
     for Index in range (0, len(PublicKeyHexString), 2):
@@ -138,7 +139,7 @@ if __name__ == '__main__':
     Process.stdin.write (PublicKey)
     PublicKeyHash = PublicKeyHash + Process.communicate()[0]
     if Process.returncode <> 0:
-      print 'ERROR: Unable to extract SHA 256 hash of public key'
+      print('ERROR: Unable to extract SHA 256 hash of public key')
       sys.exit(Process.returncode)
 
   #
@@ -171,4 +172,4 @@ if __name__ == '__main__':
   # If verbose is enabled display the public key in C structure format
   #
   if args.Verbose:
-    print 'PublicKeySha256 = ' + PublicKeyHashC    
+    print('PublicKeySha256 = ' + PublicKeyHashC)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index d36a14ffb775..2944b634fb7a 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -17,6 +17,7 @@
 '''
 Rsa2048Sha256Sign
 '''
+from __future__ import print_function
 
 import os
 import sys
@@ -96,14 +97,14 @@ if __name__ == '__main__':
   try:
     Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
   except:  
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(1)
     
   Version = Process.communicate()
   if Process.returncode <> 0:
-    print 'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH'
+    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
-  print Version[0]
+  print(Version[0])
   
   #
   # Read input file into a buffer and save input filename
@@ -117,7 +118,7 @@ if __name__ == '__main__':
   #
   OutputDir = os.path.dirname(args.OutputFile)
   if not os.path.exists(OutputDir):
-    print 'ERROR: The output path does not exist: %s' % OutputDir
+    print('ERROR: The output path does not exist: %s' % OutputDir)
     sys.exit(1)
   args.OutputFileName = args.OutputFile
 
@@ -144,7 +145,7 @@ if __name__ == '__main__':
       args.PrivateKeyFile = open(args.PrivateKeyFileName, 'rb')
       args.PrivateKeyFile.close()
     except:
-      print 'ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName)
+      print('ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName))
       sys.exit(1)
 
   #
@@ -202,14 +203,14 @@ if __name__ == '__main__':
     # Verify that the Hash Type matches the expected SHA256 type
     #
     if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
-      print 'ERROR: unsupport hash GUID'
+      print('ERROR: unsupport hash GUID')
       sys.exit(1)
 
     #
     # Verify the public key
     #
     if Header.PublicKey <> PublicKey:
-      print 'ERROR: Public key in input file does not match public key from private key file'
+      print('ERROR: Public key in input file does not match public key from private key file')
       sys.exit(1)
 
     FullInputFileBuffer = args.InputFileBuffer
@@ -228,7 +229,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s dgst -sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=FullInputFileBuffer)
     if Process.returncode <> 0:
-      print 'ERROR: Verification failed'
+      print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index 9fb89549cc29..0d4a59198e7b 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -12,6 +12,7 @@
 #  WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
 
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import sys
 import traceback
@@ -32,7 +33,7 @@ class TargetTool():
         self.Arg       = args[0]
         self.FileName  = os.path.normpath(os.path.join(self.WorkSpace, 'Conf', 'target.txt'))
         if os.path.isfile(self.FileName) == False:
-            print "%s does not exist." % self.FileName
+            print("%s does not exist." % self.FileName)
             sys.exit(1)
         self.TargetTxtDictionary = {
             TAB_TAT_DEFINES_ACTIVE_PLATFORM                            : None,
@@ -83,14 +84,14 @@ class TargetTool():
         errMsg  = ''
         for Key in self.TargetTxtDictionary:
             if type(self.TargetTxtDictionary[Key]) == type([]):
-                print "%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key]))
+                print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
             elif self.TargetTxtDictionary[Key] is None:
                 errMsg += "  Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep 
             else:
-                print "%-30s = %s" % (Key, self.TargetTxtDictionary[Key])
+                print("%-30s = %s" % (Key, self.TargetTxtDictionary[Key]))
         
         if errMsg != '':
-            print os.linesep + 'Warning:' + os.linesep + errMsg
+            print(os.linesep + 'Warning:' + os.linesep + errMsg)
             
     def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
         try:
@@ -109,7 +110,7 @@ class TargetTool():
                             if Key not in existKeys:
                                 existKeys.append(Key)
                             else:
-                                print "Warning: Found duplicate key item in original configuration files!"
+                                print("Warning: Found duplicate key item in original configuration files!")
                                 
                             if Num == 0:
                                 Line = "%-30s = \n" % Key
@@ -120,7 +121,7 @@ class TargetTool():
                             fw.write(Line)
             for key in self.TargetTxtDictionary:
                 if key not in existKeys:
-                    print "Warning: %s does not exist in original configuration file" % key
+                    print("Warning: %s does not exist in original configuration file" % key)
                     Line = GetConfigureKeyValue(self, key)
                     if Line is None:
                         Line = "%-30s = " % key
@@ -223,25 +224,25 @@ if __name__ == '__main__':
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.QUIET)
     if os.getenv('WORKSPACE') is None:
-        print "ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool"
+        print("ERROR: WORKSPACE should be specified or edksetup script should be executed before run TargetTool")
         sys.exit(1)
         
     (opt, args) = MyOptionParser()
     if len(args) != 1 or (args[0].lower() != 'print' and args[0].lower() != 'clean' and args[0].lower() != 'set'):
-        print "The number of args isn't 1 or the value of args is invalid."
+        print("The number of args isn't 1 or the value of args is invalid.")
         sys.exit(1)
     if opt.NUM is not None and opt.NUM < 1:
-        print "The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0."
+        print("The MAX_CONCURRENT_THREAD_NUMBER must be larger than 0.")
         sys.exit(1)
     if opt.TARGET is not None and len(opt.TARGET) > 1:
         for elem in opt.TARGET:
             if elem == '0':
-                print "0 will clear the TARGET setting in target.txt and can't combine with other value."
+                print("0 will clear the TARGET setting in target.txt and can't combine with other value.")
                 sys.exit(1)
     if opt.TARGET_ARCH is not None and len(opt.TARGET_ARCH) > 1:
         for elem in opt.TARGET_ARCH:
             if elem == '0':
-                print "0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value."
+                print("0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value.")
                 sys.exit(1)
 
     try:
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index ca21e6995217..afa5b2407ec5 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -14,6 +14,7 @@
 '''
 ExpressionValidate
 '''
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -566,7 +567,7 @@ def IsValidFeatureFlagExp(Token, Flag=False):
 
 if __name__ == '__main__':
 #    print IsValidRangeExpr('LT 9')
-    print _LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression()
+    print(_LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression())
 
 
     
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 299cd871444b..a464cbf702f7 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -14,6 +14,7 @@
 """
 Collect all defined strings in multiple uni files
 """
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -730,7 +731,7 @@ class UniFileClassObject(object):
                     EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
                 NewLines.append(Line)
             else:
-                print Line
+                print(Line)
                 EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
                     
         if StrName and not StrName.split()[1].startswith(u'STR_'):
@@ -1022,12 +1023,12 @@ class UniFileClassObject(object):
     # Show the instance itself
     #
     def ShowMe(self):
-        print self.LanguageDef
+        print(self.LanguageDef)
         #print self.OrderedStringList
         for Item in self.OrderedStringList:
-            print Item
+            print(Item)
             for Member in self.OrderedStringList[Item]:
-                print str(Member)
+                print(str(Member))
     
     #
     # Read content from '!include' UNI file 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 436dc90e6dd3..074aa311f31d 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -15,6 +15,7 @@
 '''
 DecPomAlignment
 '''
+from __future__ import print_function
 
 ##
 # Import Modules
@@ -902,47 +903,47 @@ class DecPomAlignment(PackageObject):
     # Print all members and their values of Package class
     #
     def ShowPackage(self):
-        print '\nName =', self.GetName()
-        print '\nBaseName =', self.GetBaseName()
-        print '\nVersion =', self.GetVersion() 
-        print '\nGuid =', self.GetGuid()
+        print('\nName =', self.GetName())
+        print('\nBaseName =', self.GetBaseName())
+        print('\nVersion =', self.GetVersion())
+        print('\nGuid =', self.GetGuid())
         
-        print '\nStandardIncludes = %d ' \
-            % len(self.GetStandardIncludeFileList()),
+        print('\nStandardIncludes = %d ' \
+            % len(self.GetStandardIncludeFileList()), end=' ')
         for Item in self.GetStandardIncludeFileList():
-            print Item.GetFilePath(), '  ', Item.GetSupArchList()
-        print '\nPackageIncludes = %d \n' \
-            % len(self.GetPackageIncludeFileList()),
+            print(Item.GetFilePath(), '  ', Item.GetSupArchList())
+        print('\nPackageIncludes = %d \n' \
+            % len(self.GetPackageIncludeFileList()), end=' ')
         for Item in self.GetPackageIncludeFileList():
-            print Item.GetFilePath(), '  ', Item.GetSupArchList()
+            print(Item.GetFilePath(), '  ', Item.GetSupArchList())
              
-        print '\nGuids =', self.GetGuidList()
+        print('\nGuids =', self.GetGuidList())
         for Item in self.GetGuidList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nProtocols =', self.GetProtocolList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nProtocols =', self.GetProtocolList())
         for Item in self.GetProtocolList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nPpis =', self.GetPpiList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nPpis =', self.GetPpiList())
         for Item in self.GetPpiList():
-            print Item.GetCName(), Item.GetGuid(), Item.GetSupArchList()
-        print '\nLibraryClasses =', self.GetLibraryClassList()
+            print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
+        print('\nLibraryClasses =', self.GetLibraryClassList())
         for Item in self.GetLibraryClassList():
-            print Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
-            Item.GetSupArchList()
-        print '\nPcds =', self.GetPcdList()
+            print(Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
+            Item.GetSupArchList())
+        print('\nPcds =', self.GetPcdList())
         for Item in self.GetPcdList():
-            print 'CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
+            print('CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
                 Item.GetTokenSpaceGuidCName(), \
                 'DefaultValue=', Item.GetDefaultValue(), \
                 'ValidUsage=', Item.GetValidUsage(), \
                 'SupArchList', Item.GetSupArchList(), \
-                'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType()
+                'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType())
  
         for Item in self.GetMiscFileList():
-            print Item.GetName()
+            print(Item.GetName())
             for FileObjectItem in Item.GetFileList():
-                print FileObjectItem.GetURI()
-        print '****************\n'
+                print(FileObjectItem.GetURI())
+        print('****************\n')
 
 ## GenPcdDeclaration
 #
diff --git a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
index 8b4ece2617a1..5f0abcafef27 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
@@ -11,6 +11,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 
+from __future__ import print_function
 import os
 import unittest
 
@@ -66,7 +67,7 @@ def TestTemplate(TestString, TestFunc):
         # Close file
         f.close()
     except:
-        print 'Can not create temporary file [%s]!' % Path
+        print('Can not create temporary file [%s]!' % Path)
         exit(-1)
 
     # Call test function to test
@@ -279,6 +280,6 @@ if __name__ == '__main__':
     unittest.FunctionTestCase(TestDecPcd).runTest()
     unittest.FunctionTestCase(TestDecUserExtension).runTest()
 
-    print 'All tests passed...'
+    print('All tests passed...')
 
 
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index f3b43ee0bc27..626f17426de7 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -11,6 +11,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 
+from __future__ import print_function
 import os
 #import Object.Parser.InfObject as InfObject
 from Object.Parser.InfCommonObject import CurrentLine
@@ -271,7 +272,7 @@ def PrepareTest(String):
                 TempFile  = open (FileName, "w")    
                 TempFile.close()
             except:
-                print "File Create Error"
+                print("File Create Error")
         CurrentLine = CurrentLine()
         CurrentLine.SetFileName("Test")
         CurrentLine.SetLineString(Item[0])
@@ -376,11 +377,11 @@ if __name__ == '__main__':
             try:
                 InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
             except:
-                print "Test Failed!"
+                print("Test Failed!")
                 AllPassedFlag = False
     
     if AllPassedFlag :
-        print 'All tests passed...'
+        print('All tests passed...')
     else:
-        print 'Some unit test failed!'
+        print('Some unit test failed!')
 
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index a001162e8e3b..7f289c103fb9 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -17,6 +17,7 @@
 #  This class is used to retrieve information stored in database and convert them
 # into PlatformBuildClassObject form for easier use for AutoGen.
 #
+from __future__ import print_function
 from Common.StringUtils import *
 from Common.DataType import *
 from Common.Misc import *
@@ -1373,7 +1374,7 @@ class DscBuildData(PlatformBuildClassObject):
             for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
                 str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
                 if str_pcd_obj is None:
-                    print PcdName, PcdGuid
+                    print(PcdName, PcdGuid)
                     raise
                 if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
@@ -1808,7 +1809,7 @@ class DscBuildData(PlatformBuildClassObject):
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                         (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     except:
-                        print "error"
+                        print("error")
                 try:
                     Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                 except Exception:
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index f1cfa73fd4f2..d5fbf6f095bf 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import time
@@ -1630,7 +1631,7 @@ class DscParser(MetaFileParser):
         try:
             self._ValueList[2] = '|'.join(ValList)
         except Exception:
-            print ValList
+            print(ValList)
 
     def __ProcessComponent(self):
         self._ValueList[0] = ReplaceMacro(self._ValueList[0], self._Macros)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 4600c46be1be..416aa73549d1 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -16,6 +16,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
 import StringIO
@@ -2195,7 +2196,7 @@ class Build():
                     toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
                     toolsFile = open(toolsFile, 'wt')
                     for guidedSectionTool in guidAttribs:
-                        print >> toolsFile, ' '.join(guidedSectionTool)
+                        print(' '.join(guidedSectionTool), file=toolsFile)
                     toolsFile.close()
 
     ## Returns the full path of the tool.
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 27afd79f2094..be7b4ad42856 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -1,3 +1,4 @@
+from __future__ import print_function
 ## @file
 # Utility functions and classes for BaseTools unit tests
 #
@@ -91,9 +92,9 @@ class BaseToolsTest(unittest.TestCase):
             os.remove(path)
 
     def DisplayBinaryData(self, description, data):
-        print description, '(base64 encoded):'
+        print(description, '(base64 encoded):')
         b64data = base64.b64encode(data)
-        print b64data
+        print(b64data)
 
     def DisplayFile(self, fileName):
         sys.stdout.write(self.ReadTmpFile(fileName))
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index e14136416211..f6a4a6ae9c5d 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -15,6 +15,7 @@
 ##
 # Import Modules
 #
+from __future__ import print_function
 import os
 import random
 import sys
@@ -52,8 +53,8 @@ class Tests(TestTools.BaseToolsTest):
         finish = self.ReadTmpFile('output2')
         startEqualsFinish = start == finish
         if not startEqualsFinish:
-            print
-            print 'Original data did not match decompress(compress(data))'
+            print()
+            print('Original data did not match decompress(compress(data))')
             self.DisplayBinaryData('original data', start)
             self.DisplayBinaryData('after compression', self.ReadTmpFile('output1'))
             self.DisplayBinaryData('after decomression', finish)
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 858b4020ef9f..643fec58a457 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -17,6 +17,7 @@
 #
 
 
+from __future__ import print_function
 from optparse import OptionParser
 import os
 import shutil
@@ -34,7 +35,7 @@ if sys.version_info < (2, 5):
     #
     # This script (and edk2 BaseTools) require Python 2.5 or newer
     #
-    print 'Python version 2.5 or later is required.'
+    print('Python version 2.5 or later is required.')
     sys.exit(-1)
 
 #
@@ -146,37 +147,37 @@ class Config:
         if not self.options.skip_gcc:
             building.append('gcc')
         if len(building) == 0:
-            print "Nothing will be built!"
-            print
-            print "Please try using --help and then change the configuration."
+            print("Nothing will be built!")
+            print()
+            print("Please try using --help and then change the configuration.")
             return False
 
-        print "Current directory:"
-        print "   ", self.base_dir
-        print "Sources download/extraction:", self.Relative(self.src_dir)
-        print "Build directory            :", self.Relative(self.build_dir)
-        print "Prefix (install) directory :", self.Relative(self.prefix)
-        print "Create symlinks directory  :", self.Relative(self.symlinks)
-        print "Building                   :", ', '.join(building)
-        print
+        print("Current directory:")
+        print("   ", self.base_dir)
+        print("Sources download/extraction:", self.Relative(self.src_dir))
+        print("Build directory            :", self.Relative(self.build_dir))
+        print("Prefix (install) directory :", self.Relative(self.prefix))
+        print("Create symlinks directory  :", self.Relative(self.symlinks))
+        print("Building                   :", ', '.join(building))
+        print()
         answer = raw_input("Is this configuration ok? (default = no): ")
         if (answer.lower() not in ('y', 'yes')):
-            print
-            print "Please try using --help and then change the configuration."
+            print()
+            print("Please try using --help and then change the configuration.")
             return False
 
         if self.arch.lower() == 'ipf':
-            print
-            print 'Please note that the IPF compiler built by this script has'
-            print 'not yet been validated!'
-            print
+            print()
+            print('Please note that the IPF compiler built by this script has')
+            print('not yet been validated!')
+            print()
             answer = raw_input("Are you sure you want to build it? (default = no): ")
             if (answer.lower() not in ('y', 'yes')):
-                print
-                print "Please try using --help and then change the configuration."
+                print()
+                print("Please try using --help and then change the configuration.")
                 return False
 
-        print
+        print()
         return True
 
     def Relative(self, path):
@@ -275,7 +276,7 @@ class SourceFiles:
             wDots = (100 * received * blockSize) / fileSize / 10
             if wDots > self.dots:
                 for i in range(wDots - self.dots):
-                    print '.',
+                    print('.', end=' ')
                     sys.stdout.flush()
                     self.dots += 1
 
@@ -286,18 +287,18 @@ class SourceFiles:
                     self.dots = 0
                     local_file = os.path.join(self.config.src_dir, fdata['filename'])
                     url = fdata['url']
-                    print 'Downloading %s:' % fname, url
+                    print('Downloading %s:' % fname, url)
                     if retries > 0:
-                        print '(retry)',
+                        print('(retry)', end=' ')
                     sys.stdout.flush()
 
                     completed = False
                     if os.path.exists(local_file):
                         md5_pass = self.checkHash(fdata)
                         if md5_pass:
-                            print '[md5 match]',
+                            print('[md5 match]', end=' ')
                         else:
-                            print '[md5 mismatch]',
+                            print('[md5 mismatch]', end=' ')
                         sys.stdout.flush()
                         completed = md5_pass
 
@@ -313,32 +314,32 @@ class SourceFiles:
                     if not completed and os.path.exists(local_file):
                         md5_pass = self.checkHash(fdata)
                         if md5_pass:
-                            print '[md5 match]',
+                            print('[md5 match]', end=' ')
                         else:
-                            print '[md5 mismatch]',
+                            print('[md5 mismatch]', end=' ')
                         sys.stdout.flush()
                         completed = md5_pass
 
                     if completed:
-                        print '[done]'
+                        print('[done]')
                         break
                     else:
-                        print '[failed]'
-                        print '  Tried to retrieve', url
-                        print '  to', local_file
-                        print 'Possible fixes:'
-                        print '* If you are behind a web-proxy, try setting the',
-                        print 'http_proxy environment variable'
-                        print '* You can try to download this file separately',
-                        print 'and rerun this script'
+                        print('[failed]')
+                        print('  Tried to retrieve', url)
+                        print('  to', local_file)
+                        print('Possible fixes:')
+                        print('* If you are behind a web-proxy, try setting the', end=' ')
+                        print('http_proxy environment variable')
+                        print('* You can try to download this file separately', end=' ')
+                        print('and rerun this script')
                         raise Exception()
                 
                 except KeyboardInterrupt:
-                    print '[KeyboardInterrupt]'
+                    print('[KeyboardInterrupt]')
                     return False
 
                 except Exception as e:
-                    print e
+                    print(e)
 
             if not completed: return False
 
@@ -396,7 +397,7 @@ class Extracter:
             extractedMd5 = open(extracted).read()
 
         if extractedMd5 != moduleMd5:
-            print 'Extracting %s:' % self.config.Relative(local_file)
+            print('Extracting %s:' % self.config.Relative(local_file))
             tar = tarfile.open(local_file)
             tar.extractall(extractDst)
             open(extracted, 'w').write(moduleMd5)
@@ -480,7 +481,7 @@ class Builder:
 
         os.chdir(base_dir)
 
-        print '%s module is now built and installed' % module
+        print('%s module is now built and installed' % module)
 
     def RunCommand(self, cmd, module, stage, skipable=False):
         if skipable:
@@ -495,13 +496,13 @@ class Builder:
                 stderr=subprocess.STDOUT
                 )
 
-        print '%s [%s] ...' % (module, stage),
+        print('%s [%s] ...' % (module, stage), end=' ')
         sys.stdout.flush()
         p = popen(cmd)
         output = p.stdout.read()
         p.wait()
         if p.returncode != 0:
-            print '[failed!]'
+            print('[failed!]')
             logFile = os.path.join(self.config.build_dir, 'log.txt')
             f = open(logFile, "w")
             f.write(output)
@@ -509,7 +510,7 @@ class Builder:
             raise Exception, 'Failed to %s %s\n' % (stage, module) + \
                 'See output log at %s' % self.config.Relative(logFile)
         else:
-            print '[done]'
+            print('[done]')
 
         if skipable:
             self.MarkBuildStepComplete('%s.%s' % (module, stage))
@@ -526,13 +527,13 @@ class Builder:
             linkdst = os.path.join(links_dir, link)
             if not os.path.lexists(linkdst):
                 if not startPrinted:
-                    print 'Making symlinks in %s:' % self.config.Relative(links_dir),
+                    print('Making symlinks in %s:' % self.config.Relative(links_dir), end=' ')
                     startPrinted = True
-                print link,
+                print(link, end=' ')
                 os.symlink(src, linkdst)
 
         if startPrinted:
-            print '[done]'
+            print('[done]')
 
 class App:
     """class App
@@ -551,9 +552,9 @@ class App:
         sources = SourceFiles(config)
         result = sources.GetAll()
         if result:
-            print 'All files have been downloaded & verified'
+            print('All files have been downloaded & verified')
         else:
-            print 'An error occured while downloading a file'
+            print('An error occured while downloading a file')
             return
 
         Extracter(sources, config).ExtractAll()
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 04/13] BaseTools: Remove the old python "not-equal"
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (2 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 03/13] BaseTools: Refactor python print statements Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 05/13] BaseTools: Remove tuple parameter in python scripts Gary Lin
                   ` (10 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace "<>" with "!=" to be compatible with python3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/AutoGen/AutoGen.py                             |  2 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                         |  4 ++--
 BaseTools/Source/Python/AutoGen/GenMake.py                             |  2 +-
 BaseTools/Source/Python/Common/Misc.py                                 |  2 +-
 BaseTools/Source/Python/Ecc/Check.py                                   | 14 +++++++-------
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                         |  6 +++---
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py |  8 ++++----
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py         | 12 ++++++------
 BaseTools/Source/Python/Workspace/DscBuildData.py                      |  2 +-
 9 files changed, 26 insertions(+), 26 deletions(-)

diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index e268c4c0a1cf..d7485909414d 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -4233,7 +4233,7 @@ class ModuleAutoGen(AutoGen):
             Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
             DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
 
-            if len(Dpx.PostfixNotation) <> 0:
+            if len(Dpx.PostfixNotation) != 0:
                 self.DepexGenerated = True
 
             if Dpx.Generate(path.join(self.OutputDir, DpxFile)):
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index d4daa3093761..cab4c993dc44 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -362,8 +362,8 @@ class BuildRule:
             self.RuleContent[Index] = Line
             
             # find the build_rule_version
-            if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) <> -1:
-                if Line.find("=") <> -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
+            if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) != -1:
+                if Line.find("=") != -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
                     self._FileVersion = (Line[(Line.find("=") + 1):]).split()[0]
             # skip empty or comment line
             if Line == "" or Line[0] == "#":
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index 48b66c570e0a..ea73de5fa55f 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -561,7 +561,7 @@ cleanlib:
 
         # convert source files and binary files to build targets
         self.ResultFileList = [str(T.Target) for T in self._AutoGenObject.CodaTargetList]
-        if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) <> 0:
+        if len(self.ResultFileList) == 0 and len(self._AutoGenObject.SourceFileList) != 0:
             EdkLogger.error("build", AUTOGEN_ERROR, "Nothing to build",
                             ExtraData="[%s]" % str(self._AutoGenObject))
 
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 5197818d3f27..01171adb9b9e 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1293,7 +1293,7 @@ def ParseDevPathValue (Value):
 def ParseFieldValue (Value):
     if type(Value) == type(0):
         return Value, (Value.bit_length() + 7) / 8
-    if type(Value) <> type(''):
+    if type(Value) != type(''):
         raise BadExpression('Type %s is %s' %(Value, type(Value)))
     Value = Value.strip()
     if Value.startswith(TAB_UINT8) and Value.endswith(')'):
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index dde7d7841082..ea739043e0bc 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -816,8 +816,8 @@ class Check(object):
                     EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE, OtherMsg="The Library Class [%s] is not used in any platform" % (Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
             SqlCommand = """
                          select A.ID, A.Value1, A.BelongsToFile, A.StartLine, B.StartLine from Dsc as A left join Dsc as B
-                         where A.Model = %s and B.Model = %s and A.Scope1 = B.Scope1 and A.Scope2 = B.Scope2 and A.ID <> B.ID
-                         and A.Value1 = B.Value1 and A.Value2 <> B.Value2 and A.BelongsToItem = -1 and B.BelongsToItem = -1 and A.StartLine <> B.StartLine and B.BelongsToFile = A.BelongsToFile""" \
+                         where A.Model = %s and B.Model = %s and A.Scope1 = B.Scope1 and A.Scope2 = B.Scope2 and A.ID != B.ID
+                         and A.Value1 = B.Value1 and A.Value2 != B.Value2 and A.BelongsToItem = -1 and B.BelongsToItem = -1 and A.StartLine != B.StartLine and B.BelongsToFile = A.BelongsToFile""" \
                             % (MODEL_EFI_LIBRARY_CLASS, MODEL_EFI_LIBRARY_CLASS)
             RecordSet = EccGlobalData.gDb.TblDsc.Exec(SqlCommand)
             for Record in RecordSet:
@@ -903,7 +903,7 @@ class Check(object):
                          and A.Value1 = B.Value1
                          and A.Value2 = B.Value2
                          and A.Scope1 = B.Scope1
-                         and A.ID <> B.ID
+                         and A.ID != B.ID
                          and A.Model = B.Model
                          and A.Enabled > -1
                          and B.Enabled > -1
@@ -1055,7 +1055,7 @@ class Check(object):
             SqlCommand = """
                          select A.ID, A.Value3, A.BelongsToFile, B.BelongsToFile from %s as A, %s as B
                          where A.Value2 = 'FILE_GUID' and B.Value2 = 'FILE_GUID' and
-                         A.Value3 = B.Value3 and A.ID <> B.ID group by A.ID
+                         A.Value3 = B.Value3 and A.ID != B.ID group by A.ID
                          """ % (Table.Table, Table.Table)
             RecordSet = Table.Exec(SqlCommand)
             for Record in RecordSet:
@@ -1215,7 +1215,7 @@ class Check(object):
         SqlCommand = """
                      select A.ID, A.Value1 from %s as A, %s as B
                      where A.Model = %s and B.Model = %s
-                     and A.Value1 like B.Value1 and A.ID <> B.ID
+                     and A.Value1 like B.Value1 and A.ID != B.ID
                      and A.Scope1 = B.Scope1
                      and A.Enabled > -1
                      and B.Enabled > -1
@@ -1239,8 +1239,8 @@ class Check(object):
         SqlCommand = """
                      select A.ID, A.Value1, A.Value2 from %s as A, %s as B
                      where A.Model = %s and B.Model = %s
-                     and A.Value2 like B.Value2 and A.ID <> B.ID
-                     and A.Scope1 = B.Scope1 and A.Value1 <> B.Value1
+                     and A.Value2 like B.Value2 and A.ID != B.ID
+                     and A.Scope1 = B.Scope1 and A.Value1 != B.Value1
                      group by A.ID
                      """ % (Table.Table, Table.Table, Model, Model)
         RecordSet = Table.Exec(SqlCommand)
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 4f79d0f82967..11d11700ed99 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -118,7 +118,7 @@ if __name__ == '__main__':
     sys.exit(1)
 
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -208,7 +208,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s smime -sign -binary -signer "%s" -outform DER -md sha256 -certfile "%s"' % (OpenSslCommand, args.SignerPrivateCertFileName, args.OtherPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Signature = Process.communicate(input=FullInputFileBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       sys.exit(Process.returncode)
 
     #
@@ -277,7 +277,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=args.SignatureBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 41bcaa0437c5..ca4f64864790 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -80,7 +80,7 @@ if __name__ == '__main__':
     sys.exit(1)
     
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -103,7 +103,7 @@ if __name__ == '__main__':
       #
       Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
       Process.communicate()
-      if Process.returncode <> 0:
+      if Process.returncode != 0:
         print('ERROR: RSA 2048 key generation failed')
         sys.exit(Process.returncode)
       
@@ -125,7 +125,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Unable to extract public key from private key')
       sys.exit(Process.returncode)
     PublicKey = ''
@@ -138,7 +138,7 @@ if __name__ == '__main__':
     Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.stdin.write (PublicKey)
     PublicKeyHash = PublicKeyHash + Process.communicate()[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Unable to extract SHA 256 hash of public key')
       sys.exit(Process.returncode)
 
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 2944b634fb7a..2e164c4a2da6 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -101,7 +101,7 @@ if __name__ == '__main__':
     sys.exit(1)
     
   Version = Process.communicate()
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
     sys.exit(Process.returncode)
   print(Version[0])
@@ -157,7 +157,7 @@ if __name__ == '__main__':
   while len(PublicKeyHexString) > 0:
     PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
     PublicKeyHexString=PublicKeyHexString[2:]
-  if Process.returncode <> 0:
+  if Process.returncode != 0:
     sys.exit(Process.returncode)
 
   if args.MonotonicCountStr:
@@ -179,7 +179,7 @@ if __name__ == '__main__':
     #
     Process = subprocess.Popen('%s dgst -sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Signature = Process.communicate(input=FullInputFileBuffer)[0]
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       sys.exit(Process.returncode)
       
     #
@@ -202,14 +202,14 @@ if __name__ == '__main__':
     #
     # Verify that the Hash Type matches the expected SHA256 type
     #
-    if uuid.UUID(bytes_le = Header.HashType) <> EFI_HASH_ALGORITHM_SHA256_GUID:
+    if uuid.UUID(bytes_le = Header.HashType) != EFI_HASH_ALGORITHM_SHA256_GUID:
       print('ERROR: unsupport hash GUID')
       sys.exit(1)
 
     #
     # Verify the public key
     #
-    if Header.PublicKey <> PublicKey:
+    if Header.PublicKey != PublicKey:
       print('ERROR: Public key in input file does not match public key from private key file')
       sys.exit(1)
 
@@ -228,7 +228,7 @@ if __name__ == '__main__':
     #    
     Process = subprocess.Popen('%s dgst -sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     Process.communicate(input=FullInputFileBuffer)
-    if Process.returncode <> 0:
+    if Process.returncode != 0:
       print('ERROR: Verification failed')
       os.remove (args.OutputFileName)
       sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 7f289c103fb9..a80c07bc1e55 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -2153,7 +2153,7 @@ class DscBuildData(PlatformBuildClassObject):
         if DscBuildData.NeedUpdateOutput(OutputValueFile, PcdValueInitExe ,InputValueFile):
             Command = PcdValueInitExe + ' -i %s -o %s' % (InputValueFile, OutputValueFile)
             returncode, StdOut, StdErr = DscBuildData.ExecuteCommand (Command)
-            if returncode <> 0:
+            if returncode != 0:
                 EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s' % Command)
 
         File = open (OutputValueFile, 'r')
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 05/13] BaseTools: Remove tuple parameter in python scripts
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (3 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 04/13] BaseTools: Remove the old python "not-equal" Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 06/13] BaseTools: Remove the deprecated hash_key() Gary Lin
                   ` (9 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

According to PEP3113, tuple parameter is removed in python 3.
(PEP3113: https://www.python.org/dev/peps/pep-3113/)

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/Common/VpdInfoFile.py | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 09b8196faf07..435b23f203a0 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -218,7 +218,8 @@ class VpdInfoFile:
             return None
         
         return self._VpdArray[vpd]
-    def GetVpdInfo(self,(PcdTokenName,TokenSpaceName)):
+    def GetVpdInfo(self, arg):
+        (PcdTokenName, TokenSpaceName) = arg
         return self._VpdInfo.get((TokenSpaceName, PcdTokenName))
     
 ## Call external BPDG tool to process VPD file
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 06/13] BaseTools: Remove the deprecated hash_key()
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (4 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 05/13] BaseTools: Remove tuple parameter in python scripts Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 07/13] BaseTools: Replace StandardError with Expression Gary Lin
                   ` (8 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace "has_key()" with "in" to be compatible with python3.
Based on "futurize -f lib2to3.fixes.fix_has_key"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py         |  2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py |  6 +++---
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py        |  8 ++++----
 BaseTools/Source/Python/AutoGen/AutoGen.py                                         |  4 ++--
 BaseTools/Source/Python/Common/VpdInfoFile.py                                      |  2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                              | 16 ++++++++--------
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py                       |  6 +++---
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py                       |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py                         |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py               |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                               |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py                     |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py                          |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py                          |  4 ++--
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py                     |  2 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py                       |  3 +--
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py                |  4 ++--
 BaseTools/Source/Python/build/build.py                                             |  2 +-
 18 files changed, 39 insertions(+), 40 deletions(-)

diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
index ea83327052f2..ccfef6b6e280 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
@@ -140,7 +140,7 @@ class BaseINIFile(object):
                     sObj = self.GetSectionInstance(self, name, (len(sname_arr) > 1))
                     sObj._start = index
                     sObjs.append(sObj)
-                    if not self._sections.has_key(name.lower()):
+                    if name.lower() not in self._sections:
                         self._sections[name.lower()] = [sObj]
                     else:
                         self._sections[name.lower()].append(sObj)
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
index 7c120d85c255..b49c87c8bdab 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
@@ -26,7 +26,7 @@ class SurfaceObject(object):
 
         """
         obj = object.__new__(cls, *args, **kwargs)
-        if not cls._objs.has_key("None"):
+        if "None" not in cls._objs:
             cls._objs["None"] = []
         cls._objs["None"].append(obj)
 
@@ -47,7 +47,7 @@ class SurfaceObject(object):
         self.GetFileObj().Destroy(self)
         del self._fileObj
         # dereference self from _objs arrary
-        assert self._objs.has_key(key), "when destory, object is not in obj list"
+        assert key in self._objs, "when destory, object is not in obj list"
         assert self in self._objs[key], "when destory, object is not in obj list"
         self._objs[key].remove(self)
         if len(self._objs[key]) == 0:
@@ -95,7 +95,7 @@ class SurfaceObject(object):
         if self not in cls._objs["None"]:
             ErrorMsg("Sufrace object does not be create into None list")
         cls._objs["None"].remove(self)
-        if not cls._objs.has_key(relativePath):
+        if relativePath not in cls._objs:
             cls._objs[relativePath] = []
         cls._objs[relativePath].append(self)
 
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
index 32b26850e766..793e95efedcc 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
@@ -61,7 +61,7 @@ class INFFile(ini.BaseINIFile):
         classname = self.GetProduceLibraryClass()
         if classname is not None:
             libobjdict = INFFile._libobjs
-            if libobjdict.has_key(classname):
+            if classname in libobjdict:
                 if self not in libobjdict[classname]:
                     libobjdict[classname].append(self)
             else:
@@ -169,7 +169,7 @@ class INFLibraryClassObject(INFSectionObject):
     def Parse(self):
         self._classname = self.GetLineByOffset(self._start).split('#')[0].strip()
         objdict = INFLibraryClassObject._objs
-        if objdict.has_key(self._classname):
+        if self._classname in objdict:
             objdict[self._classname].append(self)
         else:
             objdict[self._classname] = [self]
@@ -241,7 +241,7 @@ class INFSourceObject(INFSectionObject):
 
         self.mFilename = os.path.basename(self.GetSourceFullPath())
         objdict = INFSourceObject._objs
-        if not objdict.has_key(self.mFilename):
+        if self.mFilename not in objdict:
             objdict[self.mFilename] = [self]
         else:
             objdict[self.mFilename].append(self)
@@ -303,7 +303,7 @@ class INFPcdObject(INFSectionObject):
             self.mDefaultValue = arr[1].strip()
 
         objdict = INFPcdObject._objs
-        if objdict.has_key(self.GetName()):
+        if self.GetName() in objdict:
             if self not in objdict[self.GetName()]:
                 objdict[self.GetName()].append(self)
         else:
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index d7485909414d..dbcf662389e4 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1798,8 +1798,8 @@ class PlatformAutoGen(AutoGen):
             # retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
             BPDGToolName = None
             for ToolDef in self.ToolDefinition.values():
-                if ToolDef.has_key(TAB_GUID) and ToolDef[TAB_GUID] == self.Platform.VpdToolGuid:
-                    if not ToolDef.has_key("PATH"):
+                if TAB_GUID in ToolDef and ToolDef[TAB_GUID] == self.Platform.VpdToolGuid:
+                    if "PATH" not in ToolDef:
                         EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
                     BPDGToolName = ToolDef["PATH"]
                     break
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 435b23f203a0..ddabe9fb2546 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -211,7 +211,7 @@ class VpdInfoFile:
     #
     #  @param vpd    A given VPD PCD 
     def GetOffset(self, vpd):
-        if not self._VpdArray.has_key(vpd):
+        if vpd not in self._VpdArray:
             return None
         
         if len(self._VpdArray[vpd]) == 0:
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index bfd422b196ba..b97b319e0956 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -438,14 +438,14 @@ def GenLibraryClasses(ModuleObject):
                 Statement = '# Guid: ' + LibraryItem.Guid + ' Version: ' + LibraryItem.Version
 
                 if len(BinaryFile.SupArchList) == 0:
-                    if LibraryClassDict.has_key('COMMON') and Statement not in LibraryClassDict['COMMON']:
+                    if 'COMMON' in LibraryClassDict and Statement not in LibraryClassDict['COMMON']:
                         LibraryClassDict['COMMON'].append(Statement)
                     else:
                         LibraryClassDict['COMMON'] = ['## @LIB_INSTANCES']
                         LibraryClassDict['COMMON'].append(Statement)
                 else:
                     for Arch in BinaryFile.SupArchList:
-                        if LibraryClassDict.has_key(Arch):
+                        if Arch in LibraryClassDict:
                             if Statement not in LibraryClassDict[Arch]:
                                 LibraryClassDict[Arch].append(Statement)
                             else:
@@ -917,14 +917,14 @@ def GenAsBuiltPacthPcdSections(ModuleObject):
             if FileNameObjList:
                 ArchList = FileNameObjList[0].GetSupArchList()
             if len(ArchList) == 0:
-                if PatchPcdDict.has_key(DT.TAB_ARCH_COMMON):
+                if DT.TAB_ARCH_COMMON in PatchPcdDict:
                     if Statement not in PatchPcdDict[DT.TAB_ARCH_COMMON]:
                         PatchPcdDict[DT.TAB_ARCH_COMMON].append(Statement)
                 else:
                     PatchPcdDict[DT.TAB_ARCH_COMMON] = [Statement]
             else:
                 for Arch in ArchList:
-                    if PatchPcdDict.has_key(Arch):
+                    if Arch in PatchPcdDict:
                         if Statement not in PatchPcdDict[Arch]:
                             PatchPcdDict[Arch].append(Statement)
                     else:
@@ -967,13 +967,13 @@ def GenAsBuiltPcdExSections(ModuleObject):
                 ArchList = FileNameObjList[0].GetSupArchList()
 
             if len(ArchList) == 0:
-                if PcdExDict.has_key('COMMON'):
+                if 'COMMON' in PcdExDict:
                     PcdExDict['COMMON'].append(Statement)
                 else:
                     PcdExDict['COMMON'] = [Statement]
             else:
                 for Arch in ArchList:
-                    if PcdExDict.has_key(Arch):
+                    if Arch in PcdExDict:
                         if Statement not in PcdExDict[Arch]:
                             PcdExDict[Arch].append(Statement)
                     else:
@@ -1071,7 +1071,7 @@ def GenBuildOptions(ModuleObject):
             for BuilOptionItem in BinaryFile.AsBuiltList[0].BinaryBuildFlagList:
                 Statement = '#' + BuilOptionItem.AsBuiltOptionFlags
                 if len(BinaryFile.SupArchList) == 0:
-                    if BuildOptionDict.has_key('COMMON'):
+                    if 'COMMON' in BuildOptionDict:
                         if Statement not in BuildOptionDict['COMMON']:
                             BuildOptionDict['COMMON'].append(Statement)
                     else:
@@ -1079,7 +1079,7 @@ def GenBuildOptions(ModuleObject):
                         BuildOptionDict['COMMON'].append(Statement)
                 else:
                     for Arch in BinaryFile.SupArchList:
-                        if BuildOptionDict.has_key(Arch):
+                        if Arch in BuildOptionDict:
                             if Statement not in BuildOptionDict[Arch]:
                                 BuildOptionDict[Arch].append(Statement)
                         else:
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
index 33b142d64e07..cc2fc4905326 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
@@ -272,7 +272,7 @@ class InfBinariesObject(InfSectionCommonDef):
                                 pass
 
             if InfBianryVerItemObj is not None:
-                if self.Binaries.has_key((InfBianryVerItemObj)):
+                if (InfBianryVerItemObj) in self.Binaries:
                     BinariesList = self.Binaries[InfBianryVerItemObj]
                     BinariesList.append((InfBianryVerItemObj, VerComment))
                     self.Binaries[InfBianryVerItemObj] = BinariesList
@@ -522,7 +522,7 @@ class InfBinariesObject(InfSectionCommonDef):
 #                                pass
 
             if InfBianryCommonItemObj is not None:
-                if self.Binaries.has_key((InfBianryCommonItemObj)):
+                if (InfBianryCommonItemObj) in self.Binaries:
                     BinariesList = self.Binaries[InfBianryCommonItemObj]
                     BinariesList.append((InfBianryCommonItemObj, ItemComment))
                     self.Binaries[InfBianryCommonItemObj] = BinariesList
@@ -673,7 +673,7 @@ class InfBinariesObject(InfSectionCommonDef):
 #                                        pass
 
                     if InfBianryUiItemObj is not None:
-                        if self.Binaries.has_key((InfBianryUiItemObj)):
+                        if (InfBianryUiItemObj) in self.Binaries:
                             BinariesList = self.Binaries[InfBianryUiItemObj]
                             BinariesList.append((InfBianryUiItemObj, UiComment))
                             self.Binaries[InfBianryUiItemObj] = BinariesList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
index 3995546593d8..9d27a92cd6b0 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
@@ -957,7 +957,7 @@ class InfDefObject(InfSectionCommonDef):
                     SpecValue = Name[Name.find("SPEC") + len("SPEC"):].strip()
                     Name = "SPEC"
                     Value = SpecValue + " = " + Value
-                if self.Defines.has_key(ArchListString):
+                if ArchListString in self.Defines:
                     DefineList = self.Defines[ArchListString]                 
                     LineInfo[0] = InfDefMemberObj.CurrentLine.GetFileName()
                     LineInfo[1] = InfDefMemberObj.CurrentLine.GetLineNo()
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
index fb8d1f5a62ee..4dfe75a2f16a 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
@@ -338,7 +338,7 @@ class InfGuidObject():
                                 #
                                 pass
                                                 
-            if self.Guids.has_key((InfGuidItemObj)):           
+            if (InfGuidItemObj) in self.Guids:
                 GuidList = self.Guids[InfGuidItemObj]                 
                 GuidList.append(InfGuidItemObj)
                 self.Guids[InfGuidItemObj] = GuidList
@@ -350,4 +350,4 @@ class InfGuidObject():
         return True
     
     def GetGuid(self):
-        return self.Guids
\ No newline at end of file
+        return self.Guids
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
index e588c6ba66d8..5de1832b71fa 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
@@ -238,7 +238,7 @@ class InfLibraryClassObject():
                 LibItemObj.SetVersion(LibItem[1])
                 LibItemObj.SetSupArchList(__SupArchList)
 
-            if self.LibraryClasses.has_key((LibItemObj)):
+            if (LibItemObj) in self.LibraryClasses:
                 LibraryList = self.LibraryClasses[LibItemObj]
                 LibraryList.append(LibItemObj)
                 self.LibraryClasses[LibItemObj] = LibraryList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
index 37f8cb2336bb..4ed739d66fb2 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
@@ -114,7 +114,7 @@ class InfSpecialCommentObject(InfSectionCommonDef):
            Type == DT.TYPE_EVENT_SECTION or \
            Type == DT.TYPE_BOOTMODE_SECTION:
             for Item in SepcialSectionList:
-                if self.SpecialComments.has_key(Type):           
+                if Type in self.SpecialComments:
                     ObjList = self.SpecialComments[Type]
                     ObjList.append(Item)
                     self.SpecialComments[Type] = ObjList
@@ -145,4 +145,4 @@ def ErrorInInf(Message=None, ErrorCode=None, LineInfo=None, RaiseError=True):
                  File=LineInfo[0], 
                  Line=LineInfo[1],
                  ExtraData=LineInfo[2], 
-                 RaiseError=RaiseError)
\ No newline at end of file
+                 RaiseError=RaiseError)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
index 01c854a8470e..bfac2b6b571c 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
@@ -171,7 +171,7 @@ class InfPackageObject():
                                 #
                                 pass
                                             
-            if self.Packages.has_key((PackageItemObj)):   
+            if (PackageItemObj) in self.Packages:
                 PackageList = self.Packages[PackageItemObj]
                 PackageList.append(PackageItemObj)
                 self.Packages[PackageItemObj] = PackageList
@@ -184,4 +184,4 @@ class InfPackageObject():
     
     def GetPackages(self, Arch = None):
         if Arch is None:
-            return self.Packages
\ No newline at end of file
+            return self.Packages
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
index 62c1e8409a09..3b9dfaed0c98 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
@@ -411,7 +411,7 @@ class InfPcdObject():
                 else:
                     PcdItemObj.SetSupportArchList(SupArchList)
 
-                if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+                if (PcdTypeItem, PcdItemObj) in self.Pcds:
                     PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
                     PcdsList.append(PcdItemObj)
                     self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
@@ -456,7 +456,7 @@ class InfPcdObject():
                                                       PackageInfo)
 
             PcdTypeItem = KeysList[0][0]
-            if self.Pcds.has_key((PcdTypeItem, PcdItemObj)):
+            if (PcdTypeItem, PcdItemObj) in self.Pcds:
                 PcdsList = self.Pcds[PcdTypeItem, PcdItemObj]
                 PcdsList.append(PcdItemObj)
                 self.Pcds[PcdTypeItem, PcdItemObj] = PcdsList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
index eb6b6927140b..0f865c569665 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
@@ -327,7 +327,7 @@ class InfPpiObject():
                                 # 
                                 pass
             
-            if self.Ppis.has_key((InfPpiItemObj)):           
+            if (InfPpiItemObj) in self.Ppis:
                 PpiList = self.Ppis[InfPpiItemObj]
                 PpiList.append(InfPpiItemObj)
                 self.Ppis[InfPpiItemObj] = PpiList
@@ -340,4 +340,4 @@ class InfPpiObject():
         
     
     def GetPpi(self):
-        return self.Ppis
\ No newline at end of file
+        return self.Ppis
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
index eb03095d6fec..6cadeb5a211c 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
@@ -296,7 +296,7 @@ class InfProtocolObject():
                                 #
                                 pass      
                                       
-            if self.Protocols.has_key((InfProtocolItemObj)):           
+            if (InfProtocolItemObj) in self.Protocols:
                 ProcotolList = self.Protocols[InfProtocolItemObj]
                 ProcotolList.append(InfProtocolItemObj)
                 self.Protocols[InfProtocolItemObj] = ProcotolList
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
index 2302dd5b9673..285e89aacbfc 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
@@ -224,7 +224,7 @@ class InfSourcesObject(InfSectionCommonDef):
                 
             ItemObj.SetSupArchList(__SupArchList) 
                                                                                                       
-            if self.Sources.has_key((ItemObj)):           
+            if (ItemObj) in self.Sources:
                 SourceContent = self.Sources[ItemObj]
                 SourceContent.append(ItemObj)
                 self.Sources[ItemObj] = SourceContent
@@ -237,4 +237,3 @@ class InfSourcesObject(InfSectionCommonDef):
      
     def GetSources(self):
         return self.Sources
-    
\ No newline at end of file
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
index 27a1c6ad25a0..f9db2944a495 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
@@ -103,7 +103,7 @@ class InfUserExtensionObject():
 #                            Line=LineNo,
 #                            ExtraData=None)
             
-            if self.UserExtension.has_key(IdContentItem):           
+            if IdContentItem in self.UserExtension:
                 #
                 # Each UserExtensions section header must have a unique set 
                 # of UserId, IdString and Arch values.
@@ -130,4 +130,4 @@ class InfUserExtensionObject():
         return True
         
     def GetUserExtension(self):
-        return self.UserExtension
\ No newline at end of file
+        return self.UserExtension
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 416aa73549d1..344b006bc424 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -77,7 +77,7 @@ TmpTableDict = {}
 #   Otherwise, False is returned
 #
 def IsToolInPath(tool):
-    if os.environ.has_key('PATHEXT'):
+    if 'PATHEXT' in os.environ:
         extns = os.environ['PATHEXT'].split(os.path.pathsep)
     else:
         extns = ('',)
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 07/13] BaseTools: Replace StandardError with Expression
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (5 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 06/13] BaseTools: Remove the deprecated hash_key() Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 08/13] BaseTools: Remove types.TypeType Gary Lin
                   ` (7 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

StandardError has been removed from python 3.
Replace it with Exception.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/UPT/UPT.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 2644dbed31e9..0e425828cdfe 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -309,7 +309,7 @@ def Main():
             else:
                 GlobalData.gDB.Commit()
                 Mgr.commit()
-        except StandardError:
+        except Exception:
             Logger.Quiet(ST.MSG_RECOVER_FAIL)
         GlobalData.gDB.CloseDb()
 
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 08/13] BaseTools: Remove types.TypeType
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (6 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 07/13] BaseTools: Replace StandardError with Expression Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 09/13] BaseTools: Refactor python raise statement Gary Lin
                   ` (6 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

"types.TypeType" is now an alias of the built-in "type" and is not
compatible with python 3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Tests/TestTools.py | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index be7b4ad42856..20a4ea28aa11 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -23,7 +23,6 @@ import random
 import shutil
 import subprocess
 import sys
-import types
 import unittest
 
 TestsDir = os.path.realpath(os.path.split(sys.argv[0])[0])
@@ -42,7 +41,7 @@ if PythonSourceDir not in sys.path:
 def MakeTheTestSuite(localItems):
     tests = []
     for name, item in localItems.iteritems():
-        if isinstance(item, types.TypeType):
+        if isinstance(item, type):
             if issubclass(item, unittest.TestCase):
                 tests.append(unittest.TestLoader().loadTestsFromTestCase(item))
             elif issubclass(item, unittest.TestSuite):
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 09/13] BaseTools: Refactor python raise statement
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (7 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 08/13] BaseTools: Remove types.TypeType Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 10/13] BaseTools: Adjust the spaces around commas and colons Gary Lin
                   ` (5 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Make "raise" to be compatible with python3.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/gcc/mingw-gcc-build.py | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 643fec58a457..7d7db33be4e4 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -507,8 +507,8 @@ class Builder:
             f = open(logFile, "w")
             f.write(output)
             f.close()
-            raise Exception, 'Failed to %s %s\n' % (stage, module) + \
-                'See output log at %s' % self.config.Relative(logFile)
+            raise Exception('Failed to %s %s\n' % (stage, module) + \
+                'See output log at %s' % self.config.Relative(logFile))
         else:
             print('[done]')
 
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 10/13] BaseTools: Adjust the spaces around commas and colons
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (8 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 09/13] BaseTools: Refactor python raise statement Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 11/13] BaseTools: Migrate to the new octal literal Gary Lin
                   ` (4 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Based on "futurize -f lib2to3.fixes.fix_ws_comma"

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                              |   2 +-
 BaseTools/Scripts/BinToPcd.py                                                  |   6 +-
 BaseTools/Scripts/FormatDosFiles.py                                            |   2 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py                                    |   6 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py |   4 +-
 BaseTools/Scripts/PatchCheck.py                                                |   2 +-
 BaseTools/Scripts/RunMakefile.py                                               |   2 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                                     |  54 ++---
 BaseTools/Source/Python/AutoGen/GenC.py                                        |  72 +++---
 BaseTools/Source/Python/AutoGen/GenMake.py                                     |   4 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                                    | 114 +++++-----
 BaseTools/Source/Python/AutoGen/GenVar.py                                      | 160 +++++++-------
 BaseTools/Source/Python/AutoGen/StrGather.py                                   |   4 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                         |  12 +-
 BaseTools/Source/Python/Common/DataType.py                                     |   4 +-
 BaseTools/Source/Python/Common/Expression.py                                   |  10 +-
 BaseTools/Source/Python/Common/Misc.py                                         |  28 +--
 BaseTools/Source/Python/Common/RangeExpression.py                              |   6 +-
 BaseTools/Source/Python/Common/StringUtils.py                                  |   2 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                           |   6 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                  |  10 +-
 BaseTools/Source/Python/Ecc/CParser.py                                         |  28 +--
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py                |  16 +-
 BaseTools/Source/Python/Eot/CParser.py                                         |  28 +--
 BaseTools/Source/Python/Eot/c.py                                               |  20 +-
 BaseTools/Source/Python/GenFds/AprioriSection.py                               |   2 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                                  |   2 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                                   |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py                                           |   6 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                                    |  26 +--
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                              |  14 +-
 BaseTools/Source/Python/GenFds/Fv.py                                           |   4 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                               |   4 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                         |   4 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py                   |   2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                                 |   2 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py         |   2 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py                 |   6 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                               |  12 +-
 BaseTools/Source/Python/Trim/Trim.py                                           |  14 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                            |   8 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                                      |   4 +-
 BaseTools/Source/Python/UPT/Library/StringUtils.py                             |   2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py                      |   2 +-
 BaseTools/Source/Python/UPT/UPT.py                                             |   2 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                                   |   2 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                                   |  24 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                          |  14 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                              |  22 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                              | 230 ++++++++++----------
 BaseTools/Source/Python/Workspace/MetaFileParser.py                            |  34 +--
 BaseTools/Source/Python/Workspace/MetaFileTable.py                             |   6 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                           |   2 +-
 BaseTools/Source/Python/build/BuildReport.py                                   |  10 +-
 BaseTools/Source/Python/build/build.py                                         |  12 +-
 BaseTools/Tests/TestTools.py                                                   |   2 +-
 BaseTools/gcc/mingw-gcc-build.py                                               |   2 +-
 57 files changed, 543 insertions(+), 543 deletions(-)

diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index dd66c7111ac0..b226499e8450 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -48,7 +48,7 @@ def ConvertCygPathToDos(CygPath):
     DosPath = CygPath
   
   # pipes.quote will add the extra \\ for us.
-  return DosPath.replace('/','\\')
+  return DosPath.replace('/', '\\')
 
 
 # we receive our options as a list, but we will be passing them to the shell as a line
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 10b5043325cc..c42e37bd119b 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -41,13 +41,13 @@ if __name__ == '__main__':
         return Value
 
     def ValidatePcdName (Argument):
-        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
             Message = '{Argument} is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>'.format (Argument = Argument)
             raise argparse.ArgumentTypeError (Message)
         return Argument
 
     def ValidateGuidName (Argument):
-        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['','']:
+        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
             Message = '{Argument} is not a valid GUID C name'.format (Argument = Argument)
             raise argparse.ArgumentTypeError (Message)
         return Argument
@@ -84,7 +84,7 @@ if __name__ == '__main__':
                          help = "Output filename for PCD value or PCD statement")
     parser.add_argument ("-p", "--pcd", dest = 'PcdName', type = ValidatePcdName,
                          help = "Name of the PCD in the form <PcdTokenSpaceGuidCName>.<PcdCName>")
-    parser.add_argument ("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD','HII'],
+    parser.add_argument ("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD', 'HII'],
                          help = "PCD statement type (HII or VPD).  Default is standard.")
     parser.add_argument ("-m", "--max-size", dest = 'MaxSize', type = ValidateUnsignedInteger,
                          help = "Maximum size of the PCD.  Ignored with --type HII.")
diff --git a/BaseTools/Scripts/FormatDosFiles.py b/BaseTools/Scripts/FormatDosFiles.py
index 3b16af5a4413..1c6b8e2b0bb2 100644
--- a/BaseTools/Scripts/FormatDosFiles.py
+++ b/BaseTools/Scripts/FormatDosFiles.py
@@ -62,7 +62,7 @@ def FormatFilesInDir(DirPath, ExtList, Args):
         FormatFile(File, Args)
 
 if __name__ == "__main__":
-    parser = argparse.ArgumentParser(prog=__prog__,description=__description__ + __copyright__, conflict_handler = 'resolve')
+    parser = argparse.ArgumentParser(prog=__prog__, description=__description__ + __copyright__, conflict_handler = 'resolve')
 
     parser.add_argument('Path', nargs='+',
                         help='the path for files to be converted.It could be directory or file path.')
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 0a41f9d83271..1dbb116bba0d 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -191,7 +191,7 @@ def processLine(newline):
 
     driverPrefixLen = len("Driver - ")
     # get driver name
-    if cmp(newline[0:driverPrefixLen],"Driver - ") == 0 :
+    if cmp(newline[0:driverPrefixLen], "Driver - ") == 0 :
         driverlineList = newline.split(" ")
         driverName = driverlineList[2]
         #print "Checking : ", driverName
@@ -214,7 +214,7 @@ def processLine(newline):
         else :
             symbolsFile.symbolsTable[driverName].parse_debug_file (driverName, pdbName)
 
-    elif cmp(newline,"") == 0 :
+    elif cmp(newline, "") == 0 :
         driverName = ""
 
     # check entry line
@@ -227,7 +227,7 @@ def processLine(newline):
         rvaName = ""
         symbolName = ""
 
-    if cmp(rvaName,"") == 0 :
+    if cmp(rvaName, "") == 0 :
         return newline
     else :
         return newline + symbolName
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
index fe2ba1d8a842..b5ab213cd7f0 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
@@ -66,7 +66,7 @@ class Page(BaseDoxygeItem):
 
     def AddSection(self, section):
         self.mSections.append(section)
-        self.mSections.sort(cmp=lambda x,y: cmp(x.mName.lower(), y.mName.lower()))
+        self.mSections.sort(cmp=lambda x, y: cmp(x.mName.lower(), y.mName.lower()))
 
     def Generate(self):
         if self.mIsMainPage:
@@ -91,7 +91,7 @@ class Page(BaseDoxygeItem):
             self.mText.insert(endIndex, '<ul>')
             endIndex += 1
             if self.mIsSort:
-                self.mSubPages.sort(cmp=lambda x,y: cmp(x.mName.lower(), y.mName.lower()))
+                self.mSubPages.sort(cmp=lambda x, y: cmp(x.mName.lower(), y.mName.lower()))
             for page in self.mSubPages:
                 self.mText.insert(endIndex, '<li>\subpage %s \"%s\" </li>' % (page.mTag, page.mName))
                 endIndex += 1
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 43bfc2495c6b..7b7fba8b7044 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -285,7 +285,7 @@ class GitDiffCheck:
         if self.state == START:
             if line.startswith('diff --git'):
                 self.state = PRE_PATCH
-                self.filename = line[13:].split(' ',1)[0]
+                self.filename = line[13:].split(' ', 1)[0]
                 self.is_newfile = False
                 self.force_crlf = not self.filename.endswith('.sh')
             elif len(line.rstrip()) != 0:
diff --git a/BaseTools/Scripts/RunMakefile.py b/BaseTools/Scripts/RunMakefile.py
index 48bc198c7671..6d0c4553c9eb 100644
--- a/BaseTools/Scripts/RunMakefile.py
+++ b/BaseTools/Scripts/RunMakefile.py
@@ -149,7 +149,7 @@ if __name__ == '__main__':
     for Item in gArgs.Define:
       if '=' not in Item[0]:
         continue
-      Item = Item[0].split('=',1)
+      Item = Item[0].split('=', 1)
       CommandLine.append('%s="%s"' % (Item[0], Item[1]))
   CommandLine.append('EXTRA_FLAGS="%s"' % (gArgs.Remaining))
   CommandLine.append(gArgs.BuildType)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index dbcf662389e4..a7e1edb8435c 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -48,7 +48,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 import InfSectionParser
 import datetime
 import hashlib
-from GenVar import VariableMgr,var_info
+from GenVar import VariableMgr, var_info
 from collections import OrderedDict
 from collections import defaultdict
 from Workspace.WorkspaceCommon import OrderedListDict
@@ -1293,7 +1293,7 @@ class PlatformAutoGen(AutoGen):
             ShareFixedAtBuildPcdsSameValue = {} 
             for Module in LibAuto._ReferenceModules:                
                 for Pcd in Module.FixedAtBuildPcds + LibAuto.FixedAtBuildPcds:
-                    key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))  
+                    key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
                     if key not in FixedAtBuildPcds:
                         ShareFixedAtBuildPcdsSameValue[key] = True
                         FixedAtBuildPcds[key] = Pcd.DefaultValue
@@ -1301,11 +1301,11 @@ class PlatformAutoGen(AutoGen):
                         if FixedAtBuildPcds[key] != Pcd.DefaultValue:
                             ShareFixedAtBuildPcdsSameValue[key] = False      
             for Pcd in LibAuto.FixedAtBuildPcds:
-                key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
-                if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
+                key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+                if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
                     continue
                 else:
-                    DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)]
+                    DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                     if DscPcd.Type != TAB_PCDS_FIXED_AT_BUILD:
                         continue
                 if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:                    
@@ -1325,12 +1325,12 @@ class PlatformAutoGen(AutoGen):
                         break
 
 
-        VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(),self.DscBuildDataObj._GetSkuIds())
+        VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj._GetSkuIds())
         VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
         VariableInfo.SetVpdRegionOffset(VpdRegionBase)
         Index = 0
         for Pcd in DynamicPcdSet:
-            pcdname = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+            pcdname = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
             for SkuName in Pcd.SkuInfoList:
                 Sku = Pcd.SkuInfoList[SkuName]
                 SkuId = Sku.SkuId
@@ -1340,11 +1340,11 @@ class PlatformAutoGen(AutoGen):
                     VariableGuidStructure = Sku.VariableGuidValue
                     VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
                     for StorageName in Sku.DefaultStoreDict:
-                        VariableInfo.append_variable(var_info(Index,pcdname,StorageName,SkuName, StringToArray(Sku.VariableName),VariableGuid, Sku.VariableOffset, Sku.VariableAttribute , Sku.HiiDefaultValue,Sku.DefaultStoreDict[StorageName],Pcd.DatumType))
+                        VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName], Pcd.DatumType))
             Index += 1
         return VariableInfo
 
-    def UpdateNVStoreMaxSize(self,OrgVpdFile):
+    def UpdateNVStoreMaxSize(self, OrgVpdFile):
         if self.VariableInfo:
             VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
             PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
@@ -1357,7 +1357,7 @@ class PlatformAutoGen(AutoGen):
                 else:
                     EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
 
-                NvStoreOffset = int(NvStoreOffset,16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
+                NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
                 default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
                 maxsize = self.VariableInfo.VpdRegionSize  - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
                 var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
@@ -1569,7 +1569,7 @@ class PlatformAutoGen(AutoGen):
                     VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
 
             #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
-            PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer","gEfiMdeModulePkgTokenSpaceGuid"))
+            PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
             if PcdNvStoreDfBuffer:
                 self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
                 vardump = self.VariableInfo.dump()
@@ -1595,10 +1595,10 @@ class PlatformAutoGen(AutoGen):
                         PcdValue = DefaultSku.DefaultValue
                         if PcdValue not in SkuValueMap:
                             SkuValueMap[PcdValue] = []
-                            VpdFile.Add(Pcd, TAB_DEFAULT,DefaultSku.VpdOffset)
+                            VpdFile.Add(Pcd, TAB_DEFAULT, DefaultSku.VpdOffset)
                         SkuValueMap[PcdValue].append(DefaultSku)
 
-                    for (SkuName,Sku) in Pcd.SkuInfoList.items():
+                    for (SkuName, Sku) in Pcd.SkuInfoList.items():
                         Sku.VpdOffset = Sku.VpdOffset.strip()
                         PcdValue = Sku.DefaultValue
                         if PcdValue == "":
@@ -1624,7 +1624,7 @@ class PlatformAutoGen(AutoGen):
                                     EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
                         if PcdValue not in SkuValueMap:
                             SkuValueMap[PcdValue] = []
-                            VpdFile.Add(Pcd, SkuName,Sku.VpdOffset)
+                            VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
                         SkuValueMap[PcdValue].append(Sku)
                         # if the offset of a VPD is *, then it need to be fixed up by third party tool.
                         if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
@@ -1656,9 +1656,9 @@ class PlatformAutoGen(AutoGen):
                             SkuObjList = DscPcdEntry.SkuInfoList.items()
                             DefaultSku = DscPcdEntry.SkuInfoList.get(TAB_DEFAULT)
                             if DefaultSku:
-                                defaultindex = SkuObjList.index((TAB_DEFAULT,DefaultSku))
-                                SkuObjList[0],SkuObjList[defaultindex] = SkuObjList[defaultindex],SkuObjList[0]
-                            for (SkuName,Sku) in SkuObjList:
+                                defaultindex = SkuObjList.index((TAB_DEFAULT, DefaultSku))
+                                SkuObjList[0], SkuObjList[defaultindex] = SkuObjList[defaultindex], SkuObjList[0]
+                            for (SkuName, Sku) in SkuObjList:
                                 Sku.VpdOffset = Sku.VpdOffset.strip() 
                                 
                                 # Need to iterate DEC pcd information to get the value & datumtype
@@ -1708,7 +1708,7 @@ class PlatformAutoGen(AutoGen):
                                             EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
                                 if PcdValue not in SkuValueMap:
                                     SkuValueMap[PcdValue] = []
-                                    VpdFile.Add(DscPcdEntry, SkuName,Sku.VpdOffset)
+                                    VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
                                 SkuValueMap[PcdValue].append(Sku)
                                 if not NeedProcessVpdMapFile and Sku.VpdOffset == "*":
                                     NeedProcessVpdMapFile = True 
@@ -1774,17 +1774,17 @@ class PlatformAutoGen(AutoGen):
         self._DynamicPcdList.extend(list(UnicodePcdArray))
         self._DynamicPcdList.extend(list(HiiPcdArray))
         self._DynamicPcdList.extend(list(OtherPcdArray))
-        allskuset = [(SkuName,Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName,Sku) in pcd.SkuInfoList.items()]
+        allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
-                for (SkuName,SkuId) in allskuset:
-                    if type(SkuId) in (str,unicode) and eval(SkuId) == 0 or SkuId == 0:
+                for (SkuName, SkuId) in allskuset:
+                    if type(SkuId) in (str, unicode) and eval(SkuId) == 0 or SkuId == 0:
                         continue
                     pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
         self.AllPcdList = self._NonDynamicPcdList + self._DynamicPcdList
 
-    def FixVpdOffset(self,VpdFile ):
+    def FixVpdOffset(self, VpdFile ):
         FvPath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY)
         if not os.path.exists(FvPath):
             try:
@@ -2050,7 +2050,7 @@ class PlatformAutoGen(AutoGen):
         if self._NonDynamicPcdDict:
             return self._NonDynamicPcdDict
         for Pcd in self.NonDynamicPcdList:
-            self._NonDynamicPcdDict[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)] = Pcd
+            self._NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
         return self._NonDynamicPcdDict
 
     ## Get list of non-dynamic PCDs
@@ -3711,7 +3711,7 @@ class ModuleAutoGen(AutoGen):
         try:
             fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
         except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName,None)
+            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
 
         # Use a instance of StringIO to cache data
         fStringIO = StringIO('')  
@@ -3746,7 +3746,7 @@ class ModuleAutoGen(AutoGen):
             fInputfile.write (fStringIO.getvalue())
         except:
             EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
-                            "file been locked or using by other applications." %UniVfrOffsetFileName,None)
+                            "file been locked or using by other applications." %UniVfrOffsetFileName, None)
 
         fStringIO.close ()
         fInputfile.close ()
@@ -4181,7 +4181,7 @@ class ModuleAutoGen(AutoGen):
     def CopyBinaryFiles(self):
         for File in self.Module.Binaries:
             SrcPath = File.Path
-            DstPath = os.path.join(self.OutputDir , os.path.basename(SrcPath))
+            DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
             CopyLongFilePath(SrcPath, DstPath)
     ## Create autogen code for the module and its dependent libraries
     #
@@ -4331,7 +4331,7 @@ class ModuleAutoGen(AutoGen):
         if SrcTimeStamp > DstTimeStamp:
             return False
 
-        with open(self.GetTimeStampPath(),'r') as f:
+        with open(self.GetTimeStampPath(), 'r') as f:
             for source in f:
                 source = source.rstrip('\n')
                 if not os.path.exists(source):
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index 5b24ae5fc464..5c3552a773c0 100644
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -949,7 +949,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             AutoGenH.Append('// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
             AutoGenH.Append('// #define %s  %s\n' % (PcdTokenName, PcdExTokenName))
             AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName,Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
                 AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
@@ -959,7 +959,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
         else:
             AutoGenH.Append('#define %s  %s\n' % (PcdTokenName, PcdExTokenName))
             AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName,Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
                 AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
@@ -1073,7 +1073,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
                 Value = eval(Value)         # translate escape character
                 ValueSize = len(Value) + 1
                 NewValue = '{'
-                for Index in range(0,len(Value)):
+                for Index in range(0, len(Value)):
                     if Unicode:
                         NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ', '
                     else:
@@ -1119,14 +1119,14 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             PcdDataSize = Pcd.GetPcdSize()
             if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
                 AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName,FixPcdSizeTokenName))
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (FixedPcdSizeVariableName,PcdDataSize))
+                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, FixPcdSizeTokenName))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (FixedPcdSizeVariableName, PcdDataSize))
             if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
                 AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, Pcd.MaxDatumSize))
-                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName,PatchPcdSizeVariableName))
+                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, PatchPcdSizeVariableName))
                 AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName,PcdDataSize))
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (PatchPcdMaxSizeVariable,Pcd.MaxDatumSize))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName, PcdDataSize))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (PatchPcdMaxSizeVariable, Pcd.MaxDatumSize))
         elif Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
             AutoGenH.Append('#define %s  %s\n' %(PcdValueName, Value))
             AutoGenC.Append('volatile %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
@@ -1136,13 +1136,13 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             PcdDataSize = Pcd.GetPcdSize()
             AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
             
-            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName,PatchPcdSizeVariableName))
+            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, PatchPcdSizeVariableName))
             AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName,PcdDataSize))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName, PcdDataSize))
         else:
             PcdDataSize = Pcd.GetPcdSize()
             AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName,FixPcdSizeTokenName))
+            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, FixPcdSizeTokenName))
             
             AutoGenH.Append('#define %s  %s\n' %(PcdValueName, Value))
             AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
@@ -1249,7 +1249,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
             AutoGenH.Append('// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
             AutoGenH.Append('// #define %s  %s\n' % (PcdTokenName, PcdExTokenName))
             AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName,Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
                 AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
@@ -1259,7 +1259,7 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
         else:
             AutoGenH.Append('#define %s  %s\n' % (PcdTokenName, PcdExTokenName))
             AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName,Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
                 AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
@@ -1310,11 +1310,11 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
             AutoGenH.Append('#define %s(Value)  ((%s = (Value)), RETURN_SUCCESS)\n' % (SetModeStatusName, PcdVariableName))
             AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
 
-        AutoGenH.Append('#define %s %s\n' % (GetModeSizeName,PatchPcdSizeVariableName))
+        AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, PatchPcdSizeVariableName))
         AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
         
     if PcdItemType == TAB_PCDS_FIXED_AT_BUILD or PcdItemType == TAB_PCDS_FEATURE_FLAG:
-        key = ".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName))
+        key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
         PcdVariableName = '_gPcd_' + gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
         if DatumType == TAB_VOID and Array == '[]':
             DatumType = [TAB_UINT8, TAB_UINT16][Pcd.DefaultValue[0] == 'L']
@@ -1338,14 +1338,14 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 if ConstFixedPcd:
                     AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName,FixPcdSizeTokenName))
+                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixPcdSizeTokenName))
                 else:
-                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName,FixedPcdSizeVariableName))
-                    AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName,FixedPcdSizeVariableName))
+                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixedPcdSizeVariableName))
+                    AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, FixedPcdSizeVariableName))
                     AutoGenH.Append('extern const UINTN %s; \n' % FixedPcdSizeVariableName)
             else:
                 AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                AutoGenH.Append('#define %s %s\n' % (GetModeSizeName,FixPcdSizeTokenName))
+                AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixPcdSizeTokenName))
 
 ## Create code for library constructor
 #
@@ -1373,11 +1373,11 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
             ConstructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             ConstructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1402,10 +1402,10 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 ## Create code for library destructor
@@ -1435,11 +1435,11 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
         elif Lib.ModuleType in SUP_MODULE_SET_PEI:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
+        elif Lib.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
-        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
             DestructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
             DestructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
 
@@ -1464,10 +1464,10 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
             AutoGenC.Append(gLibraryString[SUP_MODULE_BASE].Replace(Dict))
         elif Info.ModuleType in SUP_MODULE_SET_PEI:
             AutoGenC.Append(gLibraryString['PEI'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_DXE_CORE,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SMM_DRIVER,SUP_MODULE_DXE_RUNTIME_DRIVER,
-                                 SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER,SUP_MODULE_UEFI_APPLICATION,SUP_MODULE_SMM_CORE]:
+        elif Info.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
+                                 SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
             AutoGenC.Append(gLibraryString['DXE'].Replace(Dict))
-        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE,SUP_MODULE_MM_CORE_STANDALONE]:
+        elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 
@@ -1526,7 +1526,7 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         else:
             AutoGenC.Append(gPeimEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gPeimEntryPointPrototype.Replace(Dict))
-    elif Info.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER,SUP_MODULE_DXE_DRIVER,SUP_MODULE_DXE_SAL_DRIVER,SUP_MODULE_UEFI_DRIVER]:
+    elif Info.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
         if NumEntryPoints < 2:
             AutoGenC.Append(gUefiDriverEntryPointString[NumEntryPoints].Replace(Dict))
         else:
@@ -1925,7 +1925,7 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
     ImageType, = struct.unpack('2s', Buffer[0:2])
     if ImageType!= 'BM': # BMP file type is 'BM'
         EdkLogger.error("build", FILE_TYPE_MISMATCH, "The file %s is not a standard BMP file." % File.Path)
-    BMP_IMAGE_HEADER = collections.namedtuple('BMP_IMAGE_HEADER', ['bfSize','bfReserved1','bfReserved2','bfOffBits','biSize','biWidth','biHeight','biPlanes','biBitCount', 'biCompression', 'biSizeImage','biXPelsPerMeter','biYPelsPerMeter','biClrUsed','biClrImportant'])
+    BMP_IMAGE_HEADER = collections.namedtuple('BMP_IMAGE_HEADER', ['bfSize', 'bfReserved1', 'bfReserved2', 'bfOffBits', 'biSize', 'biWidth', 'biHeight', 'biPlanes', 'biBitCount', 'biCompression', 'biSizeImage', 'biXPelsPerMeter', 'biYPelsPerMeter', 'biClrUsed', 'biClrImportant'])
     BMP_IMAGE_HEADER_STRUCT = struct.Struct('IHHIIIIHHIIIIII')
     BmpHeader = BMP_IMAGE_HEADER._make(BMP_IMAGE_HEADER_STRUCT.unpack_from(Buffer[2:]))
     #
@@ -2009,7 +2009,7 @@ def CreateHeaderCode(Info, AutoGenC, AutoGenH):
     # file header
     AutoGenH.Append(gAutoGenHeaderString.Replace({'FileName':'AutoGen.h'}))
     # header file Prologue
-    AutoGenH.Append(gAutoGenHPrologueString.Replace({'File':'AUTOGENH','Guid':Info.Guid.replace('-','_')}))
+    AutoGenH.Append(gAutoGenHPrologueString.Replace({'File':'AUTOGENH','Guid':Info.Guid.replace('-', '_')}))
     AutoGenH.Append(gAutoGenHCppPrologueString)
     if Info.AutoGenVersion >= 0x00010005:
         # header files includes
@@ -2085,7 +2085,7 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
     if Info.UnicodeFileList:
         FileName = "%sStrDefs.h" % Info.Name
         StringH.Append(gAutoGenHeaderString.Replace({'FileName':FileName}))
-        StringH.Append(gAutoGenHPrologueString.Replace({'File':'STRDEFS', 'Guid':Info.Guid.replace('-','_')}))
+        StringH.Append(gAutoGenHPrologueString.Replace({'File':'STRDEFS', 'Guid':Info.Guid.replace('-', '_')}))
         CreateUnicodeStringCode(Info, AutoGenC, StringH, UniGenCFlag, UniGenBinBuffer)
 
         GuidMacros = []
@@ -2131,7 +2131,7 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
     if Info.IdfFileList:
         FileName = "%sImgDefs.h" % Info.Name
         StringIdf.Append(gAutoGenHeaderString.Replace({'FileName':FileName}))
-        StringIdf.Append(gAutoGenHPrologueString.Replace({'File':'IMAGEDEFS', 'Guid':Info.Guid.replace('-','_')}))
+        StringIdf.Append(gAutoGenHPrologueString.Replace({'File':'IMAGEDEFS', 'Guid':Info.Guid.replace('-', '_')}))
         CreateIdfFileCode(Info, AutoGenC, StringIdf, IdfGenCFlag, IdfGenBinBuffer)
 
         StringIdf.Append("\n#endif\n")
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index ea73de5fa55f..f3b23ed055fb 100644
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -745,7 +745,7 @@ cleanlib:
                         if CmdName == 'Trim':
                             SecDepsFileList.append(os.path.join('$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
                         if OutputFile.endswith('.ui') or OutputFile.endswith('.ver'):
-                            SecDepsFileList.append(os.path.join('$(MODULE_DIR)','$(MODULE_FILE)'))
+                            SecDepsFileList.append(os.path.join('$(MODULE_DIR)', '$(MODULE_FILE)'))
                         self.FfsOutputFileList.append((OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
                         if len(SecDepsFileList) > 0:
                             self.ParseSecCmd(SecDepsFileList, CmdTuple)
@@ -867,7 +867,7 @@ cleanlib:
                         for Target in BuildTargets:
                             for i, SingleCommand in enumerate(BuildTargets[Target].Commands):
                                 if FlagDict[Flag]['Macro'] in SingleCommand:
-                                    BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)','').replace(FlagDict[Flag]['Macro'], RespMacro)
+                                    BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMacro)
         return RespDict
 
     def ProcessBuildTargetList(self):
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 25e4f7246e3a..07ba29a158be 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -574,22 +574,22 @@ def StringArrayToList(StringArray):
 #
 def GetTokenTypeValue(TokenType):
     TokenTypeDict = {
-        "PCD_TYPE_SHIFT":28,
-        "PCD_TYPE_DATA":(0x0 << 28),
-        "PCD_TYPE_HII":(0x8 << 28),
-        "PCD_TYPE_VPD":(0x4 << 28),
+        "PCD_TYPE_SHIFT": 28,
+        "PCD_TYPE_DATA": (0x0 << 28),
+        "PCD_TYPE_HII": (0x8 << 28),
+        "PCD_TYPE_VPD": (0x4 << 28),
 #        "PCD_TYPE_SKU_ENABLED":(0x2 << 28),
-        "PCD_TYPE_STRING":(0x1 << 28),
+        "PCD_TYPE_STRING": (0x1 << 28),
 
-        "PCD_DATUM_TYPE_SHIFT":24,
-        "PCD_DATUM_TYPE_POINTER":(0x0 << 24),
-        "PCD_DATUM_TYPE_UINT8":(0x1 << 24),
-        "PCD_DATUM_TYPE_UINT16":(0x2 << 24),
-        "PCD_DATUM_TYPE_UINT32":(0x4 << 24),
-        "PCD_DATUM_TYPE_UINT64":(0x8 << 24),
+        "PCD_DATUM_TYPE_SHIFT": 24,
+        "PCD_DATUM_TYPE_POINTER": (0x0 << 24),
+        "PCD_DATUM_TYPE_UINT8": (0x1 << 24),
+        "PCD_DATUM_TYPE_UINT16": (0x2 << 24),
+        "PCD_DATUM_TYPE_UINT32": (0x4 << 24),
+        "PCD_DATUM_TYPE_UINT64": (0x8 << 24),
 
-        "PCD_DATUM_TYPE_SHIFT2":20,
-        "PCD_DATUM_TYPE_UINT8_BOOLEAN":(0x1 << 20 | 0x1 << 24),
+        "PCD_DATUM_TYPE_SHIFT2": 20,
+        "PCD_DATUM_TYPE_UINT8_BOOLEAN": (0x1 << 20 | 0x1 << 24),
         }
     return eval(TokenType, TokenTypeDict)
 
@@ -643,7 +643,7 @@ def BuildExDataBase(Dict):
     DbPcdCNameTable = DbStringItemList(0, RawDataList = PcdCNameTableValue, LenList = PcdCNameLen)
     
     PcdNameOffsetTable = Dict['PCD_NAME_OFFSET']
-    DbPcdNameOffsetTable = DbItemList(4,RawDataList = PcdNameOffsetTable)
+    DbPcdNameOffsetTable = DbItemList(4, RawDataList = PcdNameOffsetTable)
     
     SizeTableValue = zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH'])
     DbSizeTableValue = DbSizeTableItemList(2, RawDataList = SizeTableValue)
@@ -678,16 +678,16 @@ def BuildExDataBase(Dict):
     PcdTokenNumberMap = Dict['PCD_ORDER_TOKEN_NUMBER_MAP']
  
     DbNameTotle = ["SkuidValue",  "InitValueUint64", "VardefValueUint64", "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable",
-               "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable","VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
+               "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
                "SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
                "VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
  
     DbTotal = [SkuidValue,  InitValueUint64, VardefValueUint64, InitValueUint32, VardefValueUint32, VpdHeadValue, ExMapTable,
-               LocalTokenNumberTable, GuidTable, StringHeadValue,  PcdNameOffsetTable,VariableTable, StringTableLen, PcdTokenTable,PcdCNameTable,
+               LocalTokenNumberTable, GuidTable, StringHeadValue,  PcdNameOffsetTable, VariableTable, StringTableLen, PcdTokenTable, PcdCNameTable,
                SizeTableValue, InitValueUint16, VardefValueUint16, InitValueUint8, VardefValueUint8, InitValueBoolean,
                VardefValueBoolean, UnInitValueUint64, UnInitValueUint32, UnInitValueUint16, UnInitValueUint8, UnInitValueBoolean]
     DbItemTotal = [DbSkuidValue,  DbInitValueUint64, DbVardefValueUint64, DbInitValueUint32, DbVardefValueUint32, DbVpdHeadValue, DbExMapTable,
-               DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable,DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
+               DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
                DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
                DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
     
@@ -746,7 +746,7 @@ def BuildExDataBase(Dict):
                         DbOffset += (8 - DbOffset % 8)
             else:
                 assert(False)
-            if isinstance(VariableRefTable[0],list):
+            if isinstance(VariableRefTable[0], list):
                 DbOffset += skuindex * 4   
             skuindex += 1
             if DbIndex >= InitTableNum:
@@ -893,54 +893,54 @@ def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
     Changed = SaveFileOnChange(DbFileName, DbFile.getvalue(), True)
 def CreatePcdDataBase(PcdDBData):
     delta = {}
-    for skuname,skuid in PcdDBData:
-        if len(PcdDBData[(skuname,skuid)][1]) != len(PcdDBData[(TAB_DEFAULT,"0")][1]):
+    for skuname, skuid in PcdDBData:
+        if len(PcdDBData[(skuname, skuid)][1]) != len(PcdDBData[(TAB_DEFAULT, "0")][1]):
             EdkLogger.ERROR("The size of each sku in one pcd are not same")
-    for skuname,skuid in PcdDBData:
+    for skuname, skuid in PcdDBData:
         if skuname == TAB_DEFAULT:
             continue
-        delta[(skuname,skuid)] = [(index,data,hex(data)) for index,data in enumerate(PcdDBData[(skuname,skuid)][1]) if PcdDBData[(skuname,skuid)][1][index] != PcdDBData[(TAB_DEFAULT,"0")][1][index]]
-    databasebuff = PcdDBData[(TAB_DEFAULT,"0")][0]
+        delta[(skuname, skuid)] = [(index, data, hex(data)) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[(TAB_DEFAULT, "0")][1][index]]
+    databasebuff = PcdDBData[(TAB_DEFAULT, "0")][0]
 
-    for skuname,skuid in delta:
+    for skuname, skuid in delta:
         # 8 byte align
         if len(databasebuff) % 8 > 0:
             for i in range(8 - (len(databasebuff) % 8)):
-                databasebuff += pack("=B",0)
+                databasebuff += pack("=B", 0)
         databasebuff += pack('=Q', int(skuid))
         databasebuff += pack('=Q', 0)
-        databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname,skuid)]))
-        for item in delta[(skuname,skuid)]:
-            databasebuff += pack("=L",item[0])
-            databasebuff = databasebuff[:-1] + pack("=B",item[1])
+        databasebuff += pack('=L', 8+8+4+4*len(delta[(skuname, skuid)]))
+        for item in delta[(skuname, skuid)]:
+            databasebuff += pack("=L", item[0])
+            databasebuff = databasebuff[:-1] + pack("=B", item[1])
     totallen = len(databasebuff)
-    totallenbuff = pack("=L",totallen)
+    totallenbuff = pack("=L", totallen)
     newbuffer = databasebuff[:32]
     for i in range(4):
         newbuffer += totallenbuff[i]
-    for i in range(36,totallen):
+    for i in range(36, totallen):
         newbuffer += databasebuff[i]
 
     return newbuffer
 
 def CreateVarCheckBin(VarCheckTab):
-    return VarCheckTab[(TAB_DEFAULT,"0")]
+    return VarCheckTab[(TAB_DEFAULT, "0")]
 
 def CreateAutoGen(PcdDriverAutoGenData):
     autogenC = TemplateString()
-    for skuname,skuid in PcdDriverAutoGenData:
+    for skuname, skuid in PcdDriverAutoGenData:
         autogenC.Append("//SKUID: %s" % skuname)
-        autogenC.Append(PcdDriverAutoGenData[(skuname,skuid)][1].String)
-    return (PcdDriverAutoGenData[(skuname,skuid)][0],autogenC)
-def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase):
-    def prune_sku(pcd,skuname):
+        autogenC.Append(PcdDriverAutoGenData[(skuname, skuid)][1].String)
+    return (PcdDriverAutoGenData[(skuname, skuid)][0], autogenC)
+def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
+    def prune_sku(pcd, skuname):
         new_pcd = copy.deepcopy(pcd)
         new_pcd.SkuInfoList = {skuname:pcd.SkuInfoList[skuname]}
         new_pcd.isinit = 'INIT'
         if new_pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
             for skuobj in pcd.SkuInfoList.values():
                 if skuobj.DefaultValue:
-                    defaultvalue = int(skuobj.DefaultValue,16) if skuobj.DefaultValue.upper().startswith("0X") else int(skuobj.DefaultValue,10)
+                    defaultvalue = int(skuobj.DefaultValue, 16) if skuobj.DefaultValue.upper().startswith("0X") else int(skuobj.DefaultValue, 10)
                     if defaultvalue  != 0:
                         new_pcd.isinit = "INIT"
                         break
@@ -951,32 +951,32 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform,Phase):
                 new_pcd.isinit = "UNINIT"
         return new_pcd
     DynamicPcds = Platform.DynamicPcdList
-    DynamicPcdSet_Sku = {(SkuName,skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName,skuobj) in pcd.SkuInfoList.items() }
-    for skuname,skuid in DynamicPcdSet_Sku:
-        DynamicPcdSet_Sku[(skuname,skuid)] = [prune_sku(pcd,skuname) for pcd in DynamicPcds]
+    DynamicPcdSet_Sku = {(SkuName, skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName, skuobj) in pcd.SkuInfoList.items() }
+    for skuname, skuid in DynamicPcdSet_Sku:
+        DynamicPcdSet_Sku[(skuname, skuid)] = [prune_sku(pcd, skuname) for pcd in DynamicPcds]
     PcdDBData = {}
     PcdDriverAutoGenData = {}
     VarCheckTableData = {}
     if DynamicPcdSet_Sku:
-        for skuname,skuid in DynamicPcdSet_Sku:
-            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,DynamicPcdSet_Sku[(skuname,skuid)], Phase)
+        for skuname, skuid in DynamicPcdSet_Sku:
+            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
             final_data = ()
             for item in PcdDbBuffer:
-                final_data += unpack("B",item)
-            PcdDBData[(skuname,skuid)] = (PcdDbBuffer, final_data)
-            PcdDriverAutoGenData[(skuname,skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
-            VarCheckTableData[(skuname,skuid)] = VarCheckTab
+                final_data += unpack("B", item)
+            PcdDBData[(skuname, skuid)] = (PcdDbBuffer, final_data)
+            PcdDriverAutoGenData[(skuname, skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
+            VarCheckTableData[(skuname, skuid)] = VarCheckTab
         if Platform.Platform.VarCheckFlag:
             dest = os.path.join(Platform.BuildDir, TAB_FV_DIRECTORY)
             VarCheckTable = CreateVarCheckBin(VarCheckTableData)
             VarCheckTable.dump(dest, Phase)
         AdditionalAutoGenH, AdditionalAutoGenC =  CreateAutoGen(PcdDriverAutoGenData)
     else:
-        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer,VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform,{}, Phase)
+        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase)
         final_data = ()
         for item in PcdDbBuffer:
-            final_data += unpack("B",item)
-        PcdDBData[(TAB_DEFAULT,"0")] = (PcdDbBuffer, final_data)
+            final_data += unpack("B", item)
+        PcdDBData[(TAB_DEFAULT, "0")] = (PcdDbBuffer, final_data)
 
     return AdditionalAutoGenH, AdditionalAutoGenC, CreatePcdDataBase(PcdDBData)
 ## Create PCD database in DXE or PEI phase
@@ -1022,14 +1022,14 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['VARDEF_SKUID_' + DatumType] = []
         Dict['VARDEF_VALUE_' + DatumType] = []
         Dict['VARDEF_DB_VALUE_' + DatumType] = []
-        for Init in ['INIT','UNINIT']:
+        for Init in ['INIT', 'UNINIT']:
             Dict[Init+'_CNAME_DECL_' + DatumType]   = []
             Dict[Init+'_GUID_DECL_' + DatumType]    = []
             Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
             Dict[Init+'_VALUE_' + DatumType]        = []
             Dict[Init+'_DB_VALUE_'+DatumType] = []
             
-    for Type in ['STRING_HEAD','VPD_HEAD','VARIABLE_HEAD']:
+    for Type in ['STRING_HEAD', 'VPD_HEAD', 'VARIABLE_HEAD']:
         Dict[Type + '_CNAME_DECL']   = []
         Dict[Type + '_GUID_DECL']    = []
         Dict[Type + '_NUMSKUS_DECL'] = []
@@ -1190,7 +1190,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                         Dict['STRING_TABLE_INDEX'].append('')
                     else:
                         Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
-                    VarNameSize = len(VariableNameStructure.replace(',',' ').split())
+                    VarNameSize = len(VariableNameStructure.replace(',', ' ').split())
                     Dict['STRING_TABLE_LENGTH'].append(VarNameSize )
                     Dict['STRING_TABLE_VALUE'].append(VariableNameStructure)
                     StringHeadOffsetList.append(str(StringTableSize) + 'U')
@@ -1198,7 +1198,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     VarStringDbOffsetList.append(StringTableSize)
                     Dict['STRING_DB_VALUE'].append(VarStringDbOffsetList)
                     StringTableIndex += 1
-                    StringTableSize += len(VariableNameStructure.replace(',',' ').split())
+                    StringTableSize += len(VariableNameStructure.replace(',', ' ').split())
                 VariableHeadStringIndex = 0
                 for Index in range(Dict['STRING_TABLE_VALUE'].index(VariableNameStructure)):
                     VariableHeadStringIndex += Dict['STRING_TABLE_LENGTH'][Index]
@@ -1237,7 +1237,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
                     elif Pcd.DatumType == "BOOLEAN":
-                        if eval(Sku.HiiDefaultValue) in [1,0]:
+                        if eval(Sku.HiiDefaultValue) in [1, 0]:
                             Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
                     else:
                         Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
@@ -1287,7 +1287,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                         Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
                     if Sku.DefaultValue[0] == 'L':
                         DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
-                        Size = len(DefaultValueBinStructure.replace(',',' ').split())
+                        Size = len(DefaultValueBinStructure.replace(',', ' ').split())
                         Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
                     elif Sku.DefaultValue[0] == '"':
                         DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
@@ -1599,7 +1599,7 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
 
 #     print Phase
     Buffer = BuildExDataBase(Dict)
-    return AutoGenH, AutoGenC, Buffer,VarCheckTab
+    return AutoGenH, AutoGenC, Buffer, VarCheckTab
 
 def GetOrderedDynamicPcdList(DynamicPcdList, PcdTokenNumberList):
     ReorderedDyPcdList = [None for i in range(len(DynamicPcdList))]
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index 3675be8de994..8a73c0436788 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -14,7 +14,7 @@
 # #
 # Import Modules
 #
-from struct import pack,unpack
+from struct import pack, unpack
 import collections
 import copy
 from Common.VariableAttributes import VariableAttributes
@@ -27,7 +27,7 @@ NvStorageHeaderSize = 28
 VariableHeaderSize = 32
 
 class VariableMgr(object):
-    def __init__(self, DefaultStoreMap,SkuIdMap):
+    def __init__(self, DefaultStoreMap, SkuIdMap):
         self.VarInfo = []
         self.DefaultStoreMap = DefaultStoreMap
         self.SkuIdMap = SkuIdMap
@@ -37,19 +37,19 @@ class VariableMgr(object):
         self.VarDefaultBuff = None
         self.VarDeltaBuff = None
 
-    def append_variable(self,uefi_var):
+    def append_variable(self, uefi_var):
         self.VarInfo.append(uefi_var)
 
-    def SetVpdRegionMaxSize(self,maxsize):
+    def SetVpdRegionMaxSize(self, maxsize):
         self.VpdRegionSize = maxsize
 
-    def SetVpdRegionOffset(self,vpdoffset):
+    def SetVpdRegionOffset(self, vpdoffset):
         self.VpdRegionOffset = vpdoffset
 
-    def PatchNVStoreDefaultMaxSize(self,maxsize):
+    def PatchNVStoreDefaultMaxSize(self, maxsize):
         if not self.NVHeaderBuff:
             return ""
-        self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q",maxsize)
+        self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q", maxsize)
         default_var_bin = VariableMgr.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
         value_str = "{"
         default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
@@ -59,9 +59,9 @@ class VariableMgr(object):
     def combine_variable(self):
         indexedvarinfo = collections.OrderedDict()
         for item in self.VarInfo:
-            if (item.skuname,item.defaultstoragename, item.var_name,item.var_guid) not in indexedvarinfo:
-                indexedvarinfo[(item.skuname,item.defaultstoragename, item.var_name,item.var_guid) ] = []
-            indexedvarinfo[(item.skuname,item.defaultstoragename, item.var_name,item.var_guid)].append(item)
+            if (item.skuname, item.defaultstoragename, item.var_name, item.var_guid) not in indexedvarinfo:
+                indexedvarinfo[(item.skuname, item.defaultstoragename, item.var_name, item.var_guid) ] = []
+            indexedvarinfo[(item.skuname, item.defaultstoragename, item.var_name, item.var_guid)].append(item)
         for key in indexedvarinfo:
             sku_var_info_offset_list = indexedvarinfo[key]
             if len(sku_var_info_offset_list) == 1:
@@ -74,15 +74,15 @@ class VariableMgr(object):
                     data_flag = DataType.PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[data_type]]
                     data = value_list[0]
                     value_list = []
-                    for data_byte in pack(data_flag,int(data,16) if data.upper().startswith('0X') else int(data)):
-                        value_list.append(hex(unpack("B",data_byte)[0]))
-                newvalue[int(item.var_offset,16) if item.var_offset.upper().startswith("0X") else int(item.var_offset)] = value_list
+                    for data_byte in pack(data_flag, int(data, 16) if data.upper().startswith('0X') else int(data)):
+                        value_list.append(hex(unpack("B", data_byte)[0]))
+                newvalue[int(item.var_offset, 16) if item.var_offset.upper().startswith("0X") else int(item.var_offset)] = value_list
             try:
                 newvaluestr = "{" + ",".join(VariableMgr.assemble_variable(newvalue)) +"}"
             except:
                 EdkLogger.error("build", AUTOGEN_ERROR, "Variable offset conflict in PCDs: %s \n" % (" and ".join(item.pcdname for item in sku_var_info_offset_list)))
             n = sku_var_info_offset_list[0]
-            indexedvarinfo[key] =  [var_info(n.pcdindex,n.pcdname,n.defaultstoragename,n.skuname,n.var_name, n.var_guid, "0x00",n.var_attribute,newvaluestr  , newvaluestr , DataType.TAB_VOID)]
+            indexedvarinfo[key] =  [var_info(n.pcdindex, n.pcdname, n.defaultstoragename, n.skuname, n.var_name, n.var_guid, "0x00", n.var_attribute, newvaluestr, newvaluestr, DataType.TAB_VOID)]
         self.VarInfo = [item[0] for item in indexedvarinfo.values()]
 
     @staticmethod
@@ -105,7 +105,7 @@ class VariableMgr(object):
         for item in self.VarInfo:
             if item.pcdindex not in indexedvarinfo:
                 indexedvarinfo[item.pcdindex] = dict()
-            indexedvarinfo[item.pcdindex][(item.skuname,item.defaultstoragename)] = item
+            indexedvarinfo[item.pcdindex][(item.skuname, item.defaultstoragename)] = item
 
         for index in indexedvarinfo:
             sku_var_info = indexedvarinfo[index]
@@ -113,40 +113,40 @@ class VariableMgr(object):
             default_data_buffer = ""
             others_data_buffer = ""
             tail = None
-            default_sku_default = indexedvarinfo[index].get((DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT))
+            default_sku_default = indexedvarinfo[index].get((DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT))
 
             if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
                 var_max_len = max(len(var_item.default_value.split(",")) for var_item in sku_var_info.values())
                 if len(default_sku_default.default_value.split(",")) < var_max_len:
                     tail = ",".join("0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(","))))
 
-            default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value,default_sku_default.data_type,tail)
+            default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value, default_sku_default.data_type, tail)
 
             default_data_array = ()
             for item in default_data_buffer:
-                default_data_array += unpack("B",item)
+                default_data_array += unpack("B", item)
 
-            var_data[(DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (default_data_buffer,sku_var_info[(DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT)])
+            var_data[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (default_data_buffer, sku_var_info[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)])
 
-            for (skuid,defaultstoragename) in indexedvarinfo[index]:
+            for (skuid, defaultstoragename) in indexedvarinfo[index]:
                 tail = None
-                if (skuid,defaultstoragename) == (DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT):
+                if (skuid, defaultstoragename) == (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT):
                     continue
-                other_sku_other = indexedvarinfo[index][(skuid,defaultstoragename)]
+                other_sku_other = indexedvarinfo[index][(skuid, defaultstoragename)]
 
                 if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
                     if len(other_sku_other.default_value.split(",")) < var_max_len:
                         tail = ",".join("0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(","))))
 
-                others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value,other_sku_other.data_type,tail)
+                others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value, other_sku_other.data_type, tail)
 
                 others_data_array = ()
                 for item in others_data_buffer:
-                    others_data_array += unpack("B",item)
+                    others_data_array += unpack("B", item)
 
                 data_delta = VariableMgr.calculate_delta(default_data_array, others_data_array)
 
-                var_data[(skuid,defaultstoragename)][index] = (data_delta,sku_var_info[(skuid,defaultstoragename)])
+                var_data[(skuid, defaultstoragename)][index] = (data_delta, sku_var_info[(skuid, defaultstoragename)])
         return var_data
 
     def new_process_varinfo(self):
@@ -157,17 +157,17 @@ class VariableMgr(object):
         if not var_data:
             return []
 
-        pcds_default_data = var_data.get((DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT),{})
+        pcds_default_data = var_data.get((DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT), {})
         NvStoreDataBuffer = ""
         var_data_offset = collections.OrderedDict()
         offset = NvStorageHeaderSize
-        for default_data,default_info in pcds_default_data.values():
+        for default_data, default_info in pcds_default_data.values():
             var_name_buffer = VariableMgr.PACK_VARIABLE_NAME(default_info.var_name)
 
             vendorguid = default_info.var_guid.split('-')
 
             if default_info.var_attribute:
-                var_attr_value,_ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
+                var_attr_value, _ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
             else:
                 var_attr_value = 0x07
 
@@ -186,22 +186,22 @@ class VariableMgr(object):
         nv_default_part = VariableMgr.AlignData(VariableMgr.PACK_DEFAULT_DATA(0, 0, VariableMgr.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
 
         data_delta_structure_buffer = ""
-        for skuname,defaultstore in var_data:
-            if (skuname,defaultstore) == (DataType.TAB_DEFAULT,DataType.TAB_DEFAULT_STORES_DEFAULT):
+        for skuname, defaultstore in var_data:
+            if (skuname, defaultstore) == (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT):
                 continue
-            pcds_sku_data = var_data[(skuname,defaultstore)]
+            pcds_sku_data = var_data[(skuname, defaultstore)]
             delta_data_set = []
             for pcdindex in pcds_sku_data:
                 offset = var_data_offset[pcdindex]
-                delta_data,_ = pcds_sku_data[pcdindex]
+                delta_data, _ = pcds_sku_data[pcdindex]
                 delta_data = [(item[0] + offset, item[1]) for item in delta_data]
                 delta_data_set.extend(delta_data)
 
-            data_delta_structure_buffer += VariableMgr.AlignData(self.PACK_DELTA_DATA(skuname,defaultstore,delta_data_set), 8)
+            data_delta_structure_buffer += VariableMgr.AlignData(self.PACK_DELTA_DATA(skuname, defaultstore, delta_data_set), 8)
 
         size = len(nv_default_part + data_delta_structure_buffer) + 16
         maxsize = self.VpdRegionSize if self.VpdRegionSize else size
-        NV_Store_Default_Header = VariableMgr.PACK_NV_STORE_DEFAULT_HEADER(size,maxsize)
+        NV_Store_Default_Header = VariableMgr.PACK_NV_STORE_DEFAULT_HEADER(size, maxsize)
 
         self.NVHeaderBuff =  NV_Store_Default_Header
         self.VarDefaultBuff =nv_default_part
@@ -217,7 +217,7 @@ class VariableMgr(object):
     def unpack_data(data):
         final_data = ()
         for item in data:
-            final_data += unpack("B",item)
+            final_data += unpack("B", item)
         return final_data
 
     @staticmethod
@@ -227,7 +227,7 @@ class VariableMgr(object):
         data_delta = []
         for i in range(len(default)):
             if default[i] != theother[i]:
-                data_delta.append((i,theother[i]))
+                data_delta.append((i, theother[i]))
         return data_delta
 
     def dump(self):
@@ -248,36 +248,36 @@ class VariableMgr(object):
         Guid = GuidStructureStringToGuidString(Guid)
         GuidBuffer = PackGUID(Guid.split('-'))
 
-        SizeBuffer = pack('=L',size)
-        FormatBuffer = pack('=B',0x5A)
-        StateBuffer = pack('=B',0xFE)
-        reservedBuffer = pack('=H',0)
-        reservedBuffer += pack('=L',0)
+        SizeBuffer = pack('=L', size)
+        FormatBuffer = pack('=B', 0x5A)
+        StateBuffer = pack('=B', 0xFE)
+        reservedBuffer = pack('=H', 0)
+        reservedBuffer += pack('=L', 0)
 
         return GuidBuffer + SizeBuffer + FormatBuffer + StateBuffer + reservedBuffer
 
     @staticmethod
-    def PACK_NV_STORE_DEFAULT_HEADER(size,maxsize):
-        Signature = pack('=B',ord('N'))
-        Signature += pack("=B",ord('S'))
-        Signature += pack("=B",ord('D'))
-        Signature += pack("=B",ord('B'))
+    def PACK_NV_STORE_DEFAULT_HEADER(size, maxsize):
+        Signature = pack('=B', ord('N'))
+        Signature += pack("=B", ord('S'))
+        Signature += pack("=B", ord('D'))
+        Signature += pack("=B", ord('B'))
 
-        SizeBuffer = pack("=L",size)
-        MaxSizeBuffer = pack("=Q",maxsize)
+        SizeBuffer = pack("=L", size)
+        MaxSizeBuffer = pack("=Q", maxsize)
 
         return Signature + SizeBuffer + MaxSizeBuffer
 
     @staticmethod
-    def PACK_VARIABLE_HEADER(attribute,namesize,datasize,vendorguid):
+    def PACK_VARIABLE_HEADER(attribute, namesize, datasize, vendorguid):
 
-        Buffer = pack('=H',0x55AA) # pack StartID
-        Buffer += pack('=B',0x3F)  # pack State
-        Buffer += pack('=B',0)     # pack reserved
+        Buffer = pack('=H', 0x55AA) # pack StartID
+        Buffer += pack('=B', 0x3F)  # pack State
+        Buffer += pack('=B', 0)     # pack reserved
 
-        Buffer += pack('=L',attribute)
-        Buffer += pack('=L',namesize)
-        Buffer += pack('=L',datasize)
+        Buffer += pack('=L', attribute)
+        Buffer += pack('=L', namesize)
+        Buffer += pack('=L', datasize)
 
         Buffer += PackGUID(vendorguid)
 
@@ -289,66 +289,66 @@ class VariableMgr(object):
         data_len = 0
         if data_type == DataType.TAB_VOID:
             for value_char in var_value.strip("{").strip("}").split(","):
-                Buffer += pack("=B",int(value_char,16))
+                Buffer += pack("=B", int(value_char, 16))
             data_len += len(var_value.split(","))
             if tail:
                 for value_char in tail.split(","):
-                    Buffer += pack("=B",int(value_char,16))
+                    Buffer += pack("=B", int(value_char, 16))
                 data_len += len(tail.split(","))
         elif data_type == "BOOLEAN":
-            Buffer += pack("=B",True) if var_value.upper() == "TRUE" else pack("=B",False)
+            Buffer += pack("=B", True) if var_value.upper() == "TRUE" else pack("=B", False)
             data_len += 1
         elif data_type  == DataType.TAB_UINT8:
-            Buffer += pack("=B",GetIntegerValue(var_value))
+            Buffer += pack("=B", GetIntegerValue(var_value))
             data_len += 1
         elif data_type == DataType.TAB_UINT16:
-            Buffer += pack("=H",GetIntegerValue(var_value))
+            Buffer += pack("=H", GetIntegerValue(var_value))
             data_len += 2
         elif data_type == DataType.TAB_UINT32:
-            Buffer += pack("=L",GetIntegerValue(var_value))
+            Buffer += pack("=L", GetIntegerValue(var_value))
             data_len += 4
         elif data_type == DataType.TAB_UINT64:
-            Buffer += pack("=Q",GetIntegerValue(var_value))
+            Buffer += pack("=Q", GetIntegerValue(var_value))
             data_len += 8
 
         return Buffer
 
     @staticmethod
-    def PACK_DEFAULT_DATA(defaultstoragename,skuid,var_value):
+    def PACK_DEFAULT_DATA(defaultstoragename, skuid, var_value):
         Buffer = ""
-        Buffer += pack("=L",4+8+8)
-        Buffer += pack("=Q",int(skuid))
-        Buffer += pack("=Q",int(defaultstoragename))
+        Buffer += pack("=L", 4+8+8)
+        Buffer += pack("=Q", int(skuid))
+        Buffer += pack("=Q", int(defaultstoragename))
 
         for item in var_value:
-            Buffer += pack("=B",item)
+            Buffer += pack("=B", item)
 
-        Buffer = pack("=L",len(Buffer)+4) + Buffer
+        Buffer = pack("=L", len(Buffer)+4) + Buffer
 
         return Buffer
 
-    def GetSkuId(self,skuname):
+    def GetSkuId(self, skuname):
         if skuname not in self.SkuIdMap:
             return None
         return self.SkuIdMap.get(skuname)[0]
 
-    def GetDefaultStoreId(self,dname):
+    def GetDefaultStoreId(self, dname):
         if dname not in self.DefaultStoreMap:
             return None
         return self.DefaultStoreMap.get(dname)[0]
 
-    def PACK_DELTA_DATA(self,skuname,defaultstoragename,delta_list):
+    def PACK_DELTA_DATA(self, skuname, defaultstoragename, delta_list):
         skuid = self.GetSkuId(skuname)
         defaultstorageid = self.GetDefaultStoreId(defaultstoragename)
         Buffer = ""
-        Buffer += pack("=L",4+8+8)
-        Buffer += pack("=Q",int(skuid))
-        Buffer += pack("=Q",int(defaultstorageid))
-        for (delta_offset,value) in delta_list:
-            Buffer += pack("=L",delta_offset)
-            Buffer = Buffer[:-1] + pack("=B",value)
+        Buffer += pack("=L", 4+8+8)
+        Buffer += pack("=Q", int(skuid))
+        Buffer += pack("=Q", int(defaultstorageid))
+        for (delta_offset, value) in delta_list:
+            Buffer += pack("=L", delta_offset)
+            Buffer = Buffer[:-1] + pack("=B", value)
 
-        Buffer = pack("=L",len(Buffer) + 4) + Buffer
+        Buffer = pack("=L", len(Buffer) + 4) + Buffer
 
         return Buffer
 
@@ -357,7 +357,7 @@ class VariableMgr(object):
         mybuffer = data
         if (len(data) % align) > 0:
             for i in range(align - (len(data) % align)):
-                mybuffer += pack("=B",0)
+                mybuffer += pack("=B", 0)
 
         return mybuffer
 
@@ -365,6 +365,6 @@ class VariableMgr(object):
     def PACK_VARIABLE_NAME(var_name):
         Buffer = ""
         for name_char in var_name.strip("{").strip("}").split(","):
-            Buffer += pack("=B",int(name_char,16))
+            Buffer += pack("=B", int(name_char, 16))
 
         return Buffer
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index ce8866f480d5..9f70d4e5b717 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -94,7 +94,7 @@ PRINTABLE_LANGUAGE_NAME_STRING_NAME = '$PRINTABLE_LANGUAGE_NAME'
 # @retval:       The formatted hex string
 #
 def DecToHexStr(Dec, Digit = 8):
-    return '0x{0:0{1}X}'.format(Dec,Digit)
+    return '0x{0:0{1}X}'.format(Dec, Digit)
 
 ## Convert a dec number to a hex list
 #
@@ -109,7 +109,7 @@ def DecToHexStr(Dec, Digit = 8):
 # @retval:       A list for formatted hex string
 #
 def DecToHexList(Dec, Digit = 8):
-    Hex = '{0:0{1}X}'.format(Dec,Digit)
+    Hex = '{0:0{1}X}'.format(Dec, Digit)
     return ["0x" + Hex[Bit:Bit + 2] for Bit in range(Digit - 2, -1, -2)]
 
 ## Convert a acsii string to a hex list
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 69a9665f5a76..807d0fa8d86f 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -349,7 +349,7 @@ class GenVPD :
                 #
                 # Enhanced for support "|" character in the string.
                 #
-                ValueList = ['', '', '', '','']
+                ValueList = ['', '', '', '', '']
 
                 ValueRe = re.compile(r'\s*L?\".*\|.*\"\s*$')
                 PtrValue = ValueRe.findall(line)
@@ -399,7 +399,7 @@ class GenVPD :
         count = 0
         for line in self.FileLinesList:
             if line is not None :
-                PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4],line[5], self.InputFileName)   
+                PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4], line[5], self.InputFileName)
                 # Strip the space char
                 PCD.PcdCName     = PCD.PcdCName.strip(' ')
                 PCD.SkuId        = PCD.SkuId.strip(' ')
@@ -513,10 +513,10 @@ class GenVPD :
         index =0
         for pcd in self.PcdUnknownOffsetList:
             index += 1
-            if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid","PcdNvStoreDefaultValueBuffer")):
+            if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid", "PcdNvStoreDefaultValueBuffer")):
                 if index != len(self.PcdUnknownOffsetList):
                     for i in range(len(self.PcdUnknownOffsetList) - index):
-                        self.PcdUnknownOffsetList[index+i -1 ] , self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i] , self.PcdUnknownOffsetList[index+i -1]
+                        self.PcdUnknownOffsetList[index+i -1 ], self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i], self.PcdUnknownOffsetList[index+i -1]
 
         #
         # Process all Offset value are "*"
@@ -597,7 +597,7 @@ class GenVPD :
                                 eachUnfixedPcd.PcdOffset    = str(hex(LastOffset))
                                 eachUnfixedPcd.PcdBinOffset = LastOffset
                                 # Insert this pcd into fixed offset pcd list.
-                                self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount,eachUnfixedPcd)
+                                self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount, eachUnfixedPcd)
                                 
                                 # Delete the item's offset that has been fixed and added into fixed offset list
                                 self.PcdUnknownOffsetList.pop(countOfUnfixedList)
@@ -685,7 +685,7 @@ class GenVPD :
         for eachPcd in self.PcdFixedOffsetSizeList  :
             # write map file
             try :
-                fMapFile.write("%s | %s | %s | %s | %s  \n" % (eachPcd.PcdCName, eachPcd.SkuId,eachPcd.PcdOffset, eachPcd.PcdSize,eachPcd.PcdUnpackValue))
+                fMapFile.write("%s | %s | %s | %s | %s  \n" % (eachPcd.PcdCName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpackValue))
             except:
                 EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
 
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index 154245ca317d..473fb7473a5a 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -531,8 +531,8 @@ PCDS_DYNAMICEX_DEFAULT = "PcdsDynamicExDefault"
 PCDS_DYNAMICEX_VPD = "PcdsDynamicExVpd"
 PCDS_DYNAMICEX_HII = "PcdsDynamicExHii"
 
-SECTIONS_HAVE_ITEM_PCD_SET = {PCDS_DYNAMIC_DEFAULT.upper(),PCDS_DYNAMIC_VPD.upper(),PCDS_DYNAMIC_HII.upper(), \
-                              PCDS_DYNAMICEX_DEFAULT.upper(),PCDS_DYNAMICEX_VPD.upper(),PCDS_DYNAMICEX_HII.upper()}
+SECTIONS_HAVE_ITEM_PCD_SET = {PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD.upper(), PCDS_DYNAMIC_HII.upper(), \
+                              PCDS_DYNAMICEX_DEFAULT.upper(), PCDS_DYNAMICEX_VPD.upper(), PCDS_DYNAMICEX_HII.upper()}
 # Section allowed to have items after arch
 SECTIONS_HAVE_ITEM_AFTER_ARCH_SET = {TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.upper(), TAB_USER_EXTENSIONS.upper(),
                                  PCDS_DYNAMIC_DEFAULT.upper(),
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index c63030a16e6e..9ff4f104256e 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -569,7 +569,7 @@ class ValueExpression(BaseExpression):
         IsArray = IsGuid = False
         if len(Token.split(',')) == 11 and len(Token.split(',{')) == 2 \
             and len(Token.split('},')) == 1:
-            HexLen = [11,6,6,5,4,4,4,4,4,4,6]
+            HexLen = [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6]
             HexList= Token.split(',')
             if HexList[3].startswith('{') and \
                 not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
@@ -765,7 +765,7 @@ class ValueExpression(BaseExpression):
     # Parse operator
     def _GetOperator(self):
         self.__SkipWS()
-        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?',':']
+        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?', ':']
 
         self._Token = ''
         Expr = self._Expr[self._Idx:]
@@ -842,7 +842,7 @@ class ValueExpressionEx(ValueExpression):
                         elif Item.startswith(TAB_UINT64):
                             ItemSize = 8
                             ValueType = TAB_UINT64
-                        elif Item[0] in {'"',"'",'L'}:
+                        elif Item[0] in {'"', "'", 'L'}:
                             ItemSize = 0
                             ValueType = TAB_VOID
                         else:
@@ -946,7 +946,7 @@ class ValueExpressionEx(ValueExpression):
                             # replace each offset, except errors
                             for Offset in OffsetList:
                                 try:
-                                    Item = Item.replace('OFFSET_OF({})'.format(Offset),LabelDict[Offset])
+                                    Item = Item.replace('OFFSET_OF({})'.format(Offset), LabelDict[Offset])
                                 except:
                                     raise BadExpression('%s not defined' % Offset)
 
@@ -999,7 +999,7 @@ class ValueExpressionEx(ValueExpression):
                                 Item = '0x%x' % TmpValue if type(TmpValue) != type('') else TmpValue
                                 if ItemSize == 0:
                                     ItemValue, ItemSize = ParseFieldValue(Item)
-                                    if Item[0] not in {'"','L','{'} and ItemSize > 1:
+                                    if Item[0] not in {'"', 'L', '{'} and ItemSize > 1:
                                         raise BadExpression("Byte  array number %s should less than 0xFF." % Item)
                                 else:
                                     ItemValue = ParseFieldValue(Item)[0]
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 01171adb9b9e..fd53b6b046c4 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -132,7 +132,7 @@ def _parseForGCC(lines, efifilepath, varnames):
                     if Str:
                         m = pcdPatternGcc.match(Str.strip())
                         if m is not None:
-                            varoffset.append((varname, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+                            varoffset.append((varname, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
 
     if not varoffset:
         return []
@@ -1469,7 +1469,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
 #         Value, Size = ParseFieldValue(Value)
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1490,7 +1490,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
 
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1512,7 +1512,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
             IsValid = (len(FieldList) <= 3)
         if Size:
             try:
-                int(Size,16) if Size.upper().startswith("0X") else int(Size)
+                int(Size, 16) if Size.upper().startswith("0X") else int(Size)
             except:
                 IsValid = False
                 Size = -1
@@ -1670,7 +1670,7 @@ def ConvertStringToByteArray(Value):
 
     Value = eval(Value)         # translate escape character
     NewValue = '{'
-    for Index in range(0,len(Value)):
+    for Index in range(0, len(Value)):
         if Unicode:
             NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ','
         else:
@@ -1914,28 +1914,28 @@ class PeImageClass():
         return Value
 
 class DefaultStore():
-    def __init__(self,DefaultStores ):
+    def __init__(self, DefaultStores ):
 
         self.DefaultStores = DefaultStores
-    def DefaultStoreID(self,DefaultStoreName):
-        for key,value in self.DefaultStores.items():
+    def DefaultStoreID(self, DefaultStoreName):
+        for key, value in self.DefaultStores.items():
             if value == DefaultStoreName:
                 return key
         return None
     def GetDefaultDefault(self):
         if not self.DefaultStores or "0" in self.DefaultStores:
-            return "0",TAB_DEFAULT_STORES_DEFAULT
+            return "0", TAB_DEFAULT_STORES_DEFAULT
         else:
             minvalue = min(int(value_str) for value_str in self.DefaultStores)
             return (str(minvalue), self.DefaultStores[str(minvalue)])
-    def GetMin(self,DefaultSIdList):
+    def GetMin(self, DefaultSIdList):
         if not DefaultSIdList:
             return TAB_DEFAULT_STORES_DEFAULT
         storeidset = {storeid for storeid, storename in self.DefaultStores.values() if storename in DefaultSIdList}
         if not storeidset:
             return ""
         minid = min(storeidset )
-        for sid,name in self.DefaultStores.values():
+        for sid, name in self.DefaultStores.values():
             if sid == minid:
                 return name
 class SkuClass():
@@ -1950,7 +1950,7 @@ class SkuClass():
 
         for SkuName in SkuIds:
             SkuId = SkuIds[SkuName][0]
-            skuid_num = int(SkuId,16) if SkuId.upper().startswith("0X") else int(SkuId)
+            skuid_num = int(SkuId, 16) if SkuId.upper().startswith("0X") else int(SkuId)
             if skuid_num > 0xFFFFFFFFFFFFFFFF:
                 EdkLogger.error("build", PARAMETER_INVALID,
                             ExtraData = "SKU-ID [%s] value %s exceeds the max value of UINT64"
@@ -2003,9 +2003,9 @@ class SkuClass():
             self.__SkuInherit = {}
             for item in self.SkuData.values():
                 self.__SkuInherit[item[1]]=item[2] if item[2] else "DEFAULT"
-        return self.__SkuInherit.get(skuname,"DEFAULT")
+        return self.__SkuInherit.get(skuname, "DEFAULT")
 
-    def GetSkuChain(self,sku):
+    def GetSkuChain(self, sku):
         if sku == "DEFAULT":
             return ["DEFAULT"]
         skulist = [sku]
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 4c29bc9ee4bd..1cf975ba7bef 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -17,7 +17,7 @@ from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
 import uuid
-from Common.Expression import PcdPattern,BaseExpression
+from Common.Expression import PcdPattern, BaseExpression
 from Common.DataType import *
 
 ERR_STRING_EXPR = 'This operator cannot be used in string expression: [%s].'
@@ -167,7 +167,7 @@ class EQOperatorObject(object):
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
         rangeContainer = RangeContainer()
-        rangeContainer.push(RangeObject(int(Operand) , int(Operand)))
+        rangeContainer.push(RangeObject(int(Operand), int(Operand)))
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1   
     
@@ -453,7 +453,7 @@ class RangeExpression(BaseExpression):
 
     # [!]*A
     def _RelExpr(self):
-        if self._IsOperator({"NOT" , "LE", "GE", "LT", "GT", "EQ", "XOR"}):
+        if self._IsOperator({"NOT", "LE", "GE", "LT", "GT", "EQ", "XOR"}):
             Token = self._Token
             Val = self._NeExpr()
             try:
diff --git a/BaseTools/Source/Python/Common/StringUtils.py b/BaseTools/Source/Python/Common/StringUtils.py
index 2292a263b985..25dd4b264c2f 100644
--- a/BaseTools/Source/Python/Common/StringUtils.py
+++ b/BaseTools/Source/Python/Common/StringUtils.py
@@ -750,7 +750,7 @@ def SplitString(String):
 # @param StringList:  A list for strings to be converted
 #
 def ConvertToSqlString(StringList):
-    return map(lambda s: s.replace("'", "''") , StringList)
+    return map(lambda s: s.replace("'", "''"), StringList)
 
 ## Convert To Sql String
 #
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index dd985ab30359..fb95a0353cef 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -26,9 +26,9 @@ from Common.StringUtils import NormPath
 import Common.GlobalData as GlobalData
 from Common import GlobalData
 from Common.MultipleWorkspace import MultipleWorkspace as mws
-from DataType import TAB_TOD_DEFINES_TARGET,TAB_TOD_DEFINES_TOOL_CHAIN_TAG,\
-                     TAB_TOD_DEFINES_TARGET_ARCH,TAB_TOD_DEFINES_COMMAND_TYPE\
-                     ,TAB_TOD_DEFINES_FAMILY,TAB_TOD_DEFINES_BUILDRULEFAMILY
+from DataType import TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG,\
+                     TAB_TOD_DEFINES_TARGET_ARCH, TAB_TOD_DEFINES_COMMAND_TYPE\
+                     , TAB_TOD_DEFINES_FAMILY, TAB_TOD_DEFINES_BUILDRULEFAMILY
 
 
 ##
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index ddabe9fb2546..93175d41e9f7 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -88,7 +88,7 @@ class VpdInfoFile:
     #
     #  @param offset integer value for VPD's offset in specific SKU.
     #
-    def Add(self, Vpd, skuname,Offset):
+    def Add(self, Vpd, skuname, Offset):
         if (Vpd is None):
             EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
         
@@ -140,7 +140,7 @@ class VpdInfoFile:
                 if PcdValue == "" :
                     PcdValue  = Pcd.DefaultValue
 
-                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname,str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(),PcdValue)
+                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname, str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(), PcdValue)
                 i += 1
 
         return SaveFileOnChange(FilePath, Content, False)
@@ -169,8 +169,8 @@ class VpdInfoFile:
             # the line must follow output format defined in BPDG spec.
             #
             try:
-                PcdName, SkuId,Offset, Size, Value = Line.split("#")[0].split("|")
-                PcdName, SkuId,Offset, Size, Value = PcdName.strip(), SkuId.strip(),Offset.strip(), Size.strip(), Value.strip()
+                PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
+                PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
                 TokenSpaceName, PcdTokenName = PcdName.split(".")
             except:
                 EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail to parse VPD information file %s" % FilePath)
@@ -179,7 +179,7 @@ class VpdInfoFile:
             
             if (TokenSpaceName, PcdTokenName) not in self._VpdInfo:
                 self._VpdInfo[(TokenSpaceName, PcdTokenName)] = []
-            self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId,Offset, Value))
+            self._VpdInfo[(TokenSpaceName, PcdTokenName)].append((SkuId, Offset, Value))
             for VpdObject in self._VpdArray:
                 VpdObjectTokenCName = VpdObject.TokenCName
                 for PcdItem in GlobalData.MixedPcd:
diff --git a/BaseTools/Source/Python/Ecc/CParser.py b/BaseTools/Source/Python/Ecc/CParser.py
index d5fd3a37a167..d7eff138da57 100644
--- a/BaseTools/Source/Python/Ecc/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser.py
@@ -785,10 +785,10 @@ class CParser(Parser):
                 if self.backtracking == 0:
                           
                     if d is not None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
                     else:
                       self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a is not None:
@@ -922,9 +922,9 @@ class CParser(Parser):
                     if self.backtracking == 0:
                             
                         if b is not None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
                         	  
 
 
@@ -959,7 +959,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if t is not None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
                         	
 
 
@@ -1403,7 +1403,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if s.stop is not None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
                         	
 
 
@@ -1418,7 +1418,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if e.stop is not None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
                         	
 
 
@@ -5401,7 +5401,7 @@ class CParser(Parser):
                 if self.failed:
                     return 
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
                 while True: #loop65
@@ -5501,7 +5501,7 @@ class CParser(Parser):
                         if self.failed:
                             return 
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
 
 
@@ -8277,7 +8277,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16384,7 +16384,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                     self.following.append(self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
@@ -16503,7 +16503,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16535,7 +16535,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16582,7 +16582,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index fd96bb9a3c0b..b3b0ede7e8f3 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -561,7 +561,7 @@ class InfParser(MetaFileParser):
                     NmakeLine = ''
 
             # section content
-            self._ValueList = ['','','']
+            self._ValueList = ['', '', '']
             # parse current line, result will be put in self._ValueList
             self._SectionParser[self._SectionType](self)
             if self._ValueList is None or self._ItemType == MODEL_META_DATA_DEFINE:
@@ -920,7 +920,7 @@ class DscParser(MetaFileParser):
 
     ## Directive statement parser
     def _DirectiveParser(self):
-        self._ValueList = ['','','']
+        self._ValueList = ['', '', '']
         TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
         self._ValueList[0:len(TokenList)] = TokenList
 
@@ -1110,7 +1110,7 @@ class DscParser(MetaFileParser):
 
     ## Override parent's method since we'll do all macro replacements in parser
     def _GetMacros(self):
-        Macros = dict( [('ARCH','IA32'), ('FAMILY','MSFT'),('TOOL_CHAIN_TAG','VS2008x86'),('TARGET','DEBUG')])
+        Macros = dict( [('ARCH', 'IA32'), ('FAMILY', 'MSFT'), ('TOOL_CHAIN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')])
         Macros.update(self._FileLocalMacros)
         Macros.update(self._GetApplicableSectionMacro())
         Macros.update(GlobalData.gEdkGlobal)
@@ -1225,7 +1225,7 @@ class DscParser(MetaFileParser):
         self._RawTable.Drop()
         self._Table.Drop()
         for Record in RecordList:
-            EccGlobalData.gDb.TblDsc.Insert(Record[1],Record[2],Record[3],Record[4],Record[5],Record[6],Record[7],Record[8],Record[9],Record[10],Record[11],Record[12],Record[13],Record[14])
+            EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3], Record[4], Record[5], Record[6], Record[7], Record[8], Record[9], Record[10], Record[11], Record[12], Record[13], Record[14])
         GlobalData.gPlatformDefines.update(self._FileLocalMacros)
         self._PostProcessed = True
         self._Content = None
@@ -1246,7 +1246,7 @@ class DscParser(MetaFileParser):
 
     def __RetrievePcdValue(self):
         Records = self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsToItem=-1.0)
-        for TokenSpaceGuid,PcdName,Value,Dummy2,Dummy3,ID,Line in Records:
+        for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
             Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
             # Only use PCD whose value is straitforward (no macro and PCD)
             if self.SymbolPattern.findall(Value):
@@ -1259,7 +1259,7 @@ class DscParser(MetaFileParser):
             self._Symbols[Name] = Value
 
         Records = self._RawTable.Query(MODEL_PCD_FIXED_AT_BUILD, BelongsToItem=-1.0)
-        for TokenSpaceGuid,PcdName,Value,Dummy2,Dummy3,ID,Line in Records:
+        for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
             Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
             # Only use PCD whose value is straitforward (no macro and PCD)
             if self.SymbolPattern.findall(Value):
@@ -1571,7 +1571,7 @@ class DecParser(MetaFileParser):
                 continue
 
             # section content
-            self._ValueList = ['','','']
+            self._ValueList = ['', '', '']
             self._SectionParser[self._SectionType[0]](self)
             if self._ValueList is None or self._ItemType == MODEL_META_DATA_DEFINE:
                 self._ItemType = -1
@@ -1717,7 +1717,7 @@ class DecParser(MetaFileParser):
                         GuidValue = GuidValue.lstrip(' {')
                         HexList.append('0x' + str(GuidValue[2:]))
                         Index += 1
-            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2],HexList[3],HexList[4],HexList[5],HexList[6],HexList[7],HexList[8],HexList[9],HexList[10])
+            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2], HexList[3], HexList[4], HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10])
         else:
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
                             ExtraData=self._CurrentLine + \
diff --git a/BaseTools/Source/Python/Eot/CParser.py b/BaseTools/Source/Python/Eot/CParser.py
index d5fd3a37a167..d7eff138da57 100644
--- a/BaseTools/Source/Python/Eot/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser.py
@@ -785,10 +785,10 @@ class CParser(Parser):
                 if self.backtracking == 0:
                           
                     if d is not None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start,d.stop)
+                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
                     else:
                       self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start,declarator1.stop)
+                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a is not None:
@@ -922,9 +922,9 @@ class CParser(Parser):
                     if self.backtracking == 0:
                             
                         if b is not None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start,b.stop), self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start,c.stop))
+                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
                         	  
 
 
@@ -959,7 +959,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if t is not None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start,s.stop), self.input.toString(t.start,t.stop))
+                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
                         	
 
 
@@ -1403,7 +1403,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if s.stop is not None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start,s.stop))
+                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
                         	
 
 
@@ -1418,7 +1418,7 @@ class CParser(Parser):
                     if self.backtracking == 0:
                           
                         if e.stop is not None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
                         	
 
 
@@ -5401,7 +5401,7 @@ class CParser(Parser):
                 if self.failed:
                     return 
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start,p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
                 while True: #loop65
@@ -5501,7 +5501,7 @@ class CParser(Parser):
                         if self.failed:
                             return 
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start,c.stop))
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
 
 
@@ -8277,7 +8277,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16384,7 +16384,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                     self.following.append(self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
@@ -16503,7 +16503,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16535,7 +16535,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
@@ -16582,7 +16582,7 @@ class CParser(Parser):
                     if self.failed:
                         return 
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start,e.stop))
+                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
 
 
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index c70f62f393a9..ceefc952237f 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -128,11 +128,11 @@ def GetIdentifierList():
 
     for pp in FileProfile.PPDirectiveList:
         Type = GetIdType(pp.Content)
-        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0],pp.StartPos[1],pp.EndPos[0],pp.EndPos[1])
+        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
         IdList.append(IdPP)
 
     for ae in FileProfile.AssignmentExpressionList:
-        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0],ae.StartPos[1],ae.EndPos[0],ae.EndPos[1])
+        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1])
         IdList.append(IdAE)
 
     FuncDeclPattern = GetFuncDeclPattern()
@@ -154,7 +154,7 @@ def GetIdentifierList():
                     var.Modifier += ' ' + FuncNamePartList[Index]
                     var.Declarator = var.Declarator.lstrip().lstrip(FuncNamePartList[Index])
                     Index += 1
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
             continue
 
@@ -167,7 +167,7 @@ def GetIdentifierList():
                     var.Modifier += ' ' + Name[LSBPos:]
                     Name = Name[0:LSBPos]
 
-                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
                 IdList.append(IdVar)
         else:
             DeclList = var.Declarator.split('=')
@@ -176,7 +176,7 @@ def GetIdentifierList():
                 LSBPos = var.Declarator.find('[')
                 var.Modifier += ' ' + Name[LSBPos:]
                 Name = Name[0:LSBPos]
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0],var.StartPos[1],var.EndPos[0],var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
 
     for enum in FileProfile.EnumerationDefinitionList:
@@ -184,7 +184,7 @@ def GetIdentifierList():
         RBPos = enum.Content.find('}')
         Name = enum.Content[4:LBPos].strip()
         Value = enum.Content[LBPos+1:RBPos]
-        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0],enum.StartPos[1],enum.EndPos[0],enum.EndPos[1])
+        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
         IdList.append(IdEnum)
 
     for su in FileProfile.StructUnionDefinitionList:
@@ -201,7 +201,7 @@ def GetIdentifierList():
         else:
             Name = su.Content[SkipLen:LBPos].strip()
             Value = su.Content[LBPos+1:RBPos]
-        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0],su.StartPos[1],su.EndPos[0],su.EndPos[1])
+        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
         IdList.append(IdPE)
 
     TdFuncPointerPattern = GetTypedefFuncPointerPattern()
@@ -224,11 +224,11 @@ def GetIdentifierList():
             Name = TmpStr[0:RBPos]
             Value = 'FP' + TmpStr[RBPos + 1:]
 
-        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0],td.StartPos[1],td.EndPos[0],td.EndPos[1])
+        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
         IdList.append(IdTd)
 
     for funcCall in FileProfile.FunctionCallingList:
-        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0],funcCall.StartPos[1],funcCall.EndPos[0],funcCall.EndPos[1])
+        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
         IdList.append(IdFC)
     return IdList
 
@@ -330,7 +330,7 @@ def GetFunctionList():
                 FuncDef.Modifier += ' ' + FuncNamePartList[Index]
                 Index += 1
 
-        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0],FuncDef.StartPos[1],FuncDef.EndPos[0],FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
+        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
         FuncObjList.append(FuncObj)
 
     return FuncObjList
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 6b81b42620d7..3d28c7d778cb 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -22,7 +22,7 @@ import FfsFileStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import AprioriSectionClassObject
 from Common.StringUtils import *
-from Common.Misc import SaveFileOnChange,PathClass
+from Common.Misc import SaveFileOnChange, PathClass
 from Common import EdkLogger
 from Common.BuildToolError import *
 from Common.DataType import TAB_COMMON
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index b376d6b2e9be..9dc55e5dbf7b 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -207,7 +207,7 @@ class CapsulePayload(CapsuleData):
         #
         Guid = self.ImageTypeId.split('-')
         Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
-                       int(self.Version,16),
+                       int(self.Version, 16),
                        int(Guid[0], 16), 
                        int(Guid[1], 16), 
                        int(Guid[2], 16), 
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index 5405d0a8da13..8ac37dd96b9b 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -133,7 +133,7 @@ class EfiSection (EfiSectionClassObject):
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     VerString = f.read()
@@ -192,7 +192,7 @@ class EfiSection (EfiSectionClassObject):
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     UiString = f.read()
@@ -237,7 +237,7 @@ class EfiSection (EfiSectionClassObject):
                 for File in FileList:
                     """ Copy Map file to FFS output path """
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum , Index)
+                    Num = '%s.%d' %(SecNum, Index)
                     OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + Num + Ffs.SectionSuffix.get(SectionType))
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
                     
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 188ca28cd7ce..b2a14a1e1313 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -137,7 +137,7 @@ class FD(FDClassObject):
             # Call each region's AddToBuffer function
             #
             GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
-            RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict,Flag=Flag)
+            RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict, Flag=Flag)
         #
         # Write the buffer contents to Fd file
         #
@@ -163,7 +163,7 @@ class FD(FDClassObject):
                 if len(RegionObj.RegionDataList) == 1:
                     RegionData = RegionObj.RegionDataList[0]
                     FvList.append(RegionData.upper())
-                    FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+                    FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
                                                 RegionObj.Offset, RegionObj.Size)
                 else:
                     Offset = RegionObj.Offset
@@ -178,7 +178,7 @@ class FD(FDClassObject):
                             Size = 0
                             for blockStatement in FvObj.BlockSizeList:
                                 Size = Size + blockStatement[0] * blockStatement[1]
-                            FvAddDict[RegionData.upper()] = (int(self.BaseAddress,16) + \
+                            FvAddDict[RegionData.upper()] = (int(self.BaseAddress, 16) + \
                                                              Offset, Size)
                             Offset = Offset + Size
         #
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 6b4f724f6d9c..74785e0a93fe 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1820,7 +1820,7 @@ class FdfParser:
             return long(
                 ValueExpression(Expr,
                                 self.__CollectMacroPcd()
-                                )(True),0)
+                                )(True), 0)
         except Exception:
             self.SetFileBufferPos(StartPos)
             return None
@@ -2730,7 +2730,7 @@ class FdfParser:
         while True:
             AlignValue = None
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                         "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 #For FFS, Auto is default option same to ""
@@ -2789,7 +2789,7 @@ class FdfParser:
             FfsFileObj.CheckSum = True
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
@@ -2861,7 +2861,7 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -3151,7 +3151,7 @@ class FdfParser:
 
         AlignValue = None
         if self.__GetAlignment():
-            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self.__Token
@@ -3544,7 +3544,7 @@ class FdfParser:
         AfileName = self.__Token
         AfileBaseName = os.path.basename(AfileName)
         
-        if os.path.splitext(AfileBaseName)[1]  not in [".bin",".BIN",".Bin",".dat",".DAT",".Dat",".data",".DATA",".Data"]:
+        if os.path.splitext(AfileBaseName)[1]  not in [".bin", ".BIN", ".Bin", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"]:
             raise Warning('invalid binary file type, should be one of "bin",BINARY_FILE_TYPE_BIN,"Bin","dat","DAT","Dat","data","DATA","Data"', \
                           self.FileName, self.CurrentLineNumber)
         
@@ -3741,7 +3741,7 @@ class FdfParser:
 
         AlignValue = ""
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             #For FFS, Auto is default option same to ""
@@ -3791,7 +3791,7 @@ class FdfParser:
 
             SectAlignment = ""
             if self.__GetAlignment():
-                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                         "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                     raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                 if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
@@ -3871,7 +3871,7 @@ class FdfParser:
                 FvImageSectionObj.FvFileType = self.__Token
 
                 if self.__GetAlignment():
-                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+                    if self.__Token not in ("8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                             "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                         raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self.__Token
@@ -3939,7 +3939,7 @@ class FdfParser:
                 EfiSectionObj.BuildNum = self.__Token
 
         if self.__GetAlignment():
-            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K" ,"64K", "128K",
+            if self.__Token not in ("Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
                                     "256K", "512K", "1M", "2M", "4M", "8M", "16M"):
                 raise Warning("Incorrect alignment '%s'" % self.__Token, self.FileName, self.CurrentLineNumber)
             if self.__Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
@@ -4679,7 +4679,7 @@ class FdfParser:
                     FvInFdList = self.__GetFvInFd(RefFdName)
                     if FvInFdList != []:
                         for FvNameInFd in FvInFdList:
-                            LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+                            LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
                             if FvNameInFd not in RefFvStack:
                                 RefFvStack.append(FvNameInFd)
 
@@ -4735,7 +4735,7 @@ class FdfParser:
                         CapInFdList = self.__GetCapInFd(RefFdName)
                         if CapInFdList != []:
                             for CapNameInFd in CapInFdList:
-                                LogStr += "FD %s contains Capsule %s\n" % (RefFdName,CapNameInFd)
+                                LogStr += "FD %s contains Capsule %s\n" % (RefFdName, CapNameInFd)
                                 if CapNameInFd not in RefCapStack:
                                     RefCapStack.append(CapNameInFd)
 
@@ -4746,7 +4746,7 @@ class FdfParser:
                         FvInFdList = self.__GetFvInFd(RefFdName)
                         if FvInFdList != []:
                             for FvNameInFd in FvInFdList:
-                                LogStr += "FD %s contains FV %s\n" % (RefFdName,FvNameInFd)
+                                LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
                                 if FvNameInFd not in RefFvList:
                                     RefFvList.append(FvNameInFd)
 
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index b26821b29052..9eb99d659bfd 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -293,7 +293,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 try:
                     Pcd.InfDefaultValue = ValueExpressionEx(Pcd.InfDefaultValue, Pcd.DatumType, Platform._GuidDict)(True)
                 except BadExpression:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DefaultValue),File=self.InfFileName)
+                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DefaultValue), File=self.InfFileName)
 
             # Check value, if value are equal, no need to patch
             if Pcd.DatumType == TAB_VOID:
@@ -446,7 +446,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
         self.__InfParse__(Dict)
         Arch = self.GetCurrentArch()
-        SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir , self.InfFileName);
+        SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName);
         DestFile = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
         
         SrcFileDir = "."
@@ -694,13 +694,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
             Arch = self.CurrentArch
 
         OutputPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
-                                  Arch ,
+                                  Arch,
                                   ModulePath,
                                   FileName,
                                   'OUTPUT'
                                   )
         DebugPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
-                                  Arch ,
+                                  Arch,
                                   ModulePath,
                                   FileName,
                                   'DEBUG'
@@ -962,9 +962,9 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 Sect.FvParentAddr = FvParentAddr
             
             if Rule.KeyStringList != []:
-                SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
+                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
             else :
-                SectList, Align = Sect.GenSection(self.OutputPath , self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
+                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
             
             if not HasGeneratedFlag:
                 UniVfrOffsetFileSection = ""    
@@ -1121,7 +1121,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         try :
             SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue())
         except:
-            EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName,None)
+            EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName, None)
         
         fStringIO.close ()
 
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 6714838f6fc9..fb82634ccd7e 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -379,8 +379,8 @@ class FV (FvClassObject):
                     # check if the file path exists or not
                     if not os.path.isfile(FileFullPath):
                         GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
-                    FvExtFile = open (FileFullPath,'rb')
-                    FvExtFile.seek(0,2)
+                    FvExtFile = open (FileFullPath, 'rb')
+                    FvExtFile.seek(0, 2)
                     Size = FvExtFile.tell()
                     if Size >= 0x10000:
                         GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 3a4d8fb91b70..77bf6a700623 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -65,7 +65,7 @@ class FvImageSection(FvImageSectionClassObject):
             for FvFileName in FileList:
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
-                    FvFileObj = open (FvFileName,'rb')
+                    FvFileObj = open (FvFileName, 'rb')
                     FvFileObj.seek(0)
                     # PI FvHeader is 0x48 byte
                     FvHeaderBuffer = FvFileObj.read(0x48)
@@ -113,7 +113,7 @@ class FvImageSection(FvImageSectionClassObject):
                 if self.FvFileName is not None:
                     FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
                     if os.path.isfile(FvFileName):
-                        FvFileObj = open (FvFileName,'rb')
+                        FvFileObj = open (FvFileName, 'rb')
                         FvFileObj.seek(0)
                         # PI FvHeader is 0x48 byte
                         FvHeaderBuffer = FvFileObj.read(0x48)
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index 73b52030d929..6eb1201cee49 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -341,7 +341,7 @@ class GenFdsGlobalVariable:
         for Arch in ArchList:
             GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.normpath(
                 os.path.join(GlobalData.gWorkspace,
-                             WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,GlobalData.gGlobalDefines['TARGET'],
+                             WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
                              GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory,
                              GlobalData.gGlobalDefines['TARGET'] +'_' + GlobalData.gGlobalDefines['TOOLCHAIN']))
             GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = os.path.normpath(
@@ -547,7 +547,7 @@ class GenFdsGlobalVariable:
 
         GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
         if MakefilePath:
-            if (tuple(Cmd),tuple(GenFdsGlobalVariable.SecCmdList),tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict:
+            if (tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict:
                 GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] = MakefilePath
             GenFdsGlobalVariable.SecCmdList = []
             GenFdsGlobalVariable.CopyList = []
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index d7084fbe88da..9645e9b08db4 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -110,7 +110,7 @@ def _parseForGCC(lines, efifilepath):
                     PcdName = m.groups(0)[0]
                     m = pcdPatternGcc.match(lines[index + 1].strip())
                     if m is not None:
-                        bpcds.append((PcdName, int(m.groups(0)[0], 16) , int(sections[-1][1], 16), sections[-1][0]))
+                        bpcds.append((PcdName, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
                 
     # get section information from efi file
     efisecs = PeImageClass(efifilepath).SectionHeaderList
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 11d11700ed99..a44781f2e839 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -88,7 +88,7 @@ if __name__ == '__main__':
   parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index ca4f64864790..d8048d49a2ae 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -51,7 +51,7 @@ if __name__ == '__main__':
   parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
 
   #
   # Parse command line arguments
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index 2e164c4a2da6..807772daff81 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -50,7 +50,7 @@ EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID('{51aa59de-fdf2-4ea3-bc63-875fb7842ee
 #     UINT8 Signature[256];
 #   } EFI_CERT_BLOCK_RSA_2048_SHA256;
 #
-EFI_CERT_BLOCK_RSA_2048_SHA256        = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType','PublicKey','Signature'])
+EFI_CERT_BLOCK_RSA_2048_SHA256        = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature'])
 EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT = struct.Struct('16s256s256s')
 
 #
@@ -71,7 +71,7 @@ if __name__ == '__main__':
   parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename.  If not specified, a test signing key is used.")
   parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
   parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0,10), default=0, help="set debug level")
+  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
   parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
 
   #
@@ -155,7 +155,7 @@ if __name__ == '__main__':
   PublicKeyHexString = Process.communicate()[0].split('=')[1].strip()
   PublicKey = ''
   while len(PublicKeyHexString) > 0:
-    PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2],16))
+    PublicKey = PublicKey + chr(int(PublicKeyHexString[0:2], 16))
     PublicKeyHexString=PublicKeyHexString[2:]
   if Process.returncode != 0:
     sys.exit(Process.returncode)
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index 0d4a59198e7b..ed567b870816 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -59,11 +59,11 @@ class TargetTool():
     def ConvertTextFileToDict(self, FileName, CommentCharacter, KeySplitCharacter):
         """Convert a text file to a dictionary of (name:value) pairs."""
         try:
-            f = open(FileName,'r')
+            f = open(FileName, 'r')
             for Line in f:
                 if Line.startswith(CommentCharacter) or Line.strip() == '':
                     continue
-                LineList = Line.split(KeySplitCharacter,1)
+                LineList = Line.split(KeySplitCharacter, 1)
                 if len(LineList) >= 2:
                     Key = LineList[0].strip()
                     if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
@@ -103,7 +103,7 @@ class TargetTool():
                 if Line.startswith(CommentCharacter) or Line.strip() == '':
                     fw.write(Line)
                 else:
-                    LineList = Line.split(KeySplitCharacter,1)
+                    LineList = Line.split(KeySplitCharacter, 1)
                     if len(LineList) >= 2:
                         Key = LineList[0].strip()
                         if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
@@ -202,14 +202,14 @@ def RangeCheckCallback(option, opt_str, value, parser):
         parser.error("Option %s only allows one instance in command line!" % option)
         
 def MyOptionParser():
-    parser = OptionParser(version=__version__,prog="TargetTool.exe",usage=__usage__,description=__copyright__)
-    parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32','X64','IPF','EBC', 'ARM', 'AARCH64','0'], dest="TARGET_ARCH",
+    parser = OptionParser(version=__version__, prog="TargetTool.exe", usage=__usage__, description=__copyright__)
+    parser.add_option("-a", "--arch", action="append", type="choice", choices=['IA32', 'X64', 'IPF', 'EBC', 'ARM', 'AARCH64', '0'], dest="TARGET_ARCH",
         help="ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC, which replaces target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-p", "--platform", action="callback", type="string", dest="DSCFILE", callback=SingleCheckCallback,
         help="Specify a DSC file, which replace target.txt's ACTIVE_PLATFORM definition. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-c", "--tooldef", action="callback", type="string", dest="TOOL_DEFINITION_FILE", callback=SingleCheckCallback,
         help="Specify the WORKSPACE relative path of tool_def.txt file, which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this setting in target.txt and can't combine with other value.")
-    parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG','RELEASE','0'], dest="TARGET",
+    parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG', 'RELEASE', '0'], dest="TARGET",
         help="TARGET is one of list: DEBUG, RELEASE, which replaces target.txt's TARGET definition. To specify more TARGET, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-n", "--tagname", action="callback", type="string", dest="TOOL_CHAIN_TAG", callback=SingleCheckCallback,
         help="Specify the Tool Chain Tagname, which replaces target.txt's TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and can't combine with other value.")
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index b512d15243f8..97f4e87587ee 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -261,7 +261,7 @@ def TrimPreprocessedVfr(Source, Target):
     CreateDirectory(os.path.dirname(Target))
     
     try:
-        f = open (Source,'r')
+        f = open (Source, 'r')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     # read whole file
@@ -310,7 +310,7 @@ def TrimPreprocessedVfr(Source, Target):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.writelines(Lines)
@@ -407,7 +407,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
     if IncludePathFile:
         try:
             LineNum = 0
-            for Line in open(IncludePathFile,'r'):
+            for Line in open(IncludePathFile, 'r'):
                 LineNum += 1
                 if Line.startswith("/I") or Line.startswith ("-I"):
                     IncludePathList.append(Line[2:].strip())
@@ -425,7 +425,7 @@ def TrimAslFile(Source, Target, IncludePathFile):
 
     # save all lines trimmed
     try:
-        f = open (Target,'w')
+        f = open (Target, 'w')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
@@ -560,7 +560,7 @@ def TrimEdkSourceCode(Source, Target):
     CreateDirectory(os.path.dirname(Target))
 
     try:
-        f = open (Source,'rb')
+        f = open (Source, 'rb')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     # read whole file
@@ -568,7 +568,7 @@ def TrimEdkSourceCode(Source, Target):
     f.close()
 
     NewLines = None
-    for Re,Repl in gImportCodePatterns:
+    for Re, Repl in gImportCodePatterns:
         if NewLines is None:
             NewLines = Re.sub(Repl, Lines)
         else:
@@ -579,7 +579,7 @@ def TrimEdkSourceCode(Source, Target):
         return
 
     try:
-        f = open (Target,'wb')
+        f = open (Target, 'wb')
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
     f.write(NewLines)
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 34f56e7bb487..406a8a7e92de 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -285,8 +285,8 @@ class DependencyRules(object):
                 pass
             DecPath = dirname(DecFile)
             if DecPath.find(WorkSP) > -1:
-                InstallPath = GetRelativePath(DecPath,WorkSP)
-                DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+                InstallPath = GetRelativePath(DecPath, WorkSP)
+                DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
             else:
                 InstallPath = DecPath
                 DecFileRelaPath = DecFile
@@ -348,8 +348,8 @@ class DependencyRules(object):
                 pass
             DecPath = dirname(DecFile)
             if DecPath.find(WorkSP) > -1:
-                InstallPath = GetRelativePath(DecPath,WorkSP)
-                DecFileRelaPath = GetRelativePath(DecFile,WorkSP)
+                InstallPath = GetRelativePath(DecPath, WorkSP)
+                DecFileRelaPath = GetRelativePath(DecFile, WorkSP)
             else:
                 InstallPath = DecPath
                 DecFileRelaPath = DecFile
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index 97ad47a58dbb..3bce33748198 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -459,7 +459,7 @@ class IpiDatabase(object):
             (select InstallPath from ModInPkgInfo where 
             ModInPkgInfo.PackageGuid ='%s' 
             and ModInPkgInfo.PackageVersion = '%s')""" \
-                            % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1],Pkg[0], Pkg[1])
+                            % (Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1], Pkg[0], Pkg[1])
             
             self.Cur.execute(SqlCommand)
         #
@@ -921,7 +921,7 @@ class IpiDatabase(object):
     def __ConvertToSqlString(self, StringList):
         if self.DpTable:
             pass
-        return map(lambda s: s.replace("'", "''") , StringList)
+        return map(lambda s: s.replace("'", "''"), StringList)
 
 
 
diff --git a/BaseTools/Source/Python/UPT/Library/StringUtils.py b/BaseTools/Source/Python/UPT/Library/StringUtils.py
index a7a7b8667143..bd2cbe612037 100644
--- a/BaseTools/Source/Python/UPT/Library/StringUtils.py
+++ b/BaseTools/Source/Python/UPT/Library/StringUtils.py
@@ -632,7 +632,7 @@ def SplitString(String):
 # @param StringList:  A list for strings to be converted
 #
 def ConvertToSqlString(StringList):
-    return map(lambda s: s.replace("'", "''") , StringList)
+    return map(lambda s: s.replace("'", "''"), StringList)
 
 ## Convert To Sql String
 #
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index 074aa311f31d..e2908bcda98b 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -648,7 +648,7 @@ class DecPomAlignment(PackageObject):
                         ContainerFile,
                         (Item.TokenSpaceGuidCName, Item.TokenCName,
                         Item.DefaultValue, Item.DatumType, Item.TokenValue,
-                        Type, Item.GetHeadComment(), Item.GetTailComment(),''),
+                        Type, Item.GetHeadComment(), Item.GetTailComment(), ''),
                         Language,
                         self.DecParser.GetDefineSectionMacro()
                         )
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 0e425828cdfe..772974199f1f 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -314,7 +314,7 @@ def Main():
         GlobalData.gDB.CloseDb()
 
         if pf.system() == 'Windows':
-            os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\',''))
+            os.system('subst %s /D' % GlobalData.gWORKSPACE.replace('\\', ''))
 
     return ReturnCode
 
diff --git a/BaseTools/Source/Python/UPT/Xml/CommonXml.py b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
index 805310de4e46..8a8cce169626 100644
--- a/BaseTools/Source/Python/UPT/Xml/CommonXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
@@ -355,7 +355,7 @@ class PackageHeaderXml(object):
     def FromXml(self, Item, Key, PackageObject2):
         if not Item:
             XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea']
-            CheckDict = {'PackageHeader':None, }
+            CheckDict = {'PackageHeader': None, }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         self.PackagePath = XmlElement(Item, '%s/PackagePath' % Key)
         self.Header.FromXml(Item, Key)
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index dba3b7f5892c..dfc81567aed6 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -103,7 +103,7 @@ class DistributionPackageXml(object):
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
             else:
                 XmlTreeLevel = ['DistributionPackage', 'DistributionHeader']
-                CheckDict = CheckDict = {'DistributionHeader':'', }
+                CheckDict = CheckDict = {'DistributionHeader': '', }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -123,16 +123,16 @@ class DistributionPackageXml(object):
             #
             if self.DistP.Tools:
                 XmlTreeLevel = ['DistributionPackage', 'Tools', 'Header']
-                CheckDict = {'Name':self.DistP.Tools.GetName(), }
+                CheckDict = {'Name': self.DistP.Tools.GetName(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
                 if not self.DistP.Tools.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'Tools']
-                    CheckDict = {'FileName':None, }
+                    CheckDict = {'FileName': None, }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 for Item in self.DistP.Tools.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'Tools']
-                    CheckDict = {'FileName':Item.GetURI(), }
+                    CheckDict = {'FileName': Item.GetURI(), }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -140,16 +140,16 @@ class DistributionPackageXml(object):
             #
             if self.DistP.MiscellaneousFiles:
                 XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles', 'Header']
-                CheckDict = {'Name':self.DistP.MiscellaneousFiles.GetName(), }
+                CheckDict = {'Name': self.DistP.MiscellaneousFiles.GetName(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
                 if not self.DistP.MiscellaneousFiles.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
-                    CheckDict = {'FileName':None, }
+                    CheckDict = {'FileName': None, }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 for Item in self.DistP.MiscellaneousFiles.GetFileList():
                     XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
-                    CheckDict = {'FileName':Item.GetURI(), }
+                    CheckDict = {'FileName': Item.GetURI(), }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
@@ -157,7 +157,7 @@ class DistributionPackageXml(object):
             #
             for Item in self.DistP.UserExtensions:
                 XmlTreeLevel = ['DistributionPackage', 'UserExtensions']
-                CheckDict = {'UserId':Item.GetUserID(), }
+                CheckDict = {'UserId': Item.GetUserID(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
 
@@ -449,10 +449,10 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['MiscellaneousFiles']
     for Item in Module.GetMiscFileList():
         if not Item.GetFileList():
-            CheckDict = {'Filename':'', }
+            CheckDict = {'Filename': '', }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         for File in Item.GetFileList():
-            CheckDict = {'Filename':File.GetURI(), }
+            CheckDict = {'Filename': File.GetURI(), }
 
 ## ValidateMS2
 #
@@ -915,10 +915,10 @@ def ValidatePS2(Package):
     XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'MiscellaneousFiles']
     for Item in Package.GetMiscFileList():
         if not Item.GetFileList():
-            CheckDict = {'Filename':'', }
+            CheckDict = {'Filename': '', }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         for File in Item.GetFileList():
-            CheckDict = {'Filename':File.GetURI(), }
+            CheckDict = {'Filename': File.GetURI(), }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
 ## ValidatePackageSurfaceArea
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 209315d901b2..2569235fb875 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -70,23 +70,23 @@ class PcdClassObject(object):
         if IsDsc:
             self.DscDefaultValue = Value
         self.PcdValueFromComm = ""
-        self.DefinitionPosition = ("","")
+        self.DefinitionPosition = ("", "")
 
     ## Get the maximum number of bytes
     def GetPcdMaxSize(self):
         if self.DatumType in TAB_PCD_NUMERIC_TYPES:
             return MAX_SIZE_TYPE[self.DatumType]
 
-        MaxSize = int(self.MaxDatumSize,10) if self.MaxDatumSize else 0
+        MaxSize = int(self.MaxDatumSize, 10) if self.MaxDatumSize else 0
         if self.PcdValueFromComm:
             if self.PcdValueFromComm.startswith("{") and self.PcdValueFromComm.endswith("}"):
-                return max([len(self.PcdValueFromComm.split(",")),MaxSize])
+                return max([len(self.PcdValueFromComm.split(",")), MaxSize])
             elif self.PcdValueFromComm.startswith("\"") or self.PcdValueFromComm.startswith("\'"):
-                return max([len(self.PcdValueFromComm)-2+1,MaxSize])
+                return max([len(self.PcdValueFromComm)-2+1, MaxSize])
             elif self.PcdValueFromComm.startswith("L\""):
-                return max([2*(len(self.PcdValueFromComm)-3+1),MaxSize])
+                return max([2*(len(self.PcdValueFromComm)-3+1), MaxSize])
             else:
-                return max([len(self.PcdValueFromComm),MaxSize])
+                return max([len(self.PcdValueFromComm), MaxSize])
         return MaxSize
 
     ## Get the number of bytes
@@ -178,7 +178,7 @@ class StructurePcd(PcdClassObject):
         self.DefaultValues[FieldName] = [Value.strip(), FileName, LineNo]
         return self.DefaultValues[FieldName]
 
-    def SetDecDefaultValue(self,DefaultValue):
+    def SetDecDefaultValue(self, DefaultValue):
         self.DefaultValueFromDec = DefaultValue
     def AddOverrideValue (self, FieldName, Value, SkuName, DefaultStoreName, FileName="", LineNo=0):
         if SkuName not in self.SkuOverrideValues:
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index 99257d08147b..7eeca9524529 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -369,16 +369,16 @@ class DecBuildData(PackageBuildClassObject):
 
     def ProcessStructurePcd(self, StructurePcdRawDataSet):
         s_pcd_set = OrderedDict()
-        for s_pcd,LineNo in StructurePcdRawDataSet:
+        for s_pcd, LineNo in StructurePcdRawDataSet:
             if s_pcd.TokenSpaceGuidCName not in s_pcd_set:
                 s_pcd_set[s_pcd.TokenSpaceGuidCName] = []
-            s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd,LineNo))
+            s_pcd_set[s_pcd.TokenSpaceGuidCName].append((s_pcd, LineNo))
 
         str_pcd_set = []
         for pcdname in s_pcd_set:
             dep_pkgs = []
             struct_pcd = StructurePcd()
-            for item,LineNo in s_pcd_set[pcdname]:
+            for item, LineNo in s_pcd_set[pcdname]:
                 if "<HeaderFiles>" in item.TokenCName:
                     struct_pcd.StructuredPcdIncludeFile.append(item.DefaultValue)
                 elif "<Packages>" in item.TokenCName:
@@ -391,7 +391,7 @@ class DecBuildData(PackageBuildClassObject):
                     struct_pcd.PkgPath = self.MetaFile.File
                     struct_pcd.SetDecDefaultValue(item.DefaultValue)
                 else:
-                    struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue,self.MetaFile.File,LineNo)
+                    struct_pcd.AddDefaultValue(item.TokenCName, item.DefaultValue, self.MetaFile.File, LineNo)
 
             struct_pcd.PackageDecs = dep_pkgs
             str_pcd_set.append(struct_pcd)
@@ -412,7 +412,7 @@ class DecBuildData(PackageBuildClassObject):
         StrPcdSet = []
         RecordList = self._RawData[Type, self._Arch]
         for TokenSpaceGuid, PcdCName, Setting, Arch, PrivateFlag, Dummy1, Dummy2 in RecordList:
-            PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting,Dummy2)
+            PcdDict[Arch, PcdCName, TokenSpaceGuid] = (Setting, Dummy2)
             if not (PcdCName, TokenSpaceGuid) in PcdSet:
                 PcdSet.append((PcdCName, TokenSpaceGuid))
 
@@ -421,7 +421,7 @@ class DecBuildData(PackageBuildClassObject):
             # limit the ARCH to self._Arch, if no self._Arch found, tdict
             # will automatically turn to 'common' ARCH and try again
             #
-            Setting,LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
+            Setting, LineNo = PcdDict[self._Arch, PcdCName, TokenSpaceGuid]
             if Setting is None:
                 continue
 
@@ -442,9 +442,9 @@ class DecBuildData(PackageBuildClassObject):
                                         list(validlists),
                                         list(expressions)
                                         )
-            PcdObj.DefinitionPosition = (self.MetaFile.File,LineNo)
+            PcdObj.DefinitionPosition = (self.MetaFile.File, LineNo)
             if "." in TokenSpaceGuid:
-                StrPcdSet.append((PcdObj,LineNo))
+                StrPcdSet.append((PcdObj, LineNo))
             else:
                 Pcds[PcdCName, TokenSpaceGuid, self._PCD_TYPE_STRING_[Type]] = PcdObj
 
@@ -455,10 +455,10 @@ class DecBuildData(PackageBuildClassObject):
         for pcd in Pcds.values():
             if pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
                 if StructPattern.match(pcd.DatumType) is None:
-                    EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.", pcd.DefinitionPosition[0],pcd.DefinitionPosition[1])
+                    EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.", pcd.DefinitionPosition[0], pcd.DefinitionPosition[1])
         for struct_pcd in Pcds.values():
-            if isinstance(struct_pcd,StructurePcd) and not struct_pcd.StructuredPcdIncludeFile:
-                EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR, "The structure Pcd %s.%s header file is not found in %s line %s \n" % (struct_pcd.TokenSpaceGuidCName, struct_pcd.TokenCName,struct_pcd.DefinitionPosition[0],struct_pcd.DefinitionPosition[1] ))
+            if isinstance(struct_pcd, StructurePcd) and not struct_pcd.StructuredPcdIncludeFile:
+                EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR, "The structure Pcd %s.%s header file is not found in %s line %s \n" % (struct_pcd.TokenSpaceGuidCName, struct_pcd.TokenCName, struct_pcd.DefinitionPosition[0], struct_pcd.DefinitionPosition[1] ))
 
         return Pcds
     @property
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index a80c07bc1e55..9e7b8a18c28b 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -40,7 +40,7 @@ import Common.GlobalData as GlobalData
 import subprocess
 from Common.Misc import SaveFileOnChange
 from Workspace.BuildClassObject import PlatformBuildClassObject, StructurePcd, PcdClassObject, ModuleBuildClassObject
-from collections import OrderedDict,defaultdict
+from collections import OrderedDict, defaultdict
 
 PcdValueInitName = 'PcdValueInit'
 
@@ -108,7 +108,7 @@ from AutoGen.GenMake import gIncludePattern
 #
 #   @retval     list            The list of files the given source file depends on
 #
-def GetDependencyList(FileStack,SearchPathList):
+def GetDependencyList(FileStack, SearchPathList):
     DepDb = dict()
     DependencySet = set(FileStack)
     while len(FileStack) > 0:
@@ -224,7 +224,7 @@ class DscBuildData(PlatformBuildClassObject):
     @property
     def OutputPath(self):
         if os.getenv("WORKSPACE"):
-            return os.path.join(os.getenv("WORKSPACE"), self.OutputDirectory, self._Target + "_" + self._Toolchain,PcdValueInitName)
+            return os.path.join(os.getenv("WORKSPACE"), self.OutputDirectory, self._Target + "_" + self._Toolchain, PcdValueInitName)
         else:
             return os.path.dirname(self.DscFile)
 
@@ -657,7 +657,7 @@ class DscBuildData(PlatformBuildClassObject):
 
     @staticmethod
     def ToInt(intstr):
-        return int(intstr,16) if intstr.upper().startswith("0X") else int(intstr)
+        return int(intstr, 16) if intstr.upper().startswith("0X") else int(intstr)
 
     def _GetDefaultStores(self):
         if self.DefaultStores is None:
@@ -676,9 +676,9 @@ class DscBuildData(PlatformBuildClassObject):
                 if not IsValidWord(Record[1]):
                     EdkLogger.error('build', FORMAT_INVALID, "The format of the DefaultStores ID name is invalid. The correct format is '(a-zA-Z0-9_)(a-zA-Z0-9_-.)*'",
                                     File=self.MetaFile, Line=Record[-1])
-                self.DefaultStores[Record[1].upper()] = (DscBuildData.ToInt(Record[0]),Record[1].upper())
+                self.DefaultStores[Record[1].upper()] = (DscBuildData.ToInt(Record[0]), Record[1].upper())
             if TAB_DEFAULT_STORES_DEFAULT not in self.DefaultStores:
-                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0,TAB_DEFAULT_STORES_DEFAULT)
+                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0, TAB_DEFAULT_STORES_DEFAULT)
             GlobalData.gDefaultStores = sorted(self.DefaultStores.keys())
         return self.DefaultStores
 
@@ -736,7 +736,7 @@ class DscBuildData(PlatformBuildClassObject):
             for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, \
                          MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX]:
                 RecordList = self._RawData[Type, self._Arch, None, ModuleId]
-                for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+                for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                     TokenList = GetSplitValueList(Setting)
                     DefaultValue = TokenList[0]
                     # the format is PcdName| Value | VOID* | MaxDatumSize
@@ -761,7 +761,7 @@ class DscBuildData(PlatformBuildClassObject):
 
             # get module private build options
             RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, None, ModuleId]
-            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                 if (ToolChainFamily, ToolChain) not in Module.BuildOptions:
                     Module.BuildOptions[ToolChainFamily, ToolChain] = Option
                 else:
@@ -801,7 +801,7 @@ class DscBuildData(PlatformBuildClassObject):
             RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, None, -1]
             Macros = self._Macros
             for Record in RecordList:
-                LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy,Dummy, LineNo = Record
+                LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy, Dummy, LineNo = Record
                 if LibraryClass == '' or LibraryClass == 'NULL':
                     self._NullLibraryNumber += 1
                     LibraryClass = 'NULL%d' % self._NullLibraryNumber
@@ -868,7 +868,7 @@ class DscBuildData(PlatformBuildClassObject):
                 ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
 
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
             self._GuidDict.update(GlobalData.gPlatformPcds)
 
         if (PcdCName, TokenSpaceGuid) not in self._DecPcds:
@@ -913,14 +913,14 @@ class DscBuildData(PlatformBuildClassObject):
                                 ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
             if PcdType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
                 if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
-                    EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file." , File=self.MetaFile, Line=LineNo,
+                    EdkLogger.error('build', FORMAT_INVALID, "Pcd datumtype used in DSC file is not the same as its declaration in DEC file.", File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
         if (TokenSpaceGuid + '.' + PcdCName) in GlobalData.gPlatformPcds:
             if GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] != ValueList[Index]:
                 GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] = ValueList[Index]
         return ValueList
 
-    def _FilterPcdBySkuUsage(self,Pcds):
+    def _FilterPcdBySkuUsage(self, Pcds):
         available_sku = self.SkuIdMgr.AvailableSkuIdSet
         sku_usage = self.SkuIdMgr.SkuUsageType
         if sku_usage == SkuClass.SINGLE:
@@ -936,7 +936,7 @@ class DscBuildData(PlatformBuildClassObject):
                 if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
-    def CompleteHiiPcdsDefaultStores(self,Pcds):
+    def CompleteHiiPcdsDefaultStores(self, Pcds):
         HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         for pcd in HiiPcd:
@@ -958,7 +958,7 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).DefaultValue
         for pcd in self._Pcds:
-            if isinstance(self._Pcds[pcd],StructurePcd) and (self._Pcds[pcd].PcdValueFromComm or self._Pcds[pcd].PcdFieldValueFromComm):
+            if isinstance(self._Pcds[pcd], StructurePcd) and (self._Pcds[pcd].PcdValueFromComm or self._Pcds[pcd].PcdFieldValueFromComm):
                 UpdateCommandLineValue(self._Pcds[pcd])
 
     def __ParsePcdFromCommandLine(self):
@@ -970,10 +970,10 @@ class DscBuildData(PlatformBuildClassObject):
                 if not pcdvalue:
                     EdkLogger.error('build', AUTOGEN_ERROR, "No Value specified for the PCD %s." % (pcdname))
                 if '.' in pcdname:
-                    (Name1, Name2) = pcdname.split('.',1)
+                    (Name1, Name2) = pcdname.split('.', 1)
                     if "." in Name2:
-                        (Name3, FieldName) = Name2.split(".",1)
-                        if ((Name3,Name1)) in self.DecPcds:
+                        (Name3, FieldName) = Name2.split(".", 1)
+                        if ((Name3, Name1)) in self.DecPcds:
                             HasTokenSpace = True
                             TokenCName = Name3
                             TokenSpaceGuidCName = Name1
@@ -983,7 +983,7 @@ class DscBuildData(PlatformBuildClassObject):
                             TokenSpaceGuidCName = ''
                             HasTokenSpace = False
                     else:
-                        if ((Name2,Name1)) in self.DecPcds:
+                        if ((Name2, Name1)) in self.DecPcds:
                             HasTokenSpace = True
                             TokenCName = Name2
                             TokenSpaceGuidCName = Name1
@@ -1037,7 +1037,7 @@ class DscBuildData(PlatformBuildClassObject):
                     IsValid, Cause = CheckPcdDatum(PcdDatumType, pcdvalue)
                     if not IsValid:
                         EdkLogger.error("build", FORMAT_INVALID, Cause, ExtraData="%s.%s" % (TokenSpaceGuidCName, TokenCName))
-                GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue,("build command options",1))
+                GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, ("build command options", 1))
 
                 for BuildData in self._Bdb._CACHE_.values():
                     if BuildData.MetaFile.Ext == '.dec' or BuildData.MetaFile.Ext == '.dsc':
@@ -1148,7 +1148,7 @@ class DscBuildData(PlatformBuildClassObject):
             #
             for CodeBase in (EDKII_NAME, EDK_NAME):
                 RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
-                for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+                for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                     if Dummy3.upper() != TAB_COMMON:
                         continue
                     CurKey = (ToolChainFamily, ToolChain, CodeBase)
@@ -1171,7 +1171,7 @@ class DscBuildData(PlatformBuildClassObject):
             DriverType = '%s.%s' % (Edk, ModuleType)
             CommonDriverType = '%s.%s' % (TAB_COMMON, ModuleType)
             RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch]
-            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4,Dummy5 in RecordList:
+            for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                 Type = Dummy2 + '.' + Dummy3
                 if Type.upper() == DriverType.upper() or Type.upper() == CommonDriverType.upper():
                     Key = (ToolChainFamily, ToolChain, Edk)
@@ -1186,7 +1186,7 @@ class DscBuildData(PlatformBuildClassObject):
     def GetStructurePcdInfo(PcdSet):
         structure_pcd_data = defaultdict(list)
         for item in PcdSet:
-            structure_pcd_data[(item[0],item[1])].append(item)
+            structure_pcd_data[(item[0], item[1])].append(item)
 
         return structure_pcd_data
 
@@ -1194,25 +1194,25 @@ class DscBuildData(PlatformBuildClassObject):
     def OverrideByFdfComm(StruPcds):
         StructurePcdInCom = OrderedDict()
         for item in GlobalData.BuildOptionPcd:
-            if len(item) == 5 and (item[1],item[0]) in StruPcds:
-                StructurePcdInCom[(item[0],item[1],item[2] )] = (item[3],item[4])
-        GlobalPcds = {(item[0],item[1]) for item in StructurePcdInCom}
+            if len(item) == 5 and (item[1], item[0]) in StruPcds:
+                StructurePcdInCom[(item[0], item[1], item[2] )] = (item[3], item[4])
+        GlobalPcds = {(item[0], item[1]) for item in StructurePcdInCom}
         for Pcd in StruPcds.values():
-            if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) not in GlobalPcds:
+            if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in GlobalPcds:
                 continue
             FieldValues = OrderedDict()
             for item in StructurePcdInCom:
-                if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) == (item[0],item[1]) and item[2]:
+                if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) == (item[0], item[1]) and item[2]:
                     FieldValues[item[2]] = StructurePcdInCom[item]
             for field in FieldValues:
                 if field not in Pcd.PcdFieldValueFromComm:
-                    Pcd.PcdFieldValueFromComm[field] = ["","",""]
+                    Pcd.PcdFieldValueFromComm[field] = ["", "", ""]
                 Pcd.PcdFieldValueFromComm[field][0] = FieldValues[field][0]
                 Pcd.PcdFieldValueFromComm[field][1] = FieldValues[field][1][0]
                 Pcd.PcdFieldValueFromComm[field][2] = FieldValues[field][1][1]
         return StruPcds
 
-    def OverrideByFdfCommOverAll(self,AllPcds):
+    def OverrideByFdfCommOverAll(self, AllPcds):
         def CheckStructureInComm(commpcds):
             if not commpcds:
                 return False
@@ -1221,43 +1221,43 @@ class DscBuildData(PlatformBuildClassObject):
             return False
 
         if CheckStructureInComm(GlobalData.BuildOptionPcd):
-            StructurePcdInCom = {(item[0],item[1],item[2] ):(item[3],item[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionPcd else {}
-            NoFiledValues = {(item[0],item[1]):StructurePcdInCom[item] for item in StructurePcdInCom if not item[2]}
+            StructurePcdInCom = {(item[0], item[1], item[2] ):(item[3], item[4]) for item in GlobalData.BuildOptionPcd } if GlobalData.BuildOptionPcd else {}
+            NoFiledValues = {(item[0], item[1]):StructurePcdInCom[item] for item in StructurePcdInCom if not item[2]}
         else:
-            NoFiledValues = {(item[0],item[1]):[item[2]] for item in GlobalData.BuildOptionPcd}
-        for Guid,Name in NoFiledValues:
-            if (Name,Guid) in AllPcds:
-                Pcd = AllPcds.get((Name,Guid))
-                if isinstance(self._DecPcds.get((Pcd.TokenCName,Pcd.TokenSpaceGuidCName), None),StructurePcd):
-                    self._DecPcds.get((Pcd.TokenCName,Pcd.TokenSpaceGuidCName)).PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+            NoFiledValues = {(item[0], item[1]):[item[2]] for item in GlobalData.BuildOptionPcd}
+        for Guid, Name in NoFiledValues:
+            if (Name, Guid) in AllPcds:
+                Pcd = AllPcds.get((Name, Guid))
+                if isinstance(self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), None), StructurePcd):
+                    self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName)).PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                 else:
-                    Pcd.PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
-                    Pcd.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+                    Pcd.PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                    Pcd.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                     for sku in Pcd.SkuInfoList:
                         SkuInfo = Pcd.SkuInfoList[sku]
                         if SkuInfo.DefaultValue:
-                            SkuInfo.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+                            SkuInfo.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                         else:
-                            SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+                            SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                             for defaultstore in SkuInfo.DefaultStoreDict:
-                                SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName,Pcd.TokenCName)][0]
+                                SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                     if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]]:
                         if Pcd.DatumType == TAB_VOID:
                             if not Pcd.MaxDatumSize:
                                 Pcd.MaxDatumSize = '0'
-                            CurrentSize = int(Pcd.MaxDatumSize,16) if Pcd.MaxDatumSize.upper().startswith("0X") else int(Pcd.MaxDatumSize)
+                            CurrentSize = int(Pcd.MaxDatumSize, 16) if Pcd.MaxDatumSize.upper().startswith("0X") else int(Pcd.MaxDatumSize)
                             OptionSize = len((StringToArray(Pcd.PcdValueFromComm)).split(","))
                             MaxSize = max(CurrentSize, OptionSize)
                             Pcd.MaxDatumSize = str(MaxSize)
             else:
-                PcdInDec = self.DecPcds.get((Name,Guid))
+                PcdInDec = self.DecPcds.get((Name, Guid))
                 if PcdInDec:
-                    PcdInDec.PcdValueFromComm = NoFiledValues[(Guid,Name)][0]
+                    PcdInDec.PcdValueFromComm = NoFiledValues[(Guid, Name)][0]
                     if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
                         self.Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
-                        self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid,Name)][0]
+                        self.Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid, Name)][0]
         return AllPcds
     def UpdateStructuredPcds(self, TypeList, AllPcds):
 
@@ -1281,7 +1281,7 @@ class DscBuildData(PlatformBuildClassObject):
         for Type in TypeList:
             RecordList.extend(self._RawData[Type, self._Arch])
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, default_store, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             default_store = default_store.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
@@ -1289,7 +1289,7 @@ class DscBuildData(PlatformBuildClassObject):
                 continue
 
             if SkuName in SkuIds and "." in TokenSpaceGuid:
-                S_PcdSet.append([ TokenSpaceGuid.split(".")[0],TokenSpaceGuid.split(".")[1], PcdCName,SkuName, default_store,Dummy5, AnalyzePcdExpression(Setting)[0]])
+                S_PcdSet.append([ TokenSpaceGuid.split(".")[0], TokenSpaceGuid.split(".")[1], PcdCName, SkuName, default_store, Dummy5, AnalyzePcdExpression(Setting)[0]])
 
         # handle pcd value override
         StrPcdSet = DscBuildData.GetStructurePcdInfo(S_PcdSet)
@@ -1300,7 +1300,7 @@ class DscBuildData(PlatformBuildClassObject):
             if not isinstance (str_pcd_dec, StructurePcd):
                 EdkLogger.error('build', PARSER_ERROR,
                             "Pcd (%s.%s) is not declared as Structure PCD in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch),
-                            File=self.MetaFile,Line = StrPcdSet[str_pcd][0][5])
+                            File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
             if str_pcd_dec:
                 str_pcd_obj_str = StructurePcd()
                 str_pcd_obj_str.copy(str_pcd_dec)
@@ -1312,12 +1312,12 @@ class DscBuildData(PlatformBuildClassObject):
                         str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                 for str_pcd_data in StrPcdSet[str_pcd]:
                     if str_pcd_data[3] in SkuIds:
-                        str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), TAB_DEFAULT if str_pcd_data[3] == TAB_COMMON else str_pcd_data[3],TAB_DEFAULT_STORES_DEFAULT if str_pcd_data[4] == TAB_COMMON else str_pcd_data[4], self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:],LineNo=str_pcd_data[5])
+                        str_pcd_obj_str.AddOverrideValue(str_pcd_data[2], str(str_pcd_data[6]), TAB_DEFAULT if str_pcd_data[3] == TAB_COMMON else str_pcd_data[3], TAB_DEFAULT_STORES_DEFAULT if str_pcd_data[4] == TAB_COMMON else str_pcd_data[4], self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:], LineNo=str_pcd_data[5])
                 S_pcd_set[str_pcd[1], str_pcd[0]] = str_pcd_obj_str
             else:
                 EdkLogger.error('build', PARSER_ERROR,
                             "Pcd (%s.%s) defined in DSC is not declared in DEC files. Arch: ['%s']" % (str_pcd[0], str_pcd[1], self._Arch),
-                            File=self.MetaFile,Line = StrPcdSet[str_pcd][0][5])
+                            File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
         # Add the Structure PCD that only defined in DEC, don't have override in DSC file
         for Pcd in self.DecPcds:
             if type (self._DecPcds[Pcd]) is StructurePcd:
@@ -1348,7 +1348,7 @@ class DscBuildData(PlatformBuildClassObject):
                         nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
                     stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy({defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
                     if not NoDefault:
-                        stru_pcd.ValueChain.add((skuid,''))
+                        stru_pcd.ValueChain.add((skuid, ''))
             if stru_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
                 for skuid in SkuIds:
                     nextskuid = skuid
@@ -1367,11 +1367,11 @@ class DscBuildData(PlatformBuildClassObject):
                     for defaultstoreid in DefaultStores:
                         if defaultstoreid not in stru_pcd.SkuOverrideValues[skuid]:
                             stru_pcd.SkuOverrideValues[skuid][defaultstoreid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid][mindefaultstorename])
-                            stru_pcd.ValueChain.add((skuid,defaultstoreid))
+                            stru_pcd.ValueChain.add((skuid, defaultstoreid))
         S_pcd_set = DscBuildData.OverrideByFdfComm(S_pcd_set)
         Str_Pcd_Values = self.GenerateByteArrayValue(S_pcd_set)
         if Str_Pcd_Values:
-            for (skuname,StoreName,PcdGuid,PcdName,PcdValue) in Str_Pcd_Values:
+            for (skuname, StoreName, PcdGuid, PcdName, PcdValue) in Str_Pcd_Values:
                 str_pcd_obj = S_pcd_set.get((PcdName, PcdGuid))
                 if str_pcd_obj is None:
                     print(PcdName, PcdGuid)
@@ -1423,7 +1423,7 @@ class DscBuildData(PlatformBuildClassObject):
                 elif TAB_DEFAULT in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                     del pcd.SkuInfoList[TAB_COMMON]
 
-        map(self.FilterSkuSettings,[Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
+        map(self.FilterSkuSettings, [Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType])
         return Pcds
 
     ## Retrieve non-dynamic PCD settings
@@ -1445,7 +1445,7 @@ class DscBuildData(PlatformBuildClassObject):
         # Find out all possible PCD candidates for self._Arch
         RecordList = self._RawData[Type, self._Arch]
         PcdValueDict = OrderedDict()
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -1466,7 +1466,7 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 PcdValueDict[PcdCName, TokenSpaceGuid] = {SkuName:(PcdValue, DatumType, MaxDatumSize)}
 
-        for ((PcdCName,TokenSpaceGuid),PcdSetting) in PcdValueDict.iteritems():
+        for ((PcdCName, TokenSpaceGuid), PcdSetting) in PcdValueDict.iteritems():
             PcdValue = None
             DatumType = None
             MaxDatumSize = None
@@ -1536,7 +1536,7 @@ class DscBuildData(PlatformBuildClassObject):
         Result = Result + '"'
         return Result
 
-    def GenerateSizeFunction(self,Pcd):
+    def GenerateSizeFunction(self, Pcd):
         CApp = "// Default Value in Dec \n"
         CApp = CApp + "void Cal_%s_%s_Size(UINT32 *Size){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         for FieldList in [Pcd.DefaultValues]:
@@ -1618,7 +1618,7 @@ class DscBuildData(PlatformBuildClassObject):
                 while '[' in FieldName:
                     FieldName = FieldName.rsplit('[', 1)[0]
                     CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip("."), ArrayIndex + 1, Pcd.PcdFieldValueFromComm[FieldName_ori][1], Pcd.PcdFieldValueFromComm[FieldName_ori][2], Pcd.PcdFieldValueFromComm[FieldName_ori][0])
-        CApp = CApp + "  *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (Pcd.GetPcdMaxSize(),Pcd.GetPcdMaxSize(),Pcd.GetPcdMaxSize())
+        CApp = CApp + "  *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize())
         CApp = CApp + "}\n"
         return CApp
 
@@ -1628,9 +1628,9 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = CApp + '  Cal_%s_%s_Size(&Size);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
 
-    def GenerateDefaultValueAssignFunction(self,Pcd):
+    def GenerateDefaultValueAssignFunction(self, Pcd):
         CApp = "// Default value in Dec \n"
-        CApp = CApp + "void Assign_%s_%s_Default_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType)
+        CApp = CApp + "void Assign_%s_%s_Default_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
         DefaultValueFromDec = Pcd.DefaultValueFromDec
@@ -1661,12 +1661,12 @@ class DscBuildData(PlatformBuildClassObject):
                         FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                     except BadExpression:
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
-                                        (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1],FieldList[FieldName][2]))
+                                        (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
 
                 try:
                     Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                 except Exception:
-                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                 if isinstance(Value, str):
                     CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                 elif IsArray:
@@ -1689,22 +1689,22 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = '  Assign_%s_%s_Default_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
 
-    def GenerateInitValueFunction(self,Pcd,SkuName,DefaultStoreName):
-        CApp = "// Value in Dsc for Sku: %s, DefaultStore %s\n" % (SkuName,DefaultStoreName)
-        CApp = CApp + "void Assign_%s_%s_%s_%s_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName,DefaultStoreName,Pcd.DatumType)
+    def GenerateInitValueFunction(self, Pcd, SkuName, DefaultStoreName):
+        CApp = "// Value in Dsc for Sku: %s, DefaultStore %s\n" % (SkuName, DefaultStoreName)
+        CApp = CApp + "void Assign_%s_%s_%s_%s_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, Pcd.DatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
 
         CApp = CApp + "// SkuName: %s,  DefaultStoreName: %s \n" % (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
         inherit_OverrideValues = Pcd.SkuOverrideValues[SkuName]
-        if (SkuName,DefaultStoreName) == (TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT):
-            pcddefaultvalue = Pcd.DefaultFromDSC.get(TAB_DEFAULT,{}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue
+        if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT):
+            pcddefaultvalue = Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue
         else:
             if not Pcd.DscRawValue:
                 # handle the case that structure pcd is not appear in DSC
                 self.CopyDscRawValue(Pcd)
-            pcddefaultvalue = Pcd.DscRawValue.get(SkuName,{}).get(DefaultStoreName)
-        for FieldList in [pcddefaultvalue,inherit_OverrideValues.get(DefaultStoreName)]:
+            pcddefaultvalue = Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName)
+        for FieldList in [pcddefaultvalue, inherit_OverrideValues.get(DefaultStoreName)]:
             if not FieldList:
                 continue
             if pcddefaultvalue and FieldList == pcddefaultvalue:
@@ -1717,26 +1717,26 @@ class DscBuildData(PlatformBuildClassObject):
                                         (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldList))
                 Value, ValueSize = ParseFieldValue (FieldList)
 
-                if (SkuName,DefaultStoreName) == (TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT):
+                if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT):
                     if isinstance(Value, str):
-                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DefaultFromDSC.get(TAB_DEFAULT,{}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
                     elif IsArray:
                     #
                     # Use memcpy() to copy value into field
                     #
-                        CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(TAB_DEFAULT,{}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                        CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
                         CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
                 else:
                     if isinstance(Value, str):
-                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DscRawValue.get(SkuName,{}).get(DefaultStoreName))
+                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
                     elif IsArray:
                     #
                     # Use memcpy() to copy value into field
                     #
-                        CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DscRawValue.get(SkuName,{}).get(DefaultStoreName))
+                        CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
                         CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
                 continue
-            if (SkuName,DefaultStoreName) == (TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT) or (( (SkuName,'') not in Pcd.ValueChain) and ( (SkuName,DefaultStoreName) not in Pcd.ValueChain )):
+            if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT) or (( (SkuName, '') not in Pcd.ValueChain) and ( (SkuName, DefaultStoreName) not in Pcd.ValueChain )):
                 for FieldName in FieldList:
                     IsArray = IsFieldValueAnArray(FieldList[FieldName][0])
                     if IsArray:
@@ -1748,7 +1748,7 @@ class DscBuildData(PlatformBuildClassObject):
                     try:
                         Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                     except Exception:
-                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     if isinstance(Value, str):
                         CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     elif IsArray:
@@ -1767,18 +1767,18 @@ class DscBuildData(PlatformBuildClassObject):
         return CApp
 
     @staticmethod
-    def GenerateInitValueStatement(Pcd,SkuName,DefaultStoreName):
-        CApp = '  Assign_%s_%s_%s_%s_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName,DefaultStoreName)
+    def GenerateInitValueStatement(Pcd, SkuName, DefaultStoreName):
+        CApp = '  Assign_%s_%s_%s_%s_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
         return CApp
 
-    def GenerateCommandLineValue(self,Pcd):
+    def GenerateCommandLineValue(self, Pcd):
         CApp = "// Value in CommandLine\n"
-        CApp = CApp + "void Assign_%s_%s_CommandLine_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType)
+        CApp = CApp + "void Assign_%s_%s_CommandLine_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
 
         pcddefaultvalue = Pcd.PcdValueFromComm
-        for FieldList in [pcddefaultvalue,Pcd.PcdFieldValueFromComm]:
+        for FieldList in [pcddefaultvalue, Pcd.PcdFieldValueFromComm]:
             if not FieldList:
                 continue
             if pcddefaultvalue and FieldList == pcddefaultvalue:
@@ -1813,7 +1813,7 @@ class DscBuildData(PlatformBuildClassObject):
                 try:
                     Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
                 except Exception:
-                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                 if isinstance(Value, str):
                     CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                 elif IsArray:
@@ -1855,7 +1855,7 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + '\n'
 
             if SkuName in Pcd.SkuInfoList:
-                DefaultValue = Pcd.SkuInfoList[SkuName].DefaultStoreDict.get(DefaultStoreName,Pcd.SkuInfoList[SkuName].HiiDefaultValue if Pcd.SkuInfoList[SkuName].HiiDefaultValue  else Pcd.SkuInfoList[SkuName].DefaultValue)
+                DefaultValue = Pcd.SkuInfoList[SkuName].DefaultStoreDict.get(DefaultStoreName, Pcd.SkuInfoList[SkuName].HiiDefaultValue if Pcd.SkuInfoList[SkuName].HiiDefaultValue  else Pcd.SkuInfoList[SkuName].DefaultValue)
             else:
                 DefaultValue = Pcd.DefaultValue
             PcdDefaultValue = StringToArray(DefaultValue.strip())
@@ -1901,12 +1901,12 @@ class DscBuildData(PlatformBuildClassObject):
                     storeset = [DefaultStoreName] if DefaultStoreName == TAB_DEFAULT_STORES_DEFAULT else [TAB_DEFAULT_STORES_DEFAULT, DefaultStoreName]
                     for defaultstorenameitem in storeset:
                         CApp = CApp + "// SkuName: %s,  DefaultStoreName: %s \n" % (skuname, defaultstorenameitem)
-                        CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd,skuname,defaultstorenameitem)
+                        CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd, skuname, defaultstorenameitem)
                     if skuname == SkuName:
                         break
             else:
                 CApp = CApp + "// SkuName: %s,  DefaultStoreName: STANDARD \n" % self.SkuIdMgr.SystemSkuId
-                CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd,self.SkuIdMgr.SystemSkuId,TAB_DEFAULT_STORES_DEFAULT)
+                CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             CApp = CApp + DscBuildData.GenerateCommandLineValueStatement(Pcd)
             #
             # Set new PCD value and size
@@ -1946,13 +1946,13 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + self.GenerateCommandLineValue(Pcd)
             if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
                         self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                CApp = CApp + self.GenerateInitValueFunction(Pcd,self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
+                CApp = CApp + self.GenerateInitValueFunction(Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
                     if SkuName not in Pcd.SkuOverrideValues:
                         continue
                     for DefaultStoreName in Pcd.SkuOverrideValues[SkuName]:
-                        CApp = CApp + self.GenerateInitValueFunction(Pcd,SkuName,DefaultStoreName)
+                        CApp = CApp + self.GenerateInitValueFunction(Pcd, SkuName, DefaultStoreName)
             if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
                         self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
                 InitByteValue, CApp = self.GenerateInitializeFunc(self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd, InitByteValue, CApp)
@@ -1970,7 +1970,7 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = CApp + '  )\n'
         CApp = CApp + '{\n'
         for Pcd in StructuredPcds.values():
-            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+            if not Pcd.SkuOverrideValues or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
                 CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
@@ -2072,7 +2072,7 @@ class DscBuildData(PlatformBuildClassObject):
         IncludeFileFullPaths = []
         for includefile in IncludeFiles:
             for includepath in IncSearchList:
-                includefullpath = os.path.join(str(includepath),includefile)
+                includefullpath = os.path.join(str(includepath), includefile)
                 if os.path.exists(includefullpath):
                     IncludeFileFullPaths.append(os.path.normpath(includefullpath))
                     break
@@ -2080,7 +2080,7 @@ class DscBuildData(PlatformBuildClassObject):
         SearchPathList.append(os.path.normpath(mws.join(GlobalData.gWorkspace, "BaseTools/Source/C/Include")))
         SearchPathList.append(os.path.normpath(mws.join(GlobalData.gWorkspace, "BaseTools/Source/C/Common")))
         SearchPathList.extend(str(item) for item in IncSearchList)
-        IncFileList = GetDependencyList(IncludeFileFullPaths,SearchPathList)
+        IncFileList = GetDependencyList(IncludeFileFullPaths, SearchPathList)
         for include_file in IncFileList:
             MakeApp += "$(OBJECTS) : %s\n" % include_file
         MakeFileName = os.path.join(self.OutputPath, 'Makefile')
@@ -2126,7 +2126,7 @@ class DscBuildData(PlatformBuildClassObject):
                     if FileLine.isdigit():
                         error_line = FileData[int (FileLine) - 1]
                         if r"//" in error_line:
-                            c_line,dsc_line = error_line.split(r"//")
+                            c_line, dsc_line = error_line.split(r"//")
                         else:
                             dsc_line = error_line
                         message_itmes = Message.split(":")
@@ -2150,7 +2150,7 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 EdkLogger.error('Build', COMMAND_FAILURE, 'Can not execute command: %s' % MakeCommand)
 
-        if DscBuildData.NeedUpdateOutput(OutputValueFile, PcdValueInitExe ,InputValueFile):
+        if DscBuildData.NeedUpdateOutput(OutputValueFile, PcdValueInitExe, InputValueFile):
             Command = PcdValueInitExe + ' -i %s -o %s' % (InputValueFile, OutputValueFile)
             returncode, StdOut, StdErr = DscBuildData.ExecuteCommand (Command)
             if returncode != 0:
@@ -2164,7 +2164,7 @@ class DscBuildData(PlatformBuildClassObject):
         for Pcd in FileBuffer:
             PcdValue = Pcd.split ('|')
             PcdInfo = PcdValue[0].split ('.')
-            StructurePcdSet.append((PcdInfo[0],PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
+            StructurePcdSet.append((PcdInfo[0], PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
         return StructurePcdSet
 
     @staticmethod
@@ -2198,7 +2198,7 @@ class DscBuildData(PlatformBuildClassObject):
         AvailableSkuIdSet = copy.copy(self.SkuIds)
 
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -2260,7 +2260,7 @@ class DscBuildData(PlatformBuildClassObject):
             elif TAB_DEFAULT in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 del pcd.SkuInfoList[TAB_COMMON]
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
 
         return Pcds
 
@@ -2291,7 +2291,7 @@ class DscBuildData(PlatformBuildClassObject):
         else:
             return False
 
-    def CopyDscRawValue(self,Pcd):
+    def CopyDscRawValue(self, Pcd):
         if Pcd.DscRawValue is None:
             Pcd.DscRawValue = dict()
         if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
@@ -2305,10 +2305,10 @@ class DscBuildData(PlatformBuildClassObject):
                     Pcd.DscRawValue[skuname][defaultstore] = Pcd.SkuInfoList[skuname].DefaultStoreDict[defaultstore]
             else:
                 Pcd.DscRawValue[skuname][TAB_DEFAULT_STORES_DEFAULT] = Pcd.SkuInfoList[skuname].DefaultValue
-    def CompletePcdValues(self,PcdSet):
+    def CompletePcdValues(self, PcdSet):
         Pcds = {}
         DefaultStoreObj = DefaultStore(self._GetDefaultStores())
-        SkuIds = {skuname:skuid for skuname,skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname != TAB_COMMON}
+        SkuIds = {skuname:skuid for skuname, skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname != TAB_COMMON}
         DefaultStores = set(storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict)
         for PcdCName, TokenSpaceGuid in PcdSet:
             PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
@@ -2330,7 +2330,7 @@ class DscBuildData(PlatformBuildClassObject):
                         if defaultstorename not in skuobj.DefaultStoreDict:
                             skuobj.DefaultStoreDict[defaultstorename] = copy.deepcopy(skuobj.DefaultStoreDict[mindefaultstorename])
                     skuobj.HiiDefaultValue = skuobj.DefaultStoreDict[mindefaultstorename]
-            for skuname,skuid in SkuIds.items():
+            for skuname, skuid in SkuIds.items():
                 if skuname not in PcdObj.SkuInfoList:
                     nextskuid = self.SkuIdMgr.GetNextSkuId(skuname)
                     while nextskuid not in PcdObj.SkuInfoList:
@@ -2364,7 +2364,7 @@ class DscBuildData(PlatformBuildClassObject):
         AvailableSkuIdSet = copy.copy(self.SkuIds)
         DefaultStoresDefine = self._GetDefaultStores()
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, DefaultStore, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             DefaultStore = DefaultStore.upper()
@@ -2377,14 +2377,14 @@ class DscBuildData(PlatformBuildClassObject):
                 EdkLogger.error('build', PARAMETER_INVALID, 'DefaultStores %s is not defined in [DefaultStores] section' % DefaultStore,
                                             File=self.MetaFile, Line=Dummy5)
             if "." not in TokenSpaceGuid:
-                PcdSet.add((PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy5))
-            PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore] = Setting
+                PcdSet.add((PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy5))
+            PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore] = Setting
 
 
         # Remove redundant PCD candidates, per the ARCH and SKU
-        for PcdCName, TokenSpaceGuid, SkuName,DefaultStore, Dummy4 in PcdSet:
+        for PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4 in PcdSet:
 
-            Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid,DefaultStore]
+            Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore]
             if Setting is None:
                 continue
             VariableName, VariableGuid, VariableOffset, DefaultValue, VarAttribute = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
@@ -2428,10 +2428,10 @@ class DscBuildData(PlatformBuildClassObject):
                     Skuitem = pcdObject.SkuInfoList[SkuName]
                     Skuitem.DefaultStoreDict.update({DefaultStore:DefaultValue})
                 else:
-                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
                     pcdObject.SkuInfoList[SkuName] = SkuInfo
             else:
-                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute,DefaultStore={DefaultStore:DefaultValue})
+                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
                 Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
                                                 PcdCName,
                                                 TokenSpaceGuid,
@@ -2462,7 +2462,7 @@ class DscBuildData(PlatformBuildClassObject):
                     pcd.DefaultValue = pcdDecObject.DefaultValue
             if TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON not in pcd.SkuInfoList:
                 valuefromDec = pcdDecObject.DefaultValue
-                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec,VariableAttribute=SkuInfoObj.VariableAttribute,DefaultStore={DefaultStore:valuefromDec})
+                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec, VariableAttribute=SkuInfoObj.VariableAttribute, DefaultStore={DefaultStore:valuefromDec})
                 pcd.SkuInfoList[TAB_DEFAULT] = SkuInfo
             elif TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 pcd.SkuInfoList[TAB_DEFAULT] = pcd.SkuInfoList[TAB_COMMON]
@@ -2490,7 +2490,7 @@ class DscBuildData(PlatformBuildClassObject):
             invalidpcd = ",".join(invalidhii)
             EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR, Message='The same HII PCD must map to the same EFI variable for all SKUs', File=self.MetaFile, ExtraData=invalidpcd)
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
 
         return Pcds
 
@@ -2499,11 +2499,11 @@ class DscBuildData(PlatformBuildClassObject):
         invalidhii = []
         for pcdname in Pcds:
             pcd = Pcds[pcdname]
-            varnameset = set(sku.VariableName for (skuid,sku) in pcd.SkuInfoList.items())
+            varnameset = set(sku.VariableName for (skuid, sku) in pcd.SkuInfoList.items())
             if len(varnameset) > 1:
-                invalidhii.append(".".join((pcdname[1],pcdname[0])))
+                invalidhii.append(".".join((pcdname[1], pcdname[0])))
         if len(invalidhii):
-            return False,invalidhii
+            return False, invalidhii
         else:
             return True, []
     ## Retrieve dynamic VPD PCD settings
@@ -2527,7 +2527,7 @@ class DscBuildData(PlatformBuildClassObject):
         RecordList = self._RawData[Type, self._Arch]
         AvailableSkuIdSet = copy.copy(self.SkuIds)
 
-        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4,Dummy5 in RecordList:
+        for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
@@ -2595,7 +2595,7 @@ class DscBuildData(PlatformBuildClassObject):
                 del pcd.SkuInfoList[TAB_COMMON]
 
 
-        map(self.FilterSkuSettings,Pcds.values())
+        map(self.FilterSkuSettings, Pcds.values())
         return Pcds
 
     ## Add external modules
@@ -2660,7 +2660,7 @@ class DscBuildData(PlatformBuildClassObject):
                     continue
                 ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain,PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
         return self._DecPcds
     _Macros             = property(_GetMacros)
     Arch                = property(_GetArch, _SetArch)
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index d5fbf6f095bf..4ab3c137dd7a 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -302,7 +302,7 @@ class MetaFileParser(object):
         for Item in GetSplitValueList(self._CurrentLine[1:-1], TAB_COMMA_SPLIT):
             if Item == '':
                 continue
-            ItemList = GetSplitValueList(Item, TAB_SPLIT,3)
+            ItemList = GetSplitValueList(Item, TAB_SPLIT, 3)
             # different section should not mix in one section
             if self._SectionName != '' and self._SectionName != ItemList[0].upper():
                 EdkLogger.error('Parser', FORMAT_INVALID, "Different section names in the same section",
@@ -420,7 +420,7 @@ class MetaFileParser(object):
 
     ## Construct section Macro dict 
     def _ConstructSectionMacroDict(self, Name, Value):
-        ScopeKey = [(Scope[0], Scope[1],Scope[2]) for Scope in self._Scope]
+        ScopeKey = [(Scope[0], Scope[1], Scope[2]) for Scope in self._Scope]
         ScopeKey = tuple(ScopeKey)
         #
         # DecParser SectionType is a list, will contain more than one item only in Pcd Section
@@ -451,15 +451,15 @@ class MetaFileParser(object):
                 continue
 
             for ActiveScope in self._Scope:
-                Scope0, Scope1 ,Scope2= ActiveScope[0], ActiveScope[1],ActiveScope[2]
-                if(Scope0, Scope1,Scope2) not in Scope:
+                Scope0, Scope1, Scope2= ActiveScope[0], ActiveScope[1], ActiveScope[2]
+                if(Scope0, Scope1, Scope2) not in Scope:
                     break
             else:
                 SpeSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
 
             for ActiveScope in self._Scope:
-                Scope0, Scope1,Scope2 = ActiveScope[0], ActiveScope[1],ActiveScope[2]
-                if(Scope0, Scope1,Scope2) not in Scope and (Scope0, TAB_COMMON, TAB_COMMON) not in Scope and (TAB_COMMON, Scope1, TAB_COMMON) not in Scope:
+                Scope0, Scope1, Scope2 = ActiveScope[0], ActiveScope[1], ActiveScope[2]
+                if(Scope0, Scope1, Scope2) not in Scope and (Scope0, TAB_COMMON, TAB_COMMON) not in Scope and (TAB_COMMON, Scope1, TAB_COMMON) not in Scope:
                     break
             else:
                 ComSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
@@ -636,7 +636,7 @@ class InfParser(MetaFileParser):
             # Model, Value1, Value2, Value3, Arch, Platform, BelongsToItem=-1,
             # LineBegin=-1, ColumnBegin=-1, LineEnd=-1, ColumnEnd=-1, Enabled=-1
             #
-            for Arch, Platform,_ in self._Scope:
+            for Arch, Platform, _ in self._Scope:
                 LastItem = self._Store(self._SectionType,
                             self._ValueList[0],
                             self._ValueList[1],
@@ -947,7 +947,7 @@ class DscParser(MetaFileParser):
                 self._DirectiveParser()
                 continue
             if Line[0] == TAB_OPTION_START and not self._InSubsection:
-                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1),ExtraData=self.MetaFile)
+                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1), ExtraData=self.MetaFile)
 
             if self._InSubsection:
                 SectionType = self._SubsectionType
@@ -1104,7 +1104,7 @@ class DscParser(MetaFileParser):
     @ParseMacro
     def _SkuIdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
-        if len(TokenList) not in (2,3):
+        if len(TokenList) not in (2, 3):
             EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Number>|<UiName>[|<UiName>]'",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -1164,7 +1164,7 @@ class DscParser(MetaFileParser):
 
         # Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
         ValueList = GetSplitValueList(self._ValueList[2])
-        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8 , TAB_UINT16, TAB_UINT32 , TAB_UINT64] \
+        if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64] \
                               and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
             EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
@@ -1172,7 +1172,7 @@ class DscParser(MetaFileParser):
         # Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
         if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
             DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
-            if len(DscPcdValueList[0].replace('L','').replace('"','').strip()) == 0:
+            if len(DscPcdValueList[0].replace('L', '').replace('"', '').strip()) == 0:
                 EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
@@ -1309,7 +1309,7 @@ class DscParser(MetaFileParser):
         self._ContentIndex = 0
         self._InSubsection = False
         while self._ContentIndex < len(self._Content) :
-            Id, self._ItemType, V1, V2, V3, S1, S2, S3,Owner, self._From, \
+            Id, self._ItemType, V1, V2, V3, S1, S2, S3, Owner, self._From, \
                 LineStart, ColStart, LineEnd, ColEnd, Enabled = self._Content[self._ContentIndex]
 
             if self._From < 0:
@@ -1327,8 +1327,8 @@ class DscParser(MetaFileParser):
                     break
                 Record = self._Content[self._ContentIndex]
                 if LineStart == Record[10] and LineEnd == Record[12]:
-                    if [Record[5], Record[6],Record[7]] not in self._Scope:
-                        self._Scope.append([Record[5], Record[6],Record[7]])
+                    if [Record[5], Record[6], Record[7]] not in self._Scope:
+                        self._Scope.append([Record[5], Record[6], Record[7]])
                     self._ContentIndex += 1
                 else:
                     break
@@ -1421,7 +1421,7 @@ class DscParser(MetaFileParser):
                         MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
                         MODEL_PCD_DYNAMIC_EX_VPD):
             Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
-            for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4,ID, Line in Records:
+            for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4, ID, Line in Records:
                 Name = TokenSpaceGuid + '.' + PcdName
                 if Name not in GlobalData.gPlatformOtherPcds:
                     PcdLine = Line
@@ -1800,7 +1800,7 @@ class DecParser(MetaFileParser):
         if self._DefinesCount > 1:
             EdkLogger.error('Parser', FORMAT_INVALID, 'Multiple [Defines] section is exist.', self.MetaFile )
         if self._DefinesCount == 0:
-            EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.',self.MetaFile)
+            EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.', self.MetaFile)
         self._Done()
 
 
@@ -1944,7 +1944,7 @@ class DecParser(MetaFileParser):
                     self._CurrentStructurePcdName = ""
                 else:
                     if self._CurrentStructurePcdName != TAB_SPLIT.join(PcdNames[:2]):
-                        EdkLogger.error('Parser', FORMAT_INVALID, "Pcd Name does not match: %s and %s " % (self._CurrentStructurePcdName , TAB_SPLIT.join(PcdNames[:2])),
+                        EdkLogger.error('Parser', FORMAT_INVALID, "Pcd Name does not match: %s and %s " % (self._CurrentStructurePcdName, TAB_SPLIT.join(PcdNames[:2])),
                                 File=self.MetaFile, Line=self._LineIndex + 1)
                     self._ValueList[1] = TAB_SPLIT.join(PcdNames[2:])
                     self._ValueList[2] = PcdTockens[1]
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index d17487a4409d..69b2c40e7f1a 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -258,8 +258,8 @@ class PackageTable(MetaFileTable):
                 ValidType = "@ValidList"
             if oricomment.startswith("@Expression"):
                 ValidType = "@Expression"
-            EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType,TokenSpaceGuid, PcdCName),
-                            ExtraData=oricomment,File=self.MetaFile, Line=LineNum)
+            EdkLogger.error('Parser', FORMAT_INVALID, "The syntax for %s of PCD %s.%s is incorrect" % (ValidType, TokenSpaceGuid, PcdCName),
+                            ExtraData=oricomment, File=self.MetaFile, Line=LineNum)
             return set(), set(), set()
         return set(validateranges), set(validlists), set(expressions)
 ## Python class representation of table storing platform data
@@ -308,7 +308,7 @@ class PlatformTable(MetaFileTable):
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1=TAB_ARCH_COMMON, Scope2=TAB_COMMON, Scope3=TAB_DEFAULT_STORES_DEFAULT,BelongsToItem=-1,
                FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
-        (Value1, Value2, Value3, Scope1, Scope2,Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2,Scope3))
+        (Value1, Value2, Value3, Scope1, Scope2, Scope3) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2, Scope3))
         return Table.Insert(
                         self, 
                         Model, 
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index 713c1ddbddc9..e8f159b26204 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -53,7 +53,7 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
 #  @retval: A dictionary contains instances of PcdClassObject with key (PcdCName, TokenSpaceGuid)
 #  @retval: A dictionary contains real GUIDs of TokenSpaceGuid
 #
-def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain,additionalPkgs):
+def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalPkgs):
     PkgList = GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain)
     PkgList = set(PkgList)
     PkgList |= additionalPkgs
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 55222c886d2d..454ea7d088b4 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1190,7 +1190,7 @@ class PcdReport(object):
                     FileWrite(File, Array)
             else:
                 if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
-                    if Value.startswith(('0x','0X')):
+                    if Value.startswith(('0x', '0X')):
                         Value = '{} ({:d})'.format(Value, int(Value, 0))
                     else:
                         Value = "0x{:X} ({})".format(int(Value, 0), Value)
@@ -1300,9 +1300,9 @@ class PcdReport(object):
                     else:
                         if IsByteArray:
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', "{"))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
                             for Array in ArrayList:
                                 FileWrite(File, Array)
                         else:
@@ -1312,9 +1312,9 @@ class PcdReport(object):
                                 else:
                                     Value = "0x{:X} ({})".format(int(Value, 0), Value)
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ' , TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                     if TypeName in ('DYNVPD', 'DEXVPD'):
                         FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
                     if IsStructure:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 344b006bc424..49869d9ee4e6 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -54,7 +54,7 @@ import Common.EdkLogger
 import Common.GlobalData as GlobalData
 from GenFds.GenFds import GenFds
 
-from collections import OrderedDict,defaultdict
+from collections import OrderedDict, defaultdict
 
 # Version and Copyright
 VersionNumber = "0.60" + ' ' + gBUILD_VERSION
@@ -526,7 +526,7 @@ class BuildTask:
                     BuildTask._Thread.acquire(True)
 
                     # start a new build thread
-                    Bo,Bt = BuildTask._ReadyQueue.popitem()
+                    Bo, Bt = BuildTask._ReadyQueue.popitem()
 
                     # move into running queue
                     BuildTask._RunningQueueLock.acquire()
@@ -840,7 +840,7 @@ class Build():
         self.HashSkipModules = []
         self.Db_Flag = False
         self.LaunchPrebuildFlag = False
-        self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory,'.cache', '.PlatformBuild')
+        self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory, '.cache', '.PlatformBuild')
         if BuildOptions.CommandLength:
             GlobalData.gCommandMaxLength = BuildOptions.CommandLength
 
@@ -1133,7 +1133,7 @@ class Build():
             # and preserve them for the rest of the main build step, because the child process environment will
             # evaporate as soon as it exits, we cannot get it in build step.
             #
-            PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory,'.cache','.PrebuildEnv')
+            PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory, '.cache', '.PrebuildEnv')
             if os.path.isfile(PrebuildEnvFile):
                 os.remove(PrebuildEnvFile)
             if os.path.isfile(self.PlatformBuildPath):
@@ -1173,7 +1173,7 @@ class Build():
                 f = open(PrebuildEnvFile)
                 envs = f.readlines()
                 f.close()
-                envs = itertools.imap(lambda l: l.split('=',1), envs)
+                envs = itertools.imap(lambda l: l.split('=', 1), envs)
                 envs = itertools.ifilter(lambda l: len(l) == 2, envs)
                 envs = itertools.imap(lambda l: [i.strip() for i in l], envs)
                 os.environ.update(dict(envs))
@@ -2358,7 +2358,7 @@ def MyOptionParser():
     Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
 
     Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
-    Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD','LIBRARY','FLASH','DEPEX','BUILD_FLAGS','FIXED_ADDRESS','HASH','EXECUTION_ORDER'], dest="ReportType", default=[],
+    Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
         help="Flags that control the type of build report to generate.  Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER].  "\
              "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
     Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 20a4ea28aa11..d3a42ff42652 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -160,7 +160,7 @@ class BaseToolsTest(unittest.TestCase):
         if minlen is None: minlen = 1024
         if maxlen is None: maxlen = minlen
         return ''.join(
-            [chr(random.randint(0,255))
+            [chr(random.randint(0, 255))
              for x in xrange(random.randint(minlen, maxlen))
             ])
 
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 7d7db33be4e4..6de6ff43138e 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -186,7 +186,7 @@ class Config:
         return path
 
     def MakeDirs(self):
-        for path in (self.src_dir, self.build_dir,self.prefix, self.symlinks):
+        for path in (self.src_dir, self.build_dir, self.prefix, self.symlinks):
             if not os.path.exists(path):
                 os.makedirs(path)
 
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 11/13] BaseTools: Migrate to the new octal literal
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (9 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 10/13] BaseTools: Adjust the spaces around commas and colons Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 12/13] BaseTools: Fix old python2 idioms Gary Lin
                   ` (3 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Change the octal literals according to PEP3127
https://www.python.org/dev/peps/pep-3127/

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Source/Python/Common/LongFilePathOs.py | 2 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py     | 2 +-
 2 files changed, 2 insertions(+), 2 deletions(-)

diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 2cebbb276b19..4939a8bc733c 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -41,7 +41,7 @@ def rmdir(path):
 def mkdir(path):
     return os.mkdir(LongFilePath(path))
 
-def makedirs(name, mode=0777):
+def makedirs(name, mode=0o777):
     return os.makedirs(LongFilePath(name), mode)
 
 def rename(old, new):
diff --git a/BaseTools/Source/Python/UPT/Core/FileHook.py b/BaseTools/Source/Python/UPT/Core/FileHook.py
index d8736a872366..67e86f4f7454 100644
--- a/BaseTools/Source/Python/UPT/Core/FileHook.py
+++ b/BaseTools/Source/Python/UPT/Core/FileHook.py
@@ -166,7 +166,7 @@ def _hookrm(path):
     else:
         __built_in_remove__(path)
 
-def _hookmkdir(path, mode=0777):
+def _hookmkdir(path, mode=0o777):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkmkdir(path, mode)
     else:
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 12/13] BaseTools: Fix old python2 idioms
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (10 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 11/13] BaseTools: Migrate to the new octal literal Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-25 10:31 ` [PATCH v4 13/13] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
                   ` (2 subsequent siblings)
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Based on "futurize -f lib2to3.fixes.fix_idioms"

* Change some type comparisons to isinstance() calls:
    type(x) == T -> isinstance(x, T)
    type(x) is T -> isinstance(x, T)
    type(x) != T -> not isinstance(x, T)
    type(x) is not T -> not isinstance(x, T)

* Change "while 1:" into "while True:".

* Change both

    v = list(EXPR)
    v.sort()
    foo(v)

and the more general

    v = EXPR
    v.sort()
    foo(v)

into

    v = sorted(EXPR)
    foo(v)

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/MemoryProfileSymbolGen.py                     |  2 +-
 BaseTools/Scripts/UpdateBuildVersions.py                        |  6 ++--
 BaseTools/Source/Python/AutoGen/BuildEngine.py                  |  2 +-
 BaseTools/Source/Python/AutoGen/GenDepex.py                     |  2 +-
 BaseTools/Source/Python/Common/Expression.py                    | 34 ++++++++++----------
 BaseTools/Source/Python/Common/Misc.py                          | 13 ++++----
 BaseTools/Source/Python/Common/RangeExpression.py               | 16 ++++-----
 BaseTools/Source/Python/Common/StringUtils.py                   |  4 +--
 BaseTools/Source/Python/Common/ToolDefClassObject.py            |  2 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                   |  3 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py | 12 +++----
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                  |  2 +-
 BaseTools/Source/Python/Eot/Parser.py                           |  2 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                     |  2 +-
 BaseTools/Source/Python/GenFds/GenFds.py                        |  2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                |  2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py           | 15 +++------
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py           | 21 ++++--------
 BaseTools/Source/Python/UPT/Library/Misc.py                     |  6 ++--
 BaseTools/Source/Python/UPT/Library/ParserValidate.py           |  2 +-
 BaseTools/Source/Python/UPT/Library/StringUtils.py              |  2 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py          |  2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py       |  3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py       |  3 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py   |  3 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py           |  2 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py               |  8 ++---
 BaseTools/Source/Python/Workspace/MetaFileParser.py             | 20 ++++++------
 BaseTools/Source/Python/build/BuildReport.py                    |  3 +-
 BaseTools/Source/Python/build/build.py                          |  4 +--
 BaseTools/gcc/mingw-gcc-build.py                                |  4 +--
 31 files changed, 93 insertions(+), 111 deletions(-)

diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 1dbb116bba0d..eaae0283c51d 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -264,7 +264,7 @@ def main():
         return 1
 
     try:
-        while 1:
+        while True:
             line = file.readline()
             if not line:
                 break
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index fb61b89bfb4c..269435bfa4cb 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -253,7 +253,7 @@ def GetSvnRevision(opts):
     StatusCmd = "svn st -v --depth infinity --non-interactive"
     contents = ShellCommandResults(StatusCmd, opts)
     os.chdir(Cwd)
-    if type(contents) is ListType:
+    if isinstance(contents, ListType):
         for line in contents:
             if line.startswith("M "):
                 Modified = True
@@ -263,7 +263,7 @@ def GetSvnRevision(opts):
     InfoCmd = "svn info %s" % SrcPath.replace("\\", "/").strip()
     Revision = 0
     contents = ShellCommandResults(InfoCmd, opts)
-    if type(contents) is IntType:
+    if isinstance(contents, IntType):
         return 0, Modified
     for line in contents:
         line = line.strip()
@@ -284,7 +284,7 @@ def CheckSvn(opts):
     VerCmd = "svn --version"
     contents = ShellCommandResults(VerCmd, opts)
     opts.silent = OriginalSilent
-    if type(contents) is IntType:
+    if isinstance(contents, IntType):
         if opts.verbose:
             sys.stdout.write("SVN does not appear to be available.\n")
             sys.stdout.flush()
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index cab4c993dc44..e205589c6bd1 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -68,7 +68,7 @@ class TargetDescBlock(object):
         return hash(self.Target.Path)
 
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return Other.Target.Path == self.Target.Path
         else:
             return str(Other) == self.Target.Path
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index b69788c37e08..e89191a72b9f 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -140,7 +140,7 @@ class DependencyExpression:
     def __init__(self, Expression, ModuleType, Optimize=False):
         self.ModuleType = ModuleType
         self.Phase = gType2Phase[ModuleType]
-        if type(Expression) == type([]):
+        if isinstance(Expression, type([])):
             self.ExpressionString = " ".join(Expression)
             self.TokenList = Expression
         else:
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 9ff4f104256e..e1a2c155b7f3 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -245,12 +245,12 @@ class ValueExpression(BaseExpression):
         WrnExp = None
 
         if Operator not in {"==", "!=", ">=", "<=", ">", "<", "in", "not in"} and \
-            (type(Oprand1) == type('') or type(Oprand2) == type('')):
+            (isinstance(Oprand1, type('')) or isinstance(Oprand2, type(''))):
             raise BadExpression(ERR_STRING_EXPR % Operator)
         if Operator in {'in', 'not in'}:
-            if type(Oprand1) != type(''):
+            if not isinstance(Oprand1, type('')):
                 Oprand1 = IntToStr(Oprand1)
-            if type(Oprand2) != type(''):
+            if not isinstance(Oprand2, type('')):
                 Oprand2 = IntToStr(Oprand2)
         TypeDict = {
             type(0)  : 0,
@@ -261,18 +261,18 @@ class ValueExpression(BaseExpression):
 
         EvalStr = ''
         if Operator in {"!", "NOT", "not"}:
-            if type(Oprand1) == type(''):
+            if isinstance(Oprand1, type('')):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             EvalStr = 'not Oprand1'
         elif Operator in {"~"}:
-            if type(Oprand1) == type(''):
+            if isinstance(Oprand1, type('')):
                 raise BadExpression(ERR_STRING_EXPR % Operator)
             EvalStr = '~ Oprand1'
         else:
             if Operator in {"+", "-"} and (type(True) in {type(Oprand1), type(Oprand2)}):
                 # Boolean in '+'/'-' will be evaluated but raise warning
                 WrnExp = WrnExpression(WRN_BOOL_EXPR)
-            elif type('') in {type(Oprand1), type(Oprand2)} and type(Oprand1)!= type(Oprand2):
+            elif type('') in {type(Oprand1), type(Oprand2)} and not isinstance(Oprand1, type(Oprand2)):
                 # == between string and number/boolean will always return False, != return True
                 if Operator == "==":
                     WrnExp = WrnExpression(WRN_EQCMP_STR_OTHERS)
@@ -293,11 +293,11 @@ class ValueExpression(BaseExpression):
                     pass
                 else:
                     raise BadExpression(ERR_EXPR_TYPE)
-            if type(Oprand1) == type('') and type(Oprand2) == type(''):
+            if isinstance(Oprand1, type('')) and isinstance(Oprand2, type('')):
                 if (Oprand1.startswith('L"') and not Oprand2.startswith('L"')) or \
                     (not Oprand1.startswith('L"') and Oprand2.startswith('L"')):
                     raise BadExpression(ERR_STRING_CMP % (Oprand1, Operator, Oprand2))
-            if 'in' in Operator and type(Oprand2) == type(''):
+            if 'in' in Operator and isinstance(Oprand2, type('')):
                 Oprand2 = Oprand2.split()
             EvalStr = 'Oprand1 ' + Operator + ' Oprand2'
 
@@ -325,7 +325,7 @@ class ValueExpression(BaseExpression):
     def __init__(self, Expression, SymbolTable={}):
         super(ValueExpression, self).__init__(self, Expression, SymbolTable)
         self._NoProcess = False
-        if type(Expression) != type(''):
+        if not isinstance(Expression, type('')):
             self._Expr = Expression
             self._NoProcess = True
             return
@@ -373,7 +373,7 @@ class ValueExpression(BaseExpression):
                 Token = self._GetToken()
             except BadExpression:
                 pass
-            if type(Token) == type('') and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
+            if isinstance(Token, type('')) and Token.startswith('{') and Token.endswith('}') and self._Idx >= self._Len:
                 return self._Expr
 
             self._Idx = 0
@@ -381,7 +381,7 @@ class ValueExpression(BaseExpression):
 
         Val = self._ConExpr()
         RealVal = Val
-        if type(Val) == type(''):
+        if isinstance(Val, type('')):
             if Val == 'L""':
                 Val = False
             elif not Val:
@@ -640,7 +640,7 @@ class ValueExpression(BaseExpression):
                 Ex.Pcd = self._Token
                 raise Ex
             self._Token = ValueExpression(self._Symb[self._Token], self._Symb)(True, self._Depth+1)
-            if type(self._Token) != type(''):
+            if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
 
@@ -735,7 +735,7 @@ class ValueExpression(BaseExpression):
                 if Ch == ')':
                     TmpValue = self._Expr[Idx :self._Idx - 1]
                     TmpValue = ValueExpression(TmpValue)(True)
-                    TmpValue = '0x%x' % int(TmpValue) if type(TmpValue) != type('') else TmpValue
+                    TmpValue = '0x%x' % int(TmpValue) if not isinstance(TmpValue, type('')) else TmpValue
                     break
             self._Token, Size = ParseFieldValue(Prefix + '(' + TmpValue + ')')
             return  self._Token
@@ -824,7 +824,7 @@ class ValueExpressionEx(ValueExpression):
                 PcdValue = PcdValue.strip()
                 if PcdValue.startswith('{') and PcdValue.endswith('}'):
                     PcdValue = SplitPcdValueString(PcdValue[1:-1])
-                if type(PcdValue) == type([]):
+                if isinstance(PcdValue, type([])):
                     TmpValue = 0
                     Size = 0
                     ValueType = ''
@@ -863,7 +863,7 @@ class ValueExpressionEx(ValueExpression):
                         else:
                             ItemValue = ParseFieldValue(Item)[0]
 
-                        if type(ItemValue) == type(''):
+                        if isinstance(ItemValue, type('')):
                             ItemValue = int(ItemValue, 0)
 
                         TmpValue = (ItemValue << (Size * 8)) | TmpValue
@@ -873,7 +873,7 @@ class ValueExpressionEx(ValueExpression):
                         TmpValue, Size = ParseFieldValue(PcdValue)
                     except BadExpression as Value:
                         raise BadExpression("Type: %s, Value: %s, %s" % (self.PcdType, PcdValue, Value))
-                if type(TmpValue) == type(''):
+                if isinstance(TmpValue, type('')):
                     try:
                         TmpValue = int(TmpValue)
                     except:
@@ -996,7 +996,7 @@ class ValueExpressionEx(ValueExpression):
                                     TmpValue = ValueExpressionEx(Item, ValueType, self._Symb)(True)
                                 else:
                                     TmpValue = ValueExpressionEx(Item, self.PcdType, self._Symb)(True)
-                                Item = '0x%x' % TmpValue if type(TmpValue) != type('') else TmpValue
+                                Item = '0x%x' % TmpValue if not isinstance(TmpValue, type('')) else TmpValue
                                 if ItemSize == 0:
                                     ItemValue, ItemSize = ParseFieldValue(Item)
                                     if Item[0] not in {'"', 'L', '{'} and ItemSize > 1:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index fd53b6b046c4..55e3c6f2281b 100644
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1291,9 +1291,9 @@ def ParseDevPathValue (Value):
     return '{' + out + '}', Size
 
 def ParseFieldValue (Value):
-    if type(Value) == type(0):
+    if isinstance(Value, type(0)):
         return Value, (Value.bit_length() + 7) / 8
-    if type(Value) != type(''):
+    if not isinstance(Value, type('')):
         raise BadExpression('Type %s is %s' %(Value, type(Value)))
     Value = Value.strip()
     if Value.startswith(TAB_UINT8) and Value.endswith(')'):
@@ -1584,8 +1584,7 @@ def CheckPcdDatum(Type, Value):
             Printset.add(TAB_PRINTCHAR_BS)
             Printset.add(TAB_PRINTCHAR_NUL)
             if not set(Value).issubset(Printset):
-                PrintList = list(Printset)
-                PrintList.sort()
+                PrintList = sorted(Printset)
                 return False, "Invalid PCD string value of type [%s]; must be printable chars %s." % (Type, PrintList)
     elif Type == 'BOOLEAN':
         if Value not in ['TRUE', 'True', 'true', '0x1', '0x01', '1', 'FALSE', 'False', 'false', '0x0', '0x00', '0']:
@@ -1747,7 +1746,7 @@ class PathClass(object):
     # @retval True  The two PathClass are the same
     #
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return self.Path == Other.Path
         else:
             return self.Path == str(Other)
@@ -1760,7 +1759,7 @@ class PathClass(object):
     # @retval -1    The first PathClass is less than the second PathClass
     # @retval 1     The first PathClass is Bigger than the second PathClass
     def __cmp__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             OtherKey = Other.Path
         else:
             OtherKey = str(Other)
@@ -2010,7 +2009,7 @@ class SkuClass():
             return ["DEFAULT"]
         skulist = [sku]
         nextsku = sku
-        while 1:
+        while True:
             nextsku = self.GetNextSkuId(nextsku)
             skulist.append(nextsku)
             if nextsku == "DEFAULT":
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 1cf975ba7bef..014c75b8cebd 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -97,7 +97,7 @@ class XOROperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "XOR ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId = str(uuid.uuid1())
@@ -111,7 +111,7 @@ class LEOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LE ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -123,7 +123,7 @@ class LTOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable):
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LT ..." 
             raise BadExpression(ERR_SNYTAX % Expr) 
         rangeId1 = str(uuid.uuid1())
@@ -136,7 +136,7 @@ class GEOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GE ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -149,7 +149,7 @@ class GTOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GT ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -162,7 +162,7 @@ class EQOperatorObject(object):
     def __init__(self):     
         pass
     def Calculate(self, Operand, DataType, SymbolTable): 
-        if type(Operand) == type('') and not Operand.isalnum():
+        if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "EQ ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
@@ -350,7 +350,7 @@ class RangeExpression(BaseExpression):
     def __init__(self, Expression, PcdDataType, SymbolTable = {}):
         super(RangeExpression, self).__init__(self, Expression, PcdDataType, SymbolTable)
         self._NoProcess = False
-        if type(Expression) != type(''):
+        if not isinstance(Expression, type('')):
             self._Expr = Expression
             self._NoProcess = True
             return
@@ -571,7 +571,7 @@ class RangeExpression(BaseExpression):
                 Ex.Pcd = self._Token
                 raise Ex
             self._Token = RangeExpression(self._Symb[self._Token], self._Symb)(True, self._Depth + 1)
-            if type(self._Token) != type(''):
+            if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
 
diff --git a/BaseTools/Source/Python/Common/StringUtils.py b/BaseTools/Source/Python/Common/StringUtils.py
index 25dd4b264c2f..3f6bae3bdc39 100644
--- a/BaseTools/Source/Python/Common/StringUtils.py
+++ b/BaseTools/Source/Python/Common/StringUtils.py
@@ -251,7 +251,7 @@ def SplitModuleType(Key):
 def ReplaceMacros(StringList, MacroDefinitions={}, SelfReplacement=False):
     NewList = []
     for String in StringList:
-        if type(String) == type(''):
+        if isinstance(String, type('')):
             NewList.append(ReplaceMacro(String, MacroDefinitions, SelfReplacement))
         else:
             NewList.append(String)
@@ -793,7 +793,7 @@ def RemoveBlockComment(Lines):
 # Get String of a List
 #
 def GetStringOfList(List, Split=' '):
-    if type(List) != type([]):
+    if not isinstance(List, type([])):
         return List
     Str = ''
     for Item in List:
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index fb95a0353cef..7cc7e22839e1 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -158,7 +158,7 @@ class ToolDefClassObject(object):
                             if ErrorCode != 0:
                                 EdkLogger.error("tools_def.txt parser", FILE_NOT_FOUND, ExtraData=IncFile)
 
-                    if type(IncFileTmp) is PathClass:
+                    if isinstance(IncFileTmp, PathClass):
                         IncFile = IncFileTmp.Path
                     else:
                         IncFile = IncFileTmp
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 93175d41e9f7..b98c021b5777 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -127,8 +127,7 @@ class VpdInfoFile:
                             "Invalid parameter FilePath: %s." % FilePath)        
 
         Content = FILE_COMMENT_TEMPLATE
-        Pcds = self._VpdArray.keys()
-        Pcds.sort()
+        Pcds = sorted(self._VpdArray.keys())
         for Pcd in Pcds:
             i = 0
             PcdTokenCName = Pcd.TokenCName
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index b3b0ede7e8f3..a41223f285ff 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -68,7 +68,7 @@ def ParseMacro(Parser):
         self._ItemType = MODEL_META_DATA_DEFINE
         # DEFINE defined macros
         if Type == TAB_DSC_DEFINES_DEFINE:
-            if type(self) == DecParser:
+            if isinstance(self, DecParser):
                 if MODEL_META_DATA_HEADER in self._SectionType:
                     self._FileLocalMacros[Name] = Value
                 else:
@@ -83,7 +83,7 @@ def ParseMacro(Parser):
                 SectionLocalMacros = self._SectionsMacroDict[SectionDictKey]
                 SectionLocalMacros[Name] = Value
         # EDK_GLOBAL defined macros
-        elif type(self) != DscParser:
+        elif not isinstance(self, DscParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
         elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -215,7 +215,7 @@ class MetaFileParser(object):
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
     def __getitem__(self, DataInfo):
-        if type(DataInfo) != type(()):
+        if not isinstance(DataInfo, type(())):
             DataInfo = (DataInfo,)
 
         # Parse the file first, if necessary
@@ -257,7 +257,7 @@ class MetaFileParser(object):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
         # Don't do macro replacement for dsc file at this point
-        if type(self) != DscParser:
+        if not isinstance(self, DscParser):
             Macros = self._Macros
             self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
 
@@ -355,7 +355,7 @@ class MetaFileParser(object):
             if os.path.exists(UniFile):
                 self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=False)
         
-        if type(self) == InfParser and self._Version < 0x00010005:
+        if isinstance(self, InfParser) and self._Version < 0x00010005:
             # EDK module allows using defines as macros
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
@@ -370,7 +370,7 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and type(self) != DscParser: # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index 811106133cb4..1e45806fa657 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -35,7 +35,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
         Element.appendChild(Doc.createTextNode(String))
     
     for Item in NodeList:
-        if type(Item) == type([]):
+        if isinstance(Item, type([])):
             Key = Item[0]
             Value = Item[1]
             if Key != '' and Key is not None and Value != '' and Value is not None:
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index ff88e957ad0d..0b720d5b2187 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -730,7 +730,7 @@ def GetParameter(Parameter, Index = 1):
 #  @return: The name of parameter
 #
 def GetParameterName(Parameter):
-    if type(Parameter) == type('') and Parameter.startswith('&'):
+    if isinstance(Parameter, type('')) and Parameter.startswith('&'):
         return Parameter[1:].replace('{', '').replace('}', '').replace('\r', '').replace('\n', '').strip()
     else:
         return Parameter.strip()
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 74785e0a93fe..b57ffc778f5e 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -905,7 +905,7 @@ class FdfParser:
         MacroDict.update(GlobalData.gCommandLineDefines)
         if GlobalData.BuildOptionPcd:
             for Item in GlobalData.BuildOptionPcd:
-                if type(Item) is tuple:
+                if isinstance(Item, tuple):
                     continue
                 PcdName, TmpValue = Item.split("=")
                 TmpValue = BuildOptionValue(TmpValue, {})
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 1552ab4ee3a8..912e6c58f402 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -785,7 +785,7 @@ class GenFds :
                         if not Name:
                             continue
 
-                        Name = ' '.join(Name) if type(Name) == type([]) else Name
+                        Name = ' '.join(Name) if isinstance(Name, type([])) else Name
                         GuidXRefFile.write("%s %s\n" %(FileStatementGuid, Name))
 
        # Append GUIDs, Protocols, and PPIs to the Xref file
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index ed567b870816..26d2bb9ebfce 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -83,7 +83,7 @@ class TargetTool():
     def Print(self):
         errMsg  = ''
         for Key in self.TargetTxtDictionary:
-            if type(self.TargetTxtDictionary[Key]) == type([]):
+            if isinstance(self.TargetTxtDictionary[Key], type([])):
                 print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
             elif self.TargetTxtDictionary[Key] is None:
                 errMsg += "  Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep 
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
index 9397359367e7..a1a9d38087ee 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
@@ -123,8 +123,7 @@ def GenPcd(Package, Content):
         if Pcd.GetSupModuleList():
             Statement += GenDecTailComment(Pcd.GetSupModuleList())
 
-        ArchList = Pcd.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Pcd.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -205,8 +204,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Guid.GetSupModuleList():
             Statement += GenDecTailComment(Guid.GetSupModuleList())     
-        ArchList = Guid.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Guid.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -246,8 +244,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Protocol.GetSupModuleList():
             Statement += GenDecTailComment(Protocol.GetSupModuleList())
-        ArchList = Protocol.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Protocol.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -287,8 +284,7 @@ def GenGuidProtocolPpi(Package, Content):
         #
         if Ppi.GetSupModuleList():
             Statement += GenDecTailComment(Ppi.GetSupModuleList())
-        ArchList = Ppi.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Ppi.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
@@ -463,8 +459,7 @@ def PackageToDec(Package, DistHeader = None):
         if LibraryClass.GetSupModuleList():
             Statement += \
             GenDecTailComment(LibraryClass.GetSupModuleList())
-        ArchList = LibraryClass.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(LibraryClass.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index b97b319e0956..9457f851f4ec 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -493,8 +493,7 @@ def GenPackages(ModuleObject):
         Statement += RelaPath.replace('\\', '/')
         if FFE:
             Statement += '|' + FFE
-        ArchList = PackageDependency.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(PackageDependency.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [Statement]
@@ -513,8 +512,7 @@ def GenSources(ModuleObject):
         SourceFile = Source.GetSourceFile()
         Family = Source.GetFamily()
         FeatureFlag = Source.GetFeatureFlag()
-        SupArchList = Source.GetSupArchList()
-        SupArchList.sort()
+        SupArchList = sorted(Source.GetSupArchList())
         SortedArch = ' '.join(SupArchList)
         Statement = GenSourceStatement(ConvertPath(SourceFile), Family, FeatureFlag)
         if SortedArch in NewSectionDict:
@@ -722,8 +720,7 @@ def GenGuidSections(GuidObjList):
         #
         # merge duplicate items
         #
-        ArchList = Guid.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Guid.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if (Statement, SortedArch) in GuidDict:
             PreviousComment = GuidDict[Statement, SortedArch]
@@ -782,8 +779,7 @@ def GenProtocolPPiSections(ObjList, IsProtocol):
         #
         # merge duplicate items
         #
-        ArchList = Object.GetSupArchList()
-        ArchList.sort()
+        ArchList = sorted(Object.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if (Statement, SortedArch) in Dict:
             PreviousComment = Dict[Statement, SortedArch]
@@ -857,8 +853,7 @@ def GenPcdSections(ModuleObject):
             #
             # Merge duplicate entries
             #
-            ArchList = Pcd.GetSupArchList()
-            ArchList.sort()
+            ArchList = sorted(Pcd.GetSupArchList())
             SortedArch = ' '.join(ArchList)
             if (Statement, SortedArch) in Dict:
                 PreviousComment = Dict[Statement, SortedArch]
@@ -1025,8 +1020,7 @@ def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
         if CommentStr and not CommentStr.endswith('\n#\n'):
             CommentStr = CommentStr + '#\n'
         NewStateMent = CommentStr + Statement
-        SupArch = Obj.GetSupArchList()
-        SupArch.sort()
+        SupArch = sorted(Obj.GetSupArchList())
         SortedArch = ' '.join(SupArch)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [NewStateMent]
@@ -1104,8 +1098,7 @@ def GenBinaries(ModuleObject):
             FileName = ConvertPath(FileNameObj.GetFilename())
             FileType = FileNameObj.GetFileType()
             FFE = FileNameObj.GetFeatureFlag()
-            ArchList = FileNameObj.GetSupArchList()
-            ArchList.sort()
+            ArchList = sorted(FileNameObj.GetSupArchList())
             SortedArch = ' '.join(ArchList)
             Key = (FileName, FileType, FFE, SortedArch)
             if Key in BinariesDict:
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index e16d309ef883..28471d81238a 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -514,7 +514,7 @@ class PathClass(object):
     # Check whether PathClass are the same
     #
     def __eq__(self, Other):
-        if type(Other) == type(self):
+        if isinstance(Other, type(self)):
             return self.Path == Other.Path
         else:
             return self.Path == str(Other)
@@ -819,11 +819,11 @@ def ConvertArchList(ArchList):
     if not ArchList:
         return NewArchList
 
-    if type(ArchList) == list:
+    if isinstance(ArchList, list):
         for Arch in ArchList:
             Arch = Arch.upper()
             NewArchList.append(Arch)
-    elif type(ArchList) == str:
+    elif isinstance(ArchList, str):
         ArchList = ArchList.upper()
         NewArchList.append(ArchList)
 
diff --git a/BaseTools/Source/Python/UPT/Library/ParserValidate.py b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
index 3f8ca9d609ae..dc93cedd917e 100644
--- a/BaseTools/Source/Python/UPT/Library/ParserValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
@@ -341,7 +341,7 @@ def IsValidCFormatGuid(Guid):
                 #
                 # Index may out of bound
                 #
-                if type(List[Index]) != type(1) or \
+                if not isinstance(List[Index], type(1)) or \
                    len(Value) > List[Index] or len(Value) < 3:
                     return False
                 
diff --git a/BaseTools/Source/Python/UPT/Library/StringUtils.py b/BaseTools/Source/Python/UPT/Library/StringUtils.py
index bd2cbe612037..2be382fa1797 100644
--- a/BaseTools/Source/Python/UPT/Library/StringUtils.py
+++ b/BaseTools/Source/Python/UPT/Library/StringUtils.py
@@ -651,7 +651,7 @@ def ConvertToSqlString2(String):
 # @param Split: split character
 #
 def GetStringOfList(List, Split=' '):
-    if type(List) != type([]):
+    if not isinstance(List, type([])):
         return List
     Str = ''
     for Item in List:
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index 1096bc5b1849..dbaee678af45 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -40,7 +40,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
         Element.appendChild(Doc.createTextNode(String))
 
     for Item in NodeList:
-        if type(Item) == type([]):
+        if isinstance(Item, type([])):
             Key = Item[0]
             Value = Item[1]
             if Key != '' and Key is not None and Value != '' and Value is not None:
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index e2908bcda98b..941dd4a39891 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -409,8 +409,7 @@ class DecPomAlignment(PackageObject):
         # 
         PackagePath = os.path.split(self.GetFullPath())[0]
         IncludePathList = \
-            [os.path.normpath(Path) + sep for Path in IncludesDict.keys()]
-        IncludePathList.sort()
+            sorted([os.path.normpath(Path) + sep for Path in IncludesDict.keys()])
         
         #
         # get a non-overlap set of include path, IncludePathList should be 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
index a5929e15de2d..84f0d43f015e 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
@@ -611,8 +611,7 @@ class InfPomAlignment(ModuleObject):
                 SourceFile = Item.GetSourceFileName()
                 Family = Item.GetFamily()
                 FeatureFlag = Item.GetFeatureFlagExp()
-                SupArchList = ConvertArchList(Item.GetSupArchList())
-                SupArchList.sort()
+                SupArchList = sorted(ConvertArchList(Item.GetSupArchList()))
                 Source = SourceFileObject()
                 Source.SetSourceFile(SourceFile)
                 Source.SetFamily(Family)
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
index cee42516231c..3bb506bea660 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
@@ -194,8 +194,7 @@ def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObj
         # can be used for the attribute.
         # If both not have VALID_ARCHITECTURE comment and no architecturie specified, then keep it empty.
         #        
-        SupArchList = ConvertArchList(ItemObj.GetSupArchList())
-        SupArchList.sort()
+        SupArchList = sorted(ConvertArchList(ItemObj.GetSupArchList()))
         if len(SupArchList) == 1 and SupArchList[0] == 'COMMON':
             if not (len(OriSupArchList) == 1 or OriSupArchList[0] == 'COMMON'):
                 SupArchList = OriSupArchList
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index 2569235fb875..c188b47534f8 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -217,7 +217,7 @@ class StructurePcd(PcdClassObject):
         self.DscRawValue = PcdObject.DscRawValue if PcdObject.DscRawValue else self.DscRawValue
         self.PcdValueFromComm = PcdObject.PcdValueFromComm if PcdObject.PcdValueFromComm else self.PcdValueFromComm
         self.DefinitionPosition = PcdObject.DefinitionPosition if PcdObject.DefinitionPosition else self.DefinitionPosition
-        if type(PcdObject) is StructurePcd:
+        if isinstance(PcdObject, StructurePcd):
             self.StructuredPcdIncludeFile = PcdObject.StructuredPcdIncludeFile if PcdObject.StructuredPcdIncludeFile else self.StructuredPcdIncludeFile
             self.PackageDecs = PcdObject.PackageDecs if PcdObject.PackageDecs else self.PackageDecs
             self.DefaultValues = PcdObject.DefaultValues if PcdObject.DefaultValues else self.DefaultValues
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index 9e7b8a18c28b..06732e6fade4 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -927,13 +927,13 @@ class DscBuildData(PlatformBuildClassObject):
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
                 Pcds[pcdname].SkuInfoList = {TAB_DEFAULT:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
-                if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+                if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {TAB_DEFAULT:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         else:
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
                 Pcds[pcdname].SkuInfoList = {skuid:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
-                if type(pcd) is StructurePcd and pcd.SkuOverrideValues:
+                if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
                     Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
     def CompleteHiiPcdsDefaultStores(self, Pcds):
@@ -964,7 +964,7 @@ class DscBuildData(PlatformBuildClassObject):
     def __ParsePcdFromCommandLine(self):
         if GlobalData.BuildOptionPcd:
             for i, pcd in enumerate(GlobalData.BuildOptionPcd):
-                if type(pcd) is tuple:
+                if isinstance(pcd, tuple):
                     continue
                 (pcdname, pcdvalue) = pcd.split('=')
                 if not pcdvalue:
@@ -1320,7 +1320,7 @@ class DscBuildData(PlatformBuildClassObject):
                             File=self.MetaFile, Line = StrPcdSet[str_pcd][0][5])
         # Add the Structure PCD that only defined in DEC, don't have override in DSC file
         for Pcd in self.DecPcds:
-            if type (self._DecPcds[Pcd]) is StructurePcd:
+            if isinstance(self._DecPcds[Pcd], StructurePcd):
                 if Pcd not in S_pcd_set:
                     str_pcd_obj_str = StructurePcd()
                     str_pcd_obj_str.copy(self._DecPcds[Pcd])
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 4ab3c137dd7a..8c860d594b4f 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -78,10 +78,10 @@ def ParseMacro(Parser):
             #
             # First judge whether this DEFINE is in conditional directive statements or not.
             #
-            if type(self) == DscParser and self._InDirective > -1:
+            if isinstance(self, DscParser) and self._InDirective > -1:
                 pass
             else:
-                if type(self) == DecParser:
+                if isinstance(self, DecParser):
                     if MODEL_META_DATA_HEADER in self._SectionType:
                         self._FileLocalMacros[Name] = Value
                     else:
@@ -92,7 +92,7 @@ def ParseMacro(Parser):
                     self._ConstructSectionMacroDict(Name, Value)
 
         # EDK_GLOBAL defined macros
-        elif type(self) != DscParser:
+        elif not isinstance(self, DscParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "EDK_GLOBAL can only be used in .dsc file",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         elif self._SectionType != MODEL_META_DATA_HEADER:
@@ -233,7 +233,7 @@ class MetaFileParser(object):
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
     def __getitem__(self, DataInfo):
-        if type(DataInfo) != type(()):
+        if not isinstance(DataInfo, type(())):
             DataInfo = (DataInfo,)
 
         # Parse the file first, if necessary
@@ -275,7 +275,7 @@ class MetaFileParser(object):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
         # Don't do macro replacement for dsc file at this point
-        if type(self) != DscParser:
+        if not isinstance(self, DscParser):
             Macros = self._Macros
             self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
 
@@ -382,7 +382,7 @@ class MetaFileParser(object):
                 EdkLogger.error('Parser', FORMAT_INVALID, "Invalid version number",
                                 ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
-        if type(self) == InfParser and self._Version < 0x00010005:
+        if isinstance(self, InfParser) and self._Version < 0x00010005:
             # EDK module allows using defines as macros
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
@@ -398,7 +398,7 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and type(self) != DscParser: # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
@@ -426,7 +426,7 @@ class MetaFileParser(object):
         # DecParser SectionType is a list, will contain more than one item only in Pcd Section
         # As Pcd section macro usage is not alllowed, so here it is safe
         #
-        if type(self) == DecParser:
+        if isinstance(self, DecParser):
             SectionDictKey = self._SectionType[0], ScopeKey
         else:
             SectionDictKey = self._SectionType, ScopeKey
@@ -443,7 +443,7 @@ class MetaFileParser(object):
         SpeSpeMacroDict = {}
 
         ActiveSectionType = self._SectionType
-        if type(self) == DecParser:
+        if isinstance(self, DecParser):
             ActiveSectionType = self._SectionType[0]
 
         for (SectionType, Scope) in self._SectionsMacroDict:
@@ -1252,7 +1252,7 @@ class DscParser(MetaFileParser):
             Macros.update(self._Symbols)
         if GlobalData.BuildOptionPcd:
             for Item in GlobalData.BuildOptionPcd:
-                if type(Item) is tuple:
+                if isinstance(Item, tuple):
                     continue
                 PcdName, TmpValue = Item.split("=")
                 TmpValue = BuildOptionValue(TmpValue, self._GuidDict)
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 454ea7d088b4..c9648a9299dd 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1865,8 +1865,7 @@ class FdRegionReport(object):
                 for Match in gOffsetGuidPattern.finditer(FvReport):
                     Guid = Match.group(2).upper()
                     OffsetInfo[Match.group(1)] = self._GuidsDb.get(Guid, Guid)
-                OffsetList = OffsetInfo.keys()
-                OffsetList.sort()
+                OffsetList = sorted(OffsetInfo.keys())
                 for Offset in OffsetList:
                     FileWrite (File, "%s %s" % (Offset, OffsetInfo[Offset]))
             except IOError:
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 49869d9ee4e6..bf1f853d56be 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -305,7 +305,7 @@ def LaunchCommand(Command, WorkingDir):
         if EndOfProcedure is not None:
             EndOfProcedure.set()
         if Proc is None:
-            if type(Command) != type(""):
+            if not isinstance(Command, type("")):
                 Command = " ".join(Command)
             EdkLogger.error("build", COMMAND_FAILURE, "Failed to start command", ExtraData="%s [%s]" % (Command, WorkingDir))
 
@@ -316,7 +316,7 @@ def LaunchCommand(Command, WorkingDir):
 
     # check the return code of the program
     if Proc.returncode != 0:
-        if type(Command) != type(""):
+        if not isinstance(Command, type("")):
             Command = " ".join(Command)
         # print out the Response file and its content when make failure
         RespFile = os.path.join(WorkingDir, 'OUTPUT', 'respfilelist.txt')
diff --git a/BaseTools/gcc/mingw-gcc-build.py b/BaseTools/gcc/mingw-gcc-build.py
index 6de6ff43138e..97839d4d03a1 100755
--- a/BaseTools/gcc/mingw-gcc-build.py
+++ b/BaseTools/gcc/mingw-gcc-build.py
@@ -257,9 +257,9 @@ class SourceFiles:
             replaceables = ('extract-dir', 'filename', 'url')
             for replaceItem in fdata:
                 if replaceItem in replaceables: continue
-                if type(fdata[replaceItem]) != str: continue
+                if not isinstance(fdata[replaceItem], str): continue
                 for replaceable in replaceables:
-                    if type(fdata[replaceable]) != str: continue
+                    if not isinstance(fdata[replaceable], str): continue
                     if replaceable in fdata:
                         fdata[replaceable] = \
                             fdata[replaceable].replace(
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* [PATCH v4 13/13] BaseTools: Replace StringIO.StringIO with io.BytesIO
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (11 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 12/13] BaseTools: Fix old python2 idioms Gary Lin
@ 2018-06-25 10:31 ` Gary Lin
  2018-06-26  3:40 ` [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
  2018-06-26  4:48 ` Zhu, Yonghong
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-25 10:31 UTC (permalink / raw)
  To: edk2-devel; +Cc: Yonghong Zhu, Liming Gao

Replace StringIO.StringIO with io.BytesIO to be compatible with python3.
This commit also removes "import StringIO" from those python scripts
that don't really use it.

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>
---
 BaseTools/Scripts/ConvertUni.py                            |  5 -----
 BaseTools/Source/Python/AutoGen/AutoGen.py                 | 10 +++++-----
 BaseTools/Source/Python/AutoGen/GenDepex.py                |  4 ++--
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                |  4 ++--
 BaseTools/Source/Python/AutoGen/IdfClassObject.py          |  1 -
 BaseTools/Source/Python/AutoGen/StrGather.py               |  4 ++--
 BaseTools/Source/Python/AutoGen/UniClassObject.py          |  6 +++---
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py |  4 ++--
 BaseTools/Source/Python/BPDG/GenVpd.py                     |  6 +++---
 BaseTools/Source/Python/GenFds/AprioriSection.py           |  4 ++--
 BaseTools/Source/Python/GenFds/Capsule.py                  | 10 +++++-----
 BaseTools/Source/Python/GenFds/CapsuleData.py              |  4 ++--
 BaseTools/Source/Python/GenFds/Fd.py                       |  6 +++---
 BaseTools/Source/Python/GenFds/FfsFileStatement.py         |  4 ++--
 BaseTools/Source/Python/GenFds/FfsInfStatement.py          |  4 ++--
 BaseTools/Source/Python/GenFds/Fv.py                       |  6 +++---
 BaseTools/Source/Python/GenFds/FvImageSection.py           |  4 ++--
 BaseTools/Source/Python/GenFds/GenFds.py                   |  8 ++++----
 BaseTools/Source/Python/GenFds/OptionRom.py                |  3 ---
 BaseTools/Source/Python/GenFds/Region.py                   | 11 ++++++-----
 BaseTools/Source/Python/Trim/Trim.py                       |  6 +++---
 BaseTools/Source/Python/build/BuildReport.py               |  4 ++--
 BaseTools/Source/Python/build/build.py                     |  8 ++++----
 23 files changed, 59 insertions(+), 67 deletions(-)

diff --git a/BaseTools/Scripts/ConvertUni.py b/BaseTools/Scripts/ConvertUni.py
index 2af55dfc6702..67bbe41b1f18 100755
--- a/BaseTools/Scripts/ConvertUni.py
+++ b/BaseTools/Scripts/ConvertUni.py
@@ -23,11 +23,6 @@ import codecs
 import os
 import sys
 
-try:
-    from io import StringIO
-except ImportError:
-    from StringIO import StringIO
-
 class ConvertOneArg:
     """Converts utf-16 to utf-8 for one command line argument.
 
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index a7e1edb8435c..202245400433 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -25,7 +25,7 @@ import uuid
 import GenC
 import GenMake
 import GenDepex
-from StringIO import StringIO
+from io import BytesIO
 
 from StrGather import *
 from BuildEngine import BuildRule
@@ -3437,8 +3437,8 @@ class ModuleAutoGen(AutoGen):
     def _GetAutoGenFileList(self):
         UniStringAutoGenC = True
         IdfStringAutoGenC = True
-        UniStringBinBuffer = StringIO()
-        IdfGenBinBuffer = StringIO()
+        UniStringBinBuffer = BytesIO()
+        IdfGenBinBuffer = BytesIO()
         if self.BuildType == 'UEFI_HII':
             UniStringAutoGenC = False
             IdfStringAutoGenC = False
@@ -3713,8 +3713,8 @@ class ModuleAutoGen(AutoGen):
         except:
             EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
 
-        # Use a instance of StringIO to cache data
-        fStringIO = StringIO('')  
+        # Use a instance of BytesIO to cache data
+        fStringIO = BytesIO('')
 
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index e89191a72b9f..d3b1eae181c2 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -17,7 +17,7 @@ import Common.LongFilePathOs as os
 import re
 import traceback
 from Common.LongFilePathSupport import OpenLongFilePath as open
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack
 from Common.BuildToolError import *
 from Common.Misc import SaveFileOnChange
@@ -345,7 +345,7 @@ class DependencyExpression:
     #   @retval False   If file exists and is not changed.
     #
     def Generate(self, File=None):
-        Buffer = StringIO()
+        Buffer = BytesIO()
         if len(self.PostfixNotation) == 0:
             return False
 
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index 07ba29a158be..9fcd7fcc97a9 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -10,7 +10,7 @@
 # THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
 # WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
 #
-from StringIO import StringIO
+from io import BytesIO
 from Common.Misc import *
 from Common.StringUtils import StringToArray
 from struct import pack
@@ -888,7 +888,7 @@ def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
         DbFileName = os.path.join(Info.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, Phase + "PcdDataBase.raw")
     else:
         DbFileName = os.path.join(Info.OutputDir, Phase + "PcdDataBase.raw")
-    DbFile = StringIO()
+    DbFile = BytesIO()
     DbFile.write(PcdDbBuffer)
     Changed = SaveFileOnChange(DbFileName, DbFile.getvalue(), True)
 def CreatePcdDataBase(PcdDBData):
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index e5b933c2036f..b656bd83e3ba 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -14,7 +14,6 @@
 # Import Modules
 #
 import Common.EdkLogger as EdkLogger
-import StringIO
 from Common.BuildToolError import *
 from Common.StringUtils import GetLineNo
 from Common.Misc import PathClass
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index 9f70d4e5b717..0e74f3bfb7cb 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -18,7 +18,7 @@ import re
 import Common.EdkLogger as EdkLogger
 from Common.BuildToolError import *
 from UniClassObject import *
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack, unpack
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
@@ -341,7 +341,7 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         if Language not in UniLanguageListFiltered:
             continue
         
-        StringBuffer = StringIO()
+        StringBuffer = BytesIO()
         StrStringValue = ''
         ArrayLength = 0
         NumberOfUseOtherLangDef = 0
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index 3a931c6f2766..88810f1ccc0d 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -20,7 +20,7 @@ from __future__ import print_function
 import Common.LongFilePathOs as os, codecs, re
 import distutils.util
 import Common.EdkLogger as EdkLogger
-import StringIO
+from io import BytesIO
 from Common.BuildToolError import *
 from Common.StringUtils import GetLineNo
 from Common.Misc import PathClass
@@ -320,7 +320,7 @@ class UniFileClassObject(object):
 
         UniFileClassObject.VerifyUcs2Data(FileIn, FileName, Encoding)
 
-        UniFile = StringIO.StringIO(FileIn)
+        UniFile = BytesIO(FileIn)
         Info = codecs.lookup(Encoding)
         (Reader, Writer) = (Info.streamreader, Info.streamwriter)
         return codecs.StreamReaderWriter(UniFile, Reader, Writer)
@@ -335,7 +335,7 @@ class UniFileClassObject(object):
             FileDecoded = codecs.decode(FileIn, Encoding)
             Ucs2Info.encode(FileDecoded)
         except:
-            UniFile = StringIO.StringIO(FileIn)
+            UniFile = BytesIO(FileIn)
             Info = codecs.lookup(Encoding)
             (Reader, Writer) = (Info.streamreader, Info.streamwriter)
             File = codecs.StreamReaderWriter(UniFile, Reader, Writer)
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index 64d4965e9662..49fbdf3246a5 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -17,7 +17,7 @@
 import os
 from Common.RangeExpression import RangeExpression
 from Common.Misc import *
-from StringIO import StringIO
+from io import BytesIO
 from struct import pack
 from Common.DataType import *
 
@@ -162,7 +162,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                             Buffer += b
                             realLength += 1
         
-        DbFile = StringIO()
+        DbFile = BytesIO()
         if Phase == 'DXE' and os.path.exists(BinFilePath):
             BinFile = open(BinFilePath, "rb")
             BinBuffer = BinFile.read()
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 807d0fa8d86f..3bae803467a8 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -14,7 +14,7 @@
 #
 
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import StringTable as st
 import array
 import re
@@ -673,8 +673,8 @@ class GenVPD :
             # Open failed
             EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.MapFileName, None)
 
-        # Use a instance of StringIO to cache data
-        fStringIO = StringIO.StringIO('')
+        # Use a instance of BytesIO to cache data
+        fStringIO = BytesIO('')
 
         # Write the header of map file.
         try :
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 3d28c7d778cb..b3e7b5fc64a3 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -17,7 +17,7 @@
 #
 from struct import *
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import FfsFileStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import AprioriSectionClassObject
@@ -51,7 +51,7 @@ class AprioriSection (AprioriSectionClassObject):
     def GenFfs (self, FvName, Dict = {}, IsMakefile = False):
         DXE_GUID = "FC510EE7-FFDC-11D4-BD41-0080C73C8881"
         PEI_GUID = "1B45CC0A-156A-428A-AF62-49864DA0E6E6"
-        Buffer = StringIO.StringIO('')
+        Buffer = BytesIO('')
         AprioriFileGuid = DXE_GUID
         if self.AprioriType == "PEI":
             AprioriFileGuid = PEI_GUID
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index fbd48f3c6d76..b02661d99855 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -19,7 +19,7 @@ from GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import CapsuleClassObject
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
+from io import BytesIO
 from Common.Misc import SaveFileOnChange
 from GenFds import GenFds
 from Common.Misc import PackRegistryFormatGuid
@@ -66,7 +66,7 @@ class Capsule (CapsuleClassObject) :
         #     UINT32            CapsuleImageSize;
         # } EFI_CAPSULE_HEADER;
         #
-        Header = StringIO.StringIO()
+        Header = BytesIO()
         #
         # Use FMP capsule GUID: 6DCBD5ED-E82D-4C44-BDA1-7194199AD92A
         #
@@ -97,7 +97,7 @@ class Capsule (CapsuleClassObject) :
         #     // UINT64 ItemOffsetList[];
         # } EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER;
         #
-        FwMgrHdr = StringIO.StringIO()
+        FwMgrHdr = BytesIO()
         if 'CAPSULE_HEADER_INIT_VERSION' in self.TokensDict:
             FwMgrHdr.write(pack('=I', int(self.TokensDict['CAPSULE_HEADER_INIT_VERSION'], 16)))
         else:
@@ -132,7 +132,7 @@ class Capsule (CapsuleClassObject) :
         #
 
         PreSize = FwMgrHdrSize
-        Content = StringIO.StringIO()
+        Content = BytesIO()
         for driver in self.CapsuleDataList:
             FileName = driver.GenCapsuleSubItem()
             FwMgrHdr.write(pack('=Q', PreSize))
@@ -247,7 +247,7 @@ class Capsule (CapsuleClassObject) :
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiCapsuleName +  "_Cap" + '.inf')
-        CapInfFile = StringIO.StringIO() #open (self.CapInfFileName , 'w+')
+        CapInfFile = BytesIO() #open (self.CapInfFileName , 'w+')
 
         CapInfFile.writelines("[options]" + T_CHAR_LF)
 
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index 9dc55e5dbf7b..83b2731110bc 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -17,7 +17,7 @@
 #
 import Ffs
 from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
 from struct import pack
 import os
 from Common.Misc import SaveFileOnChange
@@ -82,7 +82,7 @@ class CapsuleFv (CapsuleData):
         if self.FvName.find('.fv') == -1:
             if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
                 FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[self.FvName.upper()]
-                FdBuffer = StringIO.StringIO('')
+                FdBuffer = BytesIO('')
                 FvObj.CapsuleName = self.CapsuleName
                 FvFile = FvObj.AddToBuffer(FdBuffer)
                 FvObj.CapsuleName = None
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index b2a14a1e1313..3305a470edfa 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -18,7 +18,7 @@
 import Region
 import Fv
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import sys
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -75,7 +75,7 @@ class FD(FDClassObject):
                 HasCapsuleRegion = True
                 break
         if HasCapsuleRegion:
-            TempFdBuffer = StringIO.StringIO('')
+            TempFdBuffer = BytesIO('')
             PreviousRegionStart = -1
             PreviousRegionSize = 1
 
@@ -104,7 +104,7 @@ class FD(FDClassObject):
                 GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
                 RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFds.ImageBinDict, self.vtfRawDict, self.DefineVarDict)
         
-        FdBuffer = StringIO.StringIO('')
+        FdBuffer = BytesIO('')
         PreviousRegionStart = -1
         PreviousRegionSize = 1
         for RegionObj in self.RegionList :
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index ba8e0465ef34..f5de57d0ac82 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -18,7 +18,7 @@
 import Ffs
 import Rule
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 import subprocess
 
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -82,7 +82,7 @@ class FileStatement (FileStatementClassObject) :
         Dict.update(self.DefineVarDict)
         SectionAlignments = None
         if self.FvName is not None :
-            Buffer = StringIO.StringIO('')
+            Buffer = BytesIO('')
             if self.FvName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
                 EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 9eb99d659bfd..ef34dbf00754 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -18,7 +18,7 @@
 #
 import Rule
 import Common.LongFilePathOs as os
-import StringIO
+from io import BytesIO
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
 import Ffs
@@ -1088,7 +1088,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     def __GenUniVfrOffsetFile(VfrUniOffsetList, UniVfrOffsetFileName):
 
         # Use a instance of StringIO to cache data
-        fStringIO = StringIO.StringIO('')  
+        fStringIO = BytesIO('')
         
         for Item in VfrUniOffsetList:
             if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index fb82634ccd7e..d980020680f8 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -17,7 +17,7 @@
 #
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
+from io import BytesIO
 from struct import *
 
 import Ffs
@@ -265,7 +265,7 @@ class FV (FvClassObject):
         #
         self.InfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
                                    self.UiFvName + '.inf')
-        self.FvInfFile = StringIO.StringIO()
+        self.FvInfFile = BytesIO()
 
         #
         # Add [Options]
@@ -407,7 +407,7 @@ class FV (FvClassObject):
             #
             if TotalSize > 0:
                 FvExtHeaderFileName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiFvName + '.ext')
-                FvExtHeaderFile = StringIO.StringIO()
+                FvExtHeaderFile = BytesIO()
                 FvExtHeaderFile.write(Buffer)
                 Changed = SaveFileOnChange(FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
                 FvExtHeaderFile.close()
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index 77bf6a700623..b4f1f3340e99 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -16,7 +16,7 @@
 # Import Modules
 #
 import Section
-import StringIO
+from io import BytesIO
 from Ffs import Ffs
 import subprocess
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -98,7 +98,7 @@ class FvImageSection(FvImageSectionClassObject):
         # Generate Fv
         #
         if self.FvName is not None:
-            Buffer = StringIO.StringIO('')
+            Buffer = BytesIO('')
             Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName)
             if Fv is not None:
                 self.Fv = Fv
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 912e6c58f402..acd19e527626 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -27,7 +27,7 @@ from Workspace.WorkspaceDatabase import WorkspaceDatabase
 from Workspace.BuildClassObject import PcdClassObject
 import RuleComplexFile
 from EfiSection import EfiSection
-import StringIO
+from io import BytesIO
 import Common.TargetTxtClassObject as TargetTxtClassObject
 import Common.ToolDefClassObject as ToolDefClassObject
 from Common.DataType import *
@@ -542,13 +542,13 @@ class GenFds :
         if GenFds.OnlyGenerateThisFv is not None and GenFds.OnlyGenerateThisFv.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
             FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[GenFds.OnlyGenerateThisFv.upper()]
             if FvObj is not None:
-                Buffer = StringIO.StringIO()
+                Buffer = BytesIO()
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
                 return
         elif GenFds.OnlyGenerateThisFv is None:
             for FvObj in GenFdsGlobalVariable.FdfParser.Profile.FvDict.values():
-                Buffer = StringIO.StringIO('')
+                Buffer = BytesIO('')
                 FvObj.AddToBuffer(Buffer)
                 Buffer.close()
         
@@ -694,7 +694,7 @@ class GenFds :
 
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
         GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
-        GuidXRefFile = StringIO.StringIO('')
+        GuidXRefFile = BytesIO('')
         GuidDict = {}
         ModuleList = []
         FileGuidList = []
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index b05841529940..755eb01da7e1 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -17,7 +17,6 @@
 #
 import Common.LongFilePathOs as os
 import subprocess
-import StringIO
 
 import OptRomInfStatement
 from GenFdsGlobalVariable import GenFdsGlobalVariable
@@ -138,5 +137,3 @@ class OverrideAttribs:
         self.PciDeviceId = None
         self.PciRevision = None
         self.NeedCompress = None
-        
-        
\ No newline at end of file
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index 9d632b6321e2..3b7e30ec8592 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -17,7 +17,7 @@
 #
 from struct import *
 from GenFdsGlobalVariable import GenFdsGlobalVariable
-import StringIO
+from io import BytesIO
 import string
 from CommonDataClass.FdfClass import RegionClassObject
 import Common.LongFilePathOs as os
@@ -127,7 +127,7 @@ class Region(RegionClassObject):
                         if self.FvAddress % FvAlignValue != 0:
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "FV (%s) is NOT %s Aligned!" % (FvObj.UiFvName, FvObj.FvAlignment))
-                        FvBuffer = StringIO.StringIO('')
+                        FvBuffer = BytesIO('')
                         FvBaseAddress = '0x%X' % self.FvAddress
                         BlockSize = None
                         BlockNum = None
@@ -135,7 +135,8 @@ class Region(RegionClassObject):
                         if Flag:
                             continue
 
-                        if FvBuffer.len > Size:
+                        FvBufferLen = len(FvBuffer.getvalue())
+                        if FvBufferLen > Size:
                             FvBuffer.close()
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "Size of FV (%s) is larger than Region Size 0x%X specified." % (RegionData, Size))
@@ -144,8 +145,8 @@ class Region(RegionClassObject):
                         #
                         Buffer.write(FvBuffer.getvalue())
                         FvBuffer.close()
-                        FvOffset = FvOffset + FvBuffer.len
-                        Size = Size - FvBuffer.len
+                        FvOffset = FvOffset + FvBufferLen
+                        Size = Size - FvBufferLen
                         continue
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (RegionData))
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index 97f4e87587ee..76944c0e25b3 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -17,7 +17,7 @@
 import Common.LongFilePathOs as os
 import sys
 import re
-import StringIO
+from io import BytesIO
 
 from optparse import OptionParser
 from optparse import make_option
@@ -455,8 +455,8 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
 
-    # Use a instance of StringIO to cache data
-    fStringIO = StringIO.StringIO('')
+    # Use a instance of BytesIO to cache data
+    fStringIO = BytesIO('')
 
     for Item in VfrUniOffsetList:
         if (Item[0].find("Strings") != -1):
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index c9648a9299dd..897167cd11d6 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -28,7 +28,7 @@ import hashlib
 import subprocess
 import threading
 from datetime import datetime
-from StringIO import StringIO
+from io import BytesIO
 from Common import EdkLogger
 from Common.Misc import SaveFileOnChange
 from Common.Misc import GuidStructureByteArrayToGuidString
@@ -2169,7 +2169,7 @@ class BuildReport(object):
     def GenerateReport(self, BuildDuration, AutoGenTime, MakeTime, GenFdsTime):
         if self.ReportFile:
             try:
-                File = StringIO('')
+                File = BytesIO('')
                 for (Wa, MaList) in self.ReportList:
                     PlatformReport(Wa, MaList, self.ReportType).GenerateReport(File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
                 Content = FileLinesSplit(File.getvalue(), gLineMaxLength)
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index bf1f853d56be..08e81016de8b 100644
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -19,7 +19,7 @@
 from __future__ import print_function
 import Common.LongFilePathOs as os
 import re
-import StringIO
+from io import BytesIO
 import sys
 import glob
 import time
@@ -1782,7 +1782,7 @@ class Build():
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
@@ -1940,7 +1940,7 @@ class Build():
                             if not Ma.IsLibrary:
                                 ModuleList[Ma.Guid.upper()] = Ma
 
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         #
                         # Rebase module to the preferred memory address before GenFds
@@ -2127,7 +2127,7 @@ class Build():
                     #
                     # Rebase module to the preferred memory address before GenFds
                     #
-                    MapBuffer = StringIO('')
+                    MapBuffer = BytesIO('')
                     if self.LoadFixAddress != 0:
                         self._CollectModuleMapBuffer(MapBuffer, ModuleList)
 
-- 
2.17.1



^ permalink raw reply related	[flat|nested] 16+ messages in thread

* Re: [PATCH v4 00/20] BaseTools: One step toward python3
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (12 preceding siblings ...)
  2018-06-25 10:31 ` [PATCH v4 13/13] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
@ 2018-06-26  3:40 ` Gary Lin
  2018-06-26  4:48 ` Zhu, Yonghong
  14 siblings, 0 replies; 16+ messages in thread
From: Gary Lin @ 2018-06-26  3:40 UTC (permalink / raw)
  To: edk2-devel; +Cc: Liming Gao

On Mon, Jun 25, 2018 at 06:31:23PM +0800, Gary Lin wrote:
> v4 changes:
v4 is rebased to 3b03b5e990f8bb347dfdb91926d8ef015d0b607e

>   - Remove the range() patch since it needs python-future
>   - Remove the patch to unify long and int since it caused error in
>     windows.
>   - Split the absolute import patches and will introduce them later
> 
> v3 changes:
>   - Rebase to the current git HEAD (2e1083038d9aa74fcaa2db8158fdee7c8b4af3bb)
>   - Fix a typo in BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
>   - Remove the patch for reduce() since it's not used anymore 
> 
> v2 changes:
>   - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
>   - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
>     refactor some python scripts to break the circular imports.
> 
> This patch series is also available in
> https://github.com/lcp/edk2/tree/python3-futurize-v3
I forgot the update the url. It should be

https://github.com/lcp/edk2/tree/python3-futurize-v4

Gary Lin

> 
> Since python2 will be EOL in 2020, we start to evaluate the impact of
> the python2 removal. As expected, OMVF building failed the test. It's
> actually a task noted in the wiki page:
> 
> https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support
> 
> Maybe it's time to convert the python scripts gradually.
> 
> This patchset doesn't make the python scripts in BaseTools compatible
> with python3 immediately. It aims to do the trivial and safe conversion
> and replacement to make some statements compatible with both python2 and
> python3, so we can deal with the difficult cases later.
> 
> With the help of "futurize" from python-future, it's easier to refactor
> the statements. This patchset is basically equivalent to "futurize -1"(*)
> plus "StringIO.StringIO => io.BytesIO" and minus the absolute import.
> 
> The patchset was tested with the following command in openSUSE
> Tumbleweed:
> 
> $ ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
>                      -D NETWORK_IP6_ENABLE \
>                      -D HTTP_BOOT_ENABLE \
>                      -D TLS_ENABLE \
>                      -D TPM2_ENABLE
> 
> The firmware file was built successfully and I didn't notice any error
> so far. Testing with other platform is welcome.
> 
> (*) http://python-future.org/automatic_conversion.html#stage-1-safe-fixes
> 
> Contributed-under: TianoCore Contribution Agreement 1.1
> Cc: Yonghong Zhu <yonghong.zhu@intel.com>
> Cc: Liming Gao <liming.gao@intel.com>
> Signed-off-by: Gary Lin <glin@suse.com>
> 
> Gary Lin (13):
>   BaseTools: Fix a typo in ini.py
>   BaseTools: Refactor python except statements
>   BaseTools: Refactor python print statements
>   BaseTools: Remove the old python "not-equal"
>   BaseTools: Remove tuple parameter in python scripts
>   BaseTools: Remove the deprecated hash_key()
>   BaseTools: Replace StandardError with Expression
>   BaseTools: Remove types.TypeType
>   BaseTools: Refactor python raise statement
>   BaseTools: Adjust the spaces around commas and colons
>   BaseTools: Migrate to the new octal literal
>   BaseTools: Fix old python2 idioms
>   BaseTools: Replace StringIO.StringIO with io.BytesIO
> 
>  .../Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py   |   5 +-
>  BaseTools/Scripts/BinToPcd.py                 |   7 +-
>  BaseTools/Scripts/ConvertUni.py               |   5 -
>  BaseTools/Scripts/FormatDosFiles.py           |   3 +-
>  BaseTools/Scripts/MemoryProfileSymbolGen.py   |  21 +-
>  .../PackageDocumentTools/packagedoc_cli.py    |  47 ++--
>  .../plugins/EdkPlugins/basemodel/doxygen.py   |  11 +-
>  .../plugins/EdkPlugins/basemodel/efibinary.py |  29 +-
>  .../plugins/EdkPlugins/basemodel/ini.py       |   4 +-
>  .../EdkPlugins/edk2/model/baseobject.py       |   6 +-
>  .../EdkPlugins/edk2/model/doxygengen.py       |   2 +-
>  .../EdkPlugins/edk2/model/doxygengen_spec.py  |   2 +-
>  .../plugins/EdkPlugins/edk2/model/inf.py      |   8 +-
>  BaseTools/Scripts/PatchCheck.py               |   2 +-
>  BaseTools/Scripts/RunMakefile.py              |   2 +-
>  .../Scripts/SmiHandlerProfileSymbolGen.py     |  19 +-
>  BaseTools/Scripts/UpdateBuildVersions.py      |  18 +-
>  BaseTools/Source/Python/AutoGen/AutoGen.py    |  77 +++---
>  .../Source/Python/AutoGen/BuildEngine.py      |  37 +--
>  BaseTools/Source/Python/AutoGen/GenC.py       |  72 ++---
>  BaseTools/Source/Python/AutoGen/GenDepex.py   |   8 +-
>  BaseTools/Source/Python/AutoGen/GenMake.py    |   8 +-
>  BaseTools/Source/Python/AutoGen/GenPcdDb.py   | 118 ++++----
>  BaseTools/Source/Python/AutoGen/GenVar.py     | 160 +++++------
>  .../Source/Python/AutoGen/IdfClassObject.py   |   1 -
>  BaseTools/Source/Python/AutoGen/StrGather.py  |   8 +-
>  .../Source/Python/AutoGen/UniClassObject.py   |  17 +-
>  .../Python/AutoGen/ValidCheckingInfoObject.py |   4 +-
>  BaseTools/Source/Python/BPDG/BPDG.py          |   3 +-
>  BaseTools/Source/Python/BPDG/GenVpd.py        |  18 +-
>  BaseTools/Source/Python/Common/DataType.py    |   4 +-
>  BaseTools/Source/Python/Common/Expression.py  |  77 +++---
>  .../Source/Python/Common/LongFilePathOs.py    |   2 +-
>  BaseTools/Source/Python/Common/Misc.py        |  49 ++--
>  .../Source/Python/Common/RangeExpression.py   |  33 +--
>  BaseTools/Source/Python/Common/StringUtils.py |   6 +-
>  .../Python/Common/TargetTxtClassObject.py     |   7 +-
>  .../Python/Common/ToolDefClassObject.py       |   8 +-
>  BaseTools/Source/Python/Common/VpdInfoFile.py |  23 +-
>  BaseTools/Source/Python/Ecc/CParser.py        | 175 ++++++------
>  BaseTools/Source/Python/Ecc/Check.py          |  14 +-
>  .../Python/Ecc/CodeFragmentCollector.py       |  69 ++---
>  BaseTools/Source/Python/Ecc/Configuration.py  |   5 +-
>  BaseTools/Source/Python/Ecc/Exception.py      |   3 +-
>  .../Ecc/MetaFileWorkspace/MetaDataTable.py    |   5 +-
>  .../Ecc/MetaFileWorkspace/MetaFileParser.py   |  42 +--
>  .../Source/Python/Ecc/Xml/XmlRoutines.py      |   9 +-
>  BaseTools/Source/Python/Ecc/c.py              |  15 +-
>  BaseTools/Source/Python/Eot/CParser.py        | 175 ++++++------
>  .../Python/Eot/CodeFragmentCollector.py       |  61 +++--
>  BaseTools/Source/Python/Eot/InfParserLite.py  |   7 +-
>  BaseTools/Source/Python/Eot/Parser.py         |   2 +-
>  BaseTools/Source/Python/Eot/c.py              |  23 +-
>  .../Source/Python/GenFds/AprioriSection.py    |   6 +-
>  BaseTools/Source/Python/GenFds/Capsule.py     |  10 +-
>  BaseTools/Source/Python/GenFds/CapsuleData.py |   6 +-
>  BaseTools/Source/Python/GenFds/EfiSection.py  |   6 +-
>  BaseTools/Source/Python/GenFds/Fd.py          |  12 +-
>  BaseTools/Source/Python/GenFds/FdfParser.py   |  45 +--
>  .../Source/Python/GenFds/FfsFileStatement.py  |   4 +-
>  .../Source/Python/GenFds/FfsInfStatement.py   |  18 +-
>  BaseTools/Source/Python/GenFds/Fv.py          |  10 +-
>  .../Source/Python/GenFds/FvImageSection.py    |   8 +-
>  BaseTools/Source/Python/GenFds/GenFds.py      |  17 +-
>  .../Python/GenFds/GenFdsGlobalVariable.py     |   9 +-
>  BaseTools/Source/Python/GenFds/OptionRom.py   |   3 -
>  BaseTools/Source/Python/GenFds/Region.py      |  11 +-
>  .../GenPatchPcdTable/GenPatchPcdTable.py      |   9 +-
>  .../Source/Python/Pkcs7Sign/Pkcs7Sign.py      |  31 ++-
>  .../Rsa2048Sha256GenerateKeys.py              |  25 +-
>  .../Rsa2048Sha256Sign/Rsa2048Sha256Sign.py    |  35 +--
>  .../Source/Python/TargetTool/TargetTool.py    |  39 +--
>  BaseTools/Source/Python/Trim/Trim.py          |  24 +-
>  .../Source/Python/UPT/Core/DependencyRules.py |  12 +-
>  .../UPT/Core/DistributionPackageClass.py      |   4 +-
>  BaseTools/Source/Python/UPT/Core/FileHook.py  |   2 +-
>  BaseTools/Source/Python/UPT/Core/IpiDb.py     |   6 +-
>  .../Source/Python/UPT/Core/PackageFile.py     |  12 +-
>  .../Python/UPT/GenMetaFile/GenDecFile.py      |  15 +-
>  .../Python/UPT/GenMetaFile/GenInfFile.py      |  37 +--
>  BaseTools/Source/Python/UPT/InstallPkg.py     |   2 +-
>  BaseTools/Source/Python/UPT/InventoryWs.py    |   2 +-
>  .../Python/UPT/Library/CommentParsing.py      |   2 +-
>  .../Python/UPT/Library/ExpressionValidate.py  |  11 +-
>  BaseTools/Source/Python/UPT/Library/Misc.py   |   6 +-
>  .../Python/UPT/Library/ParserValidate.py      |   2 +-
>  .../Source/Python/UPT/Library/StringUtils.py  |   4 +-
>  .../Python/UPT/Library/UniClassObject.py      |  17 +-
>  .../Python/UPT/Library/Xml/XmlRoutines.py     |   4 +-
>  BaseTools/Source/Python/UPT/MkPkg.py          |   2 +-
>  .../UPT/Object/Parser/InfBinaryObject.py      |   6 +-
>  .../UPT/Object/Parser/InfDefineObject.py      |   2 +-
>  .../Python/UPT/Object/Parser/InfGuidObject.py |   4 +-
>  .../Object/Parser/InfLibraryClassesObject.py  |   2 +-
>  .../Python/UPT/Object/Parser/InfMisc.py       |   4 +-
>  .../UPT/Object/Parser/InfPackagesObject.py    |   4 +-
>  .../Python/UPT/Object/Parser/InfPcdObject.py  |   4 +-
>  .../Python/UPT/Object/Parser/InfPpiObject.py  |   4 +-
>  .../UPT/Object/Parser/InfProtocolObject.py    |   2 +-
>  .../UPT/Object/Parser/InfSoucesObject.py      |   3 +-
>  .../Object/Parser/InfUserExtensionObject.py   |   4 +-
>  .../Python/UPT/PomAdapter/DecPomAlignment.py  |  56 ++--
>  .../Python/UPT/PomAdapter/InfPomAlignment.py  |   3 +-
>  .../UPT/PomAdapter/InfPomAlignmentMisc.py     |   3 +-
>  BaseTools/Source/Python/UPT/ReplacePkg.py     |   2 +-
>  BaseTools/Source/Python/UPT/RmPkg.py          |   2 +-
>  BaseTools/Source/Python/UPT/TestInstall.py    |   4 +-
>  BaseTools/Source/Python/UPT/UPT.py            |   8 +-
>  .../Python/UPT/UnitTest/DecParserTest.py      |   5 +-
>  .../UPT/UnitTest/InfBinarySectionTest.py      |   9 +-
>  BaseTools/Source/Python/UPT/Xml/CommonXml.py  |   2 +-
>  BaseTools/Source/Python/UPT/Xml/XmlParser.py  |  24 +-
>  .../Python/Workspace/BuildClassObject.py      |  16 +-
>  .../Source/Python/Workspace/DecBuildData.py   |  22 +-
>  .../Source/Python/Workspace/DscBuildData.py   | 259 +++++++++---------
>  .../Source/Python/Workspace/InfBuildData.py   |   2 +-
>  .../Source/Python/Workspace/MetaFileParser.py |  69 ++---
>  .../Source/Python/Workspace/MetaFileTable.py  |  10 +-
>  .../Python/Workspace/WorkspaceCommon.py       |   2 +-
>  BaseTools/Source/Python/build/BuildReport.py  |  21 +-
>  BaseTools/Source/Python/build/build.py        |  39 +--
>  BaseTools/Tests/CheckPythonSyntax.py          |   2 +-
>  BaseTools/Tests/TestTools.py                  |  10 +-
>  BaseTools/Tests/TianoCompress.py              |   5 +-
>  BaseTools/gcc/mingw-gcc-build.py              | 111 ++++----
>  125 files changed, 1376 insertions(+), 1353 deletions(-)
> 
> -- 
> 2.17.1
> 
> _______________________________________________
> edk2-devel mailing list
> edk2-devel@lists.01.org
> https://lists.01.org/mailman/listinfo/edk2-devel
> 


^ permalink raw reply	[flat|nested] 16+ messages in thread

* Re: [PATCH v4 00/20] BaseTools: One step toward python3
  2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
                   ` (13 preceding siblings ...)
  2018-06-26  3:40 ` [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
@ 2018-06-26  4:48 ` Zhu, Yonghong
  14 siblings, 0 replies; 16+ messages in thread
From: Zhu, Yonghong @ 2018-06-26  4:48 UTC (permalink / raw)
  To: Gary Lin, edk2-devel@lists.01.org; +Cc: Gao, Liming, Zhu, Yonghong

Hi Gary,

The V4 patches are good to me. I will push this series in recent days if no other comment from community.
Reviewed-by: Yonghong Zhu <yonghong.zhu@intel.com> 

Best Regards,
Zhu Yonghong


-----Original Message-----
From: Gary Lin [mailto:glin@suse.com] 
Sent: Monday, June 25, 2018 6:31 PM
To: edk2-devel@lists.01.org
Cc: Zhu, Yonghong <yonghong.zhu@intel.com>; Gao, Liming <liming.gao@intel.com>
Subject: [PATCH v4 00/20] BaseTools: One step toward python3

v4 changes:
  - Remove the range() patch since it needs python-future
  - Remove the patch to unify long and int since it caused error in
    windows.
  - Split the absolute import patches and will introduce them later

v3 changes:
  - Rebase to the current git HEAD (2e1083038d9aa74fcaa2db8158fdee7c8b4af3bb)
  - Fix a typo in BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
  - Remove the patch for reduce() since it's not used anymore 

v2 changes:
  - Rebase to the current git HEAD (821807bcefb9a36e598d71a8004fae5aab2052a0)
  - Apply "futurize -f libfuturize.fixes.fix_absolute_import" and
    refactor some python scripts to break the circular imports.

This patch series is also available in
https://github.com/lcp/edk2/tree/python3-futurize-v3

Since python2 will be EOL in 2020, we start to evaluate the impact of the python2 removal. As expected, OMVF building failed the test. It's actually a task noted in the wiki page:

https://github.com/tianocore/tianocore.github.io/wiki/Tasks-BaseTools-Python3-Support

Maybe it's time to convert the python scripts gradually.

This patchset doesn't make the python scripts in BaseTools compatible with python3 immediately. It aims to do the trivial and safe conversion and replacement to make some statements compatible with both python2 and python3, so we can deal with the difficult cases later.

With the help of "futurize" from python-future, it's easier to refactor the statements. This patchset is basically equivalent to "futurize -1"(*) plus "StringIO.StringIO => io.BytesIO" and minus the absolute import.

The patchset was tested with the following command in openSUSE
Tumbleweed:

$ ./OvmfPkg/build.sh -D SECURE_BOOT_ENABLE \
                     -D NETWORK_IP6_ENABLE \
                     -D HTTP_BOOT_ENABLE \
                     -D TLS_ENABLE \
                     -D TPM2_ENABLE

The firmware file was built successfully and I didn't notice any error so far. Testing with other platform is welcome.

(*) http://python-future.org/automatic_conversion.html#stage-1-safe-fixes

Contributed-under: TianoCore Contribution Agreement 1.1
Cc: Yonghong Zhu <yonghong.zhu@intel.com>
Cc: Liming Gao <liming.gao@intel.com>
Signed-off-by: Gary Lin <glin@suse.com>

Gary Lin (13):
  BaseTools: Fix a typo in ini.py
  BaseTools: Refactor python except statements
  BaseTools: Refactor python print statements
  BaseTools: Remove the old python "not-equal"
  BaseTools: Remove tuple parameter in python scripts
  BaseTools: Remove the deprecated hash_key()
  BaseTools: Replace StandardError with Expression
  BaseTools: Remove types.TypeType
  BaseTools: Refactor python raise statement
  BaseTools: Adjust the spaces around commas and colons
  BaseTools: Migrate to the new octal literal
  BaseTools: Fix old python2 idioms
  BaseTools: Replace StringIO.StringIO with io.BytesIO

 .../Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py   |   5 +-
 BaseTools/Scripts/BinToPcd.py                 |   7 +-
 BaseTools/Scripts/ConvertUni.py               |   5 -
 BaseTools/Scripts/FormatDosFiles.py           |   3 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py   |  21 +-
 .../PackageDocumentTools/packagedoc_cli.py    |  47 ++--
 .../plugins/EdkPlugins/basemodel/doxygen.py   |  11 +-
 .../plugins/EdkPlugins/basemodel/efibinary.py |  29 +-
 .../plugins/EdkPlugins/basemodel/ini.py       |   4 +-
 .../EdkPlugins/edk2/model/baseobject.py       |   6 +-
 .../EdkPlugins/edk2/model/doxygengen.py       |   2 +-
 .../EdkPlugins/edk2/model/doxygengen_spec.py  |   2 +-
 .../plugins/EdkPlugins/edk2/model/inf.py      |   8 +-
 BaseTools/Scripts/PatchCheck.py               |   2 +-
 BaseTools/Scripts/RunMakefile.py              |   2 +-
 .../Scripts/SmiHandlerProfileSymbolGen.py     |  19 +-
 BaseTools/Scripts/UpdateBuildVersions.py      |  18 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py    |  77 +++---
 .../Source/Python/AutoGen/BuildEngine.py      |  37 +--
 BaseTools/Source/Python/AutoGen/GenC.py       |  72 ++---
 BaseTools/Source/Python/AutoGen/GenDepex.py   |   8 +-
 BaseTools/Source/Python/AutoGen/GenMake.py    |   8 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py   | 118 ++++----
 BaseTools/Source/Python/AutoGen/GenVar.py     | 160 +++++------
 .../Source/Python/AutoGen/IdfClassObject.py   |   1 -
 BaseTools/Source/Python/AutoGen/StrGather.py  |   8 +-
 .../Source/Python/AutoGen/UniClassObject.py   |  17 +-
 .../Python/AutoGen/ValidCheckingInfoObject.py |   4 +-
 BaseTools/Source/Python/BPDG/BPDG.py          |   3 +-
 BaseTools/Source/Python/BPDG/GenVpd.py        |  18 +-
 BaseTools/Source/Python/Common/DataType.py    |   4 +-
 BaseTools/Source/Python/Common/Expression.py  |  77 +++---
 .../Source/Python/Common/LongFilePathOs.py    |   2 +-
 BaseTools/Source/Python/Common/Misc.py        |  49 ++--
 .../Source/Python/Common/RangeExpression.py   |  33 +--
 BaseTools/Source/Python/Common/StringUtils.py |   6 +-
 .../Python/Common/TargetTxtClassObject.py     |   7 +-
 .../Python/Common/ToolDefClassObject.py       |   8 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py |  23 +-
 BaseTools/Source/Python/Ecc/CParser.py        | 175 ++++++------
 BaseTools/Source/Python/Ecc/Check.py          |  14 +-
 .../Python/Ecc/CodeFragmentCollector.py       |  69 ++---
 BaseTools/Source/Python/Ecc/Configuration.py  |   5 +-
 BaseTools/Source/Python/Ecc/Exception.py      |   3 +-
 .../Ecc/MetaFileWorkspace/MetaDataTable.py    |   5 +-
 .../Ecc/MetaFileWorkspace/MetaFileParser.py   |  42 +--
 .../Source/Python/Ecc/Xml/XmlRoutines.py      |   9 +-
 BaseTools/Source/Python/Ecc/c.py              |  15 +-
 BaseTools/Source/Python/Eot/CParser.py        | 175 ++++++------
 .../Python/Eot/CodeFragmentCollector.py       |  61 +++--
 BaseTools/Source/Python/Eot/InfParserLite.py  |   7 +-
 BaseTools/Source/Python/Eot/Parser.py         |   2 +-
 BaseTools/Source/Python/Eot/c.py              |  23 +-
 .../Source/Python/GenFds/AprioriSection.py    |   6 +-
 BaseTools/Source/Python/GenFds/Capsule.py     |  10 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py |   6 +-
 BaseTools/Source/Python/GenFds/EfiSection.py  |   6 +-
 BaseTools/Source/Python/GenFds/Fd.py          |  12 +-
 BaseTools/Source/Python/GenFds/FdfParser.py   |  45 +--
 .../Source/Python/GenFds/FfsFileStatement.py  |   4 +-
 .../Source/Python/GenFds/FfsInfStatement.py   |  18 +-
 BaseTools/Source/Python/GenFds/Fv.py          |  10 +-
 .../Source/Python/GenFds/FvImageSection.py    |   8 +-
 BaseTools/Source/Python/GenFds/GenFds.py      |  17 +-
 .../Python/GenFds/GenFdsGlobalVariable.py     |   9 +-
 BaseTools/Source/Python/GenFds/OptionRom.py   |   3 -
 BaseTools/Source/Python/GenFds/Region.py      |  11 +-
 .../GenPatchPcdTable/GenPatchPcdTable.py      |   9 +-
 .../Source/Python/Pkcs7Sign/Pkcs7Sign.py      |  31 ++-
 .../Rsa2048Sha256GenerateKeys.py              |  25 +-
 .../Rsa2048Sha256Sign/Rsa2048Sha256Sign.py    |  35 +--
 .../Source/Python/TargetTool/TargetTool.py    |  39 +--
 BaseTools/Source/Python/Trim/Trim.py          |  24 +-
 .../Source/Python/UPT/Core/DependencyRules.py |  12 +-
 .../UPT/Core/DistributionPackageClass.py      |   4 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py  |   2 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py     |   6 +-
 .../Source/Python/UPT/Core/PackageFile.py     |  12 +-
 .../Python/UPT/GenMetaFile/GenDecFile.py      |  15 +-
 .../Python/UPT/GenMetaFile/GenInfFile.py      |  37 +--
 BaseTools/Source/Python/UPT/InstallPkg.py     |   2 +-
 BaseTools/Source/Python/UPT/InventoryWs.py    |   2 +-
 .../Python/UPT/Library/CommentParsing.py      |   2 +-
 .../Python/UPT/Library/ExpressionValidate.py  |  11 +-
 BaseTools/Source/Python/UPT/Library/Misc.py   |   6 +-
 .../Python/UPT/Library/ParserValidate.py      |   2 +-
 .../Source/Python/UPT/Library/StringUtils.py  |   4 +-
 .../Python/UPT/Library/UniClassObject.py      |  17 +-
 .../Python/UPT/Library/Xml/XmlRoutines.py     |   4 +-
 BaseTools/Source/Python/UPT/MkPkg.py          |   2 +-
 .../UPT/Object/Parser/InfBinaryObject.py      |   6 +-
 .../UPT/Object/Parser/InfDefineObject.py      |   2 +-
 .../Python/UPT/Object/Parser/InfGuidObject.py |   4 +-
 .../Object/Parser/InfLibraryClassesObject.py  |   2 +-
 .../Python/UPT/Object/Parser/InfMisc.py       |   4 +-
 .../UPT/Object/Parser/InfPackagesObject.py    |   4 +-
 .../Python/UPT/Object/Parser/InfPcdObject.py  |   4 +-
 .../Python/UPT/Object/Parser/InfPpiObject.py  |   4 +-
 .../UPT/Object/Parser/InfProtocolObject.py    |   2 +-
 .../UPT/Object/Parser/InfSoucesObject.py      |   3 +-
 .../Object/Parser/InfUserExtensionObject.py   |   4 +-
 .../Python/UPT/PomAdapter/DecPomAlignment.py  |  56 ++--
 .../Python/UPT/PomAdapter/InfPomAlignment.py  |   3 +-
 .../UPT/PomAdapter/InfPomAlignmentMisc.py     |   3 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py     |   2 +-
 BaseTools/Source/Python/UPT/RmPkg.py          |   2 +-
 BaseTools/Source/Python/UPT/TestInstall.py    |   4 +-
 BaseTools/Source/Python/UPT/UPT.py            |   8 +-
 .../Python/UPT/UnitTest/DecParserTest.py      |   5 +-
 .../UPT/UnitTest/InfBinarySectionTest.py      |   9 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py  |   2 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py  |  24 +-
 .../Python/Workspace/BuildClassObject.py      |  16 +-
 .../Source/Python/Workspace/DecBuildData.py   |  22 +-
 .../Source/Python/Workspace/DscBuildData.py   | 259 +++++++++---------
 .../Source/Python/Workspace/InfBuildData.py   |   2 +-
 .../Source/Python/Workspace/MetaFileParser.py |  69 ++---  .../Source/Python/Workspace/MetaFileTable.py  |  10 +-
 .../Python/Workspace/WorkspaceCommon.py       |   2 +-
 BaseTools/Source/Python/build/BuildReport.py  |  21 +-
 BaseTools/Source/Python/build/build.py        |  39 +--
 BaseTools/Tests/CheckPythonSyntax.py          |   2 +-
 BaseTools/Tests/TestTools.py                  |  10 +-
 BaseTools/Tests/TianoCompress.py              |   5 +-
 BaseTools/gcc/mingw-gcc-build.py              | 111 ++++----
 125 files changed, 1376 insertions(+), 1353 deletions(-)

--
2.17.1



^ permalink raw reply	[flat|nested] 16+ messages in thread

end of thread, other threads:[~2018-06-26  4:48 UTC | newest]

Thread overview: 16+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2018-06-25 10:31 [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
2018-06-25 10:31 ` [PATCH v4 01/13] BaseTools: Fix a typo in ini.py Gary Lin
2018-06-25 10:31 ` [PATCH v4 02/13] BaseTools: Refactor python except statements Gary Lin
2018-06-25 10:31 ` [PATCH v4 03/13] BaseTools: Refactor python print statements Gary Lin
2018-06-25 10:31 ` [PATCH v4 04/13] BaseTools: Remove the old python "not-equal" Gary Lin
2018-06-25 10:31 ` [PATCH v4 05/13] BaseTools: Remove tuple parameter in python scripts Gary Lin
2018-06-25 10:31 ` [PATCH v4 06/13] BaseTools: Remove the deprecated hash_key() Gary Lin
2018-06-25 10:31 ` [PATCH v4 07/13] BaseTools: Replace StandardError with Expression Gary Lin
2018-06-25 10:31 ` [PATCH v4 08/13] BaseTools: Remove types.TypeType Gary Lin
2018-06-25 10:31 ` [PATCH v4 09/13] BaseTools: Refactor python raise statement Gary Lin
2018-06-25 10:31 ` [PATCH v4 10/13] BaseTools: Adjust the spaces around commas and colons Gary Lin
2018-06-25 10:31 ` [PATCH v4 11/13] BaseTools: Migrate to the new octal literal Gary Lin
2018-06-25 10:31 ` [PATCH v4 12/13] BaseTools: Fix old python2 idioms Gary Lin
2018-06-25 10:31 ` [PATCH v4 13/13] BaseTools: Replace StringIO.StringIO with io.BytesIO Gary Lin
2018-06-26  3:40 ` [PATCH v4 00/20] BaseTools: One step toward python3 Gary Lin
2018-06-26  4:48 ` Zhu, Yonghong

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox