public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
* [PATCH v1 0/1] BaseTools: Fix Python Formatting
@ 2022-10-10 20:05 Ayush Singh
  2022-10-10 20:05 ` [PATCH v1 1/1] Format BaseTools python files using autopep8 Ayush Singh
  2022-10-12  4:56 ` [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
  0 siblings, 2 replies; 5+ messages in thread
From: Ayush Singh @ 2022-10-10 20:05 UTC (permalink / raw)
  To: devel

Fix formatting of Python files in BaseTools to conform to PEP8 using
autopep8. This does not fix all the warnings/errors from flake8, but I
wanted to get this patch checked out first to see if ignoring those
warnings is deliberate or not.

The complete code can be found: https://github.com/Ayush1325/edk2/tree/formatting

Ayush Singh (1):
  Format BaseTools python files using autopep8

 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                                       |   74 +-
 BaseTools/Edk2ToolsBuild.py                                                             |    4 +-
 BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py                          |   15 +-
 BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py                               |    6 +-
 BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py                                   |    8 +-
 BaseTools/Scripts/BinToPcd.py                                                           |  185 +-
 BaseTools/Scripts/ConvertFceToStructurePcd.py                                           | 1312 ++--
 BaseTools/Scripts/ConvertMasmToNasm.py                                                  |    7 +-
 BaseTools/Scripts/ConvertUni.py                                                         |   14 +-
 BaseTools/Scripts/DetectNotUsedItem.py                                                  |   23 +-
 BaseTools/Scripts/FormatDosFiles.py                                                     |   25 +-
 BaseTools/Scripts/GetMaintainer.py                                                      |   19 +-
 BaseTools/Scripts/GetUtcDateTime.py                                                     |   18 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py                                             |  162 +-
 BaseTools/Scripts/PackageDocumentTools/__init__.py                                      |    2 +-
 BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py                                |  138 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py                   |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py         |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py          |   79 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py        |   96 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py              |   92 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py       |    3 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py          |   12 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py              |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py        |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py      |  165 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py             |   41 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py      |  374 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py |  372 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py             |   25 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py             |   59 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py                              |    2 +-
 BaseTools/Scripts/PatchCheck.py                                                         |   90 +-
 BaseTools/Scripts/RunMakefile.py                                                        |  258 +-
 BaseTools/Scripts/SetupGit.py                                                           |   23 +-
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                                         |  165 +-
 BaseTools/Scripts/UpdateBuildVersions.py                                                |   64 +-
 BaseTools/Scripts/efi_debugging.py                                                      |    4 +-
 BaseTools/Scripts/efi_gdb.py                                                            |    1 +
 BaseTools/Source/C/Makefiles/NmakeSubdirs.py                                            |   43 +-
 BaseTools/Source/C/PyEfiCompressor/setup.py                                             |   16 +-
 BaseTools/Source/Python/AmlToC/AmlToC.py                                                |   29 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                                              |   58 +-
 BaseTools/Source/Python/AutoGen/AutoGenWorker.py                                        |  145 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                                          |  158 +-
 BaseTools/Source/Python/AutoGen/DataPipe.py                                             |  152 +-
 BaseTools/Source/Python/AutoGen/GenC.py                                                 |  944 ++-
 BaseTools/Source/Python/AutoGen/GenDepex.py                                             |  211 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                                              |  738 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                                             |  533 +-
 BaseTools/Source/Python/AutoGen/GenVar.py                                               |  182 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py                                       |   97 +-
 BaseTools/Source/Python/AutoGen/IncludesAutoGen.py                                      |  109 +-
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                                     |   47 +-
 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py                                        |  945 ++-
 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py                                  |  255 +-
 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py                                      |  554 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                                            |  221 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                                       |  261 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py                              |   23 +-
 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py                                     |  405 +-
 BaseTools/Source/Python/AutoGen/__init__.py                                             |    2 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                                    |   37 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                                  |  348 +-
 BaseTools/Source/Python/BPDG/StringTable.py                                             |   47 +-
 BaseTools/Source/Python/BPDG/__init__.py                                                |    2 +-
 BaseTools/Source/Python/Capsule/GenerateCapsule.py                                      | 1329 ++--
 BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py                                |  119 +-
 BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py                          |   83 +-
 BaseTools/Source/Python/Common/BuildToolError.py                                        |  109 +-
 BaseTools/Source/Python/Common/BuildVersion.py                                          |    2 +-
 BaseTools/Source/Python/Common/DataType.py                                              |  187 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py                         |   84 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Edk2/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/EdkLogger.py                                             |  112 +-
 BaseTools/Source/Python/Common/Expression.py                                            |  243 +-
 BaseTools/Source/Python/Common/GlobalData.py                                            |   24 +-
 BaseTools/Source/Python/Common/LongFilePathOs.py                                        |   31 +-
 BaseTools/Source/Python/Common/LongFilePathOsPath.py                                    |   10 +-
 BaseTools/Source/Python/Common/LongFilePathSupport.py                                   |   11 +-
 BaseTools/Source/Python/Common/Misc.py                                                  |  479 +-
 BaseTools/Source/Python/Common/MultipleWorkspace.py                                     |   32 +-
 BaseTools/Source/Python/Common/Parsing.py                                               |  332 +-
 BaseTools/Source/Python/Common/RangeExpression.py                                       |   89 +-
 BaseTools/Source/Python/Common/StringUtils.py                                           |  194 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                                  |   73 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                                    |   89 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py                        |  394 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py                            |  117 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py                         |  286 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py                        |  110 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Uefi/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/VariableAttributes.py                                    |   14 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                           |   96 +-
 BaseTools/Source/Python/Common/__init__.py                                              |    2 +-
 BaseTools/Source/Python/Common/caching.py                                               |   29 +-
 BaseTools/Source/Python/CommonDataClass/CommonClass.py                                  |   29 +-
 BaseTools/Source/Python/CommonDataClass/DataClass.py                                    |   88 +-
 BaseTools/Source/Python/CommonDataClass/Exceptions.py                                   |   12 +-
 BaseTools/Source/Python/CommonDataClass/FdfClass.py                                     |  129 +-
 BaseTools/Source/Python/CommonDataClass/__init__.py                                     |    2 +-
 BaseTools/Source/Python/Ecc/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Ecc/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Ecc/CParser4/CLexer.py                                          |  140 +-
 BaseTools/Source/Python/Ecc/CParser4/CListener.py                                       |  359 +-
 BaseTools/Source/Python/Ecc/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Ecc/Check.py                                                    |  404 +-
 BaseTools/Source/Python/Ecc/CodeFragment.py                                             |   68 +-
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                                    |  151 +-
 BaseTools/Source/Python/Ecc/Configuration.py                                            |  245 +-
 BaseTools/Source/Python/Ecc/Database.py                                                 |  111 +-
 BaseTools/Source/Python/Ecc/EccGlobalData.py                                            |    2 +-
 BaseTools/Source/Python/Ecc/EccMain.py                                                  |  144 +-
 BaseTools/Source/Python/Ecc/EccToolError.py                                             |  177 +-
 BaseTools/Source/Python/Ecc/Exception.py                                                |   16 +-
 BaseTools/Source/Python/Ecc/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                                           |   62 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py                          |   43 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py                         |  827 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py                          |  181 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py                               |    2 +-
 BaseTools/Source/Python/Ecc/ParserWarning.py                                            |    8 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                                          |   39 +-
 BaseTools/Source/Python/Ecc/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Ecc/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Ecc/c.py                                                        |  512 +-
 BaseTools/Source/Python/Eot/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Eot/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Eot/CParser4/CLexer.py                                          |  139 +-
 BaseTools/Source/Python/Eot/CParser4/CListener.py                                       |  358 +-
 BaseTools/Source/Python/Eot/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Eot/CodeFragment.py                                             |   78 +-
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                                    |  119 +-
 BaseTools/Source/Python/Eot/Database.py                                                 |   77 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                                            |    5 +-
 BaseTools/Source/Python/Eot/EotMain.py                                                  |  544 +-
 BaseTools/Source/Python/Eot/EotToolError.py                                             |    7 +-
 BaseTools/Source/Python/Eot/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Eot/Identification.py                                           |   11 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                                            |   52 +-
 BaseTools/Source/Python/Eot/Parser.py                                                   |  244 +-
 BaseTools/Source/Python/Eot/ParserWarning.py                                            |    6 +-
 BaseTools/Source/Python/Eot/Report.py                                                   |   63 +-
 BaseTools/Source/Python/Eot/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Eot/c.py                                                        |  100 +-
 BaseTools/Source/Python/FMMT/FMMT.py                                                    |   57 +-
 BaseTools/Source/Python/FMMT/__init__.py                                                |    4 +-
 BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py                               |  151 +-
 BaseTools/Source/Python/FMMT/core/BiosTree.py                                           |   47 +-
 BaseTools/Source/Python/FMMT/core/BiosTreeNode.py                                       |   77 +-
 BaseTools/Source/Python/FMMT/core/FMMTOperation.py                                      |   40 +-
 BaseTools/Source/Python/FMMT/core/FMMTParser.py                                         |   30 +-
 BaseTools/Source/Python/FMMT/core/FvHandler.py                                          |  201 +-
 BaseTools/Source/Python/FMMT/core/GuidTools.py                                          |   47 +-
 BaseTools/Source/Python/FMMT/utils/FmmtLogger.py                                        |   10 +-
 BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py                                     |   29 +-
 BaseTools/Source/Python/FirmwareStorageFormat/Common.py                                 |   20 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py                          |    5 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py                               |   31 +-
 BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py                          |    9 +-
 BaseTools/Source/Python/FirmwareStorageFormat/__init__.py                               |    4 +-
 BaseTools/Source/Python/GenFds/AprioriSection.py                                        |   48 +-
 BaseTools/Source/Python/GenFds/Capsule.py                                               |  103 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                                           |  111 +-
 BaseTools/Source/Python/GenFds/CompressSection.py                                       |   42 +-
 BaseTools/Source/Python/GenFds/DataSection.py                                           |   63 +-
 BaseTools/Source/Python/GenFds/DepexSection.py                                          |   40 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                                            |  167 +-
 BaseTools/Source/Python/GenFds/Fd.py                                                    |   80 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                                             | 1680 +++--
 BaseTools/Source/Python/GenFds/Ffs.py                                                   |   58 +-
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                                      |   67 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                                       |  572 +-
 BaseTools/Source/Python/GenFds/Fv.py                                                    |  240 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                                        |   77 +-
 BaseTools/Source/Python/GenFds/GenFds.py                                                |  353 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                                  |  355 +-
 BaseTools/Source/Python/GenFds/GuidSection.py                                           |   90 +-
 BaseTools/Source/Python/GenFds/OptRomFileStatement.py                                   |   16 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py                                    |   56 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                                             |   46 +-
 BaseTools/Source/Python/GenFds/Region.py                                                |  113 +-
 BaseTools/Source/Python/GenFds/Rule.py                                                  |    8 +-
 BaseTools/Source/Python/GenFds/RuleComplexFile.py                                       |   12 +-
 BaseTools/Source/Python/GenFds/RuleSimpleFile.py                                        |   10 +-
 BaseTools/Source/Python/GenFds/Section.py                                               |  121 +-
 BaseTools/Source/Python/GenFds/UiSection.py                                             |   23 +-
 BaseTools/Source/Python/GenFds/VerSection.py                                            |   15 +-
 BaseTools/Source/Python/GenFds/__init__.py                                              |    2 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py                            |   61 +-
 BaseTools/Source/Python/GenPatchPcdTable/__init__.py                                    |    2 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                                  |   49 +-
 BaseTools/Source/Python/PatchPcdValue/__init__.py                                       |    2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                                          |  408 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py                  |  259 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py                          |  350 +-
 BaseTools/Source/Python/Split/Split.py                                                  |   17 +-
 BaseTools/Source/Python/Table/Table.py                                                  |   25 +-
 BaseTools/Source/Python/Table/TableDataModel.py                                         |   17 +-
 BaseTools/Source/Python/Table/TableDec.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableDsc.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableEotReport.py                                         |   19 +-
 BaseTools/Source/Python/Table/TableFdf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableFile.py                                              |   26 +-
 BaseTools/Source/Python/Table/TableFunction.py                                          |   15 +-
 BaseTools/Source/Python/Table/TableIdentifier.py                                        |   15 +-
 BaseTools/Source/Python/Table/TableInf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TablePcd.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableQuery.py                                             |   12 +-
 BaseTools/Source/Python/Table/TableReport.py                                            |   37 +-
 BaseTools/Source/Python/Table/__init__.py                                               |    2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                                        |  107 +-
 BaseTools/Source/Python/TargetTool/__init__.py                                          |    2 +-
 BaseTools/Source/Python/Trim/Trim.py                                                    |  224 +-
 BaseTools/Source/Python/UPT/BuildVersion.py                                             |    2 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                                     |   82 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py                            |   78 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py                                            |   50 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                                               |  230 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                                         |   61 +-
 BaseTools/Source/Python/UPT/Core/__init__.py                                            |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                                   |  149 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                                   |  181 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py                              |   40 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py                                   |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/InstallPkg.py                                               |  295 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                                              |   45 +-
 BaseTools/Source/Python/UPT/Library/CommentGenerating.py                                |   88 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                                   |  177 +-
 BaseTools/Source/Python/UPT/Library/DataType.py                                         |  379 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py                               |  147 +-
 BaseTools/Source/Python/UPT/Library/GlobalData.py                                       |    2 +-
 BaseTools/Source/Python/UPT/Library/Misc.py                                             |  227 +-
 BaseTools/Source/Python/UPT/Library/ParserValidate.py                                   |  130 +-
 BaseTools/Source/Python/UPT/Library/Parsing.py                                          |  363 +-
 BaseTools/Source/Python/UPT/Library/StringUtils.py                                      |  203 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                                   |  447 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                                  |   37 +-
 BaseTools/Source/Python/UPT/Library/Xml/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/Library/__init__.py                                         |    2 +-
 BaseTools/Source/Python/UPT/Logger/Log.py                                               |   97 +-
 BaseTools/Source/Python/UPT/Logger/StringTable.py                                       |  933 +--
 BaseTools/Source/Python/UPT/Logger/ToolError.py                                         |  117 +-
 BaseTools/Source/Python/UPT/Logger/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                                    |   73 +-
 BaseTools/Source/Python/UPT/Object/POM/CommonObject.py                                  |   93 +-
 BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py                                  |   35 +-
 BaseTools/Source/Python/UPT/Object/POM/PackageObject.py                                 |   13 +-
 BaseTools/Source/Python/UPT/Object/POM/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/Object/Parser/DecObject.py                                  |  186 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py                            |  138 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py                       |   15 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py                            |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py                      |   36 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py                            |  334 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py                             |   21 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py                              |   51 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py                            |   34 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py                    |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                                    |   23 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py                          |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py                               |  116 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py                               |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py                          |   42 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py                            |   66 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py                     |   24 +-
 BaseTools/Source/Python/UPT/Object/Parser/__init__.py                                   |    2 +-
 BaseTools/Source/Python/UPT/Object/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/Parser/DecParser.py                                         |  280 +-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                                     |   56 +-
 BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py                                 |   47 +-
 BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py                            |   44 +-
 BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py                       |   53 +-
 BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py                            |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py                             |   15 +-
 BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py                   |   73 +-
 BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py                           |   38 +-
 BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py                           |   34 +-
 BaseTools/Source/Python/UPT/Parser/InfParser.py                                         |  164 +-
 BaseTools/Source/Python/UPT/Parser/InfParserMisc.py                                     |  122 +-
 BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py                               |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                                  |   94 +-
 BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py                            |   39 +-
 BaseTools/Source/Python/UPT/Parser/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py                               |  236 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py                               |  183 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py                           |   35 +-
 BaseTools/Source/Python/UPT/PomAdapter/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                                               |   60 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                                    |   66 +-
 BaseTools/Source/Python/UPT/TestInstall.py                                              |   27 +-
 BaseTools/Source/Python/UPT/UPT.py                                                      |  150 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py                       |   86 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py                          |   91 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                                   |   23 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py                               |  118 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py                            |  131 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                                            |  265 +-
 BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py                                   |  139 +-
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                                             |  152 +-
 BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py                                 |  143 +-
 BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py                                |   87 +-
 BaseTools/Source/Python/UPT/Xml/PcdXml.py                                               |  141 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                                            |  363 +-
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                                        |   22 +-
 BaseTools/Source/Python/UPT/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                                   |  303 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                                       |  184 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                                       | 2007 +++--
 BaseTools/Source/Python/Workspace/InfBuildData.py                                       |  426 +-
 BaseTools/Source/Python/Workspace/MetaDataTable.py                                      |   98 +-
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py                              |   21 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                                     |  885 ++-
 BaseTools/Source/Python/Workspace/MetaFileTable.py                                      |  217 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                                    |   60 +-
 BaseTools/Source/Python/Workspace/WorkspaceDatabase.py                                  |   67 +-
 BaseTools/Source/Python/Workspace/__init__.py                                           |    2 +-
 BaseTools/Source/Python/build/BuildReport.py                                            |  766 +-
 BaseTools/Source/Python/build/__init__.py                                               |    2 +-
 BaseTools/Source/Python/build/build.py                                                  | 1145 +--
 BaseTools/Source/Python/build/buildoptions.py                                           |  113 +-
 BaseTools/Source/Python/sitecustomize.py                                                |   11 +-
 BaseTools/Source/Python/tests/Split/test_split.py                                       |   37 +-
 BaseTools/Tests/CToolsTests.py                                                          |    6 +-
 BaseTools/Tests/CheckPythonSyntax.py                                                    |   15 +-
 BaseTools/Tests/CheckUnicodeSourceFiles.py                                              |    4 +-
 BaseTools/Tests/PythonTest.py                                                           |    2 +-
 BaseTools/Tests/PythonToolsTests.py                                                     |    4 +-
 BaseTools/Tests/RunTests.py                                                             |    7 +-
 BaseTools/Tests/TestRegularExpression.py                                                |    7 +-
 BaseTools/Tests/TestTools.py                                                            |   54 +-
 BaseTools/Tests/TianoCompress.py                                                        |   15 +-
 335 files changed, 35765 insertions(+), 32705 deletions(-)

-- 
2.37.3


^ permalink raw reply	[flat|nested] 5+ messages in thread

* [PATCH v1 1/1] Format BaseTools python files using autopep8
  2022-10-10 20:05 [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
@ 2022-10-10 20:05 ` Ayush Singh
  2022-10-12  4:56 ` [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
  1 sibling, 0 replies; 5+ messages in thread
From: Ayush Singh @ 2022-10-10 20:05 UTC (permalink / raw)
  To: devel

Format all python files in BaseTools using the following commands:
$ find . -name '*.py' -exec autopep8 --in-place '{}' \;
This is to make the Python code PEP8 compilant as stated in EDK II Python
Development Process Specification

Signed-off-by: Ayush Singh <ayushdevel1325@gmail.com>
---
 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                                       |   74 +-
 BaseTools/Edk2ToolsBuild.py                                                             |    4 +-
 BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py                          |   15 +-
 BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py                               |    6 +-
 BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py                                   |    8 +-
 BaseTools/Scripts/BinToPcd.py                                                           |  185 +-
 BaseTools/Scripts/ConvertFceToStructurePcd.py                                           | 1312 ++--
 BaseTools/Scripts/ConvertMasmToNasm.py                                                  |    7 +-
 BaseTools/Scripts/ConvertUni.py                                                         |   14 +-
 BaseTools/Scripts/DetectNotUsedItem.py                                                  |   23 +-
 BaseTools/Scripts/FormatDosFiles.py                                                     |   25 +-
 BaseTools/Scripts/GetMaintainer.py                                                      |   19 +-
 BaseTools/Scripts/GetUtcDateTime.py                                                     |   18 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py                                             |  162 +-
 BaseTools/Scripts/PackageDocumentTools/__init__.py                                      |    2 +-
 BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py                                |  138 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py                   |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py         |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py          |   79 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py        |   96 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py              |   92 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py       |    3 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py          |   12 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py              |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py        |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py      |  165 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py             |   41 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py      |  374 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py |  372 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py             |   25 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py             |   59 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py                              |    2 +-
 BaseTools/Scripts/PatchCheck.py                                                         |   90 +-
 BaseTools/Scripts/RunMakefile.py                                                        |  258 +-
 BaseTools/Scripts/SetupGit.py                                                           |   23 +-
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                                         |  165 +-
 BaseTools/Scripts/UpdateBuildVersions.py                                                |   64 +-
 BaseTools/Scripts/efi_debugging.py                                                      |    4 +-
 BaseTools/Scripts/efi_gdb.py                                                            |    1 +
 BaseTools/Source/C/Makefiles/NmakeSubdirs.py                                            |   43 +-
 BaseTools/Source/C/PyEfiCompressor/setup.py                                             |   16 +-
 BaseTools/Source/Python/AmlToC/AmlToC.py                                                |   29 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                                              |   58 +-
 BaseTools/Source/Python/AutoGen/AutoGenWorker.py                                        |  145 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                                          |  158 +-
 BaseTools/Source/Python/AutoGen/DataPipe.py                                             |  152 +-
 BaseTools/Source/Python/AutoGen/GenC.py                                                 |  944 ++-
 BaseTools/Source/Python/AutoGen/GenDepex.py                                             |  211 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                                              |  738 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                                             |  533 +-
 BaseTools/Source/Python/AutoGen/GenVar.py                                               |  182 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py                                       |   97 +-
 BaseTools/Source/Python/AutoGen/IncludesAutoGen.py                                      |  109 +-
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                                     |   47 +-
 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py                                        |  945 ++-
 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py                                  |  255 +-
 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py                                      |  554 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                                            |  221 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                                       |  261 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py                              |   23 +-
 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py                                     |  405 +-
 BaseTools/Source/Python/AutoGen/__init__.py                                             |    2 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                                    |   37 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                                  |  348 +-
 BaseTools/Source/Python/BPDG/StringTable.py                                             |   47 +-
 BaseTools/Source/Python/BPDG/__init__.py                                                |    2 +-
 BaseTools/Source/Python/Capsule/GenerateCapsule.py                                      | 1329 ++--
 BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py                                |  119 +-
 BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py                          |   83 +-
 BaseTools/Source/Python/Common/BuildToolError.py                                        |  109 +-
 BaseTools/Source/Python/Common/BuildVersion.py                                          |    2 +-
 BaseTools/Source/Python/Common/DataType.py                                              |  187 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py                         |   84 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Edk2/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/EdkLogger.py                                             |  112 +-
 BaseTools/Source/Python/Common/Expression.py                                            |  243 +-
 BaseTools/Source/Python/Common/GlobalData.py                                            |   24 +-
 BaseTools/Source/Python/Common/LongFilePathOs.py                                        |   31 +-
 BaseTools/Source/Python/Common/LongFilePathOsPath.py                                    |   10 +-
 BaseTools/Source/Python/Common/LongFilePathSupport.py                                   |   11 +-
 BaseTools/Source/Python/Common/Misc.py                                                  |  479 +-
 BaseTools/Source/Python/Common/MultipleWorkspace.py                                     |   32 +-
 BaseTools/Source/Python/Common/Parsing.py                                               |  332 +-
 BaseTools/Source/Python/Common/RangeExpression.py                                       |   89 +-
 BaseTools/Source/Python/Common/StringUtils.py                                           |  194 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                                  |   73 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                                    |   89 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py                        |  394 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py                            |  117 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py                         |  286 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py                        |  110 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Uefi/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/VariableAttributes.py                                    |   14 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                           |   96 +-
 BaseTools/Source/Python/Common/__init__.py                                              |    2 +-
 BaseTools/Source/Python/Common/caching.py                                               |   29 +-
 BaseTools/Source/Python/CommonDataClass/CommonClass.py                                  |   29 +-
 BaseTools/Source/Python/CommonDataClass/DataClass.py                                    |   88 +-
 BaseTools/Source/Python/CommonDataClass/Exceptions.py                                   |   12 +-
 BaseTools/Source/Python/CommonDataClass/FdfClass.py                                     |  129 +-
 BaseTools/Source/Python/CommonDataClass/__init__.py                                     |    2 +-
 BaseTools/Source/Python/Ecc/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Ecc/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Ecc/CParser4/CLexer.py                                          |  140 +-
 BaseTools/Source/Python/Ecc/CParser4/CListener.py                                       |  359 +-
 BaseTools/Source/Python/Ecc/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Ecc/Check.py                                                    |  404 +-
 BaseTools/Source/Python/Ecc/CodeFragment.py                                             |   68 +-
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                                    |  151 +-
 BaseTools/Source/Python/Ecc/Configuration.py                                            |  245 +-
 BaseTools/Source/Python/Ecc/Database.py                                                 |  111 +-
 BaseTools/Source/Python/Ecc/EccGlobalData.py                                            |    2 +-
 BaseTools/Source/Python/Ecc/EccMain.py                                                  |  144 +-
 BaseTools/Source/Python/Ecc/EccToolError.py                                             |  177 +-
 BaseTools/Source/Python/Ecc/Exception.py                                                |   16 +-
 BaseTools/Source/Python/Ecc/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                                           |   62 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py                          |   43 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py                         |  827 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py                          |  181 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py                               |    2 +-
 BaseTools/Source/Python/Ecc/ParserWarning.py                                            |    8 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                                          |   39 +-
 BaseTools/Source/Python/Ecc/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Ecc/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Ecc/c.py                                                        |  512 +-
 BaseTools/Source/Python/Eot/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Eot/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Eot/CParser4/CLexer.py                                          |  139 +-
 BaseTools/Source/Python/Eot/CParser4/CListener.py                                       |  358 +-
 BaseTools/Source/Python/Eot/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Eot/CodeFragment.py                                             |   78 +-
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                                    |  119 +-
 BaseTools/Source/Python/Eot/Database.py                                                 |   77 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                                            |    5 +-
 BaseTools/Source/Python/Eot/EotMain.py                                                  |  544 +-
 BaseTools/Source/Python/Eot/EotToolError.py                                             |    7 +-
 BaseTools/Source/Python/Eot/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Eot/Identification.py                                           |   11 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                                            |   52 +-
 BaseTools/Source/Python/Eot/Parser.py                                                   |  244 +-
 BaseTools/Source/Python/Eot/ParserWarning.py                                            |    6 +-
 BaseTools/Source/Python/Eot/Report.py                                                   |   63 +-
 BaseTools/Source/Python/Eot/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Eot/c.py                                                        |  100 +-
 BaseTools/Source/Python/FMMT/FMMT.py                                                    |   57 +-
 BaseTools/Source/Python/FMMT/__init__.py                                                |    4 +-
 BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py                               |  151 +-
 BaseTools/Source/Python/FMMT/core/BiosTree.py                                           |   47 +-
 BaseTools/Source/Python/FMMT/core/BiosTreeNode.py                                       |   77 +-
 BaseTools/Source/Python/FMMT/core/FMMTOperation.py                                      |   40 +-
 BaseTools/Source/Python/FMMT/core/FMMTParser.py                                         |   30 +-
 BaseTools/Source/Python/FMMT/core/FvHandler.py                                          |  201 +-
 BaseTools/Source/Python/FMMT/core/GuidTools.py                                          |   47 +-
 BaseTools/Source/Python/FMMT/utils/FmmtLogger.py                                        |   10 +-
 BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py                                     |   29 +-
 BaseTools/Source/Python/FirmwareStorageFormat/Common.py                                 |   20 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py                          |    5 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py                               |   31 +-
 BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py                          |    9 +-
 BaseTools/Source/Python/FirmwareStorageFormat/__init__.py                               |    4 +-
 BaseTools/Source/Python/GenFds/AprioriSection.py                                        |   48 +-
 BaseTools/Source/Python/GenFds/Capsule.py                                               |  103 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                                           |  111 +-
 BaseTools/Source/Python/GenFds/CompressSection.py                                       |   42 +-
 BaseTools/Source/Python/GenFds/DataSection.py                                           |   63 +-
 BaseTools/Source/Python/GenFds/DepexSection.py                                          |   40 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                                            |  167 +-
 BaseTools/Source/Python/GenFds/Fd.py                                                    |   80 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                                             | 1680 +++--
 BaseTools/Source/Python/GenFds/Ffs.py                                                   |   58 +-
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                                      |   67 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                                       |  572 +-
 BaseTools/Source/Python/GenFds/Fv.py                                                    |  240 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                                        |   77 +-
 BaseTools/Source/Python/GenFds/GenFds.py                                                |  353 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                                  |  355 +-
 BaseTools/Source/Python/GenFds/GuidSection.py                                           |   90 +-
 BaseTools/Source/Python/GenFds/OptRomFileStatement.py                                   |   16 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py                                    |   56 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                                             |   46 +-
 BaseTools/Source/Python/GenFds/Region.py                                                |  113 +-
 BaseTools/Source/Python/GenFds/Rule.py                                                  |    8 +-
 BaseTools/Source/Python/GenFds/RuleComplexFile.py                                       |   12 +-
 BaseTools/Source/Python/GenFds/RuleSimpleFile.py                                        |   10 +-
 BaseTools/Source/Python/GenFds/Section.py                                               |  121 +-
 BaseTools/Source/Python/GenFds/UiSection.py                                             |   23 +-
 BaseTools/Source/Python/GenFds/VerSection.py                                            |   15 +-
 BaseTools/Source/Python/GenFds/__init__.py                                              |    2 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py                            |   61 +-
 BaseTools/Source/Python/GenPatchPcdTable/__init__.py                                    |    2 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                                  |   49 +-
 BaseTools/Source/Python/PatchPcdValue/__init__.py                                       |    2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                                          |  408 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py                  |  259 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py                          |  350 +-
 BaseTools/Source/Python/Split/Split.py                                                  |   17 +-
 BaseTools/Source/Python/Table/Table.py                                                  |   25 +-
 BaseTools/Source/Python/Table/TableDataModel.py                                         |   17 +-
 BaseTools/Source/Python/Table/TableDec.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableDsc.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableEotReport.py                                         |   19 +-
 BaseTools/Source/Python/Table/TableFdf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableFile.py                                              |   26 +-
 BaseTools/Source/Python/Table/TableFunction.py                                          |   15 +-
 BaseTools/Source/Python/Table/TableIdentifier.py                                        |   15 +-
 BaseTools/Source/Python/Table/TableInf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TablePcd.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableQuery.py                                             |   12 +-
 BaseTools/Source/Python/Table/TableReport.py                                            |   37 +-
 BaseTools/Source/Python/Table/__init__.py                                               |    2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                                        |  107 +-
 BaseTools/Source/Python/TargetTool/__init__.py                                          |    2 +-
 BaseTools/Source/Python/Trim/Trim.py                                                    |  224 +-
 BaseTools/Source/Python/UPT/BuildVersion.py                                             |    2 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                                     |   82 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py                            |   78 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py                                            |   50 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                                               |  230 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                                         |   61 +-
 BaseTools/Source/Python/UPT/Core/__init__.py                                            |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                                   |  149 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                                   |  181 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py                              |   40 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py                                   |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/InstallPkg.py                                               |  295 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                                              |   45 +-
 BaseTools/Source/Python/UPT/Library/CommentGenerating.py                                |   88 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                                   |  177 +-
 BaseTools/Source/Python/UPT/Library/DataType.py                                         |  379 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py                               |  147 +-
 BaseTools/Source/Python/UPT/Library/GlobalData.py                                       |    2 +-
 BaseTools/Source/Python/UPT/Library/Misc.py                                             |  227 +-
 BaseTools/Source/Python/UPT/Library/ParserValidate.py                                   |  130 +-
 BaseTools/Source/Python/UPT/Library/Parsing.py                                          |  363 +-
 BaseTools/Source/Python/UPT/Library/StringUtils.py                                      |  203 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                                   |  447 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                                  |   37 +-
 BaseTools/Source/Python/UPT/Library/Xml/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/Library/__init__.py                                         |    2 +-
 BaseTools/Source/Python/UPT/Logger/Log.py                                               |   97 +-
 BaseTools/Source/Python/UPT/Logger/StringTable.py                                       |  933 +--
 BaseTools/Source/Python/UPT/Logger/ToolError.py                                         |  117 +-
 BaseTools/Source/Python/UPT/Logger/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                                    |   73 +-
 BaseTools/Source/Python/UPT/Object/POM/CommonObject.py                                  |   93 +-
 BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py                                  |   35 +-
 BaseTools/Source/Python/UPT/Object/POM/PackageObject.py                                 |   13 +-
 BaseTools/Source/Python/UPT/Object/POM/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/Object/Parser/DecObject.py                                  |  186 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py                            |  138 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py                       |   15 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py                            |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py                      |   36 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py                            |  334 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py                             |   21 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py                              |   51 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py                            |   34 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py                    |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                                    |   23 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py                          |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py                               |  116 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py                               |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py                          |   42 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py                            |   66 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py                     |   24 +-
 BaseTools/Source/Python/UPT/Object/Parser/__init__.py                                   |    2 +-
 BaseTools/Source/Python/UPT/Object/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/Parser/DecParser.py                                         |  280 +-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                                     |   56 +-
 BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py                                 |   47 +-
 BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py                            |   44 +-
 BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py                       |   53 +-
 BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py                            |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py                             |   15 +-
 BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py                   |   73 +-
 BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py                           |   38 +-
 BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py                           |   34 +-
 BaseTools/Source/Python/UPT/Parser/InfParser.py                                         |  164 +-
 BaseTools/Source/Python/UPT/Parser/InfParserMisc.py                                     |  122 +-
 BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py                               |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                                  |   94 +-
 BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py                            |   39 +-
 BaseTools/Source/Python/UPT/Parser/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py                               |  236 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py                               |  183 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py                           |   35 +-
 BaseTools/Source/Python/UPT/PomAdapter/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                                               |   60 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                                    |   66 +-
 BaseTools/Source/Python/UPT/TestInstall.py                                              |   27 +-
 BaseTools/Source/Python/UPT/UPT.py                                                      |  150 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py                       |   86 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py                          |   91 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                                   |   23 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py                               |  118 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py                            |  131 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                                            |  265 +-
 BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py                                   |  139 +-
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                                             |  152 +-
 BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py                                 |  143 +-
 BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py                                |   87 +-
 BaseTools/Source/Python/UPT/Xml/PcdXml.py                                               |  141 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                                            |  363 +-
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                                        |   22 +-
 BaseTools/Source/Python/UPT/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                                   |  303 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                                       |  184 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                                       | 2007 +++--
 BaseTools/Source/Python/Workspace/InfBuildData.py                                       |  426 +-
 BaseTools/Source/Python/Workspace/MetaDataTable.py                                      |   98 +-
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py                              |   21 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                                     |  885 ++-
 BaseTools/Source/Python/Workspace/MetaFileTable.py                                      |  217 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                                    |   60 +-
 BaseTools/Source/Python/Workspace/WorkspaceDatabase.py                                  |   67 +-
 BaseTools/Source/Python/Workspace/__init__.py                                           |    2 +-
 BaseTools/Source/Python/build/BuildReport.py                                            |  766 +-
 BaseTools/Source/Python/build/__init__.py                                               |    2 +-
 BaseTools/Source/Python/build/build.py                                                  | 1145 +--
 BaseTools/Source/Python/build/buildoptions.py                                           |  113 +-
 BaseTools/Source/Python/sitecustomize.py                                                |   11 +-
 BaseTools/Source/Python/tests/Split/test_split.py                                       |   37 +-
 BaseTools/Tests/CToolsTests.py                                                          |    6 +-
 BaseTools/Tests/CheckPythonSyntax.py                                                    |   15 +-
 BaseTools/Tests/CheckUnicodeSourceFiles.py                                              |    4 +-
 BaseTools/Tests/PythonTest.py                                                           |    2 +-
 BaseTools/Tests/PythonToolsTests.py                                                     |    4 +-
 BaseTools/Tests/RunTests.py                                                             |    7 +-
 BaseTools/Tests/TestRegularExpression.py                                                |    7 +-
 BaseTools/Tests/TestTools.py                                                            |   54 +-
 BaseTools/Tests/TianoCompress.py                                                        |   15 +-
 335 files changed, 35765 insertions(+), 32705 deletions(-)

diff --git a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
index 3035732d5c81..cd0577e22c0e 100755
--- a/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
+++ b/BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
@@ -26,22 +26,27 @@ import pipes
 # Convert using cygpath command line tool
 # Currently not used, but just in case we need it in the future
 #
+
+
 def ConvertCygPathToDosViacygpath(CygPath):
-  p = subprocess.Popen("cygpath -m " + pipes.quote(CygPath), shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=True)
-  return p.stdout.read().strip()
+    p = subprocess.Popen("cygpath -m " + pipes.quote(CygPath), shell=True,
+                         stdout=subprocess.PIPE, stderr=subprocess.STDOUT, close_fds=True)
+    return p.stdout.read().strip()
 
 #
 #
 #
+
+
 def ConvertCygPathToDos(CygPath):
-  if CygPath.find("/cygdrive/") == 0:
-    # convert /cygdrive/c/Xyz to c:/Xyz
-    DosPath = CygPath[10] + ':' + CygPath[11:]
-  else:
-    DosPath = CygPath
+    if CygPath.find("/cygdrive/") == 0:
+        # convert /cygdrive/c/Xyz to c:/Xyz
+        DosPath = CygPath[10] + ':' + CygPath[11:]
+    else:
+        DosPath = CygPath
 
-  # pipes.quote will add the extra \\ for us.
-  return DosPath.replace('/', '\\')
+    # pipes.quote will add the extra \\ for us.
+    return DosPath.replace('/', '\\')
 
 
 # we receive our options as a list, but we will be passing them to the shell as a line
@@ -50,38 +55,37 @@ def ConvertCygPathToDos(CygPath):
 # if you don't use the shell you don't get a PATH search.
 def main(argv):
 
-  # use 1st argument as name of tool to call
-  Command = pipes.quote(sys.argv[1]);
+    # use 1st argument as name of tool to call
+    Command = pipes.quote(sys.argv[1])
 
-  ExceptionList = ["/interwork"]
+    ExceptionList = ["/interwork"]
 
-  for arg in argv:
-    if arg.find('/') == -1:
-      # if we don't need to convert just add to the command line
-      Command = Command + ' ' + pipes.quote(arg)
-    elif arg in ExceptionList:
-      # if it is in the list, then don't do a cygpath
-      # assembler stuff after --apcs has the /.
-      Command = Command + ' ' + pipes.quote(arg)
-    else:
-      if ((arg[0] == '-') and (arg[1] == 'I' or arg[1] == 'i')):
-        CygPath = arg[0] + arg[1] + ConvertCygPathToDos(arg[2:])
-      else:
-        CygPath = ConvertCygPathToDos(arg)
+    for arg in argv:
+        if arg.find('/') == -1:
+            # if we don't need to convert just add to the command line
+            Command = Command + ' ' + pipes.quote(arg)
+        elif arg in ExceptionList:
+            # if it is in the list, then don't do a cygpath
+            # assembler stuff after --apcs has the /.
+            Command = Command + ' ' + pipes.quote(arg)
+        else:
+            if ((arg[0] == '-') and (arg[1] == 'I' or arg[1] == 'i')):
+                CygPath = arg[0] + arg[1] + ConvertCygPathToDos(arg[2:])
+            else:
+                CygPath = ConvertCygPathToDos(arg)
 
-      Command = Command + ' ' + pipes.quote(CygPath)
+            Command = Command + ' ' + pipes.quote(CygPath)
 
-  # call the real tool with the converted paths
-  return subprocess.call(Command, shell=True)
+    # call the real tool with the converted paths
+    return subprocess.call(Command, shell=True)
 
 
 if __name__ == "__main__":
-  try:
-     ret = main(sys.argv[2:])
+    try:
+        ret = main(sys.argv[2:])
 
-  except:
-    print("exiting: exception from " + sys.argv[0])
-    ret = 2
-
-  sys.exit(ret)
+    except:
+        print("exiting: exception from " + sys.argv[0])
+        ret = 2
 
+    sys.exit(ret)
diff --git a/BaseTools/Edk2ToolsBuild.py b/BaseTools/Edk2ToolsBuild.py
index f862468ce275..f719924c2c0e 100644
--- a/BaseTools/Edk2ToolsBuild.py
+++ b/BaseTools/Edk2ToolsBuild.py
@@ -143,7 +143,8 @@ class Edk2ToolsBuild(BaseAbstractInvocable):
 
         elif self.tool_chain_tag.lower().startswith("gcc"):
             cpu_count = self.GetCpuThreads()
-            ret = RunCmd("make", f"-C .  -j {cpu_count}", workingdir=shell_env.get_shell_var("EDK_TOOLS_PATH"))
+            ret = RunCmd(
+                "make", f"-C .  -j {cpu_count}", workingdir=shell_env.get_shell_var("EDK_TOOLS_PATH"))
             if ret != 0:
                 raise Exception("Failed to build.")
 
@@ -168,7 +169,6 @@ class Edk2ToolsBuild(BaseAbstractInvocable):
         return cpus
 
 
-
 def main():
     Edk2ToolsBuild().Invoke()
 
diff --git a/BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py b/BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py
index 9f86b1c35885..5c826a9e7104 100644
--- a/BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py
+++ b/BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py
@@ -14,16 +14,19 @@ try:
             try:
                 from edk2toolext.environment import version_aggregator
             except ImportError:
-                logging.critical("Loading BuildToolsReportGenerator failed, please update your Edk2-PyTool-Extensions")
+                logging.critical(
+                    "Loading BuildToolsReportGenerator failed, please update your Edk2-PyTool-Extensions")
                 return 0
 
-            OutputReport = os.path.join(thebuilder.env.GetValue("BUILD_OUTPUT_BASE"), "BUILD_TOOLS_REPORT")
+            OutputReport = os.path.join(thebuilder.env.GetValue(
+                "BUILD_OUTPUT_BASE"), "BUILD_TOOLS_REPORT")
             OutputReport = os.path.normpath(OutputReport)
             if not os.path.isdir(os.path.dirname(OutputReport)):
                 os.makedirs(os.path.dirname(OutputReport))
 
             Report = BuildToolsReport()
-            Report.MakeReport(version_aggregator.GetVersionAggregator().GetAggregatedVersionInformation(), OutputReport=OutputReport)
+            Report.MakeReport(version_aggregator.GetVersionAggregator(
+            ).GetAggregatedVersionInformation(), OutputReport=OutputReport)
 
         def do_pre_build(self, thebuilder):
             self.do_report(thebuilder)
@@ -55,11 +58,13 @@ class BuildToolsReport(object):
 
         htmlfile = open(OutputReport + ".html", "w")
         jsonfile = open(OutputReport + ".json", "w")
-        template = open(os.path.join(BuildToolsReport.MY_FOLDER, "BuildToolsReport_Template.html"), "r")
+        template = open(os.path.join(BuildToolsReport.MY_FOLDER,
+                        "BuildToolsReport_Template.html"), "r")
 
         for line in template.readlines():
             if "%TO_BE_FILLED_IN_BY_PYTHON_SCRIPT%" in line:
-                line = line.replace("%TO_BE_FILLED_IN_BY_PYTHON_SCRIPT%", json.dumps(json_dict))
+                line = line.replace(
+                    "%TO_BE_FILLED_IN_BY_PYTHON_SCRIPT%", json.dumps(json_dict))
             htmlfile.write(line)
 
         jsonfile.write(json.dumps(versions_list, indent=4))
diff --git a/BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py b/BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py
index f0685d804029..8d5b0e0d3fba 100644
--- a/BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py
+++ b/BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py
@@ -57,7 +57,8 @@ class LinuxGcc5ToolChain(IUefiBuildPlugin):
                 return 0
 
             # make GCC5_ARM_PREFIX to align with tools_def.txt
-            prefix = os.path.join(install_path, "bin", "arm-none-linux-gnueabihf-")
+            prefix = os.path.join(install_path, "bin",
+                                  "arm-none-linux-gnueabihf-")
             shell_environment.GetEnvironment().set_shell_var("GCC5_ARM_PREFIX", prefix)
 
         # now confirm it exists
@@ -80,7 +81,8 @@ class LinuxGcc5ToolChain(IUefiBuildPlugin):
                 return 0
 
             # make GCC5_AARCH64_PREFIX to align with tools_def.txt
-            prefix = os.path.join(install_path, "bin", "aarch64-none-linux-gnu-")
+            prefix = os.path.join(install_path, "bin",
+                                  "aarch64-none-linux-gnu-")
             shell_environment.GetEnvironment().set_shell_var("GCC5_AARCH64_PREFIX", prefix)
 
         # now confirm it exists
diff --git a/BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py b/BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py
index ec2f2d1298f9..f6da2312f4bd 100644
--- a/BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py
+++ b/BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py
@@ -1,4 +1,4 @@
-## @file WinRcPath.py
+# @file WinRcPath.py
 # Plugin to find Windows SDK Resource Compiler rc.exe
 ##
 # This plugin works in conjuncture with the tools_def to support rc.exe
@@ -12,18 +12,20 @@ import edk2toollib.windows.locate_tools as locate_tools
 from edk2toolext.environment import shell_environment
 from edk2toolext.environment import version_aggregator
 
+
 class WinRcPath(IUefiBuildPlugin):
 
     def do_post_build(self, thebuilder):
         return 0
 
     def do_pre_build(self, thebuilder):
-        #get the locate tools module
+        # get the locate tools module
         path = locate_tools.FindToolInWinSdk("rc.exe")
         if path is None:
             thebuilder.logging.warning("Failed to find rc.exe")
         else:
             p = os.path.abspath(os.path.dirname(path))
             shell_environment.GetEnvironment().set_shell_var("WINSDK_PATH_FOR_RC_EXE", p)
-            version_aggregator.GetVersionAggregator().ReportVersion("WINSDK_PATH_FOR_RC_EXE", p, version_aggregator.VersionTypes.INFO)
+            version_aggregator.GetVersionAggregator().ReportVersion(
+                "WINSDK_PATH_FOR_RC_EXE", p, version_aggregator.VersionTypes.INFO)
         return 0
diff --git a/BaseTools/Scripts/BinToPcd.py b/BaseTools/Scripts/BinToPcd.py
index 3bc557b8412c..0e82a349d26d 100644
--- a/BaseTools/Scripts/BinToPcd.py
+++ b/BaseTools/Scripts/BinToPcd.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Convert a binary file to a VOID* PCD value or DSC file VOID* PCD statement.
 #
 # Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -18,89 +18,93 @@ import xdrlib
 #
 # Globals for help information
 #
-__prog__        = 'BinToPcd'
-__copyright__   = 'Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved.'
+__prog__ = 'BinToPcd'
+__copyright__ = 'Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved.'
 __description__ = 'Convert one or more binary files to a VOID* PCD value or DSC file VOID* PCD statement.\n'
 
 if __name__ == '__main__':
-    def ValidateUnsignedInteger (Argument):
+    def ValidateUnsignedInteger(Argument):
         try:
-            Value = int (Argument, 0)
+            Value = int(Argument, 0)
         except:
-            Message = '{Argument} is not a valid integer value.'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+            Message = '{Argument} is not a valid integer value.'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         if Value < 0:
-            Message = '{Argument} is a negative value.'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+            Message = '{Argument} is a negative value.'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         return Value
 
-    def ValidatePcdName (Argument):
-        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
-            Message = '{Argument} is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+    def ValidatePcdName(Argument):
+        if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*\.[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
+            Message = '{Argument} is not in the form <PcdTokenSpaceGuidCName>.<PcdCName>'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         return Argument
 
-    def ValidateGuidName (Argument):
-        if re.split ('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
-            Message = '{Argument} is not a valid GUID C name'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+    def ValidateGuidName(Argument):
+        if re.split('[a-zA-Z\_][a-zA-Z0-9\_]*', Argument) != ['', '']:
+            Message = '{Argument} is not a valid GUID C name'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         return Argument
 
-    def ByteArray (Buffer, Xdr = False):
+    def ByteArray(Buffer, Xdr=False):
         if Xdr:
             #
             # If Xdr flag is set then encode data using the Variable-Length Opaque
             # Data format of RFC 4506 External Data Representation Standard (XDR).
             #
-            XdrEncoder = xdrlib.Packer ()
+            XdrEncoder = xdrlib.Packer()
             for Item in Buffer:
-                XdrEncoder.pack_bytes (Item)
-            Buffer = bytearray (XdrEncoder.get_buffer ())
+                XdrEncoder.pack_bytes(Item)
+            Buffer = bytearray(XdrEncoder.get_buffer())
         else:
             #
             # If Xdr flag is not set, then concatenate all the data
             #
-            Buffer = bytearray (b''.join (Buffer))
+            Buffer = bytearray(b''.join(Buffer))
         #
         # Return a PCD value of the form '{0x01, 0x02, ...}' along with the PCD length in bytes
         #
-        return '{' + (', '.join (['0x{Byte:02X}'.format (Byte = Item) for Item in Buffer])) + '}', len (Buffer)
+        return '{' + (', '.join(['0x{Byte:02X}'.format(Byte=Item) for Item in Buffer])) + '}', len(Buffer)
 
     #
     # Create command line argument parser object
     #
-    parser = argparse.ArgumentParser (prog = __prog__,
-                                      description = __description__ + __copyright__,
-                                      conflict_handler = 'resolve')
-    parser.add_argument ("-i", "--input", dest = 'InputFile', type = argparse.FileType ('rb'), action='append', required = True,
-                         help = "Input binary filename.  Multiple input files are combined into a single PCD.")
-    parser.add_argument ("-o", "--output", dest = 'OutputFile', type = argparse.FileType ('w'),
-                         help = "Output filename for PCD value or PCD statement")
-    parser.add_argument ("-p", "--pcd", dest = 'PcdName', type = ValidatePcdName,
-                         help = "Name of the PCD in the form <PcdTokenSpaceGuidCName>.<PcdCName>")
-    parser.add_argument ("-t", "--type", dest = 'PcdType', default = None, choices = ['VPD', 'HII'],
-                         help = "PCD statement type (HII or VPD).  Default is standard.")
-    parser.add_argument ("-m", "--max-size", dest = 'MaxSize', type = ValidateUnsignedInteger,
-                         help = "Maximum size of the PCD.  Ignored with --type HII.")
-    parser.add_argument ("-f", "--offset", dest = 'Offset', type = ValidateUnsignedInteger,
-                         help = "VPD offset if --type is VPD.  UEFI Variable offset if --type is HII.  Must be 8-byte aligned.")
-    parser.add_argument ("-n", "--variable-name", dest = 'VariableName',
-                         help = "UEFI variable name.  Only used with --type HII.")
-    parser.add_argument ("-g", "--variable-guid", type = ValidateGuidName, dest = 'VariableGuid',
-                         help = "UEFI variable GUID C name.  Only used with --type HII.")
-    parser.add_argument ("-x", "--xdr", dest = 'Xdr', action = "store_true",
-                         help = "Encode PCD using the Variable-Length Opaque Data format of RFC 4506 External Data Representation Standard (XDR)")
-    parser.add_argument ("-v", "--verbose", dest = 'Verbose', action = "store_true",
-                         help = "Increase output messages")
-    parser.add_argument ("-q", "--quiet", dest = 'Quiet', action = "store_true",
-                         help = "Reduce output messages")
-    parser.add_argument ("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = range (0, 10), default = 0,
-                         help = "Set debug level")
+    parser = argparse.ArgumentParser(prog=__prog__,
+                                     description=__description__ + __copyright__,
+                                     conflict_handler='resolve')
+    parser.add_argument("-i", "--input", dest='InputFile', type=argparse.FileType('rb'), action='append', required=True,
+                        help="Input binary filename.  Multiple input files are combined into a single PCD.")
+    parser.add_argument("-o", "--output", dest='OutputFile', type=argparse.FileType('w'),
+                        help="Output filename for PCD value or PCD statement")
+    parser.add_argument("-p", "--pcd", dest='PcdName', type=ValidatePcdName,
+                        help="Name of the PCD in the form <PcdTokenSpaceGuidCName>.<PcdCName>")
+    parser.add_argument("-t", "--type", dest='PcdType', default=None, choices=['VPD', 'HII'],
+                        help="PCD statement type (HII or VPD).  Default is standard.")
+    parser.add_argument("-m", "--max-size", dest='MaxSize', type=ValidateUnsignedInteger,
+                        help="Maximum size of the PCD.  Ignored with --type HII.")
+    parser.add_argument("-f", "--offset", dest='Offset', type=ValidateUnsignedInteger,
+                        help="VPD offset if --type is VPD.  UEFI Variable offset if --type is HII.  Must be 8-byte aligned.")
+    parser.add_argument("-n", "--variable-name", dest='VariableName',
+                        help="UEFI variable name.  Only used with --type HII.")
+    parser.add_argument("-g", "--variable-guid", type=ValidateGuidName, dest='VariableGuid',
+                        help="UEFI variable GUID C name.  Only used with --type HII.")
+    parser.add_argument("-x", "--xdr", dest='Xdr', action="store_true",
+                        help="Encode PCD using the Variable-Length Opaque Data format of RFC 4506 External Data Representation Standard (XDR)")
+    parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true",
+                        help="Increase output messages")
+    parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true",
+                        help="Reduce output messages")
+    parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0,
+                        help="Set debug level")
 
     #
     # Parse command line arguments
     #
-    args = parser.parse_args ()
+    args = parser.parse_args()
 
     #
     # Read all binary input files
@@ -108,17 +112,18 @@ if __name__ == '__main__':
     Buffer = []
     for File in args.InputFile:
         try:
-            Buffer.append (File.read ())
-            File.close ()
+            Buffer.append(File.read())
+            File.close()
         except:
-            print ('BinToPcd: error: can not read binary input file {File}'.format (File = File))
-            sys.exit (1)
+            print(
+                'BinToPcd: error: can not read binary input file {File}'.format(File=File))
+            sys.exit(1)
 
     #
     # Convert PCD to an encoded string of hex values and determine the size of
     # the encoded PCD in bytes.
     #
-    PcdValue, PcdSize = ByteArray (Buffer, args.Xdr)
+    PcdValue, PcdSize = ByteArray(Buffer, args.Xdr)
 
     #
     # Convert binary buffer to a DSC file PCD statement
@@ -129,7 +134,7 @@ if __name__ == '__main__':
         #
         Pcd = PcdValue
         if args.Verbose:
-            print ('BinToPcd: Convert binary file to PCD Value')
+            print('BinToPcd: Convert binary file to PCD Value')
     elif args.PcdType is None:
         #
         # If --type is neither VPD nor HII, then use PCD statement syntax that is
@@ -141,19 +146,21 @@ if __name__ == '__main__':
             # If --max-size is not provided, then do not generate the syntax that
             # includes the maximum size.
             #
-            Pcd = '  {Name}|{Value}'.format (Name = args.PcdName, Value = PcdValue)
+            Pcd = '  {Name}|{Value}'.format(Name=args.PcdName, Value=PcdValue)
         elif args.MaxSize < PcdSize:
-            print ('BinToPcd: error: argument --max-size is smaller than input file.')
-            sys.exit (1)
+            print('BinToPcd: error: argument --max-size is smaller than input file.')
+            sys.exit(1)
         else:
-            Pcd = '  {Name}|{Value}|VOID*|{Size}'.format (Name = args.PcdName, Value = PcdValue, Size = args.MaxSize)
+            Pcd = '  {Name}|{Value}|VOID*|{Size}'.format(
+                Name=args.PcdName, Value=PcdValue, Size=args.MaxSize)
 
         if args.Verbose:
-            print ('BinToPcd: Convert binary file to PCD statement compatible with PCD sections:')
-            print ('    [PcdsFixedAtBuild]')
-            print ('    [PcdsPatchableInModule]')
-            print ('    [PcdsDynamicDefault]')
-            print ('    [PcdsDynamicExDefault]')
+            print(
+                'BinToPcd: Convert binary file to PCD statement compatible with PCD sections:')
+            print('    [PcdsFixedAtBuild]')
+            print('    [PcdsPatchableInModule]')
+            print('    [PcdsDynamicDefault]')
+            print('    [PcdsDynamicExDefault]')
     elif args.PcdType == 'VPD':
         if args.MaxSize is None:
             #
@@ -162,33 +169,37 @@ if __name__ == '__main__':
             #
             args.MaxSize = PcdSize
         if args.MaxSize < PcdSize:
-            print ('BinToPcd: error: argument --max-size is smaller than input file.')
-            sys.exit (1)
+            print('BinToPcd: error: argument --max-size is smaller than input file.')
+            sys.exit(1)
         if args.Offset is None:
             #
             # if --offset is not provided, then set offset field to '*' so build
             # tools will compute offset of PCD in VPD region.
             #
-            Pcd = '  {Name}|*|{Size}|{Value}'.format (Name = args.PcdName, Size = args.MaxSize, Value = PcdValue)
+            Pcd = '  {Name}|*|{Size}|{Value}'.format(
+                Name=args.PcdName, Size=args.MaxSize, Value=PcdValue)
         else:
             #
             # --offset value must be 8-byte aligned
             #
             if (args.Offset % 8) != 0:
-                print ('BinToPcd: error: argument --offset must be 8-byte aligned.')
-                sys.exit (1)
+                print('BinToPcd: error: argument --offset must be 8-byte aligned.')
+                sys.exit(1)
             #
             # Use the --offset value provided.
             #
-            Pcd = '  {Name}|{Offset}|{Size}|{Value}'.format (Name = args.PcdName, Offset = args.Offset, Size = args.MaxSize, Value = PcdValue)
+            Pcd = '  {Name}|{Offset}|{Size}|{Value}'.format(
+                Name=args.PcdName, Offset=args.Offset, Size=args.MaxSize, Value=PcdValue)
         if args.Verbose:
-            print ('BinToPcd: Convert binary file to PCD statement compatible with PCD sections')
-            print ('    [PcdsDynamicVpd]')
-            print ('    [PcdsDynamicExVpd]')
+            print(
+                'BinToPcd: Convert binary file to PCD statement compatible with PCD sections')
+            print('    [PcdsDynamicVpd]')
+            print('    [PcdsDynamicExVpd]')
     elif args.PcdType == 'HII':
         if args.VariableGuid is None or args.VariableName is None:
-            print ('BinToPcd: error: arguments --variable-guid and --variable-name are required for --type HII.')
-            sys.exit (1)
+            print(
+                'BinToPcd: error: arguments --variable-guid and --variable-name are required for --type HII.')
+            sys.exit(1)
         if args.Offset is None:
             #
             # Use UEFI Variable offset of 0 if --offset is not provided
@@ -198,23 +209,25 @@ if __name__ == '__main__':
         # --offset value must be 8-byte aligned
         #
         if (args.Offset % 8) != 0:
-            print ('BinToPcd: error: argument --offset must be 8-byte aligned.')
-            sys.exit (1)
-        Pcd = '  {Name}|L"{VarName}"|{VarGuid}|{Offset}|{Value}'.format (Name = args.PcdName, VarName = args.VariableName, VarGuid = args.VariableGuid, Offset = args.Offset, Value = PcdValue)
+            print('BinToPcd: error: argument --offset must be 8-byte aligned.')
+            sys.exit(1)
+        Pcd = '  {Name}|L"{VarName}"|{VarGuid}|{Offset}|{Value}'.format(
+            Name=args.PcdName, VarName=args.VariableName, VarGuid=args.VariableGuid, Offset=args.Offset, Value=PcdValue)
         if args.Verbose:
-            print ('BinToPcd: Convert binary file to PCD statement compatible with PCD sections')
-            print ('    [PcdsDynamicHii]')
-            print ('    [PcdsDynamicExHii]')
+            print(
+                'BinToPcd: Convert binary file to PCD statement compatible with PCD sections')
+            print('    [PcdsDynamicHii]')
+            print('    [PcdsDynamicExHii]')
 
     #
     # Write PCD value or PCD statement to the output file
     #
     try:
-        args.OutputFile.write (Pcd)
-        args.OutputFile.close ()
+        args.OutputFile.write(Pcd)
+        args.OutputFile.close()
     except:
         #
         # If output file is not specified or it can not be written, then write the
         # PCD value or PCD statement to the console
         #
-        print (Pcd)
+        print(Pcd)
diff --git a/BaseTools/Scripts/ConvertFceToStructurePcd.py b/BaseTools/Scripts/ConvertFceToStructurePcd.py
index 9e7fe58768b1..066ddccbc245 100644
--- a/BaseTools/Scripts/ConvertFceToStructurePcd.py
+++ b/BaseTools/Scripts/ConvertFceToStructurePcd.py
@@ -1,5 +1,5 @@
 #!/usr/bin/python
-## @file
+# @file
 # Firmware Configuration Editor (FCE) from https://firmware.intel.com/develop
 # can parse BIOS image and generate Firmware Configuration file.
 # This script bases on Firmware Configuration file, and generate the structure
@@ -21,13 +21,13 @@ import argparse
 #
 # Globals for help information
 #
-__prog__        = 'ConvertFceToStructurePcd'
-__version__     = '%s Version %s' % (__prog__, '0.1 ')
-__copyright__   = 'Copyright (c) 2018, Intel Corporation. All rights reserved.'
+__prog__ = 'ConvertFceToStructurePcd'
+__version__ = '%s Version %s' % (__prog__, '0.1 ')
+__copyright__ = 'Copyright (c) 2018, Intel Corporation. All rights reserved.'
 __description__ = 'Generate Structure PCD in DEC/DSC/INF based on Firmware Configuration.\n'
 
 
-dscstatement='''[Defines]
+dscstatement = '''[Defines]
   VPD_TOOL_GUID                  = 8C3D856A-9BE6-468E-850A-24F7A8D38E08
 
 [SkuIds]
@@ -50,693 +50,747 @@ decstatement = '''[Guids]
 infstatement = '''[Pcd]
 '''
 
-SECTION='PcdsDynamicHii'
-PCD_NAME='gStructPcdTokenSpaceGuid.Pcd'
+SECTION = 'PcdsDynamicHii'
+PCD_NAME = 'gStructPcdTokenSpaceGuid.Pcd'
 Max_Pcd_Len = 100
 
-WARNING=[]
-ERRORMSG=[]
+WARNING = []
+ERRORMSG = []
+
 
 class parser_lst(object):
 
-  def __init__(self,filelist):
-    self._ignore=['BOOLEAN', 'UINT8', 'UINT16', 'UINT32', 'UINT64']
-    self.file=filelist
-    self.text=self.megre_lst()[0]
-    self.content=self.megre_lst()[1]
+    def __init__(self, filelist):
+        self._ignore = ['BOOLEAN', 'UINT8', 'UINT16', 'UINT32', 'UINT64']
+        self.file = filelist
+        self.text = self.megre_lst()[0]
+        self.content = self.megre_lst()[1]
 
-  def megre_lst(self):
-    alltext=''
-    content={}
-    for file in self.file:
-      with open(file,'r') as f:
-        read =f.read()
-      alltext += read
-      content[file]=read
-    return alltext,content
+    def megre_lst(self):
+        alltext = ''
+        content = {}
+        for file in self.file:
+            with open(file, 'r') as f:
+                read = f.read()
+            alltext += read
+            content[file] = read
+        return alltext, content
 
-  def struct_lst(self):#{struct:lst file}
-    structs_file={}
-    name_format = re.compile(r'(?<!typedef)\s+struct (\w+) {.*?;', re.S)
-    for i in list(self.content.keys()):
-      structs= name_format.findall(self.content[i])
-      if structs:
-        for j in structs:
-          if j not in self._ignore:
-            structs_file[j]=i
-      else:
-        print("%s"%structs)
-    return structs_file
+    def struct_lst(self):  # {struct:lst file}
+        structs_file = {}
+        name_format = re.compile(r'(?<!typedef)\s+struct (\w+) {.*?;', re.S)
+        for i in list(self.content.keys()):
+            structs = name_format.findall(self.content[i])
+            if structs:
+                for j in structs:
+                    if j not in self._ignore:
+                        structs_file[j] = i
+            else:
+                print("%s" % structs)
+        return structs_file
 
-  def struct(self):#struct:{offset:name}
-    unit_num = re.compile('(\d+)')
-    offset1_re = re.compile('(\d+)\[')
-    pcdname_num_re = re.compile('\w+\[(\S+)\]')
-    pcdname_re = re.compile('\](.*)\<')
-    pcdname2_re = re.compile('(\w+)\[')
-    uint_re = re.compile('\<(\S+)\>')
-    name_format = re.compile(r'(?<!typedef)\s+struct (\w+) {.*?;', re.S)
-    name=name_format.findall(self.text)
-    info={}
-    unparse=[]
-    if name:
-      tmp_n = [n for n in name if n not in self._ignore]
-      name = list(set(tmp_n))
-      name.sort(key = tmp_n.index)
-      name.reverse()
-      #name=list(set(name).difference(set(self._ignore)))
-      for struct in name:
-        s_re = re.compile(r'struct %s :(.*?)};'% struct, re.S)
+    def struct(self):  # struct:{offset:name}
+        unit_num = re.compile('(\d+)')
+        offset1_re = re.compile('(\d+)\[')
+        pcdname_num_re = re.compile('\w+\[(\S+)\]')
+        pcdname_re = re.compile('\](.*)\<')
+        pcdname2_re = re.compile('(\w+)\[')
+        uint_re = re.compile('\<(\S+)\>')
+        name_format = re.compile(r'(?<!typedef)\s+struct (\w+) {.*?;', re.S)
+        name = name_format.findall(self.text)
+        info = {}
+        unparse = []
+        if name:
+            tmp_n = [n for n in name if n not in self._ignore]
+            name = list(set(tmp_n))
+            name.sort(key=tmp_n.index)
+            name.reverse()
+            # name=list(set(name).difference(set(self._ignore)))
+            for struct in name:
+                s_re = re.compile(r'struct %s :(.*?)};' % struct, re.S)
+                content = s_re.search(self.text)
+                if content:
+                    tmp_dict = {}
+                    text = content.group().split('+')
+                    for line in text[1:]:
+                        offset = offset1_re.findall(line)
+                        t_name = pcdname_re.findall(line)
+                        uint = uint_re.findall(line)
+                        if offset and uint:
+                            offset = offset[0]
+                            uint = uint[0]
+                            if t_name:
+                                t_name = t_name[0].strip()
+                                if (' ' in t_name) or ("=" in t_name) or (";" in t_name) or ("\\" in name) or (t_name == ''):
+                                    WARNING.append("Warning:Invalid Pcd name '%s' for Offset %s in struct %s" % (
+                                        t_name, offset, struct))
+                                else:
+                                    if '[' in t_name:
+                                        if uint in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:
+                                            offset = int(offset, 10)
+                                            tmp_name = pcdname2_re.findall(t_name)[
+                                                0] + '[0]'
+                                            tmp_dict[offset] = tmp_name
+                                            pcdname_num = int(
+                                                pcdname_num_re.findall(t_name)[0], 10)
+                                            uint = int(
+                                                unit_num.findall(uint)[0], 10)
+                                            bit = uint // 8
+                                            for i in range(1, pcdname_num):
+                                                offset += bit
+                                                tmp_name = pcdname2_re.findall(
+                                                    t_name)[0] + '[%s]' % i
+                                                tmp_dict[offset] = tmp_name
+                                        else:
+                                            tmp_name = pcdname2_re.findall(t_name)[
+                                                0]
+                                            pcdname_num = pcdname_num_re.findall(t_name)[
+                                                0]
+                                            line = [offset, tmp_name,
+                                                    pcdname_num, uint]
+                                            line.append(struct)
+                                            unparse.append(line)
+                                    else:
+                                        if uint not in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN']:
+                                            line = [offset, t_name, 0, uint]
+                                            line.append(struct)
+                                            unparse.append(line)
+                                        else:
+                                            offset = int(offset, 10)
+                                            tmp_dict[offset] = t_name
+                info[struct] = tmp_dict
+            if len(unparse) != 0:
+                for u in unparse:
+                    if u[3] in list(info.keys()):
+                        unpar = self.nameISstruct(u, info[u[3]])
+                        info[u[4]] = dict(
+                            list(info[u[4]].items())+list(unpar[u[4]].items()))
+        else:
+            print("ERROR: No struct name found in %s" % self.file)
+            ERRORMSG.append("ERROR: No struct name found in %s" % self.file)
+        return info
+
+    def nameISstruct(self, line, key_dict):
+        dict = {}
+        dict2 = {}
+        s_re = re.compile(r'struct %s :(.*?)};' % line[3], re.S)
+        size_re = re.compile(r'mTotalSize \[(\S+)\]')
         content = s_re.search(self.text)
         if content:
-          tmp_dict = {}
-          text = content.group().split('+')
-          for line in text[1:]:
-            offset = offset1_re.findall(line)
-            t_name = pcdname_re.findall(line)
-            uint = uint_re.findall(line)
-            if offset and uint:
-              offset = offset[0]
-              uint = uint[0]
-              if t_name:
-                t_name = t_name[0].strip()
-                if (' ' in t_name) or ("=" in t_name) or (";" in t_name) or("\\" in name) or (t_name ==''):
-                  WARNING.append("Warning:Invalid Pcd name '%s' for Offset %s in struct %s" % (t_name,offset, struct))
-                else:
-                  if '[' in t_name:
-                    if uint in ['UINT8', 'UINT16', 'UINT32', 'UINT64']:
-                      offset = int(offset, 10)
-                      tmp_name = pcdname2_re.findall(t_name)[0] + '[0]'
-                      tmp_dict[offset] = tmp_name
-                      pcdname_num = int(pcdname_num_re.findall(t_name)[0],10)
-                      uint = int(unit_num.findall(uint)[0],10)
-                      bit = uint // 8
-                      for i in range(1, pcdname_num):
-                        offset += bit
-                        tmp_name = pcdname2_re.findall(t_name)[0] + '[%s]' % i
-                        tmp_dict[offset] = tmp_name
-                    else:
-                      tmp_name = pcdname2_re.findall(t_name)[0]
-                      pcdname_num = pcdname_num_re.findall(t_name)[0]
-                      line = [offset,tmp_name,pcdname_num,uint]
-                      line.append(struct)
-                      unparse.append(line)
-                  else:
-                    if uint not in ['UINT8', 'UINT16', 'UINT32', 'UINT64', 'BOOLEAN']:
-                      line = [offset, t_name, 0, uint]
-                      line.append(struct)
-                      unparse.append(line)
-                    else:
-                      offset = int(offset,10)
-                      tmp_dict[offset] = t_name
-        info[struct] = tmp_dict
-      if len(unparse) != 0:
-        for u in unparse:
-          if u[3] in list(info.keys()):
-            unpar = self.nameISstruct(u,info[u[3]])
-            info[u[4]]= dict(list(info[u[4]].items())+list(unpar[u[4]].items()))
-    else:
-      print("ERROR: No struct name found in %s" % self.file)
-      ERRORMSG.append("ERROR: No struct name found in %s" % self.file)
-    return info
+            s_size = size_re.findall(content.group())[0]
+        else:
+            s_size = '0'
+            print("ERROR: Struct %s not define mTotalSize in lst file" %
+                  line[3])
+            ERRORMSG.append(
+                "ERROR: Struct %s not define mTotalSize in lst file" % line[3])
+        size = int(line[0], 10)
+        if line[2] != 0:
+            for j in range(0, int(line[2], 10)):
+                for k in list(key_dict.keys()):
+                    offset = size + k
+                    name = '%s.%s' % ((line[1]+'[%s]' % j), key_dict[k])
+                    dict[offset] = name
+                size = int(s_size, 16)+size
+        elif line[2] == 0:
+            for k in list(key_dict.keys()):
+                offset = size + k
+                name = '%s.%s' % (line[1], key_dict[k])
+                dict[offset] = name
+        dict2[line[4]] = dict
+        return dict2
 
+    def efivarstore_parser(self):
+        efivarstore_format = re.compile(r'efivarstore.*?;', re.S)
+        struct_re = re.compile(r'efivarstore(.*?),', re.S)
+        name_re = re.compile(r'name=(\w+)')
+        efivarstore_dict = {}
+        efitxt = efivarstore_format.findall(self.text)
+        for i in efitxt:
+            struct = struct_re.findall(i.replace(' ', ''))
+            if struct[0] in self._ignore:
+                continue
+            name = name_re.findall(i.replace(' ', ''))
+            if struct and name:
+                efivarstore_dict[name[0]] = struct[0]
+            else:
+                print(
+                    "ERROR: Can't find Struct or name in lst file, please check have this format:efivarstore XXXX, name=xxxx")
+                ERRORMSG.append(
+                    "ERROR: Can't find Struct or name in lst file, please check have this format:efivarstore XXXX, name=xxxx")
+        return efivarstore_dict
 
-  def nameISstruct(self,line,key_dict):
-    dict={}
-    dict2={}
-    s_re = re.compile(r'struct %s :(.*?)};' % line[3], re.S)
-    size_re = re.compile(r'mTotalSize \[(\S+)\]')
-    content = s_re.search(self.text)
-    if content:
-      s_size = size_re.findall(content.group())[0]
-    else:
-      s_size = '0'
-      print("ERROR: Struct %s not define mTotalSize in lst file" %line[3])
-      ERRORMSG.append("ERROR: Struct %s not define mTotalSize in lst file" %line[3])
-    size = int(line[0], 10)
-    if line[2] != 0:
-      for j in range(0, int(line[2], 10)):
-        for k in list(key_dict.keys()):
-          offset = size  + k
-          name ='%s.%s' %((line[1]+'[%s]'%j),key_dict[k])
-          dict[offset] = name
-        size = int(s_size,16)+size
-    elif line[2] == 0:
-      for k in list(key_dict.keys()):
-        offset = size + k
-        name = '%s.%s' % (line[1], key_dict[k])
-        dict[offset] = name
-    dict2[line[4]] = dict
-    return dict2
-
-  def efivarstore_parser(self):
-    efivarstore_format = re.compile(r'efivarstore.*?;', re.S)
-    struct_re = re.compile(r'efivarstore(.*?),',re.S)
-    name_re = re.compile(r'name=(\w+)')
-    efivarstore_dict={}
-    efitxt = efivarstore_format.findall(self.text)
-    for i in efitxt:
-      struct = struct_re.findall(i.replace(' ',''))
-      if struct[0] in self._ignore:
-          continue
-      name = name_re.findall(i.replace(' ',''))
-      if struct and name:
-        efivarstore_dict[name[0]]=struct[0]
-      else:
-        print("ERROR: Can't find Struct or name in lst file, please check have this format:efivarstore XXXX, name=xxxx")
-        ERRORMSG.append("ERROR: Can't find Struct or name in lst file, please check have this format:efivarstore XXXX, name=xxxx")
-    return efivarstore_dict
 
 class Config(object):
 
-  def __init__(self,Config):
-    self.config=Config
+    def __init__(self, Config):
+        self.config = Config
 
-  #Parser .config file,return list[offset,name,guid,value,help]
-  def config_parser(self):
-    ids_re =re.compile('_ID:(\d+)',re.S)
-    id_re= re.compile('\s+')
-    info = []
-    info_dict={}
-    with open(self.config, 'r') as text:
-      read = text.read()
-    if 'DEFAULT_ID:' in read:
-      all_txt = read.split('FCEKEY DEFAULT')
-      for i in all_txt[1:]:
-        part = [] #save all infomation for DEFAULT_ID
-        str_id=''
-        ids = ids_re.findall(i.replace(' ',''))
-        for m in ids:
-          str_id +=m+'_'
-        str_id=str_id[:-1]
-        part.append(ids)
-        section = i.split('\nQ') #split with '\nQ ' to get every block
-        part +=self.section_parser(section)
-        info_dict[str_id] = self.section_parser(section)
-        info.append(part)
-    else:
-      part = []
-      id=('0','0')
-      str_id='0_0'
-      part.append(id)
-      section = read.split('\nQ')
-      part +=self.section_parser(section)
-      info_dict[str_id] = self.section_parser(section)
-      info.append(part)
-    return info_dict
+    # Parser .config file,return list[offset,name,guid,value,help]
+    def config_parser(self):
+        ids_re = re.compile('_ID:(\d+)', re.S)
+        id_re = re.compile('\s+')
+        info = []
+        info_dict = {}
+        with open(self.config, 'r') as text:
+            read = text.read()
+        if 'DEFAULT_ID:' in read:
+            all_txt = read.split('FCEKEY DEFAULT')
+            for i in all_txt[1:]:
+                part = []  # save all infomation for DEFAULT_ID
+                str_id = ''
+                ids = ids_re.findall(i.replace(' ', ''))
+                for m in ids:
+                    str_id += m+'_'
+                str_id = str_id[:-1]
+                part.append(ids)
+                # split with '\nQ ' to get every block
+                section = i.split('\nQ')
+                part += self.section_parser(section)
+                info_dict[str_id] = self.section_parser(section)
+                info.append(part)
+        else:
+            part = []
+            id = ('0', '0')
+            str_id = '0_0'
+            part.append(id)
+            section = read.split('\nQ')
+            part += self.section_parser(section)
+            info_dict[str_id] = self.section_parser(section)
+            info.append(part)
+        return info_dict
 
-  def eval_id(self,id):
-    id = id.split("_")
-    default_id=id[0:len(id)//2]
-    platform_id=id[len(id)//2:]
-    text=''
-    for i in range(len(default_id)):
-      text +="%s.common.%s.%s,"%(SECTION,self.id_name(platform_id[i],'PLATFORM'),self.id_name(default_id[i],'DEFAULT'))
-    return '\n[%s]\n'%text[:-1]
+    def eval_id(self, id):
+        id = id.split("_")
+        default_id = id[0:len(id)//2]
+        platform_id = id[len(id)//2:]
+        text = ''
+        for i in range(len(default_id)):
+            text += "%s.common.%s.%s," % (SECTION, self.id_name(
+                platform_id[i], 'PLATFORM'), self.id_name(default_id[i], 'DEFAULT'))
+        return '\n[%s]\n' % text[:-1]
 
-  def id_name(self,ID, flag):
-    platform_dict = {'0': 'DEFAULT'}
-    default_dict = {'0': 'STANDARD', '1': 'MANUFACTURING'}
-    if flag == "PLATFORM":
-      try:
-        value = platform_dict[ID]
-      except KeyError:
-        value = 'SKUID%s' % ID
-    elif flag == 'DEFAULT':
-      try:
-        value = default_dict[ID]
-      except KeyError:
-        value = 'DEFAULTID%s' % ID
-    else:
-      value = None
-    return value
+    def id_name(self, ID, flag):
+        platform_dict = {'0': 'DEFAULT'}
+        default_dict = {'0': 'STANDARD', '1': 'MANUFACTURING'}
+        if flag == "PLATFORM":
+            try:
+                value = platform_dict[ID]
+            except KeyError:
+                value = 'SKUID%s' % ID
+        elif flag == 'DEFAULT':
+            try:
+                value = default_dict[ID]
+            except KeyError:
+                value = 'DEFAULTID%s' % ID
+        else:
+            value = None
+        return value
 
-  def section_parser(self,section):
-    offset_re = re.compile(r'offset=(\w+)')
-    name_re = re.compile(r'name=(\S+)')
-    guid_re = re.compile(r'guid=(\S+)')
-  #  help_re = re.compile(r'help = (.*)')
-    attribute_re=re.compile(r'attribute=(\w+)')
-    value_re = re.compile(r'(//.*)')
-    part = []
-    part_without_comment = []
-    for x in section[1:]:
-        line=x.split('\n')[0]
-        comment_list = value_re.findall(line) # the string \\... in "Q...." line
-        comment_list[0] = comment_list[0].replace('//', '')
-        comment_ori = comment_list[0].strip()
-        comment = ""
-        for each in comment_ori:
-            if each != " " and "\x21" > each or each > "\x7E":
-                if bytes(each, 'utf-16') == b'\xff\xfe\xae\x00':
-                    each = '(R)'
-                else:
-                    each = ""
-            comment += each
-        line=value_re.sub('',line) #delete \\... in "Q...." line
-        list1=line.split(' ')
-        value=self.value_parser(list1)
-        offset = offset_re.findall(x.replace(' ',''))
-        name = name_re.findall(x.replace(' ',''))
-        guid = guid_re.findall(x.replace(' ',''))
-        attribute =attribute_re.findall(x.replace(' ',''))
-        if offset and name and guid and value and attribute:
-          if attribute[0] in ['0x3','0x7']:
-            offset = int(offset[0], 16)
-            #help = help_re.findall(x)
-            text_without_comment = offset, name[0], guid[0], value, attribute[0]
-            if text_without_comment in part_without_comment:
-                # check if exists same Pcd with different comments, add different comments in one line with "|".
-                dupl_index = part_without_comment.index(text_without_comment)
-                part[dupl_index] = list(part[dupl_index])
-                if comment not in part[dupl_index][-1]:
-                    part[dupl_index][-1] += " | " + comment
-                part[dupl_index] = tuple(part[dupl_index])
+    def section_parser(self, section):
+        offset_re = re.compile(r'offset=(\w+)')
+        name_re = re.compile(r'name=(\S+)')
+        guid_re = re.compile(r'guid=(\S+)')
+    #  help_re = re.compile(r'help = (.*)')
+        attribute_re = re.compile(r'attribute=(\w+)')
+        value_re = re.compile(r'(//.*)')
+        part = []
+        part_without_comment = []
+        for x in section[1:]:
+            line = x.split('\n')[0]
+            # the string \\... in "Q...." line
+            comment_list = value_re.findall(line)
+            comment_list[0] = comment_list[0].replace('//', '')
+            comment_ori = comment_list[0].strip()
+            comment = ""
+            for each in comment_ori:
+                if each != " " and "\x21" > each or each > "\x7E":
+                    if bytes(each, 'utf-16') == b'\xff\xfe\xae\x00':
+                        each = '(R)'
+                    else:
+                        each = ""
+                comment += each
+            line = value_re.sub('', line)  # delete \\... in "Q...." line
+            list1 = line.split(' ')
+            value = self.value_parser(list1)
+            offset = offset_re.findall(x.replace(' ', ''))
+            name = name_re.findall(x.replace(' ', ''))
+            guid = guid_re.findall(x.replace(' ', ''))
+            attribute = attribute_re.findall(x.replace(' ', ''))
+            if offset and name and guid and value and attribute:
+                if attribute[0] in ['0x3', '0x7']:
+                    offset = int(offset[0], 16)
+                    #help = help_re.findall(x)
+                    text_without_comment = offset, name[0], guid[0], value, attribute[0]
+                    if text_without_comment in part_without_comment:
+                        # check if exists same Pcd with different comments, add different comments in one line with "|".
+                        dupl_index = part_without_comment.index(
+                            text_without_comment)
+                        part[dupl_index] = list(part[dupl_index])
+                        if comment not in part[dupl_index][-1]:
+                            part[dupl_index][-1] += " | " + comment
+                        part[dupl_index] = tuple(part[dupl_index])
+                    else:
+                        text = offset, name[0], guid[0], value, attribute[0], comment
+                        part_without_comment.append(text_without_comment)
+                        part.append(text)
+        return(part)
+
+    def value_parser(self, list1):
+        list1 = [t for t in list1 if t != '']  # remove '' form list
+        first_num = int(list1[0], 16)
+        if list1[first_num + 1] == 'STRING':  # parser STRING
+            if list1[-1] == '""':
+                value = "{0x0, 0x0}"
             else:
-                text = offset, name[0], guid[0], value, attribute[0], comment
-                part_without_comment.append(text_without_comment)
-                part.append(text)
-    return(part)
-
-  def value_parser(self, list1):
-    list1 = [t for t in list1 if t != '']  # remove '' form list
-    first_num = int(list1[0], 16)
-    if list1[first_num + 1] == 'STRING':  # parser STRING
-      if list1[-1] == '""':
-        value = "{0x0, 0x0}"
-      else:
-        value = 'L%s' % list1[-1]
-    elif list1[first_num + 1] == 'ORDERED_LIST':  # parser ORDERED_LIST
-      value_total = int(list1[first_num + 2])
-      list2 = list1[-value_total:]
-      tmp = []
-      line = ''
-      for i in list2:
-        if len(i) % 2 == 0 and len(i) != 2:
-          for m in range(0, len(i) // 2):
-            tmp.append('0x%02x' % (int('0x%s' % i, 16) >> m * 8 & 0xff))
+                value = 'L%s' % list1[-1]
+        elif list1[first_num + 1] == 'ORDERED_LIST':  # parser ORDERED_LIST
+            value_total = int(list1[first_num + 2])
+            list2 = list1[-value_total:]
+            tmp = []
+            line = ''
+            for i in list2:
+                if len(i) % 2 == 0 and len(i) != 2:
+                    for m in range(0, len(i) // 2):
+                        tmp.append('0x%02x' %
+                                   (int('0x%s' % i, 16) >> m * 8 & 0xff))
+                else:
+                    tmp.append('0x%s' % i)
+            for i in tmp:
+                line += '%s,' % i
+            value = '{%s}' % line[:-1]
         else:
-          tmp.append('0x%s' % i)
-      for i in tmp:
-        line += '%s,' % i
-      value = '{%s}' % line[:-1]
-    else:
-      value = "0x%01x" % int(list1[-1], 16)
-    return value
+            value = "0x%01x" % int(list1[-1], 16)
+        return value
 
 
-#parser Guid file, get guid name form guid value
+# parser Guid file, get guid name form guid value
 class GUID(object):
 
-  def __init__(self,path):
-    self.path = path
-    self.guidfile = self.gfile()
-    self.guiddict = self.guid_dict()
+    def __init__(self, path):
+        self.path = path
+        self.guidfile = self.gfile()
+        self.guiddict = self.guid_dict()
 
-  def gfile(self):
-    for root, dir, file in os.walk(self.path, topdown=True, followlinks=False):
-      if 'FV' in dir:
-        gfile = os.path.join(root,'Fv','Guid.xref')
-        if os.path.isfile(gfile):
-          return gfile
+    def gfile(self):
+        for root, dir, file in os.walk(self.path, topdown=True, followlinks=False):
+            if 'FV' in dir:
+                gfile = os.path.join(root, 'Fv', 'Guid.xref')
+                if os.path.isfile(gfile):
+                    return gfile
+                else:
+                    print("ERROR: Guid.xref file not found")
+                    ERRORMSG.append("ERROR: Guid.xref file not found")
+                    exit()
+
+    def guid_dict(self):
+        guiddict = {}
+        with open(self.guidfile, 'r') as file:
+            lines = file.readlines()
+        guidinfo = lines
+        for line in guidinfo:
+            list = line.strip().split(' ')
+            if list:
+                if len(list) > 1:
+                    guiddict[list[0].upper()] = list[1]
+                elif list[0] != '' and len(list) == 1:
+                    print("Error: line %s can't be parser in %s" %
+                          (line.strip(), self.guidfile))
+                    ERRORMSG.append("Error: line %s can't be parser in %s" % (
+                        line.strip(), self.guidfile))
+            else:
+                print("ERROR: No data in %s" % self.guidfile)
+                ERRORMSG.append("ERROR: No data in %s" % self.guidfile)
+        return guiddict
+
+    def guid_parser(self, guid):
+        if guid.upper() in self.guiddict:
+            return self.guiddict[guid.upper()]
         else:
-          print("ERROR: Guid.xref file not found")
-          ERRORMSG.append("ERROR: Guid.xref file not found")
-          exit()
+            print("ERROR: GUID %s not found in file %s" %
+                  (guid, self.guidfile))
+            ERRORMSG.append("ERROR: GUID %s not found in file %s" %
+                            (guid, self.guidfile))
+            return guid
 
-  def guid_dict(self):
-    guiddict={}
-    with open(self.guidfile,'r') as file:
-      lines = file.readlines()
-    guidinfo=lines
-    for line in guidinfo:
-      list=line.strip().split(' ')
-      if list:
-        if len(list)>1:
-          guiddict[list[0].upper()]=list[1]
-        elif list[0] != ''and len(list)==1:
-          print("Error: line %s can't be parser in %s"%(line.strip(),self.guidfile))
-          ERRORMSG.append("Error: line %s can't be parser in %s"%(line.strip(),self.guidfile))
-      else:
-        print("ERROR: No data in %s" %self.guidfile)
-        ERRORMSG.append("ERROR: No data in %s" %self.guidfile)
-    return guiddict
-
-  def guid_parser(self,guid):
-    if guid.upper() in self.guiddict:
-      return self.guiddict[guid.upper()]
-    else:
-      print("ERROR: GUID %s not found in file %s"%(guid, self.guidfile))
-      ERRORMSG.append("ERROR: GUID %s not found in file %s"%(guid, self.guidfile))
-      return guid
 
 class PATH(object):
 
-  def __init__(self,path):
-    self.path=path
-    self.rootdir=self.get_root_dir()
-    self.usefuldir=set()
-    self.lstinf = {}
-    for path in self.rootdir:
-      for o_root, o_dir, o_file in os.walk(os.path.join(path, "OUTPUT"), topdown=True, followlinks=False):
-        for INF in o_file:
-          if os.path.splitext(INF)[1] == '.inf':
-            for l_root, l_dir, l_file in os.walk(os.path.join(path, "DEBUG"), topdown=True,
-                               followlinks=False):
-              for LST in l_file:
-                if os.path.splitext(LST)[1] == '.lst':
-                  self.lstinf[os.path.join(l_root, LST)] = os.path.join(o_root, INF)
-                  self.usefuldir.add(path)
+    def __init__(self, path):
+        self.path = path
+        self.rootdir = self.get_root_dir()
+        self.usefuldir = set()
+        self.lstinf = {}
+        for path in self.rootdir:
+            for o_root, o_dir, o_file in os.walk(os.path.join(path, "OUTPUT"), topdown=True, followlinks=False):
+                for INF in o_file:
+                    if os.path.splitext(INF)[1] == '.inf':
+                        for l_root, l_dir, l_file in os.walk(os.path.join(path, "DEBUG"), topdown=True,
+                                                             followlinks=False):
+                            for LST in l_file:
+                                if os.path.splitext(LST)[1] == '.lst':
+                                    self.lstinf[os.path.join(l_root, LST)] = os.path.join(
+                                        o_root, INF)
+                                    self.usefuldir.add(path)
 
-  def get_root_dir(self):
-    rootdir=[]
-    for root,dir,file in os.walk(self.path,topdown=True,followlinks=False):
-      if "OUTPUT" in root:
-        updir=root.split("OUTPUT",1)[0]
-        rootdir.append(updir)
-    rootdir=list(set(rootdir))
-    return rootdir
+    def get_root_dir(self):
+        rootdir = []
+        for root, dir, file in os.walk(self.path, topdown=True, followlinks=False):
+            if "OUTPUT" in root:
+                updir = root.split("OUTPUT", 1)[0]
+                rootdir.append(updir)
+        rootdir = list(set(rootdir))
+        return rootdir
 
-  def lst_inf(self):
-    return self.lstinf
+    def lst_inf(self):
+        return self.lstinf
 
-  def package(self):
-    package={}
-    package_re=re.compile(r'Packages\.\w+]\n(.*)',re.S)
-    for i in list(self.lstinf.values()):
-      with open(i,'r') as inf:
-        read=inf.read()
-      section=read.split('[')
-      for j in section:
-        p=package_re.findall(j)
-        if p:
-          package[i]=p[0].rstrip()
-    return package
+    def package(self):
+        package = {}
+        package_re = re.compile(r'Packages\.\w+]\n(.*)', re.S)
+        for i in list(self.lstinf.values()):
+            with open(i, 'r') as inf:
+                read = inf.read()
+            section = read.split('[')
+            for j in section:
+                p = package_re.findall(j)
+                if p:
+                    package[i] = p[0].rstrip()
+        return package
 
-  def header(self,struct):
-    header={}
-    head_re = re.compile('typedef.*} %s;[\n]+(.*)(?:typedef|formset)'%struct,re.M|re.S)
-    head_re2 = re.compile(r'#line[\s\d]+"(\S+h)"')
-    for i in list(self.lstinf.keys()):
-      with open(i,'r') as lst:
-        read = lst.read()
-      h = head_re.findall(read)
-      if h:
-        head=head_re2.findall(h[0])
-        if head:
-          format = head[0].replace('\\\\','/').replace('\\','/')
-          name =format.split('/')[-1]
-          head = self.headerfileset.get(name)
-          if head:
-            head = head.replace('\\','/')
-            header[struct] = head
-    return header
-  @property
-  def headerfileset(self):
-    headerset = dict()
-    for root,dirs,files in os.walk(self.path):
-      for file in files:
-        if os.path.basename(file) == 'deps.txt':
-          with open(os.path.join(root,file),"r") as fr:
-            for line in fr.readlines():
-              headerset[os.path.basename(line).strip()] = line.strip()
-    return headerset
+    def header(self, struct):
+        header = {}
+        head_re = re.compile(
+            'typedef.*} %s;[\n]+(.*)(?:typedef|formset)' % struct, re.M | re.S)
+        head_re2 = re.compile(r'#line[\s\d]+"(\S+h)"')
+        for i in list(self.lstinf.keys()):
+            with open(i, 'r') as lst:
+                read = lst.read()
+            h = head_re.findall(read)
+            if h:
+                head = head_re2.findall(h[0])
+                if head:
+                    format = head[0].replace('\\\\', '/').replace('\\', '/')
+                    name = format.split('/')[-1]
+                    head = self.headerfileset.get(name)
+                    if head:
+                        head = head.replace('\\', '/')
+                        header[struct] = head
+        return header
+
+    @property
+    def headerfileset(self):
+        headerset = dict()
+        for root, dirs, files in os.walk(self.path):
+            for file in files:
+                if os.path.basename(file) == 'deps.txt':
+                    with open(os.path.join(root, file), "r") as fr:
+                        for line in fr.readlines():
+                            headerset[os.path.basename(
+                                line).strip()] = line.strip()
+        return headerset
+
+    def makefile(self, filename):
+        re_format = re.compile(r'DEBUG_DIR.*(?:\S+Pkg)\\(.*\\%s)' % filename)
+        for i in self.usefuldir:
+            with open(os.path.join(i, 'Makefile'), 'r') as make:
+                read = make.read()
+            dir = re_format.findall(read)
+            if dir:
+                return dir[0]
+        return None
 
-  def makefile(self,filename):
-    re_format = re.compile(r'DEBUG_DIR.*(?:\S+Pkg)\\(.*\\%s)'%filename)
-    for i in self.usefuldir:
-      with open(os.path.join(i,'Makefile'),'r') as make:
-        read = make.read()
-      dir = re_format.findall(read)
-      if dir:
-        return dir[0]
-    return None
 
 class mainprocess(object):
 
-  def __init__(self,InputPath,Config,OutputPath):
-    self.init = 0xFCD00000
-    self.inputpath = os.path.abspath(InputPath)
-    self.outputpath = os.path.abspath(OutputPath)
-    self.LST = PATH(self.inputpath)
-    self.lst_dict = self.LST.lst_inf()
-    self.Config = Config
-    self.attribute_dict = {'0x3': 'NV, BS', '0x7': 'NV, BS, RT'}
-    self.guid = GUID(self.inputpath)
-    self.header={}
+    def __init__(self, InputPath, Config, OutputPath):
+        self.init = 0xFCD00000
+        self.inputpath = os.path.abspath(InputPath)
+        self.outputpath = os.path.abspath(OutputPath)
+        self.LST = PATH(self.inputpath)
+        self.lst_dict = self.LST.lst_inf()
+        self.Config = Config
+        self.attribute_dict = {'0x3': 'NV, BS', '0x7': 'NV, BS, RT'}
+        self.guid = GUID(self.inputpath)
+        self.header = {}
 
-  def main(self):
-    conf=Config(self.Config)
-    config_dict=conf.config_parser() #get {'0_0':[offset,name,guid,value,attribute]...,'1_0':....}
-    lst=parser_lst(list(self.lst_dict.keys()))
-    efi_dict=lst.efivarstore_parser() #get {name:struct} form lst file
-    keys=sorted(config_dict.keys())
-    all_struct=lst.struct()
-    stru_lst=lst.struct_lst()
-    title_list=[]
-    info_list=[]
-    header_list=[]
-    inf_list =[]
-    for i in stru_lst:
-      tmp = self.LST.header(i)
-      self.header.update(tmp)
-    for id_key in keys:
-      tmp_id=[id_key] #['0_0',[(struct,[name...]),(struct,[name...])]]
-      tmp_info={} #{name:struct}
-      for section in config_dict[id_key]:
-        c_offset,c_name,c_guid,c_value,c_attribute,c_comment = section
-        if c_name in efi_dict:
-          struct = efi_dict[c_name]
-          title='%s%s|L"%s"|%s|0x00||%s\n'%(PCD_NAME,c_name,c_name,self.guid.guid_parser(c_guid),self.attribute_dict[c_attribute])
-          if struct in all_struct:
-            lstfile = stru_lst[struct]
-            struct_dict=all_struct[struct]
-            try:
-              title2 = '%s%s|{0}|%s|0xFCD00000{\n <HeaderFiles>\n  %s\n <Packages>\n%s\n}\n' % (PCD_NAME, c_name, struct, self.header[struct], self.LST.package()[self.lst_dict[lstfile]])
-            except KeyError:
-              WARNING.append("Warning: No <HeaderFiles> for struct %s"%struct)
-              title2 = '%s%s|{0}|%s|0xFCD00000{\n <HeaderFiles>\n  %s\n <Packages>\n%s\n}\n' % (PCD_NAME, c_name, struct, '', self.LST.package()[self.lst_dict[lstfile]])
-            header_list.append(title2)
-          elif struct not in lst._ignore:
-            struct_dict ={}
-            print("ERROR: Struct %s can't found in lst file" %struct)
-            ERRORMSG.append("ERROR: Struct %s can't found in lst file" %struct)
-          if c_offset in struct_dict:
-            offset_name=struct_dict[c_offset]
-            info = "%s%s.%s|%s\n"%(PCD_NAME,c_name,offset_name,c_value)
-            blank_length = Max_Pcd_Len - len(info)
-            if blank_length <= 0:
-                info_comment = "%s%s.%s|%s%s# %s\n"%(PCD_NAME,c_name,offset_name,c_value,"     ",c_comment)
-            else:
-                info_comment = "%s%s.%s|%s%s# %s\n"%(PCD_NAME,c_name,offset_name,c_value,blank_length*" ",c_comment)
-            inf = "%s%s\n"%(PCD_NAME,c_name)
-            inf_list.append(inf)
-            tmp_info[info_comment]=title
-          else:
-            print("ERROR: Can't find offset %s with struct name %s"%(c_offset,struct))
-            ERRORMSG.append("ERROR: Can't find offset %s with name %s"%(c_offset,struct))
+    def main(self):
+        conf = Config(self.Config)
+        # get {'0_0':[offset,name,guid,value,attribute]...,'1_0':....}
+        config_dict = conf.config_parser()
+        lst = parser_lst(list(self.lst_dict.keys()))
+        efi_dict = lst.efivarstore_parser()  # get {name:struct} form lst file
+        keys = sorted(config_dict.keys())
+        all_struct = lst.struct()
+        stru_lst = lst.struct_lst()
+        title_list = []
+        info_list = []
+        header_list = []
+        inf_list = []
+        for i in stru_lst:
+            tmp = self.LST.header(i)
+            self.header.update(tmp)
+        for id_key in keys:
+            # ['0_0',[(struct,[name...]),(struct,[name...])]]
+            tmp_id = [id_key]
+            tmp_info = {}  # {name:struct}
+            for section in config_dict[id_key]:
+                c_offset, c_name, c_guid, c_value, c_attribute, c_comment = section
+                if c_name in efi_dict:
+                    struct = efi_dict[c_name]
+                    title = '%s%s|L"%s"|%s|0x00||%s\n' % (PCD_NAME, c_name, c_name, self.guid.guid_parser(
+                        c_guid), self.attribute_dict[c_attribute])
+                    if struct in all_struct:
+                        lstfile = stru_lst[struct]
+                        struct_dict = all_struct[struct]
+                        try:
+                            title2 = '%s%s|{0}|%s|0xFCD00000{\n <HeaderFiles>\n  %s\n <Packages>\n%s\n}\n' % (
+                                PCD_NAME, c_name, struct, self.header[struct], self.LST.package()[self.lst_dict[lstfile]])
+                        except KeyError:
+                            WARNING.append(
+                                "Warning: No <HeaderFiles> for struct %s" % struct)
+                            title2 = '%s%s|{0}|%s|0xFCD00000{\n <HeaderFiles>\n  %s\n <Packages>\n%s\n}\n' % (
+                                PCD_NAME, c_name, struct, '', self.LST.package()[self.lst_dict[lstfile]])
+                        header_list.append(title2)
+                    elif struct not in lst._ignore:
+                        struct_dict = {}
+                        print("ERROR: Struct %s can't found in lst file" % struct)
+                        ERRORMSG.append(
+                            "ERROR: Struct %s can't found in lst file" % struct)
+                    if c_offset in struct_dict:
+                        offset_name = struct_dict[c_offset]
+                        info = "%s%s.%s|%s\n" % (
+                            PCD_NAME, c_name, offset_name, c_value)
+                        blank_length = Max_Pcd_Len - len(info)
+                        if blank_length <= 0:
+                            info_comment = "%s%s.%s|%s%s# %s\n" % (
+                                PCD_NAME, c_name, offset_name, c_value, "     ", c_comment)
+                        else:
+                            info_comment = "%s%s.%s|%s%s# %s\n" % (
+                                PCD_NAME, c_name, offset_name, c_value, blank_length*" ", c_comment)
+                        inf = "%s%s\n" % (PCD_NAME, c_name)
+                        inf_list.append(inf)
+                        tmp_info[info_comment] = title
+                    else:
+                        print("ERROR: Can't find offset %s with struct name %s" % (
+                            c_offset, struct))
+                        ERRORMSG.append(
+                            "ERROR: Can't find offset %s with name %s" % (c_offset, struct))
+                else:
+                    print("ERROR: Can't find name %s in lst file" % (c_name))
+                    ERRORMSG.append(
+                        "ERROR: Can't find name %s in lst file" % (c_name))
+            tmp_id.append(list(self.reverse_dict(tmp_info).items()))
+            id, tmp_title_list, tmp_info_list = self.read_list(tmp_id)
+            title_list += tmp_title_list
+            info_list.append(tmp_info_list)
+        inf_list = self.del_repeat(inf_list)
+        header_list = self.plus(self.del_repeat(header_list))
+        title_all = list(set(title_list))
+        info_list = self.remove_bracket(self.del_repeat(info_list))
+        for i in range(len(info_list)-1, -1, -1):
+            if len(info_list[i]) == 0:
+                info_list.remove(info_list[i])
+        for i in (inf_list, title_all, header_list):
+            i.sort()
+        return keys, title_all, info_list, header_list, inf_list
+
+    def correct_sort(self, PcdString):
+        # sort the Pcd list with two rules:
+        # First sort through Pcd name;
+        # Second if the Pcd exists several elements, sort them through index value.
+        if ("]|") in PcdString:
+            Pcdname = PcdString.split("[")[0]
+            Pcdindex = int(PcdString.split("[")[1].split("]")[0])
         else:
-          print("ERROR: Can't find name %s in lst file"%(c_name))
-          ERRORMSG.append("ERROR: Can't find name %s in lst file"%(c_name))
-      tmp_id.append(list(self.reverse_dict(tmp_info).items()))
-      id,tmp_title_list,tmp_info_list = self.read_list(tmp_id)
-      title_list +=tmp_title_list
-      info_list.append(tmp_info_list)
-    inf_list = self.del_repeat(inf_list)
-    header_list = self.plus(self.del_repeat(header_list))
-    title_all=list(set(title_list))
-    info_list = self.remove_bracket(self.del_repeat(info_list))
-    for i in range(len(info_list)-1,-1,-1):
-      if len(info_list[i]) == 0:
-        info_list.remove(info_list[i])
-    for i in (inf_list, title_all, header_list):
-      i.sort()
-    return keys,title_all,info_list,header_list,inf_list
+            Pcdname = PcdString.split("|")[0]
+            Pcdindex = 0
+        return Pcdname, Pcdindex
 
-  def correct_sort(self, PcdString):
-    # sort the Pcd list with two rules:
-    # First sort through Pcd name;
-    # Second if the Pcd exists several elements, sort them through index value.
-    if ("]|") in PcdString:
-        Pcdname = PcdString.split("[")[0]
-        Pcdindex = int(PcdString.split("[")[1].split("]")[0])
-    else:
-        Pcdname = PcdString.split("|")[0]
-        Pcdindex = 0
-    return Pcdname, Pcdindex
+    def remove_bracket(self, List):
+        for i in List:
+            for j in i:
+                tmp = j.split("|")
+                if (('L"' in j) and ("[" in j)) or (tmp[1].split("#")[0].strip() == '{0x0, 0x0}'):
+                    tmp[0] = tmp[0][:tmp[0].index('[')]
+                    List[List.index(i)][i.index(j)] = "|".join(tmp)
+                else:
+                    List[List.index(i)][i.index(j)] = j
+        for i in List:
+            if type(i) == type([0, 0]):
+                i.sort(key=lambda x: (self.correct_sort(
+                    x)[0], self.correct_sort(x)[1]))
+        return List
 
-  def remove_bracket(self,List):
-    for i in List:
-      for j in i:
-        tmp = j.split("|")
-        if (('L"' in j) and ("[" in j)) or (tmp[1].split("#")[0].strip() == '{0x0, 0x0}'):
-          tmp[0] = tmp[0][:tmp[0].index('[')]
-          List[List.index(i)][i.index(j)] = "|".join(tmp)
-        else:
-          List[List.index(i)][i.index(j)] = j
-    for i in List:
-      if type(i) == type([0,0]):
-        i.sort(key = lambda x:(self.correct_sort(x)[0], self.correct_sort(x)[1]))
-    return List
+    def write_all(self):
+        title_flag = 1
+        info_flag = 1
+        if not os.path.isdir(self.outputpath):
+            os.makedirs(self.outputpath)
+        decwrite = write2file(os.path.join(
+            self.outputpath, 'StructurePcd.dec'))
+        dscwrite = write2file(os.path.join(
+            self.outputpath, 'StructurePcd.dsc'))
+        infwrite = write2file(os.path.join(
+            self.outputpath, 'StructurePcd.inf'))
+        conf = Config(self.Config)
+        ids, title, info, header, inf = self.main()
+        decwrite.add2file(decstatement)
+        decwrite.add2file(header)
+        infwrite.add2file(infstatement)
+        infwrite.add2file(inf)
+        dscwrite.add2file(dscstatement)
+        for id in ids:
+            dscwrite.add2file(conf.eval_id(id))
+            if title_flag:
+                dscwrite.add2file(title)
+                title_flag = 0
+            if len(info) == 1:
+                dscwrite.add2file(info)
+            elif len(info) == 2:
+                if info_flag:
+                    dscwrite.add2file(info[0])
+                    info_flag = 0
+                else:
+                    dscwrite.add2file(info[1])
 
-  def write_all(self):
-    title_flag=1
-    info_flag=1
-    if not os.path.isdir(self.outputpath):
-      os.makedirs(self.outputpath)
-    decwrite = write2file(os.path.join(self.outputpath,'StructurePcd.dec'))
-    dscwrite = write2file(os.path.join(self.outputpath,'StructurePcd.dsc'))
-    infwrite = write2file(os.path.join(self.outputpath, 'StructurePcd.inf'))
-    conf = Config(self.Config)
-    ids,title,info,header,inf=self.main()
-    decwrite.add2file(decstatement)
-    decwrite.add2file(header)
-    infwrite.add2file(infstatement)
-    infwrite.add2file(inf)
-    dscwrite.add2file(dscstatement)
-    for id in ids:
-      dscwrite.add2file(conf.eval_id(id))
-      if title_flag:
-        dscwrite.add2file(title)
-        title_flag=0
-      if len(info) == 1:
-        dscwrite.add2file(info)
-      elif len(info) == 2:
-        if info_flag:
-          dscwrite.add2file(info[0])
-          info_flag =0
+    def del_repeat(self, List):
+        if len(List) == 1 or len(List) == 0:
+            return List
         else:
-          dscwrite.add2file(info[1])
-
-  def del_repeat(self,List):
-    if len(List) == 1 or len(List) == 0:
-      return List
-    else:
-      if type(List[0]) != type('xxx'):
-        alist=[]
-        for i in range(len(List)):
-          if i == 0:
-            alist.append(List[0])
-          else:
-            plist = []
-            for j in range(i):
-              plist += List[j]
-            alist.append(self.__del(list(set(plist)), List[i]))
-        return alist
-      else:
-        return list(set(List))
+            if type(List[0]) != type('xxx'):
+                alist = []
+                for i in range(len(List)):
+                    if i == 0:
+                        alist.append(List[0])
+                    else:
+                        plist = []
+                        for j in range(i):
+                            plist += List[j]
+                        alist.append(self.__del(list(set(plist)), List[i]))
+                return alist
+            else:
+                return list(set(List))
 
+    def __del(self, list1, list2):
+        return list(set(list2).difference(set(list1)))
 
-  def __del(self,list1,list2):
-    return list(set(list2).difference(set(list1)))
+    def reverse_dict(self, dict):
+        data = {}
+        for i in list(dict.items()):
+            if i[1] not in list(data.keys()):
+                data[i[1]] = [i[0]]
+            else:
+                data[i[1]].append(i[0])
+        return data
 
-  def reverse_dict(self,dict):
-    data={}
-    for i in list(dict.items()):
-      if i[1] not in list(data.keys()):
-        data[i[1]]=[i[0]]
-      else:
-        data[i[1]].append(i[0])
-    return data
+    def read_list(self, list):
+        title_list = []
+        info_list = []
+        for i in list[1]:
+            title_list.append(i[0])
+            for j in i[1]:
+                info_list.append(j)
+        return list[0], title_list, info_list
 
-  def read_list(self,list):
-    title_list=[]
-    info_list=[]
-    for i in list[1]:
-      title_list.append(i[0])
-      for j in i[1]:
-        info_list.append(j)
-    return list[0],title_list,info_list
+    def plus(self, list):
+        nums = []
+        for i in list:
+            if type(i) != type([0]):
+                self.init += 1
+                num = "0x%01x" % self.init
+                j = i.replace('0xFCD00000', num.upper())
+                nums.append(j)
+        return nums
 
-  def plus(self,list):
-    nums=[]
-    for i in list:
-      if type(i) != type([0]):
-        self.init += 1
-        num = "0x%01x" % self.init
-        j=i.replace('0xFCD00000',num.upper())
-        nums.append(j)
-    return nums
 
 class write2file(object):
 
-  def __init__(self,Output):
-    self.output=Output
-    self.text=''
-    if os.path.exists(self.output):
-      os.remove(self.output)
+    def __init__(self, Output):
+        self.output = Output
+        self.text = ''
+        if os.path.exists(self.output):
+            os.remove(self.output)
 
-  def add2file(self,content):
-    self.text = ''
-    with open(self.output,'a+') as file:
-      file.write(self.__gen(content))
+    def add2file(self, content):
+        self.text = ''
+        with open(self.output, 'a+') as file:
+            file.write(self.__gen(content))
 
-  def __gen(self,content):
-    if type(content) == type(''):
-      return content
-    elif type(content) == type([0,0])or type(content) == type((0,0)):
-      return self.__readlist(content)
-    elif type(content) == type({0:0}):
-      return self.__readdict(content)
+    def __gen(self, content):
+        if type(content) == type(''):
+            return content
+        elif type(content) == type([0, 0]) or type(content) == type((0, 0)):
+            return self.__readlist(content)
+        elif type(content) == type({0: 0}):
+            return self.__readdict(content)
 
-  def __readlist(self,list):
-    for i in list:
-      if type(i) == type([0,0])or type(i) == type((0,0)):
-        self.__readlist(i)
-      elif type(i) == type('') :
-        self.text +=i
-    return self.text
+    def __readlist(self, list):
+        for i in list:
+            if type(i) == type([0, 0]) or type(i) == type((0, 0)):
+                self.__readlist(i)
+            elif type(i) == type(''):
+                self.text += i
+        return self.text
+
+    def __readdict(self, dict):
+        content = list(dict.items())
+        return self.__readlist(content)
 
-  def __readdict(self,dict):
-    content=list(dict.items())
-    return self.__readlist(content)
 
 def stamp():
-  return datetime.datetime.now()
+    return datetime.datetime.now()
 
-def dtime(start,end,id=None):
-  if id:
-    pass
-    print("%s time:%s" % (id,str(end - start)))
-  else:
-    print("Total time:%s" %str(end-start)[:-7])
+
+def dtime(start, end, id=None):
+    if id:
+        pass
+        print("%s time:%s" % (id, str(end - start)))
+    else:
+        print("Total time:%s" % str(end-start)[:-7])
 
 
 def main():
-  start = stamp()
-  parser = argparse.ArgumentParser(prog = __prog__,
-                                   description = __description__ + __copyright__,
-                                   conflict_handler = 'resolve')
-  parser.add_argument('-v', '--version', action = 'version',version = __version__, help="show program's version number and exit")
-  parser.add_argument('-p', '--path', metavar='PATH', dest='path', help="platform build output directory")
-  parser.add_argument('-c', '--config',metavar='FILENAME', dest='config', help="firmware configuration file")
-  parser.add_argument('-o', '--outputdir', metavar='PATH', dest='output', help="output directoy")
-  options = parser.parse_args()
-  if options.config:
-    if options.path:
-      if options.output:
-        run = mainprocess(options.path, options.config, options.output)
-        print("Running...")
-        run.write_all()
-        if WARNING:
-          warning = list(set(WARNING))
-          for j in warning:
-            print(j)
-        if ERRORMSG:
-          ERROR = list(set(ERRORMSG))
-          with open("ERROR.log", 'w+') as error:
-            for i in ERROR:
-              error.write(i + '\n')
-          print("Some error find, error log in ERROR.log")
-        print('Finished, Output files in directory %s'%os.path.abspath(options.output))
-      else:
-        print('Error command, no output path, use -h for help')
+    start = stamp()
+    parser = argparse.ArgumentParser(prog=__prog__,
+                                     description=__description__ + __copyright__,
+                                     conflict_handler='resolve')
+    parser.add_argument('-v', '--version', action='version',
+                        version=__version__, help="show program's version number and exit")
+    parser.add_argument('-p', '--path', metavar='PATH',
+                        dest='path', help="platform build output directory")
+    parser.add_argument('-c', '--config', metavar='FILENAME',
+                        dest='config', help="firmware configuration file")
+    parser.add_argument('-o', '--outputdir', metavar='PATH',
+                        dest='output', help="output directoy")
+    options = parser.parse_args()
+    if options.config:
+        if options.path:
+            if options.output:
+                run = mainprocess(options.path, options.config, options.output)
+                print("Running...")
+                run.write_all()
+                if WARNING:
+                    warning = list(set(WARNING))
+                    for j in warning:
+                        print(j)
+                if ERRORMSG:
+                    ERROR = list(set(ERRORMSG))
+                    with open("ERROR.log", 'w+') as error:
+                        for i in ERROR:
+                            error.write(i + '\n')
+                    print("Some error find, error log in ERROR.log")
+                print('Finished, Output files in directory %s' %
+                      os.path.abspath(options.output))
+            else:
+                print('Error command, no output path, use -h for help')
+        else:
+            print('Error command, no build path input, use -h for help')
     else:
-      print('Error command, no build path input, use -h for help')
-  else:
-    print('Error command, no output file, use -h for help')
-  end = stamp()
-  dtime(start, end)
+        print('Error command, no output file, use -h for help')
+    end = stamp()
+    dtime(start, end)
+
 
 if __name__ == '__main__':
-  main()
+    main()
diff --git a/BaseTools/Scripts/ConvertMasmToNasm.py b/BaseTools/Scripts/ConvertMasmToNasm.py
index 0ef5a8badb94..c4439dd5c5bc 100755
--- a/BaseTools/Scripts/ConvertMasmToNasm.py
+++ b/BaseTools/Scripts/ConvertMasmToNasm.py
@@ -834,7 +834,8 @@ class ConvertInfFile(CommonUtils):
                 reldst = self.RootRelative(dst)
                 print('Unabled to convert', reldst)
         if unsupportedArchCount > 0 and not self.args.quiet:
-            print('Skipped', unsupportedArchCount, 'files based on architecture')
+            print('Skipped', unsupportedArchCount,
+                  'files based on architecture')
 
     def UpdateInfAsmFile(self, dst, IgnoreMissingAsm=False):
         infPath = os.path.split(os.path.realpath(self.inf))[0]
@@ -961,7 +962,8 @@ class ConvertInfFiles(CommonUtils):
                 reldst = self.RootRelative(dst)
                 print('Unabled to convert', reldst)
         if unsupportedArchCount > 0 and not self.args.quiet:
-            print('Skipped', unsupportedArchCount, 'files based on architecture')
+            print('Skipped', unsupportedArchCount,
+                  'files based on architecture')
 
 
 class ConvertDirectories(CommonUtils):
@@ -1002,4 +1004,5 @@ class ConvertAsmApp(CommonUtils):
         elif not self.dirmode:
             ConvertAsmFile(src, dst, self)
 
+
 ConvertAsmApp()
diff --git a/BaseTools/Scripts/ConvertUni.py b/BaseTools/Scripts/ConvertUni.py
index d87550b63132..c4980f1e1a6f 100755
--- a/BaseTools/Scripts/ConvertUni.py
+++ b/BaseTools/Scripts/ConvertUni.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Check a patch for various format issues
 #
 #  Copyright (c) 2015, Intel Corporation. All rights reserved.<BR>
@@ -7,14 +7,14 @@
 #
 
 from __future__ import print_function
+import sys
+import os
+import codecs
+import argparse
 
 VersionNumber = '0.1'
 __copyright__ = "Copyright (c) 2015, Intel Corporation  All rights reserved."
 
-import argparse
-import codecs
-import os
-import sys
 
 class ConvertOneArg:
     """Converts utf-16 to utf-8 for one command line argument.
@@ -82,7 +82,8 @@ class ConvertOneArg:
         f.write(new_content)
         f.close()
 
-        print(source + ": converted, size", len(file_content), '=>', len(new_content))
+        print(source + ": converted, size",
+              len(file_content), '=>', len(new_content))
         return True
 
 
@@ -121,5 +122,6 @@ class ConvertUniApp:
         self.args = parser.parse_args()
         self.utf8 = not self.args.utf_16
 
+
 if __name__ == "__main__":
     sys.exit(ConvertUniApp().retval)
diff --git a/BaseTools/Scripts/DetectNotUsedItem.py b/BaseTools/Scripts/DetectNotUsedItem.py
index 61481fb9187c..807627d5e9fe 100644
--- a/BaseTools/Scripts/DetectNotUsedItem.py
+++ b/BaseTools/Scripts/DetectNotUsedItem.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Detect unreferenced PCD and GUID/Protocols/PPIs.
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.
@@ -73,7 +73,8 @@ class PROCESS(object):
                     Comment_Line.append(Index)
                     if NotComment:
                         if content != "\n" and content != "\r\n":
-                            ItemName[Index] = content.split('=')[0].split('|')[0].split('#')[0].strip()
+                            ItemName[Index] = content.split('=')[0].split('|')[
+                                0].split('#')[0].strip()
                             Comments[Index] = Comment_Line
                             Comment_Line = []
         return ItemName, Comments
@@ -103,7 +104,8 @@ class PROCESS(object):
         InfDscFdfContent = self.ParserDscFdfInfFile()
         for LineNum in list(DecItem.keys()):
             DecItemName = DecItem[LineNum]
-            Match_reg = re.compile("(?<![a-zA-Z0-9_-])%s(?![a-zA-Z0-9_-])" % DecItemName)
+            Match_reg = re.compile(
+                "(?<![a-zA-Z0-9_-])%s(?![a-zA-Z0-9_-])" % DecItemName)
             MatchFlag = False
             for Line in InfDscFdfContent:
                 if Match_reg.search(Line):
@@ -115,13 +117,16 @@ class PROCESS(object):
         return NotUsedItem, DecComments
 
     def Display(self, UnuseDict):
-        print("DEC File:\n%s\n%s%s" % (self.Dec, "{:<15}".format("Line Number"), "{:<0}".format("Unused Item")))
+        print("DEC File:\n%s\n%s%s" % (self.Dec, "{:<15}".format(
+            "Line Number"), "{:<0}".format("Unused Item")))
         self.Log.append(
             "DEC File:\n%s\n%s%s\n" % (self.Dec, "{:<15}".format("Line Number"), "{:<0}".format("Unused Item")))
         for num in list(sorted(UnuseDict.keys())):
             ItemName = UnuseDict[num]
-            print("%s%s%s" % (" " * 3, "{:<12}".format(num + 1), "{:<1}".format(ItemName)))
-            self.Log.append(("%s%s%s\n" % (" " * 3, "{:<12}".format(num + 1), "{:<1}".format(ItemName))))
+            print("%s%s%s" %
+                  (" " * 3, "{:<12}".format(num + 1), "{:<1}".format(ItemName)))
+            self.Log.append(
+                ("%s%s%s\n" % (" " * 3, "{:<12}".format(num + 1), "{:<1}".format(ItemName))))
 
     def Clean(self, UnUseDict, Comments):
         removednum = []
@@ -174,7 +179,8 @@ def main():
     parser = argparse.ArgumentParser(prog=__prog__,
                                      description=__description__ + __copyright__,
                                      conflict_handler='resolve')
-    parser.add_argument('-i', '--input', metavar="", dest='InputDec', help="Input DEC file name.")
+    parser.add_argument('-i', '--input', metavar="",
+                        dest='InputDec', help="Input DEC file name.")
     parser.add_argument('--dirs', metavar="", action='append', dest='Dirs',
                         help="The package directory. To specify more directories, please repeat this option.")
     parser.add_argument('--clean', action='store_true', default=False, dest='Clean',
@@ -187,7 +193,8 @@ def main():
             print("Error: Invalid DEC file input: %s" % options.InputDec)
         if options.Dirs:
             M = Main()
-            M.mainprocess(options.InputDec, options.Dirs, options.Clean, options.Logfile)
+            M.mainprocess(options.InputDec, options.Dirs,
+                          options.Clean, options.Logfile)
         else:
             print("Error: the following argument is required:'--dirs'.")
     else:
diff --git a/BaseTools/Scripts/FormatDosFiles.py b/BaseTools/Scripts/FormatDosFiles.py
index e119334dede7..79f3a82fc7d3 100644
--- a/BaseTools/Scripts/FormatDosFiles.py
+++ b/BaseTools/Scripts/FormatDosFiles.py
@@ -18,14 +18,17 @@ import re
 import sys
 import copy
 
-__prog__        = 'FormatDosFiles'
-__version__     = '%s Version %s' % (__prog__, '0.10 ')
-__copyright__   = 'Copyright (c) 2018-2019, Intel Corporation. All rights reserved.'
+__prog__ = 'FormatDosFiles'
+__version__ = '%s Version %s' % (__prog__, '0.10 ')
+__copyright__ = 'Copyright (c) 2018-2019, Intel Corporation. All rights reserved.'
 __description__ = 'Convert source files to meet the EDKII C Coding Standards Specification.\n'
-DEFAULT_EXT_LIST = ['.h', '.c', '.nasm', '.nasmb', '.asm', '.S', '.inf', '.dec', '.dsc', '.fdf', '.uni', '.asl', '.aslc', '.vfr', '.idf', '.txt', '.bat', '.py']
+DEFAULT_EXT_LIST = ['.h', '.c', '.nasm', '.nasmb', '.asm', '.S', '.inf', '.dec',
+                    '.dsc', '.fdf', '.uni', '.asl', '.aslc', '.vfr', '.idf', '.txt', '.bat', '.py']
+
+# For working in python2 and python3 environment, re pattern should use binary string, which is bytes type in python3.
+# Because in python3,read from file in binary mode will return bytes type,and in python3 bytes type can not be mixed with str type.
+
 
-#For working in python2 and python3 environment, re pattern should use binary string, which is bytes type in python3.
-#Because in python3,read from file in binary mode will return bytes type,and in python3 bytes type can not be mixed with str type.
 def FormatFile(FilePath, Args):
     with open(FilePath, 'rb') as Fd:
         Content = Fd.read()
@@ -43,6 +46,7 @@ def FormatFile(FilePath, Args):
             if not Args.Quiet:
                 print(FilePath)
 
+
 def FormatFilesInDir(DirPath, ExtList, Args):
 
     FileList = []
@@ -68,12 +72,14 @@ def FormatFilesInDir(DirPath, ExtList, Args):
             if Continue:
                 continue
         for FileName in [f for f in FileNames if any(f.endswith(ext) for ext in ExtList)]:
-                FileList.append(os.path.join(DirPath, FileName))
+            FileList.append(os.path.join(DirPath, FileName))
     for File in FileList:
         FormatFile(File, Args)
 
+
 if __name__ == "__main__":
-    parser = argparse.ArgumentParser(prog=__prog__, description=__description__ + __copyright__, conflict_handler = 'resolve')
+    parser = argparse.ArgumentParser(
+        prog=__prog__, description=__description__ + __copyright__, conflict_handler='resolve')
 
     parser.add_argument('Path', nargs='+',
                         help='the path for files to be converted.It could be directory or file path.')
@@ -88,7 +94,8 @@ if __name__ == "__main__":
                         help='reduce output messages')
     parser.add_argument('--debug', dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0,
                         help='set debug level')
-    parser.add_argument('--exclude', dest='Exclude', nargs='+', help="directory name or file name which will be excluded")
+    parser.add_argument('--exclude', dest='Exclude', nargs='+',
+                        help="directory name or file name which will be excluded")
     args = parser.parse_args()
     DefaultExt = copy.copy(DEFAULT_EXT_LIST)
 
diff --git a/BaseTools/Scripts/GetMaintainer.py b/BaseTools/Scripts/GetMaintainer.py
index d1e042c0afe4..b289decb8471 100644
--- a/BaseTools/Scripts/GetMaintainer.py
+++ b/BaseTools/Scripts/GetMaintainer.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Retrieves the people to request review from on submission of a commit.
 #
 #  Copyright (c) 2019, Linaro Ltd. All rights reserved.<BR>
@@ -25,6 +25,7 @@ EXPRESSIONS = {
     'webpage':    re.compile(r'^W:\s*(?P<webpage>.*?)\r*$')
 }
 
+
 def printsection(section):
     """Prints out the dictionary describing a Maintainers.txt section."""
     print('===')
@@ -33,6 +34,7 @@ def printsection(section):
         for item in section[key]:
             print('  %s' % item)
 
+
 def pattern_to_regex(pattern):
     """Takes a string containing regular UNIX path wildcards
        and returns a string suitable for matching with regex."""
@@ -49,6 +51,7 @@ def pattern_to_regex(pattern):
 
     return pattern
 
+
 def path_in_section(path, section):
     """Returns True of False indicating whether the path is covered by
        the current section."""
@@ -72,6 +75,7 @@ def path_in_section(path, section):
 
     return False
 
+
 def get_section_maintainers(path, section):
     """Returns a list with email addresses to any M: and R: entries
        matching the provided path in the provided section."""
@@ -82,7 +86,8 @@ def get_section_maintainers(path, section):
     if path_in_section(path, section):
         for status in section['status']:
             if status not in nowarn_status:
-                print('WARNING: Maintained status for "%s" is \'%s\'!' % (path, status))
+                print('WARNING: Maintained status for "%s" is \'%s\'!' %
+                      (path, status))
         for address in section['maintainer'], section['reviewer']:
             # Convert to list if necessary
             if isinstance(address, list):
@@ -98,6 +103,7 @@ def get_section_maintainers(path, section):
 
     return maintainers, lists
 
+
 def get_maintainers(path, sections, level=0):
     """For 'path', iterates over all sections, returning maintainers
        for matching ones."""
@@ -115,7 +121,8 @@ def get_maintainers(path, sections, level=0):
         # REPO.working_dir/<default>
         print('"%s": no maintainers found, looking for default' % path)
         if level == 0:
-            maintainers = get_maintainers('<default>', sections, level=level + 1)
+            maintainers = get_maintainers(
+                '<default>', sections, level=level + 1)
         else:
             print("No <default> maintainers set for project.")
         if not maintainers:
@@ -123,6 +130,7 @@ def get_maintainers(path, sections, level=0):
 
     return maintainers + lists
 
+
 def parse_maintainers_line(line):
     """Parse one line of Maintainers.txt, returning any match group and its key."""
     for key, expression in EXPRESSIONS.items():
@@ -131,6 +139,7 @@ def parse_maintainers_line(line):
             return key, match.group(key)
     return None, None
 
+
 def parse_maintainers_file(filename):
     """Parse the Maintainers.txt from top-level of repo and
        return a list containing dictionaries of all sections."""
@@ -153,11 +162,13 @@ def parse_maintainers_file(filename):
 
         return sectionlist
 
+
 def get_modified_files(repo, args):
     """Returns a list of the files modified by the commit specified in 'args'."""
     commit = repo.commit(args.commit)
     return commit.stats.files
 
+
 if __name__ == '__main__':
     PARSER = argparse.ArgumentParser(
         description='Retrieves information on who to cc for review on a given commit')
@@ -178,7 +189,7 @@ if __name__ == '__main__':
     SECTIONS = parse_maintainers_file(CONFIG_FILE)
 
     if ARGS.lookup:
-        FILES = [ARGS.lookup.replace('\\','/')]
+        FILES = [ARGS.lookup.replace('\\', '/')]
     else:
         FILES = get_modified_files(REPO, ARGS)
 
diff --git a/BaseTools/Scripts/GetUtcDateTime.py b/BaseTools/Scripts/GetUtcDateTime.py
index 3cfb6ac2ae77..8327854835c5 100644
--- a/BaseTools/Scripts/GetUtcDateTime.py
+++ b/BaseTools/Scripts/GetUtcDateTime.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Get current UTC date and time information and output as ascii code.
 #
 #  Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -6,10 +6,11 @@
 #  SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-VersionNumber = '0.1'
-import sys
-import datetime
 import argparse
+import datetime
+import sys
+VersionNumber = '0.1'
+
 
 def Main():
     PARSER = argparse.ArgumentParser(
@@ -26,19 +27,20 @@ def Main():
 
     ARGS = PARSER.parse_args()
     if len(sys.argv) == 1:
-        print ("ERROR: At least one argument is required!\n")
+        print("ERROR: At least one argument is required!\n")
         PARSER.print_help()
 
     today = datetime.datetime.utcnow()
     if ARGS.year:
         ReversedNumber = str(today.year)[::-1]
-        print (''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
+        print(''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
     if ARGS.date:
         ReversedNumber = str(today.strftime("%m%d"))[::-1]
-        print (''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
+        print(''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
     if ARGS.time:
         ReversedNumber = str(today.strftime("%H%M"))[::-1]
-        print (''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
+        print(''.join(hex(ord(HexString))[2:] for HexString in ReversedNumber))
+
 
 if __name__ == '__main__':
     Main()
diff --git a/BaseTools/Scripts/MemoryProfileSymbolGen.py b/BaseTools/Scripts/MemoryProfileSymbolGen.py
index 69df2fb99b30..0a2ee3f20311 100644
--- a/BaseTools/Scripts/MemoryProfileSymbolGen.py
+++ b/BaseTools/Scripts/MemoryProfileSymbolGen.py
@@ -17,6 +17,7 @@ from optparse import OptionParser
 versionNumber = "1.1"
 __copyright__ = "Copyright (c) 2016 - 2018, Intel Corporation. All rights reserved."
 
+
 class Symbols:
     def __init__(self):
         self.listLineAddress = []
@@ -26,36 +27,36 @@ class Symbols:
         # Cache for line
         self.sourceName = ""
 
-
-    def getSymbol (self, rva):
+    def getSymbol(self, rva):
         index = 0
-        lineName  = 0
+        lineName = 0
         sourceName = "??"
-        while index + 1 < self.lineCount :
-            if self.listLineAddress[index][0] <= rva and self.listLineAddress[index + 1][0] > rva :
+        while index + 1 < self.lineCount:
+            if self.listLineAddress[index][0] <= rva and self.listLineAddress[index + 1][0] > rva:
                 offset = rva - self.listLineAddress[index][0]
                 functionName = self.listLineAddress[index][1]
                 lineName = self.listLineAddress[index][2]
                 sourceName = self.listLineAddress[index][3]
-                if lineName == 0 :
-                  return " (" + self.listLineAddress[index][1] + "() - " + ")"
-                else :
-                  return " (" + self.listLineAddress[index][1] + "() - " + sourceName + ":" + str(lineName) + ")"
+                if lineName == 0:
+                    return " (" + self.listLineAddress[index][1] + "() - " + ")"
+                else:
+                    return " (" + self.listLineAddress[index][1] + "() - " + sourceName + ":" + str(lineName) + ")"
             index += 1
 
         return " (unknown)"
 
     def parse_debug_file(self, driverName, pdbName):
-        if cmp (pdbName, "") == 0 :
+        if cmp(pdbName, "") == 0:
             return
-        self.pdbName = pdbName;
+        self.pdbName = pdbName
 
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
             print("parsing (debug) - " + pdbName)
-            os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
-        except :
+            os.system('%s %s %s > nmDump.line.log' %
+                      (nmCommand, nmLineOption, pdbName))
+        except:
             print('ERROR: nm command not available.  Please verify PATH')
             return
 
@@ -70,36 +71,38 @@ class Symbols:
         patchLineFileMatchString = "([0-9a-fA-F]*)\s+[T|D|t|d]\s+(\w+)\s*((?:[a-zA-Z]:)?[\w+\-./_a-zA-Z0-9\\\\]*):?([0-9]*)"
 
         for reportLine in reportLines:
-            #print "check - " + reportLine
+            # print "check - " + reportLine
             match = re.match(patchLineFileMatchString, reportLine)
             if match is not None:
-                #print "match - " + reportLine[:-1]
-                #print "0 - " + match.group(0)
-                #print "1 - " + match.group(1)
-                #print "2 - " + match.group(2)
-                #print "3 - " + match.group(3)
-                #print "4 - " + match.group(4)
+                # print "match - " + reportLine[:-1]
+                # print "0 - " + match.group(0)
+                # print "1 - " + match.group(1)
+                # print "2 - " + match.group(2)
+                # print "3 - " + match.group(3)
+                # print "4 - " + match.group(4)
 
-                rva = int (match.group(1), 16)
+                rva = int(match.group(1), 16)
                 functionName = match.group(2)
                 sourceName = match.group(3)
-                if cmp (match.group(4), "") != 0 :
-                    lineName = int (match.group(4))
-                else :
+                if cmp(match.group(4), "") != 0:
+                    lineName = int(match.group(4))
+                else:
                     lineName = 0
-                self.listLineAddress.append ([rva, functionName, lineName, sourceName])
+                self.listLineAddress.append(
+                    [rva, functionName, lineName, sourceName])
 
-        self.lineCount = len (self.listLineAddress)
+        self.lineCount = len(self.listLineAddress)
 
-        self.listLineAddress = sorted(self.listLineAddress, key=lambda symbolAddress:symbolAddress[0])
+        self.listLineAddress = sorted(
+            self.listLineAddress, key=lambda symbolAddress: symbolAddress[0])
 
-        #for key in self.listLineAddress :
-            #print "rva - " + "%x"%(key[0]) + ", func - " + key[1] + ", line - " + str(key[2]) + ", source - " + key[3]
+        # for key in self.listLineAddress :
+        # print "rva - " + "%x"%(key[0]) + ", func - " + key[1] + ", line - " + str(key[2]) + ", source - " + key[3]
 
     def parse_pdb_file(self, driverName, pdbName):
-        if cmp (pdbName, "") == 0 :
+        if cmp(pdbName, "") == 0:
             return
-        self.pdbName = pdbName;
+        self.pdbName = pdbName
 
         try:
             #DIA2DumpCommand = "\"C:\\Program Files (x86)\Microsoft Visual Studio 14.0\\DIA SDK\\Samples\\DIA2Dump\\x64\\Debug\\Dia2Dump.exe\""
@@ -108,8 +111,9 @@ class Symbols:
             DIA2LinesOption = "-l"
             print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
-            os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
-        except :
+            os.system('%s %s %s > DIA2Dump.line.log' %
+                      (DIA2DumpCommand, DIA2LinesOption, pdbName))
+        except:
             print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
@@ -129,108 +133,120 @@ class Symbols:
         patchLineFileMatchStringFunc = "\*\*\s+(\w+)\s*"
 
         for reportLine in reportLines:
-            #print "check line - " + reportLine
+            # print "check line - " + reportLine
             match = re.match(patchLineFileMatchString, reportLine)
             if match is not None:
-                #print "match - " + reportLine[:-1]
-                #print "0 - " + match.group(0)
-                #print "1 - " + match.group(1)
-                #print "2 - " + match.group(2)
-                if cmp (match.group(3), "") != 0 :
+                # print "match - " + reportLine[:-1]
+                # print "0 - " + match.group(0)
+                # print "1 - " + match.group(1)
+                # print "2 - " + match.group(2)
+                if cmp(match.group(3), "") != 0:
                     self.sourceName = match.group(3)
                 sourceName = self.sourceName
                 functionName = self.functionName
 
-                rva = int (match.group(2), 16)
-                lineName = int (match.group(1))
-                self.listLineAddress.append ([rva, functionName, lineName, sourceName])
-            else :
+                rva = int(match.group(2), 16)
+                lineName = int(match.group(1))
+                self.listLineAddress.append(
+                    [rva, functionName, lineName, sourceName])
+            else:
                 match = re.match(patchLineFileMatchStringFunc, reportLine)
                 if match is not None:
                     self.functionName = match.group(1)
 
-        self.lineCount = len (self.listLineAddress)
-        self.listLineAddress = sorted(self.listLineAddress, key=lambda symbolAddress:symbolAddress[0])
+        self.lineCount = len(self.listLineAddress)
+        self.listLineAddress = sorted(
+            self.listLineAddress, key=lambda symbolAddress: symbolAddress[0])
+
+        # for key in self.listLineAddress :
+        # print "rva - " + "%x"%(key[0]) + ", func - " + key[1] + ", line - " + str(key[2]) + ", source - " + key[3]
 
-        #for key in self.listLineAddress :
-            #print "rva - " + "%x"%(key[0]) + ", func - " + key[1] + ", line - " + str(key[2]) + ", source - " + key[3]
 
 class SymbolsFile:
     def __init__(self):
         self.symbolsTable = {}
 
+
 symbolsFile = ""
 
 driverName = ""
 rvaName = ""
 symbolName = ""
 
+
 def getSymbolName(driverName, rva):
     global symbolsFile
 
-    #print "driverName - " + driverName
+    # print "driverName - " + driverName
 
-    try :
+    try:
         symbolList = symbolsFile.symbolsTable[driverName]
         if symbolList is not None:
-            return symbolList.getSymbol (rva)
+            return symbolList.getSymbol(rva)
         else:
             return " (???)"
     except Exception:
         return " (???)"
 
+
 def processLine(newline):
     global driverName
     global rvaName
 
     driverPrefixLen = len("Driver - ")
     # get driver name
-    if cmp(newline[0:driverPrefixLen], "Driver - ") == 0 :
+    if cmp(newline[0:driverPrefixLen], "Driver - ") == 0:
         driverlineList = newline.split(" ")
         driverName = driverlineList[2]
-        #print "Checking : ", driverName
+        # print "Checking : ", driverName
 
         # EDKII application output
         pdbMatchString = "Driver - \w* \(Usage - 0x[0-9a-fA-F]+\) \(Pdb - ([:\-.\w\\\\/]*)\)\s*"
         pdbName = ""
         match = re.match(pdbMatchString, newline)
         if match is not None:
-            #print "match - " + newline
-            #print "0 - " + match.group(0)
-            #print "1 - " + match.group(1)
+            # print "match - " + newline
+            # print "0 - " + match.group(0)
+            # print "1 - " + match.group(1)
             pdbName = match.group(1)
-            #print "PDB - " + pdbName
+            # print "PDB - " + pdbName
 
         symbolsFile.symbolsTable[driverName] = Symbols()
 
-        if cmp (pdbName[-3:], "pdb") == 0 :
-            symbolsFile.symbolsTable[driverName].parse_pdb_file (driverName, pdbName)
-        else :
-            symbolsFile.symbolsTable[driverName].parse_debug_file (driverName, pdbName)
+        if cmp(pdbName[-3:], "pdb") == 0:
+            symbolsFile.symbolsTable[driverName].parse_pdb_file(
+                driverName, pdbName)
+        else:
+            symbolsFile.symbolsTable[driverName].parse_debug_file(
+                driverName, pdbName)
 
-    elif cmp(newline, "") == 0 :
+    elif cmp(newline, "") == 0:
         driverName = ""
 
     # check entry line
-    if newline.find ("<==") != -1 :
+    if newline.find("<==") != -1:
         entry_list = newline.split(" ")
         rvaName = entry_list[4]
-        #print "rva : ", rvaName
-        symbolName = getSymbolName (driverName, int(rvaName, 16))
-    else :
+        # print "rva : ", rvaName
+        symbolName = getSymbolName(driverName, int(rvaName, 16))
+    else:
         rvaName = ""
         symbolName = ""
 
-    if cmp(rvaName, "") == 0 :
+    if cmp(rvaName, "") == 0:
         return newline
-    else :
+    else:
         return newline + symbolName
 
+
 def myOptionParser():
     usage = "%prog [--version] [-h] [--help] [-i inputfile [-o outputfile]]"
-    Parser = OptionParser(usage=usage, description=__copyright__, version="%prog " + str(versionNumber))
-    Parser.add_option("-i", "--inputfile", dest="inputfilename", type="string", help="The input memory profile info file output from MemoryProfileInfo application in MdeModulePkg")
-    Parser.add_option("-o", "--outputfile", dest="outputfilename", type="string", help="The output memory profile info file with symbol, MemoryProfileInfoSymbol.txt will be used if it is not specified")
+    Parser = OptionParser(usage=usage, description=__copyright__,
+                          version="%prog " + str(versionNumber))
+    Parser.add_option("-i", "--inputfile", dest="inputfilename", type="string",
+                      help="The input memory profile info file output from MemoryProfileInfo application in MdeModulePkg")
+    Parser.add_option("-o", "--outputfile", dest="outputfilename", type="string",
+                      help="The output memory profile info file with symbol, MemoryProfileInfoSymbol.txt will be used if it is not specified")
 
     (Options, args) = Parser.parse_args()
     if Options.inputfilename is None:
@@ -239,6 +255,7 @@ def myOptionParser():
         Options.outputfilename = "MemoryProfileInfoSymbol.txt"
     return Options
 
+
 def main():
     global symbolsFile
     global Options
@@ -246,12 +263,12 @@ def main():
 
     symbolsFile = SymbolsFile()
 
-    try :
+    try:
         file = open(Options.inputfilename)
     except Exception:
         print("fail to open " + Options.inputfilename)
         return 1
-    try :
+    try:
         newfile = open(Options.outputfilename, "w")
     except Exception:
         print("fail to open " + Options.outputfilename)
@@ -272,5 +289,6 @@ def main():
         file.close()
         newfile.close()
 
+
 if __name__ == '__main__':
     sys.exit(main())
diff --git a/BaseTools/Scripts/PackageDocumentTools/__init__.py b/BaseTools/Scripts/PackageDocumentTools/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py b/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
index eb3164a553fa..3e7b6742c155 100644
--- a/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
+++ b/BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This module provide command line entry for generating package document!
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -7,23 +7,29 @@
 #
 
 from __future__ import print_function
-import os, sys, logging, traceback, subprocess
+import os
+import sys
+import logging
+import traceback
+import subprocess
 from optparse import OptionParser
 
 from plugins.EdkPlugins.edk2.model import baseobject
 from plugins.EdkPlugins.edk2.model import doxygengen
 
-gArchMarcoDict = {'ALL'      : 'MDE_CPU_IA32 MDE_CPU_X64 MDE_CPU_EBC MDE_CPU_IPF _MSC_EXTENSIONS __GNUC__ __INTEL_COMPILER',
+gArchMarcoDict = {'ALL': 'MDE_CPU_IA32 MDE_CPU_X64 MDE_CPU_EBC MDE_CPU_IPF _MSC_EXTENSIONS __GNUC__ __INTEL_COMPILER',
                   'IA32_MSFT': 'MDE_CPU_IA32 _MSC_EXTENSIONS',
-                  'IA32_GNU' : 'MDE_CPU_IA32 __GNUC__',
-                  'X64_MSFT' : 'MDE_CPU_X64 _MSC_EXTENSIONS  ASM_PFX= OPTIONAL= ',
-                  'X64_GNU'  : 'MDE_CPU_X64 __GNUC__  ASM_PFX= OPTIONAL= ',
-                  'IPF_MSFT' : 'MDE_CPU_IPF _MSC_EXTENSIONS  ASM_PFX= OPTIONAL= ',
-                  'IPF_GNU'  : 'MDE_CPU_IPF __GNUC__  ASM_PFX= OPTIONAL= ',
+                  'IA32_GNU': 'MDE_CPU_IA32 __GNUC__',
+                  'X64_MSFT': 'MDE_CPU_X64 _MSC_EXTENSIONS  ASM_PFX= OPTIONAL= ',
+                  'X64_GNU': 'MDE_CPU_X64 __GNUC__  ASM_PFX= OPTIONAL= ',
+                  'IPF_MSFT': 'MDE_CPU_IPF _MSC_EXTENSIONS  ASM_PFX= OPTIONAL= ',
+                  'IPF_GNU': 'MDE_CPU_IPF __GNUC__  ASM_PFX= OPTIONAL= ',
                   'EBC_INTEL': 'MDE_CPU_EBC __INTEL_COMPILER  ASM_PFX= OPTIONAL= '}
 
+
 def parseCmdArgs():
-    parser = OptionParser(version="Package Document Generation Tools - Version 0.1")
+    parser = OptionParser(
+        version="Package Document Generation Tools - Version 0.1")
     parser.add_option('-w', '--workspace', action='store', type='string', dest='WorkspacePath',
                       help='Specify workspace absolute path. For example: c:\\tianocore')
     parser.add_option('-p', '--decfile', action='store', dest='PackagePath',
@@ -47,22 +53,27 @@ def parseCmdArgs():
     if options.WorkspacePath is None:
         errors.append('- Please specify workspace path via option -w!')
     elif not os.path.exists(options.WorkspacePath):
-        errors.append("- Invalid workspace path %s! The workspace path should be exist in absolute path!" % options.WorkspacePath)
+        errors.append(
+            "- Invalid workspace path %s! The workspace path should be exist in absolute path!" % options.WorkspacePath)
 
     if options.PackagePath is None:
         errors.append('- Please specify package DEC file path via option -p!')
     elif not os.path.exists(options.PackagePath):
-        errors.append("- Invalid package's DEC file path %s! The DEC path should be exist in absolute path!" % options.PackagePath)
+        errors.append(
+            "- Invalid package's DEC file path %s! The DEC path should be exist in absolute path!" % options.PackagePath)
 
     default = "C:\\Program Files\\doxygen\\bin\\doxygen.exe"
     if options.DoxygenPath is None:
         if os.path.exists(default):
-            print("Warning: Assume doxygen tool is installed at %s. If not, please specify via -x" % default)
+            print(
+                "Warning: Assume doxygen tool is installed at %s. If not, please specify via -x" % default)
             options.DoxygenPath = default
         else:
-            errors.append('- Please specify the path of doxygen tool installation via option -x! or install it in default path %s' % default)
+            errors.append(
+                '- Please specify the path of doxygen tool installation via option -x! or install it in default path %s' % default)
     elif not os.path.exists(options.DoxygenPath):
-        errors.append("- Invalid doxygen tool path %s! The doxygen tool path should be exist in absolute path!" % options.DoxygenPath)
+        errors.append(
+            "- Invalid doxygen tool path %s! The doxygen tool path should be exist in absolute path!" % options.DoxygenPath)
 
     if options.OutputPath is not None:
         if not os.path.exists(options.OutputPath):
@@ -70,20 +81,24 @@ def parseCmdArgs():
             try:
                 os.makedirs(options.OutputPath)
             except:
-                errors.append('- Fail to create the output directory %s' % options.OutputPath)
+                errors.append(
+                    '- Fail to create the output directory %s' % options.OutputPath)
     else:
         if options.PackagePath is not None and os.path.exists(options.PackagePath):
             dirpath = os.path.dirname(options.PackagePath)
-            default = os.path.join (dirpath, "Document")
-            print('Warning: Assume document output at %s. If not, please specify via option -o' % default)
+            default = os.path.join(dirpath, "Document")
+            print(
+                'Warning: Assume document output at %s. If not, please specify via option -o' % default)
             options.OutputPath = default
             if not os.path.exists(default):
                 try:
                     os.makedirs(default)
                 except:
-                    errors.append('- Fail to create default output directory %s! Please specify document output diretory via option -o' % default)
+                    errors.append(
+                        '- Fail to create default output directory %s! Please specify document output diretory via option -o' % default)
         else:
-            errors.append('- Please specify document output path via option -o!')
+            errors.append(
+                '- Please specify document output path via option -o!')
 
     if options.Arch is None:
         options.Arch = 'ALL'
@@ -104,29 +119,35 @@ def parseCmdArgs():
                 print('Warning: Assume the installation path of Microsoft HTML Workshop is %s. If not, specify via option -c.' % default)
                 options.HtmlWorkshopPath = default
             else:
-                errors.append('- Please specify the installation path of Microsoft HTML Workshop via option -c!')
+                errors.append(
+                    '- Please specify the installation path of Microsoft HTML Workshop via option -c!')
         elif not os.path.exists(options.HtmlWorkshopPath):
-            errors.append('- The installation path of Microsoft HTML Workshop %s does not exists. ' % options.HtmlWorkshopPath)
+            errors.append(
+                '- The installation path of Microsoft HTML Workshop %s does not exists. ' % options.HtmlWorkshopPath)
 
     if len(errors) != 0:
         print('\n')
-        parser.error('Fail to start due to following reasons: \n%s' %'\n'.join(errors))
+        parser.error('Fail to start due to following reasons: \n%s' %
+                     '\n'.join(errors))
     return (options.WorkspacePath, options.PackagePath, options.DoxygenPath, options.OutputPath,
             options.Arch, options.DocumentMode, options.IncludeOnly, options.HtmlWorkshopPath)
 
+
 def createPackageObject(wsPath, pkgPath):
     try:
         pkgObj = baseobject.Package(None, wsPath)
         pkgObj.Load(pkgPath)
     except:
-        logging.getLogger().error ('Fail to create package object!')
+        logging.getLogger().error('Fail to create package object!')
         return None
 
     return pkgObj
 
+
 def callbackLogMessage(msg, level):
     print(msg.strip())
 
+
 def callbackCreateDoxygenProcess(doxPath, configPath):
     if sys.platform == 'win32':
         cmd = '"%s" %s' % (doxPath, configPath)
@@ -146,7 +167,8 @@ def DocumentFixup(outPath, arch):
             if dir.lower() in ['.svn', '_svn', 'cvs']:
                 dirs.remove(dir)
         for file in files:
-            if not file.lower().endswith('.html'): continue
+            if not file.lower().endswith('.html'):
+                continue
             fullpath = os.path.join(outPath, root, file)
             try:
                 f = open(fullpath, 'r')
@@ -169,6 +191,7 @@ def DocumentFixup(outPath, arch):
 
     print('    >>> Finish all document fixing up! \n')
 
+
 def FixPageBaseLib(path, text):
     print('    >>> Fixup BaseLib file page at file %s \n' % path)
     lines = text.split('\n')
@@ -184,13 +207,13 @@ def FixPageBaseLib(path, text):
             lines[index] = '<td class="memname">#define BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT&nbsp;&nbsp;&nbsp;9&nbsp;[EBC, x64]   </td>'
         if line.find('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;4') != -1:
             lines[index] = lines[index].replace('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;4',
-                                 'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;4&nbsp;[IA32]')
+                                                'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;4&nbsp;[IA32]')
         if line.find('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;0x10') != -1:
             lines[index] = lines[index].replace('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;0x10',
-                                 'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;0x10&nbsp;[IPF]')
+                                                'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;0x10&nbsp;[IPF]')
         if line.find('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;8') != -1:
             lines[index] = lines[index].replace('BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;8',
-                                 'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;8&nbsp;[x64, EBC]')
+                                                'BASE_LIBRARY_JUMP_BUFFER_ALIGNMENT</a>&nbsp;&nbsp;&nbsp;8&nbsp;[x64, EBC]')
         if line.find('>BASE_LIBRARY_JUMP_BUFFER</a>') != -1:
             if lastBaseJumpIndex != -1:
                 del lines[lastBaseJumpIndex]
@@ -208,15 +231,18 @@ def FixPageBaseLib(path, text):
         return
     print("    <<< Finish to fixup file %s\n" % path)
 
+
 def FixPageIA32_IDT_GATE_DESCRIPTOR(path, text):
     print('    >>> Fixup structure reference IA32_IDT_GATE_DESCRIPTOR at file %s \n' % path)
     lines = text.split('\n')
     for index in range(len(lines) - 1, -1, -1):
         line = lines[index].strip()
         if line.find('struct {</td>') != -1 and lines[index - 2].find('>Uint64</a></td>') != -1:
-            lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For X64</h2></td></tr>')
+            lines.insert(
+                index, '<tr><td colspan="2"><br><h2>Data Fields For X64</h2></td></tr>')
         if line.find('struct {</td>') != -1 and lines[index - 1].find('Data Fields') != -1:
-            lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For IA32</h2></td></tr>')
+            lines.insert(
+                index, '<tr><td colspan="2"><br><h2>Data Fields For IA32</h2></td></tr>')
     try:
         f = open(path, 'w')
         f.write('\n'.join(lines))
@@ -226,6 +252,7 @@ def FixPageIA32_IDT_GATE_DESCRIPTOR(path, text):
         return
     print("    <<< Finish to fixup file %s\n" % path)
 
+
 def FixPageBASE_LIBRARY_JUMP_BUFFER(path, text):
     print('    >>> Fixup structure reference BASE_LIBRARY_JUMP_BUFFER at file %s \n' % path)
     lines = text.split('\n')
@@ -237,23 +264,27 @@ def FixPageBASE_LIBRARY_JUMP_BUFFER(path, text):
             bInDetail = False
         if line.startswith('EBC context buffer used by') and lines[index - 1].startswith('x64 context buffer'):
             lines[index] = "IA32/IPF/X64/" + line
-            bNeedRemove  = True
+            bNeedRemove = True
         if line.startswith("x64 context buffer") or line.startswith('IPF context buffer used by') or \
            line.startswith('IA32 context buffer used by'):
             if bNeedRemove:
                 lines.remove(line)
         if line.find('>R0</a>') != -1 and not bInDetail:
             if lines[index - 1] != '<tr><td colspan="2"><br><h2>Data Fields For EBC</h2></td></tr>':
-                lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For EBC</h2></td></tr>')
+                lines.insert(
+                    index, '<tr><td colspan="2"><br><h2>Data Fields For EBC</h2></td></tr>')
         if line.find('>Rbx</a>') != -1 and not bInDetail:
             if lines[index - 1] != '<tr><td colspan="2"><br><h2>Data Fields For X64</h2></td></tr>':
-                lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For X64</h2></td></tr>')
+                lines.insert(
+                    index, '<tr><td colspan="2"><br><h2>Data Fields For X64</h2></td></tr>')
         if line.find('>F2</a>') != -1 and not bInDetail:
             if lines[index - 1] != '<tr><td colspan="2"><br><h2>Data Fields For IPF</h2></td></tr>':
-                lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For IPF</h2></td></tr>')
+                lines.insert(
+                    index, '<tr><td colspan="2"><br><h2>Data Fields For IPF</h2></td></tr>')
         if line.find('>Ebx</a>') != -1 and not bInDetail:
             if lines[index - 1] != '<tr><td colspan="2"><br><h2>Data Fields For IA32</h2></td></tr>':
-                lines.insert(index, '<tr><td colspan="2"><br><h2>Data Fields For IA32</h2></td></tr>')
+                lines.insert(
+                    index, '<tr><td colspan="2"><br><h2>Data Fields For IA32</h2></td></tr>')
     try:
         f = open(path, 'w')
         f.write('\n'.join(lines))
@@ -263,17 +294,18 @@ def FixPageBASE_LIBRARY_JUMP_BUFFER(path, text):
         return
     print("    <<< Finish to fixup file %s\n" % path)
 
+
 def FixPageUefiDriverEntryPoint(path, text):
     print('    >>> Fixup file reference MdePkg/Include/Library/UefiDriverEntryPoint.h at file %s \n' % path)
     lines = text.split('\n')
     bInModuleEntry = False
-    bInEfiMain     = False
-    ModuleEntryDlCount  = 0
+    bInEfiMain = False
+    ModuleEntryDlCount = 0
     ModuleEntryDelStart = 0
-    ModuleEntryDelEnd   = 0
-    EfiMainDlCount      = 0
-    EfiMainDelStart     = 0
-    EfiMainDelEnd       = 0
+    ModuleEntryDelEnd = 0
+    EfiMainDlCount = 0
+    EfiMainDelStart = 0
+    EfiMainDelEnd = 0
 
     for index in range(len(lines)):
         line = lines[index].strip()
@@ -320,13 +352,13 @@ def FixPageUefiApplicationEntryPoint(path, text):
     print('    >>> Fixup file reference MdePkg/Include/Library/UefiApplicationEntryPoint.h at file %s \n' % path)
     lines = text.split('\n')
     bInModuleEntry = False
-    bInEfiMain     = False
-    ModuleEntryDlCount  = 0
+    bInEfiMain = False
+    ModuleEntryDlCount = 0
     ModuleEntryDelStart = 0
-    ModuleEntryDelEnd   = 0
-    EfiMainDlCount      = 0
-    EfiMainDelStart     = 0
-    EfiMainDelEnd       = 0
+    ModuleEntryDelEnd = 0
+    EfiMainDlCount = 0
+    EfiMainDelStart = 0
+    EfiMainDelEnd = 0
 
     for index in range(len(lines)):
         line = lines[index].strip()
@@ -368,12 +400,14 @@ def FixPageUefiApplicationEntryPoint(path, text):
         return
     print("    <<< Finish to fixup file %s\n" % path)
 
+
 if __name__ == '__main__':
     wspath, pkgpath, doxpath, outpath, archtag, docmode, isinc, hwpath = parseCmdArgs()
 
     # configure logging system
     logfilepath = os.path.join(outpath, 'log.txt')
-    logging.basicConfig(format='%(levelname)-8s %(message)s', level=logging.DEBUG)
+    logging.basicConfig(
+        format='%(levelname)-8s %(message)s', level=logging.DEBUG)
 
     # create package model object firstly
     pkgObj = createPackageObject(wspath, pkgpath)
@@ -381,13 +415,13 @@ if __name__ == '__main__':
         sys.exit(-1)
 
     # create doxygen action model
-    arch    = None
+    arch = None
     tooltag = None
     if archtag.lower() != 'all':
         arch = archtag.split('_')[0]
         tooltag = archtag.split('_')[1]
     else:
-        arch    = 'all'
+        arch = 'all'
         tooltag = 'all'
 
     # preprocess package and call doxygen
@@ -419,6 +453,8 @@ if __name__ == '__main__':
         else:
             cmd = '%s %s' % (hwpath, indexpath)
         subprocess.call(cmd)
-        print('\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.chm'))
+        print('\nFinish to generate package document! Please open %s for review' %
+              os.path.join(outpath, 'html', 'index.chm'))
     else:
-        print('\nFinish to generate package document! Please open %s for review' % os.path.join(outpath, 'html', 'index.html'))
+        print('\nFinish to generate package document! Please open %s for review' %
+              os.path.join(outpath, 'html', 'index.html'))
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
index a8f87751a0fa..9081234a9088 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -11,10 +11,11 @@ import os
 
 from .message import *
 
+
 class BaseDoxygeItem:
     def __init__(self, name, tag=''):
         self.mName = name
-        self.mTag  = tag
+        self.mTag = tag
         self.mDescription = ''
         self.mText = []
 
@@ -27,6 +28,7 @@ class BaseDoxygeItem:
     def Generate(self):
         """This interface need to be override"""
 
+
 class Section(BaseDoxygeItem):
     def Generate(self):
         """This interface need to be override"""
@@ -38,13 +40,14 @@ class Section(BaseDoxygeItem):
         self.mText.append(self.mDescription)
         return self.mText
 
+
 class Page(BaseDoxygeItem):
     def __init__(self, name, tag=None, isSort=True):
         BaseDoxygeItem.__init__(self, name, tag)
-        self.mSubPages     = []
-        self.mIsMainPage   = False
-        self.mSections     = []
-        self.mIsSort       = isSort
+        self.mSubPages = []
+        self.mIsMainPage = False
+        self.mSections = []
+        self.mIsSort = isSort
 
     def GetSubpageCount(self):
         return len(self.mSubPages)
@@ -88,7 +91,8 @@ class Page(BaseDoxygeItem):
             if self.mIsSort:
                 self.mSubPages.sort(key=lambda x: x.mName.lower())
             for page in self.mSubPages:
-                self.mText.insert(endIndex, '<li>\subpage %s \"%s\" </li>' % (page.mTag, page.mName))
+                self.mText.insert(
+                    endIndex, '<li>\subpage %s \"%s\" </li>' % (page.mTag, page.mName))
                 endIndex += 1
                 self.mText += page.Generate()
             self.mText.insert(endIndex, '</ul>')
@@ -96,10 +100,11 @@ class Page(BaseDoxygeItem):
         self.mText.insert(endIndex, ' **/')
         return self.mText
 
+
 class DoxygenFile(Page):
     def __init__(self, name, file):
         Page.__init__(self, name)
-        self.mFilename  = file
+        self.mFilename = file
         self.mIsMainPage = True
 
     def GetFilename(self):
@@ -112,11 +117,12 @@ class DoxygenFile(Page):
             f.write('\n'.join(str))
             f.close()
         except IOError as e:
-            ErrorMsg ('Fail to write file %s' % self.mFilename)
+            ErrorMsg('Fail to write file %s' % self.mFilename)
             return False
 
         return True
 
+
 doxygenConfigTemplate = """
 DOXYFILE_ENCODING      = UTF-8
 PROJECT_NAME           = %(ProjectName)s
@@ -326,19 +332,21 @@ DOT_CLEANUP            = YES
 SEARCHENGINE           = NO
 
 """
+
+
 class DoxygenConfigFile:
     def __init__(self):
-        self.mProjectName  = ''
-        self.mOutputDir    = ''
-        self.mFileList     = []
-        self.mIncludeList  = []
-        self.mStripPath    = ''
-        self.mExamplePath  = ''
-        self.mPattern      = ['*.c', '*.h',
-                              '*.asm', '*.s', '.nasm', '*.html', '*.dox']
-        self.mMode         = 'HTML'
-        self.mWarningFile  = ''
-        self.mPreDefined   = []
+        self.mProjectName = ''
+        self.mOutputDir = ''
+        self.mFileList = []
+        self.mIncludeList = []
+        self.mStripPath = ''
+        self.mExamplePath = ''
+        self.mPattern = ['*.c', '*.h',
+                         '*.asm', '*.s', '.nasm', '*.html', '*.dox']
+        self.mMode = 'HTML'
+        self.mWarningFile = ''
+        self.mPreDefined = []
         self.mProjectVersion = 0.1
 
     def SetChmMode(self):
@@ -399,7 +407,7 @@ class DoxygenConfigFile:
         self.mPreDefined.append(macro)
 
     def Generate(self, path):
-        files    = ' \\\n'.join(self.mFileList)
+        files = ' \\\n'.join(self.mFileList)
         includes = ' \\\n'.join(self.mIncludeList)
         patterns = ' \\\n'.join(self.mPattern)
         if self.mMode.lower() == 'html':
@@ -409,32 +417,33 @@ class DoxygenConfigFile:
             sHtmlHelp = 'YES'
             sTreeView = 'NO'
 
-        text = doxygenConfigTemplate % {'ProjectName':self.mProjectName,
-                                        'OutputDir':self.mOutputDir,
-                                        'StripPath':self.mStripPath,
-                                        'ExamplePath':self.mExamplePath,
-                                        'FileList':files,
-                                        'Pattern':patterns,
-                                        'WhetherGenerateHtmlHelp':sHtmlHelp,
-                                        'WhetherGenerateTreeView':sTreeView,
-                                        'IncludePath':includes,
-                                        'WarningFile':self.mWarningFile,
-                                        'PreDefined':' '.join(self.mPreDefined),
-                                        'ProjectVersion':self.mProjectVersion}
+        text = doxygenConfigTemplate % {'ProjectName': self.mProjectName,
+                                        'OutputDir': self.mOutputDir,
+                                        'StripPath': self.mStripPath,
+                                        'ExamplePath': self.mExamplePath,
+                                        'FileList': files,
+                                        'Pattern': patterns,
+                                        'WhetherGenerateHtmlHelp': sHtmlHelp,
+                                        'WhetherGenerateTreeView': sTreeView,
+                                        'IncludePath': includes,
+                                        'WarningFile': self.mWarningFile,
+                                        'PreDefined': ' '.join(self.mPreDefined),
+                                        'ProjectVersion': self.mProjectVersion}
         try:
             f = open(path, 'w')
             f.write(text)
             f.close()
         except IOError as e:
-            ErrorMsg ('Fail to generate doxygen config file %s' % path)
+            ErrorMsg('Fail to generate doxygen config file %s' % path)
             return False
 
         return True
 
+
 ########################################################################
 #                  TEST                   CODE
 ########################################################################
-if __name__== '__main__':
+if __name__ == '__main__':
     df = DoxygenFile('Platform Document', 'm:\tree')
     df.AddPage(Page('Module', 'module'))
     p = df.AddPage(Page('Library', 'library'))
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
index 1bc4938bfcdf..c0d7518317e3 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -13,9 +13,11 @@ import os
 import logging
 import core.pe as pe
 
+
 def GetLogger():
     return logging.getLogger('EFI Binary File')
 
+
 class EFIBinaryError(Exception):
     def __init__(self, message):
         Exception.__init__(self)
@@ -24,6 +26,7 @@ class EFIBinaryError(Exception):
     def GetMessage(self):
         return self._message
 
+
 class EfiFd(object):
     EFI_FV_HEADER_SIZE = 0x48
 
@@ -43,28 +46,29 @@ class EfiFd(object):
     def GetFvs(self):
         return self._fvs
 
+
 class EfiFv(object):
     FILE_SYSTEM_GUID = uuid.UUID('{8c8ce578-8a3d-4f1c-9935-896185c32dd3}')
 
     def __init__(self, parent=None):
-        self._size         = 0
-        self._filename     = None
-        self._fvheader     = None
+        self._size = 0
+        self._filename = None
+        self._fvheader = None
         self._blockentries = []
-        self._ffs          = []
+        self._ffs = []
 
         # following field is for FV in FD
-        self._parent       = parent
-        self._offset       = 0
-        self._raw          = array.array('B')
+        self._parent = parent
+        self._offset = 0
+        self._raw = array.array('B')
 
     def Load(self, fd):
-        self._offset   = fd.tell()
+        self._offset = fd.tell()
         self._filename = fd.name
 
         # get file header
         self._fvheader = EfiFirmwareVolumeHeader.Read(fd)
-        #self._fvheader.Dump()
+        # self._fvheader.Dump()
 
         self._size = self._fvheader.GetFvLength()
 
@@ -80,10 +84,10 @@ class EfiFv(object):
             self._blockentries.append(blockentry)
             blockentry = BlockMapEntry.Read(fd)
 
-
         if self._fvheader.GetSize() + (len(self._blockentries)) * 8 != \
            self._fvheader.GetHeaderLength():
-            raise EFIBinaryError("Volume Header length not consistent with block map!")
+            raise EFIBinaryError(
+                "Volume Header length not consistent with block map!")
 
         index = align(fd.tell(), 8)
         count = 0
@@ -120,10 +124,11 @@ class EfiFv(object):
     def GetRawData(self):
         return self._raw.tolist()
 
+
 class BinaryItem(object):
     def __init__(self, parent=None):
         self._size = 0
-        self._arr  = array.array('B')
+        self._arr = array.array('B')
         self._parent = parent
 
     @classmethod
@@ -144,6 +149,7 @@ class BinaryItem(object):
     def GetParent(self):
         return self._parent
 
+
 class EfiFirmwareVolumeHeader(BinaryItem):
     def GetSize(self):
         return 56
@@ -267,6 +273,7 @@ class EfiFirmwareVolumeHeader(BinaryItem):
     def GetRawData(self):
         return self._arr.tolist()
 
+
 class BlockMapEntry(BinaryItem):
     def GetSize(self):
         return 8
@@ -285,15 +292,16 @@ class BlockMapEntry(BinaryItem):
     def __str__(self):
         return '[BlockEntry] Number = 0x%X, length=0x%X' % (self.GetNumberBlocks(), self.GetLength())
 
+
 class EfiFfs(object):
-    FFS_HEADER_SIZE  = 24
+    FFS_HEADER_SIZE = 24
 
     def __init__(self, parent=None):
         self._header = None
 
         # following field is for FFS in FV file.
-        self._parent  = parent
-        self._offset  = 0
+        self._parent = parent
+        self._offset = 0
         self._sections = []
 
     def Load(self, fd):
@@ -335,8 +343,8 @@ class EfiFfs(object):
         return self._header.GetNameGuid()
 
     def DumpContent(self):
-        list  = self._content.tolist()
-        line  = []
+        list = self._content.tolist()
+        line = []
         count = 0
         for item in list:
             if count < 32:
@@ -358,13 +366,14 @@ class EfiFfs(object):
     def GetSections(self):
         return self._sections
 
+
 class EfiFfsHeader(BinaryItem):
-    ffs_state_map = {0x01:'EFI_FILE_HEADER_CONSTRUCTION',
-                     0x02:'EFI_FILE_HEADER_VALID',
-                     0x04:'EFI_FILE_DATA_VALID',
-                     0x08:'EFI_FILE_MARKED_FOR_UPDATE',
-                     0x10:'EFI_FILE_DELETED',
-                     0x20:'EFI_FILE_HEADER_INVALID'}
+    ffs_state_map = {0x01: 'EFI_FILE_HEADER_CONSTRUCTION',
+                     0x02: 'EFI_FILE_HEADER_VALID',
+                     0x04: 'EFI_FILE_DATA_VALID',
+                     0x08: 'EFI_FILE_MARKED_FOR_UPDATE',
+                     0x10: 'EFI_FILE_DELETED',
+                     0x20: 'EFI_FILE_HEADER_INVALID'}
 
     def GetSize(self):
         return 24
@@ -377,7 +386,6 @@ class EfiFfsHeader(BinaryItem):
         list = self._arr.tolist()
         return int(list[18])
 
-
     def GetTypeString(self):
         value = self.GetType()
         if value == 0x01:
@@ -454,7 +462,7 @@ class EfiSection(object):
     EFI_SECTION_HEADER_SIZE = 4
 
     def __init__(self, parent=None):
-        self._size   = 0
+        self._size = 0
         self._parent = parent
         self._offset = 0
         self._contents = array.array('B')
@@ -465,8 +473,8 @@ class EfiSection(object):
         self._header = EfiSectionHeader.Read(fd, self)
 
         if self._header.GetTypeString() == "EFI_SECTION_PE32":
-             pefile = pe.PEFile(self)
-             pefile.Load(fd, self.GetContentSize())
+            pefile = pe.PEFile(self)
+            pefile.Load(fd, self.GetContentSize())
 
         fd.seek(self._offset)
         self._contents.fromfile(fd, self.GetContentSize())
@@ -491,6 +499,7 @@ class EfiSection(object):
     def GetSectionOffset(self):
         return self._offset + self.EFI_SECTION_HEADER_SIZE
 
+
 class EfiSectionHeader(BinaryItem):
     section_type_map = {0x01: 'EFI_SECTION_COMPRESSION',
                         0x02: 'EFI_SECTION_GUID_DEFINED',
@@ -505,6 +514,7 @@ class EfiSectionHeader(BinaryItem):
                         0x18: 'EFI_SECTION_FREEFORM_SUBTYPE_GUID',
                         0x19: 'EFI_SECTION_RAW',
                         0x1B: 'EFI_SECTION_PEI_DEPEX'}
+
     def GetSize(self):
         return 4
 
@@ -527,8 +537,10 @@ class EfiSectionHeader(BinaryItem):
         print('type = 0x%X' % self.GetType())
 
 
+rMapEntry = re.compile(
+    '^(\w+)[ \(\w\)]* \(BaseAddress=([0-9a-fA-F]+), EntryPoint=([0-9a-fA-F]+), GUID=([0-9a-fA-F\-]+)')
+
 
-rMapEntry = re.compile('^(\w+)[ \(\w\)]* \(BaseAddress=([0-9a-fA-F]+), EntryPoint=([0-9a-fA-F]+), GUID=([0-9a-fA-F\-]+)')
 class EfiFvMapFile(object):
     def __init__(self):
         self._mapentries = {}
@@ -549,12 +561,13 @@ class EfiFvMapFile(object):
                 # new entry
                 ret = rMapEntry.match(line)
                 if ret is not None:
-                    name     = ret.groups()[0]
+                    name = ret.groups()[0]
                     baseaddr = int(ret.groups()[1], 16)
-                    entry    = int(ret.groups()[2], 16)
-                    guidstr  = '{' + ret.groups()[3] + '}'
-                    guid     = uuid.UUID(guidstr)
-                    self._mapentries[guid] = EfiFvMapFileEntry(name, baseaddr, entry, guid)
+                    entry = int(ret.groups()[2], 16)
+                    guidstr = '{' + ret.groups()[3] + '}'
+                    guid = uuid.UUID(guidstr)
+                    self._mapentries[guid] = EfiFvMapFileEntry(
+                        name, baseaddr, entry, guid)
         return True
 
     def GetEntry(self, guid):
@@ -562,12 +575,13 @@ class EfiFvMapFile(object):
             return self._mapentries[guid]
         return None
 
+
 class EfiFvMapFileEntry(object):
     def __init__(self, name, baseaddr, entry, guid):
-        self._name     = name
+        self._name = name
         self._baseaddr = baseaddr
-        self._entry    = entry
-        self._guid     = guid
+        self._entry = entry
+        self._guid = guid
 
     def GetName(self):
         return self._name
@@ -578,6 +592,7 @@ class EfiFvMapFileEntry(object):
     def GetEntryPoint(self):
         return self._entry
 
+
 def list2guid(list):
     val1 = list2int(list[0:4])
     val2 = list2int(list[4:6])
@@ -586,20 +601,25 @@ def list2guid(list):
     for item in list[8:16]:
         val4 = (val4 << 8) | int(item)
 
-    val  = val1 << 12 * 8 | val2 << 10 * 8 | val3 << 8 * 8 | val4
+    val = val1 << 12 * 8 | val2 << 10 * 8 | val3 << 8 * 8 | val4
     guid = uuid.UUID(int=val)
     return guid
 
+
 def list2int(list):
     val = 0
     for index in range(len(list) - 1, -1, -1):
         val = (val << 8) | int(list[index])
     return val
 
+
 def align(value, alignment):
     return (value + ((alignment - value) & (alignment - 1)))
 
+
 gInvalidGuid = uuid.UUID(int=0xffffffffffffffffffffffffffffffff)
+
+
 def isValidGuid(guid):
     if guid == gInvalidGuid:
         return False
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
index 7facb9aa24c7..c51195d58336 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -12,16 +12,19 @@ import os
 
 section_re = re.compile(r'^\[([\w., "]+)\]')
 
+
 class BaseINIFile(object):
     _objs = {}
+
     def __new__(cls, *args, **kwargs):
         """Maintain only a single instance of this object
         @return: instance of this class
 
         """
-        if len(args) == 0: return object.__new__(cls)
+        if len(args) == 0:
+            return object.__new__(cls)
         filename = args[0]
-        parent   = None
+        parent = None
         if len(args) > 1:
             parent = args[1]
 
@@ -35,19 +38,21 @@ class BaseINIFile(object):
         return cls._objs[key]
 
     def __init__(self, filename=None, parent=None):
-        self._lines    = []
+        self._lines = []
         self._sections = {}
         self._filename = filename
-        self._globals  = []
+        self._globals = []
         self._isModify = True
 
     def AddParent(self, parent):
-        if parent is None: return
+        if parent is None:
+            return
         if not hasattr(self, "_parents"):
             self._parents = []
 
         if parent in self._parents:
-            ErrorMsg("Duplicate parent is found for INI file %s" % self._filename)
+            ErrorMsg("Duplicate parent is found for INI file %s" %
+                     self._filename)
             return
         self._parents.append(parent)
 
@@ -58,7 +63,8 @@ class BaseINIFile(object):
         return self._isModify
 
     def Modify(self, modify=True, obj=None):
-        if modify == self._isModify: return
+        if modify == self._isModify:
+            return
         self._isModify = modify
         if modify:
             for parent in self._parents:
@@ -73,7 +79,7 @@ class BaseINIFile(object):
 
         try:
             handle = open(filename, 'r')
-            self._lines  = handle.readlines()
+            self._lines = handle.readlines()
             handle.close()
         except:
             raise EdkException("Fail to open file %s" % filename)
@@ -102,22 +108,25 @@ class BaseINIFile(object):
         return arr
 
     def Parse(self):
-        if not self._isModify: return True
-        if not self._ReadLines(self._filename): return False
+        if not self._isModify:
+            return True
+        if not self._ReadLines(self._filename):
+            return False
 
-        sObjs    = []
+        sObjs = []
         inGlobal = True
         # process line
         for index in range(len(self._lines)):
             templine = self._lines[index].strip()
             # skip comments
-            if len(templine) == 0: continue
+            if len(templine) == 0:
+                continue
             if re.match("^\[=*\]", templine) or re.match("^#", templine) or \
                re.match("\*+/", templine):
                 continue
 
             m = section_re.match(templine)
-            if m is not None: # found a section
+            if m is not None:  # found a section
                 inGlobal = False
                 # Finish the latest section first
                 if len(sObjs) != 0:
@@ -132,7 +141,8 @@ class BaseINIFile(object):
                 sname_arr = m.groups()[0].split(',')
                 sObjs = []
                 for name in sname_arr:
-                    sObj = self.GetSectionInstance(self, name, (len(sname_arr) > 1))
+                    sObj = self.GetSectionInstance(
+                        self, name, (len(sname_arr) > 1))
                     sObj._start = index
                     sObjs.append(sObj)
                     if name.lower() not in self._sections:
@@ -164,14 +174,16 @@ class BaseINIFile(object):
             assert parent in self._parents, "when destory ini object, can not found parent reference!"
             self._parents.remove(parent)
 
-        if len(self._parents) != 0: return
+        if len(self._parents) != 0:
+            return
 
         for sects in self._sections.values():
             for sect in sects:
                 sect.Destroy()
 
         # dereference from _objs array
-        assert self.GetFilename() in self._objs.keys(), "When destroy ini object, can not find obj reference!"
+        assert self.GetFilename() in self._objs.keys(
+        ), "When destroy ini object, can not find obj reference!"
         assert self in self._objs.values(), "When destroy ini object, can not find obj reference!"
         del self._objs[self.GetFilename()]
 
@@ -208,12 +220,13 @@ class BaseINIFile(object):
 
     def AddNewSection(self, sectName):
         if sectName.lower() in self._sections.keys():
-            ErrorMsg('Section %s can not be created for conflict with existing section')
+            ErrorMsg(
+                'Section %s can not be created for conflict with existing section')
             return None
 
         sectionObj = self.GetSectionInstance(self, sectName)
         sectionObj._start = len(self._lines)
-        sectionObj._end   = len(self._lines) + 1
+        sectionObj._end = len(self._lines) + 1
         self._lines.append('[%s]\n' % sectName)
         self._lines.append('\n\n')
         self._sections[sectName.lower()] = sectionObj
@@ -228,19 +241,19 @@ class BaseINIFile(object):
     def __str__(self):
         return ''.join(self._lines)
 
-    ## Get file header's comment from basic INI file.
+    # Get file header's comment from basic INI file.
     #  The file comments has two style:
     #  1) #/** @file
     #  2) ## @file
     #
     def GetFileHeader(self):
         desc = []
-        lineArr  = self._lines
+        lineArr = self._lines
         inHeader = False
         for num in range(len(self._lines)):
             line = lineArr[num].strip()
             if not inHeader and (line.startswith("#/**") or line.startswith("##")) and \
-                line.find("@file") != -1:
+                    line.find("@file") != -1:
                 inHeader = True
                 continue
             if inHeader and (line.startswith("#**/") or line.startswith('##')):
@@ -254,14 +267,15 @@ class BaseINIFile(object):
                     desc.append(line[prefixIndex + 1:])
         return '<br>\n'.join(desc)
 
+
 class BaseINISection(object):
     def __init__(self, parent, name, isCombined=False):
-        self._parent     = parent
-        self._name       = name
+        self._parent = parent
+        self._name = name
         self._isCombined = isCombined
-        self._start      = 0
-        self._end        = 0
-        self._objs       = []
+        self._start = 0
+        self._end = 0
+        self._objs = []
 
     def __del__(self):
         for obj in self._objs:
@@ -354,7 +368,7 @@ class BaseINISection(object):
 
     def GetComment(self):
         comments = []
-        start  = self._start - 1
+        start = self._start - 1
         bFound = False
 
         while (start > 0):
@@ -376,18 +390,21 @@ class BaseINISection(object):
             end = start + 1
             while (end < self._start):
                 line = self.GetLine(end).strip()
-                if len(line) == 0: break
-                if not line.startswith('#'): break
+                if len(line) == 0:
+                    break
+                if not line.startswith('#'):
+                    break
                 index = line.rfind('#')
                 if (index + 1) < len(line):
                     comments.append(line[index + 1:])
                 end += 1
         return comments
 
+
 class BaseINIGlobalObject(object):
     def __init__(self, parent):
         self._start = 0
-        self._end   = 0
+        self._end = 0
 
     def Parse(self):
         return True
@@ -398,10 +415,11 @@ class BaseINIGlobalObject(object):
     def __del__(self):
         pass
 
+
 class BaseINISectionObject(object):
     def __init__(self, parent):
-        self._start  = 0
-        self._end    = 0
+        self._start = 0
+        self._end = 0
         self._parent = parent
 
     def __del__(self):
@@ -444,7 +462,7 @@ class BaseINISectionObject(object):
 
     def GetComment(self):
         comments = []
-        start  = self.GetStartLinenumber() - 1
+        start = self.GetStartLinenumber() - 1
         bFound = False
 
         while (start > 0):
@@ -466,8 +484,10 @@ class BaseINISectionObject(object):
             end = start + 1
             while (end <= self.GetStartLinenumber() - 1):
                 line = self.GetParent().GetLine(end).strip()
-                if len(line) == 0: break
-                if not line.startswith('#'): break
+                if len(line) == 0:
+                    break
+                if not line.startswith('#'):
+                    break
                 index = line.rfind('#')
                 if (index + 1) < len(line):
                     comments.append(line[index + 1:])
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py
index 849d0fc91a53..43e053fc0c60 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -7,6 +7,7 @@
 
 import core.editor
 
+
 class INIDoc(core.editor.EditorDocument):
     def __init__(self):
         core.editor.EditorDocument.__init__(self)
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py
index 882538a1711b..636aa4d40378 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -9,23 +9,28 @@ def GetEdkLogger():
     import logging
     return logging.getLogger('edk')
 
+
 class EdkException(Exception):
     def __init__(self, message, fName=None, fNo=None):
         self._message = message
         ErrorMsg(message, fName, fNo)
 
     def GetMessage(self):
-        return '[EDK Failure]: %s' %self._message
+        return '[EDK Failure]: %s' % self._message
+
 
 def ErrorMsg(mess, fName=None, fNo=None):
     GetEdkLogger().error(NormalMessage('#ERR#', mess, fName, fNo))
 
+
 def LogMsg(mess, fName=None, fNo=None):
     GetEdkLogger().info(NormalMessage('@LOG@', mess, fName, fNo))
 
+
 def WarnMsg(mess, fName=None, fNo=None):
     GetEdkLogger().warning(NormalMessage('!WAR!', mess, fName, fNo))
 
+
 def NormalMessage(type, mess, fName=None, fNo=None):
     strMsg = type
 
@@ -41,6 +46,3 @@ def NormalMessage(type, mess, fName=None, fNo=None):
     strMsg += mess
 
     return strMsg
-
-
-
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
index ee00529f464b..307dac9949af 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -11,6 +11,7 @@ from plugins.EdkPlugins.edk2.model import dec
 import os
 from plugins.EdkPlugins.basemodel.message import *
 
+
 class SurfaceObject(object):
     _objs = {}
 
@@ -27,10 +28,10 @@ class SurfaceObject(object):
         return obj
 
     def __init__(self, parent, workspace):
-        self._parent    = parent
-        self._fileObj   = None
+        self._parent = parent
+        self._fileObj = None
         self._workspace = workspace
-        self._isModify  = False
+        self._isModify = False
         self._modifiedObjs = []
 
     def __del__(self):
@@ -68,7 +69,8 @@ class SurfaceObject(object):
 
     def Load(self, relativePath):
         # if has been loaded, directly return
-        if self._fileObj is not None: return True
+        if self._fileObj is not None:
+            return True
 
         relativePath = os.path.normpath(relativePath)
         fullPath = os.path.join(self._workspace, relativePath)
@@ -133,11 +135,12 @@ class SurfaceObject(object):
                 continue
         return arr
 
+
 class Platform(SurfaceObject):
     def __init__(self, parent, workspace):
         SurfaceObject.__init__(self, parent, workspace)
-        self._modules    = []
-        self._packages   = []
+        self._modules = []
+        self._packages = []
 
     def Destroy(self):
         for module in self._modules:
@@ -201,7 +204,8 @@ class Platform(SurfaceObject):
 
             return obj.GetInstance()
 
-        ErrorMsg("Fail to get library class %s [%s][%s] from platform %s" % (classname, arch, type, self.GetFilename()))
+        ErrorMsg("Fail to get library class %s [%s][%s] from platform %s" % (
+            classname, arch, type, self.GetFilename()))
         return None
 
     def GetPackage(self, path):
@@ -224,7 +228,8 @@ class Platform(SurfaceObject):
         # do not care force paramter for platform object
         isFileChanged = self.GetFileObj().IsModified()
         ret = SurfaceObject.Reload(self, False)
-        if not ret: return False
+        if not ret:
+            return False
         if isFileChanged:
             # destroy all modules and reload them again
             for obj in self._modules:
@@ -285,8 +290,10 @@ class Platform(SurfaceObject):
         for oldSect in sects:
             newSect = newDsc.AddNewSection(oldSect.GetName())
             for oldComObj in oldSect.GetObjects():
-                module = self.GetModuleObject(oldComObj.GetFilename(), oldSect.GetArch())
-                if module is None: continue
+                module = self.GetModuleObject(
+                    oldComObj.GetFilename(), oldSect.GetArch())
+                if module is None:
+                    continue
 
                 newComObj = dsc.DSCComponentObject(newSect)
                 newComObj.SetFilename(oldComObj.GetFilename())
@@ -295,13 +302,14 @@ class Platform(SurfaceObject):
                 libdict = module.GetLibraries()
                 for libclass in libdict.keys():
                     if libdict[libclass] is not None:
-                        newComObj.AddOverideLib(libclass, libdict[libclass].GetRelativeFilename().replace('\\', '/'))
+                        newComObj.AddOverideLib(
+                            libclass, libdict[libclass].GetRelativeFilename().replace('\\', '/'))
 
                 # add all pcds for override section
                 pcddict = module.GetPcds()
                 for pcd in pcddict.values():
-                    buildPcd   = pcd.GetBuildObj()
-                    buildType  = buildPcd.GetPcdType()
+                    buildPcd = pcd.GetBuildObj()
+                    buildType = buildPcd.GetPcdType()
                     buildValue = None
                     if buildType.lower() == 'pcdsdynamichii' or \
                        buildType.lower() == 'pcdsdynamicvpd' or \
@@ -315,20 +323,21 @@ class Platform(SurfaceObject):
                 newSect.AddObject(newComObj)
         return newDsc
 
+
 class Module(SurfaceObject):
     def __init__(self, parent, workspace):
         SurfaceObject.__init__(self, parent, workspace)
-        self._arch        = 'common'
-        self._parent      = parent
+        self._arch = 'common'
+        self._parent = parent
         self._overidePcds = {}
         self._overideLibs = {}
-        self._libs        = {}
-        self._pcds        = {}
-        self._ppis        = []
-        self._protocols   = []
-        self._depexs      = []
-        self._guids       = []
-        self._packages    = []
+        self._libs = {}
+        self._pcds = {}
+        self._ppis = []
+        self._protocols = []
+        self._depexs = []
+        self._guids = []
+        self._packages = []
 
     def Destroy(self):
         for lib in self._libs.values():
@@ -397,7 +406,8 @@ class Module(SurfaceObject):
     def GetPcds(self):
         pcds = self._pcds.copy()
         for lib in self._libs.values():
-            if lib is None: continue
+            if lib is None:
+                continue
             for name in lib._pcds.keys():
                 pcds[name] = lib._pcds[name]
         return pcds
@@ -406,7 +416,8 @@ class Module(SurfaceObject):
         ppis = []
         ppis += self._ppis
         for lib in self._libs.values():
-            if lib is None: continue
+            if lib is None:
+                continue
             ppis += lib._ppis
         return ppis
 
@@ -414,7 +425,8 @@ class Module(SurfaceObject):
         pros = []
         pros = self._protocols
         for lib in self._libs.values():
-            if lib is None: continue
+            if lib is None:
+                continue
             pros += lib._protocols
         return pros
 
@@ -422,7 +434,8 @@ class Module(SurfaceObject):
         guids = []
         guids += self._guids
         for lib in self._libs.values():
-            if lib is None: continue
+            if lib is None:
+                continue
             guids += lib._guids
         return guids
 
@@ -430,7 +443,8 @@ class Module(SurfaceObject):
         deps = []
         deps += self._depexs
         for lib in self._libs.values():
-            if lib is None: continue
+            if lib is None:
+                continue
             deps += lib._depexs
         return deps
 
@@ -449,13 +463,15 @@ class Module(SurfaceObject):
             if issubclass(parent.__class__, Platform):
                 path = parent.GetLibraryPath(classname, arch, type)
                 if path is None:
-                    ErrorMsg('Fail to get library instance for %s' % classname, self.GetFilename())
+                    ErrorMsg('Fail to get library instance for %s' %
+                             classname, self.GetFilename())
                     return None
                 self._libs[classname] = Library(self, self.GetWorkspace())
                 if not self._libs[classname].Load(path, self.GetArch()):
                     self._libs[classname] = None
             else:
-                self._libs[classname] = parent.GetLibraryInstance(classname, arch, type)
+                self._libs[classname] = parent.GetLibraryInstance(
+                    classname, arch, type)
         return self._libs[classname]
 
     def GetSourceObjs(self):
@@ -479,14 +495,14 @@ class Module(SurfaceObject):
 
     def _SearchSurfaceItems(self):
         # get surface item from self's inf
-        pcds  = []
-        ppis  = []
-        pros  = []
-        deps  = []
+        pcds = []
+        ppis = []
+        pros = []
+        deps = []
         guids = []
         if self.GetFileObj() is not None:
             pcds = self.FilterObjsByArch(self.GetFileObj().GetSectionObjectsByName('pcd'),
-                                          self.GetArch())
+                                         self.GetArch())
             for pcd in pcds:
                 if pcd.GetPcdName() not in self._pcds.keys():
                     pcdItem = PcdItem(pcd.GetPcdName(), self, pcd)
@@ -518,7 +534,7 @@ class Module(SurfaceObject):
                 self._depexs.append(item)
 
             guids += self.FilterObjsByArch(self.GetFileObj().GetSectionObjectsByName('guids'),
-                                          self.GetArch())
+                                           self.GetArch())
             for guid in guids:
                 item = GuidItem(guid.GetName(), self, guid)
                 if item not in self._guids:
@@ -542,7 +558,8 @@ class Module(SurfaceObject):
 
     def GetLibraryClassHeaderFilePath(self):
         lcname = self.GetFileObj().GetProduceLibraryClass()
-        if lcname is None: return None
+        if lcname is None:
+            return None
 
         pkgs = self.GetPackages()
         for package in pkgs:
@@ -556,7 +573,8 @@ class Module(SurfaceObject):
             callback(self, "Starting reload...")
 
         ret = SurfaceObject.Reload(self, force)
-        if not ret: return False
+        if not ret:
+            return False
 
         if not force and not self.IsModified():
             return True
@@ -612,6 +630,7 @@ class Module(SurfaceObject):
             self._isModify = modify
             self.GetParent().Modify(modify, self)
 
+
 class Library(Module):
     def __init__(self, parent, workspace):
         Module.__init__(self, parent, workspace)
@@ -637,13 +656,14 @@ class Library(Module):
         self._pcds.clear()
         SurfaceObject.Destroy(self)
 
+
 class Package(SurfaceObject):
     def __init__(self, parent, workspace):
         SurfaceObject.__init__(self, parent, workspace)
-        self._pcds      = {}
-        self._guids     = {}
+        self._pcds = {}
+        self._guids = {}
         self._protocols = {}
-        self._ppis      = {}
+        self._ppis = {}
 
     def GetPcds(self):
         return self._pcds
@@ -679,19 +699,22 @@ class Package(SurfaceObject):
 
     def Load(self, relativePath):
         ret = SurfaceObject.Load(self, relativePath)
-        if not ret: return False
+        if not ret:
+            return False
         pcds = self.GetFileObj().GetSectionObjectsByName('pcds')
         for pcd in pcds:
             if pcd.GetPcdName() in self._pcds.keys():
                 if self._pcds[pcd.GetPcdName()] is not None:
                     self._pcds[pcd.GetPcdName()].AddDecObj(pcd)
             else:
-                self._pcds[pcd.GetPcdName()] = PcdItem(pcd.GetPcdName(), self, pcd)
+                self._pcds[pcd.GetPcdName()] = PcdItem(
+                    pcd.GetPcdName(), self, pcd)
 
         guids = self.GetFileObj().GetSectionObjectsByName('guids')
         for guid in guids:
             if guid.GetName() not in self._guids.keys():
-                self._guids[guid.GetName()] = GuidItem(guid.GetName(), self, guid)
+                self._guids[guid.GetName()] = GuidItem(
+                    guid.GetName(), self, guid)
             else:
                 WarnMsg("Duplicate definition for %s" % guid.GetName())
 
@@ -705,7 +728,8 @@ class Package(SurfaceObject):
         protocols = self.GetFileObj().GetSectionObjectsByName('protocols')
         for protocol in protocols:
             if protocol.GetName() not in self._protocols.keys():
-                self._protocols[protocol.GetName()] = ProtocolItem(protocol.GetName(), self, protocol)
+                self._protocols[protocol.GetName()] = ProtocolItem(
+                    protocol.GetName(), self, protocol)
             else:
                 WarnMsg("Duplicate definition for %s" % protocol.GetName())
 
@@ -720,7 +744,8 @@ class Package(SurfaceObject):
     def GetPcdDefineObjs(self, name=None):
         arr = []
         objs = self.GetFileObj().GetSectionObjectsByName('pcds')
-        if name is None: return objs
+        if name is None:
+            return objs
 
         for obj in objs:
             if obj.GetPcdName().lower() == name.lower():
@@ -748,6 +773,7 @@ class Package(SurfaceObject):
                 return obj.GetHeaderFile()
         return None
 
+
 class DepexItem(object):
     def __init__(self, parent, infObj):
         self._parent = parent
@@ -759,19 +785,21 @@ class DepexItem(object):
     def GetInfObject(self):
         return self._infObj
 
+
 class ModulePcd(object):
     _type_mapping = {'FeaturePcd': 'PcdsFeatureFlag',
                      'FixedPcd': 'PcdsFixedAtBuild',
                      'PatchPcd': 'PcdsPatchableInModule'}
 
     def __init__(self, parent, name, infObj, pcdItem):
-        assert issubclass(parent.__class__, Module), "Module's PCD's parent must be module!"
+        assert issubclass(parent.__class__,
+                          Module), "Module's PCD's parent must be module!"
         assert pcdItem is not None, 'Pcd %s does not in some package!' % name
 
-        self._name          = name
-        self._parent        = parent
-        self._pcdItem       = pcdItem
-        self._infObj        = infObj
+        self._name = name
+        self._parent = parent
+        self._pcdItem = pcdItem
+        self._infObj = infObj
 
     def GetName(self):
         return self._name
@@ -787,7 +815,8 @@ class ModulePcd(object):
         self._infObj = None
 
     def GetBuildObj(self):
-        platformInfos = self._parent.GetPlatform().GetPcdBuildObjs(self._name, self.GetArch())
+        platformInfos = self._parent.GetPlatform(
+        ).GetPcdBuildObjs(self._name, self.GetArch())
         modulePcdType = self._infObj.GetPcdType()
 
         # if platform do not gives pcd's value, get default value from package
@@ -802,12 +831,12 @@ class ModulePcd(object):
 
                     if self._type_mapping[modulePcdType] == obj.GetPcdType():
                         return obj
-                ErrorMsg ('Module PCD type %s does not in valied range [%s] in package!' % \
-                          (modulePcdType))
+                ErrorMsg('Module PCD type %s does not in valied range [%s] in package!' %
+                         (modulePcdType))
         else:
             if modulePcdType.lower() == 'pcd':
                 if len(platformInfos) > 1:
-                    WarnMsg("Find more than one value for PCD %s in platform %s" % \
+                    WarnMsg("Find more than one value for PCD %s in platform %s" %
                             (self._name, self._parent.GetPlatform().GetFilename()))
                 return platformInfos[0]
             else:
@@ -819,7 +848,7 @@ class ModulePcd(object):
                     if self._type_mapping[modulePcdType] == obj.GetPcdType():
                         return obj
 
-                ErrorMsg('Can not find value for pcd %s in pcd type %s' % \
+                ErrorMsg('Can not find value for pcd %s in pcd type %s' %
                          (self._name, modulePcdType))
         return None
 
@@ -832,8 +861,8 @@ class SurfaceItem(object):
         @return: instance of this class
 
         """
-        name    = args[0]
-        parent  = args[1]
+        name = args[0]
+        parent = args[1]
         fileObj = args[2]
         if issubclass(parent.__class__, Package):
             if name in cls._objs.keys():
@@ -845,19 +874,18 @@ class SurfaceItem(object):
             return obj
         elif issubclass(parent.__class__, Module):
             if name not in cls._objs.keys():
-                ErrorMsg("%s item does not defined in any package! It is used by module %s" % \
+                ErrorMsg("%s item does not defined in any package! It is used by module %s" %
                          (name, parent.GetFilename()))
                 return None
             return cls._objs[name]
 
         return None
 
-
     def __init__(self, name, parent, fileObj):
         if issubclass(parent.__class__, Package):
-            self._name    = name
-            self._parent  = parent
-            self._decObj  = [fileObj]
+            self._name = name
+            self._parent = parent
+            self._decObj = [fileObj]
             self._refMods = {}
         else:
             self.RefModule(parent, fileObj)
@@ -879,7 +907,8 @@ class SurfaceItem(object):
 
     def DeRef(self, mObj):
         if mObj not in self._refMods.keys():
-            WarnMsg("%s is not referenced by module %s" % (self._name, mObj.GetFilename()))
+            WarnMsg("%s is not referenced by module %s" %
+                    (self._name, mObj.GetFilename()))
             return
         del self._refMods[mObj]
 
@@ -897,16 +926,17 @@ class SurfaceItem(object):
     def GetDecObjects(self):
         return self._decObj
 
+
 class PcdItem(SurfaceItem):
     def AddDecObj(self, fileObj):
         for decObj in self._decObj:
             if decObj.GetFilename() != fileObj.GetFilename():
-                ErrorMsg("Pcd %s defined in more than one packages : %s and %s" % \
+                ErrorMsg("Pcd %s defined in more than one packages : %s and %s" %
                          (self._name, decObj.GetFilename(), fileObj.GetFilename()))
                 return
             if decObj.GetPcdType() == fileObj.GetPcdType() and \
                decObj.GetArch().lower() == fileObj.GetArch():
-                ErrorMsg("Pcd %s is duplicated defined in pcd type %s in package %s" % \
+                ErrorMsg("Pcd %s is duplicated defined in pcd type %s in package %s" %
                          (self._name, decObj.GetPcdType(), decObj.GetFilename()))
                 return
         self._decObj.append(fileObj)
@@ -918,11 +948,14 @@ class PcdItem(SurfaceItem):
                 types += obj.GetPcdType()
         return types
 
+
 class GuidItem(SurfaceItem):
     pass
 
+
 class PpiItem(SurfaceItem):
     pass
 
+
 class ProtocolItem(SurfaceItem):
     pass
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py
index c25ab322efd9..879c63c12f23 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -6,9 +6,11 @@
 #
 
 from plugins.EdkPlugins.basemodel import ini
-import re, os
+import re
+import os
 from plugins.EdkPlugins.basemodel.message import *
 
+
 class DECFile(ini.BaseINIFile):
 
     def GetSectionInstance(self, parent, name, isCombined=False):
@@ -39,6 +41,7 @@ class DECFile(ini.BaseINIFile):
 
         return arr
 
+
 class DECSection(ini.BaseINISection):
     def GetSectionINIObject(self, parent):
         type = self.GetType()
@@ -79,10 +82,12 @@ class DECSection(ini.BaseINISection):
 
         return True
 
+
 class DECSectionObject(ini.BaseINISectionObject):
     def GetArch(self):
         return self.GetParent().GetArch()
 
+
 class DECDefineSectionObject(DECSectionObject):
     def __init__(self, parent):
         DECSectionObject.__init__(self, parent)
@@ -90,20 +95,21 @@ class DECDefineSectionObject(DECSectionObject):
         self._value = None
 
     def Parse(self):
-        assert (self._start == self._end), 'The object in define section must be in single line'
+        assert (self._start ==
+                self._end), 'The object in define section must be in single line'
 
         line = self.GetLineByOffset(self._start).strip()
 
         line = line.split('#')[0]
-        arr  = line.split('=')
+        arr = line.split('=')
         if len(arr) != 2:
             ErrorMsg('Invalid define section object',
-                   self.GetFilename(),
-                   self.GetParent().GetName()
-                   )
+                     self.GetFilename(),
+                     self.GetParent().GetName()
+                     )
             return False
 
-        self._key   = arr[0].strip()
+        self._key = arr[0].strip()
         self._value = arr[1].strip()
 
         return True
@@ -114,6 +120,7 @@ class DECDefineSectionObject(DECSectionObject):
     def GetValue(self):
         return self._value
 
+
 class DECGuidObject(DECSectionObject):
     _objs = {}
 
@@ -149,8 +156,10 @@ class DECGuidObject(DECSectionObject):
     def GetObjectDict():
         return DECGuidObject._objs
 
+
 class DECPpiObject(DECSectionObject):
     _objs = {}
+
     def __init__(self, parent):
         DECSectionObject.__init__(self, parent)
         self._name = None
@@ -183,6 +192,7 @@ class DECPpiObject(DECSectionObject):
     def GetObjectDict():
         return DECPpiObject._objs
 
+
 class DECProtocolObject(DECSectionObject):
     _objs = {}
 
@@ -214,11 +224,11 @@ class DECProtocolObject(DECSectionObject):
         if len(objdict[self._name]) == 0:
             del objdict[self._name]
 
-
     @staticmethod
     def GetObjectDict():
         return DECProtocolObject._objs
 
+
 class DECLibraryClassObject(DECSectionObject):
     _objs = {}
 
@@ -256,6 +266,7 @@ class DECLibraryClassObject(DECSectionObject):
     def GetObjectDict():
         return DECLibraryClassObject._objs
 
+
 class DECIncludeObject(DECSectionObject):
     def __init__(self, parent):
         DECSectionObject.__init__(self, parent)
@@ -263,19 +274,21 @@ class DECIncludeObject(DECSectionObject):
     def GetPath(self):
         return self.GetLineByOffset(self._start).split('#')[0].strip()
 
+
 class DECPcdObject(DECSectionObject):
     _objs = {}
 
     def __init__(self, parent):
         DECSectionObject.__init__(self, parent)
-        self.mPcdName           = None
-        self.mPcdDefaultValue   = None
-        self.mPcdDataType       = None
-        self.mPcdToken          = None
+        self.mPcdName = None
+        self.mPcdDefaultValue = None
+        self.mPcdDataType = None
+        self.mPcdToken = None
 
     def Parse(self):
         line = self.GetLineByOffset(self._start).strip().split('#')[0]
-        (self.mPcdName, self.mPcdDefaultValue, self.mPcdDataType, self.mPcdToken) = line.split('|')
+        (self.mPcdName, self.mPcdDefaultValue,
+         self.mPcdDataType, self.mPcdToken) = line.split('|')
         objdict = DECPcdObject._objs
         if self.mPcdName not in objdict.keys():
             objdict[self.mPcdName] = [self]
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
index 89833043c6b1..9d7d03b5749c 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This file produce action class to generate doxygen document for edk2 codebase.
 # The action classes are shared by GUI and command line tools.
@@ -24,22 +24,24 @@ from plugins.EdkPlugins.basemodel.message import *
 
 _ignore_dir = ['.svn', '_svn', 'cvs']
 _inf_key_description_mapping_table = {
-  'INF_VERSION':'Version of INF file specification',
-  #'BASE_NAME':'Module Name',
-  'FILE_GUID':'Module Guid',
-  'MODULE_TYPE': 'Module Type',
-  'VERSION_STRING': 'Module Version',
-  'LIBRARY_CLASS': 'Produced Library Class',
-  'EFI_SPECIFICATION_VERSION': 'UEFI Specification Version',
-  'PI_SPECIFICATION_VERSION': 'PI Specification Version',
-  'ENTRY_POINT': 'Module Entry Point Function',
-  'CONSTRUCTOR': 'Library Constructor Function'
+    'INF_VERSION': 'Version of INF file specification',
+    # 'BASE_NAME':'Module Name',
+    'FILE_GUID': 'Module Guid',
+    'MODULE_TYPE': 'Module Type',
+    'VERSION_STRING': 'Module Version',
+    'LIBRARY_CLASS': 'Produced Library Class',
+    'EFI_SPECIFICATION_VERSION': 'UEFI Specification Version',
+    'PI_SPECIFICATION_VERSION': 'PI Specification Version',
+    'ENTRY_POINT': 'Module Entry Point Function',
+    'CONSTRUCTOR': 'Library Constructor Function'
 }
 
 _dec_key_description_mapping_table = {
-  'DEC_SPECIFICATION': 'Version of DEC file specification',
-  'PACKAGE_GUID': 'Package Guid'
+    'DEC_SPECIFICATION': 'Version of DEC file specification',
+    'PACKAGE_GUID': 'Package Guid'
 }
+
+
 class DoxygenAction:
     """This is base class for all doxygen action.
     """
@@ -50,17 +52,17 @@ class DoxygenAction:
         @param  outputPath      the obosolution output path.
         @param  log             log function for output message
         """
-        self._doxPath       = doxPath
-        self._chmPath       = chmPath
-        self._outputPath    = outputPath
-        self._projname      = projname
-        self._configFile    = None          # doxygen config file is used by doxygen exe file
+        self._doxPath = doxPath
+        self._chmPath = chmPath
+        self._outputPath = outputPath
+        self._projname = projname
+        self._configFile = None          # doxygen config file is used by doxygen exe file
         self._indexPageFile = None          # doxygen page file for index page.
-        self._log           = log
-        self._mode          = mode
-        self._verbose       = verbose
+        self._log = log
+        self._mode = mode
+        self._verbose = verbose
         self._doxygenCallback = None
-        self._chmCallback     = None
+        self._chmCallback = None
 
     def Log(self, message, level='info'):
         if self._log is not None:
@@ -71,13 +73,15 @@ class DoxygenAction:
 
     def Generate(self):
         """Generate interface called by outer directly"""
-        self.Log(">>>>>> Start generate doxygen document for %s... Zzz....\n" % self._projname)
+        self.Log(
+            ">>>>>> Start generate doxygen document for %s... Zzz....\n" % self._projname)
 
         # create doxygen config file at first
         self._configFile = doxygen.DoxygenConfigFile()
         self._configFile.SetOutputDir(self._outputPath)
 
-        self._configFile.SetWarningFilePath(os.path.join(self._outputPath, 'warning.txt'))
+        self._configFile.SetWarningFilePath(
+            os.path.join(self._outputPath, 'warning.txt'))
         if self._mode.lower() == 'html':
             self._configFile.SetHtmlMode()
         else:
@@ -92,15 +96,18 @@ class DoxygenAction:
             self.Log("Fail to generate index page!\n", 'error')
             return False
         else:
-            self.Log("Success to create doxygen index page file %s \n" % indexPagePath)
+            self.Log("Success to create doxygen index page file %s \n" %
+                     indexPagePath)
 
         # Add index page doxygen file to file list.
         self._configFile.AddFile(indexPagePath)
 
         # save config file to output path
-        configFilePath = os.path.join(self._outputPath, self._projname + '.doxygen_config')
+        configFilePath = os.path.join(
+            self._outputPath, self._projname + '.doxygen_config')
         self._configFile.Generate(configFilePath)
-        self.Log("    <<<<<< Success Save doxygen config file to %s...\n" % configFilePath)
+        self.Log("    <<<<<< Success Save doxygen config file to %s...\n" %
+                 configFilePath)
 
         # launch doxygen tool to generate document
         if self._doxygenCallback is not None:
@@ -128,17 +135,20 @@ class DoxygenAction:
     def RegisterCallbackCHMProcess(self, callback):
         self._chmCallback = callback
 
+
 class PlatformDocumentAction(DoxygenAction):
     """Generate platform doxygen document, will be implement at future."""
 
+
 class PackageDocumentAction(DoxygenAction):
     """Generate package reference document"""
 
     def __init__(self, doxPath, chmPath, outputPath, pObj, mode='html', log=None, arch=None, tooltag=None,
-                  onlyInclude=False, verbose=False):
-        DoxygenAction.__init__(self, doxPath, chmPath, outputPath, pObj.GetName(), mode, log, verbose)
-        self._pObj   = pObj
-        self._arch   = arch
+                 onlyInclude=False, verbose=False):
+        DoxygenAction.__init__(self, doxPath, chmPath,
+                               outputPath, pObj.GetName(), mode, log, verbose)
+        self._pObj = pObj
+        self._arch = arch
         self._tooltag = tooltag
         self._onlyIncludeDocument = onlyInclude
 
@@ -166,7 +176,8 @@ class PackageDocumentAction(DoxygenAction):
             namestr += '[%s]' % self._tooltag
         self._configFile.SetProjectName(namestr)
         self._configFile.SetStripPath(self._pObj.GetWorkspace())
-        self._configFile.SetProjectVersion(self._pObj.GetFileObj().GetVersion())
+        self._configFile.SetProjectVersion(
+            self._pObj.GetFileObj().GetVersion())
         self._configFile.AddPattern('*.decdoxygen')
 
         if self._tooltag.lower() == 'msft':
@@ -186,28 +197,32 @@ class PackageDocumentAction(DoxygenAction):
 
     def GenerateIndexPage(self):
         """Generate doxygen index page. Inherited class should implement it."""
-        fObj   = self._pObj.GetFileObj()
-        pdObj  = doxygen.DoxygenFile('%s Package Document' % self._pObj.GetName(),
-                                     '%s.decdoxygen' % self._pObj.GetFilename())
+        fObj = self._pObj.GetFileObj()
+        pdObj = doxygen.DoxygenFile('%s Package Document' % self._pObj.GetName(),
+                                    '%s.decdoxygen' % self._pObj.GetFilename())
         self._configFile.AddFile(pdObj.GetFilename())
         pdObj.AddDescription(fObj.GetFileHeader())
 
         defSection = fObj.GetSectionByName('defines')[0]
-        baseSection = doxygen.Section('PackageBasicInformation', 'Package Basic Information')
+        baseSection = doxygen.Section(
+            'PackageBasicInformation', 'Package Basic Information')
         descr = '<TABLE>'
         for obj in defSection.GetObjects():
             if obj.GetKey() in _dec_key_description_mapping_table.keys():
                 descr += '<TR>'
-                descr += '<TD><B>%s</B></TD>' % _dec_key_description_mapping_table[obj.GetKey()]
+                descr += '<TD><B>%s</B></TD>' % _dec_key_description_mapping_table[obj.GetKey(
+                )]
                 descr += '<TD>%s</TD>' % obj.GetValue()
                 descr += '</TR>'
         descr += '</TABLE><br>'
         baseSection.AddDescription(descr)
         pdObj.AddSection(baseSection)
 
-        knownIssueSection = doxygen.Section('Known_Issue_section', 'Known Issue')
+        knownIssueSection = doxygen.Section(
+            'Known_Issue_section', 'Known Issue')
         knownIssueSection.AddDescription('<ul>')
-        knownIssueSection.AddDescription('<li> OPTIONAL macro for function parameter can not be dealed with doxygen, so it disapear in this document! </li>')
+        knownIssueSection.AddDescription(
+            '<li> OPTIONAL macro for function parameter can not be dealed with doxygen, so it disapear in this document! </li>')
         knownIssueSection.AddDescription('</ul>')
         pdObj.AddSection(knownIssueSection)
 
@@ -215,7 +230,8 @@ class PackageDocumentAction(DoxygenAction):
         pages = self.GenerateIncludesSubPage(self._pObj, self._configFile)
         if len(pages) != 0:
             pdObj.AddPages(pages)
-        pages = self.GenerateLibraryClassesSubPage(self._pObj, self._configFile)
+        pages = self.GenerateLibraryClassesSubPage(
+            self._pObj, self._configFile)
         if len(pages) != 0:
             pdObj.AddPages(pages)
         pages = self.GeneratePcdSubPages(self._pObj, self._configFile)
@@ -231,7 +247,8 @@ class PackageDocumentAction(DoxygenAction):
         if len(pages) != 0:
             pdObj.AddPages(pages)
         if not self._onlyIncludeDocument:
-            pdObj.AddPages(self.GenerateModulePages(self._pObj, self._configFile))
+            pdObj.AddPages(self.GenerateModulePages(
+                self._pObj, self._configFile))
 
         pdObj.Save()
         return pdObj.GetFilename()
@@ -244,16 +261,20 @@ class PackageDocumentAction(DoxygenAction):
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Protocol'))
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Ppi'))
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Guid'))
-        configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'IndustryStandard'))
+        configFile.AddIncludePath(os.path.join(
+            pkpath, 'Include', 'IndustryStandard'))
 
         rootArray = []
-        pageRoot = doxygen.Page("Public Includes", "%s_public_includes" % pObj.GetName())
+        pageRoot = doxygen.Page(
+            "Public Includes", "%s_public_includes" % pObj.GetName())
         objs = pObj.GetFileObj().GetSectionObjectsByName('includes')
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
 
         for obj in objs:
             # Add path to include path
-            path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
+            path = os.path.join(
+                pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
             configFile.AddIncludePath(path)
 
             # only list common folder's include file
@@ -261,27 +282,35 @@ class PackageDocumentAction(DoxygenAction):
                 continue
 
             bNeedAddIncludePage = False
-            topPage = doxygen.Page(self._ConvertPathToDoxygen(path, pObj), 'public_include_top')
+            topPage = doxygen.Page(self._ConvertPathToDoxygen(
+                path, pObj), 'public_include_top')
 
             topPage.AddDescription('<ul>\n')
             for file in os.listdir(path):
-                if file.lower() in _ignore_dir: continue
+                if file.lower() in _ignore_dir:
+                    continue
                 fullpath = os.path.join(path, file)
                 if os.path.isfile(fullpath):
-                    self.ProcessSourceFileForInclude(fullpath, pObj, configFile)
-                    topPage.AddDescription('<li> \link %s\endlink </li>\n' % self._ConvertPathToDoxygen(fullpath, pObj))
+                    self.ProcessSourceFileForInclude(
+                        fullpath, pObj, configFile)
+                    topPage.AddDescription(
+                        '<li> \link %s\endlink </li>\n' % self._ConvertPathToDoxygen(fullpath, pObj))
                 else:
                     if file.lower() in ['library', 'protocol', 'guid', 'ppi', 'ia32', 'x64', 'ipf', 'ebc', 'arm', 'pi', 'uefi', 'aarch64']:
                         continue
                     bNeedAddSubPage = False
-                    subpage = doxygen.Page(self._ConvertPathToDoxygen(fullpath, pObj), 'public_include_%s' % file)
+                    subpage = doxygen.Page(self._ConvertPathToDoxygen(
+                        fullpath, pObj), 'public_include_%s' % file)
                     subpage.AddDescription('<ul>\n')
                     for subfile in os.listdir(fullpath):
-                        if subfile.lower() in _ignore_dir: continue
+                        if subfile.lower() in _ignore_dir:
+                            continue
                         bNeedAddSubPage = True
                         subfullpath = os.path.join(fullpath, subfile)
-                        self.ProcessSourceFileForInclude(subfullpath, pObj, configFile)
-                        subpage.AddDescription('<li> \link %s \endlink </li>\n' % self._ConvertPathToDoxygen(subfullpath, pObj))
+                        self.ProcessSourceFileForInclude(
+                            subfullpath, pObj, configFile)
+                        subpage.AddDescription(
+                            '<li> \link %s \endlink </li>\n' % self._ConvertPathToDoxygen(subfullpath, pObj))
                     subpage.AddDescription('</ul>\n')
                     if bNeedAddSubPage:
                         bNeedAddIncludePage = True
@@ -304,9 +333,11 @@ class PackageDocumentAction(DoxygenAction):
         @param  fObj DEC file object.
         """
         rootArray = []
-        pageRoot = doxygen.Page("Library Class", "%s_libraryclass" % pObj.GetName())
+        pageRoot = doxygen.Page(
+            "Library Class", "%s_libraryclass" % pObj.GetName())
         objs = pObj.GetFileObj().GetSectionObjectsByName('libraryclass', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
 
         if self._arch is not None:
             for obj in objs:
@@ -314,17 +345,21 @@ class PackageDocumentAction(DoxygenAction):
                                          "lc_%s" % obj.GetClassName())
                 comments = obj.GetComment()
                 if len(comments) != 0:
-                    classPage.AddDescription('<br>\n'.join(comments) + '<br>\n')
+                    classPage.AddDescription(
+                        '<br>\n'.join(comments) + '<br>\n')
                 pageRoot.AddPage(classPage)
-                path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                path = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 path = path[len(pObj.GetWorkspace()) + 1:]
                 if len(comments) == 0:
-                    classPage.AddDescription('\copydoc %s<p>' % obj.GetHeaderFile())
+                    classPage.AddDescription(
+                        '\copydoc %s<p>' % obj.GetHeaderFile())
                 section = doxygen.Section('ref', 'Refer to Header File')
                 section.AddDescription('\link %s\n' % obj.GetHeaderFile())
                 section.AddDescription(' \endlink<p>\n')
                 classPage.AddSection(section)
-                fullPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                fullPath = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 self.ProcessSourceFileForInclude(fullPath, pObj, configFile)
         else:
             archPageDict = {}
@@ -338,17 +373,21 @@ class PackageDocumentAction(DoxygenAction):
                                          "lc_%s" % obj.GetClassName())
                 comments = obj.GetComment()
                 if len(comments) != 0:
-                    classPage.AddDescription('<br>\n'.join(comments) + '<br>\n')
+                    classPage.AddDescription(
+                        '<br>\n'.join(comments) + '<br>\n')
                 subArchRoot.AddPage(classPage)
-                path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                path = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 path = path[len(pObj.GetWorkspace()) + 1:]
                 if len(comments) == 0:
-                    classPage.AddDescription('\copydoc %s<p>' % obj.GetHeaderFile())
+                    classPage.AddDescription(
+                        '\copydoc %s<p>' % obj.GetHeaderFile())
                 section = doxygen.Section('ref', 'Refer to Header File')
                 section.AddDescription('\link %s\n' % obj.GetHeaderFile())
                 section.AddDescription(' \endlink<p>\n')
                 classPage.AddSection(section)
-                fullPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                fullPath = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
 
                 self.ProcessSourceFileForInclude(fullPath, pObj, configFile)
         rootArray.append(pageRoot)
@@ -388,10 +427,12 @@ class PackageDocumentAction(DoxygenAction):
                 continue
             index = lines[no].lower().find('include')
             #mo = IncludePattern.finditer(lines[no].lower())
-            mo = re.match(r"^#\s*include\s+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip().lower())
+            mo = re.match(
+                r"^#\s*include\s+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip().lower())
             if not mo:
                 continue
-            mo = re.match(r"^[#\w\s]+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip())
+            mo = re.match(r"^[#\w\s]+[<\"]([\\/\w.]+)[>\"]$",
+                          lines[no].strip())
             filePath = mo.groups()[0]
 
             if filePath is None or len(filePath) == 0:
@@ -402,49 +443,60 @@ class PackageDocumentAction(DoxygenAction):
 
             if os.path.exists(os.path.join(os.path.dirname(path), filePath)):
                 # Find the file in current directory
-                fullPath = os.path.join(os.path.dirname(path), filePath).replace('\\', '/')
+                fullPath = os.path.join(os.path.dirname(
+                    path), filePath).replace('\\', '/')
             else:
                 # find in depedent package's include path
                 incObjs = pObj.GetFileObj().GetSectionObjectsByName('includes')
                 for incObj in incObjs:
-                    incPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), incObj.GetPath()).strip()
+                    incPath = os.path.join(
+                        pObj.GetFileObj().GetPackageRootPath(), incObj.GetPath()).strip()
                     incPath = os.path.realpath(os.path.join(incPath, filePath))
                     if os.path.exists(incPath):
                         fullPath = incPath
                         break
                 if infObj is not None:
                     pkgInfObjs = infObj.GetSectionObjectsByName('packages')
-                    for obj in  pkgInfObjs:
-                        decObj = dec.DECFile(os.path.join(pObj.GetWorkspace(), obj.GetPath()))
+                    for obj in pkgInfObjs:
+                        decObj = dec.DECFile(os.path.join(
+                            pObj.GetWorkspace(), obj.GetPath()))
                         if not decObj:
-                            ErrorMsg ('Fail to create pacakge object for %s' % obj.GetPackageName())
+                            ErrorMsg('Fail to create pacakge object for %s' %
+                                     obj.GetPackageName())
                             continue
                         if not decObj.Parse():
-                            ErrorMsg ('Fail to load package object for %s' % obj.GetPackageName())
+                            ErrorMsg('Fail to load package object for %s' %
+                                     obj.GetPackageName())
                             continue
                         incObjs = decObj.GetSectionObjectsByName('includes')
                         for incObj in incObjs:
-                            incPath = os.path.join(decObj.GetPackageRootPath(), incObj.GetPath()).replace('\\', '/')
+                            incPath = os.path.join(
+                                decObj.GetPackageRootPath(), incObj.GetPath()).replace('\\', '/')
                             if os.path.exists(os.path.join(incPath, filePath)):
-                                fullPath = os.path.join(os.path.join(incPath, filePath))
+                                fullPath = os.path.join(
+                                    os.path.join(incPath, filePath))
                                 break
                         if fullPath is not None:
                             break
 
             if fullPath is None and self.IsVerbose():
-                self.Log('Can not resolve header file %s for file %s in package %s\n' % (filePath, path, pObj.GetFileObj().GetFilename()), 'error')
+                self.Log('Can not resolve header file %s for file %s in package %s\n' % (
+                    filePath, path, pObj.GetFileObj().GetFilename()), 'error')
                 return
             else:
                 fullPath = fullPath.replace('\\', '/')
                 if self.IsVerbose():
-                    self.Log('Preprocessing: Add include file %s for file %s\n' % (fullPath, path))
+                    self.Log('Preprocessing: Add include file %s for file %s\n' % (
+                        fullPath, path))
                 #LogMsg ('Preprocessing: Add include file %s for file %s' % (fullPath, path))
-                self.ProcessSourceFileForInclude(fullPath, pObj, configFile, infObj)
+                self.ProcessSourceFileForInclude(
+                    fullPath, pObj, configFile, infObj)
 
     def AddAllIncludeFiles(self, pObj, configFile):
         objs = pObj.GetFileObj().GetSectionObjectsByName('includes')
         for obj in objs:
-            incPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
+            incPath = os.path.join(
+                pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
             for root, dirs, files in os.walk(incPath):
                 for dir in dirs:
                     if dir.lower() in _ignore_dir:
@@ -469,15 +521,17 @@ class PackageDocumentAction(DoxygenAction):
         typeArchRootPageDict = {}
         for obj in objs:
             if obj.GetPcdType() not in typeRootPageDict.keys():
-                typeRootPageDict[obj.GetPcdType()] = doxygen.Page(obj.GetPcdType(), 'pcd_%s_root_page' % obj.GetPcdType())
+                typeRootPageDict[obj.GetPcdType()] = doxygen.Page(
+                    obj.GetPcdType(), 'pcd_%s_root_page' % obj.GetPcdType())
                 pcdRootPage.AddPage(typeRootPageDict[obj.GetPcdType()])
             typeRoot = typeRootPageDict[obj.GetPcdType()]
             if self._arch is not None:
                 pcdPage = doxygen.Page('%s' % obj.GetPcdName(),
-                                        'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
-                pcdPage.AddDescription('<br>\n'.join(obj.GetComment()) + '<br>\n')
+                                       'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
+                pcdPage.AddDescription(
+                    '<br>\n'.join(obj.GetComment()) + '<br>\n')
                 section = doxygen.Section('PCDinformation', 'PCD Information')
-                desc  = '<TABLE>'
+                desc = '<TABLE>'
                 desc += '<TR>'
                 desc += '<TD><CAPTION>Name</CAPTION></TD>'
                 desc += '<TD><CAPTION>Token Space</CAPTION></TD>'
@@ -486,8 +540,10 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TD><CAPTION>Default Value</CAPTION></TD>'
                 desc += '</TR>'
                 desc += '<TR>'
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[1]
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[0]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    1]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    0]
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdToken()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdDataType()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdValue()
@@ -499,15 +555,17 @@ class PackageDocumentAction(DoxygenAction):
             else:
                 keystr = obj.GetPcdType() + obj.GetArch()
                 if keystr not in typeArchRootPageDict.keys():
-                    typeArchRootPage = doxygen.Page(obj.GetArch(), 'pcd_%s_%s_root_page' % (obj.GetPcdType(), obj.GetArch()))
+                    typeArchRootPage = doxygen.Page(
+                        obj.GetArch(), 'pcd_%s_%s_root_page' % (obj.GetPcdType(), obj.GetArch()))
                     typeArchRootPageDict[keystr] = typeArchRootPage
                     typeRoot.AddPage(typeArchRootPage)
                 typeArchRoot = typeArchRootPageDict[keystr]
                 pcdPage = doxygen.Page('%s' % obj.GetPcdName(),
-                                        'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
-                pcdPage.AddDescription('<br>\n'.join(obj.GetComment()) + '<br>\n')
+                                       'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
+                pcdPage.AddDescription(
+                    '<br>\n'.join(obj.GetComment()) + '<br>\n')
                 section = doxygen.Section('PCDinformation', 'PCD Information')
-                desc  = '<TABLE>'
+                desc = '<TABLE>'
                 desc += '<TR>'
                 desc += '<TD><CAPTION>Name</CAPTION></TD>'
                 desc += '<TD><CAPTION>Token Space</CAPTION></TD>'
@@ -516,8 +574,10 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TD><CAPTION>Default Value</CAPTION></TD>'
                 desc += '</TR>'
                 desc += '<TR>'
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[1]
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[0]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    1]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    0]
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdToken()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdDataType()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdValue()
@@ -535,7 +595,7 @@ class PackageDocumentAction(DoxygenAction):
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicGuidInfo', 'GUID Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>GUID\'s Guid Name</CAPTION></TD><TD><CAPTION>GUID\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -567,19 +627,23 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('GUID', 'guid_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('guids', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GenerateGuidSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GenerateGuidSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'guid_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'guid_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GenerateGuidSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GenerateGuidSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def _GeneratePpiSubPage(self, pObj, obj, configFile):
@@ -588,7 +652,7 @@ class PackageDocumentAction(DoxygenAction):
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicPpiInfo', 'PPI Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>PPI\'s Guid Name</CAPTION></TD><TD><CAPTION>PPI\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -620,28 +684,33 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('PPI', 'ppi_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('ppis', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GeneratePpiSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GeneratePpiSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'ppi_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'ppi_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GeneratePpiSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GeneratePpiSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def _GenerateProtocolSubPage(self, pObj, obj, configFile):
-        guidPage = doxygen.Page(obj.GetName(), 'protocol_page_%s' % obj.GetName())
+        guidPage = doxygen.Page(
+            obj.GetName(), 'protocol_page_%s' % obj.GetName())
         comments = obj.GetComment()
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicProtocolInfo', 'PROTOCOL Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>PROTOCOL\'s Guid Name</CAPTION></TD><TD><CAPTION>PROTOCOL\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -674,19 +743,23 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('PROTOCOL', 'protocol_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('protocols', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GenerateProtocolSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GenerateProtocolSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'protocol_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'protocol_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GenerateProtocolSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GenerateProtocolSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def FindHeaderFileForGuid(self, pObj, name, configFile):
@@ -699,8 +772,8 @@ class PackageDocumentAction(DoxygenAction):
 
         @return full path of header file and None if not found.
         """
-        startPath  = pObj.GetFileObj().GetPackageRootPath()
-        incPath    = os.path.join(startPath, 'Include').replace('\\', '/')
+        startPath = pObj.GetFileObj().GetPackageRootPath()
+        incPath = os.path.join(startPath, 'Include').replace('\\', '/')
         # if <PackagePath>/include exist, then search header under it.
         if os.path.exists(incPath):
             startPath = incPath
@@ -760,7 +833,7 @@ class PackageDocumentAction(DoxygenAction):
         modObjs = []
         for infpath in infList:
             infObj = inf.INFFile(infpath)
-            #infObj = INFFileObject.INFFile (pObj.GetWorkspacePath(),
+            # infObj = INFFileObject.INFFile (pObj.GetWorkspacePath(),
             #                                inf)
             if not infObj:
                 self.Log('Fail create INF object for %s' % inf)
@@ -777,13 +850,15 @@ class PackageDocumentAction(DoxygenAction):
             libRootPage = doxygen.Page('Libraries', 'lib_root_page')
             rootPages.append(libRootPage)
             for libInf in libObjs:
-                libRootPage.AddPage(self.GenerateModulePage(pObj, libInf, configFile, True))
+                libRootPage.AddPage(self.GenerateModulePage(
+                    pObj, libInf, configFile, True))
 
         if len(modObjs) != 0:
             modRootPage = doxygen.Page('Modules', 'module_root_page')
             rootPages.append(modRootPage)
             for modInf in modObjs:
-                modRootPage.AddPage(self.GenerateModulePage(pObj, modInf, configFile, False))
+                modRootPage.AddPage(self.GenerateModulePage(
+                    pObj, modInf, configFile, False))
 
         return rootPages
 
@@ -798,13 +873,15 @@ class PackageDocumentAction(DoxygenAction):
         """
         workspace = pObj.GetWorkspace()
         refDecObjs = []
-        for obj in  infObj.GetSectionObjectsByName('packages'):
+        for obj in infObj.GetSectionObjectsByName('packages'):
             decObj = dec.DECFile(os.path.join(workspace, obj.GetPath()))
             if not decObj:
-                ErrorMsg ('Fail to create pacakge object for %s' % obj.GetPackageName())
+                ErrorMsg('Fail to create pacakge object for %s' %
+                         obj.GetPackageName())
                 continue
             if not decObj.Parse():
-                ErrorMsg ('Fail to load package object for %s' % obj.GetPackageName())
+                ErrorMsg('Fail to load package object for %s' %
+                         obj.GetPackageName())
                 continue
             refDecObjs.append(decObj)
 
@@ -812,12 +889,14 @@ class PackageDocumentAction(DoxygenAction):
                                'module_%s' % infObj.GetBaseName())
         modPage.AddDescription(infObj.GetFileHeader())
 
-        basicInfSection = doxygen.Section('BasicModuleInformation', 'Basic Module Information')
+        basicInfSection = doxygen.Section(
+            'BasicModuleInformation', 'Basic Module Information')
         desc = "<TABLE>"
         for obj in infObj.GetSectionObjectsByName('defines'):
             key = obj.GetKey()
             value = obj.GetValue()
-            if key not in _inf_key_description_mapping_table.keys(): continue
+            if key not in _inf_key_description_mapping_table.keys():
+                continue
             if key == 'LIBRARY_CLASS' and value.find('|') != -1:
                 clsname, types = value.split('|')
                 desc += '<TR>'
@@ -841,7 +920,7 @@ class PackageDocumentAction(DoxygenAction):
         modPage.AddSection(basicInfSection)
 
         # Add protocol section
-        data  = []
+        data = []
         for obj in infObj.GetSectionObjectsByName('pcd', self._arch):
             data.append(obj.GetPcdName().strip())
         if len(data) != 0:
@@ -852,7 +931,8 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item.split('.')[1]
                 desc += '<TD>%s</TD>' % item.split('.')[0]
-                pkgbasename = self.SearchPcdPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchPcdPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -861,8 +941,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add protocol section
         #sects = infObj.GetSectionByString('protocol')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('protocol', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -872,7 +952,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchProtocolPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchProtocolPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -881,8 +962,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add ppi section
         #sects = infObj.GetSectionByString('ppi')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('ppi', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -892,7 +973,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchPpiPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchPpiPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -901,8 +983,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add guid section
         #sects = infObj.GetSectionByString('guid')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('guid', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -912,7 +994,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchGuidPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchGuidPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -928,12 +1011,13 @@ class PackageDocumentAction(DoxygenAction):
             desc += '<TD>Produce</TD>'
             try:
                 pkgname, hPath = self.SearchLibraryClassHeaderFile(infObj.GetProduceLibraryClass(),
-                                                              workspace,
-                                                              refDecObjs)
+                                                                   workspace,
+                                                                   refDecObjs)
             except:
-                self.Log ('fail to get package header file for lib class %s' % infObj.GetProduceLibraryClass())
+                self.Log('fail to get package header file for lib class %s' %
+                         infObj.GetProduceLibraryClass())
                 pkgname = 'NULL'
-                hPath   = 'NULL'
+                hPath = 'NULL'
             desc += '<TD>%s</TD>' % pkgname
             if hPath != "NULL":
                 desc += '<TD>\link %s \endlink</TD>' % hPath
@@ -949,9 +1033,10 @@ class PackageDocumentAction(DoxygenAction):
             if retarr is not None:
                 pkgname, hPath = retarr
             else:
-                self.Log('Fail find the library class %s definition from module %s dependent package!' % (lcObj.GetClass(), infObj.GetFilename()), 'error')
+                self.Log('Fail find the library class %s definition from module %s dependent package!' % (
+                    lcObj.GetClass(), infObj.GetFilename()), 'error')
                 pkgname = 'NULL'
-                hPath   = 'NULL'
+                hPath = 'NULL'
             desc += '<TD>Consume</TD>'
             desc += '<TD>%s</TD>' % pkgname
             desc += '<TD>\link %s \endlink</TD>' % hPath
@@ -964,22 +1049,25 @@ class PackageDocumentAction(DoxygenAction):
         section.AddDescription('<ul>\n')
         for obj in infObj.GetSourceObjects(self._arch, self._tooltag):
             sPath = infObj.GetModuleRootPath()
-            sPath = os.path.join(sPath, obj.GetSourcePath()).replace('\\', '/').strip()
+            sPath = os.path.join(sPath, obj.GetSourcePath()
+                                 ).replace('\\', '/').strip()
             if sPath.lower().endswith('.uni') or sPath.lower().endswith('.s') or sPath.lower().endswith('.asm') or sPath.lower().endswith('.nasm'):
                 newPath = self.TranslateUniFile(sPath)
                 configFile.AddFile(newPath)
                 newPath = newPath[len(pObj.GetWorkspace()) + 1:]
-                section.AddDescription('<li> \link %s \endlink </li>' %  newPath)
+                section.AddDescription(
+                    '<li> \link %s \endlink </li>' % newPath)
             else:
-                self.ProcessSourceFileForInclude(sPath, pObj, configFile, infObj)
+                self.ProcessSourceFileForInclude(
+                    sPath, pObj, configFile, infObj)
                 sPath = sPath[len(pObj.GetWorkspace()) + 1:]
                 section.AddDescription('<li>\link %s \endlink </li>' % sPath)
         section.AddDescription('</ul>\n')
         modPage.AddSection(section)
 
         #sects = infObj.GetSectionByString('depex')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('depex'):
             data.append(str(obj))
         if len(data) != 0:
@@ -1029,35 +1117,35 @@ class PackageDocumentAction(DoxygenAction):
         return newpath
 
     def SearchPcdPackage(self, pcdname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for pcd in decObj.GetSectionObjectsByName('pcd'):
                 if pcdname == pcd.GetPcdName():
                     return decObj.GetBaseName()
         return None
 
     def SearchProtocolPackage(self, protname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for proto in decObj.GetSectionObjectsByName('protocol'):
                 if protname == proto.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchPpiPackage(self, ppiname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for ppi in decObj.GetSectionObjectsByName('ppi'):
                 if ppiname == ppi.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchGuidPackage(self, guidname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for guid in decObj.GetSectionObjectsByName('guid'):
                 if guidname == guid.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchLibraryClassHeaderFile(self, className, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for cls in decObj.GetSectionObjectsByName('libraryclasses'):
                 if cls.GetClassName().strip() == className:
                     path = cls.GetHeaderFile().strip()
@@ -1072,9 +1160,11 @@ class PackageDocumentAction(DoxygenAction):
         path = path[len(pRootPath) + 1:]
         return path.replace('\\', '/')
 
+
 def IsCHeaderFile(path):
     return CheckPathPostfix(path, 'h')
 
+
 def CheckPathPostfix(path, str):
     index = path.rfind('.')
     if index == -1:
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
index e37938c466a2..56e1a637c0ff 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This file produce action class to generate doxygen document for edk2 codebase.
 # The action classes are shared by GUI and command line tools.
@@ -21,22 +21,24 @@ from plugins.EdkPlugins.basemodel.message import *
 
 _ignore_dir = ['.svn', '_svn', 'cvs']
 _inf_key_description_mapping_table = {
-  'INF_VERSION':'Version of INF file specification',
-  #'BASE_NAME':'Module Name',
-  'FILE_GUID':'Module Guid',
-  'MODULE_TYPE': 'Module Type',
-  'VERSION_STRING': 'Module Version',
-  'LIBRARY_CLASS': 'Produced Library Class',
-  'EFI_SPECIFICATION_VERSION': 'UEFI Specification Version',
-  'PI_SPECIFICATION_VERSION': 'PI Specification Version',
-  'ENTRY_POINT': 'Module Entry Point Function',
-  'CONSTRUCTOR': 'Library Constructor Function'
+    'INF_VERSION': 'Version of INF file specification',
+    # 'BASE_NAME':'Module Name',
+    'FILE_GUID': 'Module Guid',
+    'MODULE_TYPE': 'Module Type',
+    'VERSION_STRING': 'Module Version',
+    'LIBRARY_CLASS': 'Produced Library Class',
+    'EFI_SPECIFICATION_VERSION': 'UEFI Specification Version',
+    'PI_SPECIFICATION_VERSION': 'PI Specification Version',
+    'ENTRY_POINT': 'Module Entry Point Function',
+    'CONSTRUCTOR': 'Library Constructor Function'
 }
 
 _dec_key_description_mapping_table = {
-  'DEC_SPECIFICATION': 'Version of DEC file specification',
-  'PACKAGE_GUID': 'Package Guid'
+    'DEC_SPECIFICATION': 'Version of DEC file specification',
+    'PACKAGE_GUID': 'Package Guid'
 }
+
+
 class DoxygenAction:
     """This is base class for all doxygen action.
     """
@@ -47,17 +49,17 @@ class DoxygenAction:
         @param  outputPath      the obosolution output path.
         @param  log             log function for output message
         """
-        self._doxPath       = doxPath
-        self._chmPath       = chmPath
-        self._outputPath    = outputPath
-        self._projname      = projname
-        self._configFile    = None          # doxygen config file is used by doxygen exe file
+        self._doxPath = doxPath
+        self._chmPath = chmPath
+        self._outputPath = outputPath
+        self._projname = projname
+        self._configFile = None          # doxygen config file is used by doxygen exe file
         self._indexPageFile = None          # doxygen page file for index page.
-        self._log           = log
-        self._mode          = mode
-        self._verbose       = verbose
+        self._log = log
+        self._mode = mode
+        self._verbose = verbose
         self._doxygenCallback = None
-        self._chmCallback     = None
+        self._chmCallback = None
 
     def Log(self, message, level='info'):
         if self._log is not None:
@@ -68,13 +70,15 @@ class DoxygenAction:
 
     def Generate(self):
         """Generate interface called by outer directly"""
-        self.Log(">>>>>> Start generate doxygen document for %s... Zzz....\n" % self._projname)
+        self.Log(
+            ">>>>>> Start generate doxygen document for %s... Zzz....\n" % self._projname)
 
         # create doxygen config file at first
         self._configFile = doxygen.DoxygenConfigFile()
         self._configFile.SetOutputDir(self._outputPath)
 
-        self._configFile.SetWarningFilePath(os.path.join(self._outputPath, 'warning.txt'))
+        self._configFile.SetWarningFilePath(
+            os.path.join(self._outputPath, 'warning.txt'))
         if self._mode.lower() == 'html':
             self._configFile.SetHtmlMode()
         else:
@@ -89,15 +93,18 @@ class DoxygenAction:
             self.Log("Fail to generate index page!\n", 'error')
             return False
         else:
-            self.Log("Success to create doxygen index page file %s \n" % indexPagePath)
+            self.Log("Success to create doxygen index page file %s \n" %
+                     indexPagePath)
 
         # Add index page doxygen file to file list.
         self._configFile.AddFile(indexPagePath)
 
         # save config file to output path
-        configFilePath = os.path.join(self._outputPath, self._projname + '.doxygen_config')
+        configFilePath = os.path.join(
+            self._outputPath, self._projname + '.doxygen_config')
         self._configFile.Generate(configFilePath)
-        self.Log("    <<<<<< Success Save doxygen config file to %s...\n" % configFilePath)
+        self.Log("    <<<<<< Success Save doxygen config file to %s...\n" %
+                 configFilePath)
 
         # launch doxygen tool to generate document
         if self._doxygenCallback is not None:
@@ -125,17 +132,20 @@ class DoxygenAction:
     def RegisterCallbackCHMProcess(self, callback):
         self._chmCallback = callback
 
+
 class PlatformDocumentAction(DoxygenAction):
     """Generate platform doxygen document, will be implement at future."""
 
+
 class PackageDocumentAction(DoxygenAction):
     """Generate package reference document"""
 
     def __init__(self, doxPath, chmPath, outputPath, pObj, mode='html', log=None, arch=None, tooltag=None,
                  macros=[], onlyInclude=False, verbose=False):
-        DoxygenAction.__init__(self, doxPath, chmPath, outputPath, pObj.GetName(), mode, log, verbose)
-        self._pObj   = pObj
-        self._arch   = arch
+        DoxygenAction.__init__(self, doxPath, chmPath,
+                               outputPath, pObj.GetName(), mode, log, verbose)
+        self._pObj = pObj
+        self._arch = arch
         self._tooltag = tooltag
         self._macros = macros
         self._onlyIncludeDocument = onlyInclude
@@ -167,7 +177,8 @@ class PackageDocumentAction(DoxygenAction):
             namestr += '[%s]' % self._tooltag
         self._configFile.SetProjectName(namestr)
         self._configFile.SetStripPath(self._pObj.GetWorkspace())
-        self._configFile.SetProjectVersion(self._pObj.GetFileObj().GetVersion())
+        self._configFile.SetProjectVersion(
+            self._pObj.GetFileObj().GetVersion())
         self._configFile.AddPattern('*.decdoxygen')
 
         if self._tooltag.lower() == 'msft':
@@ -187,28 +198,32 @@ class PackageDocumentAction(DoxygenAction):
 
     def GenerateIndexPage(self):
         """Generate doxygen index page. Inherited class should implement it."""
-        fObj   = self._pObj.GetFileObj()
-        pdObj  = doxygen.DoxygenFile('%s Package Document' % self._pObj.GetName(),
-                                     '%s.decdoxygen' % self._pObj.GetFilename())
+        fObj = self._pObj.GetFileObj()
+        pdObj = doxygen.DoxygenFile('%s Package Document' % self._pObj.GetName(),
+                                    '%s.decdoxygen' % self._pObj.GetFilename())
         self._configFile.AddFile(pdObj.GetFilename())
         pdObj.AddDescription(fObj.GetFileHeader())
 
         defSection = fObj.GetSectionByName('defines')[0]
-        baseSection = doxygen.Section('PackageBasicInformation', 'Package Basic Information')
+        baseSection = doxygen.Section(
+            'PackageBasicInformation', 'Package Basic Information')
         descr = '<TABLE>'
         for obj in defSection.GetObjects():
             if obj.GetKey() in _dec_key_description_mapping_table.keys():
                 descr += '<TR>'
-                descr += '<TD><B>%s</B></TD>' % _dec_key_description_mapping_table[obj.GetKey()]
+                descr += '<TD><B>%s</B></TD>' % _dec_key_description_mapping_table[obj.GetKey(
+                )]
                 descr += '<TD>%s</TD>' % obj.GetValue()
                 descr += '</TR>'
         descr += '</TABLE><br>'
         baseSection.AddDescription(descr)
         pdObj.AddSection(baseSection)
 
-        knownIssueSection = doxygen.Section('Known_Issue_section', 'Known Issue')
+        knownIssueSection = doxygen.Section(
+            'Known_Issue_section', 'Known Issue')
         knownIssueSection.AddDescription('<ul>')
-        knownIssueSection.AddDescription('<li> OPTIONAL macro for function parameter can not be dealed with doxygen, so it disapear in this document! </li>')
+        knownIssueSection.AddDescription(
+            '<li> OPTIONAL macro for function parameter can not be dealed with doxygen, so it disapear in this document! </li>')
         knownIssueSection.AddDescription('</ul>')
         pdObj.AddSection(knownIssueSection)
 
@@ -216,7 +231,8 @@ class PackageDocumentAction(DoxygenAction):
         pages = self.GenerateIncludesSubPage(self._pObj, self._configFile)
         if len(pages) != 0:
             pdObj.AddPages(pages)
-        pages = self.GenerateLibraryClassesSubPage(self._pObj, self._configFile)
+        pages = self.GenerateLibraryClassesSubPage(
+            self._pObj, self._configFile)
         if len(pages) != 0:
             pdObj.AddPages(pages)
         pages = self.GeneratePcdSubPages(self._pObj, self._configFile)
@@ -232,7 +248,8 @@ class PackageDocumentAction(DoxygenAction):
         if len(pages) != 0:
             pdObj.AddPages(pages)
         if not self._onlyIncludeDocument:
-            pdObj.AddPages(self.GenerateModulePages(self._pObj, self._configFile))
+            pdObj.AddPages(self.GenerateModulePages(
+                self._pObj, self._configFile))
 
         pdObj.Save()
         return pdObj.GetFilename()
@@ -245,16 +262,20 @@ class PackageDocumentAction(DoxygenAction):
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Protocol'))
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Ppi'))
         configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'Guid'))
-        configFile.AddIncludePath(os.path.join(pkpath, 'Include', 'IndustryStandard'))
+        configFile.AddIncludePath(os.path.join(
+            pkpath, 'Include', 'IndustryStandard'))
 
         rootArray = []
-        pageRoot = doxygen.Page("Public Includes", "%s_public_includes" % pObj.GetName())
+        pageRoot = doxygen.Page(
+            "Public Includes", "%s_public_includes" % pObj.GetName())
         objs = pObj.GetFileObj().GetSectionObjectsByName('includes')
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
 
         for obj in objs:
             # Add path to include path
-            path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
+            path = os.path.join(
+                pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
             configFile.AddIncludePath(path)
 
             # only list common folder's include file
@@ -262,27 +283,35 @@ class PackageDocumentAction(DoxygenAction):
                 continue
 
             bNeedAddIncludePage = False
-            topPage = doxygen.Page(self._ConvertPathToDoxygen(path, pObj), 'public_include_top')
+            topPage = doxygen.Page(self._ConvertPathToDoxygen(
+                path, pObj), 'public_include_top')
 
             topPage.AddDescription('<ul>\n')
             for file in os.listdir(path):
-                if file.lower() in _ignore_dir: continue
+                if file.lower() in _ignore_dir:
+                    continue
                 fullpath = os.path.join(path, file)
                 if os.path.isfile(fullpath):
-                    self.ProcessSourceFileForInclude(fullpath, pObj, configFile)
-                    topPage.AddDescription('<li> \link %s\endlink </li>\n' % self._ConvertPathToDoxygen(fullpath, pObj))
+                    self.ProcessSourceFileForInclude(
+                        fullpath, pObj, configFile)
+                    topPage.AddDescription(
+                        '<li> \link %s\endlink </li>\n' % self._ConvertPathToDoxygen(fullpath, pObj))
                 else:
                     if file.lower() in ['library', 'protocol', 'guid', 'ppi', 'ia32', 'x64', 'ipf', 'ebc', 'arm', 'pi', 'uefi', 'aarch64']:
                         continue
                     bNeedAddSubPage = False
-                    subpage = doxygen.Page(self._ConvertPathToDoxygen(fullpath, pObj), 'public_include_%s' % file)
+                    subpage = doxygen.Page(self._ConvertPathToDoxygen(
+                        fullpath, pObj), 'public_include_%s' % file)
                     subpage.AddDescription('<ul>\n')
                     for subfile in os.listdir(fullpath):
-                        if subfile.lower() in _ignore_dir: continue
+                        if subfile.lower() in _ignore_dir:
+                            continue
                         bNeedAddSubPage = True
                         subfullpath = os.path.join(fullpath, subfile)
-                        self.ProcessSourceFileForInclude(subfullpath, pObj, configFile)
-                        subpage.AddDescription('<li> \link %s \endlink </li>\n' % self._ConvertPathToDoxygen(subfullpath, pObj))
+                        self.ProcessSourceFileForInclude(
+                            subfullpath, pObj, configFile)
+                        subpage.AddDescription(
+                            '<li> \link %s \endlink </li>\n' % self._ConvertPathToDoxygen(subfullpath, pObj))
                     subpage.AddDescription('</ul>\n')
                     if bNeedAddSubPage:
                         bNeedAddIncludePage = True
@@ -305,9 +334,11 @@ class PackageDocumentAction(DoxygenAction):
         @param  fObj DEC file object.
         """
         rootArray = []
-        pageRoot = doxygen.Page("Library Class", "%s_libraryclass" % pObj.GetName())
+        pageRoot = doxygen.Page(
+            "Library Class", "%s_libraryclass" % pObj.GetName())
         objs = pObj.GetFileObj().GetSectionObjectsByName('libraryclass', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
 
         if self._arch is not None:
             for obj in objs:
@@ -315,17 +346,21 @@ class PackageDocumentAction(DoxygenAction):
                                          "lc_%s" % obj.GetClassName())
                 comments = obj.GetComment()
                 if len(comments) != 0:
-                    classPage.AddDescription('<br>\n'.join(comments) + '<br>\n')
+                    classPage.AddDescription(
+                        '<br>\n'.join(comments) + '<br>\n')
                 pageRoot.AddPage(classPage)
-                path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                path = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 path = path[len(pObj.GetWorkspace()) + 1:]
                 if len(comments) == 0:
-                    classPage.AddDescription('\copydoc %s<p>' % obj.GetHeaderFile())
+                    classPage.AddDescription(
+                        '\copydoc %s<p>' % obj.GetHeaderFile())
                 section = doxygen.Section('ref', 'Refer to Header File')
                 section.AddDescription('\link %s\n' % obj.GetHeaderFile())
                 section.AddDescription(' \endlink<p>\n')
                 classPage.AddSection(section)
-                fullPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                fullPath = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 self.ProcessSourceFileForInclude(fullPath, pObj, configFile)
         else:
             archPageDict = {}
@@ -339,17 +374,21 @@ class PackageDocumentAction(DoxygenAction):
                                          "lc_%s" % obj.GetClassName())
                 comments = obj.GetComment()
                 if len(comments) != 0:
-                    classPage.AddDescription('<br>\n'.join(comments) + '<br>\n')
+                    classPage.AddDescription(
+                        '<br>\n'.join(comments) + '<br>\n')
                 subArchRoot.AddPage(classPage)
-                path = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                path = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
                 path = path[len(pObj.GetWorkspace()) + 1:]
                 if len(comments) == 0:
-                    classPage.AddDescription('\copydoc %s<p>' % obj.GetHeaderFile())
+                    classPage.AddDescription(
+                        '\copydoc %s<p>' % obj.GetHeaderFile())
                 section = doxygen.Section('ref', 'Refer to Header File')
                 section.AddDescription('\link %s\n' % obj.GetHeaderFile())
                 section.AddDescription(' \endlink<p>\n')
                 classPage.AddSection(section)
-                fullPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
+                fullPath = os.path.join(
+                    pObj.GetFileObj().GetPackageRootPath(), obj.GetHeaderFile())
 
                 self.ProcessSourceFileForInclude(fullPath, pObj, configFile)
         rootArray.append(pageRoot)
@@ -389,10 +428,12 @@ class PackageDocumentAction(DoxygenAction):
                 continue
             index = lines[no].lower().find('include')
             #mo = IncludePattern.finditer(lines[no].lower())
-            mo = re.match(r"^#\s*include\s+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip().lower())
+            mo = re.match(
+                r"^#\s*include\s+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip().lower())
             if not mo:
                 continue
-            mo = re.match(r"^[#\w\s]+[<\"]([\\/\w.]+)[>\"]$", lines[no].strip())
+            mo = re.match(r"^[#\w\s]+[<\"]([\\/\w.]+)[>\"]$",
+                          lines[no].strip())
             filePath = mo.groups()[0]
 
             if filePath is None or len(filePath) == 0:
@@ -403,49 +444,60 @@ class PackageDocumentAction(DoxygenAction):
 
             if os.path.exists(os.path.join(os.path.dirname(path), filePath)):
                 # Find the file in current directory
-                fullPath = os.path.join(os.path.dirname(path), filePath).replace('\\', '/')
+                fullPath = os.path.join(os.path.dirname(
+                    path), filePath).replace('\\', '/')
             else:
                 # find in depedent package's include path
                 incObjs = pObj.GetFileObj().GetSectionObjectsByName('includes')
                 for incObj in incObjs:
-                    incPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), incObj.GetPath()).strip()
+                    incPath = os.path.join(
+                        pObj.GetFileObj().GetPackageRootPath(), incObj.GetPath()).strip()
                     incPath = os.path.realpath(os.path.join(incPath, filePath))
                     if os.path.exists(incPath):
                         fullPath = incPath
                         break
                 if infObj is not None:
                     pkgInfObjs = infObj.GetSectionObjectsByName('packages')
-                    for obj in  pkgInfObjs:
-                        decObj = dec.DECFile(os.path.join(pObj.GetWorkspace(), obj.GetPath()))
+                    for obj in pkgInfObjs:
+                        decObj = dec.DECFile(os.path.join(
+                            pObj.GetWorkspace(), obj.GetPath()))
                         if not decObj:
-                            ErrorMsg ('Fail to create pacakge object for %s' % obj.GetPackageName())
+                            ErrorMsg('Fail to create pacakge object for %s' %
+                                     obj.GetPackageName())
                             continue
                         if not decObj.Parse():
-                            ErrorMsg ('Fail to load package object for %s' % obj.GetPackageName())
+                            ErrorMsg('Fail to load package object for %s' %
+                                     obj.GetPackageName())
                             continue
                         incObjs = decObj.GetSectionObjectsByName('includes')
                         for incObj in incObjs:
-                            incPath = os.path.join(decObj.GetPackageRootPath(), incObj.GetPath()).replace('\\', '/')
+                            incPath = os.path.join(
+                                decObj.GetPackageRootPath(), incObj.GetPath()).replace('\\', '/')
                             if os.path.exists(os.path.join(incPath, filePath)):
-                                fullPath = os.path.join(os.path.join(incPath, filePath))
+                                fullPath = os.path.join(
+                                    os.path.join(incPath, filePath))
                                 break
                         if fullPath is not None:
                             break
 
             if fullPath is None and self.IsVerbose():
-                self.Log('Can not resolve header file %s for file %s in package %s\n' % (filePath, path, pObj.GetFileObj().GetFilename()), 'error')
+                self.Log('Can not resolve header file %s for file %s in package %s\n' % (
+                    filePath, path, pObj.GetFileObj().GetFilename()), 'error')
                 return
             else:
                 fullPath = fullPath.replace('\\', '/')
                 if self.IsVerbose():
-                    self.Log('Preprocessing: Add include file %s for file %s\n' % (fullPath, path))
+                    self.Log('Preprocessing: Add include file %s for file %s\n' % (
+                        fullPath, path))
                 #LogMsg ('Preprocessing: Add include file %s for file %s' % (fullPath, path))
-                self.ProcessSourceFileForInclude(fullPath, pObj, configFile, infObj)
+                self.ProcessSourceFileForInclude(
+                    fullPath, pObj, configFile, infObj)
 
     def AddAllIncludeFiles(self, pObj, configFile):
         objs = pObj.GetFileObj().GetSectionObjectsByName('includes')
         for obj in objs:
-            incPath = os.path.join(pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
+            incPath = os.path.join(
+                pObj.GetFileObj().GetPackageRootPath(), obj.GetPath())
             for root, dirs, files in os.walk(incPath):
                 for dir in dirs:
                     if dir.lower() in _ignore_dir:
@@ -470,15 +522,17 @@ class PackageDocumentAction(DoxygenAction):
         typeArchRootPageDict = {}
         for obj in objs:
             if obj.GetPcdType() not in typeRootPageDict.keys():
-                typeRootPageDict[obj.GetPcdType()] = doxygen.Page(obj.GetPcdType(), 'pcd_%s_root_page' % obj.GetPcdType())
+                typeRootPageDict[obj.GetPcdType()] = doxygen.Page(
+                    obj.GetPcdType(), 'pcd_%s_root_page' % obj.GetPcdType())
                 pcdRootPage.AddPage(typeRootPageDict[obj.GetPcdType()])
             typeRoot = typeRootPageDict[obj.GetPcdType()]
             if self._arch is not None:
                 pcdPage = doxygen.Page('%s' % obj.GetPcdName(),
-                                        'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
-                pcdPage.AddDescription('<br>\n'.join(obj.GetComment()) + '<br>\n')
+                                       'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
+                pcdPage.AddDescription(
+                    '<br>\n'.join(obj.GetComment()) + '<br>\n')
                 section = doxygen.Section('PCDinformation', 'PCD Information')
-                desc  = '<TABLE>'
+                desc = '<TABLE>'
                 desc += '<TR>'
                 desc += '<TD><CAPTION>Name</CAPTION></TD>'
                 desc += '<TD><CAPTION>Token Space</CAPTION></TD>'
@@ -487,8 +541,10 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TD><CAPTION>Default Value</CAPTION></TD>'
                 desc += '</TR>'
                 desc += '<TR>'
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[1]
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[0]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    1]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    0]
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdToken()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdDataType()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdValue()
@@ -500,15 +556,17 @@ class PackageDocumentAction(DoxygenAction):
             else:
                 keystr = obj.GetPcdType() + obj.GetArch()
                 if keystr not in typeArchRootPageDict.keys():
-                    typeArchRootPage = doxygen.Page(obj.GetArch(), 'pcd_%s_%s_root_page' % (obj.GetPcdType(), obj.GetArch()))
+                    typeArchRootPage = doxygen.Page(
+                        obj.GetArch(), 'pcd_%s_%s_root_page' % (obj.GetPcdType(), obj.GetArch()))
                     typeArchRootPageDict[keystr] = typeArchRootPage
                     typeRoot.AddPage(typeArchRootPage)
                 typeArchRoot = typeArchRootPageDict[keystr]
                 pcdPage = doxygen.Page('%s' % obj.GetPcdName(),
-                                        'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
-                pcdPage.AddDescription('<br>\n'.join(obj.GetComment()) + '<br>\n')
+                                       'pcd_%s_%s_%s' % (obj.GetPcdType(), obj.GetArch(), obj.GetPcdName().split('.')[1]))
+                pcdPage.AddDescription(
+                    '<br>\n'.join(obj.GetComment()) + '<br>\n')
                 section = doxygen.Section('PCDinformation', 'PCD Information')
-                desc  = '<TABLE>'
+                desc = '<TABLE>'
                 desc += '<TR>'
                 desc += '<TD><CAPTION>Name</CAPTION></TD>'
                 desc += '<TD><CAPTION>Token Space</CAPTION></TD>'
@@ -517,8 +575,10 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TD><CAPTION>Default Value</CAPTION></TD>'
                 desc += '</TR>'
                 desc += '<TR>'
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[1]
-                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[0]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    1]
+                desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdName().split('.')[
+                    0]
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdToken()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdDataType()
                 desc += '<TD><CAPTION>%s</CAPTION></TD>' % obj.GetPcdValue()
@@ -536,7 +596,7 @@ class PackageDocumentAction(DoxygenAction):
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicGuidInfo', 'GUID Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>GUID\'s Guid Name</CAPTION></TD><TD><CAPTION>GUID\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -568,19 +628,23 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('GUID', 'guid_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('guids', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GenerateGuidSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GenerateGuidSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'guid_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'guid_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GenerateGuidSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GenerateGuidSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def _GeneratePpiSubPage(self, pObj, obj, configFile):
@@ -589,7 +653,7 @@ class PackageDocumentAction(DoxygenAction):
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicPpiInfo', 'PPI Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>PPI\'s Guid Name</CAPTION></TD><TD><CAPTION>PPI\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -621,28 +685,33 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('PPI', 'ppi_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('ppis', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GeneratePpiSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GeneratePpiSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'ppi_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'ppi_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GeneratePpiSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GeneratePpiSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def _GenerateProtocolSubPage(self, pObj, obj, configFile):
-        guidPage = doxygen.Page(obj.GetName(), 'protocol_page_%s' % obj.GetName())
+        guidPage = doxygen.Page(
+            obj.GetName(), 'protocol_page_%s' % obj.GetName())
         comments = obj.GetComment()
         if len(comments) != 0:
             guidPage.AddDescription('<br>'.join(obj.GetComment()) + '<br>')
         section = doxygen.Section('BasicProtocolInfo', 'PROTOCOL Information')
-        desc  = '<TABLE>'
+        desc = '<TABLE>'
         desc += '<TR>'
         desc += '<TD><CAPTION>PROTOCOL\'s Guid Name</CAPTION></TD><TD><CAPTION>PROTOCOL\'s Guid</CAPTION></TD>'
         desc += '</TR>'
@@ -675,19 +744,23 @@ class PackageDocumentAction(DoxygenAction):
         """
         pageRoot = doxygen.Page('PROTOCOL', 'protocol_root_page')
         objs = pObj.GetFileObj().GetSectionObjectsByName('protocols', self._arch)
-        if len(objs) == 0: return []
+        if len(objs) == 0:
+            return []
         if self._arch is not None:
             for obj in objs:
-                pageRoot.AddPage(self._GenerateProtocolSubPage(pObj, obj, configFile))
+                pageRoot.AddPage(
+                    self._GenerateProtocolSubPage(pObj, obj, configFile))
         else:
             guidArchRootPageDict = {}
             for obj in objs:
                 if obj.GetArch() not in guidArchRootPageDict.keys():
-                    guidArchRoot = doxygen.Page(obj.GetArch(), 'protocol_arch_root_%s' % obj.GetArch())
+                    guidArchRoot = doxygen.Page(
+                        obj.GetArch(), 'protocol_arch_root_%s' % obj.GetArch())
                     pageRoot.AddPage(guidArchRoot)
                     guidArchRootPageDict[obj.GetArch()] = guidArchRoot
                 guidArchRoot = guidArchRootPageDict[obj.GetArch()]
-                guidArchRoot.AddPage(self._GenerateProtocolSubPage(pObj, obj, configFile))
+                guidArchRoot.AddPage(
+                    self._GenerateProtocolSubPage(pObj, obj, configFile))
         return [pageRoot]
 
     def FindHeaderFileForGuid(self, pObj, name, configFile):
@@ -700,8 +773,8 @@ class PackageDocumentAction(DoxygenAction):
 
         @return full path of header file and None if not found.
         """
-        startPath  = pObj.GetFileObj().GetPackageRootPath()
-        incPath    = os.path.join(startPath, 'Include').replace('\\', '/')
+        startPath = pObj.GetFileObj().GetPackageRootPath()
+        incPath = os.path.join(startPath, 'Include').replace('\\', '/')
         # if <PackagePath>/include exist, then search header under it.
         if os.path.exists(incPath):
             startPath = incPath
@@ -761,7 +834,7 @@ class PackageDocumentAction(DoxygenAction):
         modObjs = []
         for infpath in infList:
             infObj = inf.INFFile(infpath)
-            #infObj = INFFileObject.INFFile (pObj.GetWorkspacePath(),
+            # infObj = INFFileObject.INFFile (pObj.GetWorkspacePath(),
             #                                inf)
             if not infObj:
                 self.Log('Fail create INF object for %s' % inf)
@@ -778,13 +851,15 @@ class PackageDocumentAction(DoxygenAction):
             libRootPage = doxygen.Page('Libraries', 'lib_root_page')
             rootPages.append(libRootPage)
             for libInf in libObjs:
-                libRootPage.AddPage(self.GenerateModulePage(pObj, libInf, configFile, True))
+                libRootPage.AddPage(self.GenerateModulePage(
+                    pObj, libInf, configFile, True))
 
         if len(modObjs) != 0:
             modRootPage = doxygen.Page('Modules', 'module_root_page')
             rootPages.append(modRootPage)
             for modInf in modObjs:
-                modRootPage.AddPage(self.GenerateModulePage(pObj, modInf, configFile, False))
+                modRootPage.AddPage(self.GenerateModulePage(
+                    pObj, modInf, configFile, False))
 
         return rootPages
 
@@ -799,13 +874,15 @@ class PackageDocumentAction(DoxygenAction):
         """
         workspace = pObj.GetWorkspace()
         refDecObjs = []
-        for obj in  infObj.GetSectionObjectsByName('packages'):
+        for obj in infObj.GetSectionObjectsByName('packages'):
             decObj = dec.DECFile(os.path.join(workspace, obj.GetPath()))
             if not decObj:
-                ErrorMsg ('Fail to create pacakge object for %s' % obj.GetPackageName())
+                ErrorMsg('Fail to create pacakge object for %s' %
+                         obj.GetPackageName())
                 continue
             if not decObj.Parse():
-                ErrorMsg ('Fail to load package object for %s' % obj.GetPackageName())
+                ErrorMsg('Fail to load package object for %s' %
+                         obj.GetPackageName())
                 continue
             refDecObjs.append(decObj)
 
@@ -813,12 +890,14 @@ class PackageDocumentAction(DoxygenAction):
                                'module_%s' % infObj.GetBaseName())
         modPage.AddDescription(infObj.GetFileHeader())
 
-        basicInfSection = doxygen.Section('BasicModuleInformation', 'Basic Module Information')
+        basicInfSection = doxygen.Section(
+            'BasicModuleInformation', 'Basic Module Information')
         desc = "<TABLE>"
         for obj in infObj.GetSectionObjectsByName('defines'):
             key = obj.GetKey()
             value = obj.GetValue()
-            if key not in _inf_key_description_mapping_table.keys(): continue
+            if key not in _inf_key_description_mapping_table.keys():
+                continue
             if key == 'LIBRARY_CLASS' and value.find('|') != -1:
                 clsname, types = value.split('|')
                 desc += '<TR>'
@@ -842,7 +921,7 @@ class PackageDocumentAction(DoxygenAction):
         modPage.AddSection(basicInfSection)
 
         # Add protocol section
-        data  = []
+        data = []
         for obj in infObj.GetSectionObjectsByName('pcd', self._arch):
             data.append(obj.GetPcdName().strip())
         if len(data) != 0:
@@ -853,7 +932,8 @@ class PackageDocumentAction(DoxygenAction):
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item.split('.')[1]
                 desc += '<TD>%s</TD>' % item.split('.')[0]
-                pkgbasename = self.SearchPcdPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchPcdPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -862,8 +942,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add protocol section
         #sects = infObj.GetSectionByString('protocol')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('protocol', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -873,7 +953,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchProtocolPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchProtocolPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -882,8 +963,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add ppi section
         #sects = infObj.GetSectionByString('ppi')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('ppi', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -893,7 +974,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchPpiPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchPpiPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -902,8 +984,8 @@ class PackageDocumentAction(DoxygenAction):
 
         # Add guid section
         #sects = infObj.GetSectionByString('guid')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('guid', self._arch):
             data.append(obj.GetName().strip())
         if len(data) != 0:
@@ -913,7 +995,8 @@ class PackageDocumentAction(DoxygenAction):
             for item in data:
                 desc += '<TR>'
                 desc += '<TD>%s</TD>' % item
-                pkgbasename = self.SearchGuidPackage(item, workspace, refDecObjs)
+                pkgbasename = self.SearchGuidPackage(
+                    item, workspace, refDecObjs)
                 desc += '<TD>%s</TD>' % pkgbasename
                 desc += '</TR>'
             desc += "</TABLE>"
@@ -929,12 +1012,13 @@ class PackageDocumentAction(DoxygenAction):
             desc += '<TD>Produce</TD>'
             try:
                 pkgname, hPath = self.SearchLibraryClassHeaderFile(infObj.GetProduceLibraryClass(),
-                                                              workspace,
-                                                              refDecObjs)
+                                                                   workspace,
+                                                                   refDecObjs)
             except:
-                self.Log ('fail to get package header file for lib class %s' % infObj.GetProduceLibraryClass())
+                self.Log('fail to get package header file for lib class %s' %
+                         infObj.GetProduceLibraryClass())
                 pkgname = 'NULL'
-                hPath   = 'NULL'
+                hPath = 'NULL'
             desc += '<TD>%s</TD>' % pkgname
             if hPath != "NULL":
                 #desc += '<TD>\link %s \endlink</TD>' % hPath
@@ -951,9 +1035,10 @@ class PackageDocumentAction(DoxygenAction):
             if retarr is not None:
                 pkgname, hPath = retarr
             else:
-                self.Log('Fail find the library class %s definition from module %s dependent package!' % (lcObj.GetClass(), infObj.GetFilename()), 'error')
+                self.Log('Fail find the library class %s definition from module %s dependent package!' % (
+                    lcObj.GetClass(), infObj.GetFilename()), 'error')
                 pkgname = 'NULL'
-                hPath   = 'NULL'
+                hPath = 'NULL'
             desc += '<TD>Consume</TD>'
             desc += '<TD>%s</TD>' % pkgname
             desc += '<TD>%s</TD>' % hPath
@@ -966,22 +1051,25 @@ class PackageDocumentAction(DoxygenAction):
         section.AddDescription('<ul>\n')
         for obj in infObj.GetSourceObjects(self._arch, self._tooltag):
             sPath = infObj.GetModuleRootPath()
-            sPath = os.path.join(sPath, obj.GetSourcePath()).replace('\\', '/').strip()
+            sPath = os.path.join(sPath, obj.GetSourcePath()
+                                 ).replace('\\', '/').strip()
             if sPath.lower().endswith('.uni') or sPath.lower().endswith('.s') or sPath.lower().endswith('.asm') or sPath.lower().endswith('.nasm'):
                 newPath = self.TranslateUniFile(sPath)
                 configFile.AddFile(newPath)
                 newPath = newPath[len(pObj.GetWorkspace()) + 1:]
-                section.AddDescription('<li> \link %s \endlink </li>' %  newPath)
+                section.AddDescription(
+                    '<li> \link %s \endlink </li>' % newPath)
             else:
-                self.ProcessSourceFileForInclude(sPath, pObj, configFile, infObj)
+                self.ProcessSourceFileForInclude(
+                    sPath, pObj, configFile, infObj)
                 sPath = sPath[len(pObj.GetWorkspace()) + 1:]
                 section.AddDescription('<li>\link %s \endlink </li>' % sPath)
         section.AddDescription('</ul>\n')
         modPage.AddSection(section)
 
         #sects = infObj.GetSectionByString('depex')
-        data  = []
-        #for sect in sects:
+        data = []
+        # for sect in sects:
         for obj in infObj.GetSectionObjectsByName('depex'):
             data.append(str(obj))
         if len(data) != 0:
@@ -1031,35 +1119,35 @@ class PackageDocumentAction(DoxygenAction):
         return newpath
 
     def SearchPcdPackage(self, pcdname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for pcd in decObj.GetSectionObjectsByName('pcd'):
                 if pcdname == pcd.GetPcdName():
                     return decObj.GetBaseName()
         return None
 
     def SearchProtocolPackage(self, protname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for proto in decObj.GetSectionObjectsByName('protocol'):
                 if protname == proto.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchPpiPackage(self, ppiname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for ppi in decObj.GetSectionObjectsByName('ppi'):
                 if ppiname == ppi.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchGuidPackage(self, guidname, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for guid in decObj.GetSectionObjectsByName('guid'):
                 if guidname == guid.GetName():
                     return decObj.GetBaseName()
         return None
 
     def SearchLibraryClassHeaderFile(self, className, workspace, decObjs):
-        for decObj in  decObjs:
+        for decObj in decObjs:
             for cls in decObj.GetSectionObjectsByName('libraryclasses'):
                 if cls.GetClassName().strip() == className:
                     path = cls.GetHeaderFile().strip()
@@ -1074,9 +1162,11 @@ class PackageDocumentAction(DoxygenAction):
         path = path[len(pRootPath) + 1:]
         return path.replace('\\', '/')
 
+
 def IsCHeaderFile(path):
     return CheckPathPostfix(path, 'h')
 
+
 def CheckPathPostfix(path, str):
     index = path.rfind('.')
     if index == -1:
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py
index a2d23a7732f4..566531035307 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -6,9 +6,11 @@
 #
 
 from plugins.EdkPlugins.basemodel import ini
-import re, os
+import re
+import os
 from plugins.EdkPlugins.basemodel.message import *
 
+
 class DSCFile(ini.BaseINIFile):
     def GetSectionInstance(self, parent, name, isCombined=False):
         return DSCSection(parent, name, isCombined)
@@ -16,6 +18,7 @@ class DSCFile(ini.BaseINIFile):
     def GetComponents(self):
         return self.GetSectionObjectsByName('Components')
 
+
 class DSCSection(ini.BaseINISection):
     def GetSectionINIObject(self, parent):
         type = self.GetType()
@@ -53,10 +56,12 @@ class DSCSection(ini.BaseINISection):
             return 'common'
         return arr[2]
 
+
 class DSCSectionObject(ini.BaseINISectionObject):
     def GetArch(self):
         return self.GetParent().GetArch()
 
+
 class DSCPcdObject(DSCSectionObject):
 
     def __init__(self, parent):
@@ -65,8 +70,8 @@ class DSCPcdObject(DSCSectionObject):
 
     def Parse(self):
         line = self.GetLineByOffset(self._start).strip().split('#')[0]
-        self._name   = line.split('|')[0]
-        self._value  = line.split('|')[1]
+        self._name = line.split('|')[0]
+        self._value = line.split('|')[1]
         return True
 
     def GetPcdName(self):
@@ -78,6 +83,7 @@ class DSCPcdObject(DSCSectionObject):
     def GetPcdValue(self):
         return self._value
 
+
 class DSCLibraryClassObject(DSCSectionObject):
     def __init__(self, parent):
         ini.BaseINISectionObject.__init__(self, parent)
@@ -96,13 +102,14 @@ class DSCLibraryClassObject(DSCSectionObject):
     def GetModuleType(self):
         return self.GetParent().GetModuleType()
 
+
 class DSCComponentObject(DSCSectionObject):
 
     def __init__(self, parent):
         ini.BaseINISectionObject.__init__(self, parent)
-        self._OveridePcds      = {}
+        self._OveridePcds = {}
         self._OverideLibraries = {}
-        self._Filename         = ''
+        self._Filename = ''
 
     def __del__(self):
         self._OverideLibraries.clear()
@@ -140,7 +147,7 @@ class DSCComponentObject(DSCSectionObject):
             # The end line is '}' and could be ignored
             #
             curr = self._start + 1
-            end  = self._end - 1
+            end = self._end - 1
             OverideName = ''
             while (curr <= end):
                 line = self.GetLineByOffset(curr).strip()
@@ -176,7 +183,8 @@ class DSCComponentObject(DSCSectionObject):
         if hasLib:
             lines.append('    <LibraryClasses>\n')
             for libKey in self._OverideLibraries.keys():
-                lines.append('      %s|%s\n' % (libKey, self._OverideLibraries[libKey]))
+                lines.append('      %s|%s\n' %
+                             (libKey, self._OverideLibraries[libKey]))
 
         if hasPcd:
             for key in self._OveridePcds.keys():
@@ -192,4 +200,3 @@ class DSCComponentObject(DSCSectionObject):
             lines.append('  }\n')
 
         return lines
-
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
index 430e71af2d89..259e8a409fac 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -6,9 +6,11 @@
 #
 
 from plugins.EdkPlugins.basemodel import ini
-import re, os
+import re
+import os
 from plugins.EdkPlugins.basemodel.message import *
 
+
 class INFFile(ini.BaseINIFile):
     _libobjs = {}
 
@@ -17,7 +19,8 @@ class INFFile(ini.BaseINIFile):
 
     def GetProduceLibraryClass(self):
         obj = self.GetDefine("LIBRARY_CLASS")
-        if obj is None: return None
+        if obj is None:
+            return None
 
         return obj.split('|')[0].strip()
 
@@ -116,10 +119,12 @@ class INFSection(ini.BaseINISection):
 
         return True
 
+
 class INFSectionObject(ini.BaseINISectionObject):
     def GetArch(self):
         return self.GetParent().GetArch()
 
+
 class INFDefineSectionObject(INFSectionObject):
     def __init__(self, parent):
         INFSectionObject.__init__(self, parent)
@@ -127,20 +132,21 @@ class INFDefineSectionObject(INFSectionObject):
         self._value = None
 
     def Parse(self):
-        assert (self._start == self._end), 'The object in define section must be in single line'
+        assert (self._start ==
+                self._end), 'The object in define section must be in single line'
 
         line = self.GetLineByOffset(self._start).strip()
 
         line = line.split('#')[0]
-        arr  = line.split('=')
+        arr = line.split('=')
         if len(arr) != 2:
             ErrorMsg('Invalid define section object',
-                   self.GetFilename(),
-                   self._start
-                   )
+                     self.GetFilename(),
+                     self._start
+                     )
             return False
 
-        self._key   = arr[0].strip()
+        self._key = arr[0].strip()
         self._value = arr[1].strip()
 
         return True
@@ -151,8 +157,10 @@ class INFDefineSectionObject(INFSectionObject):
     def GetValue(self):
         return self._value
 
+
 class INFLibraryClassObject(INFSectionObject):
     _objs = {}
+
     def __init__(self, parent):
         INFSectionObject.__init__(self, parent)
         self._classname = None
@@ -161,7 +169,8 @@ class INFLibraryClassObject(INFSectionObject):
         return self._classname
 
     def Parse(self):
-        self._classname = self.GetLineByOffset(self._start).split('#')[0].strip()
+        self._classname = self.GetLineByOffset(
+            self._start).split('#')[0].strip()
         objdict = INFLibraryClassObject._objs
         if self._classname in objdict:
             objdict[self._classname].append(self)
@@ -182,21 +191,24 @@ class INFLibraryClassObject(INFSectionObject):
     def GetObjectDict():
         return INFLibraryClassObject._objs
 
+
 class INFDependentPackageObject(INFSectionObject):
     def GetPath(self):
         return self.GetLineByOffset(self._start).split('#')[0].strip()
 
+
 class INFSourceObject(INFSectionObject):
     _objs = {}
+
     def __init__(self, parent):
         INFSectionObject.__init__(self, parent)
 
-        self.mSourcename  = None
-        self.mToolCode    = None
-        self.mFamily      = None
-        self.mTagName     = None
-        self.mFeaturePcd  = None
-        self.mFilename    = None
+        self.mSourcename = None
+        self.mToolCode = None
+        self.mFamily = None
+        self.mTagName = None
+        self.mFeaturePcd = None
+        self.mFilename = None
 
     def GetSourcePath(self):
         return self.mSourcename
@@ -273,15 +285,16 @@ class INFSourceObject(INFSectionObject):
     def GetObjectDict():
         return INFSourceObject._objs
 
+
 class INFPcdObject(INFSectionObject):
     _objs = {}
 
     def __init__(self, parent):
         INFSectionObject.__init__(self, parent)
 
-        self.mPcdType      = None
+        self.mPcdType = None
         self.mDefaultValue = None
-        self.mPcdName      = None
+        self.mPcdName = None
 
     @staticmethod
     def GetObjectDict():
@@ -291,7 +304,7 @@ class INFPcdObject(INFSectionObject):
         line = self.GetLineByOffset(self._start).strip().split('#')[0]
 
         arr = line.split('|')
-        self.mPcdName       = arr[0].strip()
+        self.mPcdName = arr[0].strip()
 
         if len(arr) >= 2:
             self.mDefaultValue = arr[1].strip()
@@ -319,17 +332,17 @@ class INFPcdObject(INFSectionObject):
         if len(objdict[self.GetName()]) == 0:
             del objdict[self.GetName()]
 
+
 class INFGuidObject(INFSectionObject):
     def __init__(self, parent):
         INFSectionObject.__init__(self, parent)
         self._name = None
 
     def Parse(self):
-        line = self.GetLineByOffset(self._start).strip().split('#')[0].split("|")[0]
-        self._name =  line.strip()
+        line = self.GetLineByOffset(self._start).strip().split('#')[
+            0].split("|")[0]
+        self._name = line.strip()
         return True
 
     def GetName(self):
         return self._name
-
-
diff --git a/BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py b/BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py
index 57dfebde5916..dd93c30888c4 100644
--- a/BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py
+++ b/BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
diff --git a/BaseTools/Scripts/PatchCheck.py b/BaseTools/Scripts/PatchCheck.py
index 63e6223f8ebc..551bc3d26fdf 100755
--- a/BaseTools/Scripts/PatchCheck.py
+++ b/BaseTools/Scripts/PatchCheck.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Check a patch for various format issues
 #
 #  Copyright (c) 2015 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -9,23 +9,23 @@
 #
 
 from __future__ import print_function
+import email.header
+import sys
+import subprocess
+import re
+import os
+import argparse
+import email
 
 VersionNumber = '0.1'
 __copyright__ = "Copyright (c) 2015 - 2016, Intel Corporation  All rights reserved."
 
-import email
-import argparse
-import os
-import re
-import subprocess
-import sys
-
-import email.header
 
 class Verbose:
     SILENT, ONELINE, NORMAL = range(3)
     level = NORMAL
 
+
 class EmailAddressCheck:
     """Checks an email address."""
 
@@ -55,7 +55,7 @@ class EmailAddressCheck:
             count += 1
 
     email_re1 = re.compile(r'(?:\s*)(.*?)(\s*)<(.+)>\s*$',
-                           re.MULTILINE|re.IGNORECASE)
+                           re.MULTILINE | re.IGNORECASE)
 
     def check_email_address(self, email):
         email = email.strip()
@@ -86,13 +86,14 @@ class EmailAddressCheck:
             self.error("Email rewritten by lists DMARC / DKIM / SPF: " +
                        email)
 
+
 class CommitMessageCheck:
     """Checks the contents of a git commit message."""
 
     def __init__(self, subject, message, author_email):
         self.ok = True
 
-        if subject is None and  message is None:
+        if subject is None and message is None:
             self.error('Commit message is missing!')
             return
 
@@ -104,7 +105,7 @@ class CommitMessageCheck:
         self.subject = subject
         self.msg = message
 
-        print (subject)
+        print(subject)
 
         self.check_contributed_under()
         if not MergifyMerge:
@@ -143,7 +144,7 @@ class CommitMessageCheck:
     # requires ':' to be present.  Matches if there is white space before
     # the tag or between the tag and the ':'.
     contributed_under_re = \
-        re.compile(r'^\s*contributed-under\s*:', re.MULTILINE|re.IGNORECASE)
+        re.compile(r'^\s*contributed-under\s*:', re.MULTILINE | re.IGNORECASE)
 
     def check_contributed_under(self):
         match = self.contributed_under_re.search(self.msg)
@@ -160,7 +161,7 @@ class CommitMessageCheck:
         re_str = (r'^(?P<tag>' + sub_re +
                   r')(\s*):(\s*)(?P<value>\S.*?)(?:\s*)$')
         try:
-            return re.compile(re_str, re.MULTILINE|re.IGNORECASE)
+            return re.compile(re_str, re.MULTILINE | re.IGNORECASE)
         except Exception:
             print("Tried to compile re:", re_str)
             raise
@@ -184,7 +185,7 @@ class CommitMessageCheck:
 
         bad_case_sigs = filter(lambda m: m[0] != sig, sigs)
         for s in bad_case_sigs:
-            self.error("'" +s[0] + "' should be '" + sig + "'")
+            self.error("'" + s[0] + "' should be '" + sig + "'")
 
         for s in sigs:
             if s[1] != '':
@@ -198,7 +199,7 @@ class CommitMessageCheck:
         return sigs
 
     def check_signed_off_by(self):
-        sob='Signed-off-by'
+        sob = 'Signed-off-by'
         if self.msg.find(sob) < 0:
             self.error('Missing Signed-off-by! (Note: this must be ' +
                        'added by the code contributor!)')
@@ -217,7 +218,7 @@ class CommitMessageCheck:
         'Suggested',
         'Acked',
         'Cc'
-        )
+    )
 
     def check_misc_signatures(self):
         for sig in self.sig_types:
@@ -251,7 +252,7 @@ class CommitMessageCheck:
                 self.error(
                     'First line of commit message (subject line) is too long (%d >= 93).' %
                     (len(lines[0].rstrip()))
-                    )
+                )
         else:
             #
             # If CVE-xxxx-xxxxx is not present in subject line, then limit
@@ -261,7 +262,7 @@ class CommitMessageCheck:
                 self.error(
                     'First line of commit message (subject line) is too long (%d >= 76).' %
                     (len(lines[0].rstrip()))
-                    )
+                )
 
         if count >= 1 and len(lines[0].strip()) == 0:
             self.error('First line of commit message (subject line) ' +
@@ -281,14 +282,14 @@ class CommitMessageCheck:
                 not lines[i].startswith('Reported-by:') and
                 not lines[i].startswith('Suggested-by:') and
                 not lines[i].startswith('Signed-off-by:') and
-                not lines[i].startswith('Cc:')):
+                    not lines[i].startswith('Cc:')):
                 #
                 # Print a warning if body line is longer than 75 characters
                 #
                 print(
                     'WARNING - Line %d of commit message is too long (%d >= 76).' %
                     (i + 1, len(lines[i]))
-                    )
+                )
                 print(lines[i])
 
         last_sig_line = None
@@ -307,8 +308,10 @@ class CommitMessageCheck:
                 break
             last_sig_line = line.strip()
 
+
 (START, PRE_PATCH, PATCH) = range(3)
 
+
 class GitDiffCheck:
     """Checks the contents of a git diff."""
 
@@ -347,8 +350,8 @@ class GitDiffCheck:
             if line.startswith('@@ '):
                 self.state = PRE_PATCH
             elif len(line) >= 1 and line[0] not in ' -+' and \
-                 not line.startswith('\r\n') and  \
-                 not line.startswith(r'\ No newline ') and not self.binary:
+                    not line.startswith('\r\n') and  \
+                    not line.startswith(r'\ No newline ') and not self.binary:
                 for line in self.lines[self.line_num + 1:]:
                     if line.startswith('diff --git'):
                         self.format_error('diff found after end of patch')
@@ -364,10 +367,10 @@ class GitDiffCheck:
                 self.force_crlf = True
                 self.force_notabs = True
                 if self.filename.endswith('.sh') or \
-                    self.filename.startswith('BaseTools/BinWrappers/PosixLike/') or \
-                    self.filename.startswith('BaseTools/BinPipWrappers/PosixLike/') or \
-                    self.filename.startswith('BaseTools/Bin/CYGWIN_NT-5.1-i686/') or \
-                    self.filename == 'BaseTools/BuildEnv':
+                        self.filename.startswith('BaseTools/BinWrappers/PosixLike/') or \
+                        self.filename.startswith('BaseTools/BinPipWrappers/PosixLike/') or \
+                        self.filename.startswith('BaseTools/Bin/CYGWIN_NT-5.1-i686/') or \
+                        self.filename == 'BaseTools/BuildEnv':
                     #
                     # Do not enforce CR/LF line endings for linux shell scripts.
                     # Some linux shell scripts don't end with the ".sh" extension,
@@ -391,7 +394,7 @@ class GitDiffCheck:
                 self.state = PATCH
                 self.binary = False
             elif line.startswith('GIT binary patch') or \
-                 line.startswith('Binary files'):
+                    line.startswith('Binary files'):
                 self.state = PATCH
                 self.binary = True
                 if self.is_newfile:
@@ -437,7 +440,7 @@ class GitDiffCheck:
         'copy from ',
         'copy to ',
         'rename ',
-        )
+    )
 
     line_endings = ('\r\n', '\n\r', '\n', '\r')
 
@@ -448,7 +451,7 @@ class GitDiffCheck:
                    re.VERBOSE)
 
     def added_line_error(self, msg, line):
-        lines = [ msg ]
+        lines = [msg]
         if self.filename is not None:
             lines.append('File: ' + self.filename)
         lines.append('Line: ' + line)
@@ -514,6 +517,7 @@ class GitDiffCheck:
             print(prefix, line)
             count += 1
 
+
 class CheckOnePatch:
     """Checks the contents of a git email formatted patch.
 
@@ -528,7 +532,8 @@ class CheckOnePatch:
         email_check = EmailAddressCheck(self.author_email, 'Author')
         email_ok = email_check.ok
 
-        msg_check = CommitMessageCheck(self.commit_subject, self.commit_msg, self.author_email)
+        msg_check = CommitMessageCheck(
+            self.commit_subject, self.commit_msg, self.author_email)
         msg_ok = msg_check.ok
 
         diff_ok = True
@@ -550,7 +555,6 @@ class CheckOnePatch:
                 result = 'bad ' + ' and '.join(result)
             print(name, result)
 
-
     git_diff_re = re.compile(r'''
                                  ^ diff \s+ --git \s+ a/.+ \s+ b/.+ $
                              ''',
@@ -629,10 +633,12 @@ class CheckOnePatch:
 
         self.commit_subject = subject.replace('\r\n', '')
         self.commit_subject = self.commit_subject.replace('\n', '')
-        self.commit_subject = self.subject_prefix_re.sub('', self.commit_subject, 1)
+        self.commit_subject = self.subject_prefix_re.sub(
+            '', self.commit_subject, 1)
 
         self.author_email = pmail['from']
 
+
 class CheckGitCommits:
     """Reads patches from git based on the specified git revision range.
 
@@ -642,7 +648,7 @@ class CheckGitCommits:
     def __init__(self, rev_spec, max_count):
         commits = self.read_commit_list_from_git(rev_spec, max_count)
         if len(commits) == 1 and Verbose.level > Verbose.ONELINE:
-            commits = [ rev_spec ]
+            commits = [rev_spec]
         self.ok = True
         blank_line = False
         for commit in commits:
@@ -661,7 +667,7 @@ class CheckGitCommits:
 
     def read_commit_list_from_git(self, rev_spec, max_count):
         # Run git to get the commit patch
-        cmd = [ 'rev-list', '--abbrev-commit', '--no-walk' ]
+        cmd = ['rev-list', '--abbrev-commit', '--no-walk']
         if max_count is not None:
             cmd.append('--max-count=' + str(max_count))
         cmd.append(rev_spec)
@@ -679,13 +685,14 @@ class CheckGitCommits:
                             '--no-use-mailmap', commit)
 
     def run_git(self, *args):
-        cmd = [ 'git' ]
+        cmd = ['git']
         cmd += args
         p = subprocess.Popen(cmd,
-                     stdout=subprocess.PIPE,
-                     stderr=subprocess.STDOUT)
+                             stdout=subprocess.PIPE,
+                             stderr=subprocess.STDOUT)
         Result = p.communicate()
-        return Result[0].decode('utf-8', 'ignore') if Result[0] and Result[0].find(b"fatal")!=0 else None
+        return Result[0].decode('utf-8', 'ignore') if Result[0] and Result[0].find(b"fatal") != 0 else None
+
 
 class CheckOnePatchFile:
     """Performs a patch check for a single file.
@@ -705,6 +712,7 @@ class CheckOnePatchFile:
             print('Checking patch file:', patch_filename)
         self.ok = CheckOnePatch(patch_filename, patch).ok
 
+
 class CheckOneArg:
     """Performs a patch check for a single command line argument.
 
@@ -720,6 +728,7 @@ class CheckOneArg:
             checker = CheckGitCommits(param, max_count)
         self.ok = checker.ok
 
+
 class PatchCheckApp:
     """Checks patches based on the command line arguments."""
 
@@ -728,7 +737,7 @@ class PatchCheckApp:
         patches = self.args.patches
 
         if len(patches) == 0:
-            patches = [ 'HEAD' ]
+            patches = ['HEAD']
 
         self.ok = True
         self.count = None
@@ -772,5 +781,6 @@ class PatchCheckApp:
         if self.args.silent:
             Verbose.level = Verbose.SILENT
 
+
 if __name__ == "__main__":
     sys.exit(PatchCheckApp().retval)
diff --git a/BaseTools/Scripts/RunMakefile.py b/BaseTools/Scripts/RunMakefile.py
index 0e0a0114f9c7..a65cd0717495 100644
--- a/BaseTools/Scripts/RunMakefile.py
+++ b/BaseTools/Scripts/RunMakefile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Run a makefile as part of a PREBUILD or POSTBUILD action.
 #
 # Copyright (c) 2017, Intel Corporation. All rights reserved.<BR>
@@ -17,9 +17,9 @@ import subprocess
 #
 # Globals for help information
 #
-__prog__        = 'RunMakefile'
-__version__     = '%s Version %s' % (__prog__, '1.0')
-__copyright__   = 'Copyright (c) 2017, Intel Corporation. All rights reserved.'
+__prog__ = 'RunMakefile'
+__version__ = '%s Version %s' % (__prog__, '1.0')
+__copyright__ = 'Copyright (c) 2017, Intel Corporation. All rights reserved.'
 __description__ = 'Run a makefile as part of a PREBUILD or POSTBUILD action.\n'
 
 #
@@ -27,146 +27,152 @@ __description__ = 'Run a makefile as part of a PREBUILD or POSTBUILD action.\n'
 #
 gArgs = None
 
+
 def Log(Message):
-  if not gArgs.Verbose:
-    return
-  sys.stdout.write (__prog__ + ': ' + Message + '\n')
+    if not gArgs.Verbose:
+        return
+    sys.stdout.write(__prog__ + ': ' + Message + '\n')
+
 
 def Error(Message, ExitValue=1):
-  sys.stderr.write (__prog__ + ': ERROR: ' + Message + '\n')
-  sys.exit (ExitValue)
+    sys.stderr.write(__prog__ + ': ERROR: ' + Message + '\n')
+    sys.exit(ExitValue)
+
 
 def RelativePath(target):
-  return os.path.relpath (target, gWorkspace)
+    return os.path.relpath(target, gWorkspace)
+
 
 def NormalizePath(target):
-  if isinstance(target, tuple):
-    return os.path.normpath (os.path.join (*target))
-  else:
-    return os.path.normpath (target)
+    if isinstance(target, tuple):
+        return os.path.normpath(os.path.join(*target))
+    else:
+        return os.path.normpath(target)
+
 
 if __name__ == '__main__':
-  #
-  # Create command line argument parser object
-  #
-  parser = argparse.ArgumentParser (
-                      prog = __prog__,
-                      version = __version__,
-                      description = __description__ + __copyright__,
-                      conflict_handler = 'resolve'
-                      )
-  parser.add_argument (
-           '-a', '--arch', dest = 'Arch', nargs = '+', action = 'append',
-           required = True,
-           help = '''ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC,
+    #
+    # Create command line argument parser object
+    #
+    parser = argparse.ArgumentParser(
+        prog=__prog__,
+        version=__version__,
+        description=__description__ + __copyright__,
+        conflict_handler='resolve'
+    )
+    parser.add_argument(
+        '-a', '--arch', dest='Arch', nargs='+', action='append',
+        required=True,
+        help='''ARCHS is one of list: IA32, X64, IPF, ARM, AARCH64 or EBC,
                      which overrides target.txt's TARGET_ARCH definition. To
                      specify more archs, please repeat this option.'''
-           )
-  parser.add_argument (
-           '-t', '--tagname', dest = 'ToolChain', required = True,
-           help = '''Using the Tool Chain Tagname to build the platform,
+    )
+    parser.add_argument(
+        '-t', '--tagname', dest='ToolChain', required=True,
+        help='''Using the Tool Chain Tagname to build the platform,
                      overriding target.txt's TOOL_CHAIN_TAG definition.'''
-           )
-  parser.add_argument (
-           '-p', '--platform', dest = 'PlatformFile', required = True,
-           help = '''Build the platform specified by the DSC file name argument,
+    )
+    parser.add_argument(
+        '-p', '--platform', dest='PlatformFile', required=True,
+        help='''Build the platform specified by the DSC file name argument,
                      overriding target.txt's ACTIVE_PLATFORM definition.'''
-           )
-  parser.add_argument (
-           '-b', '--buildtarget', dest = 'BuildTarget', required = True,
-           help = '''Using the TARGET to build the platform, overriding
+    )
+    parser.add_argument(
+        '-b', '--buildtarget', dest='BuildTarget', required=True,
+        help='''Using the TARGET to build the platform, overriding
                      target.txt's TARGET definition.'''
-           )
-  parser.add_argument (
-           '--conf=', dest = 'ConfDirectory', required = True,
-           help = '''Specify the customized Conf directory.'''
-           )
-  parser.add_argument (
-           '-D', '--define', dest = 'Define', nargs='*', action = 'append',
-           help = '''Macro: "Name [= Value]".'''
-           )
-  parser.add_argument (
-           '--makefile', dest = 'Makefile', required = True,
-           help = '''Makefile to run passing in arguments as makefile defines.'''
-           )
-  parser.add_argument (
-           '-v', '--verbose', dest = 'Verbose', action = 'store_true',
-           help = '''Turn on verbose output with informational messages printed'''
-           )
+    )
+    parser.add_argument(
+        '--conf=', dest='ConfDirectory', required=True,
+        help='''Specify the customized Conf directory.'''
+    )
+    parser.add_argument(
+        '-D', '--define', dest='Define', nargs='*', action='append',
+        help='''Macro: "Name [= Value]".'''
+    )
+    parser.add_argument(
+        '--makefile', dest='Makefile', required=True,
+        help='''Makefile to run passing in arguments as makefile defines.'''
+    )
+    parser.add_argument(
+        '-v', '--verbose', dest='Verbose', action='store_true',
+        help='''Turn on verbose output with informational messages printed'''
+    )
 
-  #
-  # Parse command line arguments
-  #
-  gArgs, remaining = parser.parse_known_args()
-  gArgs.BuildType = 'all'
-  for BuildType in ['all', 'fds', 'genc', 'genmake', 'clean', 'cleanall', 'modules', 'libraries', 'run']:
-    if BuildType in remaining:
-      gArgs.BuildType = BuildType
-      remaining.remove(BuildType)
-      break
-  gArgs.Remaining = ' '.join(remaining)
+    #
+    # Parse command line arguments
+    #
+    gArgs, remaining = parser.parse_known_args()
+    gArgs.BuildType = 'all'
+    for BuildType in ['all', 'fds', 'genc', 'genmake', 'clean', 'cleanall', 'modules', 'libraries', 'run']:
+        if BuildType in remaining:
+            gArgs.BuildType = BuildType
+            remaining.remove(BuildType)
+            break
+    gArgs.Remaining = ' '.join(remaining)
 
-  #
-  # Start
-  #
-  Log ('Start')
+    #
+    # Start
+    #
+    Log('Start')
 
-  #
-  # Find makefile in WORKSPACE or PACKAGES_PATH
-  #
-  PathList = ['']
-  try:
-    PathList.append(os.environ['WORKSPACE'])
-  except:
-    Error ('WORKSPACE environment variable not set')
-  try:
-    PathList += os.environ['PACKAGES_PATH'].split(os.pathsep)
-  except:
-    pass
-  for Path in PathList:
-    Makefile = NormalizePath((Path, gArgs.Makefile))
-    if os.path.exists (Makefile):
-      break
-  if not os.path.exists(Makefile):
-    Error ('makefile %s not found' % (gArgs.Makefile))
+    #
+    # Find makefile in WORKSPACE or PACKAGES_PATH
+    #
+    PathList = ['']
+    try:
+        PathList.append(os.environ['WORKSPACE'])
+    except:
+        Error('WORKSPACE environment variable not set')
+    try:
+        PathList += os.environ['PACKAGES_PATH'].split(os.pathsep)
+    except:
+        pass
+    for Path in PathList:
+        Makefile = NormalizePath((Path, gArgs.Makefile))
+        if os.path.exists(Makefile):
+            break
+    if not os.path.exists(Makefile):
+        Error('makefile %s not found' % (gArgs.Makefile))
 
-  #
-  # Build command line arguments converting build arguments to makefile defines
-  #
-  CommandLine = [Makefile]
-  CommandLine.append('TARGET_ARCH="%s"' % (' '.join([Item[0] for Item in gArgs.Arch])))
-  CommandLine.append('TOOL_CHAIN_TAG="%s"' % (gArgs.ToolChain))
-  CommandLine.append('TARGET="%s"' % (gArgs.BuildTarget))
-  CommandLine.append('ACTIVE_PLATFORM="%s"' % (gArgs.PlatformFile))
-  CommandLine.append('CONF_DIRECTORY="%s"' % (gArgs.ConfDirectory))
-  if gArgs.Define:
-    for Item in gArgs.Define:
-      if '=' not in Item[0]:
-        continue
-      Item = Item[0].split('=', 1)
-      CommandLine.append('%s="%s"' % (Item[0], Item[1]))
-  CommandLine.append('EXTRA_FLAGS="%s"' % (gArgs.Remaining))
-  CommandLine.append(gArgs.BuildType)
-  if sys.platform == "win32":
-    CommandLine = 'nmake /f %s' % (' '.join(CommandLine))
-  else:
-    CommandLine = 'make -f %s' % (' '.join(CommandLine))
+    #
+    # Build command line arguments converting build arguments to makefile defines
+    #
+    CommandLine = [Makefile]
+    CommandLine.append('TARGET_ARCH="%s"' %
+                       (' '.join([Item[0] for Item in gArgs.Arch])))
+    CommandLine.append('TOOL_CHAIN_TAG="%s"' % (gArgs.ToolChain))
+    CommandLine.append('TARGET="%s"' % (gArgs.BuildTarget))
+    CommandLine.append('ACTIVE_PLATFORM="%s"' % (gArgs.PlatformFile))
+    CommandLine.append('CONF_DIRECTORY="%s"' % (gArgs.ConfDirectory))
+    if gArgs.Define:
+        for Item in gArgs.Define:
+            if '=' not in Item[0]:
+                continue
+            Item = Item[0].split('=', 1)
+            CommandLine.append('%s="%s"' % (Item[0], Item[1]))
+    CommandLine.append('EXTRA_FLAGS="%s"' % (gArgs.Remaining))
+    CommandLine.append(gArgs.BuildType)
+    if sys.platform == "win32":
+        CommandLine = 'nmake /f %s' % (' '.join(CommandLine))
+    else:
+        CommandLine = 'make -f %s' % (' '.join(CommandLine))
 
-  #
-  # Run the makefile
-  #
-  try:
-    Process = subprocess.Popen(CommandLine, shell=True)
-  except:
-    Error ('make command not available.  Please verify PATH')
-  Process.communicate()
+    #
+    # Run the makefile
+    #
+    try:
+        Process = subprocess.Popen(CommandLine, shell=True)
+    except:
+        Error('make command not available.  Please verify PATH')
+    Process.communicate()
 
-  #
-  # Done
-  #
-  Log ('Done')
+    #
+    # Done
+    #
+    Log('Done')
 
-  #
-  # Return status from running the makefile
-  #
-  sys.exit(Process.returncode)
+    #
+    # Return status from running the makefile
+    #
+    sys.exit(Process.returncode)
diff --git a/BaseTools/Scripts/SetupGit.py b/BaseTools/Scripts/SetupGit.py
index 91814199bfb9..a1bcd185bf0f 100644
--- a/BaseTools/Scripts/SetupGit.py
+++ b/BaseTools/Scripts/SetupGit.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Set up the git configuration for contributing to TianoCore projects
 #
 #  Copyright (c) 2019, Linaro Ltd. All rights reserved.<BR>
@@ -50,7 +50,7 @@ UPSTREAMS = [
     {'name': 'edk2-test',
      'repo': 'https://github.com/tianocore/edk2-test.git',
      'list': 'devel@edk2.groups.io', 'prefix': 'edk2-test'}
-    ]
+]
 
 # The minimum version required for all of the below options to work
 MIN_GIT_VERSION = (1, 9, 0)
@@ -78,11 +78,12 @@ OPTIONS = [
     {'section': 'format',      'option': 'numbered',          'value': True},
     {'section': 'format',      'option': 'signoff',           'value': False},
     {'section': 'log',         'option': 'mailmap',           'value': True},
-    {'section': 'notes',       'option': 'rewriteRef',        'value': 'refs/notes/commits'},
+    {'section': 'notes',       'option': 'rewriteRef',
+        'value': 'refs/notes/commits'},
     {'section': 'sendemail',   'option': 'chainreplyto',      'value': False},
     {'section': 'sendemail',   'option': 'thread',            'value': True},
     {'section': 'sendemail',   'option': 'transferEncoding',  'value': '8bit'},
-    ]
+]
 
 
 def locate_repo():
@@ -97,7 +98,7 @@ def locate_repo():
 
 def fuzzy_match_repo_url(one, other):
     """Compares two repository URLs, ignoring protocol and optional trailing '.git'."""
-    oneresult   = re.match(r'.*://(?P<oneresult>.*?)(\.git)*$', one)
+    oneresult = re.match(r'.*://(?P<oneresult>.*?)(\.git)*$', one)
     otherresult = re.match(r'.*://(?P<otherresult>.*?)(\.git)*$', other)
 
     if oneresult and otherresult:
@@ -204,13 +205,17 @@ if __name__ == '__main__':
                                                            entry['option'], value))
             else:
                 if ARGS.force:
-                    write_config_value(REPO, entry['section'], entry['option'], entry['value'])
+                    write_config_value(
+                        REPO, entry['section'], entry['option'], entry['value'])
                 else:
                     print("Not overwriting existing %s.%s value:" % (entry['section'],
                                                                      entry['option']))
                     print("  '%s' != '%s'" % (value, entry['value']))
-                    print("  add '-f' to command line to force overwriting existing settings")
+                    print(
+                        "  add '-f' to command line to force overwriting existing settings")
         else:
-            print("%s.%s => '%s'" % (entry['section'], entry['option'], entry['value']))
+            print("%s.%s => '%s'" %
+                  (entry['section'], entry['option'], entry['value']))
             if not ARGS.check:
-                write_config_value(REPO, entry['section'], entry['option'], entry['value'])
+                write_config_value(
+                    REPO, entry['section'], entry['option'], entry['value'])
diff --git a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
index 1e2e7fd1049b..4caea88c68e2 100644
--- a/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
+++ b/BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
@@ -20,6 +20,7 @@ import xml.dom.minidom
 versionNumber = "1.1"
 __copyright__ = "Copyright (c) 2016, Intel Corporation. All rights reserved."
 
+
 class Symbols:
     def __init__(self):
         self.listLineAddress = []
@@ -29,36 +30,36 @@ class Symbols:
         # Cache for line
         self.sourceName = ""
 
-
-    def getSymbol (self, rva):
+    def getSymbol(self, rva):
         index = 0
-        lineName  = 0
+        lineName = 0
         sourceName = "??"
-        while index + 1 < self.lineCount :
-            if self.listLineAddress[index][0] <= rva and self.listLineAddress[index + 1][0] > rva :
+        while index + 1 < self.lineCount:
+            if self.listLineAddress[index][0] <= rva and self.listLineAddress[index + 1][0] > rva:
                 offset = rva - self.listLineAddress[index][0]
                 functionName = self.listLineAddress[index][1]
                 lineName = self.listLineAddress[index][2]
                 sourceName = self.listLineAddress[index][3]
-                if lineName == 0 :
-                  return [functionName]
-                else :
-                  return [functionName, sourceName, lineName]
+                if lineName == 0:
+                    return [functionName]
+                else:
+                    return [functionName, sourceName, lineName]
             index += 1
 
         return []
 
     def parse_debug_file(self, driverName, pdbName):
-        if cmp (pdbName, "") == 0 :
+        if cmp(pdbName, "") == 0:
             return
-        self.pdbName = pdbName;
+        self.pdbName = pdbName
 
         try:
             nmCommand = "nm"
             nmLineOption = "-l"
             print("parsing (debug) - " + pdbName)
-            os.system ('%s %s %s > nmDump.line.log' % (nmCommand, nmLineOption, pdbName))
-        except :
+            os.system('%s %s %s > nmDump.line.log' %
+                      (nmCommand, nmLineOption, pdbName))
+        except:
             print('ERROR: nm command not available.  Please verify PATH')
             return
 
@@ -75,23 +76,25 @@ class Symbols:
         for reportLine in reportLines:
             match = re.match(patchLineFileMatchString, reportLine)
             if match is not None:
-                rva = int (match.group(1), 16)
+                rva = int(match.group(1), 16)
                 functionName = match.group(2)
                 sourceName = match.group(3)
-                if cmp (match.group(4), "") != 0 :
-                    lineName = int (match.group(4))
-                else :
+                if cmp(match.group(4), "") != 0:
+                    lineName = int(match.group(4))
+                else:
                     lineName = 0
-                self.listLineAddress.append ([rva, functionName, lineName, sourceName])
+                self.listLineAddress.append(
+                    [rva, functionName, lineName, sourceName])
 
-        self.lineCount = len (self.listLineAddress)
+        self.lineCount = len(self.listLineAddress)
 
-        self.listLineAddress = sorted(self.listLineAddress, key=lambda symbolAddress:symbolAddress[0])
+        self.listLineAddress = sorted(
+            self.listLineAddress, key=lambda symbolAddress: symbolAddress[0])
 
     def parse_pdb_file(self, driverName, pdbName):
-        if cmp (pdbName, "") == 0 :
+        if cmp(pdbName, "") == 0:
             return
-        self.pdbName = pdbName;
+        self.pdbName = pdbName
 
         try:
             #DIA2DumpCommand = "\"C:\\Program Files (x86)\Microsoft Visual Studio 14.0\\DIA SDK\\Samples\\DIA2Dump\\x64\\Debug\\Dia2Dump.exe\""
@@ -100,8 +103,9 @@ class Symbols:
             DIA2LinesOption = "-l"
             print("parsing (pdb) - " + pdbName)
             #os.system ('%s %s %s > DIA2Dump.symbol.log' % (DIA2DumpCommand, DIA2SymbolOption, pdbName))
-            os.system ('%s %s %s > DIA2Dump.line.log' % (DIA2DumpCommand, DIA2LinesOption, pdbName))
-        except :
+            os.system('%s %s %s > DIA2Dump.line.log' %
+                      (DIA2DumpCommand, DIA2LinesOption, pdbName))
+        except:
             print('ERROR: DIA2Dump command not available.  Please verify PATH')
             return
 
@@ -123,50 +127,60 @@ class Symbols:
         for reportLine in reportLines:
             match = re.match(patchLineFileMatchString, reportLine)
             if match is not None:
-                if cmp (match.group(3), "") != 0 :
+                if cmp(match.group(3), "") != 0:
                     self.sourceName = match.group(3)
                 sourceName = self.sourceName
                 functionName = self.functionName
 
-                rva = int (match.group(2), 16)
-                lineName = int (match.group(1))
-                self.listLineAddress.append ([rva, functionName, lineName, sourceName])
-            else :
+                rva = int(match.group(2), 16)
+                lineName = int(match.group(1))
+                self.listLineAddress.append(
+                    [rva, functionName, lineName, sourceName])
+            else:
                 match = re.match(patchLineFileMatchStringFunc, reportLine)
                 if match is not None:
                     self.functionName = match.group(1)
 
-        self.lineCount = len (self.listLineAddress)
-        self.listLineAddress = sorted(self.listLineAddress, key=lambda symbolAddress:symbolAddress[0])
+        self.lineCount = len(self.listLineAddress)
+        self.listLineAddress = sorted(
+            self.listLineAddress, key=lambda symbolAddress: symbolAddress[0])
+
 
 class SymbolsFile:
     def __init__(self):
         self.symbolsTable = {}
 
+
 symbolsFile = ""
 
 driverName = ""
 rvaName = ""
 symbolName = ""
 
+
 def getSymbolName(driverName, rva):
     global symbolsFile
 
-    try :
+    try:
         symbolList = symbolsFile.symbolsTable[driverName]
         if symbolList is not None:
-            return symbolList.getSymbol (rva)
+            return symbolList.getSymbol(rva)
         else:
             return []
     except Exception:
         return []
 
+
 def myOptionParser():
     usage = "%prog [--version] [-h] [--help] [-i inputfile [-o outputfile] [-g guidreffile]]"
-    Parser = OptionParser(usage=usage, description=__copyright__, version="%prog " + str(versionNumber))
-    Parser.add_option("-i", "--inputfile", dest="inputfilename", type="string", help="The input memory profile info file output from MemoryProfileInfo application in MdeModulePkg")
-    Parser.add_option("-o", "--outputfile", dest="outputfilename", type="string", help="The output memory profile info file with symbol, MemoryProfileInfoSymbol.txt will be used if it is not specified")
-    Parser.add_option("-g", "--guidref", dest="guidreffilename", type="string", help="The input guid ref file output from build")
+    Parser = OptionParser(usage=usage, description=__copyright__,
+                          version="%prog " + str(versionNumber))
+    Parser.add_option("-i", "--inputfile", dest="inputfilename", type="string",
+                      help="The input memory profile info file output from MemoryProfileInfo application in MdeModulePkg")
+    Parser.add_option("-o", "--outputfile", dest="outputfilename", type="string",
+                      help="The output memory profile info file with symbol, MemoryProfileInfoSymbol.txt will be used if it is not specified")
+    Parser.add_option("-g", "--guidref", dest="guidreffilename",
+                      type="string", help="The input guid ref file output from build")
 
     (Options, args) = Parser.parse_args()
     if Options.inputfilename is None:
@@ -175,22 +189,24 @@ def myOptionParser():
         Options.outputfilename = "SmiHandlerProfileInfoSymbol.xml"
     return Options
 
+
 dictGuid = {
-  '00000000-0000-0000-0000-000000000000':'gZeroGuid',
-  '2A571201-4966-47F6-8B86-F31E41F32F10':'gEfiEventLegacyBootGuid',
-  '27ABF055-B1B8-4C26-8048-748F37BAA2DF':'gEfiEventExitBootServicesGuid',
-  '7CE88FB3-4BD7-4679-87A8-A8D8DEE50D2B':'gEfiEventReadyToBootGuid',
-  '02CE967A-DD7E-4FFC-9EE7-810CF0470880':'gEfiEndOfDxeEventGroupGuid',
-  '60FF8964-E906-41D0-AFED-F241E974E08E':'gEfiDxeSmmReadyToLockProtocolGuid',
-  '18A3C6DC-5EEA-48C8-A1C1-B53389F98999':'gEfiSmmSwDispatch2ProtocolGuid',
-  '456D2859-A84B-4E47-A2EE-3276D886997D':'gEfiSmmSxDispatch2ProtocolGuid',
-  '4CEC368E-8E8E-4D71-8BE1-958C45FC8A53':'gEfiSmmPeriodicTimerDispatch2ProtocolGuid',
-  'EE9B8D90-C5A6-40A2-BDE2-52558D33CCA1':'gEfiSmmUsbDispatch2ProtocolGuid',
-  '25566B03-B577-4CBF-958C-ED663EA24380':'gEfiSmmGpiDispatch2ProtocolGuid',
-  '7300C4A1-43F2-4017-A51B-C81A7F40585B':'gEfiSmmStandbyButtonDispatch2ProtocolGuid',
-  '1B1183FA-1823-46A7-8872-9C578755409D':'gEfiSmmPowerButtonDispatch2ProtocolGuid',
-  '58DC368D-7BFA-4E77-ABBC-0E29418DF930':'gEfiSmmIoTrapDispatch2ProtocolGuid',
-  }
+    '00000000-0000-0000-0000-000000000000': 'gZeroGuid',
+    '2A571201-4966-47F6-8B86-F31E41F32F10': 'gEfiEventLegacyBootGuid',
+    '27ABF055-B1B8-4C26-8048-748F37BAA2DF': 'gEfiEventExitBootServicesGuid',
+    '7CE88FB3-4BD7-4679-87A8-A8D8DEE50D2B': 'gEfiEventReadyToBootGuid',
+    '02CE967A-DD7E-4FFC-9EE7-810CF0470880': 'gEfiEndOfDxeEventGroupGuid',
+    '60FF8964-E906-41D0-AFED-F241E974E08E': 'gEfiDxeSmmReadyToLockProtocolGuid',
+    '18A3C6DC-5EEA-48C8-A1C1-B53389F98999': 'gEfiSmmSwDispatch2ProtocolGuid',
+    '456D2859-A84B-4E47-A2EE-3276D886997D': 'gEfiSmmSxDispatch2ProtocolGuid',
+    '4CEC368E-8E8E-4D71-8BE1-958C45FC8A53': 'gEfiSmmPeriodicTimerDispatch2ProtocolGuid',
+    'EE9B8D90-C5A6-40A2-BDE2-52558D33CCA1': 'gEfiSmmUsbDispatch2ProtocolGuid',
+    '25566B03-B577-4CBF-958C-ED663EA24380': 'gEfiSmmGpiDispatch2ProtocolGuid',
+    '7300C4A1-43F2-4017-A51B-C81A7F40585B': 'gEfiSmmStandbyButtonDispatch2ProtocolGuid',
+    '1B1183FA-1823-46A7-8872-9C578755409D': 'gEfiSmmPowerButtonDispatch2ProtocolGuid',
+    '58DC368D-7BFA-4E77-ABBC-0E29418DF930': 'gEfiSmmIoTrapDispatch2ProtocolGuid',
+}
+
 
 def genGuidString(guidreffile):
     guidLines = guidreffile.readlines()
@@ -199,27 +215,32 @@ def genGuidString(guidreffile):
         if len(guidLineList) == 2:
             guid = guidLineList[0]
             guidName = guidLineList[1]
-            if guid not in dictGuid :
+            if guid not in dictGuid:
                 dictGuid[guid] = guidName
 
+
 def createSym(symbolName):
     SymbolNode = xml.dom.minidom.Document().createElement("Symbol")
     SymbolFunction = xml.dom.minidom.Document().createElement("Function")
-    SymbolFunctionData = xml.dom.minidom.Document().createTextNode(symbolName[0])
+    SymbolFunctionData = xml.dom.minidom.Document(
+    ).createTextNode(symbolName[0])
     SymbolFunction.appendChild(SymbolFunctionData)
     SymbolNode.appendChild(SymbolFunction)
     if (len(symbolName)) >= 2:
         SymbolSourceFile = xml.dom.minidom.Document().createElement("SourceFile")
-        SymbolSourceFileData = xml.dom.minidom.Document().createTextNode(symbolName[1])
+        SymbolSourceFileData = xml.dom.minidom.Document(
+        ).createTextNode(symbolName[1])
         SymbolSourceFile.appendChild(SymbolSourceFileData)
         SymbolNode.appendChild(SymbolSourceFile)
         if (len(symbolName)) >= 3:
             SymbolLineNumber = xml.dom.minidom.Document().createElement("LineNumber")
-            SymbolLineNumberData = xml.dom.minidom.Document().createTextNode(str(symbolName[2]))
+            SymbolLineNumberData = xml.dom.minidom.Document(
+            ).createTextNode(str(symbolName[2]))
             SymbolLineNumber.appendChild(SymbolLineNumberData)
             SymbolNode.appendChild(SymbolLineNumber)
     return SymbolNode
 
+
 def main():
     global symbolsFile
     global Options
@@ -227,14 +248,14 @@ def main():
 
     symbolsFile = SymbolsFile()
 
-    try :
+    try:
         DOMTree = xml.dom.minidom.parse(Options.inputfilename)
     except Exception:
         print("fail to open input " + Options.inputfilename)
         return 1
 
     if Options.guidreffilename is not None:
-        try :
+        try:
             guidreffile = open(Options.guidreffilename)
         except Exception:
             print("fail to open guidref" + Options.guidreffilename)
@@ -244,8 +265,10 @@ def main():
 
     SmiHandlerProfile = DOMTree.documentElement
 
-    SmiHandlerDatabase = SmiHandlerProfile.getElementsByTagName("SmiHandlerDatabase")
-    SmiHandlerCategory = SmiHandlerDatabase[0].getElementsByTagName("SmiHandlerCategory")
+    SmiHandlerDatabase = SmiHandlerProfile.getElementsByTagName(
+        "SmiHandlerDatabase")
+    SmiHandlerCategory = SmiHandlerDatabase[0].getElementsByTagName(
+        "SmiHandlerCategory")
     for smiHandlerCategory in SmiHandlerCategory:
         SmiEntry = smiHandlerCategory.getElementsByTagName("SmiEntry")
         for smiEntry in SmiEntry:
@@ -265,10 +288,12 @@ def main():
 
                     symbolsFile.symbolsTable[driverName] = Symbols()
 
-                    if cmp (pdbName[-3:], "pdb") == 0 :
-                        symbolsFile.symbolsTable[driverName].parse_pdb_file (driverName, pdbName)
-                    else :
-                        symbolsFile.symbolsTable[driverName].parse_debug_file (driverName, pdbName)
+                    if cmp(pdbName[-3:], "pdb") == 0:
+                        symbolsFile.symbolsTable[driverName].parse_pdb_file(
+                            driverName, pdbName)
+                    else:
+                        symbolsFile.symbolsTable[driverName].parse_debug_file(
+                            driverName, pdbName)
 
                     Handler = smiHandler.getElementsByTagName("Handler")
                     RVA = Handler[0].getElementsByTagName("RVA")
@@ -276,7 +301,8 @@ def main():
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
-                        symbolName = getSymbolName (driverName, int(rvaName, 16))
+                        symbolName = getSymbolName(
+                            driverName, int(rvaName, 16))
 
                         if (len(symbolName)) >= 1:
                             SymbolNode = createSym(symbolName)
@@ -288,20 +314,23 @@ def main():
 
                     if (len(RVA)) >= 1:
                         rvaName = RVA[0].childNodes[0].data
-                        symbolName = getSymbolName (driverName, int(rvaName, 16))
+                        symbolName = getSymbolName(
+                            driverName, int(rvaName, 16))
 
                         if (len(symbolName)) >= 1:
                             SymbolNode = createSym(symbolName)
                             Caller[0].appendChild(SymbolNode)
 
-    try :
+    try:
         newfile = open(Options.outputfilename, "w")
     except Exception:
         print("fail to open output" + Options.outputfilename)
         return 1
 
-    newfile.write(DOMTree.toprettyxml(indent = "\t", newl = "\n", encoding = "utf-8"))
+    newfile.write(DOMTree.toprettyxml(
+        indent="\t", newl="\n", encoding="utf-8"))
     newfile.close()
 
+
 if __name__ == '__main__':
     sys.exit(main())
diff --git a/BaseTools/Scripts/UpdateBuildVersions.py b/BaseTools/Scripts/UpdateBuildVersions.py
index 0f019f25cf50..d5174d040b44 100755
--- a/BaseTools/Scripts/UpdateBuildVersions.py
+++ b/BaseTools/Scripts/UpdateBuildVersions.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Update build revisions of the tools when performing a developer build
 #
 # This script will modife the C/Include/Common/BuildVersion.h file and the two
@@ -27,7 +27,8 @@ SYS_ENV_ERR = "ERROR : %s system environment variable must be set prior to runni
 
 __execname__ = "UpdateBuildVersions.py"
 SVN_REVISION = "$LastChangedRevision: 3 $"
-SVN_REVISION = SVN_REVISION.replace("$LastChangedRevision:", "").replace("$", "").strip()
+SVN_REVISION = SVN_REVISION.replace(
+    "$LastChangedRevision:", "").replace("$", "").strip()
 __copyright__ = "Copyright (c) 2014, Intel Corporation. All rights reserved."
 VERSION_NUMBER = "0.7.0"
 __version__ = "Version %s.%s" % (VERSION_NUMBER, SVN_REVISION)
@@ -74,13 +75,16 @@ def ShellCommandResults(CmdLine, Opt):
 
     returnValue = 0
     try:
-        subprocess.check_call(args=shlex.split(CmdLine), stderr=subprocess.STDOUT, stdout=file_list)
+        subprocess.check_call(args=shlex.split(CmdLine),
+                              stderr=subprocess.STDOUT, stdout=file_list)
     except subprocess.CalledProcessError as err_val:
         file_list.close()
         if not Opt.silent:
-            sys.stderr.write("ERROR : %d : %s\n" % (err_val.returncode, err_val.__str__()))
+            sys.stderr.write("ERROR : %d : %s\n" %
+                             (err_val.returncode, err_val.__str__()))
             if os.path.exists(filename):
-                sys.stderr.write("      : Partial results may be in this file: %s\n" % filename)
+                sys.stderr.write(
+                    "      : Partial results may be in this file: %s\n" % filename)
             sys.stderr.flush()
         returnValue = err_val.returncode
 
@@ -91,7 +95,8 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.write("I/O ERROR : %s : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
             if os.path.exists(filename):
-                sys.stderr.write("      : Partial results may be in this file: %s\n" % filename)
+                sys.stderr.write(
+                    "      : Partial results may be in this file: %s\n" % filename)
             sys.stderr.flush()
         returnValue = errno
 
@@ -102,16 +107,19 @@ def ShellCommandResults(CmdLine, Opt):
             sys.stderr.write("OS ERROR : %s : %s\n" % (str(errno), strerror))
             sys.stderr.write("ERROR : this command failed : %s\n" % CmdLine)
             if os.path.exists(filename):
-                sys.stderr.write("      : Partial results may be in this file: %s\n" % filename)
+                sys.stderr.write(
+                    "      : Partial results may be in this file: %s\n" % filename)
             sys.stderr.flush()
         returnValue = errno
 
     except KeyboardInterrupt:
         file_list.close()
         if not Opt.silent:
-            sys.stderr.write("ERROR : Command terminated by user : %s\n" % CmdLine)
+            sys.stderr.write(
+                "ERROR : Command terminated by user : %s\n" % CmdLine)
             if os.path.exists(filename):
-                sys.stderr.write("      : Partial results may be in this file: %s\n" % filename)
+                sys.stderr.write(
+                    "      : Partial results may be in this file: %s\n" % filename)
             sys.stderr.flush()
         returnValue = 1
 
@@ -136,14 +144,15 @@ def ShellCommandResults(CmdLine, Opt):
 def UpdateBuildVersionPython(Rev, UserModified, opts):
     """ This routine will update the BuildVersion.h files in the C source tree """
     for SubDir in ["Common", "UPT"]:
-        PyPath = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir)
+        PyPath = os.path.join(
+            os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir)
         BuildVersionPy = os.path.join(PyPath, "BuildVersion.py")
         fd_ = open(os.path.normpath(BuildVersionPy), 'r')
         contents = fd_.readlines()
         fd_.close()
         if opts.HAVE_SVN is False:
             BuildVersionOrig = os.path.join(PyPath, "orig_BuildVersion.py")
-            fd_ = open (BuildVersionOrig, 'w')
+            fd_ = open(BuildVersionOrig, 'w')
             for line in contents:
                 fd_.write(line)
             fd_.flush()
@@ -166,7 +175,8 @@ def UpdateBuildVersionPython(Rev, UserModified, opts):
 
 def UpdateBuildVersionH(Rev, UserModified, opts):
     """ This routine will update the BuildVersion.h files in the C source tree """
-    CPath = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common")
+    CPath = os.path.join(
+        os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common")
     BuildVersionH = os.path.join(CPath, "BuildVersion.h")
     fd_ = open(os.path.normpath(BuildVersionH), 'r')
     contents = fd_.readlines()
@@ -185,7 +195,7 @@ def UpdateBuildVersionH(Rev, UserModified, opts):
             new_line = "#define __BUILD_VERSION \"Developer Build based on Revision: %s\"" % Rev
             if UserModified:
                 new_line = "#define __BUILD_VERSION \"Developer Build based on Revision: %s with Modified Sources\"" % \
-                            Rev
+                    Rev
             new_content.append(new_line)
             continue
         new_content.append(line)
@@ -222,7 +232,8 @@ def RevertCmd(Filename, Opt):
 
     except KeyboardInterrupt:
         if not Opt.silent:
-            sys.stderr.write("ERROR : Command terminated by user : %s\n" % CmdLine)
+            sys.stderr.write(
+                "ERROR : Command terminated by user : %s\n" % CmdLine)
             sys.stderr.flush()
 
     if Opt.verbose:
@@ -236,7 +247,8 @@ def GetSvnRevision(opts):
     Modified = False
 
     if opts.HAVE_SVN is False:
-        sys.stderr.write("WARNING: the svn command-line tool is not available.\n")
+        sys.stderr.write(
+            "WARNING: the svn command-line tool is not available.\n")
         return (Revision, Modified)
 
     SrcPath = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source")
@@ -323,7 +335,8 @@ def CheckOriginals(Opts):
     Returns 0 if this succeeds, or 1 if the copy function fails. It will also return 0 if the orig_BuildVersion.* file
     does not exist.
     """
-    CPath = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common")
+    CPath = os.path.join(
+        os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common")
     BuildVersionH = os.path.join(CPath, "BuildVersion.h")
     OrigBuildVersionH = os.path.join(CPath, "orig_BuildVersion.h")
     if not os.path.exists(OrigBuildVersionH):
@@ -331,7 +344,8 @@ def CheckOriginals(Opts):
     if CopyOrig(OrigBuildVersionH, BuildVersionH, Opts):
         return 1
     for SubDir in ["Common", "UPT"]:
-        PyPath = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir)
+        PyPath = os.path.join(
+            os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir)
         BuildVersionPy = os.path.join(PyPath, "BuildVersion.h")
         OrigBuildVersionPy = os.path.join(PyPath, "orig_BuildVersion.h")
         if not os.path.exists(OrigBuildVersionPy):
@@ -351,12 +365,15 @@ def RevertBuildVersionFiles(opts):
             return 1
         return 0
     # SVN is available
-    BuildVersionH = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common", "BuildVersion.h")
+    BuildVersionH = os.path.join(
+        os.environ['BASE_TOOLS_PATH'], "Source", "C", "Include", "Common", "BuildVersion.h")
     RevertCmd(BuildVersionH, opts)
     for SubDir in ["Common", "UPT"]:
-        BuildVersionPy = os.path.join(os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir, "BuildVersion.py")
+        BuildVersionPy = os.path.join(
+            os.environ['BASE_TOOLS_PATH'], "Source", "Python", SubDir, "BuildVersion.py")
         RevertCmd(BuildVersionPy, opts)
 
+
 def UpdateRevisionFiles():
     """ Main routine that will update the BuildVersion.py and BuildVersion.h files."""
     options = ParseOptions()
@@ -368,10 +385,10 @@ def UpdateRevisionFiles():
         sys.stderr.write(SYS_ENV_ERR % 'BASE_TOOLS_PATH')
         return 1
     if not os.path.exists(os.environ['BASE_TOOLS_PATH']):
-        sys.stderr.write("Unable to locate the %s directory." % os.environ['BASE_TOOLS_PATH'])
+        sys.stderr.write("Unable to locate the %s directory." %
+                         os.environ['BASE_TOOLS_PATH'])
         return 1
 
-
     options.HAVE_SVN = CheckSvn(options)
     if options.TEST_SVN:
         return (not options.HAVE_SVN)
@@ -384,7 +401,8 @@ def UpdateRevisionFiles():
         RevertBuildVersionFiles(options)
         Revision, Modified = GetSvnRevision(options)
         if options.verbose:
-            sys.stdout.write("Revision: %s is Modified: %s\n" % (Revision, Modified))
+            sys.stdout.write("Revision: %s is Modified: %s\n" %
+                             (Revision, Modified))
             sys.stdout.flush()
         UpdateBuildVersionH(Revision, Modified, options)
         UpdateBuildVersionPython(Revision, Modified, options)
@@ -394,5 +412,3 @@ def UpdateRevisionFiles():
 
 if __name__ == "__main__":
     sys.exit(UpdateRevisionFiles())
-
-
diff --git a/BaseTools/Scripts/efi_debugging.py b/BaseTools/Scripts/efi_debugging.py
index 9848cd5968c7..8ea0184097e8 100755
--- a/BaseTools/Scripts/efi_debugging.py
+++ b/BaseTools/Scripts/efi_debugging.py
@@ -1668,11 +1668,11 @@ class EfiDevicePath:
                 node, extra = self._ctype_read_ex(type_str, ptr, hdr.Length)
                 if 'VENDOR_DEVICE_PATH' in type(node).__name__:
                     guid_type = self.guid_override_dict.get(
-                                        GuidNames.to_uuid(node.Guid), None)
+                        GuidNames.to_uuid(node.Guid), None)
                     if guid_type:
                         # use the ctype associated with the GUID
                         node, extra = self._ctype_read_ex(
-                                                guid_type, ptr, hdr.Length)
+                            guid_type, ptr, hdr.Length)
 
                 instance.append((type(node).__name__, hdr.Type,
                                 hdr.SubType, hdr.Length, node, extra))
diff --git a/BaseTools/Scripts/efi_gdb.py b/BaseTools/Scripts/efi_gdb.py
index f3e7fd9d0c28..7cf62067a051 100755
--- a/BaseTools/Scripts/efi_gdb.py
+++ b/BaseTools/Scripts/efi_gdb.py
@@ -852,6 +852,7 @@ class LoadEmulatorEfiSymbols(gdb.Breakpoint):
     Note: make sure SecGdbScriptBreak is not optimized away!
     Also turn off the dlopen() flow like on macOS.
     '''
+
     def stop(self):
         symbols = EfiSymbols()
         # Emulator adds SizeOfHeaders so we need file alignment to search
diff --git a/BaseTools/Source/C/Makefiles/NmakeSubdirs.py b/BaseTools/Source/C/Makefiles/NmakeSubdirs.py
index 1f4a45004f4b..717cb700ac4f 100644
--- a/BaseTools/Source/C/Makefiles/NmakeSubdirs.py
+++ b/BaseTools/Source/C/Makefiles/NmakeSubdirs.py
@@ -20,13 +20,15 @@ import subprocess
 import multiprocessing
 import copy
 import sys
-__prog__        = 'NmakeSubdirs'
-__version__     = '%s Version %s' % (__prog__, '0.10 ')
-__copyright__   = 'Copyright (c) 2018, Intel Corporation. All rights reserved.'
+__prog__ = 'NmakeSubdirs'
+__version__ = '%s Version %s' % (__prog__, '0.10 ')
+__copyright__ = 'Copyright (c) 2018, Intel Corporation. All rights reserved.'
 __description__ = 'Replace for NmakeSubdirs.bat in windows ,support parallel build for nmake.\n'
 
 cpu_count = multiprocessing.cpu_count()
 output_lock = threading.Lock()
+
+
 def RunCommand(WorkDir=None, *Args, **kwargs):
     if WorkDir is None:
         WorkDir = os.curdir
@@ -34,17 +36,21 @@ def RunCommand(WorkDir=None, *Args, **kwargs):
         kwargs["stderr"] = subprocess.STDOUT
     if "stdout" not in kwargs:
         kwargs["stdout"] = subprocess.PIPE
-    p = subprocess.Popen(Args, cwd=WorkDir, stderr=kwargs["stderr"], stdout=kwargs["stdout"])
+    p = subprocess.Popen(
+        Args, cwd=WorkDir, stderr=kwargs["stderr"], stdout=kwargs["stdout"])
     stdout, stderr = p.communicate()
     message = ""
     if stdout is not None:
-        message = stdout.decode(errors='ignore') #for compatibility in python 2 and 3
+        # for compatibility in python 2 and 3
+        message = stdout.decode(errors='ignore')
 
     if p.returncode != 0:
-        raise RuntimeError("Error while execute command \'{0}\' in direcotry {1}\n{2}".format(" ".join(Args), WorkDir, message))
+        raise RuntimeError("Error while execute command \'{0}\' in direcotry {1}\n{2}".format(
+            " ".join(Args), WorkDir, message))
 
     output_lock.acquire(True)
-    print("execute command \"{0}\" in directory {1}".format(" ".join(Args), WorkDir))
+    print("execute command \"{0}\" in directory {1}".format(
+        " ".join(Args), WorkDir))
     try:
         print(message)
     except:
@@ -53,6 +59,7 @@ def RunCommand(WorkDir=None, *Args, **kwargs):
 
     return p.returncode, stdout
 
+
 class TaskUnit(object):
     def __init__(self, func, args, kwargs):
         self.func = func
@@ -71,6 +78,7 @@ class TaskUnit(object):
 
         return "{0}({1})".format(self.func.__name__, ",".join(para))
 
+
 class ThreadControl(object):
 
     def __init__(self, maxthread):
@@ -123,7 +131,8 @@ class ThreadControl(object):
             try:
                 task.run()
             except RuntimeError as e:
-                if self.error: break
+                if self.error:
+                    break
                 self.errorLock.acquire(True)
                 self.error = True
                 self.errorMsg = str(e)
@@ -135,6 +144,7 @@ class ThreadControl(object):
         self.running.remove(threading.currentThread())
         self.runningLock.release()
 
+
 def Run():
     curdir = os.path.abspath(os.curdir)
     if len(args.subdirs) == 1:
@@ -142,25 +152,30 @@ def Run():
     if args.jobs == 1:
         try:
             for dir in args.subdirs:
-                RunCommand(os.path.join(curdir, dir), "nmake", args.target, stdout=sys.stdout, stderr=subprocess.STDOUT)
+                RunCommand(os.path.join(curdir, dir), "nmake", args.target,
+                           stdout=sys.stdout, stderr=subprocess.STDOUT)
         except RuntimeError:
             exit(1)
     else:
         controller = ThreadControl(args.jobs)
         for dir in args.subdirs:
-            controller.addTask(RunCommand, os.path.join(curdir, dir), "nmake", args.target)
+            controller.addTask(RunCommand, os.path.join(
+                curdir, dir), "nmake", args.target)
         controller.startSchedule()
         controller.waitComplete()
         if controller.error:
             exit(1)
 
+
 if __name__ == "__main__":
-    parser = argparse.ArgumentParser(prog=__prog__, description=__description__ + __copyright__, conflict_handler='resolve')
+    parser = argparse.ArgumentParser(
+        prog=__prog__, description=__description__ + __copyright__, conflict_handler='resolve')
 
     parser.add_argument("target", help="the target for nmake")
-    parser.add_argument("subdirs", nargs="+", help="the relative dir path of makefile")
-    parser.add_argument("--jobs", type=int, dest="jobs", default=cpu_count, help="thread number")
+    parser.add_argument("subdirs", nargs="+",
+                        help="the relative dir path of makefile")
+    parser.add_argument("--jobs", type=int, dest="jobs",
+                        default=cpu_count, help="thread number")
     parser.add_argument('--version', action='version', version=__version__)
     args = parser.parse_args()
     Run()
-
diff --git a/BaseTools/Source/C/PyEfiCompressor/setup.py b/BaseTools/Source/C/PyEfiCompressor/setup.py
index 3daf178b5401..506bace9cb7e 100644
--- a/BaseTools/Source/C/PyEfiCompressor/setup.py
+++ b/BaseTools/Source/C/PyEfiCompressor/setup.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # package and install PyEfiCompressor extension
 #
 #  Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
@@ -23,15 +23,15 @@ setup(
         Extension(
             'EfiCompressor',
             sources=[
-                os.path.join(BaseToolsDir, 'Source', 'C', 'Common', 'Decompress.c'),
+                os.path.join(BaseToolsDir, 'Source', 'C',
+                             'Common', 'Decompress.c'),
                 'EfiCompressor.c'
-                ],
+            ],
             include_dirs=[
                 os.path.join(BaseToolsDir, 'Source', 'C', 'Include'),
                 os.path.join(BaseToolsDir, 'Source', 'C', 'Include', 'Ia32'),
                 os.path.join(BaseToolsDir, 'Source', 'C', 'Common')
-                ],
-            )
-        ],
-  )
-
+            ],
+        )
+    ],
+)
diff --git a/BaseTools/Source/Python/AmlToC/AmlToC.py b/BaseTools/Source/Python/AmlToC/AmlToC.py
index 346de7159de7..80f18c04b049 100644
--- a/BaseTools/Source/Python/AmlToC/AmlToC.py
+++ b/BaseTools/Source/Python/AmlToC/AmlToC.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # Convert an AML file to a .c file containing the AML bytecode stored in a
 # C array.
@@ -24,10 +24,12 @@ Tables\Dsdt.c will contain a C array named "dsdt_aml_code" that contains
 the AML bytecode.
 """
 
-## Parse the command line arguments.
+# Parse the command line arguments.
 #
 # @retval A argparse.NameSpace instance, containing parsed values.
 #
+
+
 def ParseArgs():
     # Initialize the parser.
     Parser = argparse.ArgumentParser(description=__description__)
@@ -50,7 +52,8 @@ def ParseArgs():
         with open(Args.InputFile, "rb") as fIn:
             Signature = str(fIn.read(4))
             if ("DSDT" not in Signature) and ("SSDT" not in Signature):
-                EdkLogger.info("Invalid file type. File does not have a valid DSDT or SSDT signature: {}".format(Args.InputFile))
+                EdkLogger.info(
+                    "Invalid file type. File does not have a valid DSDT or SSDT signature: {}".format(Args.InputFile))
                 return None
 
     # Get the basename of the input file.
@@ -70,7 +73,7 @@ def ParseArgs():
 
     return Args
 
-## Convert an AML file to a .c file containing the AML bytecode stored
+# Convert an AML file to a .c file containing the AML bytecode stored
 #  in a C array.
 #
 # @param  InputFile     Path to the input AML file.
@@ -78,11 +81,13 @@ def ParseArgs():
 # @param  BaseName      Base name of the input file.
 #                       This is also the name of the generated .c file.
 #
+
+
 def AmlToC(InputFile, OutputFile, BaseName):
 
     ArrayName = BaseName.lower() + "_aml_code"
     FileHeader =\
-"""
+        """
 // This file has been generated from:
 //   -Python script: {}
 //   -Input AML file: {}
@@ -91,7 +96,8 @@ def AmlToC(InputFile, OutputFile, BaseName):
 
     with open(InputFile, "rb") as fIn, open(OutputFile, "w") as fOut:
         # Write header.
-        fOut.write(FileHeader.format(os.path.abspath(InputFile), os.path.abspath(__file__)))
+        fOut.write(FileHeader.format(os.path.abspath(
+            InputFile), os.path.abspath(__file__)))
 
         # Write the array and its content.
         fOut.write("unsigned char {}[] = {{\n  ".format(ArrayName))
@@ -105,7 +111,7 @@ def AmlToC(InputFile, OutputFile, BaseName):
             byte = fIn.read(1)
         fOut.write("\n};\n")
 
-## Main method
+# Main method
 #
 # This method:
 #   1-  Initialize an EdkLogger instance.
@@ -116,6 +122,8 @@ def AmlToC(InputFile, OutputFile, BaseName):
 # @retval 0     Success.
 # @retval 1     Error.
 #
+
+
 def Main():
     # Initialize an EdkLogger instance.
     EdkLogger.Initialize()
@@ -128,15 +136,18 @@ def Main():
 
         # Convert an AML file to a .c file containing the AML bytecode stored
         # in a C array.
-        AmlToC(CommandArguments.InputFile, CommandArguments.OutputFile, CommandArguments.BaseName)
+        AmlToC(CommandArguments.InputFile,
+               CommandArguments.OutputFile, CommandArguments.BaseName)
     except Exception as e:
         print(e)
         return 1
 
     return 0
 
+
 if __name__ == '__main__':
     r = Main()
     # 0-127 is a safe return range, and 1 is a standard default error
-    if r < 0 or r > 127: r = 1
+    if r < 0 or r > 127:
+        r = 1
     sys.exit(r)
diff --git a/BaseTools/Source/Python/AutoGen/AutoGen.py b/BaseTools/Source/Python/AutoGen/AutoGen.py
index d9ee699d8f30..2181cf521215 100644
--- a/BaseTools/Source/Python/AutoGen/AutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Generate AutoGen.h, AutoGen.c and *.depex files
 #
 # Copyright (c) 2007 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -8,20 +8,23 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 from __future__ import print_function
 from __future__ import absolute_import
 from Common.DataType import TAB_STAR
-## Base class for AutoGen
+# Base class for AutoGen
 #
 #   This class just implements the cache mechanism of AutoGen objects.
 #
+
+
 class AutoGen(object):
     # database to maintain the objects in each child class
-    __ObjectCache = {}    # (BuildTarget, ToolChain, ARCH, platform file): AutoGen object
+    # (BuildTarget, ToolChain, ARCH, platform file): AutoGen object
+    __ObjectCache = {}
 
-    ## Factory method
+    # Factory method
     #
     #   @param  Class           class object of real AutoGen class
     #                           (WorkspaceAutoGen, ModuleAutoGen or PlatformAutoGen)
@@ -44,17 +47,17 @@ class AutoGen(object):
         RetVal = cls.__ObjectCache[Key] = super(AutoGen, cls).__new__(cls)
         return RetVal
 
-
-    ## hash() operator
+    # hash() operator
     #
     #  The file path of platform file will be used to represent hash value of this object
     #
     #   @retval int     Hash value of the file path of platform file
     #
+
     def __hash__(self):
         return hash(self.MetaFile)
 
-    ## str() operator
+    # str() operator
     #
     #  The file path of platform file will be used to represent this object
     #
@@ -63,7 +66,7 @@ class AutoGen(object):
     def __str__(self):
         return str(self.MetaFile)
 
-    ## "==" operator
+    # "==" operator
     def __eq__(self, Other):
         return Other and self.MetaFile == Other
 
@@ -71,31 +74,34 @@ class AutoGen(object):
     def Cache(cls):
         return cls.__ObjectCache
 
+
 #
 # The priority list while override build option
 #
-PrioList = {"0x11111"  : 16,     #  TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
-            "0x01111"  : 15,     #  ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x10111"  : 14,     #  TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x00111"  : 13,     #  ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x11011"  : 12,     #  TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
-            "0x01011"  : 11,     #  ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
-            "0x10011"  : 10,     #  TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
-            "0x00011"  : 9,      #  ******_*********_****_COMMANDTYPE_ATTRIBUTE
-            "0x11101"  : 8,      #  TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
-            "0x01101"  : 7,      #  ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
-            "0x10101"  : 6,      #  TARGET_*********_ARCH_***********_ATTRIBUTE
-            "0x00101"  : 5,      #  ******_*********_ARCH_***********_ATTRIBUTE
-            "0x11001"  : 4,      #  TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
-            "0x01001"  : 3,      #  ******_TOOLCHAIN_****_***********_ATTRIBUTE
-            "0x10001"  : 2,      #  TARGET_*********_****_***********_ATTRIBUTE
-            "0x00001"  : 1}      #  ******_*********_****_***********_ATTRIBUTE (Lowest)
-## Calculate the priority value of the build option
+PrioList = {"0x11111": 16,  # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
+            "0x01111": 15,  # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x10111": 14,  # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x00111": 13,  # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x11011": 12,  # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+            "0x01011": 11,  # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+            "0x10011": 10,  # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
+            "0x00011": 9,  # ******_*********_****_COMMANDTYPE_ATTRIBUTE
+            "0x11101": 8,  # TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+            "0x01101": 7,  # ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+            "0x10101": 6,  # TARGET_*********_ARCH_***********_ATTRIBUTE
+            "0x00101": 5,  # ******_*********_ARCH_***********_ATTRIBUTE
+            "0x11001": 4,  # TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
+            "0x01001": 3,  # ******_TOOLCHAIN_****_***********_ATTRIBUTE
+            "0x10001": 2,  # TARGET_*********_****_***********_ATTRIBUTE
+            "0x00001": 1}  # ******_*********_****_***********_ATTRIBUTE (Lowest)
+# Calculate the priority value of the build option
 #
 # @param    Key    Build option definition contain: TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
 #
 # @retval   Value  Priority value based on the priority list.
 #
+
+
 def CalculatePriorityValue(Key):
     Target, ToolChain, Arch, CommandType, Attr = Key.split('_')
     PriorityValue = 0x11111
diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
index 0ba2339bed64..3b90571d505f 100755
--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -9,7 +9,7 @@ import multiprocessing as mp
 import threading
 from Common.Misc import PathClass
 from AutoGen.ModuleAutoGen import ModuleAutoGen
-from AutoGen.ModuleAutoGenHelper import WorkSpaceInfo,AutoGenInfo
+from AutoGen.ModuleAutoGenHelper import WorkSpaceInfo, AutoGenInfo
 import Common.GlobalData as GlobalData
 import Common.EdkLogger as EdkLogger
 import os
@@ -26,6 +26,7 @@ from AutoGen.DataPipe import MemoryDataPipe
 import logging
 import time
 
+
 def clearQ(q):
     try:
         while True:
@@ -33,16 +34,19 @@ def clearQ(q):
     except Empty:
         pass
 
+
 class LogAgent(threading.Thread):
-    def __init__(self,log_q,log_level,log_file=None):
-        super(LogAgent,self).__init__()
+    def __init__(self, log_q, log_level, log_file=None):
+        super(LogAgent, self).__init__()
         self.log_q = log_q
         self.log_level = log_level
         self.log_file = log_file
+
     def InitLogger(self):
         # For DEBUG level (All DEBUG_0~9 are applicable)
         self._DebugLogger_agent = logging.getLogger("tool_debug_agent")
-        _DebugFormatter = logging.Formatter("[%(asctime)s.%(msecs)d]: %(message)s", datefmt="%H:%M:%S")
+        _DebugFormatter = logging.Formatter(
+            "[%(asctime)s.%(msecs)d]: %(message)s", datefmt="%H:%M:%S")
         self._DebugLogger_agent.setLevel(self.log_level)
         _DebugChannel = logging.StreamHandler(sys.stdout)
         _DebugChannel.setFormatter(_DebugFormatter)
@@ -71,7 +75,7 @@ class LogAgent(threading.Thread):
             _Ch.setFormatter(_DebugFormatter)
             self._DebugLogger_agent.addHandler(_Ch)
 
-            _Ch= logging.FileHandler(self.log_file)
+            _Ch = logging.FileHandler(self.log_file)
             _Ch.setFormatter(_InfoFormatter)
             self._InfoLogger_agent.addHandler(_Ch)
 
@@ -86,23 +90,30 @@ class LogAgent(threading.Thread):
             if log_message is None:
                 break
             if log_message.name == "tool_error":
-                self._ErrorLogger_agent.log(log_message.levelno,log_message.getMessage())
+                self._ErrorLogger_agent.log(
+                    log_message.levelno, log_message.getMessage())
             elif log_message.name == "tool_info":
-                self._InfoLogger_agent.log(log_message.levelno,log_message.getMessage())
+                self._InfoLogger_agent.log(
+                    log_message.levelno, log_message.getMessage())
             elif log_message.name == "tool_debug":
-                self._DebugLogger_agent.log(log_message.levelno,log_message.getMessage())
+                self._DebugLogger_agent.log(
+                    log_message.levelno, log_message.getMessage())
             else:
-                self._InfoLogger_agent.log(log_message.levelno,log_message.getMessage())
+                self._InfoLogger_agent.log(
+                    log_message.levelno, log_message.getMessage())
 
     def kill(self):
         self.log_q.put(None)
+
+
 class AutoGenManager(threading.Thread):
-    def __init__(self,autogen_workers, feedback_q,error_event):
-        super(AutoGenManager,self).__init__()
+    def __init__(self, autogen_workers, feedback_q, error_event):
+        super(AutoGenManager, self).__init__()
         self.autogen_workers = autogen_workers
         self.feedback_q = feedback_q
         self.Status = True
         self.error_event = error_event
+
     def run(self):
         try:
             fin_num = 0
@@ -113,10 +124,12 @@ class AutoGenManager(threading.Thread):
                 if badnews == "Done":
                     fin_num += 1
                 elif badnews == "QueueEmpty":
-                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), badnews))
+                    EdkLogger.debug(EdkLogger.DEBUG_9,
+                                    "Worker %s: %s" % (os.getpid(), badnews))
                     self.TerminateWorkers()
                 else:
-                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), badnews))
+                    EdkLogger.debug(EdkLogger.DEBUG_9,
+                                    "Worker %s: %s" % (os.getpid(), badnews))
                     self.Status = False
                     self.TerminateWorkers()
                 if fin_num == len(self.autogen_workers):
@@ -143,20 +156,23 @@ class AutoGenManager(threading.Thread):
                     cache_num += 1
                 else:
                     GlobalData.gModuleAllCacheStatus.add(item)
-                if cache_num  == len(self.autogen_workers):
+                if cache_num == len(self.autogen_workers):
                     break
         except:
-            print ("cache_q error")
+            print("cache_q error")
 
     def TerminateWorkers(self):
         self.error_event.set()
+
     def kill(self):
         self.feedback_q.put(None)
+
+
 class AutoGenWorkerInProcess(mp.Process):
-    def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_lock,cache_q,log_q,error_event):
+    def __init__(self, module_queue, data_pipe_file_path, feedback_q, file_lock, cache_q, log_q, error_event):
         mp.Process.__init__(self)
         self.module_queue = module_queue
-        self.data_pipe_file_path =data_pipe_file_path
+        self.data_pipe_file_path = data_pipe_file_path
         self.data_pipe = None
         self.feedback_q = feedback_q
         self.PlatformMetaFileSet = {}
@@ -164,12 +180,14 @@ class AutoGenWorkerInProcess(mp.Process):
         self.cache_q = cache_q
         self.log_q = log_q
         self.error_event = error_event
-    def GetPlatformMetaFile(self,filepath,root):
+
+    def GetPlatformMetaFile(self, filepath, root):
         try:
-            return self.PlatformMetaFileSet[(filepath,root)]
+            return self.PlatformMetaFileSet[(filepath, root)]
         except:
-            self.PlatformMetaFileSet[(filepath,root)]  = filepath
-            return self.PlatformMetaFileSet[(filepath,root)]
+            self.PlatformMetaFileSet[(filepath, root)] = filepath
+            return self.PlatformMetaFileSet[(filepath, root)]
+
     def run(self):
         try:
             taskname = "Init"
@@ -178,7 +196,8 @@ class AutoGenWorkerInProcess(mp.Process):
                     self.data_pipe = MemoryDataPipe()
                     self.data_pipe.load(self.data_pipe_file_path)
                 except:
-                    self.feedback_q.put(taskname + ":" + "load data pipe %s failed." % self.data_pipe_file_path)
+                    self.feedback_q.put(
+                        taskname + ":" + "load data pipe %s failed." % self.data_pipe_file_path)
             EdkLogger.LogClientInitialize(self.log_q)
             loglevel = self.data_pipe.Get("LogLevel")
             if not loglevel:
@@ -193,12 +212,13 @@ class AutoGenWorkerInProcess(mp.Process):
             PackagesPath = os.getenv("PACKAGES_PATH")
             mws.setWs(workspacedir, PackagesPath)
             self.Wa = WorkSpaceInfo(
-                workspacedir,active_p,target,toolchain,archlist
-                )
+                workspacedir, active_p, target, toolchain, archlist
+            )
             self.Wa._SrcTimeStamp = self.data_pipe.Get("Workspace_timestamp")
             GlobalData.gGlobalDefines = self.data_pipe.Get("G_defines")
             GlobalData.gCommandLineDefines = self.data_pipe.Get("CL_defines")
-            GlobalData.gCommandMaxLength = self.data_pipe.Get('gCommandMaxLength')
+            GlobalData.gCommandMaxLength = self.data_pipe.Get(
+                'gCommandMaxLength')
             os.environ._data = self.data_pipe.Get("Env_Var")
             GlobalData.gWorkspace = workspacedir
             GlobalData.gDisableIncludePathCheck = False
@@ -208,23 +228,26 @@ class AutoGenWorkerInProcess(mp.Process):
             GlobalData.gUseHashCache = self.data_pipe.Get("UseHashCache")
             GlobalData.gBinCacheSource = self.data_pipe.Get("BinCacheSource")
             GlobalData.gBinCacheDest = self.data_pipe.Get("BinCacheDest")
-            GlobalData.gPlatformHashFile = self.data_pipe.Get("PlatformHashFile")
+            GlobalData.gPlatformHashFile = self.data_pipe.Get(
+                "PlatformHashFile")
             GlobalData.gModulePreMakeCacheStatus = dict()
             GlobalData.gModuleMakeCacheStatus = dict()
             GlobalData.gHashChainStatus = dict()
             GlobalData.gCMakeHashFile = dict()
             GlobalData.gModuleHashFile = dict()
             GlobalData.gFileHashDict = dict()
-            GlobalData.gEnableGenfdsMultiThread = self.data_pipe.Get("EnableGenfdsMultiThread")
-            GlobalData.gPlatformFinalPcds = self.data_pipe.Get("gPlatformFinalPcds")
+            GlobalData.gEnableGenfdsMultiThread = self.data_pipe.Get(
+                "EnableGenfdsMultiThread")
+            GlobalData.gPlatformFinalPcds = self.data_pipe.Get(
+                "gPlatformFinalPcds")
             GlobalData.file_lock = self.file_lock
             CommandTarget = self.data_pipe.Get("CommandTarget")
             pcd_from_build_option = []
             for pcd_tuple in self.data_pipe.Get("BuildOptPcd"):
-                pcd_id = ".".join((pcd_tuple[0],pcd_tuple[1]))
+                pcd_id = ".".join((pcd_tuple[0], pcd_tuple[1]))
                 if pcd_tuple[2].strip():
-                    pcd_id = ".".join((pcd_id,pcd_tuple[2]))
-                pcd_from_build_option.append("=".join((pcd_id,pcd_tuple[3])))
+                    pcd_id = ".".join((pcd_id, pcd_tuple[2]))
+                pcd_from_build_option.append("=".join((pcd_id, pcd_tuple[3])))
             GlobalData.BuildOptionPcd = pcd_from_build_option
             module_count = 0
             FfsCmd = self.data_pipe.Get("FfsCommand")
@@ -232,36 +255,40 @@ class AutoGenWorkerInProcess(mp.Process):
                 FfsCmd = {}
             GlobalData.FfsCmd = FfsCmd
             PlatformMetaFile = self.GetPlatformMetaFile(self.data_pipe.Get("P_Info").get("ActivePlatform"),
-                                             self.data_pipe.Get("P_Info").get("WorkspaceDir"))
+                                                        self.data_pipe.Get("P_Info").get("WorkspaceDir"))
             while True:
                 if self.error_event.is_set():
                     break
                 module_count += 1
                 try:
-                    module_file,module_root,module_path,module_basename,module_originalpath,module_arch,IsLib = self.module_queue.get_nowait()
+                    module_file, module_root, module_path, module_basename, module_originalpath, module_arch, IsLib = self.module_queue.get_nowait()
                 except Empty:
-                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), "Fake Empty."))
+                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (
+                        os.getpid(), "Fake Empty."))
                     time.sleep(0.01)
                     continue
                 if module_file is None:
-                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), "Worker get the last item in the queue."))
+                    EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (
+                        os.getpid(), "Worker get the last item in the queue."))
                     self.feedback_q.put("QueueEmpty")
                     time.sleep(0.01)
                     continue
 
-                modulefullpath = os.path.join(module_root,module_file)
-                taskname = " : ".join((modulefullpath,module_arch))
-                module_metafile = PathClass(module_file,module_root)
+                modulefullpath = os.path.join(module_root, module_file)
+                taskname = " : ".join((modulefullpath, module_arch))
+                module_metafile = PathClass(module_file, module_root)
                 if module_path:
                     module_metafile.Path = module_path
                 if module_basename:
                     module_metafile.BaseName = module_basename
                 if module_originalpath:
-                    module_metafile.OriginalPath = PathClass(module_originalpath,module_root)
+                    module_metafile.OriginalPath = PathClass(
+                        module_originalpath, module_root)
                 arch = module_arch
                 target = self.data_pipe.Get("P_Info").get("Target")
                 toolchain = self.data_pipe.Get("P_Info").get("ToolChain")
-                Ma = ModuleAutoGen(self.Wa,module_metafile,target,toolchain,arch,PlatformMetaFile,self.data_pipe)
+                Ma = ModuleAutoGen(self.Wa, module_metafile, target,
+                                   toolchain, arch, PlatformMetaFile, self.data_pipe)
                 Ma.IsLibrary = IsLib
                 # SourceFileList calling sequence impact the makefile string sequence.
                 # Create cached SourceFileList here to unify its calling sequence for both
@@ -275,13 +302,16 @@ class AutoGenWorkerInProcess(mp.Process):
                         self.feedback_q.put(taskname)
 
                     if CacheResult:
-                        self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "PreMakeCache", True))
+                        self.cache_q.put(
+                            (Ma.MetaFile.Path, Ma.Arch, "PreMakeCache", True))
                         continue
                     else:
-                        self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "PreMakeCache", False))
+                        self.cache_q.put(
+                            (Ma.MetaFile.Path, Ma.Arch, "PreMakeCache", False))
 
                 Ma.CreateCodeFile(False)
-                Ma.CreateMakeFile(False,GenFfsList=FfsCmd.get((Ma.MetaFile.Path, Ma.Arch),[]))
+                Ma.CreateMakeFile(False, GenFfsList=FfsCmd.get(
+                    (Ma.MetaFile.Path, Ma.Arch), []))
                 Ma.CreateAsBuiltInf()
                 if GlobalData.gBinCacheSource and CommandTarget in [None, "", "all"]:
                     try:
@@ -291,22 +321,28 @@ class AutoGenWorkerInProcess(mp.Process):
                         self.feedback_q.put(taskname)
 
                     if CacheResult:
-                        self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "MakeCache", True))
+                        self.cache_q.put(
+                            (Ma.MetaFile.Path, Ma.Arch, "MakeCache", True))
                         continue
                     else:
-                        self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "MakeCache", False))
+                        self.cache_q.put(
+                            (Ma.MetaFile.Path, Ma.Arch, "MakeCache", False))
 
         except Exception as e:
-            EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), str(e)))
+            EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" %
+                            (os.getpid(), str(e)))
             self.feedback_q.put(taskname)
         finally:
-            EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" % (os.getpid(), "Done"))
+            EdkLogger.debug(EdkLogger.DEBUG_9, "Worker %s: %s" %
+                            (os.getpid(), "Done"))
             self.feedback_q.put("Done")
             self.cache_q.put("CacheDone")
 
     def printStatus(self):
-        print("Processs ID: %d Run %d modules in AutoGen " % (os.getpid(),len(AutoGen.Cache())))
-        print("Processs ID: %d Run %d modules in AutoGenInfo " % (os.getpid(),len(AutoGenInfo.GetCache())))
+        print("Processs ID: %d Run %d modules in AutoGen " %
+              (os.getpid(), len(AutoGen.Cache())))
+        print("Processs ID: %d Run %d modules in AutoGenInfo " %
+              (os.getpid(), len(AutoGenInfo.GetCache())))
         groupobj = {}
         for buildobj in BuildDB.BuildObject.GetCache().values():
             if str(buildobj).lower().endswith("dec"):
@@ -326,6 +362,9 @@ class AutoGenWorkerInProcess(mp.Process):
                 except:
                     groupobj['inf'] = [str(buildobj)]
 
-        print("Processs ID: %d Run %d pkg in WDB " % (os.getpid(),len(groupobj.get("dec",[]))))
-        print("Processs ID: %d Run %d pla in WDB " % (os.getpid(),len(groupobj.get("dsc",[]))))
-        print("Processs ID: %d Run %d inf in WDB " % (os.getpid(),len(groupobj.get("inf",[]))))
+        print("Processs ID: %d Run %d pkg in WDB " %
+              (os.getpid(), len(groupobj.get("dec", []))))
+        print("Processs ID: %d Run %d pla in WDB " %
+              (os.getpid(), len(groupobj.get("dsc", []))))
+        print("Processs ID: %d Run %d inf in WDB " %
+              (os.getpid(), len(groupobj.get("inf", []))))
diff --git a/BaseTools/Source/Python/AutoGen/BuildEngine.py b/BaseTools/Source/Python/AutoGen/BuildEngine.py
index 752a1a1f6a86..7f810dce97de 100644
--- a/BaseTools/Source/Python/AutoGen/BuildEngine.py
+++ b/BaseTools/Source/Python/AutoGen/BuildEngine.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # The engine for building files
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -9,6 +9,7 @@
 # Import Modules
 #
 from __future__ import print_function
+import Common.EdkLogger as EdkLogger
 import Common.LongFilePathOs as os
 import re
 import copy
@@ -24,26 +25,29 @@ from Common.TargetTxtClassObject import TargetTxtDict
 gDefaultBuildRuleFile = 'build_rule.txt'
 AutoGenReqBuildRuleVerNum = '0.1'
 
-import Common.EdkLogger as EdkLogger
 
-## Convert file type to file list macro name
+# Convert file type to file list macro name
 #
 #   @param      FileType    The name of file type
 #
 #   @retval     string      The name of macro
 #
+
 def FileListMacro(FileType):
     return "%sS" % FileType.replace("-", "_").upper()
 
-## Convert file type to list file macro name
+# Convert file type to list file macro name
 #
 #   @param      FileType    The name of file type
 #
 #   @retval     string      The name of macro
 #
+
+
 def ListFileMacro(FileType):
     return "%s_LIST" % FileListMacro(FileType)
 
+
 class TargetDescBlock(object):
     def __init__(self, Inputs, Outputs, Commands, Dependencies):
         self.InitWorker(Inputs, Outputs, Commands, Dependencies)
@@ -77,17 +81,19 @@ class TargetDescBlock(object):
     def IsMultipleInput(self):
         return len(self.Inputs) > 1
 
-## Class for one build rule
+# Class for one build rule
 #
 # This represents a build rule which can give out corresponding command list for
 # building the given source file(s). The result can be used for generating the
 # target for makefile.
 #
+
+
 class FileBuildRule:
     INC_LIST_MACRO = "INC_LIST"
     INC_MACRO = "INC"
 
-    ## constructor
+    # constructor
     #
     #   @param  Input       The dictionary representing input file(s) for a rule
     #   @param  Output      The list representing output file(s) for a rule
@@ -165,28 +171,30 @@ class FileBuildRule:
         # All build targets generated by this rule for a module
         self.BuildTargets = {}
 
-    ## str() function support
+    # str() function support
     #
     #   @retval     string
     #
     def __str__(self):
         SourceString = ""
-        SourceString += " %s %s %s" % (self.SourceFileType, " ".join(self.SourceFileExtList), self.ExtraSourceFileList)
+        SourceString += " %s %s %s" % (self.SourceFileType, " ".join(
+            self.SourceFileExtList), self.ExtraSourceFileList)
         DestString = ", ".join([str(i) for i in self.DestFileList])
         CommandString = "\n\t".join(self.CommandList)
         return "%s : %s\n\t%s" % (DestString, SourceString, CommandString)
 
-    def Instantiate(self, Macros = None):
+    def Instantiate(self, Macros=None):
         if Macros is None:
             Macros = {}
         NewRuleObject = copy.copy(self)
         NewRuleObject.BuildTargets = {}
         NewRuleObject.DestFileList = []
         for File in self.DestFileList:
-            NewRuleObject.DestFileList.append(PathClass(NormPath(File, Macros)))
+            NewRuleObject.DestFileList.append(
+                PathClass(NormPath(File, Macros)))
         return NewRuleObject
 
-    ## Apply the rule to given source file(s)
+    # Apply the rule to given source file(s)
     #
     #   @param  SourceFile      One file or a list of files to be built
     #   @param  RelativeToDir   The relative path of the source file
@@ -233,24 +241,26 @@ class FileBuildRule:
 
         BuildRulePlaceholderDict = {
             # source file
-            "src"       :   SrcFile,
-            "s_path"    :   SrcPath,
-            "s_dir"     :   SrcFileDir,
-            "s_name"    :   SrcFileName,
-            "s_base"    :   SrcFileBase,
-            "s_ext"     :   SrcFileExt,
+            "src":   SrcFile,
+            "s_path":   SrcPath,
+            "s_dir":   SrcFileDir,
+            "s_name":   SrcFileName,
+            "s_base":   SrcFileBase,
+            "s_ext":   SrcFileExt,
             # destination file
-            "dst"       :   DestFile,
-            "d_path"    :   DestPath,
-            "d_name"    :   DestFileName,
-            "d_base"    :   DestFileBase,
-            "d_ext"     :   DestFileExt,
+            "dst":   DestFile,
+            "d_path":   DestPath,
+            "d_name":   DestFileName,
+            "d_base":   DestFileBase,
+            "d_ext":   DestFileExt,
         }
 
         DstFile = []
         for File in self.DestFileList:
-            File = string.Template(str(File)).safe_substitute(BuildRulePlaceholderDict)
-            File = string.Template(str(File)).safe_substitute(BuildRulePlaceholderDict)
+            File = string.Template(str(File)).safe_substitute(
+                BuildRulePlaceholderDict)
+            File = string.Template(str(File)).safe_substitute(
+                BuildRulePlaceholderDict)
             DstFile.append(PathClass(File, IsBinary=True))
 
         if DstFile[0] in self.BuildTargets:
@@ -262,14 +272,17 @@ class FileBuildRule:
                         #
                         # Command line should be regenerated since some macros are different
                         #
-                        CommandList = self._BuildCommand(BuildRulePlaceholderDict)
-                        TargetDesc.InitWorker([SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
+                        CommandList = self._BuildCommand(
+                            BuildRulePlaceholderDict)
+                        TargetDesc.InitWorker(
+                            [SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
                         break
             else:
                 TargetDesc.AddInput(SourceFile)
         else:
             CommandList = self._BuildCommand(BuildRulePlaceholderDict)
-            TargetDesc = TargetDescBlock([SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
+            TargetDesc = TargetDescBlock(
+                [SourceFile], DstFile, CommandList, self.ExtraSourceFileList)
             TargetDesc.ListFileMacro = self.ListFileMacro
             TargetDesc.FileListMacro = self.FileListMacro
             TargetDesc.IncListFileMacro = self.IncListFileMacro
@@ -282,16 +295,20 @@ class FileBuildRule:
     def _BuildCommand(self, Macros):
         CommandList = []
         for CommandString in self.CommandList:
-            CommandString = string.Template(CommandString).safe_substitute(Macros)
-            CommandString = string.Template(CommandString).safe_substitute(Macros)
+            CommandString = string.Template(
+                CommandString).safe_substitute(Macros)
+            CommandString = string.Template(
+                CommandString).safe_substitute(Macros)
             CommandList.append(CommandString)
         return CommandList
 
-## Class for build rules
+# Class for build rules
 #
 # BuildRule class parses rules defined in a file or passed by caller, and converts
 # the rule into FileBuildRule object.
 #
+
+
 class BuildRule:
     _SectionHeader = "SECTIONHEADER"
     _Section = "SECTION"
@@ -310,7 +327,7 @@ class BuildRule:
     _BinaryFileRule = FileBuildRule(TAB_DEFAULT_BINARY_FILE, [], [os.path.join("$(OUTPUT_DIR)", "${s_name}")],
                                     ["$(CP) ${src} ${dst}"], [])
 
-    ## Constructor
+    # Constructor
     #
     #   @param  File                The file containing build rules in a well defined format
     #   @param  Content             The string list of build rules in a well defined format
@@ -328,42 +345,48 @@ class BuildRule:
         elif Content is not None:
             self.RuleContent = Content
         else:
-            EdkLogger.error("build", PARAMETER_MISSING, ExtraData="No rule file or string given")
+            EdkLogger.error("build", PARAMETER_MISSING,
+                            ExtraData="No rule file or string given")
 
         self.SupportedToolChainFamilyList = SupportedFamily
-        self.RuleDatabase = tdict(True, 4)  # {FileExt, ModuleType, Arch, Family : FileBuildRule object}
+        # {FileExt, ModuleType, Arch, Family : FileBuildRule object}
+        self.RuleDatabase = tdict(True, 4)
         self.Ext2FileType = {}  # {ext : file-type}
         self.FileTypeList = set()
 
         self._LineIndex = LineIndex
         self._State = ""
-        self._RuleInfo = tdict(True, 2)     # {toolchain family : {"InputFile": {}, "OutputFile" : [], "Command" : []}}
+        # {toolchain family : {"InputFile": {}, "OutputFile" : [], "Command" : []}}
+        self._RuleInfo = tdict(True, 2)
         self._FileType = ''
         self._BuildTypeList = set()
         self._ArchList = set()
         self._FamilyList = []
         self._TotalToolChainFamilySet = set()
-        self._RuleObjectList = [] # FileBuildRule object list
+        self._RuleObjectList = []  # FileBuildRule object list
         self._FileVersion = ""
 
         self.Parse()
 
         # some intrinsic rules
-        self.RuleDatabase[TAB_DEFAULT_BINARY_FILE, TAB_COMMON, TAB_COMMON, TAB_COMMON] = self._BinaryFileRule
+        self.RuleDatabase[TAB_DEFAULT_BINARY_FILE, TAB_COMMON,
+                          TAB_COMMON, TAB_COMMON] = self._BinaryFileRule
         self.FileTypeList.add(TAB_DEFAULT_BINARY_FILE)
 
-    ## Parse the build rule strings
+    # Parse the build rule strings
     def Parse(self):
         self._State = self._Section
         for Index in range(self._LineIndex, len(self.RuleContent)):
             # Clean up the line and replace path separator with native one
-            Line = self.RuleContent[Index].strip().replace(self._PATH_SEP, os.path.sep)
+            Line = self.RuleContent[Index].strip().replace(
+                self._PATH_SEP, os.path.sep)
             self.RuleContent[Index] = Line
 
             # find the build_rule_version
             if Line and Line[0] == "#" and Line.find(TAB_BUILD_RULE_VERSION) != -1:
                 if Line.find("=") != -1 and Line.find("=") < (len(Line) - 1) and (Line[(Line.find("=") + 1):]).split():
-                    self._FileVersion = (Line[(Line.find("=") + 1):]).split()[0]
+                    self._FileVersion = (
+                        Line[(Line.find("=") + 1):]).split()[0]
             # skip empty or comment line
             if Line == "" or Line[0] == "#":
                 continue
@@ -383,14 +406,14 @@ class BuildRule:
         # merge last section information into rule database
         self.EndOfSection()
 
-    ## Parse definitions under a section
+    # Parse definitions under a section
     #
     #   @param  LineIndex   The line index of build rule text
     #
     def ParseSection(self, LineIndex):
         pass
 
-    ## Parse definitions under a subsection
+    # Parse definitions under a subsection
     #
     #   @param  LineIndex   The line index of build rule text
     #
@@ -398,14 +421,14 @@ class BuildRule:
         # currently nothing here
         pass
 
-    ## Placeholder for not supported sections
+    # Placeholder for not supported sections
     #
     #   @param  LineIndex   The line index of build rule text
     #
     def SkipSection(self, LineIndex):
         pass
 
-    ## Merge section information just got into rule database
+    # Merge section information just got into rule database
     def EndOfSection(self):
         Database = self.RuleDatabase
         # if there's specific toolchain family, 'COMMON' doesn't make sense any more
@@ -417,14 +440,16 @@ class BuildRule:
             Command = self._RuleInfo[Family, self._Command]
             ExtraDependency = self._RuleInfo[Family, self._ExtraDependency]
 
-            BuildRule = FileBuildRule(self._FileType, Input, Output, Command, ExtraDependency)
+            BuildRule = FileBuildRule(
+                self._FileType, Input, Output, Command, ExtraDependency)
             for BuildType in self._BuildTypeList:
                 for Arch in self._ArchList:
-                    Database[self._FileType, BuildType, Arch, Family] = BuildRule
+                    Database[self._FileType, BuildType,
+                             Arch, Family] = BuildRule
                     for FileExt in BuildRule.SourceFileExtList:
                         self.Ext2FileType[FileExt] = self._FileType
 
-    ## Parse section header
+    # Parse section header
     #
     #   @param  LineIndex   The line index of build rule text
     #
@@ -439,7 +464,8 @@ class BuildRule:
         for RuleName in RuleNameList:
             Arch = TAB_COMMON
             BuildType = TAB_COMMON
-            TokenList = [Token.strip().upper() for Token in RuleName.split('.')]
+            TokenList = [Token.strip().upper()
+                         for Token in RuleName.split('.')]
             # old format: Build.File-Type
             if TokenList[0] == "BUILD":
                 if len(TokenList) == 1:
@@ -486,7 +512,7 @@ class BuildRule:
         self._State = self._Section
         self.FileTypeList.add(FileType)
 
-    ## Parse sub-section header
+    # Parse sub-section header
     #
     #   @param  LineIndex   The line index of build rule text
     #
@@ -529,8 +555,10 @@ class BuildRule:
     #
     #   @param  LineIndex   The line index of build rule text
     #
+
     def ParseInputFileSubSection(self, LineIndex):
-        FileList = [File.strip() for File in self.RuleContent[LineIndex].split(",")]
+        FileList = [File.strip()
+                    for File in self.RuleContent[LineIndex].split(",")]
         for ToolChainFamily in self._FamilyList:
             if self._RuleInfo[ToolChainFamily, self._State] is None:
                 self._RuleInfo[ToolChainFamily, self._State] = []
@@ -546,9 +574,10 @@ class BuildRule:
         for ToolChainFamily in self._FamilyList:
             if self._RuleInfo[ToolChainFamily, self._State] is None:
                 self._RuleInfo[ToolChainFamily, self._State] = []
-            self._RuleInfo[ToolChainFamily, self._State].append(self.RuleContent[LineIndex])
+            self._RuleInfo[ToolChainFamily, self._State].append(
+                self.RuleContent[LineIndex])
 
-    ## Get a build rule via [] operator
+    # Get a build rule via [] operator
     #
     #   @param  FileExt             The extension of a file
     #   @param  ToolChainFamily     The tool chain family name
@@ -577,17 +606,18 @@ class BuildRule:
         return self.RuleDatabase[Key]
 
     _StateHandler = {
-        _SectionHeader     : ParseSectionHeader,
-        _Section           : ParseSection,
-        _SubSectionHeader  : ParseSubSectionHeader,
-        _SubSection        : ParseSubSection,
-        _InputFile         : ParseInputFileSubSection,
-        _OutputFile        : ParseCommonSubSection,
-        _ExtraDependency   : ParseCommonSubSection,
-        _Command           : ParseCommonSubSection,
-        _UnknownSection    : SkipSection,
+        _SectionHeader: ParseSectionHeader,
+        _Section: ParseSection,
+        _SubSectionHeader: ParseSubSectionHeader,
+        _SubSection: ParseSubSection,
+        _InputFile: ParseInputFileSubSection,
+        _OutputFile: ParseCommonSubSection,
+        _ExtraDependency: ParseCommonSubSection,
+        _Command: ParseCommonSubSection,
+        _UnknownSection: SkipSection,
     }
 
+
 class ToolBuildRule():
 
     def __new__(cls, *args, **kw):
@@ -618,13 +648,14 @@ class ToolBuildRule():
         if RetVal._FileVersion == "":
             RetVal._FileVersion = AutoGenReqBuildRuleVerNum
         else:
-            if RetVal._FileVersion < AutoGenReqBuildRuleVerNum :
+            if RetVal._FileVersion < AutoGenReqBuildRuleVerNum:
                 # If Build Rule's version is less than the version number required by the tools, halting the build.
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"\
-                                 % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
+                                ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"
+                                % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
         self._ToolBuildRule = RetVal
 
+
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
 if __name__ == '__main__':
@@ -647,4 +678,3 @@ if __name__ == '__main__':
         print(str(Br[".s", SUP_MODULE_SEC, "IPF", "COMMON"][1]))
         print()
         print(str(Br[".s", SUP_MODULE_SEC][1]))
-
diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source/Python/AutoGen/DataPipe.py
index 848c7a82963e..5ae948f5b5ab 100755
--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -13,10 +13,11 @@ import pickle
 from pickle import HIGHEST_PROTOCOL
 from Common import EdkLogger
 
+
 class PCD_DATA():
-    def __init__(self,TokenCName,TokenSpaceGuidCName,Type,DatumType,SkuInfoList,DefaultValue,
-                 MaxDatumSize,UserDefinedDefaultStoresFlag,validateranges,
-                 validlists,expressions,CustomAttribute,TokenValue):
+    def __init__(self, TokenCName, TokenSpaceGuidCName, Type, DatumType, SkuInfoList, DefaultValue,
+                 MaxDatumSize, UserDefinedDefaultStoresFlag, validateranges,
+                 validlists, expressions, CustomAttribute, TokenValue):
         self.TokenCName = TokenCName
         self.TokenSpaceGuidCName = TokenSpaceGuidCName
         self.Type = Type
@@ -31,143 +32,156 @@ class PCD_DATA():
         self.CustomAttribute = CustomAttribute
         self.TokenValue = TokenValue
 
+
 class DataPipe(object):
     def __init__(self, BuildDir=None):
         self.data_container = {}
         self.BuildDir = BuildDir
         self.dump_file = ""
 
+
 class MemoryDataPipe(DataPipe):
 
-    def Get(self,key):
+    def Get(self, key):
         return self.data_container.get(key)
 
-    def dump(self,file_path):
+    def dump(self, file_path):
         self.dump_file = file_path
-        with open(file_path,'wb') as fd:
-            pickle.dump(self.data_container,fd,pickle.HIGHEST_PROTOCOL)
+        with open(file_path, 'wb') as fd:
+            pickle.dump(self.data_container, fd, pickle.HIGHEST_PROTOCOL)
 
-    def load(self,file_path):
-        with open(file_path,'rb') as fd:
+    def load(self, file_path):
+        with open(file_path, 'rb') as fd:
             self.data_container = pickle.load(fd)
 
     @property
     def DataContainer(self):
         return self.data_container
+
     @DataContainer.setter
-    def DataContainer(self,data):
+    def DataContainer(self, data):
         self.data_container.update(data)
 
-    def FillData(self,PlatformInfo):
-        #Platform Pcds
+    def FillData(self, PlatformInfo):
+        # Platform Pcds
         self.DataContainer = {
-            "PLA_PCD" : [PCD_DATA(
-            pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type,
-            pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue,
-            pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.validateranges,
-                 pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd.TokenValue)
-            for pcd in PlatformInfo.Platform.Pcds.values()]
-            }
+            "PLA_PCD": [PCD_DATA(
+                pcd.TokenCName, pcd.TokenSpaceGuidCName, pcd.Type,
+                pcd.DatumType, pcd.SkuInfoList, pcd.DefaultValue,
+                pcd.MaxDatumSize, pcd.UserDefinedDefaultStoresFlag, pcd.validateranges,
+                pcd.validlists, pcd.expressions, pcd.CustomAttribute, pcd.TokenValue)
+                for pcd in PlatformInfo.Platform.Pcds.values()]
+        }
 
-        #Platform Module Pcds
+        # Platform Module Pcds
         ModulePcds = {}
         for m in PlatformInfo.Platform.Modules:
             module = PlatformInfo.Platform.Modules[m]
-            m_pcds =  module.Pcds
+            m_pcds = module.Pcds
             if m_pcds:
                 ModulePcds[module.Guid] = [PCD_DATA(
-            pcd.TokenCName,pcd.TokenSpaceGuidCName,pcd.Type,
-            pcd.DatumType,pcd.SkuInfoList,pcd.DefaultValue,
-            pcd.MaxDatumSize,pcd.UserDefinedDefaultStoresFlag,pcd.validateranges,
-                 pcd.validlists,pcd.expressions,pcd.CustomAttribute,pcd.TokenValue)
-            for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()]
+                    pcd.TokenCName, pcd.TokenSpaceGuidCName, pcd.Type,
+                    pcd.DatumType, pcd.SkuInfoList, pcd.DefaultValue,
+                    pcd.MaxDatumSize, pcd.UserDefinedDefaultStoresFlag, pcd.validateranges,
+                    pcd.validlists, pcd.expressions, pcd.CustomAttribute, pcd.TokenValue)
+                    for pcd in PlatformInfo.Platform.Modules[m].Pcds.values()]
 
+        self.DataContainer = {"MOL_PCDS": ModulePcds}
 
-        self.DataContainer = {"MOL_PCDS":ModulePcds}
-
-        #Module's Library Instance
+        # Module's Library Instance
         ModuleLibs = {}
         libModules = {}
         for m in PlatformInfo.Platform.Modules:
-            module_obj = BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain]
-            Libs = GetModuleLibInstances(module_obj, PlatformInfo.Platform, BuildDB.BuildObject, PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain,PlatformInfo.MetaFile,EdkLogger)
+            module_obj = BuildDB.BuildObject[m, PlatformInfo.Arch,
+                                             PlatformInfo.BuildTarget, PlatformInfo.ToolChain]
+            Libs = GetModuleLibInstances(module_obj, PlatformInfo.Platform, BuildDB.BuildObject, PlatformInfo.Arch,
+                                         PlatformInfo.BuildTarget, PlatformInfo.ToolChain, PlatformInfo.MetaFile, EdkLogger)
             for lib in Libs:
                 try:
-                    libModules[(lib.MetaFile.File,lib.MetaFile.Root,lib.Arch,lib.MetaFile.Path)].append((m.File,m.Root,module_obj.Arch,m.Path))
+                    libModules[(lib.MetaFile.File, lib.MetaFile.Root, lib.Arch, lib.MetaFile.Path)].append(
+                        (m.File, m.Root, module_obj.Arch, m.Path))
                 except:
-                    libModules[(lib.MetaFile.File,lib.MetaFile.Root,lib.Arch,lib.MetaFile.Path)] = [(m.File,m.Root,module_obj.Arch,m.Path)]
-            ModuleLibs[(m.File,m.Root,module_obj.Arch,m.Path)] = [(l.MetaFile.File,l.MetaFile.Root,l.Arch,l.MetaFile.Path) for l in Libs]
-        self.DataContainer = {"DEPS":ModuleLibs}
-        self.DataContainer = {"REFS":libModules}
+                    libModules[(lib.MetaFile.File, lib.MetaFile.Root, lib.Arch, lib.MetaFile.Path)] = [
+                        (m.File, m.Root, module_obj.Arch, m.Path)]
+            ModuleLibs[(m.File, m.Root, module_obj.Arch, m.Path)] = [
+                (l.MetaFile.File, l.MetaFile.Root, l.Arch, l.MetaFile.Path) for l in Libs]
+        self.DataContainer = {"DEPS": ModuleLibs}
+        self.DataContainer = {"REFS": libModules}
 
-        #Platform BuildOptions
+        # Platform BuildOptions
 
-        platform_build_opt =  PlatformInfo.EdkIIBuildOption
+        platform_build_opt = PlatformInfo.EdkIIBuildOption
 
         ToolDefinition = PlatformInfo.ToolDefinition
         module_build_opt = {}
         for m in PlatformInfo.Platform.Modules:
-            ModuleTypeOptions, PlatformModuleOptions = PlatformInfo.GetGlobalBuildOptions(BuildDB.BuildObject[m,PlatformInfo.Arch,PlatformInfo.BuildTarget,PlatformInfo.ToolChain])
+            ModuleTypeOptions, PlatformModuleOptions = PlatformInfo.GetGlobalBuildOptions(
+                BuildDB.BuildObject[m, PlatformInfo.Arch, PlatformInfo.BuildTarget, PlatformInfo.ToolChain])
             if ModuleTypeOptions or PlatformModuleOptions:
-                module_build_opt.update({(m.File,m.Root): {"ModuleTypeOptions":ModuleTypeOptions, "PlatformModuleOptions":PlatformModuleOptions}})
+                module_build_opt.update({(m.File, m.Root): {
+                                        "ModuleTypeOptions": ModuleTypeOptions, "PlatformModuleOptions": PlatformModuleOptions}})
 
-        self.DataContainer = {"PLA_BO":platform_build_opt,
-                              "TOOLDEF":ToolDefinition,
-                              "MOL_BO":module_build_opt
+        self.DataContainer = {"PLA_BO": platform_build_opt,
+                              "TOOLDEF": ToolDefinition,
+                              "MOL_BO": module_build_opt
                               }
 
-
-
-        #Platform Info
+        # Platform Info
         PInfo = {
-            "WorkspaceDir":PlatformInfo.Workspace.WorkspaceDir,
-            "Target":PlatformInfo.BuildTarget,
-            "ToolChain":PlatformInfo.Workspace.ToolChain,
-            "BuildRuleFile":PlatformInfo.BuildRule,
+            "WorkspaceDir": PlatformInfo.Workspace.WorkspaceDir,
+            "Target": PlatformInfo.BuildTarget,
+            "ToolChain": PlatformInfo.Workspace.ToolChain,
+            "BuildRuleFile": PlatformInfo.BuildRule,
             "Arch": PlatformInfo.Arch,
-            "ArchList":PlatformInfo.Workspace.ArchList,
-            "ActivePlatform":PlatformInfo.MetaFile
-            }
-        self.DataContainer = {'P_Info':PInfo}
+            "ArchList": PlatformInfo.Workspace.ArchList,
+            "ActivePlatform": PlatformInfo.MetaFile
+        }
+        self.DataContainer = {'P_Info': PInfo}
 
-        self.DataContainer = {'M_Name':PlatformInfo.UniqueBaseName}
+        self.DataContainer = {'M_Name': PlatformInfo.UniqueBaseName}
 
         self.DataContainer = {"ToolChainFamily": PlatformInfo.ToolChainFamily}
 
         self.DataContainer = {"BuildRuleFamily": PlatformInfo.BuildRuleFamily}
 
-        self.DataContainer = {"MixedPcd":GlobalData.MixedPcd}
+        self.DataContainer = {"MixedPcd": GlobalData.MixedPcd}
 
-        self.DataContainer = {"BuildOptPcd":GlobalData.BuildOptionPcd}
+        self.DataContainer = {"BuildOptPcd": GlobalData.BuildOptionPcd}
 
         self.DataContainer = {"BuildCommand": PlatformInfo.BuildCommand}
 
-        self.DataContainer = {"AsBuildModuleList": PlatformInfo._AsBuildModuleList}
+        self.DataContainer = {
+            "AsBuildModuleList": PlatformInfo._AsBuildModuleList}
 
         self.DataContainer = {"G_defines": GlobalData.gGlobalDefines}
 
         self.DataContainer = {"CL_defines": GlobalData.gCommandLineDefines}
 
-        self.DataContainer = {"gCommandMaxLength": GlobalData.gCommandMaxLength}
+        self.DataContainer = {
+            "gCommandMaxLength": GlobalData.gCommandMaxLength}
 
-        self.DataContainer = {"Env_Var": {k:v for k, v in os.environ.items()}}
+        self.DataContainer = {"Env_Var": {k: v for k, v in os.environ.items()}}
 
-        self.DataContainer = {"PackageList": [(dec.MetaFile,dec.Arch) for dec in PlatformInfo.PackageList]}
+        self.DataContainer = {"PackageList": [
+            (dec.MetaFile, dec.Arch) for dec in PlatformInfo.PackageList]}
 
         self.DataContainer = {"GuidDict": PlatformInfo.Platform._GuidDict}
 
-        self.DataContainer = {"DatabasePath":GlobalData.gDatabasePath}
+        self.DataContainer = {"DatabasePath": GlobalData.gDatabasePath}
 
-        self.DataContainer = {"FdfParser": True if GlobalData.gFdfParser else False}
+        self.DataContainer = {
+            "FdfParser": True if GlobalData.gFdfParser else False}
 
         self.DataContainer = {"LogLevel": EdkLogger.GetLevel()}
 
-        self.DataContainer = {"UseHashCache":GlobalData.gUseHashCache}
+        self.DataContainer = {"UseHashCache": GlobalData.gUseHashCache}
 
-        self.DataContainer = {"BinCacheSource":GlobalData.gBinCacheSource}
+        self.DataContainer = {"BinCacheSource": GlobalData.gBinCacheSource}
 
-        self.DataContainer = {"BinCacheDest":GlobalData.gBinCacheDest}
+        self.DataContainer = {"BinCacheDest": GlobalData.gBinCacheDest}
 
-        self.DataContainer = {"EnableGenfdsMultiThread":GlobalData.gEnableGenfdsMultiThread}
+        self.DataContainer = {
+            "EnableGenfdsMultiThread": GlobalData.gEnableGenfdsMultiThread}
 
-        self.DataContainer = {"gPlatformFinalPcds":GlobalData.gPlatformFinalPcds}
+        self.DataContainer = {
+            "gPlatformFinalPcds": GlobalData.gPlatformFinalPcds}
diff --git a/BaseTools/Source/Python/AutoGen/GenC.py b/BaseTools/Source/Python/AutoGen/GenC.py
index a2053d548521..fe02f160fd01 100755
--- a/BaseTools/Source/Python/AutoGen/GenC.py
+++ b/BaseTools/Source/Python/AutoGen/GenC.py
@@ -1,11 +1,11 @@
-## @file
+# @file
 # Routines for generating AutoGen.h and AutoGen.c
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 from __future__ import absolute_import
 import string
@@ -21,28 +21,31 @@ from .StrGather import *
 from .GenPcdDb import CreatePcdDatabaseCode
 from .IdfClassObject import *
 
-## PCD type string
-gItemTypeStringDatabase  = {
-    TAB_PCDS_FEATURE_FLAG       :   TAB_PCDS_FIXED_AT_BUILD,
-    TAB_PCDS_FIXED_AT_BUILD     :   TAB_PCDS_FIXED_AT_BUILD,
+# PCD type string
+gItemTypeStringDatabase = {
+    TAB_PCDS_FEATURE_FLAG:   TAB_PCDS_FIXED_AT_BUILD,
+    TAB_PCDS_FIXED_AT_BUILD:   TAB_PCDS_FIXED_AT_BUILD,
     TAB_PCDS_PATCHABLE_IN_MODULE:   'BinaryPatch',
-    TAB_PCDS_DYNAMIC            :   '',
-    TAB_PCDS_DYNAMIC_DEFAULT    :   '',
-    TAB_PCDS_DYNAMIC_VPD        :   '',
-    TAB_PCDS_DYNAMIC_HII        :   '',
-    TAB_PCDS_DYNAMIC_EX         :   '',
-    TAB_PCDS_DYNAMIC_EX_DEFAULT :   '',
-    TAB_PCDS_DYNAMIC_EX_VPD     :   '',
-    TAB_PCDS_DYNAMIC_EX_HII     :   '',
+    TAB_PCDS_DYNAMIC:   '',
+    TAB_PCDS_DYNAMIC_DEFAULT:   '',
+    TAB_PCDS_DYNAMIC_VPD:   '',
+    TAB_PCDS_DYNAMIC_HII:   '',
+    TAB_PCDS_DYNAMIC_EX:   '',
+    TAB_PCDS_DYNAMIC_EX_DEFAULT:   '',
+    TAB_PCDS_DYNAMIC_EX_VPD:   '',
+    TAB_PCDS_DYNAMIC_EX_HII:   '',
 }
 
 
-## Datum size
-gDatumSizeStringDatabase = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOLEAN',TAB_VOID:'8'}
-gDatumSizeStringDatabaseH = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'BOOL',TAB_VOID:'PTR'}
-gDatumSizeStringDatabaseLib = {TAB_UINT8:'8',TAB_UINT16:'16',TAB_UINT32:'32',TAB_UINT64:'64','BOOLEAN':'Bool',TAB_VOID:'Ptr'}
+# Datum size
+gDatumSizeStringDatabase = {TAB_UINT8: '8', TAB_UINT16: '16',
+                            TAB_UINT32: '32', TAB_UINT64: '64', 'BOOLEAN': 'BOOLEAN', TAB_VOID: '8'}
+gDatumSizeStringDatabaseH = {TAB_UINT8: '8', TAB_UINT16: '16',
+                             TAB_UINT32: '32', TAB_UINT64: '64', 'BOOLEAN': 'BOOL', TAB_VOID: 'PTR'}
+gDatumSizeStringDatabaseLib = {TAB_UINT8: '8', TAB_UINT16: '16',
+                               TAB_UINT32: '32', TAB_UINT64: '64', 'BOOLEAN': 'Bool', TAB_VOID: 'Ptr'}
 
-## AutoGen File Header Templates
+# AutoGen File Header Templates
 gAutoGenHeaderString = TemplateString("""\
 /**
   DO NOT EDIT
@@ -75,7 +78,7 @@ gAutoGenHEpilogueString = """
 #endif
 """
 
-## PEI Core Entry Point Templates
+# PEI Core Entry Point Templates
 gPeiCoreEntryPointPrototype = TemplateString("""
 ${BEGIN}
 VOID
@@ -105,7 +108,7 @@ ${END}
 """)
 
 
-## DXE Core Entry Point Templates
+# DXE Core Entry Point Templates
 gDxeCoreEntryPointPrototype = TemplateString("""
 ${BEGIN}
 VOID
@@ -130,7 +133,7 @@ ProcessModuleEntryPointList (
 ${END}
 """)
 
-## PEIM Entry Point Templates
+# PEIM Entry Point Templates
 gPeimEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -143,7 +146,7 @@ ${END}
 """)
 
 gPeimEntryPointString = [
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gPeimRevision = ${PiSpecVersion};
 
 EFI_STATUS
@@ -157,7 +160,7 @@ ProcessModuleEntryPointList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gPeimRevision = ${PiSpecVersion};
 ${BEGIN}
 EFI_STATUS
@@ -172,7 +175,7 @@ ProcessModuleEntryPointList (
 }
 ${END}
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gPeimRevision = ${PiSpecVersion};
 
 EFI_STATUS
@@ -198,7 +201,7 @@ ${END}
 """)
 ]
 
-## SMM_CORE Entry Point Templates
+# SMM_CORE Entry Point Templates
 gSmmCoreEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -227,7 +230,7 @@ ProcessModuleEntryPointList (
 ${END}
 """)
 
-## MM_CORE_STANDALONE Entry Point Templates
+# MM_CORE_STANDALONE Entry Point Templates
 gMmCoreStandaloneEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -253,7 +256,7 @@ ProcessModuleEntryPointList (
 ${END}
 """)
 
-## MM_STANDALONE Entry Point Templates
+# MM_STANDALONE Entry Point Templates
 gMmStandaloneEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -266,7 +269,7 @@ ${END}
 """)
 
 gMmStandaloneEntryPointString = [
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gMmRevision = ${PiSpecVersion};
 
 EFI_STATUS
@@ -280,7 +283,7 @@ ProcessModuleEntryPointList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gMmRevision = ${PiSpecVersion};
 ${BEGIN}
 EFI_STATUS
@@ -295,7 +298,7 @@ ProcessModuleEntryPointList (
 }
 ${END}
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT32 _gMmRevision = ${PiSpecVersion};
 
 EFI_STATUS
@@ -321,7 +324,7 @@ ${END}
 """)
 ]
 
-## DXE SMM Entry Point Templates
+# DXE SMM Entry Point Templates
 gDxeSmmEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -334,7 +337,7 @@ ${END}
 """)
 
 gDxeSmmEntryPointString = [
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 const UINT32 _gDxeRevision = ${PiSpecVersion};
 
@@ -349,7 +352,7 @@ ProcessModuleEntryPointList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 const UINT32 _gDxeRevision = ${PiSpecVersion};
 
@@ -390,7 +393,7 @@ ${END}
 """)
 ]
 
-## UEFI Driver Entry Point Templates
+# UEFI Driver Entry Point Templates
 gUefiDriverEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -403,7 +406,7 @@ ${END}
 """)
 
 gUefiDriverEntryPointString = [
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 const UINT32 _gDxeRevision = ${PiSpecVersion};
 
@@ -417,7 +420,7 @@ ProcessModuleEntryPointList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 const UINT32 _gDxeRevision = ${PiSpecVersion};
 
@@ -445,7 +448,7 @@ ExitDriver (
   gBS->Exit (gImageHandle, Status, 0, NULL);
 }
 """),
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 const UINT32 _gDxeRevision = ${PiSpecVersion};
 
@@ -485,7 +488,7 @@ ExitDriver (
 ]
 
 
-## UEFI Application Entry Point Templates
+# UEFI Application Entry Point Templates
 gUefiApplicationEntryPointPrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -498,7 +501,7 @@ ${END}
 """)
 
 gUefiApplicationEntryPointString = [
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 
 EFI_STATUS
@@ -511,7 +514,7 @@ ProcessModuleEntryPointList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 
 ${BEGIN}
@@ -538,7 +541,7 @@ ExitDriver (
   gBS->Exit (gImageHandle, Status, 0, NULL);
 }
 """),
-TemplateString("""
+    TemplateString("""
 const UINT32 _gUefiDriverRevision = ${UefiSpecVersion};
 
 EFI_STATUS
@@ -576,7 +579,7 @@ ExitDriver (
 """)
 ]
 
-## UEFI Unload Image Templates
+# UEFI Unload Image Templates
 gUefiUnloadImagePrototype = TemplateString("""
 ${BEGIN}
 EFI_STATUS
@@ -588,7 +591,7 @@ ${END}
 """)
 
 gUefiUnloadImageString = [
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT8 _gDriverUnloadImageCount = ${Count};
 
 EFI_STATUS
@@ -600,7 +603,7 @@ ProcessModuleUnloadList (
   return EFI_SUCCESS;
 }
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT8 _gDriverUnloadImageCount = ${Count};
 
 ${BEGIN}
@@ -614,7 +617,7 @@ ProcessModuleUnloadList (
 }
 ${END}
 """),
-TemplateString("""
+    TemplateString("""
 GLOBAL_REMOVE_IF_UNREFERENCED const UINT8 _gDriverUnloadImageCount = ${Count};
 
 EFI_STATUS
@@ -639,7 +642,7 @@ ${END}
 ]
 
 gLibraryStructorPrototype = {
-SUP_MODULE_BASE  : TemplateString("""${BEGIN}
+    SUP_MODULE_BASE: TemplateString("""${BEGIN}
 RETURN_STATUS
 EFIAPI
 ${Function} (
@@ -647,7 +650,7 @@ ${Function} (
   );${END}
 """),
 
-'PEI'   : TemplateString("""${BEGIN}
+    'PEI': TemplateString("""${BEGIN}
 EFI_STATUS
 EFIAPI
 ${Function} (
@@ -656,7 +659,7 @@ ${Function} (
   );${END}
 """),
 
-'DXE'   : TemplateString("""${BEGIN}
+    'DXE': TemplateString("""${BEGIN}
 EFI_STATUS
 EFIAPI
 ${Function} (
@@ -665,7 +668,7 @@ ${Function} (
   );${END}
 """),
 
-'MM'   : TemplateString("""${BEGIN}
+    'MM': TemplateString("""${BEGIN}
 EFI_STATUS
 EFIAPI
 ${Function} (
@@ -676,30 +679,30 @@ ${Function} (
 }
 
 gLibraryStructorCall = {
-SUP_MODULE_BASE  : TemplateString("""${BEGIN}
+    SUP_MODULE_BASE: TemplateString("""${BEGIN}
   Status = ${Function} ();
   ASSERT_RETURN_ERROR (Status);${END}
 """),
 
-'PEI'   : TemplateString("""${BEGIN}
+    'PEI': TemplateString("""${BEGIN}
   Status = ${Function} (FileHandle, PeiServices);
   ASSERT_EFI_ERROR (Status);${END}
 """),
 
-'DXE'   : TemplateString("""${BEGIN}
+    'DXE': TemplateString("""${BEGIN}
   Status = ${Function} (ImageHandle, SystemTable);
   ASSERT_EFI_ERROR (Status);${END}
 """),
 
-'MM'   : TemplateString("""${BEGIN}
+    'MM': TemplateString("""${BEGIN}
   Status = ${Function} (ImageHandle, MmSystemTable);
   ASSERT_EFI_ERROR (Status);${END}
 """),
 }
 
-## Library Constructor and Destructor Templates
+# Library Constructor and Destructor Templates
 gLibraryString = {
-SUP_MODULE_BASE  :   TemplateString("""
+    SUP_MODULE_BASE:   TemplateString("""
 ${BEGIN}${FunctionPrototype}${END}
 
 VOID
@@ -713,7 +716,7 @@ ${FunctionCall}${END}
 }
 """),
 
-'PEI'   :   TemplateString("""
+    'PEI':   TemplateString("""
 ${BEGIN}${FunctionPrototype}${END}
 
 VOID
@@ -728,7 +731,7 @@ ${FunctionCall}${END}
 }
 """),
 
-'DXE'   :   TemplateString("""
+    'DXE':   TemplateString("""
 ${BEGIN}${FunctionPrototype}${END}
 
 VOID
@@ -743,7 +746,7 @@ ${FunctionCall}${END}
 }
 """),
 
-'MM'   :   TemplateString("""
+    'MM':   TemplateString("""
 ${BEGIN}${FunctionPrototype}${END}
 
 VOID
@@ -762,34 +765,36 @@ ${FunctionCall}${END}
 gBasicHeaderFile = "Base.h"
 
 gModuleTypeHeaderFile = {
-    SUP_MODULE_BASE              :   [gBasicHeaderFile, "Library/DebugLib.h"],
-    SUP_MODULE_SEC               :   ["PiPei.h", "Library/DebugLib.h"],
-    SUP_MODULE_PEI_CORE          :   ["PiPei.h", "Library/DebugLib.h", "Library/PeiCoreEntryPoint.h"],
-    SUP_MODULE_PEIM              :   ["PiPei.h", "Library/DebugLib.h", "Library/PeimEntryPoint.h"],
-    SUP_MODULE_DXE_CORE          :   ["PiDxe.h", "Library/DebugLib.h", "Library/DxeCoreEntryPoint.h"],
-    SUP_MODULE_DXE_DRIVER        :   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
-    SUP_MODULE_DXE_SMM_DRIVER    :   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
+    SUP_MODULE_BASE:   [gBasicHeaderFile, "Library/DebugLib.h"],
+    SUP_MODULE_SEC:   ["PiPei.h", "Library/DebugLib.h"],
+    SUP_MODULE_PEI_CORE:   ["PiPei.h", "Library/DebugLib.h", "Library/PeiCoreEntryPoint.h"],
+    SUP_MODULE_PEIM:   ["PiPei.h", "Library/DebugLib.h", "Library/PeimEntryPoint.h"],
+    SUP_MODULE_DXE_CORE:   ["PiDxe.h", "Library/DebugLib.h", "Library/DxeCoreEntryPoint.h"],
+    SUP_MODULE_DXE_DRIVER:   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
+    SUP_MODULE_DXE_SMM_DRIVER:   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
     SUP_MODULE_DXE_RUNTIME_DRIVER:   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
-    SUP_MODULE_DXE_SAL_DRIVER    :   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
-    SUP_MODULE_UEFI_DRIVER       :   ["Uefi.h",  "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
-    SUP_MODULE_UEFI_APPLICATION  :   ["Uefi.h",  "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiApplicationEntryPoint.h"],
-    SUP_MODULE_SMM_CORE          :   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiDriverEntryPoint.h"],
-    SUP_MODULE_MM_STANDALONE     :   ["PiMm.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/StandaloneMmDriverEntryPoint.h"],
-    SUP_MODULE_MM_CORE_STANDALONE :  ["PiMm.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/StandaloneMmCoreEntryPoint.h"],
-    SUP_MODULE_USER_DEFINED      :   [gBasicHeaderFile, "Library/DebugLib.h"],
-    SUP_MODULE_HOST_APPLICATION  :   [gBasicHeaderFile, "Library/DebugLib.h"]
+    SUP_MODULE_DXE_SAL_DRIVER:   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
+    SUP_MODULE_UEFI_DRIVER:   ["Uefi.h",  "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiDriverEntryPoint.h"],
+    SUP_MODULE_UEFI_APPLICATION:   ["Uefi.h",  "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiBootServicesTableLib.h", "Library/UefiApplicationEntryPoint.h"],
+    SUP_MODULE_SMM_CORE:   ["PiDxe.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/UefiDriverEntryPoint.h"],
+    SUP_MODULE_MM_STANDALONE:   ["PiMm.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/StandaloneMmDriverEntryPoint.h"],
+    SUP_MODULE_MM_CORE_STANDALONE:  ["PiMm.h", "Library/BaseLib.h", "Library/DebugLib.h", "Library/StandaloneMmCoreEntryPoint.h"],
+    SUP_MODULE_USER_DEFINED:   [gBasicHeaderFile, "Library/DebugLib.h"],
+    SUP_MODULE_HOST_APPLICATION:   [gBasicHeaderFile, "Library/DebugLib.h"]
 }
 
-## Autogen internal worker macro to define DynamicEx PCD name includes both the TokenSpaceGuidName
+# Autogen internal worker macro to define DynamicEx PCD name includes both the TokenSpaceGuidName
 #  the TokenName and Guid comparison to avoid define name collisions.
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenH    The TemplateString object for header file
 #
 #
+
+
 def DynExPcdTokenNumberMapping(Info, AutoGenH):
     ExTokenCNameList = []
-    PcdExList        = []
+    PcdExList = []
     # Even it is the Library, the PCD is saved in the ModulePcdList
     PcdList = Info.ModulePcdList
     for Pcd in PcdList:
@@ -817,7 +822,8 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
             if Pcd.TokenCName == TokenCName:
                 Index = Index + 1
                 if Index == 1:
-                    AutoGenH.Append('\n#define __PCD_%s_ADDR_CMP(GuidPtr)  (' % (RealTokenCName))
+                    AutoGenH.Append(
+                        '\n#define __PCD_%s_ADDR_CMP(GuidPtr)  (' % (RealTokenCName))
                     AutoGenH.Append('\\\n  (GuidPtr == &%s) ? _PCD_TOKEN_%s_%s:'
                                     % (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
                 else:
@@ -842,7 +848,8 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
             if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET and Pcd.TokenCName == TokenCName:
                 Index = Index + 1
                 if Index == 1:
-                    AutoGenH.Append('\n#define __PCD_%s_VAL_CMP(GuidPtr)  (' % (RealTokenCName))
+                    AutoGenH.Append(
+                        '\n#define __PCD_%s_VAL_CMP(GuidPtr)  (' % (RealTokenCName))
                     AutoGenH.Append('\\\n  (GuidPtr == NULL) ? 0:')
                     AutoGenH.Append('\\\n  COMPAREGUID (GuidPtr, &%s) ? _PCD_TOKEN_%s_%s:'
                                     % (Pcd.TokenSpaceGuidCName, Pcd.TokenSpaceGuidCName, RealTokenCName))
@@ -859,15 +866,18 @@ def DynExPcdTokenNumberMapping(Info, AutoGenH):
                                     % (RealTokenCName, RealTokenCName, RealTokenCName, RealTokenCName))
                 TokenCNameList.add(TokenCName)
 
-## Create code for module PCDs
+# Create code for module PCDs
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #   @param      Pcd         The PCD object
 #
+
+
 def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
-    TokenSpaceGuidValue = Pcd.TokenSpaceGuidValue   #Info.GuidList[Pcd.TokenSpaceGuidCName]
+    # Info.GuidList[Pcd.TokenSpaceGuidCName]
+    TokenSpaceGuidValue = Pcd.TokenSpaceGuidValue
     PcdTokenNumber = Info.PlatformInfo.PcdTokenNumber
     #
     # Write PCDs
@@ -878,7 +888,7 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             TokenCName = PcdItem[0]
             break
     PcdTokenName = '_PCD_TOKEN_' + TokenCName
-    PatchPcdSizeTokenName = '_PCD_PATCHABLE_' + TokenCName +'_SIZE'
+    PatchPcdSizeTokenName = '_PCD_PATCHABLE_' + TokenCName + '_SIZE'
     PatchPcdSizeVariableName = '_gPcd_BinaryPatch_Size_' + TokenCName
     PatchPcdMaxSizeVariable = '_gPcd_BinaryPatch_MaxSize_' + TokenCName
     FixPcdSizeTokenName = '_PCD_SIZE_' + TokenCName
@@ -909,23 +919,33 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
                 TokenNumber = 0
             else:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "No generated token number for %s.%s\n" % (Pcd.TokenSpaceGuidCName, TokenCName),
+                                "No generated token number for %s.%s\n" % (
+                                    Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
         else:
-            TokenNumber = PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName]
+            TokenNumber = PcdTokenNumber[Pcd.TokenCName,
+                                         Pcd.TokenSpaceGuidCName]
         AutoGenH.Append('\n#define %s  %dU\n' % (PcdTokenName, TokenNumber))
 
-    EdkLogger.debug(EdkLogger.DEBUG_3, "Creating code for " + TokenCName + "." + Pcd.TokenSpaceGuidCName)
+    EdkLogger.debug(EdkLogger.DEBUG_3, "Creating code for " +
+                    TokenCName + "." + Pcd.TokenSpaceGuidCName)
     if Pcd.Type not in gItemTypeStringDatabase:
         EdkLogger.error("build", AUTOGEN_ERROR,
-                        "Unknown PCD type [%s] of PCD %s.%s" % (Pcd.Type, Pcd.TokenSpaceGuidCName, TokenCName),
+                        "Unknown PCD type [%s] of PCD %s.%s" % (
+                            Pcd.Type, Pcd.TokenSpaceGuidCName, TokenCName),
                         ExtraData="[%s]" % str(Info))
 
     DatumSize = gDatumSizeStringDatabase[Pcd.DatumType] if Pcd.DatumType in gDatumSizeStringDatabase else gDatumSizeStringDatabase[TAB_VOID]
     DatumSizeLib = gDatumSizeStringDatabaseLib[Pcd.DatumType] if Pcd.DatumType in gDatumSizeStringDatabaseLib else gDatumSizeStringDatabaseLib[TAB_VOID]
-    GetModeName = '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
-    SetModeName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
-    SetModeStatusName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_S_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_S_' + TokenCName
+    GetModeName = '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_GET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
+    SetModeName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
+    SetModeStatusName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_S_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_S_' + TokenCName
     GetModeSizeName = '_PCD_GET_MODE_SIZE' + '_' + TokenCName
 
     if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
@@ -944,26 +964,41 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
         # If only PcdToken and PcdGet/Set used in all Pcds with different CName, it should succeed to build.
         # If PcdToken and PcdGet/Set used in the Pcds with different Guids but same CName, it should failed to build.
         if PcdExCNameTest > 1:
-            AutoGenH.Append('// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
-            AutoGenH.Append('// #define %s  %s\n' % (PcdTokenName, PcdExTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append(
+                '// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
+            AutoGenH.Append('// #define %s  %s\n' %
+                            (PcdTokenName, PcdExTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' %
+                            (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' %
+                            (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
             else:
-                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' %
+                                (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
         else:
-            AutoGenH.Append('#define %s  %s\n' % (PcdTokenName, PcdExTokenName))
-            AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s  %s\n' %
+                            (PcdTokenName, PcdExTokenName))
+            AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (
+                GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (
+                GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
             else:
-                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (
+                    SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
     elif Pcd.Type in PCD_DYNAMIC_TYPE_SET:
         PcdCNameTest = 0
         for PcdModule in Info.LibraryPcdList + Info.ModulePcdList:
@@ -973,18 +1008,26 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             if PcdCNameTest > 1:
                 break
         if PcdCNameTest > 1:
-            EdkLogger.error("build", AUTOGEN_ERROR, "More than one Dynamic Pcds [%s] are different Guids but same CName. They need to be changed to DynamicEx type to avoid the confliction.\n" % (TokenCName), ExtraData="[%s]" % str(Info.MetaFile.Path))
+            EdkLogger.error("build", AUTOGEN_ERROR, "More than one Dynamic Pcds [%s] are different Guids but same CName. They need to be changed to DynamicEx type to avoid the confliction.\n" % (
+                TokenCName), ExtraData="[%s]" % str(Info.MetaFile.Path))
         else:
-            AutoGenH.Append('#define %s  LibPcdGet%s(%s)\n' % (GetModeName, DatumSizeLib, PcdTokenName))
-            AutoGenH.Append('#define %s  LibPcdGetSize(%s)\n' % (GetModeSizeName, PcdTokenName))
+            AutoGenH.Append('#define %s  LibPcdGet%s(%s)\n' %
+                            (GetModeName, DatumSizeLib, PcdTokenName))
+            AutoGenH.Append('#define %s  LibPcdGetSize(%s)\n' %
+                            (GetModeSizeName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%s(%s, (SizeOfBuffer), (Buffer))\n' %(SetModeName, DatumSizeLib, PcdTokenName))
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%sS(%s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%s(%s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%sS(%s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeStatusName, DatumSizeLib, PcdTokenName))
             else:
-                AutoGenH.Append('#define %s(Value)  LibPcdSet%s(%s, (Value))\n' % (SetModeName, DatumSizeLib, PcdTokenName))
-                AutoGenH.Append('#define %s(Value)  LibPcdSet%sS(%s, (Value))\n' % (SetModeStatusName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSet%s(%s, (Value))\n' % (
+                    SetModeName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSet%sS(%s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, PcdTokenName))
     else:
-        PcdVariableName = '_gPcd_' + gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
+        PcdVariableName = '_gPcd_' + \
+            gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
         Const = 'const'
         if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
             Const = ''
@@ -1007,18 +1050,21 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
                     Value = Value[:-1]
                 if Value.startswith('0') and not Value.lower().startswith('0x') and len(Value) > 1 and Value.lstrip('0'):
                     Value = Value.lstrip('0')
-                ValueNumber = int (Value, 0)
+                ValueNumber = int(Value, 0)
             except:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "PCD value is not valid dec or hex number for datum type [%s] of PCD %s.%s" % (Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
+                                "PCD value is not valid dec or hex number for datum type [%s] of PCD %s.%s" % (
+                                    Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
             if ValueNumber < 0:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "PCD can't be set to negative value for datum type [%s] of PCD %s.%s" % (Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
+                                "PCD can't be set to negative value for datum type [%s] of PCD %s.%s" % (
+                                    Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
             elif ValueNumber > MAX_VAL_TYPE[Pcd.DatumType]:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "Too large PCD value for datum type [%s] of PCD %s.%s" % (Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
+                                "Too large PCD value for datum type [%s] of PCD %s.%s" % (
+                                    Pcd.DatumType, Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
             if Pcd.DatumType == TAB_UINT64 and not Value.endswith('ULL'):
                 Value += 'ULL'
@@ -1028,7 +1074,8 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
         if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
             if not Pcd.MaxDatumSize:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "Unknown [MaxDatumSize] of PCD [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName),
+                                "Unknown [MaxDatumSize] of PCD [%s.%s]" % (
+                                    Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
 
             ArraySize = int(Pcd.MaxDatumSize, 0)
@@ -1038,23 +1085,26 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             else:
                 if Value[0] == 'L':
                     Unicode = True
-                Value = Value.lstrip('L')   #.strip('"')
+                Value = Value.lstrip('L')  # .strip('"')
                 Value = eval(Value)         # translate escape character
                 ValueSize = len(Value) + 1
                 NewValue = '{'
                 for Index in range(0, len(Value)):
                     if Unicode:
-                        NewValue = NewValue + str(ord(Value[Index]) % 0x10000) + ', '
+                        NewValue = NewValue + \
+                            str(ord(Value[Index]) % 0x10000) + ', '
                     else:
-                        NewValue = NewValue + str(ord(Value[Index]) % 0x100) + ', '
+                        NewValue = NewValue + \
+                            str(ord(Value[Index]) % 0x100) + ', '
                 if Unicode:
                     ArraySize = ArraySize // 2
                 Value = NewValue + '0 }'
             if ArraySize < ValueSize:
                 if Pcd.MaxSizeUserSet:
                     EdkLogger.error("build", AUTOGEN_ERROR,
-                                "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName),
-                                ExtraData="[%s]" % str(Info))
+                                    "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (
+                                        Pcd.TokenSpaceGuidCName, TokenCName),
+                                    ExtraData="[%s]" % str(Info))
                 else:
                     ArraySize = Pcd.GetPcdSize()
                     if Unicode:
@@ -1076,76 +1126,108 @@ def CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd):
             #
             # For unicode, UINT16 array will be generated, so the alignment of unicode is guaranteed.
             #
-            AutoGenH.Append('#define %s  %s%s\n' %(PcdValueName, Type, PcdVariableName))
+            AutoGenH.Append('#define %s  %s%s\n' %
+                            (PcdValueName, Type, PcdVariableName))
             if Unicode:
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s UINT16 %s%s = %s;\n' % (Const, PcdVariableName, Array, Value))
-                AutoGenH.Append('extern %s UINT16 %s%s;\n' %(Const, PcdVariableName, Array))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s UINT16 %s%s = %s;\n' % (
+                    Const, PcdVariableName, Array, Value))
+                AutoGenH.Append('extern %s UINT16 %s%s;\n' %
+                                (Const, PcdVariableName, Array))
             else:
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s UINT8 %s%s = %s;\n' % (Const, PcdVariableName, Array, Value))
-                AutoGenH.Append('extern %s UINT8 %s%s;\n' %(Const, PcdVariableName, Array))
-            AutoGenH.Append('#define %s  %s%s\n' %(GetModeName, Type, PcdVariableName))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s UINT8 %s%s = %s;\n' % (
+                    Const, PcdVariableName, Array, Value))
+                AutoGenH.Append('extern %s UINT8 %s%s;\n' %
+                                (Const, PcdVariableName, Array))
+            AutoGenH.Append('#define %s  %s%s\n' %
+                            (GetModeName, Type, PcdVariableName))
 
             PcdDataSize = Pcd.GetPcdSize()
             if Pcd.Type == TAB_PCDS_FIXED_AT_BUILD:
-                AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, FixPcdSizeTokenName))
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (FixedPcdSizeVariableName, PcdDataSize))
+                AutoGenH.Append('#define %s %s\n' %
+                                (FixPcdSizeTokenName, PcdDataSize))
+                AutoGenH.Append('#define %s  %s \n' %
+                                (GetModeSizeName, FixPcdSizeTokenName))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (
+                    FixedPcdSizeVariableName, PcdDataSize))
             if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
-                AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, Pcd.MaxDatumSize))
-                AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, PatchPcdSizeVariableName))
-                AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName, PcdDataSize))
-                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (PatchPcdMaxSizeVariable, Pcd.MaxDatumSize))
+                AutoGenH.Append('#define %s %s\n' %
+                                (PatchPcdSizeTokenName, Pcd.MaxDatumSize))
+                AutoGenH.Append('#define %s  %s \n' %
+                                (GetModeSizeName, PatchPcdSizeVariableName))
+                AutoGenH.Append('extern UINTN %s; \n' %
+                                PatchPcdSizeVariableName)
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (
+                    PatchPcdSizeVariableName, PcdDataSize))
+                AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED const UINTN %s = %s;\n' % (
+                    PatchPcdMaxSizeVariable, Pcd.MaxDatumSize))
         elif Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
-            AutoGenH.Append('#define %s  %s\n' %(PcdValueName, Value))
-            AutoGenC.Append('volatile %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
-            AutoGenH.Append('extern volatile %s  %s  %s%s;\n' % (Const, Pcd.DatumType, PcdVariableName, Array))
-            AutoGenH.Append('#define %s  %s%s\n' % (GetModeName, Type, PcdVariableName))
+            AutoGenH.Append('#define %s  %s\n' % (PcdValueName, Value))
+            AutoGenC.Append('volatile %s %s %s = %s;\n' % (
+                Const, Pcd.DatumType, PcdVariableName, PcdValueName))
+            AutoGenH.Append('extern volatile %s  %s  %s%s;\n' %
+                            (Const, Pcd.DatumType, PcdVariableName, Array))
+            AutoGenH.Append('#define %s  %s%s\n' %
+                            (GetModeName, Type, PcdVariableName))
 
             PcdDataSize = Pcd.GetPcdSize()
-            AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
+            AutoGenH.Append('#define %s %s\n' %
+                            (PatchPcdSizeTokenName, PcdDataSize))
 
-            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, PatchPcdSizeVariableName))
+            AutoGenH.Append('#define %s  %s \n' %
+                            (GetModeSizeName, PatchPcdSizeVariableName))
             AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (PatchPcdSizeVariableName, PcdDataSize))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED UINTN %s = %s;\n' % (
+                PatchPcdSizeVariableName, PcdDataSize))
         else:
             PcdDataSize = Pcd.GetPcdSize()
-            AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-            AutoGenH.Append('#define %s  %s \n' % (GetModeSizeName, FixPcdSizeTokenName))
+            AutoGenH.Append('#define %s %s\n' %
+                            (FixPcdSizeTokenName, PcdDataSize))
+            AutoGenH.Append('#define %s  %s \n' %
+                            (GetModeSizeName, FixPcdSizeTokenName))
 
-            AutoGenH.Append('#define %s  %s\n' %(PcdValueName, Value))
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s %s = %s;\n' %(Const, Pcd.DatumType, PcdVariableName, PcdValueName))
-            AutoGenH.Append('extern %s  %s  %s%s;\n' % (Const, Pcd.DatumType, PcdVariableName, Array))
-            AutoGenH.Append('#define %s  %s%s\n' % (GetModeName, Type, PcdVariableName))
+            AutoGenH.Append('#define %s  %s\n' % (PcdValueName, Value))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s %s = %s;\n' % (
+                Const, Pcd.DatumType, PcdVariableName, PcdValueName))
+            AutoGenH.Append('extern %s  %s  %s%s;\n' %
+                            (Const, Pcd.DatumType, PcdVariableName, Array))
+            AutoGenH.Append('#define %s  %s%s\n' %
+                            (GetModeName, Type, PcdVariableName))
 
         if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSize((VOID *)_gPcd_BinaryPatch_%s, &_gPcd_BinaryPatch_Size_%s, (UINTN)_PCD_PATCHABLE_%s_SIZE, (SizeOfBuffer), (Buffer))\n' % (SetModeName, Pcd.TokenCName, Pcd.TokenCName, Pcd.TokenCName))
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSizeS((VOID *)_gPcd_BinaryPatch_%s, &_gPcd_BinaryPatch_Size_%s, (UINTN)_PCD_PATCHABLE_%s_SIZE, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, Pcd.TokenCName, Pcd.TokenCName, Pcd.TokenCName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSize((VOID *)_gPcd_BinaryPatch_%s, &_gPcd_BinaryPatch_Size_%s, (UINTN)_PCD_PATCHABLE_%s_SIZE, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeName, Pcd.TokenCName, Pcd.TokenCName, Pcd.TokenCName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSizeS((VOID *)_gPcd_BinaryPatch_%s, &_gPcd_BinaryPatch_Size_%s, (UINTN)_PCD_PATCHABLE_%s_SIZE, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeStatusName, Pcd.TokenCName, Pcd.TokenCName, Pcd.TokenCName))
             else:
-                AutoGenH.Append('#define %s(Value)  (%s = (Value))\n' % (SetModeName, PcdVariableName))
-                AutoGenH.Append('#define %s(Value)  ((%s = (Value)), RETURN_SUCCESS) \n' % (SetModeStatusName, PcdVariableName))
+                AutoGenH.Append('#define %s(Value)  (%s = (Value))\n' %
+                                (SetModeName, PcdVariableName))
+                AutoGenH.Append('#define %s(Value)  ((%s = (Value)), RETURN_SUCCESS) \n' % (
+                    SetModeStatusName, PcdVariableName))
         else:
-            AutoGenH.Append('//#define %s  ASSERT(FALSE)  // It is not allowed to set value for a FIXED_AT_BUILD PCD\n' % SetModeName)
+            AutoGenH.Append(
+                '//#define %s  ASSERT(FALSE)  // It is not allowed to set value for a FIXED_AT_BUILD PCD\n' % SetModeName)
 
-## Create code for library module PCDs
+# Create code for library module PCDs
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #   @param      Pcd         The PCD object
 #
+
+
 def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
     PcdTokenNumber = Info.PlatformInfo.PcdTokenNumber
     TokenSpaceGuidCName = Pcd.TokenSpaceGuidCName
-    TokenCName  = Pcd.TokenCName
+    TokenCName = Pcd.TokenCName
     for PcdItem in GlobalData.MixedPcd:
         if (TokenCName, TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
             TokenCName = PcdItem[0]
             break
     PcdTokenName = '_PCD_TOKEN_' + TokenCName
     FixPcdSizeTokenName = '_PCD_SIZE_' + TokenCName
-    PatchPcdSizeTokenName = '_PCD_PATCHABLE_' + TokenCName +'_SIZE'
+    PatchPcdSizeTokenName = '_PCD_PATCHABLE_' + TokenCName + '_SIZE'
     PatchPcdSizeVariableName = '_gPcd_BinaryPatch_Size_' + TokenCName
     PatchPcdMaxSizeVariable = '_gPcd_BinaryPatch_MaxSize_' + TokenCName
     FixedPcdSizeVariableName = '_gPcd_FixedAtBuild_Size_' + TokenCName
@@ -1173,28 +1255,37 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
                 TokenNumber = 0
             else:
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                "No generated token number for %s.%s\n" % (Pcd.TokenSpaceGuidCName, TokenCName),
+                                "No generated token number for %s.%s\n" % (
+                                    Pcd.TokenSpaceGuidCName, TokenCName),
                                 ExtraData="[%s]" % str(Info))
         else:
-            TokenNumber = PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName]
+            TokenNumber = PcdTokenNumber[Pcd.TokenCName,
+                                         Pcd.TokenSpaceGuidCName]
 
     if Pcd.Type not in gItemTypeStringDatabase:
         EdkLogger.error("build", AUTOGEN_ERROR,
-                        "Unknown PCD type [%s] of PCD %s.%s" % (Pcd.Type, Pcd.TokenSpaceGuidCName, TokenCName),
+                        "Unknown PCD type [%s] of PCD %s.%s" % (
+                            Pcd.Type, Pcd.TokenSpaceGuidCName, TokenCName),
                         ExtraData="[%s]" % str(Info))
 
-    DatumType   = Pcd.DatumType
+    DatumType = Pcd.DatumType
     DatumSize = gDatumSizeStringDatabase[Pcd.DatumType] if Pcd.DatumType in gDatumSizeStringDatabase else gDatumSizeStringDatabase[TAB_VOID]
     DatumSizeLib = gDatumSizeStringDatabaseLib[Pcd.DatumType] if Pcd.DatumType in gDatumSizeStringDatabaseLib else gDatumSizeStringDatabaseLib[TAB_VOID]
-    GetModeName = '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
-    SetModeName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
-    SetModeStatusName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_S_' + TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[TAB_VOID] + '_S_' + TokenCName
+    GetModeName = '_PCD_GET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_GET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
+    SetModeName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_' + TokenCName
+    SetModeStatusName = '_PCD_SET_MODE_' + gDatumSizeStringDatabaseH[Pcd.DatumType] + '_S_' + \
+        TokenCName if Pcd.DatumType in gDatumSizeStringDatabaseH else '_PCD_SET_MODE_' + \
+        gDatumSizeStringDatabaseH[TAB_VOID] + '_S_' + TokenCName
     GetModeSizeName = '_PCD_GET_MODE_SIZE' + '_' + TokenCName
 
     Type = ''
     Array = ''
     if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-        if Pcd.DefaultValue[0]== '{':
+        if Pcd.DefaultValue[0] == '{':
             Type = '(VOID *)'
         Array = '[]'
     PcdItemType = Pcd.Type
@@ -1217,28 +1308,44 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
         # If only PcdGet/Set used in all Pcds with different CName, it should succeed to build.
         # If PcdGet/Set used in the Pcds with different Guids but same CName, it should failed to build.
         if PcdExCNameTest > 1:
-            AutoGenH.Append('// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
-            AutoGenH.Append('// #define %s  %s\n' % (PcdTokenName, PcdExTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append(
+                '// Disabled the macros, as PcdToken and PcdGet/Set are not allowed in the case that more than one DynamicEx Pcds are different Guids but same CName.\n')
+            AutoGenH.Append('// #define %s  %s\n' %
+                            (PcdTokenName, PcdExTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetEx%s(&%s, %s)\n' %
+                            (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('// #define %s  LibPcdGetExSize(&%s, %s)\n' %
+                            (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                                (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
             else:
-                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' %
+                                (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('// #define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
         else:
-            AutoGenH.Append('#define %s  %s\n' % (PcdTokenName, PcdExTokenName))
-            AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s  %s\n' %
+                            (PcdTokenName, PcdExTokenName))
+            AutoGenH.Append('#define %s  LibPcdGetEx%s(&%s, %s)\n' % (
+                GetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+            AutoGenH.Append('#define %s LibPcdGetExSize(&%s, %s)\n' % (
+                GetModeSizeName, Pcd.TokenSpaceGuidCName, PcdTokenName))
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%s(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSetEx%sS(&%s, %s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
             else:
-                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
-                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%s(&%s, %s, (Value))\n' % (
+                    SetModeName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSetEx%sS(&%s, %s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, Pcd.TokenSpaceGuidCName, PcdTokenName))
     else:
-        AutoGenH.Append('#define _PCD_TOKEN_%s  %dU\n' % (TokenCName, TokenNumber))
+        AutoGenH.Append('#define _PCD_TOKEN_%s  %dU\n' %
+                        (TokenCName, TokenNumber))
     if PcdItemType in PCD_DYNAMIC_TYPE_SET:
         PcdList = []
         PcdCNameList = []
@@ -1248,51 +1355,75 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
             if PcdModule.Type in PCD_DYNAMIC_TYPE_SET:
                 PcdCNameList.append(PcdModule.TokenCName)
         if PcdCNameList.count(Pcd.TokenCName) > 1:
-            EdkLogger.error("build", AUTOGEN_ERROR, "More than one Dynamic Pcds [%s] are different Guids but same CName.They need to be changed to DynamicEx type to avoid the confliction.\n" % (TokenCName), ExtraData="[%s]" % str(Info.MetaFile.Path))
+            EdkLogger.error("build", AUTOGEN_ERROR, "More than one Dynamic Pcds [%s] are different Guids but same CName.They need to be changed to DynamicEx type to avoid the confliction.\n" % (
+                TokenCName), ExtraData="[%s]" % str(Info.MetaFile.Path))
         else:
-            AutoGenH.Append('#define %s  LibPcdGet%s(%s)\n' % (GetModeName, DatumSizeLib, PcdTokenName))
-            AutoGenH.Append('#define %s  LibPcdGetSize(%s)\n' % (GetModeSizeName, PcdTokenName))
+            AutoGenH.Append('#define %s  LibPcdGet%s(%s)\n' %
+                            (GetModeName, DatumSizeLib, PcdTokenName))
+            AutoGenH.Append('#define %s  LibPcdGetSize(%s)\n' %
+                            (GetModeSizeName, PcdTokenName))
             if DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%s(%s, (SizeOfBuffer), (Buffer))\n' %(SetModeName, DatumSizeLib, PcdTokenName))
-                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%sS(%s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%s(%s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPcdSet%sS(%s, (SizeOfBuffer), (Buffer))\n' % (
+                    SetModeStatusName, DatumSizeLib, PcdTokenName))
             else:
-                AutoGenH.Append('#define %s(Value)  LibPcdSet%s(%s, (Value))\n' % (SetModeName, DatumSizeLib, PcdTokenName))
-                AutoGenH.Append('#define %s(Value)  LibPcdSet%sS(%s, (Value))\n' % (SetModeStatusName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSet%s(%s, (Value))\n' % (
+                    SetModeName, DatumSizeLib, PcdTokenName))
+                AutoGenH.Append('#define %s(Value)  LibPcdSet%sS(%s, (Value))\n' % (
+                    SetModeStatusName, DatumSizeLib, PcdTokenName))
     if PcdItemType == TAB_PCDS_PATCHABLE_IN_MODULE:
-        PcdVariableName = '_gPcd_' + gItemTypeStringDatabase[TAB_PCDS_PATCHABLE_IN_MODULE] + '_' + TokenCName
+        PcdVariableName = '_gPcd_' + \
+            gItemTypeStringDatabase[TAB_PCDS_PATCHABLE_IN_MODULE] + \
+            '_' + TokenCName
         if DatumType not in TAB_PCD_NUMERIC_TYPES:
             if DatumType == TAB_VOID and Array == '[]':
                 DatumType = [TAB_UINT8, TAB_UINT16][Pcd.DefaultValue[0] == 'L']
             else:
                 DatumType = TAB_UINT8
-            AutoGenH.Append('extern %s _gPcd_BinaryPatch_%s%s;\n' %(DatumType, TokenCName, Array))
+            AutoGenH.Append('extern %s _gPcd_BinaryPatch_%s%s;\n' %
+                            (DatumType, TokenCName, Array))
         else:
-            AutoGenH.Append('extern volatile  %s  %s%s;\n' % (DatumType, PcdVariableName, Array))
-        AutoGenH.Append('#define %s  %s_gPcd_BinaryPatch_%s\n' %(GetModeName, Type, TokenCName))
+            AutoGenH.Append('extern volatile  %s  %s%s;\n' %
+                            (DatumType, PcdVariableName, Array))
+        AutoGenH.Append('#define %s  %s_gPcd_BinaryPatch_%s\n' %
+                        (GetModeName, Type, TokenCName))
         PcdDataSize = Pcd.GetPcdSize()
         if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-            AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSize((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
-            AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSizeS((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' % (SetModeStatusName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
-            AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PatchPcdMaxSizeVariable))
-            AutoGenH.Append('extern const UINTN %s; \n' % PatchPcdMaxSizeVariable)
+            AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSize((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                            (SetModeName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
+            AutoGenH.Append('#define %s(SizeOfBuffer, Buffer)  LibPatchPcdSetPtrAndSizeS((VOID *)_gPcd_BinaryPatch_%s, &%s, %s, (SizeOfBuffer), (Buffer))\n' %
+                            (SetModeStatusName, TokenCName, PatchPcdSizeVariableName, PatchPcdMaxSizeVariable))
+            AutoGenH.Append('#define %s %s\n' %
+                            (PatchPcdSizeTokenName, PatchPcdMaxSizeVariable))
+            AutoGenH.Append('extern const UINTN %s; \n' %
+                            PatchPcdMaxSizeVariable)
         else:
-            AutoGenH.Append('#define %s(Value)  (%s = (Value))\n' % (SetModeName, PcdVariableName))
-            AutoGenH.Append('#define %s(Value)  ((%s = (Value)), RETURN_SUCCESS)\n' % (SetModeStatusName, PcdVariableName))
-            AutoGenH.Append('#define %s %s\n' % (PatchPcdSizeTokenName, PcdDataSize))
+            AutoGenH.Append('#define %s(Value)  (%s = (Value))\n' %
+                            (SetModeName, PcdVariableName))
+            AutoGenH.Append('#define %s(Value)  ((%s = (Value)), RETURN_SUCCESS)\n' % (
+                SetModeStatusName, PcdVariableName))
+            AutoGenH.Append('#define %s %s\n' %
+                            (PatchPcdSizeTokenName, PcdDataSize))
 
-        AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, PatchPcdSizeVariableName))
+        AutoGenH.Append('#define %s %s\n' %
+                        (GetModeSizeName, PatchPcdSizeVariableName))
         AutoGenH.Append('extern UINTN %s; \n' % PatchPcdSizeVariableName)
 
     if PcdItemType == TAB_PCDS_FIXED_AT_BUILD or PcdItemType == TAB_PCDS_FEATURE_FLAG:
         key = ".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
-        PcdVariableName = '_gPcd_' + gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
+        PcdVariableName = '_gPcd_' + \
+            gItemTypeStringDatabase[Pcd.Type] + '_' + TokenCName
         if DatumType == TAB_VOID and Array == '[]':
             DatumType = [TAB_UINT8, TAB_UINT16][Pcd.DefaultValue[0] == 'L']
         if DatumType not in TAB_PCD_NUMERIC_TYPES_VOID:
             DatumType = TAB_UINT8
-        AutoGenH.Append('extern const %s _gPcd_FixedAtBuild_%s%s;\n' %(DatumType, TokenCName, Array))
-        AutoGenH.Append('#define %s  %s_gPcd_FixedAtBuild_%s\n' %(GetModeName, Type, TokenCName))
-        AutoGenH.Append('//#define %s  ASSERT(FALSE)  // It is not allowed to set value for a FIXED_AT_BUILD PCD\n' % SetModeName)
+        AutoGenH.Append('extern const %s _gPcd_FixedAtBuild_%s%s;\n' %
+                        (DatumType, TokenCName, Array))
+        AutoGenH.Append('#define %s  %s_gPcd_FixedAtBuild_%s\n' %
+                        (GetModeName, Type, TokenCName))
+        AutoGenH.Append(
+            '//#define %s  ASSERT(FALSE)  // It is not allowed to set value for a FIXED_AT_BUILD PCD\n' % SetModeName)
 
         ConstFixedPcd = False
         if PcdItemType == TAB_PCDS_FIXED_AT_BUILD and (key in Info.ConstPcd or (Info.IsLibrary and not Info.ReferenceModules)):
@@ -1300,29 +1431,40 @@ def CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd):
             if key in Info.ConstPcd:
                 Pcd.DefaultValue = Info.ConstPcd[key]
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
-                AutoGenH.Append('#define _PCD_VALUE_%s %s%s\n' %(TokenCName, Type, PcdVariableName))
+                AutoGenH.Append('#define _PCD_VALUE_%s %s%s\n' %
+                                (TokenCName, Type, PcdVariableName))
             else:
-                AutoGenH.Append('#define _PCD_VALUE_%s %s\n' %(TokenCName, Pcd.DefaultValue))
+                AutoGenH.Append('#define _PCD_VALUE_%s %s\n' %
+                                (TokenCName, Pcd.DefaultValue))
         PcdDataSize = Pcd.GetPcdSize()
         if PcdItemType == TAB_PCDS_FIXED_AT_BUILD:
             if Pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 if ConstFixedPcd:
-                    AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixPcdSizeTokenName))
+                    AutoGenH.Append('#define %s %s\n' %
+                                    (FixPcdSizeTokenName, PcdDataSize))
+                    AutoGenH.Append('#define %s %s\n' %
+                                    (GetModeSizeName, FixPcdSizeTokenName))
                 else:
-                    AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixedPcdSizeVariableName))
-                    AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, FixedPcdSizeVariableName))
-                    AutoGenH.Append('extern const UINTN %s; \n' % FixedPcdSizeVariableName)
+                    AutoGenH.Append('#define %s %s\n' %
+                                    (GetModeSizeName, FixedPcdSizeVariableName))
+                    AutoGenH.Append('#define %s %s\n' % (
+                        FixPcdSizeTokenName, FixedPcdSizeVariableName))
+                    AutoGenH.Append('extern const UINTN %s; \n' %
+                                    FixedPcdSizeVariableName)
             else:
-                AutoGenH.Append('#define %s %s\n' % (FixPcdSizeTokenName, PcdDataSize))
-                AutoGenH.Append('#define %s %s\n' % (GetModeSizeName, FixPcdSizeTokenName))
+                AutoGenH.Append('#define %s %s\n' %
+                                (FixPcdSizeTokenName, PcdDataSize))
+                AutoGenH.Append('#define %s %s\n' %
+                                (GetModeSizeName, FixPcdSizeTokenName))
 
-## Create code for library constructor
+# Create code for library constructor
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
     #
     # Library Constructors
@@ -1336,21 +1478,29 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
     for Lib in DependentLibraryList:
         if len(Lib.ConstructorList) <= 0:
             continue
-        Dict = {'Function':Lib.ConstructorList}
+        Dict = {'Function': Lib.ConstructorList}
         if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
-            ConstructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
-            ConstructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
+            ConstructorPrototypeString.Append(
+                gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
+            ConstructorCallingString.Append(
+                gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         if Info.ModuleType not in [SUP_MODULE_BASE, SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION]:
             if Lib.ModuleType in SUP_MODULE_SET_PEI:
-                ConstructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
-                ConstructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
+                ConstructorPrototypeString.Append(
+                    gLibraryStructorPrototype['PEI'].Replace(Dict))
+                ConstructorCallingString.Append(
+                    gLibraryStructorCall['PEI'].Replace(Dict))
             elif Lib.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
                                     SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
-                ConstructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
-                ConstructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
+                ConstructorPrototypeString.Append(
+                    gLibraryStructorPrototype['DXE'].Replace(Dict))
+                ConstructorCallingString.Append(
+                    gLibraryStructorCall['DXE'].Replace(Dict))
             elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
-                ConstructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
-                ConstructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
+                ConstructorPrototypeString.Append(
+                    gLibraryStructorPrototype['MM'].Replace(Dict))
+                ConstructorCallingString.Append(
+                    gLibraryStructorCall['MM'].Replace(Dict))
 
     if str(ConstructorPrototypeString) == '':
         ConstructorPrototypeList = []
@@ -1362,9 +1512,9 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
         ConstructorCallingList = [str(ConstructorCallingString)]
 
     Dict = {
-        'Type'              :   'Constructor',
-        'FunctionPrototype' :   ConstructorPrototypeList,
-        'FunctionCall'      :   ConstructorCallingList
+        'Type':   'Constructor',
+        'FunctionPrototype':   ConstructorPrototypeList,
+        'FunctionCall':   ConstructorCallingList
     }
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
@@ -1379,12 +1529,14 @@ def CreateLibraryConstructorCode(Info, AutoGenC, AutoGenH):
         elif Info.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
-## Create code for library destructor
+# Create code for library destructor
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
     #
     # Library Destructors
@@ -1399,21 +1551,29 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
         Lib = DependentLibraryList[Index]
         if len(Lib.DestructorList) <= 0:
             continue
-        Dict = {'Function':Lib.DestructorList}
+        Dict = {'Function': Lib.DestructorList}
         if Lib.ModuleType in [SUP_MODULE_BASE, SUP_MODULE_SEC]:
-            DestructorPrototypeString.Append(gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
-            DestructorCallingString.Append(gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
+            DestructorPrototypeString.Append(
+                gLibraryStructorPrototype[SUP_MODULE_BASE].Replace(Dict))
+            DestructorCallingString.Append(
+                gLibraryStructorCall[SUP_MODULE_BASE].Replace(Dict))
         if Info.ModuleType not in [SUP_MODULE_BASE, SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION]:
             if Lib.ModuleType in SUP_MODULE_SET_PEI:
-                DestructorPrototypeString.Append(gLibraryStructorPrototype['PEI'].Replace(Dict))
-                DestructorCallingString.Append(gLibraryStructorCall['PEI'].Replace(Dict))
+                DestructorPrototypeString.Append(
+                    gLibraryStructorPrototype['PEI'].Replace(Dict))
+                DestructorCallingString.Append(
+                    gLibraryStructorCall['PEI'].Replace(Dict))
             elif Lib.ModuleType in [SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER,
                                     SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER, SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_SMM_CORE]:
-                DestructorPrototypeString.Append(gLibraryStructorPrototype['DXE'].Replace(Dict))
-                DestructorCallingString.Append(gLibraryStructorCall['DXE'].Replace(Dict))
+                DestructorPrototypeString.Append(
+                    gLibraryStructorPrototype['DXE'].Replace(Dict))
+                DestructorCallingString.Append(
+                    gLibraryStructorCall['DXE'].Replace(Dict))
             elif Lib.ModuleType in [SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]:
-                DestructorPrototypeString.Append(gLibraryStructorPrototype['MM'].Replace(Dict))
-                DestructorCallingString.Append(gLibraryStructorCall['MM'].Replace(Dict))
+                DestructorPrototypeString.Append(
+                    gLibraryStructorPrototype['MM'].Replace(Dict))
+                DestructorCallingString.Append(
+                    gLibraryStructorCall['MM'].Replace(Dict))
 
     if str(DestructorPrototypeString) == '':
         DestructorPrototypeList = []
@@ -1425,9 +1585,9 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
         DestructorCallingList = [str(DestructorCallingString)]
 
     Dict = {
-        'Type'              :   'Destructor',
-        'FunctionPrototype' :   DestructorPrototypeList,
-        'FunctionCall'      :   DestructorCallingList
+        'Type':   'Destructor',
+        'FunctionPrototype':   DestructorPrototypeList,
+        'FunctionCall':   DestructorCallingList
     }
     if Info.IsLibrary:
         AutoGenH.Append("${BEGIN}${FunctionPrototype}${END}", Dict)
@@ -1443,7 +1603,7 @@ def CreateLibraryDestructorCode(Info, AutoGenC, AutoGenH):
             AutoGenC.Append(gLibraryString['MM'].Replace(Dict))
 
 
-## Create code for ModuleEntryPoint
+# Create code for ModuleEntryPoint
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
@@ -1465,8 +1625,8 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
     else:
         UefiSpecVersion = '0x00000000'
     Dict = {
-        'Function'       :   Info.Module.ModuleEntryPointList,
-        'PiSpecVersion'  :   PiSpecVersion + 'U',
+        'Function':   Info.Module.ModuleEntryPointList,
+        'PiSpecVersion':   PiSpecVersion + 'U',
         'UefiSpecVersion':   UefiSpecVersion + 'U'
     }
 
@@ -1474,12 +1634,12 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         if Info.SourceFileList:
             if NumEntryPoints != 1:
                 EdkLogger.error(
-                  "build",
-                  AUTOGEN_ERROR,
-                  '%s must have exactly one entry point' % Info.ModuleType,
-                  File=str(Info),
-                  ExtraData= ", ".join(Info.Module.ModuleEntryPointList)
-                  )
+                    "build",
+                    AUTOGEN_ERROR,
+                    '%s must have exactly one entry point' % Info.ModuleType,
+                    File=str(Info),
+                    ExtraData=", ".join(Info.Module.ModuleEntryPointList)
+                )
     if Info.ModuleType == SUP_MODULE_PEI_CORE:
         AutoGenC.Append(gPeiCoreEntryPointString.Replace(Dict))
         AutoGenH.Append(gPeiCoreEntryPointPrototype.Replace(Dict))
@@ -1494,13 +1654,15 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         AutoGenH.Append(gMmCoreStandaloneEntryPointPrototype.Replace(Dict))
     elif Info.ModuleType == SUP_MODULE_PEIM:
         if NumEntryPoints < 2:
-            AutoGenC.Append(gPeimEntryPointString[NumEntryPoints].Replace(Dict))
+            AutoGenC.Append(
+                gPeimEntryPointString[NumEntryPoints].Replace(Dict))
         else:
             AutoGenC.Append(gPeimEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gPeimEntryPointPrototype.Replace(Dict))
     elif Info.ModuleType in [SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
         if NumEntryPoints < 2:
-            AutoGenC.Append(gUefiDriverEntryPointString[NumEntryPoints].Replace(Dict))
+            AutoGenC.Append(
+                gUefiDriverEntryPointString[NumEntryPoints].Replace(Dict))
         else:
             AutoGenC.Append(gUefiDriverEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gUefiDriverEntryPointPrototype.Replace(Dict))
@@ -1512,23 +1674,27 @@ def CreateModuleEntryPointCode(Info, AutoGenC, AutoGenH):
         AutoGenH.Append(gDxeSmmEntryPointPrototype.Replace(Dict))
     elif Info.ModuleType == SUP_MODULE_MM_STANDALONE:
         if NumEntryPoints < 2:
-            AutoGenC.Append(gMmStandaloneEntryPointString[NumEntryPoints].Replace(Dict))
+            AutoGenC.Append(
+                gMmStandaloneEntryPointString[NumEntryPoints].Replace(Dict))
         else:
             AutoGenC.Append(gMmStandaloneEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gMmStandaloneEntryPointPrototype.Replace(Dict))
     elif Info.ModuleType == SUP_MODULE_UEFI_APPLICATION:
         if NumEntryPoints < 2:
-            AutoGenC.Append(gUefiApplicationEntryPointString[NumEntryPoints].Replace(Dict))
+            AutoGenC.Append(
+                gUefiApplicationEntryPointString[NumEntryPoints].Replace(Dict))
         else:
             AutoGenC.Append(gUefiApplicationEntryPointString[2].Replace(Dict))
         AutoGenH.Append(gUefiApplicationEntryPointPrototype.Replace(Dict))
 
-## Create code for ModuleUnloadImage
+# Create code for ModuleUnloadImage
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary or Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_BASE, SUP_MODULE_SEC]:
         return
@@ -1536,19 +1702,22 @@ def CreateModuleUnloadImageCode(Info, AutoGenC, AutoGenH):
     # Unload Image Handlers
     #
     NumUnloadImage = len(Info.Module.ModuleUnloadImageList)
-    Dict = {'Count':str(NumUnloadImage) + 'U', 'Function':Info.Module.ModuleUnloadImageList}
+    Dict = {'Count': str(NumUnloadImage) + 'U',
+            'Function': Info.Module.ModuleUnloadImageList}
     if NumUnloadImage < 2:
         AutoGenC.Append(gUefiUnloadImageString[NumUnloadImage].Replace(Dict))
     else:
         AutoGenC.Append(gUefiUnloadImageString[2].Replace(Dict))
     AutoGenH.Append(gUefiUnloadImagePrototype.Replace(Dict))
 
-## Create code for GUID
+# Create code for GUID
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
     if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_BASE]:
         GuidType = TAB_GUID
@@ -1564,15 +1733,18 @@ def CreateGuidDefinitionCode(Info, AutoGenC, AutoGenH):
     #
     for Key in Info.GuidList:
         if not Info.IsLibrary:
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (GuidType, Key, Info.GuidList[Key]))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (
+                GuidType, Key, Info.GuidList[Key]))
         AutoGenH.Append('extern %s %s;\n' % (GuidType, Key))
 
-## Create code for protocol
+# Create code for protocol
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
     if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_BASE]:
         GuidType = TAB_GUID
@@ -1588,15 +1760,18 @@ def CreateProtocolDefinitionCode(Info, AutoGenC, AutoGenH):
     #
     for Key in Info.ProtocolList:
         if not Info.IsLibrary:
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (GuidType, Key, Info.ProtocolList[Key]))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (
+                GuidType, Key, Info.ProtocolList[Key]))
         AutoGenH.Append('extern %s %s;\n' % (GuidType, Key))
 
-## Create code for PPI
+# Create code for PPI
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreatePpiDefinitionCode(Info, AutoGenC, AutoGenH):
     if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_BASE]:
         GuidType = TAB_GUID
@@ -1612,15 +1787,18 @@ def CreatePpiDefinitionCode(Info, AutoGenC, AutoGenH):
     #
     for Key in Info.PpiList:
         if not Info.IsLibrary:
-            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (GuidType, Key, Info.PpiList[Key]))
+            AutoGenC.Append('GLOBAL_REMOVE_IF_UNREFERENCED %s %s = %s;\n' % (
+                GuidType, Key, Info.PpiList[Key]))
         AutoGenH.Append('extern %s %s;\n' % (GuidType, Key))
 
-## Create code for PCD
+# Create code for PCD
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreatePcdCode(Info, AutoGenC, AutoGenH):
 
     # Collect Token Space GUIDs used by DynamicEc PCDs
@@ -1634,7 +1812,8 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
     AutoGenH.Append("extern UINT64 _gPcd_SkuId_Array[];\n")
     # Add extern declarations to AutoGen.h if one or more Token Space GUIDs were found
     if TokenSpaceList:
-        AutoGenH.Append("\n// Definition of PCD Token Space GUIDs used in this module\n\n")
+        AutoGenH.Append(
+            "\n// Definition of PCD Token Space GUIDs used in this module\n\n")
         if Info.ModuleType in [SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_BASE]:
             GuidType = TAB_GUID
         else:
@@ -1647,24 +1826,26 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
             AutoGenH.Append("\n// PCD definitions\n")
         for Pcd in Info.ModulePcdList:
             CreateLibraryPcdCode(Info, AutoGenC, AutoGenH, Pcd)
-        DynExPcdTokenNumberMapping (Info, AutoGenH)
+        DynExPcdTokenNumberMapping(Info, AutoGenH)
     else:
         AutoGenC.Append("\n// Definition of SkuId Array\n")
-        AutoGenC.Append("GLOBAL_REMOVE_IF_UNREFERENCED UINT64 _gPcd_SkuId_Array[] = %s;\n" % SkuMgr.DumpSkuIdArrary())
+        AutoGenC.Append(
+            "GLOBAL_REMOVE_IF_UNREFERENCED UINT64 _gPcd_SkuId_Array[] = %s;\n" % SkuMgr.DumpSkuIdArrary())
         if Info.ModulePcdList:
             AutoGenH.Append("\n// Definition of PCDs used in this module\n")
             AutoGenC.Append("\n// Definition of PCDs used in this module\n")
         for Pcd in Info.ModulePcdList:
             CreateModulePcdCode(Info, AutoGenC, AutoGenH, Pcd)
-        DynExPcdTokenNumberMapping (Info, AutoGenH)
+        DynExPcdTokenNumberMapping(Info, AutoGenH)
         if Info.LibraryPcdList:
-            AutoGenH.Append("\n// Definition of PCDs used in libraries is in AutoGen.c\n")
+            AutoGenH.Append(
+                "\n// Definition of PCDs used in libraries is in AutoGen.c\n")
             AutoGenC.Append("\n// Definition of PCDs used in libraries\n")
         for Pcd in Info.LibraryPcdList:
             CreateModulePcdCode(Info, AutoGenC, AutoGenC, Pcd)
     CreatePcdDatabaseCode(Info, AutoGenC, AutoGenH)
 
-## Create code for unicode string definition
+# Create code for unicode string definition
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
@@ -1672,6 +1853,8 @@ def CreatePcdCode(Info, AutoGenC, AutoGenH):
 #   @param      UniGenCFlag     UniString is generated into AutoGen C file when it is set to True
 #   @param      UniGenBinBuffer Buffer to store uni string package data
 #
+
+
 def CreateUnicodeStringCode(Info, AutoGenC, AutoGenH, UniGenCFlag, UniGenBinBuffer):
     WorkingDir = os.getcwd()
     os.chdir(Info.WorkspaceDir)
@@ -1698,12 +1881,13 @@ def CreateUnicodeStringCode(Info, AutoGenC, AutoGenH, UniGenCFlag, UniGenBinBuff
     else:
         ShellMode = False
 
-    #RFC4646 is only for EDKII modules and ISO639-2 for EDK modules
+    # RFC4646 is only for EDKII modules and ISO639-2 for EDK modules
     if EDK2Module:
         FilterInfo = [EDK2Module] + [Info.PlatformInfo.Platform.RFCLanguages]
     else:
         FilterInfo = [EDK2Module] + [Info.PlatformInfo.Platform.ISOLanguages]
-    Header, Code = GetStringFiles(Info.UnicodeFileList, SrcList, IncList, Info.IncludePathList, ['.uni', '.inf'], Info.Name, CompatibleMode, ShellMode, UniGenCFlag, UniGenBinBuffer, FilterInfo)
+    Header, Code = GetStringFiles(Info.UnicodeFileList, SrcList, IncList, Info.IncludePathList, [
+                                  '.uni', '.inf'], Info.Name, CompatibleMode, ShellMode, UniGenCFlag, UniGenBinBuffer, FilterInfo)
     if CompatibleMode or UniGenCFlag:
         AutoGenC.Append("\n//\n//Unicode String Pack Definition\n//\n")
         AutoGenC.Append(Code)
@@ -1714,9 +1898,10 @@ def CreateUnicodeStringCode(Info, AutoGenC, AutoGenH, UniGenCFlag, UniGenBinBuff
         AutoGenH.Append("\n#define STRING_ARRAY_NAME %sStrings\n" % Info.Name)
     os.chdir(WorkingDir)
 
+
 def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
     if len(Info.IdfFileList) > 0:
-        ImageFiles = IdfFileClassObject(sorted (Info.IdfFileList))
+        ImageFiles = IdfFileClassObject(sorted(Info.IdfFileList))
         if ImageFiles.ImageFilesDict:
             Index = 1
             PaletteIndex = 1
@@ -1739,32 +1924,41 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                         for sourcefile in Info.SourceFileList:
                             if FileObj.FileName == sourcefile.File:
                                 if not sourcefile.Ext.upper() in ['.PNG', '.BMP', '.JPG']:
-                                    EdkLogger.error("build", AUTOGEN_ERROR, "The %s's postfix must be one of .bmp, .jpg, .png" % (FileObj.FileName), ExtraData="[%s]" % str(Info))
+                                    EdkLogger.error("build", AUTOGEN_ERROR, "The %s's postfix must be one of .bmp, .jpg, .png" % (
+                                        FileObj.FileName), ExtraData="[%s]" % str(Info))
                                 FileObj.File = sourcefile
                                 break
                         else:
-                            EdkLogger.error("build", AUTOGEN_ERROR, "The %s in %s is not defined in the driver's [Sources] section" % (FileObj.FileName, Idf), ExtraData="[%s]" % str(Info))
+                            EdkLogger.error("build", AUTOGEN_ERROR, "The %s in %s is not defined in the driver's [Sources] section" % (
+                                FileObj.FileName, Idf), ExtraData="[%s]" % str(Info))
 
                     for FileObj in ImageFiles.ImageFilesDict[Idf]:
                         ID = FileObj.ImageID
                         File = FileObj.File
                         try:
-                            SearchImageID (FileObj, FileList)
+                            SearchImageID(FileObj, FileList)
                             if FileObj.Referenced:
                                 if (ValueStartPtr - len(DEFINE_STR + ID)) <= 0:
-                                    Line = DEFINE_STR + ' ' + ID + ' ' + DecToHexStr(Index, 4) + '\n'
+                                    Line = DEFINE_STR + ' ' + ID + ' ' + \
+                                        DecToHexStr(Index, 4) + '\n'
                                 else:
-                                    Line = DEFINE_STR + ' ' + ID + ' ' * (ValueStartPtr - len(DEFINE_STR + ID)) + DecToHexStr(Index, 4) + '\n'
+                                    Line = DEFINE_STR + ' ' + ID + ' ' * \
+                                        (ValueStartPtr - len(DEFINE_STR + ID)
+                                         ) + DecToHexStr(Index, 4) + '\n'
 
                                 if File not in FileDict:
                                     FileDict[File] = Index
                                 else:
-                                    DuplicateBlock = pack('B', EFI_HII_IIBT_DUPLICATE)
+                                    DuplicateBlock = pack(
+                                        'B', EFI_HII_IIBT_DUPLICATE)
                                     DuplicateBlock += pack('H', FileDict[File])
                                     ImageBuffer += DuplicateBlock
-                                    BufferStr = WriteLine(BufferStr, '// %s: %s: %s' % (DecToHexStr(Index, 4), ID, DecToHexStr(Index, 4)))
-                                    TempBufferList = AscToHexList(DuplicateBlock)
-                                    BufferStr = WriteLine(BufferStr, CreateArrayItem(TempBufferList, 16) + '\n')
+                                    BufferStr = WriteLine(
+                                        BufferStr, '// %s: %s: %s' % (DecToHexStr(Index, 4), ID, DecToHexStr(Index, 4)))
+                                    TempBufferList = AscToHexList(
+                                        DuplicateBlock)
+                                    BufferStr = WriteLine(
+                                        BufferStr, CreateArrayItem(TempBufferList, 16) + '\n')
                                     StringH.Append(Line)
                                     Index += 1
                                     continue
@@ -1773,50 +1967,65 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
                                 Buffer = TmpFile.read()
                                 TmpFile.close()
                                 if File.Ext.upper() == '.PNG':
-                                    TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_PNG)
+                                    TempBuffer = pack(
+                                        'B', EFI_HII_IIBT_IMAGE_PNG)
                                     TempBuffer += pack('I', len(Buffer))
                                     TempBuffer += Buffer
                                 elif File.Ext.upper() == '.JPG':
-                                    ImageType, = struct.unpack('4s', Buffer[6:10])
+                                    ImageType, = struct.unpack(
+                                        '4s', Buffer[6:10])
                                     if ImageType != b'JFIF':
-                                        EdkLogger.error("build", FILE_TYPE_MISMATCH, "The file %s is not a standard JPG file." % File.Path)
-                                    TempBuffer = pack('B', EFI_HII_IIBT_IMAGE_JPEG)
+                                        EdkLogger.error(
+                                            "build", FILE_TYPE_MISMATCH, "The file %s is not a standard JPG file." % File.Path)
+                                    TempBuffer = pack(
+                                        'B', EFI_HII_IIBT_IMAGE_JPEG)
                                     TempBuffer += pack('I', len(Buffer))
                                     TempBuffer += Buffer
                                 elif File.Ext.upper() == '.BMP':
-                                    TempBuffer, TempPalette = BmpImageDecoder(File, Buffer, PaletteIndex, FileObj.TransParent)
+                                    TempBuffer, TempPalette = BmpImageDecoder(
+                                        File, Buffer, PaletteIndex, FileObj.TransParent)
                                     if len(TempPalette) > 1:
                                         PaletteIndex += 1
-                                        NewPalette = pack('H', len(TempPalette))
+                                        NewPalette = pack(
+                                            'H', len(TempPalette))
                                         NewPalette += TempPalette
                                         PaletteBuffer += NewPalette
-                                        PaletteStr = WriteLine(PaletteStr, '// %s: %s: %s' % (DecToHexStr(PaletteIndex - 1, 4), ID, DecToHexStr(PaletteIndex - 1, 4)))
-                                        TempPaletteList = AscToHexList(NewPalette)
-                                        PaletteStr = WriteLine(PaletteStr, CreateArrayItem(TempPaletteList, 16) + '\n')
+                                        PaletteStr = WriteLine(PaletteStr, '// %s: %s: %s' % (
+                                            DecToHexStr(PaletteIndex - 1, 4), ID, DecToHexStr(PaletteIndex - 1, 4)))
+                                        TempPaletteList = AscToHexList(
+                                            NewPalette)
+                                        PaletteStr = WriteLine(
+                                            PaletteStr, CreateArrayItem(TempPaletteList, 16) + '\n')
                                 ImageBuffer += TempBuffer
-                                BufferStr = WriteLine(BufferStr, '// %s: %s: %s' % (DecToHexStr(Index, 4), ID, DecToHexStr(Index, 4)))
+                                BufferStr = WriteLine(
+                                    BufferStr, '// %s: %s: %s' % (DecToHexStr(Index, 4), ID, DecToHexStr(Index, 4)))
                                 TempBufferList = AscToHexList(TempBuffer)
-                                BufferStr = WriteLine(BufferStr, CreateArrayItem(TempBufferList, 16) + '\n')
+                                BufferStr = WriteLine(
+                                    BufferStr, CreateArrayItem(TempBufferList, 16) + '\n')
 
                                 StringH.Append(Line)
                                 Index += 1
                         except IOError:
-                            EdkLogger.error("build", FILE_NOT_FOUND, ExtraData=File.Path)
+                            EdkLogger.error(
+                                "build", FILE_NOT_FOUND, ExtraData=File.Path)
 
             BufferStr = WriteLine(BufferStr, '// End of the Image Info')
-            BufferStr = WriteLine(BufferStr, CreateArrayItem(DecToHexList(EFI_HII_IIBT_END, 2)) + '\n')
+            BufferStr = WriteLine(BufferStr, CreateArrayItem(
+                DecToHexList(EFI_HII_IIBT_END, 2)) + '\n')
             ImageEnd = pack('B', EFI_HII_IIBT_END)
             ImageBuffer += ImageEnd
 
             if len(ImageBuffer) > 1:
                 ImageInfoOffset = 12
             if len(PaletteBuffer) > 1:
-                PaletteInfoOffset = 12 + len(ImageBuffer) - 1 # -1 is for the first empty pad byte of ImageBuffer
+                # -1 is for the first empty pad byte of ImageBuffer
+                PaletteInfoOffset = 12 + len(ImageBuffer) - 1
 
             IMAGE_PACKAGE_HDR = pack('=II', ImageInfoOffset, PaletteInfoOffset)
             # PACKAGE_HEADER_Length = PACKAGE_HEADER + ImageInfoOffset + PaletteInfoOffset + ImageBuffer Length + PaletteCount + PaletteBuffer Length
             if len(PaletteBuffer) > 1:
-                PACKAGE_HEADER_Length = 4 + 4 + 4 + len(ImageBuffer) - 1 + 2 + len(PaletteBuffer) - 1
+                PACKAGE_HEADER_Length = 4 + 4 + 4 + \
+                    len(ImageBuffer) - 1 + 2 + len(PaletteBuffer) - 1
             else:
                 PACKAGE_HEADER_Length = 4 + 4 + 4 + len(ImageBuffer) - 1
             if PaletteIndex > 1:
@@ -1824,12 +2033,14 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
             # EFI_HII_PACKAGE_HEADER length max value is 0xFFFFFF
             Hex_Length = '%06X' % PACKAGE_HEADER_Length
             if PACKAGE_HEADER_Length > 0xFFFFFF:
-                EdkLogger.error("build", AUTOGEN_ERROR, "The Length of EFI_HII_PACKAGE_HEADER exceed its maximum value", ExtraData="[%s]" % str(Info))
-            PACKAGE_HEADER = pack('=HBB', int('0x' + Hex_Length[2:], 16), int('0x' + Hex_Length[0:2], 16), EFI_HII_PACKAGE_IMAGES)
+                EdkLogger.error(
+                    "build", AUTOGEN_ERROR, "The Length of EFI_HII_PACKAGE_HEADER exceed its maximum value", ExtraData="[%s]" % str(Info))
+            PACKAGE_HEADER = pack('=HBB', int(
+                '0x' + Hex_Length[2:], 16), int('0x' + Hex_Length[0:2], 16), EFI_HII_PACKAGE_IMAGES)
 
             IdfGenBinBuffer.write(PACKAGE_HEADER)
             IdfGenBinBuffer.write(IMAGE_PACKAGE_HDR)
-            if len(ImageBuffer) > 1 :
+            if len(ImageBuffer) > 1:
                 IdfGenBinBuffer.write(ImageBuffer[1:])
             if PaletteIndex > 1:
                 IdfGenBinBuffer.write(PALETTE_INFO_HEADER)
@@ -1839,27 +2050,34 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
             if IdfGenCFlag:
                 TotalLength = EFI_HII_ARRAY_SIZE_LENGTH + PACKAGE_HEADER_Length
                 AutoGenC.Append("\n//\n//Image Pack Definition\n//\n")
-                AllStr = WriteLine('', CHAR_ARRAY_DEFIN + ' ' + Info.Module.BaseName + 'Images' + '[] = {\n')
+                AllStr = WriteLine('', CHAR_ARRAY_DEFIN + ' ' +
+                                   Info.Module.BaseName + 'Images' + '[] = {\n')
                 AllStr = WriteLine(AllStr, '// STRGATHER_OUTPUT_HEADER')
-                AllStr = WriteLine(AllStr, CreateArrayItem(DecToHexList(TotalLength)) + '\n')
+                AllStr = WriteLine(AllStr, CreateArrayItem(
+                    DecToHexList(TotalLength)) + '\n')
                 AllStr = WriteLine(AllStr, '// Image PACKAGE HEADER\n')
                 IMAGE_PACKAGE_HDR_List = AscToHexList(PACKAGE_HEADER)
                 IMAGE_PACKAGE_HDR_List += AscToHexList(IMAGE_PACKAGE_HDR)
-                AllStr = WriteLine(AllStr, CreateArrayItem(IMAGE_PACKAGE_HDR_List, 16) + '\n')
+                AllStr = WriteLine(AllStr, CreateArrayItem(
+                    IMAGE_PACKAGE_HDR_List, 16) + '\n')
                 AllStr = WriteLine(AllStr, '// Image DATA\n')
                 if BufferStr:
                     AllStr = WriteLine(AllStr, BufferStr)
                 if PaletteStr:
                     AllStr = WriteLine(AllStr, '// Palette Header\n')
-                    PALETTE_INFO_HEADER_List = AscToHexList(PALETTE_INFO_HEADER)
-                    AllStr = WriteLine(AllStr, CreateArrayItem(PALETTE_INFO_HEADER_List, 16) + '\n')
+                    PALETTE_INFO_HEADER_List = AscToHexList(
+                        PALETTE_INFO_HEADER)
+                    AllStr = WriteLine(AllStr, CreateArrayItem(
+                        PALETTE_INFO_HEADER_List, 16) + '\n')
                     AllStr = WriteLine(AllStr, '// Palette Data\n')
                     AllStr = WriteLine(AllStr, PaletteStr)
                 AllStr = WriteLine(AllStr, '};')
                 AutoGenC.Append(AllStr)
                 AutoGenC.Append("\n")
-                StringH.Append('\nextern unsigned char ' + Info.Module.BaseName + 'Images[];\n')
-                StringH.Append("\n#define IMAGE_ARRAY_NAME %sImages\n" % Info.Module.BaseName)
+                StringH.Append('\nextern unsigned char ' +
+                               Info.Module.BaseName + 'Images[];\n')
+                StringH.Append(
+                    "\n#define IMAGE_ARRAY_NAME %sImages\n" % Info.Module.BaseName)
 
 # typedef struct _EFI_HII_IMAGE_PACKAGE_HDR {
 #   EFI_HII_PACKAGE_HEADER  Header;          # Standard package header, where Header.Type = EFI_HII_PACKAGE_IMAGES
@@ -1878,24 +2096,31 @@ def CreateIdfFileCode(Info, AutoGenC, StringH, IdfGenCFlag, IdfGenBinBuffer):
 #   UINT8    BlockBody[];
 # } EFI_HII_IMAGE_BLOCK;
 
+
 def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
     ImageType, = struct.unpack('2s', Buffer[0:2])
-    if ImageType!= b'BM': # BMP file type is 'BM'
-        EdkLogger.error("build", FILE_TYPE_MISMATCH, "The file %s is not a standard BMP file." % File.Path)
-    BMP_IMAGE_HEADER = collections.namedtuple('BMP_IMAGE_HEADER', ['bfSize', 'bfReserved1', 'bfReserved2', 'bfOffBits', 'biSize', 'biWidth', 'biHeight', 'biPlanes', 'biBitCount', 'biCompression', 'biSizeImage', 'biXPelsPerMeter', 'biYPelsPerMeter', 'biClrUsed', 'biClrImportant'])
+    if ImageType != b'BM':  # BMP file type is 'BM'
+        EdkLogger.error("build", FILE_TYPE_MISMATCH,
+                        "The file %s is not a standard BMP file." % File.Path)
+    BMP_IMAGE_HEADER = collections.namedtuple('BMP_IMAGE_HEADER', ['bfSize', 'bfReserved1', 'bfReserved2', 'bfOffBits', 'biSize', 'biWidth',
+                                              'biHeight', 'biPlanes', 'biBitCount', 'biCompression', 'biSizeImage', 'biXPelsPerMeter', 'biYPelsPerMeter', 'biClrUsed', 'biClrImportant'])
     BMP_IMAGE_HEADER_STRUCT = struct.Struct('IHHIIIIHHIIIIII')
-    BmpHeader = BMP_IMAGE_HEADER._make(BMP_IMAGE_HEADER_STRUCT.unpack_from(Buffer[2:]))
+    BmpHeader = BMP_IMAGE_HEADER._make(
+        BMP_IMAGE_HEADER_STRUCT.unpack_from(Buffer[2:]))
     #
     # Doesn't support compress.
     #
     if BmpHeader.biCompression != 0:
-        EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "The compress BMP file %s is not support." % File.Path)
+        EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                        "The compress BMP file %s is not support." % File.Path)
 
     # The Width and Height is UINT16 type in Image Package
     if BmpHeader.biWidth > 0xFFFF:
-        EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "The BMP file %s Width is exceed 0xFFFF." % File.Path)
+        EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                        "The BMP file %s Width is exceed 0xFFFF." % File.Path)
     if BmpHeader.biHeight > 0xFFFF:
-        EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "The BMP file %s Height is exceed 0xFFFF." % File.Path)
+        EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                        "The BMP file %s Height is exceed 0xFFFF." % File.Path)
 
     PaletteBuffer = pack('x')
     if BmpHeader.biBitCount == 1:
@@ -1906,7 +2131,8 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
         ImageBuffer += pack('B', PaletteIndex)
         Width = (BmpHeader.biWidth + 7)//8
         if BmpHeader.bfOffBits > BMP_IMAGE_HEADER_STRUCT.size + 2:
-            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size + 2 : BmpHeader.bfOffBits]
+            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size +
+                                   2: BmpHeader.bfOffBits]
     elif BmpHeader.biBitCount == 4:
         if TransParent:
             ImageBuffer = pack('B', EFI_HII_IIBT_IMAGE_4BIT_TRANS)
@@ -1915,7 +2141,8 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
         ImageBuffer += pack('B', PaletteIndex)
         Width = (BmpHeader.biWidth + 1)//2
         if BmpHeader.bfOffBits > BMP_IMAGE_HEADER_STRUCT.size + 2:
-            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size + 2 : BmpHeader.bfOffBits]
+            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size +
+                                   2: BmpHeader.bfOffBits]
     elif BmpHeader.biBitCount == 8:
         if TransParent:
             ImageBuffer = pack('B', EFI_HII_IIBT_IMAGE_8BIT_TRANS)
@@ -1924,7 +2151,8 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
         ImageBuffer += pack('B', PaletteIndex)
         Width = BmpHeader.biWidth
         if BmpHeader.bfOffBits > BMP_IMAGE_HEADER_STRUCT.size + 2:
-            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size + 2 : BmpHeader.bfOffBits]
+            PaletteBuffer = Buffer[BMP_IMAGE_HEADER_STRUCT.size +
+                                   2: BmpHeader.bfOffBits]
     elif BmpHeader.biBitCount == 24:
         if TransParent:
             ImageBuffer = pack('B', EFI_HII_IIBT_IMAGE_24BIT_TRANS)
@@ -1932,7 +2160,8 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
             ImageBuffer = pack('B', EFI_HII_IIBT_IMAGE_24BIT)
         Width = BmpHeader.biWidth * 3
     else:
-        EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "Only support the 1 bit, 4 bit, 8bit, 24 bit BMP files.", ExtraData="[%s]" % str(File.Path))
+        EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                        "Only support the 1 bit, 4 bit, 8bit, 24 bit BMP files.", ExtraData="[%s]" % str(File.Path))
 
     ImageBuffer += pack('H', BmpHeader.biWidth)
     ImageBuffer += pack('H', BmpHeader.biHeight)
@@ -1943,7 +2172,7 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
             Start = End + (Width % 4) - 4 - Width
         else:
             Start = End - Width
-        ImageBuffer += Buffer[Start + 1 : Start + Width + 1]
+        ImageBuffer += Buffer[Start + 1: Start + Width + 1]
         End = Start
 
     # handle the Palette info,  BMP use 4 bytes for R, G, B and Reserved info while EFI_HII_RGB_PIXEL only have the R, G, B info
@@ -1956,22 +2185,26 @@ def BmpImageDecoder(File, Buffer, PaletteIndex, TransParent):
         PaletteBuffer = PaletteTemp[1:]
     return ImageBuffer, PaletteBuffer
 
-## Create common code
+# Create common code
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateHeaderCode(Info, AutoGenC, AutoGenH):
     # file header
-    AutoGenH.Append(gAutoGenHeaderString.Replace({'FileName':'AutoGen.h'}))
+    AutoGenH.Append(gAutoGenHeaderString.Replace({'FileName': 'AutoGen.h'}))
     # header file Prologue
-    AutoGenH.Append(gAutoGenHPrologueString.Replace({'File':'AUTOGENH','Guid':Info.Guid.replace('-', '_')}))
+    AutoGenH.Append(gAutoGenHPrologueString.Replace(
+        {'File': 'AUTOGENH', 'Guid': Info.Guid.replace('-', '_')}))
     AutoGenH.Append(gAutoGenHCppPrologueString)
 
     # header files includes
     if Info.ModuleType in gModuleTypeHeaderFile:
-        AutoGenH.Append("#include <%s>\n" % gModuleTypeHeaderFile[Info.ModuleType][0])
+        AutoGenH.Append("#include <%s>\n" %
+                        gModuleTypeHeaderFile[Info.ModuleType][0])
     #
     # if either PcdLib in [LibraryClasses] sections or there exist Pcd section, add PcdLib.h
     # As if modules only uses FixedPcd, then PcdLib is not needed in [LibraryClasses] section.
@@ -1986,13 +2219,15 @@ def CreateHeaderCode(Info, AutoGenC, AutoGenH):
     if Info.IsLibrary:
         return
 
-    AutoGenH.Append("#define EFI_CALLER_ID_GUID \\\n  %s\n" % GuidStringToGuidStructureString(Info.Guid))
-    AutoGenH.Append("#define EDKII_DSC_PLATFORM_GUID \\\n  %s\n" % GuidStringToGuidStructureString(Info.PlatformInfo.Guid))
+    AutoGenH.Append("#define EFI_CALLER_ID_GUID \\\n  %s\n" %
+                    GuidStringToGuidStructureString(Info.Guid))
+    AutoGenH.Append("#define EDKII_DSC_PLATFORM_GUID \\\n  %s\n" %
+                    GuidStringToGuidStructureString(Info.PlatformInfo.Guid))
 
     if Info.IsLibrary:
         return
     # C file header
-    AutoGenC.Append(gAutoGenHeaderString.Replace({'FileName':'AutoGen.c'}))
+    AutoGenC.Append(gAutoGenHeaderString.Replace({'FileName': 'AutoGen.c'}))
     # C file header files includes
     if Info.ModuleType in gModuleTypeHeaderFile:
         for Inc in gModuleTypeHeaderFile[Info.ModuleType]:
@@ -2003,20 +2238,25 @@ def CreateHeaderCode(Info, AutoGenC, AutoGenH):
     #
     # Publish the CallerId Guid
     #
-    AutoGenC.Append('\nGLOBAL_REMOVE_IF_UNREFERENCED GUID gEfiCallerIdGuid = %s;\n' % GuidStringToGuidStructureString(Info.Guid))
-    AutoGenC.Append('\nGLOBAL_REMOVE_IF_UNREFERENCED GUID gEdkiiDscPlatformGuid = %s;\n' % GuidStringToGuidStructureString(Info.PlatformInfo.Guid))
-    AutoGenC.Append('\nGLOBAL_REMOVE_IF_UNREFERENCED CHAR8 *gEfiCallerBaseName = "%s";\n' % Info.Name)
+    AutoGenC.Append('\nGLOBAL_REMOVE_IF_UNREFERENCED GUID gEfiCallerIdGuid = %s;\n' %
+                    GuidStringToGuidStructureString(Info.Guid))
+    AutoGenC.Append('\nGLOBAL_REMOVE_IF_UNREFERENCED GUID gEdkiiDscPlatformGuid = %s;\n' %
+                    GuidStringToGuidStructureString(Info.PlatformInfo.Guid))
+    AutoGenC.Append(
+        '\nGLOBAL_REMOVE_IF_UNREFERENCED CHAR8 *gEfiCallerBaseName = "%s";\n' % Info.Name)
 
-## Create common code for header file
+# Create common code for header file
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
+
+
 def CreateFooterCode(Info, AutoGenC, AutoGenH):
     AutoGenH.Append(gAutoGenHEpilogueString)
 
-## Create code for a module
+# Create code for a module
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
@@ -2028,6 +2268,8 @@ def CreateFooterCode(Info, AutoGenC, AutoGenH):
 #   @param      IdfGenCFlag     IdfString is generated into AutoGen C file when it is set to True
 #   @param      IdfGenBinBuffer Buffer to store Idf string package data
 #
+
+
 def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer, StringIdf, IdfGenCFlag, IdfGenBinBuffer):
     CreateHeaderCode(Info, AutoGenC, AutoGenH)
 
@@ -2042,25 +2284,34 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
 
     if Info.UnicodeFileList:
         FileName = "%sStrDefs.h" % Info.Name
-        StringH.Append(gAutoGenHeaderString.Replace({'FileName':FileName}))
-        StringH.Append(gAutoGenHPrologueString.Replace({'File':'STRDEFS', 'Guid':Info.Guid.replace('-', '_')}))
-        CreateUnicodeStringCode(Info, AutoGenC, StringH, UniGenCFlag, UniGenBinBuffer)
+        StringH.Append(gAutoGenHeaderString.Replace({'FileName': FileName}))
+        StringH.Append(gAutoGenHPrologueString.Replace(
+            {'File': 'STRDEFS', 'Guid': Info.Guid.replace('-', '_')}))
+        CreateUnicodeStringCode(Info, AutoGenC, StringH,
+                                UniGenCFlag, UniGenBinBuffer)
 
         GuidMacros = []
         for Guid in Info.Module.Guids:
             if Guid in Info.Module.GetGuidsUsedByPcd():
                 continue
-            GuidMacros.append('#define %s %s' % (Guid, Info.Module.Guids[Guid]))
+            GuidMacros.append('#define %s %s' %
+                              (Guid, Info.Module.Guids[Guid]))
         for Guid, Value in list(Info.Module.Protocols.items()) + list(Info.Module.Ppis.items()):
             GuidMacros.append('#define %s %s' % (Guid, Value))
         # supports FixedAtBuild and FeaturePcd usage in VFR file
         if Info.VfrFileList and Info.ModulePcdList:
-            GuidMacros.append('#define %s %s' % ('FixedPcdGetBool(TokenName)', '_PCD_VALUE_##TokenName'))
-            GuidMacros.append('#define %s %s' % ('FixedPcdGet8(TokenName)', '_PCD_VALUE_##TokenName'))
-            GuidMacros.append('#define %s %s' % ('FixedPcdGet16(TokenName)', '_PCD_VALUE_##TokenName'))
-            GuidMacros.append('#define %s %s' % ('FixedPcdGet32(TokenName)', '_PCD_VALUE_##TokenName'))
-            GuidMacros.append('#define %s %s' % ('FixedPcdGet64(TokenName)', '_PCD_VALUE_##TokenName'))
-            GuidMacros.append('#define %s %s' % ('FeaturePcdGet(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FixedPcdGetBool(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FixedPcdGet8(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FixedPcdGet16(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FixedPcdGet32(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FixedPcdGet64(TokenName)', '_PCD_VALUE_##TokenName'))
+            GuidMacros.append('#define %s %s' % (
+                'FeaturePcdGet(TokenName)', '_PCD_VALUE_##TokenName'))
             for Pcd in Info.ModulePcdList:
                 if Pcd.Type in [TAB_PCDS_FIXED_AT_BUILD, TAB_PCDS_FEATURE_FLAG]:
                     TokenCName = Pcd.TokenCName
@@ -2075,29 +2326,33 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
                         if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
                             TokenCName = PcdItem[0]
                             break
-                    GuidMacros.append('#define %s %s' % ('_PCD_VALUE_'+TokenCName, Value))
+                    GuidMacros.append('#define %s %s' %
+                                      ('_PCD_VALUE_'+TokenCName, Value))
 
         if Info.IdfFileList:
             GuidMacros.append('#include "%sImgDefs.h"' % Info.Name)
 
         if GuidMacros:
-            StringH.Append('\n#ifdef VFRCOMPILE\n%s\n#endif\n' % '\n'.join(GuidMacros))
+            StringH.Append('\n#ifdef VFRCOMPILE\n%s\n#endif\n' %
+                           '\n'.join(GuidMacros))
 
         StringH.Append("\n#endif\n")
         AutoGenH.Append('#include "%s"\n' % FileName)
 
     if Info.IdfFileList:
         FileName = "%sImgDefs.h" % Info.Name
-        StringIdf.Append(gAutoGenHeaderString.Replace({'FileName':FileName}))
-        StringIdf.Append(gAutoGenHPrologueString.Replace({'File':'IMAGEDEFS', 'Guid':Info.Guid.replace('-', '_')}))
-        CreateIdfFileCode(Info, AutoGenC, StringIdf, IdfGenCFlag, IdfGenBinBuffer)
+        StringIdf.Append(gAutoGenHeaderString.Replace({'FileName': FileName}))
+        StringIdf.Append(gAutoGenHPrologueString.Replace(
+            {'File': 'IMAGEDEFS', 'Guid': Info.Guid.replace('-', '_')}))
+        CreateIdfFileCode(Info, AutoGenC, StringIdf,
+                          IdfGenCFlag, IdfGenBinBuffer)
 
         StringIdf.Append("\n#endif\n")
         AutoGenH.Append('#include "%s"\n' % FileName)
 
     CreateFooterCode(Info, AutoGenC, AutoGenH)
 
-## Create the code file
+# Create the code file
 #
 #   @param      FilePath     The path of code file
 #   @param      Content      The content of code file
@@ -2106,6 +2361,7 @@ def CreateCode(Info, AutoGenC, AutoGenH, StringH, UniGenCFlag, UniGenBinBuffer,
 #   @retval     True        If file content is changed or file doesn't exist
 #   @retval     False       If the file exists and the content is not changed
 #
+
+
 def Generate(FilePath, Content, IsBinaryFile):
     return SaveFileOnChange(FilePath, Content, IsBinaryFile)
-
diff --git a/BaseTools/Source/Python/AutoGen/GenDepex.py b/BaseTools/Source/Python/AutoGen/GenDepex.py
index f2f2e9d65b5f..31be6e897cb9 100644
--- a/BaseTools/Source/Python/AutoGen/GenDepex.py
+++ b/BaseTools/Source/Python/AutoGen/GenDepex.py
@@ -1,10 +1,10 @@
-## @file
+# @file
 # This file is used to generate DEPEX file for module's dependency expression
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 
-## Import Modules
+# Import Modules
 #
 import sys
 import Common.LongFilePathOs as os
@@ -22,113 +22,120 @@ from Common import EdkLogger as EdkLogger
 from Common.BuildVersion import gBUILD_VERSION
 from Common.DataType import *
 
-## Regular expression for matching "DEPENDENCY_START ... DEPENDENCY_END"
+# Regular expression for matching "DEPENDENCY_START ... DEPENDENCY_END"
 gStartClosePattern = re.compile(".*DEPENDENCY_START(.+)DEPENDENCY_END.*", re.S)
 
-## Mapping between module type and EFI phase
+# Mapping between module type and EFI phase
 gType2Phase = {
-    SUP_MODULE_BASE              :   None,
-    SUP_MODULE_SEC               :   "PEI",
-    SUP_MODULE_PEI_CORE          :   "PEI",
-    SUP_MODULE_PEIM              :   "PEI",
-    SUP_MODULE_DXE_CORE          :   "DXE",
-    SUP_MODULE_DXE_DRIVER        :   "DXE",
-    SUP_MODULE_DXE_SMM_DRIVER    :   "DXE",
+    SUP_MODULE_BASE:   None,
+    SUP_MODULE_SEC:   "PEI",
+    SUP_MODULE_PEI_CORE:   "PEI",
+    SUP_MODULE_PEIM:   "PEI",
+    SUP_MODULE_DXE_CORE:   "DXE",
+    SUP_MODULE_DXE_DRIVER:   "DXE",
+    SUP_MODULE_DXE_SMM_DRIVER:   "DXE",
     SUP_MODULE_DXE_RUNTIME_DRIVER:   "DXE",
-    SUP_MODULE_DXE_SAL_DRIVER    :   "DXE",
-    SUP_MODULE_UEFI_DRIVER       :   "DXE",
-    SUP_MODULE_UEFI_APPLICATION  :   "DXE",
-    SUP_MODULE_SMM_CORE          :   "DXE",
-    SUP_MODULE_MM_STANDALONE     :   "MM",
-    SUP_MODULE_MM_CORE_STANDALONE :  "MM",
+    SUP_MODULE_DXE_SAL_DRIVER:   "DXE",
+    SUP_MODULE_UEFI_DRIVER:   "DXE",
+    SUP_MODULE_UEFI_APPLICATION:   "DXE",
+    SUP_MODULE_SMM_CORE:   "DXE",
+    SUP_MODULE_MM_STANDALONE:   "MM",
+    SUP_MODULE_MM_CORE_STANDALONE:  "MM",
 }
 
-## Convert dependency expression string into EFI internal representation
+# Convert dependency expression string into EFI internal representation
 #
 #   DependencyExpression class is used to parse dependency expression string and
 # convert it into its binary form.
 #
+
+
 class DependencyExpression:
 
     ArchProtocols = {
-                        '665e3ff6-46cc-11d4-9a38-0090273fc14d',     #   'gEfiBdsArchProtocolGuid'
-                        '26baccb1-6f42-11d4-bce7-0080c73c8881',     #   'gEfiCpuArchProtocolGuid'
-                        '26baccb2-6f42-11d4-bce7-0080c73c8881',     #   'gEfiMetronomeArchProtocolGuid'
-                        '1da97072-bddc-4b30-99f1-72a0b56fff2a',     #   'gEfiMonotonicCounterArchProtocolGuid'
-                        '27cfac87-46cc-11d4-9a38-0090273fc14d',     #   'gEfiRealTimeClockArchProtocolGuid'
-                        '27cfac88-46cc-11d4-9a38-0090273fc14d',     #   'gEfiResetArchProtocolGuid'
-                        'b7dfb4e1-052f-449f-87be-9818fc91b733',     #   'gEfiRuntimeArchProtocolGuid'
-                        'a46423e3-4617-49f1-b9ff-d1bfa9115839',     #   'gEfiSecurityArchProtocolGuid'
-                        '26baccb3-6f42-11d4-bce7-0080c73c8881',     #   'gEfiTimerArchProtocolGuid'
-                        '6441f818-6362-4e44-b570-7dba31dd2453',     #   'gEfiVariableWriteArchProtocolGuid'
-                        '1e5668e2-8481-11d4-bcf1-0080c73c8881',     #   'gEfiVariableArchProtocolGuid'
-                        '665e3ff5-46cc-11d4-9a38-0090273fc14d'      #   'gEfiWatchdogTimerArchProtocolGuid'
-                    }
+        '665e3ff6-46cc-11d4-9a38-0090273fc14d',  # 'gEfiBdsArchProtocolGuid'
+        '26baccb1-6f42-11d4-bce7-0080c73c8881',  # 'gEfiCpuArchProtocolGuid'
+        '26baccb2-6f42-11d4-bce7-0080c73c8881',  # 'gEfiMetronomeArchProtocolGuid'
+        # 'gEfiMonotonicCounterArchProtocolGuid'
+        '1da97072-bddc-4b30-99f1-72a0b56fff2a',
+        '27cfac87-46cc-11d4-9a38-0090273fc14d',  # 'gEfiRealTimeClockArchProtocolGuid'
+        '27cfac88-46cc-11d4-9a38-0090273fc14d',  # 'gEfiResetArchProtocolGuid'
+        'b7dfb4e1-052f-449f-87be-9818fc91b733',  # 'gEfiRuntimeArchProtocolGuid'
+        'a46423e3-4617-49f1-b9ff-d1bfa9115839',  # 'gEfiSecurityArchProtocolGuid'
+        '26baccb3-6f42-11d4-bce7-0080c73c8881',  # 'gEfiTimerArchProtocolGuid'
+        '6441f818-6362-4e44-b570-7dba31dd2453',  # 'gEfiVariableWriteArchProtocolGuid'
+        '1e5668e2-8481-11d4-bcf1-0080c73c8881',  # 'gEfiVariableArchProtocolGuid'
+        '665e3ff5-46cc-11d4-9a38-0090273fc14d'  # 'gEfiWatchdogTimerArchProtocolGuid'
+    }
 
     OpcodePriority = {
-        DEPEX_OPCODE_AND   :   1,
-        DEPEX_OPCODE_OR    :   1,
-        DEPEX_OPCODE_NOT   :   2,
+        DEPEX_OPCODE_AND:   1,
+        DEPEX_OPCODE_OR:   1,
+        DEPEX_OPCODE_NOT:   2,
     }
 
     Opcode = {
-        "PEI"   : {
-            DEPEX_OPCODE_PUSH  :   0x02,
-            DEPEX_OPCODE_AND   :   0x03,
-            DEPEX_OPCODE_OR    :   0x04,
-            DEPEX_OPCODE_NOT   :   0x05,
-            DEPEX_OPCODE_TRUE  :   0x06,
-            DEPEX_OPCODE_FALSE :   0x07,
-            DEPEX_OPCODE_END   :   0x08
+        "PEI": {
+            DEPEX_OPCODE_PUSH:   0x02,
+            DEPEX_OPCODE_AND:   0x03,
+            DEPEX_OPCODE_OR:   0x04,
+            DEPEX_OPCODE_NOT:   0x05,
+            DEPEX_OPCODE_TRUE:   0x06,
+            DEPEX_OPCODE_FALSE:   0x07,
+            DEPEX_OPCODE_END:   0x08
         },
 
-        "DXE"   : {
+        "DXE": {
             DEPEX_OPCODE_BEFORE:   0x00,
-            DEPEX_OPCODE_AFTER :   0x01,
-            DEPEX_OPCODE_PUSH  :   0x02,
-            DEPEX_OPCODE_AND   :   0x03,
-            DEPEX_OPCODE_OR    :   0x04,
-            DEPEX_OPCODE_NOT   :   0x05,
-            DEPEX_OPCODE_TRUE  :   0x06,
-            DEPEX_OPCODE_FALSE :   0x07,
-            DEPEX_OPCODE_END   :   0x08,
-            DEPEX_OPCODE_SOR   :   0x09
+            DEPEX_OPCODE_AFTER:   0x01,
+            DEPEX_OPCODE_PUSH:   0x02,
+            DEPEX_OPCODE_AND:   0x03,
+            DEPEX_OPCODE_OR:   0x04,
+            DEPEX_OPCODE_NOT:   0x05,
+            DEPEX_OPCODE_TRUE:   0x06,
+            DEPEX_OPCODE_FALSE:   0x07,
+            DEPEX_OPCODE_END:   0x08,
+            DEPEX_OPCODE_SOR:   0x09
         },
 
-        "MM"   : {
+        "MM": {
             DEPEX_OPCODE_BEFORE:   0x00,
-            DEPEX_OPCODE_AFTER :   0x01,
-            DEPEX_OPCODE_PUSH  :   0x02,
-            DEPEX_OPCODE_AND   :   0x03,
-            DEPEX_OPCODE_OR    :   0x04,
-            DEPEX_OPCODE_NOT   :   0x05,
-            DEPEX_OPCODE_TRUE  :   0x06,
-            DEPEX_OPCODE_FALSE :   0x07,
-            DEPEX_OPCODE_END   :   0x08,
-            DEPEX_OPCODE_SOR   :   0x09
+            DEPEX_OPCODE_AFTER:   0x01,
+            DEPEX_OPCODE_PUSH:   0x02,
+            DEPEX_OPCODE_AND:   0x03,
+            DEPEX_OPCODE_OR:   0x04,
+            DEPEX_OPCODE_NOT:   0x05,
+            DEPEX_OPCODE_TRUE:   0x06,
+            DEPEX_OPCODE_FALSE:   0x07,
+            DEPEX_OPCODE_END:   0x08,
+            DEPEX_OPCODE_SOR:   0x09
         }
     }
 
     # all supported op codes and operands
-    SupportedOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH, DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
+    SupportedOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER, DEPEX_OPCODE_PUSH,
+                       DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_END, DEPEX_OPCODE_SOR]
     SupportedOperand = [DEPEX_OPCODE_TRUE, DEPEX_OPCODE_FALSE]
 
-    OpcodeWithSingleOperand = [DEPEX_OPCODE_NOT, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    OpcodeWithSingleOperand = [DEPEX_OPCODE_NOT,
+                               DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
     OpcodeWithTwoOperand = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]
 
     # op code that should not be the last one
-    NonEndingOpcode = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR, DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR]
+    NonEndingOpcode = [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR,
+                       DEPEX_OPCODE_NOT, DEPEX_OPCODE_SOR]
     # op code must not present at the same time
     ExclusiveOpcode = [DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
     # op code that should be the first one if it presents
-    AboveAllOpcode = [DEPEX_OPCODE_SOR, DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
+    AboveAllOpcode = [DEPEX_OPCODE_SOR,
+                      DEPEX_OPCODE_BEFORE, DEPEX_OPCODE_AFTER]
 
     #
     # open and close brace must be taken as individual tokens
     #
     TokenPattern = re.compile("(\(|\)|\{[^{}]+\{?[^{}]+\}?[ ]*\}|\w+)")
 
-    ## Constructor
+    # Constructor
     #
     #   @param  Expression  The list or string of dependency expression
     #   @param  ModuleType  The type of the module using the dependency expression
@@ -152,7 +159,8 @@ class DependencyExpression:
         EdkLogger.debug(EdkLogger.DEBUG_8, repr(self))
         if Optimize:
             self.Optimize()
-            EdkLogger.debug(EdkLogger.DEBUG_8, "\n    Optimized: " + repr(self))
+            EdkLogger.debug(EdkLogger.DEBUG_8,
+                            "\n    Optimized: " + repr(self))
 
     def __str__(self):
         return " ".join(self.TokenList)
@@ -166,11 +174,11 @@ class DependencyExpression:
                 WellForm += ' ' + Token
         return WellForm
 
-    ## Split the expression string into token list
+    # Split the expression string into token list
     def GetExpressionTokenList(self):
         self.TokenList = self.TokenPattern.findall(self.ExpressionString)
 
-    ## Convert token list into postfix notation
+    # Convert token list into postfix notation
     def GetPostfixNotation(self):
         Stack = []
         LastToken = ''
@@ -198,8 +206,8 @@ class DependencyExpression:
                         EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operator before NOT",
                                         ExtraData="Near %s" % LastToken)
                 elif LastToken in self.SupportedOpcode + ['(', '', None]:
-                        EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before " + Token,
-                                        ExtraData="Near %s" % LastToken)
+                    EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid dependency expression: missing operand before " + Token,
+                                    ExtraData="Near %s" % LastToken)
 
                 while len(Stack) > 0:
                     if Stack[-1] == "(" or self.OpcodePriority[Token] >= self.OpcodePriority[Stack[-1]]:
@@ -223,7 +231,8 @@ class DependencyExpression:
                     self.OpcodeList.append(Token)
                 else:
                     EdkLogger.error("GenDepex", PARSER_ERROR,
-                                    "Opcode=%s doesn't supported in %s stage " % (Token, self.Phase),
+                                    "Opcode=%s doesn't supported in %s stage " % (
+                                        Token, self.Phase),
                                     ExtraData=str(self))
                 self.PostfixNotation.append(Token)
             LastToken = Token
@@ -237,7 +246,7 @@ class DependencyExpression:
         if self.PostfixNotation[-1] != DEPEX_OPCODE_END:
             self.PostfixNotation.append(DEPEX_OPCODE_END)
 
-    ## Validate the dependency expression
+    # Validate the dependency expression
     def ValidateOpcode(self):
         for Op in self.AboveAllOpcode:
             if Op in self.PostfixNotation:
@@ -265,14 +274,14 @@ class DependencyExpression:
             EdkLogger.error("GenDepex", PARSER_ERROR, "Extra expressions after END",
                             ExtraData=str(self))
 
-    ## Simply optimize the dependency expression by removing duplicated operands
+    # Simply optimize the dependency expression by removing duplicated operands
     def Optimize(self):
         OpcodeSet = set(self.OpcodeList)
         # if there are isn't one in the set, return
         if len(OpcodeSet) != 1:
-          return
+            return
         Op = OpcodeSet.pop()
-        #if Op isn't either OR or AND, return
+        # if Op isn't either OR or AND, return
         if Op not in [DEPEX_OPCODE_AND, DEPEX_OPCODE_OR]:
             return
         NewOperand = []
@@ -319,25 +328,29 @@ class DependencyExpression:
         self.PostfixNotation = []
         self.GetPostfixNotation()
 
-
-    ## Convert a GUID value in C structure format into its binary form
+    # Convert a GUID value in C structure format into its binary form
     #
     #   @param  Guid    The GUID value in C structure format
     #
     #   @retval array   The byte array representing the GUID value
     #
+
     def GetGuidValue(self, Guid):
-        GuidValueString = Guid.replace("{", "").replace("}", "").replace(" ", "")
+        GuidValueString = Guid.replace(
+            "{", "").replace("}", "").replace(" ", "")
         GuidValueList = GuidValueString.split(",")
         if len(GuidValueList) != 11 and len(GuidValueList) == 16:
-            GuidValueString = GuidStringToGuidStructureString(GuidStructureByteArrayToGuidString(Guid))
-            GuidValueString = GuidValueString.replace("{", "").replace("}", "").replace(" ", "")
+            GuidValueString = GuidStringToGuidStructureString(
+                GuidStructureByteArrayToGuidString(Guid))
+            GuidValueString = GuidValueString.replace(
+                "{", "").replace("}", "").replace(" ", "")
             GuidValueList = GuidValueString.split(",")
         if len(GuidValueList) != 11:
-            EdkLogger.error("GenDepex", PARSER_ERROR, "Invalid GUID value string or opcode: %s" % Guid)
+            EdkLogger.error("GenDepex", PARSER_ERROR,
+                            "Invalid GUID value string or opcode: %s" % Guid)
         return pack("1I2H8B", *(int(value, 16) for value in GuidValueList))
 
-    ## Save the binary form of dependency expression in file
+    # Save the binary form of dependency expression in file
     #
     #   @param  File    The path of file. If None is given, put the data on console
     #
@@ -354,7 +367,8 @@ class DependencyExpression:
                 Buffer.write(pack("B", self.Opcode[self.Phase][Item]))
             elif Item in self.SupportedOpcode:
                 EdkLogger.error("GenDepex", FORMAT_INVALID,
-                                "Opcode [%s] is not expected in %s phase" % (Item, self.Phase),
+                                "Opcode [%s] is not expected in %s phase" % (
+                                    Item, self.Phase),
                                 ExtraData=self.ExpressionString)
             else:
                 Buffer.write(self.GetGuidValue(Item))
@@ -370,19 +384,23 @@ class DependencyExpression:
         Buffer.close()
         return FileChangeFlag
 
+
 versionNumber = ("0.04" + " " + gBUILD_VERSION)
 __version__ = "%prog Version " + versionNumber
 __copyright__ = "Copyright (c) 2007-2018, Intel Corporation  All rights reserved."
 __usage__ = "%prog [options] [dependency_expression_file]"
 
-## Parse command line options
+# Parse command line options
 #
 #   @retval OptionParser
 #
+
+
 def GetOptions():
     from optparse import OptionParser
 
-    Parser = OptionParser(description=__copyright__, version=__version__, usage=__usage__)
+    Parser = OptionParser(description=__copyright__,
+                          version=__version__, usage=__usage__)
 
     Parser.add_option("-o", "--output", dest="OutputFile", default=None, metavar="FILE",
                       help="Specify the name of depex file to be generated")
@@ -394,14 +412,15 @@ def GetOptions():
                       help="Do some simple optimization on the expression.")
     Parser.add_option("-v", "--verbose", dest="verbose", default=False, action="store_true",
                       help="build with verbose information")
-    Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
+    Parser.add_option("-d", "--debug", action="store", type="int",
+                      help="Enable debug messages at specified level.")
     Parser.add_option("-q", "--quiet", dest="quiet", default=False, action="store_true",
                       help="build with little information")
 
     return Parser.parse_args()
 
 
-## Entrance method
+# Entrance method
 #
 # @retval 0     Tool was successful
 # @retval 1     Tool failed
@@ -422,12 +441,14 @@ def Main():
 
     try:
         if Option.ModuleType is None or Option.ModuleType not in gType2Phase:
-            EdkLogger.error("GenDepex", OPTION_MISSING, "Module type is not specified or supported")
+            EdkLogger.error("GenDepex", OPTION_MISSING,
+                            "Module type is not specified or supported")
 
         DxsFile = ''
         if len(Input) > 0 and Option.Expression == "":
             DxsFile = Input[0]
-            DxsString = open(DxsFile, 'r').read().replace("\n", " ").replace("\r", " ")
+            DxsString = open(DxsFile, 'r').read().replace(
+                "\n", " ").replace("\r", " ")
             DxsString = gStartClosePattern.sub("\\1", DxsString)
         elif Option.Expression != "":
             if Option.Expression[0] == '"':
@@ -435,9 +456,11 @@ def Main():
             else:
                 DxsString = Option.Expression
         else:
-            EdkLogger.error("GenDepex", OPTION_MISSING, "No expression string or file given")
+            EdkLogger.error("GenDepex", OPTION_MISSING,
+                            "No expression string or file given")
 
-        Dpx = DependencyExpression(DxsString, Option.ModuleType, Option.Optimize)
+        Dpx = DependencyExpression(
+            DxsString, Option.ModuleType, Option.Optimize)
         if Option.OutputFile is not None:
             FileChangeFlag = Dpx.Generate(Option.OutputFile)
             if not FileChangeFlag and DxsFile:
@@ -459,6 +482,6 @@ def Main():
 
     return 0
 
+
 if __name__ == '__main__':
     sys.exit(Main())
-
diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/Python/AutoGen/GenMake.py
index daec9c6d54b2..acbc3056d918 100755
--- a/BaseTools/Source/Python/AutoGen/GenMake.py
+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2007 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -6,7 +6,7 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 from __future__ import absolute_import
 import Common.LongFilePathOs as os
@@ -24,33 +24,34 @@ import Common.GlobalData as GlobalData
 from collections import OrderedDict
 from Common.DataType import TAB_COMPILER_MSFT
 
-## Regular expression for finding header file inclusions
-gIncludePattern = re.compile(r"^[ \t]*[#%]?[ \t]*include(?:[ \t]*(?:\\(?:\r\n|\r|\n))*[ \t]*)*(?:\(?[\"<]?[ \t]*)([-\w.\\/() \t]+)(?:[ \t]*[\">]?\)?)", re.MULTILINE | re.UNICODE | re.IGNORECASE)
+# Regular expression for finding header file inclusions
+gIncludePattern = re.compile(
+    r"^[ \t]*[#%]?[ \t]*include(?:[ \t]*(?:\\(?:\r\n|\r|\n))*[ \t]*)*(?:\(?[\"<]?[ \t]*)([-\w.\\/() \t]+)(?:[ \t]*[\">]?\)?)", re.MULTILINE | re.UNICODE | re.IGNORECASE)
 
-## Regular expression for matching macro used in header file inclusion
+# Regular expression for matching macro used in header file inclusion
 gMacroPattern = re.compile("([_A-Z][_A-Z0-9]*)[ \t]*\((.+)\)", re.UNICODE)
 
 gIsFileMap = {}
 
-## pattern for include style in Edk.x code
+# pattern for include style in Edk.x code
 gProtocolDefinition = "Protocol/%(HeaderKey)s/%(HeaderKey)s.h"
 gGuidDefinition = "Guid/%(HeaderKey)s/%(HeaderKey)s.h"
 gArchProtocolDefinition = "ArchProtocol/%(HeaderKey)s/%(HeaderKey)s.h"
 gPpiDefinition = "Ppi/%(HeaderKey)s/%(HeaderKey)s.h"
 gIncludeMacroConversion = {
-  "EFI_PROTOCOL_DEFINITION"         :   gProtocolDefinition,
-  "EFI_GUID_DEFINITION"             :   gGuidDefinition,
-  "EFI_ARCH_PROTOCOL_DEFINITION"    :   gArchProtocolDefinition,
-  "EFI_PROTOCOL_PRODUCER"           :   gProtocolDefinition,
-  "EFI_PROTOCOL_CONSUMER"           :   gProtocolDefinition,
-  "EFI_PROTOCOL_DEPENDENCY"         :   gProtocolDefinition,
-  "EFI_ARCH_PROTOCOL_PRODUCER"      :   gArchProtocolDefinition,
-  "EFI_ARCH_PROTOCOL_CONSUMER"      :   gArchProtocolDefinition,
-  "EFI_ARCH_PROTOCOL_DEPENDENCY"    :   gArchProtocolDefinition,
-  "EFI_PPI_DEFINITION"              :   gPpiDefinition,
-  "EFI_PPI_PRODUCER"                :   gPpiDefinition,
-  "EFI_PPI_CONSUMER"                :   gPpiDefinition,
-  "EFI_PPI_DEPENDENCY"              :   gPpiDefinition,
+    "EFI_PROTOCOL_DEFINITION":   gProtocolDefinition,
+    "EFI_GUID_DEFINITION":   gGuidDefinition,
+    "EFI_ARCH_PROTOCOL_DEFINITION":   gArchProtocolDefinition,
+    "EFI_PROTOCOL_PRODUCER":   gProtocolDefinition,
+    "EFI_PROTOCOL_CONSUMER":   gProtocolDefinition,
+    "EFI_PROTOCOL_DEPENDENCY":   gProtocolDefinition,
+    "EFI_ARCH_PROTOCOL_PRODUCER":   gArchProtocolDefinition,
+    "EFI_ARCH_PROTOCOL_CONSUMER":   gArchProtocolDefinition,
+    "EFI_ARCH_PROTOCOL_DEPENDENCY":   gArchProtocolDefinition,
+    "EFI_PPI_DEFINITION":   gPpiDefinition,
+    "EFI_PPI_PRODUCER":   gPpiDefinition,
+    "EFI_PPI_CONSUMER":   gPpiDefinition,
+    "EFI_PPI_DEPENDENCY":   gPpiDefinition,
 }
 
 NMAKE_FILETYPE = "nmake"
@@ -58,21 +59,23 @@ GMAKE_FILETYPE = "gmake"
 WIN32_PLATFORM = "win32"
 POSIX_PLATFORM = "posix"
 
-## BuildFile class
+# BuildFile class
 #
 #  This base class encapsules build file and its generation. It uses template to generate
 #  the content of build file. The content of build file will be got from AutoGen objects.
 #
+
+
 class BuildFile(object):
-    ## template used to generate the build file (i.e. makefile if using make)
+    # template used to generate the build file (i.e. makefile if using make)
     _TEMPLATE_ = TemplateString('')
 
     _DEFAULT_FILE_NAME_ = "Makefile"
 
-    ## default file name for each type of build file
+    # default file name for each type of build file
     _FILE_NAME_ = {
-        NMAKE_FILETYPE :   "Makefile",
-        GMAKE_FILETYPE :   "GNUmakefile"
+        NMAKE_FILETYPE:   "Makefile",
+        GMAKE_FILETYPE:   "GNUmakefile"
     }
 
     # Get Makefile name.
@@ -82,7 +85,7 @@ class BuildFile(object):
         else:
             return self._FILE_NAME_[self._FileType]
 
-    ## Fixed header string for makefile
+    # Fixed header string for makefile
     _MAKEFILE_HEADER = '''#
 # DO NOT EDIT
 # This file is auto-generated by build utility
@@ -97,13 +100,13 @@ class BuildFile(object):
 #
     '''
 
-    ## Header string for each type of build file
+    # Header string for each type of build file
     _FILE_HEADER_ = {
-        NMAKE_FILETYPE :   _MAKEFILE_HEADER % _FILE_NAME_[NMAKE_FILETYPE],
-        GMAKE_FILETYPE :   _MAKEFILE_HEADER % _FILE_NAME_[GMAKE_FILETYPE]
+        NMAKE_FILETYPE:   _MAKEFILE_HEADER % _FILE_NAME_[NMAKE_FILETYPE],
+        GMAKE_FILETYPE:   _MAKEFILE_HEADER % _FILE_NAME_[GMAKE_FILETYPE]
     }
 
-    ## shell commands which can be used in build file in the form of macro
+    # shell commands which can be used in build file in the form of macro
     #   $(CP)     copy file command
     #   $(MV)     move file command
     #   $(RM)     remove file command
@@ -111,64 +114,65 @@ class BuildFile(object):
     #   $(RD)     remove dir command
     #
     _SHELL_CMD_ = {
-        WIN32_PLATFORM : {
-            "CP"    :   "copy /y",
-            "MV"    :   "move /y",
-            "RM"    :   "del /f /q",
-            "MD"    :   "mkdir",
-            "RD"    :   "rmdir /s /q",
+        WIN32_PLATFORM: {
+            "CP":   "copy /y",
+            "MV":   "move /y",
+            "RM":   "del /f /q",
+            "MD":   "mkdir",
+            "RD":   "rmdir /s /q",
         },
 
-        POSIX_PLATFORM : {
-            "CP"    :   "cp -p -f",
-            "MV"    :   "mv -f",
-            "RM"    :   "rm -f",
-            "MD"    :   "mkdir -p",
-            "RD"    :   "rm -r -f",
+        POSIX_PLATFORM: {
+            "CP":   "cp -p -f",
+            "MV":   "mv -f",
+            "RM":   "rm -f",
+            "MD":   "mkdir -p",
+            "RD":   "rm -r -f",
         }
     }
 
-    ## directory separator
+    # directory separator
     _SEP_ = {
-        WIN32_PLATFORM :   "\\",
-        POSIX_PLATFORM :   "/"
+        WIN32_PLATFORM:   "\\",
+        POSIX_PLATFORM:   "/"
     }
 
-    ## directory creation template
+    # directory creation template
     _MD_TEMPLATE_ = {
-        WIN32_PLATFORM :   'if not exist %(dir)s $(MD) %(dir)s',
-        POSIX_PLATFORM :   "$(MD) %(dir)s"
+        WIN32_PLATFORM:   'if not exist %(dir)s $(MD) %(dir)s',
+        POSIX_PLATFORM:   "$(MD) %(dir)s"
     }
 
-    ## directory removal template
+    # directory removal template
     _RD_TEMPLATE_ = {
-        WIN32_PLATFORM :   'if exist %(dir)s $(RD) %(dir)s',
-        POSIX_PLATFORM :   "$(RD) %(dir)s"
+        WIN32_PLATFORM:   'if exist %(dir)s $(RD) %(dir)s',
+        POSIX_PLATFORM:   "$(RD) %(dir)s"
     }
-    ## cp if exist
+    # cp if exist
     _CP_TEMPLATE_ = {
-        WIN32_PLATFORM :   'if exist %(Src)s $(CP) %(Src)s %(Dst)s',
-        POSIX_PLATFORM :   "test -f %(Src)s && $(CP) %(Src)s %(Dst)s"
+        WIN32_PLATFORM:   'if exist %(Src)s $(CP) %(Src)s %(Dst)s',
+        POSIX_PLATFORM:   "test -f %(Src)s && $(CP) %(Src)s %(Dst)s"
     }
 
     _CD_TEMPLATE_ = {
-        WIN32_PLATFORM :   'if exist %(dir)s cd %(dir)s',
-        POSIX_PLATFORM :   "test -e %(dir)s && cd %(dir)s"
+        WIN32_PLATFORM:   'if exist %(dir)s cd %(dir)s',
+        POSIX_PLATFORM:   "test -e %(dir)s && cd %(dir)s"
     }
 
     _MAKE_TEMPLATE_ = {
-        WIN32_PLATFORM :   'if exist %(file)s "$(MAKE)" $(MAKE_FLAGS) -f %(file)s',
-        POSIX_PLATFORM :   'test -e %(file)s && "$(MAKE)" $(MAKE_FLAGS) -f %(file)s'
+        WIN32_PLATFORM:   'if exist %(file)s "$(MAKE)" $(MAKE_FLAGS) -f %(file)s',
+        POSIX_PLATFORM:   'test -e %(file)s && "$(MAKE)" $(MAKE_FLAGS) -f %(file)s'
     }
 
     _INCLUDE_CMD_ = {
-        NMAKE_FILETYPE :   '!INCLUDE',
-        GMAKE_FILETYPE :   "include"
+        NMAKE_FILETYPE:   '!INCLUDE',
+        GMAKE_FILETYPE:   "include"
     }
 
-    _INC_FLAG_ = {TAB_COMPILER_MSFT : "/I", "GCC" : "-I", "INTEL" : "-I", "NASM" : "-I"}
+    _INC_FLAG_ = {TAB_COMPILER_MSFT: "/I",
+                  "GCC": "-I", "INTEL": "-I", "NASM": "-I"}
 
-    ## Constructor of BuildFile
+    # Constructor of BuildFile
     #
     #   @param  AutoGenObject   Object of AutoGen class
     #
@@ -188,7 +192,7 @@ class BuildFile(object):
         else:
             self._Platform = POSIX_PLATFORM
 
-    ## Create build file.
+    # Create build file.
     #
     #  Only nmake and gmake are supported.
     #
@@ -199,33 +203,33 @@ class BuildFile(object):
         FileContent = self._TEMPLATE_.Replace(self._TemplateDict)
         FileName = self.getMakefileName()
         if not os.path.exists(os.path.join(self._AutoGenObject.MakeFileDir, "deps.txt")):
-            with open(os.path.join(self._AutoGenObject.MakeFileDir, "deps.txt"),"w+") as fd:
+            with open(os.path.join(self._AutoGenObject.MakeFileDir, "deps.txt"), "w+") as fd:
                 fd.write("")
         if not os.path.exists(os.path.join(self._AutoGenObject.MakeFileDir, "dependency")):
-            with open(os.path.join(self._AutoGenObject.MakeFileDir, "dependency"),"w+") as fd:
+            with open(os.path.join(self._AutoGenObject.MakeFileDir, "dependency"), "w+") as fd:
                 fd.write("")
         if not os.path.exists(os.path.join(self._AutoGenObject.MakeFileDir, "deps_target")):
-            with open(os.path.join(self._AutoGenObject.MakeFileDir, "deps_target"),"w+") as fd:
+            with open(os.path.join(self._AutoGenObject.MakeFileDir, "deps_target"), "w+") as fd:
                 fd.write("")
         return SaveFileOnChange(os.path.join(self._AutoGenObject.MakeFileDir, FileName), FileContent, False)
 
-    ## Return a list of directory creation command string
+    # Return a list of directory creation command string
     #
     #   @param      DirList     The list of directory to be created
     #
     #   @retval     list        The directory creation command list
     #
     def GetCreateDirectoryCommand(self, DirList):
-        return [self._MD_TEMPLATE_[self._Platform] % {'dir':Dir} for Dir in DirList]
+        return [self._MD_TEMPLATE_[self._Platform] % {'dir': Dir} for Dir in DirList]
 
-    ## Return a list of directory removal command string
+    # Return a list of directory removal command string
     #
     #   @param      DirList     The list of directory to be removed
     #
     #   @retval     list        The directory removal command list
     #
     def GetRemoveDirectoryCommand(self, DirList):
-        return [self._RD_TEMPLATE_[self._Platform] % {'dir':Dir} for Dir in DirList]
+        return [self._RD_TEMPLATE_[self._Platform] % {'dir': Dir} for Dir in DirList]
 
     def PlaceMacro(self, Path, MacroDefinitions=None):
         if Path.startswith("$("):
@@ -244,13 +248,15 @@ class BuildFile(object):
                     break
             return Path
 
-## ModuleMakefile class
+# ModuleMakefile class
 #
 #  This class encapsules makefie and its generation for module. It uses template to generate
 #  the content of makefile. The content of makefile will be got from ModuleAutoGen object.
 #
+
+
 class ModuleMakefile(BuildFile):
-    ## template used to generate the makefile for module
+    # template used to generate the makefile for module
     _TEMPLATE_ = TemplateString('''\
 ${makefile_header}
 
@@ -432,10 +438,12 @@ cleanlib:
 \t${BEGIN}-@${library_build_command} cleanall
 \t${END}@cd $(MODULE_BUILD_DIR)\n\n''')
 
-    _FILE_MACRO_TEMPLATE = TemplateString("${macro_name} = ${BEGIN} \\\n    ${source_file}${END}\n")
-    _BUILD_TARGET_TEMPLATE = TemplateString("${BEGIN}${target} : ${deps}\n${END}\t${cmd}\n")
+    _FILE_MACRO_TEMPLATE = TemplateString(
+        "${macro_name} = ${BEGIN} \\\n    ${source_file}${END}\n")
+    _BUILD_TARGET_TEMPLATE = TemplateString(
+        "${BEGIN}${target} : ${deps}\n${END}\t${cmd}\n")
 
-    ## Constructor of ModuleMakefile
+    # Constructor of ModuleMakefile
     #
     #   @param  ModuleAutoGen   Object of ModuleAutoGen class
     #
@@ -460,14 +468,14 @@ cleanlib:
         self.LibraryBuildDirectoryList = []
         self.SystemLibraryList = []
         self.Macros = OrderedDict()
-        self.Macros["OUTPUT_DIR"      ] = self._AutoGenObject.Macros["OUTPUT_DIR"]
-        self.Macros["DEBUG_DIR"       ] = self._AutoGenObject.Macros["DEBUG_DIR"]
+        self.Macros["OUTPUT_DIR"] = self._AutoGenObject.Macros["OUTPUT_DIR"]
+        self.Macros["DEBUG_DIR"] = self._AutoGenObject.Macros["DEBUG_DIR"]
         self.Macros["MODULE_BUILD_DIR"] = self._AutoGenObject.Macros["MODULE_BUILD_DIR"]
-        self.Macros["BIN_DIR"         ] = self._AutoGenObject.Macros["BIN_DIR"]
-        self.Macros["BUILD_DIR"       ] = self._AutoGenObject.Macros["BUILD_DIR"]
-        self.Macros["WORKSPACE"       ] = self._AutoGenObject.Macros["WORKSPACE"]
-        self.Macros["FFS_OUTPUT_DIR"  ] = self._AutoGenObject.Macros["FFS_OUTPUT_DIR"]
-        self.GenFfsList                 = ModuleAutoGen.GenFfsList
+        self.Macros["BIN_DIR"] = self._AutoGenObject.Macros["BIN_DIR"]
+        self.Macros["BUILD_DIR"] = self._AutoGenObject.Macros["BUILD_DIR"]
+        self.Macros["WORKSPACE"] = self._AutoGenObject.Macros["WORKSPACE"]
+        self.Macros["FFS_OUTPUT_DIR"] = self._AutoGenObject.Macros["FFS_OUTPUT_DIR"]
+        self.GenFfsList = ModuleAutoGen.GenFfsList
         self.MacroList = ['FFS_OUTPUT_DIR', 'MODULE_GUID', 'OUTPUT_DIR']
         self.FfsOutputFileList = []
         self.DependencyHeaderFileSet = set()
@@ -534,7 +542,8 @@ cleanlib:
                         continue
                     # Remove duplicated include path, if any
                     if Attr == "FLAGS":
-                        Value = RemoveDupOption(Value, IncPrefix, MyAgo.IncludePathList)
+                        Value = RemoveDupOption(
+                            Value, IncPrefix, MyAgo.IncludePathList)
                         if Tool == "OPTROM" and PCI_COMPRESS_Flag:
                             ValueList = Value.split()
                             if ValueList:
@@ -554,7 +563,8 @@ cleanlib:
         if RespDict:
             RespFileListContent = ''
             for Resp in RespDict:
-                RespFile = os.path.join(MyAgo.OutputDir, str(Resp).lower() + '.txt')
+                RespFile = os.path.join(
+                    MyAgo.OutputDir, str(Resp).lower() + '.txt')
                 StrList = RespDict[Resp].split(' ')
                 UnexpandMacro = []
                 NewStr = []
@@ -566,7 +576,8 @@ cleanlib:
                 UnexpandMacroStr = ' '.join(UnexpandMacro)
                 NewRespStr = ' '.join(NewStr)
                 SaveFileOnChange(RespFile, NewRespStr, False)
-                ToolsDef.append("%s = %s" % (Resp, UnexpandMacroStr + ' @' + RespFile))
+                ToolsDef.append("%s = %s" %
+                                (Resp, UnexpandMacroStr + ' @' + RespFile))
                 RespFileListContent += '@' + RespFile + TAB_LINE_BREAK
                 RespFileListContent += NewRespStr + TAB_LINE_BREAK
             SaveFileOnChange(RespFileList, RespFileListContent, False)
@@ -584,36 +595,39 @@ cleanlib:
         self.ParserGenerateFfsCmd()
 
         # Generate macros used to represent input files
-        FileMacroList = [] # macro name = file list
+        FileMacroList = []  # macro name = file list
         for FileListMacro in self.FileListMacros:
             FileMacro = self._FILE_MACRO_TEMPLATE.Replace(
-                                                    {
-                                                        "macro_name"  : FileListMacro,
-                                                        "source_file" : self.FileListMacros[FileListMacro]
-                                                    }
-                                                    )
+                {
+                    "macro_name": FileListMacro,
+                    "source_file": self.FileListMacros[FileListMacro]
+                }
+            )
             FileMacroList.append(FileMacro)
 
         # INC_LIST is special
         FileMacro = ""
         IncludePathList = []
-        for P in  MyAgo.IncludePathList:
+        for P in MyAgo.IncludePathList:
             IncludePathList.append(IncPrefix + self.PlaceMacro(P, self.Macros))
             if FileBuildRule.INC_LIST_MACRO in self.ListFileMacros:
-                self.ListFileMacros[FileBuildRule.INC_LIST_MACRO].append(IncPrefix + P)
+                self.ListFileMacros[FileBuildRule.INC_LIST_MACRO].append(
+                    IncPrefix + P)
         FileMacro += self._FILE_MACRO_TEMPLATE.Replace(
-                                                {
-                                                    "macro_name"   : "INC",
-                                                    "source_file" : IncludePathList
-                                                }
-                                                )
+            {
+                "macro_name": "INC",
+                "source_file": IncludePathList
+            }
+        )
         FileMacroList.append(FileMacro)
         # Add support when compiling .nasm source files
         IncludePathList = []
-        asmsource = [item for item in MyAgo.SourceFileList if item.File.upper().endswith((".NASM",".ASM",".NASMB","S"))]
+        asmsource = [item for item in MyAgo.SourceFileList if item.File.upper(
+        ).endswith((".NASM", ".ASM", ".NASMB", "S"))]
         if asmsource:
-            for P in  MyAgo.IncludePathList:
-                IncludePath = self._INC_FLAG_['NASM'] + self.PlaceMacro(P, self.Macros)
+            for P in MyAgo.IncludePathList:
+                IncludePath = self._INC_FLAG_[
+                    'NASM'] + self.PlaceMacro(P, self.Macros)
                 if IncludePath.endswith(os.sep):
                     IncludePath = IncludePath.rstrip(os.sep)
                 # When compiling .nasm files, need to add a literal backslash at each path.
@@ -624,29 +638,33 @@ cleanlib:
                 else:
                     IncludePath = os.path.join(IncludePath, '')
                 IncludePathList.append(IncludePath)
-            FileMacroList.append(self._FILE_MACRO_TEMPLATE.Replace({"macro_name": "NASM_INC", "source_file": IncludePathList}))
+            FileMacroList.append(self._FILE_MACRO_TEMPLATE.Replace(
+                {"macro_name": "NASM_INC", "source_file": IncludePathList}))
 
         # Generate macros used to represent files containing list of input files
         for ListFileMacro in self.ListFileMacros:
-            ListFileName = os.path.join(MyAgo.OutputDir, "%s.lst" % ListFileMacro.lower()[:len(ListFileMacro) - 5])
+            ListFileName = os.path.join(MyAgo.OutputDir, "%s.lst" % ListFileMacro.lower()[
+                                        :len(ListFileMacro) - 5])
             FileMacroList.append("%s = %s" % (ListFileMacro, ListFileName))
             SaveFileOnChange(
                 ListFileName,
                 "\n".join(self.ListFileMacros[ListFileMacro]),
                 False
-                )
+            )
 
         # Generate objlist used to create .obj file
         for Type in self.ObjTargetDict:
             NewLine = ' '.join(list(self.ObjTargetDict[Type]))
-            FileMacroList.append("OBJLIST_%s = %s" % (list(self.ObjTargetDict.keys()).index(Type), NewLine))
+            FileMacroList.append("OBJLIST_%s = %s" % (
+                list(self.ObjTargetDict.keys()).index(Type), NewLine))
 
         BcTargetList = []
 
         MakefileName = self.getMakefileName()
         LibraryMakeCommandList = []
         for D in self.LibraryBuildDirectoryList:
-            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file":os.path.join(D, MakefileName)}
+            Command = self._MAKE_TEMPLATE_[self._Platform] % {
+                "file": os.path.join(D, MakefileName)}
             LibraryMakeCommandList.append(Command)
 
         package_rel_dir = MyAgo.SourceDir
@@ -663,66 +681,67 @@ cleanlib:
             package_rel_dir = package_rel_dir[index + 1:]
 
         MakefileTemplateDict = {
-            "makefile_header"           : self._FILE_HEADER_[self._FileType],
-            "makefile_path"             : os.path.join("$(MODULE_BUILD_DIR)", MakefileName),
-            "makefile_name"             : MakefileName,
-            "platform_name"             : self.PlatformInfo.Name,
-            "platform_guid"             : self.PlatformInfo.Guid,
-            "platform_version"          : self.PlatformInfo.Version,
+            "makefile_header": self._FILE_HEADER_[self._FileType],
+            "makefile_path": os.path.join("$(MODULE_BUILD_DIR)", MakefileName),
+            "makefile_name": MakefileName,
+            "platform_name": self.PlatformInfo.Name,
+            "platform_guid": self.PlatformInfo.Guid,
+            "platform_version": self.PlatformInfo.Version,
             "platform_relative_directory": self.PlatformInfo.SourceDir,
-            "platform_output_directory" : self.PlatformInfo.OutputDir,
-            "ffs_output_directory"      : MyAgo.Macros["FFS_OUTPUT_DIR"],
-            "platform_dir"              : MyAgo.Macros["PLATFORM_DIR"],
+            "platform_output_directory": self.PlatformInfo.OutputDir,
+            "ffs_output_directory": MyAgo.Macros["FFS_OUTPUT_DIR"],
+            "platform_dir": MyAgo.Macros["PLATFORM_DIR"],
 
-            "module_name"               : MyAgo.Name,
-            "module_guid"               : MyAgo.Guid,
-            "module_name_guid"          : MyAgo.UniqueBaseName,
-            "module_version"            : MyAgo.Version,
-            "module_type"               : MyAgo.ModuleType,
-            "module_file"               : MyAgo.MetaFile.Name,
-            "module_file_base_name"     : MyAgo.MetaFile.BaseName,
-            "module_relative_directory" : MyAgo.SourceDir,
-            "module_dir"                : mws.join (self.Macros["WORKSPACE"], MyAgo.SourceDir),
+            "module_name": MyAgo.Name,
+            "module_guid": MyAgo.Guid,
+            "module_name_guid": MyAgo.UniqueBaseName,
+            "module_version": MyAgo.Version,
+            "module_type": MyAgo.ModuleType,
+            "module_file": MyAgo.MetaFile.Name,
+            "module_file_base_name": MyAgo.MetaFile.BaseName,
+            "module_relative_directory": MyAgo.SourceDir,
+            "module_dir": mws.join(self.Macros["WORKSPACE"], MyAgo.SourceDir),
             "package_relative_directory": package_rel_dir,
-            "module_extra_defines"      : ["%s = %s" % (k, v) for k, v in MyAgo.Module.Defines.items()],
+            "module_extra_defines": ["%s = %s" % (k, v) for k, v in MyAgo.Module.Defines.items()],
 
-            "architecture"              : MyAgo.Arch,
-            "toolchain_tag"             : MyAgo.ToolChain,
-            "build_target"              : MyAgo.BuildTarget,
+            "architecture": MyAgo.Arch,
+            "toolchain_tag": MyAgo.ToolChain,
+            "build_target": MyAgo.BuildTarget,
 
-            "platform_build_directory"  : self.PlatformInfo.BuildDir,
-            "module_build_directory"    : MyAgo.BuildDir,
-            "module_output_directory"   : MyAgo.OutputDir,
-            "module_debug_directory"    : MyAgo.DebugDir,
+            "platform_build_directory": self.PlatformInfo.BuildDir,
+            "module_build_directory": MyAgo.BuildDir,
+            "module_output_directory": MyAgo.OutputDir,
+            "module_debug_directory": MyAgo.DebugDir,
 
-            "separator"                 : Separator,
-            "module_tool_definitions"   : ToolsDef,
+            "separator": Separator,
+            "module_tool_definitions": ToolsDef,
 
-            "shell_command_code"        : list(self._SHELL_CMD_[self._Platform].keys()),
-            "shell_command"             : list(self._SHELL_CMD_[self._Platform].values()),
+            "shell_command_code": list(self._SHELL_CMD_[self._Platform].keys()),
+            "shell_command": list(self._SHELL_CMD_[self._Platform].values()),
 
-            "module_entry_point"        : ModuleEntryPoint,
-            "image_entry_point"         : ImageEntryPoint,
-            "arch_entry_point"          : ArchEntryPoint,
-            "remaining_build_target"    : self.ResultFileList,
-            "common_dependency_file"    : self.CommonFileDependency,
-            "create_directory_command"  : self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
-            "clean_command"             : self.GetRemoveDirectoryCommand(["$(OUTPUT_DIR)"]),
-            "cleanall_command"          : self.GetRemoveDirectoryCommand(["$(DEBUG_DIR)", "$(OUTPUT_DIR)"]),
-            "dependent_library_build_directory" : self.LibraryBuildDirectoryList,
-            "library_build_command"     : LibraryMakeCommandList,
-            "file_macro"                : FileMacroList,
-            "file_build_target"         : self.BuildTargetList,
+            "module_entry_point": ModuleEntryPoint,
+            "image_entry_point": ImageEntryPoint,
+            "arch_entry_point": ArchEntryPoint,
+            "remaining_build_target": self.ResultFileList,
+            "common_dependency_file": self.CommonFileDependency,
+            "create_directory_command": self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
+            "clean_command": self.GetRemoveDirectoryCommand(["$(OUTPUT_DIR)"]),
+            "cleanall_command": self.GetRemoveDirectoryCommand(["$(DEBUG_DIR)", "$(OUTPUT_DIR)"]),
+            "dependent_library_build_directory": self.LibraryBuildDirectoryList,
+            "library_build_command": LibraryMakeCommandList,
+            "file_macro": FileMacroList,
+            "file_build_target": self.BuildTargetList,
             "backward_compatible_target": BcTargetList,
-            "INCLUDETAG"                   : "\n".join([self._INCLUDE_CMD_[self._FileType] + " " + os.path.join("$(MODULE_BUILD_DIR)","dependency"),
-                                                              self._INCLUDE_CMD_[self._FileType] + " " + os.path.join("$(MODULE_BUILD_DIR)","deps_target")
-                                                              ])
+            "INCLUDETAG": "\n".join([self._INCLUDE_CMD_[self._FileType] + " " + os.path.join("$(MODULE_BUILD_DIR)", "dependency"),
+                                     self._INCLUDE_CMD_[
+                                         self._FileType] + " " + os.path.join("$(MODULE_BUILD_DIR)", "deps_target")
+                                     ])
         }
 
         return MakefileTemplateDict
 
     def ParserGenerateFfsCmd(self):
-        #Add Ffs cmd to self.BuildTargetList
+        # Add Ffs cmd to self.BuildTargetList
         OutputFile = ''
         DepsFileList = []
 
@@ -734,9 +753,10 @@ cleanlib:
                     Dst = self.ReplaceMacro(Dst)
                     if Dst not in self.ResultFileList:
                         self.ResultFileList.append(Dst)
-                    if '%s :' %(Dst) not in self.BuildTargetList:
-                        self.BuildTargetList.append("%s : %s" %(Dst,Src))
-                        self.BuildTargetList.append('\t' + self._CP_TEMPLATE_[self._Platform] %{'Src': Src, 'Dst': Dst})
+                    if '%s :' % (Dst) not in self.BuildTargetList:
+                        self.BuildTargetList.append("%s : %s" % (Dst, Src))
+                        self.BuildTargetList.append(
+                            '\t' + self._CP_TEMPLATE_[self._Platform] % {'Src': Src, 'Dst': Dst})
 
             FfsCmdList = Cmd[0]
             for index, Str in enumerate(FfsCmdList):
@@ -753,14 +773,16 @@ cleanlib:
             OutputFile = self.ReplaceMacro(OutputFile)
             self.ResultFileList.append(OutputFile)
             DepsFileString = self.ReplaceMacro(DepsFileString)
-            self.BuildTargetList.append('%s : %s' % (OutputFile, DepsFileString))
+            self.BuildTargetList.append(
+                '%s : %s' % (OutputFile, DepsFileString))
             CmdString = ' '.join(FfsCmdList).strip()
             CmdString = self.ReplaceMacro(CmdString)
             self.BuildTargetList.append('\t%s' % CmdString)
 
             self.ParseSecCmd(DepsFileList, Cmd[1])
-            for SecOutputFile, SecDepsFile, SecCmd in self.FfsOutputFileList :
-                self.BuildTargetList.append('%s : %s' % (self.ReplaceMacro(SecOutputFile), self.ReplaceMacro(SecDepsFile)))
+            for SecOutputFile, SecDepsFile, SecCmd in self.FfsOutputFileList:
+                self.BuildTargetList.append('%s : %s' % (self.ReplaceMacro(
+                    SecOutputFile), self.ReplaceMacro(SecDepsFile)))
                 self.BuildTargetList.append('\t%s' % self.ReplaceMacro(SecCmd))
             self.FfsOutputFileList = []
 
@@ -778,10 +800,13 @@ cleanlib:
                                 SecDepsFileList.append(SecCmdList[index + 1])
                             index = index + 1
                         if CmdName == 'Trim':
-                            SecDepsFileList.append(os.path.join('$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
+                            SecDepsFileList.append(os.path.join(
+                                '$(DEBUG_DIR)', os.path.basename(OutputFile).replace('offset', 'efi')))
                         if OutputFile.endswith('.ui') or OutputFile.endswith('.ver'):
-                            SecDepsFileList.append(os.path.join('$(MODULE_DIR)', '$(MODULE_FILE)'))
-                        self.FfsOutputFileList.append((OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
+                            SecDepsFileList.append(os.path.join(
+                                '$(MODULE_DIR)', '$(MODULE_FILE)'))
+                        self.FfsOutputFileList.append(
+                            (OutputFile, ' '.join(SecDepsFileList), SecCmdStr))
                         if len(SecDepsFileList) > 0:
                             self.ParseSecCmd(SecDepsFileList, CmdTuple)
                         break
@@ -798,14 +823,14 @@ cleanlib:
 
     def CommandExceedLimit(self):
         FlagDict = {
-                    'CC'    :  { 'Macro' : '$(CC_FLAGS)',    'Value' : False},
-                    'PP'    :  { 'Macro' : '$(PP_FLAGS)',    'Value' : False},
-                    'APP'   :  { 'Macro' : '$(APP_FLAGS)',   'Value' : False},
-                    'ASLPP' :  { 'Macro' : '$(ASLPP_FLAGS)', 'Value' : False},
-                    'VFRPP' :  { 'Macro' : '$(VFRPP_FLAGS)', 'Value' : False},
-                    'ASM'   :  { 'Macro' : '$(ASM_FLAGS)',   'Value' : False},
-                    'ASLCC' :  { 'Macro' : '$(ASLCC_FLAGS)', 'Value' : False},
-                   }
+            'CC':  {'Macro': '$(CC_FLAGS)',    'Value': False},
+            'PP':  {'Macro': '$(PP_FLAGS)',    'Value': False},
+            'APP':  {'Macro': '$(APP_FLAGS)',   'Value': False},
+            'ASLPP':  {'Macro': '$(ASLPP_FLAGS)', 'Value': False},
+            'VFRPP':  {'Macro': '$(VFRPP_FLAGS)', 'Value': False},
+            'ASM':  {'Macro': '$(ASM_FLAGS)',   'Value': False},
+            'ASLCC':  {'Macro': '$(ASLCC_FLAGS)', 'Value': False},
+        }
 
         RespDict = {}
         FileTypeList = []
@@ -830,44 +855,53 @@ cleanlib:
                         SingleCommandList = SingleCommand.split()
                         if len(SingleCommandList) > 0:
                             for Flag in FlagDict:
-                                if '$('+ Flag +')' in SingleCommandList[0]:
+                                if '$(' + Flag + ')' in SingleCommandList[0]:
                                     Tool = Flag
                                     break
                         if Tool:
                             if 'PATH' not in self._AutoGenObject.BuildOption[Tool]:
-                                EdkLogger.error("build", AUTOGEN_ERROR, "%s_PATH doesn't exist in %s ToolChain and %s Arch." %(Tool, self._AutoGenObject.ToolChain, self._AutoGenObject.Arch), ExtraData="[%s]" % str(self._AutoGenObject))
-                            SingleCommandLength += len(self._AutoGenObject.BuildOption[Tool]['PATH'])
+                                EdkLogger.error("build", AUTOGEN_ERROR, "%s_PATH doesn't exist in %s ToolChain and %s Arch." % (
+                                    Tool, self._AutoGenObject.ToolChain, self._AutoGenObject.Arch), ExtraData="[%s]" % str(self._AutoGenObject))
+                            SingleCommandLength += len(
+                                self._AutoGenObject.BuildOption[Tool]['PATH'])
                             for item in SingleCommandList[1:]:
                                 if FlagDict[Tool]['Macro'] in item:
                                     if 'FLAGS' not in self._AutoGenObject.BuildOption[Tool]:
-                                        EdkLogger.error("build", AUTOGEN_ERROR, "%s_FLAGS doesn't exist in %s ToolChain and %s Arch." %(Tool, self._AutoGenObject.ToolChain, self._AutoGenObject.Arch), ExtraData="[%s]" % str(self._AutoGenObject))
+                                        EdkLogger.error("build", AUTOGEN_ERROR, "%s_FLAGS doesn't exist in %s ToolChain and %s Arch." % (
+                                            Tool, self._AutoGenObject.ToolChain, self._AutoGenObject.Arch), ExtraData="[%s]" % str(self._AutoGenObject))
                                     Str = self._AutoGenObject.BuildOption[Tool]['FLAGS']
                                     for Option in self._AutoGenObject.BuildOption:
                                         for Attr in self._AutoGenObject.BuildOption[Option]:
                                             if Str.find(Option + '_' + Attr) != -1:
-                                                Str = Str.replace('$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
+                                                Str = Str.replace(
+                                                    '$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
                                     while(Str.find('$(') != -1):
                                         for macro in self._AutoGenObject.Macros:
-                                            MacroName = '$('+ macro + ')'
+                                            MacroName = '$(' + macro + ')'
                                             if (Str.find(MacroName) != -1):
-                                                Str = Str.replace(MacroName, self._AutoGenObject.Macros[macro])
+                                                Str = Str.replace(
+                                                    MacroName, self._AutoGenObject.Macros[macro])
                                                 break
                                         else:
                                             break
                                     SingleCommandLength += len(Str)
                                 elif '$(INC)' in item:
-                                    SingleCommandLength += self._AutoGenObject.IncludePathLength + len(IncPrefix) * len(self._AutoGenObject.IncludePathList)
+                                    SingleCommandLength += self._AutoGenObject.IncludePathLength + \
+                                        len(IncPrefix) * \
+                                        len(self._AutoGenObject.IncludePathList)
                                 elif item.find('$(') != -1:
                                     Str = item
                                     for Option in self._AutoGenObject.BuildOption:
                                         for Attr in self._AutoGenObject.BuildOption[Option]:
                                             if Str.find(Option + '_' + Attr) != -1:
-                                                Str = Str.replace('$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
+                                                Str = Str.replace(
+                                                    '$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
                                     while(Str.find('$(') != -1):
                                         for macro in self._AutoGenObject.Macros:
-                                            MacroName = '$('+ macro + ')'
+                                            MacroName = '$(' + macro + ')'
                                             if (Str.find(MacroName) != -1):
-                                                Str = Str.replace(MacroName, self._AutoGenObject.Macros[macro])
+                                                Str = Str.replace(
+                                                    MacroName, self._AutoGenObject.Macros[macro])
                                                 break
                                         else:
                                             break
@@ -880,19 +914,22 @@ cleanlib:
                 for Flag in FlagDict:
                     if FlagDict[Flag]['Value']:
                         Key = Flag + '_RESP'
-                        RespMacro = FlagDict[Flag]['Macro'].replace('FLAGS', 'RESP')
+                        RespMacro = FlagDict[Flag]['Macro'].replace(
+                            'FLAGS', 'RESP')
                         Value = self._AutoGenObject.BuildOption[Flag]['FLAGS']
                         for inc in self._AutoGenObject.IncludePathList:
                             Value += ' ' + IncPrefix + inc
                         for Option in self._AutoGenObject.BuildOption:
                             for Attr in self._AutoGenObject.BuildOption[Option]:
                                 if Value.find(Option + '_' + Attr) != -1:
-                                    Value = Value.replace('$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
+                                    Value = Value.replace(
+                                        '$(' + Option + '_' + Attr + ')', self._AutoGenObject.BuildOption[Option][Attr])
                         while (Value.find('$(') != -1):
                             for macro in self._AutoGenObject.Macros:
-                                MacroName = '$('+ macro + ')'
+                                MacroName = '$(' + macro + ')'
                                 if (Value.find(MacroName) != -1):
-                                    Value = Value.replace(MacroName, self._AutoGenObject.Macros[macro])
+                                    Value = Value.replace(
+                                        MacroName, self._AutoGenObject.Macros[macro])
                                     break
                             else:
                                 break
@@ -904,7 +941,8 @@ cleanlib:
                         for Target in BuildTargets:
                             for i, SingleCommand in enumerate(BuildTargets[Target].Commands):
                                 if FlagDict[Flag]['Macro'] in SingleCommand:
-                                    BuildTargets[Target].Commands[i] = SingleCommand.replace('$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMacro)
+                                    BuildTargets[Target].Commands[i] = SingleCommand.replace(
+                                        '$(INC)', '').replace(FlagDict[Flag]['Macro'], RespMacro)
         return RespDict
 
     def ProcessBuildTargetList(self, RespFile, ToolsDef):
@@ -926,7 +964,8 @@ cleanlib:
                 if Item in SourceFileList:
                     SourceFileList.remove(Item)
 
-        FileDependencyDict = {item:ForceIncludedFile for item in SourceFileList}
+        FileDependencyDict = {
+            item: ForceIncludedFile for item in SourceFileList}
 
         for Dependency in FileDependencyDict.values():
             self.DependencyHeaderFileSet.update(set(Dependency))
@@ -985,11 +1024,12 @@ cleanlib:
                 continue
             if GlobalData.gUseHashCache:
                 GlobalData.gModuleBuildTracking[self._AutoGenObject] = 'FAIL_METAFILE'
-            EdkLogger.warn("build","Module MetaFile [Sources] is missing local header!",
-                        ExtraData = "Local Header: " + aFile + " not found in " + self._AutoGenObject.MetaFile.Path
-                        )
+            EdkLogger.warn("build", "Module MetaFile [Sources] is missing local header!",
+                           ExtraData="Local Header: " + aFile +
+                           " not found in " + self._AutoGenObject.MetaFile.Path
+                           )
 
-        for File,Dependency in FileDependencyDict.items():
+        for File, Dependency in FileDependencyDict.items():
             if not Dependency:
                 continue
 
@@ -1024,7 +1064,8 @@ cleanlib:
                 for Dep in T.Dependencies:
                     Deps.append(self.PlaceMacro(str(Dep), self.Macros))
                     if Dep != '$(MAKE_FILE)':
-                        CCodeDeps.append(self.PlaceMacro(str(Dep), self.Macros))
+                        CCodeDeps.append(
+                            self.PlaceMacro(str(Dep), self.Macros))
                 # Add inclusion-dependencies
                 if len(T.Inputs) == 1 and T.Inputs[0] in FileDependencyDict:
                     for F in FileDependencyDict[T.Inputs[0]]:
@@ -1035,7 +1076,8 @@ cleanlib:
                     # In order to use file list macro as dependency
                     if T.GenListFile:
                         # gnu tools need forward slash path separator, even on Windows
-                        self.ListFileMacros[T.ListFileMacro].append(str(F).replace ('\\', '/'))
+                        self.ListFileMacros[T.ListFileMacro].append(
+                            str(F).replace('\\', '/'))
                         self.FileListMacros[T.FileListMacro].append(NewFile)
                     elif T.GenFileListMacro:
                         self.FileListMacros[T.FileListMacro].append(NewFile)
@@ -1054,25 +1096,32 @@ cleanlib:
                                                                                    CmdCppDict, DependencyDict, RespFile,
                                                                                    ToolsDef, resp_file_number)
                     resp_file_number += 1
-                    TargetDict = {"target": self.PlaceMacro(T.Target.Path, self.Macros), "cmd": "\n\t".join(T.Commands),"deps": CCodeDeps}
-                    CmdLine = self._BUILD_TARGET_TEMPLATE.Replace(TargetDict).rstrip().replace('\t$(OBJLIST', '$(OBJLIST')
+                    TargetDict = {"target": self.PlaceMacro(
+                        T.Target.Path, self.Macros), "cmd": "\n\t".join(T.Commands), "deps": CCodeDeps}
+                    CmdLine = self._BUILD_TARGET_TEMPLATE.Replace(
+                        TargetDict).rstrip().replace('\t$(OBJLIST', '$(OBJLIST')
                     if T.Commands:
-                        CmdLine = '%s%s' %(CmdLine, TAB_LINE_BREAK)
+                        CmdLine = '%s%s' % (CmdLine, TAB_LINE_BREAK)
                     if CCodeDeps or CmdLine:
                         self.BuildTargetList.append(CmdLine)
                 else:
-                    TargetDict = {"target": self.PlaceMacro(T.Target.Path, self.Macros), "cmd": "\n\t".join(T.Commands),"deps": Deps}
-                    self.BuildTargetList.append(self._BUILD_TARGET_TEMPLATE.Replace(TargetDict))
+                    TargetDict = {"target": self.PlaceMacro(
+                        T.Target.Path, self.Macros), "cmd": "\n\t".join(T.Commands), "deps": Deps}
+                    self.BuildTargetList.append(
+                        self._BUILD_TARGET_TEMPLATE.Replace(TargetDict))
 
                     # Add a Makefile rule for targets generating multiple files.
                     # The main output is a prerequisite for the other output files.
                     for i in T.Outputs[1:]:
-                        AnnexeTargetDict = {"target": self.PlaceMacro(i.Path, self.Macros), "cmd": "", "deps": self.PlaceMacro(T.Target.Path, self.Macros)}
-                        self.BuildTargetList.append(self._BUILD_TARGET_TEMPLATE.Replace(AnnexeTargetDict))
+                        AnnexeTargetDict = {"target": self.PlaceMacro(
+                            i.Path, self.Macros), "cmd": "", "deps": self.PlaceMacro(T.Target.Path, self.Macros)}
+                        self.BuildTargetList.append(
+                            self._BUILD_TARGET_TEMPLATE.Replace(AnnexeTargetDict))
 
     def ParserCCodeFile(self, T, Type, CmdSumDict, CmdTargetDict, CmdCppDict, DependencyDict, RespFile, ToolsDef,
-                            resp_file_number):
-        SaveFilePath = os.path.join(RespFile, "cc_resp_%s.txt" % resp_file_number)
+                        resp_file_number):
+        SaveFilePath = os.path.join(
+            RespFile, "cc_resp_%s.txt" % resp_file_number)
         if not CmdSumDict:
             for item in self._AutoGenObject.Targets[Type]:
                 CmdSumDict[item.Target.SubDir] = item.Target.BaseName
@@ -1097,39 +1146,47 @@ cleanlib:
                 if len(SingleCommandList) > 0 and self.CheckCCCmd(SingleCommandList):
                     for Temp in SingleCommandList:
                         if Temp.startswith('/Fo'):
-                            CmdSign = '%s%s' % (Temp.rsplit(TAB_SLASH, 1)[0], TAB_SLASH)
+                            CmdSign = '%s%s' % (Temp.rsplit(
+                                TAB_SLASH, 1)[0], TAB_SLASH)
                             break
                     else:
                         continue
                     if CmdSign not in list(CmdTargetDict.keys()):
                         cmd = Item.replace(Temp, CmdSign)
                         if SingleCommandList[-1] in cmd:
-                            CmdTargetDict[CmdSign] = [cmd.replace(SingleCommandList[-1], "").rstrip(), SingleCommandList[-1]]
+                            CmdTargetDict[CmdSign] = [cmd.replace(
+                                SingleCommandList[-1], "").rstrip(), SingleCommandList[-1]]
                     else:
                         # CmdTargetDict[CmdSign] = "%s %s" % (CmdTargetDict[CmdSign], SingleCommandList[-1])
                         CmdTargetDict[CmdSign].append(SingleCommandList[-1])
                     Index = CommandList.index(Item)
                     CommandList.pop(Index)
-                    BaseName = SingleCommandList[-1].rsplit('.',1)[0]
+                    BaseName = SingleCommandList[-1].rsplit('.', 1)[0]
                     if BaseName.endswith("%s%s" % (TAB_SLASH, CmdSumDict[CmdSign[3:].rsplit(TAB_SLASH, 1)[0]])):
                         Cpplist = CmdCppDict[T.Target.SubDir]
-                        Cpplist.insert(0, '$(OBJLIST_%d): ' % list(self.ObjTargetDict.keys()).index(T.Target.SubDir))
+                        Cpplist.insert(0, '$(OBJLIST_%d): ' % list(
+                            self.ObjTargetDict.keys()).index(T.Target.SubDir))
                         source_files = CmdTargetDict[CmdSign][1:]
                         source_files.insert(0, " ")
-                        if len(source_files)>2:
-                            SaveFileOnChange(SaveFilePath, " ".join(source_files), False)
+                        if len(source_files) > 2:
+                            SaveFileOnChange(
+                                SaveFilePath, " ".join(source_files), False)
                             T.Commands[Index] = '%s\n\t%s $(cc_resp_%s)' % (
-                            ' \\\n\t'.join(Cpplist), CmdTargetDict[CmdSign][0], resp_file_number)
-                            ToolsDef.append("cc_resp_%s = @%s" % (resp_file_number, SaveFilePath))
+                                ' \\\n\t'.join(Cpplist), CmdTargetDict[CmdSign][0], resp_file_number)
+                            ToolsDef.append("cc_resp_%s = @%s" %
+                                            (resp_file_number, SaveFilePath))
 
-                        elif len(source_files)<=2 and len(" ".join(CmdTargetDict[CmdSign][:2]))>GlobalData.gCommandMaxLength:
-                            SaveFileOnChange(SaveFilePath, " ".join(source_files), False)
+                        elif len(source_files) <= 2 and len(" ".join(CmdTargetDict[CmdSign][:2])) > GlobalData.gCommandMaxLength:
+                            SaveFileOnChange(
+                                SaveFilePath, " ".join(source_files), False)
                             T.Commands[Index] = '%s\n\t%s $(cc_resp_%s)' % (
                                 ' \\\n\t'.join(Cpplist), CmdTargetDict[CmdSign][0], resp_file_number)
-                            ToolsDef.append("cc_resp_%s = @%s" % (resp_file_number, SaveFilePath))
+                            ToolsDef.append("cc_resp_%s = @%s" %
+                                            (resp_file_number, SaveFilePath))
 
                         else:
-                            T.Commands[Index] = '%s\n\t%s' % (' \\\n\t'.join(Cpplist), " ".join(CmdTargetDict[CmdSign]))
+                            T.Commands[Index] = '%s\n\t%s' % (' \\\n\t'.join(
+                                Cpplist), " ".join(CmdTargetDict[CmdSign]))
                     else:
                         T.Commands.pop(Index)
         return T, CmdSumDict, CmdTargetDict, CmdCppDict
@@ -1139,13 +1196,15 @@ cleanlib:
             if '$(CC)' in cmd:
                 return True
         return False
-    ## For creating makefile targets for dependent libraries
+    # For creating makefile targets for dependent libraries
+
     def ProcessDependentLibrary(self):
         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
             if not LibraryAutoGen.IsBinaryModule:
-                self.LibraryBuildDirectoryList.append(self.PlaceMacro(LibraryAutoGen.BuildDir, self.Macros))
+                self.LibraryBuildDirectoryList.append(
+                    self.PlaceMacro(LibraryAutoGen.BuildDir, self.Macros))
 
-    ## Return a list containing source file's dependencies
+    # Return a list containing source file's dependencies
     #
     #   @param      FileList        The list of source files
     #   @param      ForceInculeList The list of files which will be included forcely
@@ -1156,17 +1215,18 @@ cleanlib:
     def GetFileDependency(self, FileList, ForceInculeList, SearchPathList):
         Dependency = {}
         for F in FileList:
-            Dependency[F] = GetDependencyList(self._AutoGenObject, self.FileCache, F, ForceInculeList, SearchPathList)
+            Dependency[F] = GetDependencyList(
+                self._AutoGenObject, self.FileCache, F, ForceInculeList, SearchPathList)
         return Dependency
 
 
-## CustomMakefile class
+# CustomMakefile class
 #
 #  This class encapsules makefie and its generation for module. It uses template to generate
 #  the content of makefile. The content of makefile will be got from ModuleAutoGen object.
 #
 class CustomMakefile(BuildFile):
-    ## template used to generate the makefile for module with custom makefile
+    # template used to generate the makefile for module with custom makefile
     _TEMPLATE_ = TemplateString('''\
 ${makefile_header}
 
@@ -1258,7 +1318,7 @@ ${BEGIN}\t-@${create_directory_command}\n${END}\
 
 ''')
 
-    ## Constructor of CustomMakefile
+    # Constructor of CustomMakefile
     #
     #   @param  ModuleAutoGen   Object of ModuleAutoGen class
     #
@@ -1277,9 +1337,9 @@ ${BEGIN}\t-@${create_directory_command}\n${END}\
             EdkLogger.error('build', OPTION_NOT_SUPPORTED, "No custom makefile for %s" % self._FileType,
                             ExtraData="[%s]" % str(MyAgo))
         MakefilePath = mws.join(
-                                MyAgo.WorkspaceDir,
-                                MyAgo.CustomMakefile[self._FileType]
-                                )
+            MyAgo.WorkspaceDir,
+            MyAgo.CustomMakefile[self._FileType]
+        )
         try:
             CustomMakefile = open(MakefilePath, 'r').read()
         except:
@@ -1296,61 +1356,65 @@ ${BEGIN}\t-@${create_directory_command}\n${END}\
                 if Attr == "FAMILY":
                     continue
                 elif Attr == "PATH":
-                    ToolsDef.append("%s = %s" % (Tool, MyAgo.BuildOption[Tool][Attr]))
+                    ToolsDef.append("%s = %s" %
+                                    (Tool, MyAgo.BuildOption[Tool][Attr]))
                 else:
-                    ToolsDef.append("%s_%s = %s" % (Tool, Attr, MyAgo.BuildOption[Tool][Attr]))
+                    ToolsDef.append("%s_%s = %s" %
+                                    (Tool, Attr, MyAgo.BuildOption[Tool][Attr]))
             ToolsDef.append("")
 
         MakefileName = self.getMakefileName()
         MakefileTemplateDict = {
-            "makefile_header"           : self._FILE_HEADER_[self._FileType],
-            "makefile_path"             : os.path.join("$(MODULE_BUILD_DIR)", MakefileName),
-            "platform_name"             : self.PlatformInfo.Name,
-            "platform_guid"             : self.PlatformInfo.Guid,
-            "platform_version"          : self.PlatformInfo.Version,
+            "makefile_header": self._FILE_HEADER_[self._FileType],
+            "makefile_path": os.path.join("$(MODULE_BUILD_DIR)", MakefileName),
+            "platform_name": self.PlatformInfo.Name,
+            "platform_guid": self.PlatformInfo.Guid,
+            "platform_version": self.PlatformInfo.Version,
             "platform_relative_directory": self.PlatformInfo.SourceDir,
-            "platform_output_directory" : self.PlatformInfo.OutputDir,
-            "platform_dir"              : MyAgo.Macros["PLATFORM_DIR"],
+            "platform_output_directory": self.PlatformInfo.OutputDir,
+            "platform_dir": MyAgo.Macros["PLATFORM_DIR"],
 
-            "module_name"               : MyAgo.Name,
-            "module_guid"               : MyAgo.Guid,
-            "module_name_guid"          : MyAgo.UniqueBaseName,
-            "module_version"            : MyAgo.Version,
-            "module_type"               : MyAgo.ModuleType,
-            "module_file"               : MyAgo.MetaFile,
-            "module_file_base_name"     : MyAgo.MetaFile.BaseName,
-            "module_relative_directory" : MyAgo.SourceDir,
-            "module_dir"                : mws.join (MyAgo.WorkspaceDir, MyAgo.SourceDir),
+            "module_name": MyAgo.Name,
+            "module_guid": MyAgo.Guid,
+            "module_name_guid": MyAgo.UniqueBaseName,
+            "module_version": MyAgo.Version,
+            "module_type": MyAgo.ModuleType,
+            "module_file": MyAgo.MetaFile,
+            "module_file_base_name": MyAgo.MetaFile.BaseName,
+            "module_relative_directory": MyAgo.SourceDir,
+            "module_dir": mws.join(MyAgo.WorkspaceDir, MyAgo.SourceDir),
 
-            "architecture"              : MyAgo.Arch,
-            "toolchain_tag"             : MyAgo.ToolChain,
-            "build_target"              : MyAgo.BuildTarget,
+            "architecture": MyAgo.Arch,
+            "toolchain_tag": MyAgo.ToolChain,
+            "build_target": MyAgo.BuildTarget,
 
-            "platform_build_directory"  : self.PlatformInfo.BuildDir,
-            "module_build_directory"    : MyAgo.BuildDir,
-            "module_output_directory"   : MyAgo.OutputDir,
-            "module_debug_directory"    : MyAgo.DebugDir,
+            "platform_build_directory": self.PlatformInfo.BuildDir,
+            "module_build_directory": MyAgo.BuildDir,
+            "module_output_directory": MyAgo.OutputDir,
+            "module_debug_directory": MyAgo.DebugDir,
 
-            "separator"                 : Separator,
-            "module_tool_definitions"   : ToolsDef,
+            "separator": Separator,
+            "module_tool_definitions": ToolsDef,
 
-            "shell_command_code"        : list(self._SHELL_CMD_[self._Platform].keys()),
-            "shell_command"             : list(self._SHELL_CMD_[self._Platform].values()),
+            "shell_command_code": list(self._SHELL_CMD_[self._Platform].keys()),
+            "shell_command": list(self._SHELL_CMD_[self._Platform].values()),
 
-            "create_directory_command"  : self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
-            "custom_makefile_content"   : CustomMakefile
+            "create_directory_command": self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
+            "custom_makefile_content": CustomMakefile
         }
 
         return MakefileTemplateDict
 
-## PlatformMakefile class
+# PlatformMakefile class
 #
 #  This class encapsules makefie and its generation for platform. It uses
 # template to generate the content of makefile. The content of makefile will be
 # got from PlatformAutoGen object.
 #
+
+
 class PlatformMakefile(BuildFile):
-    ## template used to generate the makefile for platform
+    # template used to generate the makefile for platform
     _TEMPLATE_ = TemplateString('''\
 ${makefile_header}
 
@@ -1445,7 +1509,7 @@ cleanlib:
 \t${END}@cd $(BUILD_DIR)\n
 ''')
 
-    ## Constructor of PlatformMakefile
+    # Constructor of PlatformMakefile
     #
     #   @param  ModuleAutoGen   Object of PlatformAutoGen class
     #
@@ -1477,9 +1541,9 @@ cleanlib:
         LibraryMakefileList = []
         LibraryMakeCommandList = []
         for D in self.LibraryBuildDirectoryList:
-            D = self.PlaceMacro(D, {"BUILD_DIR":MyAgo.BuildDir})
+            D = self.PlaceMacro(D, {"BUILD_DIR": MyAgo.BuildDir})
             Makefile = os.path.join(D, MakefileName)
-            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file":Makefile}
+            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file": Makefile}
             LibraryMakefileList.append(Makefile)
             LibraryMakeCommandList.append(Command)
         self.LibraryMakeCommandList = LibraryMakeCommandList
@@ -1487,44 +1551,44 @@ cleanlib:
         ModuleMakefileList = []
         ModuleMakeCommandList = []
         for D in self.ModuleBuildDirectoryList:
-            D = self.PlaceMacro(D, {"BUILD_DIR":MyAgo.BuildDir})
+            D = self.PlaceMacro(D, {"BUILD_DIR": MyAgo.BuildDir})
             Makefile = os.path.join(D, MakefileName)
-            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file":Makefile}
+            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file": Makefile}
             ModuleMakefileList.append(Makefile)
             ModuleMakeCommandList.append(Command)
 
         MakefileTemplateDict = {
-            "makefile_header"           : self._FILE_HEADER_[self._FileType],
-            "makefile_path"             : os.path.join("$(BUILD_DIR)", MakefileName),
-            "make_path"                 : MyAgo.ToolDefinition["MAKE"]["PATH"],
-            "makefile_name"             : MakefileName,
-            "platform_name"             : MyAgo.Name,
-            "platform_guid"             : MyAgo.Guid,
-            "platform_version"          : MyAgo.Version,
-            "platform_file"             : MyAgo.MetaFile,
+            "makefile_header": self._FILE_HEADER_[self._FileType],
+            "makefile_path": os.path.join("$(BUILD_DIR)", MakefileName),
+            "make_path": MyAgo.ToolDefinition["MAKE"]["PATH"],
+            "makefile_name": MakefileName,
+            "platform_name": MyAgo.Name,
+            "platform_guid": MyAgo.Guid,
+            "platform_version": MyAgo.Version,
+            "platform_file": MyAgo.MetaFile,
             "platform_relative_directory": MyAgo.SourceDir,
-            "platform_output_directory" : MyAgo.OutputDir,
-            "platform_build_directory"  : MyAgo.BuildDir,
-            "platform_dir"              : MyAgo.Macros["PLATFORM_DIR"],
+            "platform_output_directory": MyAgo.OutputDir,
+            "platform_build_directory": MyAgo.BuildDir,
+            "platform_dir": MyAgo.Macros["PLATFORM_DIR"],
 
-            "toolchain_tag"             : MyAgo.ToolChain,
-            "build_target"              : MyAgo.BuildTarget,
-            "shell_command_code"        : list(self._SHELL_CMD_[self._Platform].keys()),
-            "shell_command"             : list(self._SHELL_CMD_[self._Platform].values()),
-            "build_architecture_list"   : MyAgo.Arch,
-            "architecture"              : MyAgo.Arch,
-            "separator"                 : Separator,
-            "create_directory_command"  : self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
-            "cleanall_command"          : self.GetRemoveDirectoryCommand(self.IntermediateDirectoryList),
-            "library_makefile_list"     : LibraryMakefileList,
-            "module_makefile_list"      : ModuleMakefileList,
-            "library_build_command"     : LibraryMakeCommandList,
-            "module_build_command"      : ModuleMakeCommandList,
+            "toolchain_tag": MyAgo.ToolChain,
+            "build_target": MyAgo.BuildTarget,
+            "shell_command_code": list(self._SHELL_CMD_[self._Platform].keys()),
+            "shell_command": list(self._SHELL_CMD_[self._Platform].values()),
+            "build_architecture_list": MyAgo.Arch,
+            "architecture": MyAgo.Arch,
+            "separator": Separator,
+            "create_directory_command": self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
+            "cleanall_command": self.GetRemoveDirectoryCommand(self.IntermediateDirectoryList),
+            "library_makefile_list": LibraryMakefileList,
+            "module_makefile_list": ModuleMakefileList,
+            "library_build_command": LibraryMakeCommandList,
+            "module_build_command": ModuleMakeCommandList,
         }
 
         return MakefileTemplateDict
 
-    ## Get the root directory list for intermediate files of all modules build
+    # Get the root directory list for intermediate files of all modules build
     #
     #   @retval     list    The list of directory
     #
@@ -1532,10 +1596,11 @@ cleanlib:
         DirList = []
         for ModuleAutoGen in self._AutoGenObject.ModuleAutoGenList:
             if not ModuleAutoGen.IsBinaryModule:
-                DirList.append(os.path.join(self._AutoGenObject.BuildDir, ModuleAutoGen.BuildDir))
+                DirList.append(os.path.join(
+                    self._AutoGenObject.BuildDir, ModuleAutoGen.BuildDir))
         return DirList
 
-    ## Get the root directory list for intermediate files of all libraries build
+    # Get the root directory list for intermediate files of all libraries build
     #
     #   @retval     list    The list of directory
     #
@@ -1543,20 +1608,23 @@ cleanlib:
         DirList = []
         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
             if not LibraryAutoGen.IsBinaryModule:
-                DirList.append(os.path.join(self._AutoGenObject.BuildDir, LibraryAutoGen.BuildDir))
+                DirList.append(os.path.join(
+                    self._AutoGenObject.BuildDir, LibraryAutoGen.BuildDir))
         return DirList
 
-## TopLevelMakefile class
+# TopLevelMakefile class
 #
 #  This class encapsules makefie and its generation for entrance makefile. It
 # uses template to generate the content of makefile. The content of makefile
 # will be got from WorkspaceAutoGen object.
 #
+
+
 class TopLevelMakefile(BuildFile):
-    ## template used to generate toplevel makefile
+    # template used to generate toplevel makefile
     _TEMPLATE_ = TemplateString('''${BEGIN}\tGenFds -f ${fdf_file} --conf=${conf_directory} -o ${platform_build_directory} -t ${toolchain_tag} -b ${build_target} -p ${active_platform} -a ${build_architecture_list} ${extra_options}${END}${BEGIN} -r ${fd} ${END}${BEGIN} -i ${fv} ${END}${BEGIN} -C ${cap} ${END}${BEGIN} -D ${macro} ${END}''')
 
-    ## Constructor of TopLevelMakefile
+    # Constructor of TopLevelMakefile
     #
     #   @param  Workspace   Object of WorkspaceAutoGen class
     #
@@ -1578,7 +1646,8 @@ class TopLevelMakefile(BuildFile):
                             ExtraData="[%s]" % str(MyAgo))
 
         for Arch in MyAgo.ArchList:
-            self.IntermediateDirectoryList.append(Separator.join(["$(BUILD_DIR)", Arch]))
+            self.IntermediateDirectoryList.append(
+                Separator.join(["$(BUILD_DIR)", Arch]))
         self.IntermediateDirectoryList.append("$(FV_DIR)")
 
         # TRICK: for not generating GenFds call in makefile if no FDF file
@@ -1591,7 +1660,8 @@ class TopLevelMakefile(BuildFile):
             MacroDict.update(GlobalData.gCommandLineDefines)
             for MacroName in MacroDict:
                 if MacroDict[MacroName] != "":
-                    MacroList.append('"%s=%s"' % (MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
+                    MacroList.append('"%s=%s"' % (
+                        MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
                 else:
                     MacroList.append('"%s"' % MacroName)
         else:
@@ -1620,48 +1690,50 @@ class TopLevelMakefile(BuildFile):
             else:
                 pcdname = '.'.join(pcd[0:2])
             if pcd[3].startswith('{'):
-                ExtraOption += " --pcd " + pcdname + '=' + 'H' + '"' + pcd[3] + '"'
+                ExtraOption += " --pcd " + pcdname + \
+                    '=' + 'H' + '"' + pcd[3] + '"'
             else:
                 ExtraOption += " --pcd " + pcdname + '=' + pcd[3]
 
         MakefileName = self.getMakefileName()
         SubBuildCommandList = []
         for A in MyAgo.ArchList:
-            Command = self._MAKE_TEMPLATE_[self._Platform] % {"file":os.path.join("$(BUILD_DIR)", A, MakefileName)}
+            Command = self._MAKE_TEMPLATE_[self._Platform] % {
+                "file": os.path.join("$(BUILD_DIR)", A, MakefileName)}
             SubBuildCommandList.append(Command)
 
         MakefileTemplateDict = {
-            "makefile_header"           : self._FILE_HEADER_[self._FileType],
-            "makefile_path"             : os.path.join("$(BUILD_DIR)", MakefileName),
-            "make_path"                 : MyAgo.ToolDefinition["MAKE"]["PATH"],
-            "platform_name"             : MyAgo.Name,
-            "platform_guid"             : MyAgo.Guid,
-            "platform_version"          : MyAgo.Version,
-            "platform_build_directory"  : MyAgo.BuildDir,
-            "conf_directory"            : GlobalData.gConfDirectory,
+            "makefile_header": self._FILE_HEADER_[self._FileType],
+            "makefile_path": os.path.join("$(BUILD_DIR)", MakefileName),
+            "make_path": MyAgo.ToolDefinition["MAKE"]["PATH"],
+            "platform_name": MyAgo.Name,
+            "platform_guid": MyAgo.Guid,
+            "platform_version": MyAgo.Version,
+            "platform_build_directory": MyAgo.BuildDir,
+            "conf_directory": GlobalData.gConfDirectory,
 
-            "toolchain_tag"             : MyAgo.ToolChain,
-            "build_target"              : MyAgo.BuildTarget,
-            "shell_command_code"        : list(self._SHELL_CMD_[self._Platform].keys()),
-            "shell_command"             : list(self._SHELL_CMD_[self._Platform].values()),
-            'arch'                      : list(MyAgo.ArchList),
-            "build_architecture_list"   : ','.join(MyAgo.ArchList),
-            "separator"                 : Separator,
-            "create_directory_command"  : self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
-            "cleanall_command"          : self.GetRemoveDirectoryCommand(self.IntermediateDirectoryList),
-            "sub_build_command"         : SubBuildCommandList,
-            "fdf_file"                  : FdfFileList,
-            "active_platform"           : str(MyAgo),
-            "fd"                        : MyAgo.FdTargetList,
-            "fv"                        : MyAgo.FvTargetList,
-            "cap"                       : MyAgo.CapTargetList,
-            "extra_options"             : ExtraOption,
-            "macro"                     : MacroList,
+            "toolchain_tag": MyAgo.ToolChain,
+            "build_target": MyAgo.BuildTarget,
+            "shell_command_code": list(self._SHELL_CMD_[self._Platform].keys()),
+            "shell_command": list(self._SHELL_CMD_[self._Platform].values()),
+            'arch': list(MyAgo.ArchList),
+            "build_architecture_list": ','.join(MyAgo.ArchList),
+            "separator": Separator,
+            "create_directory_command": self.GetCreateDirectoryCommand(self.IntermediateDirectoryList),
+            "cleanall_command": self.GetRemoveDirectoryCommand(self.IntermediateDirectoryList),
+            "sub_build_command": SubBuildCommandList,
+            "fdf_file": FdfFileList,
+            "active_platform": str(MyAgo),
+            "fd": MyAgo.FdTargetList,
+            "fv": MyAgo.FvTargetList,
+            "cap": MyAgo.CapTargetList,
+            "extra_options": ExtraOption,
+            "macro": MacroList,
         }
 
         return MakefileTemplateDict
 
-    ## Get the root directory list for intermediate files of all modules build
+    # Get the root directory list for intermediate files of all modules build
     #
     #   @retval     list    The list of directory
     #
@@ -1669,10 +1741,11 @@ class TopLevelMakefile(BuildFile):
         DirList = []
         for ModuleAutoGen in self._AutoGenObject.ModuleAutoGenList:
             if not ModuleAutoGen.IsBinaryModule:
-                DirList.append(os.path.join(self._AutoGenObject.BuildDir, ModuleAutoGen.BuildDir))
+                DirList.append(os.path.join(
+                    self._AutoGenObject.BuildDir, ModuleAutoGen.BuildDir))
         return DirList
 
-    ## Get the root directory list for intermediate files of all libraries build
+    # Get the root directory list for intermediate files of all libraries build
     #
     #   @retval     list    The list of directory
     #
@@ -1680,10 +1753,11 @@ class TopLevelMakefile(BuildFile):
         DirList = []
         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
             if not LibraryAutoGen.IsBinaryModule:
-                DirList.append(os.path.join(self._AutoGenObject.BuildDir, LibraryAutoGen.BuildDir))
+                DirList.append(os.path.join(
+                    self._AutoGenObject.BuildDir, LibraryAutoGen.BuildDir))
         return DirList
 
-## Find dependencies for one source file
+# Find dependencies for one source file
 #
 #  By searching recursively "#include" directive in file, find out all the
 #  files needed by given source file. The dependencies will be only searched
@@ -1695,8 +1769,11 @@ class TopLevelMakefile(BuildFile):
 #
 #   @retval     list            The list of files the given source file depends on
 #
+
+
 def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPathList):
-    EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files for %s" % File)
+    EdkLogger.debug(EdkLogger.DEBUG_1,
+                    "Try to get dependency files for %s" % File)
     FileStack = [File] + ForceList
     DependencySet = set()
 
@@ -1725,7 +1802,8 @@ def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPathList)
                 FileContent = Fd.read()
                 Fd.close()
             except BaseException as X:
-                EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path + "\n\t" + str(X))
+                EdkLogger.error("build", FILE_OPEN_FAILURE,
+                                ExtraData=F.Path + "\n\t" + str(X))
             if len(FileContent) == 0:
                 continue
             try:
@@ -1746,7 +1824,8 @@ def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPathList)
                     HeaderType = HeaderList[0][0]
                     HeaderKey = HeaderList[0][1]
                     if HeaderType in gIncludeMacroConversion:
-                        Inc = gIncludeMacroConversion[HeaderType] % {"HeaderKey" : HeaderKey}
+                        Inc = gIncludeMacroConversion[HeaderType] % {
+                            "HeaderKey": HeaderKey}
                     else:
                         # not known macro used in #include, always build the file by
                         # returning a empty dependency
@@ -1776,7 +1855,7 @@ def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPathList)
                     FileStack.append(FilePath)
                 break
             else:
-                EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was not found "\
+                EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was not found "
                                 "in any given path:\n\t%s" % (Inc, F, "\n\t".join(SearchPathList)))
 
         FileCache[F] = FullPathDependList
@@ -1789,6 +1868,7 @@ def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPathList)
 
     return DependencyList
 
+
 # This acts like the main() function for the script, unless it is 'import'ed into another script.
 if __name__ == '__main__':
     pass
diff --git a/BaseTools/Source/Python/AutoGen/GenPcdDb.py b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
index ad5dae0e5a2f..8bf76e6a7058 100644
--- a/BaseTools/Source/Python/AutoGen/GenPcdDb.py
+++ b/BaseTools/Source/Python/AutoGen/GenPcdDb.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Routines for generating Pcd Database
 #
 # Copyright (c) 2013 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -94,10 +94,10 @@ ${END}
 #endif
 """)
 
-## Mapping between PCD driver type and EFI phase
+# Mapping between PCD driver type and EFI phase
 gPcdPhaseMap = {
-    "PEI_PCD_DRIVER"    :   "PEI",
-    "DXE_PCD_DRIVER"    :   "DXE"
+    "PEI_PCD_DRIVER":   "PEI",
+    "DXE_PCD_DRIVER":   "DXE"
 }
 
 gPcdDatabaseAutoGenH = TemplateString("""
@@ -233,7 +233,7 @@ ${PHASE}_PCD_DATABASE_INIT g${PHASE}PcdDbInit = {
 #endif
 """)
 
-## DbItemList
+# DbItemList
 #
 #  The class holds the Pcd database items. ItemSize if not zero should match the item datum type in the C structure.
 #  When the structure is changed, remember to check the ItemSize and the related  PackStr in PackData()
@@ -241,6 +241,8 @@ ${PHASE}_PCD_DATABASE_INIT g${PHASE}PcdDbInit = {
 #  the DataList corresponds to the data that need to be written to database. If DataList is not present, then RawDataList
 #  will be written to the database.
 #
+
+
 class DbItemList:
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         self.ItemSize = ItemSize
@@ -269,13 +271,14 @@ class DbItemList:
             self.ListSize = 0
             return self.ListSize
         if self.ItemSize == 0:
-            self.ListSize = self.GetInterOffset(len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
+            self.ListSize = self.GetInterOffset(
+                len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
         else:
             self.ListSize = self.ItemSize * len(self.RawDataList)
         return self.ListSize
 
     def PackData(self):
-        ## PackGuid
+        # PackGuid
         #
         # Pack the GUID value in C structure format into data array
         #
@@ -305,10 +308,12 @@ class DbItemList:
 
         return Buffer
 
-## DbExMapTblItemList
+# DbExMapTblItemList
 #
 #  The class holds the ExMap table
 #
+
+
 class DbExMapTblItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
@@ -323,11 +328,13 @@ class DbExMapTblItemList (DbItemList):
                            GetIntegerValue(Datas[2]))
         return Buffer
 
-## DbComItemList
+# DbComItemList
 #
 # The DbComItemList is a special kind of DbItemList in case that the size of the List can not be computed by the
 # ItemSize multiply the ItemCount.
 #
+
+
 class DbComItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
@@ -356,7 +363,8 @@ class DbComItemList (DbItemList):
             if len(self.RawDataList) == 0:
                 self.ListSize = 0
             else:
-                self.ListSize = self.GetInterOffset(len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1]) * self.ItemSize
+                self.ListSize = self.GetInterOffset(len(
+                    self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1]) * self.ItemSize
 
         return self.ListSize
 
@@ -374,10 +382,12 @@ class DbComItemList (DbItemList):
 
         return Buffer
 
-## DbVariableTableItemList
+# DbVariableTableItemList
 #
 #  The class holds the Variable header value table
 #
+
+
 class DbVariableTableItemList (DbComItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbComItemList.__init__(self, ItemSize, DataList, RawDataList)
@@ -397,8 +407,9 @@ class DbVariableTableItemList (DbComItemList):
                                GetIntegerValue(0))
         return Buffer
 
+
 class DbStringHeadTableItemList(DbItemList):
-    def __init__(self,ItemSize,DataList=None,RawDataList=None):
+    def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
 
     def GetInterOffset(self, Index):
@@ -426,7 +437,8 @@ class DbStringHeadTableItemList(DbItemList):
             self.ListSize = 0
             return self.ListSize
         if self.ItemSize == 0:
-            self.ListSize = self.GetInterOffset(len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
+            self.ListSize = self.GetInterOffset(
+                len(self.RawDataList) - 1) + len(self.RawDataList[len(self.RawDataList)-1])
         else:
             for Datas in self.RawDataList:
                 if type(Datas) in (list, tuple):
@@ -435,10 +447,12 @@ class DbStringHeadTableItemList(DbItemList):
                     self.ListSize += self.ItemSize
         return self.ListSize
 
-## DbSkuHeadTableItemList
+# DbSkuHeadTableItemList
 #
 #  The class holds the Sku header value table
 #
+
+
 class DbSkuHeadTableItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
@@ -452,10 +466,12 @@ class DbSkuHeadTableItemList (DbItemList):
                            GetIntegerValue(Data[1]))
         return Buffer
 
-## DbSizeTableItemList
+# DbSizeTableItemList
 #
 #  The class holds the size table
 #
+
+
 class DbSizeTableItemList (DbItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None):
         DbItemList.__init__(self, ItemSize, DataList, RawDataList)
@@ -465,6 +481,7 @@ class DbSizeTableItemList (DbItemList):
         for Data in self.RawDataList:
             length += (1 + len(Data[1]))
         return length * self.ItemSize
+
     def PackData(self):
         PackStr = "=H"
         Buffer = bytearray()
@@ -473,13 +490,15 @@ class DbSizeTableItemList (DbItemList):
                            GetIntegerValue(Data[0]))
             for subData in Data[1]:
                 Buffer += pack(PackStr,
-                           GetIntegerValue(subData))
+                               GetIntegerValue(subData))
         return Buffer
 
-## DbStringItemList
+# DbStringItemList
 #
 #  The class holds the string table
 #
+
+
 class DbStringItemList (DbComItemList):
     def __init__(self, ItemSize, DataList=None, RawDataList=None, LenList=None):
         if DataList is None:
@@ -504,6 +523,7 @@ class DbStringItemList (DbComItemList):
             DataList.append(ActualDatas)
         self.LenList = LenList
         DbComItemList.__init__(self, ItemSize, DataList, RawDataList)
+
     def GetInterOffset(self, Index):
         Offset = 0
 
@@ -520,7 +540,8 @@ class DbStringItemList (DbComItemList):
         if len(self.LenList) == 0:
             self.ListSize = 0
         else:
-            self.ListSize = self.GetInterOffset(len(self.LenList) - 1) + self.LenList[len(self.LenList)-1]
+            self.ListSize = self.GetInterOffset(
+                len(self.LenList) - 1) + self.LenList[len(self.LenList)-1]
 
         return self.ListSize
 
@@ -529,8 +550,7 @@ class DbStringItemList (DbComItemList):
         return DbComItemList.PackData(self)
 
 
-
-##  Find the index in two list where the item matches the key separately
+# Find the index in two list where the item matches the key separately
 #
 #   @param      Key1   The key used to search the List1
 #   @param      List1  The list that Key1 will be searched
@@ -551,7 +571,7 @@ def GetMatchedIndex(Key1, List1, Key2, List2):
     return -1
 
 
-## convert StringArray like {0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00}
+# convert StringArray like {0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00}
 # to List like [0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00]
 #
 #   @param      StringArray A string array like {0x36, 0x00, 0x34, 0x00, 0x21, 0x00, 0x36, 0x00, 0x34, 0x00, 0x00, 0x00}
@@ -564,7 +584,7 @@ def StringArrayToList(StringArray):
     return eval(StringArray)
 
 
-## Convert TokenType String like  "PCD_DATUM_TYPE_UINT32 | PCD_TYPE_HII" to TokenType value
+# Convert TokenType String like  "PCD_DATUM_TYPE_UINT32 | PCD_TYPE_HII" to TokenType value
 #
 #   @param      TokenType  A TokenType string like "PCD_DATUM_TYPE_UINT32 | PCD_TYPE_HII"
 #
@@ -576,7 +596,7 @@ def GetTokenTypeValue(TokenType):
         "PCD_TYPE_DATA": (0x0 << 28),
         "PCD_TYPE_HII": (0x8 << 28),
         "PCD_TYPE_VPD": (0x4 << 28),
-#        "PCD_TYPE_SKU_ENABLED":(0x2 << 28),
+        #        "PCD_TYPE_SKU_ENABLED":(0x2 << 28),
         "PCD_TYPE_STRING": (0x1 << 28),
 
         "PCD_DATUM_TYPE_SHIFT": 24,
@@ -588,106 +608,112 @@ def GetTokenTypeValue(TokenType):
 
         "PCD_DATUM_TYPE_SHIFT2": 20,
         "PCD_DATUM_TYPE_UINT8_BOOLEAN": (0x1 << 20 | 0x1 << 24),
-        }
+    }
     return eval(TokenType, TokenTypeDict)
 
-## construct the external Pcd database using data from Dict
+# construct the external Pcd database using data from Dict
 #
 #   @param      Dict  A dictionary contains Pcd related tables
 #
 #   @retval     Buffer A byte stream of the Pcd database
 #
+
+
 def BuildExDataBase(Dict):
     # init Db items
     InitValueUint64 = Dict['INIT_DB_VALUE_UINT64']
-    DbInitValueUint64 = DbComItemList(8, RawDataList = InitValueUint64)
+    DbInitValueUint64 = DbComItemList(8, RawDataList=InitValueUint64)
     VardefValueUint64 = Dict['VARDEF_DB_VALUE_UINT64']
-    DbVardefValueUint64 = DbItemList(8, RawDataList = VardefValueUint64)
+    DbVardefValueUint64 = DbItemList(8, RawDataList=VardefValueUint64)
     InitValueUint32 = Dict['INIT_DB_VALUE_UINT32']
-    DbInitValueUint32 = DbComItemList(4, RawDataList = InitValueUint32)
+    DbInitValueUint32 = DbComItemList(4, RawDataList=InitValueUint32)
     VardefValueUint32 = Dict['VARDEF_DB_VALUE_UINT32']
-    DbVardefValueUint32 = DbItemList(4, RawDataList = VardefValueUint32)
+    DbVardefValueUint32 = DbItemList(4, RawDataList=VardefValueUint32)
     VpdHeadValue = Dict['VPD_DB_VALUE']
-    DbVpdHeadValue = DbComItemList(4, RawDataList = VpdHeadValue)
-    ExMapTable = list(zip(Dict['EXMAPPING_TABLE_EXTOKEN'], Dict['EXMAPPING_TABLE_LOCAL_TOKEN'], Dict['EXMAPPING_TABLE_GUID_INDEX']))
-    DbExMapTable = DbExMapTblItemList(8, RawDataList = ExMapTable)
+    DbVpdHeadValue = DbComItemList(4, RawDataList=VpdHeadValue)
+    ExMapTable = list(zip(Dict['EXMAPPING_TABLE_EXTOKEN'],
+                      Dict['EXMAPPING_TABLE_LOCAL_TOKEN'], Dict['EXMAPPING_TABLE_GUID_INDEX']))
+    DbExMapTable = DbExMapTblItemList(8, RawDataList=ExMapTable)
     LocalTokenNumberTable = Dict['LOCAL_TOKEN_NUMBER_DB_VALUE']
-    DbLocalTokenNumberTable = DbItemList(4, RawDataList = LocalTokenNumberTable)
+    DbLocalTokenNumberTable = DbItemList(4, RawDataList=LocalTokenNumberTable)
     GuidTable = Dict['GUID_STRUCTURE']
-    DbGuidTable = DbItemList(16, RawDataList = GuidTable)
+    DbGuidTable = DbItemList(16, RawDataList=GuidTable)
     StringHeadValue = Dict['STRING_DB_VALUE']
     # DbItemList to DbStringHeadTableItemList
-    DbStringHeadValue = DbStringHeadTableItemList(4, RawDataList = StringHeadValue)
+    DbStringHeadValue = DbStringHeadTableItemList(
+        4, RawDataList=StringHeadValue)
     VariableTable = Dict['VARIABLE_DB_VALUE']
-    DbVariableTable = DbVariableTableItemList(20, RawDataList = VariableTable)
+    DbVariableTable = DbVariableTableItemList(20, RawDataList=VariableTable)
     NumberOfSkuEnabledPcd = GetIntegerValue(Dict['SKU_HEAD_SIZE'])
 
-    Dict['STRING_TABLE_DB_VALUE'] = [StringArrayToList(x) for x in Dict['STRING_TABLE_VALUE']]
+    Dict['STRING_TABLE_DB_VALUE'] = [
+        StringArrayToList(x) for x in Dict['STRING_TABLE_VALUE']]
 
     StringTableValue = Dict['STRING_TABLE_DB_VALUE']
     # when calcute the offset, should use StringTableLen instead of StringTableValue, as string maximum len may be different with actual len
     StringTableLen = Dict['STRING_TABLE_LENGTH']
-    DbStringTableLen = DbStringItemList(0, RawDataList = StringTableValue, LenList = StringTableLen)
-
+    DbStringTableLen = DbStringItemList(
+        0, RawDataList=StringTableValue, LenList=StringTableLen)
 
     PcdTokenTable = Dict['PCD_TOKENSPACE']
     PcdTokenLen = Dict['PCD_TOKENSPACE_LENGTH']
     PcdTokenTableValue = [StringArrayToList(x) for x in Dict['PCD_TOKENSPACE']]
-    DbPcdTokenTable = DbStringItemList(0, RawDataList = PcdTokenTableValue, LenList = PcdTokenLen)
+    DbPcdTokenTable = DbStringItemList(
+        0, RawDataList=PcdTokenTableValue, LenList=PcdTokenLen)
 
     PcdCNameTable = Dict['PCD_CNAME']
     PcdCNameLen = Dict['PCD_CNAME_LENGTH']
     PcdCNameTableValue = [StringArrayToList(x) for x in Dict['PCD_CNAME']]
-    DbPcdCNameTable = DbStringItemList(0, RawDataList = PcdCNameTableValue, LenList = PcdCNameLen)
+    DbPcdCNameTable = DbStringItemList(
+        0, RawDataList=PcdCNameTableValue, LenList=PcdCNameLen)
 
     PcdNameOffsetTable = Dict['PCD_NAME_OFFSET']
-    DbPcdNameOffsetTable = DbItemList(4, RawDataList = PcdNameOffsetTable)
+    DbPcdNameOffsetTable = DbItemList(4, RawDataList=PcdNameOffsetTable)
 
-    SizeTableValue = list(zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH']))
-    DbSizeTableValue = DbSizeTableItemList(2, RawDataList = SizeTableValue)
+    SizeTableValue = list(
+        zip(Dict['SIZE_TABLE_MAXIMUM_LENGTH'], Dict['SIZE_TABLE_CURRENT_LENGTH']))
+    DbSizeTableValue = DbSizeTableItemList(2, RawDataList=SizeTableValue)
     InitValueUint16 = Dict['INIT_DB_VALUE_UINT16']
-    DbInitValueUint16 = DbComItemList(2, RawDataList = InitValueUint16)
+    DbInitValueUint16 = DbComItemList(2, RawDataList=InitValueUint16)
     VardefValueUint16 = Dict['VARDEF_DB_VALUE_UINT16']
-    DbVardefValueUint16 = DbItemList(2, RawDataList = VardefValueUint16)
+    DbVardefValueUint16 = DbItemList(2, RawDataList=VardefValueUint16)
     InitValueUint8 = Dict['INIT_DB_VALUE_UINT8']
-    DbInitValueUint8 = DbComItemList(1, RawDataList = InitValueUint8)
+    DbInitValueUint8 = DbComItemList(1, RawDataList=InitValueUint8)
     VardefValueUint8 = Dict['VARDEF_DB_VALUE_UINT8']
-    DbVardefValueUint8 = DbItemList(1, RawDataList = VardefValueUint8)
+    DbVardefValueUint8 = DbItemList(1, RawDataList=VardefValueUint8)
     InitValueBoolean = Dict['INIT_DB_VALUE_BOOLEAN']
-    DbInitValueBoolean = DbComItemList(1, RawDataList = InitValueBoolean)
+    DbInitValueBoolean = DbComItemList(1, RawDataList=InitValueBoolean)
     VardefValueBoolean = Dict['VARDEF_DB_VALUE_BOOLEAN']
-    DbVardefValueBoolean = DbItemList(1, RawDataList = VardefValueBoolean)
+    DbVardefValueBoolean = DbItemList(1, RawDataList=VardefValueBoolean)
     SkuidValue = Dict['SKUID_VALUE']
-    DbSkuidValue = DbItemList(8, RawDataList = SkuidValue)
-
-
+    DbSkuidValue = DbItemList(8, RawDataList=SkuidValue)
 
     # Unit Db Items
     UnInitValueUint64 = Dict['UNINIT_GUID_DECL_UINT64']
-    DbUnInitValueUint64 = DbItemList(8, RawDataList = UnInitValueUint64)
+    DbUnInitValueUint64 = DbItemList(8, RawDataList=UnInitValueUint64)
     UnInitValueUint32 = Dict['UNINIT_GUID_DECL_UINT32']
-    DbUnInitValueUint32 = DbItemList(4, RawDataList = UnInitValueUint32)
+    DbUnInitValueUint32 = DbItemList(4, RawDataList=UnInitValueUint32)
     UnInitValueUint16 = Dict['UNINIT_GUID_DECL_UINT16']
-    DbUnInitValueUint16 = DbItemList(2, RawDataList = UnInitValueUint16)
+    DbUnInitValueUint16 = DbItemList(2, RawDataList=UnInitValueUint16)
     UnInitValueUint8 = Dict['UNINIT_GUID_DECL_UINT8']
-    DbUnInitValueUint8 = DbItemList(1, RawDataList = UnInitValueUint8)
+    DbUnInitValueUint8 = DbItemList(1, RawDataList=UnInitValueUint8)
     UnInitValueBoolean = Dict['UNINIT_GUID_DECL_BOOLEAN']
-    DbUnInitValueBoolean = DbItemList(1, RawDataList = UnInitValueBoolean)
+    DbUnInitValueBoolean = DbItemList(1, RawDataList=UnInitValueBoolean)
     PcdTokenNumberMap = Dict['PCD_ORDER_TOKEN_NUMBER_MAP']
 
     DbNameTotle = ["SkuidValue",  "InitValueUint64", "VardefValueUint64", "InitValueUint32", "VardefValueUint32", "VpdHeadValue", "ExMapTable",
-               "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
-               "SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
-               "VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
+                   "LocalTokenNumberTable", "GuidTable", "StringHeadValue",  "PcdNameOffsetTable", "VariableTable", "StringTableLen", "PcdTokenTable", "PcdCNameTable",
+                   "SizeTableValue", "InitValueUint16", "VardefValueUint16", "InitValueUint8", "VardefValueUint8", "InitValueBoolean",
+                   "VardefValueBoolean", "UnInitValueUint64", "UnInitValueUint32", "UnInitValueUint16", "UnInitValueUint8", "UnInitValueBoolean"]
 
     DbTotal = [SkuidValue,  InitValueUint64, VardefValueUint64, InitValueUint32, VardefValueUint32, VpdHeadValue, ExMapTable,
                LocalTokenNumberTable, GuidTable, StringHeadValue,  PcdNameOffsetTable, VariableTable, StringTableLen, PcdTokenTable, PcdCNameTable,
                SizeTableValue, InitValueUint16, VardefValueUint16, InitValueUint8, VardefValueUint8, InitValueBoolean,
                VardefValueBoolean, UnInitValueUint64, UnInitValueUint32, UnInitValueUint16, UnInitValueUint8, UnInitValueBoolean]
     DbItemTotal = [DbSkuidValue,  DbInitValueUint64, DbVardefValueUint64, DbInitValueUint32, DbVardefValueUint32, DbVpdHeadValue, DbExMapTable,
-               DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
-               DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
-               DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
+                   DbLocalTokenNumberTable, DbGuidTable, DbStringHeadValue,  DbPcdNameOffsetTable, DbVariableTable, DbStringTableLen, DbPcdTokenTable, DbPcdCNameTable,
+                   DbSizeTableValue, DbInitValueUint16, DbVardefValueUint16, DbInitValueUint8, DbVardefValueUint8, DbInitValueBoolean,
+                   DbVardefValueBoolean, DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean]
 
     # VardefValueBoolean is the last table in the init table items
     InitTableNum = DbNameTotle.index("VardefValueBoolean") + 1
@@ -701,7 +727,6 @@ def BuildExDataBase(Dict):
             break
         SkuIdTableOffset += DbItemTotal[DbIndex].GetListSize()
 
-
     # Get offset of SkuValue table in the database
 
     # Fix up the LocalTokenNumberTable, SkuHeader table
@@ -721,22 +746,22 @@ def BuildExDataBase(Dict):
 
         TokenTypeValue = Dict['TOKEN_TYPE'][LocalTokenNumberTableIndex]
         TokenTypeValue = GetTokenTypeValue(TokenTypeValue)
-        LocalTokenNumberTable[LocalTokenNumberTableIndex] = DbOffset|int(TokenTypeValue)
+        LocalTokenNumberTable[LocalTokenNumberTableIndex] = DbOffset | int(
+            TokenTypeValue)
         # if PCD_TYPE_SKU_ENABLED, then we need to fix up the SkuTable
 
-
-
-
     # resolve variable table offset
     for VariableEntries in VariableTable:
         skuindex = 0
         for VariableEntryPerSku in VariableEntries:
-            (VariableHeadGuidIndex, VariableHeadStringIndex, SKUVariableOffset, VariableOffset, VariableRefTable, VariableAttribute) = VariableEntryPerSku[:]
+            (VariableHeadGuidIndex, VariableHeadStringIndex, SKUVariableOffset,
+             VariableOffset, VariableRefTable, VariableAttribute) = VariableEntryPerSku[:]
             DbIndex = 0
             DbOffset = FixedHeaderLen
             for DbIndex in range(len(DbTotal)):
                 if DbTotal[DbIndex] is VariableRefTable:
-                    DbOffset += DbItemTotal[DbIndex].GetInterOffset(VariableOffset)
+                    DbOffset += DbItemTotal[DbIndex].GetInterOffset(
+                        VariableOffset)
                     break
                 DbOffset += DbItemTotal[DbIndex].GetListSize()
                 if DbIndex + 1 == InitTableNum:
@@ -749,8 +774,10 @@ def BuildExDataBase(Dict):
             skuindex += 1
             if DbIndex >= InitTableNum:
                 assert(False)
-            VarAttr, VarProp = VariableAttributes.GetVarAttributes(VariableAttribute)
-            VariableEntryPerSku[:] = (VariableHeadStringIndex, DbOffset, VariableHeadGuidIndex, SKUVariableOffset, VarAttr, VarProp)
+            VarAttr, VarProp = VariableAttributes.GetVarAttributes(
+                VariableAttribute)
+            VariableEntryPerSku[:] = (VariableHeadStringIndex, DbOffset,
+                                      VariableHeadGuidIndex, SKUVariableOffset, VarAttr, VarProp)
 
     # calculate various table offset now
     DbTotalLength = FixedHeaderLen
@@ -770,17 +797,16 @@ def BuildExDataBase(Dict):
         elif DbItemTotal[DbIndex] is DbPcdNameOffsetTable:
             DbPcdNameOffset = DbTotalLength
 
-
         DbTotalLength += DbItemTotal[DbIndex].GetListSize()
     if not Dict['PCD_INFO_FLAG']:
-        DbPcdNameOffset  = 0
+        DbPcdNameOffset = 0
     LocalTokenCount = GetIntegerValue(Dict['LOCAL_TOKEN_NUMBER'])
     ExTokenCount = GetIntegerValue(Dict['EX_TOKEN_NUMBER'])
     GuidTableCount = GetIntegerValue(Dict['GUID_TABLE_SIZE'])
     SystemSkuId = GetIntegerValue(Dict['SYSTEM_SKU_ID_VALUE'])
     Pad = 0xDA
 
-    UninitDataBaseSize  = 0
+    UninitDataBaseSize = 0
     for Item in (DbUnInitValueUint64, DbUnInitValueUint32, DbUnInitValueUint16, DbUnInitValueUint8, DbUnInitValueBoolean):
         UninitDataBaseSize += Item.GetListSize()
 
@@ -846,7 +872,7 @@ def BuildExDataBase(Dict):
 
     Index = 0
     for Item in DbItemTotal:
-        Index +=1
+        Index += 1
         packdata = Item.PackData()
         for i in range(len(packdata)):
             Buffer += packdata[i:i + 1]
@@ -858,20 +884,23 @@ def BuildExDataBase(Dict):
             break
     return Buffer
 
-## Create code for PCD database
+# Create code for PCD database
 #
 #   @param      Info        The ModuleAutoGen object
 #   @param      AutoGenC    The TemplateString object for C code
 #   @param      AutoGenH    The TemplateString object for header file
 #
-def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
+
+
+def CreatePcdDatabaseCode(Info, AutoGenC, AutoGenH):
     if Info.PcdIsDriver == "":
         return
     if Info.PcdIsDriver not in gPcdPhaseMap:
         EdkLogger.error("build", AUTOGEN_ERROR, "Not supported PcdIsDriver type:%s" % Info.PcdIsDriver,
                         ExtraData="[%s]" % str(Info))
 
-    AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer = NewCreatePcdDatabasePhaseSpecificAutoGen (Info.PlatformInfo, 'PEI')
+    AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer = NewCreatePcdDatabasePhaseSpecificAutoGen(
+        Info.PlatformInfo, 'PEI')
     AutoGenH.Append(AdditionalAutoGenH.String)
 
     Phase = gPcdPhaseMap[Info.PcdIsDriver]
@@ -879,26 +908,32 @@ def CreatePcdDatabaseCode (Info, AutoGenC, AutoGenH):
         AutoGenC.Append(AdditionalAutoGenC.String)
 
     if Phase == 'DXE':
-        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer = NewCreatePcdDatabasePhaseSpecificAutoGen (Info.PlatformInfo, Phase)
+        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer = NewCreatePcdDatabasePhaseSpecificAutoGen(
+            Info.PlatformInfo, Phase)
         AutoGenH.Append(AdditionalAutoGenH.String)
         AutoGenC.Append(AdditionalAutoGenC.String)
 
     if Info.IsBinaryModule:
-        DbFileName = os.path.join(Info.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, Phase + "PcdDataBase.raw")
+        DbFileName = os.path.join(
+            Info.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, Phase + "PcdDataBase.raw")
     else:
         DbFileName = os.path.join(Info.OutputDir, Phase + "PcdDataBase.raw")
     DbFile = BytesIO()
     DbFile.write(PcdDbBuffer)
     Changed = SaveFileOnChange(DbFileName, DbFile.getvalue(), True)
+
+
 def CreatePcdDataBase(PcdDBData):
     delta = {}
     for skuname, skuid in PcdDBData:
         if len(PcdDBData[(skuname, skuid)][1]) != len(PcdDBData[(TAB_DEFAULT, "0")][1]):
-            EdkLogger.error("build", AUTOGEN_ERROR, "The size of each sku in one pcd are not same")
+            EdkLogger.error("build", AUTOGEN_ERROR,
+                            "The size of each sku in one pcd are not same")
     for skuname, skuid in PcdDBData:
         if skuname == TAB_DEFAULT:
             continue
-        delta[(skuname, skuid)] = [(index, data, hex(data)) for index, data in enumerate(PcdDBData[(skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[(TAB_DEFAULT, "0")][1][index]]
+        delta[(skuname, skuid)] = [(index, data, hex(data)) for index, data in enumerate(PcdDBData[(
+            skuname, skuid)][1]) if PcdDBData[(skuname, skuid)][1][index] != PcdDBData[(TAB_DEFAULT, "0")][1][index]]
     databasebuff = PcdDBData[(TAB_DEFAULT, "0")][0]
 
     for skuname, skuid in delta:
@@ -922,25 +957,30 @@ def CreatePcdDataBase(PcdDBData):
 
     return newbuffer
 
+
 def CreateVarCheckBin(VarCheckTab):
     return VarCheckTab[(TAB_DEFAULT, "0")]
 
+
 def CreateAutoGen(PcdDriverAutoGenData):
     autogenC = TemplateString()
     for skuname, skuid in PcdDriverAutoGenData:
         autogenC.Append("//SKUID: %s" % skuname)
         autogenC.Append(PcdDriverAutoGenData[(skuname, skuid)][1].String)
     return (PcdDriverAutoGenData[(skuname, skuid)][0], autogenC)
+
+
 def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
     def prune_sku(pcd, skuname):
         new_pcd = copy.deepcopy(pcd)
-        new_pcd.SkuInfoList = {skuname:pcd.SkuInfoList[skuname]}
+        new_pcd.SkuInfoList = {skuname: pcd.SkuInfoList[skuname]}
         new_pcd.isinit = 'INIT'
         if new_pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
             for skuobj in pcd.SkuInfoList.values():
                 if skuobj.DefaultValue:
-                    defaultvalue = int(skuobj.DefaultValue, 16) if skuobj.DefaultValue.upper().startswith("0X") else int(skuobj.DefaultValue, 10)
-                    if defaultvalue  != 0:
+                    defaultvalue = int(skuobj.DefaultValue, 16) if skuobj.DefaultValue.upper(
+                    ).startswith("0X") else int(skuobj.DefaultValue, 10)
+                    if defaultvalue != 0:
                         new_pcd.isinit = "INIT"
                         break
                 elif skuobj.VariableName:
@@ -950,64 +990,72 @@ def NewCreatePcdDatabasePhaseSpecificAutoGen(Platform, Phase):
                 new_pcd.isinit = "UNINIT"
         return new_pcd
     DynamicPcds = Platform.DynamicPcdList
-    DynamicPcdSet_Sku = {(SkuName, skuobj.SkuId):[] for pcd in DynamicPcds for (SkuName, skuobj) in pcd.SkuInfoList.items() }
+    DynamicPcdSet_Sku = {(SkuName, skuobj.SkuId): [] for pcd in DynamicPcds for (
+        SkuName, skuobj) in pcd.SkuInfoList.items()}
     for skuname, skuid in DynamicPcdSet_Sku:
-        DynamicPcdSet_Sku[(skuname, skuid)] = [prune_sku(pcd, skuname) for pcd in DynamicPcds]
+        DynamicPcdSet_Sku[(skuname, skuid)] = [
+            prune_sku(pcd, skuname) for pcd in DynamicPcds]
     PcdDBData = {}
     PcdDriverAutoGenData = {}
     VarCheckTableData = {}
     if DynamicPcdSet_Sku:
         for skuname, skuid in DynamicPcdSet_Sku:
-            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
+            AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen(
+                Platform, DynamicPcdSet_Sku[(skuname, skuid)], Phase)
             final_data = ()
             for item in range(len(PcdDbBuffer)):
                 final_data += unpack("B", PcdDbBuffer[item:item+1])
             PcdDBData[(skuname, skuid)] = (PcdDbBuffer, final_data)
-            PcdDriverAutoGenData[(skuname, skuid)] = (AdditionalAutoGenH, AdditionalAutoGenC)
+            PcdDriverAutoGenData[(skuname, skuid)] = (
+                AdditionalAutoGenH, AdditionalAutoGenC)
             VarCheckTableData[(skuname, skuid)] = VarCheckTab
         if Platform.Platform.VarCheckFlag:
             dest = os.path.join(Platform.BuildDir, TAB_FV_DIRECTORY)
             VarCheckTable = CreateVarCheckBin(VarCheckTableData)
             VarCheckTable.dump(dest, Phase)
-        AdditionalAutoGenH, AdditionalAutoGenC =  CreateAutoGen(PcdDriverAutoGenData)
+        AdditionalAutoGenH, AdditionalAutoGenC = CreateAutoGen(
+            PcdDriverAutoGenData)
     else:
-        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen (Platform, {}, Phase)
+        AdditionalAutoGenH, AdditionalAutoGenC, PcdDbBuffer, VarCheckTab = CreatePcdDatabasePhaseSpecificAutoGen(
+            Platform, {}, Phase)
         final_data = ()
         for item in range(len(PcdDbBuffer)):
             final_data += unpack("B", PcdDbBuffer[item:item + 1])
         PcdDBData[(TAB_DEFAULT, "0")] = (PcdDbBuffer, final_data)
 
     return AdditionalAutoGenH, AdditionalAutoGenC, CreatePcdDataBase(PcdDBData)
-## Create PCD database in DXE or PEI phase
+# Create PCD database in DXE or PEI phase
 #
 #   @param      Platform    The platform object
 #   @retval     tuple       Two TemplateString objects for C code and header file,
 #                           respectively
 #
-def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
+
+
+def CreatePcdDatabasePhaseSpecificAutoGen(Platform, DynamicPcdList, Phase):
     AutoGenC = TemplateString()
     AutoGenH = TemplateString()
 
     Dict = {
-        'PHASE'                         : Phase,
-        'SERVICE_DRIVER_VERSION'        : DATABASE_VERSION,
-        'GUID_TABLE_SIZE'               : '1U',
-        'STRING_TABLE_SIZE'             : '1U',
-        'SKUID_TABLE_SIZE'              : '1U',
-        'LOCAL_TOKEN_NUMBER_TABLE_SIZE' : '0U',
-        'LOCAL_TOKEN_NUMBER'            : '0U',
-        'EXMAPPING_TABLE_SIZE'          : '1U',
-        'EX_TOKEN_NUMBER'               : '0U',
-        'SIZE_TABLE_SIZE'               : '2U',
-        'SKU_HEAD_SIZE'                 : '1U',
-        'GUID_TABLE_EMPTY'              : 'TRUE',
-        'STRING_TABLE_EMPTY'            : 'TRUE',
-        'SKUID_TABLE_EMPTY'             : 'TRUE',
-        'DATABASE_EMPTY'                : 'TRUE',
-        'EXMAP_TABLE_EMPTY'             : 'TRUE',
-        'PCD_DATABASE_UNINIT_EMPTY'     : '  UINT8  dummy; /* PCD_DATABASE_UNINIT is empty */',
-        'SYSTEM_SKU_ID'                 : '  SKU_ID             SystemSkuId;',
-        'SYSTEM_SKU_ID_VALUE'           : '0U'
+        'PHASE': Phase,
+        'SERVICE_DRIVER_VERSION': DATABASE_VERSION,
+        'GUID_TABLE_SIZE': '1U',
+        'STRING_TABLE_SIZE': '1U',
+        'SKUID_TABLE_SIZE': '1U',
+        'LOCAL_TOKEN_NUMBER_TABLE_SIZE': '0U',
+        'LOCAL_TOKEN_NUMBER': '0U',
+        'EXMAPPING_TABLE_SIZE': '1U',
+        'EX_TOKEN_NUMBER': '0U',
+        'SIZE_TABLE_SIZE': '2U',
+        'SKU_HEAD_SIZE': '1U',
+        'GUID_TABLE_EMPTY': 'TRUE',
+        'STRING_TABLE_EMPTY': 'TRUE',
+        'SKUID_TABLE_EMPTY': 'TRUE',
+        'DATABASE_EMPTY': 'TRUE',
+        'EXMAP_TABLE_EMPTY': 'TRUE',
+        'PCD_DATABASE_UNINIT_EMPTY': '  UINT8  dummy; /* PCD_DATABASE_UNINIT is empty */',
+        'SYSTEM_SKU_ID': '  SKU_ID             SystemSkuId;',
+        'SYSTEM_SKU_ID_VALUE': '0U'
     }
 
     SkuObj = Platform.Platform.SkuIdMgr
@@ -1017,20 +1065,20 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
 
     for DatumType in TAB_PCD_NUMERIC_TYPES_VOID:
         Dict['VARDEF_CNAME_' + DatumType] = []
-        Dict['VARDEF_GUID_' + DatumType]  = []
+        Dict['VARDEF_GUID_' + DatumType] = []
         Dict['VARDEF_SKUID_' + DatumType] = []
         Dict['VARDEF_VALUE_' + DatumType] = []
         Dict['VARDEF_DB_VALUE_' + DatumType] = []
         for Init in ['INIT', 'UNINIT']:
-            Dict[Init+'_CNAME_DECL_' + DatumType]   = []
-            Dict[Init+'_GUID_DECL_' + DatumType]    = []
+            Dict[Init+'_CNAME_DECL_' + DatumType] = []
+            Dict[Init+'_GUID_DECL_' + DatumType] = []
             Dict[Init+'_NUMSKUS_DECL_' + DatumType] = []
-            Dict[Init+'_VALUE_' + DatumType]        = []
+            Dict[Init+'_VALUE_' + DatumType] = []
             Dict[Init+'_DB_VALUE_'+DatumType] = []
 
     for Type in ['STRING_HEAD', 'VPD_HEAD', 'VARIABLE_HEAD']:
-        Dict[Type + '_CNAME_DECL']   = []
-        Dict[Type + '_GUID_DECL']    = []
+        Dict[Type + '_CNAME_DECL'] = []
+        Dict[Type + '_GUID_DECL'] = []
         Dict[Type + '_NUMSKUS_DECL'] = []
         Dict[Type + '_VALUE'] = []
 
@@ -1039,23 +1087,23 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
     Dict['VARIABLE_DB_VALUE'] = []
 
     Dict['STRING_TABLE_INDEX'] = []
-    Dict['STRING_TABLE_LENGTH']  = []
+    Dict['STRING_TABLE_LENGTH'] = []
     Dict['STRING_TABLE_CNAME'] = []
-    Dict['STRING_TABLE_GUID']  = []
+    Dict['STRING_TABLE_GUID'] = []
     Dict['STRING_TABLE_VALUE'] = []
     Dict['STRING_TABLE_DB_VALUE'] = []
 
     Dict['SIZE_TABLE_CNAME'] = []
-    Dict['SIZE_TABLE_GUID']  = []
-    Dict['SIZE_TABLE_CURRENT_LENGTH']  = []
-    Dict['SIZE_TABLE_MAXIMUM_LENGTH']  = []
+    Dict['SIZE_TABLE_GUID'] = []
+    Dict['SIZE_TABLE_CURRENT_LENGTH'] = []
+    Dict['SIZE_TABLE_MAXIMUM_LENGTH'] = []
 
     Dict['EXMAPPING_TABLE_EXTOKEN'] = []
     Dict['EXMAPPING_TABLE_LOCAL_TOKEN'] = []
     Dict['EXMAPPING_TABLE_GUID_INDEX'] = []
 
     Dict['GUID_STRUCTURE'] = []
-    Dict['SKUID_VALUE'] = [0] # init Dict length
+    Dict['SKUID_VALUE'] = [0]  # init Dict length
     Dict['VARDEF_HEADER'] = []
 
     Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'] = []
@@ -1084,7 +1132,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
     GuidList = []
     VarCheckTab = VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER()
     i = 0
-    ReorderedDynPcdList = GetOrderedDynamicPcdList(DynamicPcdList, Platform.PcdTokenNumber)
+    ReorderedDynPcdList = GetOrderedDynamicPcdList(
+        DynamicPcdList, Platform.PcdTokenNumber)
     for item in ReorderedDynPcdList:
         if item.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
             item.DatumType = TAB_VOID
@@ -1098,7 +1147,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
                 CName = PcdItem[0]
 
-        EdkLogger.debug(EdkLogger.DEBUG_3, "PCD: %s %s (%s : %s)" % (CName, TokenSpaceGuidCName, Pcd.Phase, Phase))
+        EdkLogger.debug(EdkLogger.DEBUG_3, "PCD: %s %s (%s : %s)" %
+                        (CName, TokenSpaceGuidCName, Pcd.Phase, Phase))
 
         if Pcd.Phase == 'PEI':
             NumberOfPeiLocalTokens += 1
@@ -1111,7 +1161,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         # TODO: need GetGuidValue() definition
         #
         TokenSpaceGuidStructure = Pcd.TokenSpaceGuidValue
-        TokenSpaceGuid = GuidStructureStringToGuidValueName(TokenSpaceGuidStructure)
+        TokenSpaceGuid = GuidStructureStringToGuidValueName(
+            TokenSpaceGuidStructure)
         if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
             if TokenSpaceGuid not in GuidList:
                 GuidList.append(TokenSpaceGuid)
@@ -1149,17 +1200,20 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             if SkuId is None or SkuId == '':
                 continue
 
-
             SkuIdIndex += 1
 
             if len(Sku.VariableName) > 0:
                 VariableGuidStructure = Sku.VariableGuidValue
-                VariableGuid = GuidStructureStringToGuidValueName(VariableGuidStructure)
+                VariableGuid = GuidStructureStringToGuidValueName(
+                    VariableGuidStructure)
                 if Platform.Platform.VarCheckFlag:
-                    var_check_obj = VAR_CHECK_PCD_VARIABLE_TAB(VariableGuidStructure, StringToArray(Sku.VariableName))
+                    var_check_obj = VAR_CHECK_PCD_VARIABLE_TAB(
+                        VariableGuidStructure, StringToArray(Sku.VariableName))
                     try:
-                        var_check_obj.push_back(GetValidationObject(Pcd, Sku.VariableOffset))
-                        VarAttr, _ = VariableAttributes.GetVarAttributes(Sku.VariableAttribute)
+                        var_check_obj.push_back(
+                            GetValidationObject(Pcd, Sku.VariableOffset))
+                        VarAttr, _ = VariableAttributes.GetVarAttributes(
+                            Sku.VariableAttribute)
                         var_check_obj.SetAttributes(VarAttr)
                         var_check_obj.UpdateSize()
                         VarCheckTab.push_back(var_check_obj)
@@ -1171,11 +1225,12 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                             ValidInfo = Pcd.validlists[0]
                         if ValidInfo:
                             EdkLogger.error("build", PCD_VALIDATION_INFO_ERROR,
-                                                "The PCD '%s.%s' Validation information defined in DEC file has incorrect format." % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName),
-                                                ExtraData = "[%s]" % str(ValidInfo))
+                                            "The PCD '%s.%s' Validation information defined in DEC file has incorrect format." % (
+                                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName),
+                                            ExtraData="[%s]" % str(ValidInfo))
                         else:
                             EdkLogger.error("build", PCD_VALIDATION_INFO_ERROR,
-                                                "The PCD '%s.%s' Validation information defined in DEC file has incorrect format." % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+                                            "The PCD '%s.%s' Validation information defined in DEC file has incorrect format." % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
                 Pcd.TokenTypeList.append('PCD_TYPE_HII')
                 Pcd.InitString = 'INIT'
                 # Store all variable names of one HII PCD under different SKU to stringTable
@@ -1184,9 +1239,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                 VariableNameStructure = StringToArray(Sku.VariableName)
 
                 #  Make pointer of VaraibleName(HII PCD) 2 bytes aligned
-                VariableNameStructureBytes = VariableNameStructure.lstrip("{").rstrip("}").split(",")
+                VariableNameStructureBytes = VariableNameStructure.lstrip(
+                    "{").rstrip("}").split(",")
                 if len(VariableNameStructureBytes) % 2:
-                    VariableNameStructure = "{%s,0x00}" % ",".join(VariableNameStructureBytes)
+                    VariableNameStructure = "{%s,0x00}" % ",".join(
+                        VariableNameStructureBytes)
 
                 if VariableNameStructure not in Dict['STRING_TABLE_VALUE']:
                     Dict['STRING_TABLE_CNAME'].append(CName)
@@ -1194,16 +1251,19 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     if StringTableIndex == 0:
                         Dict['STRING_TABLE_INDEX'].append('')
                     else:
-                        Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
-                    VarNameSize = len(VariableNameStructure.replace(',', ' ').split())
-                    Dict['STRING_TABLE_LENGTH'].append(VarNameSize )
+                        Dict['STRING_TABLE_INDEX'].append(
+                            '_%d' % StringTableIndex)
+                    VarNameSize = len(
+                        VariableNameStructure.replace(',', ' ').split())
+                    Dict['STRING_TABLE_LENGTH'].append(VarNameSize)
                     Dict['STRING_TABLE_VALUE'].append(VariableNameStructure)
                     StringHeadOffsetList.append(str(StringTableSize) + 'U')
                     VarStringDbOffsetList = []
                     VarStringDbOffsetList.append(StringTableSize)
                     Dict['STRING_DB_VALUE'].append(VarStringDbOffsetList)
                     StringTableIndex += 1
-                    StringTableSize += len(VariableNameStructure.replace(',', ' ').split())
+                    StringTableSize += len(
+                        VariableNameStructure.replace(',', ' ').split())
                 VariableHeadStringIndex = 0
                 for Index in range(Dict['STRING_TABLE_VALUE'].index(VariableNameStructure)):
                     VariableHeadStringIndex += Dict['STRING_TABLE_LENGTH'][Index]
@@ -1220,32 +1280,38 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                 if "PCD_TYPE_STRING" in Pcd.TokenTypeList:
                     VariableHeadValueList.append('%dU, offsetof(%s_PCD_DATABASE, Init.%s_%s), %dU, %sU' %
                                                  (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid,
-                                                 VariableHeadGuidIndex, Sku.VariableOffset))
+                                                  VariableHeadGuidIndex, Sku.VariableOffset))
                 else:
                     VariableHeadValueList.append('%dU, offsetof(%s_PCD_DATABASE, Init.%s_%s_VariableDefault_%s), %dU, %sU' %
                                                  (VariableHeadStringIndex, Phase, CName, TokenSpaceGuid, SkuIdIndex,
-                                                 VariableHeadGuidIndex, Sku.VariableOffset))
+                                                  VariableHeadGuidIndex, Sku.VariableOffset))
                 Dict['VARDEF_CNAME_'+Pcd.DatumType].append(CName)
                 Dict['VARDEF_GUID_'+Pcd.DatumType].append(TokenSpaceGuid)
                 Dict['VARDEF_SKUID_'+Pcd.DatumType].append(SkuIdIndex)
-                if "PCD_TYPE_STRING" in  Pcd.TokenTypeList:
-                    Dict['VARDEF_VALUE_' + Pcd.DatumType].append("%s_%s[%d]" % (Pcd.TokenCName, TokenSpaceGuid, SkuIdIndex))
+                if "PCD_TYPE_STRING" in Pcd.TokenTypeList:
+                    Dict['VARDEF_VALUE_' + Pcd.DatumType].append(
+                        "%s_%s[%d]" % (Pcd.TokenCName, TokenSpaceGuid, SkuIdIndex))
                 else:
                     #
                     # ULL (for UINT64) or U(other integer type) should be append to avoid
                     # warning under linux building environment.
                     #
-                    Dict['VARDEF_DB_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
+                    Dict['VARDEF_DB_VALUE_' +
+                         Pcd.DatumType].append(Sku.HiiDefaultValue)
 
                     if Pcd.DatumType == TAB_UINT64:
-                        Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "ULL")
+                        Dict['VARDEF_VALUE_' +
+                             Pcd.DatumType].append(Sku.HiiDefaultValue + "ULL")
                     elif Pcd.DatumType in (TAB_UINT32, TAB_UINT16, TAB_UINT8):
-                        Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
+                        Dict['VARDEF_VALUE_' +
+                             Pcd.DatumType].append(Sku.HiiDefaultValue + "U")
                     elif Pcd.DatumType == "BOOLEAN":
                         if eval(Sku.HiiDefaultValue) in [1, 0]:
-                            Dict['VARDEF_VALUE_'+Pcd.DatumType].append(str(eval(Sku.HiiDefaultValue)) + "U")
+                            Dict['VARDEF_VALUE_'+Pcd.DatumType].append(
+                                str(eval(Sku.HiiDefaultValue)) + "U")
                     else:
-                        Dict['VARDEF_VALUE_'+Pcd.DatumType].append(Sku.HiiDefaultValue)
+                        Dict['VARDEF_VALUE_' +
+                             Pcd.DatumType].append(Sku.HiiDefaultValue)
 
                 # construct the VariableHeader value
                 if "PCD_TYPE_STRING" in Pcd.TokenTypeList:
@@ -1260,9 +1326,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                                                  (VariableHeadGuidIndex, VariableHeadStringIndex, Sku.VariableOffset,
                                                   Phase, CName, TokenSpaceGuid, SkuIdIndex))
                     # the Pcd default value was filled before
-                    VariableOffset = len(Dict['VARDEF_DB_VALUE_' + Pcd.DatumType]) - 1
+                    VariableOffset = len(
+                        Dict['VARDEF_DB_VALUE_' + Pcd.DatumType]) - 1
                     VariableRefTable = Dict['VARDEF_DB_VALUE_' + Pcd.DatumType]
-                VariableDbValueList.append([VariableHeadGuidIndex, VariableHeadStringIndex, Sku.VariableOffset, VariableOffset, VariableRefTable, Sku.VariableAttribute])
+                VariableDbValueList.append([VariableHeadGuidIndex, VariableHeadStringIndex,
+                                           Sku.VariableOffset, VariableOffset, VariableRefTable, Sku.VariableAttribute])
 
             elif Sku.VpdOffset != '':
                 Pcd.TokenTypeList.append('PCD_TYPE_VPD')
@@ -1289,19 +1357,27 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                     if StringTableIndex == 0:
                         Dict['STRING_TABLE_INDEX'].append('')
                     else:
-                        Dict['STRING_TABLE_INDEX'].append('_%d' % StringTableIndex)
+                        Dict['STRING_TABLE_INDEX'].append(
+                            '_%d' % StringTableIndex)
                     if Sku.DefaultValue[0] == 'L':
-                        DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
-                        Size = len(DefaultValueBinStructure.replace(',', ' ').split())
-                        Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
+                        DefaultValueBinStructure = StringToArray(
+                            Sku.DefaultValue)
+                        Size = len(DefaultValueBinStructure.replace(
+                            ',', ' ').split())
+                        Dict['STRING_TABLE_VALUE'].append(
+                            DefaultValueBinStructure)
                     elif Sku.DefaultValue[0] == '"':
-                        DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
+                        DefaultValueBinStructure = StringToArray(
+                            Sku.DefaultValue)
                         Size = len(Sku.DefaultValue) - 2 + 1
-                        Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
+                        Dict['STRING_TABLE_VALUE'].append(
+                            DefaultValueBinStructure)
                     elif Sku.DefaultValue[0] == '{':
-                        DefaultValueBinStructure = StringToArray(Sku.DefaultValue)
+                        DefaultValueBinStructure = StringToArray(
+                            Sku.DefaultValue)
                         Size = len(Sku.DefaultValue.split(","))
-                        Dict['STRING_TABLE_VALUE'].append(DefaultValueBinStructure)
+                        Dict['STRING_TABLE_VALUE'].append(
+                            DefaultValueBinStructure)
 
                     StringHeadOffsetList.append(str(StringTableSize) + 'U')
                     StringDbOffsetList.append(StringTableSize)
@@ -1310,8 +1386,9 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
                         if MaxDatumSize < Size:
                             if Pcd.MaxSizeUserSet:
                                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                            "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName),
-                                            ExtraData="[%s]" % str(Platform))
+                                                "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (
+                                                    Pcd.TokenSpaceGuidCName, Pcd.TokenCName),
+                                                ExtraData="[%s]" % str(Platform))
                             else:
                                 MaxDatumSize = Size
                     else:
@@ -1351,16 +1428,16 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         if Pcd.DatumType == TAB_VOID:
             Dict['SIZE_TABLE_CNAME'].append(CName)
             Dict['SIZE_TABLE_GUID'].append(TokenSpaceGuid)
-            Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append(str(Pcd.MaxDatumSize) + 'U')
+            Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append(
+                str(Pcd.MaxDatumSize) + 'U')
             Dict['SIZE_TABLE_CURRENT_LENGTH'].append(VoidStarTypeCurrSize)
 
-
-
         if 'PCD_TYPE_HII' in Pcd.TokenTypeList:
             Dict['VARIABLE_HEAD_CNAME_DECL'].append(CName)
             Dict['VARIABLE_HEAD_GUID_DECL'].append(TokenSpaceGuid)
             Dict['VARIABLE_HEAD_NUMSKUS_DECL'].append(len(Pcd.SkuInfoList))
-            Dict['VARIABLE_HEAD_VALUE'].append('{ %s }\n' % ' },\n    { '.join(VariableHeadValueList))
+            Dict['VARIABLE_HEAD_VALUE'].append(
+                '{ %s }\n' % ' },\n    { '.join(VariableHeadValueList))
             Dict['VARDEF_HEADER'].append('_Variable_Header')
             Dict['VARIABLE_DB_VALUE'].append(VariableDbValueList)
         else:
@@ -1369,7 +1446,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             Dict['VPD_HEAD_CNAME_DECL'].append(CName)
             Dict['VPD_HEAD_GUID_DECL'].append(TokenSpaceGuid)
             Dict['VPD_HEAD_NUMSKUS_DECL'].append(len(Pcd.SkuInfoList))
-            Dict['VPD_HEAD_VALUE'].append('{ %s }' % ' }, { '.join(VpdHeadOffsetList))
+            Dict['VPD_HEAD_VALUE'].append(
+                '{ %s }' % ' }, { '.join(VpdHeadOffsetList))
             Dict['VPD_DB_VALUE'].append(VpdDbOffsetList)
         if 'PCD_TYPE_STRING' in Pcd.TokenTypeList:
             Dict['STRING_HEAD_CNAME_DECL'].append(CName)
@@ -1377,28 +1455,34 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             Dict['STRING_HEAD_NUMSKUS_DECL'].append(len(Pcd.SkuInfoList))
             Dict['STRING_HEAD_VALUE'].append(', '.join(StringHeadOffsetList))
             Dict['STRING_DB_VALUE'].append(StringDbOffsetList)
-            PCD_STRING_INDEX_MAP[len(Dict['STRING_HEAD_CNAME_DECL']) -1 ] = len(Dict['STRING_DB_VALUE']) -1
+            PCD_STRING_INDEX_MAP[len(
+                Dict['STRING_HEAD_CNAME_DECL']) - 1] = len(Dict['STRING_DB_VALUE']) - 1
         if 'PCD_TYPE_DATA' in Pcd.TokenTypeList:
             Dict[Pcd.InitString+'_CNAME_DECL_'+Pcd.DatumType].append(CName)
-            Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType].append(TokenSpaceGuid)
-            Dict[Pcd.InitString+'_NUMSKUS_DECL_'+Pcd.DatumType].append(len(Pcd.SkuInfoList))
+            Dict[Pcd.InitString+'_GUID_DECL_' +
+                 Pcd.DatumType].append(TokenSpaceGuid)
+            Dict[Pcd.InitString+'_NUMSKUS_DECL_' +
+                 Pcd.DatumType].append(len(Pcd.SkuInfoList))
             if Pcd.InitString == 'UNINIT':
                 Dict['PCD_DATABASE_UNINIT_EMPTY'] = ''
             else:
-                Dict[Pcd.InitString+'_VALUE_'+Pcd.DatumType].append(', '.join(ValueList))
-                Dict[Pcd.InitString+'_DB_VALUE_'+Pcd.DatumType].append(DbValueList)
+                Dict[Pcd.InitString+'_VALUE_' +
+                     Pcd.DatumType].append(', '.join(ValueList))
+                Dict[Pcd.InitString+'_DB_VALUE_' +
+                     Pcd.DatumType].append(DbValueList)
 
     if Phase == 'PEI':
         NumberOfLocalTokens = NumberOfPeiLocalTokens
     if Phase == 'DXE':
         NumberOfLocalTokens = NumberOfDxeLocalTokens
 
-    Dict['TOKEN_INIT']       = ['' for x in range(NumberOfLocalTokens)]
-    Dict['TOKEN_CNAME']      = ['' for x in range(NumberOfLocalTokens)]
-    Dict['TOKEN_GUID']       = ['' for x in range(NumberOfLocalTokens)]
-    Dict['TOKEN_TYPE']       = ['' for x in range(NumberOfLocalTokens)]
-    Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'] = ['' for x in range(NumberOfLocalTokens)]
-    Dict['PCD_CNAME']        = ['' for x in range(NumberOfLocalTokens)]
+    Dict['TOKEN_INIT'] = ['' for x in range(NumberOfLocalTokens)]
+    Dict['TOKEN_CNAME'] = ['' for x in range(NumberOfLocalTokens)]
+    Dict['TOKEN_GUID'] = ['' for x in range(NumberOfLocalTokens)]
+    Dict['TOKEN_TYPE'] = ['' for x in range(NumberOfLocalTokens)]
+    Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'] = [
+        '' for x in range(NumberOfLocalTokens)]
+    Dict['PCD_CNAME'] = ['' for x in range(NumberOfLocalTokens)]
     Dict['PCD_TOKENSPACE_MAP'] = ['' for x in range(NumberOfLocalTokens)]
     Dict['PCD_CNAME_LENGTH'] = [0 for x in range(NumberOfLocalTokens)]
     SkuEnablePcdIndex = 0
@@ -1408,8 +1492,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         if Pcd.Phase != Phase:
             continue
 
-        TokenSpaceGuid = GuidStructureStringToGuidValueName(Pcd.TokenSpaceGuidValue) #(Platform.PackageList, TokenSpaceGuidCName))
-        GeneratedTokenNumber = Platform.PcdTokenNumber[CName, TokenSpaceGuidCName] - 1
+        # (Platform.PackageList, TokenSpaceGuidCName))
+        TokenSpaceGuid = GuidStructureStringToGuidValueName(
+            Pcd.TokenSpaceGuidValue)
+        GeneratedTokenNumber = Platform.PcdTokenNumber[CName,
+                                                       TokenSpaceGuidCName] - 1
         if Phase == 'DXE':
             GeneratedTokenNumber -= NumberOfPeiLocalTokens
 
@@ -1421,9 +1508,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
                 CName = PcdItem[0]
 
-        EdkLogger.debug(EdkLogger.DEBUG_1, "PCD = %s.%s" % (CName, TokenSpaceGuidCName))
+        EdkLogger.debug(EdkLogger.DEBUG_1, "PCD = %s.%s" %
+                        (CName, TokenSpaceGuidCName))
         EdkLogger.debug(EdkLogger.DEBUG_1, "phase = %s" % Phase)
-        EdkLogger.debug(EdkLogger.DEBUG_1, "GeneratedTokenNumber = %s" % str(GeneratedTokenNumber))
+        EdkLogger.debug(EdkLogger.DEBUG_1, "GeneratedTokenNumber = %s" %
+                        str(GeneratedTokenNumber))
 
         #
         # following four Dict items hold the information for LocalTokenNumberTable
@@ -1434,47 +1523,56 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
 
         Dict['TOKEN_CNAME'][GeneratedTokenNumber] = CName
         Dict['TOKEN_GUID'][GeneratedTokenNumber] = TokenSpaceGuid
-        Dict['TOKEN_TYPE'][GeneratedTokenNumber] = ' | '.join(Pcd.TokenTypeList)
+        Dict['TOKEN_TYPE'][GeneratedTokenNumber] = ' | '.join(
+            Pcd.TokenTypeList)
 
         if Platform.Platform.PcdInfoFlag:
-            TokenSpaceGuidCNameArray = StringToArray('"' + TokenSpaceGuidCName + '"' )
+            TokenSpaceGuidCNameArray = StringToArray(
+                '"' + TokenSpaceGuidCName + '"')
             if TokenSpaceGuidCNameArray not in Dict['PCD_TOKENSPACE']:
                 Dict['PCD_TOKENSPACE'].append(TokenSpaceGuidCNameArray)
-                Dict['PCD_TOKENSPACE_LENGTH'].append( len(TokenSpaceGuidCNameArray.split(",")) )
-            Dict['PCD_TOKENSPACE_MAP'][GeneratedTokenNumber] = Dict['PCD_TOKENSPACE'].index(TokenSpaceGuidCNameArray)
-            CNameBinArray = StringToArray('"' + CName + '"' )
+                Dict['PCD_TOKENSPACE_LENGTH'].append(
+                    len(TokenSpaceGuidCNameArray.split(",")))
+            Dict['PCD_TOKENSPACE_MAP'][GeneratedTokenNumber] = Dict['PCD_TOKENSPACE'].index(
+                TokenSpaceGuidCNameArray)
+            CNameBinArray = StringToArray('"' + CName + '"')
             Dict['PCD_CNAME'][GeneratedTokenNumber] = CNameBinArray
 
-            Dict['PCD_CNAME_LENGTH'][GeneratedTokenNumber] = len(CNameBinArray.split(","))
-
+            Dict['PCD_CNAME_LENGTH'][GeneratedTokenNumber] = len(
+                CNameBinArray.split(","))
 
         Pcd.TokenTypeList = list(set(Pcd.TokenTypeList))
 
         # search the Offset and Table, used by LocalTokenNumberTableOffset
         if 'PCD_TYPE_HII' in Pcd.TokenTypeList:
             # Find index by CName, TokenSpaceGuid
-            Offset = GetMatchedIndex(CName, Dict['VARIABLE_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['VARIABLE_HEAD_GUID_DECL'])
+            Offset = GetMatchedIndex(
+                CName, Dict['VARIABLE_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['VARIABLE_HEAD_GUID_DECL'])
             assert(Offset != -1)
             Table = Dict['VARIABLE_DB_VALUE']
         if 'PCD_TYPE_VPD' in Pcd.TokenTypeList:
-            Offset = GetMatchedIndex(CName, Dict['VPD_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['VPD_HEAD_GUID_DECL'])
+            Offset = GetMatchedIndex(
+                CName, Dict['VPD_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['VPD_HEAD_GUID_DECL'])
             assert(Offset != -1)
             Table = Dict['VPD_DB_VALUE']
         if 'PCD_TYPE_STRING' in Pcd.TokenTypeList and 'PCD_TYPE_HII' not in Pcd.TokenTypeList:
             # Find index by CName, TokenSpaceGuid
-            Offset = GetMatchedIndex(CName, Dict['STRING_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['STRING_HEAD_GUID_DECL'])
+            Offset = GetMatchedIndex(
+                CName, Dict['STRING_HEAD_CNAME_DECL'], TokenSpaceGuid, Dict['STRING_HEAD_GUID_DECL'])
             Offset = PCD_STRING_INDEX_MAP[Offset]
             assert(Offset != -1)
             Table = Dict['STRING_DB_VALUE']
         if 'PCD_TYPE_DATA' in Pcd.TokenTypeList:
             # need to store whether it is in init table or not
-            Offset = GetMatchedIndex(CName, Dict[Pcd.InitString+'_CNAME_DECL_'+Pcd.DatumType], TokenSpaceGuid, Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType])
+            Offset = GetMatchedIndex(CName, Dict[Pcd.InitString+'_CNAME_DECL_'+Pcd.DatumType],
+                                     TokenSpaceGuid, Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType])
             assert(Offset != -1)
             if Pcd.InitString == 'UNINIT':
-                Table =  Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType]
+                Table = Dict[Pcd.InitString+'_GUID_DECL_'+Pcd.DatumType]
             else:
                 Table = Dict[Pcd.InitString+'_DB_VALUE_'+Pcd.DatumType]
-        Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'][GeneratedTokenNumber] = (Offset, Table)
+        Dict['LOCAL_TOKEN_NUMBER_DB_VALUE'][GeneratedTokenNumber] = (
+            Offset, Table)
 
         #
         # Update VARDEF_HEADER
@@ -1484,7 +1582,6 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         else:
             Dict['VARDEF_HEADER'][GeneratedTokenNumber] = ''
 
-
         if Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
 
             if Phase == 'DXE':
@@ -1499,10 +1596,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             # to the EXMAPPING_TABLE.
             #
 
-
             Dict['EXMAPPING_TABLE_EXTOKEN'].append(str(Pcd.TokenValue) + 'U')
-            Dict['EXMAPPING_TABLE_LOCAL_TOKEN'].append(str(GeneratedTokenNumber + 1) + 'U')
-            Dict['EXMAPPING_TABLE_GUID_INDEX'].append(str(GuidList.index(TokenSpaceGuid)) + 'U')
+            Dict['EXMAPPING_TABLE_LOCAL_TOKEN'].append(
+                str(GeneratedTokenNumber + 1) + 'U')
+            Dict['EXMAPPING_TABLE_GUID_INDEX'].append(
+                str(GuidList.index(TokenSpaceGuid)) + 'U')
 
     if Platform.Platform.PcdInfoFlag:
         for index in range(len(Dict['PCD_TOKENSPACE_MAP'])):
@@ -1515,7 +1613,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             StringTableIndex += 1
         for index in range(len(Dict['PCD_CNAME'])):
             Dict['PCD_CNAME_OFFSET'].append(StringTableSize)
-            Dict['PCD_NAME_OFFSET'].append(Dict['PCD_TOKENSPACE_OFFSET'][index])
+            Dict['PCD_NAME_OFFSET'].append(
+                Dict['PCD_TOKENSPACE_OFFSET'][index])
             Dict['PCD_NAME_OFFSET'].append(StringTableSize)
             StringTableSize += Dict['PCD_CNAME_LENGTH'][index]
             StringTableIndex += 1
@@ -1523,7 +1622,8 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['GUID_TABLE_EMPTY'] = 'FALSE'
         Dict['GUID_TABLE_SIZE'] = str(len(GuidList)) + 'U'
     else:
-        Dict['GUID_STRUCTURE'] = [GuidStringToGuidStructureString('00000000-0000-0000-0000-000000000000')]
+        Dict['GUID_STRUCTURE'] = [GuidStringToGuidStructureString(
+            '00000000-0000-0000-0000-000000000000')]
 
     if StringTableIndex == 0:
         Dict['STRING_TABLE_INDEX'].append('')
@@ -1542,14 +1642,14 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
         Dict['SIZE_TABLE_MAXIMUM_LENGTH'].append('0U')
 
     if NumberOfLocalTokens != 0:
-        Dict['DATABASE_EMPTY']                = 'FALSE'
+        Dict['DATABASE_EMPTY'] = 'FALSE'
         Dict['LOCAL_TOKEN_NUMBER_TABLE_SIZE'] = NumberOfLocalTokens
-        Dict['LOCAL_TOKEN_NUMBER']            = NumberOfLocalTokens
+        Dict['LOCAL_TOKEN_NUMBER'] = NumberOfLocalTokens
 
     if NumberOfExTokens != 0:
-        Dict['EXMAP_TABLE_EMPTY']    = 'FALSE'
+        Dict['EXMAP_TABLE_EMPTY'] = 'FALSE'
         Dict['EXMAPPING_TABLE_SIZE'] = str(NumberOfExTokens) + 'U'
-        Dict['EX_TOKEN_NUMBER']      = str(NumberOfExTokens) + 'U'
+        Dict['EX_TOKEN_NUMBER'] = str(NumberOfExTokens) + 'U'
     else:
         Dict['EXMAPPING_TABLE_EXTOKEN'].append('0U')
         Dict['EXMAPPING_TABLE_LOCAL_TOKEN'].append('0U')
@@ -1587,11 +1687,15 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
             for Count in range(len(Dict['TOKEN_CNAME'])):
                 for Count1 in range(len(Dict['SIZE_TABLE_CNAME'])):
                     if Dict['TOKEN_CNAME'][Count] == Dict['SIZE_TABLE_CNAME'][Count1] and \
-                        Dict['TOKEN_GUID'][Count] == Dict['SIZE_TABLE_GUID'][Count1]:
-                        SizeCNameTempList.append(Dict['SIZE_TABLE_CNAME'][Count1])
-                        SizeGuidTempList.append(Dict['SIZE_TABLE_GUID'][Count1])
-                        SizeCurLenTempList.append(Dict['SIZE_TABLE_CURRENT_LENGTH'][Count1])
-                        SizeMaxLenTempList.append(Dict['SIZE_TABLE_MAXIMUM_LENGTH'][Count1])
+                            Dict['TOKEN_GUID'][Count] == Dict['SIZE_TABLE_GUID'][Count1]:
+                        SizeCNameTempList.append(
+                            Dict['SIZE_TABLE_CNAME'][Count1])
+                        SizeGuidTempList.append(
+                            Dict['SIZE_TABLE_GUID'][Count1])
+                        SizeCurLenTempList.append(
+                            Dict['SIZE_TABLE_CURRENT_LENGTH'][Count1])
+                        SizeMaxLenTempList.append(
+                            Dict['SIZE_TABLE_MAXIMUM_LENGTH'][Count1])
 
             for Count in range(len(Dict['SIZE_TABLE_CNAME'])):
                 Dict['SIZE_TABLE_CNAME'][Count] = SizeCNameTempList[Count]
@@ -1606,10 +1710,11 @@ def CreatePcdDatabasePhaseSpecificAutoGen (Platform, DynamicPcdList, Phase):
     Buffer = BuildExDataBase(Dict)
     return AutoGenH, AutoGenC, Buffer, VarCheckTab
 
+
 def GetOrderedDynamicPcdList(DynamicPcdList, PcdTokenNumberList):
     ReorderedDyPcdList = [None for i in range(len(DynamicPcdList))]
     for Pcd in DynamicPcdList:
         if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in PcdTokenNumberList:
-            ReorderedDyPcdList[PcdTokenNumberList[Pcd.TokenCName, Pcd.TokenSpaceGuidCName]-1] = Pcd
+            ReorderedDyPcdList[PcdTokenNumberList[Pcd.TokenCName,
+                                                  Pcd.TokenSpaceGuidCName]-1] = Pcd
     return ReorderedDyPcdList
-
diff --git a/BaseTools/Source/Python/AutoGen/GenVar.py b/BaseTools/Source/Python/AutoGen/GenVar.py
index f2ad54ba630e..8b7ff2f66196 100644
--- a/BaseTools/Source/Python/AutoGen/GenVar.py
+++ b/BaseTools/Source/Python/AutoGen/GenVar.py
@@ -17,11 +17,13 @@ import collections
 import Common.DataType as DataType
 import Common.GlobalData as GlobalData
 
-var_info = collections.namedtuple("uefi_var", "pcdindex,pcdname,defaultstoragename,skuname,var_name, var_guid, var_offset,var_attribute,pcd_default_value, default_value, data_type,PcdDscLine,StructurePcd")
+var_info = collections.namedtuple(
+    "uefi_var", "pcdindex,pcdname,defaultstoragename,skuname,var_name, var_guid, var_offset,var_attribute,pcd_default_value, default_value, data_type,PcdDscLine,StructurePcd")
 NvStorageHeaderSize = 28
 VariableHeaderSize = 32
 AuthenticatedVariableHeaderSize = 60
 
+
 class VariableMgr(object):
     def __init__(self, DefaultStoreMap, SkuIdMap):
         self.VarInfo = []
@@ -46,9 +48,11 @@ class VariableMgr(object):
         if not self.NVHeaderBuff:
             return ""
         self.NVHeaderBuff = self.NVHeaderBuff[:8] + pack("=Q", maxsize)
-        default_var_bin = VariableMgr.format_data(self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
+        default_var_bin = VariableMgr.format_data(
+            self.NVHeaderBuff + self.VarDefaultBuff + self.VarDeltaBuff)
         value_str = "{"
-        default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
+        default_var_bin_strip = [data.strip(
+            """'""") for data in default_var_bin]
         value_str += ",".join(default_var_bin_strip)
         value_str += "}"
         return value_str
@@ -57,13 +61,17 @@ class VariableMgr(object):
         indexedvarinfo = collections.OrderedDict()
         for item in self.VarInfo:
             if (item.skuname, item.defaultstoragename, item.var_name, item.var_guid) not in indexedvarinfo:
-                indexedvarinfo[(item.skuname, item.defaultstoragename, item.var_name, item.var_guid) ] = []
-            indexedvarinfo[(item.skuname, item.defaultstoragename, item.var_name, item.var_guid)].append(item)
+                indexedvarinfo[(item.skuname, item.defaultstoragename,
+                                item.var_name, item.var_guid)] = []
+            indexedvarinfo[(item.skuname, item.defaultstoragename,
+                            item.var_name, item.var_guid)].append(item)
         for key in indexedvarinfo:
             sku_var_info_offset_list = indexedvarinfo[key]
-            sku_var_info_offset_list.sort(key=lambda x:x.PcdDscLine)
-            FirstOffset = int(sku_var_info_offset_list[0].var_offset, 16) if sku_var_info_offset_list[0].var_offset.upper().startswith("0X") else int(sku_var_info_offset_list[0].var_offset)
-            fisrtvalue_list = sku_var_info_offset_list[0].default_value.strip("{").strip("}").split(",")
+            sku_var_info_offset_list.sort(key=lambda x: x.PcdDscLine)
+            FirstOffset = int(sku_var_info_offset_list[0].var_offset, 16) if sku_var_info_offset_list[0].var_offset.upper(
+            ).startswith("0X") else int(sku_var_info_offset_list[0].var_offset)
+            fisrtvalue_list = sku_var_info_offset_list[0].default_value.strip(
+                "{").strip("}").split(",")
             firstdata_type = sku_var_info_offset_list[0].data_type
             if firstdata_type in DataType.TAB_PCD_NUMERIC_TYPES:
                 fisrtdata_flag = DataType.PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[firstdata_type]]
@@ -71,12 +79,15 @@ class VariableMgr(object):
                 fisrtvalue_list = []
                 pack_data = pack(fisrtdata_flag, int(fisrtdata, 0))
                 for data_byte in range(len(pack_data)):
-                    fisrtvalue_list.append(hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
+                    fisrtvalue_list.append(
+                        hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
             newvalue_list = ["0x00"] * FirstOffset + fisrtvalue_list
 
             for var_item in sku_var_info_offset_list[1:]:
-                CurOffset = int(var_item.var_offset, 16) if var_item.var_offset.upper().startswith("0X") else int(var_item.var_offset)
-                CurvalueList = var_item.default_value.strip("{").strip("}").split(",")
+                CurOffset = int(var_item.var_offset, 16) if var_item.var_offset.upper(
+                ).startswith("0X") else int(var_item.var_offset)
+                CurvalueList = var_item.default_value.strip(
+                    "{").strip("}").split(",")
                 Curdata_type = var_item.data_type
                 if Curdata_type in DataType.TAB_PCD_NUMERIC_TYPES:
                     data_flag = DataType.PACK_CODE_BY_SIZE[MAX_SIZE_TYPE[Curdata_type]]
@@ -84,15 +95,20 @@ class VariableMgr(object):
                     CurvalueList = []
                     pack_data = pack(data_flag, int(data, 0))
                     for data_byte in range(len(pack_data)):
-                        CurvalueList.append(hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
+                        CurvalueList.append(
+                            hex(unpack("B", pack_data[data_byte:data_byte + 1])[0]))
                 if CurOffset > len(newvalue_list):
-                    newvalue_list = newvalue_list + ["0x00"] * (CurOffset - len(newvalue_list)) + CurvalueList
+                    newvalue_list = newvalue_list + \
+                        ["0x00"] * (CurOffset - len(newvalue_list)
+                                    ) + CurvalueList
                 else:
-                    newvalue_list[CurOffset : CurOffset + len(CurvalueList)] = CurvalueList
+                    newvalue_list[CurOffset: CurOffset +
+                                  len(CurvalueList)] = CurvalueList
 
-            newvaluestr =  "{" + ",".join(newvalue_list) +"}"
+            newvaluestr = "{" + ",".join(newvalue_list) + "}"
             n = sku_var_info_offset_list[0]
-            indexedvarinfo[key] =  [var_info(n.pcdindex, n.pcdname, n.defaultstoragename, n.skuname, n.var_name, n.var_guid, "0x00", n.var_attribute, newvaluestr, newvaluestr, DataType.TAB_VOID,n.PcdDscLine,n.StructurePcd)]
+            indexedvarinfo[key] = [var_info(n.pcdindex, n.pcdname, n.defaultstoragename, n.skuname, n.var_name, n.var_guid,
+                                            "0x00", n.var_attribute, newvaluestr, newvaluestr, DataType.TAB_VOID, n.PcdDscLine, n.StructurePcd)]
         self.VarInfo = [item[0] for item in list(indexedvarinfo.values())]
 
     def process_variable_data(self):
@@ -103,7 +119,8 @@ class VariableMgr(object):
         for item in self.VarInfo:
             if item.pcdindex not in indexedvarinfo:
                 indexedvarinfo[item.pcdindex] = dict()
-            indexedvarinfo[item.pcdindex][(item.skuname, item.defaultstoragename)] = item
+            indexedvarinfo[item.pcdindex][(
+                item.skuname, item.defaultstoragename)] = item
 
         for index in indexedvarinfo:
             sku_var_info = indexedvarinfo[index]
@@ -111,40 +128,52 @@ class VariableMgr(object):
             default_data_buffer = ""
             others_data_buffer = ""
             tail = None
-            default_sku_default = indexedvarinfo[index].get((DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT))
+            default_sku_default = indexedvarinfo[index].get(
+                (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT))
 
             if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
-                var_max_len = max(len(var_item.default_value.split(",")) for var_item in sku_var_info.values())
+                var_max_len = max(len(var_item.default_value.split(","))
+                                  for var_item in sku_var_info.values())
                 if len(default_sku_default.default_value.split(",")) < var_max_len:
-                    tail = ",".join("0x00" for i in range(var_max_len-len(default_sku_default.default_value.split(","))))
+                    tail = ",".join("0x00" for i in range(
+                        var_max_len-len(default_sku_default.default_value.split(","))))
 
-            default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(default_sku_default.default_value, default_sku_default.data_type, tail)
+            default_data_buffer = VariableMgr.PACK_VARIABLES_DATA(
+                default_sku_default.default_value, default_sku_default.data_type, tail)
 
             default_data_array = ()
             for item in range(len(default_data_buffer)):
-                default_data_array += unpack("B", default_data_buffer[item:item + 1])
+                default_data_array += unpack("B",
+                                             default_data_buffer[item:item + 1])
 
-            var_data[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (default_data_buffer, sku_var_info[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)])
+            var_data[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)][index] = (
+                default_data_buffer, sku_var_info[(DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT)])
 
             for (skuid, defaultstoragename) in indexedvarinfo[index]:
                 tail = None
                 if (skuid, defaultstoragename) == (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT):
                     continue
-                other_sku_other = indexedvarinfo[index][(skuid, defaultstoragename)]
+                other_sku_other = indexedvarinfo[index][(
+                    skuid, defaultstoragename)]
 
                 if default_sku_default.data_type not in DataType.TAB_PCD_NUMERIC_TYPES:
                     if len(other_sku_other.default_value.split(",")) < var_max_len:
-                        tail = ",".join("0x00" for i in range(var_max_len-len(other_sku_other.default_value.split(","))))
+                        tail = ",".join("0x00" for i in range(
+                            var_max_len-len(other_sku_other.default_value.split(","))))
 
-                others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(other_sku_other.default_value, other_sku_other.data_type, tail)
+                others_data_buffer = VariableMgr.PACK_VARIABLES_DATA(
+                    other_sku_other.default_value, other_sku_other.data_type, tail)
 
                 others_data_array = ()
                 for item in range(len(others_data_buffer)):
-                    others_data_array += unpack("B", others_data_buffer[item:item + 1])
+                    others_data_array += unpack("B",
+                                                others_data_buffer[item:item + 1])
 
-                data_delta = VariableMgr.calculate_delta(default_data_array, others_data_array)
+                data_delta = VariableMgr.calculate_delta(
+                    default_data_array, others_data_array)
 
-                var_data[(skuid, defaultstoragename)][index] = (data_delta, sku_var_info[(skuid, defaultstoragename)])
+                var_data[(skuid, defaultstoragename)][index] = (
+                    data_delta, sku_var_info[(skuid, defaultstoragename)])
         return var_data
 
     def new_process_varinfo(self):
@@ -155,41 +184,51 @@ class VariableMgr(object):
         if not var_data:
             return []
 
-        pcds_default_data = var_data.get((DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT), {})
+        pcds_default_data = var_data.get(
+            (DataType.TAB_DEFAULT, DataType.TAB_DEFAULT_STORES_DEFAULT), {})
         NvStoreDataBuffer = bytearray()
         var_data_offset = collections.OrderedDict()
         offset = NvStorageHeaderSize
         for default_data, default_info in pcds_default_data.values():
-            var_name_buffer = VariableMgr.PACK_VARIABLE_NAME(default_info.var_name)
+            var_name_buffer = VariableMgr.PACK_VARIABLE_NAME(
+                default_info.var_name)
 
             vendorguid = default_info.var_guid.split('-')
 
             if default_info.var_attribute:
-                var_attr_value, _ = VariableAttributes.GetVarAttributes(default_info.var_attribute)
+                var_attr_value, _ = VariableAttributes.GetVarAttributes(
+                    default_info.var_attribute)
             else:
                 var_attr_value = 0x07
 
             DataBuffer = VariableMgr.AlignData(var_name_buffer + default_data)
 
             data_size = len(DataBuffer)
-            if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE,"FALSE").upper() == "TRUE":
-                offset += AuthenticatedVariableHeaderSize + len(default_info.var_name.split(","))
+            if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE, "FALSE").upper() == "TRUE":
+                offset += AuthenticatedVariableHeaderSize + \
+                    len(default_info.var_name.split(","))
             else:
-                offset += VariableHeaderSize + len(default_info.var_name.split(","))
+                offset += VariableHeaderSize + \
+                    len(default_info.var_name.split(","))
             var_data_offset[default_info.pcdindex] = offset
             offset += data_size - len(default_info.var_name.split(","))
-            if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE,"FALSE").upper() == "TRUE":
-                var_header_buffer = VariableMgr.PACK_AUTHENTICATED_VARIABLE_HEADER(var_attr_value, len(default_info.var_name.split(",")), len (default_data), vendorguid)
+            if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE, "FALSE").upper() == "TRUE":
+                var_header_buffer = VariableMgr.PACK_AUTHENTICATED_VARIABLE_HEADER(
+                    var_attr_value, len(default_info.var_name.split(",")), len(default_data), vendorguid)
             else:
-                var_header_buffer = VariableMgr.PACK_VARIABLE_HEADER(var_attr_value, len(default_info.var_name.split(",")), len (default_data), vendorguid)
+                var_header_buffer = VariableMgr.PACK_VARIABLE_HEADER(var_attr_value, len(
+                    default_info.var_name.split(",")), len(default_data), vendorguid)
             NvStoreDataBuffer += (var_header_buffer + DataBuffer)
 
-        if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE,"FALSE").upper() == "TRUE":
-            variable_storage_header_buffer = VariableMgr.PACK_AUTHENTICATED_VARIABLE_STORE_HEADER(len(NvStoreDataBuffer) + 28)
+        if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE, "FALSE").upper() == "TRUE":
+            variable_storage_header_buffer = VariableMgr.PACK_AUTHENTICATED_VARIABLE_STORE_HEADER(
+                len(NvStoreDataBuffer) + 28)
         else:
-            variable_storage_header_buffer = VariableMgr.PACK_VARIABLE_STORE_HEADER(len(NvStoreDataBuffer) + 28)
+            variable_storage_header_buffer = VariableMgr.PACK_VARIABLE_STORE_HEADER(
+                len(NvStoreDataBuffer) + 28)
 
-        nv_default_part = VariableMgr.AlignData(VariableMgr.PACK_DEFAULT_DATA(0, 0, VariableMgr.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
+        nv_default_part = VariableMgr.AlignData(VariableMgr.PACK_DEFAULT_DATA(
+            0, 0, VariableMgr.unpack_data(variable_storage_header_buffer+NvStoreDataBuffer)), 8)
 
         data_delta_structure_buffer = bytearray()
         for skuname, defaultstore in var_data:
@@ -200,24 +239,26 @@ class VariableMgr(object):
             for pcdindex in pcds_sku_data:
                 offset = var_data_offset[pcdindex]
                 delta_data, _ = pcds_sku_data[pcdindex]
-                delta_data = [(item[0] + offset, item[1]) for item in delta_data]
+                delta_data = [(item[0] + offset, item[1])
+                              for item in delta_data]
                 delta_data_set.extend(delta_data)
 
-            data_delta_structure_buffer += VariableMgr.AlignData(self.PACK_DELTA_DATA(skuname, defaultstore, delta_data_set), 8)
+            data_delta_structure_buffer += VariableMgr.AlignData(
+                self.PACK_DELTA_DATA(skuname, defaultstore, delta_data_set), 8)
 
         size = len(nv_default_part + data_delta_structure_buffer) + 16
         maxsize = self.VpdRegionSize if self.VpdRegionSize else size
-        NV_Store_Default_Header = VariableMgr.PACK_NV_STORE_DEFAULT_HEADER(size, maxsize)
+        NV_Store_Default_Header = VariableMgr.PACK_NV_STORE_DEFAULT_HEADER(
+            size, maxsize)
 
-        self.NVHeaderBuff =  NV_Store_Default_Header
-        self.VarDefaultBuff =nv_default_part
-        self.VarDeltaBuff =  data_delta_structure_buffer
+        self.NVHeaderBuff = NV_Store_Default_Header
+        self.VarDefaultBuff = nv_default_part
+        self.VarDeltaBuff = data_delta_structure_buffer
         return VariableMgr.format_data(NV_Store_Default_Header + nv_default_part + data_delta_structure_buffer)
 
-
     @staticmethod
     def format_data(data):
-        return  [hex(item) for item in VariableMgr.unpack_data(data)]
+        return [hex(item) for item in VariableMgr.unpack_data(data)]
 
     @staticmethod
     def unpack_data(data):
@@ -229,7 +270,8 @@ class VariableMgr(object):
     @staticmethod
     def calculate_delta(default, theother):
         if len(default) - len(theother) != 0:
-            EdkLogger.error("build", FORMAT_INVALID, 'The variable data length is not the same for the same PCD.')
+            EdkLogger.error("build", FORMAT_INVALID,
+                            'The variable data length is not the same for the same PCD.')
         data_delta = []
         for i in range(len(default)):
             if default[i] != theother[i]:
@@ -241,7 +283,8 @@ class VariableMgr(object):
         default_var_bin = self.new_process_varinfo()
         if default_var_bin:
             value_str = "{"
-            default_var_bin_strip = [ data.strip("""'""") for data in default_var_bin]
+            default_var_bin_strip = [data.strip(
+                """'""") for data in default_var_bin]
             value_str += ",".join(default_var_bin_strip)
             value_str += "}"
             return value_str
@@ -291,7 +334,7 @@ class VariableMgr(object):
     @staticmethod
     def PACK_VARIABLE_HEADER(attribute, namesize, datasize, vendorguid):
 
-        Buffer = pack('=H', 0x55AA) # pack StartID
+        Buffer = pack('=H', 0x55AA)  # pack StartID
         Buffer += pack('=B', 0x3F)  # pack State
         Buffer += pack('=B', 0)     # pack reserved
 
@@ -313,18 +356,18 @@ class VariableMgr(object):
         Buffer += pack('=L', attribute)
 
         Buffer += pack('=Q', 0)        # pack MonotonicCount
-        Buffer += pack('=HBBBBBBLhBB', # pack TimeStamp
-                         0,            # UINT16 Year
-                         0,            # UINT8  Month
-                         0,            # UINT8  Day
-                         0,            # UINT8  Hour
-                         0,            # UINT8  Minute
-                         0,            # UINT8  Second
-                         0,            # UINT8  Pad1
-                         0,            # UINT32 Nanosecond
-                         0,            # INT16  TimeZone
-                         0,            # UINT8  Daylight
-                         0)            # UINT8  Pad2
+        Buffer += pack('=HBBBBBBLhBB',  # pack TimeStamp
+                       0,            # UINT16 Year
+                       0,            # UINT8  Month
+                       0,            # UINT8  Day
+                       0,            # UINT8  Hour
+                       0,            # UINT8  Minute
+                       0,            # UINT8  Second
+                       0,            # UINT8  Pad1
+                       0,            # UINT32 Nanosecond
+                       0,            # INT16  TimeZone
+                       0,            # UINT8  Daylight
+                       0)            # UINT8  Pad2
         Buffer += pack('=L', 0)        # pack PubKeyIndex
 
         Buffer += pack('=L', namesize)
@@ -335,7 +378,7 @@ class VariableMgr(object):
         return Buffer
 
     @staticmethod
-    def PACK_VARIABLES_DATA(var_value,data_type, tail = None):
+    def PACK_VARIABLES_DATA(var_value, data_type, tail=None):
         Buffer = bytearray()
         data_len = 0
         if data_type == DataType.TAB_VOID:
@@ -347,9 +390,10 @@ class VariableMgr(object):
                     Buffer += pack("=B", int(value_char, 16))
                 data_len += len(tail.split(","))
         elif data_type == "BOOLEAN":
-            Buffer += pack("=B", True) if var_value.upper() in ["TRUE","1"] else pack("=B", False)
+            Buffer += pack("=B", True) if var_value.upper() in [
+                "TRUE", "1"] else pack("=B", False)
             data_len += 1
-        elif data_type  == DataType.TAB_UINT8:
+        elif data_type == DataType.TAB_UINT8:
             Buffer += pack("=B", GetIntegerValue(var_value))
             data_len += 1
         elif data_type == DataType.TAB_UINT16:
@@ -404,7 +448,7 @@ class VariableMgr(object):
         return Buffer
 
     @staticmethod
-    def AlignData(data, align = 4):
+    def AlignData(data, align=4):
         mybuffer = data
         if (len(data) % align) > 0:
             for i in range(align - (len(data) % align)):
diff --git a/BaseTools/Source/Python/AutoGen/IdfClassObject.py b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
index a6b8123c2539..1526a9f01de4 100644
--- a/BaseTools/Source/Python/AutoGen/IdfClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/IdfClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to collect all defined strings in Image Definition files
 #
 # Copyright (c) 2016, Intel Corporation. All rights reserved.<BR>
@@ -18,53 +18,56 @@ import os
 from Common.GlobalData import gIdentifierPattern
 from .UniClassObject import StripComments
 
-IMAGE_TOKEN = re.compile('IMAGE_TOKEN *\(([A-Z0-9_]+) *\)', re.MULTILINE | re.UNICODE)
+IMAGE_TOKEN = re.compile(
+    'IMAGE_TOKEN *\(([A-Z0-9_]+) *\)', re.MULTILINE | re.UNICODE)
 
 #
 # Value of different image information block types
 #
-EFI_HII_IIBT_END               = 0x00
-EFI_HII_IIBT_IMAGE_1BIT        = 0x10
-EFI_HII_IIBT_IMAGE_1BIT_TRANS  = 0x11
-EFI_HII_IIBT_IMAGE_4BIT        = 0x12
-EFI_HII_IIBT_IMAGE_4BIT_TRANS  = 0x13
-EFI_HII_IIBT_IMAGE_8BIT        = 0x14
-EFI_HII_IIBT_IMAGE_8BIT_TRANS  = 0x15
-EFI_HII_IIBT_IMAGE_24BIT       = 0x16
+EFI_HII_IIBT_END = 0x00
+EFI_HII_IIBT_IMAGE_1BIT = 0x10
+EFI_HII_IIBT_IMAGE_1BIT_TRANS = 0x11
+EFI_HII_IIBT_IMAGE_4BIT = 0x12
+EFI_HII_IIBT_IMAGE_4BIT_TRANS = 0x13
+EFI_HII_IIBT_IMAGE_8BIT = 0x14
+EFI_HII_IIBT_IMAGE_8BIT_TRANS = 0x15
+EFI_HII_IIBT_IMAGE_24BIT = 0x16
 EFI_HII_IIBT_IMAGE_24BIT_TRANS = 0x17
-EFI_HII_IIBT_IMAGE_JPEG        = 0x18
-EFI_HII_IIBT_IMAGE_PNG         = 0x19
-EFI_HII_IIBT_DUPLICATE         = 0x20
-EFI_HII_IIBT_SKIP2             = 0x21
-EFI_HII_IIBT_SKIP1             = 0x22
-EFI_HII_IIBT_EXT1              = 0x30
-EFI_HII_IIBT_EXT2              = 0x31
-EFI_HII_IIBT_EXT4              = 0x32
+EFI_HII_IIBT_IMAGE_JPEG = 0x18
+EFI_HII_IIBT_IMAGE_PNG = 0x19
+EFI_HII_IIBT_DUPLICATE = 0x20
+EFI_HII_IIBT_SKIP2 = 0x21
+EFI_HII_IIBT_SKIP1 = 0x22
+EFI_HII_IIBT_EXT1 = 0x30
+EFI_HII_IIBT_EXT2 = 0x31
+EFI_HII_IIBT_EXT4 = 0x32
 
 #
 # Value of HII package type
 #
-EFI_HII_PACKAGE_TYPE_ALL           = 0x00
-EFI_HII_PACKAGE_TYPE_GUID          = 0x01
-EFI_HII_PACKAGE_FORMS              = 0x02
-EFI_HII_PACKAGE_STRINGS            = 0x04
-EFI_HII_PACKAGE_FONTS              = 0x05
-EFI_HII_PACKAGE_IMAGES             = 0x06
-EFI_HII_PACKAGE_SIMPLE_FONTS       = 0x07
-EFI_HII_PACKAGE_DEVICE_PATH        = 0x08
-EFI_HII_PACKAGE_KEYBOARD_LAYOUT    = 0x09
-EFI_HII_PACKAGE_ANIMATIONS         = 0x0A
-EFI_HII_PACKAGE_END                = 0xDF
-EFI_HII_PACKAGE_TYPE_SYSTEM_BEGIN  = 0xE0
-EFI_HII_PACKAGE_TYPE_SYSTEM_END    = 0xFF
+EFI_HII_PACKAGE_TYPE_ALL = 0x00
+EFI_HII_PACKAGE_TYPE_GUID = 0x01
+EFI_HII_PACKAGE_FORMS = 0x02
+EFI_HII_PACKAGE_STRINGS = 0x04
+EFI_HII_PACKAGE_FONTS = 0x05
+EFI_HII_PACKAGE_IMAGES = 0x06
+EFI_HII_PACKAGE_SIMPLE_FONTS = 0x07
+EFI_HII_PACKAGE_DEVICE_PATH = 0x08
+EFI_HII_PACKAGE_KEYBOARD_LAYOUT = 0x09
+EFI_HII_PACKAGE_ANIMATIONS = 0x0A
+EFI_HII_PACKAGE_END = 0xDF
+EFI_HII_PACKAGE_TYPE_SYSTEM_BEGIN = 0xE0
+EFI_HII_PACKAGE_TYPE_SYSTEM_END = 0xFF
+
 
 class IdfFileClassObject(object):
-    def __init__(self, FileList = []):
+    def __init__(self, FileList=[]):
         self.ImageFilesDict = {}
         self.ImageIDList = []
         for File in FileList:
             if File is None:
-                EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'No Image definition file is given.')
+                EdkLogger.error("Image Definition File Parser",
+                                PARSER_ERROR, 'No Image definition file is given.')
 
             try:
                 IdfFile = open(LongFilePath(File.Path), mode='r')
@@ -82,30 +85,38 @@ class IdfFileClassObject(object):
 
                 LineNo = GetLineNo(FileIn, Line, False)
                 if not Line.startswith('#image '):
-                    EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is invalid.' % (Line, LineNo, File.Path))
+                    EdkLogger.error("Image Definition File Parser", PARSER_ERROR,
+                                    'The %s in Line %s of File %s is invalid.' % (Line, LineNo, File.Path))
 
                 if Line.find('#image ') >= 0:
                     LineDetails = Line.split()
                     Len = len(LineDetails)
                     if Len != 3 and Len != 4:
-                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The format is not match #image IMAGE_ID [TRANSPARENT] ImageFileName in Line %s of File %s.' % (LineNo, File.Path))
+                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR,
+                                        'The format is not match #image IMAGE_ID [TRANSPARENT] ImageFileName in Line %s of File %s.' % (LineNo, File.Path))
                     if Len == 4 and LineDetails[2] != 'TRANSPARENT':
-                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'Please use the keyword "TRANSPARENT" to describe the transparency setting in Line %s of File %s.' % (LineNo, File.Path))
+                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR,
+                                        'Please use the keyword "TRANSPARENT" to describe the transparency setting in Line %s of File %s.' % (LineNo, File.Path))
                     MatchString = gIdentifierPattern.match(LineDetails[1])
                     if MatchString is None:
-                        EdkLogger.error('Image Definition  File Parser', FORMAT_INVALID, 'The Image token name %s defined in Idf file %s contains the invalid character.' % (LineDetails[1], File.Path))
+                        EdkLogger.error('Image Definition  File Parser', FORMAT_INVALID,
+                                        'The Image token name %s defined in Idf file %s contains the invalid character.' % (LineDetails[1], File.Path))
                     if LineDetails[1] not in self.ImageIDList:
                         self.ImageIDList.append(LineDetails[1])
                     else:
-                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is already defined.' % (LineDetails[1], LineNo, File.Path))
+                        EdkLogger.error("Image Definition File Parser", PARSER_ERROR, 'The %s in Line %s of File %s is already defined.' % (
+                            LineDetails[1], LineNo, File.Path))
                     if Len == 4:
-                        ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], True)
+                        ImageFile = ImageFileObject(
+                            LineDetails[Len-1], LineDetails[1], True)
                     else:
-                        ImageFile = ImageFileObject(LineDetails[Len-1], LineDetails[1], False)
+                        ImageFile = ImageFileObject(
+                            LineDetails[Len-1], LineDetails[1], False)
                     ImageFileList.append(ImageFile)
             if ImageFileList:
                 self.ImageFilesDict[File] = ImageFileList
 
+
 def SearchImageID(ImageFileObject, FileList):
     if FileList == []:
         return ImageFileObject
@@ -116,11 +127,13 @@ def SearchImageID(ImageFileObject, FileList):
             for Line in Lines:
                 ImageIdList = IMAGE_TOKEN.findall(Line)
                 for ID in ImageIdList:
-                    EdkLogger.debug(EdkLogger.DEBUG_5, "Found ImageID identifier: " + ID)
+                    EdkLogger.debug(EdkLogger.DEBUG_5,
+                                    "Found ImageID identifier: " + ID)
                     ImageFileObject.SetImageIDReferenced(ID)
 
+
 class ImageFileObject(object):
-    def __init__(self, FileName, ImageID, TransParent = False):
+    def __init__(self, FileName, ImageID, TransParent=False):
         self.FileName = FileName
         self.File = ''
         self.ImageID = ImageID
diff --git a/BaseTools/Source/Python/AutoGen/IncludesAutoGen.py b/BaseTools/Source/Python/AutoGen/IncludesAutoGen.py
index 5ec26eb98b42..7d3dbae33512 100644
--- a/BaseTools/Source/Python/AutoGen/IncludesAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/IncludesAutoGen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Build cache intermediate result and state
 #
 # Copyright (c) 2019 - 2020, Intel Corporation. All rights reserved.<BR>
@@ -16,6 +16,7 @@ gIsFileMap = {}
 
 DEP_FILE_TAIL = "# Updated \n"
 
+
 class IncludesAutoGen():
     """ This class is to manage the dependent files witch are used in Makefile to support incremental build.
         1. C files:
@@ -34,6 +35,7 @@ class IncludesAutoGen():
             2. ASM PP use c preprocessor to find out all included files with #include format and generate a deps file
             3. build tool updates the .deps file
     """
+
     def __init__(self, makefile_folder, ModuleAuto):
         self.d_folder = makefile_folder
         self.makefile_folder = makefile_folder
@@ -42,14 +44,16 @@ class IncludesAutoGen():
         self.workspace = ModuleAuto.WorkspaceDir
 
     def CreateModuleDeps(self):
-        SaveFileOnChange(os.path.join(self.makefile_folder,"deps.txt"),"\n".join(self.DepsCollection),False)
+        SaveFileOnChange(os.path.join(self.makefile_folder,
+                         "deps.txt"), "\n".join(self.DepsCollection), False)
 
     def CreateDepsInclude(self):
-        deps_file = {'deps_file':self.deps_files}
+        deps_file = {'deps_file': self.deps_files}
 
         MakePath = self.module_autogen.BuildOption.get('MAKE', {}).get('PATH')
         if not MakePath:
-            EdkLogger.error("build", PARAMETER_MISSING, Message="No Make path available.")
+            EdkLogger.error("build", PARAMETER_MISSING,
+                            Message="No Make path available.")
         elif "nmake" in MakePath:
             _INCLUDE_DEPS_TEMPLATE = TemplateString('''
 ${BEGIN}
@@ -69,10 +73,12 @@ ${END}
             deps_include_str = _INCLUDE_DEPS_TEMPLATE.Replace(deps_file)
         except Exception as e:
             print(e)
-        SaveFileOnChange(os.path.join(self.makefile_folder,"dependency"),deps_include_str,False)
+        SaveFileOnChange(os.path.join(self.makefile_folder,
+                         "dependency"), deps_include_str, False)
 
     def CreateDepsTarget(self):
-        SaveFileOnChange(os.path.join(self.makefile_folder,"deps_target"),"\n".join([item +":" for item in self.DepsCollection]),False)
+        SaveFileOnChange(os.path.join(self.makefile_folder, "deps_target"), "\n".join(
+            [item + ":" for item in self.DepsCollection]), False)
 
     @cached_property
     def deps_files(self):
@@ -93,14 +99,14 @@ ${END}
         targetname = [item[0].Name for item in self.TargetFileList.values()]
         for abspath in self.deps_files:
             try:
-                with open(abspath,"r") as fd:
+                with open(abspath, "r") as fd:
                     lines = fd.readlines()
 
                 firstlineitems = lines[0].split(": ")
                 dependency_file = firstlineitems[1].strip(" \\\n")
                 dependency_file = dependency_file.strip('''"''')
                 if dependency_file:
-                    if os.path.normpath(dependency_file +".deps") == abspath:
+                    if os.path.normpath(dependency_file + ".deps") == abspath:
                         continue
                     filename = os.path.basename(dependency_file).strip()
                     if filename not in targetname:
@@ -113,14 +119,15 @@ ${END}
                     dependency_file = dependency_file.strip('''"''')
                     if dependency_file == '':
                         continue
-                    if os.path.normpath(dependency_file +".deps") == abspath:
+                    if os.path.normpath(dependency_file + ".deps") == abspath:
                         continue
                     filename = os.path.basename(dependency_file).strip()
                     if filename in targetname:
                         continue
                     includes.add(dependency_file.strip())
             except Exception as e:
-                EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % abspath, ExtraData=str(e), RaiseError=False)
+                EdkLogger.error("build", FILE_NOT_FOUND, "%s doesn't exist" %
+                                abspath, ExtraData=str(e), RaiseError=False)
                 continue
         rt = sorted(list(set([item.strip(' " \\\n') for item in includes])))
         return rt
@@ -128,49 +135,60 @@ ${END}
     @cached_property
     def SourceFileList(self):
         """ Get a map of module's source files name to module's source files path """
-        source = {os.path.basename(item.File):item.Path for item in self.module_autogen.SourceFileList}
+        source = {os.path.basename(
+            item.File): item.Path for item in self.module_autogen.SourceFileList}
         middle_file = {}
         for afile in source:
             if afile.upper().endswith(".VFR"):
-                middle_file.update({afile.split(".")[0]+".c":os.path.join(self.module_autogen.DebugDir,afile.split(".")[0]+".c")})
-            if afile.upper().endswith((".S","ASM")):
-                middle_file.update({afile.split(".")[0]+".i":os.path.join(self.module_autogen.OutputDir,afile.split(".")[0]+".i")})
+                middle_file.update({afile.split(".")[
+                                   0]+".c": os.path.join(self.module_autogen.DebugDir, afile.split(".")[0]+".c")})
+            if afile.upper().endswith((".S", "ASM")):
+                middle_file.update({afile.split(".")[
+                                   0]+".i": os.path.join(self.module_autogen.OutputDir, afile.split(".")[0]+".i")})
             if afile.upper().endswith(".ASL"):
-                middle_file.update({afile.split(".")[0]+".i":os.path.join(self.module_autogen.OutputDir,afile.split(".")[0]+".i")})
-        source.update({"AutoGen.c":os.path.join(self.module_autogen.OutputDir,"AutoGen.c")})
+                middle_file.update({afile.split(".")[
+                                   0]+".i": os.path.join(self.module_autogen.OutputDir, afile.split(".")[0]+".i")})
+        source.update({"AutoGen.c": os.path.join(
+            self.module_autogen.OutputDir, "AutoGen.c")})
         source.update(middle_file)
         return source
 
     @cached_property
     def HasNamesakeSourceFile(self):
-        source_base_name = set([os.path.basename(item.File) for item in self.module_autogen.SourceFileList])
+        source_base_name = set([os.path.basename(item.File)
+                               for item in self.module_autogen.SourceFileList])
         rt = len(source_base_name) != len(self.module_autogen.SourceFileList)
         return rt
+
     @cached_property
     def CcPPCommandPathSet(self):
         rt = set()
-        rt.add(self.module_autogen.BuildOption.get('CC',{}).get('PATH'))
-        rt.add(self.module_autogen.BuildOption.get('ASLCC',{}).get('PATH'))
-        rt.add(self.module_autogen.BuildOption.get('ASLPP',{}).get('PATH'))
-        rt.add(self.module_autogen.BuildOption.get('VFRPP',{}).get('PATH'))
-        rt.add(self.module_autogen.BuildOption.get('PP',{}).get('PATH'))
-        rt.add(self.module_autogen.BuildOption.get('APP',{}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('CC', {}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('ASLCC', {}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('ASLPP', {}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('VFRPP', {}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('PP', {}).get('PATH'))
+        rt.add(self.module_autogen.BuildOption.get('APP', {}).get('PATH'))
         rt.discard(None)
         return rt
+
     @cached_property
     def TargetFileList(self):
         """ Get a map of module's target name to a tuple of module's targets path and whose input file path """
         targets = {}
-        targets["AutoGen.obj"] = (PathClass(os.path.join(self.module_autogen.OutputDir,"AutoGen.obj")),PathClass(os.path.join(self.module_autogen.DebugDir,"AutoGen.c")))
+        targets["AutoGen.obj"] = (PathClass(os.path.join(self.module_autogen.OutputDir, "AutoGen.obj")), PathClass(
+            os.path.join(self.module_autogen.DebugDir, "AutoGen.c")))
         for item in self.module_autogen.Targets.values():
             for block in item:
-                targets[block.Target.Path] = (block.Target,block.Inputs[0])
+                targets[block.Target.Path] = (block.Target, block.Inputs[0])
         return targets
 
-    def GetRealTarget(self,source_file_abs):
+    def GetRealTarget(self, source_file_abs):
         """ Get the final target file based on source file abspath """
-        source_target_map = {item[1].Path:item[0].Path for item in self.TargetFileList.values()}
-        source_name_map = {item[1].File:item[0].Path for item in self.TargetFileList.values()}
+        source_target_map = {
+            item[1].Path: item[0].Path for item in self.TargetFileList.values()}
+        source_name_map = {
+            item[1].File: item[0].Path for item in self.TargetFileList.values()}
         target_abs = source_target_map.get(source_file_abs)
         if target_abs is None:
             if source_file_abs.strip().endswith(".i"):
@@ -205,28 +223,33 @@ ${END}
                             if not item.startswith("/"):
                                 if item.endswith(".txt") and item.startswith("@"):
                                     with open(item[1:], "r") as file:
-                                        source_files = file.readlines()[0].split()
+                                        source_files = file.readlines()[
+                                            0].split()
                                         SourceFileAbsPathMap = {os.path.basename(file): file for file in source_files if
                                                                 os.path.exists(file)}
                                 else:
                                     if os.path.exists(item):
-                                        SourceFileAbsPathMap.update({os.path.basename(item): item.strip()})
+                                        SourceFileAbsPathMap.update(
+                                            {os.path.basename(item): item.strip()})
                         # SourceFileAbsPathMap = {os.path.basename(item):item for item in cc_options if not item.startswith("/") and os.path.exists(item)}
             if line in SourceFileAbsPathMap:
                 current_source = line
                 if current_source not in ModuleDepDict:
                     ModuleDepDict[SourceFileAbsPathMap[current_source]] = []
-            elif "Note: including file:" ==  line.lstrip()[:21]:
+            elif "Note: including file:" == line.lstrip()[:21]:
                 if not current_source:
-                    EdkLogger.error("build",BUILD_ERROR, "Parse /showIncludes output failed. line: %s. \n" % line, RaiseError=False)
+                    EdkLogger.error(
+                        "build", BUILD_ERROR, "Parse /showIncludes output failed. line: %s. \n" % line, RaiseError=False)
                 else:
-                    ModuleDepDict[SourceFileAbsPathMap[current_source]].append(line.lstrip()[22:].strip())
+                    ModuleDepDict[SourceFileAbsPathMap[current_source]].append(line.lstrip()[
+                                                                               22:].strip())
 
         for source_abs in ModuleDepDict:
             if ModuleDepDict[source_abs]:
                 target_abs = self.GetRealTarget(source_abs)
                 dep_file_name = os.path.basename(source_abs) + ".deps"
-                SaveFileOnChange(os.path.join(os.path.dirname(target_abs),dep_file_name)," \\\n".join([target_abs+":"] + ['''"''' + item +'''"''' for item in ModuleDepDict[source_abs]]),False)
+                SaveFileOnChange(os.path.join(os.path.dirname(target_abs), dep_file_name), " \\\n".join(
+                    [target_abs+":"] + ['''"''' + item + '''"''' for item in ModuleDepDict[source_abs]]), False)
 
     def UpdateDepsFileforNonMsvc(self):
         """ Update .deps files.
@@ -239,7 +262,7 @@ ${END}
                 continue
             try:
                 newcontent = []
-                with open(abspath,"r") as fd:
+                with open(abspath, "r") as fd:
                     lines = fd.readlines()
                 if lines[-1] == DEP_FILE_TAIL:
                     continue
@@ -250,7 +273,7 @@ ${END}
                 else:
                     sourceitem = lines[1].strip().split(" ")[0]
 
-                source_abs = self.SourceFileList.get(sourceitem,sourceitem)
+                source_abs = self.SourceFileList.get(sourceitem, sourceitem)
                 firstlineitems[0] = self.GetRealTarget(source_abs)
                 p_target = firstlineitems
                 if not p_target[0].strip().endswith(":"):
@@ -268,10 +291,11 @@ ${END}
 
                 newcontent.append("\n")
                 newcontent.append(DEP_FILE_TAIL)
-                with open(abspath,"w") as fw:
+                with open(abspath, "w") as fw:
                     fw.write("".join(newcontent))
             except Exception as e:
-                EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % abspath, ExtraData=str(e), RaiseError=False)
+                EdkLogger.error("build", FILE_NOT_FOUND, "%s doesn't exist" %
+                                abspath, ExtraData=str(e), RaiseError=False)
                 continue
 
     def UpdateDepsFileforTrim(self):
@@ -282,7 +306,7 @@ ${END}
                 continue
             try:
                 newcontent = []
-                with open(abspath,"r") as fd:
+                with open(abspath, "r") as fd:
                     lines = fd.readlines()
                 if lines[-1] == DEP_FILE_TAIL:
                     continue
@@ -291,14 +315,15 @@ ${END}
                 targetitem = self.GetRealTarget(source_abs.strip(" :"))
 
                 targetitem += ": "
-                if len(lines)>=2:
+                if len(lines) >= 2:
                     targetitem += lines[1]
                 newcontent.append(targetitem)
                 newcontent.extend(lines[2:])
                 newcontent.append("\n")
                 newcontent.append(DEP_FILE_TAIL)
-                with open(abspath,"w") as fw:
+                with open(abspath, "w") as fw:
                     fw.write("".join(newcontent))
             except Exception as e:
-                EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % abspath, ExtraData=str(e), RaiseError=False)
+                EdkLogger.error("build", FILE_NOT_FOUND, "%s doesn't exist" %
+                                abspath, ExtraData=str(e), RaiseError=False)
                 continue
diff --git a/BaseTools/Source/Python/AutoGen/InfSectionParser.py b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
index a55ddac341b6..0b27fa4e4ea4 100644
--- a/BaseTools/Source/Python/AutoGen/InfSectionParser.py
+++ b/BaseTools/Source/Python/AutoGen/InfSectionParser.py
@@ -1,11 +1,11 @@
-## @file
+# @file
 # Parser a Inf file and Get specify section data.
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 
 import Common.EdkLogger as EdkLogger
@@ -31,7 +31,8 @@ class InfSectionParser():
             with open(self._FilePath, "r") as File:
                 FileLinesList = File.readlines()
         except BaseException:
-            EdkLogger.error("build", AUTOGEN_ERROR, 'File %s is opened failed.' % self._FilePath)
+            EdkLogger.error("build", AUTOGEN_ERROR,
+                            'File %s is opened failed.' % self._FilePath)
 
         for Index in range(0, len(FileLinesList)):
             line = str(FileLinesList[Index]).strip()
@@ -48,8 +49,8 @@ class InfSectionParser():
                 UserExtFind = True
                 FindEnd = False
 
-            if (NextLine != '' and NextLine[0] == TAB_SECTION_START and \
-                NextLine[-1] == TAB_SECTION_END) or FileLastLine:
+            if (NextLine != '' and NextLine[0] == TAB_SECTION_START and
+                    NextLine[-1] == TAB_SECTION_END) or FileLastLine:
                 UserExtFind = False
                 FindEnd = True
                 self._FileSectionDataList.append({SectionLine: SectionData[:]})
@@ -66,13 +67,16 @@ class InfSectionParser():
         for SectionDataDict in self._FileSectionDataList:
             for key in SectionDataDict:
                 if key.lower().startswith("[userextensions") and key.lower().find('.tianocore.') > -1:
-                    SectionLine = key.lstrip(TAB_SECTION_START).rstrip(TAB_SECTION_END)
+                    SectionLine = key.lstrip(
+                        TAB_SECTION_START).rstrip(TAB_SECTION_END)
                     SubSectionList = [SectionLine]
                     if str(SectionLine).find(TAB_COMMA_SPLIT) > -1:
-                        SubSectionList = str(SectionLine).split(TAB_COMMA_SPLIT)
+                        SubSectionList = str(
+                            SectionLine).split(TAB_COMMA_SPLIT)
                     for SubSection in SubSectionList:
                         if SubSection.lower().find('.tianocore.') > -1:
-                            UserExtensionTianoCore.append({SubSection: SectionDataDict[key]})
+                            UserExtensionTianoCore.append(
+                                {SubSection: SectionDataDict[key]})
         return UserExtensionTianoCore
 
     # Get depex expression
@@ -85,10 +89,12 @@ class InfSectionParser():
         for SectionDataDict in self._FileSectionDataList:
             for key in SectionDataDict:
                 if key.lower() == "[depex]" or key.lower().startswith("[depex."):
-                    SectionLine = key.lstrip(TAB_SECTION_START).rstrip(TAB_SECTION_END)
+                    SectionLine = key.lstrip(
+                        TAB_SECTION_START).rstrip(TAB_SECTION_END)
                     SubSectionList = [SectionLine]
                     if str(SectionLine).find(TAB_COMMA_SPLIT) > -1:
-                        SubSectionList = str(SectionLine).split(TAB_COMMA_SPLIT)
+                        SubSectionList = str(
+                            SectionLine).split(TAB_COMMA_SPLIT)
                     for SubSection in SubSectionList:
                         SectionList = SubSection.split(TAB_SPLIT)
                         SubKey = ()
@@ -99,21 +105,8 @@ class InfSectionParser():
                         elif len(SectionList) == 3:
                             SubKey = (SectionList[1], SectionList[2])
                         else:
-                            EdkLogger.error("build", AUTOGEN_ERROR, 'Section %s is invalid.' % key)
-                        DepexExpressionList.append({SubKey: SectionDataDict[key]})
+                            EdkLogger.error(
+                                "build", AUTOGEN_ERROR, 'Section %s is invalid.' % key)
+                        DepexExpressionList.append(
+                            {SubKey: SectionDataDict[key]})
         return DepexExpressionList
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
index d05410b32966..4983a2e124fb 100755
--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,7 @@ from Common.LongFilePathSupport import LongFilePath, CopyLongFilePath
 from Common.BuildToolError import *
 from Common.DataType import *
 from Common.Misc import *
-from Common.StringUtils import NormPath,GetSplitList
+from Common.StringUtils import NormPath, GetSplitList
 from collections import defaultdict
 from Workspace.WorkspaceCommon import OrderedListDict
 import os.path as path
@@ -25,20 +25,22 @@ from GenPatchPcdTable.GenPatchPcdTable import parsePcdInfoFromMapFile
 from Workspace.MetaFileCommentParser import UsageList
 from .GenPcdDb import CreatePcdDatabaseCode
 from Common.caching import cached_class_function
-from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo
+from AutoGen.ModuleAutoGenHelper import PlatformInfo, WorkSpaceInfo
 import json
 import tempfile
 
-## Mapping Makefile type
-gMakeTypeMap = {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"}
+# Mapping Makefile type
+gMakeTypeMap = {TAB_COMPILER_MSFT: "nmake", "GCC": "gmake"}
 #
 # Regular expression for finding Include Directories, the difference between MSFT and INTEL/GCC
 # is the former use /I , the Latter used -I to specify include directories
 #
-gBuildOptIncludePatternMsft = re.compile(r"(?:.*?)/I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
-gBuildOptIncludePatternOther = re.compile(r"(?:.*?)-I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
+gBuildOptIncludePatternMsft = re.compile(
+    r"(?:.*?)/I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
+gBuildOptIncludePatternOther = re.compile(
+    r"(?:.*?)-I[ \t]*([^ ]*)", re.MULTILINE | re.DOTALL)
 
-## default file name for AutoGen
+# default file name for AutoGen
 gAutoGenCodeFileName = "AutoGen.c"
 gAutoGenHeaderFileName = "AutoGen.h"
 gAutoGenStringFileName = "%(module_name)sStrDefs.h"
@@ -133,11 +135,15 @@ ${tail_comments}
 # extend lists contained in a dictionary with lists stored in another dictionary
 # if CopyToDict is not derived from DefaultDict(list) then this may raise exception
 #
+
+
 def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict):
     for Key in CopyFromDict:
         CopyToDict[Key].extend(CopyFromDict[Key])
 
 # Create a directory specified by a set of path elements and return the full path
+
+
 def _MakeDir(PathList):
     RetVal = path.join(*PathList)
     CreateDirectory(RetVal)
@@ -146,6 +152,8 @@ def _MakeDir(PathList):
 #
 # Convert string to C format array
 #
+
+
 def _ConvertStringToByteArray(Value):
     Value = Value.strip()
     if not Value:
@@ -182,32 +190,35 @@ def _ConvertStringToByteArray(Value):
     Value = NewValue + '0}'
     return Value
 
-## ModuleAutoGen class
+# ModuleAutoGen class
 #
 # This class encapsules the AutoGen behaviors for the build tools. In addition to
 # the generation of AutoGen.h and AutoGen.c, it will generate *.depex file according
 # to the [depex] section in module's inf file.
 #
+
+
 class ModuleAutoGen(AutoGen):
     # call super().__init__ then call the worker function with different parameter count
     def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
         if not hasattr(self, "_Init"):
-            self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args)
+            self._InitWorker(Workspace, MetaFile, Target,
+                             Toolchain, Arch, *args)
             self._Init = True
 
-    ## Cache the timestamps of metafiles of every module in a class attribute
+    # Cache the timestamps of metafiles of every module in a class attribute
     #
     TimeDict = {}
 
     def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
-#         check if this module is employed by active platform
-        if not PlatformInfo(Workspace, args[0], Target, Toolchain, Arch,args[-1]).ValidModule(MetaFile):
-            EdkLogger.verbose("Module [%s] for [%s] is not employed by active platform\n" \
+        #         check if this module is employed by active platform
+        if not PlatformInfo(Workspace, args[0], Target, Toolchain, Arch, args[-1]).ValidModule(MetaFile):
+            EdkLogger.verbose("Module [%s] for [%s] is not employed by active platform\n"
                               % (MetaFile, Arch))
             return None
         return super(ModuleAutoGen, cls).__new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
 
-    ## Initialize ModuleAutoGen
+    # Initialize ModuleAutoGen
     #
     #   @param      Workspace           EdkIIWorkspaceBuild object
     #   @param      ModuleFile          The path of module file
@@ -216,9 +227,11 @@ class ModuleAutoGen(AutoGen):
     #   @param      Arch                The arch the module supports
     #   @param      PlatformFile        Platform meta-file
     #
-    def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arch, PlatformFile,DataPipe):
-        EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen module [%s] [%s]" % (ModuleFile, Arch))
-        GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (ModuleFile, Arch, Toolchain, Target)
+    def _InitWorker(self, Workspace, ModuleFile, Target, Toolchain, Arch, PlatformFile, DataPipe):
+        EdkLogger.debug(EdkLogger.DEBUG_9,
+                        "AutoGen module [%s] [%s]" % (ModuleFile, Arch))
+        GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (
+            ModuleFile, Arch, Toolchain, Target)
 
         self.Workspace = Workspace
         self.WorkspaceDir = ""
@@ -241,35 +254,38 @@ class ModuleAutoGen(AutoGen):
 
         self.BuildDatabase = self.Workspace.BuildDatabase
         self.BuildRuleOrder = None
-        self.BuildTime      = 0
+        self.BuildTime = 0
 
         self._GuidComments = OrderedListDict()
         self._ProtocolComments = OrderedListDict()
         self._PpiComments = OrderedListDict()
-        self._BuildTargets            = None
-        self._IntroBuildTargetList    = None
-        self._FinalBuildTargetList    = None
-        self._FileTypes               = None
+        self._BuildTargets = None
+        self._IntroBuildTargetList = None
+        self._FinalBuildTargetList = None
+        self._FileTypes = None
 
         self.AutoGenDepSet = set()
         self.ReferenceModules = []
-        self.ConstPcd                  = {}
-        self.FileDependCache  = {}
+        self.ConstPcd = {}
+        self.FileDependCache = {}
 
     def __init_platform_info__(self):
         pinfo = self.DataPipe.Get("P_Info")
         self.WorkspaceDir = pinfo.get("WorkspaceDir")
-        self.PlatformInfo = PlatformInfo(self.Workspace,pinfo.get("ActivePlatform"),pinfo.get("Target"),pinfo.get("ToolChain"),pinfo.get("Arch"),self.DataPipe)
-    ## hash() operator of ModuleAutoGen
+        self.PlatformInfo = PlatformInfo(self.Workspace, pinfo.get("ActivePlatform"), pinfo.get(
+            "Target"), pinfo.get("ToolChain"), pinfo.get("Arch"), self.DataPipe)
+    # hash() operator of ModuleAutoGen
     #
     #  The module file path and arch string will be used to represent
     #  hash value of this object
     #
     #   @retval   int Hash value of the module file path and arch
     #
+
     @cached_class_function
     def __hash__(self):
-        return hash((self.MetaFile, self.Arch, self.ToolChain,self.BuildTarget))
+        return hash((self.MetaFile, self.Arch, self.ToolChain, self.BuildTarget))
+
     def __repr__(self):
         return "%s [%s]" % (self.MetaFile, self.Arch)
 
@@ -290,7 +306,8 @@ class ModuleAutoGen(AutoGen):
         for Pcd in self.FixedAtBuildPcds:
             if Pcd.DatumType == TAB_VOID:
                 if '.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName)) not in RetVal:
-                    RetVal['.'.join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))] = Pcd.DefaultValue
+                    RetVal['.'.join(
+                        (Pcd.TokenSpaceGuidCName, Pcd.TokenCName))] = Pcd.DefaultValue
         return RetVal
 
     @property
@@ -298,61 +315,61 @@ class ModuleAutoGen(AutoGen):
         ModuleNames = self.DataPipe.Get("M_Name")
         if not ModuleNames:
             return self.Name
-        return ModuleNames.get((self.Name,self.MetaFile),self.Name)
+        return ModuleNames.get((self.Name, self.MetaFile), self.Name)
 
     # Macros could be used in build_rule.txt (also Makefile)
     @cached_property
     def Macros(self):
         return OrderedDict((
-            ("WORKSPACE" ,self.WorkspaceDir),
-            ("MODULE_NAME" ,self.Name),
-            ("MODULE_NAME_GUID" ,self.UniqueBaseName),
-            ("MODULE_GUID" ,self.Guid),
-            ("MODULE_VERSION" ,self.Version),
-            ("MODULE_TYPE" ,self.ModuleType),
-            ("MODULE_FILE" ,str(self.MetaFile)),
-            ("MODULE_FILE_BASE_NAME" ,self.MetaFile.BaseName),
-            ("MODULE_RELATIVE_DIR" ,self.SourceDir),
-            ("MODULE_DIR" ,self.SourceDir),
-            ("BASE_NAME" ,self.Name),
-            ("ARCH" ,self.Arch),
-            ("TOOLCHAIN" ,self.ToolChain),
-            ("TOOLCHAIN_TAG" ,self.ToolChain),
-            ("TOOL_CHAIN_TAG" ,self.ToolChain),
-            ("TARGET" ,self.BuildTarget),
-            ("BUILD_DIR" ,self.PlatformInfo.BuildDir),
-            ("BIN_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
-            ("LIB_DIR" ,os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
-            ("MODULE_BUILD_DIR" ,self.BuildDir),
-            ("OUTPUT_DIR" ,self.OutputDir),
-            ("DEBUG_DIR" ,self.DebugDir),
-            ("DEST_DIR_OUTPUT" ,self.OutputDir),
-            ("DEST_DIR_DEBUG" ,self.DebugDir),
-            ("PLATFORM_NAME" ,self.PlatformInfo.Name),
-            ("PLATFORM_GUID" ,self.PlatformInfo.Guid),
-            ("PLATFORM_VERSION" ,self.PlatformInfo.Version),
-            ("PLATFORM_RELATIVE_DIR" ,self.PlatformInfo.SourceDir),
-            ("PLATFORM_DIR" ,mws.join(self.WorkspaceDir, self.PlatformInfo.SourceDir)),
-            ("PLATFORM_OUTPUT_DIR" ,self.PlatformInfo.OutputDir),
-            ("FFS_OUTPUT_DIR" ,self.FfsOutputDir)
-            ))
+            ("WORKSPACE", self.WorkspaceDir),
+            ("MODULE_NAME", self.Name),
+            ("MODULE_NAME_GUID", self.UniqueBaseName),
+            ("MODULE_GUID", self.Guid),
+            ("MODULE_VERSION", self.Version),
+            ("MODULE_TYPE", self.ModuleType),
+            ("MODULE_FILE", str(self.MetaFile)),
+            ("MODULE_FILE_BASE_NAME", self.MetaFile.BaseName),
+            ("MODULE_RELATIVE_DIR", self.SourceDir),
+            ("MODULE_DIR", self.SourceDir),
+            ("BASE_NAME", self.Name),
+            ("ARCH", self.Arch),
+            ("TOOLCHAIN", self.ToolChain),
+            ("TOOLCHAIN_TAG", self.ToolChain),
+            ("TOOL_CHAIN_TAG", self.ToolChain),
+            ("TARGET", self.BuildTarget),
+            ("BUILD_DIR", self.PlatformInfo.BuildDir),
+            ("BIN_DIR", os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
+            ("LIB_DIR", os.path.join(self.PlatformInfo.BuildDir, self.Arch)),
+            ("MODULE_BUILD_DIR", self.BuildDir),
+            ("OUTPUT_DIR", self.OutputDir),
+            ("DEBUG_DIR", self.DebugDir),
+            ("DEST_DIR_OUTPUT", self.OutputDir),
+            ("DEST_DIR_DEBUG", self.DebugDir),
+            ("PLATFORM_NAME", self.PlatformInfo.Name),
+            ("PLATFORM_GUID", self.PlatformInfo.Guid),
+            ("PLATFORM_VERSION", self.PlatformInfo.Version),
+            ("PLATFORM_RELATIVE_DIR", self.PlatformInfo.SourceDir),
+            ("PLATFORM_DIR", mws.join(self.WorkspaceDir, self.PlatformInfo.SourceDir)),
+            ("PLATFORM_OUTPUT_DIR", self.PlatformInfo.OutputDir),
+            ("FFS_OUTPUT_DIR", self.FfsOutputDir)
+        ))
 
-    ## Return the module build data object
+    # Return the module build data object
     @cached_property
     def Module(self):
         return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
 
-    ## Return the module name
+    # Return the module name
     @cached_property
     def Name(self):
         return self.Module.BaseName
 
-    ## Return the module DxsFile if exist
+    # Return the module DxsFile if exist
     @cached_property
     def DxsFile(self):
         return self.Module.DxsFile
 
-    ## Return the module meta-file GUID
+    # Return the module meta-file GUID
     @cached_property
     def Guid(self):
         #
@@ -367,84 +384,85 @@ class ModuleAutoGen(AutoGen):
             return os.path.basename(self.MetaFile.Path)[:36]
         return self.Module.Guid
 
-    ## Return the module version
+    # Return the module version
     @cached_property
     def Version(self):
         return self.Module.Version
 
-    ## Return the module type
+    # Return the module type
     @cached_property
     def ModuleType(self):
         return self.Module.ModuleType
 
-    ## Return the component type (for Edk.x style of module)
+    # Return the component type (for Edk.x style of module)
     @cached_property
     def ComponentType(self):
         return self.Module.ComponentType
 
-    ## Return the build type
+    # Return the build type
     @cached_property
     def BuildType(self):
         return self.Module.BuildType
 
-    ## Return the PCD_IS_DRIVER setting
+    # Return the PCD_IS_DRIVER setting
     @cached_property
     def PcdIsDriver(self):
         return self.Module.PcdIsDriver
 
-    ## Return the autogen version, i.e. module meta-file version
+    # Return the autogen version, i.e. module meta-file version
     @cached_property
     def AutoGenVersion(self):
         return self.Module.AutoGenVersion
 
-    ## Check if the module is library or not
+    # Check if the module is library or not
     @cached_property
     def IsLibrary(self):
         return bool(self.Module.LibraryClass)
 
-    ## Check if the module is binary module or not
+    # Check if the module is binary module or not
     @cached_property
     def IsBinaryModule(self):
         return self.Module.IsBinaryModule
 
-    ## Return the directory to store intermediate files of the module
+    # Return the directory to store intermediate files of the module
     @cached_property
     def BuildDir(self):
         return _MakeDir((
-                                    self.PlatformInfo.BuildDir,
-                                    self.Arch,
-                                    self.SourceDir,
-                                    self.MetaFile.BaseName
-            ))
+            self.PlatformInfo.BuildDir,
+            self.Arch,
+            self.SourceDir,
+            self.MetaFile.BaseName
+        ))
 
-    ## Return the directory to store the intermediate object files of the module
+    # Return the directory to store the intermediate object files of the module
     @cached_property
     def OutputDir(self):
         return _MakeDir((self.BuildDir, "OUTPUT"))
 
-    ## Return the directory path to store ffs file
+    # Return the directory path to store ffs file
     @cached_property
     def FfsOutputDir(self):
         if GlobalData.gFdfParser:
             return path.join(self.PlatformInfo.BuildDir, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
         return ''
 
-    ## Return the directory to store auto-gened source files of the module
+    # Return the directory to store auto-gened source files of the module
     @cached_property
     def DebugDir(self):
         return _MakeDir((self.BuildDir, "DEBUG"))
 
-    ## Return the path of custom file
+    # Return the path of custom file
     @cached_property
     def CustomMakefile(self):
         RetVal = {}
         for Type in self.Module.CustomMakefile:
             MakeType = gMakeTypeMap[Type] if Type in gMakeTypeMap else 'nmake'
-            File = os.path.join(self.SourceDir, self.Module.CustomMakefile[Type])
+            File = os.path.join(
+                self.SourceDir, self.Module.CustomMakefile[Type])
             RetVal[MakeType] = File
         return RetVal
 
-    ## Return the directory of the makefile
+    # Return the directory of the makefile
     #
     #   @retval     string  The directory string of module's makefile
     #
@@ -452,7 +470,7 @@ class ModuleAutoGen(AutoGen):
     def MakeFileDir(self):
         return self.BuildDir
 
-    ## Return build command string
+    # Return build command string
     #
     #   @retval     string  Build command string
     #
@@ -460,7 +478,7 @@ class ModuleAutoGen(AutoGen):
     def BuildCommand(self):
         return self.PlatformInfo.BuildCommand
 
-    ## Get Module package and Platform package
+    # Get Module package and Platform package
     #
     #   @retval list The list of package object
     #
@@ -469,14 +487,15 @@ class ModuleAutoGen(AutoGen):
         PkagList = []
         if self.Module.Packages:
             PkagList.extend(self.Module.Packages)
-        Platform = self.BuildDatabase[self.PlatformInfo.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
+        Platform = self.BuildDatabase[self.PlatformInfo.MetaFile,
+                                      self.Arch, self.BuildTarget, self.ToolChain]
         for Package in Platform.Packages:
             if Package in PkagList:
                 continue
             PkagList.append(Package)
         return PkagList
 
-    ## Get object list of all packages the module and its dependent libraries belong to and the Platform depends on
+    # Get object list of all packages the module and its dependent libraries belong to and the Platform depends on
     #
     #   @retval     list    The list of package object
     #
@@ -491,13 +510,13 @@ class ModuleAutoGen(AutoGen):
                 PackageList.append(Package)
         return PackageList
 
-    ## Get the depex string
+    # Get the depex string
     #
     # @return : a string contain all depex expression.
     def _GetDepexExpresionString(self):
         DepexStr = ''
         DepexList = []
-        ## DPX_SOURCE IN Define section.
+        # DPX_SOURCE IN Define section.
         if self.Module.DxsFile:
             return DepexStr
         for M in [self.Module] + self.DependentLibraryList:
@@ -507,7 +526,8 @@ class ModuleAutoGen(AutoGen):
             for DepexExpression in DepexExpressionList:
                 for key in DepexExpression:
                     Arch, ModuleType = key
-                    DepexExpr = [x for x in DepexExpression[key] if not str(x).startswith('#')]
+                    DepexExpr = [x for x in DepexExpression[key]
+                                 if not str(x).startswith('#')]
                     # the type of build module is USER_DEFINED.
                     # All different DEPEX section tags would be copied into the As Built INF file
                     # and there would be separate DEPEX section tags
@@ -516,22 +536,22 @@ class ModuleAutoGen(AutoGen):
                             DepexList.append({(Arch, ModuleType): DepexExpr})
                     else:
                         if Arch.upper() == TAB_ARCH_COMMON or \
-                          (Arch.upper() == self.Arch.upper() and \
-                          ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
+                            (Arch.upper() == self.Arch.upper() and
+                           ModuleType.upper() in [TAB_ARCH_COMMON, self.ModuleType.upper()]):
                             DepexList.append({(Arch, ModuleType): DepexExpr})
 
-        #the type of build module is USER_DEFINED.
+        # the type of build module is USER_DEFINED.
         if self.ModuleType.upper() == SUP_MODULE_USER_DEFINED or self.ModuleType.upper() == SUP_MODULE_HOST_APPLICATION:
             for Depex in DepexList:
                 for key in Depex:
                     DepexStr += '[Depex.%s.%s]\n' % key
-                    DepexStr += '\n'.join('# '+ val for val in Depex[key])
+                    DepexStr += '\n'.join('# ' + val for val in Depex[key])
                     DepexStr += '\n\n'
             if not DepexStr:
                 return '[Depex.%s]\n' % self.Arch
             return DepexStr
 
-        #the type of build module not is USER_DEFINED.
+        # the type of build module not is USER_DEFINED.
         Count = 0
         for Depex in DepexList:
             Count += 1
@@ -551,7 +571,7 @@ class ModuleAutoGen(AutoGen):
             return '[Depex.%s]\n' % self.Arch
         return '[Depex.%s]\n#  ' % self.Arch + DepexStr
 
-    ## Merge dependency expression
+    # Merge dependency expression
     #
     #   @retval     list    The token list of the dependency expression after parsed
     #
@@ -573,7 +593,7 @@ class ModuleAutoGen(AutoGen):
                 if DepexList != []:
                     DepexList.append('AND')
                 DepexList.append('(')
-                #replace D with value if D is FixedAtBuild PCD
+                # replace D with value if D is FixedAtBuild PCD
                 NewList = []
                 for item in D:
                     if '.' not in item:
@@ -586,7 +606,8 @@ class ModuleAutoGen(AutoGen):
                                                 "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type and 16 bytes in the module.".format(item))
                             NewList.append(Value)
                         except:
-                            EdkLogger.error("build", FORMAT_INVALID, "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type in the module.".format(item))
+                            EdkLogger.error(
+                                "build", FORMAT_INVALID, "{} used in [Depex] section should be used as FixedAtBuild type and VOID* datum type in the module.".format(item))
 
                 DepexList.extend(NewList)
                 if DepexList[-1] == 'END':  # no need of a END at this time
@@ -594,14 +615,15 @@ class ModuleAutoGen(AutoGen):
                 DepexList.append(')')
                 Inherited = True
             if Inherited:
-                EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.Module.BaseName, DepexList))
+                EdkLogger.verbose("DEPEX[%s] (+%s) = %s" %
+                                  (self.Name, M.Module.BaseName, DepexList))
             if 'BEFORE' in DepexList or 'AFTER' in DepexList:
                 break
             if len(DepexList) > 0:
                 EdkLogger.verbose('')
-        return {self.ModuleType:DepexList}
+        return {self.ModuleType: DepexList}
 
-    ## Merge dependency expression
+    # Merge dependency expression
     #
     #   @retval     list    The token list of the dependency expression after parsed
     #
@@ -621,17 +643,19 @@ class ModuleAutoGen(AutoGen):
                     DepexExpressionString += ' AND '
                 DepexExpressionString += '('
                 DepexExpressionString += D
-                DepexExpressionString = DepexExpressionString.rstrip('END').strip()
+                DepexExpressionString = DepexExpressionString.rstrip(
+                    'END').strip()
                 DepexExpressionString += ')'
                 Inherited = True
             if Inherited:
-                EdkLogger.verbose("DEPEX[%s] (+%s) = %s" % (self.Name, M.BaseName, DepexExpressionString))
+                EdkLogger.verbose(
+                    "DEPEX[%s] (+%s) = %s" % (self.Name, M.BaseName, DepexExpressionString))
             if 'BEFORE' in DepexExpressionString or 'AFTER' in DepexExpressionString:
                 break
         if len(DepexExpressionString) > 0:
             EdkLogger.verbose('')
 
-        return {self.ModuleType:DepexExpressionString}
+        return {self.ModuleType: DepexExpressionString}
 
     # Get the tiano core user extension, it is contain dependent library.
     # @retval: a list contain tiano core userextension.
@@ -650,14 +674,15 @@ class ModuleAutoGen(AutoGen):
                         Arch = ItemList[3]
                     if Arch.upper() == TAB_ARCH_COMMON or Arch.upper() == self.Arch.upper():
                         TianoCoreList = []
-                        TianoCoreList.extend([TAB_SECTION_START + Section + TAB_SECTION_END])
+                        TianoCoreList.extend(
+                            [TAB_SECTION_START + Section + TAB_SECTION_END])
                         TianoCoreList.extend(TianoCoreUserExtent[Section][:])
                         TianoCoreList.append('\n')
                         TianoCoreUserExtentionList.append(TianoCoreList)
 
         return TianoCoreUserExtentionList
 
-    ## Return the list of specification version required for the module
+    # Return the list of specification version required for the module
     #
     #   @retval     list    The list of specification defined in module file
     #
@@ -665,19 +690,21 @@ class ModuleAutoGen(AutoGen):
     def Specification(self):
         return self.Module.Specification
 
-    ## Tool option for the module build
+    # Tool option for the module build
     #
     #   @param      PlatformInfo    The object of PlatformBuildInfo
     #   @retval     dict            The dict containing valid options
     #
     @cached_property
     def BuildOption(self):
-        RetVal, self.BuildRuleOrder = self.PlatformInfo.ApplyBuildOption(self.Module)
+        RetVal, self.BuildRuleOrder = self.PlatformInfo.ApplyBuildOption(
+            self.Module)
         if self.BuildRuleOrder:
-            self.BuildRuleOrder = ['.%s' % Ext for Ext in self.BuildRuleOrder.split()]
+            self.BuildRuleOrder = ['.%s' %
+                                   Ext for Ext in self.BuildRuleOrder.split()]
         return RetVal
 
-    ## Get include path list from tool option for the module build
+    # Get include path list from tool option for the module build
     #
     #   @retval     list            The include path list
     #
@@ -704,7 +731,8 @@ class ModuleAutoGen(AutoGen):
             except KeyError:
                 FlagOption = ''
 
-            IncPathList = [NormPath(Path, self.Macros) for Path in BuildOptIncludeRegEx.findall(FlagOption)]
+            IncPathList = [NormPath(Path, self.Macros)
+                           for Path in BuildOptIncludeRegEx.findall(FlagOption)]
 
             #
             # EDK II modules must not reference header files outside of the packages they depend on or
@@ -713,7 +741,8 @@ class ModuleAutoGen(AutoGen):
             if GlobalData.gDisableIncludePathCheck == False:
                 for Path in IncPathList:
                     if (Path not in self.IncludePathList) and (CommonPath([Path, self.MetaFile.Dir]) != self.MetaFile.Dir):
-                        ErrMsg = "The include directory for the EDK II module in this line is invalid %s specified in %s FLAGS '%s'" % (Path, Tool, FlagOption)
+                        ErrMsg = "The include directory for the EDK II module in this line is invalid %s specified in %s FLAGS '%s'" % (
+                            Path, Tool, FlagOption)
                         EdkLogger.error("build",
                                         PARAMETER_INVALID,
                                         ExtraData=ErrMsg,
@@ -721,7 +750,7 @@ class ModuleAutoGen(AutoGen):
             RetVal += IncPathList
         return RetVal
 
-    ## Return a list of files which can be built from source
+    # Return a list of files which can be built from source
     #
     #  What kind of files can be built is determined by build rules in
     #  $(CONF_DIRECTORY)/build_rule.txt and toolchain family.
@@ -730,7 +759,8 @@ class ModuleAutoGen(AutoGen):
     def SourceFileList(self):
         RetVal = []
         ToolChainTagSet = {"", TAB_STAR, self.ToolChain}
-        ToolChainFamilySet = {"", TAB_STAR, self.ToolChainFamily, self.BuildRuleFamily}
+        ToolChainFamilySet = {"", TAB_STAR,
+                              self.ToolChainFamily, self.BuildRuleFamily}
         for F in self.Module.Sources:
             # match tool chain
             if F.TagName not in ToolChainTagSet:
@@ -740,10 +770,10 @@ class ModuleAutoGen(AutoGen):
             # match tool chain family or build rule family
             if F.ToolChainFamily not in ToolChainFamilySet:
                 EdkLogger.debug(
-                            EdkLogger.DEBUG_0,
-                            "The file [%s] must be built by tools of [%s], " \
-                            "but current toolchain family is [%s], buildrule family is [%s]" \
-                                % (str(F), F.ToolChainFamily, self.ToolChainFamily, self.BuildRuleFamily))
+                    EdkLogger.DEBUG_0,
+                    "The file [%s] must be built by tools of [%s], "
+                    "but current toolchain family is [%s], buildrule family is [%s]"
+                    % (str(F), F.ToolChainFamily, self.ToolChainFamily, self.BuildRuleFamily))
                 continue
 
             # add the file path into search path list for file including
@@ -762,7 +792,7 @@ class ModuleAutoGen(AutoGen):
         self.BuildOption
         for SingleFile in FileList:
             if self.BuildRuleOrder and SingleFile.Ext in self.BuildRuleOrder and SingleFile.Ext in self.BuildRules:
-                key = SingleFile.Path.rsplit(SingleFile.Ext,1)[0]
+                key = SingleFile.Path.rsplit(SingleFile.Ext, 1)[0]
                 if key in Order_Dict:
                     Order_Dict[key].append(SingleFile.Ext)
                 else:
@@ -780,22 +810,22 @@ class ModuleAutoGen(AutoGen):
 
         return FileList
 
-    ## Return the list of unicode files
+    # Return the list of unicode files
     @cached_property
     def UnicodeFileList(self):
-        return self.FileTypes.get(TAB_UNICODE_FILE,[])
+        return self.FileTypes.get(TAB_UNICODE_FILE, [])
 
-    ## Return the list of vfr files
+    # Return the list of vfr files
     @cached_property
     def VfrFileList(self):
         return self.FileTypes.get(TAB_VFR_FILE, [])
 
-    ## Return the list of Image Definition files
+    # Return the list of Image Definition files
     @cached_property
     def IdfFileList(self):
-        return self.FileTypes.get(TAB_IMAGE_FILE,[])
+        return self.FileTypes.get(TAB_IMAGE_FILE, [])
 
-    ## Return a list of files which can be built from binary
+    # Return a list of files which can be built from binary
     #
     #  "Build" binary files are just to copy them to build directory.
     #
@@ -816,19 +846,23 @@ class ModuleAutoGen(AutoGen):
         RetVal = {}
         BuildRuleDatabase = self.PlatformInfo.BuildRule
         for Type in BuildRuleDatabase.FileTypeList:
-            #first try getting build rule by BuildRuleFamily
-            RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.BuildRuleFamily]
+            # first try getting build rule by BuildRuleFamily
+            RuleObject = BuildRuleDatabase[Type,
+                                           self.BuildType, self.Arch, self.BuildRuleFamily]
             if not RuleObject:
                 # build type is always module type, but ...
                 if self.ModuleType != self.BuildType:
-                    RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.BuildRuleFamily]
-            #second try getting build rule by ToolChainFamily
+                    RuleObject = BuildRuleDatabase[Type,
+                                                   self.ModuleType, self.Arch, self.BuildRuleFamily]
+            # second try getting build rule by ToolChainFamily
             if not RuleObject:
-                RuleObject = BuildRuleDatabase[Type, self.BuildType, self.Arch, self.ToolChainFamily]
+                RuleObject = BuildRuleDatabase[Type,
+                                               self.BuildType, self.Arch, self.ToolChainFamily]
                 if not RuleObject:
                     # build type is always module type, but ...
                     if self.ModuleType != self.BuildType:
-                        RuleObject = BuildRuleDatabase[Type, self.ModuleType, self.Arch, self.ToolChainFamily]
+                        RuleObject = BuildRuleDatabase[Type,
+                                                       self.ModuleType, self.Arch, self.ToolChainFamily]
             if not RuleObject:
                 continue
             RuleObject = RuleObject.Instantiate(self.Macros)
@@ -906,7 +940,8 @@ class ModuleAutoGen(AutoGen):
 
             # to avoid cyclic rule
             if FileType in RuleChain:
-                EdkLogger.error("build", ERROR_STATEMENT, "Cyclic dependency detected while generating rule for %s" % str(Source))
+                EdkLogger.error(
+                    "build", ERROR_STATEMENT, "Cyclic dependency detected while generating rule for %s" % str(Source))
 
             RuleChain.add(FileType)
             SourceList.extend(Target.Outputs)
@@ -925,10 +960,10 @@ class ModuleAutoGen(AutoGen):
             self._BuildTargets = defaultdict(set)
             self._FileTypes = defaultdict(set)
 
-        #TRICK: call SourceFileList property to apply build rule for source files
+        # TRICK: call SourceFileList property to apply build rule for source files
         self.SourceFileList
 
-        #TRICK: call _GetBinaryFileList to apply build rule for binary files
+        # TRICK: call _GetBinaryFileList to apply build rule for binary files
         self.BinaryFileList
 
         return self._BuildTargets
@@ -948,7 +983,7 @@ class ModuleAutoGen(AutoGen):
         self.Targets
         return self._FileTypes
 
-    ## Get the list of package object the module depends on and the Platform depends on
+    # Get the list of package object the module depends on and the Platform depends on
     #
     #   @retval     list    The package object list
     #
@@ -956,7 +991,7 @@ class ModuleAutoGen(AutoGen):
     def DependentPackageList(self):
         return self.PackageList
 
-    ## Return the list of auto-generated code file
+    # Return the list of auto-generated code file
     #
     #   @retval     list        The list of auto-generated file
     #
@@ -970,7 +1005,8 @@ class ModuleAutoGen(AutoGen):
         AutoGenH = TemplateString()
         StringH = TemplateString()
         StringIdf = TemplateString()
-        GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniIdf, UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer)
+        GenC.CreateCode(self, AutoGenC, AutoGenH, StringH, AutoGenUniIdf,
+                        UniStringBinBuffer, StringIdf, AutoGenUniIdf, IdfGenBinBuffer)
         #
         # AutoGen.c is generated if there are library classes in inf, or there are object files
         #
@@ -984,22 +1020,26 @@ class ModuleAutoGen(AutoGen):
             RetVal[AutoFile] = str(AutoGenH)
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if str(StringH) != "":
-            AutoFile = PathClass(gAutoGenStringFileName % {"module_name":self.Name}, self.DebugDir)
+            AutoFile = PathClass(gAutoGenStringFileName %
+                                 {"module_name": self.Name}, self.DebugDir)
             RetVal[AutoFile] = str(StringH)
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if UniStringBinBuffer is not None and UniStringBinBuffer.getvalue() != b"":
-            AutoFile = PathClass(gAutoGenStringFormFileName % {"module_name":self.Name}, self.OutputDir)
+            AutoFile = PathClass(gAutoGenStringFormFileName %
+                                 {"module_name": self.Name}, self.OutputDir)
             RetVal[AutoFile] = UniStringBinBuffer.getvalue()
             AutoFile.IsBinary = True
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if UniStringBinBuffer is not None:
             UniStringBinBuffer.close()
         if str(StringIdf) != "":
-            AutoFile = PathClass(gAutoGenImageDefFileName % {"module_name":self.Name}, self.DebugDir)
+            AutoFile = PathClass(gAutoGenImageDefFileName %
+                                 {"module_name": self.Name}, self.DebugDir)
             RetVal[AutoFile] = str(StringIdf)
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
         if IdfGenBinBuffer is not None and IdfGenBinBuffer.getvalue() != b"":
-            AutoFile = PathClass(gAutoGenIdfFileName % {"module_name":self.Name}, self.OutputDir)
+            AutoFile = PathClass(gAutoGenIdfFileName %
+                                 {"module_name": self.Name}, self.OutputDir)
             RetVal[AutoFile] = IdfGenBinBuffer.getvalue()
             AutoFile.IsBinary = True
             self._ApplyBuildRule(AutoFile, TAB_UNKNOWN_FILE)
@@ -1007,7 +1047,7 @@ class ModuleAutoGen(AutoGen):
             IdfGenBinBuffer.close()
         return RetVal
 
-    ## Return the list of library modules explicitly or implicitly used by this module
+    # Return the list of library modules explicitly or implicitly used by this module
     @cached_property
     def DependentLibraryList(self):
         # only merge library classes and PCD for non-library module
@@ -1015,7 +1055,7 @@ class ModuleAutoGen(AutoGen):
             return []
         return self.PlatformInfo.ApplyLibraryInstance(self.Module)
 
-    ## Get the list of PCDs from current module
+    # Get the list of PCDs from current module
     #
     #   @retval     list                    The list of PCD
     #
@@ -1025,6 +1065,7 @@ class ModuleAutoGen(AutoGen):
         RetVal = self.PlatformInfo.ApplyPcdSetting(self, self.Module.Pcds)
 
         return RetVal
+
     @cached_property
     def _PcdComments(self):
         ReVal = OrderedListDict()
@@ -1034,7 +1075,7 @@ class ModuleAutoGen(AutoGen):
                 ExtendCopyDictionaryLists(ReVal, Library.PcdComments)
         return ReVal
 
-    ## Get the list of PCDs from dependent libraries
+    # Get the list of PCDs from dependent libraries
     #
     #   @retval     list                    The list of PCD
     #
@@ -1053,10 +1094,11 @@ class ModuleAutoGen(AutoGen):
                     continue
                 Pcds.add(Key)
                 PcdsInLibrary[Key] = copy.copy(Library.Pcds[Key])
-            RetVal.extend(self.PlatformInfo.ApplyPcdSetting(self, PcdsInLibrary, Library=Library))
+            RetVal.extend(self.PlatformInfo.ApplyPcdSetting(
+                self, PcdsInLibrary, Library=Library))
         return RetVal
 
-    ## Get the GUID value mapping
+    # Get the GUID value mapping
     #
     #   @retval     dict    The mapping between GUID cname and its value
     #
@@ -1075,20 +1117,23 @@ class ModuleAutoGen(AutoGen):
         for Library in self.DependentLibraryList:
             RetVal.update(Library.GetGuidsUsedByPcd())
         return RetVal
-    ## Get the protocol value mapping
+    # Get the protocol value mapping
     #
     #   @retval     dict    The mapping between protocol cname and its value
     #
+
     @cached_property
     def ProtocolList(self):
         RetVal = OrderedDict(self.Module.Protocols)
         for Library in self.DependentLibraryList:
             RetVal.update(Library.Protocols)
-            ExtendCopyDictionaryLists(self._ProtocolComments, Library.ProtocolComments)
-        ExtendCopyDictionaryLists(self._ProtocolComments, self.Module.ProtocolComments)
+            ExtendCopyDictionaryLists(
+                self._ProtocolComments, Library.ProtocolComments)
+        ExtendCopyDictionaryLists(
+            self._ProtocolComments, self.Module.ProtocolComments)
         return RetVal
 
-    ## Get the PPI value mapping
+    # Get the PPI value mapping
     #
     #   @retval     dict    The mapping between PPI cname and its value
     #
@@ -1101,7 +1146,7 @@ class ModuleAutoGen(AutoGen):
         ExtendCopyDictionaryLists(self._PpiComments, self.Module.PpiComments)
         return RetVal
 
-    ## Get the list of include search path
+    # Get the list of include search path
     #
     #   @retval     list                    The list path
     #
@@ -1118,7 +1163,8 @@ class ModuleAutoGen(AutoGen):
             IncludesList = Package.Includes
             if Package._PrivateIncludes:
                 if not self.MetaFile.OriginalPath.Path.startswith(PackageDir):
-                    IncludesList = list(set(Package.Includes).difference(set(Package._PrivateIncludes)))
+                    IncludesList = list(set(Package.Includes).difference(
+                        set(Package._PrivateIncludes)))
             for Inc in IncludesList:
                 if Inc not in RetVal:
                     RetVal.append(str(Inc))
@@ -1134,8 +1180,8 @@ class ModuleAutoGen(AutoGen):
                 whitespace = False
                 for flag in flags.split(" "):
                     flag = flag.strip()
-                    if flag.startswith(("/I","-I")):
-                        if len(flag)>2:
+                    if flag.startswith(("/I", "-I")):
+                        if len(flag) > 2:
                             if os.path.exists(flag[2:]):
                                 IncPathList.append(flag[2:])
                         else:
@@ -1151,7 +1197,7 @@ class ModuleAutoGen(AutoGen):
     def IncludePathLength(self):
         return sum(len(inc)+1 for inc in self.IncludePathList)
 
-    ## Get the list of include paths from the packages
+    # Get the list of include paths from the packages
     #
     #   @IncludesList     list             The list path
     #
@@ -1163,10 +1209,11 @@ class ModuleAutoGen(AutoGen):
             IncludesList = Package.Includes
             if Package._PrivateIncludes:
                 if not self.MetaFile.Path.startswith(PackageDir):
-                    IncludesList = list(set(Package.Includes).difference(set(Package._PrivateIncludes)))
+                    IncludesList = list(set(Package.Includes).difference(
+                        set(Package._PrivateIncludes)))
         return IncludesList
 
-    ## Get HII EX PCDs which maybe used by VFR
+    # Get HII EX PCDs which maybe used by VFR
     #
     #  efivarstore used by VFR may relate with HII EX PCDs
     #  Get the variable name and GUID from efivarstore and HII EX PCD
@@ -1209,8 +1256,10 @@ class ModuleAutoGen(AutoGen):
                 Guid = gEfiVarStoreGuidPattern.search(Content, Pos)
                 if not Guid:
                     break
-                NameArray = _ConvertStringToByteArray('L"' + Name.group(1) + '"')
-                NameGuids.add((NameArray, GuidStructureStringToGuidString(Guid.group(1))))
+                NameArray = _ConvertStringToByteArray(
+                    'L"' + Name.group(1) + '"')
+                NameGuids.add(
+                    (NameArray, GuidStructureStringToGuidString(Guid.group(1))))
                 Pos = Content.find('efivarstore', Name.end())
         if not NameGuids:
             return []
@@ -1219,7 +1268,8 @@ class ModuleAutoGen(AutoGen):
             if Pcd.Type != TAB_PCDS_DYNAMIC_EX_HII:
                 continue
             for SkuInfo in Pcd.SkuInfoList.values():
-                Value = GuidValue(SkuInfo.VariableGuid, self.PlatformInfo.PackageList, self.MetaFile.Path)
+                Value = GuidValue(
+                    SkuInfo.VariableGuid, self.PlatformInfo.PackageList, self.MetaFile.Path)
                 if not Value:
                     continue
                 Name = _ConvertStringToByteArray(SkuInfo.VariableName)
@@ -1233,12 +1283,13 @@ class ModuleAutoGen(AutoGen):
     def _GenOffsetBin(self):
         VfrUniBaseName = {}
         for SourceFile in self.Module.Sources:
-            if SourceFile.Type.upper() == ".VFR" :
+            if SourceFile.Type.upper() == ".VFR":
                 #
                 # search the .map file to find the offset of vfr binary in the PE32+/TE file.
                 #
-                VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
-            elif SourceFile.Type.upper() == ".UNI" :
+                VfrUniBaseName[SourceFile.BaseName] = (
+                    SourceFile.BaseName + "Bin")
+            elif SourceFile.Type.upper() == ".UNI":
                 #
                 # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
                 #
@@ -1248,17 +1299,19 @@ class ModuleAutoGen(AutoGen):
             return None
         MapFileName = os.path.join(self.OutputDir, self.Name + ".map")
         EfiFileName = os.path.join(self.OutputDir, self.Name + ".efi")
-        VfrUniOffsetList = GetVariableOffset(MapFileName, EfiFileName, list(VfrUniBaseName.values()))
+        VfrUniOffsetList = GetVariableOffset(
+            MapFileName, EfiFileName, list(VfrUniBaseName.values()))
         if not VfrUniOffsetList:
             return None
 
         OutputName = '%sOffset.bin' % self.Name
-        UniVfrOffsetFileName    =  os.path.join( self.OutputDir, OutputName)
+        UniVfrOffsetFileName = os.path.join(self.OutputDir, OutputName)
 
         try:
             fInputfile = open(UniVfrOffsetFileName, "wb+", 0)
         except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, "File open failed for %s" % UniVfrOffsetFileName, None)
+            EdkLogger.error("build", FILE_OPEN_FAILURE,
+                            "File open failed for %s" % UniVfrOffsetFileName, None)
 
         # Use a instance of BytesIO to cache data
         fStringIO = BytesIO()
@@ -1272,8 +1325,8 @@ class ModuleAutoGen(AutoGen):
                 #
                 UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
                 fStringIO.write(UniGuid)
-                UniValue = pack ('Q', int (Item[1], 16))
-                fStringIO.write (UniValue)
+                UniValue = pack('Q', int(Item[1], 16))
+                fStringIO.write(UniValue)
             else:
                 #
                 # VFR binary offset in image.
@@ -1282,19 +1335,19 @@ class ModuleAutoGen(AutoGen):
                 #
                 VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
                 fStringIO.write(VfrGuid)
-                VfrValue = pack ('Q', int (Item[1], 16))
-                fStringIO.write (VfrValue)
+                VfrValue = pack('Q', int(Item[1], 16))
+                fStringIO.write(VfrValue)
         #
         # write data into file.
         #
-        try :
-            fInputfile.write (fStringIO.getvalue())
+        try:
+            fInputfile.write(fStringIO.getvalue())
         except:
             EdkLogger.error("build", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the "
-                            "file been locked or using by other applications." %UniVfrOffsetFileName, None)
+                            "file been locked or using by other applications." % UniVfrOffsetFileName, None)
 
-        fStringIO.close ()
-        fInputfile.close ()
+        fStringIO.close()
+        fInputfile.close()
         return OutputName
 
     @cached_property
@@ -1315,7 +1368,7 @@ class ModuleAutoGen(AutoGen):
 
         return retVal
 
-    ## Create AsBuilt INF file the module
+    # Create AsBuilt INF file the module
     #
     def CreateAsBuiltInf(self):
 
@@ -1334,7 +1387,7 @@ class ModuleAutoGen(AutoGen):
         if self.BinaryFileList:
             return
 
-        ### TODO: How to handles mixed source and binary modules
+        # TODO: How to handles mixed source and binary modules
 
         # Find all DynamicEx and PatchableInModule PCDs used by this module and dependent libraries
         # Also find all packages that the DynamicEx PCDs depend on
@@ -1346,12 +1399,15 @@ class ModuleAutoGen(AutoGen):
         for Pcd in self.ModulePcdList + self.LibraryPcdList:
             if Pcd.Type == TAB_PCDS_PATCHABLE_IN_MODULE:
                 PatchablePcds.append(Pcd)
-                PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_PATCHABLE_IN_MODULE))
+                PcdCheckList.append(
+                    (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_PATCHABLE_IN_MODULE))
             elif Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
                 if Pcd not in Pcds:
                     Pcds.append(Pcd)
-                    PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX))
-                    PcdCheckList.append((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC))
+                    PcdCheckList.append(
+                        (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX))
+                    PcdCheckList.append(
+                        (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC))
                     PcdTokenSpaceList.append(Pcd.TokenSpaceGuidCName)
         GuidList = OrderedDict(self.GuidList)
         for TokenSpace in self.GetGuidsUsedByPcd:
@@ -1363,7 +1419,8 @@ class ModuleAutoGen(AutoGen):
         for Package in self.DerivedPackageList:
             if Package in Packages:
                 continue
-            BeChecked = (Package.Guids, Package.Ppis, Package.Protocols, Package.Pcds)
+            BeChecked = (Package.Guids, Package.Ppis,
+                         Package.Protocols, Package.Pcds)
             Found = False
             for Index in range(len(BeChecked)):
                 for Item in CheckList[Index]:
@@ -1380,7 +1437,7 @@ class ModuleAutoGen(AutoGen):
                 continue
             for VfrPcd in VfrPcds:
                 if ((VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC_EX) in Pkg.Pcds or
-                    (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC) in Pkg.Pcds):
+                        (VfrPcd.TokenCName, VfrPcd.TokenSpaceGuidCName, TAB_PCDS_DYNAMIC) in Pkg.Pcds):
                     Packages.append(Pkg)
                     break
 
@@ -1390,36 +1447,36 @@ class ModuleAutoGen(AutoGen):
         MDefs = self.Module.Defines
 
         AsBuiltInfDict = {
-          'module_name'                       : self.Name,
-          'module_guid'                       : Guid,
-          'module_module_type'                : ModuleType,
-          'module_version_string'             : [MDefs['VERSION_STRING']] if 'VERSION_STRING' in MDefs else [],
-          'pcd_is_driver_string'              : [],
-          'module_uefi_specification_version' : [],
-          'module_pi_specification_version'   : [],
-          'module_entry_point'                : self.Module.ModuleEntryPointList,
-          'module_unload_image'               : self.Module.ModuleUnloadImageList,
-          'module_constructor'                : self.Module.ConstructorList,
-          'module_destructor'                 : self.Module.DestructorList,
-          'module_shadow'                     : [MDefs['SHADOW']] if 'SHADOW' in MDefs else [],
-          'module_pci_vendor_id'              : [MDefs['PCI_VENDOR_ID']] if 'PCI_VENDOR_ID' in MDefs else [],
-          'module_pci_device_id'              : [MDefs['PCI_DEVICE_ID']] if 'PCI_DEVICE_ID' in MDefs else [],
-          'module_pci_class_code'             : [MDefs['PCI_CLASS_CODE']] if 'PCI_CLASS_CODE' in MDefs else [],
-          'module_pci_revision'               : [MDefs['PCI_REVISION']] if 'PCI_REVISION' in MDefs else [],
-          'module_build_number'               : [MDefs['BUILD_NUMBER']] if 'BUILD_NUMBER' in MDefs else [],
-          'module_spec'                       : [MDefs['SPEC']] if 'SPEC' in MDefs else [],
-          'module_uefi_hii_resource_section'  : [MDefs['UEFI_HII_RESOURCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [],
-          'module_uni_file'                   : [MDefs['MODULE_UNI_FILE']] if 'MODULE_UNI_FILE' in MDefs else [],
-          'module_arch'                       : self.Arch,
-          'package_item'                      : [Package.MetaFile.File.replace('\\', '/') for Package in Packages],
-          'binary_item'                       : [],
-          'patchablepcd_item'                 : [],
-          'pcd_item'                          : [],
-          'protocol_item'                     : [],
-          'ppi_item'                          : [],
-          'guid_item'                         : [],
-          'flags_item'                        : [],
-          'libraryclasses_item'               : []
+            'module_name': self.Name,
+            'module_guid': Guid,
+            'module_module_type': ModuleType,
+            'module_version_string': [MDefs['VERSION_STRING']] if 'VERSION_STRING' in MDefs else [],
+            'pcd_is_driver_string': [],
+            'module_uefi_specification_version': [],
+            'module_pi_specification_version': [],
+            'module_entry_point': self.Module.ModuleEntryPointList,
+            'module_unload_image': self.Module.ModuleUnloadImageList,
+            'module_constructor': self.Module.ConstructorList,
+            'module_destructor': self.Module.DestructorList,
+            'module_shadow': [MDefs['SHADOW']] if 'SHADOW' in MDefs else [],
+            'module_pci_vendor_id': [MDefs['PCI_VENDOR_ID']] if 'PCI_VENDOR_ID' in MDefs else [],
+            'module_pci_device_id': [MDefs['PCI_DEVICE_ID']] if 'PCI_DEVICE_ID' in MDefs else [],
+            'module_pci_class_code': [MDefs['PCI_CLASS_CODE']] if 'PCI_CLASS_CODE' in MDefs else [],
+            'module_pci_revision': [MDefs['PCI_REVISION']] if 'PCI_REVISION' in MDefs else [],
+            'module_build_number': [MDefs['BUILD_NUMBER']] if 'BUILD_NUMBER' in MDefs else [],
+            'module_spec': [MDefs['SPEC']] if 'SPEC' in MDefs else [],
+            'module_uefi_hii_resource_section': [MDefs['UEFI_HII_RESOURCE_SECTION']] if 'UEFI_HII_RESOURCE_SECTION' in MDefs else [],
+            'module_uni_file': [MDefs['MODULE_UNI_FILE']] if 'MODULE_UNI_FILE' in MDefs else [],
+            'module_arch': self.Arch,
+            'package_item': [Package.MetaFile.File.replace('\\', '/') for Package in Packages],
+            'binary_item': [],
+            'patchablepcd_item': [],
+            'pcd_item': [],
+            'protocol_item': [],
+            'ppi_item': [],
+            'guid_item': [],
+            'flags_item': [],
+            'libraryclasses_item': []
         }
 
         if 'MODULE_UNI_FILE' in MDefs:
@@ -1436,22 +1493,27 @@ class ModuleAutoGen(AutoGen):
             AsBuiltInfDict['pcd_is_driver_string'].append(DriverType)
 
         if 'UEFI_SPECIFICATION_VERSION' in self.Specification:
-            AsBuiltInfDict['module_uefi_specification_version'].append(self.Specification['UEFI_SPECIFICATION_VERSION'])
+            AsBuiltInfDict['module_uefi_specification_version'].append(
+                self.Specification['UEFI_SPECIFICATION_VERSION'])
         if 'PI_SPECIFICATION_VERSION' in self.Specification:
-            AsBuiltInfDict['module_pi_specification_version'].append(self.Specification['PI_SPECIFICATION_VERSION'])
+            AsBuiltInfDict['module_pi_specification_version'].append(
+                self.Specification['PI_SPECIFICATION_VERSION'])
 
         OutputDir = self.OutputDir.replace('\\', '/').strip('/')
         DebugDir = self.DebugDir.replace('\\', '/').strip('/')
         for Item in self.CodaTargetList:
-            File = Item.Target.Path.replace('\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
+            File = Item.Target.Path.replace(
+                '\\', '/').strip('/').replace(DebugDir, '').replace(OutputDir, '').strip('/')
             if os.path.isabs(File):
-                File = File.replace('\\', '/').strip('/').replace(OutputDir, '').strip('/')
+                File = File.replace(
+                    '\\', '/').strip('/').replace(OutputDir, '').strip('/')
             if Item.Target.Ext.lower() == '.aml':
                 AsBuiltInfDict['binary_item'].append('ASL|' + File)
             elif Item.Target.Ext.lower() == '.acpi':
                 AsBuiltInfDict['binary_item'].append('ACPI|' + File)
             elif Item.Target.Ext.lower() == '.efi':
-                AsBuiltInfDict['binary_item'].append('PE32|' + self.Name + '.efi')
+                AsBuiltInfDict['binary_item'].append(
+                    'PE32|' + self.Name + '.efi')
             else:
                 AsBuiltInfDict['binary_item'].append('BIN|' + File)
         if not self.DepexGenerated:
@@ -1460,11 +1522,14 @@ class ModuleAutoGen(AutoGen):
                 self.DepexGenerated = True
         if self.DepexGenerated:
             if self.ModuleType in [SUP_MODULE_PEIM]:
-                AsBuiltInfDict['binary_item'].append('PEI_DEPEX|' + self.Name + '.depex')
+                AsBuiltInfDict['binary_item'].append(
+                    'PEI_DEPEX|' + self.Name + '.depex')
             elif self.ModuleType in [SUP_MODULE_DXE_DRIVER, SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_UEFI_DRIVER]:
-                AsBuiltInfDict['binary_item'].append('DXE_DEPEX|' + self.Name + '.depex')
+                AsBuiltInfDict['binary_item'].append(
+                    'DXE_DEPEX|' + self.Name + '.depex')
             elif self.ModuleType in [SUP_MODULE_DXE_SMM_DRIVER]:
-                AsBuiltInfDict['binary_item'].append('SMM_DEPEX|' + self.Name + '.depex')
+                AsBuiltInfDict['binary_item'].append(
+                    'SMM_DEPEX|' + self.Name + '.depex')
 
         Bin = self._GenOffsetBin()
         if Bin:
@@ -1478,10 +1543,12 @@ class ModuleAutoGen(AutoGen):
         StartPos = 0
         for Index in range(len(HeaderComments)):
             if HeaderComments[Index].find('@BinaryHeader') != -1:
-                HeaderComments[Index] = HeaderComments[Index].replace('@BinaryHeader', '@file')
+                HeaderComments[Index] = HeaderComments[Index].replace(
+                    '@BinaryHeader', '@file')
                 StartPos = Index
                 break
-        AsBuiltInfDict['header_comments'] = '\n'.join(HeaderComments[StartPos:]).replace(':#', '://')
+        AsBuiltInfDict['header_comments'] = '\n'.join(
+            HeaderComments[StartPos:]).replace(':#', '://')
         AsBuiltInfDict['tail_comments'] = '\n'.join(self.Module.TailComments)
 
         GenList = [
@@ -1491,13 +1558,14 @@ class ModuleAutoGen(AutoGen):
         ]
         for Item in GenList:
             for CName in Item[0]:
-                Comments = '\n  '.join(Item[1][CName]) if CName in Item[1] else ''
+                Comments = '\n  '.join(
+                    Item[1][CName]) if CName in Item[1] else ''
                 Entry = Comments + '\n  ' + CName if Comments else CName
                 AsBuiltInfDict[Item[2]].append(Entry)
         PatchList = parsePcdInfoFromMapFile(
-                            os.path.join(self.OutputDir, self.Name + '.map'),
-                            os.path.join(self.OutputDir, self.Name + '.efi')
-                        )
+            os.path.join(self.OutputDir, self.Name + '.map'),
+            os.path.join(self.OutputDir, self.Name + '.efi')
+        )
         if PatchList:
             for Pcd in PatchablePcds:
                 TokenCName = Pcd.TokenCName
@@ -1530,7 +1598,8 @@ class ModuleAutoGen(AutoGen):
                 else:
                     if Pcd.MaxDatumSize is None or Pcd.MaxDatumSize == '':
                         EdkLogger.error("build", AUTOGEN_ERROR,
-                                        "Unknown [MaxDatumSize] of PCD [%s.%s]" % (Pcd.TokenSpaceGuidCName, TokenCName)
+                                        "Unknown [MaxDatumSize] of PCD [%s.%s]" % (
+                                            Pcd.TokenSpaceGuidCName, TokenCName)
                                         )
                     ArraySize = int(Pcd.MaxDatumSize, 0)
                     PcdValue = Pcd.DefaultValue
@@ -1545,9 +1614,11 @@ class ModuleAutoGen(AutoGen):
                             if Unicode:
                                 CharVal = ord(PcdValue[Index])
                                 NewValue = NewValue + '0x%02x' % (CharVal & 0x00FF) + ', ' \
-                                        + '0x%02x' % (CharVal >> 8) + ', '
+                                    + '0x%02x' % (CharVal >> 8) + ', '
                             else:
-                                NewValue = NewValue + '0x%02x' % (ord(PcdValue[Index]) % 0x100) + ', '
+                                NewValue = NewValue + \
+                                    '0x%02x' % (
+                                        ord(PcdValue[Index]) % 0x100) + ', '
                         Padding = '0x00, '
                         if Unicode:
                             Padding = Padding * 2
@@ -1555,28 +1626,34 @@ class ModuleAutoGen(AutoGen):
                         if ArraySize < (len(PcdValue) + 1):
                             if Pcd.MaxSizeUserSet:
                                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                            "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
-                                            )
+                                                "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (
+                                                    Pcd.TokenSpaceGuidCName, TokenCName)
+                                                )
                             else:
                                 ArraySize = len(PcdValue) + 1
                         if ArraySize > len(PcdValue) + 1:
-                            NewValue = NewValue + Padding * (ArraySize - len(PcdValue) - 1)
+                            NewValue = NewValue + Padding * \
+                                (ArraySize - len(PcdValue) - 1)
                         PcdValue = NewValue + Padding.strip().rstrip(',') + '}'
                     elif len(PcdValue.split(',')) <= ArraySize:
-                        PcdValue = PcdValue.rstrip('}') + ', 0x00' * (ArraySize - len(PcdValue.split(',')))
+                        PcdValue = PcdValue.rstrip(
+                            '}') + ', 0x00' * (ArraySize - len(PcdValue.split(',')))
                         PcdValue += '}'
                     else:
                         if Pcd.MaxSizeUserSet:
                             EdkLogger.error("build", AUTOGEN_ERROR,
-                                        "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (Pcd.TokenSpaceGuidCName, TokenCName)
-                                        )
+                                            "The maximum size of VOID* type PCD '%s.%s' is less than its actual size occupied." % (
+                                                Pcd.TokenSpaceGuidCName, TokenCName)
+                                            )
                         else:
                             ArraySize = len(PcdValue) + 1
                 PcdItem = '%s.%s|%s|0x%X' % \
-                    (Pcd.TokenSpaceGuidCName, TokenCName, PcdValue, PatchPcd[1])
+                    (Pcd.TokenSpaceGuidCName,
+                     TokenCName, PcdValue, PatchPcd[1])
                 PcdComments = ''
                 if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
-                    PcdComments = '\n  '.join(self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName])
+                    PcdComments = '\n  '.join(
+                        self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName])
                 if PcdComments:
                     PcdItem = PcdComments + '\n  ' + PcdItem
                 AsBuiltInfDict['patchablepcd_item'].append(PcdItem)
@@ -1592,10 +1669,12 @@ class ModuleAutoGen(AutoGen):
             if Pcd.Type == TAB_PCDS_DYNAMIC_EX_HII:
                 for SkuName in Pcd.SkuInfoList:
                     SkuInfo = Pcd.SkuInfoList[SkuName]
-                    HiiInfo = '## %s|%s|%s' % (SkuInfo.VariableName, SkuInfo.VariableGuid, SkuInfo.VariableOffset)
+                    HiiInfo = '## %s|%s|%s' % (
+                        SkuInfo.VariableName, SkuInfo.VariableGuid, SkuInfo.VariableOffset)
                     break
             if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) in self._PcdComments:
-                PcdCommentList = self._PcdComments[Pcd.TokenSpaceGuidCName, Pcd.TokenCName][:]
+                PcdCommentList = self._PcdComments[Pcd.TokenSpaceGuidCName,
+                                                   Pcd.TokenCName][:]
             if HiiInfo:
                 UsageIndex = -1
                 UsageStr = ''
@@ -1606,7 +1685,8 @@ class ModuleAutoGen(AutoGen):
                             UsageIndex = Index
                             break
                 if UsageIndex != -1:
-                    PcdCommentList[UsageIndex] = '## %s %s %s' % (UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
+                    PcdCommentList[UsageIndex] = '## %s %s %s' % (
+                        UsageStr, HiiInfo, PcdCommentList[UsageIndex].replace(UsageStr, ''))
                 else:
                     PcdCommentList.append('## UNDEFINED ' + HiiInfo)
             PcdComments = '\n  '.join(PcdCommentList)
@@ -1616,11 +1696,13 @@ class ModuleAutoGen(AutoGen):
             AsBuiltInfDict['pcd_item'].append(PcdEntry)
         for Item in self.BuildOption:
             if 'FLAGS' in self.BuildOption[Item]:
-                AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLAGS = %s' % (self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.Arch, Item, self.BuildOption[Item]['FLAGS'].strip()))
+                AsBuiltInfDict['flags_item'].append('%s:%s_%s_%s_%s_FLAGS = %s' % (
+                    self.ToolChainFamily, self.BuildTarget, self.ToolChain, self.Arch, Item, self.BuildOption[Item]['FLAGS'].strip()))
 
         # Generated LibraryClasses section in comments.
         for Library in self.LibraryAutoGenList:
-            AsBuiltInfDict['libraryclasses_item'].append(Library.MetaFile.File.replace('\\', '/'))
+            AsBuiltInfDict['libraryclasses_item'].append(
+                Library.MetaFile.File.replace('\\', '/'))
 
         # Generated UserExtensions TianoCore section.
         # All tianocore user extensions are copied.
@@ -1639,7 +1721,8 @@ class ModuleAutoGen(AutoGen):
         AsBuiltInf = TemplateString()
         AsBuiltInf.Append(gAsBuiltInfHeaderString.Replace(AsBuiltInfDict))
 
-        SaveFileOnChange(os.path.join(self.OutputDir, self.Name + '.inf'), str(AsBuiltInf), False)
+        SaveFileOnChange(os.path.join(
+            self.OutputDir, self.Name + '.inf'), str(AsBuiltInf), False)
 
         self.IsAsBuiltInfCreated = True
 
@@ -1654,7 +1737,8 @@ class ModuleAutoGen(AutoGen):
         try:
             CopyFileOnChange(File, destination_dir)
         except:
-            EdkLogger.quiet("[cache warning]: fail to copy file:%s to folder:%s" % (File, destination_dir))
+            EdkLogger.quiet("[cache warning]: fail to copy file:%s to folder:%s" % (
+                File, destination_dir))
             return
 
     def CopyModuleToCache(self):
@@ -1664,42 +1748,52 @@ class ModuleAutoGen(AutoGen):
         PreMakeHashStr = None
         MakeTimeStamp = 0
         PreMakeTimeStamp = 0
-        Files = [f for f in os.listdir(LongFilePath(self.BuildDir)) if path.isfile(LongFilePath(path.join(self.BuildDir, f)))]
+        Files = [f for f in os.listdir(LongFilePath(self.BuildDir)) if path.isfile(
+            LongFilePath(path.join(self.BuildDir, f)))]
         for File in Files:
             if ".MakeHashFileList." in File:
-                #find lastest file through time stamp
-                FileTimeStamp = os.stat(LongFilePath(path.join(self.BuildDir, File)))[8]
+                # find lastest file through time stamp
+                FileTimeStamp = os.stat(LongFilePath(
+                    path.join(self.BuildDir, File)))[8]
                 if FileTimeStamp > MakeTimeStamp:
                     MakeTimeStamp = FileTimeStamp
                     MakeHashStr = File.split('.')[-1]
                     if len(MakeHashStr) != 32:
-                        EdkLogger.quiet("[cache error]: wrong MakeHashFileList file:%s" % (File))
+                        EdkLogger.quiet(
+                            "[cache error]: wrong MakeHashFileList file:%s" % (File))
             if ".PreMakeHashFileList." in File:
-                FileTimeStamp = os.stat(LongFilePath(path.join(self.BuildDir, File)))[8]
+                FileTimeStamp = os.stat(LongFilePath(
+                    path.join(self.BuildDir, File)))[8]
                 if FileTimeStamp > PreMakeTimeStamp:
                     PreMakeTimeStamp = FileTimeStamp
                     PreMakeHashStr = File.split('.')[-1]
                     if len(PreMakeHashStr) != 32:
-                        EdkLogger.quiet("[cache error]: wrong PreMakeHashFileList file:%s" % (File))
+                        EdkLogger.quiet(
+                            "[cache error]: wrong PreMakeHashFileList file:%s" % (File))
 
         if not MakeHashStr:
-            EdkLogger.quiet("[cache error]: No MakeHashFileList file for module:%s[%s]" % (self.MetaFile.Path, self.Arch))
+            EdkLogger.quiet("[cache error]: No MakeHashFileList file for module:%s[%s]" % (
+                self.MetaFile.Path, self.Arch))
             return
         if not PreMakeHashStr:
-            EdkLogger.quiet("[cache error]: No PreMakeHashFileList file for module:%s[%s]" % (self.MetaFile.Path, self.Arch))
+            EdkLogger.quiet("[cache error]: No PreMakeHashFileList file for module:%s[%s]" % (
+                self.MetaFile.Path, self.Arch))
             return
 
         # Create Cache destination dirs
-        FileDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
-        FfsDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
+        FileDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.OutputDir, self.BuildTarget +
+                            "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
+        FfsDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.OutputDir, self.BuildTarget +
+                           "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
         CacheFileDir = path.join(FileDir, MakeHashStr)
         CacheFfsDir = path.join(FfsDir, MakeHashStr)
-        CreateDirectory (CacheFileDir)
-        CreateDirectory (CacheFfsDir)
+        CreateDirectory(CacheFileDir)
+        CreateDirectory(CacheFfsDir)
 
         # Create ModuleHashPair file to support multiple version cache together
         ModuleHashPair = path.join(FileDir, self.Name + ".ModuleHashPair")
-        ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash, MakeHash)]
+        # tuple list: [tuple(PreMakefileHash, MakeHash)]
+        ModuleHashPairList = []
         if os.path.exists(ModuleHashPair):
             with open(ModuleHashPair, 'r') as f:
                 ModuleHashPairList = json.load(f)
@@ -1710,44 +1804,46 @@ class ModuleAutoGen(AutoGen):
 
         # Copy files to Cache destination dirs
         if not self.OutputFile:
-            Ma = self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
+            Ma = self.BuildDatabase[self.MetaFile,
+                                    self.Arch, self.BuildTarget, self.ToolChain]
             self.OutputFile = Ma.Binaries
         for File in self.OutputFile:
             if File.startswith(os.path.abspath(self.FfsOutputDir)+os.sep):
                 self.CacheCopyFile(CacheFfsDir, self.FfsOutputDir, File)
             else:
-                if  self.Name + ".autogen.hash." in File or \
-                    self.Name + ".autogen.hashchain." in File or \
-                    self.Name + ".hash." in File or \
-                    self.Name + ".hashchain." in File or \
-                    self.Name + ".PreMakeHashFileList." in File or \
-                    self.Name + ".MakeHashFileList." in File:
+                if self.Name + ".autogen.hash." in File or \
+                        self.Name + ".autogen.hashchain." in File or \
+                        self.Name + ".hash." in File or \
+                        self.Name + ".hashchain." in File or \
+                        self.Name + ".PreMakeHashFileList." in File or \
+                        self.Name + ".MakeHashFileList." in File:
                     self.CacheCopyFile(FileDir, self.BuildDir, File)
                 else:
                     self.CacheCopyFile(CacheFileDir, self.BuildDir, File)
-    ## Create makefile for the module and its dependent libraries
+    # Create makefile for the module and its dependent libraries
     #
     #   @param      CreateLibraryMakeFile   Flag indicating if or not the makefiles of
     #                                       dependent libraries will be created
     #
+
     @cached_class_function
-    def CreateMakeFile(self, CreateLibraryMakeFile=True, GenFfsList = []):
+    def CreateMakeFile(self, CreateLibraryMakeFile=True, GenFfsList=[]):
 
         # nest this function inside it's only caller.
         def CreateTimeStamp():
             FileSet = {self.MetaFile.Path}
 
             for SourceFile in self.Module.Sources:
-                FileSet.add (SourceFile.Path)
+                FileSet.add(SourceFile.Path)
 
             for Lib in self.DependentLibraryList:
-                FileSet.add (Lib.MetaFile.Path)
+                FileSet.add(Lib.MetaFile.Path)
 
             for f in self.AutoGenDepSet:
-                FileSet.add (f.Path)
+                FileSet.add(f.Path)
 
-            if os.path.exists (self.TimeStampPath):
-                os.remove (self.TimeStampPath)
+            if os.path.exists(self.TimeStampPath):
+                os.remove(self.TimeStampPath)
 
             SaveFileOnChange(self.TimeStampPath, "\n".join(FileSet), False)
 
@@ -1789,11 +1885,12 @@ class ModuleAutoGen(AutoGen):
             SrcPath = File.Path
             DstPath = os.path.join(self.OutputDir, os.path.basename(SrcPath))
             CopyLongFilePath(SrcPath, DstPath)
-    ## Create autogen code for the module and its dependent libraries
+    # Create autogen code for the module and its dependent libraries
     #
     #   @param      CreateLibraryCodeFile   Flag indicating if or not the code of
     #                                       dependent libraries will be created
     #
+
     def CreateCodeFile(self, CreateLibraryCodeFile=True):
 
         if self.IsCodeFileCreated:
@@ -1822,14 +1919,14 @@ class ModuleAutoGen(AutoGen):
             else:
                 IgoredAutoGenList.append(str(File))
 
-
         for ModuleType in self.DepexList:
             # Ignore empty [depex] section or [depex] section for SUP_MODULE_USER_DEFINED module
             if len(self.DepexList[ModuleType]) == 0 or ModuleType == SUP_MODULE_USER_DEFINED or ModuleType == SUP_MODULE_HOST_APPLICATION:
                 continue
 
-            Dpx = GenDepex.DependencyExpression(self.DepexList[ModuleType], ModuleType, True)
-            DpxFile = gAutoGenDepexFileName % {"module_name" : self.Name}
+            Dpx = GenDepex.DependencyExpression(
+                self.DepexList[ModuleType], ModuleType, True)
+            DpxFile = gAutoGenDepexFileName % {"module_name": self.Name}
 
             if len(Dpx.PostfixNotation) != 0:
                 self.DepexGenerated = True
@@ -1853,20 +1950,20 @@ class ModuleAutoGen(AutoGen):
 
         return AutoGenList
 
-    ## Summarize the ModuleAutoGen objects of all libraries used by this module
+    # Summarize the ModuleAutoGen objects of all libraries used by this module
     @cached_property
     def LibraryAutoGenList(self):
         RetVal = []
         for Library in self.DependentLibraryList:
             La = ModuleAutoGen(
-                        self.Workspace,
-                        Library.MetaFile,
-                        self.BuildTarget,
-                        self.ToolChain,
-                        self.Arch,
-                        self.PlatformInfo.MetaFile,
-                        self.DataPipe
-                        )
+                self.Workspace,
+                Library.MetaFile,
+                self.BuildTarget,
+                self.ToolChain,
+                self.Arch,
+                self.PlatformInfo.MetaFile,
+                self.DataPipe
+            )
             La.IsLibrary = True
             if La not in RetVal:
                 RetVal.append(La)
@@ -1888,10 +1985,11 @@ class ModuleAutoGen(AutoGen):
         # Add Makefile
         abspath = path.join(self.BuildDir, self.Name + ".makefile")
         try:
-            with open(LongFilePath(abspath),"r") as fd:
+            with open(LongFilePath(abspath), "r") as fd:
                 lines = fd.readlines()
         except Exception as e:
-            EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % abspath, ExtraData=str(e), RaiseError=False)
+            EdkLogger.error("build", FILE_NOT_FOUND, "%s doesn't exist" %
+                            abspath, ExtraData=str(e), RaiseError=False)
         if lines:
             DependencyFileSet.update(lines)
 
@@ -1901,20 +1999,24 @@ class ModuleAutoGen(AutoGen):
         m = hashlib.md5()
         for File in sorted(DependencyFileSet, key=lambda x: str(x)):
             if not path.exists(LongFilePath(str(File))):
-                EdkLogger.quiet("[cache warning]: header file %s is missing for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch))
+                EdkLogger.quiet("[cache warning]: header file %s is missing for module: %s[%s]" % (
+                    File, self.MetaFile.Path, self.Arch))
                 continue
             with open(LongFilePath(str(File)), 'rb') as f:
                 Content = f.read()
             m.update(Content)
             FileList.append((str(File), hashlib.md5(Content).hexdigest()))
 
-        HashChainFile = path.join(self.BuildDir, self.Name + ".autogen.hashchain." + m.hexdigest())
-        GlobalData.gCMakeHashFile[(self.MetaFile.Path, self.Arch)] = HashChainFile
+        HashChainFile = path.join(
+            self.BuildDir, self.Name + ".autogen.hashchain." + m.hexdigest())
+        GlobalData.gCMakeHashFile[(
+            self.MetaFile.Path, self.Arch)] = HashChainFile
         try:
             with open(LongFilePath(HashChainFile), 'w') as f:
                 json.dump(FileList, f, indent=2)
         except:
-            EdkLogger.quiet("[cache warning]: fail to save hashchain file:%s" % HashChainFile)
+            EdkLogger.quiet(
+                "[cache warning]: fail to save hashchain file:%s" % HashChainFile)
             return False
 
     def GenModuleHash(self):
@@ -1939,17 +2041,18 @@ class ModuleAutoGen(AutoGen):
         abspath = path.join(self.BuildDir, "deps.txt")
         rt = None
         try:
-            with open(LongFilePath(abspath),"r") as fd:
+            with open(LongFilePath(abspath), "r") as fd:
                 lines = fd.readlines()
                 if lines:
-                    rt = set([item.lstrip().strip("\n") for item in lines if item.strip("\n").endswith(".h")])
+                    rt = set([item.lstrip().strip("\n")
+                             for item in lines if item.strip("\n").endswith(".h")])
         except Exception as e:
-            EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % abspath, ExtraData=str(e), RaiseError=False)
+            EdkLogger.error("build", FILE_NOT_FOUND, "%s doesn't exist" %
+                            abspath, ExtraData=str(e), RaiseError=False)
 
         if rt:
             DependencyFileSet.update(rt)
 
-
         # Caculate all above dependency files hash
         # Initialze hash object
         FileList = []
@@ -1961,20 +2064,24 @@ class ModuleAutoGen(AutoGen):
             if BuildDirStr in path.abspath(File).lower():
                 continue
             if not path.exists(LongFilePath(File)):
-                EdkLogger.quiet("[cache warning]: header file %s is missing for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch))
+                EdkLogger.quiet("[cache warning]: header file %s is missing for module: %s[%s]" % (
+                    File, self.MetaFile.Path, self.Arch))
                 continue
             with open(LongFilePath(File), 'rb') as f:
                 Content = f.read()
             m.update(Content)
             FileList.append((File, hashlib.md5(Content).hexdigest()))
 
-        HashChainFile = path.join(self.BuildDir, self.Name + ".hashchain." + m.hexdigest())
-        GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)] = HashChainFile
+        HashChainFile = path.join(
+            self.BuildDir, self.Name + ".hashchain." + m.hexdigest())
+        GlobalData.gModuleHashFile[(
+            self.MetaFile.Path, self.Arch)] = HashChainFile
         try:
             with open(LongFilePath(HashChainFile), 'w') as f:
                 json.dump(FileList, f, indent=2)
         except:
-            EdkLogger.quiet("[cache warning]: fail to save hashchain file:%s" % HashChainFile)
+            EdkLogger.quiet(
+                "[cache warning]: fail to save hashchain file:%s" % HashChainFile)
             return False
 
     def GenPreMakefileHashList(self):
@@ -1998,56 +2105,68 @@ class ModuleAutoGen(AutoGen):
             FileList.append(HashFile)
             m.update(HashFile.encode('utf-8'))
         else:
-            EdkLogger.quiet("[cache warning]: No Platform HashFile: %s" % HashFile)
+            EdkLogger.quiet(
+                "[cache warning]: No Platform HashFile: %s" % HashFile)
 
         # Add Package level hash
         if self.DependentPackageList:
             for Pkg in sorted(self.DependentPackageList, key=lambda x: x.PackageName):
                 if not (Pkg.PackageName, Pkg.Arch) in GlobalData.gPackageHashFile:
-                    EdkLogger.quiet("[cache warning]:No Package %s for module %s[%s]" % (Pkg.PackageName, self.MetaFile.Path, self.Arch))
+                    EdkLogger.quiet("[cache warning]:No Package %s for module %s[%s]" % (
+                        Pkg.PackageName, self.MetaFile.Path, self.Arch))
                     continue
-                HashFile = GlobalData.gPackageHashFile[(Pkg.PackageName, Pkg.Arch)]
+                HashFile = GlobalData.gPackageHashFile[(
+                    Pkg.PackageName, Pkg.Arch)]
                 if path.exists(LongFilePath(HashFile)):
                     FileList.append(HashFile)
                     m.update(HashFile.encode('utf-8'))
                 else:
-                    EdkLogger.quiet("[cache warning]:No Package HashFile: %s" % HashFile)
+                    EdkLogger.quiet(
+                        "[cache warning]:No Package HashFile: %s" % HashFile)
 
         # Add Module self
         # GenPreMakefileHashList needed in both --binary-destination
         # and --hash. And --hash might save ModuleHashFile in remote dict
         # during multiprocessing.
         if (self.MetaFile.Path, self.Arch) in GlobalData.gModuleHashFile:
-            HashFile = GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)]
+            HashFile = GlobalData.gModuleHashFile[(
+                self.MetaFile.Path, self.Arch)]
         else:
-            EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
+            EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %s[%s]" % (
+                self.MetaFile.Path, self.Arch))
         if path.exists(LongFilePath(HashFile)):
             FileList.append(HashFile)
             m.update(HashFile.encode('utf-8'))
         else:
-            EdkLogger.quiet("[cache warning]:No Module HashFile: %s" % HashFile)
+            EdkLogger.quiet(
+                "[cache warning]:No Module HashFile: %s" % HashFile)
 
         # Add Library hash
         if self.LibraryAutoGenList:
             for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.MetaFile.Path):
 
                 if (Lib.MetaFile.Path, Lib.Arch) in GlobalData.gModuleHashFile:
-                    HashFile = GlobalData.gModuleHashFile[(Lib.MetaFile.Path, Lib.Arch)]
+                    HashFile = GlobalData.gModuleHashFile[(
+                        Lib.MetaFile.Path, Lib.Arch)]
                 else:
-                    EdkLogger.quiet("[cache error]:No ModuleHashFile for lib: %s[%s]" % (Lib.MetaFile.Path, Lib.Arch))
+                    EdkLogger.quiet("[cache error]:No ModuleHashFile for lib: %s[%s]" % (
+                        Lib.MetaFile.Path, Lib.Arch))
                 if path.exists(LongFilePath(HashFile)):
                     FileList.append(HashFile)
                     m.update(HashFile.encode('utf-8'))
                 else:
-                    EdkLogger.quiet("[cache warning]:No Lib HashFile: %s" % HashFile)
+                    EdkLogger.quiet(
+                        "[cache warning]:No Lib HashFile: %s" % HashFile)
 
         # Save PreMakeHashFileList
-        FilePath = path.join(self.BuildDir, self.Name + ".PreMakeHashFileList." + m.hexdigest())
+        FilePath = path.join(self.BuildDir, self.Name +
+                             ".PreMakeHashFileList." + m.hexdigest())
         try:
             with open(LongFilePath(FilePath), 'w') as f:
                 json.dump(FileList, f, indent=0)
         except:
-            EdkLogger.quiet("[cache warning]: fail to save PreMake HashFileList: %s" % FilePath)
+            EdkLogger.quiet(
+                "[cache warning]: fail to save PreMake HashFileList: %s" % FilePath)
 
     def GenMakefileHashList(self):
         # GenMakefileHashList only need in --binary-destination which will
@@ -2065,39 +2184,48 @@ class ModuleAutoGen(AutoGen):
             FileList.append(HashFile)
             m.update(HashFile.encode('utf-8'))
         else:
-            EdkLogger.quiet("[cache warning]:No AutoGen HashFile: %s" % HashFile)
+            EdkLogger.quiet(
+                "[cache warning]:No AutoGen HashFile: %s" % HashFile)
 
         # Add Module self
         if (self.MetaFile.Path, self.Arch) in GlobalData.gModuleHashFile:
-            HashFile = GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)]
+            HashFile = GlobalData.gModuleHashFile[(
+                self.MetaFile.Path, self.Arch)]
         else:
-            EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
+            EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %s[%s]" % (
+                self.MetaFile.Path, self.Arch))
         if path.exists(LongFilePath(HashFile)):
             FileList.append(HashFile)
             m.update(HashFile.encode('utf-8'))
         else:
-            EdkLogger.quiet("[cache warning]:No Module HashFile: %s" % HashFile)
+            EdkLogger.quiet(
+                "[cache warning]:No Module HashFile: %s" % HashFile)
 
         # Add Library hash
         if self.LibraryAutoGenList:
             for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.MetaFile.Path):
                 if (Lib.MetaFile.Path, Lib.Arch) in GlobalData.gModuleHashFile:
-                    HashFile = GlobalData.gModuleHashFile[(Lib.MetaFile.Path, Lib.Arch)]
+                    HashFile = GlobalData.gModuleHashFile[(
+                        Lib.MetaFile.Path, Lib.Arch)]
                 else:
-                    EdkLogger.quiet("[cache error]:No ModuleHashFile for lib: %s[%s]" % (Lib.MetaFile.Path, Lib.Arch))
+                    EdkLogger.quiet("[cache error]:No ModuleHashFile for lib: %s[%s]" % (
+                        Lib.MetaFile.Path, Lib.Arch))
                 if path.exists(LongFilePath(HashFile)):
                     FileList.append(HashFile)
                     m.update(HashFile.encode('utf-8'))
                 else:
-                    EdkLogger.quiet("[cache warning]:No Lib HashFile: %s" % HashFile)
+                    EdkLogger.quiet(
+                        "[cache warning]:No Lib HashFile: %s" % HashFile)
 
         # Save MakeHashFileList
-        FilePath = path.join(self.BuildDir, self.Name + ".MakeHashFileList." + m.hexdigest())
+        FilePath = path.join(self.BuildDir, self.Name +
+                             ".MakeHashFileList." + m.hexdigest())
         try:
             with open(LongFilePath(FilePath), 'w') as f:
                 json.dump(FileList, f, indent=0)
         except:
-            EdkLogger.quiet("[cache warning]: fail to save Make HashFileList: %s" % FilePath)
+            EdkLogger.quiet(
+                "[cache warning]: fail to save Make HashFileList: %s" % FilePath)
 
     def CheckHashChainFile(self, HashChainFile):
         # Assume the HashChainFile basename format is the 'x.hashchain.16BytesHexStr'
@@ -2105,19 +2233,21 @@ class ModuleAutoGen(AutoGen):
         # all hashchain files content
         HashStr = HashChainFile.split('.')[-1]
         if len(HashStr) != 32:
-            EdkLogger.quiet("[cache error]: wrong format HashChainFile:%s" % (File))
+            EdkLogger.quiet(
+                "[cache error]: wrong format HashChainFile:%s" % (File))
             return False
 
         try:
             with open(LongFilePath(HashChainFile), 'r') as f:
                 HashChainList = json.load(f)
         except:
-            EdkLogger.quiet("[cache error]: fail to load HashChainFile: %s" % HashChainFile)
+            EdkLogger.quiet(
+                "[cache error]: fail to load HashChainFile: %s" % HashChainFile)
             return False
 
         # Print the different file info
         # print(HashChainFile)
-        for idx, (SrcFile, SrcHash) in enumerate (HashChainList):
+        for idx, (SrcFile, SrcHash) in enumerate(HashChainList):
             if SrcFile in GlobalData.gFileHashDict:
                 DestHash = GlobalData.gFileHashDict[SrcFile]
             else:
@@ -2129,15 +2259,17 @@ class ModuleAutoGen(AutoGen):
                 except IOError as X:
                     # cache miss if SrcFile is removed in new version code
                     GlobalData.gFileHashDict[SrcFile] = 0
-                    EdkLogger.quiet("[cache insight]: first cache miss file in %s is %s" % (HashChainFile, SrcFile))
+                    EdkLogger.quiet("[cache insight]: first cache miss file in %s is %s" % (
+                        HashChainFile, SrcFile))
                     return False
             if SrcHash != DestHash:
-                EdkLogger.quiet("[cache insight]: first cache miss file in %s is %s" % (HashChainFile, SrcFile))
+                EdkLogger.quiet("[cache insight]: first cache miss file in %s is %s" % (
+                    HashChainFile, SrcFile))
                 return False
 
         return True
 
-    ## Decide whether we can skip the left autogen and make process
+    # Decide whether we can skip the left autogen and make process
     def CanSkipbyMakeCache(self):
         # For --binary-source only
         # CanSkipbyMakeCache consume below dicts:
@@ -2155,44 +2287,57 @@ class ModuleAutoGen(AutoGen):
 
         # If Module is binary, which has special build rule, do not skip by cache.
         if self.IsBinaryModule:
-            print("[cache miss]: MakeCache: Skip BinaryModule:", self.MetaFile.Path, self.Arch)
-            GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+            print("[cache miss]: MakeCache: Skip BinaryModule:",
+                  self.MetaFile.Path, self.Arch)
+            GlobalData.gModuleMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = False
             return False
 
         # see .inc as binary file, do not skip by hash
         for f_ext in self.SourceFileList:
             if '.inc' in str(f_ext):
-                print("[cache miss]: MakeCache: Skip '.inc' File:", self.MetaFile.Path, self.Arch)
-                GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+                print("[cache miss]: MakeCache: Skip '.inc' File:",
+                      self.MetaFile.Path, self.Arch)
+                GlobalData.gModuleMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = False
                 return False
 
-        ModuleCacheDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
-        FfsDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
+        ModuleCacheDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir,
+                                   self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
+        FfsDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir,
+                           self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
 
-        ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash, MakeHash)]
-        ModuleHashPair = path.join(ModuleCacheDir, self.Name + ".ModuleHashPair")
+        # tuple list: [tuple(PreMakefileHash, MakeHash)]
+        ModuleHashPairList = []
+        ModuleHashPair = path.join(
+            ModuleCacheDir, self.Name + ".ModuleHashPair")
         try:
             with open(LongFilePath(ModuleHashPair), 'r') as f:
                 ModuleHashPairList = json.load(f)
         except:
             # ModuleHashPair might not exist for new added module
-            GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
-            EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair file: %s" % ModuleHashPair)
+            GlobalData.gModuleMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = False
+            EdkLogger.quiet(
+                "[cache warning]: fail to load ModuleHashPair file: %s" % ModuleHashPair)
             print("[cache miss]: MakeCache:", self.MetaFile.Path, self.Arch)
             return False
 
         # Check the PreMakeHash in ModuleHashPairList one by one
-        for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairList):
+        for idx, (PreMakefileHash, MakeHash) in enumerate(ModuleHashPairList):
             SourceHashDir = path.join(ModuleCacheDir, MakeHash)
             SourceFfsHashDir = path.join(FfsDir, MakeHash)
-            PreMakeHashFileList_FilePah = path.join(ModuleCacheDir, self.Name + ".PreMakeHashFileList." + PreMakefileHash)
-            MakeHashFileList_FilePah = path.join(ModuleCacheDir, self.Name + ".MakeHashFileList." + MakeHash)
+            PreMakeHashFileList_FilePah = path.join(
+                ModuleCacheDir, self.Name + ".PreMakeHashFileList." + PreMakefileHash)
+            MakeHashFileList_FilePah = path.join(
+                ModuleCacheDir, self.Name + ".MakeHashFileList." + MakeHash)
 
             try:
                 with open(LongFilePath(MakeHashFileList_FilePah), 'r') as f:
                     MakeHashFileList = json.load(f)
             except:
-                EdkLogger.quiet("[cache error]: fail to load MakeHashFileList file: %s" % MakeHashFileList_FilePah)
+                EdkLogger.quiet(
+                    "[cache error]: fail to load MakeHashFileList file: %s" % MakeHashFileList_FilePah)
                 continue
 
             HashMiss = False
@@ -2206,13 +2351,16 @@ class ModuleAutoGen(AutoGen):
                 elif HashChainStatus == True:
                     continue
                 # Convert to path start with cache source dir
-                RelativePath = os.path.relpath(HashChainFile, self.WorkspaceDir)
-                NewFilePath = os.path.join(GlobalData.gBinCacheSource, RelativePath)
+                RelativePath = os.path.relpath(
+                    HashChainFile, self.WorkspaceDir)
+                NewFilePath = os.path.join(
+                    GlobalData.gBinCacheSource, RelativePath)
                 if self.CheckHashChainFile(NewFilePath):
                     GlobalData.gHashChainStatus[HashChainFile] = True
                     # Save the module self HashFile for GenPreMakefileHashList later usage
                     if self.Name + ".hashchain." in HashChainFile:
-                        GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)] = HashChainFile
+                        GlobalData.gModuleHashFile[(
+                            self.MetaFile.Path, self.Arch)] = HashChainFile
                 else:
                     GlobalData.gHashChainStatus[HashChainFile] = False
                     HashMiss = True
@@ -2230,20 +2378,23 @@ class ModuleAutoGen(AutoGen):
                 for root, dir, files in os.walk(SourceFfsHashDir):
                     for f in files:
                         File = path.join(root, f)
-                        self.CacheCopyFile(self.FfsOutputDir, SourceFfsHashDir, File)
+                        self.CacheCopyFile(
+                            self.FfsOutputDir, SourceFfsHashDir, File)
 
             if self.Name == "PcdPeim" or self.Name == "PcdDxe":
                 CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
 
             print("[cache hit]: MakeCache:", self.MetaFile.Path, self.Arch)
-            GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = True
+            GlobalData.gModuleMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = True
             return True
 
         print("[cache miss]: MakeCache:", self.MetaFile.Path, self.Arch)
-        GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+        GlobalData.gModuleMakeCacheStatus[(
+            self.MetaFile.Path, self.Arch)] = False
         return False
 
-    ## Decide whether we can skip the left autogen and make process
+    # Decide whether we can skip the left autogen and make process
     def CanSkipbyPreMakeCache(self):
         # CanSkipbyPreMakeCache consume below dicts:
         #     gModulePreMakeCacheStatus
@@ -2261,20 +2412,25 @@ class ModuleAutoGen(AutoGen):
 
         # If Module is binary, which has special build rule, do not skip by cache.
         if self.IsBinaryModule:
-            print("[cache miss]: PreMakeCache: Skip BinaryModule:", self.MetaFile.Path, self.Arch)
-            GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+            print("[cache miss]: PreMakeCache: Skip BinaryModule:",
+                  self.MetaFile.Path, self.Arch)
+            GlobalData.gModulePreMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = False
             return False
 
         # see .inc as binary file, do not skip by hash
         for f_ext in self.SourceFileList:
             if '.inc' in str(f_ext):
-                print("[cache miss]: PreMakeCache: Skip '.inc' File:", self.MetaFile.Path, self.Arch)
-                GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+                print("[cache miss]: PreMakeCache: Skip '.inc' File:",
+                      self.MetaFile.Path, self.Arch)
+                GlobalData.gModulePreMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = False
                 return False
 
         # For --hash only in the incremental build
         if not GlobalData.gBinCacheSource:
-            Files = [path.join(self.BuildDir, f) for f in os.listdir(self.BuildDir) if path.isfile(path.join(self.BuildDir, f))]
+            Files = [path.join(self.BuildDir, f) for f in os.listdir(
+                self.BuildDir) if path.isfile(path.join(self.BuildDir, f))]
             PreMakeHashFileList_FilePah = None
             MakeTimeStamp = 0
             # Find latest PreMakeHashFileList file in self.BuildDir folder
@@ -2285,16 +2441,20 @@ class ModuleAutoGen(AutoGen):
                         MakeTimeStamp = FileTimeStamp
                         PreMakeHashFileList_FilePah = File
             if not PreMakeHashFileList_FilePah:
-                GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+                GlobalData.gModulePreMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = False
                 return False
 
             try:
                 with open(LongFilePath(PreMakeHashFileList_FilePah), 'r') as f:
                     PreMakeHashFileList = json.load(f)
             except:
-                EdkLogger.quiet("[cache error]: fail to load PreMakeHashFileList file: %s" % PreMakeHashFileList_FilePah)
-                print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.Arch)
-                GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+                EdkLogger.quiet(
+                    "[cache error]: fail to load PreMakeHashFileList file: %s" % PreMakeHashFileList_FilePah)
+                print("[cache miss]: PreMakeCache:",
+                      self.MetaFile.Path, self.Arch)
+                GlobalData.gModulePreMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = False
                 return False
 
             HashMiss = False
@@ -2311,48 +2471,62 @@ class ModuleAutoGen(AutoGen):
                     GlobalData.gHashChainStatus[HashChainFile] = True
                     # Save the module self HashFile for GenPreMakefileHashList later usage
                     if self.Name + ".hashchain." in HashChainFile:
-                        GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)] = HashChainFile
+                        GlobalData.gModuleHashFile[(
+                            self.MetaFile.Path, self.Arch)] = HashChainFile
                 else:
                     GlobalData.gHashChainStatus[HashChainFile] = False
                     HashMiss = True
                     break
 
             if HashMiss:
-                print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.Arch)
-                GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+                print("[cache miss]: PreMakeCache:",
+                      self.MetaFile.Path, self.Arch)
+                GlobalData.gModulePreMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = False
                 return False
             else:
-                print("[cache hit]: PreMakeCache:", self.MetaFile.Path, self.Arch)
-                GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = True
+                print("[cache hit]: PreMakeCache:",
+                      self.MetaFile.Path, self.Arch)
+                GlobalData.gModulePreMakeCacheStatus[(
+                    self.MetaFile.Path, self.Arch)] = True
                 return True
 
-        ModuleCacheDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
-        FfsDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
+        ModuleCacheDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir,
+                                   self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, self.MetaFile.BaseName)
+        FfsDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.OutputDir,
+                           self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
 
-        ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash, MakeHash)]
-        ModuleHashPair = path.join(ModuleCacheDir, self.Name + ".ModuleHashPair")
+        # tuple list: [tuple(PreMakefileHash, MakeHash)]
+        ModuleHashPairList = []
+        ModuleHashPair = path.join(
+            ModuleCacheDir, self.Name + ".ModuleHashPair")
         try:
             with open(LongFilePath(ModuleHashPair), 'r') as f:
                 ModuleHashPairList = json.load(f)
         except:
             # ModuleHashPair might not exist for new added module
-            GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
-            EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair file: %s" % ModuleHashPair)
+            GlobalData.gModulePreMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = False
+            EdkLogger.quiet(
+                "[cache warning]: fail to load ModuleHashPair file: %s" % ModuleHashPair)
             print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.Arch)
             return False
 
         # Check the PreMakeHash in ModuleHashPairList one by one
-        for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairList):
+        for idx, (PreMakefileHash, MakeHash) in enumerate(ModuleHashPairList):
             SourceHashDir = path.join(ModuleCacheDir, MakeHash)
             SourceFfsHashDir = path.join(FfsDir, MakeHash)
-            PreMakeHashFileList_FilePah = path.join(ModuleCacheDir, self.Name + ".PreMakeHashFileList." + PreMakefileHash)
-            MakeHashFileList_FilePah = path.join(ModuleCacheDir, self.Name + ".MakeHashFileList." + MakeHash)
+            PreMakeHashFileList_FilePah = path.join(
+                ModuleCacheDir, self.Name + ".PreMakeHashFileList." + PreMakefileHash)
+            MakeHashFileList_FilePah = path.join(
+                ModuleCacheDir, self.Name + ".MakeHashFileList." + MakeHash)
 
             try:
                 with open(LongFilePath(PreMakeHashFileList_FilePah), 'r') as f:
                     PreMakeHashFileList = json.load(f)
             except:
-                EdkLogger.quiet("[cache error]: fail to load PreMakeHashFileList file: %s" % PreMakeHashFileList_FilePah)
+                EdkLogger.quiet(
+                    "[cache error]: fail to load PreMakeHashFileList file: %s" % PreMakeHashFileList_FilePah)
                 continue
 
             HashMiss = False
@@ -2366,8 +2540,10 @@ class ModuleAutoGen(AutoGen):
                 elif HashChainStatus == True:
                     continue
                 # Convert to path start with cache source dir
-                RelativePath = os.path.relpath(HashChainFile, self.WorkspaceDir)
-                NewFilePath = os.path.join(GlobalData.gBinCacheSource, RelativePath)
+                RelativePath = os.path.relpath(
+                    HashChainFile, self.WorkspaceDir)
+                NewFilePath = os.path.join(
+                    GlobalData.gBinCacheSource, RelativePath)
                 if self.CheckHashChainFile(NewFilePath):
                     GlobalData.gHashChainStatus[HashChainFile] = True
                 else:
@@ -2387,20 +2563,23 @@ class ModuleAutoGen(AutoGen):
                 for root, dir, files in os.walk(SourceFfsHashDir):
                     for f in files:
                         File = path.join(root, f)
-                        self.CacheCopyFile(self.FfsOutputDir, SourceFfsHashDir, File)
+                        self.CacheCopyFile(
+                            self.FfsOutputDir, SourceFfsHashDir, File)
 
             if self.Name == "PcdPeim" or self.Name == "PcdDxe":
                 CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
 
             print("[cache hit]: PreMakeCache:", self.MetaFile.Path, self.Arch)
-            GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = True
+            GlobalData.gModulePreMakeCacheStatus[(
+                self.MetaFile.Path, self.Arch)] = True
             return True
 
         print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.Arch)
-        GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arch)] = False
+        GlobalData.gModulePreMakeCacheStatus[(
+            self.MetaFile.Path, self.Arch)] = False
         return False
 
-    ## Decide whether we can skip the Module build
+    # Decide whether we can skip the Module build
     def CanSkipbyCache(self, gHitSet):
         # Hashing feature is off
         if not GlobalData.gBinCacheSource:
@@ -2411,7 +2590,7 @@ class ModuleAutoGen(AutoGen):
 
         return False
 
-    ## Decide whether we can skip the ModuleAutoGen process
+    # Decide whether we can skip the ModuleAutoGen process
     #  If any source file is newer than the module than we cannot skip
     #
     def CanSkip(self):
@@ -2422,19 +2601,19 @@ class ModuleAutoGen(AutoGen):
             return True
         if not os.path.exists(self.TimeStampPath):
             return False
-        #last creation time of the module
+        # last creation time of the module
         DstTimeStamp = os.stat(self.TimeStampPath)[8]
 
         SrcTimeStamp = self.Workspace._SrcTimeStamp
         if SrcTimeStamp > DstTimeStamp:
             return False
 
-        with open(self.TimeStampPath,'r') as f:
+        with open(self.TimeStampPath, 'r') as f:
             for source in f:
                 source = source.rstrip('\n')
                 if not os.path.exists(source):
                     return False
-                if source not in ModuleAutoGen.TimeDict :
+                if source not in ModuleAutoGen.TimeDict:
                     ModuleAutoGen.TimeDict[source] = os.stat(source)[8]
                 if ModuleAutoGen.TimeDict[source] > DstTimeStamp:
                     return False
diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
index 036fdac3d7df..1fa3805d2480 100644
--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
@@ -1,21 +1,21 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019 - 2021, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 from __future__ import absolute_import
-from Workspace.WorkspaceDatabase import WorkspaceDatabase,BuildDB
+from Workspace.WorkspaceDatabase import WorkspaceDatabase, BuildDB
 from Common.caching import cached_property
-from AutoGen.BuildEngine import BuildRule,AutoGenReqBuildRuleVerNum
+from AutoGen.BuildEngine import BuildRule, AutoGenReqBuildRuleVerNum
 from AutoGen.AutoGen import CalculatePriorityValue
-from Common.Misc import CheckPcdDatum,GuidValue
+from Common.Misc import CheckPcdDatum, GuidValue
 from Common.Expression import ValueExpressionEx
 from Common.DataType import *
 from CommonDataClass.Exceptions import *
 from CommonDataClass.CommonClass import SkuInfoClass
 import Common.EdkLogger as EdkLogger
-from Common.BuildToolError import OPTION_CONFLICT,FORMAT_INVALID,RESOURCE_NOT_AVAILABLE
+from Common.BuildToolError import OPTION_CONFLICT, FORMAT_INVALID, RESOURCE_NOT_AVAILABLE
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 from collections import defaultdict
 from Common.Misc import PathClass
@@ -25,31 +25,34 @@ import os
 #
 # The priority list while override build option
 #
-PrioList = {"0x11111"  : 16,     #  TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
-            "0x01111"  : 15,     #  ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x10111"  : 14,     #  TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x00111"  : 13,     #  ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
-            "0x11011"  : 12,     #  TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
-            "0x01011"  : 11,     #  ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
-            "0x10011"  : 10,     #  TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
-            "0x00011"  : 9,      #  ******_*********_****_COMMANDTYPE_ATTRIBUTE
-            "0x11101"  : 8,      #  TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
-            "0x01101"  : 7,      #  ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
-            "0x10101"  : 6,      #  TARGET_*********_ARCH_***********_ATTRIBUTE
-            "0x00101"  : 5,      #  ******_*********_ARCH_***********_ATTRIBUTE
-            "0x11001"  : 4,      #  TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
-            "0x01001"  : 3,      #  ******_TOOLCHAIN_****_***********_ATTRIBUTE
-            "0x10001"  : 2,      #  TARGET_*********_****_***********_ATTRIBUTE
-            "0x00001"  : 1}      #  ******_*********_****_***********_ATTRIBUTE (Lowest)
-## Base class for AutoGen
+PrioList = {"0x11111": 16,  # TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE (Highest)
+            "0x01111": 15,  # ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x10111": 14,  # TARGET_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x00111": 13,  # ******_*********_ARCH_COMMANDTYPE_ATTRIBUTE
+            "0x11011": 12,  # TARGET_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+            "0x01011": 11,  # ******_TOOLCHAIN_****_COMMANDTYPE_ATTRIBUTE
+            "0x10011": 10,  # TARGET_*********_****_COMMANDTYPE_ATTRIBUTE
+            "0x00011": 9,  # ******_*********_****_COMMANDTYPE_ATTRIBUTE
+            "0x11101": 8,  # TARGET_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+            "0x01101": 7,  # ******_TOOLCHAIN_ARCH_***********_ATTRIBUTE
+            "0x10101": 6,  # TARGET_*********_ARCH_***********_ATTRIBUTE
+            "0x00101": 5,  # ******_*********_ARCH_***********_ATTRIBUTE
+            "0x11001": 4,  # TARGET_TOOLCHAIN_****_***********_ATTRIBUTE
+            "0x01001": 3,  # ******_TOOLCHAIN_****_***********_ATTRIBUTE
+            "0x10001": 2,  # TARGET_*********_****_***********_ATTRIBUTE
+            "0x00001": 1}  # ******_*********_****_***********_ATTRIBUTE (Lowest)
+# Base class for AutoGen
 #
 #   This class just implements the cache mechanism of AutoGen objects.
 #
+
+
 class AutoGenInfo(object):
     # database to maintain the objects in each child class
-    __ObjectCache = {}    # (BuildTarget, ToolChain, ARCH, platform file): AutoGen object
+    # (BuildTarget, ToolChain, ARCH, platform file): AutoGen object
+    __ObjectCache = {}
 
-    ## Factory method
+    # Factory method
     #
     #   @param  Class           class object of real AutoGen class
     #                           (WorkspaceAutoGen, ModuleAutoGen or PlatformAutoGen)
@@ -64,6 +67,7 @@ class AutoGenInfo(object):
     @classmethod
     def GetCache(cls):
         return cls.__ObjectCache
+
     def __new__(cls, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
         # check if the object has been created
         Key = (Target, Toolchain, Arch, MetaFile)
@@ -74,17 +78,17 @@ class AutoGenInfo(object):
         RetVal = cls.__ObjectCache[Key] = super(AutoGenInfo, cls).__new__(cls)
         return RetVal
 
-
-    ## hash() operator
+    # hash() operator
     #
     #  The file path of platform file will be used to represent hash value of this object
     #
     #   @retval int     Hash value of the file path of platform file
     #
+
     def __hash__(self):
         return hash(self.MetaFile)
 
-    ## str() operator
+    # str() operator
     #
     #  The file path of platform file will be used to represent this object
     #
@@ -93,11 +97,11 @@ class AutoGenInfo(object):
     def __str__(self):
         return str(self.MetaFile)
 
-    ## "==" operator
+    # "==" operator
     def __eq__(self, Other):
         return Other and self.MetaFile == Other
 
-    ## Expand * in build option key
+    # Expand * in build option key
     #
     #   @param  Options     Options to be expanded
     #   @param  ToolDef     Use specified ToolDef instead of full version.
@@ -111,7 +115,7 @@ class AutoGenInfo(object):
         if not ToolDef:
             ToolDef = self.ToolDefinition
         BuildOptions = {}
-        FamilyMatch  = False
+        FamilyMatch = False
         FamilyIsNull = True
 
         OverrideList = {}
@@ -124,12 +128,12 @@ class AutoGenInfo(object):
             # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
             #
             if (Key[0] == self.BuildRuleFamily and
-                (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
+                    (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
                 Target, ToolChain, Arch, CommandType, Attr = Key[1].split('_')
                 if (Target == self.BuildTarget or Target == TAB_STAR) and\
                     (ToolChain == self.ToolChain or ToolChain == TAB_STAR) and\
                     (Arch == self.Arch or Arch == TAB_STAR) and\
-                    Options[Key].startswith("="):
+                        Options[Key].startswith("="):
 
                     if OverrideList.get(Key[1]) is not None:
                         OverrideList.pop(Key[1])
@@ -142,18 +146,20 @@ class AutoGenInfo(object):
             KeyList = list(OverrideList.keys())
             for Index in range(len(KeyList)):
                 NowKey = KeyList[Index]
-                Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split("_")
+                Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split(
+                    "_")
                 for Index1 in range(len(KeyList) - Index - 1):
                     NextKey = KeyList[Index1 + Index + 1]
                     #
                     # Compare two Key, if one is included by another, choose the higher priority one
                     #
-                    Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
+                    Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split(
+                        "_")
                     if (Target1 == Target2 or Target1 == TAB_STAR or Target2 == TAB_STAR) and\
                         (ToolChain1 == ToolChain2 or ToolChain1 == TAB_STAR or ToolChain2 == TAB_STAR) and\
                         (Arch1 == Arch2 or Arch1 == TAB_STAR or Arch2 == TAB_STAR) and\
                         (CommandType1 == CommandType2 or CommandType1 == TAB_STAR or CommandType2 == TAB_STAR) and\
-                        (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
+                            (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
 
                         if CalculatePriorityValue(NowKey) > CalculatePriorityValue(NextKey):
                             if Options.get((self.BuildRuleFamily, NextKey)) is not None:
@@ -163,7 +169,7 @@ class AutoGenInfo(object):
                                 Options.pop((self.BuildRuleFamily, NowKey))
 
         for Key in Options:
-            if ModuleStyle is not None and len (Key) > 2:
+            if ModuleStyle is not None and len(Key) > 2:
                 # Check Module style is EDK or EDKII.
                 # Only append build option for the matched style module.
                 if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
@@ -208,7 +214,7 @@ class AutoGenInfo(object):
             return BuildOptions
 
         for Key in Options:
-            if ModuleStyle is not None and len (Key) > 2:
+            if ModuleStyle is not None and len(Key) > 2:
                 # Check Module style is EDK or EDKII.
                 # Only append build option for the matched style module.
                 if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
@@ -249,14 +255,17 @@ class AutoGenInfo(object):
                                 BuildOptions[Tool][Attr] = Options[Key]
         return BuildOptions
 #
-#This class is the pruned WorkSpaceAutoGen for ModuleAutoGen in multiple thread
+# This class is the pruned WorkSpaceAutoGen for ModuleAutoGen in multiple thread
 #
+
+
 class WorkSpaceInfo(AutoGenInfo):
-    def __init__(self,Workspace, MetaFile, Target, ToolChain, Arch):
+    def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch):
         if not hasattr(self, "_Init"):
             self.do_init(Workspace, MetaFile, Target, ToolChain, Arch)
             self._Init = True
-    def do_init(self,Workspace, MetaFile, Target, ToolChain, Arch):
+
+    def do_init(self, Workspace, MetaFile, Target, ToolChain, Arch):
         self._SrcTimeStamp = 0
         self.Db = BuildDB
         self.BuildDatabase = self.Db.BuildObject
@@ -266,6 +275,7 @@ class WorkSpaceInfo(AutoGenInfo):
         self.ActivePlatform = MetaFile
         self.ArchList = Arch
         self.AutoGenObjectList = []
+
     @property
     def BuildDir(self):
         return self.AutoGenObjectList[0].BuildDir
@@ -277,9 +287,11 @@ class WorkSpaceInfo(AutoGenInfo):
     @property
     def FlashDefinition(self):
         return self.AutoGenObjectList[0].Platform.FlashDefinition
+
     @property
     def GenFdsCommandDict(self):
-        FdsCommandDict = self.AutoGenObjectList[0].DataPipe.Get("FdsCommandDict")
+        FdsCommandDict = self.AutoGenObjectList[0].DataPipe.Get(
+            "FdsCommandDict")
         if FdsCommandDict:
             return FdsCommandDict
         return {}
@@ -288,12 +300,15 @@ class WorkSpaceInfo(AutoGenInfo):
     def FvDir(self):
         return os.path.join(self.BuildDir, TAB_FV_DIRECTORY)
 
+
 class PlatformInfo(AutoGenInfo):
-    def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch,DataPipe):
+    def __init__(self, Workspace, MetaFile, Target, ToolChain, Arch, DataPipe):
         if not hasattr(self, "_Init"):
-            self.do_init(Workspace, MetaFile, Target, ToolChain, Arch,DataPipe)
+            self.do_init(Workspace, MetaFile, Target,
+                         ToolChain, Arch, DataPipe)
             self._Init = True
-    def do_init(self,Workspace, MetaFile, Target, ToolChain, Arch,DataPipe):
+
+    def do_init(self, Workspace, MetaFile, Target, ToolChain, Arch, DataPipe):
         self.Wa = Workspace
         self.WorkspaceDir = self.Wa.WorkspaceDir
         self.MetaFile = MetaFile
@@ -301,10 +316,12 @@ class PlatformInfo(AutoGenInfo):
         self.Target = Target
         self.BuildTarget = Target
         self.ToolChain = ToolChain
-        self.Platform = self.Wa.BuildDatabase[self.MetaFile, self.Arch, self.Target, self.ToolChain]
+        self.Platform = self.Wa.BuildDatabase[self.MetaFile,
+                                              self.Arch, self.Target, self.ToolChain]
 
         self.SourceDir = MetaFile.SubDir
         self.DataPipe = DataPipe
+
     @cached_property
     def _AsBuildModuleList(self):
         retVal = self.DataPipe.Get("AsBuildModuleList")
@@ -312,7 +329,7 @@ class PlatformInfo(AutoGenInfo):
             retVal = {}
         return retVal
 
-    ## Test if a module is supported by the platform
+    # Test if a module is supported by the platform
     #
     #  An error will be raised directly if the module or its arch is not supported
     #  by the platform or current configuration
@@ -342,47 +359,48 @@ class PlatformInfo(AutoGenInfo):
     @cached_property
     def PackageList(self):
         RetVal = set()
-        for dec_file,Arch in self.DataPipe.Get("PackageList"):
-            RetVal.add(self.Wa.BuildDatabase[dec_file,Arch,self.BuildTarget, self.ToolChain])
+        for dec_file, Arch in self.DataPipe.Get("PackageList"):
+            RetVal.add(
+                self.Wa.BuildDatabase[dec_file, Arch, self.BuildTarget, self.ToolChain])
         return list(RetVal)
 
-    ## Return the directory to store all intermediate and final files built
+    # Return the directory to store all intermediate and final files built
     @cached_property
     def BuildDir(self):
         if os.path.isabs(self.OutputDir):
             RetVal = os.path.join(
-                                os.path.abspath(self.OutputDir),
-                                self.Target + "_" + self.ToolChain,
-                                )
+                os.path.abspath(self.OutputDir),
+                self.Target + "_" + self.ToolChain,
+            )
         else:
             RetVal = os.path.join(
-                                self.WorkspaceDir,
-                                self.OutputDir,
-                                self.Target + "_" + self.ToolChain,
-                                )
+                self.WorkspaceDir,
+                self.OutputDir,
+                self.Target + "_" + self.ToolChain,
+            )
         return RetVal
 
-    ## Return the build output directory platform specifies
+    # Return the build output directory platform specifies
     @cached_property
     def OutputDir(self):
         return self.Platform.OutputDirectory
 
-    ## Return platform name
+    # Return platform name
     @cached_property
     def Name(self):
         return self.Platform.PlatformName
 
-    ## Return meta-file GUID
+    # Return meta-file GUID
     @cached_property
     def Guid(self):
         return self.Platform.Guid
 
-    ## Return platform version
+    # Return platform version
     @cached_property
     def Version(self):
         return self.Platform.Version
 
-    ## Return paths of tools
+    # Return paths of tools
     @cached_property
     def ToolDefinition(self):
         retVal = self.DataPipe.Get("TOOLDEF")
@@ -390,7 +408,7 @@ class PlatformInfo(AutoGenInfo):
             retVal = {}
         return retVal
 
-    ## Return build command string
+    # Return build command string
     #
     #   @retval     string  Build command string
     #
@@ -408,7 +426,7 @@ class PlatformInfo(AutoGenInfo):
             retVal = {}
         return retVal
 
-    ## Override PCD setting (type, value, ...)
+    # Override PCD setting (type, value, ...)
     #
     #   @param  ToPcd       The PCD to be overridden
     #   @param  FromPcd     The PCD overriding from
@@ -428,15 +446,16 @@ class PlatformInfo(AutoGenInfo):
             if ToPcd.Pending and FromPcd.Type:
                 ToPcd.Type = FromPcd.Type
             elif ToPcd.Type and FromPcd.Type\
-                and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
+                    and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
                 if ToPcd.Type.strip() == TAB_PCDS_DYNAMIC_EX:
                     ToPcd.Type = FromPcd.Type
             elif ToPcd.Type and FromPcd.Type \
-                and ToPcd.Type != FromPcd.Type:
+                    and ToPcd.Type != FromPcd.Type:
                 if Library:
-                    Module = str(Module) + " 's library file (" + str(Library) + ")"
+                    Module = str(Module) + \
+                        " 's library file (" + str(Library) + ")"
                 EdkLogger.error("build", OPTION_CONFLICT, "Mismatched PCD type",
-                                ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."\
+                                ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."
                                           % (ToPcd.TokenSpaceGuidCName, TokenCName,
                                              ToPcd.Type, Module, FromPcd.Type, Msg),
                                           File=self.MetaFile)
@@ -457,10 +476,11 @@ class PlatformInfo(AutoGenInfo):
             # Add Flexible PCD format parse
             if ToPcd.DefaultValue:
                 try:
-                    ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self._GuidDict)(True)
+                    ToPcd.DefaultValue = ValueExpressionEx(
+                        ToPcd.DefaultValue, ToPcd.DatumType, self._GuidDict)(True)
                 except BadExpression as Value:
-                    EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
-                                        File=self.MetaFile)
+                    EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' % (ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
+                                    File=self.MetaFile)
 
             # check the validation of datum
             IsValid, Cause = CheckPcdDatum(ToPcd.DatumType, ToPcd.DefaultValue)
@@ -473,7 +493,7 @@ class PlatformInfo(AutoGenInfo):
             ToPcd.CustomAttribute = FromPcd.CustomAttribute
 
         if FromPcd is not None and ToPcd.DatumType == TAB_VOID and not ToPcd.MaxDatumSize:
-            EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s" \
+            EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s"
                             % (ToPcd.TokenSpaceGuidCName, TokenCName))
             Value = ToPcd.DefaultValue
             if not Value:
@@ -487,18 +507,19 @@ class PlatformInfo(AutoGenInfo):
 
         # apply default SKU for dynamic PCDS if specified one is not available
         if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYNAMIC_EX_TYPE_SET) \
-            and not ToPcd.SkuInfoList:
+                and not ToPcd.SkuInfoList:
             if self.Platform.SkuName in self.Platform.SkuIds:
                 SkuName = self.Platform.SkuName
             else:
                 SkuName = TAB_DEFAULT
             ToPcd.SkuInfoList = {
-                SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
+                SkuName: SkuInfoClass(
+                    SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
             }
 
     def ApplyPcdSetting(self, Ma, Pcds, Library=""):
         # for each PCD in module
-        Module=Ma.Module
+        Module = Ma.Module
         for Name, Guid in Pcds:
             PcdInModule = Pcds[Name, Guid]
             # find out the PCD setting in platform
@@ -507,34 +528,40 @@ class PlatformInfo(AutoGenInfo):
             else:
                 PcdInPlatform = None
             # then override the settings if any
-            self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg="DSC PCD sections", Library=Library)
+            self._OverridePcd(PcdInModule, PcdInPlatform,
+                              Module, Msg="DSC PCD sections", Library=Library)
             # resolve the VariableGuid value
             for SkuId in PcdInModule.SkuInfoList:
                 Sku = PcdInModule.SkuInfoList[SkuId]
-                if Sku.VariableGuid == '': continue
-                Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
+                if Sku.VariableGuid == '':
+                    continue
+                Sku.VariableGuidValue = GuidValue(
+                    Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
                 if Sku.VariableGuidValue is None:
                     PackageList = "\n\t".join(str(P) for P in self.PackageList)
                     EdkLogger.error(
-                                'build',
-                                RESOURCE_NOT_AVAILABLE,
-                                "Value of GUID [%s] is not found in" % Sku.VariableGuid,
-                                ExtraData=PackageList + "\n\t(used with %s.%s from module %s)" \
-                                                        % (Guid, Name, str(Module)),
-                                File=self.MetaFile
-                                )
+                        'build',
+                        RESOURCE_NOT_AVAILABLE,
+                        "Value of GUID [%s] is not found in" % Sku.VariableGuid,
+                        ExtraData=PackageList +
+                        "\n\t(used with %s.%s from module %s)"
+                        % (Guid, Name, str(Module)),
+                        File=self.MetaFile
+                    )
 
         # override PCD settings with module specific setting
         ModuleScopePcds = self.DataPipe.Get("MOL_PCDS")
         if Module in self.Platform.Modules:
             PlatformModule = self.Platform.Modules[str(Module)]
-            PCD_DATA = ModuleScopePcds.get(Ma.Guid,{})
-            mPcds = {(pcd.TokenCName,pcd.TokenSpaceGuidCName): pcd for pcd in PCD_DATA}
-            for Key  in mPcds:
+            PCD_DATA = ModuleScopePcds.get(Ma.Guid, {})
+            mPcds = {(pcd.TokenCName, pcd.TokenSpaceGuidCName)
+                      : pcd for pcd in PCD_DATA}
+            for Key in mPcds:
                 if self.BuildOptionPcd:
                     for pcd in self.BuildOptionPcd:
-                        (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
-                        if (TokenCName, TokenSpaceGuidCName) == Key and FieldName =="":
+                        (TokenSpaceGuidCName, TokenCName,
+                         FieldName, pcdvalue, _) = pcd
+                        if (TokenCName, TokenSpaceGuidCName) == Key and FieldName == "":
                             PlatformModule.Pcds[Key].DefaultValue = pcdvalue
                             PlatformModule.Pcds[Key].PcdValueFromComm = pcdvalue
                             break
@@ -549,7 +576,8 @@ class PlatformInfo(AutoGenInfo):
                             Flag = True
                             break
                 if Flag:
-                    self._OverridePcd(ToPcd, mPcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
+                    self._OverridePcd(
+                        ToPcd, mPcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
         # use PCD value to calculate the MaxDatumSize when it is not specified
         for Name, Guid in Pcds:
             Pcd = Pcds[Name, Guid]
@@ -572,9 +600,9 @@ class PlatformInfo(AutoGenInfo):
 #         for pcd in PlatformPcdData:
 #             for skuid in pcd.SkuInfoList:
 #                 pcd.SkuInfoList[skuid] = self.CreateSkuInfoFromDict(pcd.SkuInfoList[skuid])
-        return {(pcddata.TokenCName,pcddata.TokenSpaceGuidCName):pcddata for pcddata in PlatformPcdData}
+        return {(pcddata.TokenCName, pcddata.TokenSpaceGuidCName): pcddata for pcddata in PlatformPcdData}
 
-    def CreateSkuInfoFromDict(self,SkuInfoDict):
+    def CreateSkuInfoFromDict(self, SkuInfoDict):
         return SkuInfoClass(
             SkuInfoDict.get("SkuIdName"),
             SkuInfoDict.get("SkuId"),
@@ -585,27 +613,33 @@ class PlatformInfo(AutoGenInfo):
             SkuInfoDict.get("VpdOffset"),
             SkuInfoDict.get("DefaultValue"),
             SkuInfoDict.get("VariableGuidValue"),
-            SkuInfoDict.get("VariableAttribute",""),
-            SkuInfoDict.get("DefaultStore",None)
-            )
+            SkuInfoDict.get("VariableAttribute", ""),
+            SkuInfoDict.get("DefaultStore", None)
+        )
+
     @cached_property
     def MixedPcd(self):
         return self.DataPipe.Get("MixedPcd")
+
     @cached_property
     def _GuidDict(self):
         RetVal = self.DataPipe.Get("GuidDict")
         if RetVal is None:
             RetVal = {}
         return RetVal
+
     @cached_property
     def BuildOptionPcd(self):
         return self.DataPipe.Get("BuildOptPcd")
-    def ApplyBuildOption(self,module):
+
+    def ApplyBuildOption(self, module):
         PlatformOptions = self.DataPipe.Get("PLA_BO")
         ModuleBuildOptions = self.DataPipe.Get("MOL_BO")
-        ModuleOptionFromDsc = ModuleBuildOptions.get((module.MetaFile.File,module.MetaFile.Root))
+        ModuleOptionFromDsc = ModuleBuildOptions.get(
+            (module.MetaFile.File, module.MetaFile.Root))
         if ModuleOptionFromDsc:
-            ModuleTypeOptions, PlatformModuleOptions = ModuleOptionFromDsc["ModuleTypeOptions"],ModuleOptionFromDsc["PlatformModuleOptions"]
+            ModuleTypeOptions, PlatformModuleOptions = ModuleOptionFromDsc[
+                "ModuleTypeOptions"], ModuleOptionFromDsc["PlatformModuleOptions"]
         else:
             ModuleTypeOptions, PlatformModuleOptions = {}, {}
         ToolDefinition = self.DataPipe.Get("TOOLDEF")
@@ -639,29 +673,34 @@ class PlatformInfo(AutoGenInfo):
                     for ExpandedTool in ToolList:
                         # check if override is indicated
                         if Value.startswith('='):
-                            BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(Value[1:])
+                            BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(
+                                Value[1:])
                         else:
                             if Attr != 'PATH':
-                                BuildOptions[ExpandedTool][Attr] += " " + mws.handleWsMacro(Value)
+                                BuildOptions[ExpandedTool][Attr] += " " + \
+                                    mws.handleWsMacro(Value)
                             else:
-                                BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(Value)
+                                BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(
+                                    Value)
 
         return BuildOptions, BuildRuleOrder
 
-    def ApplyLibraryInstance(self,module):
+    def ApplyLibraryInstance(self, module):
         alldeps = self.DataPipe.Get("DEPS")
         if alldeps is None:
             alldeps = {}
-        mod_libs = alldeps.get((module.MetaFile.File,module.MetaFile.Root,module.Arch,module.MetaFile.Path),[])
+        mod_libs = alldeps.get(
+            (module.MetaFile.File, module.MetaFile.Root, module.Arch, module.MetaFile.Path), [])
         retVal = []
-        for (file_path,root,arch,abs_path) in mod_libs:
-            libMetaFile = PathClass(file_path,root)
-            libMetaFile.OriginalPath = PathClass(file_path,root)
+        for (file_path, root, arch, abs_path) in mod_libs:
+            libMetaFile = PathClass(file_path, root)
+            libMetaFile.OriginalPath = PathClass(file_path, root)
             libMetaFile.Path = abs_path
-            retVal.append(self.Wa.BuildDatabase[libMetaFile, arch, self.Target,self.ToolChain])
+            retVal.append(
+                self.Wa.BuildDatabase[libMetaFile, arch, self.Target, self.ToolChain])
         return retVal
 
-    ## Parse build_rule.txt in Conf Directory.
+    # Parse build_rule.txt in Conf Directory.
     #
     #   @retval     BuildRule object
     #
diff --git a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
index 592d4824a4b3..292466053c9c 100644
--- a/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -6,7 +6,7 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 from __future__ import print_function
 from __future__ import absolute_import
@@ -14,7 +14,7 @@ import os.path as path
 import copy
 from collections import defaultdict
 
-from .BuildEngine import BuildRule,gDefaultBuildRuleFile,AutoGenReqBuildRuleVerNum
+from .BuildEngine import BuildRule, gDefaultBuildRuleFile, AutoGenReqBuildRuleVerNum
 from .GenVar import VariableMgr, var_info
 from . import GenMake
 from AutoGen.DataPipe import MemoryDataPipe
@@ -25,17 +25,19 @@ from Workspace.WorkspaceCommon import GetModuleLibInstances
 from CommonDataClass.CommonClass import SkuInfoClass
 from Common.caching import cached_class_function
 from Common.Expression import ValueExpressionEx
-from Common.StringUtils import StringToArray,NormPath
+from Common.StringUtils import StringToArray, NormPath
 from Common.BuildToolError import *
 from Common.DataType import *
 from Common.Misc import *
 import Common.VpdInfoFile as VpdInfoFile
 
-## Split command line option string to list
+# Split command line option string to list
 #
 # subprocess.Popen needs the args to be a sequence. Otherwise there's problem
 # in non-windows platform to launch command
 #
+
+
 def _SplitOption(OptionString):
     OptionList = []
     LastChar = " "
@@ -60,11 +62,13 @@ def _SplitOption(OptionString):
     OptionList.append(OptionString[OptionStart:])
     return OptionList
 
-## AutoGen class for platform
+# AutoGen class for platform
 #
 #  PlatformAutoGen class will process the original information in platform
 #  file in order to generate makefile for platform.
 #
+
+
 class PlatformAutoGen(AutoGen):
     # call super().__init__ then call the worker function with different parameter count
     def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
@@ -79,9 +83,7 @@ class PlatformAutoGen(AutoGen):
     _NonDynaPcdList_ = []
     _PlatformPcds = {}
 
-
-
-    ## Initialize PlatformAutoGen
+    # Initialize PlatformAutoGen
     #
     #
     #   @param      Workspace       WorkspaceAutoGen object
@@ -90,9 +92,12 @@ class PlatformAutoGen(AutoGen):
     #   @param      Toolchain       Name of tool chain
     #   @param      Arch            arch of the platform supports
     #
+
     def _InitWorker(self, Workspace, PlatformFile, Target, Toolchain, Arch):
-        EdkLogger.debug(EdkLogger.DEBUG_9, "AutoGen platform [%s] [%s]" % (PlatformFile, Arch))
-        GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (PlatformFile, Arch, Toolchain, Target)
+        EdkLogger.debug(EdkLogger.DEBUG_9,
+                        "AutoGen platform [%s] [%s]" % (PlatformFile, Arch))
+        GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (
+            PlatformFile, Arch, Toolchain, Target)
 
         self.MetaFile = PlatformFile
         self.Workspace = Workspace
@@ -111,8 +116,10 @@ class PlatformAutoGen(AutoGen):
         # indicating whether the file has been created.
         self.MakeFileName = ""
 
-        self._DynamicPcdList = None    # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
-        self._NonDynamicPcdList = None # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
+        # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
+        self._DynamicPcdList = None
+        # [(TokenCName1, TokenSpaceGuidCName1), (TokenCName2, TokenSpaceGuidCName2), ...]
+        self._NonDynamicPcdList = None
 
         self._AsBuildInfList = []
         self._AsBuildModuleList = []
@@ -122,8 +129,10 @@ class PlatformAutoGen(AutoGen):
         if GlobalData.gFdfParser is not None:
             self._AsBuildInfList = GlobalData.gFdfParser.Profile.InfList
             for Inf in self._AsBuildInfList:
-                InfClass = PathClass(NormPath(Inf), GlobalData.gWorkspace, self.Arch)
-                M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
+                InfClass = PathClass(
+                    NormPath(Inf), GlobalData.gWorkspace, self.Arch)
+                M = self.BuildDatabase[InfClass, self.Arch,
+                                       self.BuildTarget, self.ToolChain]
                 if not M.IsBinaryModule:
                     continue
                 self._AsBuildModuleList.append(InfClass)
@@ -135,27 +144,31 @@ class PlatformAutoGen(AutoGen):
         self.DataPipe.FillData(self)
 
         return True
+
     def FillData_LibConstPcd(self):
         libConstPcd = {}
         for LibAuto in self.LibraryAutoGenList:
             if LibAuto.ConstPcd:
-                libConstPcd[(LibAuto.MetaFile.File,LibAuto.MetaFile.Root,LibAuto.Arch,LibAuto.MetaFile.Path)] = LibAuto.ConstPcd
-        self.DataPipe.DataContainer = {"LibConstPcd":libConstPcd}
-    ## hash() operator of PlatformAutoGen
+                libConstPcd[(LibAuto.MetaFile.File, LibAuto.MetaFile.Root,
+                             LibAuto.Arch, LibAuto.MetaFile.Path)] = LibAuto.ConstPcd
+        self.DataPipe.DataContainer = {"LibConstPcd": libConstPcd}
+    # hash() operator of PlatformAutoGen
     #
     #  The platform file path and arch string will be used to represent
     #  hash value of this object
     #
     #   @retval   int Hash value of the platform file path and arch
     #
+
     @cached_class_function
     def __hash__(self):
-        return hash((self.MetaFile, self.Arch,self.ToolChain,self.BuildTarget))
+        return hash((self.MetaFile, self.Arch, self.ToolChain, self.BuildTarget))
+
     @cached_class_function
     def __repr__(self):
         return "%s [%s]" % (self.MetaFile, self.Arch)
 
-    ## Create autogen code for platform and modules
+    # Create autogen code for platform and modules
     #
     #  Since there's no autogen code for platform, this method will do nothing
     #  if CreateModuleCodeFile is set to False.
@@ -172,17 +185,17 @@ class PlatformAutoGen(AutoGen):
         for Ma in self.ModuleAutoGenList:
             Ma.CreateCodeFile(CreateModuleCodeFile)
 
-    ## Generate Fds Command
+    # Generate Fds Command
     @cached_property
     def GenFdsCommand(self):
         return self.Workspace.GenFdsCommand
 
-    ## Create makefile for the platform and modules in it
+    # Create makefile for the platform and modules in it
     #
     #   @param      CreateModuleMakeFile    Flag indicating if the makefile for
     #                                       modules will be created as well
     #
-    def CreateMakeFile(self, CreateModuleMakeFile=False, FfsCommand = {}):
+    def CreateMakeFile(self, CreateModuleMakeFile=False, FfsCommand={}):
         if CreateModuleMakeFile:
             for Ma in self._MaList:
                 key = (Ma.MetaFile.File, self.Arch)
@@ -206,8 +219,9 @@ class PlatformAutoGen(AutoGen):
     @property
     def AllPcdList(self):
         return self.DynamicPcdList + self.NonDynamicPcdList
-    ## Deal with Shared FixedAtBuild Pcds
+    # Deal with Shared FixedAtBuild Pcds
     #
+
     def CollectFixedAtBuildPcds(self):
         for LibAuto in self.LibraryAutoGenList:
             FixedAtBuildPcds = {}
@@ -231,7 +245,8 @@ class PlatformAutoGen(AutoGen):
                 if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) not in self.NonDynamicPcdDict:
                     continue
                 else:
-                    DscPcd = self.NonDynamicPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
+                    DscPcd = self.NonDynamicPcdDict[(
+                        Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                     if DscPcd.Type != TAB_PCDS_FIXED_AT_BUILD:
                         continue
                 if key in ShareFixedAtBuildPcdsSameValue and ShareFixedAtBuildPcdsSameValue[key]:
@@ -249,7 +264,8 @@ class PlatformAutoGen(AutoGen):
                         VpdRegionBase = FdRegion.Offset
                         break
 
-        VariableInfo = VariableMgr(self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj.SkuIds)
+        VariableInfo = VariableMgr(
+            self.DscBuildDataObj._GetDefaultStores(), self.DscBuildDataObj.SkuIds)
         VariableInfo.SetVpdRegionMaxSize(VpdRegionSize)
         VariableInfo.SetVpdRegionOffset(VpdRegionBase)
         Index = 0
@@ -264,40 +280,52 @@ class PlatformAutoGen(AutoGen):
                     if Sku.VariableAttribute and 'NV' not in Sku.VariableAttribute:
                         continue
                     VariableGuidStructure = Sku.VariableGuidValue
-                    VariableGuid = GuidStructureStringToGuidString(VariableGuidStructure)
+                    VariableGuid = GuidStructureStringToGuidString(
+                        VariableGuidStructure)
                     for StorageName in Sku.DefaultStoreDict:
-                        VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else StringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttribute['DscPosition'], Pcd.CustomAttribute.get('IsStru',False)))
+                        VariableInfo.append_variable(var_info(Index, pcdname, StorageName, SkuName, StringToArray(Sku.VariableName), VariableGuid, Sku.VariableOffset, Sku.VariableAttribute, Sku.HiiDefaultValue, Sku.DefaultStoreDict[
+                                                     StorageName] if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES else StringToArray(Sku.DefaultStoreDict[StorageName]), Pcd.DatumType, Pcd.CustomAttribute['DscPosition'], Pcd.CustomAttribute.get('IsStru', False)))
             Index += 1
         return VariableInfo
 
     def UpdateNVStoreMaxSize(self, OrgVpdFile):
         if self.VariableInfo:
-            VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
-            PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
+            VpdMapFilePath = os.path.join(
+                self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
+            PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName ==
+                                  "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
 
             if PcdNvStoreDfBuffer:
                 try:
                     OrgVpdFile.Read(VpdMapFilePath)
                     PcdItems = OrgVpdFile.GetOffset(PcdNvStoreDfBuffer[0])
-                    NvStoreOffset = list(PcdItems.values())[0].strip() if PcdItems else '0'
+                    NvStoreOffset = list(PcdItems.values())[
+                        0].strip() if PcdItems else '0'
                 except:
-                    EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
+                    EdkLogger.error(
+                        "build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
 
-                NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper().startswith("0X") else int(NvStoreOffset)
-                default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
-                maxsize = self.VariableInfo.VpdRegionSize  - NvStoreOffset if self.VariableInfo.VpdRegionSize else len(default_skuobj.DefaultValue.split(","))
-                var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(maxsize)
+                NvStoreOffset = int(NvStoreOffset, 16) if NvStoreOffset.upper(
+                ).startswith("0X") else int(NvStoreOffset)
+                default_skuobj = PcdNvStoreDfBuffer[0].SkuInfoList.get(
+                    TAB_DEFAULT)
+                maxsize = self.VariableInfo.VpdRegionSize - \
+                    NvStoreOffset if self.VariableInfo.VpdRegionSize else len(
+                        default_skuobj.DefaultValue.split(","))
+                var_data = self.VariableInfo.PatchNVStoreDefaultMaxSize(
+                    maxsize)
 
                 if var_data and default_skuobj:
                     default_skuobj.DefaultValue = var_data
                     PcdNvStoreDfBuffer[0].DefaultValue = var_data
                     PcdNvStoreDfBuffer[0].SkuInfoList.clear()
                     PcdNvStoreDfBuffer[0].SkuInfoList[TAB_DEFAULT] = default_skuobj
-                    PcdNvStoreDfBuffer[0].MaxDatumSize = str(len(default_skuobj.DefaultValue.split(",")))
+                    PcdNvStoreDfBuffer[0].MaxDatumSize = str(
+                        len(default_skuobj.DefaultValue.split(",")))
 
         return OrgVpdFile
 
-    ## Collect dynamic PCDs
+    # Collect dynamic PCDs
     #
     #  Gather dynamic PCDs list from each module and their settings from platform
     #  This interface should be invoked explicitly when platform action is created.
@@ -315,23 +343,26 @@ class PlatformAutoGen(AutoGen):
             InfName = mws.join(self.WorkspaceDir, InfName)
             FdfModuleList.append(os.path.normpath(InfName))
         for M in self._MbList:
-#            F is the Module for which M is the module autogen
+            #            F is the Module for which M is the module autogen
             ModPcdList = self.ApplyPcdSetting(M, M.ModulePcdList)
             LibPcdList = []
             for lib in M.LibraryPcdList:
-                LibPcdList.extend(self.ApplyPcdSetting(M, M.LibraryPcdList[lib], lib))
+                LibPcdList.extend(self.ApplyPcdSetting(
+                    M, M.LibraryPcdList[lib], lib))
             for PcdFromModule in ModPcdList + LibPcdList:
 
                 # make sure that the "VOID*" kind of datum has MaxDatumSize set
                 if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
-                    NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile))
+                    NoDatumTypePcdList.add("%s.%s [%s]" % (
+                        PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, M.MetaFile))
 
                 # Check the PCD from Binary INF or Source INF
                 if M.IsBinaryModule == True:
                     PcdFromModule.IsFromBinaryInf = True
 
                 # Check the PCD from DSC or not
-                PcdFromModule.IsFromDsc = (PcdFromModule.TokenCName, PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds
+                PcdFromModule.IsFromDsc = (
+                    PcdFromModule.TokenCName, PcdFromModule.TokenSpaceGuidCName) in self.Platform.Pcds
 
                 if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET or PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
                     if M.MetaFile.Path not in FdfModuleList:
@@ -344,7 +375,7 @@ class PlatformAutoGen(AutoGen):
                         # PCD will not be added into the Database unless it is used by other
                         # modules that are included in the FDF file.
                         if PcdFromModule.Type in PCD_DYNAMIC_TYPE_SET and \
-                            PcdFromModule.IsFromBinaryInf == False:
+                                PcdFromModule.IsFromBinaryInf == False:
                             # Print warning message to let the developer make a determine.
                             continue
                         # If one of the Source built modules listed in the DSC is not listed in
@@ -373,16 +404,19 @@ class PlatformAutoGen(AutoGen):
                 elif PcdFromModule in self._NonDynaPcdList_ and PcdFromModule.IsFromBinaryInf == True:
                     Index = self._NonDynaPcdList_.index(PcdFromModule)
                     if self._NonDynaPcdList_[Index].IsFromBinaryInf == False:
-                        #The PCD from Binary INF will override the same one from source INF
-                        self._NonDynaPcdList_.remove (self._NonDynaPcdList_[Index])
+                        # The PCD from Binary INF will override the same one from source INF
+                        self._NonDynaPcdList_.remove(
+                            self._NonDynaPcdList_[Index])
                         PcdFromModule.Pending = False
-                        self._NonDynaPcdList_.append (PcdFromModule)
-        DscModuleSet = {os.path.normpath(ModuleInf.Path) for ModuleInf in self.Platform.Modules}
+                        self._NonDynaPcdList_.append(PcdFromModule)
+        DscModuleSet = {os.path.normpath(ModuleInf.Path)
+                        for ModuleInf in self.Platform.Modules}
         # add the PCD from modules that listed in FDF but not in DSC to Database
         for InfName in FdfModuleList:
             if InfName not in DscModuleSet:
                 InfClass = PathClass(InfName)
-                M = self.BuildDatabase[InfClass, self.Arch, self.BuildTarget, self.ToolChain]
+                M = self.BuildDatabase[InfClass, self.Arch,
+                                       self.BuildTarget, self.ToolChain]
                 # If a module INF in FDF but not in current arch's DSC module list, it must be module (either binary or source)
                 # for different Arch. PCDs in source module for different Arch is already added before, so skip the source module here.
                 # For binary module, if in current arch, we need to list the PCDs into database.
@@ -401,7 +435,8 @@ class PlatformAutoGen(AutoGen):
                                         % (PcdFromModule.Type, PcdFromModule.TokenCName, InfName))
                     # make sure that the "VOID*" kind of datum has MaxDatumSize set
                     if PcdFromModule.DatumType == TAB_VOID and not PcdFromModule.MaxDatumSize:
-                        NoDatumTypePcdList.add("%s.%s [%s]" % (PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName))
+                        NoDatumTypePcdList.add("%s.%s [%s]" % (
+                            PcdFromModule.TokenSpaceGuidCName, PcdFromModule.TokenCName, InfName))
                     if M.ModuleType in SUP_MODULE_SET_PEI:
                         PcdFromModule.Phase = "PEI"
                     if PcdFromModule not in self._DynaPcdList_ and PcdFromModule.Type in PCD_DYNAMIC_EX_TYPE_SET:
@@ -429,11 +464,11 @@ class PlatformAutoGen(AutoGen):
                 continue
             Index = self._DynaPcdList_.index(PcdFromModule)
             if PcdFromModule.IsFromDsc == False and \
-                PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and \
-                PcdFromModule.IsFromBinaryInf == True and \
-                self._DynaPcdList_[Index].IsFromBinaryInf == False:
+                    PcdFromModule.Type in TAB_PCDS_PATCHABLE_IN_MODULE and \
+                    PcdFromModule.IsFromBinaryInf == True and \
+                    self._DynaPcdList_[Index].IsFromBinaryInf == False:
                 Index = self._DynaPcdList_.index(PcdFromModule)
-                self._DynaPcdList_.remove (self._DynaPcdList_[Index])
+                self._DynaPcdList_.remove(self._DynaPcdList_[Index])
 
         # print out error information and break the build, if error found
         if len(NoDatumTypePcdList) > 0:
@@ -455,10 +490,10 @@ class PlatformAutoGen(AutoGen):
         # The reason of sorting is make sure the unicode string is in double-byte alignment in string table.
         #
         UnicodePcdArray = set()
-        HiiPcdArray     = set()
-        OtherPcdArray   = set()
-        VpdPcdDict      = {}
-        VpdFile               = VpdInfoFile.VpdInfoFile()
+        HiiPcdArray = set()
+        OtherPcdArray = set()
+        VpdPcdDict = {}
+        VpdFile = VpdInfoFile.VpdInfoFile()
         NeedProcessVpdMapFile = False
 
         for pcd in self.Platform.Pcds:
@@ -483,27 +518,31 @@ class PlatformAutoGen(AutoGen):
                 if Pcd.Type in [TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_EX_VPD]:
                     VpdPcdDict[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)] = Pcd
 
-            #Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
-            PcdNvStoreDfBuffer = VpdPcdDict.get(("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
+            # Collect DynamicHii PCD values and assign it to DynamicExVpd PCD gEfiMdeModulePkgTokenSpaceGuid.PcdNvStoreDefaultValueBuffer
+            PcdNvStoreDfBuffer = VpdPcdDict.get(
+                ("PcdNvStoreDefaultValueBuffer", "gEfiMdeModulePkgTokenSpaceGuid"))
             if PcdNvStoreDfBuffer:
                 self.VariableInfo = self.CollectVariables(self._DynamicPcdList)
                 vardump = self.VariableInfo.dump()
                 if vardump:
                     #
-                    #According to PCD_DATABASE_INIT in edk2\MdeModulePkg\Include\Guid\PcdDataBaseSignatureGuid.h,
-                    #the max size for string PCD should not exceed USHRT_MAX 65535(0xffff).
-                    #typedef UINT16 SIZE_INFO;
-                    #//SIZE_INFO  SizeTable[];
+                    # According to PCD_DATABASE_INIT in edk2\MdeModulePkg\Include\Guid\PcdDataBaseSignatureGuid.h,
+                    # the max size for string PCD should not exceed USHRT_MAX 65535(0xffff).
+                    # typedef UINT16 SIZE_INFO;
+                    # //SIZE_INFO  SizeTable[];
                     if len(vardump.split(",")) > 0xffff:
-                        EdkLogger.error("build", RESOURCE_OVERFLOW, 'The current length of PCD %s value is %d, it exceeds to the max size of String PCD.' %(".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName,PcdNvStoreDfBuffer.TokenCName]) ,len(vardump.split(","))))
+                        EdkLogger.error("build", RESOURCE_OVERFLOW, 'The current length of PCD %s value is %d, it exceeds to the max size of String PCD.' % (
+                            ".".join([PcdNvStoreDfBuffer.TokenSpaceGuidCName, PcdNvStoreDfBuffer.TokenCName]), len(vardump.split(","))))
                     PcdNvStoreDfBuffer.DefaultValue = vardump
                     for skuname in PcdNvStoreDfBuffer.SkuInfoList:
                         PcdNvStoreDfBuffer.SkuInfoList[skuname].DefaultValue = vardump
-                        PcdNvStoreDfBuffer.MaxDatumSize = str(len(vardump.split(",")))
+                        PcdNvStoreDfBuffer.MaxDatumSize = str(
+                            len(vardump.split(",")))
             else:
-                #If the end user define [DefaultStores] and [XXX.Menufacturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to PcdsDynamicVpd
+                # If the end user define [DefaultStores] and [XXX.Menufacturing] in DSC, but forget to configure PcdNvStoreDefaultValueBuffer to PcdsDynamicVpd
                 if [Pcd for Pcd in self._DynamicPcdList if Pcd.UserDefinedDefaultStoresFlag]:
-                    EdkLogger.warn("build", "PcdNvStoreDefaultValueBuffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultStores is enabled for this platform.\n%s" %self.Platform.MetaFile.Path)
+                    EdkLogger.warn(
+                        "build", "PcdNvStoreDefaultValueBuffer should be defined as PcdsDynamicExVpd in dsc file since the DefaultStores is enabled for this platform.\n%s" % self.Platform.MetaFile.Path)
             PlatformPcds = sorted(self._PlatformPcds.keys())
             #
             # Add VPD type PCD into VpdFile and determine whether the VPD PCD need to be fixed up.
@@ -527,7 +566,7 @@ class PlatformAutoGen(AutoGen):
                         Sku.VpdOffset = Sku.VpdOffset.strip()
                         PcdValue = Sku.DefaultValue
                         if PcdValue == "":
-                            PcdValue  = Pcd.DefaultValue
+                            PcdValue = Pcd.DefaultValue
                         if Sku.VpdOffset != TAB_STAR:
                             if PcdValue.startswith("{"):
                                 Alignment = 8
@@ -541,12 +580,15 @@ class PlatformAutoGen(AutoGen):
                                 try:
                                     VpdOffset = int(Sku.VpdOffset, 16)
                                 except:
-                                    EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+                                    EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (
+                                        Sku.VpdOffset, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
                             if VpdOffset % Alignment != 0:
                                 if PcdValue.startswith("{"):
-                                    EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName), File=self.MetaFile)
+                                    EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" % (
+                                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName), File=self.MetaFile)
                                 else:
-                                    EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
+                                    EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (
+                                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Alignment))
                         if PcdValue not in SkuValueMap:
                             SkuValueMap[PcdValue] = []
                             VpdFile.Add(Pcd, SkuName, Sku.VpdOffset)
@@ -555,7 +597,7 @@ class PlatformAutoGen(AutoGen):
                         if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
                             NeedProcessVpdMapFile = True
                             if self.Platform.VpdToolGuid is None or self.Platform.VpdToolGuid == '':
-                                EdkLogger.error("Build", FILE_NOT_FOUND, \
+                                EdkLogger.error("Build", FILE_NOT_FOUND,
                                                 "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
 
                     VpdSkuMap[PcdKey] = SkuValueMap
@@ -572,16 +614,18 @@ class PlatformAutoGen(AutoGen):
                             # This PCD has been referenced by module
                             if (VpdPcd.TokenSpaceGuidCName == DscPcdEntry.TokenSpaceGuidCName) and \
                                (VpdPcd.TokenCName == DscPcdEntry.TokenCName):
-                                    FoundFlag = True
+                                FoundFlag = True
 
                         # Not found, it should be signature
-                        if not FoundFlag :
+                        if not FoundFlag:
                             # just pick the a value to determine whether is unicode string type
                             SkuValueMap = {}
                             SkuObjList = list(DscPcdEntry.SkuInfoList.items())
-                            DefaultSku = DscPcdEntry.SkuInfoList.get(TAB_DEFAULT)
+                            DefaultSku = DscPcdEntry.SkuInfoList.get(
+                                TAB_DEFAULT)
                             if DefaultSku:
-                                defaultindex = SkuObjList.index((TAB_DEFAULT, DefaultSku))
+                                defaultindex = SkuObjList.index(
+                                    (TAB_DEFAULT, DefaultSku))
                                 SkuObjList[0], SkuObjList[defaultindex] = SkuObjList[defaultindex], SkuObjList[0]
                             for (SkuName, Sku) in SkuObjList:
                                 Sku.VpdOffset = Sku.VpdOffset.strip()
@@ -594,24 +638,26 @@ class PlatformAutoGen(AutoGen):
                                            (DecPcdEntry.TokenCName == DscPcdEntry.TokenCName):
                                             # Print warning message to let the developer make a determine.
                                             EdkLogger.warn("build", "Unreferenced vpd pcd used!",
-                                                            File=self.MetaFile, \
-                                                            ExtraData = "PCD: %s.%s used in the DSC file %s is unreferenced." \
-                                                            %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
+                                                           File=self.MetaFile,
+                                                           ExtraData="PCD: %s.%s used in the DSC file %s is unreferenced."
+                                                           % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, self.Platform.MetaFile.Path))
 
-                                            DscPcdEntry.DatumType    = DecPcdEntry.DatumType
+                                            DscPcdEntry.DatumType = DecPcdEntry.DatumType
                                             DscPcdEntry.DefaultValue = DecPcdEntry.DefaultValue
                                             DscPcdEntry.TokenValue = DecPcdEntry.TokenValue
-                                            DscPcdEntry.TokenSpaceGuidValue = eachDec.Guids[DecPcdEntry.TokenSpaceGuidCName]
+                                            DscPcdEntry.TokenSpaceGuidValue = eachDec.Guids[
+                                                DecPcdEntry.TokenSpaceGuidCName]
                                             # Only fix the value while no value provided in DSC file.
                                             if not Sku.DefaultValue:
-                                                DscPcdEntry.SkuInfoList[list(DscPcdEntry.SkuInfoList.keys())[0]].DefaultValue = DecPcdEntry.DefaultValue
+                                                DscPcdEntry.SkuInfoList[list(DscPcdEntry.SkuInfoList.keys())[
+                                                    0]].DefaultValue = DecPcdEntry.DefaultValue
 
                                 if DscPcdEntry not in self._DynamicPcdList:
                                     self._DynamicPcdList.append(DscPcdEntry)
                                 Sku.VpdOffset = Sku.VpdOffset.strip()
                                 PcdValue = Sku.DefaultValue
                                 if PcdValue == "":
-                                    PcdValue  = DscPcdEntry.DefaultValue
+                                    PcdValue = DscPcdEntry.DefaultValue
                                 if Sku.VpdOffset != TAB_STAR:
                                     if PcdValue.startswith("{"):
                                         Alignment = 8
@@ -625,15 +671,19 @@ class PlatformAutoGen(AutoGen):
                                         try:
                                             VpdOffset = int(Sku.VpdOffset, 16)
                                         except:
-                                            EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (Sku.VpdOffset, DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName))
+                                            EdkLogger.error("build", FORMAT_INVALID, "Invalid offset value %s for PCD %s.%s." % (
+                                                Sku.VpdOffset, DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName))
                                     if VpdOffset % Alignment != 0:
                                         if PcdValue.startswith("{"):
-                                            EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" %(DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName), File=self.MetaFile)
+                                            EdkLogger.warn("build", "The offset value of PCD %s.%s is not 8-byte aligned!" % (
+                                                DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName), File=self.MetaFile)
                                         else:
-                                            EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
+                                            EdkLogger.error("build", FORMAT_INVALID, 'The offset value of PCD %s.%s should be %s-byte aligned.' % (
+                                                DscPcdEntry.TokenSpaceGuidCName, DscPcdEntry.TokenCName, Alignment))
                                 if PcdValue not in SkuValueMap:
                                     SkuValueMap[PcdValue] = []
-                                    VpdFile.Add(DscPcdEntry, SkuName, Sku.VpdOffset)
+                                    VpdFile.Add(
+                                        DscPcdEntry, SkuName, Sku.VpdOffset)
                                 SkuValueMap[PcdValue].append(Sku)
                                 if not NeedProcessVpdMapFile and Sku.VpdOffset == TAB_STAR:
                                     NeedProcessVpdMapFile = True
@@ -656,16 +706,20 @@ class PlatformAutoGen(AutoGen):
                 self.FixVpdOffset(VpdFile)
 
                 self.FixVpdOffset(self.UpdateNVStoreMaxSize(VpdFile))
-                PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName == "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
+                PcdNvStoreDfBuffer = [item for item in self._DynamicPcdList if item.TokenCName ==
+                                      "PcdNvStoreDefaultValueBuffer" and item.TokenSpaceGuidCName == "gEfiMdeModulePkgTokenSpaceGuid"]
                 if PcdNvStoreDfBuffer:
-                    PcdName,PcdGuid = PcdNvStoreDfBuffer[0].TokenCName, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName
-                    if (PcdName,PcdGuid) in VpdSkuMap:
-                        DefaultSku = PcdNvStoreDfBuffer[0].SkuInfoList.get(TAB_DEFAULT)
-                        VpdSkuMap[(PcdName,PcdGuid)] = {DefaultSku.DefaultValue:[SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values() ]}
+                    PcdName, PcdGuid = PcdNvStoreDfBuffer[0].TokenCName, PcdNvStoreDfBuffer[0].TokenSpaceGuidCName
+                    if (PcdName, PcdGuid) in VpdSkuMap:
+                        DefaultSku = PcdNvStoreDfBuffer[0].SkuInfoList.get(
+                            TAB_DEFAULT)
+                        VpdSkuMap[(PcdName, PcdGuid)] = {DefaultSku.DefaultValue: [
+                            SkuObj for SkuObj in PcdNvStoreDfBuffer[0].SkuInfoList.values()]}
 
                 # Process VPD map file generated by third party BPDG tool
                 if NeedProcessVpdMapFile:
-                    VpdMapFilePath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
+                    VpdMapFilePath = os.path.join(
+                        self.BuildDir, TAB_FV_DIRECTORY, "%s.map" % self.Platform.VpdToolGuid)
                     try:
                         VpdFile.Read(VpdMapFilePath)
 
@@ -673,7 +727,7 @@ class PlatformAutoGen(AutoGen):
                         for pcd in VpdSkuMap:
                             vpdinfo = VpdFile.GetVpdInfo(pcd)
                             if vpdinfo is None:
-                            # just pick the a value to determine whether is unicode string type
+                                # just pick the a value to determine whether is unicode string type
                                 continue
                             for pcdvalue in VpdSkuMap[pcd]:
                                 for sku in VpdSkuMap[pcd][pcdvalue]:
@@ -681,7 +735,8 @@ class PlatformAutoGen(AutoGen):
                                         if item[2] == pcdvalue:
                                             sku.VpdOffset = item[1]
                     except:
-                        EdkLogger.error("build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
+                        EdkLogger.error(
+                            "build", FILE_READ_FAILURE, "Can not find VPD map file %s to fix up VPD offset." % VpdMapFilePath)
 
             # Delete the DynamicPcdList At the last time enter into this function
             for Pcd in self._DynamicPcdList:
@@ -706,25 +761,29 @@ class PlatformAutoGen(AutoGen):
         self._DynamicPcdList.extend(list(HiiPcdArray))
         self._DynamicPcdList.extend(list(OtherPcdArray))
         self._DynamicPcdList.sort()
-        allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (SkuName, Sku) in pcd.SkuInfoList.items()]
+        allskuset = [(SkuName, Sku.SkuId) for pcd in self._DynamicPcdList for (
+            SkuName, Sku) in pcd.SkuInfoList.items()]
         for pcd in self._DynamicPcdList:
             if len(pcd.SkuInfoList) == 1:
                 for (SkuName, SkuId) in allskuset:
                     if isinstance(SkuId, str) and eval(SkuId) == 0 or SkuId == 0:
                         continue
-                    pcd.SkuInfoList[SkuName] = copy.deepcopy(pcd.SkuInfoList[TAB_DEFAULT])
+                    pcd.SkuInfoList[SkuName] = copy.deepcopy(
+                        pcd.SkuInfoList[TAB_DEFAULT])
                     pcd.SkuInfoList[SkuName].SkuId = SkuId
                     pcd.SkuInfoList[SkuName].SkuIdName = SkuName
 
-    def FixVpdOffset(self, VpdFile ):
+    def FixVpdOffset(self, VpdFile):
         FvPath = os.path.join(self.BuildDir, TAB_FV_DIRECTORY)
         if not os.path.exists(FvPath):
             try:
                 os.makedirs(FvPath)
             except:
-                EdkLogger.error("build", FILE_WRITE_FAILURE, "Fail to create FV folder under %s" % self.BuildDir)
+                EdkLogger.error("build", FILE_WRITE_FAILURE,
+                                "Fail to create FV folder under %s" % self.BuildDir)
 
-        VpdFilePath = os.path.join(FvPath, "%s.txt" % self.Platform.VpdToolGuid)
+        VpdFilePath = os.path.join(FvPath, "%s.txt" %
+                                   self.Platform.VpdToolGuid)
 
         if VpdFile.Write(VpdFilePath):
             # retrieve BPDG tool's path from tool_def.txt according to VPD_TOOL_GUID defined in DSC file.
@@ -732,66 +791,68 @@ class PlatformAutoGen(AutoGen):
             for ToolDef in self.ToolDefinition.values():
                 if TAB_GUID in ToolDef and ToolDef[TAB_GUID] == self.Platform.VpdToolGuid:
                     if "PATH" not in ToolDef:
-                        EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
+                        EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
+                                        "PATH attribute was not provided for BPDG guid tool %s in tools_def.txt" % self.Platform.VpdToolGuid)
                     BPDGToolName = ToolDef["PATH"]
                     break
             # Call third party GUID BPDG tool.
             if BPDGToolName is not None:
                 VpdInfoFile.CallExtenalBPDGTool(BPDGToolName, VpdFilePath)
             else:
-                EdkLogger.error("Build", FILE_NOT_FOUND, "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
+                EdkLogger.error(
+                    "Build", FILE_NOT_FOUND, "Fail to find third-party BPDG tool to process VPD PCDs. BPDG Guid tool need to be defined in tools_def.txt and VPD_TOOL_GUID need to be provided in DSC file.")
 
-    ## Return the platform build data object
+    # Return the platform build data object
     @cached_property
     def Platform(self):
         return self.BuildDatabase[self.MetaFile, self.Arch, self.BuildTarget, self.ToolChain]
 
-    ## Return platform name
+    # Return platform name
     @cached_property
     def Name(self):
         return self.Platform.PlatformName
 
-    ## Return the meta file GUID
+    # Return the meta file GUID
     @cached_property
     def Guid(self):
         return self.Platform.Guid
 
-    ## Return the platform version
+    # Return the platform version
     @cached_property
     def Version(self):
         return self.Platform.Version
 
-    ## Return the FDF file name
+    # Return the FDF file name
     @cached_property
     def FdfFile(self):
         if self.Workspace.FdfFile:
-            RetVal= mws.join(self.WorkspaceDir, self.Workspace.FdfFile)
+            RetVal = mws.join(self.WorkspaceDir, self.Workspace.FdfFile)
         else:
             RetVal = ''
         return RetVal
 
-    ## Return the build output directory platform specifies
+    # Return the build output directory platform specifies
     @cached_property
     def OutputDir(self):
         return self.Platform.OutputDirectory
 
-    ## Return the directory to store all intermediate and final files built
+    # Return the directory to store all intermediate and final files built
     @cached_property
     def BuildDir(self):
         if os.path.isabs(self.OutputDir):
             GlobalData.gBuildDirectory = RetVal = path.join(
-                                        path.abspath(self.OutputDir),
-                                        self.BuildTarget + "_" + self.ToolChain,
-                                        )
+                path.abspath(self.OutputDir),
+                self.BuildTarget + "_" + self.ToolChain,
+            )
         else:
             GlobalData.gBuildDirectory = RetVal = path.join(
-                                        self.WorkspaceDir,
-                                        self.OutputDir,
-                                        self.BuildTarget + "_" + self.ToolChain,
-                                        )
+                self.WorkspaceDir,
+                self.OutputDir,
+                self.BuildTarget + "_" + self.ToolChain,
+            )
         return RetVal
 
-    ## Return directory of platform makefile
+    # Return directory of platform makefile
     #
     #   @retval     string  Makefile directory
     #
@@ -799,7 +860,7 @@ class PlatformAutoGen(AutoGen):
     def MakeFileDir(self):
         return path.join(self.BuildDir, self.Arch)
 
-    ## Return build command string
+    # Return build command string
     #
     #   @retval     string  Build command string
     #
@@ -827,7 +888,7 @@ class PlatformAutoGen(AutoGen):
                 RetVal = RetVal + _SplitOption(Flags.strip())
         return RetVal
 
-    ## Compute a tool defintion key priority value in range 0..15
+    # Compute a tool defintion key priority value in range 0..15
     #
     #  TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE  15
     #  ******_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE  14
@@ -846,15 +907,15 @@ class PlatformAutoGen(AutoGen):
     #  TARGET_*********_****_***********_ATTRIBUTE   1
     #  ******_*********_****_***********_ATTRIBUTE   0
     #
-    def ToolDefinitionPriority (self,Key):
+    def ToolDefinitionPriority(self, Key):
         KeyList = Key.split('_')
         Priority = 0
-        for Index in range (0, min(4, len(KeyList))):
+        for Index in range(0, min(4, len(KeyList))):
             if KeyList[Index] != '*':
                 Priority += (1 << Index)
         return Priority
 
-    ## Get tool chain definition
+    # Get tool chain definition
     #
     #  Get each tool definition for given tool chain from tools_def.txt and platform
     #
@@ -867,7 +928,8 @@ class PlatformAutoGen(AutoGen):
         RetVal = OrderedDict()
         DllPathList = set()
 
-        PrioritizedDefList = sorted(ToolDefinition.keys(), key=self.ToolDefinitionPriority, reverse=True)
+        PrioritizedDefList = sorted(ToolDefinition.keys(
+        ), key=self.ToolDefinitionPriority, reverse=True)
         for Def in PrioritizedDefList:
             Target, Tag, Arch, Tool, Attr = Def.split("_")
             if Target == TAB_STAR:
@@ -918,13 +980,17 @@ class PlatformAutoGen(AutoGen):
                 if Tool in self._BuildOptionWithToolDef(RetVal) and Attr in self._BuildOptionWithToolDef(RetVal)[Tool]:
                     # check if override is indicated
                     if self._BuildOptionWithToolDef(RetVal)[Tool][Attr].startswith('='):
-                        Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr][1:].strip()
+                        Value = self._BuildOptionWithToolDef(
+                            RetVal)[Tool][Attr][1:].strip()
                     else:
                         # Do not append PATH or GUID
                         if Attr != 'PATH' and Attr != 'GUID':
-                            Value += " " + self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
+                            Value += " " + \
+                                self._BuildOptionWithToolDef(RetVal)[
+                                    Tool][Attr]
                         else:
-                            Value = self._BuildOptionWithToolDef(RetVal)[Tool][Attr]
+                            Value = self._BuildOptionWithToolDef(RetVal)[
+                                Tool][Attr]
                 if Attr == "PATH":
                     # Don't put MAKE definition in the file
                     if Tool != "MAKE":
@@ -938,7 +1004,8 @@ class PlatformAutoGen(AutoGen):
                         ToolsDef += "%s_%s = %s\n" % (Tool, Attr, Value)
             ToolsDef += "\n"
 
-        tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
+        tool_def_file = os.path.join(
+            self.MakeFileDir, "TOOLS_DEF." + self.Arch)
         SaveFileOnChange(tool_def_file, ToolsDef, False)
         for DllPath in DllPathList:
             os.environ["PATH"] = DllPath + os.pathsep + os.environ["PATH"]
@@ -946,23 +1013,24 @@ class PlatformAutoGen(AutoGen):
 
         return RetVal
 
-    ## Return the paths of tools
+    # Return the paths of tools
     @cached_property
     def ToolDefinitionFile(self):
-        tool_def_file = os.path.join(self.MakeFileDir, "TOOLS_DEF." + self.Arch)
+        tool_def_file = os.path.join(
+            self.MakeFileDir, "TOOLS_DEF." + self.Arch)
         if not os.path.exists(tool_def_file):
             self.ToolDefinition
         return tool_def_file
 
-    ## Retrieve the toolchain family of given toolchain tag. Default to 'MSFT'.
+    # Retrieve the toolchain family of given toolchain tag. Default to 'MSFT'.
     @cached_property
     def ToolChainFamily(self):
         ToolDefinition = self.Workspace.ToolDef.ToolsDefTxtDatabase
         if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \
            or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
            or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]:
-            EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
-                               % self.ToolChain)
+            EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT."
+                              % self.ToolChain)
             RetVal = TAB_COMPILER_MSFT
         else:
             RetVal = ToolDefinition[TAB_TOD_DEFINES_FAMILY][self.ToolChain]
@@ -974,13 +1042,13 @@ class PlatformAutoGen(AutoGen):
         if TAB_TOD_DEFINES_BUILDRULEFAMILY not in ToolDefinition \
            or self.ToolChain not in ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY] \
            or not ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]:
-            EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT." \
-                               % self.ToolChain)
+            EdkLogger.verbose("No tool chain family found in configuration for %s. Default to MSFT."
+                              % self.ToolChain)
             return TAB_COMPILER_MSFT
 
         return ToolDefinition[TAB_TOD_DEFINES_BUILDRULEFAMILY][self.ToolChain]
 
-    ## Return the build options specific for all modules in this platform
+    # Return the build options specific for all modules in this platform
     @cached_property
     def BuildOption(self):
         return self._ExpandBuildOption(self.Platform.BuildOptions)
@@ -988,17 +1056,17 @@ class PlatformAutoGen(AutoGen):
     def _BuildOptionWithToolDef(self, ToolDef):
         return self._ExpandBuildOption(self.Platform.BuildOptions, ToolDef=ToolDef)
 
-    ## Return the build options specific for EDK modules in this platform
+    # Return the build options specific for EDK modules in this platform
     @cached_property
     def EdkBuildOption(self):
         return self._ExpandBuildOption(self.Platform.BuildOptions, EDK_NAME)
 
-    ## Return the build options specific for EDKII modules in this platform
+    # Return the build options specific for EDKII modules in this platform
     @cached_property
     def EdkIIBuildOption(self):
         return self._ExpandBuildOption(self.Platform.BuildOptions, EDKII_NAME)
 
-    ## Parse build_rule.txt in Conf Directory.
+    # Parse build_rule.txt in Conf Directory.
     #
     #   @retval     BuildRule object
     #
@@ -1006,21 +1074,22 @@ class PlatformAutoGen(AutoGen):
     def BuildRule(self):
         BuildRuleFile = None
         if TAB_TAT_DEFINES_BUILD_RULE_CONF in self.Workspace.TargetTxt.TargetTxtDictionary:
-            BuildRuleFile = self.Workspace.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_BUILD_RULE_CONF]
+            BuildRuleFile = self.Workspace.TargetTxt.TargetTxtDictionary[
+                TAB_TAT_DEFINES_BUILD_RULE_CONF]
         if not BuildRuleFile:
             BuildRuleFile = gDefaultBuildRuleFile
         RetVal = BuildRule(BuildRuleFile)
         if RetVal._FileVersion == "":
             RetVal._FileVersion = AutoGenReqBuildRuleVerNum
         else:
-            if RetVal._FileVersion < AutoGenReqBuildRuleVerNum :
+            if RetVal._FileVersion < AutoGenReqBuildRuleVerNum:
                 # If Build Rule's version is less than the version number required by the tools, halting the build.
                 EdkLogger.error("build", AUTOGEN_ERROR,
-                                ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"\
-                                 % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
+                                ExtraData="The version number [%s] of build_rule.txt is less than the version number required by the AutoGen.(the minimum required version number is [%s])"
+                                % (RetVal._FileVersion, AutoGenReqBuildRuleVerNum))
         return RetVal
 
-    ## Summarize the packages used by modules in this platform
+    # Summarize the packages used by modules in this platform
     @cached_property
     def PackageList(self):
         RetVal = set()
@@ -1028,34 +1097,35 @@ class PlatformAutoGen(AutoGen):
             RetVal.update(Mb.Packages)
             for lb in Mb.LibInstances:
                 RetVal.update(lb.Packages)
-        #Collect package set information from INF of FDF
+        # Collect package set information from INF of FDF
         for ModuleFile in self._AsBuildModuleList:
             if ModuleFile in self.Platform.Modules:
                 continue
-            ModuleData = self.BuildDatabase[ModuleFile, self.Arch, self.BuildTarget, self.ToolChain]
+            ModuleData = self.BuildDatabase[ModuleFile,
+                                            self.Arch, self.BuildTarget, self.ToolChain]
             RetVal.update(ModuleData.Packages)
         RetVal.update(self.Platform.Packages)
         return list(RetVal)
 
     @cached_property
     def NonDynamicPcdDict(self):
-        return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):Pcd for Pcd in self.NonDynamicPcdList}
+        return {(Pcd.TokenCName, Pcd.TokenSpaceGuidCName): Pcd for Pcd in self.NonDynamicPcdList}
 
-    ## Get list of non-dynamic PCDs
+    # Get list of non-dynamic PCDs
     @property
     def NonDynamicPcdList(self):
         if not self._NonDynamicPcdList:
             self.CollectPlatformDynamicPcds()
         return self._NonDynamicPcdList
 
-    ## Get list of dynamic PCDs
+    # Get list of dynamic PCDs
     @property
     def DynamicPcdList(self):
         if not self._DynamicPcdList:
             self.CollectPlatformDynamicPcds()
         return self._DynamicPcdList
 
-    ## Generate Token Number for all PCD
+    # Generate Token Number for all PCD
     @cached_property
     def PcdTokenNumber(self):
         RetVal = OrderedDict()
@@ -1071,25 +1141,29 @@ class PlatformAutoGen(AutoGen):
         #
         for Pcd in self.DynamicPcdList:
             if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
-                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %
+                                (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
                 RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
                 TokenNumber += 1
 
         for Pcd in self.DynamicPcdList:
             if Pcd.Phase == "PEI" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
-                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %
+                                (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
                 RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
                 TokenNumber += 1
 
         for Pcd in self.DynamicPcdList:
             if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_TYPE_SET:
-                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %
+                                (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
                 RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
                 TokenNumber += 1
 
         for Pcd in self.DynamicPcdList:
             if Pcd.Phase == "DXE" and Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
-                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" % (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
+                EdkLogger.debug(EdkLogger.DEBUG_5, "%s %s (%s) -> %d" %
+                                (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Phase, TokenNumber))
                 RetVal[Pcd.TokenCName, Pcd.TokenSpaceGuidCName] = TokenNumber
                 TokenNumber += 1
 
@@ -1102,7 +1176,8 @@ class PlatformAutoGen(AutoGen):
         ModuleList = []
         for m in self.Platform.Modules:
             component = self.Platform.Modules[m]
-            module = self.BuildDatabase[m, self.Arch, self.BuildTarget, self.ToolChain]
+            module = self.BuildDatabase[m, self.Arch,
+                                        self.BuildTarget, self.ToolChain]
             module.Guid = component.Guid
             ModuleList.append(module)
         return ModuleList
@@ -1111,18 +1186,18 @@ class PlatformAutoGen(AutoGen):
     def _MaList(self):
         for ModuleFile in self.Platform.Modules:
             Ma = ModuleAutoGen(
-                  self.Workspace,
-                  ModuleFile,
-                  self.BuildTarget,
-                  self.ToolChain,
-                  self.Arch,
-                  self.MetaFile,
-                  self.DataPipe
-                  )
+                self.Workspace,
+                ModuleFile,
+                self.BuildTarget,
+                self.ToolChain,
+                self.Arch,
+                self.MetaFile,
+                self.DataPipe
+            )
             self.Platform.Modules[ModuleFile].M = Ma
         return [x.M for x in self.Platform.Modules.values()]
 
-    ## Summarize ModuleAutoGen objects of all modules to be built for this platform
+    # Summarize ModuleAutoGen objects of all modules to be built for this platform
     @cached_property
     def ModuleAutoGenList(self):
         RetVal = []
@@ -1131,7 +1206,7 @@ class PlatformAutoGen(AutoGen):
                 RetVal.append(Ma)
         return RetVal
 
-    ## Summarize ModuleAutoGen objects of all libraries to be built for this platform
+    # Summarize ModuleAutoGen objects of all libraries to be built for this platform
     @cached_property
     def LibraryAutoGenList(self):
         RetVal = []
@@ -1143,7 +1218,7 @@ class PlatformAutoGen(AutoGen):
                     La.ReferenceModules.append(Ma)
         return RetVal
 
-    ## Test if a module is supported by the platform
+    # Test if a module is supported by the platform
     #
     #  An error will be raised directly if the module or its arch is not supported
     #  by the platform or current configuration
@@ -1151,23 +1226,28 @@ class PlatformAutoGen(AutoGen):
     def ValidModule(self, Module):
         return Module in self.Platform.Modules or Module in self.Platform.LibraryInstances \
             or Module in self._AsBuildModuleList
+
     @cached_property
-    def GetAllModuleInfo(self,WithoutPcd=True):
+    def GetAllModuleInfo(self, WithoutPcd=True):
         ModuleLibs = set()
         for m in self.Platform.Modules:
-            module_obj = self.BuildDatabase[m,self.Arch,self.BuildTarget,self.ToolChain]
+            module_obj = self.BuildDatabase[m, self.Arch,
+                                            self.BuildTarget, self.ToolChain]
             if not bool(module_obj.LibraryClass):
-                Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase, self.Arch,self.BuildTarget,self.ToolChain,self.MetaFile,EdkLogger)
+                Libs = GetModuleLibInstances(module_obj, self.Platform, self.BuildDatabase,
+                                             self.Arch, self.BuildTarget, self.ToolChain, self.MetaFile, EdkLogger)
             else:
                 Libs = []
-            ModuleLibs.update( set([(l.MetaFile.File,l.MetaFile.Root,l.MetaFile.Path,l.MetaFile.BaseName,l.MetaFile.OriginalPath,l.Arch,True) for l in Libs]))
+            ModuleLibs.update(set([(l.MetaFile.File, l.MetaFile.Root, l.MetaFile.Path,
+                              l.MetaFile.BaseName, l.MetaFile.OriginalPath, l.Arch, True) for l in Libs]))
             if WithoutPcd and module_obj.PcdIsDriver:
                 continue
-            ModuleLibs.add((m.File,m.Root,m.Path,m.BaseName,m.OriginalPath,module_obj.Arch,bool(module_obj.LibraryClass)))
+            ModuleLibs.add((m.File, m.Root, m.Path, m.BaseName, m.OriginalPath,
+                           module_obj.Arch, bool(module_obj.LibraryClass)))
 
         return ModuleLibs
 
-    ## Resolve the library classes in a module to library instances
+    # Resolve the library classes in a module to library instances
     #
     # This method will not only resolve library classes but also sort the library
     # instances according to the dependency-ship.
@@ -1190,7 +1270,7 @@ class PlatformAutoGen(AutoGen):
                                      self.MetaFile,
                                      EdkLogger)
 
-    ## Override PCD setting (type, value, ...)
+    # Override PCD setting (type, value, ...)
     #
     #   @param  ToPcd       The PCD to be overridden
     #   @param  FromPcd     The PCD overriding from
@@ -1210,15 +1290,16 @@ class PlatformAutoGen(AutoGen):
             if ToPcd.Pending and FromPcd.Type:
                 ToPcd.Type = FromPcd.Type
             elif ToPcd.Type and FromPcd.Type\
-                and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
+                    and ToPcd.Type != FromPcd.Type and ToPcd.Type in FromPcd.Type:
                 if ToPcd.Type.strip() == TAB_PCDS_DYNAMIC_EX:
                     ToPcd.Type = FromPcd.Type
             elif ToPcd.Type and FromPcd.Type \
-                and ToPcd.Type != FromPcd.Type:
+                    and ToPcd.Type != FromPcd.Type:
                 if Library:
-                    Module = str(Module) + " 's library file (" + str(Library) + ")"
+                    Module = str(Module) + \
+                        " 's library file (" + str(Library) + ")"
                 EdkLogger.error("build", OPTION_CONFLICT, "Mismatched PCD type",
-                                ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."\
+                                ExtraData="%s.%s is used as [%s] in module %s, but as [%s] in %s."
                                           % (ToPcd.TokenSpaceGuidCName, TokenCName,
                                              ToPcd.Type, Module, FromPcd.Type, Msg),
                                           File=self.MetaFile)
@@ -1239,10 +1320,11 @@ class PlatformAutoGen(AutoGen):
             # Add Flexible PCD format parse
             if ToPcd.DefaultValue:
                 try:
-                    ToPcd.DefaultValue = ValueExpressionEx(ToPcd.DefaultValue, ToPcd.DatumType, self.Platform._GuidDict)(True)
+                    ToPcd.DefaultValue = ValueExpressionEx(
+                        ToPcd.DefaultValue, ToPcd.DatumType, self.Platform._GuidDict)(True)
                 except BadExpression as Value:
-                    EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
-                                        File=self.MetaFile)
+                    EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' % (ToPcd.TokenSpaceGuidCName, ToPcd.TokenCName, ToPcd.DefaultValue, Value),
+                                    File=self.MetaFile)
 
             # check the validation of datum
             IsValid, Cause = CheckPcdDatum(ToPcd.DatumType, ToPcd.DefaultValue)
@@ -1255,7 +1337,7 @@ class PlatformAutoGen(AutoGen):
             ToPcd.CustomAttribute = FromPcd.CustomAttribute
 
         if FromPcd is not None and ToPcd.DatumType == TAB_VOID and not ToPcd.MaxDatumSize:
-            EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s" \
+            EdkLogger.debug(EdkLogger.DEBUG_9, "No MaxDatumSize specified for PCD %s.%s"
                             % (ToPcd.TokenSpaceGuidCName, TokenCName))
             Value = ToPcd.DefaultValue
             if not Value:
@@ -1269,16 +1351,17 @@ class PlatformAutoGen(AutoGen):
 
         # apply default SKU for dynamic PCDS if specified one is not available
         if (ToPcd.Type in PCD_DYNAMIC_TYPE_SET or ToPcd.Type in PCD_DYNAMIC_EX_TYPE_SET) \
-            and not ToPcd.SkuInfoList:
+                and not ToPcd.SkuInfoList:
             if self.Platform.SkuName in self.Platform.SkuIds:
                 SkuName = self.Platform.SkuName
             else:
                 SkuName = TAB_DEFAULT
             ToPcd.SkuInfoList = {
-                SkuName : SkuInfoClass(SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
+                SkuName: SkuInfoClass(
+                    SkuName, self.Platform.SkuIds[SkuName][0], '', '', '', '', '', ToPcd.DefaultValue)
             }
 
-    ## Apply PCD setting defined platform to a module
+    # Apply PCD setting defined platform to a module
     #
     #   @param  Module  The module from which the PCD setting will be overridden
     #
@@ -1294,31 +1377,36 @@ class PlatformAutoGen(AutoGen):
             else:
                 PcdInPlatform = None
             # then override the settings if any
-            self._OverridePcd(PcdInModule, PcdInPlatform, Module, Msg="DSC PCD sections", Library=Library)
+            self._OverridePcd(PcdInModule, PcdInPlatform,
+                              Module, Msg="DSC PCD sections", Library=Library)
             # resolve the VariableGuid value
             for SkuId in PcdInModule.SkuInfoList:
                 Sku = PcdInModule.SkuInfoList[SkuId]
-                if Sku.VariableGuid == '': continue
-                Sku.VariableGuidValue = GuidValue(Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
+                if Sku.VariableGuid == '':
+                    continue
+                Sku.VariableGuidValue = GuidValue(
+                    Sku.VariableGuid, self.PackageList, self.MetaFile.Path)
                 if Sku.VariableGuidValue is None:
                     PackageList = "\n\t".join(str(P) for P in self.PackageList)
                     EdkLogger.error(
-                                'build',
-                                RESOURCE_NOT_AVAILABLE,
-                                "Value of GUID [%s] is not found in" % Sku.VariableGuid,
-                                ExtraData=PackageList + "\n\t(used with %s.%s from module %s)" \
-                                                        % (Guid, Name, str(Module)),
-                                File=self.MetaFile
-                                )
+                        'build',
+                        RESOURCE_NOT_AVAILABLE,
+                        "Value of GUID [%s] is not found in" % Sku.VariableGuid,
+                        ExtraData=PackageList +
+                        "\n\t(used with %s.%s from module %s)"
+                        % (Guid, Name, str(Module)),
+                        File=self.MetaFile
+                    )
 
         # override PCD settings with module specific setting
         if Module in self.Platform.Modules:
             PlatformModule = self.Platform.Modules[str(Module)]
-            for Key  in PlatformModule.Pcds:
+            for Key in PlatformModule.Pcds:
                 if GlobalData.BuildOptionPcd:
                     for pcd in GlobalData.BuildOptionPcd:
-                        (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
-                        if (TokenCName, TokenSpaceGuidCName) == Key and FieldName =="":
+                        (TokenSpaceGuidCName, TokenCName,
+                         FieldName, pcdvalue, _) = pcd
+                        if (TokenCName, TokenSpaceGuidCName) == Key and FieldName == "":
                             PlatformModule.Pcds[Key].DefaultValue = pcdvalue
                             PlatformModule.Pcds[Key].PcdValueFromComm = pcdvalue
                             break
@@ -1333,7 +1421,8 @@ class PlatformAutoGen(AutoGen):
                             Flag = True
                             break
                 if Flag:
-                    self._OverridePcd(ToPcd, PlatformModule.Pcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
+                    self._OverridePcd(
+                        ToPcd, PlatformModule.Pcds[Key], Module, Msg="DSC Components Module scoped PCD section", Library=Library)
         # use PCD value to calculate the MaxDatumSize when it is not specified
         for Name, Guid in Pcds:
             Pcd = Pcds[Name, Guid]
@@ -1350,7 +1439,7 @@ class PlatformAutoGen(AutoGen):
                     Pcd.MaxDatumSize = str(len(Value) - 1)
         return list(Pcds.values())
 
-    ## Append build options in platform to a module
+    # Append build options in platform to a module
     #
     #   @param  Module  The module to which the build options will be appended
     #
@@ -1359,12 +1448,14 @@ class PlatformAutoGen(AutoGen):
     def ApplyBuildOption(self, Module):
         # Get the different options for the different style module
         PlatformOptions = self.EdkIIBuildOption
-        ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(EDKII_NAME, Module.ModuleType)
+        ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(
+            EDKII_NAME, Module.ModuleType)
         ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
         ModuleOptions = self._ExpandBuildOption(Module.BuildOptions)
         if Module in self.Platform.Modules:
             PlatformModule = self.Platform.Modules[str(Module)]
-            PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
+            PlatformModuleOptions = self._ExpandBuildOption(
+                PlatformModule.BuildOptions)
         else:
             PlatformModuleOptions = {}
 
@@ -1397,59 +1488,67 @@ class PlatformAutoGen(AutoGen):
                     for ExpandedTool in ToolList:
                         # check if override is indicated
                         if Value.startswith('='):
-                            BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(Value[1:])
+                            BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(
+                                Value[1:])
                         else:
                             if Attr != 'PATH':
-                                BuildOptions[ExpandedTool][Attr] += " " + mws.handleWsMacro(Value)
+                                BuildOptions[ExpandedTool][Attr] += " " + \
+                                    mws.handleWsMacro(Value)
                             else:
-                                BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(Value)
+                                BuildOptions[ExpandedTool][Attr] = mws.handleWsMacro(
+                                    Value)
 
         return BuildOptions, BuildRuleOrder
 
-
-    def GetGlobalBuildOptions(self,Module):
-        ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(EDKII_NAME, Module.ModuleType)
+    def GetGlobalBuildOptions(self, Module):
+        ModuleTypeOptions = self.Platform.GetBuildOptionsByModuleType(
+            EDKII_NAME, Module.ModuleType)
         ModuleTypeOptions = self._ExpandBuildOption(ModuleTypeOptions)
 
         if Module in self.Platform.Modules:
             PlatformModule = self.Platform.Modules[str(Module)]
-            PlatformModuleOptions = self._ExpandBuildOption(PlatformModule.BuildOptions)
+            PlatformModuleOptions = self._ExpandBuildOption(
+                PlatformModule.BuildOptions)
         else:
             PlatformModuleOptions = {}
 
-        return ModuleTypeOptions,PlatformModuleOptions
-    def ModuleGuid(self,Module):
+        return ModuleTypeOptions, PlatformModuleOptions
+
+    def ModuleGuid(self, Module):
         if os.path.basename(Module.MetaFile.File) != os.path.basename(Module.MetaFile.Path):
             #
             # Length of GUID is 36
             #
             return os.path.basename(Module.MetaFile.Path)[:36]
         return Module.Guid
+
     @cached_property
     def UniqueBaseName(self):
-        retVal ={}
+        retVal = {}
         ModuleNameDict = {}
         UniqueName = {}
         for Module in self._MbList:
-            unique_base_name = '%s_%s' % (Module.BaseName,self.ModuleGuid(Module))
+            unique_base_name = '%s_%s' % (
+                Module.BaseName, self.ModuleGuid(Module))
             if unique_base_name not in ModuleNameDict:
                 ModuleNameDict[unique_base_name] = []
             ModuleNameDict[unique_base_name].append(Module.MetaFile)
             if Module.BaseName not in UniqueName:
                 UniqueName[Module.BaseName] = set()
-            UniqueName[Module.BaseName].add((self.ModuleGuid(Module),Module.MetaFile))
+            UniqueName[Module.BaseName].add(
+                (self.ModuleGuid(Module), Module.MetaFile))
         for module_paths in ModuleNameDict.values():
-            if len(set(module_paths))>1:
+            if len(set(module_paths)) > 1:
                 samemodules = list(set(module_paths))
                 EdkLogger.error("build", FILE_DUPLICATED, 'Modules have same BaseName and FILE_GUID:\n'
-                                    '  %s\n  %s' % (samemodules[0], samemodules[1]))
+                                '  %s\n  %s' % (samemodules[0], samemodules[1]))
         for name in UniqueName:
             Guid_Path = UniqueName[name]
             if len(Guid_Path) > 1:
-                for guid,mpath in Guid_Path:
-                    retVal[(name,mpath)] = '%s_%s' % (name,guid)
+                for guid, mpath in Guid_Path:
+                    retVal[(name, mpath)] = '%s_%s' % (name, guid)
         return retVal
-    ## Expand * in build option key
+    # Expand * in build option key
     #
     #   @param  Options     Options to be expanded
     #   @param  ToolDef     Use specified ToolDef instead of full version.
@@ -1459,11 +1558,12 @@ class PlatformAutoGen(AutoGen):
     #
     #   @retval options     Options expanded
     #
+
     def _ExpandBuildOption(self, Options, ModuleStyle=None, ToolDef=None):
         if not ToolDef:
             ToolDef = self.ToolDefinition
         BuildOptions = {}
-        FamilyMatch  = False
+        FamilyMatch = False
         FamilyIsNull = True
 
         OverrideList = {}
@@ -1476,12 +1576,12 @@ class PlatformAutoGen(AutoGen):
             # Key[1] -- TARGET_TOOLCHAIN_ARCH_COMMANDTYPE_ATTRIBUTE
             #
             if (Key[0] == self.BuildRuleFamily and
-                (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
+                    (ModuleStyle is None or len(Key) < 3 or (len(Key) > 2 and Key[2] == ModuleStyle))):
                 Target, ToolChain, Arch, CommandType, Attr = Key[1].split('_')
                 if (Target == self.BuildTarget or Target == TAB_STAR) and\
                     (ToolChain == self.ToolChain or ToolChain == TAB_STAR) and\
                     (Arch == self.Arch or Arch == TAB_STAR) and\
-                    Options[Key].startswith("="):
+                        Options[Key].startswith("="):
 
                     if OverrideList.get(Key[1]) is not None:
                         OverrideList.pop(Key[1])
@@ -1494,18 +1594,20 @@ class PlatformAutoGen(AutoGen):
             KeyList = list(OverrideList.keys())
             for Index in range(len(KeyList)):
                 NowKey = KeyList[Index]
-                Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split("_")
+                Target1, ToolChain1, Arch1, CommandType1, Attr1 = NowKey.split(
+                    "_")
                 for Index1 in range(len(KeyList) - Index - 1):
                     NextKey = KeyList[Index1 + Index + 1]
                     #
                     # Compare two Key, if one is included by another, choose the higher priority one
                     #
-                    Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split("_")
+                    Target2, ToolChain2, Arch2, CommandType2, Attr2 = NextKey.split(
+                        "_")
                     if (Target1 == Target2 or Target1 == TAB_STAR or Target2 == TAB_STAR) and\
                         (ToolChain1 == ToolChain2 or ToolChain1 == TAB_STAR or ToolChain2 == TAB_STAR) and\
                         (Arch1 == Arch2 or Arch1 == TAB_STAR or Arch2 == TAB_STAR) and\
                         (CommandType1 == CommandType2 or CommandType1 == TAB_STAR or CommandType2 == TAB_STAR) and\
-                        (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
+                            (Attr1 == Attr2 or Attr1 == TAB_STAR or Attr2 == TAB_STAR):
 
                         if CalculatePriorityValue(NowKey) > CalculatePriorityValue(NextKey):
                             if Options.get((self.BuildRuleFamily, NextKey)) is not None:
@@ -1515,7 +1617,7 @@ class PlatformAutoGen(AutoGen):
                                 Options.pop((self.BuildRuleFamily, NowKey))
 
         for Key in Options:
-            if ModuleStyle is not None and len (Key) > 2:
+            if ModuleStyle is not None and len(Key) > 2:
                 # Check Module style is EDK or EDKII.
                 # Only append build option for the matched style module.
                 if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
@@ -1561,7 +1663,7 @@ class PlatformAutoGen(AutoGen):
             return BuildOptions
 
         for Key in Options:
-            if ModuleStyle is not None and len (Key) > 2:
+            if ModuleStyle is not None and len(Key) > 2:
                 # Check Module style is EDK or EDKII.
                 # Only append build option for the matched style module.
                 if ModuleStyle == EDK_NAME and Key[2] != EDK_NAME:
diff --git a/BaseTools/Source/Python/AutoGen/StrGather.py b/BaseTools/Source/Python/AutoGen/StrGather.py
index eed30388bea1..441f6ebcbf0e 100644
--- a/BaseTools/Source/Python/AutoGen/StrGather.py
+++ b/BaseTools/Source/Python/AutoGen/StrGather.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a strings file and create or add to a string database
 # file.
 #
@@ -54,7 +54,8 @@ NOT_REFERENCED = 'not referenced'
 COMMENT_NOT_REFERENCED = ' ' + COMMENT + NOT_REFERENCED
 CHAR_ARRAY_DEFIN = 'unsigned char'
 COMMON_FILE_NAME = 'Strings'
-STRING_TOKEN = re.compile('STRING_TOKEN *\(([A-Z0-9_]+) *\)', re.MULTILINE | re.UNICODE)
+STRING_TOKEN = re.compile(
+    'STRING_TOKEN *\(([A-Z0-9_]+) *\)', re.MULTILINE | re.UNICODE)
 
 EFI_HII_ARRAY_SIZE_LENGTH = 4
 EFI_HII_PACKAGE_HEADER_LENGTH = 4
@@ -65,17 +66,19 @@ EFI_STRING_ID_LENGTH = 2
 EFI_HII_LANGUAGE_WINDOW = 0
 EFI_HII_LANGUAGE_WINDOW_LENGTH = 2
 EFI_HII_LANGUAGE_WINDOW_NUMBER = 16
-EFI_HII_STRING_PACKAGE_HDR_LENGTH = EFI_HII_PACKAGE_HEADER_LENGTH + EFI_HII_HDR_SIZE_LENGTH + EFI_HII_STRING_OFFSET_LENGTH + EFI_HII_LANGUAGE_WINDOW_LENGTH * EFI_HII_LANGUAGE_WINDOW_NUMBER + EFI_STRING_ID_LENGTH
+EFI_HII_STRING_PACKAGE_HDR_LENGTH = EFI_HII_PACKAGE_HEADER_LENGTH + EFI_HII_HDR_SIZE_LENGTH + \
+    EFI_HII_STRING_OFFSET_LENGTH + EFI_HII_LANGUAGE_WINDOW_LENGTH * \
+    EFI_HII_LANGUAGE_WINDOW_NUMBER + EFI_STRING_ID_LENGTH
 
-H_C_FILE_HEADER = ['//', \
-                   '//  DO NOT EDIT -- auto-generated file', \
-                   '//', \
-                   '//  This file is generated by the StrGather utility', \
+H_C_FILE_HEADER = ['//',
+                   '//  DO NOT EDIT -- auto-generated file',
+                   '//',
+                   '//  This file is generated by the StrGather utility',
                    '//']
 LANGUAGE_NAME_STRING_NAME = '$LANGUAGE_NAME'
 PRINTABLE_LANGUAGE_NAME_STRING_NAME = '$PRINTABLE_LANGUAGE_NAME'
 
-## Convert a dec number to a hex string
+# Convert a dec number to a hex string
 #
 # Convert a dec number to a formatted hex string in length digit
 # The digit is set to default 8
@@ -88,10 +91,12 @@ PRINTABLE_LANGUAGE_NAME_STRING_NAME = '$PRINTABLE_LANGUAGE_NAME'
 #
 # @retval:       The formatted hex string
 #
-def DecToHexStr(Dec, Digit = 8):
+
+
+def DecToHexStr(Dec, Digit=8):
     return '0x{0:0{1}X}'.format(Dec, Digit)
 
-## Convert a dec number to a hex list
+# Convert a dec number to a hex list
 #
 # Convert a dec number to a formatted hex list in size digit
 # The digit is set to default 8
@@ -103,11 +108,13 @@ def DecToHexStr(Dec, Digit = 8):
 #
 # @retval:       A list for formatted hex string
 #
-def DecToHexList(Dec, Digit = 8):
+
+
+def DecToHexList(Dec, Digit=8):
     Hex = '{0:0{1}X}'.format(Dec, Digit)
     return ["0x" + Hex[Bit:Bit + 2] for Bit in range(Digit - 2, -1, -2)]
 
-## Convert a acsii string to a hex list
+# Convert a acsii string to a hex list
 #
 # Convert a acsii string to a formatted hex list
 # AscToHexList('en-US') is ['0x65', '0x6E', '0x2D', '0x55', '0x53']
@@ -116,13 +123,15 @@ def DecToHexList(Dec, Digit = 8):
 #
 # @retval:       A list for formatted hex string
 #
+
+
 def AscToHexList(Ascii):
     try:
         return ['0x{0:02X}'.format(Item) for Item in Ascii]
     except:
         return ['0x{0:02X}'.format(ord(Item)) for Item in Ascii]
 
-## Create content of .h file
+# Create content of .h file
 #
 # Create content of .h file
 #
@@ -133,16 +142,22 @@ def AscToHexList(Ascii):
 #
 # @retval Str:           A string of .h file content
 #
+
+
 def CreateHFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag):
     Str = []
     ValueStartPtr = 60
-    Line = COMMENT_DEFINE_STR + ' ' + LANGUAGE_NAME_STRING_NAME + ' ' * (ValueStartPtr - len(DEFINE_STR + LANGUAGE_NAME_STRING_NAME)) + DecToHexStr(0, 4) + COMMENT_NOT_REFERENCED
+    Line = COMMENT_DEFINE_STR + ' ' + LANGUAGE_NAME_STRING_NAME + ' ' * \
+        (ValueStartPtr - len(DEFINE_STR + LANGUAGE_NAME_STRING_NAME)) + \
+        DecToHexStr(0, 4) + COMMENT_NOT_REFERENCED
     Str = WriteLine(Str, Line)
-    Line = COMMENT_DEFINE_STR + ' ' + PRINTABLE_LANGUAGE_NAME_STRING_NAME + ' ' * (ValueStartPtr - len(DEFINE_STR + PRINTABLE_LANGUAGE_NAME_STRING_NAME)) + DecToHexStr(1, 4) + COMMENT_NOT_REFERENCED
+    Line = COMMENT_DEFINE_STR + ' ' + PRINTABLE_LANGUAGE_NAME_STRING_NAME + ' ' * \
+        (ValueStartPtr - len(DEFINE_STR + PRINTABLE_LANGUAGE_NAME_STRING_NAME)
+         ) + DecToHexStr(1, 4) + COMMENT_NOT_REFERENCED
     Str = WriteLine(Str, Line)
     UnusedStr = ''
 
-    #Group the referred/Unused STRING token together.
+    # Group the referred/Unused STRING token together.
     for Index in range(2, len(UniObjectClass.OrderedStringList[UniObjectClass.LanguageDef[0][0]])):
         StringItem = UniObjectClass.OrderedStringList[UniObjectClass.LanguageDef[0][0]][Index]
         Name = StringItem.StringName
@@ -152,25 +167,31 @@ def CreateHFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag):
             Line = ''
             if Referenced == True:
                 if (ValueStartPtr - len(DEFINE_STR + Name)) <= 0:
-                    Line = DEFINE_STR + ' ' + Name + ' ' + DecToHexStr(Token, 4)
+                    Line = DEFINE_STR + ' ' + Name + \
+                        ' ' + DecToHexStr(Token, 4)
                 else:
-                    Line = DEFINE_STR + ' ' + Name + ' ' * (ValueStartPtr - len(DEFINE_STR + Name)) + DecToHexStr(Token, 4)
+                    Line = DEFINE_STR + ' ' + Name + ' ' * \
+                        (ValueStartPtr - len(DEFINE_STR + Name)) + \
+                        DecToHexStr(Token, 4)
                 Str = WriteLine(Str, Line)
             else:
                 if (ValueStartPtr - len(DEFINE_STR + Name)) <= 0:
-                    Line = COMMENT_DEFINE_STR + ' ' + Name + ' ' + DecToHexStr(Token, 4) + COMMENT_NOT_REFERENCED
+                    Line = COMMENT_DEFINE_STR + ' ' + Name + ' ' + \
+                        DecToHexStr(Token, 4) + COMMENT_NOT_REFERENCED
                 else:
-                    Line = COMMENT_DEFINE_STR + ' ' + Name + ' ' * (ValueStartPtr - len(DEFINE_STR + Name)) + DecToHexStr(Token, 4) + COMMENT_NOT_REFERENCED
+                    Line = COMMENT_DEFINE_STR + ' ' + Name + ' ' * \
+                        (ValueStartPtr - len(DEFINE_STR + Name)) + \
+                        DecToHexStr(Token, 4) + COMMENT_NOT_REFERENCED
                 UnusedStr = WriteLine(UnusedStr, Line)
 
-    Str.extend( UnusedStr)
+    Str.extend(UnusedStr)
 
     Str = WriteLine(Str, '')
     if IsCompatibleMode or UniGenCFlag:
         Str = WriteLine(Str, 'extern unsigned char ' + BaseName + 'Strings[];')
     return "".join(Str)
 
-## Create a complete .h file
+# Create a complete .h file
 #
 # Create a complet .h file with file header and file content
 #
@@ -181,21 +202,26 @@ def CreateHFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag):
 #
 # @retval Str:           A string of complete .h file
 #
+
+
 def CreateHFile(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag):
-    HFile = WriteLine('', CreateHFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag))
+    HFile = WriteLine('', CreateHFileContent(
+        BaseName, UniObjectClass, IsCompatibleMode, UniGenCFlag))
 
     return "".join(HFile)
 
-## Create a buffer to store all items in an array
+# Create a buffer to store all items in an array
 #
 # @param BinBuffer   Buffer to contain Binary data.
 # @param Array:      The array need to be formatted
 #
+
+
 def CreateBinBuffer(BinBuffer, Array):
     for Item in Array:
         BinBuffer.write(pack("B", int(Item, 16)))
 
-## Create a formatted string all items in an array
+# Create a formatted string all items in an array
 #
 # Use ',' to join each item in an array, and break an new line when reaching the width (default is 16)
 #
@@ -204,7 +230,9 @@ def CreateBinBuffer(BinBuffer, Array):
 #
 # @retval ArrayItem: A string for all formatted array items
 #
-def CreateArrayItem(Array, Width = 16):
+
+
+def CreateArrayItem(Array, Width=16):
     MaxLength = Width
     Index = 0
     Line = '  '
@@ -222,7 +250,7 @@ def CreateArrayItem(Array, Width = 16):
 
     return "".join(ArrayItem)
 
-## CreateCFileStringValue
+# CreateCFileStringValue
 #
 # Create a line with string value
 #
@@ -231,13 +259,14 @@ def CreateArrayItem(Array, Width = 16):
 # @retval Str:   A formatted string with string value
 #
 
+
 def CreateCFileStringValue(Value):
     Value = [StringBlockType] + Value
     Str = WriteLine('', CreateArrayItem(Value))
 
     return "".join(Str)
 
-## GetFilteredLanguage
+# GetFilteredLanguage
 #
 # apply get best language rules to the UNI language code list
 #
@@ -246,6 +275,8 @@ def CreateCFileStringValue(Value):
 #
 # @retval UniLanguageListFiltered:   the filtered language code
 #
+
+
 def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
     UniLanguageListFiltered = []
     # if filter list is empty, then consider there is no filter
@@ -269,12 +300,14 @@ def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
 
             for UniLanguage in UniLanguageList:
                 if UniLanguage.find('-') != -1:
-                    UniLanguagePrimaryTag = UniLanguage[0:UniLanguage.find('-')].lower()
+                    UniLanguagePrimaryTag = UniLanguage[0:UniLanguage.find(
+                        '-')].lower()
                 else:
                     UniLanguagePrimaryTag = UniLanguage
 
                 if len(UniLanguagePrimaryTag) == 3:
-                    UniLanguagePrimaryTag = LangConvTable.get(UniLanguagePrimaryTag)
+                    UniLanguagePrimaryTag = LangConvTable.get(
+                        UniLanguagePrimaryTag)
 
                 if PrimaryTag == UniLanguagePrimaryTag:
                     if UniLanguage not in UniLanguageListFiltered:
@@ -294,10 +327,10 @@ def GetFilteredLanguage(UniLanguageList, LanguageFilterList):
                             break
                     else:
                         UniLanguageListFiltered.append(DefaultTag)
-    return  UniLanguageListFiltered
+    return UniLanguageListFiltered
 
 
-## Create content of .c file
+# Create content of .c file
 #
 # Create content of .c file
 #
@@ -322,14 +355,15 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         LanguageFilterList = FilterInfo[1]
     else:
         # EDK module is using ISO639-2 format filter, convert to the RFC4646 format
-        LanguageFilterList = [LangConvTable.get(F.lower()) for F in FilterInfo[1]]
+        LanguageFilterList = [LangConvTable.get(
+            F.lower()) for F in FilterInfo[1]]
 
     UniLanguageList = []
     for IndexI in range(len(UniObjectClass.LanguageDef)):
         UniLanguageList += [UniObjectClass.LanguageDef[IndexI][0]]
 
-    UniLanguageListFiltered = GetFilteredLanguage(UniLanguageList, LanguageFilterList)
-
+    UniLanguageListFiltered = GetFilteredLanguage(
+        UniLanguageList, LanguageFilterList)
 
     #
     # Create lines for each language's strings
@@ -358,16 +392,21 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
                 Index = Index + 1
             else:
                 if NumberOfUseOtherLangDef > 0:
-                    StrStringValue = WriteLine(StrStringValue, CreateArrayItem([StringSkipType] + DecToHexList(NumberOfUseOtherLangDef, 4)))
-                    CreateBinBuffer (StringBuffer, ([StringSkipType] + DecToHexList(NumberOfUseOtherLangDef, 4)))
+                    StrStringValue = WriteLine(StrStringValue, CreateArrayItem(
+                        [StringSkipType] + DecToHexList(NumberOfUseOtherLangDef, 4)))
+                    CreateBinBuffer(
+                        StringBuffer, ([StringSkipType] + DecToHexList(NumberOfUseOtherLangDef, 4)))
                     NumberOfUseOtherLangDef = 0
                     ArrayLength = ArrayLength + 3
                 if Referenced and Item.Token > 0:
                     Index = Index + 1
-                    StrStringValue = WriteLine(StrStringValue, "// %s: %s:%s" % (DecToHexStr(Index, 4), Name, DecToHexStr(Token, 4)))
-                    StrStringValue = Write(StrStringValue, CreateCFileStringValue(Value))
-                    CreateBinBuffer (StringBuffer, [StringBlockType] + Value)
-                    ArrayLength = ArrayLength + Item.Length + 1 # 1 is for the length of string type
+                    StrStringValue = WriteLine(
+                        StrStringValue, "// %s: %s:%s" % (DecToHexStr(Index, 4), Name, DecToHexStr(Token, 4)))
+                    StrStringValue = Write(
+                        StrStringValue, CreateCFileStringValue(Value))
+                    CreateBinBuffer(StringBuffer, [StringBlockType] + Value)
+                    ArrayLength = ArrayLength + Item.Length + \
+                        1  # 1 is for the length of string type
 
         #
         # EFI_HII_PACKAGE_HEADER
@@ -382,13 +421,13 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         TotalLength = TotalLength + ArrayLength
 
         List = DecToHexList(ArrayLength, 6) + \
-               [StringPackageType] + \
-               DecToHexList(Offset) + \
-               DecToHexList(Offset) + \
-               DecToHexList(EFI_HII_LANGUAGE_WINDOW, EFI_HII_LANGUAGE_WINDOW_LENGTH * 2) * EFI_HII_LANGUAGE_WINDOW_NUMBER + \
-               DecToHexList(EFI_STRING_ID, 4) + \
-               AscToHexList(Language) + \
-               DecToHexList(0, 2)
+            [StringPackageType] + \
+            DecToHexList(Offset) + \
+            DecToHexList(Offset) + \
+            DecToHexList(EFI_HII_LANGUAGE_WINDOW, EFI_HII_LANGUAGE_WINDOW_LENGTH * 2) * EFI_HII_LANGUAGE_WINDOW_NUMBER + \
+            DecToHexList(EFI_STRING_ID, 4) + \
+            AscToHexList(Language) + \
+            DecToHexList(0, 2)
         Str = WriteLine(Str, CreateArrayItem(List, 16) + '\n')
 
         #
@@ -406,23 +445,25 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         # Create binary UNI string
         #
         if UniBinBuffer:
-            CreateBinBuffer (UniBinBuffer, List)
-            UniBinBuffer.write (StringBuffer.getvalue())
-            UniBinBuffer.write (pack("B", int(EFI_HII_SIBT_END, 16)))
+            CreateBinBuffer(UniBinBuffer, List)
+            UniBinBuffer.write(StringBuffer.getvalue())
+            UniBinBuffer.write(pack("B", int(EFI_HII_SIBT_END, 16)))
         StringBuffer.close()
 
     #
     # Create line for string variable name
     # "unsigned char $(BaseName)Strings[] = {"
     #
-    AllStr = WriteLine('', CHAR_ARRAY_DEFIN + ' ' + BaseName + COMMON_FILE_NAME + '[] = {\n')
+    AllStr = WriteLine('', CHAR_ARRAY_DEFIN + ' ' +
+                       BaseName + COMMON_FILE_NAME + '[] = {\n')
 
     if IsCompatibleMode:
         #
         # Create FRAMEWORK_EFI_HII_PACK_HEADER in compatible mode
         #
         AllStr = WriteLine(AllStr, '// FRAMEWORK PACKAGE HEADER Length')
-        AllStr = WriteLine(AllStr, CreateArrayItem(DecToHexList(TotalLength + 2)) + '\n')
+        AllStr = WriteLine(AllStr, CreateArrayItem(
+            DecToHexList(TotalLength + 2)) + '\n')
         AllStr = WriteLine(AllStr, '// FRAMEWORK PACKAGE HEADER Type')
         AllStr = WriteLine(AllStr, CreateArrayItem(DecToHexList(2, 4)) + '\n')
     else:
@@ -430,7 +471,8 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
         # Create whole array length in UEFI mode
         #
         AllStr = WriteLine(AllStr, '// STRGATHER_OUTPUT_HEADER')
-        AllStr = WriteLine(AllStr, CreateArrayItem(DecToHexList(TotalLength)) + '\n')
+        AllStr = WriteLine(AllStr, CreateArrayItem(
+            DecToHexList(TotalLength)) + '\n')
 
     #
     # Join package data
@@ -439,17 +481,19 @@ def CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, UniBinBuffer,
 
     return "".join(AllStr)
 
-## Create end of .c file
+# Create end of .c file
 #
 # Create end of .c file
 #
 # @retval Str:           A string of .h file end
 #
+
+
 def CreateCFileEnd():
     Str = Write('', '};')
     return Str
 
-## Create a .c file
+# Create a .c file
 #
 # Create a complete .c file
 #
@@ -460,13 +504,16 @@ def CreateCFileEnd():
 #
 # @retval CFile:          A string of complete .c file
 #
+
+
 def CreateCFile(BaseName, UniObjectClass, IsCompatibleMode, FilterInfo):
     CFile = ''
-    CFile = WriteLine(CFile, CreateCFileContent(BaseName, UniObjectClass, IsCompatibleMode, None, FilterInfo))
+    CFile = WriteLine(CFile, CreateCFileContent(
+        BaseName, UniObjectClass, IsCompatibleMode, None, FilterInfo))
     CFile = WriteLine(CFile, CreateCFileEnd())
     return "".join(CFile)
 
-## GetFileList
+# GetFileList
 #
 # Get a list for all files
 #
@@ -475,9 +522,12 @@ def CreateCFile(BaseName, UniObjectClass, IsCompatibleMode, FilterInfo):
 #
 # @retval FileList:    A list of all files found
 #
+
+
 def GetFileList(SourceFileList, IncludeList, SkipList):
     if IncludeList is None:
-        EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR, "Include path for unicode file is not defined")
+        EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR,
+                        "Include path for unicode file is not defined")
 
     FileList = []
     if SkipList is None:
@@ -499,7 +549,8 @@ def GetFileList(SourceFileList, IncludeList, SkipList):
             IsSkip = False
             for Skip in SkipList:
                 if os.path.splitext(File)[1].upper() == Skip.upper():
-                    EdkLogger.verbose("Skipped %s for string token uses search" % File)
+                    EdkLogger.verbose(
+                        "Skipped %s for string token uses search" % File)
                     IsSkip = True
                     break
 
@@ -510,7 +561,7 @@ def GetFileList(SourceFileList, IncludeList, SkipList):
 
     return FileList
 
-## SearchString
+# SearchString
 #
 # Search whether all string defined in UniObjectClass are referenced
 # All string used should be set to Referenced
@@ -521,6 +572,8 @@ def GetFileList(SourceFileList, IncludeList, SkipList):
 #
 # @retval UniObjectClass: UniObjectClass after searched
 #
+
+
 def SearchString(UniObjectClass, FileList, IsCompatibleMode):
     if FileList == []:
         return UniObjectClass
@@ -531,55 +584,65 @@ def SearchString(UniObjectClass, FileList, IsCompatibleMode):
                 Lines = open(File, 'r')
                 for Line in Lines:
                     for StrName in STRING_TOKEN.findall(Line):
-                        EdkLogger.debug(EdkLogger.DEBUG_5, "Found string identifier: " + StrName)
+                        EdkLogger.debug(EdkLogger.DEBUG_5,
+                                        "Found string identifier: " + StrName)
                         UniObjectClass.SetStringReferenced(StrName)
         except:
-            EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR, "SearchString: Error while processing file", File=File, RaiseError=False)
+            EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR,
+                            "SearchString: Error while processing file", File=File, RaiseError=False)
             raise
 
     UniObjectClass.ReToken()
 
     return UniObjectClass
 
-## GetStringFiles
+# GetStringFiles
 #
 # This function is used for UEFI2.1 spec
 #
 #
-def GetStringFiles(UniFilList, SourceFileList, IncludeList, IncludePathList, SkipList, BaseName, IsCompatibleMode = False, ShellMode = False, UniGenCFlag = True, UniGenBinBuffer = None, FilterInfo = [True, []]):
+
+
+def GetStringFiles(UniFilList, SourceFileList, IncludeList, IncludePathList, SkipList, BaseName, IsCompatibleMode=False, ShellMode=False, UniGenCFlag=True, UniGenBinBuffer=None, FilterInfo=[True, []]):
     if len(UniFilList) > 0:
         if ShellMode:
             #
             # support ISO 639-2 codes in .UNI files of EDK Shell
             #
-            Uni = UniFileClassObject(sorted(UniFilList, key=lambda x: x.File), True, IncludePathList)
+            Uni = UniFileClassObject(
+                sorted(UniFilList, key=lambda x: x.File), True, IncludePathList)
         else:
-            Uni = UniFileClassObject(sorted(UniFilList, key=lambda x: x.File), IsCompatibleMode, IncludePathList)
+            Uni = UniFileClassObject(
+                sorted(UniFilList, key=lambda x: x.File), IsCompatibleMode, IncludePathList)
     else:
-        EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR, 'No unicode files given')
+        EdkLogger.error("UnicodeStringGather", AUTOGEN_ERROR,
+                        'No unicode files given')
 
     FileList = GetFileList(SourceFileList, IncludeList, SkipList)
 
-    Uni = SearchString(Uni, sorted (FileList), IsCompatibleMode)
+    Uni = SearchString(Uni, sorted(FileList), IsCompatibleMode)
 
     HFile = CreateHFile(BaseName, Uni, IsCompatibleMode, UniGenCFlag)
     CFile = None
     if IsCompatibleMode or UniGenCFlag:
         CFile = CreateCFile(BaseName, Uni, IsCompatibleMode, FilterInfo)
     if UniGenBinBuffer:
-        CreateCFileContent(BaseName, Uni, IsCompatibleMode, UniGenBinBuffer, FilterInfo)
+        CreateCFileContent(BaseName, Uni, IsCompatibleMode,
+                           UniGenBinBuffer, FilterInfo)
 
     return HFile, CFile
 
 #
 # Write an item
 #
+
+
 def Write(Target, Item):
-    if isinstance(Target,str):
+    if isinstance(Target, str):
         Target = [Target]
     if not Target:
         Target = []
-    if isinstance(Item,list):
+    if isinstance(Item, list):
         Target.extend(Item)
     else:
         Target.append(Item)
@@ -588,8 +651,10 @@ def Write(Target, Item):
 #
 # Write an item with a break line
 #
+
+
 def WriteLine(Target, Item):
-    if isinstance(Target,str):
+    if isinstance(Target, str):
         Target = [Target]
     if not Target:
         Target = []
@@ -600,14 +665,15 @@ def WriteLine(Target, Item):
     Target.append('\n')
     return Target
 
+
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
 if __name__ == '__main__':
     EdkLogger.info('start')
 
     UniFileList = [
-                   r'C:\\Edk\\Strings2.uni',
-                   r'C:\\Edk\\Strings.uni'
+        r'C:\\Edk\\Strings2.uni',
+        r'C:\\Edk\\Strings.uni'
     ]
 
     SrcFileList = []
@@ -616,12 +682,13 @@ if __name__ == '__main__':
             SrcFileList.append(File)
 
     IncludeList = [
-                   r'C:\\Edk'
+        r'C:\\Edk'
     ]
 
     SkipList = ['.inf', '.uni']
     BaseName = 'DriverSample'
-    (h, c) = GetStringFiles(UniFileList, SrcFileList, IncludeList, SkipList, BaseName, True)
+    (h, c) = GetStringFiles(UniFileList, SrcFileList,
+                            IncludeList, SkipList, BaseName, True)
     hfile = open('unistring.h', 'w')
     cfile = open('unistring.c', 'w')
     hfile.write(h)
diff --git a/BaseTools/Source/Python/AutoGen/UniClassObject.py b/BaseTools/Source/Python/AutoGen/UniClassObject.py
index b16330e36825..753364560e04 100644
--- a/BaseTools/Source/Python/AutoGen/UniClassObject.py
+++ b/BaseTools/Source/Python/AutoGen/UniClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to collect all defined strings in multiple uni files
 #
 #
@@ -11,7 +11,9 @@
 # Import Modules
 #
 from __future__ import print_function
-import Common.LongFilePathOs as os, codecs, re
+import Common.LongFilePathOs as os
+import codecs
+import re
 import shlex
 import Common.EdkLogger as EdkLogger
 from io import BytesIO
@@ -38,9 +40,10 @@ NULL = u'\u0000'
 TAB = u'\t'
 BACK_SLASH_PLACEHOLDER = u'\u0006'
 
-gIncludePattern = re.compile("^#include +[\"<]+([^\"< >]+)[>\"]+$", re.MULTILINE | re.UNICODE)
+gIncludePattern = re.compile(
+    "^#include +[\"<]+([^\"< >]+)[>\"]+$", re.MULTILINE | re.UNICODE)
 
-## Convert a unicode string to a Hex list
+# Convert a unicode string to a Hex list
 #
 # Convert a unicode string to a Hex list
 # UniToHexList('ABC') is ['0x41', '0x00', '0x42', '0x00', '0x43', '0x00']
@@ -49,6 +52,8 @@ gIncludePattern = re.compile("^#include +[\"<]+([^\"< >]+)[>\"]+$", re.MULTILINE
 #
 # @retval List:  The formatted hex list
 #
+
+
 def UniToHexList(Uni):
     List = []
     for Item in Uni:
@@ -57,40 +62,41 @@ def UniToHexList(Uni):
         List.append('0x' + Temp[0:2])
     return List
 
-LangConvTable = {'eng':'en', 'fra':'fr', \
-                 'aar':'aa', 'abk':'ab', 'ave':'ae', 'afr':'af', 'aka':'ak', 'amh':'am', \
-                 'arg':'an', 'ara':'ar', 'asm':'as', 'ava':'av', 'aym':'ay', 'aze':'az', \
-                 'bak':'ba', 'bel':'be', 'bul':'bg', 'bih':'bh', 'bis':'bi', 'bam':'bm', \
-                 'ben':'bn', 'bod':'bo', 'bre':'br', 'bos':'bs', 'cat':'ca', 'che':'ce', \
-                 'cha':'ch', 'cos':'co', 'cre':'cr', 'ces':'cs', 'chu':'cu', 'chv':'cv', \
-                 'cym':'cy', 'dan':'da', 'deu':'de', 'div':'dv', 'dzo':'dz', 'ewe':'ee', \
-                 'ell':'el', 'epo':'eo', 'spa':'es', 'est':'et', 'eus':'eu', 'fas':'fa', \
-                 'ful':'ff', 'fin':'fi', 'fij':'fj', 'fao':'fo', 'fry':'fy', 'gle':'ga', \
-                 'gla':'gd', 'glg':'gl', 'grn':'gn', 'guj':'gu', 'glv':'gv', 'hau':'ha', \
-                 'heb':'he', 'hin':'hi', 'hmo':'ho', 'hrv':'hr', 'hat':'ht', 'hun':'hu', \
-                 'hye':'hy', 'her':'hz', 'ina':'ia', 'ind':'id', 'ile':'ie', 'ibo':'ig', \
-                 'iii':'ii', 'ipk':'ik', 'ido':'io', 'isl':'is', 'ita':'it', 'iku':'iu', \
-                 'jpn':'ja', 'jav':'jv', 'kat':'ka', 'kon':'kg', 'kik':'ki', 'kua':'kj', \
-                 'kaz':'kk', 'kal':'kl', 'khm':'km', 'kan':'kn', 'kor':'ko', 'kau':'kr', \
-                 'kas':'ks', 'kur':'ku', 'kom':'kv', 'cor':'kw', 'kir':'ky', 'lat':'la', \
-                 'ltz':'lb', 'lug':'lg', 'lim':'li', 'lin':'ln', 'lao':'lo', 'lit':'lt', \
-                 'lub':'lu', 'lav':'lv', 'mlg':'mg', 'mah':'mh', 'mri':'mi', 'mkd':'mk', \
-                 'mal':'ml', 'mon':'mn', 'mar':'mr', 'msa':'ms', 'mlt':'mt', 'mya':'my', \
-                 'nau':'na', 'nob':'nb', 'nde':'nd', 'nep':'ne', 'ndo':'ng', 'nld':'nl', \
-                 'nno':'nn', 'nor':'no', 'nbl':'nr', 'nav':'nv', 'nya':'ny', 'oci':'oc', \
-                 'oji':'oj', 'orm':'om', 'ori':'or', 'oss':'os', 'pan':'pa', 'pli':'pi', \
-                 'pol':'pl', 'pus':'ps', 'por':'pt', 'que':'qu', 'roh':'rm', 'run':'rn', \
-                 'ron':'ro', 'rus':'ru', 'kin':'rw', 'san':'sa', 'srd':'sc', 'snd':'sd', \
-                 'sme':'se', 'sag':'sg', 'sin':'si', 'slk':'sk', 'slv':'sl', 'smo':'sm', \
-                 'sna':'sn', 'som':'so', 'sqi':'sq', 'srp':'sr', 'ssw':'ss', 'sot':'st', \
-                 'sun':'su', 'swe':'sv', 'swa':'sw', 'tam':'ta', 'tel':'te', 'tgk':'tg', \
-                 'tha':'th', 'tir':'ti', 'tuk':'tk', 'tgl':'tl', 'tsn':'tn', 'ton':'to', \
-                 'tur':'tr', 'tso':'ts', 'tat':'tt', 'twi':'tw', 'tah':'ty', 'uig':'ug', \
-                 'ukr':'uk', 'urd':'ur', 'uzb':'uz', 'ven':'ve', 'vie':'vi', 'vol':'vo', \
-                 'wln':'wa', 'wol':'wo', 'xho':'xh', 'yid':'yi', 'yor':'yo', 'zha':'za', \
-                 'zho':'zh', 'zul':'zu'}
 
-## GetLanguageCode
+LangConvTable = {'eng': 'en', 'fra': 'fr',
+                 'aar': 'aa', 'abk': 'ab', 'ave': 'ae', 'afr': 'af', 'aka': 'ak', 'amh': 'am',
+                 'arg': 'an', 'ara': 'ar', 'asm': 'as', 'ava': 'av', 'aym': 'ay', 'aze': 'az',
+                 'bak': 'ba', 'bel': 'be', 'bul': 'bg', 'bih': 'bh', 'bis': 'bi', 'bam': 'bm',
+                 'ben': 'bn', 'bod': 'bo', 'bre': 'br', 'bos': 'bs', 'cat': 'ca', 'che': 'ce',
+                 'cha': 'ch', 'cos': 'co', 'cre': 'cr', 'ces': 'cs', 'chu': 'cu', 'chv': 'cv',
+                 'cym': 'cy', 'dan': 'da', 'deu': 'de', 'div': 'dv', 'dzo': 'dz', 'ewe': 'ee',
+                 'ell': 'el', 'epo': 'eo', 'spa': 'es', 'est': 'et', 'eus': 'eu', 'fas': 'fa',
+                 'ful': 'ff', 'fin': 'fi', 'fij': 'fj', 'fao': 'fo', 'fry': 'fy', 'gle': 'ga',
+                 'gla': 'gd', 'glg': 'gl', 'grn': 'gn', 'guj': 'gu', 'glv': 'gv', 'hau': 'ha',
+                 'heb': 'he', 'hin': 'hi', 'hmo': 'ho', 'hrv': 'hr', 'hat': 'ht', 'hun': 'hu',
+                 'hye': 'hy', 'her': 'hz', 'ina': 'ia', 'ind': 'id', 'ile': 'ie', 'ibo': 'ig',
+                 'iii': 'ii', 'ipk': 'ik', 'ido': 'io', 'isl': 'is', 'ita': 'it', 'iku': 'iu',
+                 'jpn': 'ja', 'jav': 'jv', 'kat': 'ka', 'kon': 'kg', 'kik': 'ki', 'kua': 'kj',
+                 'kaz': 'kk', 'kal': 'kl', 'khm': 'km', 'kan': 'kn', 'kor': 'ko', 'kau': 'kr',
+                 'kas': 'ks', 'kur': 'ku', 'kom': 'kv', 'cor': 'kw', 'kir': 'ky', 'lat': 'la',
+                 'ltz': 'lb', 'lug': 'lg', 'lim': 'li', 'lin': 'ln', 'lao': 'lo', 'lit': 'lt',
+                 'lub': 'lu', 'lav': 'lv', 'mlg': 'mg', 'mah': 'mh', 'mri': 'mi', 'mkd': 'mk',
+                 'mal': 'ml', 'mon': 'mn', 'mar': 'mr', 'msa': 'ms', 'mlt': 'mt', 'mya': 'my',
+                 'nau': 'na', 'nob': 'nb', 'nde': 'nd', 'nep': 'ne', 'ndo': 'ng', 'nld': 'nl',
+                 'nno': 'nn', 'nor': 'no', 'nbl': 'nr', 'nav': 'nv', 'nya': 'ny', 'oci': 'oc',
+                 'oji': 'oj', 'orm': 'om', 'ori': 'or', 'oss': 'os', 'pan': 'pa', 'pli': 'pi',
+                 'pol': 'pl', 'pus': 'ps', 'por': 'pt', 'que': 'qu', 'roh': 'rm', 'run': 'rn',
+                 'ron': 'ro', 'rus': 'ru', 'kin': 'rw', 'san': 'sa', 'srd': 'sc', 'snd': 'sd',
+                 'sme': 'se', 'sag': 'sg', 'sin': 'si', 'slk': 'sk', 'slv': 'sl', 'smo': 'sm',
+                 'sna': 'sn', 'som': 'so', 'sqi': 'sq', 'srp': 'sr', 'ssw': 'ss', 'sot': 'st',
+                 'sun': 'su', 'swe': 'sv', 'swa': 'sw', 'tam': 'ta', 'tel': 'te', 'tgk': 'tg',
+                 'tha': 'th', 'tir': 'ti', 'tuk': 'tk', 'tgl': 'tl', 'tsn': 'tn', 'ton': 'to',
+                 'tur': 'tr', 'tso': 'ts', 'tat': 'tt', 'twi': 'tw', 'tah': 'ty', 'uig': 'ug',
+                 'ukr': 'uk', 'urd': 'ur', 'uzb': 'uz', 'ven': 've', 'vie': 'vi', 'vol': 'vo',
+                 'wln': 'wa', 'wol': 'wo', 'xho': 'xh', 'yid': 'yi', 'yor': 'yo', 'zha': 'za',
+                 'zho': 'zh', 'zul': 'zu'}
+
+# GetLanguageCode
 #
 # Check the language code read from .UNI file and convert ISO 639-2 codes to RFC 4646 codes if appropriate
 # ISO 639-2 language codes supported in compatibility mode
@@ -100,6 +106,8 @@ LangConvTable = {'eng':'en', 'fra':'fr', \
 #
 # @retval LangName:  Valid language code in RFC 4646 format or None
 #
+
+
 def GetLanguageCode(LangName, IsCompatibleMode, File):
     length = len(LangName)
     if IsCompatibleMode:
@@ -109,7 +117,8 @@ def GetLanguageCode(LangName, IsCompatibleMode, File):
                 return TempLangName
             return LangName
         else:
-            EdkLogger.error("Unicode File Parser", FORMAT_INVALID, "Invalid ISO 639-2 language code : %s" % LangName, File)
+            EdkLogger.error("Unicode File Parser", FORMAT_INVALID,
+                            "Invalid ISO 639-2 language code : %s" % LangName, File)
 
     if (LangName[0] == 'X' or LangName[0] == 'x') and LangName[1] == '-':
         return LangName
@@ -128,14 +137,17 @@ def GetLanguageCode(LangName, IsCompatibleMode, File):
         if LangName[0:3].isalpha() and LangConvTable.get(LangName.lower()) is None and LangName[3] == '-':
             return LangName
 
-    EdkLogger.error("Unicode File Parser", FORMAT_INVALID, "Invalid RFC 4646 language code : %s" % LangName, File)
+    EdkLogger.error("Unicode File Parser", FORMAT_INVALID,
+                    "Invalid RFC 4646 language code : %s" % LangName, File)
 
-## Ucs2Codec
+# Ucs2Codec
 #
 # This is only a partial codec implementation. It only supports
 # encoding, and is primarily used to check that all the characters are
 # valid for UCS-2.
 #
+
+
 class Ucs2Codec(codecs.Codec):
     def __init__(self):
         self.__utf16 = codecs.lookup('utf-16')
@@ -150,7 +162,10 @@ class Ucs2Codec(codecs.Codec):
                 raise ValueError("Code Point too large to encode in UCS-2")
         return self.__utf16.encode(input)
 
+
 TheUcs2Codec = Ucs2Codec()
+
+
 def Ucs2Search(name):
     if name in ['ucs-2', 'ucs_2']:
         return codecs.CodecInfo(
@@ -159,14 +174,18 @@ def Ucs2Search(name):
             decode=TheUcs2Codec.decode)
     else:
         return None
+
+
 codecs.register(Ucs2Search)
 
-## StringDefClassObject
+# StringDefClassObject
 #
 # A structure for language definition
 #
+
+
 class StringDefClassObject(object):
-    def __init__(self, Name = None, Value = None, Referenced = False, Token = None, UseOtherLangDef = ''):
+    def __init__(self, Name=None, Value=None, Referenced=False, Token=None, UseOtherLangDef=''):
         self.StringName = ''
         self.StringNameByteList = []
         self.StringValue = ''
@@ -188,42 +207,48 @@ class StringDefClassObject(object):
 
     def __str__(self):
         return repr(self.StringName) + ' ' + \
-               repr(self.Token) + ' ' + \
-               repr(self.Referenced) + ' ' + \
-               repr(self.StringValue) + ' ' + \
-               repr(self.UseOtherLangDef)
+            repr(self.Token) + ' ' + \
+            repr(self.Referenced) + ' ' + \
+            repr(self.StringValue) + ' ' + \
+            repr(self.UseOtherLangDef)
 
-    def UpdateValue(self, Value = None):
+    def UpdateValue(self, Value=None):
         if Value is not None:
             self.StringValue = Value + u'\x00'        # Add a NULL at string tail
             self.StringValueByteList = UniToHexList(self.StringValue)
             self.Length = len(self.StringValueByteList)
 
+
 def StripComments(Line):
     Comment = u'//'
     CommentPos = Line.find(Comment)
     while CommentPos >= 0:
-    # if there are non matched quotes before the comment header
-    # then we are in the middle of a string
-    # but we need to ignore the escaped quotes and backslashes.
+        # if there are non matched quotes before the comment header
+        # then we are in the middle of a string
+        # but we need to ignore the escaped quotes and backslashes.
         if ((Line.count(u'"', 0, CommentPos) - Line.count(u'\\"', 0, CommentPos)) & 1) == 1:
-            CommentPos = Line.find (Comment, CommentPos + 1)
+            CommentPos = Line.find(Comment, CommentPos + 1)
         else:
             return Line[:CommentPos].strip()
     return Line.strip()
 
-## UniFileClassObject
+# UniFileClassObject
 #
 # A structure for .uni file definition
 #
+
+
 class UniFileClassObject(object):
-    def __init__(self, FileList = [], IsCompatibleMode = False, IncludePathList = []):
+    def __init__(self, FileList=[], IsCompatibleMode=False, IncludePathList=[]):
         self.FileList = FileList
         self.Token = 2
-        self.LanguageDef = []                   #[ [u'LanguageIdentifier', u'PrintableName'], ... ]
-        self.OrderedStringList = {}             #{ u'LanguageIdentifier' : [StringDefClassObject]  }
-        self.OrderedStringDict = {}             #{ u'LanguageIdentifier' : {StringName:(IndexInList)}  }
-        self.OrderedStringListByToken = {}      #{ u'LanguageIdentifier' : {Token: StringDefClassObject} }
+        self.LanguageDef = []  # [ [u'LanguageIdentifier', u'PrintableName'], ... ]
+        # { u'LanguageIdentifier' : [StringDefClassObject]  }
+        self.OrderedStringList = {}
+        # { u'LanguageIdentifier' : {StringName:(IndexInList)}  }
+        self.OrderedStringDict = {}
+        # { u'LanguageIdentifier' : {Token: StringDefClassObject} }
+        self.OrderedStringListByToken = {}
         self.IsCompatibleMode = IsCompatibleMode
         self.IncludePathList = IncludePathList
         if len(self.FileList) > 0:
@@ -236,23 +261,26 @@ class UniFileClassObject(object):
         Lang = shlex.split(Line.split(u"//")[0])
         if len(Lang) != 3:
             try:
-                FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
+                FileIn = UniFileClassObject.OpenUniFile(
+                    LongFilePath(File.Path))
             except UnicodeError as X:
-                EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File);
+                EdkLogger.error("build", FILE_READ_FAILURE,
+                                "File read failure: %s" % str(X), ExtraData=File)
             except:
-                EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File);
+                EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File)
             LineNo = GetLineNo(FileIn, Line, False)
             EdkLogger.error("Unicode File Parser", PARSER_ERROR, "Wrong language definition",
                             ExtraData="""%s\n\t*Correct format is like '#langdef en-US "English"'""" % Line, File=File, Line=LineNo)
         else:
-            LangName = GetLanguageCode(Lang[1], self.IsCompatibleMode, self.File)
+            LangName = GetLanguageCode(
+                Lang[1], self.IsCompatibleMode, self.File)
             LangPrintName = Lang[2]
 
         IsLangInDef = False
         for Item in self.LanguageDef:
             if Item[0] == LangName:
                 IsLangInDef = True
-                break;
+                break
 
         if not IsLangInDef:
             self.LanguageDef.append([LangName, LangPrintName])
@@ -260,8 +288,10 @@ class UniFileClassObject(object):
         #
         # Add language string
         #
-        self.AddStringToList(u'$LANGUAGE_NAME', LangName, LangName, 0, True, Index=0)
-        self.AddStringToList(u'$PRINTABLE_LANGUAGE_NAME', LangName, LangPrintName, 1, True, Index=1)
+        self.AddStringToList(u'$LANGUAGE_NAME', LangName,
+                             LangName, 0, True, Index=0)
+        self.AddStringToList(u'$PRINTABLE_LANGUAGE_NAME',
+                             LangName, LangPrintName, 1, True, Index=1)
 
         if not IsLangInDef:
             #
@@ -270,14 +300,16 @@ class UniFileClassObject(object):
             #
             FirstLangName = self.LanguageDef[0][0]
             if LangName != FirstLangName:
-                for Index in range (2, len (self.OrderedStringList[FirstLangName])):
+                for Index in range(2, len(self.OrderedStringList[FirstLangName])):
                     Item = self.OrderedStringList[FirstLangName][Index]
                     if Item.UseOtherLangDef != '':
                         OtherLang = Item.UseOtherLangDef
                     else:
                         OtherLang = FirstLangName
-                    self.OrderedStringList[LangName].append (StringDefClassObject(Item.StringName, '', Item.Referenced, Item.Token, OtherLang))
-                    self.OrderedStringDict[LangName][Item.StringName] = len(self.OrderedStringList[LangName]) - 1
+                    self.OrderedStringList[LangName].append(StringDefClassObject(
+                        Item.StringName, '', Item.Referenced, Item.Token, OtherLang))
+                    self.OrderedStringDict[LangName][Item.StringName] = len(
+                        self.OrderedStringList[LangName]) - 1
         return True
 
     @staticmethod
@@ -297,7 +329,7 @@ class UniFileClassObject(object):
         #
         Encoding = 'utf-8'
         if (FileIn.startswith(codecs.BOM_UTF16_BE) or
-            FileIn.startswith(codecs.BOM_UTF16_LE)):
+                FileIn.startswith(codecs.BOM_UTF16_LE)):
             Encoding = 'utf-16'
 
         UniFileClassObject.VerifyUcs2Data(FileIn, FileName, Encoding)
@@ -322,9 +354,10 @@ class UniFileClassObject(object):
             (Reader, Writer) = (Info.streamreader, Info.streamwriter)
             File = codecs.StreamReaderWriter(UniFile, Reader, Writer)
             LineNumber = 0
-            ErrMsg = lambda Encoding, LineNumber: \
-                     '%s contains invalid %s characters on line %d.' % \
-                     (FileName, Encoding, LineNumber)
+
+            def ErrMsg(Encoding, LineNumber): return \
+                '%s contains invalid %s characters on line %d.' % \
+                (FileName, Encoding, LineNumber)
             while True:
                 LineNumber = LineNumber + 1
                 try:
@@ -349,22 +382,26 @@ class UniFileClassObject(object):
         if Name != '':
             MatchString = gIdentifierPattern.match(Name)
             if MatchString is None:
-                EdkLogger.error('Unicode File Parser', FORMAT_INVALID, 'The string token name %s defined in UNI file %s contains the invalid character.' % (Name, self.File))
+                EdkLogger.error('Unicode File Parser', FORMAT_INVALID,
+                                'The string token name %s defined in UNI file %s contains the invalid character.' % (Name, self.File))
         LanguageList = Item.split(u'#language ')
         for IndexI in range(len(LanguageList)):
             if IndexI == 0:
                 continue
             else:
                 Language = LanguageList[IndexI].split()[0]
-                Value = LanguageList[IndexI][LanguageList[IndexI].find(u'\"') + len(u'\"') : LanguageList[IndexI].rfind(u'\"')] #.replace(u'\r\n', u'')
-                Language = GetLanguageCode(Language, self.IsCompatibleMode, self.File)
+                Value = LanguageList[IndexI][LanguageList[IndexI].find(
+                    u'\"') + len(u'\"'): LanguageList[IndexI].rfind(u'\"')]  # .replace(u'\r\n', u'')
+                Language = GetLanguageCode(
+                    Language, self.IsCompatibleMode, self.File)
                 self.AddStringToList(Name, Language, Value)
 
     #
     # Get include file list and load them
     #
     def GetIncludeFile(self, Item, Dir):
-        FileName = Item[Item.find(u'#include ') + len(u'#include ') :Item.find(u' ', len(u'#include '))][1:-1]
+        FileName = Item[Item.find(
+            u'#include ') + len(u'#include '):Item.find(u' ', len(u'#include '))][1:-1]
         self.LoadUniFile(FileName)
 
     #
@@ -374,11 +411,13 @@ class UniFileClassObject(object):
         try:
             FileIn = UniFileClassObject.OpenUniFile(LongFilePath(File.Path))
         except UnicodeError as X:
-            EdkLogger.error("build", FILE_READ_FAILURE, "File read failure: %s" % str(X), ExtraData=File.Path);
+            EdkLogger.error("build", FILE_READ_FAILURE,
+                            "File read failure: %s" % str(X), ExtraData=File.Path)
         except OSError:
-            EdkLogger.error("Unicode File Parser", FILE_NOT_FOUND, ExtraData=File.Path)
+            EdkLogger.error("Unicode File Parser",
+                            FILE_NOT_FOUND, ExtraData=File.Path)
         except:
-            EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File.Path);
+            EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=File.Path)
 
         Lines = []
         #
@@ -395,7 +434,6 @@ class UniFileClassObject(object):
             if len(Line) == 0:
                 continue
 
-
             Line = Line.replace(u'/langdef', u'#langdef')
             Line = Line.replace(u'/string', u'#string')
             Line = Line.replace(u'/language', u'#language')
@@ -417,15 +455,15 @@ class UniFileClassObject(object):
             StartPos = Line.find(u'\\x')
             while (StartPos != -1):
                 EndPos = Line.find(u'\\', StartPos + 1, StartPos + 7)
-                if EndPos != -1 and EndPos - StartPos == 6 :
-                    if g4HexChar.match(Line[StartPos + 2 : EndPos], re.UNICODE):
-                        EndStr = Line[EndPos: ]
+                if EndPos != -1 and EndPos - StartPos == 6:
+                    if g4HexChar.match(Line[StartPos + 2: EndPos], re.UNICODE):
+                        EndStr = Line[EndPos:]
                         UniStr = Line[StartPos + 2: EndPos]
                         if EndStr.startswith(u'\\x') and len(EndStr) >= 7:
-                            if EndStr[6] == u'\\' and g4HexChar.match(EndStr[2 : 6], re.UNICODE):
-                                Line = Line[0 : StartPos] + UniStr + EndStr
+                            if EndStr[6] == u'\\' and g4HexChar.match(EndStr[2: 6], re.UNICODE):
+                                Line = Line[0: StartPos] + UniStr + EndStr
                         else:
-                            Line = Line[0 : StartPos] + UniStr + EndStr[1:]
+                            Line = Line[0: StartPos] + UniStr + EndStr[1:]
                 StartPos = Line.find(u'\\x', StartPos + 1)
 
             IncList = gIncludePattern.findall(Line)
@@ -436,7 +474,8 @@ class UniFileClassObject(object):
                         Lines.extend(self.PreProcess(IncFile))
                         break
                 else:
-                    EdkLogger.error("Unicode File Parser", FILE_NOT_FOUND, Message="Cannot find include file", ExtraData=str(IncList[0]))
+                    EdkLogger.error("Unicode File Parser", FILE_NOT_FOUND,
+                                    Message="Cannot find include file", ExtraData=str(IncList[0]))
                 continue
 
             Lines.append(Line)
@@ -446,9 +485,10 @@ class UniFileClassObject(object):
     #
     # Load a .uni file
     #
-    def LoadUniFile(self, File = None):
+    def LoadUniFile(self, File=None):
         if File is None:
-            EdkLogger.error("Unicode File Parser", PARSER_ERROR, 'No unicode file is given')
+            EdkLogger.error("Unicode File Parser", PARSER_ERROR,
+                            'No unicode file is given')
         self.File = File
         #
         # Process special char in file
@@ -488,10 +528,12 @@ class UniFileClassObject(object):
             #     Mi segunda secuencia 2
             #
             if Line.find(u'#string ') >= 0 and Line.find(u'#language ') < 0 and \
-                SecondLine.find(u'#string ') < 0 and SecondLine.find(u'#language ') >= 0 and \
-                ThirdLine.find(u'#string ') < 0 and ThirdLine.find(u'#language ') < 0:
-                Name = Line[Line.find(u'#string ') + len(u'#string ') : ].strip(' ')
-                Language = SecondLine[SecondLine.find(u'#language ') + len(u'#language ') : ].strip(' ')
+                    SecondLine.find(u'#string ') < 0 and SecondLine.find(u'#language ') >= 0 and \
+                    ThirdLine.find(u'#string ') < 0 and ThirdLine.find(u'#language ') < 0:
+                Name = Line[Line.find(u'#string ') +
+                            len(u'#string '):].strip(' ')
+                Language = SecondLine[SecondLine.find(
+                    u'#language ') + len(u'#language '):].strip(' ')
                 for IndexJ in range(IndexI + 2, len(Lines)):
                     if Lines[IndexJ].find(u'#string ') < 0 and Lines[IndexJ].find(u'#language ') < 0:
                         Value = Value + Lines[IndexJ]
@@ -499,12 +541,14 @@ class UniFileClassObject(object):
                         IndexI = IndexJ
                         break
                 # Value = Value.replace(u'\r\n', u'')
-                Language = GetLanguageCode(Language, self.IsCompatibleMode, self.File)
+                Language = GetLanguageCode(
+                    Language, self.IsCompatibleMode, self.File)
                 # Check the string name
                 if not self.IsCompatibleMode and Name != '':
                     MatchString = gIdentifierPattern.match(Name)
                     if MatchString is None:
-                        EdkLogger.error('Unicode File Parser', FORMAT_INVALID, 'The string token name %s defined in UNI file %s contains the invalid character.' % (Name, self.File))
+                        EdkLogger.error('Unicode File Parser', FORMAT_INVALID,
+                                        'The string token name %s defined in UNI file %s contains the invalid character.' % (Name, self.File))
                 self.AddStringToList(Name, Language, Value)
                 continue
 
@@ -529,7 +573,8 @@ class UniFileClassObject(object):
                     elif Lines[IndexJ].find(u'#string ') < 0 and Lines[IndexJ].find(u'#language ') >= 0:
                         StringItem = StringItem + Lines[IndexJ]
                     elif Lines[IndexJ].count(u'\"') >= 2:
-                        StringItem = StringItem[ : StringItem.rfind(u'\"')] + Lines[IndexJ][Lines[IndexJ].find(u'\"') + len(u'\"') : ]
+                        StringItem = StringItem[: StringItem.rfind(
+                            u'\"')] + Lines[IndexJ][Lines[IndexJ].find(u'\"') + len(u'\"'):]
                 self.GetStringObject(StringItem)
                 continue
 
@@ -544,12 +589,12 @@ class UniFileClassObject(object):
     #
     # Add a string to list
     #
-    def AddStringToList(self, Name, Language, Value, Token = None, Referenced = False, UseOtherLangDef = '', Index = -1):
+    def AddStringToList(self, Name, Language, Value, Token=None, Referenced=False, UseOtherLangDef='', Index=-1):
         for LangNameItem in self.LanguageDef:
             if Language == LangNameItem[0]:
                 break
         else:
-            EdkLogger.error('Unicode File Parser', FORMAT_NOT_SUPPORTED, "The language '%s' for %s is not defined in Unicode file %s." \
+            EdkLogger.error('Unicode File Parser', FORMAT_NOT_SUPPORTED, "The language '%s' for %s is not defined in Unicode file %s."
                             % (Language, Name, self.File))
 
         if Language not in self.OrderedStringList:
@@ -568,7 +613,8 @@ class UniFileClassObject(object):
         if IsAdded:
             Token = len(self.OrderedStringList[Language])
             if Index == -1:
-                self.OrderedStringList[Language].append(StringDefClassObject(Name, Value, Referenced, Token, UseOtherLangDef))
+                self.OrderedStringList[Language].append(StringDefClassObject(
+                    Name, Value, Referenced, Token, UseOtherLangDef))
                 self.OrderedStringDict[Language][Name] = Token
                 for LangName in self.LanguageDef:
                     #
@@ -580,10 +626,13 @@ class UniFileClassObject(object):
                             OtherLangDef = UseOtherLangDef
                         else:
                             OtherLangDef = Language
-                        self.OrderedStringList[LangName[0]].append(StringDefClassObject(Name, '', Referenced, Token, OtherLangDef))
-                        self.OrderedStringDict[LangName[0]][Name] = len(self.OrderedStringList[LangName[0]]) - 1
+                        self.OrderedStringList[LangName[0]].append(
+                            StringDefClassObject(Name, '', Referenced, Token, OtherLangDef))
+                        self.OrderedStringDict[LangName[0]][Name] = len(
+                            self.OrderedStringList[LangName[0]]) - 1
             else:
-                self.OrderedStringList[Language].insert(Index, StringDefClassObject(Name, Value, Referenced, Token, UseOtherLangDef))
+                self.OrderedStringList[Language].insert(Index, StringDefClassObject(
+                    Name, Value, Referenced, Token, UseOtherLangDef))
                 self.OrderedStringDict[Language][Name] = Index
 
     #
@@ -637,7 +686,7 @@ class UniFileClassObject(object):
         # Use small token for all referred string stoken.
         #
         RefToken = 0
-        for Index in range (0, len (self.OrderedStringList[FirstLangName])):
+        for Index in range(0, len(self.OrderedStringList[FirstLangName])):
             FirstLangItem = self.OrderedStringList[FirstLangName][Index]
             if FirstLangItem.Referenced == True:
                 for LangNameItem in self.LanguageDef:
@@ -652,7 +701,7 @@ class UniFileClassObject(object):
         # Use big token for all unreferred string stoken.
         #
         UnRefToken = 0
-        for Index in range (0, len (self.OrderedStringList[FirstLangName])):
+        for Index in range(0, len(self.OrderedStringList[FirstLangName])):
             FirstLangItem = self.OrderedStringList[FirstLangName][Index]
             if FirstLangItem.Referenced == False:
                 for LangNameItem in self.LanguageDef:
@@ -667,17 +716,19 @@ class UniFileClassObject(object):
     #
     def ShowMe(self):
         print(self.LanguageDef)
-        #print self.OrderedStringList
+        # print self.OrderedStringList
         for Item in self.OrderedStringList:
             print(Item)
             for Member in self.OrderedStringList[Item]:
                 print(str(Member))
 
+
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
 if __name__ == '__main__':
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.DEBUG_0)
-    a = UniFileClassObject([PathClass("C:\\Edk\\Strings.uni"), PathClass("C:\\Edk\\Strings2.uni")])
+    a = UniFileClassObject(
+        [PathClass("C:\\Edk\\Strings.uni"), PathClass("C:\\Edk\\Strings2.uni")])
     a.ReToken()
     a.ShowMe()
diff --git a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
index ad8c9b598025..36907e983b60 100644
--- a/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
+++ b/BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
@@ -15,6 +15,7 @@ from io import BytesIO
 from struct import pack
 from Common.DataType import *
 
+
 class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
     def __init__(self):
         self.var_check_info = []
@@ -56,7 +57,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                     else:
                         realLength += item.StorageWidth
                         realLength += item.StorageWidth
-                if (index == len(self.var_check_info)) :
+                if (index == len(self.var_check_info)):
                     if (itemIndex < len(var_check_tab.validtab)) and realLength % 4:
                         realLength += (4 - (realLength % 4))
                 else:
@@ -136,14 +137,16 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
                         Buffer += b
                         realLength += item.StorageWidth
                     else:
-                        b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data[0])
+                        b = pack(
+                            PACK_CODE_BY_SIZE[item.StorageWidth], v_data[0])
                         Buffer += b
                         realLength += item.StorageWidth
-                        b = pack(PACK_CODE_BY_SIZE[item.StorageWidth], v_data[1])
+                        b = pack(
+                            PACK_CODE_BY_SIZE[item.StorageWidth], v_data[1])
                         Buffer += b
                         realLength += item.StorageWidth
 
-                if (index == len(self.var_check_info)) :
+                if (index == len(self.var_check_info)):
                     if (itemIndex < len(var_check_tab.validtab)) and realLength % 4:
                         for i in range(4 - (realLength % 4)):
                             b = pack("=B", var_check_tab.pad)
@@ -173,6 +176,7 @@ class VAR_CHECK_PCD_VARIABLE_TAB_CONTAINER(object):
 
 class VAR_CHECK_PCD_VARIABLE_TAB(object):
     pad = 0xDA
+
     def __init__(self, TokenSpaceGuid, PcdCName):
         self.Revision = 0x0001
         self.HeaderLength = 0
@@ -180,7 +184,8 @@ class VAR_CHECK_PCD_VARIABLE_TAB(object):
         self.Type = 0
         self.Reserved = 0
         self.Attributes = 0x00000000
-        self.Guid = eval("[" + TokenSpaceGuid.replace("{", "").replace("}", "") + "]")
+        self.Guid = eval(
+            "[" + TokenSpaceGuid.replace("{", "").replace("}", "") + "]")
         self.Name = PcdCName
         self.validtab = []
 
@@ -233,9 +238,11 @@ class VAR_CHECK_PCD_VALID_OBJ(object):
     def __eq__(self, validObj):
         return validObj and self.VarOffset == validObj.VarOffset
 
+
 class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
     def __init__(self, VarOffset, validlist, PcdDataType):
-        super(VAR_CHECK_PCD_VALID_LIST, self).__init__(VarOffset, validlist, PcdDataType)
+        super(VAR_CHECK_PCD_VALID_LIST, self).__init__(
+            VarOffset, validlist, PcdDataType)
         self.Type = 1
         valid_num_list = []
         for item in self.rawdata:
@@ -249,13 +256,13 @@ class VAR_CHECK_PCD_VALID_LIST(VAR_CHECK_PCD_VALID_OBJ):
             else:
                 self.data.add(int(valid_num))
 
-
         self.Length = 5 + len(self.data) * self.StorageWidth
 
 
 class VAR_CHECK_PCD_VALID_RANGE(VAR_CHECK_PCD_VALID_OBJ):
     def __init__(self, VarOffset, validrange, PcdDataType):
-        super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(VarOffset, validrange, PcdDataType)
+        super(VAR_CHECK_PCD_VALID_RANGE, self).__init__(
+            VarOffset, validrange, PcdDataType)
         self.Type = 2
         RangeExpr = ""
         i = 0
diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
index f86c749c08c3..1f10e03c268f 100644
--- a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
+++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
@@ -1,11 +1,11 @@
-## @file
+# @file
 # Create makefile for MS nmake and GNU make
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 from __future__ import print_function
 from __future__ import absolute_import
@@ -25,25 +25,28 @@ from Common.DataType import *
 from Common.Misc import *
 import json
 
-## Regular expression for splitting Dependency Expression string into tokens
+# Regular expression for splitting Dependency Expression string into tokens
 gDepexTokenPattern = re.compile("(\(|\)|\w+| \S+\.inf)")
 
-## Regular expression for match: PCD(xxxx.yyy)
+# Regular expression for match: PCD(xxxx.yyy)
 gPCDAsGuidPattern = re.compile(r"^PCD\(.+\..+\)$")
 
-## Workspace AutoGen class
+# Workspace AutoGen class
 #
 #   This class is used mainly to control the whole platform build for different
 # architecture. This class will generate top level makefile.
 #
+
+
 class WorkspaceAutoGen(AutoGen):
     # call super().__init__ then call the worker function with different parameter count
     def __init__(self, Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs):
         if not hasattr(self, "_Init"):
-            self._InitWorker(Workspace, MetaFile, Target, Toolchain, Arch, *args, **kwargs)
+            self._InitWorker(Workspace, MetaFile, Target,
+                             Toolchain, Arch, *args, **kwargs)
             self._Init = True
 
-    ## Initialize WorkspaceAutoGen
+    # Initialize WorkspaceAutoGen
     #
     #   @param  WorkspaceDir            Root directory of workspace
     #   @param  ActivePlatform          Meta-file of active platform
@@ -60,25 +63,26 @@ class WorkspaceAutoGen(AutoGen):
     #   @param  SkuId                   SKU id from command line
     #
     def _InitWorker(self, WorkspaceDir, ActivePlatform, Target, Toolchain, ArchList, MetaFileDb,
-              BuildConfig, ToolDefinition, FlashDefinitionFile='', Fds=None, Fvs=None, Caps=None, SkuId='', UniFlag=None,
-              Progress=None, BuildModule=None):
-        self.BuildDatabase  = MetaFileDb
-        self.MetaFile       = ActivePlatform
-        self.WorkspaceDir   = WorkspaceDir
-        self.Platform       = self.BuildDatabase[self.MetaFile, TAB_ARCH_COMMON, Target, Toolchain]
+                    BuildConfig, ToolDefinition, FlashDefinitionFile='', Fds=None, Fvs=None, Caps=None, SkuId='', UniFlag=None,
+                    Progress=None, BuildModule=None):
+        self.BuildDatabase = MetaFileDb
+        self.MetaFile = ActivePlatform
+        self.WorkspaceDir = WorkspaceDir
+        self.Platform = self.BuildDatabase[self.MetaFile,
+                                           TAB_ARCH_COMMON, Target, Toolchain]
         GlobalData.gActivePlatform = self.Platform
-        self.BuildTarget    = Target
-        self.ToolChain      = Toolchain
-        self.ArchList       = ArchList
-        self.SkuId          = SkuId
-        self.UniFlag        = UniFlag
+        self.BuildTarget = Target
+        self.ToolChain = Toolchain
+        self.ArchList = ArchList
+        self.SkuId = SkuId
+        self.UniFlag = UniFlag
 
-        self.TargetTxt      = BuildConfig
-        self.ToolDef        = ToolDefinition
-        self.FdfFile        = FlashDefinitionFile
-        self.FdTargetList   = Fds if Fds else []
-        self.FvTargetList   = Fvs if Fvs else []
-        self.CapTargetList  = Caps if Caps else []
+        self.TargetTxt = BuildConfig
+        self.ToolDef = ToolDefinition
+        self.FdfFile = FlashDefinitionFile
+        self.FdTargetList = Fds if Fds else []
+        self.FvTargetList = Fvs if Fvs else []
+        self.CapTargetList = Caps if Caps else []
         self.AutoGenObjectList = []
         self._GuidDict = {}
 
@@ -90,7 +94,8 @@ class WorkspaceAutoGen(AutoGen):
 
         EdkLogger.info("")
         if self.ArchList:
-            EdkLogger.info('%-16s = %s' % ("Architecture(s)", ' '.join(self.ArchList)))
+            EdkLogger.info('%-16s = %s' %
+                           ("Architecture(s)", ' '.join(self.ArchList)))
         EdkLogger.info('%-16s = %s' % ("Build target", self.BuildTarget))
         EdkLogger.info('%-16s = %s' % ("Toolchain", self.ToolChain))
 
@@ -99,7 +104,8 @@ class WorkspaceAutoGen(AutoGen):
             EdkLogger.info('%-24s = %s' % ("Active Module", BuildModule))
 
         if self.FdfFile:
-            EdkLogger.info('%-24s = %s' % ("Flash Image Definition", self.FdfFile))
+            EdkLogger.info('%-24s = %s' %
+                           ("Flash Image Definition", self.FdfFile))
 
         EdkLogger.verbose("\nFLASH_DEFINITION = %s" % self.FdfFile)
 
@@ -144,9 +150,10 @@ class WorkspaceAutoGen(AutoGen):
             ArchList = set(self.ArchList) & set(self.Platform.SupArchList)
         if not ArchList:
             EdkLogger.error("build", PARAMETER_INVALID,
-                            ExtraData = "Invalid ARCH specified. [Valid ARCH: %s]" % (" ".join(self.Platform.SupArchList)))
+                            ExtraData="Invalid ARCH specified. [Valid ARCH: %s]" % (" ".join(self.Platform.SupArchList)))
         elif self.ArchList and len(ArchList) != len(self.ArchList):
-            SkippedArchList = set(self.ArchList).symmetric_difference(set(self.Platform.SupArchList))
+            SkippedArchList = set(self.ArchList).symmetric_difference(
+                set(self.Platform.SupArchList))
             EdkLogger.verbose("\nArch [%s] is ignored because the platform supports [%s] only!"
                               % (" ".join(SkippedArchList), " ".join(self.Platform.SupArchList)))
         self.ArchList = tuple(ArchList)
@@ -163,10 +170,12 @@ class WorkspaceAutoGen(AutoGen):
         oriPkgSet = set()
         PlatformPkg = set()
         for Arch in self.ArchList:
-            Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+            Platform = self.BuildDatabase[self.MetaFile,
+                                          Arch, self.BuildTarget, self.ToolChain]
             oriInfList = Platform.Modules
             for ModuleFile in oriInfList:
-                ModuleData = self.BuildDatabase[ModuleFile, Platform._Arch, Platform._Target, Platform._Toolchain]
+                ModuleData = self.BuildDatabase[ModuleFile,
+                                                Platform._Arch, Platform._Target, Platform._Toolchain]
                 oriPkgSet.update(ModuleData.Packages)
                 for Pkg in oriPkgSet:
                     Guids = Pkg.Guids
@@ -192,17 +201,21 @@ class WorkspaceAutoGen(AutoGen):
                 for FdRegion in FdDict.RegionList:
                     if str(FdRegion.RegionType) == 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
                         if int(FdRegion.Offset) % 8 != 0:
-                            EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
+                            EdkLogger.error(
+                                "build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
             FdfProfile = Fdf.Profile
         else:
             if self.FdTargetList:
-                EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(self.FdTargetList))
+                EdkLogger.info("No flash definition file found. FD [%s] will be ignored." % " ".join(
+                    self.FdTargetList))
                 self.FdTargetList = []
             if self.FvTargetList:
-                EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(self.FvTargetList))
+                EdkLogger.info("No flash definition file found. FV [%s] will be ignored." % " ".join(
+                    self.FvTargetList))
                 self.FvTargetList = []
             if self.CapTargetList:
-                EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(self.CapTargetList))
+                EdkLogger.info("No flash definition file found. Capsule [%s] will be ignored." % " ".join(
+                    self.CapTargetList))
                 self.CapTargetList = []
 
         return FdfProfile
@@ -221,37 +234,46 @@ class WorkspaceAutoGen(AutoGen):
                 if key == 'ArchTBD':
                     MetaFile_cache = defaultdict(set)
                     for Arch in self.ArchList:
-                        Current_Platform_cache = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+                        Current_Platform_cache = self.BuildDatabase[self.MetaFile,
+                                                                    Arch, self.BuildTarget, self.ToolChain]
                         for Pkey in Current_Platform_cache.Modules:
-                            MetaFile_cache[Arch].add(Current_Platform_cache.Modules[Pkey].MetaFile)
+                            MetaFile_cache[Arch].add(
+                                Current_Platform_cache.Modules[Pkey].MetaFile)
                     for Inf in self.FdfProfile.InfDict[key]:
-                        ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+                        ModuleFile = PathClass(
+                            NormPath(Inf), GlobalData.gWorkspace, Arch)
                         for Arch in self.ArchList:
                             if ModuleFile in MetaFile_cache[Arch]:
                                 break
                         else:
-                            ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+                            ModuleData = self.BuildDatabase[ModuleFile,
+                                                            Arch, self.BuildTarget, self.ToolChain]
                             if not ModuleData.IsBinaryModule:
-                                EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
+                                EdkLogger.error(
+                                    'build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
 
                 else:
                     for Arch in self.ArchList:
                         if Arch == key:
-                            Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+                            Platform = self.BuildDatabase[self.MetaFile,
+                                                          Arch, self.BuildTarget, self.ToolChain]
                             MetaFileList = set()
                             for Pkey in Platform.Modules:
-                                MetaFileList.add(Platform.Modules[Pkey].MetaFile)
+                                MetaFileList.add(
+                                    Platform.Modules[Pkey].MetaFile)
                             for Inf in self.FdfProfile.InfDict[key]:
-                                ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+                                ModuleFile = PathClass(
+                                    NormPath(Inf), GlobalData.gWorkspace, Arch)
                                 if ModuleFile in MetaFileList:
                                     continue
-                                ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+                                ModuleData = self.BuildDatabase[ModuleFile,
+                                                                Arch, self.BuildTarget, self.ToolChain]
                                 if not ModuleData.IsBinaryModule:
-                                    EdkLogger.error('build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
-
-
+                                    EdkLogger.error(
+                                        'build', PARSER_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % ModuleFile)
 
     # parse FDF file to get PCDs in it, if any
+
     def VerifyPcdsFromFDF(self):
 
         if self.FdfProfile:
@@ -260,22 +282,23 @@ class WorkspaceAutoGen(AutoGen):
 
     def ProcessPcdType(self):
         for Arch in self.ArchList:
-            Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+            Platform = self.BuildDatabase[self.MetaFile,
+                                          Arch, self.BuildTarget, self.ToolChain]
             Platform.Pcds
             # generate the SourcePcdDict and BinaryPcdDict
             Libs = []
             for BuildData in list(self.BuildDatabase._CACHE_.values()):
                 if BuildData.Arch != Arch:
                     continue
-                if BuildData.MetaFile.Ext == '.inf' and str(BuildData) in Platform.Modules :
+                if BuildData.MetaFile.Ext == '.inf' and str(BuildData) in Platform.Modules:
                     Libs.extend(GetModuleLibInstances(BuildData, Platform,
-                                     self.BuildDatabase,
-                                     Arch,
-                                     self.BuildTarget,
-                                     self.ToolChain,
-                                     self.Platform.MetaFile,
-                                     EdkLogger
-                                     ))
+                                                      self.BuildDatabase,
+                                                      Arch,
+                                                      self.BuildTarget,
+                                                      self.ToolChain,
+                                                      self.Platform.MetaFile,
+                                                      EdkLogger
+                                                      ))
             for BuildData in list(self.BuildDatabase._CACHE_.values()):
                 if BuildData.Arch != Arch:
                     continue
@@ -289,21 +312,23 @@ class WorkspaceAutoGen(AutoGen):
                                     BuildData.Pcds[key].Pending = False
 
                             if BuildData.MetaFile in Platform.Modules:
-                                PlatformModule = Platform.Modules[str(BuildData.MetaFile)]
+                                PlatformModule = Platform.Modules[str(
+                                    BuildData.MetaFile)]
                                 if key in PlatformModule.Pcds:
                                     PcdInPlatform = PlatformModule.Pcds[key]
                                     if PcdInPlatform.Type:
                                         BuildData.Pcds[key].Type = PcdInPlatform.Type
                                         BuildData.Pcds[key].Pending = False
                             else:
-                                #Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
+                                # Pcd used in Library, Pcd Type from reference module if Pcd Type is Pending
                                 if BuildData.Pcds[key].Pending:
                                     if bool(BuildData.LibraryClass):
                                         if BuildData in set(Libs):
                                             ReferenceModules = BuildData.ReferenceModules
                                             for ReferenceModule in ReferenceModules:
                                                 if ReferenceModule.MetaFile in Platform.Modules:
-                                                    RefPlatformModule = Platform.Modules[str(ReferenceModule.MetaFile)]
+                                                    RefPlatformModule = Platform.Modules[str(
+                                                        ReferenceModule.MetaFile)]
                                                     if key in RefPlatformModule.Pcds:
                                                         PcdInReferenceModule = RefPlatformModule.Pcds[key]
                                                         if PcdInReferenceModule.Type:
@@ -313,8 +338,10 @@ class WorkspaceAutoGen(AutoGen):
 
     def ProcessMixedPcd(self):
         for Arch in self.ArchList:
-            SourcePcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set(),TAB_PCDS_DYNAMIC:set(),TAB_PCDS_FIXED_AT_BUILD:set()}
-            BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX:set(), TAB_PCDS_PATCHABLE_IN_MODULE:set()}
+            SourcePcdDict = {TAB_PCDS_DYNAMIC_EX: set(), TAB_PCDS_PATCHABLE_IN_MODULE: set(
+            ), TAB_PCDS_DYNAMIC: set(), TAB_PCDS_FIXED_AT_BUILD: set()}
+            BinaryPcdDict = {TAB_PCDS_DYNAMIC_EX: set(
+            ), TAB_PCDS_PATCHABLE_IN_MODULE: set()}
             SourcePcdDict_Keys = SourcePcdDict.keys()
             BinaryPcdDict_Keys = BinaryPcdDict.keys()
 
@@ -327,21 +354,27 @@ class WorkspaceAutoGen(AutoGen):
                     for key in BuildData.Pcds:
                         if TAB_PCDS_DYNAMIC_EX in BuildData.Pcds[key].Type:
                             if BuildData.IsBinaryModule:
-                                BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                                BinaryPcdDict[TAB_PCDS_DYNAMIC_EX].add(
+                                    (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
                             else:
-                                SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                                SourcePcdDict[TAB_PCDS_DYNAMIC_EX].add(
+                                    (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
 
                         elif TAB_PCDS_PATCHABLE_IN_MODULE in BuildData.Pcds[key].Type:
                             if BuildData.MetaFile.Ext == '.inf':
                                 if BuildData.IsBinaryModule:
-                                    BinaryPcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                                    BinaryPcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add(
+                                        (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
                                 else:
-                                    SourcePcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                                    SourcePcdDict[TAB_PCDS_PATCHABLE_IN_MODULE].add(
+                                        (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
 
                         elif TAB_PCDS_DYNAMIC in BuildData.Pcds[key].Type:
-                            SourcePcdDict[TAB_PCDS_DYNAMIC].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                            SourcePcdDict[TAB_PCDS_DYNAMIC].add(
+                                (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
                         elif TAB_PCDS_FIXED_AT_BUILD in BuildData.Pcds[key].Type:
-                            SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add((BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
+                            SourcePcdDict[TAB_PCDS_FIXED_AT_BUILD].add(
+                                (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName))
 
             #
             # A PCD can only use one type for all source modules
@@ -349,13 +382,16 @@ class WorkspaceAutoGen(AutoGen):
             for i in SourcePcdDict_Keys:
                 for j in SourcePcdDict_Keys:
                     if i != j:
-                        Intersections = SourcePcdDict[i].intersection(SourcePcdDict[j])
+                        Intersections = SourcePcdDict[i].intersection(
+                            SourcePcdDict[j])
                         if len(Intersections) > 0:
                             EdkLogger.error(
-                            'build',
-                            FORMAT_INVALID,
-                            "Building modules from source INFs, following PCD use %s and %s access method. It must be corrected to use only one access method." % (i, j),
-                            ExtraData='\n\t'.join(str(P[1]+'.'+P[0]) for P in Intersections)
+                                'build',
+                                FORMAT_INVALID,
+                                "Building modules from source INFs, following PCD use %s and %s access method. It must be corrected to use only one access method." % (
+                                    i, j),
+                                ExtraData='\n\t'.join(
+                                    str(P[1]+'.'+P[0]) for P in Intersections)
                             )
 
             #
@@ -364,7 +400,8 @@ class WorkspaceAutoGen(AutoGen):
             for i in BinaryPcdDict_Keys:
                 for j in BinaryPcdDict_Keys:
                     if i != j:
-                        Intersections = BinaryPcdDict[i].intersection(BinaryPcdDict[j])
+                        Intersections = BinaryPcdDict[i].intersection(
+                            BinaryPcdDict[j])
                         for item in Intersections:
                             NewPcd1 = (item[0] + '_' + i, item[1])
                             NewPcd2 = (item[0] + '_' + j, item[1])
@@ -382,7 +419,8 @@ class WorkspaceAutoGen(AutoGen):
             for i in SourcePcdDict_Keys:
                 for j in BinaryPcdDict_Keys:
                     if i != j:
-                        Intersections = SourcePcdDict[i].intersection(BinaryPcdDict[j])
+                        Intersections = SourcePcdDict[i].intersection(
+                            BinaryPcdDict[j])
                         for item in Intersections:
                             NewPcd1 = (item[0] + '_' + i, item[1])
                             NewPcd2 = (item[0] + '_' + j, item[1])
@@ -394,7 +432,8 @@ class WorkspaceAutoGen(AutoGen):
                                 if NewPcd2 not in GlobalData.MixedPcd[item]:
                                     GlobalData.MixedPcd[item].append(NewPcd2)
 
-            BuildData = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+            BuildData = self.BuildDatabase[self.MetaFile,
+                                           Arch, self.BuildTarget, self.ToolChain]
             for key in BuildData.Pcds:
                 for SinglePcd in GlobalData.MixedPcd:
                     if (BuildData.Pcds[key].TokenCName, BuildData.Pcds[key].TokenSpaceGuidCName) == SinglePcd:
@@ -403,7 +442,8 @@ class WorkspaceAutoGen(AutoGen):
                             if (Pcd_Type == BuildData.Pcds[key].Type) or (Pcd_Type == TAB_PCDS_DYNAMIC_EX and BuildData.Pcds[key].Type in PCD_DYNAMIC_EX_TYPE_SET) or \
                                (Pcd_Type == TAB_PCDS_DYNAMIC and BuildData.Pcds[key].Type in PCD_DYNAMIC_TYPE_SET):
                                 Value = BuildData.Pcds[key]
-                                Value.TokenCName = BuildData.Pcds[key].TokenCName + '_' + Pcd_Type
+                                Value.TokenCName = BuildData.Pcds[key].TokenCName + \
+                                    '_' + Pcd_Type
                                 if len(key) == 2:
                                     newkey = (Value.TokenCName, key[1])
                                 elif len(key) == 3:
@@ -423,7 +463,7 @@ class WorkspaceAutoGen(AutoGen):
                     for item in GlobalData.MixedPcd[key]:
                         PcdSet[item] = Value
 
-    #Collect package set information from INF of FDF
+    # Collect package set information from INF of FDF
     @cached_property
     def PkgSet(self):
         if not self.FdfFile:
@@ -435,23 +475,27 @@ class WorkspaceAutoGen(AutoGen):
             ModuleList = []
         Pkgs = {}
         for Arch in self.ArchList:
-            Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+            Platform = self.BuildDatabase[self.MetaFile,
+                                          Arch, self.BuildTarget, self.ToolChain]
             PkgSet = set()
             for mb in [self.BuildDatabase[m, Arch, self.BuildTarget, self.ToolChain] for m in Platform.Modules]:
                 PkgSet.update(mb.Packages)
             for Inf in ModuleList:
-                ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch)
+                ModuleFile = PathClass(
+                    NormPath(Inf), GlobalData.gWorkspace, Arch)
                 if ModuleFile in Platform.Modules:
                     continue
-                ModuleData = self.BuildDatabase[ModuleFile, Arch, self.BuildTarget, self.ToolChain]
+                ModuleData = self.BuildDatabase[ModuleFile,
+                                                Arch, self.BuildTarget, self.ToolChain]
                 PkgSet.update(ModuleData.Packages)
             PkgSet.update(Platform.Packages)
             Pkgs[Arch] = list(PkgSet)
         return Pkgs
 
-    def VerifyPcdDeclearation(self,PcdSet):
+    def VerifyPcdDeclearation(self, PcdSet):
         for Arch in self.ArchList:
-            Platform = self.BuildDatabase[self.MetaFile, Arch, self.BuildTarget, self.ToolChain]
+            Platform = self.BuildDatabase[self.MetaFile,
+                                          Arch, self.BuildTarget, self.ToolChain]
             Pkgs = self.PkgSet[Arch]
             DecPcds = set()
             DecPcdsKey = set()
@@ -461,33 +505,41 @@ class WorkspaceAutoGen(AutoGen):
                     DecPcdsKey.add((Pcd[0], Pcd[1], Pcd[2]))
 
             Platform.SkuName = self.SkuId
-            for Name, Guid,Fileds in PcdSet:
+            for Name, Guid, Fileds in PcdSet:
                 if (Name, Guid) not in DecPcds:
                     EdkLogger.error(
                         'build',
                         PARSER_ERROR,
-                        "PCD (%s.%s) used in FDF is not declared in DEC files." % (Guid, Name),
-                        File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
-                        Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
+                        "PCD (%s.%s) used in FDF is not declared in DEC files." % (
+                            Guid, Name),
+                        File=self.FdfProfile.PcdFileLineDict[Name,
+                                                             Guid, Fileds][0],
+                        Line=self.FdfProfile.PcdFileLineDict[Name,
+                                                             Guid, Fileds][1]
                     )
                 else:
                     # Check whether Dynamic or DynamicEx PCD used in FDF file. If used, build break and give a error message.
                     if (Name, Guid, TAB_PCDS_FIXED_AT_BUILD) in DecPcdsKey \
-                        or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) in DecPcdsKey \
-                        or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPcdsKey:
+                            or (Name, Guid, TAB_PCDS_PATCHABLE_IN_MODULE) in DecPcdsKey \
+                            or (Name, Guid, TAB_PCDS_FEATURE_FLAG) in DecPcdsKey:
                         continue
                     elif (Name, Guid, TAB_PCDS_DYNAMIC) in DecPcdsKey or (Name, Guid, TAB_PCDS_DYNAMIC_EX) in DecPcdsKey:
                         EdkLogger.error(
-                                'build',
-                                PARSER_ERROR,
-                                "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (Guid, Name),
-                                File = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][0],
-                                Line = self.FdfProfile.PcdFileLineDict[Name, Guid, Fileds][1]
+                            'build',
+                            PARSER_ERROR,
+                            "Using Dynamic or DynamicEx type of PCD [%s.%s] in FDF file is not allowed." % (
+                                Guid, Name),
+                            File=self.FdfProfile.PcdFileLineDict[Name,
+                                                                 Guid, Fileds][0],
+                            Line=self.FdfProfile.PcdFileLineDict[Name,
+                                                                 Guid, Fileds][1]
                         )
+
     def CollectAllPcds(self):
 
         for Arch in self.ArchList:
-            Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
+            Pa = PlatformAutoGen(self, self.MetaFile,
+                                 self.BuildTarget, self.ToolChain, Arch)
             #
             # Explicitly collect platform's dynamic PCDs
             #
@@ -496,21 +548,24 @@ class WorkspaceAutoGen(AutoGen):
             self.AutoGenObjectList.append(Pa)
         # We need to calculate the PcdTokenNumber after all Arch Pcds are collected.
         for Arch in self.ArchList:
-            #Pcd TokenNumber
-            Pa = PlatformAutoGen(self, self.MetaFile, self.BuildTarget, self.ToolChain, Arch)
-            self.UpdateModuleDataPipe(Arch,  {"PCD_TNUM":Pa.PcdTokenNumber})
+            # Pcd TokenNumber
+            Pa = PlatformAutoGen(self, self.MetaFile,
+                                 self.BuildTarget, self.ToolChain, Arch)
+            self.UpdateModuleDataPipe(Arch,  {"PCD_TNUM": Pa.PcdTokenNumber})
 
-    def UpdateModuleDataPipe(self,arch, attr_dict):
+    def UpdateModuleDataPipe(self, arch, attr_dict):
         for (Target, Toolchain, Arch, MetaFile) in AutoGen.Cache():
             if Arch != arch:
                 continue
             try:
-                AutoGen.Cache()[(Target, Toolchain, Arch, MetaFile)].DataPipe.DataContainer = attr_dict
+                AutoGen.Cache()[(Target, Toolchain, Arch, MetaFile)
+                                ].DataPipe.DataContainer = attr_dict
             except Exception:
                 pass
     #
     # Generate Package level hash value
     #
+
     def GeneratePkgLevelHash(self):
         for Arch in self.ArchList:
             GlobalData.gPackageHash = {}
@@ -518,7 +573,6 @@ class WorkspaceAutoGen(AutoGen):
                 for Pkg in self.PkgSet[Arch]:
                     self._GenPkgLevelHash(Pkg)
 
-
     def CreateBuildOptionsFile(self):
         #
         # Create BuildOptions Macro & PCD metafile, also add the Active Platform and FDF file.
@@ -536,7 +590,8 @@ class WorkspaceAutoGen(AutoGen):
             content += 'Flash Image Definition: '
             content += str(self.FdfFile)
             content += TAB_LINE_BREAK
-        SaveFileOnChange(os.path.join(self.BuildDir, 'BuildOptions'), content, False)
+        SaveFileOnChange(os.path.join(
+            self.BuildDir, 'BuildOptions'), content, False)
 
     def CreatePcdTokenNumberFile(self):
         #
@@ -548,17 +603,22 @@ class WorkspaceAutoGen(AutoGen):
             if Pa.DynamicPcdList:
                 for Pcd in Pa.DynamicPcdList:
                     PcdTokenNumber += TAB_LINE_BREAK
-                    PcdTokenNumber += str((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
+                    PcdTokenNumber += str((Pcd.TokenCName,
+                                          Pcd.TokenSpaceGuidCName))
                     PcdTokenNumber += ' : '
-                    PcdTokenNumber += str(Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
-        SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
+                    PcdTokenNumber += str(
+                        Pa.PcdTokenNumber[Pcd.TokenCName, Pcd.TokenSpaceGuidCName])
+        SaveFileOnChange(os.path.join(
+            self.BuildDir, 'PcdTokenNumber'), PcdTokenNumber, False)
 
     def GeneratePlatformLevelHash(self):
         #
         # Get set of workspace metafiles
         #
-        AllWorkSpaceMetaFiles = self._GetMetaFiles(self.BuildTarget, self.ToolChain)
-        AllWorkSpaceMetaFileList = sorted(AllWorkSpaceMetaFiles, key=lambda x: str(x))
+        AllWorkSpaceMetaFiles = self._GetMetaFiles(
+            self.BuildTarget, self.ToolChain)
+        AllWorkSpaceMetaFileList = sorted(
+            AllWorkSpaceMetaFiles, key=lambda x: str(x))
         #
         # Retrieve latest modified time of all metafiles
         #
@@ -583,17 +643,20 @@ class WorkspaceAutoGen(AutoGen):
             HashDir = path.join(self.BuildDir, "Hash_Platform")
             HashFile = path.join(HashDir, 'Platform.hash.' + m.hexdigest())
             SaveFileOnChange(HashFile, m.hexdigest(), False)
-            HashChainFile = path.join(HashDir, 'Platform.hashchain.' + m.hexdigest())
+            HashChainFile = path.join(
+                HashDir, 'Platform.hashchain.' + m.hexdigest())
             GlobalData.gPlatformHashFile = HashChainFile
             try:
                 with open(HashChainFile, 'w') as f:
                     json.dump(FileList, f, indent=2)
             except:
-                EdkLogger.quiet("[cache warning]: fail to save hashchain file:%s" % HashChainFile)
+                EdkLogger.quiet(
+                    "[cache warning]: fail to save hashchain file:%s" % HashChainFile)
 
             if GlobalData.gBinCacheDest:
                 # Copy platform hash files to cache destination
-                FileDir = path.join(GlobalData.gBinCacheDest, self.OutputDir, self.BuildTarget + "_" + self.ToolChain, "Hash_Platform")
+                FileDir = path.join(GlobalData.gBinCacheDest, self.OutputDir,
+                                    self.BuildTarget + "_" + self.ToolChain, "Hash_Platform")
                 CacheFileDir = FileDir
                 CreateDirectory(CacheFileDir)
                 CopyFileOnChange(HashFile, CacheFileDir)
@@ -603,7 +666,7 @@ class WorkspaceAutoGen(AutoGen):
         # Write metafile list to build directory
         #
         AutoGenFilePath = os.path.join(self.BuildDir, 'AutoGen')
-        if os.path.exists (AutoGenFilePath):
+        if os.path.exists(AutoGenFilePath):
             os.remove(AutoGenFilePath)
         if not os.path.exists(self.BuildDir):
             os.makedirs(self.BuildDir)
@@ -616,7 +679,8 @@ class WorkspaceAutoGen(AutoGen):
         if Pkg.PackageName in GlobalData.gPackageHash:
             return
 
-        PkgDir = os.path.join(self.BuildDir, Pkg.Arch, "Hash_Pkg", Pkg.PackageName)
+        PkgDir = os.path.join(self.BuildDir, Pkg.Arch,
+                              "Hash_Pkg", Pkg.PackageName)
         CreateDirectory(PkgDir)
         FileList = []
         m = hashlib.md5()
@@ -625,7 +689,8 @@ class WorkspaceAutoGen(AutoGen):
         Content = f.read()
         f.close()
         m.update(Content)
-        FileList.append((str(Pkg.MetaFile.Path), hashlib.md5(Content).hexdigest()))
+        FileList.append(
+            (str(Pkg.MetaFile.Path), hashlib.md5(Content).hexdigest()))
         # Get include files hash value
         if Pkg.Includes:
             for inc in sorted(Pkg.Includes, key=lambda x: str(x)):
@@ -636,23 +701,29 @@ class WorkspaceAutoGen(AutoGen):
                         Content = f.read()
                         f.close()
                         m.update(Content)
-                        FileList.append((str(File_Path), hashlib.md5(Content).hexdigest()))
+                        FileList.append(
+                            (str(File_Path), hashlib.md5(Content).hexdigest()))
         GlobalData.gPackageHash[Pkg.PackageName] = m.hexdigest()
 
         HashDir = PkgDir
-        HashFile = path.join(HashDir, Pkg.PackageName + '.hash.' + m.hexdigest())
+        HashFile = path.join(HashDir, Pkg.PackageName +
+                             '.hash.' + m.hexdigest())
         SaveFileOnChange(HashFile, m.hexdigest(), False)
-        HashChainFile = path.join(HashDir, Pkg.PackageName + '.hashchain.' + m.hexdigest())
-        GlobalData.gPackageHashFile[(Pkg.PackageName, Pkg.Arch)] = HashChainFile
+        HashChainFile = path.join(
+            HashDir, Pkg.PackageName + '.hashchain.' + m.hexdigest())
+        GlobalData.gPackageHashFile[(
+            Pkg.PackageName, Pkg.Arch)] = HashChainFile
         try:
             with open(HashChainFile, 'w') as f:
                 json.dump(FileList, f, indent=2)
         except:
-            EdkLogger.quiet("[cache warning]: fail to save hashchain file:%s" % HashChainFile)
+            EdkLogger.quiet(
+                "[cache warning]: fail to save hashchain file:%s" % HashChainFile)
 
         if GlobalData.gBinCacheDest:
             # Copy Pkg hash files to cache destination dir
-            FileDir = path.join(GlobalData.gBinCacheDest, self.OutputDir, self.BuildTarget + "_" + self.ToolChain, Pkg.Arch, "Hash_Pkg", Pkg.PackageName)
+            FileDir = path.join(GlobalData.gBinCacheDest, self.OutputDir, self.BuildTarget +
+                                "_" + self.ToolChain, Pkg.Arch, "Hash_Pkg", Pkg.PackageName)
             CacheFileDir = FileDir
             CreateDirectory(CacheFileDir)
             CopyFileOnChange(HashFile, CacheFileDir)
@@ -664,9 +735,9 @@ class WorkspaceAutoGen(AutoGen):
         # add fdf
         #
         if self.FdfFile:
-            AllWorkSpaceMetaFiles.add (self.FdfFile.Path)
+            AllWorkSpaceMetaFiles.add(self.FdfFile.Path)
             for f in GlobalData.gFdfParser.GetAllIncludedFile():
-                AllWorkSpaceMetaFiles.add (f.FileName)
+                AllWorkSpaceMetaFiles.add(f.FileName)
         #
         # add dsc
         #
@@ -675,8 +746,10 @@ class WorkspaceAutoGen(AutoGen):
         #
         # add build_rule.txt & tools_def.txt
         #
-        AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultBuildRuleFile))
-        AllWorkSpaceMetaFiles.add(os.path.join(GlobalData.gConfDirectory, gDefaultToolsDefFile))
+        AllWorkSpaceMetaFiles.add(os.path.join(
+            GlobalData.gConfDirectory, gDefaultBuildRuleFile))
+        AllWorkSpaceMetaFiles.add(os.path.join(
+            GlobalData.gConfDirectory, gDefaultToolsDefFile))
 
         # add BuildOption metafile
         #
@@ -684,7 +757,8 @@ class WorkspaceAutoGen(AutoGen):
 
         # add PcdToken Number file for Dynamic/DynamicEx Pcd
         #
-        AllWorkSpaceMetaFiles.add(os.path.join(self.BuildDir, 'PcdTokenNumber'))
+        AllWorkSpaceMetaFiles.add(os.path.join(
+            self.BuildDir, 'PcdTokenNumber'))
 
         for Pa in self.AutoGenObjectList:
             AllWorkSpaceMetaFiles.add(Pa.ToolDefinitionFile)
@@ -706,10 +780,10 @@ class WorkspaceAutoGen(AutoGen):
 
     def _CheckPcdDefineAndType(self):
         PcdTypeSet = {TAB_PCDS_FIXED_AT_BUILD,
-            TAB_PCDS_PATCHABLE_IN_MODULE,
-            TAB_PCDS_FEATURE_FLAG,
-            TAB_PCDS_DYNAMIC,
-            TAB_PCDS_DYNAMIC_EX}
+                      TAB_PCDS_PATCHABLE_IN_MODULE,
+                      TAB_PCDS_FEATURE_FLAG,
+                      TAB_PCDS_DYNAMIC,
+                      TAB_PCDS_DYNAMIC_EX}
 
         # This dict store PCDs which are not used by any modules with specified arches
         UnusedPcd = OrderedDict()
@@ -737,7 +811,7 @@ class WorkspaceAutoGen(AutoGen):
                             EdkLogger.error(
                                 'build',
                                 FORMAT_INVALID,
-                                "Type [%s] of PCD [%s.%s] in DSC file doesn't match the type [%s] defined in DEC file." \
+                                "Type [%s] of PCD [%s.%s] in DSC file doesn't match the type [%s] defined in DEC file."
                                 % (Pa.Platform.Pcds[Pcd].Type, Pcd[1], Pcd[0], Type),
                                 ExtraData=None
                             )
@@ -757,42 +831,42 @@ class WorkspaceAutoGen(AutoGen):
     def __repr__(self):
         return "%s [%s]" % (self.MetaFile, ", ".join(self.ArchList))
 
-    ## Return the directory to store FV files
+    # Return the directory to store FV files
     @cached_property
     def FvDir(self):
         return path.join(self.BuildDir, TAB_FV_DIRECTORY)
 
-    ## Return the directory to store all intermediate and final files built
+    # Return the directory to store all intermediate and final files built
     @cached_property
     def BuildDir(self):
         return self.AutoGenObjectList[0].BuildDir
 
-    ## Return the build output directory platform specifies
+    # Return the build output directory platform specifies
     @cached_property
     def OutputDir(self):
         return self.Platform.OutputDirectory
 
-    ## Return platform name
+    # Return platform name
     @cached_property
     def Name(self):
         return self.Platform.PlatformName
 
-    ## Return meta-file GUID
+    # Return meta-file GUID
     @cached_property
     def Guid(self):
         return self.Platform.Guid
 
-    ## Return platform version
+    # Return platform version
     @cached_property
     def Version(self):
         return self.Platform.Version
 
-    ## Return paths of tools
+    # Return paths of tools
     @cached_property
     def ToolDefinition(self):
         return self.AutoGenObjectList[0].ToolDefinition
 
-    ## Return directory of platform makefile
+    # Return directory of platform makefile
     #
     #   @retval     string  Makefile directory
     #
@@ -800,7 +874,7 @@ class WorkspaceAutoGen(AutoGen):
     def MakeFileDir(self):
         return self.BuildDir
 
-    ## Return build command string
+    # Return build command string
     #
     #   @retval     string  Build command string
     #
@@ -809,7 +883,7 @@ class WorkspaceAutoGen(AutoGen):
         # BuildCommand should be all the same. So just get one from platform AutoGen
         return self.AutoGenObjectList[0].BuildCommand
 
-    ## Check the PCDs token value conflict in each DEC file.
+    # Check the PCDs token value conflict in each DEC file.
     #
     # Will cause build break and raise error message while two PCDs conflict.
     #
@@ -821,7 +895,7 @@ class WorkspaceAutoGen(AutoGen):
                 PcdList = list(Package.Pcds.values())
                 PcdList.sort(key=lambda x: int(x.TokenValue, 0))
                 Count = 0
-                while (Count < len(PcdList) - 1) :
+                while (Count < len(PcdList) - 1):
                     Item = PcdList[Count]
                     ItemNext = PcdList[Count + 1]
                     #
@@ -834,13 +908,15 @@ class WorkspaceAutoGen(AutoGen):
                         RemainPcdListLength = len(PcdList) - Count - 2
                         for ValueSameCount in range(RemainPcdListLength):
                             if int(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount].TokenValue, 0) == int(Item.TokenValue, 0):
-                                SameTokenValuePcdList.append(PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount])
+                                SameTokenValuePcdList.append(
+                                    PcdList[len(PcdList) - RemainPcdListLength + ValueSameCount])
                             else:
-                                break;
+                                break
                         #
                         # Sort same token value PCD list with TokenGuid and TokenCName
                         #
-                        SameTokenValuePcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
+                        SameTokenValuePcdList.sort(key=lambda x: "%s.%s" % (
+                            x.TokenSpaceGuidCName, x.TokenCName))
                         SameTokenValuePcdListCount = 0
                         while (SameTokenValuePcdListCount < len(SameTokenValuePcdList) - 1):
                             Flag = False
@@ -850,24 +926,25 @@ class WorkspaceAutoGen(AutoGen):
                             if (TemListItem.TokenSpaceGuidCName == TemListItemNext.TokenSpaceGuidCName) and (TemListItem.TokenCName != TemListItemNext.TokenCName):
                                 for PcdItem in GlobalData.MixedPcd:
                                     if (TemListItem.TokenCName, TemListItem.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem] or \
-                                        (TemListItemNext.TokenCName, TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
+                                            (TemListItemNext.TokenCName, TemListItemNext.TokenSpaceGuidCName) in GlobalData.MixedPcd[PcdItem]:
                                         Flag = True
                                 if not Flag:
                                     EdkLogger.error(
-                                                'build',
-                                                FORMAT_INVALID,
-                                                "The TokenValue [%s] of PCD [%s.%s] is conflict with: [%s.%s] in %s"\
-                                                % (TemListItem.TokenValue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package),
-                                                ExtraData=None
-                                                )
+                                        'build',
+                                        FORMAT_INVALID,
+                                        "The TokenValue [%s] of PCD [%s.%s] is conflict with: [%s.%s] in %s"
+                                        % (TemListItem.TokenValue, TemListItem.TokenSpaceGuidCName, TemListItem.TokenCName, TemListItemNext.TokenSpaceGuidCName, TemListItemNext.TokenCName, Package),
+                                        ExtraData=None
+                                    )
                             SameTokenValuePcdListCount += 1
                         Count += SameTokenValuePcdListCount
                     Count += 1
 
                 PcdList = list(Package.Pcds.values())
-                PcdList.sort(key=lambda x: "%s.%s" % (x.TokenSpaceGuidCName, x.TokenCName))
+                PcdList.sort(key=lambda x: "%s.%s" %
+                             (x.TokenSpaceGuidCName, x.TokenCName))
                 Count = 0
-                while (Count < len(PcdList) - 1) :
+                while (Count < len(PcdList) - 1):
                     Item = PcdList[Count]
                     ItemNext = PcdList[Count + 1]
                     #
@@ -875,14 +952,15 @@ class WorkspaceAutoGen(AutoGen):
                     #
                     if (Item.TokenSpaceGuidCName == ItemNext.TokenSpaceGuidCName) and (Item.TokenCName == ItemNext.TokenCName) and (int(Item.TokenValue, 0) != int(ItemNext.TokenValue, 0)):
                         EdkLogger.error(
-                                    'build',
-                                    FORMAT_INVALID,
-                                    "The TokenValue [%s] of PCD [%s.%s] in %s defined in two places should be same as well."\
-                                    % (Item.TokenValue, Item.TokenSpaceGuidCName, Item.TokenCName, Package),
-                                    ExtraData=None
-                                    )
+                            'build',
+                            FORMAT_INVALID,
+                            "The TokenValue [%s] of PCD [%s.%s] in %s defined in two places should be same as well."
+                            % (Item.TokenValue, Item.TokenSpaceGuidCName, Item.TokenCName, Package),
+                            ExtraData=None
+                        )
                     Count += 1
-    ## Generate fds command
+    # Generate fds command
+
     @property
     def GenFdsCommand(self):
         return (GenMake.TopLevelMakefile(self)._TEMPLATE_.Replace(GenMake.TopLevelMakefile(self)._TemplateDict)).strip()
@@ -909,7 +987,8 @@ class WorkspaceAutoGen(AutoGen):
             else:
                 pcdname = '.'.join(pcd[0:2])
             if pcd[3].startswith('{'):
-                FdsCommandDict["OptionPcd"].append(pcdname + '=' + 'H' + '"' + pcd[3] + '"')
+                FdsCommandDict["OptionPcd"].append(
+                    pcdname + '=' + 'H' + '"' + pcd[3] + '"')
             else:
                 FdsCommandDict["OptionPcd"].append(pcdname + '=' + pcd[3])
 
@@ -920,7 +999,8 @@ class WorkspaceAutoGen(AutoGen):
         MacroDict.update(GlobalData.gCommandLineDefines)
         for MacroName in MacroDict:
             if MacroDict[MacroName] != "":
-                MacroList.append('"%s=%s"' % (MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
+                MacroList.append('"%s=%s"' % (
+                    MacroName, MacroDict[MacroName].replace('\\', '\\\\')))
             else:
                 MacroList.append('"%s"' % MacroName)
         FdsCommandDict["macro"] = MacroList
@@ -939,7 +1019,7 @@ class WorkspaceAutoGen(AutoGen):
         FdsCommandDict["cap"] = self.CapTargetList
         return FdsCommandDict
 
-    ## Create makefile for the platform and modules in it
+    # Create makefile for the platform and modules in it
     #
     #   @param      CreateDepsMakeFile      Flag indicating if the makefile for
     #                                       modules will be created as well
@@ -950,7 +1030,7 @@ class WorkspaceAutoGen(AutoGen):
         for Pa in self.AutoGenObjectList:
             Pa.CreateMakeFile(CreateDepsMakeFile)
 
-    ## Create autogen code for platform and modules
+    # Create autogen code for platform and modules
     #
     #  Since there's no autogen code for platform, this method will do nothing
     #  if CreateModuleCodeFile is set to False.
@@ -964,8 +1044,7 @@ class WorkspaceAutoGen(AutoGen):
         for Pa in self.AutoGenObjectList:
             Pa.CreateCodeFile(CreateDepsCodeFile)
 
-    ## Create AsBuilt INF file the platform
+    # Create AsBuilt INF file the platform
     #
     def CreateAsBuiltInf(self):
         return
-
diff --git a/BaseTools/Source/Python/AutoGen/__init__.py b/BaseTools/Source/Python/AutoGen/__init__.py
index 47451c4c40fa..7e12731d6c86 100644
--- a/BaseTools/Source/Python/AutoGen/__init__.py
+++ b/BaseTools/Source/Python/AutoGen/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'AutoGen' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/BPDG/BPDG.py b/BaseTools/Source/Python/BPDG/BPDG.py
index 283e08a37a0f..0820cfd5e7e0 100644
--- a/BaseTools/Source/Python/BPDG/BPDG.py
+++ b/BaseTools/Source/Python/BPDG/BPDG.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Intel Binary Product Data Generation Tool (Intel BPDG).
 #  This tool provide a simple process for the creation of a binary file containing read-only
 #  configuration data for EDK II platforms that contain Dynamic and DynamicEx PCDs described
@@ -28,10 +28,10 @@ from Common.BuildVersion import gBUILD_VERSION
 from . import StringTable as st
 from . import GenVpd
 
-PROJECT_NAME       = st.LBL_BPDG_LONG_UNI
-VERSION            = (st.LBL_BPDG_VERSION + " Build " + gBUILD_VERSION)
+PROJECT_NAME = st.LBL_BPDG_LONG_UNI
+VERSION = (st.LBL_BPDG_VERSION + " Build " + gBUILD_VERSION)
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -40,6 +40,8 @@ VERSION            = (st.LBL_BPDG_VERSION + " Build " + gBUILD_VERSION)
 #   @retval 0     Tool was successful
 #   @retval 1     Tool failed
 #
+
+
 def main():
     global Options, Args
 
@@ -59,24 +61,26 @@ def main():
         EdkLogger.SetLevel(EdkLogger.INFO)
 
     if Options.bin_filename is None:
-        EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -o option to specify the file name for the VPD binary file")
+        EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE,
+                        "Please use the -o option to specify the file name for the VPD binary file")
     if Options.filename is None:
-        EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please use the -m option to specify the file name for the mapping file")
+        EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE,
+                        "Please use the -m option to specify the file name for the mapping file")
 
     Force = False
     if Options.opt_force is not None:
         Force = True
 
-    if (Args[0] is not None) :
+    if (Args[0] is not None):
         StartBpdg(Args[0], Options.filename, Options.bin_filename, Force)
-    else :
+    else:
         EdkLogger.error("BPDG", ATTRIBUTE_NOT_AVAILABLE, "Please specify the file which contain the VPD pcd info.",
                         None)
 
     return ReturnCode
 
 
-## Parse command line options
+# Parse command line options
 #
 # Using standard Python module optparse to parse command line option of this tool.
 #
@@ -107,13 +111,14 @@ def MyOptionParser():
 
     (options, args) = parser.parse_args()
     if len(args) == 0:
-        EdkLogger.info("Please specify the filename.txt file which contain the VPD pcd info!")
+        EdkLogger.info(
+            "Please specify the filename.txt file which contain the VPD pcd info!")
         EdkLogger.info(parser.usage)
         sys.exit(1)
     return options, args
 
 
-## Start BPDG and call the main functions
+# Start BPDG and call the main functions
 #
 # This method mainly focus on call GenVPD class member functions to complete
 # BPDG's target. It will process VpdFile override, and provide the interface file
@@ -133,7 +138,7 @@ def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
         if choice.strip().lower() not in ['y', 'yes', '']:
             return
 
-    GenVPD = GenVpd.GenVPD (InputFileName, MapFileName, VpdFileName)
+    GenVPD = GenVpd.GenVPD(InputFileName, MapFileName, VpdFileName)
 
     EdkLogger.info('%-24s = %s' % ("VPD input data file: ", InputFileName))
     EdkLogger.info('%-24s = %s' % ("VPD output map file: ", MapFileName))
@@ -146,13 +151,13 @@ def StartBpdg(InputFileName, MapFileName, VpdFileName, Force):
 
     EdkLogger.info("- Vpd pcd fixed done! -")
 
+
 if __name__ == '__main__':
     try:
         r = main()
     except FatalError as e:
         r = e
-    ## 0-127 is a safe return range, and 1 is a standard default error
-    if r < 0 or r > 127: r = 1
+    # 0-127 is a safe return range, and 1 is a standard default error
+    if r < 0 or r > 127:
+        r = 1
     sys.exit(r)
-
-
diff --git a/BaseTools/Source/Python/BPDG/GenVpd.py b/BaseTools/Source/Python/BPDG/GenVpd.py
index 049c082f40e2..371d5a8ebf06 100644
--- a/BaseTools/Source/Python/BPDG/GenVpd.py
+++ b/BaseTools/Source/Python/BPDG/GenVpd.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  This file include GenVpd class for fix the Vpd type PCD offset, and PcdEntry for describe
 #  and process each entry of vpd type PCD.
 #
@@ -25,40 +25,42 @@ _FORMAT_CHAR = {1: 'B',
                 8: 'Q'
                 }
 
-## The VPD PCD data structure for store and process each VPD PCD entry.
+# The VPD PCD data structure for store and process each VPD PCD entry.
 #
 #  This class contain method to format and pack pcd's value.
 #
+
+
 class PcdEntry:
-    def __init__(self, PcdCName, SkuId,PcdOffset, PcdSize, PcdValue, Lineno=None, FileName=None, PcdUnpackValue=None,
+    def __init__(self, PcdCName, SkuId, PcdOffset, PcdSize, PcdValue, Lineno=None, FileName=None, PcdUnpackValue=None,
                  PcdBinOffset=None, PcdBinSize=None, Alignment=None):
-        self.PcdCName       = PcdCName.strip()
-        self.SkuId          = SkuId.strip()
-        self.PcdOffset      = PcdOffset.strip()
-        self.PcdSize        = PcdSize.strip()
-        self.PcdValue       = PcdValue.strip()
-        self.Lineno         = Lineno.strip()
-        self.FileName       = FileName.strip()
+        self.PcdCName = PcdCName.strip()
+        self.SkuId = SkuId.strip()
+        self.PcdOffset = PcdOffset.strip()
+        self.PcdSize = PcdSize.strip()
+        self.PcdValue = PcdValue.strip()
+        self.Lineno = Lineno.strip()
+        self.FileName = FileName.strip()
         self.PcdUnpackValue = PcdUnpackValue
-        self.PcdBinOffset   = PcdBinOffset
-        self.PcdBinSize     = PcdBinSize
-        self.Alignment       = Alignment
+        self.PcdBinOffset = PcdBinOffset
+        self.PcdBinSize = PcdBinSize
+        self.Alignment = Alignment
 
-        if self.PcdValue == '' :
+        if self.PcdValue == '':
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid PCD format(Name: %s File: %s line: %s) , no Value specified!" % (self.PcdCName, self.FileName, self.Lineno))
 
-        if self.PcdOffset == '' :
+        if self.PcdOffset == '':
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid PCD format(Name: %s File: %s Line: %s) , no Offset specified!" % (self.PcdCName, self.FileName, self.Lineno))
 
-        if self.PcdSize == '' :
+        if self.PcdSize == '':
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid PCD format(Name: %s File: %s Line: %s), no PcdSize specified!" % (self.PcdCName, self.FileName, self.Lineno))
 
-        self._GenOffsetValue ()
+        self._GenOffsetValue()
 
-    ## Analyze the string value to judge the PCD's datum type equal to Boolean or not.
+    # Analyze the string value to judge the PCD's datum type equal to Boolean or not.
     #
     #  @param   ValueString      PCD's value
     #  @param   Size             PCD's size
@@ -75,7 +77,7 @@ class PcdEntry:
 
         return False
 
-    ## Convert the PCD's value from string to integer.
+    # Convert the PCD's value from string to integer.
     #
     #  This function will try to convert the Offset value form string to integer
     #  for both hexadecimal and decimal.
@@ -83,7 +85,7 @@ class PcdEntry:
     def _GenOffsetValue(self):
         if self.PcdOffset != TAB_STAR:
             try:
-                self.PcdBinOffset = int (self.PcdOffset)
+                self.PcdBinOffset = int(self.PcdOffset)
             except:
                 try:
                     self.PcdBinOffset = int(self.PcdOffset, 16)
@@ -91,7 +93,7 @@ class PcdEntry:
                     EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                                     "Invalid offset value %s for PCD %s (File: %s Line: %s)" % (self.PcdOffset, self.PcdCName, self.FileName, self.Lineno))
 
-    ## Pack Boolean type VPD PCD's value form string to binary type.
+    # Pack Boolean type VPD PCD's value form string to binary type.
     #
     #  @param ValueString     The boolean type string for pack.
     #
@@ -110,7 +112,7 @@ class PcdEntry:
                 EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                                 "Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
 
-    ## Pack Integer type VPD PCD's value form string to binary type.
+    # Pack Integer type VPD PCD's value form string to binary type.
     #
     #  @param ValueString     The Integer type string for pack.
     #
@@ -127,11 +129,11 @@ class PcdEntry:
                 if IntValue < 0:
                     EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                                     "PCD can't be set to negative value %d for PCD %s in %s datum type(File: %s Line: %s)." % (
-                                    IntValue, self.PcdCName, Type, self.FileName, self.Lineno))
+                                        IntValue, self.PcdCName, Type, self.FileName, self.Lineno))
                 elif IntValue > MAX_VAL_TYPE[Type]:
                     EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                                     "Too large PCD value %d for datum type %s for PCD %s(File: %s Line: %s)." % (
-                                    IntValue, Type, self.PcdCName, self.FileName, self.Lineno))
+                                        IntValue, Type, self.PcdCName, self.FileName, self.Lineno))
 
         try:
             self.PcdValue = pack(_FORMAT_CHAR[Size], IntValue)
@@ -139,7 +141,7 @@ class PcdEntry:
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
 
-    ## Pack VOID* type VPD PCD's value form string to binary type.
+    # Pack VOID* type VPD PCD's value form string to binary type.
     #
     #  The VOID* type of string divided into 3 sub-type:
     #    1:    L"String"/L'String', Unicode type string.
@@ -159,7 +161,7 @@ class PcdEntry:
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid VOID* type PCD %s value %s (File: %s Line: %s)" % (self.PcdCName, ValueString, self.FileName, self.Lineno))
 
-    ## Pack an Ascii PCD value.
+    # Pack an Ascii PCD value.
     #
     #  An Ascii string for a PCD should be in format as  ""/''.
     #
@@ -168,7 +170,8 @@ class PcdEntry:
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" % (self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
         if (ValueString == ""):
-            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter ValueString %s of PCD %s!(File: %s Line: %s)" % (self.PcdUnpackValue, self.PcdCName, self.FileName, self.Lineno))
+            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter ValueString %s of PCD %s!(File: %s Line: %s)" % (
+                self.PcdUnpackValue, self.PcdCName, self.FileName, self.Lineno))
 
         QuotedFlag = True
         if ValueString.startswith("'"):
@@ -185,15 +188,17 @@ class PcdEntry:
             EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
                             "Invalid size or value for PCD %s to pack(File: %s Line: %s)." % (self.PcdCName, self.FileName, self.Lineno))
 
-    ## Pack a byte-array PCD value.
+    # Pack a byte-array PCD value.
     #
     #  A byte-array for a PCD should be in format as  {0x01, 0x02, ...}.
     #
     def _PackByteArray(self, ValueString, Size):
         if (Size < 0):
-            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" % (self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
+            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" % (
+                self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
         if (ValueString == ""):
-            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter ValueString %s of PCD %s!(File: %s Line: %s)" % (self.PcdUnpackValue, self.PcdCName, self.FileName, self.Lineno))
+            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter ValueString %s of PCD %s!(File: %s Line: %s)" % (
+                self.PcdUnpackValue, self.PcdCName, self.FileName, self.Lineno))
 
         ValueString = ValueString.strip()
         ValueString = ValueString.lstrip('{').strip('}')
@@ -214,7 +219,7 @@ class PcdEntry:
                     Value = int(ValueList[Index], 16)
                 except:
                     EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
-                                    "The value item %s in byte array %s is an invalid HEX value.(File: %s Line: %s)" % \
+                                    "The value item %s in byte array %s is an invalid HEX value.(File: %s Line: %s)" %
                                     (ValueList[Index], ValueString, self.FileName, self.Lineno))
             else:
                 # translate decimal value
@@ -222,12 +227,12 @@ class PcdEntry:
                     Value = int(ValueList[Index], 10)
                 except:
                     EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
-                                    "The value item %s in byte array %s is an invalid DECIMAL value.(File: %s Line: %s)" % \
+                                    "The value item %s in byte array %s is an invalid DECIMAL value.(File: %s Line: %s)" %
                                     (ValueList[Index], ValueString, self.FileName, self.Lineno))
 
             if Value > 255:
                 EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
-                                "The value item %s in byte array %s do not in range 0 ~ 0xFF(File: %s Line: %s)" % \
+                                "The value item %s in byte array %s do not in range 0 ~ 0xFF(File: %s Line: %s)" %
                                 (ValueList[Index], ValueString, self.FileName, self.Lineno))
 
             ReturnArray.append(Value)
@@ -237,14 +242,14 @@ class PcdEntry:
 
         self.PcdValue = ReturnArray.tolist()
 
-    ## Pack a unicode PCD value into byte array.
+    # Pack a unicode PCD value into byte array.
     #
     #  A unicode string for a PCD should be in format as  L""/L''.
     #
     def _PackUnicode(self, UnicodeString, Size):
         if (Size < 0):
-            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" % \
-                             (self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
+            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid parameter Size %s of PCD %s!(File: %s Line: %s)" %
+                            (self.PcdBinSize, self.PcdCName, self.FileName, self.Lineno))
 
         QuotedFlag = True
         if UnicodeString.startswith("L'"):
@@ -254,7 +259,7 @@ class PcdEntry:
         # No null-terminator in L'string'
         if (QuotedFlag and (len(UnicodeString) + 1) * 2 > Size) or (not QuotedFlag and len(UnicodeString) * 2 > Size):
             EdkLogger.error("BPDG", BuildToolError.RESOURCE_OVERFLOW,
-                            "The size of unicode string %s is too larger for size %s(File: %s Line: %s)" % \
+                            "The size of unicode string %s is too larger for size %s(File: %s Line: %s)" %
                             (UnicodeString, Size, self.FileName, self.Lineno))
 
         ReturnArray = array.array('B')
@@ -264,7 +269,7 @@ class PcdEntry:
                 ReturnArray.append(0)
             except:
                 EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
-                                "Invalid unicode character %s in unicode string %s(File: %s Line: %s)" % \
+                                "Invalid unicode character %s in unicode string %s(File: %s Line: %s)" %
                                 (Value, UnicodeString, self.FileName, self.Lineno))
 
         for Index in range(len(UnicodeString) * 2, Size):
@@ -273,8 +278,7 @@ class PcdEntry:
         self.PcdValue = ReturnArray.tolist()
 
 
-
-## The class implementing the BPDG VPD PCD offset fix process
+# The class implementing the BPDG VPD PCD offset fix process
 #
 #   The VPD PCD offset fix process includes:
 #       1. Parse the input guided.txt file and store it in the data structure;
@@ -282,8 +286,8 @@ class PcdEntry:
 #       3. Fixed offset if needed;
 #       4. Generate output file, including guided.map and guided.bin file;
 #
-class GenVPD :
-    ## Constructor of DscBuildData
+class GenVPD:
+    # Constructor of DscBuildData
     #
     #  Initialize object of GenVPD
     #   @Param      InputFileName   The filename include the vpd type pcd information
@@ -293,28 +297,30 @@ class GenVPD :
     #   @param      VpdFileName     The filename of Vpd file that hold vpd pcd information.
     #
     def __init__(self, InputFileName, MapFileName, VpdFileName):
-        self.InputFileName           = InputFileName
-        self.MapFileName             = MapFileName
-        self.VpdFileName             = VpdFileName
-        self.FileLinesList           = []
-        self.PcdFixedOffsetSizeList  = []
-        self.PcdUnknownOffsetList    = []
+        self.InputFileName = InputFileName
+        self.MapFileName = MapFileName
+        self.VpdFileName = VpdFileName
+        self.FileLinesList = []
+        self.PcdFixedOffsetSizeList = []
+        self.PcdUnknownOffsetList = []
         try:
             fInputfile = open(InputFileName, "r")
             try:
                 self.FileLinesList = fInputfile.readlines()
             except:
-                EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE, "File read failed for %s" % InputFileName, None)
+                EdkLogger.error("BPDG", BuildToolError.FILE_READ_FAILURE,
+                                "File read failed for %s" % InputFileName, None)
             finally:
                 fInputfile.close()
         except:
-            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % InputFileName, None)
+            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE,
+                            "File open failed for %s" % InputFileName, None)
 
     ##
     # Parser the input file which is generated by the build tool. Convert the value of each pcd's
     # from string to its real format. Also remove the useless line in the input file.
     #
-    def ParserInputFile (self):
+    def ParserInputFile(self):
         count = 0
         for line in self.FileLinesList:
             # Strip "\r\n" generated by readlines ().
@@ -322,7 +328,7 @@ class GenVPD :
             line = line.rstrip(os.linesep)
 
             # Skip the comment line
-            if (not line.startswith("#")) and len(line) > 1 :
+            if (not line.startswith("#")) and len(line) > 1:
                 #
                 # Enhanced for support "|" character in the string.
                 #
@@ -345,10 +351,10 @@ class GenVPD :
                 self.FileLinesList[count] = ValueList
                 # Store the line number
                 self.FileLinesList[count].append(str(count + 1))
-            elif len(line) <= 1 :
+            elif len(line) <= 1:
                 # Set the blank line to "None"
                 self.FileLinesList[count] = None
-            else :
+            else:
                 # Set the comment line to "None"
                 self.FileLinesList[count] = None
             count += 1
@@ -356,40 +362,41 @@ class GenVPD :
         # The line count contain usage information
         count = 0
         # Delete useless lines
-        while (True) :
-            try :
-                if (self.FileLinesList[count] is None) :
+        while (True):
+            try:
+                if (self.FileLinesList[count] is None):
                     del(self.FileLinesList[count])
-                else :
+                else:
                     count += 1
-            except :
+            except:
                 break
         #
         # After remove the useless line, if there are no data remain in the file line list,
         # Report warning messages to user's.
         #
-        if len(self.FileLinesList) == 0 :
+        if len(self.FileLinesList) == 0:
             EdkLogger.warn('BPDG', BuildToolError.RESOURCE_NOT_AVAILABLE,
                            "There are no VPD type pcds defined in DSC file, Please check it.")
 
         # Process the pcds one by one base on the pcd's value and size
         count = 0
         for line in self.FileLinesList:
-            if line is not None :
-                PCD = PcdEntry(line[0], line[1], line[2], line[3], line[4], line[5], self.InputFileName)
+            if line is not None:
+                PCD = PcdEntry(line[0], line[1], line[2], line[3],
+                               line[4], line[5], self.InputFileName)
                 # Strip the space char
-                PCD.PcdCName     = PCD.PcdCName.strip(' ')
-                PCD.SkuId        = PCD.SkuId.strip(' ')
-                PCD.PcdOffset    = PCD.PcdOffset.strip(' ')
-                PCD.PcdSize      = PCD.PcdSize.strip(' ')
-                PCD.PcdValue     = PCD.PcdValue.strip(' ')
-                PCD.Lineno       = PCD.Lineno.strip(' ')
+                PCD.PcdCName = PCD.PcdCName.strip(' ')
+                PCD.SkuId = PCD.SkuId.strip(' ')
+                PCD.PcdOffset = PCD.PcdOffset.strip(' ')
+                PCD.PcdSize = PCD.PcdSize.strip(' ')
+                PCD.PcdValue = PCD.PcdValue.strip(' ')
+                PCD.Lineno = PCD.Lineno.strip(' ')
 
                 #
                 # Store the original pcd value.
                 # This information will be useful while generate the output map file.
                 #
-                PCD.PcdUnpackValue    =  str(PCD.PcdValue)
+                PCD.PcdUnpackValue = str(PCD.PcdValue)
 
                 #
                 # Translate PCD size string to an integer value.
@@ -402,7 +409,8 @@ class GenVPD :
                         PackSize = int(PCD.PcdSize, 16)
                         PCD.PcdBinSize = PackSize
                     except:
-                        EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid PCD size value %s at file: %s line: %s" % (PCD.PcdSize, self.InputFileName, PCD.Lineno))
+                        EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "Invalid PCD size value %s at file: %s line: %s" % (
+                            PCD.PcdSize, self.InputFileName, PCD.Lineno))
 
                 #
                 # If value is Unicode string (e.g. L""), then use 2-byte alignment
@@ -420,12 +428,15 @@ class GenVPD :
                 if PCD.PcdOffset != TAB_STAR:
                     if PCD.PcdOccupySize % Alignment != 0:
                         if PCD.PcdUnpackValue.startswith("{"):
-                            EdkLogger.warn("BPDG", "The offset value of PCD %s is not 8-byte aligned!" %(PCD.PcdCName), File=self.InputFileName)
+                            EdkLogger.warn("BPDG", "The offset value of PCD %s is not 8-byte aligned!" % (
+                                PCD.PcdCName), File=self.InputFileName)
                         else:
-                            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, 'The offset value of PCD %s should be %s-byte aligned.' % (PCD.PcdCName, Alignment))
+                            EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
+                                            'The offset value of PCD %s should be %s-byte aligned.' % (PCD.PcdCName, Alignment))
                 else:
                     if PCD.PcdOccupySize % Alignment != 0:
-                        PCD.PcdOccupySize = (PCD.PcdOccupySize // Alignment + 1) * Alignment
+                        PCD.PcdOccupySize = (
+                            PCD.PcdOccupySize // Alignment + 1) * Alignment
 
                 PackSize = PCD.PcdOccupySize
                 if PCD._IsBoolean(PCD.PcdValue, PCD.PcdSize):
@@ -453,29 +464,29 @@ class GenVPD :
 
                 self.FileLinesList[count] = PCD
                 count += 1
-            else :
+            else:
                 continue
 
     ##
     # This function used to create a clean list only contain useful information and reorganized to make it
     # easy to be sorted
     #
-    def FormatFileLine (self) :
+    def FormatFileLine(self):
 
-        for eachPcd in self.FileLinesList :
-            if eachPcd.PcdOffset != TAB_STAR :
+        for eachPcd in self.FileLinesList:
+            if eachPcd.PcdOffset != TAB_STAR:
                 # Use pcd's Offset value as key, and pcd's Value as value
                 self.PcdFixedOffsetSizeList.append(eachPcd)
-            else :
+            else:
                 # Use pcd's CName as key, and pcd's Size as value
                 self.PcdUnknownOffsetList.append(eachPcd)
 
-
     ##
     # This function is use to fix the offset value which the not specified in the map file.
     # Usually it use the star (meaning any offset) character in the offset field
     #
-    def FixVpdOffset (self):
+
+    def FixVpdOffset(self):
         # At first, the offset should start at 0
         # Sort fixed offset list in order to find out where has free spaces for the pcd's offset
         # value is TAB_STAR to insert into.
@@ -487,187 +498,204 @@ class GenVPD :
         #
         self.PcdUnknownOffsetList.sort(key=lambda x: x.PcdBinSize)
 
-        index =0
+        index = 0
         for pcd in self.PcdUnknownOffsetList:
             index += 1
             if pcd.PcdCName == ".".join(("gEfiMdeModulePkgTokenSpaceGuid", "PcdNvStoreDefaultValueBuffer")):
                 if index != len(self.PcdUnknownOffsetList):
                     for i in range(len(self.PcdUnknownOffsetList) - index):
-                        self.PcdUnknownOffsetList[index+i -1 ], self.PcdUnknownOffsetList[index+i] = self.PcdUnknownOffsetList[index+i], self.PcdUnknownOffsetList[index+i -1]
+                        self.PcdUnknownOffsetList[index+i - 1], self.PcdUnknownOffsetList[index +
+                                                                                          i] = self.PcdUnknownOffsetList[index+i], self.PcdUnknownOffsetList[index+i - 1]
 
         #
         # Process all Offset value are TAB_STAR
         #
-        if (len(self.PcdFixedOffsetSizeList) == 0) and (len(self.PcdUnknownOffsetList) != 0) :
+        if (len(self.PcdFixedOffsetSizeList) == 0) and (len(self.PcdUnknownOffsetList) != 0):
             # The offset start from 0
             NowOffset = 0
-            for Pcd in self.PcdUnknownOffsetList :
+            for Pcd in self.PcdUnknownOffsetList:
                 if NowOffset % Pcd.Alignment != 0:
-                    NowOffset = (NowOffset// Pcd.Alignment + 1) * Pcd.Alignment
+                    NowOffset = (NowOffset // Pcd.Alignment + 1) * \
+                        Pcd.Alignment
                 Pcd.PcdBinOffset = NowOffset
-                Pcd.PcdOffset    = str(hex(Pcd.PcdBinOffset))
-                NowOffset       += Pcd.PcdOccupySize
+                Pcd.PcdOffset = str(hex(Pcd.PcdBinOffset))
+                NowOffset += Pcd.PcdOccupySize
 
             self.PcdFixedOffsetSizeList = self.PcdUnknownOffsetList
             return
 
         # Check the offset of VPD type pcd's offset start from 0.
-        if self.PcdFixedOffsetSizeList[0].PcdBinOffset != 0 :
+        if self.PcdFixedOffsetSizeList[0].PcdBinOffset != 0:
             EdkLogger.warn("BPDG", "The offset of VPD type pcd should start with 0, please check it.",
-                            None)
+                           None)
 
         # Judge whether the offset in fixed pcd offset list is overlapped or not.
         lenOfList = len(self.PcdFixedOffsetSizeList)
-        count     = 0
-        while (count < lenOfList - 1) :
-            PcdNow  = self.PcdFixedOffsetSizeList[count]
+        count = 0
+        while (count < lenOfList - 1):
+            PcdNow = self.PcdFixedOffsetSizeList[count]
             PcdNext = self.PcdFixedOffsetSizeList[count+1]
             # Two pcd's offset is same
-            if PcdNow.PcdBinOffset == PcdNext.PcdBinOffset :
+            if PcdNow.PcdBinOffset == PcdNext.PcdBinOffset:
                 EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
-                                "The offset of %s at line: %s is same with %s at line: %s in file %s" % \
-                                (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
+                                "The offset of %s at line: %s is same with %s at line: %s in file %s" %
+                                (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName,
+                                 PcdNext.Lineno, PcdNext.FileName),
                                 None)
 
             # Overlapped
-            if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize > PcdNext.PcdBinOffset :
+            if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize > PcdNext.PcdBinOffset:
                 EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
-                                "The offset of %s at line: %s is overlapped with %s at line: %s in file %s" % \
-                                (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
+                                "The offset of %s at line: %s is overlapped with %s at line: %s in file %s" %
+                                (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName,
+                                 PcdNext.Lineno, PcdNext.FileName),
                                 None)
 
             # Has free space, raise a warning message
-            if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize < PcdNext.PcdBinOffset :
+            if PcdNow.PcdBinOffset + PcdNow.PcdOccupySize < PcdNext.PcdBinOffset:
                 EdkLogger.warn("BPDG", BuildToolError.ATTRIBUTE_GET_FAILURE,
-                               "The offsets have free space of between %s at line: %s and %s at line: %s in file %s" % \
-                               (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName, PcdNext.Lineno, PcdNext.FileName),
-                                None)
+                               "The offsets have free space of between %s at line: %s and %s at line: %s in file %s" %
+                               (PcdNow.PcdCName, PcdNow.Lineno, PcdNext.PcdCName,
+                                PcdNext.Lineno, PcdNext.FileName),
+                               None)
             count += 1
 
-        LastOffset              = self.PcdFixedOffsetSizeList[0].PcdBinOffset
-        FixOffsetSizeListCount  = 0
-        lenOfList               = len(self.PcdFixedOffsetSizeList)
-        lenOfUnfixedList        = len(self.PcdUnknownOffsetList)
+        LastOffset = self.PcdFixedOffsetSizeList[0].PcdBinOffset
+        FixOffsetSizeListCount = 0
+        lenOfList = len(self.PcdFixedOffsetSizeList)
+        lenOfUnfixedList = len(self.PcdUnknownOffsetList)
 
         ##
         # Insert the un-fixed offset pcd's list into fixed offset pcd's list if has free space between those pcds.
         #
-        while (FixOffsetSizeListCount < lenOfList) :
+        while (FixOffsetSizeListCount < lenOfList):
 
-            eachFixedPcd     = self.PcdFixedOffsetSizeList[FixOffsetSizeListCount]
-            NowOffset        = eachFixedPcd.PcdBinOffset
+            eachFixedPcd = self.PcdFixedOffsetSizeList[FixOffsetSizeListCount]
+            NowOffset = eachFixedPcd.PcdBinOffset
 
             # Has free space
-            if LastOffset < NowOffset :
-                if lenOfUnfixedList != 0 :
+            if LastOffset < NowOffset:
+                if lenOfUnfixedList != 0:
                     countOfUnfixedList = 0
-                    while(countOfUnfixedList < lenOfUnfixedList) :
-                        eachUnfixedPcd      = self.PcdUnknownOffsetList[countOfUnfixedList]
-                        needFixPcdSize      = eachUnfixedPcd.PcdOccupySize
+                    while(countOfUnfixedList < lenOfUnfixedList):
+                        eachUnfixedPcd = self.PcdUnknownOffsetList[countOfUnfixedList]
+                        needFixPcdSize = eachUnfixedPcd.PcdOccupySize
                         # Not been fixed
-                        if eachUnfixedPcd.PcdOffset == TAB_STAR :
+                        if eachUnfixedPcd.PcdOffset == TAB_STAR:
                             if LastOffset % eachUnfixedPcd.Alignment != 0:
-                                LastOffset = (LastOffset // eachUnfixedPcd.Alignment + 1) * eachUnfixedPcd.Alignment
+                                LastOffset = (
+                                    LastOffset // eachUnfixedPcd.Alignment + 1) * eachUnfixedPcd.Alignment
                             # The offset un-fixed pcd can write into this free space
-                            if needFixPcdSize <= (NowOffset - LastOffset) :
+                            if needFixPcdSize <= (NowOffset - LastOffset):
                                 # Change the offset value of un-fixed pcd
-                                eachUnfixedPcd.PcdOffset    = str(hex(LastOffset))
+                                eachUnfixedPcd.PcdOffset = str(hex(LastOffset))
                                 eachUnfixedPcd.PcdBinOffset = LastOffset
                                 # Insert this pcd into fixed offset pcd list.
-                                self.PcdFixedOffsetSizeList.insert(FixOffsetSizeListCount, eachUnfixedPcd)
+                                self.PcdFixedOffsetSizeList.insert(
+                                    FixOffsetSizeListCount, eachUnfixedPcd)
 
                                 # Delete the item's offset that has been fixed and added into fixed offset list
-                                self.PcdUnknownOffsetList.pop(countOfUnfixedList)
+                                self.PcdUnknownOffsetList.pop(
+                                    countOfUnfixedList)
 
                                 # After item added, should enlarge the length of fixed pcd offset list
-                                lenOfList               += 1
-                                FixOffsetSizeListCount  += 1
+                                lenOfList += 1
+                                FixOffsetSizeListCount += 1
 
                                 # Decrease the un-fixed pcd offset list's length
-                                lenOfUnfixedList        -= 1
+                                lenOfUnfixedList -= 1
 
                                 # Modify the last offset value
-                                LastOffset              += needFixPcdSize
-                            else :
+                                LastOffset += needFixPcdSize
+                            else:
                                 # It can not insert into those two pcds, need to check still has other space can store it.
-                                LastOffset             = NowOffset + self.PcdFixedOffsetSizeList[FixOffsetSizeListCount].PcdOccupySize
+                                LastOffset = NowOffset + \
+                                    self.PcdFixedOffsetSizeList[FixOffsetSizeListCount].PcdOccupySize
                                 FixOffsetSizeListCount += 1
                                 break
 
                 # Set the FixOffsetSizeListCount = lenOfList for quit the loop
-                else :
+                else:
                     FixOffsetSizeListCount = lenOfList
 
             # No free space, smoothly connect with previous pcd.
-            elif LastOffset == NowOffset :
+            elif LastOffset == NowOffset:
                 LastOffset = NowOffset + eachFixedPcd.PcdOccupySize
                 FixOffsetSizeListCount += 1
             # Usually it will not enter into this thunk, if so, means it overlapped.
-            else :
+            else:
                 EdkLogger.error("BPDG", BuildToolError.ATTRIBUTE_NOT_AVAILABLE,
-                                "The offset value definition has overlapped at pcd: %s, its offset is: %s, in file: %s line: %s" % \
-                                (eachFixedPcd.PcdCName, eachFixedPcd.PcdOffset, eachFixedPcd.InputFileName, eachFixedPcd.Lineno),
+                                "The offset value definition has overlapped at pcd: %s, its offset is: %s, in file: %s line: %s" %
+                                (eachFixedPcd.PcdCName, eachFixedPcd.PcdOffset,
+                                 eachFixedPcd.InputFileName, eachFixedPcd.Lineno),
                                 None)
                 FixOffsetSizeListCount += 1
 
         # Continue to process the un-fixed offset pcd's list, add this time, just append them behind the fixed pcd's offset list.
-        lenOfUnfixedList  = len(self.PcdUnknownOffsetList)
-        lenOfList         = len(self.PcdFixedOffsetSizeList)
-        while (lenOfUnfixedList > 0) :
+        lenOfUnfixedList = len(self.PcdUnknownOffsetList)
+        lenOfList = len(self.PcdFixedOffsetSizeList)
+        while (lenOfUnfixedList > 0):
             # Still has items need to process
             # The last pcd instance
-            LastPcd    = self.PcdFixedOffsetSizeList[lenOfList-1]
+            LastPcd = self.PcdFixedOffsetSizeList[lenOfList-1]
             NeedFixPcd = self.PcdUnknownOffsetList[0]
 
             NeedFixPcd.PcdBinOffset = LastPcd.PcdBinOffset + LastPcd.PcdOccupySize
             if NeedFixPcd.PcdBinOffset % NeedFixPcd.Alignment != 0:
-                NeedFixPcd.PcdBinOffset = (NeedFixPcd.PcdBinOffset // NeedFixPcd.Alignment + 1) * NeedFixPcd.Alignment
+                NeedFixPcd.PcdBinOffset = (
+                    NeedFixPcd.PcdBinOffset // NeedFixPcd.Alignment + 1) * NeedFixPcd.Alignment
 
-            NeedFixPcd.PcdOffset    = str(hex(NeedFixPcd.PcdBinOffset))
+            NeedFixPcd.PcdOffset = str(hex(NeedFixPcd.PcdBinOffset))
 
             # Insert this pcd into fixed offset pcd list's tail.
             self.PcdFixedOffsetSizeList.insert(lenOfList, NeedFixPcd)
             # Delete the item's offset that has been fixed and added into fixed offset list
             self.PcdUnknownOffsetList.pop(0)
 
-            lenOfList          += 1
-            lenOfUnfixedList   -= 1
+            lenOfList += 1
+            lenOfUnfixedList -= 1
     ##
     # Write the final data into output files.
     #
-    def GenerateVpdFile (self, MapFileName, BinFileName):
-        #Open an VPD file to process
+
+    def GenerateVpdFile(self, MapFileName, BinFileName):
+        # Open an VPD file to process
 
         try:
             fVpdFile = open(BinFileName, "wb")
         except:
             # Open failed
-            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.VpdFileName, None)
+            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE,
+                            "File open failed for %s" % self.VpdFileName, None)
 
-        try :
+        try:
             fMapFile = open(MapFileName, "w")
         except:
             # Open failed
-            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE, "File open failed for %s" % self.MapFileName, None)
+            EdkLogger.error("BPDG", BuildToolError.FILE_OPEN_FAILURE,
+                            "File open failed for %s" % self.MapFileName, None)
 
         # Use a instance of BytesIO to cache data
         fStringIO = BytesIO()
 
         # Write the header of map file.
-        try :
-            fMapFile.write (st.MAP_FILE_COMMENT_TEMPLATE + "\n")
+        try:
+            fMapFile.write(st.MAP_FILE_COMMENT_TEMPLATE + "\n")
         except:
-            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
+            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE,
+                            "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
 
-        for eachPcd in self.PcdFixedOffsetSizeList  :
+        for eachPcd in self.PcdFixedOffsetSizeList:
             # write map file
-            try :
-                fMapFile.write("%s | %s | %s | %s | %s  \n" % (eachPcd.PcdCName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpackValue))
+            try:
+                fMapFile.write("%s | %s | %s | %s | %s  \n" % (
+                    eachPcd.PcdCName, eachPcd.SkuId, eachPcd.PcdOffset, eachPcd.PcdSize, eachPcd.PcdUnpackValue))
             except:
-                EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
+                EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE,
+                                "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.MapFileName, None)
 
             # Write Vpd binary file
-            fStringIO.seek (eachPcd.PcdBinOffset)
+            fStringIO.seek(eachPcd.PcdBinOffset)
             if isinstance(eachPcd.PcdValue, list):
                 for i in range(len(eachPcd.PcdValue)):
                     Value = eachPcd.PcdValue[i:i + 1]
@@ -676,14 +704,14 @@ class GenVPD :
                     else:
                         fStringIO.write(bytes(Value))
             else:
-                fStringIO.write (eachPcd.PcdValue)
+                fStringIO.write(eachPcd.PcdValue)
 
-        try :
-            fVpdFile.write (fStringIO.getvalue())
+        try:
+            fVpdFile.write(fStringIO.getvalue())
         except:
-            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
-
-        fStringIO.close ()
-        fVpdFile.close ()
-        fMapFile.close ()
+            EdkLogger.error("BPDG", BuildToolError.FILE_WRITE_FAILURE,
+                            "Write data to file %s failed, please check whether the file been locked or using by other applications." % self.VpdFileName, None)
 
+        fStringIO.close()
+        fVpdFile.close()
+        fMapFile.close()
diff --git a/BaseTools/Source/Python/BPDG/StringTable.py b/BaseTools/Source/Python/BPDG/StringTable.py
index cd8b2d732645..b142ebc60129 100644
--- a/BaseTools/Source/Python/BPDG/StringTable.py
+++ b/BaseTools/Source/Python/BPDG/StringTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define strings used in the BPDG tool
 #
 # Copyright (c) 2010 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -6,9 +6,9 @@
 ##
 
 
-#string table starts here...
+# string table starts here...
 
-#strings are classified as following types
+# strings are classified as following types
 #    MSG_...: it is a message string
 #    ERR_...: it is a error string
 #    WRN_...: it is a warning string
@@ -19,7 +19,7 @@
 #    XRC_...: it is a user visible string from xrc file
 
 MAP_FILE_COMMENT_TEMPLATE = \
-"""
+    """
 ## @file
 #
 #  THIS IS AUTO-GENERATED FILE BY BPDG TOOLS AND PLEASE DO NOT MAKE MODIFICATION.
@@ -38,12 +38,12 @@ MAP_FILE_COMMENT_TEMPLATE = \
 """
 
 
-
-LBL_BPDG_LONG_UNI           = (u"Intel(r) Binary Product Data Generation Tool (Intel(r) BPDG)")
-LBL_BPDG_VERSION            = (u"1.0")
-LBL_BPDG_USAGE              = \
-(
-"""BPDG options -o Filename.bin -m Filename.map Filename.txt
+LBL_BPDG_LONG_UNI = (
+    u"Intel(r) Binary Product Data Generation Tool (Intel(r) BPDG)")
+LBL_BPDG_VERSION = (u"1.0")
+LBL_BPDG_USAGE = \
+    (
+        """BPDG options -o Filename.bin -m Filename.map Filename.txt
 Copyright (c) 2010 - 2018, Intel Corporation All Rights Reserved.
 
   Intel(r) Binary Product Data Generation Tool (Intel(r) BPDG)
@@ -56,17 +56,20 @@ Required Flags:
             the mapping of Pcd name, offset, datum size and value derived
             from the input file and any automatic calculations.
 """
-)
+    )
 
-MSG_OPTION_HELP             = ("Show this help message and exit.")
-MSG_OPTION_DEBUG_LEVEL      = ("Print DEBUG statements, where DEBUG_LEVEL is 0-9.")
-MSG_OPTION_VERBOSE          = ("Print informational statements.")
-MSG_OPTION_QUIET            = ("Returns the exit code and will display only error messages.")
-MSG_OPTION_VPD_FILENAME     = ("Specify the file name for the VPD binary file.")
-MSG_OPTION_MAP_FILENAME     = ("Generate file name for consumption during the build that contains the mapping of Pcd name, offset, datum size and value derived from the input file and any automatic calculations.")
-MSG_OPTION_FORCE            = ("Will force overwriting existing output files rather than returning an error message.")
+MSG_OPTION_HELP = ("Show this help message and exit.")
+MSG_OPTION_DEBUG_LEVEL = ("Print DEBUG statements, where DEBUG_LEVEL is 0-9.")
+MSG_OPTION_VERBOSE = ("Print informational statements.")
+MSG_OPTION_QUIET = (
+    "Returns the exit code and will display only error messages.")
+MSG_OPTION_VPD_FILENAME = ("Specify the file name for the VPD binary file.")
+MSG_OPTION_MAP_FILENAME = (
+    "Generate file name for consumption during the build that contains the mapping of Pcd name, offset, datum size and value derived from the input file and any automatic calculations.")
+MSG_OPTION_FORCE = (
+    "Will force overwriting existing output files rather than returning an error message.")
 
-ERR_INVALID_DEBUG_LEVEL     = ("Invalid level for debug message. Only "
-                                "'DEBUG', 'INFO', 'WARNING', 'ERROR', "
-                                "'CRITICAL' are supported for debugging "
-                                "messages.")
+ERR_INVALID_DEBUG_LEVEL = ("Invalid level for debug message. Only "
+                           "'DEBUG', 'INFO', 'WARNING', 'ERROR', "
+                           "'CRITICAL' are supported for debugging "
+                           "messages.")
diff --git a/BaseTools/Source/Python/BPDG/__init__.py b/BaseTools/Source/Python/BPDG/__init__.py
index ea120ab9c743..20f8a593e315 100644
--- a/BaseTools/Source/Python/BPDG/__init__.py
+++ b/BaseTools/Source/Python/BPDG/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'BPDG' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Capsule/GenerateCapsule.py b/BaseTools/Source/Python/Capsule/GenerateCapsule.py
index 35435946c664..f44bc42022cb 100644
--- a/BaseTools/Source/Python/Capsule/GenerateCapsule.py
+++ b/BaseTools/Source/Python/Capsule/GenerateCapsule.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Generate a capsule.
 #
 # This tool generates a UEFI Capsule around an FMP Capsule. The capsule payload
@@ -29,20 +29,21 @@ import shutil
 import platform
 import json
 from Common.Uefi.Capsule.UefiCapsuleHeader import UefiCapsuleHeaderClass
-from Common.Uefi.Capsule.FmpCapsuleHeader  import FmpCapsuleHeaderClass
-from Common.Uefi.Capsule.FmpAuthHeader     import FmpAuthHeaderClass
+from Common.Uefi.Capsule.FmpCapsuleHeader import FmpCapsuleHeaderClass
+from Common.Uefi.Capsule.FmpAuthHeader import FmpAuthHeaderClass
 from Common.Uefi.Capsule.CapsuleDependency import CapsuleDependencyClass
-from Common.Edk2.Capsule.FmpPayloadHeader  import FmpPayloadHeaderClass
+from Common.Edk2.Capsule.FmpPayloadHeader import FmpPayloadHeaderClass
 
 #
 # Globals for help information
 #
-__prog__        = 'GenerateCapsule'
-__version__     = '0.10'
-__copyright__   = 'Copyright (c) 2022, Intel Corporation. All rights reserved.'
+__prog__ = 'GenerateCapsule'
+__version__ = '0.10'
+__copyright__ = 'Copyright (c) 2022, Intel Corporation. All rights reserved.'
 __description__ = 'Generate a capsule.\n'
 
-def SignPayloadSignTool (Payload, ToolPath, PfxFile, SubjectName, Verbose = False):
+
+def SignPayloadSignTool(Payload, ToolPath, PfxFile, SubjectName, Verbose=False):
     #
     # Create a temporary directory
     #
@@ -51,17 +52,18 @@ def SignPayloadSignTool (Payload, ToolPath, PfxFile, SubjectName, Verbose = Fals
     #
     # Generate temp file name for the payload contents
     #
-    TempFileName = os.path.join (TempDirectoryName, 'Payload.bin')
+    TempFileName = os.path.join(TempDirectoryName, 'Payload.bin')
 
     #
     # Create temporary payload file for signing
     #
     try:
-        with open (TempFileName, 'wb') as File:
-            File.write (Payload)
+        with open(TempFileName, 'wb') as File:
+            File.write(Payload)
     except:
-        shutil.rmtree (TempDirectoryName)
-        raise ValueError ('GenerateCapsule: error: can not write temporary payload file.')
+        shutil.rmtree(TempDirectoryName)
+        raise ValueError(
+            'GenerateCapsule: error: can not write temporary payload file.')
 
     #
     # Build signtool command
@@ -69,79 +71,89 @@ def SignPayloadSignTool (Payload, ToolPath, PfxFile, SubjectName, Verbose = Fals
     if ToolPath is None:
         ToolPath = ''
     Command = ''
-    Command = Command + '"{Path}" '.format (Path = os.path.join (ToolPath, 'signtool.exe'))
+    Command = Command + \
+        '"{Path}" '.format(Path=os.path.join(ToolPath, 'signtool.exe'))
     Command = Command + 'sign /fd sha256 /p7ce DetachedSignedData /p7co 1.2.840.113549.1.7.2 '
-    Command = Command + '/p7 {TempDir} '.format (TempDir = TempDirectoryName)
+    Command = Command + '/p7 {TempDir} '.format(TempDir=TempDirectoryName)
     if PfxFile is not None:
-        Command = Command + '/f {PfxFile} '.format (PfxFile = PfxFile)
+        Command = Command + '/f {PfxFile} '.format(PfxFile=PfxFile)
     if SubjectName is not None:
-        Command = Command + '/n {SubjectName} '.format (SubjectName = SubjectName)
+        Command = Command + '/n {SubjectName} '.format(SubjectName=SubjectName)
     Command = Command + TempFileName
     if Verbose:
-        print (Command)
+        print(Command)
 
     #
     # Sign the input file using the specified private key
     #
     try:
-        Process = subprocess.Popen (Command, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, shell = True)
+        Process = subprocess.Popen(Command, stdin=subprocess.PIPE,
+                                   stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         Result = Process.communicate('')
     except:
-        shutil.rmtree (TempDirectoryName)
-        raise ValueError ('GenerateCapsule: error: can not run signtool.')
+        shutil.rmtree(TempDirectoryName)
+        raise ValueError('GenerateCapsule: error: can not run signtool.')
 
     if Process.returncode != 0:
-        shutil.rmtree (TempDirectoryName)
-        print (Result[1].decode())
-        raise ValueError ('GenerateCapsule: error: signtool failed.')
+        shutil.rmtree(TempDirectoryName)
+        print(Result[1].decode())
+        raise ValueError('GenerateCapsule: error: signtool failed.')
 
     #
     # Read the signature from the generated output file
     #
     try:
-        with open (TempFileName + '.p7', 'rb') as File:
-            Signature = File.read ()
+        with open(TempFileName + '.p7', 'rb') as File:
+            Signature = File.read()
     except:
-        shutil.rmtree (TempDirectoryName)
-        raise ValueError ('GenerateCapsule: error: can not read signature file.')
+        shutil.rmtree(TempDirectoryName)
+        raise ValueError(
+            'GenerateCapsule: error: can not read signature file.')
 
-    shutil.rmtree (TempDirectoryName)
+    shutil.rmtree(TempDirectoryName)
     return Signature
 
-def VerifyPayloadSignTool (Payload, CertData, ToolPath, PfxFile, SubjectName, Verbose = False):
-    print ('signtool verify is not supported.')
-    raise ValueError ('GenerateCapsule: error: signtool verify is not supported.')
 
-def SignPayloadOpenSsl (Payload, ToolPath, SignerPrivateCertFile, OtherPublicCertFile, TrustedPublicCertFile, Verbose = False):
+def VerifyPayloadSignTool(Payload, CertData, ToolPath, PfxFile, SubjectName, Verbose=False):
+    print('signtool verify is not supported.')
+    raise ValueError(
+        'GenerateCapsule: error: signtool verify is not supported.')
+
+
+def SignPayloadOpenSsl(Payload, ToolPath, SignerPrivateCertFile, OtherPublicCertFile, TrustedPublicCertFile, Verbose=False):
     #
     # Build openssl command
     #
     if ToolPath is None:
         ToolPath = ''
     Command = ''
-    Command = Command + '"{Path}" '.format (Path = os.path.join (ToolPath, 'openssl'))
+    Command = Command + \
+        '"{Path}" '.format(Path=os.path.join(ToolPath, 'openssl'))
     Command = Command + 'smime -sign -binary -outform DER -md sha256 '
-    Command = Command + '-signer "{Private}" -certfile "{Public}"'.format (Private = SignerPrivateCertFile, Public = OtherPublicCertFile)
+    Command = Command + '-signer "{Private}" -certfile "{Public}"'.format(
+        Private=SignerPrivateCertFile, Public=OtherPublicCertFile)
     if Verbose:
-        print (Command)
+        print(Command)
 
     #
     # Sign the input file using the specified private key and capture signature from STDOUT
     #
     try:
-        Process = subprocess.Popen (Command, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, shell = True)
-        Result = Process.communicate(input = Payload)
+        Process = subprocess.Popen(Command, stdin=subprocess.PIPE,
+                                   stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Result = Process.communicate(input=Payload)
         Signature = Result[0]
     except:
-        raise ValueError ('GenerateCapsule: error: can not run openssl.')
+        raise ValueError('GenerateCapsule: error: can not run openssl.')
 
     if Process.returncode != 0:
-        print (Result[1].decode())
-        raise ValueError ('GenerateCapsule: error: openssl failed.')
+        print(Result[1].decode())
+        raise ValueError('GenerateCapsule: error: openssl failed.')
 
     return Signature
 
-def VerifyPayloadOpenSsl (Payload, CertData, ToolPath, SignerPrivateCertFile, OtherPublicCertFile, TrustedPublicCertFile, Verbose = False):
+
+def VerifyPayloadOpenSsl(Payload, CertData, ToolPath, SignerPrivateCertFile, OtherPublicCertFile, TrustedPublicCertFile, Verbose=False):
     #
     # Create a temporary directory
     #
@@ -150,17 +162,18 @@ def VerifyPayloadOpenSsl (Payload, CertData, ToolPath, SignerPrivateCertFile, Ot
     #
     # Generate temp file name for the payload contents
     #
-    TempFileName = os.path.join (TempDirectoryName, 'Payload.bin')
+    TempFileName = os.path.join(TempDirectoryName, 'Payload.bin')
 
     #
     # Create temporary payload file for verification
     #
     try:
-        with open (TempFileName, 'wb') as File:
-            File.write (Payload)
+        with open(TempFileName, 'wb') as File:
+            File.write(Payload)
     except:
-        shutil.rmtree (TempDirectoryName)
-        raise ValueError ('GenerateCapsule: error: can not write temporary payload file.')
+        shutil.rmtree(TempDirectoryName)
+        raise ValueError(
+            'GenerateCapsule: error: can not write temporary payload file.')
 
     #
     # Build openssl command
@@ -168,30 +181,34 @@ def VerifyPayloadOpenSsl (Payload, CertData, ToolPath, SignerPrivateCertFile, Ot
     if ToolPath is None:
         ToolPath = ''
     Command = ''
-    Command = Command + '"{Path}" '.format (Path = os.path.join (ToolPath, 'openssl'))
+    Command = Command + \
+        '"{Path}" '.format(Path=os.path.join(ToolPath, 'openssl'))
     Command = Command + 'smime -verify -inform DER '
-    Command = Command + '-content {Content} -CAfile "{Public}"'.format (Content = TempFileName, Public = TrustedPublicCertFile)
+    Command = Command + '-content {Content} -CAfile "{Public}"'.format(
+        Content=TempFileName, Public=TrustedPublicCertFile)
     if Verbose:
-        print (Command)
+        print(Command)
 
     #
     # Verify signature
     #
     try:
-        Process = subprocess.Popen (Command, stdin = subprocess.PIPE, stdout = subprocess.PIPE, stderr = subprocess.PIPE, shell = True)
-        Result = Process.communicate(input = CertData)
+        Process = subprocess.Popen(Command, stdin=subprocess.PIPE,
+                                   stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Result = Process.communicate(input=CertData)
     except:
-        shutil.rmtree (TempDirectoryName)
-        raise ValueError ('GenerateCapsule: error: can not run openssl.')
+        shutil.rmtree(TempDirectoryName)
+        raise ValueError('GenerateCapsule: error: can not run openssl.')
 
     if Process.returncode != 0:
-        shutil.rmtree (TempDirectoryName)
-        print (Result[1].decode())
-        raise ValueError ('GenerateCapsule: error: openssl failed.')
+        shutil.rmtree(TempDirectoryName)
+        print(Result[1].decode())
+        raise ValueError('GenerateCapsule: error: openssl failed.')
 
-    shutil.rmtree (TempDirectoryName)
+    shutil.rmtree(TempDirectoryName)
     return Payload
 
+
 if __name__ == '__main__':
     def convert_arg_line_to_args(arg_line):
         for arg in arg_line.split():
@@ -199,234 +216,286 @@ if __name__ == '__main__':
                 continue
             yield arg
 
-    def ValidateUnsignedInteger (Argument):
+    def ValidateUnsignedInteger(Argument):
         try:
-            Value = int (Argument, 0)
+            Value = int(Argument, 0)
         except:
-            Message = '{Argument} is not a valid integer value.'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+            Message = '{Argument} is not a valid integer value.'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         if Value < 0:
-            Message = '{Argument} is a negative value.'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+            Message = '{Argument} is a negative value.'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         return Value
 
-    def ValidateRegistryFormatGuid (Argument):
+    def ValidateRegistryFormatGuid(Argument):
         try:
-            Value = uuid.UUID (Argument)
+            Value = uuid.UUID(Argument)
         except:
-            Message = '{Argument} is not a valid registry format GUID value.'.format (Argument = Argument)
-            raise argparse.ArgumentTypeError (Message)
+            Message = '{Argument} is not a valid registry format GUID value.'.format(
+                Argument=Argument)
+            raise argparse.ArgumentTypeError(Message)
         return Value
 
-    def ConvertJsonValue (Config, FieldName, Convert, Required = True, Default = None, Open = False):
+    def ConvertJsonValue(Config, FieldName, Convert, Required=True, Default=None, Open=False):
         if FieldName not in Config:
             if Required:
-                print ('GenerateCapsule: error: Payload descriptor invalid syntax. Could not find {Key} in payload descriptor.'.format(Key = FieldName))
-                sys.exit (1)
+                print('GenerateCapsule: error: Payload descriptor invalid syntax. Could not find {Key} in payload descriptor.'.format(
+                    Key=FieldName))
+                sys.exit(1)
             return Default
         try:
-            Value = Convert (Config[FieldName])
+            Value = Convert(Config[FieldName])
         except:
-            print ('GenerateCapsule: error: {Key} in payload descriptor has invalid syntax.'.format (Key = FieldName))
-            sys.exit (1)
+            print('GenerateCapsule: error: {Key} in payload descriptor has invalid syntax.'.format(
+                Key=FieldName))
+            sys.exit(1)
         if Open:
             try:
-                Value = open (Value, "rb")
+                Value = open(Value, "rb")
             except:
-                print ('GenerateCapsule: error: can not open file {File}'.format (File = FieldName))
-                sys.exit (1)
+                print('GenerateCapsule: error: can not open file {File}'.format(
+                    File=FieldName))
+                sys.exit(1)
         return Value
 
-    def DecodeJsonFileParse (Json):
+    def DecodeJsonFileParse(Json):
         if 'Payloads' not in Json:
-            print ('GenerateCapsule: error "Payloads" section not found in JSON file {File}'.format (File = args.JsonFile.name))
-            sys.exit (1)
+            print('GenerateCapsule: error "Payloads" section not found in JSON file {File}'.format(
+                File=args.JsonFile.name))
+            sys.exit(1)
         for Config in Json['Payloads']:
             #
             # Parse fields from JSON
             #
-            PayloadFile                  = ConvertJsonValue (Config, 'Payload', os.path.expandvars, Required = False)
-            Guid                         = ConvertJsonValue (Config, 'Guid', ValidateRegistryFormatGuid, Required = False)
-            FwVersion                    = ConvertJsonValue (Config, 'FwVersion', ValidateUnsignedInteger, Required = False)
-            LowestSupportedVersion       = ConvertJsonValue (Config, 'LowestSupportedVersion', ValidateUnsignedInteger, Required = False)
-            HardwareInstance             = ConvertJsonValue (Config, 'HardwareInstance', ValidateUnsignedInteger, Required = False, Default = 0)
-            MonotonicCount               = ConvertJsonValue (Config, 'MonotonicCount', ValidateUnsignedInteger, Required = False, Default = 0)
-            SignToolPfxFile              = ConvertJsonValue (Config, 'SignToolPfxFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            SignToolSubjectName          = ConvertJsonValue (Config, 'SignToolSubjectName', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslSignerPrivateCertFile = ConvertJsonValue (Config, 'OpenSslSignerPrivateCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslOtherPublicCertFile   = ConvertJsonValue (Config, 'OpenSslOtherPublicCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslTrustedPublicCertFile = ConvertJsonValue (Config, 'OpenSslTrustedPublicCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            SigningToolPath              = ConvertJsonValue (Config, 'SigningToolPath', os.path.expandvars, Required = False, Default = None)
-            UpdateImageIndex             = ConvertJsonValue (Config, 'UpdateImageIndex', ValidateUnsignedInteger, Required = False, Default = 1)
+            PayloadFile = ConvertJsonValue(
+                Config, 'Payload', os.path.expandvars, Required=False)
+            Guid = ConvertJsonValue(
+                Config, 'Guid', ValidateRegistryFormatGuid, Required=False)
+            FwVersion = ConvertJsonValue(
+                Config, 'FwVersion', ValidateUnsignedInteger, Required=False)
+            LowestSupportedVersion = ConvertJsonValue(
+                Config, 'LowestSupportedVersion', ValidateUnsignedInteger, Required=False)
+            HardwareInstance = ConvertJsonValue(
+                Config, 'HardwareInstance', ValidateUnsignedInteger, Required=False, Default=0)
+            MonotonicCount = ConvertJsonValue(
+                Config, 'MonotonicCount', ValidateUnsignedInteger, Required=False, Default=0)
+            SignToolPfxFile = ConvertJsonValue(
+                Config, 'SignToolPfxFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            SignToolSubjectName = ConvertJsonValue(
+                Config, 'SignToolSubjectName', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslSignerPrivateCertFile = ConvertJsonValue(
+                Config, 'OpenSslSignerPrivateCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslOtherPublicCertFile = ConvertJsonValue(
+                Config, 'OpenSslOtherPublicCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslTrustedPublicCertFile = ConvertJsonValue(
+                Config, 'OpenSslTrustedPublicCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            SigningToolPath = ConvertJsonValue(
+                Config, 'SigningToolPath', os.path.expandvars, Required=False, Default=None)
+            UpdateImageIndex = ConvertJsonValue(
+                Config, 'UpdateImageIndex', ValidateUnsignedInteger, Required=False, Default=1)
 
-            PayloadDescriptorList.append (PayloadDescriptor (
-                                            PayloadFile,
-                                            Guid,
-                                            FwVersion,
-                                            LowestSupportedVersion,
-                                            MonotonicCount,
-                                            HardwareInstance,
-                                            UpdateImageIndex,
-                                            SignToolPfxFile,
-                                            SignToolSubjectName,
-                                            OpenSslSignerPrivateCertFile,
-                                            OpenSslOtherPublicCertFile,
-                                            OpenSslTrustedPublicCertFile,
-                                            SigningToolPath
-                                            ))
+            PayloadDescriptorList.append(PayloadDescriptor(
+                PayloadFile,
+                Guid,
+                FwVersion,
+                LowestSupportedVersion,
+                MonotonicCount,
+                HardwareInstance,
+                UpdateImageIndex,
+                SignToolPfxFile,
+                SignToolSubjectName,
+                OpenSslSignerPrivateCertFile,
+                OpenSslOtherPublicCertFile,
+                OpenSslTrustedPublicCertFile,
+                SigningToolPath
+            ))
 
-    def EncodeJsonFileParse (Json):
+    def EncodeJsonFileParse(Json):
         if 'EmbeddedDrivers' not in Json:
-            print ('GenerateCapsule: warning "EmbeddedDrivers" section not found in JSON file {File}'.format (File = args.JsonFile.name))
+            print('GenerateCapsule: warning "EmbeddedDrivers" section not found in JSON file {File}'.format(
+                File=args.JsonFile.name))
         else:
             for Config in Json['EmbeddedDrivers']:
-                EmbeddedDriverFile      = ConvertJsonValue(Config, 'Driver', os.path.expandvars, Open = True)
+                EmbeddedDriverFile = ConvertJsonValue(
+                    Config, 'Driver', os.path.expandvars, Open=True)
                 #
-                #Read EmbeddedDriver file
+                # Read EmbeddedDriver file
                 #
                 try:
                     if args.Verbose:
-                        print ('Read EmbeddedDriver file {File}'.format (File = EmbeddedDriverFile.name))
+                        print('Read EmbeddedDriver file {File}'.format(
+                            File=EmbeddedDriverFile.name))
                     Driver = EmbeddedDriverFile.read()
                 except:
-                    print ('GenerateCapsule: error: can not read EmbeddedDriver file {File}'.format (File = EmbeddedDriverFile.name))
-                    sys.exit (1)
-                EmbeddedDriverDescriptorList.append (Driver)
+                    print('GenerateCapsule: error: can not read EmbeddedDriver file {File}'.format(
+                        File=EmbeddedDriverFile.name))
+                    sys.exit(1)
+                EmbeddedDriverDescriptorList.append(Driver)
 
         if 'Payloads' not in Json:
-            print ('GenerateCapsule: error: "Payloads" section not found in JSON file {File}'.format (File = args.JsonFile.name))
-            sys.exit (1)
+            print('GenerateCapsule: error: "Payloads" section not found in JSON file {File}'.format(
+                File=args.JsonFile.name))
+            sys.exit(1)
         for Config in Json['Payloads']:
             #
             # Parse fields from JSON
             #
-            PayloadFile                  = ConvertJsonValue (Config, 'Payload', os.path.expandvars, Open = True)
-            Guid                         = ConvertJsonValue (Config, 'Guid', ValidateRegistryFormatGuid)
-            FwVersion                    = ConvertJsonValue (Config, 'FwVersion', ValidateUnsignedInteger)
-            LowestSupportedVersion       = ConvertJsonValue (Config, 'LowestSupportedVersion', ValidateUnsignedInteger)
-            HardwareInstance             = ConvertJsonValue (Config, 'HardwareInstance', ValidateUnsignedInteger, Required = False, Default = 0)
-            UpdateImageIndex             = ConvertJsonValue (Config, 'UpdateImageIndex', ValidateUnsignedInteger, Required = False, Default = 1)
-            MonotonicCount               = ConvertJsonValue (Config, 'MonotonicCount', ValidateUnsignedInteger, Required = False, Default = 0)
-            SignToolPfxFile              = ConvertJsonValue (Config, 'SignToolPfxFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            SignToolSubjectName          = ConvertJsonValue (Config, 'SignToolSubjectName', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslSignerPrivateCertFile = ConvertJsonValue (Config, 'OpenSslSignerPrivateCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslOtherPublicCertFile   = ConvertJsonValue (Config, 'OpenSslOtherPublicCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            OpenSslTrustedPublicCertFile = ConvertJsonValue (Config, 'OpenSslTrustedPublicCertFile', os.path.expandvars, Required = False, Default = None, Open = True)
-            SigningToolPath              = ConvertJsonValue (Config, 'SigningToolPath', os.path.expandvars, Required = False, Default = None)
-            DepexExp                     = ConvertJsonValue (Config, 'Dependencies', str, Required = False, Default = None)
+            PayloadFile = ConvertJsonValue(
+                Config, 'Payload', os.path.expandvars, Open=True)
+            Guid = ConvertJsonValue(Config, 'Guid', ValidateRegistryFormatGuid)
+            FwVersion = ConvertJsonValue(
+                Config, 'FwVersion', ValidateUnsignedInteger)
+            LowestSupportedVersion = ConvertJsonValue(
+                Config, 'LowestSupportedVersion', ValidateUnsignedInteger)
+            HardwareInstance = ConvertJsonValue(
+                Config, 'HardwareInstance', ValidateUnsignedInteger, Required=False, Default=0)
+            UpdateImageIndex = ConvertJsonValue(
+                Config, 'UpdateImageIndex', ValidateUnsignedInteger, Required=False, Default=1)
+            MonotonicCount = ConvertJsonValue(
+                Config, 'MonotonicCount', ValidateUnsignedInteger, Required=False, Default=0)
+            SignToolPfxFile = ConvertJsonValue(
+                Config, 'SignToolPfxFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            SignToolSubjectName = ConvertJsonValue(
+                Config, 'SignToolSubjectName', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslSignerPrivateCertFile = ConvertJsonValue(
+                Config, 'OpenSslSignerPrivateCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslOtherPublicCertFile = ConvertJsonValue(
+                Config, 'OpenSslOtherPublicCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            OpenSslTrustedPublicCertFile = ConvertJsonValue(
+                Config, 'OpenSslTrustedPublicCertFile', os.path.expandvars, Required=False, Default=None, Open=True)
+            SigningToolPath = ConvertJsonValue(
+                Config, 'SigningToolPath', os.path.expandvars, Required=False, Default=None)
+            DepexExp = ConvertJsonValue(
+                Config, 'Dependencies', str, Required=False, Default=None)
 
             #
             # Read binary input file
             #
             try:
                 if args.Verbose:
-                    print ('Read binary input file {File}'.format (File = PayloadFile.name))
+                    print('Read binary input file {File}'.format(
+                        File=PayloadFile.name))
                 Payload = PayloadFile.read()
-                PayloadFile.close ()
+                PayloadFile.close()
             except:
-                print ('GenerateCapsule: error: can not read binary input file {File}'.format (File = PayloadFile.name))
-                sys.exit (1)
-            PayloadDescriptorList.append (PayloadDescriptor (
-                                            Payload,
-                                            Guid,
-                                            FwVersion,
-                                            LowestSupportedVersion,
-                                            MonotonicCount,
-                                            HardwareInstance,
-                                            UpdateImageIndex,
-                                            SignToolPfxFile,
-                                            SignToolSubjectName,
-                                            OpenSslSignerPrivateCertFile,
-                                            OpenSslOtherPublicCertFile,
-                                            OpenSslTrustedPublicCertFile,
-                                            SigningToolPath,
-                                            DepexExp
-                                            ))
+                print('GenerateCapsule: error: can not read binary input file {File}'.format(
+                    File=PayloadFile.name))
+                sys.exit(1)
+            PayloadDescriptorList.append(PayloadDescriptor(
+                Payload,
+                Guid,
+                FwVersion,
+                LowestSupportedVersion,
+                MonotonicCount,
+                HardwareInstance,
+                UpdateImageIndex,
+                SignToolPfxFile,
+                SignToolSubjectName,
+                OpenSslSignerPrivateCertFile,
+                OpenSslOtherPublicCertFile,
+                OpenSslTrustedPublicCertFile,
+                SigningToolPath,
+                DepexExp
+            ))
 
-    def GenerateOutputJson (PayloadJsonDescriptorList):
+    def GenerateOutputJson(PayloadJsonDescriptorList):
         PayloadJson = {
-                          "Payloads" : [
-                              {
-                                  "Guid": str(PayloadDescriptor.Guid).upper(),
-                                  "FwVersion": str(PayloadDescriptor.FwVersion),
-                                  "LowestSupportedVersion": str(PayloadDescriptor.LowestSupportedVersion),
-                                  "MonotonicCount": str(PayloadDescriptor.MonotonicCount),
-                                  "Payload": PayloadDescriptor.Payload,
-                                  "HardwareInstance": str(PayloadDescriptor.HardwareInstance),
-                                  "UpdateImageIndex": str(PayloadDescriptor.UpdateImageIndex),
-                                  "SignToolPfxFile": str(PayloadDescriptor.SignToolPfxFile),
-                                  "SignToolSubjectName": str(PayloadDescriptor.SignToolSubjectName),
-                                  "OpenSslSignerPrivateCertFile": str(PayloadDescriptor.OpenSslSignerPrivateCertFile),
-                                  "OpenSslOtherPublicCertFile": str(PayloadDescriptor.OpenSslOtherPublicCertFile),
-                                  "OpenSslTrustedPublicCertFile": str(PayloadDescriptor.OpenSslTrustedPublicCertFile),
-                                  "SigningToolPath": str(PayloadDescriptor.SigningToolPath),
-                                  "Dependencies" : str(PayloadDescriptor.DepexExp)
-                              }for PayloadDescriptor in PayloadJsonDescriptorList
-                          ]
-                      }
+            "Payloads": [
+                {
+                    "Guid": str(PayloadDescriptor.Guid).upper(),
+                    "FwVersion": str(PayloadDescriptor.FwVersion),
+                    "LowestSupportedVersion": str(PayloadDescriptor.LowestSupportedVersion),
+                    "MonotonicCount": str(PayloadDescriptor.MonotonicCount),
+                    "Payload": PayloadDescriptor.Payload,
+                    "HardwareInstance": str(PayloadDescriptor.HardwareInstance),
+                    "UpdateImageIndex": str(PayloadDescriptor.UpdateImageIndex),
+                    "SignToolPfxFile": str(PayloadDescriptor.SignToolPfxFile),
+                    "SignToolSubjectName": str(PayloadDescriptor.SignToolSubjectName),
+                    "OpenSslSignerPrivateCertFile": str(PayloadDescriptor.OpenSslSignerPrivateCertFile),
+                    "OpenSslOtherPublicCertFile": str(PayloadDescriptor.OpenSslOtherPublicCertFile),
+                    "OpenSslTrustedPublicCertFile": str(PayloadDescriptor.OpenSslTrustedPublicCertFile),
+                    "SigningToolPath": str(PayloadDescriptor.SigningToolPath),
+                    "Dependencies": str(PayloadDescriptor.DepexExp)
+                }for PayloadDescriptor in PayloadJsonDescriptorList
+            ]
+        }
         OutputJsonFile = args.OutputFile.name + '.json'
         if 'Payloads' in PayloadJson:
-            PayloadSection = PayloadJson ['Payloads']
+            PayloadSection = PayloadJson['Payloads']
         Index = 0
         for PayloadField in PayloadSection:
             if PayloadJsonDescriptorList[Index].SignToolPfxFile is None:
-                del PayloadField ['SignToolPfxFile']
+                del PayloadField['SignToolPfxFile']
             if PayloadJsonDescriptorList[Index].SignToolSubjectName is None:
-                del PayloadField ['SignToolSubjectName']
+                del PayloadField['SignToolSubjectName']
             if PayloadJsonDescriptorList[Index].OpenSslSignerPrivateCertFile is None:
-                del PayloadField ['OpenSslSignerPrivateCertFile']
+                del PayloadField['OpenSslSignerPrivateCertFile']
             if PayloadJsonDescriptorList[Index].OpenSslOtherPublicCertFile is None:
-                del PayloadField ['OpenSslOtherPublicCertFile']
+                del PayloadField['OpenSslOtherPublicCertFile']
             if PayloadJsonDescriptorList[Index].OpenSslTrustedPublicCertFile is None:
-                del PayloadField ['OpenSslTrustedPublicCertFile']
+                del PayloadField['OpenSslTrustedPublicCertFile']
             if PayloadJsonDescriptorList[Index].SigningToolPath is None:
-                del PayloadField ['SigningToolPath']
+                del PayloadField['SigningToolPath']
             Index = Index + 1
-        Result = json.dumps (PayloadJson, indent=4, sort_keys=True, separators=(',', ': '))
-        with open (OutputJsonFile, 'w') as OutputFile:
-            OutputFile.write (Result)
+        Result = json.dumps(PayloadJson, indent=4,
+                            sort_keys=True, separators=(',', ': '))
+        with open(OutputJsonFile, 'w') as OutputFile:
+            OutputFile.write(Result)
 
-    def CheckArgumentConflict (args):
+    def CheckArgumentConflict(args):
         if args.Encode:
             if args.InputFile:
-                print ('GenerateCapsule: error: Argument InputFile conflicts with Argument -j')
-                sys.exit (1)
+                print(
+                    'GenerateCapsule: error: Argument InputFile conflicts with Argument -j')
+                sys.exit(1)
             if args.EmbeddedDriver:
-                print ('GenerateCapsule: error: Argument --embedded-driver conflicts with Argument -j')
-                sys.exit (1)
+                print(
+                    'GenerateCapsule: error: Argument --embedded-driver conflicts with Argument -j')
+                sys.exit(1)
         if args.Guid:
-            print ('GenerateCapsule: error: Argument --guid conflicts with Argument -j')
-            sys.exit (1)
+            print('GenerateCapsule: error: Argument --guid conflicts with Argument -j')
+            sys.exit(1)
         if args.FwVersion:
-            print ('GenerateCapsule: error: Argument --fw-version conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --fw-version conflicts with Argument -j')
+            sys.exit(1)
         if args.LowestSupportedVersion:
-            print ('GenerateCapsule: error: Argument --lsv conflicts with Argument -j')
-            sys.exit (1)
+            print('GenerateCapsule: error: Argument --lsv conflicts with Argument -j')
+            sys.exit(1)
         if args.MonotonicCount:
-            print ('GenerateCapsule: error: Argument --monotonic-count conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --monotonic-count conflicts with Argument -j')
+            sys.exit(1)
         if args.HardwareInstance:
-            print ('GenerateCapsule: error: Argument --hardware-instance conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --hardware-instance conflicts with Argument -j')
+            sys.exit(1)
         if args.SignToolPfxFile:
-            print ('GenerateCapsule: error: Argument --pfx-file conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --pfx-file conflicts with Argument -j')
+            sys.exit(1)
         if args.SignToolSubjectName:
-            print ('GenerateCapsule: error: Argument --SubjectName conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --SubjectName conflicts with Argument -j')
+            sys.exit(1)
         if args.OpenSslSignerPrivateCertFile:
-            print ('GenerateCapsule: error: Argument --signer-private-cert conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --signer-private-cert conflicts with Argument -j')
+            sys.exit(1)
         if args.OpenSslOtherPublicCertFile:
-            print ('GenerateCapsule: error: Argument --other-public-cert conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --other-public-cert conflicts with Argument -j')
+            sys.exit(1)
         if args.OpenSslTrustedPublicCertFile:
-            print ('GenerateCapsule: error: Argument --trusted-public-cert conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --trusted-public-cert conflicts with Argument -j')
+            sys.exit(1)
         if args.SigningToolPath:
-            print ('GenerateCapsule: error: Argument --signing-tool-path conflicts with Argument -j')
-            sys.exit (1)
+            print(
+                'GenerateCapsule: error: Argument --signing-tool-path conflicts with Argument -j')
+            sys.exit(1)
 
     class PayloadDescriptor (object):
         def __init__(self,
@@ -434,90 +503,108 @@ if __name__ == '__main__':
                      Guid,
                      FwVersion,
                      LowestSupportedVersion,
-                     MonotonicCount               = 0,
-                     HardwareInstance             = 0,
-                     UpdateImageIndex             = 1,
-                     SignToolPfxFile              = None,
-                     SignToolSubjectName          = None,
-                     OpenSslSignerPrivateCertFile = None,
-                     OpenSslOtherPublicCertFile   = None,
-                     OpenSslTrustedPublicCertFile = None,
-                     SigningToolPath              = None,
-                     DepexExp                     = None
+                     MonotonicCount=0,
+                     HardwareInstance=0,
+                     UpdateImageIndex=1,
+                     SignToolPfxFile=None,
+                     SignToolSubjectName=None,
+                     OpenSslSignerPrivateCertFile=None,
+                     OpenSslOtherPublicCertFile=None,
+                     OpenSslTrustedPublicCertFile=None,
+                     SigningToolPath=None,
+                     DepexExp=None
                      ):
-            self.Payload                      = Payload
-            self.Guid                         = Guid
-            self.FwVersion                    = FwVersion
-            self.LowestSupportedVersion       = LowestSupportedVersion
-            self.MonotonicCount               = MonotonicCount
-            self.HardwareInstance             = HardwareInstance
-            self.UpdateImageIndex             = UpdateImageIndex
-            self.SignToolPfxFile              = SignToolPfxFile
-            self.SignToolSubjectName          = SignToolSubjectName
+            self.Payload = Payload
+            self.Guid = Guid
+            self.FwVersion = FwVersion
+            self.LowestSupportedVersion = LowestSupportedVersion
+            self.MonotonicCount = MonotonicCount
+            self.HardwareInstance = HardwareInstance
+            self.UpdateImageIndex = UpdateImageIndex
+            self.SignToolPfxFile = SignToolPfxFile
+            self.SignToolSubjectName = SignToolSubjectName
             self.OpenSslSignerPrivateCertFile = OpenSslSignerPrivateCertFile
-            self.OpenSslOtherPublicCertFile   = OpenSslOtherPublicCertFile
+            self.OpenSslOtherPublicCertFile = OpenSslOtherPublicCertFile
             self.OpenSslTrustedPublicCertFile = OpenSslTrustedPublicCertFile
-            self.SigningToolPath              = SigningToolPath
-            self.DepexExp                     = DepexExp
+            self.SigningToolPath = SigningToolPath
+            self.DepexExp = DepexExp
 
             self.UseSignTool = (self.SignToolPfxFile is not None or
                                 self.SignToolSubjectName is not None)
-            self.UseOpenSsl  = (self.OpenSslSignerPrivateCertFile is not None and
-                                self.OpenSslOtherPublicCertFile is not None and
-                                self.OpenSslTrustedPublicCertFile is not None)
-            self.AnyOpenSsl  = (self.OpenSslSignerPrivateCertFile is not None or
-                                self.OpenSslOtherPublicCertFile is not None or
-                                self.OpenSslTrustedPublicCertFile is not None)
+            self.UseOpenSsl = (self.OpenSslSignerPrivateCertFile is not None and
+                               self.OpenSslOtherPublicCertFile is not None and
+                               self.OpenSslTrustedPublicCertFile is not None)
+            self.AnyOpenSsl = (self.OpenSslSignerPrivateCertFile is not None or
+                               self.OpenSslOtherPublicCertFile is not None or
+                               self.OpenSslTrustedPublicCertFile is not None)
             self.UseDependency = self.DepexExp is not None
 
         def Validate(self, args):
             if self.UseSignTool and self.AnyOpenSsl:
-                raise argparse.ArgumentTypeError ('Providing both signtool and OpenSSL options is not supported')
+                raise argparse.ArgumentTypeError(
+                    'Providing both signtool and OpenSSL options is not supported')
             if not self.UseSignTool and not self.UseOpenSsl and self.AnyOpenSsl:
                 if args.JsonFile:
-                    raise argparse.ArgumentTypeError ('the following JSON fields are required for OpenSSL: OpenSslSignerPrivateCertFile, OpenSslOtherPublicCertFile, OpenSslTrustedPublicCertFile')
+                    raise argparse.ArgumentTypeError(
+                        'the following JSON fields are required for OpenSSL: OpenSslSignerPrivateCertFile, OpenSslOtherPublicCertFile, OpenSslTrustedPublicCertFile')
                 else:
-                    raise argparse.ArgumentTypeError ('the following options are required for OpenSSL: --signer-private-cert, --other-public-cert, --trusted-public-cert')
+                    raise argparse.ArgumentTypeError(
+                        'the following options are required for OpenSSL: --signer-private-cert, --other-public-cert, --trusted-public-cert')
             if self.UseSignTool and platform.system() != 'Windows':
-                raise argparse.ArgumentTypeError ('Use of signtool is not supported on this operating system.')
+                raise argparse.ArgumentTypeError(
+                    'Use of signtool is not supported on this operating system.')
             if args.Encode:
                 if self.FwVersion is None or self.LowestSupportedVersion is None:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('the following JSON fields are required: FwVersion, LowestSupportedVersion')
+                        raise argparse.ArgumentTypeError(
+                            'the following JSON fields are required: FwVersion, LowestSupportedVersion')
                     else:
-                        raise argparse.ArgumentTypeError ('the following options are required: --fw-version, --lsv')
+                        raise argparse.ArgumentTypeError(
+                            'the following options are required: --fw-version, --lsv')
                 if self.FwVersion > 0xFFFFFFFF:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('JSON field FwVersion must be an integer in range 0x0..0xffffffff')
+                        raise argparse.ArgumentTypeError(
+                            'JSON field FwVersion must be an integer in range 0x0..0xffffffff')
                     else:
-                        raise argparse.ArgumentTypeError ('--fw-version must be an integer in range 0x0..0xffffffff')
+                        raise argparse.ArgumentTypeError(
+                            '--fw-version must be an integer in range 0x0..0xffffffff')
                 if self.LowestSupportedVersion > 0xFFFFFFFF:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('JSON field LowestSupportedVersion must be an integer in range 0x0..0xffffffff')
+                        raise argparse.ArgumentTypeError(
+                            'JSON field LowestSupportedVersion must be an integer in range 0x0..0xffffffff')
                     else:
-                        raise argparse.ArgumentTypeError ('--lsv must be an integer in range 0x0..0xffffffff')
+                        raise argparse.ArgumentTypeError(
+                            '--lsv must be an integer in range 0x0..0xffffffff')
 
             if args.Encode:
                 if self.Guid is None:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('the following JSON field is required: Guid')
+                        raise argparse.ArgumentTypeError(
+                            'the following JSON field is required: Guid')
                     else:
-                        raise argparse.ArgumentTypeError ('the following option is required: --guid')
+                        raise argparse.ArgumentTypeError(
+                            'the following option is required: --guid')
                 if self.HardwareInstance > 0xFFFFFFFFFFFFFFFF:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('JSON field HardwareInstance must be an integer in range 0x0..0xffffffffffffffff')
+                        raise argparse.ArgumentTypeError(
+                            'JSON field HardwareInstance must be an integer in range 0x0..0xffffffffffffffff')
                     else:
-                        raise argparse.ArgumentTypeError ('--hardware-instance must be an integer in range 0x0..0xffffffffffffffff')
+                        raise argparse.ArgumentTypeError(
+                            '--hardware-instance must be an integer in range 0x0..0xffffffffffffffff')
                 if self.MonotonicCount > 0xFFFFFFFFFFFFFFFF:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('JSON field MonotonicCount must be an integer in range 0x0..0xffffffffffffffff')
+                        raise argparse.ArgumentTypeError(
+                            'JSON field MonotonicCount must be an integer in range 0x0..0xffffffffffffffff')
                     else:
-                        raise argparse.ArgumentTypeError ('--monotonic-count must be an integer in range 0x0..0xffffffffffffffff')
-                if self.UpdateImageIndex >0xFF:
+                        raise argparse.ArgumentTypeError(
+                            '--monotonic-count must be an integer in range 0x0..0xffffffffffffffff')
+                if self.UpdateImageIndex > 0xFF:
                     if args.JsonFile:
-                        raise argparse.ArgumentTypeError ('JSON field UpdateImageIndex must be an integer in range 0x0..0xff')
+                        raise argparse.ArgumentTypeError(
+                            'JSON field UpdateImageIndex must be an integer in range 0x0..0xff')
                     else:
-                        raise argparse.ArgumentTypeError ('--update-image-index must be an integer in range 0x0..0xff')
+                        raise argparse.ArgumentTypeError(
+                            '--update-image-index must be an integer in range 0x0..0xff')
 
             if self.UseSignTool:
                 if self.SignToolPfxFile is not None:
@@ -528,7 +615,7 @@ if __name__ == '__main__':
                 self.OpenSslOtherPublicCertFile.close()
                 self.OpenSslTrustedPublicCertFile.close()
                 self.OpenSslSignerPrivateCertFile = self.OpenSslSignerPrivateCertFile.name
-                self.OpenSslOtherPublicCertFile   = self.OpenSslOtherPublicCertFile.name
+                self.OpenSslOtherPublicCertFile = self.OpenSslOtherPublicCertFile.name
                 self.OpenSslTrustedPublicCertFile = self.OpenSslTrustedPublicCertFile.name
 
             #
@@ -537,487 +624,534 @@ if __name__ == '__main__':
             if args.Encode:
                 if 'PersistAcrossReset' not in args.CapsuleFlag:
                     if 'InitiateReset' in args.CapsuleFlag:
-                        raise argparse.ArgumentTypeError ('--capflag InitiateReset also requires --capflag PersistAcrossReset')
+                        raise argparse.ArgumentTypeError(
+                            '--capflag InitiateReset also requires --capflag PersistAcrossReset')
                 if args.CapsuleOemFlag > 0xFFFF:
-                    raise argparse.ArgumentTypeError ('--capoemflag must be an integer between 0x0000 and 0xffff')
+                    raise argparse.ArgumentTypeError(
+                        '--capoemflag must be an integer between 0x0000 and 0xffff')
 
             return True
 
-
-    def Encode (PayloadDescriptorList, EmbeddedDriverDescriptorList, Buffer):
+    def Encode(PayloadDescriptorList, EmbeddedDriverDescriptorList, Buffer):
         if args.JsonFile:
             CheckArgumentConflict(args)
             try:
-                Json = json.loads (args.JsonFile.read ())
+                Json = json.loads(args.JsonFile.read())
             except:
-                print ('GenerateCapsule: error: {JSONFile} loads failure. '.format (JSONFile = args.JsonFile))
-                sys.exit (1)
+                print('GenerateCapsule: error: {JSONFile} loads failure. '.format(
+                    JSONFile=args.JsonFile))
+                sys.exit(1)
             EncodeJsonFileParse(Json)
         else:
             for Driver in args.EmbeddedDriver:
-                EmbeddedDriverDescriptorList.append (Driver.read())
-            PayloadDescriptorList.append (PayloadDescriptor (
-                                            Buffer,
-                                            args.Guid,
-                                            args.FwVersion,
-                                            args.LowestSupportedVersion,
-                                            args.MonotonicCount,
-                                            args.HardwareInstance,
-                                            args.UpdateImageIndex,
-                                            args.SignToolPfxFile,
-                                            args.SignToolSubjectName,
-                                            args.OpenSslSignerPrivateCertFile,
-                                            args.OpenSslOtherPublicCertFile,
-                                            args.OpenSslTrustedPublicCertFile,
-                                            args.SigningToolPath,
-                                            None
-                                            ))
+                EmbeddedDriverDescriptorList.append(Driver.read())
+            PayloadDescriptorList.append(PayloadDescriptor(
+                Buffer,
+                args.Guid,
+                args.FwVersion,
+                args.LowestSupportedVersion,
+                args.MonotonicCount,
+                args.HardwareInstance,
+                args.UpdateImageIndex,
+                args.SignToolPfxFile,
+                args.SignToolSubjectName,
+                args.OpenSslSignerPrivateCertFile,
+                args.OpenSslOtherPublicCertFile,
+                args.OpenSslTrustedPublicCertFile,
+                args.SigningToolPath,
+                None
+            ))
         for SinglePayloadDescriptor in PayloadDescriptorList:
             try:
-                SinglePayloadDescriptor.Validate (args)
+                SinglePayloadDescriptor.Validate(args)
             except Exception as Msg:
-                print ('GenerateCapsule: error:' + str(Msg))
-                sys.exit (1)
+                print('GenerateCapsule: error:' + str(Msg))
+                sys.exit(1)
         for SinglePayloadDescriptor in PayloadDescriptorList:
             ImageCapsuleSupport = 0x0000000000000000
             Result = SinglePayloadDescriptor.Payload
             try:
-                FmpPayloadHeader.FwVersion              = SinglePayloadDescriptor.FwVersion
+                FmpPayloadHeader.FwVersion = SinglePayloadDescriptor.FwVersion
                 FmpPayloadHeader.LowestSupportedVersion = SinglePayloadDescriptor.LowestSupportedVersion
-                FmpPayloadHeader.Payload                = SinglePayloadDescriptor.Payload
-                Result = FmpPayloadHeader.Encode ()
+                FmpPayloadHeader.Payload = SinglePayloadDescriptor.Payload
+                Result = FmpPayloadHeader.Encode()
                 if args.Verbose:
-                    FmpPayloadHeader.DumpInfo ()
+                    FmpPayloadHeader.DumpInfo()
             except:
-                print ('GenerateCapsule: error: can not encode FMP Payload Header')
-                sys.exit (1)
+                print('GenerateCapsule: error: can not encode FMP Payload Header')
+                sys.exit(1)
             if SinglePayloadDescriptor.UseDependency:
                 CapsuleDependency.Payload = Result
                 CapsuleDependency.DepexExp = SinglePayloadDescriptor.DepexExp
-                ImageCapsuleSupport        |= FmpCapsuleHeader.CAPSULE_SUPPORT_DEPENDENCY
-                Result = CapsuleDependency.Encode ()
+                ImageCapsuleSupport |= FmpCapsuleHeader.CAPSULE_SUPPORT_DEPENDENCY
+                Result = CapsuleDependency.Encode()
                 if args.Verbose:
-                    CapsuleDependency.DumpInfo ()
+                    CapsuleDependency.DumpInfo()
             if SinglePayloadDescriptor.UseOpenSsl or SinglePayloadDescriptor.UseSignTool:
                 #
                 # Sign image with 64-bit MonotonicCount appended to end of image
                 #
                 try:
                     if SinglePayloadDescriptor.UseSignTool:
-                        CertData = SignPayloadSignTool (
-                            Result + struct.pack ('<Q', SinglePayloadDescriptor.MonotonicCount),
+                        CertData = SignPayloadSignTool(
+                            Result +
+                            struct.pack(
+                                '<Q', SinglePayloadDescriptor.MonotonicCount),
                             SinglePayloadDescriptor.SigningToolPath,
                             SinglePayloadDescriptor.SignToolPfxFile,
                             SinglePayloadDescriptor.SignToolSubjectName,
-                            Verbose = args.Verbose
+                            Verbose=args.Verbose
                         )
                     else:
-                        CertData = SignPayloadOpenSsl (
-                            Result + struct.pack ('<Q', SinglePayloadDescriptor.MonotonicCount),
+                        CertData = SignPayloadOpenSsl(
+                            Result +
+                            struct.pack(
+                                '<Q', SinglePayloadDescriptor.MonotonicCount),
                             SinglePayloadDescriptor.SigningToolPath,
                             SinglePayloadDescriptor.OpenSslSignerPrivateCertFile,
                             SinglePayloadDescriptor.OpenSslOtherPublicCertFile,
                             SinglePayloadDescriptor.OpenSslTrustedPublicCertFile,
-                            Verbose = args.Verbose
+                            Verbose=args.Verbose
                         )
                 except Exception as Msg:
-                    print ('GenerateCapsule: error: can not sign payload \n' + str(Msg))
-                    sys.exit (1)
+                    print('GenerateCapsule: error: can not sign payload \n' + str(Msg))
+                    sys.exit(1)
 
                 try:
                     FmpAuthHeader.MonotonicCount = SinglePayloadDescriptor.MonotonicCount
-                    FmpAuthHeader.CertData       = CertData
-                    FmpAuthHeader.Payload        = Result
-                    ImageCapsuleSupport          |= FmpCapsuleHeader.CAPSULE_SUPPORT_AUTHENTICATION
-                    Result = FmpAuthHeader.Encode ()
+                    FmpAuthHeader.CertData = CertData
+                    FmpAuthHeader.Payload = Result
+                    ImageCapsuleSupport |= FmpCapsuleHeader.CAPSULE_SUPPORT_AUTHENTICATION
+                    Result = FmpAuthHeader.Encode()
                     if args.Verbose:
-                        FmpAuthHeader.DumpInfo ()
+                        FmpAuthHeader.DumpInfo()
                 except:
-                    print ('GenerateCapsule: error: can not encode FMP Auth Header')
-                    sys.exit (1)
-            FmpCapsuleHeader.AddPayload (SinglePayloadDescriptor.Guid, Result, HardwareInstance = SinglePayloadDescriptor.HardwareInstance, UpdateImageIndex = SinglePayloadDescriptor.UpdateImageIndex, CapsuleSupport = ImageCapsuleSupport)
+                    print('GenerateCapsule: error: can not encode FMP Auth Header')
+                    sys.exit(1)
+            FmpCapsuleHeader.AddPayload(SinglePayloadDescriptor.Guid, Result, HardwareInstance=SinglePayloadDescriptor.HardwareInstance,
+                                        UpdateImageIndex=SinglePayloadDescriptor.UpdateImageIndex, CapsuleSupport=ImageCapsuleSupport)
         try:
             for EmbeddedDriver in EmbeddedDriverDescriptorList:
                 FmpCapsuleHeader.AddEmbeddedDriver(EmbeddedDriver)
 
-            Result = FmpCapsuleHeader.Encode ()
+            Result = FmpCapsuleHeader.Encode()
             if args.Verbose:
-                FmpCapsuleHeader.DumpInfo ()
+                FmpCapsuleHeader.DumpInfo()
         except:
-            print ('GenerateCapsule: error: can not encode FMP Capsule Header')
-            sys.exit (1)
+            print('GenerateCapsule: error: can not encode FMP Capsule Header')
+            sys.exit(1)
 
         try:
-            UefiCapsuleHeader.OemFlags            = args.CapsuleOemFlag
-            UefiCapsuleHeader.PersistAcrossReset  = 'PersistAcrossReset'  in args.CapsuleFlag
+            UefiCapsuleHeader.OemFlags = args.CapsuleOemFlag
+            UefiCapsuleHeader.PersistAcrossReset = 'PersistAcrossReset' in args.CapsuleFlag
             UefiCapsuleHeader.PopulateSystemTable = False
-            UefiCapsuleHeader.InitiateReset       = 'InitiateReset'       in args.CapsuleFlag
-            UefiCapsuleHeader.Payload             = Result
-            Result = UefiCapsuleHeader.Encode ()
+            UefiCapsuleHeader.InitiateReset = 'InitiateReset' in args.CapsuleFlag
+            UefiCapsuleHeader.Payload = Result
+            Result = UefiCapsuleHeader.Encode()
             if args.Verbose:
-                UefiCapsuleHeader.DumpInfo ()
+                UefiCapsuleHeader.DumpInfo()
         except:
-            print ('GenerateCapsule: error: can not encode UEFI Capsule Header')
-            sys.exit (1)
+            print('GenerateCapsule: error: can not encode UEFI Capsule Header')
+            sys.exit(1)
         try:
             if args.Verbose:
-                print ('Write binary output file {File}'.format (File = args.OutputFile.name))
-            args.OutputFile.write (Result)
-            args.OutputFile.close ()
+                print('Write binary output file {File}'.format(
+                    File=args.OutputFile.name))
+            args.OutputFile.write(Result)
+            args.OutputFile.close()
         except:
-            print ('GenerateCapsule: error: can not write binary output file {File}'.format (File = args.OutputFile.name))
-            sys.exit (1)
+            print('GenerateCapsule: error: can not write binary output file {File}'.format(
+                File=args.OutputFile.name))
+            sys.exit(1)
 
-    def Decode (PayloadDescriptorList, PayloadJsonDescriptorList, Buffer):
+    def Decode(PayloadDescriptorList, PayloadJsonDescriptorList, Buffer):
         if args.JsonFile:
             CheckArgumentConflict(args)
         #
         # Parse payload descriptors from JSON
         #
             try:
-                Json = json.loads (args.JsonFile.read())
+                Json = json.loads(args.JsonFile.read())
             except:
-                print ('GenerateCapsule: error: {JSONFile} loads failure. '.format (JSONFile = args.JsonFile))
-                sys.exit (1)
-            DecodeJsonFileParse (Json)
+                print('GenerateCapsule: error: {JSONFile} loads failure. '.format(
+                    JSONFile=args.JsonFile))
+                sys.exit(1)
+            DecodeJsonFileParse(Json)
         else:
-            PayloadDescriptorList.append (PayloadDescriptor (
-                                            Buffer,
-                                            args.Guid,
-                                            args.FwVersion,
-                                            args.LowestSupportedVersion,
-                                            args.MonotonicCount,
-                                            args.HardwareInstance,
-                                            args.UpdateImageIndex,
-                                            args.SignToolPfxFile,
-                                            args.SignSubjectName,
-                                            args.OpenSslSignerPrivateCertFile,
-                                            args.OpenSslOtherPublicCertFile,
-                                            args.OpenSslTrustedPublicCertFile,
-                                            args.SigningToolPath,
-                                            None
-                                            ))
+            PayloadDescriptorList.append(PayloadDescriptor(
+                Buffer,
+                args.Guid,
+                args.FwVersion,
+                args.LowestSupportedVersion,
+                args.MonotonicCount,
+                args.HardwareInstance,
+                args.UpdateImageIndex,
+                args.SignToolPfxFile,
+                args.SignSubjectName,
+                args.OpenSslSignerPrivateCertFile,
+                args.OpenSslOtherPublicCertFile,
+                args.OpenSslTrustedPublicCertFile,
+                args.SigningToolPath,
+                None
+            ))
         #
         # Perform additional verification on payload descriptors
         #
         for SinglePayloadDescriptor in PayloadDescriptorList:
             try:
-                SinglePayloadDescriptor.Validate (args)
+                SinglePayloadDescriptor.Validate(args)
             except Exception as Msg:
-                print ('GenerateCapsule: error:' + str(Msg))
-                sys.exit (1)
+                print('GenerateCapsule: error:' + str(Msg))
+                sys.exit(1)
         try:
-            Result = UefiCapsuleHeader.Decode (Buffer)
-            if len (Result) > 0:
-                Result = FmpCapsuleHeader.Decode (Result)
+            Result = UefiCapsuleHeader.Decode(Buffer)
+            if len(Result) > 0:
+                Result = FmpCapsuleHeader.Decode(Result)
                 if args.JsonFile:
-                    if FmpCapsuleHeader.PayloadItemCount != len (PayloadDescriptorList):
+                    if FmpCapsuleHeader.PayloadItemCount != len(PayloadDescriptorList):
                         CapsulePayloadNum = FmpCapsuleHeader.PayloadItemCount
-                        JsonPayloadNum = len (PayloadDescriptorList)
-                        print ('GenerateCapsule: Decode error: {JsonPayloadNumber} payloads in JSON file {File} and {CapsulePayloadNumber} payloads in Capsule {CapsuleName}'.format (JsonPayloadNumber = JsonPayloadNum, File = args.JsonFile.name, CapsulePayloadNumber = CapsulePayloadNum, CapsuleName = args.InputFile.name))
-                        sys.exit (1)
-                    for Index in range (0, FmpCapsuleHeader.PayloadItemCount):
-                        if Index < len (PayloadDescriptorList):
-                            GUID = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateImageTypeId
-                            HardwareInstance = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateHardwareInstance
-                            UpdateImageIndex = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateImageIndex
+                        JsonPayloadNum = len(PayloadDescriptorList)
+                        print('GenerateCapsule: Decode error: {JsonPayloadNumber} payloads in JSON file {File} and {CapsulePayloadNumber} payloads in Capsule {CapsuleName}'.format(
+                            JsonPayloadNumber=JsonPayloadNum, File=args.JsonFile.name, CapsulePayloadNumber=CapsulePayloadNum, CapsuleName=args.InputFile.name))
+                        sys.exit(1)
+                    for Index in range(0, FmpCapsuleHeader.PayloadItemCount):
+                        if Index < len(PayloadDescriptorList):
+                            GUID = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                                Index).UpdateImageTypeId
+                            HardwareInstance = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                                Index).UpdateHardwareInstance
+                            UpdateImageIndex = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                                Index).UpdateImageIndex
                             if PayloadDescriptorList[Index].Guid != GUID or PayloadDescriptorList[Index].HardwareInstance != HardwareInstance:
-                                print ('GenerateCapsule: Decode error: Guid or HardwareInstance pair in input JSON file {File} does not match the payload {PayloadIndex} in Capsule {InputCapsule}'.format (File = args.JsonFile.name, PayloadIndex = Index + 1, InputCapsule = args.InputFile.name))
-                                sys.exit (1)
-                            PayloadDescriptorList[Index].Payload = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).Payload
-                            DecodeJsonOutput = args.OutputFile.name + '.Payload.{Index:d}.bin'.format (Index = Index + 1)
-                            PayloadJsonDescriptorList.append (PayloadDescriptor (
-                                                                DecodeJsonOutput,
-                                                                GUID,
-                                                                None,
-                                                                None,
-                                                                None,
-                                                                HardwareInstance,
-                                                                UpdateImageIndex,
-                                                                PayloadDescriptorList[Index].SignToolPfxFile,
-                                                                PayloadDescriptorList[Index].SignToolSubjectName,
-                                                                PayloadDescriptorList[Index].OpenSslSignerPrivateCertFile,
-                                                                PayloadDescriptorList[Index].OpenSslOtherPublicCertFile,
-                                                                PayloadDescriptorList[Index].OpenSslTrustedPublicCertFile,
-                                                                PayloadDescriptorList[Index].SigningToolPath,
-                                                                None
-                                                                ))
+                                print('GenerateCapsule: Decode error: Guid or HardwareInstance pair in input JSON file {File} does not match the payload {PayloadIndex} in Capsule {InputCapsule}'.format(
+                                    File=args.JsonFile.name, PayloadIndex=Index + 1, InputCapsule=args.InputFile.name))
+                                sys.exit(1)
+                            PayloadDescriptorList[Index].Payload = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                                Index).Payload
+                            DecodeJsonOutput = args.OutputFile.name + \
+                                '.Payload.{Index:d}.bin'.format(
+                                    Index=Index + 1)
+                            PayloadJsonDescriptorList.append(PayloadDescriptor(
+                                DecodeJsonOutput,
+                                GUID,
+                                None,
+                                None,
+                                None,
+                                HardwareInstance,
+                                UpdateImageIndex,
+                                PayloadDescriptorList[Index].SignToolPfxFile,
+                                PayloadDescriptorList[Index].SignToolSubjectName,
+                                PayloadDescriptorList[Index].OpenSslSignerPrivateCertFile,
+                                PayloadDescriptorList[Index].OpenSslOtherPublicCertFile,
+                                PayloadDescriptorList[Index].OpenSslTrustedPublicCertFile,
+                                PayloadDescriptorList[Index].SigningToolPath,
+                                None
+                            ))
                 else:
-                    PayloadDescriptorList[0].Payload = FmpCapsuleHeader.GetFmpCapsuleImageHeader (0).Payload
-                    for Index in range (0, FmpCapsuleHeader.PayloadItemCount):
+                    PayloadDescriptorList[0].Payload = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                        0).Payload
+                    for Index in range(0, FmpCapsuleHeader.PayloadItemCount):
                         if Index > 0:
-                            PayloadDecodeFile = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).Payload
-                            PayloadDescriptorList.append (PayloadDescriptor (PayloadDecodeFile,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            None
-                                                            ))
-                        GUID = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateImageTypeId
-                        HardwareInstance = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateHardwareInstance
-                        UpdateImageIndex = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).UpdateImageIndex
-                        DecodeJsonOutput = args.OutputFile.name + '.Payload.{Index:d}.bin'.format (Index = Index + 1)
-                        PayloadJsonDescriptorList.append (PayloadDescriptor (
-                                                            DecodeJsonOutput,
-                                                            GUID,
-                                                            None,
-                                                            None,
-                                                            None,
-                                                            HardwareInstance,
-                                                            UpdateImageIndex,
-                                                            PayloadDescriptorList[Index].SignToolPfxFile,
-                                                            PayloadDescriptorList[Index].SignToolSubjectName,
-                                                            PayloadDescriptorList[Index].OpenSslSignerPrivateCertFile,
-                                                            PayloadDescriptorList[Index].OpenSslOtherPublicCertFile,
-                                                            PayloadDescriptorList[Index].OpenSslTrustedPublicCertFile,
-                                                            PayloadDescriptorList[Index].SigningToolPath,
-                                                            None
-                                                            ))
+                            PayloadDecodeFile = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                                Index).Payload
+                            PayloadDescriptorList.append(PayloadDescriptor(PayloadDecodeFile,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None,
+                                                                           None
+                                                                           ))
+                        GUID = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                            Index).UpdateImageTypeId
+                        HardwareInstance = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                            Index).UpdateHardwareInstance
+                        UpdateImageIndex = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                            Index).UpdateImageIndex
+                        DecodeJsonOutput = args.OutputFile.name + \
+                            '.Payload.{Index:d}.bin'.format(Index=Index + 1)
+                        PayloadJsonDescriptorList.append(PayloadDescriptor(
+                            DecodeJsonOutput,
+                            GUID,
+                            None,
+                            None,
+                            None,
+                            HardwareInstance,
+                            UpdateImageIndex,
+                            PayloadDescriptorList[Index].SignToolPfxFile,
+                            PayloadDescriptorList[Index].SignToolSubjectName,
+                            PayloadDescriptorList[Index].OpenSslSignerPrivateCertFile,
+                            PayloadDescriptorList[Index].OpenSslOtherPublicCertFile,
+                            PayloadDescriptorList[Index].OpenSslTrustedPublicCertFile,
+                            PayloadDescriptorList[Index].SigningToolPath,
+                            None
+                        ))
                 JsonIndex = 0
                 for SinglePayloadDescriptor in PayloadDescriptorList:
                     if args.Verbose:
-                        print ('========')
-                        UefiCapsuleHeader.DumpInfo ()
-                        print ('--------')
-                        FmpCapsuleHeader.DumpInfo ()
+                        print('========')
+                        UefiCapsuleHeader.DumpInfo()
+                        print('--------')
+                        FmpCapsuleHeader.DumpInfo()
                     if FmpAuthHeader.IsSigned(SinglePayloadDescriptor.Payload):
                         if not SinglePayloadDescriptor.UseOpenSsl and not SinglePayloadDescriptor.UseSignTool:
-                            print ('GenerateCapsule: decode warning: can not verify singed payload without cert or pfx file. Index = {Index}'.format (Index = JsonIndex + 1))
-                        SinglePayloadDescriptor.Payload = FmpAuthHeader.Decode (SinglePayloadDescriptor.Payload)
+                            print('GenerateCapsule: decode warning: can not verify singed payload without cert or pfx file. Index = {Index}'.format(
+                                Index=JsonIndex + 1))
+                        SinglePayloadDescriptor.Payload = FmpAuthHeader.Decode(
+                            SinglePayloadDescriptor.Payload)
                         PayloadJsonDescriptorList[JsonIndex].MonotonicCount = FmpAuthHeader.MonotonicCount
                         if args.Verbose:
-                            print ('--------')
-                            FmpAuthHeader.DumpInfo ()
+                            print('--------')
+                            FmpAuthHeader.DumpInfo()
 
                         #
                         # Verify Image with 64-bit MonotonicCount appended to end of image
                         #
                         try:
-                          if SinglePayloadDescriptor.UseSignTool:
-                              CertData = VerifyPayloadSignTool (
-                                           FmpAuthHeader.Payload + struct.pack ('<Q', FmpAuthHeader.MonotonicCount),
-                                           FmpAuthHeader.CertData,
-                                           SinglePayloadDescriptor.SigningToolPath,
-                                           SinglePayloadDescriptor.SignToolPfxFile,
-                                           SinglePayloadDescriptor.SignToolSubjectName,
-                                           Verbose = args.Verbose
-                                           )
-                          else:
-                              CertData = VerifyPayloadOpenSsl (
-                                           FmpAuthHeader.Payload + struct.pack ('<Q', FmpAuthHeader.MonotonicCount),
-                                           FmpAuthHeader.CertData,
-                                           SinglePayloadDescriptor.SigningToolPath,
-                                           SinglePayloadDescriptor.OpenSslSignerPrivateCertFile,
-                                           SinglePayloadDescriptor.OpenSslOtherPublicCertFile,
-                                           SinglePayloadDescriptor.OpenSslTrustedPublicCertFile,
-                                           Verbose = args.Verbose
-                                           )
+                            if SinglePayloadDescriptor.UseSignTool:
+                                CertData = VerifyPayloadSignTool(
+                                    FmpAuthHeader.Payload +
+                                    struct.pack(
+                                        '<Q', FmpAuthHeader.MonotonicCount),
+                                    FmpAuthHeader.CertData,
+                                    SinglePayloadDescriptor.SigningToolPath,
+                                    SinglePayloadDescriptor.SignToolPfxFile,
+                                    SinglePayloadDescriptor.SignToolSubjectName,
+                                    Verbose=args.Verbose
+                                )
+                            else:
+                                CertData = VerifyPayloadOpenSsl(
+                                    FmpAuthHeader.Payload +
+                                    struct.pack(
+                                        '<Q', FmpAuthHeader.MonotonicCount),
+                                    FmpAuthHeader.CertData,
+                                    SinglePayloadDescriptor.SigningToolPath,
+                                    SinglePayloadDescriptor.OpenSslSignerPrivateCertFile,
+                                    SinglePayloadDescriptor.OpenSslOtherPublicCertFile,
+                                    SinglePayloadDescriptor.OpenSslTrustedPublicCertFile,
+                                    Verbose=args.Verbose
+                                )
                         except Exception as Msg:
-                            print ('GenerateCapsule: warning: payload verification failed Index = {Index} \n'.format (Index = JsonIndex + 1) + str(Msg))
+                            print('GenerateCapsule: warning: payload verification failed Index = {Index} \n'.format(
+                                Index=JsonIndex + 1) + str(Msg))
                     else:
                         if args.Verbose:
-                            print ('--------')
-                            print ('No EFI_FIRMWARE_IMAGE_AUTHENTICATION')
+                            print('--------')
+                            print('No EFI_FIRMWARE_IMAGE_AUTHENTICATION')
 
-                    PayloadSignature = struct.unpack ('<I', SinglePayloadDescriptor.Payload[0:4])
+                    PayloadSignature = struct.unpack(
+                        '<I', SinglePayloadDescriptor.Payload[0:4])
                     if PayloadSignature != FmpPayloadHeader.Signature:
                         SinglePayloadDescriptor.UseDependency = True
                         try:
-                            SinglePayloadDescriptor.Payload = CapsuleDependency.Decode (SinglePayloadDescriptor.Payload)
+                            SinglePayloadDescriptor.Payload = CapsuleDependency.Decode(
+                                SinglePayloadDescriptor.Payload)
                             PayloadJsonDescriptorList[JsonIndex].DepexExp = CapsuleDependency.DepexExp
                             if args.Verbose:
-                                print ('--------')
-                                CapsuleDependency.DumpInfo ()
+                                print('--------')
+                                CapsuleDependency.DumpInfo()
                         except Exception as Msg:
-                            print ('GenerateCapsule: error: invalid dependency expression')
+                            print(
+                                'GenerateCapsule: error: invalid dependency expression')
                     else:
                         if args.Verbose:
-                            print ('--------')
-                            print ('No EFI_FIRMWARE_IMAGE_DEP')
+                            print('--------')
+                            print('No EFI_FIRMWARE_IMAGE_DEP')
 
                     try:
-                        SinglePayloadDescriptor.Payload = FmpPayloadHeader.Decode (SinglePayloadDescriptor.Payload)
+                        SinglePayloadDescriptor.Payload = FmpPayloadHeader.Decode(
+                            SinglePayloadDescriptor.Payload)
                         PayloadJsonDescriptorList[JsonIndex].FwVersion = FmpPayloadHeader.FwVersion
                         PayloadJsonDescriptorList[JsonIndex].LowestSupportedVersion = FmpPayloadHeader.LowestSupportedVersion
                         JsonIndex = JsonIndex + 1
                         if args.Verbose:
-                            print ('--------')
-                            FmpPayloadHeader.DumpInfo ()
-                            print ('========')
+                            print('--------')
+                            FmpPayloadHeader.DumpInfo()
+                            print('========')
                     except:
                         if args.Verbose:
-                            print ('--------')
-                            print ('No FMP_PAYLOAD_HEADER')
-                            print ('========')
-                        sys.exit (1)
+                            print('--------')
+                            print('No FMP_PAYLOAD_HEADER')
+                            print('========')
+                        sys.exit(1)
                 #
                 # Write embedded driver file(s)
                 #
-                for Index in range (0, FmpCapsuleHeader.EmbeddedDriverCount):
-                    EmbeddedDriverBuffer = FmpCapsuleHeader.GetEmbeddedDriver (Index)
-                    EmbeddedDriverPath = args.OutputFile.name + '.EmbeddedDriver.{Index:d}.efi'.format (Index = Index + 1)
+                for Index in range(0, FmpCapsuleHeader.EmbeddedDriverCount):
+                    EmbeddedDriverBuffer = FmpCapsuleHeader.GetEmbeddedDriver(
+                        Index)
+                    EmbeddedDriverPath = args.OutputFile.name + \
+                        '.EmbeddedDriver.{Index:d}.efi'.format(Index=Index + 1)
                     try:
                         if args.Verbose:
-                            print ('Write embedded driver file {File}'.format (File = EmbeddedDriverPath))
-                        with open (EmbeddedDriverPath, 'wb') as EmbeddedDriverFile:
-                            EmbeddedDriverFile.write (EmbeddedDriverBuffer)
+                            print('Write embedded driver file {File}'.format(
+                                File=EmbeddedDriverPath))
+                        with open(EmbeddedDriverPath, 'wb') as EmbeddedDriverFile:
+                            EmbeddedDriverFile.write(EmbeddedDriverBuffer)
                     except:
-                        print ('GenerateCapsule: error: can not write embedded driver file {File}'.format (File = EmbeddedDriverPath))
-                        sys.exit (1)
+                        print('GenerateCapsule: error: can not write embedded driver file {File}'.format(
+                            File=EmbeddedDriverPath))
+                        sys.exit(1)
 
         except:
-            print ('GenerateCapsule: error: can not decode capsule')
-            sys.exit (1)
+            print('GenerateCapsule: error: can not decode capsule')
+            sys.exit(1)
         GenerateOutputJson(PayloadJsonDescriptorList)
         PayloadIndex = 0
         for SinglePayloadDescriptor in PayloadDescriptorList:
             if args.OutputFile is None:
-                print ('GenerateCapsule: Decode error: OutputFile is needed for decode output')
-                sys.exit (1)
+                print(
+                    'GenerateCapsule: Decode error: OutputFile is needed for decode output')
+                sys.exit(1)
             try:
                 if args.Verbose:
-                    print ('Write binary output file {File}'.format (File = args.OutputFile.name))
-                PayloadDecodePath = args.OutputFile.name + '.Payload.{Index:d}.bin'.format (Index = PayloadIndex + 1)
-                with open (PayloadDecodePath, 'wb') as PayloadDecodeFile:
-                    PayloadDecodeFile.write (SinglePayloadDescriptor.Payload)
+                    print('Write binary output file {File}'.format(
+                        File=args.OutputFile.name))
+                PayloadDecodePath = args.OutputFile.name + \
+                    '.Payload.{Index:d}.bin'.format(Index=PayloadIndex + 1)
+                with open(PayloadDecodePath, 'wb') as PayloadDecodeFile:
+                    PayloadDecodeFile.write(SinglePayloadDescriptor.Payload)
                 PayloadIndex = PayloadIndex + 1
             except:
-                print ('GenerateCapsule: error: can not write binary output file {File}'.format (File = SinglePayloadDescriptor.OutputFile.name))
-                sys.exit (1)
+                print('GenerateCapsule: error: can not write binary output file {File}'.format(
+                    File=SinglePayloadDescriptor.OutputFile.name))
+                sys.exit(1)
 
-    def DumpInfo (Buffer, args):
+    def DumpInfo(Buffer, args):
         if args.OutputFile is not None:
-            raise argparse.ArgumentTypeError ('the following option is not supported for dumpinfo operations: --output')
+            raise argparse.ArgumentTypeError(
+                'the following option is not supported for dumpinfo operations: --output')
         try:
-            Result = UefiCapsuleHeader.Decode (Buffer)
-            print ('========')
-            UefiCapsuleHeader.DumpInfo ()
-            if len (Result) > 0:
-                FmpCapsuleHeader.Decode (Result)
-                print ('--------')
-                FmpCapsuleHeader.DumpInfo ()
-                for Index in range (0, FmpCapsuleHeader.PayloadItemCount):
-                    Result = FmpCapsuleHeader.GetFmpCapsuleImageHeader (Index).Payload
+            Result = UefiCapsuleHeader.Decode(Buffer)
+            print('========')
+            UefiCapsuleHeader.DumpInfo()
+            if len(Result) > 0:
+                FmpCapsuleHeader.Decode(Result)
+                print('--------')
+                FmpCapsuleHeader.DumpInfo()
+                for Index in range(0, FmpCapsuleHeader.PayloadItemCount):
+                    Result = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                        Index).Payload
                     try:
-                        Result = FmpAuthHeader.Decode (Result)
-                        print ('--------')
-                        FmpAuthHeader.DumpInfo ()
+                        Result = FmpAuthHeader.Decode(Result)
+                        print('--------')
+                        FmpAuthHeader.DumpInfo()
                     except:
-                        print ('--------')
-                        print ('No EFI_FIRMWARE_IMAGE_AUTHENTICATION')
+                        print('--------')
+                        print('No EFI_FIRMWARE_IMAGE_AUTHENTICATION')
 
-                    PayloadSignature = struct.unpack ('<I', Result[0:4])
+                    PayloadSignature = struct.unpack('<I', Result[0:4])
                     if PayloadSignature != FmpPayloadHeader.Signature:
                         try:
-                            Result = CapsuleDependency.Decode (Result)
-                            print ('--------')
-                            CapsuleDependency.DumpInfo ()
+                            Result = CapsuleDependency.Decode(Result)
+                            print('--------')
+                            CapsuleDependency.DumpInfo()
                         except:
-                            print ('GenerateCapsule: error: invalid dependency expression')
+                            print(
+                                'GenerateCapsule: error: invalid dependency expression')
                     else:
-                        print ('--------')
-                        print ('No EFI_FIRMWARE_IMAGE_DEP')
+                        print('--------')
+                        print('No EFI_FIRMWARE_IMAGE_DEP')
                     try:
-                        Result = FmpPayloadHeader.Decode (Result)
-                        print ('--------')
-                        FmpPayloadHeader.DumpInfo ()
+                        Result = FmpPayloadHeader.Decode(Result)
+                        print('--------')
+                        FmpPayloadHeader.DumpInfo()
                     except:
-                        print ('--------')
-                        print ('No FMP_PAYLOAD_HEADER')
-                    print ('========')
+                        print('--------')
+                        print('No FMP_PAYLOAD_HEADER')
+                    print('========')
         except:
-            print ('GenerateCapsule: error: can not decode capsule')
-            sys.exit (1)
+            print('GenerateCapsule: error: can not decode capsule')
+            sys.exit(1)
     #
     # Create command line argument parser object
     #
-    parser = argparse.ArgumentParser (
-                        prog = __prog__,
-                        description = __description__ + __copyright__,
-                        conflict_handler = 'resolve',
-                        fromfile_prefix_chars = '@'
-                        )
+    parser = argparse.ArgumentParser(
+        prog=__prog__,
+        description=__description__ + __copyright__,
+        conflict_handler='resolve',
+        fromfile_prefix_chars='@'
+    )
     parser.convert_arg_line_to_args = convert_arg_line_to_args
 
     #
     # Add input and output file arguments
     #
-    parser.add_argument("InputFile",  type = argparse.FileType('rb'), nargs='?',
-                        help = "Input binary payload filename.")
-    parser.add_argument("-o", "--output", dest = 'OutputFile', type = argparse.FileType('wb'),
-                        help = "Output filename.")
+    parser.add_argument("InputFile",  type=argparse.FileType('rb'), nargs='?',
+                        help="Input binary payload filename.")
+    parser.add_argument("-o", "--output", dest='OutputFile', type=argparse.FileType('wb'),
+                        help="Output filename.")
     #
     # Add group for -e and -d flags that are mutually exclusive and required
     #
-    group = parser.add_mutually_exclusive_group (required = True)
-    group.add_argument ("-e", "--encode", dest = 'Encode', action = "store_true",
-                        help = "Encode file")
-    group.add_argument ("-d", "--decode", dest = 'Decode', action = "store_true",
-                        help = "Decode file")
-    group.add_argument ("--dump-info", dest = 'DumpInfo', action = "store_true",
-                        help = "Display FMP Payload Header information")
+    group = parser.add_mutually_exclusive_group(required=True)
+    group.add_argument("-e", "--encode", dest='Encode', action="store_true",
+                       help="Encode file")
+    group.add_argument("-d", "--decode", dest='Decode', action="store_true",
+                       help="Decode file")
+    group.add_argument("--dump-info", dest='DumpInfo', action="store_true",
+                       help="Display FMP Payload Header information")
     #
     # Add optional arguments for this command
     #
-    parser.add_argument ("-j", "--json-file", dest = 'JsonFile', type=argparse.FileType('r'),
-                         help = "JSON configuration file for multiple payloads and embedded drivers.")
-    parser.add_argument ("--capflag", dest = 'CapsuleFlag', action='append', default = [],
-                         choices=['PersistAcrossReset', 'InitiateReset'],
-                         help = "Capsule flag can be PersistAcrossReset or InitiateReset or not set")
-    parser.add_argument ("--capoemflag", dest = 'CapsuleOemFlag', type = ValidateUnsignedInteger, default = 0x0000,
-                         help = "Capsule OEM Flag is an integer between 0x0000 and 0xffff.")
-
-    parser.add_argument ("--guid", dest = 'Guid', type = ValidateRegistryFormatGuid,
-                         help = "The FMP/ESRT GUID in registry format.  Required for single payload encode operations.")
-    parser.add_argument ("--hardware-instance", dest = 'HardwareInstance', type = ValidateUnsignedInteger, default = 0x0000000000000000,
-                         help = "The 64-bit hardware instance.  The default is 0x0000000000000000")
-
-
-    parser.add_argument ("--monotonic-count", dest = 'MonotonicCount', type = ValidateUnsignedInteger, default = 0x0000000000000000,
-                         help = "64-bit monotonic count value in header.  Default is 0x0000000000000000.")
-
-    parser.add_argument ("--fw-version", dest = 'FwVersion', type = ValidateUnsignedInteger,
-                         help = "The 32-bit version of the binary payload (e.g. 0x11223344 or 5678).  Required for encode operations.")
-    parser.add_argument ("--lsv", dest = 'LowestSupportedVersion', type = ValidateUnsignedInteger,
-                         help = "The 32-bit lowest supported version of the binary payload (e.g. 0x11223344 or 5678).  Required for encode operations.")
-
-    parser.add_argument ("--pfx-file", dest='SignToolPfxFile', type=argparse.FileType('rb'),
-                         help="signtool PFX certificate filename.")
-    parser.add_argument ("--subject-name", dest='SignToolSubjectName',
-                         help="signtool certificate subject name.")
-
-    parser.add_argument ("--signer-private-cert", dest='OpenSslSignerPrivateCertFile', type=argparse.FileType('rb'),
-                         help="OpenSSL signer private certificate filename.")
-    parser.add_argument ("--other-public-cert", dest='OpenSslOtherPublicCertFile', type=argparse.FileType('rb'),
-                         help="OpenSSL other public certificate filename.")
-    parser.add_argument ("--trusted-public-cert", dest='OpenSslTrustedPublicCertFile', type=argparse.FileType('rb'),
-                         help="OpenSSL trusted public certificate filename.")
-
-    parser.add_argument ("--signing-tool-path", dest = 'SigningToolPath',
-                         help = "Path to signtool or OpenSSL tool.  Optional if path to tools are already in PATH.")
-
-    parser.add_argument ("--embedded-driver", dest = 'EmbeddedDriver', type = argparse.FileType('rb'), action='append', default = [],
-                         help = "Path to embedded UEFI driver to add to capsule.")
+    parser.add_argument("-j", "--json-file", dest='JsonFile', type=argparse.FileType('r'),
+                        help="JSON configuration file for multiple payloads and embedded drivers.")
+    parser.add_argument("--capflag", dest='CapsuleFlag', action='append', default=[],
+                        choices=['PersistAcrossReset', 'InitiateReset'],
+                        help="Capsule flag can be PersistAcrossReset or InitiateReset or not set")
+    parser.add_argument("--capoemflag", dest='CapsuleOemFlag', type=ValidateUnsignedInteger, default=0x0000,
+                        help="Capsule OEM Flag is an integer between 0x0000 and 0xffff.")
+
+    parser.add_argument("--guid", dest='Guid', type=ValidateRegistryFormatGuid,
+                        help="The FMP/ESRT GUID in registry format.  Required for single payload encode operations.")
+    parser.add_argument("--hardware-instance", dest='HardwareInstance', type=ValidateUnsignedInteger, default=0x0000000000000000,
+                        help="The 64-bit hardware instance.  The default is 0x0000000000000000")
+
+    parser.add_argument("--monotonic-count", dest='MonotonicCount', type=ValidateUnsignedInteger, default=0x0000000000000000,
+                        help="64-bit monotonic count value in header.  Default is 0x0000000000000000.")
+
+    parser.add_argument("--fw-version", dest='FwVersion', type=ValidateUnsignedInteger,
+                        help="The 32-bit version of the binary payload (e.g. 0x11223344 or 5678).  Required for encode operations.")
+    parser.add_argument("--lsv", dest='LowestSupportedVersion', type=ValidateUnsignedInteger,
+                        help="The 32-bit lowest supported version of the binary payload (e.g. 0x11223344 or 5678).  Required for encode operations.")
+
+    parser.add_argument("--pfx-file", dest='SignToolPfxFile', type=argparse.FileType('rb'),
+                        help="signtool PFX certificate filename.")
+    parser.add_argument("--subject-name", dest='SignToolSubjectName',
+                        help="signtool certificate subject name.")
+
+    parser.add_argument("--signer-private-cert", dest='OpenSslSignerPrivateCertFile', type=argparse.FileType('rb'),
+                        help="OpenSSL signer private certificate filename.")
+    parser.add_argument("--other-public-cert", dest='OpenSslOtherPublicCertFile', type=argparse.FileType('rb'),
+                        help="OpenSSL other public certificate filename.")
+    parser.add_argument("--trusted-public-cert", dest='OpenSslTrustedPublicCertFile', type=argparse.FileType('rb'),
+                        help="OpenSSL trusted public certificate filename.")
+
+    parser.add_argument("--signing-tool-path", dest='SigningToolPath',
+                        help="Path to signtool or OpenSSL tool.  Optional if path to tools are already in PATH.")
+
+    parser.add_argument("--embedded-driver", dest='EmbeddedDriver', type=argparse.FileType('rb'), action='append', default=[],
+                        help="Path to embedded UEFI driver to add to capsule.")
 
     #
     # Add optional arguments common to all operations
     #
-    parser.add_argument ('--version', action='version', version='%(prog)s ' + __version__)
-    parser.add_argument ("-v", "--verbose", dest = 'Verbose', action = "store_true",
-                         help = "Turn on verbose output with informational messages printed, including capsule headers and warning messages.")
-    parser.add_argument ("-q", "--quiet", dest = 'Quiet', action = "store_true",
-                         help = "Disable all messages except fatal errors.")
-    parser.add_argument ("--debug", dest = 'Debug', type = int, metavar = '[0-9]', choices = range (0, 10), default = 0,
-                         help = "Set debug level")
-    parser.add_argument ("--update-image-index", dest = 'UpdateImageIndex', type = ValidateUnsignedInteger, default = 0x01, help = "unique number identifying the firmware image within the device ")
+    parser.add_argument('--version', action='version',
+                        version='%(prog)s ' + __version__)
+    parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true",
+                        help="Turn on verbose output with informational messages printed, including capsule headers and warning messages.")
+    parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true",
+                        help="Disable all messages except fatal errors.")
+    parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0,
+                        help="Set debug level")
+    parser.add_argument("--update-image-index", dest='UpdateImageIndex', type=ValidateUnsignedInteger,
+                        default=0x01, help="unique number identifying the firmware image within the device ")
 
     #
     # Parse command line arguments
@@ -1029,48 +1163,51 @@ if __name__ == '__main__':
     #
     Buffer = ''
     if args.InputFile:
-        if os.path.getsize (args.InputFile.name) == 0:
-            print ('GenerateCapsule: error: InputFile {File} is empty'.format (File = args.InputFile.name))
-            sys.exit (1)
+        if os.path.getsize(args.InputFile.name) == 0:
+            print('GenerateCapsule: error: InputFile {File} is empty'.format(
+                File=args.InputFile.name))
+            sys.exit(1)
         try:
             if args.Verbose:
-                print ('Read binary input file {File}'.format (File = args.InputFile.name))
-            Buffer = args.InputFile.read ()
-            args.InputFile.close ()
+                print('Read binary input file {File}'.format(
+                    File=args.InputFile.name))
+            Buffer = args.InputFile.read()
+            args.InputFile.close()
         except:
-            print ('GenerateCapsule: error: can not read binary input file {File}'.format (File = args.InputFile.name))
-            sys.exit (1)
+            print('GenerateCapsule: error: can not read binary input file {File}'.format(
+                File=args.InputFile.name))
+            sys.exit(1)
 
     #
     # Create objects
     #
-    UefiCapsuleHeader = UefiCapsuleHeaderClass ()
-    FmpCapsuleHeader  = FmpCapsuleHeaderClass ()
-    FmpAuthHeader     = FmpAuthHeaderClass ()
-    FmpPayloadHeader  = FmpPayloadHeaderClass ()
-    CapsuleDependency = CapsuleDependencyClass ()
+    UefiCapsuleHeader = UefiCapsuleHeaderClass()
+    FmpCapsuleHeader = FmpCapsuleHeaderClass()
+    FmpAuthHeader = FmpAuthHeaderClass()
+    FmpPayloadHeader = FmpPayloadHeaderClass()
+    CapsuleDependency = CapsuleDependencyClass()
 
     EmbeddedDriverDescriptorList = []
     PayloadDescriptorList = []
     PayloadJsonDescriptorList = []
 
     #
-    #Encode Operation
+    # Encode Operation
     #
     if args.Encode:
-        Encode (PayloadDescriptorList, EmbeddedDriverDescriptorList, Buffer)
+        Encode(PayloadDescriptorList, EmbeddedDriverDescriptorList, Buffer)
 
     #
-    #Decode Operation
+    # Decode Operation
     #
     if args.Decode:
-        Decode (PayloadDescriptorList, PayloadJsonDescriptorList, Buffer)
+        Decode(PayloadDescriptorList, PayloadJsonDescriptorList, Buffer)
 
     #
-    #Dump Info Operation
+    # Dump Info Operation
     #
     if args.DumpInfo:
-        DumpInfo (Buffer, args)
+        DumpInfo(Buffer, args)
 
     if args.Verbose:
         print('Success')
diff --git a/BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py b/BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py
index bea7a0df387c..d9726ed25051 100644
--- a/BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py
+++ b/BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Generate a capsule windows driver.
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -21,30 +21,33 @@ import platform
 import re
 import logging
 from WindowsCapsuleSupportHelper import WindowsCapsuleSupportHelper
-from Common.Uefi.Capsule.FmpCapsuleHeader  import FmpCapsuleHeaderClass
+from Common.Uefi.Capsule.FmpCapsuleHeader import FmpCapsuleHeaderClass
 from Common.Uefi.Capsule.UefiCapsuleHeader import UefiCapsuleHeaderClass
 
 #
 # Globals for help information
 #
-__prog__        = 'GenerateWindowsDriver'
-__version__     = '0.0'
-__copyright__   = 'Copyright (c) 2019, Intel Corporation. All rights reserved.'
+__prog__ = 'GenerateWindowsDriver'
+__version__ = '0.0'
+__copyright__ = 'Copyright (c) 2019, Intel Corporation. All rights reserved.'
 __description__ = 'Generate Capsule Windows Driver.\n'
 
-def GetCapGuid (InputFile):
+
+def GetCapGuid(InputFile):
     with open(InputFile, 'rb') as File:
         Buffer = File.read()
     try:
-        Result = UefiCapsuleHeader.Decode (Buffer)
-        if len (Result) > 0:
-            FmpCapsuleHeader.Decode (Result)
-            for index in range (0, FmpCapsuleHeader.PayloadItemCount):
-                Guid = FmpCapsuleHeader.GetFmpCapsuleImageHeader (index).UpdateImageTypeId
+        Result = UefiCapsuleHeader.Decode(Buffer)
+        if len(Result) > 0:
+            FmpCapsuleHeader.Decode(Result)
+            for index in range(0, FmpCapsuleHeader.PayloadItemCount):
+                Guid = FmpCapsuleHeader.GetFmpCapsuleImageHeader(
+                    index).UpdateImageTypeId
         return Guid
     except:
-        print ('GenerateCapsule: error: can not decode capsule')
-        sys.exit (1)
+        print('GenerateCapsule: error: can not decode capsule')
+        sys.exit(1)
+
 
 def ArgCheck(args):
     Version = args.CapsuleVersion_DotString.split('.')
@@ -53,20 +56,23 @@ def ArgCheck(args):
         logging.critical("Name invalid: '%s'", args.CapsuleVersion_DotString)
         raise ValueError("Name invalid.")
     for sub in Version:
-        if  int(sub, 16) > 65536:
-            logging.critical("Name invalid: '%s'", args.CapsuleVersion_DotString)
+        if int(sub, 16) > 65536:
+            logging.critical("Name invalid: '%s'",
+                             args.CapsuleVersion_DotString)
             raise ValueError("Name exceed limit 65536.")
 
     if not (re.compile(r'[\a-fA-F0-9]*$')).match(args.CapsuleVersion_DotString):
         logging.critical("Name invalid: '%s'", args.CapsuleVersion_DotString)
         raise ValueError("Name has invalid chars.")
 
+
 def CapsuleGuidCheck(InputFile, Guid):
     CapGuid = GetCapGuid(InputFile)
     if (str(Guid).lower() != str(CapGuid)):
         print('GenerateWindowsDriver error: Different Guid from Capsule')
         sys.exit(1)
 
+
 if __name__ == '__main__':
     def convert_arg_line_to_args(arg_line):
         for arg in arg_line.split():
@@ -74,47 +80,58 @@ if __name__ == '__main__':
                 continue
             yield arg
 
-    parser = argparse.ArgumentParser (
-                        prog = __prog__,
-                        description = __description__ + __copyright__,
-                        conflict_handler = 'resolve',
-                        fromfile_prefix_chars = '@'
-                        )
+    parser = argparse.ArgumentParser(
+        prog=__prog__,
+        description=__description__ + __copyright__,
+        conflict_handler='resolve',
+        fromfile_prefix_chars='@'
+    )
     parser.convert_arg_line_to_args = convert_arg_line_to_args
-    parser.add_argument("--output-folder", dest = 'OutputFolder', help = "firmware resource update driver package output folder.")
-    parser.add_argument("--product-fmp-guid", dest = 'ProductFmpGuid', help = "firmware GUID of resource update driver package")
-    parser.add_argument("--capsuleversion-dotstring", dest = 'CapsuleVersion_DotString', help = "firmware version with date on which update driver package is authored")
-    parser.add_argument("--capsuleversion-hexstring", dest = 'CapsuleVersion_HexString', help = "firmware version in Hex of update driver package")
-    parser.add_argument("--product-fw-provider", dest = 'ProductFwProvider', help = "vendor/provider of entire firmware resource update driver package")
-    parser.add_argument("--product-fw-mfg-name", dest = 'ProductFwMfgName', help = "manufacturer/vendor of firmware resource update driver package")
-    parser.add_argument("--product-fw-desc", dest = "ProductFwDesc", help = "description about resource update driver")
-    parser.add_argument("--capsule-file-name", dest = 'CapsuleFileName', help ="firmware resource image file")
-    parser.add_argument("--pfx-file", dest = 'PfxFile', help = "pfx file path used to sign resource update driver")
-    parser.add_argument("--arch", dest = 'Arch', help = "supported architecture:arm/x64/amd64/arm64/aarch64", default = 'amd64')
-    parser.add_argument("--operating-system-string", dest = 'OperatingSystemString', help = "supported operating system:win10/10/10_au/10_rs2/10_rs3/10_rs4/server10/server2016/serverrs2/serverrs3/serverrs4", default = "win10")
+    parser.add_argument("--output-folder", dest='OutputFolder',
+                        help="firmware resource update driver package output folder.")
+    parser.add_argument("--product-fmp-guid", dest='ProductFmpGuid',
+                        help="firmware GUID of resource update driver package")
+    parser.add_argument("--capsuleversion-dotstring", dest='CapsuleVersion_DotString',
+                        help="firmware version with date on which update driver package is authored")
+    parser.add_argument("--capsuleversion-hexstring", dest='CapsuleVersion_HexString',
+                        help="firmware version in Hex of update driver package")
+    parser.add_argument("--product-fw-provider", dest='ProductFwProvider',
+                        help="vendor/provider of entire firmware resource update driver package")
+    parser.add_argument("--product-fw-mfg-name", dest='ProductFwMfgName',
+                        help="manufacturer/vendor of firmware resource update driver package")
+    parser.add_argument("--product-fw-desc", dest="ProductFwDesc",
+                        help="description about resource update driver")
+    parser.add_argument("--capsule-file-name", dest='CapsuleFileName',
+                        help="firmware resource image file")
+    parser.add_argument("--pfx-file", dest='PfxFile',
+                        help="pfx file path used to sign resource update driver")
+    parser.add_argument("--arch", dest='Arch',
+                        help="supported architecture:arm/x64/amd64/arm64/aarch64", default='amd64')
+    parser.add_argument("--operating-system-string", dest='OperatingSystemString',
+                        help="supported operating system:win10/10/10_au/10_rs2/10_rs3/10_rs4/server10/server2016/serverrs2/serverrs3/serverrs4", default="win10")
 
     args = parser.parse_args()
     InputFile = os.path.join(args.OutputFolder, '') + args.CapsuleFileName
-    UefiCapsuleHeader = UefiCapsuleHeaderClass ()
-    FmpCapsuleHeader  = FmpCapsuleHeaderClass ()
+    UefiCapsuleHeader = UefiCapsuleHeaderClass()
+    FmpCapsuleHeader = FmpCapsuleHeaderClass()
     CapsuleGuidCheck(InputFile, args.ProductFmpGuid)
     ArgCheck(args)
     ProductName = os.path.splitext(args.CapsuleFileName)[0]
-    WindowsDriver = WindowsCapsuleSupportHelper ()
+    WindowsDriver = WindowsCapsuleSupportHelper()
 
-    WindowsDriver.PackageWindowsCapsuleFiles (
-                                                   args.OutputFolder,
-                                                   ProductName,
-                                                   args.ProductFmpGuid,
-                                                   args.CapsuleVersion_DotString,
-                                                   args.CapsuleVersion_HexString,
-                                                   args.ProductFwProvider,
-                                                   args.ProductFwMfgName,
-                                                   args.ProductFwDesc,
-                                                   args.CapsuleFileName,
-                                                   args.PfxFile,
-                                                   None,
-                                                   None,
-                                                   args.Arch,
-                                                   args.OperatingSystemString
-                                                   )
+    WindowsDriver.PackageWindowsCapsuleFiles(
+        args.OutputFolder,
+        ProductName,
+        args.ProductFmpGuid,
+        args.CapsuleVersion_DotString,
+        args.CapsuleVersion_HexString,
+        args.ProductFwProvider,
+        args.ProductFwMfgName,
+        args.ProductFwDesc,
+        args.CapsuleFileName,
+        args.PfxFile,
+        None,
+        None,
+        args.Arch,
+        args.OperatingSystemString
+    )
diff --git a/BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py b/BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py
index a29ac21ae890..d6b0aa04832c 100644
--- a/BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py
+++ b/BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py
@@ -21,44 +21,49 @@ from edk2toollib.windows.capsule.inf_generator import InfGenerator
 from edk2toollib.utility_functions import CatalogSignWithSignTool
 from edk2toollib.windows.locate_tools import FindToolInWinSdk
 
+
 class WindowsCapsuleSupportHelper(object):
 
-  def RegisterHelpers(self, obj):
-      fp = os.path.abspath(__file__)
-      obj.Register("PackageWindowsCapsuleFiles", WindowsCapsuleSupportHelper.PackageWindowsCapsuleFiles, fp)
-
-
-  @staticmethod
-  def PackageWindowsCapsuleFiles(OutputFolder, ProductName, ProductFmpGuid, CapsuleVersion_DotString,
-    CapsuleVersion_HexString, ProductFwProvider, ProductFwMfgName, ProductFwDesc, CapsuleFileName, PfxFile=None, PfxPass=None,
-    Rollback=False, Arch='amd64', OperatingSystem_String='Win10'):
-
-      logging.debug("CapsulePackage: Create Windows Capsule Files")
-
-      #Make INF
-      InfFilePath = os.path.join(OutputFolder, ProductName + ".inf")
-      InfTool = InfGenerator(ProductName, ProductFwProvider, ProductFmpGuid, Arch, ProductFwDesc, CapsuleVersion_DotString, CapsuleVersion_HexString)
-      InfTool.Manufacturer = ProductFwMfgName  #optional
-      ret = InfTool.MakeInf(InfFilePath, CapsuleFileName, Rollback)
-      if(ret != 0):
-          raise Exception("CreateWindowsInf Failed with errorcode %d" % ret)
-
-      #Make CAT
-      CatFilePath = os.path.realpath(os.path.join(OutputFolder, ProductName + ".cat"))
-      CatTool = CatGenerator(Arch, OperatingSystem_String)
-      ret = CatTool.MakeCat(CatFilePath)
-
-      if(ret != 0):
-          raise Exception("Creating Cat file Failed with errorcode %d" % ret)
-
-      if(PfxFile is not None):
-          #Find Signtool
-          SignToolPath = FindToolInWinSdk("signtool.exe")
-          if not os.path.exists(SignToolPath):
-              raise Exception("Can't find signtool on this machine.")
-          #dev sign the cat file
-          ret = CatalogSignWithSignTool(SignToolPath, CatFilePath, PfxFile, PfxPass)
-          if(ret != 0):
-              raise Exception("Signing Cat file Failed with errorcode %d" % ret)
-
-      return ret
+    def RegisterHelpers(self, obj):
+        fp = os.path.abspath(__file__)
+        obj.Register("PackageWindowsCapsuleFiles",
+                     WindowsCapsuleSupportHelper.PackageWindowsCapsuleFiles, fp)
+
+    @staticmethod
+    def PackageWindowsCapsuleFiles(OutputFolder, ProductName, ProductFmpGuid, CapsuleVersion_DotString,
+                                   CapsuleVersion_HexString, ProductFwProvider, ProductFwMfgName, ProductFwDesc, CapsuleFileName, PfxFile=None, PfxPass=None,
+                                   Rollback=False, Arch='amd64', OperatingSystem_String='Win10'):
+
+        logging.debug("CapsulePackage: Create Windows Capsule Files")
+
+        # Make INF
+        InfFilePath = os.path.join(OutputFolder, ProductName + ".inf")
+        InfTool = InfGenerator(ProductName, ProductFwProvider, ProductFmpGuid,
+                               Arch, ProductFwDesc, CapsuleVersion_DotString, CapsuleVersion_HexString)
+        InfTool.Manufacturer = ProductFwMfgName  # optional
+        ret = InfTool.MakeInf(InfFilePath, CapsuleFileName, Rollback)
+        if(ret != 0):
+            raise Exception("CreateWindowsInf Failed with errorcode %d" % ret)
+
+        # Make CAT
+        CatFilePath = os.path.realpath(
+            os.path.join(OutputFolder, ProductName + ".cat"))
+        CatTool = CatGenerator(Arch, OperatingSystem_String)
+        ret = CatTool.MakeCat(CatFilePath)
+
+        if(ret != 0):
+            raise Exception("Creating Cat file Failed with errorcode %d" % ret)
+
+        if(PfxFile is not None):
+            # Find Signtool
+            SignToolPath = FindToolInWinSdk("signtool.exe")
+            if not os.path.exists(SignToolPath):
+                raise Exception("Can't find signtool on this machine.")
+            # dev sign the cat file
+            ret = CatalogSignWithSignTool(
+                SignToolPath, CatFilePath, PfxFile, PfxPass)
+            if(ret != 0):
+                raise Exception(
+                    "Signing Cat file Failed with errorcode %d" % ret)
+
+        return ret
diff --git a/BaseTools/Source/Python/Common/BuildToolError.py b/BaseTools/Source/Python/Common/BuildToolError.py
index 21549683cd19..2efd364fc4b3 100644
--- a/BaseTools/Source/Python/Common/BuildToolError.py
+++ b/BaseTools/Source/Python/Common/BuildToolError.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Standardized Error Handling infrastructures.
 #
 # Copyright (c) 2007 - 2016, Intel Corporation. All rights reserved.<BR>
@@ -34,7 +34,7 @@ OPTION_UNKNOWN_ERROR = 0x1FFF
 
 PARAMETER_INVALID = 0x2000
 PARAMETER_MISSING = 0x2001
-PARAMETER_UNKNOWN_ERROR =0x2FFF
+PARAMETER_UNKNOWN_ERROR = 0x2FFF
 
 FORMAT_INVALID = 0x3000
 FORMAT_NOT_SUPPORTED = 0x3001
@@ -89,72 +89,75 @@ ERROR_STATEMENT = 0xFFFD
 ABORT_ERROR = 0xFFFE
 UNKNOWN_ERROR = 0xFFFF
 
-## Error message of each error code
+# Error message of each error code
 gErrorMessage = {
-    FILE_NOT_FOUND          :   "File/directory not found in workspace",
-    FILE_OPEN_FAILURE       :   "File open failure",
-    FILE_WRITE_FAILURE      :   "File write failure",
-    FILE_PARSE_FAILURE      :   "File parse failure",
-    FILE_READ_FAILURE       :   "File read failure",
-    FILE_CREATE_FAILURE     :   "File create failure",
-    FILE_CHECKSUM_FAILURE   :   "Invalid checksum of file",
-    FILE_COMPRESS_FAILURE   :   "File compress failure",
-    FILE_DECOMPRESS_FAILURE :   "File decompress failure",
-    FILE_MOVE_FAILURE       :   "File move failure",
-    FILE_DELETE_FAILURE     :   "File delete failure",
-    FILE_COPY_FAILURE       :   "File copy failure",
+    FILE_NOT_FOUND:   "File/directory not found in workspace",
+    FILE_OPEN_FAILURE:   "File open failure",
+    FILE_WRITE_FAILURE:   "File write failure",
+    FILE_PARSE_FAILURE:   "File parse failure",
+    FILE_READ_FAILURE:   "File read failure",
+    FILE_CREATE_FAILURE:   "File create failure",
+    FILE_CHECKSUM_FAILURE:   "Invalid checksum of file",
+    FILE_COMPRESS_FAILURE:   "File compress failure",
+    FILE_DECOMPRESS_FAILURE:   "File decompress failure",
+    FILE_MOVE_FAILURE:   "File move failure",
+    FILE_DELETE_FAILURE:   "File delete failure",
+    FILE_COPY_FAILURE:   "File copy failure",
     FILE_POSITIONING_FAILURE:   "Failed to seeking position",
-    FILE_ALREADY_EXIST      :   "File or directory already exists",
-    FILE_TYPE_MISMATCH      :   "Incorrect file type",
-    FILE_CASE_MISMATCH      :   "File name case mismatch",
-    FILE_DUPLICATED         :   "Duplicated file found",
-    FILE_UNKNOWN_ERROR      :   "Unknown error encountered on file",
+    FILE_ALREADY_EXIST:   "File or directory already exists",
+    FILE_TYPE_MISMATCH:   "Incorrect file type",
+    FILE_CASE_MISMATCH:   "File name case mismatch",
+    FILE_DUPLICATED:   "Duplicated file found",
+    FILE_UNKNOWN_ERROR:   "Unknown error encountered on file",
 
-    OPTION_UNKNOWN          :   "Unknown option",
-    OPTION_MISSING          :   "Missing option",
-    OPTION_CONFLICT         :   "Conflict options",
-    OPTION_VALUE_INVALID    :   "Invalid value of option",
-    OPTION_DEPRECATED       :   "Deprecated option",
-    OPTION_NOT_SUPPORTED    :   "Unsupported option",
-    OPTION_UNKNOWN_ERROR    :   "Unknown error when processing options",
+    OPTION_UNKNOWN:   "Unknown option",
+    OPTION_MISSING:   "Missing option",
+    OPTION_CONFLICT:   "Conflict options",
+    OPTION_VALUE_INVALID:   "Invalid value of option",
+    OPTION_DEPRECATED:   "Deprecated option",
+    OPTION_NOT_SUPPORTED:   "Unsupported option",
+    OPTION_UNKNOWN_ERROR:   "Unknown error when processing options",
 
-    PARAMETER_INVALID       :   "Invalid parameter",
-    PARAMETER_MISSING       :   "Missing parameter",
-    PARAMETER_UNKNOWN_ERROR :   "Unknown error in parameters",
+    PARAMETER_INVALID:   "Invalid parameter",
+    PARAMETER_MISSING:   "Missing parameter",
+    PARAMETER_UNKNOWN_ERROR:   "Unknown error in parameters",
 
-    FORMAT_INVALID          :   "Invalid syntax/format",
-    FORMAT_NOT_SUPPORTED    :   "Not supported syntax/format",
-    FORMAT_UNKNOWN          :   "Unknown format",
-    FORMAT_UNKNOWN_ERROR    :   "Unknown error in syntax/format ",
+    FORMAT_INVALID:   "Invalid syntax/format",
+    FORMAT_NOT_SUPPORTED:   "Not supported syntax/format",
+    FORMAT_UNKNOWN:   "Unknown format",
+    FORMAT_UNKNOWN_ERROR:   "Unknown error in syntax/format ",
 
-    RESOURCE_NOT_AVAILABLE  :   "Not available",
-    RESOURCE_ALLOCATE_FAILURE :   "Allocate failure",
-    RESOURCE_FULL           :   "Full",
-    RESOURCE_OVERFLOW       :   "Overflow",
-    RESOURCE_UNDERRUN       :   "Underrun",
-    RESOURCE_UNKNOWN_ERROR  :   "Unknown error",
+    RESOURCE_NOT_AVAILABLE:   "Not available",
+    RESOURCE_ALLOCATE_FAILURE:   "Allocate failure",
+    RESOURCE_FULL:   "Full",
+    RESOURCE_OVERFLOW:   "Overflow",
+    RESOURCE_UNDERRUN:   "Underrun",
+    RESOURCE_UNKNOWN_ERROR:   "Unknown error",
 
-    ATTRIBUTE_NOT_AVAILABLE :   "Not available",
-    ATTRIBUTE_GET_FAILURE   :   "Failed to retrieve",
-    ATTRIBUTE_SET_FAILURE   :   "Failed to set",
+    ATTRIBUTE_NOT_AVAILABLE:   "Not available",
+    ATTRIBUTE_GET_FAILURE:   "Failed to retrieve",
+    ATTRIBUTE_SET_FAILURE:   "Failed to set",
     ATTRIBUTE_UPDATE_FAILURE:   "Failed to update",
-    ATTRIBUTE_ACCESS_DENIED :   "Access denied",
-    ATTRIBUTE_UNKNOWN_ERROR :   "Unknown error when accessing",
+    ATTRIBUTE_ACCESS_DENIED:   "Access denied",
+    ATTRIBUTE_UNKNOWN_ERROR:   "Unknown error when accessing",
 
-    COMMAND_FAILURE         :   "Failed to execute command",
+    COMMAND_FAILURE:   "Failed to execute command",
 
-    IO_NOT_READY            :   "Not ready",
-    IO_BUSY                 :   "Busy",
-    IO_TIMEOUT              :   "Timeout",
-    IO_UNKNOWN_ERROR        :   "Unknown error in IO operation",
+    IO_NOT_READY:   "Not ready",
+    IO_BUSY:   "Busy",
+    IO_TIMEOUT:   "Timeout",
+    IO_UNKNOWN_ERROR:   "Unknown error in IO operation",
 
-    ERROR_STATEMENT         :   "!error statement",
-    UNKNOWN_ERROR           :   "Unknown error",
+    ERROR_STATEMENT:   "!error statement",
+    UNKNOWN_ERROR:   "Unknown error",
 }
 
-## Exception indicating a fatal error
+# Exception indicating a fatal error
+
+
 class FatalError(Exception):
     pass
 
+
 if __name__ == "__main__":
     pass
diff --git a/BaseTools/Source/Python/Common/BuildVersion.py b/BaseTools/Source/Python/Common/BuildVersion.py
index 088609f54170..8e276013bcce 100644
--- a/BaseTools/Source/Python/Common/BuildVersion.py
+++ b/BaseTools/Source/Python/Common/BuildVersion.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This file is for build version number auto generation
 #
diff --git a/BaseTools/Source/Python/Common/DataType.py b/BaseTools/Source/Python/Common/DataType.py
index dc4962333351..67433b95c759 100644
--- a/BaseTools/Source/Python/Common/DataType.py
+++ b/BaseTools/Source/Python/Common/DataType.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common static strings used by INF/DEC/DSC files
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -39,8 +39,10 @@ TAB_VOID = 'VOID*'
 TAB_GUID = 'GUID'
 
 TAB_PCD_CLEAN_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64}
-TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN'}
-TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, 'BOOLEAN', TAB_VOID}
+TAB_PCD_NUMERIC_TYPES = {TAB_UINT8, TAB_UINT16,
+                         TAB_UINT32, TAB_UINT64, 'BOOLEAN'}
+TAB_PCD_NUMERIC_TYPES_VOID = {TAB_UINT8, TAB_UINT16,
+                              TAB_UINT32, TAB_UINT64, 'BOOLEAN', TAB_VOID}
 
 TAB_WORKSPACE = '$(WORKSPACE)'
 TAB_FV_DIRECTORY = 'FV'
@@ -55,7 +57,8 @@ TAB_ARCH_AARCH64 = 'AARCH64'
 
 TAB_ARCH_RISCV64 = 'RISCV64'
 
-ARCH_SET_FULL = {TAB_ARCH_IA32, TAB_ARCH_X64, TAB_ARCH_ARM, TAB_ARCH_EBC, TAB_ARCH_AARCH64, TAB_ARCH_RISCV64, TAB_ARCH_COMMON}
+ARCH_SET_FULL = {TAB_ARCH_IA32, TAB_ARCH_X64, TAB_ARCH_ARM,
+                 TAB_ARCH_EBC, TAB_ARCH_AARCH64, TAB_ARCH_RISCV64, TAB_ARCH_COMMON}
 
 SUP_MODULE_BASE = 'BASE'
 SUP_MODULE_SEC = 'SEC'
@@ -74,8 +77,8 @@ SUP_MODULE_SMM_CORE = 'SMM_CORE'
 SUP_MODULE_MM_STANDALONE = 'MM_STANDALONE'
 SUP_MODULE_MM_CORE_STANDALONE = 'MM_CORE_STANDALONE'
 
-SUP_MODULE_LIST = [SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
-                   SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER, \
+SUP_MODULE_LIST = [SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER,
+                   SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER,
                    SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, SUP_MODULE_HOST_APPLICATION, SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE, SUP_MODULE_MM_CORE_STANDALONE]
 SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(SUP_MODULE_LIST)
 SUP_MODULE_SET_PEI = {SUP_MODULE_PEIM, SUP_MODULE_PEI_CORE}
@@ -95,18 +98,18 @@ EDKII_NAME = 'EDKII'
 MSG_EDKII_MAIL_ADDR = 'devel@edk2.groups.io'
 
 COMPONENT_TO_MODULE_MAP_DICT = {
-    EDK_COMPONENT_TYPE_LIBRARY               :   SUP_MODULE_BASE,
-    EDK_COMPONENT_TYPE_SECURITY_CORE         :   SUP_MODULE_SEC,
-    EDK_COMPONENT_TYPE_PEI_CORE              :   SUP_MODULE_PEI_CORE,
-    EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER  :   SUP_MODULE_PEIM,
-    EDK_COMPONENT_TYPE_PIC_PEIM              :   SUP_MODULE_PEIM,
-    EDK_COMPONENT_TYPE_RELOCATABLE_PEIM      :   SUP_MODULE_PEIM,
-    "PE32_PEIM"                              :   SUP_MODULE_PEIM,
-    EDK_COMPONENT_TYPE_BS_DRIVER             :   SUP_MODULE_DXE_DRIVER,
-    EDK_COMPONENT_TYPE_RT_DRIVER             :   SUP_MODULE_DXE_RUNTIME_DRIVER,
-    EDK_COMPONENT_TYPE_SAL_RT_DRIVER         :   SUP_MODULE_DXE_SAL_DRIVER,
-    EDK_COMPONENT_TYPE_APPLICATION           :   SUP_MODULE_UEFI_APPLICATION,
-    "LOGO"                                   :   SUP_MODULE_BASE,
+    EDK_COMPONENT_TYPE_LIBRARY:   SUP_MODULE_BASE,
+    EDK_COMPONENT_TYPE_SECURITY_CORE:   SUP_MODULE_SEC,
+    EDK_COMPONENT_TYPE_PEI_CORE:   SUP_MODULE_PEI_CORE,
+    EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER:   SUP_MODULE_PEIM,
+    EDK_COMPONENT_TYPE_PIC_PEIM:   SUP_MODULE_PEIM,
+    EDK_COMPONENT_TYPE_RELOCATABLE_PEIM:   SUP_MODULE_PEIM,
+    "PE32_PEIM":   SUP_MODULE_PEIM,
+    EDK_COMPONENT_TYPE_BS_DRIVER:   SUP_MODULE_DXE_DRIVER,
+    EDK_COMPONENT_TYPE_RT_DRIVER:   SUP_MODULE_DXE_RUNTIME_DRIVER,
+    EDK_COMPONENT_TYPE_SAL_RT_DRIVER:   SUP_MODULE_DXE_SAL_DRIVER,
+    EDK_COMPONENT_TYPE_APPLICATION:   SUP_MODULE_UEFI_APPLICATION,
+    "LOGO":   SUP_MODULE_BASE,
 }
 
 BINARY_FILE_TYPE_FW = 'FW'
@@ -208,57 +211,86 @@ TAB_PCDS_DYNAMIC_DEFAULT = 'DynamicDefault'
 TAB_PCDS_DYNAMIC_VPD = 'DynamicVpd'
 TAB_PCDS_DYNAMIC_HII = 'DynamicHii'
 
-PCD_DYNAMIC_TYPE_SET = {TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_DEFAULT, TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_HII}
-PCD_DYNAMIC_EX_TYPE_SET = {TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT, TAB_PCDS_DYNAMIC_EX_VPD, TAB_PCDS_DYNAMIC_EX_HII}
+PCD_DYNAMIC_TYPE_SET = {TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_DEFAULT,
+                        TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_HII}
+PCD_DYNAMIC_EX_TYPE_SET = {TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT,
+                           TAB_PCDS_DYNAMIC_EX_VPD, TAB_PCDS_DYNAMIC_EX_HII}
 
 # leave as a list for order
-PCD_TYPE_LIST = [TAB_PCDS_FIXED_AT_BUILD, TAB_PCDS_PATCHABLE_IN_MODULE, TAB_PCDS_FEATURE_FLAG, TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_EX]
+PCD_TYPE_LIST = [TAB_PCDS_FIXED_AT_BUILD, TAB_PCDS_PATCHABLE_IN_MODULE,
+                 TAB_PCDS_FEATURE_FLAG, TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_EX]
 
 TAB_PCDS_FIXED_AT_BUILD_NULL = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD
-TAB_PCDS_FIXED_AT_BUILD_COMMON = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_COMMON
-TAB_PCDS_FIXED_AT_BUILD_IA32 = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_IA32
-TAB_PCDS_FIXED_AT_BUILD_X64 = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_X64
-TAB_PCDS_FIXED_AT_BUILD_ARM = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_ARM
-TAB_PCDS_FIXED_AT_BUILD_EBC = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_EBC
-TAB_PCDS_FIXED_AT_BUILD_AARCH64 = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_AARCH64
+TAB_PCDS_FIXED_AT_BUILD_COMMON = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_COMMON
+TAB_PCDS_FIXED_AT_BUILD_IA32 = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_IA32
+TAB_PCDS_FIXED_AT_BUILD_X64 = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_X64
+TAB_PCDS_FIXED_AT_BUILD_ARM = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_ARM
+TAB_PCDS_FIXED_AT_BUILD_EBC = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_EBC
+TAB_PCDS_FIXED_AT_BUILD_AARCH64 = TAB_PCDS + \
+    TAB_PCDS_FIXED_AT_BUILD + TAB_SPLIT + TAB_ARCH_AARCH64
 
 TAB_PCDS_PATCHABLE_IN_MODULE_NULL = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE
-TAB_PCDS_PATCHABLE_IN_MODULE_COMMON = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_COMMON
-TAB_PCDS_PATCHABLE_IN_MODULE_IA32 = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_IA32
-TAB_PCDS_PATCHABLE_IN_MODULE_X64 = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_X64
-TAB_PCDS_PATCHABLE_IN_MODULE_ARM = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_ARM
-TAB_PCDS_PATCHABLE_IN_MODULE_EBC = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_EBC
-TAB_PCDS_PATCHABLE_IN_MODULE_AARCH64 = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_AARCH64
+TAB_PCDS_PATCHABLE_IN_MODULE_COMMON = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_COMMON
+TAB_PCDS_PATCHABLE_IN_MODULE_IA32 = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_IA32
+TAB_PCDS_PATCHABLE_IN_MODULE_X64 = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_X64
+TAB_PCDS_PATCHABLE_IN_MODULE_ARM = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_ARM
+TAB_PCDS_PATCHABLE_IN_MODULE_EBC = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_EBC
+TAB_PCDS_PATCHABLE_IN_MODULE_AARCH64 = TAB_PCDS + \
+    TAB_PCDS_PATCHABLE_IN_MODULE + TAB_SPLIT + TAB_ARCH_AARCH64
 
 TAB_PCDS_FEATURE_FLAG_NULL = TAB_PCDS + TAB_PCDS_FEATURE_FLAG
-TAB_PCDS_FEATURE_FLAG_COMMON = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_COMMON
-TAB_PCDS_FEATURE_FLAG_IA32 = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_IA32
-TAB_PCDS_FEATURE_FLAG_X64 = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_X64
-TAB_PCDS_FEATURE_FLAG_ARM = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_ARM
-TAB_PCDS_FEATURE_FLAG_EBC = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_EBC
-TAB_PCDS_FEATURE_FLAG_AARCH64 = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_AARCH64
+TAB_PCDS_FEATURE_FLAG_COMMON = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_COMMON
+TAB_PCDS_FEATURE_FLAG_IA32 = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_IA32
+TAB_PCDS_FEATURE_FLAG_X64 = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_X64
+TAB_PCDS_FEATURE_FLAG_ARM = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_ARM
+TAB_PCDS_FEATURE_FLAG_EBC = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_EBC
+TAB_PCDS_FEATURE_FLAG_AARCH64 = TAB_PCDS + \
+    TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + TAB_ARCH_AARCH64
 
 TAB_PCDS_DYNAMIC_EX_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX
 TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_DEFAULT
 TAB_PCDS_DYNAMIC_EX_HII_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_HII
 TAB_PCDS_DYNAMIC_EX_VPD_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_VPD
-TAB_PCDS_DYNAMIC_EX_COMMON = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_COMMON
-TAB_PCDS_DYNAMIC_EX_IA32 = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_IA32
-TAB_PCDS_DYNAMIC_EX_X64 = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_X64
-TAB_PCDS_DYNAMIC_EX_ARM = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_ARM
-TAB_PCDS_DYNAMIC_EX_EBC = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_EBC
-TAB_PCDS_DYNAMIC_EX_AARCH64 = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_AARCH64
+TAB_PCDS_DYNAMIC_EX_COMMON = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_COMMON
+TAB_PCDS_DYNAMIC_EX_IA32 = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_IA32
+TAB_PCDS_DYNAMIC_EX_X64 = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_X64
+TAB_PCDS_DYNAMIC_EX_ARM = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_ARM
+TAB_PCDS_DYNAMIC_EX_EBC = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_EBC
+TAB_PCDS_DYNAMIC_EX_AARCH64 = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + TAB_ARCH_AARCH64
 
 TAB_PCDS_DYNAMIC_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC
 TAB_PCDS_DYNAMIC_DEFAULT_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_DEFAULT
 TAB_PCDS_DYNAMIC_HII_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_HII
 TAB_PCDS_DYNAMIC_VPD_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_VPD
-TAB_PCDS_DYNAMIC_COMMON = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_COMMON
+TAB_PCDS_DYNAMIC_COMMON = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_COMMON
 TAB_PCDS_DYNAMIC_IA32 = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_IA32
 TAB_PCDS_DYNAMIC_X64 = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_X64
 TAB_PCDS_DYNAMIC_ARM = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_ARM
 TAB_PCDS_DYNAMIC_EBC = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_EBC
-TAB_PCDS_DYNAMIC_AARCH64 = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_AARCH64
+TAB_PCDS_DYNAMIC_AARCH64 = TAB_PCDS + \
+    TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_AARCH64
 
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE = 'PcdLoadFixAddressPeiCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE_DATA_TYPE = 'UINT32'
@@ -268,15 +300,17 @@ TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE = 'PcdLoadFixAddressRuntim
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE_DATA_TYPE = 'UINT32'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE = 'PcdLoadFixAddressSmmCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE_DATA_TYPE = 'UINT32'
-TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SET =  {TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE, \
-                                            TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE, \
-                                            TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE, \
-                                            TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE}
+TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SET = {TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE,
+                                           TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE,
+                                           TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE,
+                                           TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE}
 
-## The mapping dictionary from datum type to its maximum number.
-MAX_VAL_TYPE = {"BOOLEAN":0x01, TAB_UINT8:0xFF, TAB_UINT16:0xFFFF, TAB_UINT32:0xFFFFFFFF, TAB_UINT64:0xFFFFFFFFFFFFFFFF}
-## The mapping dictionary from datum type to size string.
-MAX_SIZE_TYPE = {"BOOLEAN":1, TAB_UINT8:1, TAB_UINT16:2, TAB_UINT32:4, TAB_UINT64:8}
+# The mapping dictionary from datum type to its maximum number.
+MAX_VAL_TYPE = {"BOOLEAN": 0x01, TAB_UINT8: 0xFF, TAB_UINT16: 0xFFFF,
+                TAB_UINT32: 0xFFFFFFFF, TAB_UINT64: 0xFFFFFFFFFFFFFFFF}
+# The mapping dictionary from datum type to size string.
+MAX_SIZE_TYPE = {"BOOLEAN": 1, TAB_UINT8: 1,
+                 TAB_UINT16: 2, TAB_UINT32: 4, TAB_UINT64: 8}
 
 TAB_DEPEX = 'Depex'
 TAB_DEPEX_COMMON = TAB_DEPEX + TAB_SPLIT + TAB_ARCH_COMMON
@@ -343,7 +377,8 @@ TAB_INF_DEFINES_FFS_EXT = 'FFS_EXT'
 TAB_INF_DEFINES_FV_EXT = 'FV_EXT'
 TAB_INF_DEFINES_SOURCE_FV = 'SOURCE_FV'
 TAB_INF_DEFINES_VERSION_NUMBER = 'VERSION_NUMBER'
-TAB_INF_DEFINES_VERSION = 'VERSION'          # for Edk inf, the same as VERSION_NUMBER
+# for Edk inf, the same as VERSION_NUMBER
+TAB_INF_DEFINES_VERSION = 'VERSION'
 TAB_INF_DEFINES_VERSION_STRING = 'VERSION_STRING'
 TAB_INF_DEFINES_PCD_IS_DRIVER = 'PCD_IS_DRIVER'
 TAB_INF_DEFINES_TIANO_EDK_FLASHMAP_H = 'TIANO_EDK_FLASHMAP_H'
@@ -463,10 +498,11 @@ TAB_UNKNOWN = 'UNKNOWN'
 #
 # Build database path
 #
-DATABASE_PATH = ":memory:" #"BuildDatabase.db"
+DATABASE_PATH = ":memory:"  # "BuildDatabase.db"
 
 # used by ECC
-MODIFIER_SET = {'IN', 'OUT', 'OPTIONAL', 'UNALIGNED', 'EFI_RUNTIMESERVICE', 'EFI_BOOTSERVICE', 'EFIAPI'}
+MODIFIER_SET = {'IN', 'OUT', 'OPTIONAL', 'UNALIGNED',
+                'EFI_RUNTIMESERVICE', 'EFI_BOOTSERVICE', 'EFIAPI'}
 
 # Dependency Opcodes
 DEPEX_OPCODE_BEFORE = "BEFORE"
@@ -481,7 +517,8 @@ DEPEX_OPCODE_TRUE = "TRUE"
 DEPEX_OPCODE_FALSE = "FALSE"
 
 # Dependency Expression
-DEPEX_SUPPORTED_OPCODE_SET = {"BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "END", "SOR", "TRUE", "FALSE", '(', ')'}
+DEPEX_SUPPORTED_OPCODE_SET = {"BEFORE", "AFTER", "PUSH",
+                              "AND", "OR", "NOT", "END", "SOR", "TRUE", "FALSE", '(', ')'}
 
 TAB_STATIC_LIBRARY = "STATIC-LIBRARY-FILE"
 TAB_DYNAMIC_LIBRARY = "DYNAMIC-LIBRARY-FILE"
@@ -513,29 +550,29 @@ PCDS_DYNAMICEX_DEFAULT = "PcdsDynamicExDefault"
 PCDS_DYNAMICEX_VPD = "PcdsDynamicExVpd"
 PCDS_DYNAMICEX_HII = "PcdsDynamicExHii"
 
-SECTIONS_HAVE_ITEM_PCD_SET = {PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD.upper(), PCDS_DYNAMIC_HII.upper(), \
+SECTIONS_HAVE_ITEM_PCD_SET = {PCDS_DYNAMIC_DEFAULT.upper(), PCDS_DYNAMIC_VPD.upper(), PCDS_DYNAMIC_HII.upper(),
                               PCDS_DYNAMICEX_DEFAULT.upper(), PCDS_DYNAMICEX_VPD.upper(), PCDS_DYNAMICEX_HII.upper()}
 # Section allowed to have items after arch
 SECTIONS_HAVE_ITEM_AFTER_ARCH_SET = {TAB_LIBRARY_CLASSES.upper(), TAB_DEPEX.upper(), TAB_USER_EXTENSIONS.upper(),
-                                 PCDS_DYNAMIC_DEFAULT.upper(),
-                                 PCDS_DYNAMIC_VPD.upper(),
-                                 PCDS_DYNAMIC_HII.upper(),
-                                 PCDS_DYNAMICEX_DEFAULT.upper(),
-                                 PCDS_DYNAMICEX_VPD.upper(),
-                                 PCDS_DYNAMICEX_HII.upper(),
-                                 TAB_BUILD_OPTIONS.upper(),
-                                 TAB_PACKAGES.upper(),
-                                 TAB_INCLUDES.upper()}
+                                     PCDS_DYNAMIC_DEFAULT.upper(),
+                                     PCDS_DYNAMIC_VPD.upper(),
+                                     PCDS_DYNAMIC_HII.upper(),
+                                     PCDS_DYNAMICEX_DEFAULT.upper(),
+                                     PCDS_DYNAMICEX_VPD.upper(),
+                                     PCDS_DYNAMICEX_HII.upper(),
+                                     TAB_BUILD_OPTIONS.upper(),
+                                     TAB_PACKAGES.upper(),
+                                     TAB_INCLUDES.upper()}
 
 #
 # pack codes as used in PcdDb and elsewhere
 #
 PACK_PATTERN_GUID = '=LHHBBBBBBBB'
-PACK_CODE_BY_SIZE = {8:'=Q',
-                     4:'=L',
-                     2:'=H',
-                     1:'=B',
-                     0:'=B',
-                    16:""}
+PACK_CODE_BY_SIZE = {8: '=Q',
+                     4: '=L',
+                     2: '=H',
+                     1: '=B',
+                     0: '=B',
+                     16: ""}
 
 TAB_COMPILER_MSFT = 'MSFT'
diff --git a/BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py b/BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py
index ddc142c39ef2..c3dd899c85ae 100644
--- a/BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py
+++ b/BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Module that encodes and decodes a FMP_PAYLOAD_HEADER with a payload.
 # The FMP_PAYLOAD_HEADER is processed by the FmpPayloadHeaderLib in the
 # FmpDevicePkg.
@@ -13,11 +13,14 @@ FmpPayloadHeader
 
 import struct
 
-def _SIGNATURE_32 (A, B, C, D):
-    return struct.unpack ('=I',bytearray (A + B + C + D, 'ascii'))[0]
 
-def _SIGNATURE_32_TO_STRING (Signature):
-    return struct.pack ("<I", Signature).decode ()
+def _SIGNATURE_32(A, B, C, D):
+    return struct.unpack('=I', bytearray(A + B + C + D, 'ascii'))[0]
+
+
+def _SIGNATURE_32_TO_STRING(Signature):
+    return struct.pack("<I", Signature).decode()
+
 
 class FmpPayloadHeaderClass (object):
     #
@@ -31,55 +34,60 @@ class FmpPayloadHeaderClass (object):
     # #define FMP_PAYLOAD_HEADER_SIGNATURE SIGNATURE_32 ('M', 'S', 'S', '1')
     #
     _StructFormat = '<IIII'
-    _StructSize   = struct.calcsize (_StructFormat)
+    _StructSize = struct.calcsize(_StructFormat)
 
-    _FMP_PAYLOAD_HEADER_SIGNATURE = _SIGNATURE_32 ('M', 'S', 'S', '1')
+    _FMP_PAYLOAD_HEADER_SIGNATURE = _SIGNATURE_32('M', 'S', 'S', '1')
 
-    def __init__ (self):
-        self._Valid                 = False
-        self.Signature              = self._FMP_PAYLOAD_HEADER_SIGNATURE
-        self.HeaderSize             = self._StructSize
-        self.FwVersion              = 0x00000000
+    def __init__(self):
+        self._Valid = False
+        self.Signature = self._FMP_PAYLOAD_HEADER_SIGNATURE
+        self.HeaderSize = self._StructSize
+        self.FwVersion = 0x00000000
         self.LowestSupportedVersion = 0x00000000
-        self.Payload                = b''
+        self.Payload = b''
 
-    def Encode (self):
-        FmpPayloadHeader = struct.pack (
-                                     self._StructFormat,
-                                     self.Signature,
-                                     self.HeaderSize,
-                                     self.FwVersion,
-                                     self.LowestSupportedVersion
-                                     )
+    def Encode(self):
+        FmpPayloadHeader = struct.pack(
+            self._StructFormat,
+            self.Signature,
+            self.HeaderSize,
+            self.FwVersion,
+            self.LowestSupportedVersion
+        )
         self._Valid = True
         return FmpPayloadHeader + self.Payload
 
-    def Decode (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def Decode(self, Buffer):
+        if len(Buffer) < self._StructSize:
             raise ValueError
         (Signature, HeaderSize, FwVersion, LowestSupportedVersion) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
         if Signature != self._FMP_PAYLOAD_HEADER_SIGNATURE:
             raise ValueError
         if HeaderSize < self._StructSize:
             raise ValueError
-        self.Signature              = Signature
-        self.HeaderSize             = HeaderSize
-        self.FwVersion              = FwVersion
+        self.Signature = Signature
+        self.HeaderSize = HeaderSize
+        self.FwVersion = FwVersion
         self.LowestSupportedVersion = LowestSupportedVersion
-        self.Payload                = Buffer[self.HeaderSize:]
+        self.Payload = Buffer[self.HeaderSize:]
 
-        self._Valid                 = True
+        self._Valid = True
         return self.Payload
 
-    def DumpInfo (self):
+    def DumpInfo(self):
         if not self._Valid:
             raise ValueError
-        print ('FMP_PAYLOAD_HEADER.Signature              = {Signature:08X} ({SignatureString})'.format (Signature = self.Signature, SignatureString = _SIGNATURE_32_TO_STRING (self.Signature)))
-        print ('FMP_PAYLOAD_HEADER.HeaderSize             = {HeaderSize:08X}'.format (HeaderSize = self.HeaderSize))
-        print ('FMP_PAYLOAD_HEADER.FwVersion              = {FwVersion:08X}'.format (FwVersion = self.FwVersion))
-        print ('FMP_PAYLOAD_HEADER.LowestSupportedVersion = {LowestSupportedVersion:08X}'.format (LowestSupportedVersion = self.LowestSupportedVersion))
-        print ('sizeof (Payload)                          = {Size:08X}'.format (Size = len (self.Payload)))
+        print('FMP_PAYLOAD_HEADER.Signature              = {Signature:08X} ({SignatureString})'.format(
+            Signature=self.Signature, SignatureString=_SIGNATURE_32_TO_STRING(self.Signature)))
+        print('FMP_PAYLOAD_HEADER.HeaderSize             = {HeaderSize:08X}'.format(
+            HeaderSize=self.HeaderSize))
+        print('FMP_PAYLOAD_HEADER.FwVersion              = {FwVersion:08X}'.format(
+            FwVersion=self.FwVersion))
+        print('FMP_PAYLOAD_HEADER.LowestSupportedVersion = {LowestSupportedVersion:08X}'.format(
+            LowestSupportedVersion=self.LowestSupportedVersion))
+        print('sizeof (Payload)                          = {Size:08X}'.format(
+            Size=len(self.Payload)))
diff --git a/BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py b/BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py
index 1e69fca2eab1..09f1e0aed2b3 100644
--- a/BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py
+++ b/BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Common.Edk2.Capsule' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Common/Edk2/__init__.py b/BaseTools/Source/Python/Common/Edk2/__init__.py
index 168041c40cbf..0b300b5eaa86 100644
--- a/BaseTools/Source/Python/Common/Edk2/__init__.py
+++ b/BaseTools/Source/Python/Common/Edk2/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Common.Edk2' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Common/EdkLogger.py b/BaseTools/Source/Python/Common/EdkLogger.py
index 06da4a9d0a1d..2cf4119375a4 100644
--- a/BaseTools/Source/Python/Common/EdkLogger.py
+++ b/BaseTools/Source/Python/Common/EdkLogger.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file implements the log mechanism for Python tools.
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -22,11 +22,13 @@
 # OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
 # This copyright is for QueueHandler.
 
-## Import modules
+# Import modules
 from __future__ import absolute_import
-import Common.LongFilePathOs as os, sys, logging
+import Common.LongFilePathOs as os
+import sys
+import logging
 import traceback
-from  .BuildToolError import *
+from .BuildToolError import *
 try:
     from logging.handlers import QueueHandler
 except:
@@ -95,10 +97,14 @@ except:
                 self.enqueue(self.prepare(record))
             except Exception:
                 self.handleError(record)
+
+
 class BlockQueueHandler(QueueHandler):
     def enqueue(self, record):
-        self.queue.put(record,True)
-## Log level constants
+        self.queue.put(record, True)
+
+
+# Log level constants
 DEBUG_0 = 1
 DEBUG_1 = 2
 DEBUG_2 = 3
@@ -110,11 +116,11 @@ DEBUG_7 = 8
 DEBUG_8 = 9
 DEBUG_9 = 10
 VERBOSE = 15
-INFO    = 20
-WARN    = 30
-QUIET   = 40
-ERROR   = 50
-SILENT  = 99
+INFO = 20
+WARN = 30
+QUIET = 40
+ERROR = 50
+SILENT = 99
 
 IsRaiseError = True
 
@@ -128,7 +134,8 @@ _LogLevels = [DEBUG_0, DEBUG_1, DEBUG_2, DEBUG_3, DEBUG_4, DEBUG_5,
 
 # For DEBUG level (All DEBUG_0~9 are applicable)
 _DebugLogger = logging.getLogger("tool_debug")
-_DebugFormatter = logging.Formatter("[%(asctime)s.%(msecs)d]: %(message)s", datefmt="%H:%M:%S")
+_DebugFormatter = logging.Formatter(
+    "[%(asctime)s.%(msecs)d]: %(message)s", datefmt="%H:%M:%S")
 
 # For VERBOSE, INFO, WARN level
 _InfoLogger = logging.getLogger("tool_info")
@@ -151,12 +158,14 @@ _DebugMessageTemplate = '%(file)s(%(line)s): debug: \n    %(msg)s'
 #
 _WarningAsError = False
 
-## Log debug message
+# Log debug message
 #
 #   @param  Level       DEBUG level (DEBUG0~9)
 #   @param  Message     Debug information
 #   @param  ExtraData   More information associated with "Message"
 #
+
+
 def debug(Level, Message, ExtraData=None):
     if _DebugLogger.level > Level:
         return
@@ -166,9 +175,9 @@ def debug(Level, Message, ExtraData=None):
     # Find out the caller method information
     CallerStack = traceback.extract_stack()[-2]
     TemplateDict = {
-        "file"      : CallerStack[0],
-        "line"      : CallerStack[1],
-        "msg"       : Message,
+        "file": CallerStack[0],
+        "line": CallerStack[1],
+        "msg": Message,
     }
 
     if ExtraData is not None:
@@ -178,14 +187,16 @@ def debug(Level, Message, ExtraData=None):
 
     _DebugLogger.log(Level, LogText)
 
-## Log verbose message
+# Log verbose message
 #
 #   @param  Message     Verbose information
 #
+
+
 def verbose(Message):
     return _InfoLogger.log(VERBOSE, Message)
 
-## Log warning message
+# Log warning message
 #
 #   Warning messages are those which might be wrong but won't fail the tool.
 #
@@ -196,6 +207,8 @@ def verbose(Message):
 #   @param  Line        The line number in the "File" which caused the warning.
 #   @param  ExtraData   More information associated with "Message"
 #
+
+
 def warn(ToolName, Message, File=None, Line=None, ExtraData=None):
     if _InfoLogger.level > WARN:
         return
@@ -210,10 +223,10 @@ def warn(ToolName, Message, File=None, Line=None, ExtraData=None):
         Line = "%d" % Line
 
     TemplateDict = {
-        "tool"      : ToolName,
-        "file"      : File,
-        "line"      : Line,
-        "msg"       : Message,
+        "tool": ToolName,
+        "file": File,
+        "line": Line,
+        "msg": Message,
     }
 
     if File is not None:
@@ -230,10 +243,11 @@ def warn(ToolName, Message, File=None, Line=None, ExtraData=None):
     if _WarningAsError == True:
         raise FatalError(WARNING_AS_ERROR)
 
-## Log INFO message
-info    = _InfoLogger.info
 
-## Log ERROR message
+# Log INFO message
+info = _InfoLogger.info
+
+# Log ERROR message
 #
 #   Once an error messages is logged, the tool's execution will be broken by raising
 # an exception. If you don't want to break the execution later, you can give
@@ -249,6 +263,8 @@ info    = _InfoLogger.info
 #   @param  RaiseError  Raise an exception to break the tool's execution if
 #                       it's True. This is the default behavior.
 #
+
+
 def error(ToolName, ErrorCode, Message=None, File=None, Line=None, ExtraData=None, RaiseError=IsRaiseError):
     if Line is None:
         Line = "..."
@@ -265,16 +281,16 @@ def error(ToolName, ErrorCode, Message=None, File=None, Line=None, ExtraData=Non
         ExtraData = ""
 
     TemplateDict = {
-        "tool"      : _ToolName,
-        "file"      : File,
-        "line"      : Line,
-        "errorcode" : ErrorCode,
-        "msg"       : Message,
-        "extra"     : ExtraData
+        "tool": _ToolName,
+        "file": File,
+        "line": Line,
+        "errorcode": ErrorCode,
+        "msg": Message,
+        "extra": ExtraData
     }
 
     if File is not None:
-        LogText =  _ErrorMessageTemplate % TemplateDict
+        LogText = _ErrorMessageTemplate % TemplateDict
     else:
         LogText = _ErrorMessageTemplateWithoutFile % TemplateDict
 
@@ -283,10 +299,13 @@ def error(ToolName, ErrorCode, Message=None, File=None, Line=None, ExtraData=Non
     if RaiseError and IsRaiseError:
         raise FatalError(ErrorCode)
 
+
 # Log information which should be always put out
-quiet   = _ErrorLogger.error
+quiet = _ErrorLogger.error
+
+# Initialize log system
+
 
-## Initialize log system
 def LogClientInitialize(log_q):
     #
     # Since we use different format to log different levels of message into different
@@ -310,9 +329,11 @@ def LogClientInitialize(log_q):
     _ErrorCh.setFormatter(_ErrorFormatter)
     _ErrorLogger.addHandler(_ErrorCh)
 
-## Set log level
+# Set log level
 #
 #   @param  Level   One of log level in _LogLevel
+
+
 def SetLevel(Level):
     if Level not in _LogLevels:
         info("Not supported log level (%d). Use default level instead." % Level)
@@ -321,7 +342,9 @@ def SetLevel(Level):
     _InfoLogger.setLevel(Level)
     _ErrorLogger.setLevel(Level)
 
-## Initialize log system
+# Initialize log system
+
+
 def Initialize():
     #
     # Since we use different format to log different levels of message into different
@@ -345,23 +368,30 @@ def Initialize():
     _ErrorCh.setFormatter(_ErrorFormatter)
     _ErrorLogger.addHandler(_ErrorCh)
 
+
 def InitializeForUnitTest():
     Initialize()
     SetLevel(SILENT)
 
-## Get current log level
+# Get current log level
+
+
 def GetLevel():
     return _InfoLogger.getEffectiveLevel()
 
-## Raise up warning as error
+# Raise up warning as error
+
+
 def SetWarningAsError():
     global _WarningAsError
     _WarningAsError = True
 
-## Specify a file to store the log message as well as put on console
+# Specify a file to store the log message as well as put on console
 #
 #   @param  LogFile     The file path used to store the log message
 #
+
+
 def SetLogFile(LogFile):
     if os.path.exists(LogFile):
         os.remove(LogFile)
@@ -370,7 +400,7 @@ def SetLogFile(LogFile):
     _Ch.setFormatter(_DebugFormatter)
     _DebugLogger.addHandler(_Ch)
 
-    _Ch= logging.FileHandler(LogFile)
+    _Ch = logging.FileHandler(LogFile)
     _Ch.setFormatter(_InfoFormatter)
     _InfoLogger.addHandler(_Ch)
 
@@ -378,6 +408,6 @@ def SetLogFile(LogFile):
     _Ch.setFormatter(_ErrorFormatter)
     _ErrorLogger.addHandler(_Ch)
 
+
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/Common/Expression.py b/BaseTools/Source/Python/Common/Expression.py
index 31bf0e4b6cf7..ac9c767f213e 100644
--- a/BaseTools/Source/Python/Common/Expression.py
+++ b/BaseTools/Source/Python/Common/Expression.py
@@ -1,17 +1,17 @@
-## @file
+# @file
 # This file is used to parse and evaluate expression in directive or PCD value.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 
-## Import Modules
+# Import Modules
 #
 from __future__ import print_function
 from __future__ import absolute_import
 from Common.GlobalData import *
 from CommonDataClass.Exceptions import BadExpression
 from CommonDataClass.Exceptions import WrnExpression
-from .Misc import GuidStringToGuidStructureString, ParseFieldValue,CopyDict
+from .Misc import GuidStringToGuidStructureString, ParseFieldValue, CopyDict
 import Common.EdkLogger as EdkLogger
 import copy
 from Common.DataType import *
@@ -19,36 +19,38 @@ import sys
 from random import sample
 import string
 
-ERR_STRING_EXPR         = 'This operator cannot be used in string expression: [%s].'
-ERR_SNYTAX              = 'Syntax error, the rest of expression cannot be evaluated: [%s].'
-ERR_MATCH               = 'No matching right parenthesis.'
-ERR_STRING_TOKEN        = 'Bad string token: [%s].'
-ERR_MACRO_TOKEN         = 'Bad macro token: [%s].'
-ERR_EMPTY_TOKEN         = 'Empty token is not allowed.'
-ERR_PCD_RESOLVE         = 'The PCD should be FeatureFlag type or FixedAtBuild type: [%s].'
-ERR_VALID_TOKEN         = 'No more valid token found from rest of string: [%s].'
-ERR_EXPR_TYPE           = 'Different types found in expression.'
-ERR_OPERATOR_UNSUPPORT  = 'Unsupported operator: [%s]'
-ERR_REL_NOT_IN          = 'Expect "IN" after "not" operator.'
-WRN_BOOL_EXPR           = 'Operand of boolean type cannot be used in arithmetic expression.'
-WRN_EQCMP_STR_OTHERS    = '== Comparison between Operand of string type and Boolean/Number Type always return False.'
-WRN_NECMP_STR_OTHERS    = '!= Comparison between Operand of string type and Boolean/Number Type always return True.'
-ERR_RELCMP_STR_OTHERS   = 'Operator taking Operand of string type and Boolean/Number Type is not allowed: [%s].'
-ERR_STRING_CMP          = 'Unicode string and general string cannot be compared: [%s %s %s]'
-ERR_ARRAY_TOKEN         = 'Bad C array or C format GUID token: [%s].'
-ERR_ARRAY_ELE           = 'This must be HEX value for NList or Array: [%s].'
-ERR_EMPTY_EXPR          = 'Empty expression is not allowed.'
-ERR_IN_OPERAND          = 'Macro after IN operator can only be: $(FAMILY), $(ARCH), $(TOOL_CHAIN_TAG) and $(TARGET).'
+ERR_STRING_EXPR = 'This operator cannot be used in string expression: [%s].'
+ERR_SNYTAX = 'Syntax error, the rest of expression cannot be evaluated: [%s].'
+ERR_MATCH = 'No matching right parenthesis.'
+ERR_STRING_TOKEN = 'Bad string token: [%s].'
+ERR_MACRO_TOKEN = 'Bad macro token: [%s].'
+ERR_EMPTY_TOKEN = 'Empty token is not allowed.'
+ERR_PCD_RESOLVE = 'The PCD should be FeatureFlag type or FixedAtBuild type: [%s].'
+ERR_VALID_TOKEN = 'No more valid token found from rest of string: [%s].'
+ERR_EXPR_TYPE = 'Different types found in expression.'
+ERR_OPERATOR_UNSUPPORT = 'Unsupported operator: [%s]'
+ERR_REL_NOT_IN = 'Expect "IN" after "not" operator.'
+WRN_BOOL_EXPR = 'Operand of boolean type cannot be used in arithmetic expression.'
+WRN_EQCMP_STR_OTHERS = '== Comparison between Operand of string type and Boolean/Number Type always return False.'
+WRN_NECMP_STR_OTHERS = '!= Comparison between Operand of string type and Boolean/Number Type always return True.'
+ERR_RELCMP_STR_OTHERS = 'Operator taking Operand of string type and Boolean/Number Type is not allowed: [%s].'
+ERR_STRING_CMP = 'Unicode string and general string cannot be compared: [%s %s %s]'
+ERR_ARRAY_TOKEN = 'Bad C array or C format GUID token: [%s].'
+ERR_ARRAY_ELE = 'This must be HEX value for NList or Array: [%s].'
+ERR_EMPTY_EXPR = 'Empty expression is not allowed.'
+ERR_IN_OPERAND = 'Macro after IN operator can only be: $(FAMILY), $(ARCH), $(TOOL_CHAIN_TAG) and $(TARGET).'
 
 __ValidString = re.compile(r'[_a-zA-Z][_0-9a-zA-Z]*$')
 _ReLabel = re.compile('LABEL\((\w+)\)')
 _ReOffset = re.compile('OFFSET_OF\((\w+)\)')
 PcdPattern = re.compile(r'^[_a-zA-Z][0-9A-Za-z_]*\.[_a-zA-Z][0-9A-Za-z_]*$')
 
-## SplitString
+# SplitString
 #  Split string to list according double quote
 #  For example: abc"de\"f"ghi"jkl"mn will be: ['abc', '"de\"f"', 'ghi', '"jkl"', 'mn']
 #
+
+
 def SplitString(String):
     # There might be escaped quote: "abc\"def\\\"ghi", 'abc\'def\\\'ghi'
     RanStr = ''.join(sample(string.ascii_letters + string.digits, 8))
@@ -87,9 +89,10 @@ def SplitString(String):
         RetList.append(Item)
     for i, ch in enumerate(RetList):
         if RanStr in ch:
-            RetList[i] = ch.replace(RanStr,'\\\\')
+            RetList[i] = ch.replace(RanStr, '\\\\')
     return RetList
 
+
 def SplitPcdValueString(String):
     # There might be escaped comma in GUID() or DEVICE_PATH() or " "
     # or ' ' or L' ' or L" "
@@ -129,12 +132,14 @@ def SplitPcdValueString(String):
         RetList.append(Item)
     for i, ch in enumerate(RetList):
         if RanStr in ch:
-            RetList[i] = ch.replace(RanStr,'\\\\')
+            RetList[i] = ch.replace(RanStr, '\\\\')
     return RetList
 
+
 def IsValidCName(Str):
     return True if __ValidString.match(Str) else False
 
+
 def BuildOptionValue(PcdValue, GuidDict):
     if PcdValue.startswith('H'):
         InputValue = PcdValue[1:]
@@ -151,9 +156,11 @@ def BuildOptionValue(PcdValue, GuidDict):
 
     return PcdValue
 
-## ReplaceExprMacro
+# ReplaceExprMacro
 #
-def ReplaceExprMacro(String, Macros, ExceptionList = None):
+
+
+def ReplaceExprMacro(String, Macros, ExceptionList=None):
     StrList = SplitString(String)
     for i, String in enumerate(StrList):
         InQuote = False
@@ -200,6 +207,8 @@ def ReplaceExprMacro(String, Macros, ExceptionList = None):
     return ''.join(StrList)
 
 # transfer int to string for in/not in expression
+
+
 def IntToStr(Value):
     StrList = []
     while Value > 0:
@@ -208,8 +217,10 @@ def IntToStr(Value):
     Value = '"' + ''.join(StrList) + '"'
     return Value
 
+
 SupportedInMacroList = ['TARGET', 'TOOL_CHAIN_TAG', 'ARCH', 'FAMILY']
 
+
 class BaseExpression(object):
     def __init__(self, *args, **kwargs):
         super(BaseExpression, self).__init__()
@@ -225,35 +236,36 @@ class BaseExpression(object):
         self._Idx = Idx
         return False
 
+
 class ValueExpression(BaseExpression):
     # Logical operator mapping
     LogicalOperators = {
-        '&&' : 'and', '||' : 'or',
-        '!'  : 'not', 'AND': 'and',
-        'OR' : 'or' , 'NOT': 'not',
-        'XOR': '^'  , 'xor': '^',
-        'EQ' : '==' , 'NE' : '!=',
-        'GT' : '>'  , 'LT' : '<',
-        'GE' : '>=' , 'LE' : '<=',
-        'IN' : 'in'
+        '&&': 'and', '||': 'or',
+        '!': 'not', 'AND': 'and',
+        'OR': 'or', 'NOT': 'not',
+        'XOR': '^', 'xor': '^',
+        'EQ': '==', 'NE': '!=',
+        'GT': '>', 'LT': '<',
+        'GE': '>=', 'LE': '<=',
+        'IN': 'in'
     }
 
-    NonLetterOpLst = ['+', '-', TAB_STAR, '/', '%', '&', '|', '^', '~', '<<', '>>', '!', '=', '>', '<', '?', ':']
-
+    NonLetterOpLst = ['+', '-', TAB_STAR, '/', '%', '&', '|',
+                      '^', '~', '<<', '>>', '!', '=', '>', '<', '?', ':']
 
     SymbolPattern = re.compile("("
-                                 "\$\([A-Z][A-Z0-9_]*\)|\$\(\w+\.\w+\)|\w+\.\w+|"
-                                 "&&|\|\||!(?!=)|"
-                                 "(?<=\W)AND(?=\W)|(?<=\W)OR(?=\W)|(?<=\W)NOT(?=\W)|(?<=\W)XOR(?=\W)|"
-                                 "(?<=\W)EQ(?=\W)|(?<=\W)NE(?=\W)|(?<=\W)GT(?=\W)|(?<=\W)LT(?=\W)|(?<=\W)GE(?=\W)|(?<=\W)LE(?=\W)"
+                               "\$\([A-Z][A-Z0-9_]*\)|\$\(\w+\.\w+\)|\w+\.\w+|"
+                               "&&|\|\||!(?!=)|"
+                               "(?<=\W)AND(?=\W)|(?<=\W)OR(?=\W)|(?<=\W)NOT(?=\W)|(?<=\W)XOR(?=\W)|"
+                               "(?<=\W)EQ(?=\W)|(?<=\W)NE(?=\W)|(?<=\W)GT(?=\W)|(?<=\W)LT(?=\W)|(?<=\W)GE(?=\W)|(?<=\W)LE(?=\W)"
                                ")")
 
     @staticmethod
-    def Eval(Operator, Oprand1, Oprand2 = None):
+    def Eval(Operator, Oprand1, Oprand2=None):
         WrnExp = None
 
         if Operator not in {"==", "!=", ">=", "<=", ">", "<", "in", "not in"} and \
-            (isinstance(Oprand1, type('')) or isinstance(Oprand2, type(''))):
+                (isinstance(Oprand1, type('')) or isinstance(Oprand2, type(''))):
             raise BadExpression(ERR_STRING_EXPR % Operator)
         if Operator in {'in', 'not in'}:
             if not isinstance(Oprand1, type('')):
@@ -261,11 +273,11 @@ class ValueExpression(BaseExpression):
             if not isinstance(Oprand2, type('')):
                 Oprand2 = IntToStr(Oprand2)
         TypeDict = {
-            type(0)  : 0,
+            type(0): 0,
             # For python2 long type
-            type(sys.maxsize + 1) : 0,
-            type('') : 1,
-            type(True) : 2
+            type(sys.maxsize + 1): 0,
+            type(''): 1,
+            type(True): 2
         }
 
         EvalStr = ''
@@ -305,15 +317,16 @@ class ValueExpression(BaseExpression):
             if isinstance(Oprand1, type('')) and isinstance(Oprand2, type('')):
                 if ((Oprand1.startswith('L"') or Oprand1.startswith("L'")) and (not Oprand2.startswith('L"')) and (not Oprand2.startswith("L'"))) or \
                         (((not Oprand1.startswith('L"')) and (not Oprand1.startswith("L'"))) and (Oprand2.startswith('L"') or Oprand2.startswith("L'"))):
-                    raise BadExpression(ERR_STRING_CMP % (Oprand1, Operator, Oprand2))
+                    raise BadExpression(ERR_STRING_CMP %
+                                        (Oprand1, Operator, Oprand2))
             if 'in' in Operator and isinstance(Oprand2, type('')):
                 Oprand2 = Oprand2.split()
             EvalStr = 'Oprand1 ' + Operator + ' Oprand2'
 
         # Local symbols used by built in eval function
         Dict = {
-            'Oprand1' : Oprand1,
-            'Oprand2' : Oprand2
+            'Oprand1': Oprand1,
+            'Oprand2': Oprand2
         }
         try:
             Val = eval(EvalStr, {}, Dict)
@@ -340,8 +353,8 @@ class ValueExpression(BaseExpression):
             return
 
         self._Expr = ReplaceExprMacro(Expression.strip(),
-                                  SymbolTable,
-                                  SupportedInMacroList)
+                                      SymbolTable,
+                                      SupportedInMacroList)
 
         if not self._Expr.strip():
             raise BadExpression(ERR_EMPTY_EXPR)
@@ -447,6 +460,7 @@ class ValueExpression(BaseExpression):
                 Val = Warn.result
         return Val
     # A [? B]*
+
     def _ConExpr(self):
         return self._ExprFuncTemplate(self._OrExpr, {'?', ':'})
 
@@ -584,11 +598,11 @@ class ValueExpression(BaseExpression):
         # All whitespace and tabs in array are already stripped.
         IsArray = IsGuid = False
         if len(Token.split(',')) == 11 and len(Token.split(',{')) == 2 \
-            and len(Token.split('},')) == 1:
+                and len(Token.split('},')) == 1:
             HexLen = [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6]
-            HexList= Token.split(',')
+            HexList = Token.split(',')
             if HexList[3].startswith('{') and \
-                not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
+                    not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
                 IsGuid = True
         if Token.lstrip('{').rstrip('}').find('{') == -1:
             if not [Hex for Hex in Token.lstrip('{').rstrip('}').split(',') if len(Hex) > 4]:
@@ -608,7 +622,8 @@ class ValueExpression(BaseExpression):
 
         # Replace escape \\\", \"
         if self._Expr[Idx] == '"':
-            Expr = self._Expr[self._Idx:].replace('\\\\', '//').replace('\\\"', '\\\'')
+            Expr = self._Expr[self._Idx:].replace(
+                '\\\\', '//').replace('\\\"', '\\\'')
             for Ch in Expr:
                 self._Idx += 1
                 if Ch == '"':
@@ -616,9 +631,10 @@ class ValueExpression(BaseExpression):
             self._Token = self._LiteralToken = self._Expr[Idx:self._Idx]
             if not self._Token.endswith('"'):
                 raise BadExpression(ERR_STRING_TOKEN % self._Token)
-        #Replace escape \\\', \'
+        # Replace escape \\\', \'
         elif self._Expr[Idx] == "'":
-            Expr = self._Expr[self._Idx:].replace('\\\\', '//').replace("\\\'", "\\\"")
+            Expr = self._Expr[self._Idx:].replace(
+                '\\\\', '//').replace("\\\'", "\\\"")
             for Ch in Expr:
                 self._Idx += 1
                 if Ch == "'":
@@ -631,7 +647,7 @@ class ValueExpression(BaseExpression):
 
     # Get token that is comprised by alphanumeric, underscore or dot(used by PCD)
     # @param IsAlphaOp: Indicate if parsing general token or script operator(EQ, NE...)
-    def __GetIdToken(self, IsAlphaOp = False):
+    def __GetIdToken(self, IsAlphaOp=False):
         IdToken = ''
         for Ch in self._Expr[self._Idx:]:
             if not self.__IsIdChar(Ch) or ('?' in self._Expr and Ch == ':'):
@@ -655,7 +671,8 @@ class ValueExpression(BaseExpression):
                 Ex = BadExpression(ERR_PCD_RESOLVE % self._Token)
                 Ex.Pcd = self._Token
                 raise Ex
-            self._Token = ValueExpression(self._Symb[self._Token], self._Symb)(True, self._Depth+1)
+            self._Token = ValueExpression(
+                self._Symb[self._Token], self._Symb)(True, self._Depth+1)
             if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
@@ -697,7 +714,7 @@ class ValueExpression(BaseExpression):
 
     def __IsHexLiteral(self):
         if self._LiteralToken.startswith('{') and \
-            self._LiteralToken.endswith('}'):
+                self._LiteralToken.endswith('}'):
             return True
 
         if gHexPattern.match(self._LiteralToken):
@@ -741,7 +758,7 @@ class ValueExpression(BaseExpression):
             try:
                 RetValue = Re.search(Expr).group(1)
             except:
-                 raise BadExpression('Invalid Expression %s' % Expr)
+                raise BadExpression('Invalid Expression %s' % Expr)
             Idx = self._Idx
             for Ch in Expr:
                 self._Idx += 1
@@ -749,21 +766,23 @@ class ValueExpression(BaseExpression):
                     Prefix = self._Expr[Idx:self._Idx - 1]
                     Idx = self._Idx
                 if Ch == ')':
-                    TmpValue = self._Expr[Idx :self._Idx - 1]
+                    TmpValue = self._Expr[Idx:self._Idx - 1]
                     TmpValue = ValueExpression(TmpValue)(True)
-                    TmpValue = '0x%x' % int(TmpValue) if not isinstance(TmpValue, type('')) else TmpValue
+                    TmpValue = '0x%x' % int(TmpValue) if not isinstance(
+                        TmpValue, type('')) else TmpValue
                     break
             self._Token, Size = ParseFieldValue(Prefix + '(' + TmpValue + ')')
-            return  self._Token
+            return self._Token
 
         self._Token = ''
         if Expr:
             Ch = Expr[0]
             Match = gGuidPattern.match(Expr)
             if Match and not Expr[Match.end():Match.end()+1].isalnum() \
-                and Expr[Match.end():Match.end()+1] != '_':
+                    and Expr[Match.end():Match.end()+1] != '_':
                 self._Idx += Match.end()
-                self._Token = ValueExpression(GuidStringToGuidStructureString(Expr[0:Match.end()]))(True, self._Depth+1)
+                self._Token = ValueExpression(GuidStringToGuidStructureString(
+                    Expr[0:Match.end()]))(True, self._Depth+1)
                 return self._Token
             elif self.__IsIdChar(Ch):
                 return self.__GetIdToken()
@@ -781,7 +800,8 @@ class ValueExpression(BaseExpression):
     # Parse operator
     def _GetOperator(self):
         self.__SkipWS()
-        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + self.NonLetterOpLst + ['?', ':']
+        LegalOpLst = ['&&', '||', '!=', '==', '>=', '<='] + \
+            self.NonLetterOpLst + ['?', ':']
 
         self._Token = ''
         Expr = self._Expr[self._Idx:]
@@ -813,6 +833,7 @@ class ValueExpression(BaseExpression):
         self._Token = OpToken
         return OpToken
 
+
 class ValueExpressionEx(ValueExpression):
     def __init__(self, PcdValue, PcdType, SymbolTable={}):
         ValueExpression.__init__(self, PcdValue, SymbolTable)
@@ -828,11 +849,11 @@ class ValueExpressionEx(ValueExpression):
                     PcdValue, Size = ParseFieldValue(PcdValue)
                     PcdValueList = []
                     for I in range(Size):
-                        PcdValueList.append('0x%02X'%(PcdValue & 0xff))
+                        PcdValueList.append('0x%02X' % (PcdValue & 0xff))
                         PcdValue = PcdValue >> 8
                     PcdValue = '{' + ','.join(PcdValueList) + '}'
-                elif self.PcdType in TAB_PCD_NUMERIC_TYPES and (PcdValue.startswith("'") or \
-                          PcdValue.startswith('"') or PcdValue.startswith("L'") or PcdValue.startswith('L"') or PcdValue.startswith('{')):
+                elif self.PcdType in TAB_PCD_NUMERIC_TYPES and (PcdValue.startswith("'") or
+                                                                PcdValue.startswith('"') or PcdValue.startswith("L'") or PcdValue.startswith('L"') or PcdValue.startswith('{')):
                     raise BadExpression
             except WrnExpression as Value:
                 PcdValue = Value.result
@@ -865,12 +886,14 @@ class ValueExpressionEx(ValueExpression):
                             else:
                                 ItemSize = 0
                                 ValueType = TAB_UINT8
-                            Item = ValueExpressionEx(Item, ValueType, self._Symb)(True)
+                            Item = ValueExpressionEx(
+                                Item, ValueType, self._Symb)(True)
                             if ItemSize == 0:
                                 try:
                                     tmpValue = int(Item, 0)
                                     if tmpValue > 255:
-                                        raise BadExpression("Byte  array number %s should less than 0xFF." % Item)
+                                        raise BadExpression(
+                                            "Byte  array number %s should less than 0xFF." % Item)
                                 except BadExpression as Value:
                                     raise BadExpression(Value)
                                 except ValueError:
@@ -888,24 +911,30 @@ class ValueExpressionEx(ValueExpression):
                         try:
                             TmpValue, Size = ParseFieldValue(PcdValue)
                         except BadExpression as Value:
-                            raise BadExpression("Type: %s, Value: %s, %s" % (self.PcdType, PcdValue, Value))
+                            raise BadExpression("Type: %s, Value: %s, %s" % (
+                                self.PcdType, PcdValue, Value))
                     if isinstance(TmpValue, type('')):
                         try:
                             TmpValue = int(TmpValue)
                         except:
-                            raise  BadExpression(Value)
+                            raise BadExpression(Value)
                     else:
                         PcdValue = '0x%0{}X'.format(Size) % (TmpValue)
                     if TmpValue < 0:
-                        raise  BadExpression('Type %s PCD Value is negative' % self.PcdType)
+                        raise BadExpression(
+                            'Type %s PCD Value is negative' % self.PcdType)
                     if self.PcdType == TAB_UINT8 and Size > 1:
-                        raise BadExpression('Type %s PCD Value Size is Larger than 1 byte' % self.PcdType)
+                        raise BadExpression(
+                            'Type %s PCD Value Size is Larger than 1 byte' % self.PcdType)
                     if self.PcdType == TAB_UINT16 and Size > 2:
-                        raise BadExpression('Type %s PCD Value Size is Larger than 2 byte' % self.PcdType)
+                        raise BadExpression(
+                            'Type %s PCD Value Size is Larger than 2 byte' % self.PcdType)
                     if self.PcdType == TAB_UINT32 and Size > 4:
-                        raise BadExpression('Type %s PCD Value Size is Larger than 4 byte' % self.PcdType)
+                        raise BadExpression(
+                            'Type %s PCD Value Size is Larger than 4 byte' % self.PcdType)
                     if self.PcdType == TAB_UINT64 and Size > 8:
-                        raise BadExpression('Type %s PCD Value Size is Larger than 8 byte' % self.PcdType)
+                        raise BadExpression(
+                            'Type %s PCD Value Size is Larger than 8 byte' % self.PcdType)
                 else:
                     try:
                         TmpValue = int(PcdValue)
@@ -914,11 +943,13 @@ class ValueExpressionEx(ValueExpression):
                             PcdValue = '{0x00}'
                         else:
                             for I in range((TmpValue.bit_length() + 7) // 8):
-                                TmpList.append('0x%02x' % ((TmpValue >> I * 8) & 0xff))
+                                TmpList.append('0x%02x' %
+                                               ((TmpValue >> I * 8) & 0xff))
                             PcdValue = '{' + ', '.join(TmpList) + '}'
                     except:
                         if PcdValue.strip().startswith('{'):
-                            PcdValueList = SplitPcdValueString(PcdValue.strip()[1:-1])
+                            PcdValueList = SplitPcdValueString(
+                                PcdValue.strip()[1:-1])
                             LabelDict = {}
                             NewPcdValueList = []
                             LabelOffset = 0
@@ -930,7 +961,8 @@ class ValueExpressionEx(ValueExpression):
                                 if LabelList:
                                     for Label in LabelList:
                                         if not IsValidCName(Label):
-                                            raise BadExpression('%s is not a valid c variable name' % Label)
+                                            raise BadExpression(
+                                                '%s is not a valid c variable name' % Label)
                                         if Label not in LabelDict:
                                             LabelDict[Label] = str(LabelOffset)
                                 if Item.startswith(TAB_UINT8):
@@ -943,7 +975,8 @@ class ValueExpressionEx(ValueExpression):
                                     LabelOffset = LabelOffset + 8
                                 else:
                                     try:
-                                        ItemValue, ItemSize = ParseFieldValue(Item)
+                                        ItemValue, ItemSize = ParseFieldValue(
+                                            Item)
                                         LabelOffset = LabelOffset + ItemSize
                                     except:
                                         LabelOffset = LabelOffset + 1
@@ -962,9 +995,11 @@ class ValueExpressionEx(ValueExpression):
                                 # replace each offset, except errors
                                 for Offset in OffsetList:
                                     try:
-                                        Item = Item.replace('OFFSET_OF({})'.format(Offset), LabelDict[Offset])
+                                        Item = Item.replace('OFFSET_OF({})'.format(
+                                            Offset), LabelDict[Offset])
                                     except:
-                                        raise BadExpression('%s not defined' % Offset)
+                                        raise BadExpression(
+                                            '%s not defined' % Offset)
 
                                 NewPcdValueList.append(Item)
 
@@ -975,13 +1010,16 @@ class ValueExpressionEx(ValueExpression):
                                 TokenSpaceGuidName = ''
                                 if Item.startswith(TAB_GUID) and Item.endswith(')'):
                                     try:
-                                        TokenSpaceGuidName = re.search('GUID\((\w+)\)', Item).group(1)
+                                        TokenSpaceGuidName = re.search(
+                                            'GUID\((\w+)\)', Item).group(1)
                                     except:
                                         pass
                                     if TokenSpaceGuidName and TokenSpaceGuidName in self._Symb:
-                                        Item = 'GUID(' + self._Symb[TokenSpaceGuidName] + ')'
+                                        Item = 'GUID(' + \
+                                            self._Symb[TokenSpaceGuidName] + ')'
                                     elif TokenSpaceGuidName:
-                                        raise BadExpression('%s not found in DEC file' % TokenSpaceGuidName)
+                                        raise BadExpression(
+                                            '%s not found in DEC file' % TokenSpaceGuidName)
                                     Item, Size = ParseFieldValue(Item)
                                     for Index in range(0, Size):
                                         ValueStr = '0x%02X' % (int(Item) & 255)
@@ -1009,26 +1047,34 @@ class ValueExpressionEx(ValueExpression):
                                     else:
                                         ItemSize = 0
                                     if ValueType:
-                                        TmpValue = ValueExpressionEx(Item, ValueType, self._Symb)(True)
+                                        TmpValue = ValueExpressionEx(
+                                            Item, ValueType, self._Symb)(True)
                                     else:
-                                        TmpValue = ValueExpressionEx(Item, self.PcdType, self._Symb)(True)
-                                    Item = '0x%x' % TmpValue if not isinstance(TmpValue, type('')) else TmpValue
+                                        TmpValue = ValueExpressionEx(
+                                            Item, self.PcdType, self._Symb)(True)
+                                    Item = '0x%x' % TmpValue if not isinstance(
+                                        TmpValue, type('')) else TmpValue
                                     if ItemSize == 0:
-                                        ItemValue, ItemSize = ParseFieldValue(Item)
+                                        ItemValue, ItemSize = ParseFieldValue(
+                                            Item)
                                         if Item[0] not in {'"', 'L', '{'} and ItemSize > 1:
-                                            raise BadExpression("Byte  array number %s should less than 0xFF." % Item)
+                                            raise BadExpression(
+                                                "Byte  array number %s should less than 0xFF." % Item)
                                     else:
                                         ItemValue = ParseFieldValue(Item)[0]
                                     for I in range(0, ItemSize):
-                                        ValueStr = '0x%02X' % (int(ItemValue) & 255)
+                                        ValueStr = '0x%02X' % (
+                                            int(ItemValue) & 255)
                                         ItemValue >>= 8
                                         AllPcdValueList.append(ValueStr)
                                     Size += ItemSize
 
                             if Size > 0:
-                                PcdValue = '{' + ','.join(AllPcdValueList) + '}'
+                                PcdValue = '{' + \
+                                    ','.join(AllPcdValueList) + '}'
                         else:
-                            raise  BadExpression("Type: %s, Value: %s, %s"%(self.PcdType, PcdValue, Value))
+                            raise BadExpression("Type: %s, Value: %s, %s" % (
+                                self.PcdType, PcdValue, Value))
 
             if PcdValue == 'True':
                 PcdValue = '1'
@@ -1038,6 +1084,7 @@ class ValueExpressionEx(ValueExpression):
         if RealValue:
             return PcdValue
 
+
 if __name__ == '__main__':
     pass
     while True:
diff --git a/BaseTools/Source/Python/Common/GlobalData.py b/BaseTools/Source/Python/Common/GlobalData.py
index 197bd8366682..34cd90987566 100755
--- a/BaseTools/Source/Python/Common/GlobalData.py
+++ b/BaseTools/Source/Python/Common/GlobalData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common static strings used by INF/DEC/DSC files
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -35,32 +35,35 @@ gGuidDict = {}
 # definition for a MACRO name.  used to create regular expressions below.
 _MacroNamePattern = "[A-Z][A-Z0-9_]*"
 
-## Regular expression for matching macro used in DSC/DEC/INF file inclusion
-gMacroRefPattern = re.compile("\$\(({})\)".format(_MacroNamePattern), re.UNICODE)
+# Regular expression for matching macro used in DSC/DEC/INF file inclusion
+gMacroRefPattern = re.compile(
+    "\$\(({})\)".format(_MacroNamePattern), re.UNICODE)
 gMacroDefPattern = re.compile("^(DEFINE|EDK_GLOBAL)[ \t]+")
 gMacroNamePattern = re.compile("^{}$".format(_MacroNamePattern))
 
 # definition for a GUID.  used to create regular expressions below.
 _HexChar = r"[0-9a-fA-F]"
-_GuidPattern = r"{Hex}{{8}}-{Hex}{{4}}-{Hex}{{4}}-{Hex}{{4}}-{Hex}{{12}}".format(Hex=_HexChar)
+_GuidPattern = r"{Hex}{{8}}-{Hex}{{4}}-{Hex}{{4}}-{Hex}{{4}}-{Hex}{{12}}".format(
+    Hex=_HexChar)
 
-## Regular expressions for GUID matching
+# Regular expressions for GUID matching
 gGuidPattern = re.compile(r'{}'.format(_GuidPattern))
 gGuidPatternEnd = re.compile(r'{}$'.format(_GuidPattern))
 
-## Regular expressions for HEX matching
+# Regular expressions for HEX matching
 g4HexChar = re.compile(r'{}{{4}}'.format(_HexChar))
 gHexPattern = re.compile(r'0[xX]{}+'.format(_HexChar))
 gHexPatternAll = re.compile(r'0[xX]{}+$'.format(_HexChar))
 
-## Regular expressions for string identifier checking
+# Regular expressions for string identifier checking
 gIdentifierPattern = re.compile('^[a-zA-Z][a-zA-Z0-9_]*$', re.UNICODE)
-## Regular expression for GUID c structure format
+# Regular expression for GUID c structure format
 _GuidCFormatPattern = r"{{\s*0[xX]{Hex}{{1,8}}\s*,\s*0[xX]{Hex}{{1,4}}\s*,\s*0[xX]{Hex}{{1,4}}" \
                       r"\s*,\s*{{\s*0[xX]{Hex}{{1,2}}\s*,\s*0[xX]{Hex}{{1,2}}" \
                       r"\s*,\s*0[xX]{Hex}{{1,2}}\s*,\s*0[xX]{Hex}{{1,2}}" \
                       r"\s*,\s*0[xX]{Hex}{{1,2}}\s*,\s*0[xX]{Hex}{{1,2}}" \
-                      r"\s*,\s*0[xX]{Hex}{{1,2}}\s*,\s*0[xX]{Hex}{{1,2}}\s*}}\s*}}".format(Hex=_HexChar)
+                      r"\s*,\s*0[xX]{Hex}{{1,2}}\s*,\s*0[xX]{Hex}{{1,2}}\s*}}\s*}}".format(
+                          Hex=_HexChar)
 gGuidCFormatPattern = re.compile(r"{}".format(_GuidCFormatPattern))
 
 #
@@ -98,7 +101,7 @@ MixedPcd = {}
 
 # Structure Pcd dict
 gStructurePcd = {}
-gPcdSkuOverrides={}
+gPcdSkuOverrides = {}
 # Pcd name for the Pcd which used in the Conditional directives
 gConditionalPcds = []
 
@@ -122,4 +125,3 @@ gEnableGenfdsMultiThread = True
 gSikpAutoGenCache = set()
 # Common lock for the file access in multiple process AutoGens
 file_lock = None
-
diff --git a/BaseTools/Source/Python/Common/LongFilePathOs.py b/BaseTools/Source/Python/Common/LongFilePathOs.py
index 190f36d7ec15..1f205ce14231 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOs.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOs.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Override built in module os to provide support for long file path
 #
 # Copyright (c) 2014 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,46 +13,58 @@ import time
 
 path = LongFilePathOsPath
 
+
 def access(path, mode):
     return os.access(LongFilePath(path), mode)
 
+
 def remove(path):
-   Timeout = 0.0
-   while Timeout < 5.0:
-       try:
-           return os.remove(LongFilePath(path))
-       except:
-           time.sleep(0.1)
-           Timeout = Timeout + 0.1
-   return os.remove(LongFilePath(path))
+    Timeout = 0.0
+    while Timeout < 5.0:
+        try:
+            return os.remove(LongFilePath(path))
+        except:
+            time.sleep(0.1)
+            Timeout = Timeout + 0.1
+    return os.remove(LongFilePath(path))
+
 
 def removedirs(name):
     return os.removedirs(LongFilePath(name))
 
+
 def rmdir(path):
     return os.rmdir(LongFilePath(path))
 
+
 def mkdir(path):
     return os.mkdir(LongFilePath(path))
 
+
 def makedirs(name, mode=0o777):
     return os.makedirs(LongFilePath(name), mode)
 
+
 def rename(old, new):
     return os.rename(LongFilePath(old), LongFilePath(new))
 
+
 def chdir(path):
     return os.chdir(LongFilePath(path))
 
+
 def chmod(path, mode):
     return os.chmod(LongFilePath(path), mode)
 
+
 def stat(path):
     return os.stat(LongFilePath(path))
 
+
 def utime(path, times):
     return os.utime(LongFilePath(path), times)
 
+
 def listdir(path):
     List = []
     uList = os.listdir(u"%s" % LongFilePath(path))
@@ -60,6 +72,7 @@ def listdir(path):
         List.append(Item)
     return List
 
+
 if hasattr(os, 'replace'):
     def replace(src, dst):
         return os.replace(LongFilePath(src), LongFilePath(dst))
diff --git a/BaseTools/Source/Python/Common/LongFilePathOsPath.py b/BaseTools/Source/Python/Common/LongFilePathOsPath.py
index 60a053652550..543ecb1b7bcd 100644
--- a/BaseTools/Source/Python/Common/LongFilePathOsPath.py
+++ b/BaseTools/Source/Python/Common/LongFilePathOsPath.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Override built in module os.path to provide support for long file path
 #
 # Copyright (c) 2014, Intel Corporation. All rights reserved.<BR>
@@ -8,27 +8,35 @@
 import os
 from Common.LongFilePathSupport import LongFilePath
 
+
 def isfile(path):
     return os.path.isfile(LongFilePath(path))
 
+
 def isdir(path):
     return os.path.isdir(LongFilePath(path))
 
+
 def exists(path):
     return os.path.exists(LongFilePath(path))
 
+
 def getsize(filename):
     return os.path.getsize(LongFilePath(filename))
 
+
 def getmtime(filename):
     return os.path.getmtime(LongFilePath(filename))
 
+
 def getatime(filename):
     return os.path.getatime(LongFilePath(filename))
 
+
 def getctime(filename):
     return os.path.getctime(LongFilePath(filename))
 
+
 join = os.path.join
 splitext = os.path.splitext
 splitdrive = os.path.splitdrive
diff --git a/BaseTools/Source/Python/Common/LongFilePathSupport.py b/BaseTools/Source/Python/Common/LongFilePathSupport.py
index 38c4396544cc..ca6962e0a177 100644
--- a/BaseTools/Source/Python/Common/LongFilePathSupport.py
+++ b/BaseTools/Source/Python/Common/LongFilePathSupport.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Override built in function file.open to provide support for long file path
 #
 # Copyright (c) 2014 - 2015, Intel Corporation. All rights reserved.<BR>
@@ -14,6 +14,8 @@ import codecs
 # OpenLongPath
 # Convert a file path to a long file path
 #
+
+
 def LongFilePath(FileName):
     FileName = os.path.normpath(FileName)
     if platform.system() == 'Windows':
@@ -29,9 +31,12 @@ def LongFilePath(FileName):
 # OpenLongFilePath
 # wrap open to support opening a long file path
 #
-def OpenLongFilePath(FileName, Mode='r', Buffer= -1):
+
+
+def OpenLongFilePath(FileName, Mode='r', Buffer=-1):
     return open(LongFilePath(FileName), Mode, Buffer)
 
+
 def CodecOpenLongFilePath(Filename, Mode='rb', Encoding=None, Errors='strict', Buffering=1):
     return codecs.open(LongFilePath(Filename), Mode, Encoding, Errors, Buffering)
 
@@ -39,6 +44,8 @@ def CodecOpenLongFilePath(Filename, Mode='rb', Encoding=None, Errors='strict', B
 # CopyLongFilePath
 # wrap copyfile to support copy a long file path
 #
+
+
 def CopyLongFilePath(src, dst):
     with open(LongFilePath(src), 'rb') as fsrc:
         with open(LongFilePath(dst), 'wb') as fdst:
diff --git a/BaseTools/Source/Python/Common/Misc.py b/BaseTools/Source/Python/Common/Misc.py
index 4be7957138a5..4dcf72d37e28 100755
--- a/BaseTools/Source/Python/Common/Misc.py
+++ b/BaseTools/Source/Python/Common/Misc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Common routines used by all tools
 #
 # Copyright (c) 2007 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -42,19 +42,22 @@ from Common.caching import cached_property
 import struct
 
 ArrayIndex = re.compile("\[\s*[0-9a-fA-FxX]*\s*\]")
-## Regular expression used to find out place holders in string template
-gPlaceholderPattern = re.compile("\$\{([^$()\s]+)\}", re.MULTILINE | re.UNICODE)
+# Regular expression used to find out place holders in string template
+gPlaceholderPattern = re.compile(
+    "\$\{([^$()\s]+)\}", re.MULTILINE | re.UNICODE)
 
-## regular expressions for map file processing
+# regular expressions for map file processing
 startPatternGeneral = re.compile("^Start[' ']+Length[' ']+Name[' ']+Class")
-addressPatternGeneral = re.compile("^Address[' ']+Publics by Value[' ']+Rva\+Base")
+addressPatternGeneral = re.compile(
+    "^Address[' ']+Publics by Value[' ']+Rva\+Base")
 valuePatternGcc = re.compile('^([\w_\.]+) +([\da-fA-Fx]+) +([\da-fA-Fx]+)$')
 pcdPatternGcc = re.compile('^([\da-fA-Fx]+) +([\da-fA-Fx]+)')
-secReGeneral = re.compile('^([\da-fA-F]+):([\da-fA-F]+) +([\da-fA-F]+)[Hh]? +([.\w\$]+) +(\w+)', re.UNICODE)
+secReGeneral = re.compile(
+    '^([\da-fA-F]+):([\da-fA-F]+) +([\da-fA-F]+)[Hh]? +([.\w\$]+) +(\w+)', re.UNICODE)
 
 StructPattern = re.compile(r'[_a-zA-Z][0-9A-Za-z_]*$')
 
-## Dictionary used to store dependencies of files
+# Dictionary used to store dependencies of files
 gDependencyDatabase = {}    # arch : {file path : [dependent files list]}
 
 #
@@ -64,6 +67,7 @@ gDependencyDatabase = {}    # arch : {file path : [dependent files list]}
 #
 _TempInfs = []
 
+
 def GetVariableOffset(mapfilepath, efifilepath, varnames):
     """ Parse map file to get variable offset in current EFI file
     @param mapfilepath    Map file absolution path
@@ -80,35 +84,39 @@ def GetVariableOffset(mapfilepath, efifilepath, varnames):
     except:
         return None
 
-    if len(lines) == 0: return None
+    if len(lines) == 0:
+        return None
     firstline = lines[0].strip()
     if re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', firstline):
         return _parseForXcodeAndClang9(lines, efifilepath, varnames)
     if (firstline.startswith("Archive member included ") and
-        firstline.endswith(" file (symbol)")):
+            firstline.endswith(" file (symbol)")):
         return _parseForGCC(lines, efifilepath, varnames)
     if firstline.startswith("# Path:"):
         return _parseForXcodeAndClang9(lines, efifilepath, varnames)
     return _parseGeneral(lines, efifilepath, varnames)
 
+
 def _parseForXcodeAndClang9(lines, efifilepath, varnames):
     status = 0
     ret = []
     for line in lines:
         line = line.strip()
-        if status == 0 and (re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', line) \
-            or line == "# Symbols:"):
+        if status == 0 and (re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', line)
+                            or line == "# Symbols:"):
             status = 1
             continue
         if status == 1 and len(line) != 0:
             for varname in varnames:
                 if varname in line:
                     # cannot pregenerate this RegEx since it uses varname from varnames.
-                    m = re.match('^([\da-fA-FxX]+)([\s\S]*)([_]*%s)$' % varname, line)
+                    m = re.match(
+                        '^([\da-fA-FxX]+)([\s\S]*)([_]*%s)$' % varname, line)
                     if m is not None:
                         ret.append((varname, m.group(1)))
     return ret
 
+
 def _parseForGCC(lines, efifilepath, varnames):
     """ Parse map file generated by GCC linker """
     status = 0
@@ -123,7 +131,7 @@ def _parseForGCC(lines, efifilepath, varnames):
         elif status == 1 and line == 'Linker script and memory map':
             status = 2
             continue
-        elif status ==2 and line == 'START GROUP':
+        elif status == 2 and line == 'START GROUP':
             status = 3
             continue
 
@@ -144,7 +152,8 @@ def _parseForGCC(lines, efifilepath, varnames):
                     if Str:
                         m = pcdPatternGcc.match(Str.strip())
                         if m is not None:
-                            varoffset.append((varname, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
+                            varoffset.append((varname, int(m.groups(0)[0], 16), int(
+                                sections[-1][1], 16), sections[-1][0]))
 
     if not varoffset:
         return []
@@ -152,7 +161,7 @@ def _parseForGCC(lines, efifilepath, varnames):
     efisecs = PeImageClass(efifilepath).SectionHeaderList
     if efisecs is None or len(efisecs) == 0:
         return []
-    #redirection
+    # redirection
     redirection = 0
     for efisec in efisecs:
         for section in sections:
@@ -163,14 +172,17 @@ def _parseForGCC(lines, efifilepath, varnames):
     for var in varoffset:
         for efisec in efisecs:
             if var[1] >= efisec[1] and var[1] < efisec[1]+efisec[3]:
-                ret.append((var[0], hex(efisec[2] + var[1] - efisec[1] - redirection)))
+                ret.append(
+                    (var[0], hex(efisec[2] + var[1] - efisec[1] - redirection)))
     return ret
 
+
 def _parseGeneral(lines, efifilepath, varnames):
-    status = 0    #0 - beginning of file; 1 - PE section definition; 2 - symbol table
-    secs  = []    # key = section name
+    status = 0  # 0 - beginning of file; 1 - PE section definition; 2 - symbol table
+    secs = []    # key = section name
     varoffset = []
-    symRe = re.compile('^([\da-fA-F]+):([\da-fA-F]+) +([\.:\\\\\w\?@\$-]+) +([\da-fA-F]+)', re.UNICODE)
+    symRe = re.compile(
+        '^([\da-fA-F]+):([\da-fA-F]+) +([\.:\\\\\w\?@\$-]+) +([\da-fA-F]+)', re.UNICODE)
 
     for line in lines:
         line = line.strip()
@@ -184,27 +196,30 @@ def _parseGeneral(lines, efifilepath, varnames):
             status = 3
             continue
         if status == 1 and len(line) != 0:
-            m =  secReGeneral.match(line)
+            m = secReGeneral.match(line)
             assert m is not None, "Fail to parse the section in map file , line is %s" % line
             sec_no, sec_start, sec_length, sec_name, sec_class = m.groups(0)
-            secs.append([int(sec_no, 16), int(sec_start, 16), int(sec_length, 16), sec_name, sec_class])
+            secs.append([int(sec_no, 16), int(sec_start, 16),
+                        int(sec_length, 16), sec_name, sec_class])
         if status == 2 and len(line) != 0:
             for varname in varnames:
                 m = symRe.match(line)
                 assert m is not None, "Fail to parse the symbol in map file, line is %s" % line
                 sec_no, sym_offset, sym_name, vir_addr = m.groups(0)
-                sec_no     = int(sec_no,     16)
+                sec_no = int(sec_no,     16)
                 sym_offset = int(sym_offset, 16)
-                vir_addr   = int(vir_addr,   16)
+                vir_addr = int(vir_addr,   16)
                 # cannot pregenerate this RegEx since it uses varname from varnames.
                 m2 = re.match('^[_]*(%s)' % varname, sym_name)
                 if m2 is not None:
                     # fond a binary pcd entry in map file
                     for sec in secs:
                         if sec[0] == sec_no and (sym_offset >= sec[1] and sym_offset < sec[1] + sec[2]):
-                            varoffset.append([varname, sec[3], sym_offset, vir_addr, sec_no])
+                            varoffset.append(
+                                [varname, sec[3], sym_offset, vir_addr, sec_no])
 
-    if not varoffset: return []
+    if not varoffset:
+        return []
 
     # get section information from efi file
     efisecs = PeImageClass(efifilepath).SectionHeaderList
@@ -223,7 +238,7 @@ def _parseGeneral(lines, efifilepath, varnames):
 
     return ret
 
-## Routine to process duplicated INF
+# Routine to process duplicated INF
 #
 #  This function is called by following two cases:
 #  Case 1 in DSC:
@@ -245,6 +260,8 @@ def _parseGeneral(lines, efifilepath, varnames):
 #
 #  @retval         return the new PathClass object
 #
+
+
 def ProcessDuplicatedInf(Path, BaseName, Workspace):
     Filename = os.path.split(Path.File)[1]
     if '.' in Filename:
@@ -295,20 +312,24 @@ def ProcessDuplicatedInf(Path, BaseName, Workspace):
     shutil.copy2(str(Path), TempFullPath)
     return RtPath
 
-## Remove temporary created INFs whose paths were saved in _TempInfs
+# Remove temporary created INFs whose paths were saved in _TempInfs
 #
+
+
 def ClearDuplicatedInf():
     while _TempInfs:
         File = _TempInfs.pop()
         if os.path.exists(File):
             os.remove(File)
 
-## Convert GUID string in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx style to C structure style
+# Convert GUID string in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx style to C structure style
 #
 #   @param      Guid    The GUID string
 #
 #   @retval     string  The GUID string in C structure style
 #
+
+
 def GuidStringToGuidStructureString(Guid):
     GuidList = Guid.split('-')
     Result = '{'
@@ -320,100 +341,112 @@ def GuidStringToGuidStructureString(Guid):
     Result += '}}'
     return Result
 
-## Convert GUID structure in byte array to xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+# Convert GUID structure in byte array to xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 #
 #   @param      GuidValue   The GUID value in byte array
 #
 #   @retval     string      The GUID value in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx format
 #
+
+
 def GuidStructureByteArrayToGuidString(GuidValue):
-    guidValueString = GuidValue.lower().replace("{", "").replace("}", "").replace(" ", "").replace(";", "")
+    guidValueString = GuidValue.lower().replace(
+        "{", "").replace("}", "").replace(" ", "").replace(";", "")
     guidValueList = guidValueString.split(",")
     if len(guidValueList) != 16:
         return ''
         #EdkLogger.error(None, None, "Invalid GUID value string %s" % GuidValue)
     try:
         return "%02x%02x%02x%02x-%02x%02x-%02x%02x-%02x%02x-%02x%02x%02x%02x%02x%02x" % (
-                int(guidValueList[3], 16),
-                int(guidValueList[2], 16),
-                int(guidValueList[1], 16),
-                int(guidValueList[0], 16),
-                int(guidValueList[5], 16),
-                int(guidValueList[4], 16),
-                int(guidValueList[7], 16),
-                int(guidValueList[6], 16),
-                int(guidValueList[8], 16),
-                int(guidValueList[9], 16),
-                int(guidValueList[10], 16),
-                int(guidValueList[11], 16),
-                int(guidValueList[12], 16),
-                int(guidValueList[13], 16),
-                int(guidValueList[14], 16),
-                int(guidValueList[15], 16)
-                )
+            int(guidValueList[3], 16),
+            int(guidValueList[2], 16),
+            int(guidValueList[1], 16),
+            int(guidValueList[0], 16),
+            int(guidValueList[5], 16),
+            int(guidValueList[4], 16),
+            int(guidValueList[7], 16),
+            int(guidValueList[6], 16),
+            int(guidValueList[8], 16),
+            int(guidValueList[9], 16),
+            int(guidValueList[10], 16),
+            int(guidValueList[11], 16),
+            int(guidValueList[12], 16),
+            int(guidValueList[13], 16),
+            int(guidValueList[14], 16),
+            int(guidValueList[15], 16)
+        )
     except:
         return ''
 
-## Convert GUID string in C structure style to xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+# Convert GUID string in C structure style to xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 #
 #   @param      GuidValue   The GUID value in C structure format
 #
 #   @retval     string      The GUID value in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx format
 #
+
+
 def GuidStructureStringToGuidString(GuidValue):
     if not GlobalData.gGuidCFormatPattern.match(GuidValue):
         return ''
-    guidValueString = GuidValue.lower().replace("{", "").replace("}", "").replace(" ", "").replace(";", "")
+    guidValueString = GuidValue.lower().replace(
+        "{", "").replace("}", "").replace(" ", "").replace(";", "")
     guidValueList = guidValueString.split(",")
     if len(guidValueList) != 11:
         return ''
         #EdkLogger.error(None, None, "Invalid GUID value string %s" % GuidValue)
     try:
         return "%08x-%04x-%04x-%02x%02x-%02x%02x%02x%02x%02x%02x" % (
-                int(guidValueList[0], 16),
-                int(guidValueList[1], 16),
-                int(guidValueList[2], 16),
-                int(guidValueList[3], 16),
-                int(guidValueList[4], 16),
-                int(guidValueList[5], 16),
-                int(guidValueList[6], 16),
-                int(guidValueList[7], 16),
-                int(guidValueList[8], 16),
-                int(guidValueList[9], 16),
-                int(guidValueList[10], 16)
-                )
+            int(guidValueList[0], 16),
+            int(guidValueList[1], 16),
+            int(guidValueList[2], 16),
+            int(guidValueList[3], 16),
+            int(guidValueList[4], 16),
+            int(guidValueList[5], 16),
+            int(guidValueList[6], 16),
+            int(guidValueList[7], 16),
+            int(guidValueList[8], 16),
+            int(guidValueList[9], 16),
+            int(guidValueList[10], 16)
+        )
     except:
         return ''
 
-## Convert GUID string in C structure style to xxxxxxxx_xxxx_xxxx_xxxx_xxxxxxxxxxxx
+# Convert GUID string in C structure style to xxxxxxxx_xxxx_xxxx_xxxx_xxxxxxxxxxxx
 #
 #   @param      GuidValue   The GUID value in C structure format
 #
 #   @retval     string      The GUID value in xxxxxxxx_xxxx_xxxx_xxxx_xxxxxxxxxxxx format
 #
+
+
 def GuidStructureStringToGuidValueName(GuidValue):
-    guidValueString = GuidValue.lower().replace("{", "").replace("}", "").replace(" ", "")
+    guidValueString = GuidValue.lower().replace(
+        "{", "").replace("}", "").replace(" ", "")
     guidValueList = guidValueString.split(",")
     if len(guidValueList) != 11:
-        EdkLogger.error(None, FORMAT_INVALID, "Invalid GUID value string [%s]" % GuidValue)
+        EdkLogger.error(None, FORMAT_INVALID,
+                        "Invalid GUID value string [%s]" % GuidValue)
     return "%08x_%04x_%04x_%02x%02x_%02x%02x%02x%02x%02x%02x" % (
-            int(guidValueList[0], 16),
-            int(guidValueList[1], 16),
-            int(guidValueList[2], 16),
-            int(guidValueList[3], 16),
-            int(guidValueList[4], 16),
-            int(guidValueList[5], 16),
-            int(guidValueList[6], 16),
-            int(guidValueList[7], 16),
-            int(guidValueList[8], 16),
-            int(guidValueList[9], 16),
-            int(guidValueList[10], 16)
-            )
+        int(guidValueList[0], 16),
+        int(guidValueList[1], 16),
+        int(guidValueList[2], 16),
+        int(guidValueList[3], 16),
+        int(guidValueList[4], 16),
+        int(guidValueList[5], 16),
+        int(guidValueList[6], 16),
+        int(guidValueList[7], 16),
+        int(guidValueList[8], 16),
+        int(guidValueList[9], 16),
+        int(guidValueList[10], 16)
+    )
 
-## Create directories
+# Create directories
 #
 #   @param      Directory   The directory name
 #
+
+
 def CreateDirectory(Directory):
     if Directory is None or Directory.strip() == "":
         return True
@@ -424,10 +457,12 @@ def CreateDirectory(Directory):
         return False
     return True
 
-## Remove directories, including files and sub-directories in it
+# Remove directories, including files and sub-directories in it
 #
 #   @param      Directory   The directory name
 #
+
+
 def RemoveDirectory(Directory, Recursively=False):
     if Directory is None or Directory.strip() == "" or not os.path.exists(Directory):
         return
@@ -442,7 +477,7 @@ def RemoveDirectory(Directory, Recursively=False):
         os.chdir(CurrentDirectory)
     os.rmdir(Directory)
 
-## Store content in file
+# Store content in file
 #
 #  This method is used to save file only when its content is changed. This is
 #  quite useful for "make" system to decide what will be re-built and what won't.
@@ -454,6 +489,8 @@ def RemoveDirectory(Directory, Recursively=False):
 #   @retval     True            If the file content is changed and the file is renewed
 #   @retval     False           If the file content is the same
 #
+
+
 def SaveFileOnChange(File, Content, IsBinaryFile=True, FileLock=None):
 
     # Convert to long file path format
@@ -477,12 +514,14 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True, FileLock=None):
 
     DirName = os.path.dirname(File)
     if not CreateDirectory(DirName):
-        EdkLogger.error(None, FILE_CREATE_FAILURE, "Could not create directory %s" % DirName)
+        EdkLogger.error(None, FILE_CREATE_FAILURE,
+                        "Could not create directory %s" % DirName)
     else:
         if DirName == '':
             DirName = os.getcwd()
         if not os.access(DirName, os.W_OK):
-            EdkLogger.error(None, PERMISSION_FAILURE, "Do not have write permission on directory %s" % DirName)
+            EdkLogger.error(None, PERMISSION_FAILURE,
+                            "Do not have write permission on directory %s" % DirName)
 
     OpenMode = "w"
     if IsBinaryFile:
@@ -494,16 +533,17 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True, FileLock=None):
     if FileLock:
         FileLock.acquire()
 
-
     if GlobalData.gIsWindows and not os.path.exists(File):
         try:
             with open(File, OpenMode) as tf:
                 tf.write(Content)
         except IOError as X:
             if GlobalData.gBinCacheSource:
-                EdkLogger.quiet("[cache error]:fails to save file with error: %s" % (X))
+                EdkLogger.quiet(
+                    "[cache error]:fails to save file with error: %s" % (X))
             else:
-                EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
+                EdkLogger.error(None, FILE_CREATE_FAILURE,
+                                ExtraData='IOError %s' % X)
         finally:
             if FileLock:
                 FileLock.release()
@@ -513,16 +553,18 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True, FileLock=None):
                 Fd.write(Content)
         except IOError as X:
             if GlobalData.gBinCacheSource:
-                EdkLogger.quiet("[cache error]:fails to save file with error: %s" % (X))
+                EdkLogger.quiet(
+                    "[cache error]:fails to save file with error: %s" % (X))
             else:
-                EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
+                EdkLogger.error(None, FILE_CREATE_FAILURE,
+                                ExtraData='IOError %s' % X)
         finally:
             if FileLock:
                 FileLock.release()
 
     return True
 
-## Copy source file only if it is different from the destination file
+# Copy source file only if it is different from the destination file
 #
 #  This method is used to copy file only if the source file and destination
 #  file content are different. This is quite useful to avoid duplicated
@@ -534,6 +576,8 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True, FileLock=None):
 #   @retval     True      The two files content are different and the file is copied
 #   @retval     False     No copy really happen
 #
+
+
 def CopyFileOnChange(SrcFile, Dst, FileLock=None):
 
     # Convert to long file path format
@@ -541,7 +585,8 @@ def CopyFileOnChange(SrcFile, Dst, FileLock=None):
     Dst = LongFilePath(Dst)
 
     if os.path.isdir(SrcFile):
-        EdkLogger.error(None, FILE_COPY_FAILURE, ExtraData='CopyFileOnChange SrcFile is a dir, not a file: %s' % SrcFile)
+        EdkLogger.error(None, FILE_COPY_FAILURE,
+                        ExtraData='CopyFileOnChange SrcFile is a dir, not a file: %s' % SrcFile)
         return False
 
     if os.path.isdir(Dst):
@@ -554,12 +599,14 @@ def CopyFileOnChange(SrcFile, Dst, FileLock=None):
 
     DirName = os.path.dirname(DstFile)
     if not CreateDirectory(DirName):
-        EdkLogger.error(None, FILE_CREATE_FAILURE, "Could not create directory %s" % DirName)
+        EdkLogger.error(None, FILE_CREATE_FAILURE,
+                        "Could not create directory %s" % DirName)
     else:
         if DirName == '':
             DirName = os.getcwd()
         if not os.access(DirName, os.W_OK):
-            EdkLogger.error(None, PERMISSION_FAILURE, "Do not have write permission on directory %s" % DirName)
+            EdkLogger.error(None, PERMISSION_FAILURE,
+                            "Do not have write permission on directory %s" % DirName)
 
     # use default file_lock if no input new lock
     if not FileLock:
@@ -571,22 +618,26 @@ def CopyFileOnChange(SrcFile, Dst, FileLock=None):
         CopyLong(SrcFile, DstFile)
     except IOError as X:
         if GlobalData.gBinCacheSource:
-            EdkLogger.quiet("[cache error]:fails to copy file with error: %s" % (X))
+            EdkLogger.quiet(
+                "[cache error]:fails to copy file with error: %s" % (X))
         else:
-            EdkLogger.error(None, FILE_COPY_FAILURE, ExtraData='IOError %s' % X)
+            EdkLogger.error(None, FILE_COPY_FAILURE,
+                            ExtraData='IOError %s' % X)
     finally:
         if FileLock:
             FileLock.release()
 
     return True
 
-## Retrieve and cache the real path name in file system
+# Retrieve and cache the real path name in file system
 #
 #   @param      Root    The root directory of path relative to
 #
 #   @retval     str     The path string if the path exists
 #   @retval     None    If path doesn't exist
 #
+
+
 class DirCache:
     _CACHE_ = set()
     _UPPER_CACHE_ = {}
@@ -651,6 +702,7 @@ class DirCache:
             return os.path.join(self._Root, self._UPPER_CACHE_[UpperPath])
         return None
 
+
 def RealPath(File, Dir='', OverrideDir=''):
     NewFile = os.path.normpath(os.path.join(Dir, File))
     NewFile = GlobalData.gAllFiles[NewFile]
@@ -659,7 +711,7 @@ def RealPath(File, Dir='', OverrideDir=''):
         NewFile = GlobalData.gAllFiles[NewFile]
     return NewFile
 
-## Get GUID value from given packages
+# Get GUID value from given packages
 #
 #   @param      CName           The CName of the GUID
 #   @param      PackageList     List of packages looking-up in
@@ -668,7 +720,9 @@ def RealPath(File, Dir='', OverrideDir=''):
 #   @retval     GuidValue   if the CName is found in any given package
 #   @retval     None        if the CName is not found in all given packages
 #
-def GuidValue(CName, PackageList, Inffile = None):
+
+
+def GuidValue(CName, PackageList, Inffile=None):
     for P in PackageList:
         GuidKeys = list(P.Guids.keys())
         if Inffile and P._PrivateGuids:
@@ -678,7 +732,7 @@ def GuidValue(CName, PackageList, Inffile = None):
             return P.Guids[CName]
     return None
 
-## A string template class
+# A string template class
 #
 #  This class implements a template for string replacement. A string template
 #  looks like following
@@ -691,6 +745,8 @@ def GuidValue(CName, PackageList, Inffile = None):
 #  be not used and, in this case, the "placeholder_name" must not a list and it
 #  will just be replaced once.
 #
+
+
 class TemplateString(object):
     _REPEAT_START_FLAG = "BEGIN"
     _REPEAT_END_FLAG = "END"
@@ -712,12 +768,14 @@ class TemplateString(object):
                 #   PlaceHolderName, PlaceHolderStartPoint, PlaceHolderEndPoint
                 #
                 for PlaceHolder, Start, End in PlaceHolderList:
-                    self._SubSectionList.append(TemplateSection[SubSectionStart:Start])
+                    self._SubSectionList.append(
+                        TemplateSection[SubSectionStart:Start])
                     self._SubSectionList.append(TemplateSection[Start:End])
                     self._PlaceHolderList.append(PlaceHolder)
                     SubSectionStart = End
                 if SubSectionStart < len(TemplateSection):
-                    self._SubSectionList.append(TemplateSection[SubSectionStart:])
+                    self._SubSectionList.append(
+                        TemplateSection[SubSectionStart:])
             else:
                 self._SubSectionList = [TemplateSection]
 
@@ -738,11 +796,11 @@ class TemplateString(object):
                         RepeatTime = len(Value)
                     elif RepeatTime != len(Value):
                         EdkLogger.error(
-                                    "TemplateString",
-                                    PARAMETER_INVALID,
-                                    "${%s} has different repeat time from others!" % PlaceHolder,
-                                    ExtraData=str(self._Template)
-                                    )
+                            "TemplateString",
+                            PARAMETER_INVALID,
+                            "${%s} has different repeat time from others!" % PlaceHolder,
+                            ExtraData=str(self._Template)
+                        )
                     RepeatPlaceHolders["${%s}" % PlaceHolder] = Value
                 else:
                     NonRepeatPlaceHolders["${%s}" % PlaceHolder] = Value
@@ -764,26 +822,27 @@ class TemplateString(object):
                         if S not in RepeatPlaceHolders:
                             TempStringList.append(S)
                         else:
-                            TempStringList.append(str(RepeatPlaceHolders[S][Index]))
+                            TempStringList.append(
+                                str(RepeatPlaceHolders[S][Index]))
                 StringList = TempStringList
 
             return "".join(StringList)
 
-    ## Constructor
+    # Constructor
     def __init__(self, Template=None):
         self.String = []
         self.IsBinary = False
         self._Template = Template
         self._TemplateSectionList = self._Parse(Template)
 
-    ## str() operator
+    # str() operator
     #
     #   @retval     string  The string replaced
     #
     def __str__(self):
         return "".join(self.String)
 
-    ## Split the template string into fragments per the ${BEGIN} and ${END} flags
+    # Split the template string into fragments per the ${BEGIN} and ${END} flags
     #
     #   @retval     list    A list of TemplateString.Section objects
     #
@@ -797,7 +856,8 @@ class TemplateString(object):
             MatchObj = gPlaceholderPattern.search(Template, SearchFrom)
             if not MatchObj:
                 if MatchEnd <= len(Template):
-                    TemplateSection = TemplateString.Section(Template[SectionStart:], PlaceHolderList)
+                    TemplateSection = TemplateString.Section(
+                        Template[SectionStart:], PlaceHolderList)
                     TemplateSectionList.append(TemplateSection)
                 break
 
@@ -807,21 +867,24 @@ class TemplateString(object):
 
             if MatchString == self._REPEAT_START_FLAG:
                 if MatchStart > SectionStart:
-                    TemplateSection = TemplateString.Section(Template[SectionStart:MatchStart], PlaceHolderList)
+                    TemplateSection = TemplateString.Section(
+                        Template[SectionStart:MatchStart], PlaceHolderList)
                     TemplateSectionList.append(TemplateSection)
                 SectionStart = MatchEnd
                 PlaceHolderList = []
             elif MatchString == self._REPEAT_END_FLAG:
-                TemplateSection = TemplateString.Section(Template[SectionStart:MatchStart], PlaceHolderList)
+                TemplateSection = TemplateString.Section(
+                    Template[SectionStart:MatchStart], PlaceHolderList)
                 TemplateSectionList.append(TemplateSection)
                 SectionStart = MatchEnd
                 PlaceHolderList = []
             else:
-                PlaceHolderList.append((MatchString, MatchStart - SectionStart, MatchEnd - SectionStart))
+                PlaceHolderList.append(
+                    (MatchString, MatchStart - SectionStart, MatchEnd - SectionStart))
             SearchFrom = MatchEnd
         return TemplateSectionList
 
-    ## Replace the string template with dictionary of placeholders and append it to previous one
+    # Replace the string template with dictionary of placeholders and append it to previous one
     #
     #   @param      AppendString    The string template to append
     #   @param      Dictionary      The placeholder dictionaries
@@ -829,14 +892,15 @@ class TemplateString(object):
     def Append(self, AppendString, Dictionary=None):
         if Dictionary:
             SectionList = self._Parse(AppendString)
-            self.String.append( "".join(S.Instantiate(Dictionary) for S in SectionList))
+            self.String.append("".join(S.Instantiate(Dictionary)
+                               for S in SectionList))
         else:
-            if isinstance(AppendString,list):
+            if isinstance(AppendString, list):
                 self.String.extend(AppendString)
             else:
                 self.String.append(AppendString)
 
-    ## Replace the string template with dictionary of placeholders
+    # Replace the string template with dictionary of placeholders
     #
     #   @param      Dictionary      The placeholder dictionaries
     #
@@ -845,17 +909,19 @@ class TemplateString(object):
     def Replace(self, Dictionary=None):
         return "".join(S.Instantiate(Dictionary) for S in self._TemplateSectionList)
 
-## Progress indicator class
+# Progress indicator class
 #
 #  This class makes use of thread to print progress on console.
 #
+
+
 class Progressor:
     # for avoiding deadloop
     _StopFlag = None
     _ProgressThread = None
     _CheckInterval = 0.25
 
-    ## Constructor
+    # Constructor
     #
     #   @param      OpenMessage     The string printed before progress characters
     #   @param      CloseMessage    The string printed after progress characters
@@ -870,7 +936,7 @@ class Progressor:
         if Progressor._StopFlag is None:
             Progressor._StopFlag = threading.Event()
 
-    ## Start to print progress character
+    # Start to print progress character
     #
     #   @param      OpenMessage     The string printed before progress characters
     #
@@ -879,11 +945,12 @@ class Progressor:
             self.PromptMessage = OpenMessage
         Progressor._StopFlag.clear()
         if Progressor._ProgressThread is None:
-            Progressor._ProgressThread = threading.Thread(target=self._ProgressThreadEntry)
+            Progressor._ProgressThread = threading.Thread(
+                target=self._ProgressThreadEntry)
             Progressor._ProgressThread.setDaemon(False)
             Progressor._ProgressThread.start()
 
-    ## Stop printing progress character
+    # Stop printing progress character
     #
     #   @param      CloseMessage    The string printed after progress characters
     #
@@ -894,7 +961,7 @@ class Progressor:
         self.Abort()
         self.CodaMessage = OriginalCodaMessage
 
-    ## Thread entry method
+    # Thread entry method
     def _ProgressThreadEntry(self):
         sys.stdout.write(self.PromptMessage + " ")
         sys.stdout.flush()
@@ -909,7 +976,7 @@ class Progressor:
         sys.stdout.write(" " + self.CodaMessage + "\n")
         sys.stdout.flush()
 
-    ## Abort the progress display
+    # Abort the progress display
     @staticmethod
     def Abort():
         if Progressor._StopFlag is not None:
@@ -919,7 +986,7 @@ class Progressor:
             Progressor._ProgressThread = None
 
 
-## Dictionary using prioritized list as key
+# Dictionary using prioritized list as key
 #
 class tdict:
     _ListType = type([])
@@ -957,7 +1024,7 @@ class tdict:
 
     def _GetSingleValue(self, FirstKey, RestKeys):
         Value = None
-        #print "%s-%s" % (FirstKey, self._Level_) ,
+        # print "%s-%s" % (FirstKey, self._Level_) ,
         if self._Level_ > 1:
             if FirstKey == self._Wildcard:
                 if FirstKey in self.data:
@@ -965,12 +1032,13 @@ class tdict:
                 if Value is None:
                     for Key in self.data:
                         Value = self.data[Key][RestKeys]
-                        if Value is not None: break
+                        if Value is not None:
+                            break
             else:
                 if FirstKey in self.data:
                     Value = self.data[FirstKey][RestKeys]
                 if Value is None and self._Wildcard in self.data:
-                    #print "Value=None"
+                    # print "Value=None"
                     Value = self.data[self._Wildcard][RestKeys]
         else:
             if FirstKey == self._Wildcard:
@@ -979,7 +1047,8 @@ class tdict:
                 if Value is None:
                     for Key in self.data:
                         Value = self.data[Key]
-                        if Value is not None: break
+                        if Value is not None:
+                            break
             else:
                 if FirstKey in self.data:
                     Value = self.data[FirstKey]
@@ -1057,6 +1126,7 @@ class tdict:
                 keys |= self.data[Key].GetKeys(KeyIndex - 1)
             return keys
 
+
 def AnalyzePcdExpression(Setting):
     RanStr = ''.join(sample(string.ascii_letters + string.digits, 8))
     Setting = Setting.replace('\\\\', RanStr).strip()
@@ -1094,20 +1164,22 @@ def AnalyzePcdExpression(Setting):
         StartPos = Pos + 1
     for i, ch in enumerate(FieldList):
         if RanStr in ch:
-            FieldList[i] = ch.replace(RanStr,'\\\\')
+            FieldList[i] = ch.replace(RanStr, '\\\\')
     return FieldList
 
-def ParseFieldValue (Value):
-    def ParseDevPathValue (Value):
+
+def ParseFieldValue(Value):
+    def ParseDevPathValue(Value):
         if '\\' in Value:
             Value.replace('\\', '/').replace(' ', '')
 
         Cmd = 'DevicePath ' + '"' + Value + '"'
         try:
-            p = subprocess.Popen(Cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+            p = subprocess.Popen(Cmd, stdout=subprocess.PIPE,
+                                 stderr=subprocess.PIPE, shell=True)
             out, err = p.communicate()
         except Exception as X:
-            raise BadExpression("DevicePath: %s" % (str(X)) )
+            raise BadExpression("DevicePath: %s" % (str(X)))
         finally:
             subprocess._cleanup()
             p.stdout.close()
@@ -1124,27 +1196,31 @@ def ParseFieldValue (Value):
     if isinstance(Value, type(0)):
         return Value, (Value.bit_length() + 7) // 8
     if not isinstance(Value, type('')):
-        raise BadExpression('Type %s is %s' %(Value, type(Value)))
+        raise BadExpression('Type %s is %s' % (Value, type(Value)))
     Value = Value.strip()
     if Value.startswith(TAB_UINT8) and Value.endswith(')'):
         Value, Size = ParseFieldValue(Value.split('(', 1)[1][:-1])
         if Size > 1:
-            raise BadExpression('Value (%s) Size larger than %d' %(Value, Size))
+            raise BadExpression(
+                'Value (%s) Size larger than %d' % (Value, Size))
         return Value, 1
     if Value.startswith(TAB_UINT16) and Value.endswith(')'):
         Value, Size = ParseFieldValue(Value.split('(', 1)[1][:-1])
         if Size > 2:
-            raise BadExpression('Value (%s) Size larger than %d' %(Value, Size))
+            raise BadExpression(
+                'Value (%s) Size larger than %d' % (Value, Size))
         return Value, 2
     if Value.startswith(TAB_UINT32) and Value.endswith(')'):
         Value, Size = ParseFieldValue(Value.split('(', 1)[1][:-1])
         if Size > 4:
-            raise BadExpression('Value (%s) Size larger than %d' %(Value, Size))
+            raise BadExpression(
+                'Value (%s) Size larger than %d' % (Value, Size))
         return Value, 4
     if Value.startswith(TAB_UINT64) and Value.endswith(')'):
         Value, Size = ParseFieldValue(Value.split('(', 1)[1][:-1])
         if Size > 8:
-            raise BadExpression('Value (%s) Size larger than %d' % (Value, Size))
+            raise BadExpression(
+                'Value (%s) Size larger than %d' % (Value, Size))
         return Value, 8
     if Value.startswith(TAB_GUID) and Value.endswith(')'):
         Value = Value.split('(', 1)[1][:-1].strip()
@@ -1158,7 +1234,7 @@ def ParseFieldValue (Value):
         try:
             Value = uuid.UUID(Value).bytes_le
             ValueL, ValueH = struct.unpack('2Q', Value)
-            Value = (ValueH << 64 ) | ValueL
+            Value = (ValueH << 64) | ValueL
 
         except ValueError as Message:
             raise BadExpression(Message)
@@ -1257,7 +1333,7 @@ def ParseFieldValue (Value):
         return 0, 1
     return Value, 1
 
-## AnalyzeDscPcd
+# AnalyzeDscPcd
 #
 #  Analyze DSC PCD value, since there is no data type info in DSC
 #  This function is used to match functions (AnalyzePcdData) used for retrieving PCD value from database
@@ -1282,6 +1358,8 @@ def ParseFieldValue (Value):
 #    IsValid:   True if conforming EBNF, otherwise False
 #    Index:     The index where PcdValue is in ValueList
 #
+
+
 def AnalyzeDscPcd(Setting, PcdType, DataType=''):
     FieldList = AnalyzePcdExpression(Setting)
 
@@ -1350,7 +1428,7 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
         return [HiiString, Guid, Offset, Value, Attribute], IsValid, 3
     return [], False, 0
 
-## AnalyzePcdData
+# AnalyzePcdData
 #
 #  Analyze the pcd Value, Datum type and TokenNumber.
 #  Used to avoid split issue while the value string contain "|" character
@@ -1359,6 +1437,8 @@ def AnalyzeDscPcd(Setting, PcdType, DataType=''):
 #
 #  @retval   ValueList: A List contain value, datum type and toke number.
 #
+
+
 def AnalyzePcdData(Setting):
     ValueList = ['', '', '']
 
@@ -1379,18 +1459,21 @@ def AnalyzePcdData(Setting):
 
     return ValueList
 
-## check format of PCD value against its the datum type
+# check format of PCD value against its the datum type
 #
 # For PCD value setting
 #
+
+
 def CheckPcdDatum(Type, Value):
     if Type == TAB_VOID:
         ValueRe = re.compile(r'\s*L?\".*\"\s*$')
         if not (((Value.startswith('L"') or Value.startswith('"')) and Value.endswith('"'))
                 or (Value.startswith('{') and Value.endswith('}')) or (Value.startswith("L'") or Value.startswith("'") and Value.endswith("'"))
-               ):
+                ):
             return False, "Invalid value [%s] of type [%s]; must be in the form of {...} for array"\
-                          ", \"...\" or \'...\' for string, L\"...\" or L\'...\' for unicode string" % (Value, Type)
+                          ", \"...\" or \'...\' for string, L\"...\" or L\'...\' for unicode string" % (
+                              Value, Type)
         elif ValueRe.match(Value):
             # Check the chars in UnicodeString or CString is printable
             if Value.startswith("L"):
@@ -1419,12 +1502,14 @@ def CheckPcdDatum(Type, Value):
                 return False, "Too large PCD value[%s] for datum type [%s]" % (Value, Type)
         except:
             return False, "Invalid value [%s] of type [%s];"\
-                          " must be a hexadecimal, decimal or octal in C language format." % (Value, Type)
+                          " must be a hexadecimal, decimal or octal in C language format." % (
+                              Value, Type)
     else:
         return True, "StructurePcd"
 
     return True, ""
 
+
 def CommonPath(PathList):
     P1 = min(PathList).split(os.path.sep)
     P2 = max(PathList).split(os.path.sep)
@@ -1433,6 +1518,7 @@ def CommonPath(PathList):
             return os.path.sep.join(P1[:Index])
     return os.path.sep.join(P1)
 
+
 class PathClass(object):
     def __init__(self, File='', Root='', AlterRoot='', Type='', IsBinary=False,
                  Arch='COMMON', ToolChainFamily='', Target='', TagName='', ToolCode=''):
@@ -1484,7 +1570,7 @@ class PathClass(object):
         self.ToolChainFamily = ToolChainFamily
         self.OriginalPath = self
 
-    ## Convert the object of this class to a string
+    # Convert the object of this class to a string
     #
     #  Convert member Path of the class to a string
     #
@@ -1493,7 +1579,7 @@ class PathClass(object):
     def __str__(self):
         return self.Path
 
-    ## Override __eq__ function
+    # Override __eq__ function
     #
     # Check whether PathClass are the same
     #
@@ -1503,7 +1589,7 @@ class PathClass(object):
     def __eq__(self, Other):
         return self.Path == str(Other)
 
-    ## Override __cmp__ function
+    # Override __cmp__ function
     #
     # Customize the comparison operation of two PathClass
     #
@@ -1521,7 +1607,7 @@ class PathClass(object):
         else:
             return -1
 
-    ## Override __hash__ function
+    # Override __hash__ function
     #
     # Use Path as key in hash table
     #
@@ -1542,14 +1628,16 @@ class PathClass(object):
         def RealPath2(File, Dir='', OverrideDir=''):
             NewFile = None
             if OverrideDir:
-                NewFile = GlobalData.gAllFiles[os.path.normpath(os.path.join(OverrideDir, File))]
+                NewFile = GlobalData.gAllFiles[os.path.normpath(
+                    os.path.join(OverrideDir, File))]
                 if NewFile:
                     if OverrideDir[-1] == os.path.sep:
                         return NewFile[len(OverrideDir):], NewFile[0:len(OverrideDir)]
                     else:
                         return NewFile[len(OverrideDir) + 1:], NewFile[0:len(OverrideDir)]
             if GlobalData.gAllFiles:
-                NewFile = GlobalData.gAllFiles[os.path.normpath(os.path.join(Dir, File))]
+                NewFile = GlobalData.gAllFiles[os.path.normpath(
+                    os.path.join(Dir, File))]
             if not NewFile:
                 NewFile = os.path.normpath(os.path.join(Dir, File))
                 if not os.path.exists(NewFile):
@@ -1577,7 +1665,7 @@ class PathClass(object):
                 RealFile = os.path.join(self.AlterRoot, self.File)
             elif self.Root:
                 RealFile = os.path.join(self.Root, self.File)
-            if len (mws.getPkgPath()) == 0:
+            if len(mws.getPkgPath()) == 0:
                 return FILE_NOT_FOUND, os.path.join(self.AlterRoot, RealFile)
             else:
                 return FILE_NOT_FOUND, "%s is not found in packages path:\n\t%s" % (self.File, '\n\t'.join(mws.getPkgPath()))
@@ -1600,19 +1688,21 @@ class PathClass(object):
             self.Path = os.path.join(RealRoot, RealFile)
         return ErrorCode, ErrorInfo
 
-## Parse PE image to get the required PE information.
+# Parse PE image to get the required PE information.
 #
+
+
 class PeImageClass():
-    ## Constructor
+    # Constructor
     #
     #   @param  File FilePath of PeImage
     #
     def __init__(self, PeFile):
-        self.FileName   = PeFile
-        self.IsValid    = False
-        self.Size       = 0
+        self.FileName = PeFile
+        self.IsValid = False
+        self.Size = 0
         self.EntryPoint = 0
-        self.SectionAlignment  = 0
+        self.SectionAlignment = 0
         self.SectionHeaderList = []
         self.ErrorInfo = ''
         try:
@@ -1625,7 +1715,7 @@ class PeImageClass():
         ByteArray.fromfile(PeObject, 0x3E)
         ByteList = ByteArray.tolist()
         # DOS signature should be 'MZ'
-        if self._ByteListToStr (ByteList[0x0:0x2]) != 'MZ':
+        if self._ByteListToStr(ByteList[0x0:0x2]) != 'MZ':
             self.ErrorInfo = self.FileName + ' has no valid DOS signature MZ'
             return
 
@@ -1653,20 +1743,21 @@ class PeImageClass():
         ByteArray = array.array('B')
         ByteArray.fromfile(PeObject, OptionalHeaderSize)
         ByteList = ByteArray.tolist()
-        self.EntryPoint       = self._ByteListToInt(ByteList[0x10:0x14])
+        self.EntryPoint = self._ByteListToInt(ByteList[0x10:0x14])
         self.SectionAlignment = self._ByteListToInt(ByteList[0x20:0x24])
-        self.Size             = self._ByteListToInt(ByteList[0x38:0x3C])
+        self.Size = self._ByteListToInt(ByteList[0x38:0x3C])
 
         # Read each Section Header
         for Index in range(SecNumber):
             ByteArray = array.array('B')
             ByteArray.fromfile(PeObject, 0x28)
             ByteList = ByteArray.tolist()
-            SecName  = self._ByteListToStr(ByteList[0:8])
+            SecName = self._ByteListToStr(ByteList[0:8])
             SecVirtualSize = self._ByteListToInt(ByteList[8:12])
-            SecRawAddress  = self._ByteListToInt(ByteList[20:24])
+            SecRawAddress = self._ByteListToInt(ByteList[20:24])
             SecVirtualAddress = self._ByteListToInt(ByteList[12:16])
-            self.SectionHeaderList.append((SecName, SecVirtualAddress, SecRawAddress, SecVirtualSize))
+            self.SectionHeaderList.append(
+                (SecName, SecVirtualAddress, SecRawAddress, SecVirtualSize))
         self.IsValid = True
         PeObject.close()
 
@@ -1684,48 +1775,55 @@ class PeImageClass():
             Value = (Value << 8) | int(ByteList[index])
         return Value
 
+
 class DefaultStore():
-    def __init__(self, DefaultStores ):
+    def __init__(self, DefaultStores):
 
         self.DefaultStores = DefaultStores
+
     def DefaultStoreID(self, DefaultStoreName):
         for key, value in self.DefaultStores.items():
             if value == DefaultStoreName:
                 return key
         return None
+
     def GetDefaultDefault(self):
         if not self.DefaultStores or "0" in self.DefaultStores:
             return "0", TAB_DEFAULT_STORES_DEFAULT
         else:
             minvalue = min(int(value_str) for value_str in self.DefaultStores)
             return (str(minvalue), self.DefaultStores[str(minvalue)])
+
     def GetMin(self, DefaultSIdList):
         if not DefaultSIdList:
             return TAB_DEFAULT_STORES_DEFAULT
-        storeidset = {storeid for storeid, storename in self.DefaultStores.values() if storename in DefaultSIdList}
+        storeidset = {storeid for storeid, storename in self.DefaultStores.values(
+        ) if storename in DefaultSIdList}
         if not storeidset:
             return ""
-        minid = min(storeidset )
+        minid = min(storeidset)
         for sid, name in self.DefaultStores.values():
             if sid == minid:
                 return name
 
+
 class SkuClass():
     DEFAULT = 0
     SINGLE = 1
-    MULTIPLE =2
+    MULTIPLE = 2
 
-    def __init__(self,SkuIdentifier='', SkuIds=None):
+    def __init__(self, SkuIdentifier='', SkuIds=None):
         if SkuIds is None:
             SkuIds = {}
 
         for SkuName in SkuIds:
             SkuId = SkuIds[SkuName][0]
-            skuid_num = int(SkuId, 16) if SkuId.upper().startswith("0X") else int(SkuId)
+            skuid_num = int(SkuId, 16) if SkuId.upper(
+            ).startswith("0X") else int(SkuId)
             if skuid_num > 0xFFFFFFFFFFFFFFFF:
                 EdkLogger.error("build", PARAMETER_INVALID,
-                            ExtraData = "SKU-ID [%s] value %s exceeds the max value of UINT64"
-                                      % (SkuName, SkuId))
+                                ExtraData="SKU-ID [%s] value %s exceeds the max value of UINT64"
+                                % (SkuName, SkuId))
 
         self.AvailableSkuIds = OrderedDict()
         self.SkuIdSet = []
@@ -1738,26 +1836,28 @@ class SkuClass():
             self.SkuIdNumberSet = ['0U']
         elif SkuIdentifier == 'ALL':
             self.SkuIdSet = list(SkuIds.keys())
-            self.SkuIdNumberSet = [num[0].strip() + 'U' for num in SkuIds.values()]
+            self.SkuIdNumberSet = [
+                num[0].strip() + 'U' for num in SkuIds.values()]
         else:
             r = SkuIdentifier.split('|')
-            self.SkuIdSet=[(r[k].strip()).upper() for k in range(len(r))]
+            self.SkuIdSet = [(r[k].strip()).upper() for k in range(len(r))]
             k = None
             try:
-                self.SkuIdNumberSet = [SkuIds[k][0].strip() + 'U' for k in self.SkuIdSet]
+                self.SkuIdNumberSet = [
+                    SkuIds[k][0].strip() + 'U' for k in self.SkuIdSet]
             except Exception:
                 EdkLogger.error("build", PARAMETER_INVALID,
-                            ExtraData = "SKU-ID [%s] is not supported by the platform. [Valid SKU-ID: %s]"
-                                      % (k, " | ".join(SkuIds.keys())))
+                                ExtraData="SKU-ID [%s] is not supported by the platform. [Valid SKU-ID: %s]"
+                                % (k, " | ".join(SkuIds.keys())))
         for each in self.SkuIdSet:
             if each in SkuIds:
                 self.AvailableSkuIds[each] = SkuIds[each][0]
             else:
                 EdkLogger.error("build", PARAMETER_INVALID,
-                            ExtraData="SKU-ID [%s] is not supported by the platform. [Valid SKU-ID: %s]"
-                                      % (each, " | ".join(SkuIds.keys())))
+                                ExtraData="SKU-ID [%s] is not supported by the platform. [Valid SKU-ID: %s]"
+                                % (each, " | ".join(SkuIds.keys())))
         if self.SkuUsageType != SkuClass.SINGLE:
-            self.AvailableSkuIds.update({'DEFAULT':0, 'COMMON':0})
+            self.AvailableSkuIds.update({'DEFAULT': 0, 'COMMON': 0})
         if self.SkuIdSet:
             GlobalData.gSkuids = (self.SkuIdSet)
             if 'COMMON' in GlobalData.gSkuids:
@@ -1773,7 +1873,7 @@ class SkuClass():
         if not self._SkuInherit:
             self._SkuInherit = {}
             for item in self.SkuData.values():
-                self._SkuInherit[item[1]]=item[2] if item[2] else "DEFAULT"
+                self._SkuInherit[item[1]] = item[2] if item[2] else "DEFAULT"
         return self._SkuInherit.get(skuname, "DEFAULT")
 
     def GetSkuChain(self, sku):
@@ -1788,6 +1888,7 @@ class SkuClass():
                 break
         skulist.reverse()
         return skulist
+
     def SkuOverrideOrder(self):
         skuorderset = []
         for skuname in self.SkuIdSet:
@@ -1844,12 +1945,14 @@ class SkuClass():
         else:
             return 'DEFAULT'
 
-##  Get the integer value from string like "14U" or integer like 2
+# Get the integer value from string like "14U" or integer like 2
 #
 #   @param      Input   The object that may be either a integer value or a string
 #
 #   @retval     Value    The integer value that the input represents
 #
+
+
 def GetIntegerValue(Input):
     if not isinstance(Input, str):
         return Input
@@ -1871,6 +1974,8 @@ def GetIntegerValue(Input):
 #
 # Pack a GUID (registry format) list into a buffer and return it
 #
+
+
 def PackGUID(Guid):
     return pack(PACK_PATTERN_GUID,
                 int(Guid[0], 16),
@@ -1889,6 +1994,8 @@ def PackGUID(Guid):
 #
 # Pack a GUID (byte) list into a buffer and return it
 #
+
+
 def PackByteFormatGUID(Guid):
     return pack(PACK_PATTERN_GUID,
                 Guid[0],
@@ -1904,19 +2011,21 @@ def PackByteFormatGUID(Guid):
                 Guid[10],
                 )
 
-## DeepCopy dict/OrderedDict recusively
+# DeepCopy dict/OrderedDict recusively
 #
 #   @param      ori_dict    a nested dict or ordereddict
 #
 #   @retval     new dict or orderdict
 #
+
+
 def CopyDict(ori_dict):
     dict_type = ori_dict.__class__
-    if dict_type not in (dict,OrderedDict):
+    if dict_type not in (dict, OrderedDict):
         return ori_dict
     new_dict = dict_type()
     for key in ori_dict:
-        if isinstance(ori_dict[key],(dict,OrderedDict)):
+        if isinstance(ori_dict[key], (dict, OrderedDict)):
             new_dict[key] = CopyDict(ori_dict[key])
         else:
             new_dict[key] = ori_dict[key]
@@ -1925,5 +2034,7 @@ def CopyDict(ori_dict):
 #
 # Remove the c/c++ comments: // and /* */
 #
+
+
 def RemoveCComments(ctext):
     return re.sub('//.*?\n|/\*.*?\*/', '\n', ctext, flags=re.S)
diff --git a/BaseTools/Source/Python/Common/MultipleWorkspace.py b/BaseTools/Source/Python/Common/MultipleWorkspace.py
index ad5d48588b48..598ea311a724 100644
--- a/BaseTools/Source/Python/Common/MultipleWorkspace.py
+++ b/BaseTools/Source/Python/Common/MultipleWorkspace.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # manage multiple workspace file.
 #
 # This file is required to make Python interpreter treat the directory
@@ -11,7 +11,7 @@
 import Common.LongFilePathOs as os
 from Common.DataType import TAB_WORKSPACE
 
-## MultipleWorkspace
+# MultipleWorkspace
 #
 # This class manage multiple workspace behavior
 #
@@ -20,11 +20,13 @@ from Common.DataType import TAB_WORKSPACE
 # @var WORKSPACE:      defined the current WORKSPACE
 # @var PACKAGES_PATH:  defined the other WORKSPACE, if current WORKSPACE is invalid, search valid WORKSPACE from PACKAGES_PATH
 #
+
+
 class MultipleWorkspace(object):
     WORKSPACE = ''
     PACKAGES_PATH = None
 
-    ## convertPackagePath()
+    # convertPackagePath()
     #
     #   Convert path to match workspace.
     #
@@ -34,11 +36,11 @@ class MultipleWorkspace(object):
     #
     @classmethod
     def convertPackagePath(cls, Ws, Path):
-        if str(os.path.normcase (Path)).startswith(Ws):
+        if str(os.path.normcase(Path)).startswith(Ws):
             return os.path.join(Ws, os.path.relpath(Path, Ws))
         return Path
 
-    ## setWs()
+    # setWs()
     #
     #   set WORKSPACE and PACKAGES_PATH environment
     #
@@ -50,11 +52,12 @@ class MultipleWorkspace(object):
     def setWs(cls, Ws, PackagesPath=None):
         cls.WORKSPACE = Ws
         if PackagesPath:
-            cls.PACKAGES_PATH = [cls.convertPackagePath (Ws, os.path.normpath(Path.strip())) for Path in PackagesPath.split(os.pathsep)]
+            cls.PACKAGES_PATH = [cls.convertPackagePath(Ws, os.path.normpath(
+                Path.strip())) for Path in PackagesPath.split(os.pathsep)]
         else:
             cls.PACKAGES_PATH = []
 
-    ## join()
+    # join()
     #
     #   rewrite os.path.join function
     #
@@ -74,7 +77,7 @@ class MultipleWorkspace(object):
             Path = os.path.join(Ws, *p)
         return Path
 
-    ## relpath()
+    # relpath()
     #
     #   rewrite os.path.relpath function
     #
@@ -93,7 +96,7 @@ class MultipleWorkspace(object):
             Path = os.path.relpath(Path, Ws)
         return Path
 
-    ## getWs()
+    # getWs()
     #
     #   get valid workspace for the path
     #
@@ -112,7 +115,7 @@ class MultipleWorkspace(object):
                     return Pkg
         return Ws
 
-    ## handleWsMacro()
+    # handleWsMacro()
     #
     #   handle the $(WORKSPACE) tag, if current workspace is invalid path relative the tool, replace it.
     #
@@ -128,17 +131,19 @@ class MultipleWorkspace(object):
                     MacroStartPos = str.find(TAB_WORKSPACE)
                     if MacroStartPos != -1:
                         Substr = str[MacroStartPos:]
-                        Path = Substr.replace(TAB_WORKSPACE, cls.WORKSPACE).strip()
+                        Path = Substr.replace(
+                            TAB_WORKSPACE, cls.WORKSPACE).strip()
                         if not os.path.exists(Path):
                             for Pkg in cls.PACKAGES_PATH:
-                                Path = Substr.replace(TAB_WORKSPACE, Pkg).strip()
+                                Path = Substr.replace(
+                                    TAB_WORKSPACE, Pkg).strip()
                                 if os.path.exists(Path):
                                     break
                         PathList[i] = str[0:MacroStartPos] + Path
             PathStr = ' '.join(PathList)
         return PathStr
 
-    ## getPkgPath()
+    # getPkgPath()
     #
     #   get all package paths.
     #
@@ -147,4 +152,3 @@ class MultipleWorkspace(object):
     @classmethod
     def getPkgPath(cls):
         return cls.PACKAGES_PATH
-
diff --git a/BaseTools/Source/Python/Common/Parsing.py b/BaseTools/Source/Python/Common/Parsing.py
index 740283a04d85..22015fa00c12 100644
--- a/BaseTools/Source/Python/Common/Parsing.py
+++ b/BaseTools/Source/Python/Common/Parsing.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common parsing related functions used in parsing INF/DEC/DSC process
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,10 +13,12 @@ from .StringUtils import *
 from CommonDataClass.DataClass import *
 from .DataType import *
 
-## ParseDefineMacro
+# ParseDefineMacro
 #
 # Search whole table to find all defined Macro and replaced them with the real values
 #
+
+
 def ParseDefineMacro2(Table, RecordSets, GlobalMacro):
     Macros = {}
     #
@@ -41,10 +43,12 @@ def ParseDefineMacro2(Table, RecordSets, GlobalMacro):
         for Item in Value:
             Item[0] = ReplaceMacro(Item[0], Macros)
 
-## ParseDefineMacro
+# ParseDefineMacro
 #
 # Search whole table to find all defined Macro and replaced them with the real values
 #
+
+
 def ParseDefineMacro(Table, GlobalMacro):
     Macros = {}
     #
@@ -55,18 +59,18 @@ def ParseDefineMacro(Table, GlobalMacro):
                     and Enabled > -1""" % (Table.Table, MODEL_META_DATA_DEFINE)
     RecordSet = Table.Exec(SqlCommand)
     for Record in RecordSet:
-#***************************************************************************************************************************************************
-#            The follow SqlCommand (expr replace) is not supported in Sqlite 3.3.4 which is used in Python 2.5                                     *
-#            Reserved Only                                                                                                                         *
-#            SqlCommand = """update %s set Value1 = replace(Value1, '%s', '%s')                                                                    *
-#                            where ID in (select ID from %s                                                                                        *
-#                                         where Model = %s                                                                                         *
-#                                         and Value1 like '%%%s%%'                                                                                 *
-#                                         and StartLine > %s                                                                                       *
-#                                         and Enabled > -1                                                                                         *
-#                                         and Arch = '%s')""" % \                                                                                  *
-#                                         (self.TblDsc.Table, Record[0], Record[1], self.TblDsc.Table, Record[2], Record[1], Record[3], Record[4]) *
-#***************************************************************************************************************************************************
+        # ***************************************************************************************************************************************************
+        #            The follow SqlCommand (expr replace) is not supported in Sqlite 3.3.4 which is used in Python 2.5                                     *
+        #            Reserved Only                                                                                                                         *
+        #            SqlCommand = """update %s set Value1 = replace(Value1, '%s', '%s')                                                                    *
+        #                            where ID in (select ID from %s                                                                                        *
+        #                                         where Model = %s                                                                                         *
+        #                                         and Value1 like '%%%s%%'                                                                                 *
+        #                                         and StartLine > %s                                                                                       *
+        #                                         and Enabled > -1                                                                                         *
+        #                                         and Arch = '%s')""" % \                                                                                  *
+        #                                         (self.TblDsc.Table, Record[0], Record[1], self.TblDsc.Table, Record[2], Record[1], Record[3], Record[4]) *
+        # ***************************************************************************************************************************************************
         Macros[Record[0]] = Record[1]
 
     #
@@ -80,7 +84,7 @@ def ParseDefineMacro(Table, GlobalMacro):
     SqlCommand = """select ID, Value1 from %s
                     where Model != %s
                     and Value1 like '%%$(%%' and Value1 like '%%)%%'
-                    and Enabled > -1"""  % (Table.Table, MODEL_META_DATA_DEFINE)
+                    and Enabled > -1""" % (Table.Table, MODEL_META_DATA_DEFINE)
     FoundRecords = Table.Exec(SqlCommand)
     for FoundRecord in FoundRecords:
         NewValue = ReplaceMacro(FoundRecord[1], Macros)
@@ -88,7 +92,7 @@ def ParseDefineMacro(Table, GlobalMacro):
                         where ID = %s""" % (Table.Table, ConvertToSqlString2(NewValue), FoundRecord[0])
         Table.Exec(SqlCommand)
 
-##QueryDefinesItem
+# QueryDefinesItem
 #
 # Search item of section [Defines] by name, return its values
 #
@@ -98,6 +102,8 @@ def ParseDefineMacro(Table, GlobalMacro):
 #
 # @retval RecordSet: A list of all matched records
 #
+
+
 def QueryDefinesItem(Table, Name, Arch, BelongsToFile):
     SqlCommand = """select Value2 from %s
                     where Model = %s
@@ -132,7 +138,7 @@ def QueryDefinesItem(Table, Name, Arch, BelongsToFile):
                     RetVal.append(Item)
         return RetVal
 
-##QueryDefinesItem
+# QueryDefinesItem
 #
 # Search item of section [Defines] by name, return its values
 #
@@ -142,6 +148,8 @@ def QueryDefinesItem(Table, Name, Arch, BelongsToFile):
 #
 # @retval RecordSet: A list of all matched records
 #
+
+
 def QueryDefinesItem2(Table, Arch, BelongsToFile):
     SqlCommand = """select Value1, Value2, StartLine from %s
                     where Model = %s
@@ -159,7 +167,7 @@ def QueryDefinesItem2(Table, Arch, BelongsToFile):
 
     return RecordSet
 
-##QueryDscItem
+# QueryDscItem
 #
 # Search all dsc item for a specific section
 #
@@ -168,6 +176,8 @@ def QueryDefinesItem2(Table, Arch, BelongsToFile):
 #
 # @retval RecordSet: A list of all matched records
 #
+
+
 def QueryDscItem(Table, Model, BelongsToItem, BelongsToFile):
     SqlCommand = """select Value1, Arch, StartLine, ID, Value2 from %s
                     where Model = %s
@@ -176,7 +186,7 @@ def QueryDscItem(Table, Model, BelongsToItem, BelongsToFile):
                     and Enabled > -1""" % (Table.Table, Model, BelongsToItem, BelongsToFile)
     return Table.Exec(SqlCommand)
 
-##QueryDecItem
+# QueryDecItem
 #
 # Search all dec item for a specific section
 #
@@ -185,6 +195,8 @@ def QueryDscItem(Table, Model, BelongsToItem, BelongsToFile):
 #
 # @retval RecordSet: A list of all matched records
 #
+
+
 def QueryDecItem(Table, Model, BelongsToItem):
     SqlCommand = """select Value1, Arch, StartLine, ID, Value2 from %s
                     where Model = %s
@@ -192,7 +204,7 @@ def QueryDecItem(Table, Model, BelongsToItem):
                     and Enabled > -1""" % (Table.Table, Model, BelongsToItem)
     return Table.Exec(SqlCommand)
 
-##QueryInfItem
+# QueryInfItem
 #
 # Search all dec item for a specific section
 #
@@ -201,6 +213,8 @@ def QueryDecItem(Table, Model, BelongsToItem):
 #
 # @retval RecordSet: A list of all matched records
 #
+
+
 def QueryInfItem(Table, Model, BelongsToItem):
     SqlCommand = """select Value1, Arch, StartLine, ID, Value2 from %s
                     where Model = %s
@@ -208,7 +222,7 @@ def QueryInfItem(Table, Model, BelongsToItem):
                     and Enabled > -1""" % (Table.Table, Model, BelongsToItem)
     return Table.Exec(SqlCommand)
 
-## GetBuildOption
+# GetBuildOption
 #
 # Parse a string with format "[<Family>:]<ToolFlag>=Flag"
 # Return (Family, ToolFlag, Flag)
@@ -218,21 +232,24 @@ def QueryInfItem(Table, Model, BelongsToItem):
 #
 # @retval truple() A truple structure as (Family, ToolChain, Flag)
 #
-def GetBuildOption(String, File, LineNo = -1):
+
+
+def GetBuildOption(String, File, LineNo=-1):
     (Family, ToolChain, Flag) = ('', '', '')
     if String.find(TAB_EQUAL_SPLIT) < 0:
-        RaiseParserError(String, 'BuildOptions', File, '[<Family>:]<ToolFlag>=Flag', LineNo)
+        RaiseParserError(String, 'BuildOptions', File,
+                         '[<Family>:]<ToolFlag>=Flag', LineNo)
     else:
-        List = GetSplitValueList(String, TAB_EQUAL_SPLIT, MaxSplit = 1)
+        List = GetSplitValueList(String, TAB_EQUAL_SPLIT, MaxSplit=1)
         if List[0].find(':') > -1:
-            Family = List[0][ : List[0].find(':')].strip()
-            ToolChain = List[0][List[0].find(':') + 1 : ].strip()
+            Family = List[0][: List[0].find(':')].strip()
+            ToolChain = List[0][List[0].find(':') + 1:].strip()
         else:
             ToolChain = List[0].strip()
         Flag = List[1].strip()
     return (Family, ToolChain, Flag)
 
-## Get Library Class
+# Get Library Class
 #
 # Get Library of Dsc as <LibraryClassKeyWord>|<LibraryInstance>
 #
@@ -241,20 +258,25 @@ def GetBuildOption(String, File, LineNo = -1):
 #
 # @retval (LibraryClassKeyWord, LibraryInstance, [SUP_MODULE_LIST]) Formatted Library Item
 #
-def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo = -1):
+
+
+def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo=-1):
     List = GetSplitValueList(Item[0])
     SupMod = SUP_MODULE_LIST_STRING
     if len(List) != 2:
-        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile, '<LibraryClassKeyWord>|<LibraryInstance>')
+        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile,
+                         '<LibraryClassKeyWord>|<LibraryInstance>')
     else:
-        CheckFileType(List[1], '.Inf', ContainerFile, 'library class instance', Item[0], LineNo)
-        CheckFileExist(WorkspaceDir, List[1], ContainerFile, 'LibraryClasses', Item[0], LineNo)
+        CheckFileType(List[1], '.Inf', ContainerFile,
+                      'library class instance', Item[0], LineNo)
+        CheckFileExist(
+            WorkspaceDir, List[1], ContainerFile, 'LibraryClasses', Item[0], LineNo)
         if Item[1] != '':
             SupMod = Item[1]
 
     return (List[0], List[1], SupMod)
 
-## Get Library Class
+# Get Library Class
 #
 # Get Library of Dsc as <LibraryClassKeyWord>[|<LibraryInstance>][|<TokenSpaceGuidCName>.<PcdCName>]
 #
@@ -263,23 +285,29 @@ def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo = -1):
 #
 # @retval (LibraryClassKeyWord, LibraryInstance, [SUP_MODULE_LIST]) Formatted Library Item
 #
-def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo = -1):
+
+
+def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo=-1):
     ItemList = GetSplitValueList((Item[0] + DataType.TAB_VALUE_SPLIT * 2))
     SupMod = SUP_MODULE_LIST_STRING
 
     if len(ItemList) > 5:
-        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile, '<LibraryClassKeyWord>[|<LibraryInstance>][|<TokenSpaceGuidCName>.<PcdCName>]')
+        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile,
+                         '<LibraryClassKeyWord>[|<LibraryInstance>][|<TokenSpaceGuidCName>.<PcdCName>]')
     else:
-        CheckFileType(ItemList[1], '.Inf', ContainerFile, 'LibraryClasses', Item[0], LineNo)
-        CheckFileExist(WorkspaceDir, ItemList[1], ContainerFile, 'LibraryClasses', Item[0], LineNo)
+        CheckFileType(ItemList[1], '.Inf', ContainerFile,
+                      'LibraryClasses', Item[0], LineNo)
+        CheckFileExist(
+            WorkspaceDir, ItemList[1], ContainerFile, 'LibraryClasses', Item[0], LineNo)
         if ItemList[2] != '':
-            CheckPcdTokenInfo(ItemList[2], 'LibraryClasses', ContainerFile, LineNo)
+            CheckPcdTokenInfo(
+                ItemList[2], 'LibraryClasses', ContainerFile, LineNo)
         if Item[1] != '':
             SupMod = Item[1]
 
     return (ItemList[0], ItemList[1], ItemList[2], SupMod)
 
-## CheckPcdTokenInfo
+# CheckPcdTokenInfo
 #
 # Check if PcdTokenInfo is following <TokenSpaceGuidCName>.<PcdCName>
 #
@@ -289,7 +317,9 @@ def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo = -1):
 #
 # @retval True PcdTokenInfo is in correct format
 #
-def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo = -1):
+
+
+def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo=-1):
     Format = '<TokenSpaceGuidCName>.<PcdCName>'
     if TokenInfoString != '' and TokenInfoString is not None:
         TokenInfoList = GetSplitValueList(TokenInfoString, TAB_SPLIT)
@@ -298,7 +328,7 @@ def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo = -1):
 
     RaiseParserError(TokenInfoString, Section, File, Format, LineNo)
 
-## Get Pcd
+# Get Pcd
 #
 # Get Pcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<Type>|<MaximumDatumSize>]
 #
@@ -307,12 +337,15 @@ def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo = -1):
 #
 # @retval (TokenInfo[1], TokenInfo[0], List[1], List[2], List[3], Type)
 #
-def GetPcd(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value, MaximumDatumSize, Token = '', '', '', '', ''
     List = GetSplitValueList(Item + TAB_VALUE_SPLIT * 2)
 
     if len(List) < 4 or len(List) > 6:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<Type>|<MaximumDatumSize>]', LineNo)
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<Type>|<MaximumDatumSize>]', LineNo)
     else:
         Value = List[1]
         MaximumDatumSize = List[2]
@@ -323,7 +356,7 @@ def GetPcd(Item, Type, ContainerFile, LineNo = -1):
 
     return (TokenName, TokenGuid, Value, MaximumDatumSize, Token, Type)
 
-## Get FeatureFlagPcd
+# Get FeatureFlagPcd
 #
 # Get FeatureFlagPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE
 #
@@ -332,11 +365,14 @@ def GetPcd(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (TokenInfo[1], TokenInfo[0], List[1], Type)
 #
-def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value = '', '', ''
     List = GetSplitValueList(Item)
     if len(List) != 2:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, '<PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE', LineNo)
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE', LineNo)
     else:
         Value = List[1]
     if CheckPcdTokenInfo(List[0], 'Pcds' + Type, ContainerFile, LineNo):
@@ -344,7 +380,7 @@ def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo = -1):
 
     return (TokenName, TokenGuid, Value, Type)
 
-## Get DynamicDefaultPcd
+# Get DynamicDefaultPcd
 #
 # Get DynamicDefaultPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<DatumTyp>[|<MaxDatumSize>]]
 #
@@ -353,11 +389,14 @@ def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (TokenInfo[1], TokenInfo[0], List[1], List[2], List[3], Type)
 #
-def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value, DatumTyp, MaxDatumSize = '', '', '', '', ''
     List = GetSplitValueList(Item + TAB_VALUE_SPLIT * 2)
     if len(List) < 4 or len(List) > 8:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<DatumTyp>[|<MaxDatumSize>]]', LineNo)
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>[|<DatumTyp>[|<MaxDatumSize>]]', LineNo)
     else:
         Value = List[1]
         DatumTyp = List[2]
@@ -367,7 +406,7 @@ def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo = -1):
 
     return (TokenName, TokenGuid, Value, DatumTyp, MaxDatumSize, Type)
 
-## Get DynamicHiiPcd
+# Get DynamicHiiPcd
 #
 # Get DynamicHiiPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<String>|<VariableGuidCName>|<VariableOffset>[|<DefaultValue>[|<MaximumDatumSize>]]
 #
@@ -376,11 +415,14 @@ def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (TokenInfo[1], TokenInfo[0], List[1], List[2], List[3], List[4], List[5], Type)
 #
-def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, L1, L2, L3, L4, L5 = '', '', '', '', '', '', ''
     List = GetSplitValueList(Item + TAB_VALUE_SPLIT * 2)
     if len(List) < 6 or len(List) > 8:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, '<PcdTokenSpaceGuidCName>.<TokenCName>|<String>|<VariableGuidCName>|<VariableOffset>[|<DefaultValue>[|<MaximumDatumSize>]]', LineNo)
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|<String>|<VariableGuidCName>|<VariableOffset>[|<DefaultValue>[|<MaximumDatumSize>]]', LineNo)
     else:
         L1, L2, L3, L4, L5 = List[1], List[2], List[3], List[4], List[5]
     if CheckPcdTokenInfo(List[0], 'Pcds' + Type, ContainerFile, LineNo):
@@ -388,7 +430,7 @@ def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo = -1):
 
     return (TokenName, TokenGuid, L1, L2, L3, L4, L5, Type)
 
-## Get DynamicVpdPcd
+# Get DynamicVpdPcd
 #
 # Get DynamicVpdPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<VpdOffset>[|<MaximumDatumSize>]
 #
@@ -397,11 +439,14 @@ def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (TokenInfo[1], TokenInfo[0], List[1], List[2], Type)
 #
-def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, L1, L2 = '', '', '', ''
     List = GetSplitValueList(Item + TAB_VALUE_SPLIT)
     if len(List) < 3 or len(List) > 4:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, '<PcdTokenSpaceGuidCName>.<TokenCName>|<VpdOffset>[|<MaximumDatumSize>]', LineNo)
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|<VpdOffset>[|<MaximumDatumSize>]', LineNo)
     else:
         L1, L2 = List[1], List[2]
     if CheckPcdTokenInfo(List[0], 'Pcds' + Type, ContainerFile, LineNo):
@@ -409,7 +454,7 @@ def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo = -1):
 
     return (TokenName, TokenGuid, L1, L2, Type)
 
-## GetComponent
+# GetComponent
 #
 # Parse block of the components defined in dsc file
 # Set KeyValues as [ ['component name', [lib1, lib2, lib3], [bo1, bo2, bo3], [pcd1, pcd2, pcd3]], ...]
@@ -419,8 +464,11 @@ def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval True Get component successfully
 #
+
+
 def GetComponent(Lines, KeyValues):
-    (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
+    (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+     findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
     ListItem = None
     LibraryClassItem = []
     BuildOption = []
@@ -442,39 +490,49 @@ def GetComponent(Lines, KeyValues):
             #
             if Line.endswith('{'):
                 findBlock = True
-                ListItem = CleanString(Line.rsplit('{', 1)[0], DataType.TAB_COMMENT_SPLIT)
+                ListItem = CleanString(Line.rsplit(
+                    '{', 1)[0], DataType.TAB_COMMENT_SPLIT)
 
         #
         # Parse a block content
         #
         if findBlock:
             if Line.find('<LibraryClasses>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (True, False, False, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (True, False, False, False, False, False, False)
                 continue
             if Line.find('<BuildOptions>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, True, False, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, True, False, False, False, False, False)
                 continue
             if Line.find('<PcdsFeatureFlag>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, True, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, True, False, False, False, False)
                 continue
             if Line.find('<PcdsPatchableInModule>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, True, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, True, False, False, False)
                 continue
             if Line.find('<PcdsFixedAtBuild>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, True, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, True, False, False)
                 continue
             if Line.find('<PcdsDynamic>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, True, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, True, False)
                 continue
             if Line.find('<PcdsDynamicEx>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, True)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, True)
                 continue
             if Line.endswith('}'):
                 #
                 # find '}' at line tail
                 #
-                KeyValues.append([ListItem, LibraryClassItem, BuildOption, Pcd])
-                (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
+                KeyValues.append(
+                    [ListItem, LibraryClassItem, BuildOption, Pcd])
+                (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
                 LibraryClassItem, BuildOption, Pcd = [], [], []
                 continue
 
@@ -498,7 +556,7 @@ def GetComponent(Lines, KeyValues):
 
     return True
 
-## GetExec
+# GetExec
 #
 # Parse a string with format "InfFilename [EXEC = ExecFilename]"
 # Return (InfFilename, ExecFilename)
@@ -507,18 +565,20 @@ def GetComponent(Lines, KeyValues):
 #
 # @retval truple() A pair as (InfFilename, ExecFilename)
 #
+
+
 def GetExec(String):
     InfFilename = ''
     ExecFilename = ''
     if String.find('EXEC') > -1:
-        InfFilename = String[ : String.find('EXEC')].strip()
-        ExecFilename = String[String.find('EXEC') + len('EXEC') : ].strip()
+        InfFilename = String[: String.find('EXEC')].strip()
+        ExecFilename = String[String.find('EXEC') + len('EXEC'):].strip()
     else:
         InfFilename = String.strip()
 
     return (InfFilename, ExecFilename)
 
-## GetComponents
+# GetComponents
 #
 # Parse block of the components defined in dsc file
 # Set KeyValues as [ ['component name', [lib1, lib2, lib3], [bo1, bo2, bo3], [pcd1, pcd2, pcd3]], ...]
@@ -530,10 +590,13 @@ def GetExec(String):
 #
 # @retval True Get component successfully
 #
+
+
 def GetComponents(Lines, Key, KeyValues, CommentCharacter):
     if Lines.find(DataType.TAB_SECTION_END) > -1:
         Lines = Lines.split(DataType.TAB_SECTION_END, 1)[1]
-    (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
+    (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+     findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
     ListItem = None
     LibraryClassItem = []
     BuildOption = []
@@ -552,39 +615,49 @@ def GetComponents(Lines, Key, KeyValues, CommentCharacter):
             #
             if Line.endswith('{'):
                 findBlock = True
-                ListItem = CleanString(Line.rsplit('{', 1)[0], CommentCharacter)
+                ListItem = CleanString(Line.rsplit('{', 1)[
+                                       0], CommentCharacter)
 
         #
         # Parse a block content
         #
         if findBlock:
             if Line.find('<LibraryClasses>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (True, False, False, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (True, False, False, False, False, False, False)
                 continue
             if Line.find('<BuildOptions>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, True, False, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, True, False, False, False, False, False)
                 continue
             if Line.find('<PcdsFeatureFlag>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, True, False, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, True, False, False, False, False)
                 continue
             if Line.find('<PcdsPatchableInModule>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, True, False, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, True, False, False, False)
                 continue
             if Line.find('<PcdsFixedAtBuild>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, True, False, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, True, False, False)
                 continue
             if Line.find('<PcdsDynamic>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, True, False)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, True, False)
                 continue
             if Line.find('<PcdsDynamicEx>') != -1:
-                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, True)
+                (findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, True)
                 continue
             if Line.endswith('}'):
                 #
                 # find '}' at line tail
                 #
-                KeyValues.append([ListItem, LibraryClassItem, BuildOption, Pcd])
-                (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild, findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
+                KeyValues.append(
+                    [ListItem, LibraryClassItem, BuildOption, Pcd])
+                (findBlock, findLibraryClass, findBuildOption, findPcdsFeatureFlag, findPcdsPatchableInModule, findPcdsFixedAtBuild,
+                 findPcdsDynamic, findPcdsDynamicEx) = (False, False, False, False, False, False, False, False)
                 LibraryClassItem, BuildOption, Pcd = [], [], []
                 continue
 
@@ -608,7 +681,7 @@ def GetComponents(Lines, Key, KeyValues, CommentCharacter):
 
     return True
 
-## Get Source
+# Get Source
 #
 # Get Source of Inf as <Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]
 #
@@ -617,19 +690,23 @@ def GetComponents(Lines, Key, KeyValues, CommentCharacter):
 #
 # @retval (List[0], List[1], List[2], List[3], List[4])
 #
-def GetSource(Item, ContainerFile, FileRelativePath, LineNo = -1):
+
+
+def GetSource(Item, ContainerFile, FileRelativePath, LineNo=-1):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT * 4
     List = GetSplitValueList(ItemNew)
     if len(List) < 5 or len(List) > 9:
-        RaiseParserError(Item, 'Sources', ContainerFile, '<Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]', LineNo)
+        RaiseParserError(Item, 'Sources', ContainerFile,
+                         '<Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]', LineNo)
     List[0] = NormPath(List[0])
-    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Sources', Item, LineNo)
+    CheckFileExist(FileRelativePath, List[0],
+                   ContainerFile, 'Sources', Item, LineNo)
     if List[4] != '':
         CheckPcdTokenInfo(List[4], 'Sources', ContainerFile, LineNo)
 
     return (List[0], List[1], List[2], List[3], List[4])
 
-## Get Binary
+# Get Binary
 #
 # Get Binary of Inf as <Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]
 #
@@ -639,11 +716,14 @@ def GetSource(Item, ContainerFile, FileRelativePath, LineNo = -1):
 # @retval (List[0], List[1], List[2], List[3])
 # @retval List
 #
-def GetBinary(Item, ContainerFile, FileRelativePath, LineNo = -1):
+
+
+def GetBinary(Item, ContainerFile, FileRelativePath, LineNo=-1):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     if len(List) != 4 and len(List) != 5:
-        RaiseParserError(Item, 'Binaries', ContainerFile, "<FileType>|<Filename>|<Target>[|<TokenSpaceGuidCName>.<PcdCName>]", LineNo)
+        RaiseParserError(Item, 'Binaries', ContainerFile,
+                         "<FileType>|<Filename>|<Target>[|<TokenSpaceGuidCName>.<PcdCName>]", LineNo)
     else:
         if List[3] != '':
             CheckPcdTokenInfo(List[3], 'Binaries', ContainerFile, LineNo)
@@ -657,7 +737,7 @@ def GetBinary(Item, ContainerFile, FileRelativePath, LineNo = -1):
     elif len(List) == 1:
         return (List[0], '', '', '')
 
-## Get Guids/Protocols/Ppis
+# Get Guids/Protocols/Ppis
 #
 # Get Guids/Protocols/Ppis of Inf as <GuidCName>[|<PcdFeatureFlag>]
 #
@@ -667,7 +747,9 @@ def GetBinary(Item, ContainerFile, FileRelativePath, LineNo = -1):
 #
 # @retval (List[0], List[1])
 #
-def GetGuidsProtocolsPpisOfInf(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetGuidsProtocolsPpisOfInf(Item, Type, ContainerFile, LineNo=-1):
     ItemNew = Item + TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     if List[1] != '':
@@ -675,7 +757,7 @@ def GetGuidsProtocolsPpisOfInf(Item, Type, ContainerFile, LineNo = -1):
 
     return (List[0], List[1])
 
-## Get Guids/Protocols/Ppis
+# Get Guids/Protocols/Ppis
 #
 # Get Guids/Protocols/Ppis of Dec as <GuidCName>=<GuidValue>
 #
@@ -685,14 +767,17 @@ def GetGuidsProtocolsPpisOfInf(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (List[0], List[1])
 #
-def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo = -1):
+
+
+def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo=-1):
     List = GetSplitValueList(Item, DataType.TAB_EQUAL_SPLIT)
     if len(List) != 2:
-        RaiseParserError(Item, Type, ContainerFile, '<CName>=<GuidValue>', LineNo)
+        RaiseParserError(Item, Type, ContainerFile,
+                         '<CName>=<GuidValue>', LineNo)
 
     return (List[0], List[1])
 
-## GetPackage
+# GetPackage
 #
 # Get Package of Inf as <PackagePath>[|<PcdFeatureFlag>]
 #
@@ -702,18 +787,21 @@ def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo = -1):
 #
 # @retval (List[0], List[1])
 #
-def GetPackage(Item, ContainerFile, FileRelativePath, LineNo = -1):
+
+
+def GetPackage(Item, ContainerFile, FileRelativePath, LineNo=-1):
     ItemNew = Item + TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     CheckFileType(List[0], '.Dec', ContainerFile, 'package', List[0], LineNo)
-    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Packages', List[0], LineNo)
+    CheckFileExist(FileRelativePath,
+                   List[0], ContainerFile, 'Packages', List[0], LineNo)
 
     if List[1] != '':
         CheckPcdTokenInfo(List[1], 'Packages', ContainerFile, LineNo)
 
     return (List[0], List[1])
 
-## Get Pcd Values of Inf
+# Get Pcd Values of Inf
 #
 # Get Pcd of Inf as <TokenSpaceGuidCName>.<PcdCName>[|<Value>]
 #
@@ -723,6 +811,8 @@ def GetPackage(Item, ContainerFile, FileRelativePath, LineNo = -1):
 #
 # @retval (TokenSpcCName, TokenCName, Value, ItemType) Formatted Pcd Item
 #
+
+
 def GetPcdOfInf(Item, Type, File, LineNo):
     Format = '<TokenSpaceGuidCName>.<PcdCName>[|<Value>]'
     TokenGuid, TokenName, Value, InfType = '', '', '', ''
@@ -752,12 +842,12 @@ def GetPcdOfInf(Item, Type, File, LineNo):
     return (TokenGuid, TokenName, Value, Type)
 
 
-## Get Pcd Values of Dec
+# Get Pcd Values of Dec
 #
 # Get Pcd of Dec as <TokenSpcCName>.<TokenCName>|<Value>|<DatumType>|<Token>
 # @retval (TokenSpcCName, TokenCName, Value, DatumType, Token, ItemType) Formatted Pcd Item
 #
-def GetPcdOfDec(Item, Type, File, LineNo = -1):
+def GetPcdOfDec(Item, Type, File, LineNo=-1):
     Format = '<TokenSpaceGuidCName>.<PcdCName>|<Value>|<DatumType>|<Token>'
     TokenGuid, TokenName, Value, DatumType, Token = '', '', '', '', ''
     List = GetSplitValueList(Item)
@@ -776,7 +866,7 @@ def GetPcdOfDec(Item, Type, File, LineNo = -1):
 
     return (TokenGuid, TokenName, Value, DatumType, Token, Type)
 
-## Parse DEFINE statement
+# Parse DEFINE statement
 #
 # Get DEFINE macros
 #
@@ -784,15 +874,22 @@ def GetPcdOfDec(Item, Type, File, LineNo = -1):
 # Value1: Macro Name
 # Value2: Macro Value
 #
+
+
 def ParseDefine(LineValue, StartLine, Table, FileID, Filename, SectionName, SectionModel, Arch):
-    EdkLogger.debug(EdkLogger.DEBUG_2, "DEFINE statement '%s' found in section %s" % (LineValue, SectionName))
-    Define = GetSplitValueList(CleanString(LineValue[LineValue.upper().find(DataType.TAB_DEFINE.upper() + ' ') + len(DataType.TAB_DEFINE + ' ') : ]), TAB_EQUAL_SPLIT, 1)
-    Table.Insert(MODEL_META_DATA_DEFINE, Define[0], Define[1], '', '', '', Arch, SectionModel, FileID, StartLine, -1, StartLine, -1, 0)
+    EdkLogger.debug(EdkLogger.DEBUG_2, "DEFINE statement '%s' found in section %s" % (
+        LineValue, SectionName))
+    Define = GetSplitValueList(CleanString(LineValue[LineValue.upper().find(
+        DataType.TAB_DEFINE.upper() + ' ') + len(DataType.TAB_DEFINE + ' '):]), TAB_EQUAL_SPLIT, 1)
+    Table.Insert(MODEL_META_DATA_DEFINE, Define[0], Define[1], '', '',
+                 '', Arch, SectionModel, FileID, StartLine, -1, StartLine, -1, 0)
 
-## InsertSectionItems
+# InsertSectionItems
 #
 # Insert item data of a section to a dict
 #
+
+
 def InsertSectionItems(Model, CurrentSection, SectionItemList, ArchList, ThirdList, RecordSet):
     # Insert each item data of a section
     for Index in range(0, len(ArchList)):
@@ -804,9 +901,10 @@ def InsertSectionItems(Model, CurrentSection, SectionItemList, ArchList, ThirdLi
         Records = RecordSet[Model]
         for SectionItem in SectionItemList:
             BelongsToItem, EndLine, EndColumn = -1, -1, -1
-            LineValue, StartLine, EndLine, Comment = SectionItem[0], SectionItem[1], SectionItem[1], SectionItem[2]
+            LineValue, StartLine, EndLine, Comment = SectionItem[
+                0], SectionItem[1], SectionItem[1], SectionItem[2]
 
-            EdkLogger.debug(4, "Parsing %s ..." %LineValue)
+            EdkLogger.debug(4, "Parsing %s ..." % LineValue)
             # And then parse DEFINE statement
             if LineValue.upper().find(DataType.TAB_DEFINE.upper() + ' ') > -1:
                 continue
@@ -818,7 +916,7 @@ def InsertSectionItems(Model, CurrentSection, SectionItemList, ArchList, ThirdLi
         if RecordSet != {}:
             RecordSet[Model] = Records
 
-## Insert records to database
+# Insert records to database
 #
 # Insert item data of a section to database
 # @param Table:            The Table to be inserted
@@ -831,6 +929,8 @@ def InsertSectionItems(Model, CurrentSection, SectionItemList, ArchList, ThirdLi
 # @param IfDefList:        A list of all conditional statements
 # @param RecordSet:        A dict of all parsed records
 #
+
+
 def InsertSectionItemsIntoDatabase(Table, FileID, Filename, Model, CurrentSection, SectionItemList, ArchList, ThirdList, IfDefList, RecordSet):
     #
     # Insert each item data of a section
@@ -846,31 +946,35 @@ def InsertSectionItemsIntoDatabase(Table, FileID, Filename, Model, CurrentSectio
             BelongsToItem, EndLine, EndColumn = -1, -1, -1
             LineValue, StartLine, EndLine = SectionItem[0], SectionItem[1], SectionItem[1]
 
-            EdkLogger.debug(4, "Parsing %s ..." %LineValue)
+            EdkLogger.debug(4, "Parsing %s ..." % LineValue)
             #
             # And then parse DEFINE statement
             #
             if LineValue.upper().find(DataType.TAB_DEFINE.upper() + ' ') > -1:
-                ParseDefine(LineValue, StartLine, Table, FileID, Filename, CurrentSection, Model, Arch)
+                ParseDefine(LineValue, StartLine, Table, FileID,
+                            Filename, CurrentSection, Model, Arch)
                 continue
 
             #
             # At last parse other sections
             #
-            ID = Table.Insert(Model, LineValue, Third, Third, '', '', Arch, -1, FileID, StartLine, -1, StartLine, -1, 0)
+            ID = Table.Insert(Model, LineValue, Third, Third, '',
+                              '', Arch, -1, FileID, StartLine, -1, StartLine, -1, 0)
             Records.append([LineValue, Arch, StartLine, ID, Third])
 
         if RecordSet != {}:
             RecordSet[Model] = Records
 
-## GenMetaDatSectionItem
+# GenMetaDatSectionItem
+
+
 def GenMetaDatSectionItem(Key, Value, List):
     if Key not in List:
         List[Key] = [Value]
     else:
         List[Key].append(Value)
 
-## IsValidWord
+# IsValidWord
 #
 # Check whether the word is valid.
 # <Word>   ::=  (a-zA-Z0-9_)(a-zA-Z0-9_-){0,} Alphanumeric characters with
@@ -880,6 +984,8 @@ def GenMetaDatSectionItem(Key, Value, List):
 #
 # @param Word:  The word string need to be checked.
 #
+
+
 def IsValidWord(Word):
     if not Word:
         return False
diff --git a/BaseTools/Source/Python/Common/RangeExpression.py b/BaseTools/Source/Python/Common/RangeExpression.py
index 039d2814670f..5820806e4ccb 100644
--- a/BaseTools/Source/Python/Common/RangeExpression.py
+++ b/BaseTools/Source/Python/Common/RangeExpression.py
@@ -36,8 +36,9 @@ ERR_ARRAY_ELE = 'This must be HEX value for NList or Array: [%s].'
 ERR_EMPTY_EXPR = 'Empty expression is not allowed.'
 ERR_IN_OPERAND = 'Macro after IN operator can only be: $(FAMILY), $(ARCH), $(TOOL_CHAIN_TAG) and $(TARGET).'
 
+
 class RangeObject(object):
-    def __init__(self, start, end, empty = False):
+    def __init__(self, start, end, empty=False):
 
         if int(start) < int(end):
             self.start = int(start)
@@ -47,13 +48,15 @@ class RangeObject(object):
             self.end = int(start)
         self.empty = empty
 
+
 class RangeContainer(object):
     def __init__(self):
         self.rangelist = []
 
     def push(self, RangeObject):
         self.rangelist.append(RangeObject)
-        self.rangelist = sorted(self.rangelist, key = lambda rangeobj : rangeobj.start)
+        self.rangelist = sorted(
+            self.rangelist, key=lambda rangeobj: rangeobj.start)
         self.merge()
 
     def pop(self):
@@ -68,6 +71,7 @@ class RangeContainer(object):
             else:
                 newrangelist.append(rangeobj)
         self.rangelist = newrangelist
+
     def merge(self):
         self.__clean__()
         for i in range(0, len(self.rangelist) - 1):
@@ -75,7 +79,8 @@ class RangeContainer(object):
                 continue
             else:
                 self.rangelist[i + 1].start = self.rangelist[i].start
-                self.rangelist[i + 1].end = self.rangelist[i + 1].end > self.rangelist[i].end and self.rangelist[i + 1].end or self.rangelist[i].end
+                self.rangelist[i + 1].end = self.rangelist[i +
+                                                           1].end > self.rangelist[i].end and self.rangelist[i + 1].end or self.rangelist[i].end
                 self.rangelist[i].empty = True
 
         self.__clean__()
@@ -91,6 +96,7 @@ class RangeContainer(object):
 class XOROperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "XOR ..."
@@ -98,13 +104,16 @@ class XOROperatorObject(object):
         rangeId = str(uuid.uuid1())
         rangeContainer = RangeContainer()
         rangeContainer.push(RangeObject(0, int(Operand) - 1))
-        rangeContainer.push(RangeObject(int(Operand) + 1, MAX_VAL_TYPE[DataType]))
+        rangeContainer.push(RangeObject(
+            int(Operand) + 1, MAX_VAL_TYPE[DataType]))
         SymbolTable[rangeId] = rangeContainer
         return rangeId
 
+
 class LEOperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LE ..."
@@ -114,9 +123,12 @@ class LEOperatorObject(object):
         rangeContainer.push(RangeObject(0, int(Operand)))
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1
+
+
 class LTOperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "LT ..."
@@ -127,9 +139,11 @@ class LTOperatorObject(object):
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1
 
+
 class GEOperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GE ..."
@@ -140,22 +154,27 @@ class GEOperatorObject(object):
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1
 
+
 class GTOperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "GT ..."
             raise BadExpression(ERR_SNYTAX % Expr)
         rangeId1 = str(uuid.uuid1())
         rangeContainer = RangeContainer()
-        rangeContainer.push(RangeObject(int(Operand) + 1, MAX_VAL_TYPE[DataType]))
+        rangeContainer.push(RangeObject(
+            int(Operand) + 1, MAX_VAL_TYPE[DataType]))
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1
 
+
 class EQOperatorObject(object):
     def __init__(self):
         pass
+
     def Calculate(self, Operand, DataType, SymbolTable):
         if isinstance(Operand, type('')) and not Operand.isalnum():
             Expr = "EQ ..."
@@ -166,6 +185,7 @@ class EQOperatorObject(object):
         SymbolTable[rangeId1] = rangeContainer
         return rangeId1
 
+
 def GetOperatorObject(Operator):
     if Operator == '>':
         return GTOperatorObject()
@@ -182,17 +202,18 @@ def GetOperatorObject(Operator):
     else:
         raise BadExpression("Bad Operator")
 
+
 class RangeExpression(BaseExpression):
     # Logical operator mapping
     LogicalOperators = {
-        '&&' : 'and', '||' : 'or',
-        '!'  : 'not', 'AND': 'and',
-        'OR' : 'or' , 'NOT': 'not',
-        'XOR': '^'  , 'xor': '^',
-        'EQ' : '==' , 'NE' : '!=',
-        'GT' : '>'  , 'LT' : '<',
-        'GE' : '>=' , 'LE' : '<=',
-        'IN' : 'in'
+        '&&': 'and', '||': 'or',
+        '!': 'not', 'AND': 'and',
+        'OR': 'or', 'NOT': 'not',
+        'XOR': '^', 'xor': '^',
+        'EQ': '==', 'NE': '!=',
+        'GT': '>', 'LT': '<',
+        'GE': '>=', 'LE': '<=',
+        'IN': 'in'
     }
 
     NonLetterOpLst = ['+', '-', '&', '|', '^', '!', '=', '>', '<']
@@ -227,7 +248,6 @@ class RangeExpression(BaseExpression):
         self._Expr = expr
         return expr
 
-
     def EvalRange(self, Operator, Oprand):
 
         operatorobj = GetOperatorObject(Operator)
@@ -283,23 +303,23 @@ class RangeExpression(BaseExpression):
 #        rangeContainer.dump()
         return rangeid
 
-
     def NegativeRange(self, Oprand1):
         rangeContainer1 = self.operanddict[Oprand1]
 
-
         rangeids = []
 
         for rangeobj in rangeContainer1.pop():
             rangeContainer = RangeContainer()
             rangeid = str(uuid.uuid1())
             if rangeobj.empty:
-                rangeContainer.push(RangeObject(0, MAX_VAL_TYPE[self.PcdDataType]))
+                rangeContainer.push(RangeObject(
+                    0, MAX_VAL_TYPE[self.PcdDataType]))
             else:
                 if rangeobj.start > 0:
                     rangeContainer.push(RangeObject(0, rangeobj.start - 1))
                 if rangeobj.end < MAX_VAL_TYPE[self.PcdDataType]:
-                    rangeContainer.push(RangeObject(rangeobj.end + 1, MAX_VAL_TYPE[self.PcdDataType]))
+                    rangeContainer.push(RangeObject(
+                        rangeobj.end + 1, MAX_VAL_TYPE[self.PcdDataType]))
             self.operanddict[rangeid] = rangeContainer
             rangeids.append(rangeid)
 
@@ -321,7 +341,7 @@ class RangeExpression(BaseExpression):
         self.operanddict[rangeid2] = self.operanddict[re]
         return rangeid2
 
-    def Eval(self, Operator, Oprand1, Oprand2 = None):
+    def Eval(self, Operator, Oprand1, Oprand2=None):
 
         if Operator in ["!", "NOT", "not"]:
             if not gGuidPattern.match(Oprand1.strip()):
@@ -330,7 +350,7 @@ class RangeExpression(BaseExpression):
         else:
             if Operator in ["==", ">=", "<=", ">", "<", '^']:
                 return self.EvalRange(Operator, Oprand1)
-            elif Operator == 'and' :
+            elif Operator == 'and':
                 if not gGuidPatternEnd.match(Oprand1.strip()) or not gGuidPatternEnd.match(Oprand2.strip()):
                     raise BadExpression(ERR_STRING_EXPR % Operator)
                 return self.Rangeintersection(Oprand1, Oprand2)
@@ -341,11 +361,11 @@ class RangeExpression(BaseExpression):
             else:
                 raise BadExpression(ERR_STRING_EXPR % Operator)
 
-
-    def __init__(self, Expression, PcdDataType, SymbolTable = None):
+    def __init__(self, Expression, PcdDataType, SymbolTable=None):
         if SymbolTable is None:
             SymbolTable = {}
-        super(RangeExpression, self).__init__(self, Expression, PcdDataType, SymbolTable)
+        super(RangeExpression, self).__init__(
+            self, Expression, PcdDataType, SymbolTable)
         self._NoProcess = False
         if not isinstance(Expression, type('')):
             self._Expr = Expression
@@ -367,7 +387,6 @@ class RangeExpression(BaseExpression):
         self._Token = ''
         self._WarnExcept = None
 
-
         # Literal token without any conversion
         self._LiteralToken = ''
 
@@ -383,7 +402,7 @@ class RangeExpression(BaseExpression):
     #   @return: True or False if RealValue is False
     #            Evaluated value of string format if RealValue is True
     #
-    def __call__(self, RealValue = False, Depth = 0):
+    def __call__(self, RealValue=False, Depth=0):
         if self._NoProcess:
             return self._Expr
 
@@ -397,7 +416,7 @@ class RangeExpression(BaseExpression):
         if RealValue and Depth == 0:
             self._Token = self._Expr
             if gGuidPatternEnd.match(self._Expr):
-                return [self.operanddict[self._Expr] ]
+                return [self.operanddict[self._Expr]]
 
             self._Idx = 0
             self._Token = ''
@@ -507,11 +526,11 @@ class RangeExpression(BaseExpression):
         # All whitespace and tabs in array are already stripped.
         IsArray = IsGuid = False
         if len(Token.split(',')) == 11 and len(Token.split(',{')) == 2 \
-            and len(Token.split('},')) == 1:
+                and len(Token.split('},')) == 1:
             HexLen = [11, 6, 6, 5, 4, 4, 4, 4, 4, 4, 6]
             HexList = Token.split(',')
             if HexList[3].startswith('{') and \
-                not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
+                    not [Index for Index, Hex in enumerate(HexList) if len(Hex) > HexLen[Index]]:
                 IsGuid = True
         if Token.lstrip('{').rstrip('}').find('{') == -1:
             if not [Hex for Hex in Token.lstrip('{').rstrip('}').split(',') if len(Hex) > 4]:
@@ -530,7 +549,8 @@ class RangeExpression(BaseExpression):
         self._Idx += 1
 
         # Replace escape \\\", \"
-        Expr = self._Expr[self._Idx:].replace('\\\\', '//').replace('\\\"', '\\\'')
+        Expr = self._Expr[self._Idx:].replace(
+            '\\\\', '//').replace('\\\"', '\\\'')
         for Ch in Expr:
             self._Idx += 1
             if Ch == '"':
@@ -543,7 +563,7 @@ class RangeExpression(BaseExpression):
 
     # Get token that is comprised by alphanumeric, underscore or dot(used by PCD)
     # @param IsAlphaOp: Indicate if parsing general token or script operator(EQ, NE...)
-    def __GetIdToken(self, IsAlphaOp = False):
+    def __GetIdToken(self, IsAlphaOp=False):
         IdToken = ''
         for Ch in self._Expr[self._Idx:]:
             if not self.__IsIdChar(Ch):
@@ -567,7 +587,8 @@ class RangeExpression(BaseExpression):
                 Ex = BadExpression(ERR_PCD_RESOLVE % self._Token)
                 Ex.Pcd = self._Token
                 raise Ex
-            self._Token = RangeExpression(self._Symb[self._Token], self._Symb)(True, self._Depth + 1)
+            self._Token = RangeExpression(
+                self._Symb[self._Token], self._Symb)(True, self._Depth + 1)
             if not isinstance(self._Token, type('')):
                 self._LiteralToken = hex(self._Token)
                 return
@@ -581,7 +602,7 @@ class RangeExpression(BaseExpression):
         else:
             self.__IsNumberToken()
 
-    def __GetNList(self, InArray = False):
+    def __GetNList(self, InArray=False):
         self._GetSingleToken()
         if not self.__IsHexLiteral():
             if InArray:
@@ -609,7 +630,7 @@ class RangeExpression(BaseExpression):
 
     def __IsHexLiteral(self):
         if self._LiteralToken.startswith('{') and \
-            self._LiteralToken.endswith('}'):
+                self._LiteralToken.endswith('}'):
             return True
 
         if gHexPattern.match(self._LiteralToken):
@@ -645,7 +666,7 @@ class RangeExpression(BaseExpression):
             Ch = Expr[0]
             Match = gGuidPattern.match(Expr)
             if Match and not Expr[Match.end():Match.end() + 1].isalnum() \
-                and Expr[Match.end():Match.end() + 1] != '_':
+                    and Expr[Match.end():Match.end() + 1] != '_':
                 self._Idx += Match.end()
                 self._Token = Expr[0:Match.end()]
                 return self._Token
diff --git a/BaseTools/Source/Python/Common/StringUtils.py b/BaseTools/Source/Python/Common/StringUtils.py
index 73dafa797a52..370ce7147681 100644
--- a/BaseTools/Source/Python/Common/StringUtils.py
+++ b/BaseTools/Source/Python/Common/StringUtils.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common string related functions used in parsing process
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -24,7 +24,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 gHexVerPatt = re.compile('0x[a-f0-9]{4}[a-f0-9]{4}$', re.IGNORECASE)
 gHumanReadableVerPatt = re.compile(r'([1-9][0-9]*|0)\.[0-9]{1,2}$')
 
-## GetSplitValueList
+# GetSplitValueList
 #
 # Get a value list from a string with multiple values split with SplitTag
 # The default SplitTag is DataType.TAB_VALUE_SPLIT
@@ -36,7 +36,9 @@ gHumanReadableVerPatt = re.compile(r'([1-9][0-9]*|0)\.[0-9]{1,2}$')
 #
 # @retval list() A list for splitted string
 #
-def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
+
+
+def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit=-1):
     ValueList = []
     Last = 0
     Escaped = False
@@ -80,7 +82,7 @@ def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
 
     return ValueList
 
-## GetSplitList
+# GetSplitList
 #
 # Get a value list from a string with multiple values split with SplitString
 # The default SplitTag is DataType.TAB_VALUE_SPLIT
@@ -92,10 +94,12 @@ def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
 #
 # @retval list() A list for splitted string
 #
-def GetSplitList(String, SplitStr=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
+
+
+def GetSplitList(String, SplitStr=DataType.TAB_VALUE_SPLIT, MaxSplit=-1):
     return list(map(lambda l: l.strip(), String.split(SplitStr, MaxSplit)))
 
-## MergeArches
+# MergeArches
 #
 # Find a key's all arches in dict, add the new arch to the list
 # If not exist any arch, set the arch directly
@@ -104,13 +108,15 @@ def GetSplitList(String, SplitStr=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
 # @param Key:   The input value for Key
 # @param Arch:  The Arch to be added or merged
 #
+
+
 def MergeArches(Dict, Key, Arch):
     if Key in Dict:
         Dict[Key].append(Arch)
     else:
         Dict[Key] = Arch.split()
 
-## GenDefines
+# GenDefines
 #
 # Parse a string with format "DEFINE <VarName> = <PATH>"
 # Generate a map Defines[VarName] = PATH
@@ -124,9 +130,12 @@ def MergeArches(Dict, Key, Arch):
 # @retval 1   DEFINE statement found, but not valid
 # @retval -1  DEFINE statement not found
 #
+
+
 def GenDefines(String, Arch, Defines):
     if String.find(DataType.TAB_DEFINE + ' ') > -1:
-        List = String.replace(DataType.TAB_DEFINE + ' ', '').split(DataType.TAB_EQUAL_SPLIT)
+        List = String.replace(DataType.TAB_DEFINE + ' ',
+                              '').split(DataType.TAB_EQUAL_SPLIT)
         if len(List) == 2:
             Defines[(CleanString(List[0]), Arch)] = CleanString(List[1])
             return 0
@@ -135,7 +144,7 @@ def GenDefines(String, Arch, Defines):
 
     return 1
 
-## GenInclude
+# GenInclude
 #
 # Parse a string with format "!include <Filename>"
 # Return the file path
@@ -148,15 +157,18 @@ def GenDefines(String, Arch, Defines):
 # @retval True
 # @retval False
 #
+
+
 def GenInclude(String, IncludeFiles, Arch):
     if String.upper().find(DataType.TAB_INCLUDE.upper() + ' ') > -1:
-        IncludeFile = CleanString(String[String.upper().find(DataType.TAB_INCLUDE.upper() + ' ') + len(DataType.TAB_INCLUDE + ' ') : ])
+        IncludeFile = CleanString(String[String.upper().find(
+            DataType.TAB_INCLUDE.upper() + ' ') + len(DataType.TAB_INCLUDE + ' '):])
         MergeArches(IncludeFiles, IncludeFile, Arch)
         return True
     else:
         return False
 
-## GetLibraryClassesWithModuleType
+# GetLibraryClassesWithModuleType
 #
 # Get Library Class definition when no module type defined
 #
@@ -167,6 +179,8 @@ def GenInclude(String, IncludeFiles, Arch):
 #
 # @retval True Get library classes successfully
 #
+
+
 def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
     newKey = SplitModuleType(Key)
     Lines = Lines.split(DataType.TAB_SECTION_END, 1)[1]
@@ -178,7 +192,7 @@ def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
 
     return True
 
-## GetDynamics
+# GetDynamics
 #
 # Get Dynamic Pcds
 #
@@ -189,6 +203,8 @@ def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
 #
 # @retval True Get Dynamic Pcds successfully
 #
+
+
 def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
     #
     # Get SkuId Name List
@@ -200,11 +216,12 @@ def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
     for Line in LineList:
         Line = CleanString(Line, CommentCharacter)
         if Line != '' and Line[0] != CommentCharacter:
-            KeyValues.append([CleanString(Line, CommentCharacter), SkuIdNameList[1]])
+            KeyValues.append(
+                [CleanString(Line, CommentCharacter), SkuIdNameList[1]])
 
     return True
 
-## SplitModuleType
+# SplitModuleType
 #
 # Split ModuleType out of section defien to get key
 # [LibraryClass.Arch.ModuleType|ModuleType|ModuleType] -> [ 'LibraryClass.Arch', ['ModuleType', 'ModuleType', 'ModuleType'] ]
@@ -213,6 +230,8 @@ def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
 #
 # @retval ReturnValue A list for module types
 #
+
+
 def SplitModuleType(Key):
     KeyList = Key.split(DataType.TAB_SPLIT)
     #
@@ -232,7 +251,7 @@ def SplitModuleType(Key):
 
     return ReturnValue
 
-## Replace macro in strings list
+# Replace macro in strings list
 #
 # This method replace macros used in a given string list. The macros are
 # given in a dictionary.
@@ -243,19 +262,22 @@ def SplitModuleType(Key):
 #
 # @retval NewList           A new string list whose macros are replaced
 #
+
+
 def ReplaceMacros(StringList, MacroDefinitions=None, SelfReplacement=False):
     NewList = []
     if MacroDefinitions is None:
         MacroDefinitions = {}
     for String in StringList:
         if isinstance(String, type('')):
-            NewList.append(ReplaceMacro(String, MacroDefinitions, SelfReplacement))
+            NewList.append(ReplaceMacro(
+                String, MacroDefinitions, SelfReplacement))
         else:
             NewList.append(String)
 
     return NewList
 
-## Replace macro in string
+# Replace macro in string
 #
 # This method replace macros used in given string. The macros are given in a
 # dictionary.
@@ -266,6 +288,8 @@ def ReplaceMacros(StringList, MacroDefinitions=None, SelfReplacement=False):
 #
 # @retval string            The string whose macros are replaced
 #
+
+
 def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, RaiseError=False):
     LastString = String
     if MacroDefinitions is None:
@@ -284,7 +308,8 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, RaiseErro
                     String = String.replace("$(%s)" % Macro, '')
                 continue
             if "$(%s)" % Macro not in MacroDefinitions[Macro]:
-                String = String.replace("$(%s)" % Macro, MacroDefinitions[Macro])
+                String = String.replace(
+                    "$(%s)" % Macro, MacroDefinitions[Macro])
         # in case there's macro not defined
         if String == LastString:
             break
@@ -292,7 +317,7 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, RaiseErro
 
     return String
 
-## NormPath
+# NormPath
 #
 # Create a normal path
 # And replace DEFINE in the path
@@ -302,6 +327,8 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, RaiseErro
 #
 # @retval Path Formatted path
 #
+
+
 def NormPath(Path, Defines=None):
     IsRelativePath = False
     if Path:
@@ -317,7 +344,7 @@ def NormPath(Path, Defines=None):
         #
         Path = os.path.normpath(Path)
         if Path.startswith(GlobalData.gWorkspace) and not Path.startswith(GlobalData.gBuildDirectory) and not os.path.exists(Path):
-            Path = Path[len (GlobalData.gWorkspace):]
+            Path = Path[len(GlobalData.gWorkspace):]
             if Path[0] == os.path.sep:
                 Path = Path[1:]
             Path = mws.join(GlobalData.gWorkspace, Path)
@@ -327,7 +354,7 @@ def NormPath(Path, Defines=None):
 
     return Path
 
-## CleanString
+# CleanString
 #
 # Remove comments in a string
 # Remove spaces
@@ -337,11 +364,13 @@ def NormPath(Path, Defines=None):
 #
 # @retval Path Formatted path
 #
+
+
 def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyleComment=False, BuildOption=False):
     #
     # remove whitespace
     #
-    Line = Line.strip();
+    Line = Line.strip()
     #
     # Replace Edk's comment character
     #
@@ -383,11 +412,11 @@ def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyle
     #
     # remove whitespace again
     #
-    Line = Line.strip();
+    Line = Line.strip()
 
     return Line
 
-## CleanString2
+# CleanString2
 #
 # Split statement with comments in a string
 # Remove spaces
@@ -397,11 +426,13 @@ def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyle
 #
 # @retval Path Formatted path
 #
+
+
 def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyleComment=False):
     #
     # remove whitespace
     #
-    Line = Line.strip();
+    Line = Line.strip()
     #
     # Replace Edk's comment character
     #
@@ -428,7 +459,7 @@ def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyl
 
     return Line, Comment
 
-## GetMultipleValuesOfKeyFromLines
+# GetMultipleValuesOfKeyFromLines
 #
 # Parse multiple strings to clean comment and spaces
 # The result is saved to KeyValues
@@ -440,6 +471,8 @@ def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyl
 #
 # @retval True Successfully executed
 #
+
+
 def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
     Lines = Lines.split(DataType.TAB_SECTION_END, 1)[1]
     LineList = Lines.split('\n')
@@ -450,7 +483,7 @@ def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
 
     return True
 
-## GetDefineValue
+# GetDefineValue
 #
 # Parse a DEFINE statement to get defined value
 # DEFINE Key Value
@@ -461,11 +494,13 @@ def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
 #
 # @retval string The defined value
 #
+
+
 def GetDefineValue(String, Key, CommentCharacter):
     String = CleanString(String)
-    return String[String.find(Key + ' ') + len(Key + ' ') : ]
+    return String[String.find(Key + ' ') + len(Key + ' '):]
 
-## GetHexVerValue
+# GetHexVerValue
 #
 # Get a Hex Version Value
 #
@@ -476,6 +511,8 @@ def GetDefineValue(String, Key, CommentCharacter):
 #               If VerString is correctly formatted, return a Hex value of the Version Number (0xmmmmnnnn)
 #                   where mmmm is the major number and nnnn is the adjusted minor number.
 #
+
+
 def GetHexVerValue(VerString):
     VerString = CleanString(VerString)
 
@@ -485,7 +522,7 @@ def GetHexVerValue(VerString):
         Minor = ValueList[1]
         if len(Minor) == 1:
             Minor += '0'
-        DeciValue = (int(Major) << 16) + int(Minor);
+        DeciValue = (int(Major) << 16) + int(Minor)
         return "0x%08x" % DeciValue
     elif gHexVerPatt.match(VerString):
         return VerString
@@ -493,7 +530,7 @@ def GetHexVerValue(VerString):
         return None
 
 
-## GetSingleValueOfKeyFromLines
+# GetSingleValueOfKeyFromLines
 #
 # Parse multiple strings as below to get value of each definition line
 # Key1 = Value1
@@ -523,12 +560,14 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
         if Line.find(DataType.TAB_INF_DEFINES_DEFINE + ' ') > -1:
             if '' in DefineValues:
                 DefineValues.remove('')
-            DefineValues.append(GetDefineValue(Line, DataType.TAB_INF_DEFINES_DEFINE, CommentCharacter))
+            DefineValues.append(GetDefineValue(
+                Line, DataType.TAB_INF_DEFINES_DEFINE, CommentCharacter))
             continue
         if Line.find(DataType.TAB_INF_DEFINES_SPEC + ' ') > -1:
             if '' in SpecValues:
                 SpecValues.remove('')
-            SpecValues.append(GetDefineValue(Line, DataType.TAB_INF_DEFINES_SPEC, CommentCharacter))
+            SpecValues.append(GetDefineValue(
+                Line, DataType.TAB_INF_DEFINES_SPEC, CommentCharacter))
             continue
 
         #
@@ -543,9 +582,11 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
                 #
                 LineList[1] = CleanString(LineList[1], CommentCharacter)
                 if ValueSplitFlag:
-                    Value = list(map(string.strip, LineList[1].split(ValueSplitCharacter)))
+                    Value = list(
+                        map(string.strip, LineList[1].split(ValueSplitCharacter)))
                 else:
-                    Value = CleanString(LineList[1], CommentCharacter).splitlines()
+                    Value = CleanString(
+                        LineList[1], CommentCharacter).splitlines()
 
                 if Key[0] in Dictionary:
                     if Key[0] not in Keys:
@@ -565,7 +606,7 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
 
     return True
 
-## The content to be parsed
+# The content to be parsed
 #
 # Do pre-check for a file before it is parsed
 # Check $()
@@ -575,6 +616,8 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
 # @param FileContent:    File content to be parsed
 # @param SupSectionTag:  Used for error report
 #
+
+
 def PreCheck(FileName, FileContent, SupSectionTag):
     LineNo = 0
     IsFailed = False
@@ -596,7 +639,8 @@ def PreCheck(FileName, FileContent, SupSectionTag):
         #
         if Line.find('$') > -1:
             if Line.find('$(') < 0 or Line.find(')') < 0:
-                EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=EdkLogger.IsRaiseError)
+                EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo,
+                                File=FileName, RaiseError=EdkLogger.IsRaiseError)
 
         #
         # Check []
@@ -606,7 +650,8 @@ def PreCheck(FileName, FileContent, SupSectionTag):
             # Only get one '[' or one ']'
             #
             if not (Line.find('[') > -1 and Line.find(']') > -1):
-                EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=EdkLogger.IsRaiseError)
+                EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo,
+                                File=FileName, RaiseError=EdkLogger.IsRaiseError)
 
         #
         # Regenerate FileContent
@@ -614,11 +659,12 @@ def PreCheck(FileName, FileContent, SupSectionTag):
         NewFileContent = NewFileContent + Line + '\r\n'
 
     if IsFailed:
-       EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=EdkLogger.IsRaiseError)
+        EdkLogger.error("Parser", FORMAT_INVALID, Line=LineNo,
+                        File=FileName, RaiseError=EdkLogger.IsRaiseError)
 
     return NewFileContent
 
-## CheckFileType
+# CheckFileType
 #
 # Check if the Filename is including ExtName
 # Return True if it exists
@@ -632,20 +678,23 @@ def PreCheck(FileName, FileContent, SupSectionTag):
 #
 # @retval True The file type is correct
 #
-def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line, LineNo= -1):
+
+
+def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line, LineNo=-1):
     if CheckFilename != '' and CheckFilename is not None:
         (Root, Ext) = os.path.splitext(CheckFilename)
         if Ext.upper() != ExtName.upper():
             ContainerFile = open(ContainerFilename, 'r').read()
             if LineNo == -1:
                 LineNo = GetLineNo(ContainerFile, Line)
-            ErrorMsg = "Invalid %s. '%s' is found, but '%s' file is needed" % (SectionName, CheckFilename, ExtName)
+            ErrorMsg = "Invalid %s. '%s' is found, but '%s' file is needed" % (
+                SectionName, CheckFilename, ExtName)
             EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg, Line=LineNo,
                             File=ContainerFilename, RaiseError=EdkLogger.IsRaiseError)
 
     return True
 
-## CheckFileExist
+# CheckFileExist
 #
 # Check if the file exists
 # Return True if it exists
@@ -659,7 +708,9 @@ def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line,
 #
 # @retval The file full path if the file exists
 #
-def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName, Line, LineNo= -1):
+
+
+def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName, Line, LineNo=-1):
     CheckFile = ''
     if CheckFilename != '' and CheckFilename is not None:
         CheckFile = WorkspaceFile(WorkspaceDir, CheckFilename)
@@ -667,13 +718,14 @@ def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName,
             ContainerFile = open(ContainerFilename, 'r').read()
             if LineNo == -1:
                 LineNo = GetLineNo(ContainerFile, Line)
-            ErrorMsg = "Can't find file '%s' defined in section '%s'" % (CheckFile, SectionName)
+            ErrorMsg = "Can't find file '%s' defined in section '%s'" % (
+                CheckFile, SectionName)
             EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg,
                             File=ContainerFilename, Line=LineNo, RaiseError=EdkLogger.IsRaiseError)
 
     return CheckFile
 
-## GetLineNo
+# GetLineNo
 #
 # Find the index of a line in a file
 #
@@ -683,6 +735,8 @@ def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName,
 # @retval int  Index of the line
 # @retval -1     The line is not found
 #
+
+
 def GetLineNo(FileContent, Line, IsIgnoreComment=True):
     LineList = FileContent.splitlines()
     for Index in range(len(LineList)):
@@ -697,7 +751,7 @@ def GetLineNo(FileContent, Line, IsIgnoreComment=True):
 
     return -1
 
-## RaiseParserError
+# RaiseParserError
 #
 # Raise a parser error
 #
@@ -706,15 +760,19 @@ def GetLineNo(FileContent, Line, IsIgnoreComment=True):
 # @param File:     File which has the string
 # @param Format:   Correct format
 #
-def RaiseParserError(Line, Section, File, Format='', LineNo= -1):
+
+
+def RaiseParserError(Line, Section, File, Format='', LineNo=-1):
     if LineNo == -1:
         LineNo = GetLineNo(open(os.path.normpath(File), 'r').read(), Line)
-    ErrorMsg = "Invalid statement '%s' is found in section '%s'" % (Line, Section)
+    ErrorMsg = "Invalid statement '%s' is found in section '%s'" % (
+        Line, Section)
     if Format != '':
         Format = "Correct format is " + Format
-    EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg, File=File, Line=LineNo, ExtraData=Format, RaiseError=EdkLogger.IsRaiseError)
+    EdkLogger.error("Parser", PARSER_ERROR, ErrorMsg, File=File,
+                    Line=LineNo, ExtraData=Format, RaiseError=EdkLogger.IsRaiseError)
 
-## WorkspaceFile
+# WorkspaceFile
 #
 # Return a full path with workspace dir
 #
@@ -723,10 +781,12 @@ def RaiseParserError(Line, Section, File, Format='', LineNo= -1):
 #
 # @retval string A full path
 #
+
+
 def WorkspaceFile(WorkspaceDir, Filename):
     return mws.join(NormPath(WorkspaceDir), NormPath(Filename))
 
-## Split string
+# Split string
 #
 # Remove '"' which startswith and endswith string
 #
@@ -734,6 +794,8 @@ def WorkspaceFile(WorkspaceDir, Filename):
 #
 # @retval String: The string after removed '""'
 #
+
+
 def SplitString(String):
     if String.startswith('\"'):
         String = String[1:]
@@ -742,27 +804,33 @@ def SplitString(String):
 
     return String
 
-## Convert To Sql String
+# Convert To Sql String
 #
 # 1. Replace "'" with "''" in each item of StringList
 #
 # @param StringList:  A list for strings to be converted
 #
+
+
 def ConvertToSqlString(StringList):
     return list(map(lambda s: s.replace("'", "''"), StringList))
 
-## Convert To Sql String
+# Convert To Sql String
 #
 # 1. Replace "'" with "''" in the String
 #
 # @param String:  A String to be converted
 #
+
+
 def ConvertToSqlString2(String):
     return String.replace("'", "''")
 
 #
 # Remove comment block
 #
+
+
 def RemoveBlockComment(Lines):
     IsFindBlockComment = False
     IsFindBlockCode = False
@@ -775,10 +843,12 @@ def RemoveBlockComment(Lines):
         # Remove comment block
         #
         if Line.find(DataType.TAB_COMMENT_EDK_START) > -1:
-            ReservedLine = GetSplitList(Line, DataType.TAB_COMMENT_EDK_START, 1)[0]
+            ReservedLine = GetSplitList(
+                Line, DataType.TAB_COMMENT_EDK_START, 1)[0]
             IsFindBlockComment = True
         if Line.find(DataType.TAB_COMMENT_EDK_END) > -1:
-            Line = ReservedLine + GetSplitList(Line, DataType.TAB_COMMENT_EDK_END, 1)[1]
+            Line = ReservedLine + \
+                GetSplitList(Line, DataType.TAB_COMMENT_EDK_END, 1)[1]
             ReservedLine = ''
             IsFindBlockComment = False
         if IsFindBlockComment:
@@ -791,6 +861,8 @@ def RemoveBlockComment(Lines):
 #
 # Get String of a List
 #
+
+
 def GetStringOfList(List, Split=' '):
     if not isinstance(List, type([])):
         return List
@@ -803,16 +875,20 @@ def GetStringOfList(List, Split=' '):
 #
 # Get HelpTextList from HelpTextClassList
 #
+
+
 def GetHelpTextList(HelpTextClassList):
     List = []
     if HelpTextClassList:
         for HelpText in HelpTextClassList:
             if HelpText.String.endswith('\n'):
-                HelpText.String = HelpText.String[0: len(HelpText.String) - len('\n')]
+                HelpText.String = HelpText.String[0: len(
+                    HelpText.String) - len('\n')]
                 List.extend(HelpText.String.split('\n'))
 
     return List
 
+
 def StringToArray(String):
     if String.startswith('L"'):
         if String == "L\"\"":
@@ -836,6 +912,7 @@ def StringToArray(String):
         else:
             return '{%s,0,0}' % ','.join(String.split())
 
+
 def StringArrayLength(String):
     if String.startswith('L"'):
         return (len(String) - 3 + 1) * 2
@@ -844,6 +921,7 @@ def StringArrayLength(String):
     else:
         return len(String.split()) + 1
 
+
 def RemoveDupOption(OptionString, Which="/I", Against=None):
     OptionList = OptionString.split()
     ValueList = []
@@ -863,6 +941,7 @@ def RemoveDupOption(OptionString, Which="/I", Against=None):
             ValueList.append(Val)
     return " ".join(OptionList)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -870,4 +949,3 @@ def RemoveDupOption(OptionString, Which="/I", Against=None):
 #
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/Common/TargetTxtClassObject.py b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
index 363c38302b8e..77f3098f801e 100644
--- a/BaseTools/Source/Python/Common/TargetTxtClassObject.py
+++ b/BaseTools/Source/Python/Common/TargetTxtClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define each component of Target.txt file
 #
 # Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -22,7 +22,7 @@ from Common.MultipleWorkspace import MultipleWorkspace as mws
 
 gDefaultTargetTxtFile = "target.txt"
 
-## TargetTxtClassObject
+# TargetTxtClassObject
 #
 # This class defined content used in file target.txt
 #
@@ -31,23 +31,25 @@ gDefaultTargetTxtFile = "target.txt"
 #
 # @var TargetTxtDictionary:  To store keys and values defined in target.txt
 #
+
+
 class TargetTxtClassObject(object):
-    def __init__(self, Filename = None):
+    def __init__(self, Filename=None):
         self.TargetTxtDictionary = {
-            DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM                            : '',
-            DataType.TAB_TAT_DEFINES_ACTIVE_MODULE                              : '',
-            DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF                            : '',
-            DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER               : '',
-            DataType.TAB_TAT_DEFINES_TARGET                                     : [],
-            DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG                             : [],
-            DataType.TAB_TAT_DEFINES_TARGET_ARCH                                : [],
-            DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF                            : '',
+            DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM: '',
+            DataType.TAB_TAT_DEFINES_ACTIVE_MODULE: '',
+            DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF: '',
+            DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER: '',
+            DataType.TAB_TAT_DEFINES_TARGET: [],
+            DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG: [],
+            DataType.TAB_TAT_DEFINES_TARGET_ARCH: [],
+            DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF: '',
         }
         self.ConfDirectoryPath = ""
         if Filename is not None:
             self.LoadTargetTxtFile(Filename)
 
-    ## LoadTargetTxtFile
+    # LoadTargetTxtFile
     #
     # Load target.txt file and parse it, return a set structure to store keys and values
     #
@@ -58,12 +60,13 @@ class TargetTxtClassObject(object):
     #
     def LoadTargetTxtFile(self, Filename):
         if os.path.exists(Filename) and os.path.isfile(Filename):
-             return self.ConvertTextFileToDict(Filename, '#', '=')
+            return self.ConvertTextFileToDict(Filename, '#', '=')
         else:
-            EdkLogger.error("Target.txt Parser", FILE_NOT_FOUND, ExtraData=Filename)
+            EdkLogger.error("Target.txt Parser",
+                            FILE_NOT_FOUND, ExtraData=Filename)
             return 1
 
-    ## ConvertTextFileToDict
+    # ConvertTextFileToDict
     #
     # Convert a text file to a dictionary of (name:value) pairs.
     # The data is saved to self.TargetTxtDictionary
@@ -97,30 +100,36 @@ class TargetTxtClassObject(object):
             else:
                 Value = ""
 
-            if Key in [DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF, \
+            if Key in [DataType.TAB_TAT_DEFINES_ACTIVE_PLATFORM, DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF,
                        DataType.TAB_TAT_DEFINES_ACTIVE_MODULE, DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF]:
                 self.TargetTxtDictionary[Key] = Value.replace('\\', '/')
                 if Key == DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF and self.TargetTxtDictionary[Key]:
                     if self.TargetTxtDictionary[Key].startswith("Conf/"):
-                        Tools_Def = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
+                        Tools_Def = os.path.join(
+                            self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                         if not os.path.exists(Tools_Def) or not os.path.isfile(Tools_Def):
                             # If Conf/Conf does not exist, try just the Conf/ directory
-                            Tools_Def = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].replace("Conf/", "", 1).strip())
+                            Tools_Def = os.path.join(
+                                self.ConfDirectoryPath, self.TargetTxtDictionary[Key].replace("Conf/", "", 1).strip())
                     else:
                         # The File pointed to by TOOL_CHAIN_CONF is not in a Conf/ directory
-                        Tools_Def = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
+                        Tools_Def = os.path.join(
+                            self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                     self.TargetTxtDictionary[Key] = Tools_Def
                 if Key == DataType.TAB_TAT_DEFINES_BUILD_RULE_CONF and self.TargetTxtDictionary[Key]:
                     if self.TargetTxtDictionary[Key].startswith("Conf/"):
-                        Build_Rule = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
+                        Build_Rule = os.path.join(
+                            self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                         if not os.path.exists(Build_Rule) or not os.path.isfile(Build_Rule):
                             # If Conf/Conf does not exist, try just the Conf/ directory
-                            Build_Rule = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].replace("Conf/", "", 1).strip())
+                            Build_Rule = os.path.join(
+                                self.ConfDirectoryPath, self.TargetTxtDictionary[Key].replace("Conf/", "", 1).strip())
                     else:
                         # The File pointed to by BUILD_RULE_CONF is not in a Conf/ directory
-                        Build_Rule = os.path.join(self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
+                        Build_Rule = os.path.join(
+                            self.ConfDirectoryPath, self.TargetTxtDictionary[Key].strip())
                     self.TargetTxtDictionary[Key] = Build_Rule
-            elif Key in [DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH, \
+            elif Key in [DataType.TAB_TAT_DEFINES_TARGET, DataType.TAB_TAT_DEFINES_TARGET_ARCH,
                          DataType.TAB_TAT_DEFINES_TOOL_CHAIN_TAG]:
                 self.TargetTxtDictionary[Key] = Value.split()
             elif Key == DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER:
@@ -130,13 +139,13 @@ class TargetTxtClassObject(object):
                     EdkLogger.error("build", FORMAT_INVALID, "Invalid number of [%s]: %s." % (Key, Value),
                                     File=FileName)
                 self.TargetTxtDictionary[Key] = Value
-            #elif Key not in GlobalData.gGlobalDefines:
+            # elif Key not in GlobalData.gGlobalDefines:
             #    GlobalData.gGlobalDefines[Key] = Value
 
         F.close()
         return 0
 
-## TargetTxtDict
+# TargetTxtDict
 #
 # Load target.txt in input Conf dir
 #
@@ -145,6 +154,7 @@ class TargetTxtClassObject(object):
 # @retval Target An instance of TargetTxtClassObject() with loaded target.txt
 #
 
+
 class TargetTxtDict():
 
     def __new__(cls, *args, **kw):
@@ -173,19 +183,23 @@ class TargetTxtDict():
             if not os.path.isabs(ConfDirectoryPath):
                 # Since alternate directory name is not absolute, the alternate directory is located within the WORKSPACE
                 # This also handles someone specifying the Conf directory in the workspace. Using --conf=Conf
-                ConfDirectoryPath = mws.join(os.environ["WORKSPACE"], ConfDirectoryPath)
+                ConfDirectoryPath = mws.join(
+                    os.environ["WORKSPACE"], ConfDirectoryPath)
         else:
             if "CONF_PATH" in os.environ:
-                ConfDirectoryPath = os.path.normcase(os.path.normpath(os.environ["CONF_PATH"]))
+                ConfDirectoryPath = os.path.normcase(
+                    os.path.normpath(os.environ["CONF_PATH"]))
             else:
                 # Get standard WORKSPACE/Conf use the absolute path to the WORKSPACE/Conf
                 ConfDirectoryPath = mws.join(os.environ["WORKSPACE"], 'Conf')
         GlobalData.gConfDirectory = ConfDirectoryPath
-        targettxt = os.path.normpath(os.path.join(ConfDirectoryPath, gDefaultTargetTxtFile))
+        targettxt = os.path.normpath(os.path.join(
+            ConfDirectoryPath, gDefaultTargetTxtFile))
         if os.path.exists(targettxt):
             Target.LoadTargetTxtFile(targettxt)
         self.TxtTarget = Target
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -194,6 +208,7 @@ class TargetTxtDict():
 if __name__ == '__main__':
     pass
     Target = TargetTxtDict(os.getenv("WORKSPACE"))
-    print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER])
+    print(
+        Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER])
     print(Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TARGET])
     print(Target.TargetTxtDictionary)
diff --git a/BaseTools/Source/Python/Common/ToolDefClassObject.py b/BaseTools/Source/Python/Common/ToolDefClassObject.py
index 2b4b23849196..bad836722407 100644
--- a/BaseTools/Source/Python/Common/ToolDefClassObject.py
+++ b/BaseTools/Source/Python/Common/ToolDefClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define each component of tools_def.txt file
 #
 # Copyright (c) 2007 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -22,9 +22,8 @@ import Common.GlobalData as GlobalData
 from Common import GlobalData
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 from .DataType import TAB_TOD_DEFINES_TARGET, TAB_TOD_DEFINES_TOOL_CHAIN_TAG,\
-                     TAB_TOD_DEFINES_TARGET_ARCH, TAB_TOD_DEFINES_COMMAND_TYPE\
-                     , TAB_TOD_DEFINES_FAMILY, TAB_TOD_DEFINES_BUILDRULEFAMILY,\
-                     TAB_STAR, TAB_TAT_DEFINES_TOOL_CHAIN_CONF
+    TAB_TOD_DEFINES_TARGET_ARCH, TAB_TOD_DEFINES_COMMAND_TYPE, TAB_TOD_DEFINES_FAMILY, TAB_TOD_DEFINES_BUILDRULEFAMILY,\
+    TAB_STAR, TAB_TAT_DEFINES_TOOL_CHAIN_CONF
 
 
 ##
@@ -35,7 +34,7 @@ gEnvRefPattern = re.compile('(ENV\([^\(\)]+\))')
 gMacroDefPattern = re.compile("DEFINE\s+([^\s]+)")
 gDefaultToolsDefFile = "tools_def.txt"
 
-## ToolDefClassObject
+# ToolDefClassObject
 #
 # This class defined content used in file tools_def.txt
 #
@@ -45,6 +44,8 @@ gDefaultToolsDefFile = "tools_def.txt"
 # @var ToolsDefTxtDictionary:  To store keys and values defined in target.txt
 # @var MacroDictionary:        To store keys and values defined in DEFINE statement
 #
+
+
 class ToolDefClassObject(object):
     def __init__(self, FileName=None):
         self.ToolsDefTxtDictionary = {}
@@ -55,7 +56,7 @@ class ToolDefClassObject(object):
         if FileName is not None:
             self.LoadToolDefFile(FileName)
 
-    ## LoadToolDefFile
+    # LoadToolDefFile
     #
     # Load target.txt file and parse it
     #
@@ -67,26 +68,30 @@ class ToolDefClassObject(object):
         mws.setWs(GlobalData.gWorkspace, PackagesPath)
 
         self.ToolsDefTxtDatabase = {
-            TAB_TOD_DEFINES_TARGET          :   [],
-            TAB_TOD_DEFINES_TOOL_CHAIN_TAG  :   [],
-            TAB_TOD_DEFINES_TARGET_ARCH     :   [],
-            TAB_TOD_DEFINES_COMMAND_TYPE    :   []
+            TAB_TOD_DEFINES_TARGET:   [],
+            TAB_TOD_DEFINES_TOOL_CHAIN_TAG:   [],
+            TAB_TOD_DEFINES_TARGET_ARCH:   [],
+            TAB_TOD_DEFINES_COMMAND_TYPE:   []
         }
 
         self.IncludeToolDefFile(FileName)
 
-        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET] = list(set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET]))
-        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG] = list(set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG]))
-        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET_ARCH] = list(set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET_ARCH]))
+        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET] = list(
+            set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET]))
+        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG] = list(
+            set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG]))
+        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET_ARCH] = list(
+            set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET_ARCH]))
 
-        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_COMMAND_TYPE] = list(set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_COMMAND_TYPE]))
+        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_COMMAND_TYPE] = list(
+            set(self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_COMMAND_TYPE]))
 
         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET].sort()
         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG].sort()
         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TARGET_ARCH].sort()
         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_COMMAND_TYPE].sort()
 
-    ## IncludeToolDefFile
+    # IncludeToolDefFile
     #
     # Load target.txt file and parse it as if its contents were inside the main file
     #
@@ -99,9 +104,11 @@ class ToolDefClassObject(object):
                 F = open(FileName, 'r')
                 FileContent = F.readlines()
             except:
-                EdkLogger.error("tools_def.txt parser", FILE_OPEN_FAILURE, ExtraData=FileName)
+                EdkLogger.error("tools_def.txt parser",
+                                FILE_OPEN_FAILURE, ExtraData=FileName)
         else:
-            EdkLogger.error("tools_def.txt parser", FILE_NOT_FOUND, ExtraData=FileName)
+            EdkLogger.error("tools_def.txt parser",
+                            FILE_NOT_FOUND, ExtraData=FileName)
 
         for Index in range(len(FileContent)):
             Line = FileContent[Index].strip()
@@ -114,7 +121,7 @@ class ToolDefClassObject(object):
                 if not Done:
                     EdkLogger.error("tools_def.txt parser", ATTRIBUTE_NOT_AVAILABLE,
                                     "Macro or Environment has not been defined",
-                                ExtraData=IncFile[4:-1], File=FileName, Line=Index+1)
+                                    ExtraData=IncFile[4:-1], File=FileName, Line=Index+1)
                 IncFile = NormPath(IncFile)
 
                 if not os.path.isabs(IncFile):
@@ -132,10 +139,12 @@ class ToolDefClassObject(object):
                             #
                             # try directory of current file
                             #
-                            IncFileTmp = PathClass(IncFile, os.path.dirname(FileName))
+                            IncFileTmp = PathClass(
+                                IncFile, os.path.dirname(FileName))
                             ErrorCode = IncFileTmp.Validate()[0]
                             if ErrorCode != 0:
-                                EdkLogger.error("tools_def.txt parser", FILE_NOT_FOUND, ExtraData=IncFile)
+                                EdkLogger.error(
+                                    "tools_def.txt parser", FILE_NOT_FOUND, ExtraData=IncFile)
 
                     if isinstance(IncFileTmp, PathClass):
                         IncFile = IncFileTmp.Path
@@ -147,14 +156,16 @@ class ToolDefClassObject(object):
 
             NameValuePair = Line.split("=", 1)
             if len(NameValuePair) != 2:
-                EdkLogger.warn("tools_def.txt parser", "Line %d: not correct assignment statement, skipped" % (Index + 1))
+                EdkLogger.warn(
+                    "tools_def.txt parser", "Line %d: not correct assignment statement, skipped" % (Index + 1))
                 continue
 
             Name = NameValuePair[0].strip()
             Value = NameValuePair[1].strip()
 
             if Name == "IDENTIFIER":
-                EdkLogger.debug(EdkLogger.DEBUG_8, "Line %d: Found identifier statement, skipped: %s" % ((Index + 1), Value))
+                EdkLogger.debug(EdkLogger.DEBUG_8, "Line %d: Found identifier statement, skipped: %s" % (
+                    (Index + 1), Value))
                 continue
 
             MacroDefinition = gMacroDefPattern.findall(Name)
@@ -163,11 +174,12 @@ class ToolDefClassObject(object):
                 if not Done:
                     EdkLogger.error("tools_def.txt parser", ATTRIBUTE_NOT_AVAILABLE,
                                     "Macro or Environment has not been defined",
-                                ExtraData=Value[4:-1], File=FileName, Line=Index+1)
+                                    ExtraData=Value[4:-1], File=FileName, Line=Index+1)
 
                 MacroName = MacroDefinition[0].strip()
                 self.MacroDictionary["DEF(%s)" % MacroName] = Value
-                EdkLogger.debug(EdkLogger.DEBUG_8, "Line %d: Found macro: %s = %s" % ((Index + 1), MacroName, Value))
+                EdkLogger.debug(EdkLogger.DEBUG_8, "Line %d: Found macro: %s = %s" % (
+                    (Index + 1), MacroName, Value))
                 continue
 
             Done, Value = self.ExpandMacros(Value)
@@ -178,10 +190,12 @@ class ToolDefClassObject(object):
 
             List = Name.split('_')
             if len(List) != 5:
-                EdkLogger.verbose("Line %d: Not a valid name of definition: %s" % ((Index + 1), Name))
+                EdkLogger.verbose(
+                    "Line %d: Not a valid name of definition: %s" % ((Index + 1), Name))
                 continue
             elif List[4] == TAB_STAR:
-                EdkLogger.verbose("Line %d: '*' is not allowed in last field: %s" % ((Index + 1), Name))
+                EdkLogger.verbose(
+                    "Line %d: '*' is not allowed in last field: %s" % ((Index + 1), Name))
                 continue
             else:
                 self.ToolsDefTxtDictionary[Name] = Value
@@ -197,20 +211,23 @@ class ToolDefClassObject(object):
                     if TAB_TOD_DEFINES_FAMILY not in self.ToolsDefTxtDatabase:
                         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY] = {}
                         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY][List[1]] = Value
-                        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY] = {}
+                        self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY] = {
+                        }
                         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY][List[1]] = Value
                     elif List[1] not in self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY]:
                         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY][List[1]] = Value
                         self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY][List[1]] = Value
                     elif self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY][List[1]] != Value:
-                        EdkLogger.verbose("Line %d: No override allowed for the family of a tool chain: %s" % ((Index + 1), Name))
+                        EdkLogger.verbose(
+                            "Line %d: No override allowed for the family of a tool chain: %s" % ((Index + 1), Name))
                 if List[4] == TAB_TOD_DEFINES_BUILDRULEFAMILY and List[2] == TAB_STAR and List[3] == TAB_STAR:
                     if TAB_TOD_DEFINES_BUILDRULEFAMILY not in self.ToolsDefTxtDatabase \
                        or List[1] not in self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY]:
-                        EdkLogger.verbose("Line %d: The family is not specified, but BuildRuleFamily is specified for the tool chain: %s" % ((Index + 1), Name))
+                        EdkLogger.verbose(
+                            "Line %d: The family is not specified, but BuildRuleFamily is specified for the tool chain: %s" % ((Index + 1), Name))
                     self.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY][List[1]] = Value
 
-    ## ExpandMacros
+    # ExpandMacros
     #
     # Replace defined macros with real value
     #
@@ -228,7 +245,8 @@ class ToolDefClassObject(object):
                 if Ref in self.MacroDictionary:
                     Value = Value.replace(Ref, self.MacroDictionary[Ref])
                 else:
-                    Value = Value.replace(Ref, self.MacroDictionary[Ref.upper()])
+                    Value = Value.replace(
+                        Ref, self.MacroDictionary[Ref.upper()])
         MacroReference = gMacroRefPattern.findall(Value)
         for Ref in MacroReference:
             if Ref not in self.MacroDictionary:
@@ -237,7 +255,7 @@ class ToolDefClassObject(object):
 
         return True, Value
 
-## ToolDefDict
+# ToolDefDict
 #
 # Load tools_def.txt in input Conf dir
 #
@@ -275,11 +293,14 @@ class ToolDefDict():
             if ToolsDefFile:
                 ToolDef.LoadToolDefFile(os.path.normpath(ToolsDefFile))
             else:
-                ToolDef.LoadToolDefFile(os.path.normpath(os.path.join(self.ConfDir, gDefaultToolsDefFile)))
+                ToolDef.LoadToolDefFile(os.path.normpath(
+                    os.path.join(self.ConfDir, gDefaultToolsDefFile)))
         else:
-            ToolDef.LoadToolDefFile(os.path.normpath(os.path.join(self.ConfDir, gDefaultToolsDefFile)))
+            ToolDef.LoadToolDefFile(os.path.normpath(
+                os.path.join(self.ConfDir, gDefaultToolsDefFile)))
         self._ToolDef = ToolDef
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py b/BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py
index 74004857a737..c132a073d093 100644
--- a/BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py
+++ b/BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Module that encodes and decodes a capsule dependency.
 #
 # Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
@@ -14,95 +14,103 @@ import re
 CapsuleDependency
 '''
 
+
 class OpConvert (object):
-    def __init__ (self):
+    def __init__(self):
         # Opcode: (OperandSize, PackSize, PackFmt, EncodeConvert, DecodeConvert)
         self._DepexOperations = {0x00:    (16, 16, 's', self.Str2Guid, self.Guid2Str),
                                  0x01:    (4,  1,  'I', self.Str2Uint, self.Uint2Str),
                                  0x02:    (1,  0,  's', self.Str2Utf8, self.Byte2Str),
                                  }
 
-    def Str2Uint (self, Data):
+    def Str2Uint(self, Data):
         try:
-            Value = int (Data, 16)
+            Value = int(Data, 16)
         except:
-            Message = '{Data} is not a valid integer value.'.format (Data = Data)
-            raise ValueError (Message)
+            Message = '{Data} is not a valid integer value.'.format(Data=Data)
+            raise ValueError(Message)
         if Value < 0 or Value > 0xFFFFFFFF:
-            Message = '{Data} is not an UINT32.'.format (Data = Data)
-            raise ValueError (Message)
+            Message = '{Data} is not an UINT32.'.format(Data=Data)
+            raise ValueError(Message)
         return Value
 
-    def Uint2Str (self, Data):
+    def Uint2Str(self, Data):
         if Data < 0 or Data > 0xFFFFFFFF:
-            Message = '{Data} is not an UINT32.'.format (Data = Data)
-            raise ValueError (Message)
-        return "0x{Data:08x}".format (Data = Data)
+            Message = '{Data} is not an UINT32.'.format(Data=Data)
+            raise ValueError(Message)
+        return "0x{Data:08x}".format(Data=Data)
 
-    def Str2Guid (self, Data):
+    def Str2Guid(self, Data):
         try:
-            Guid = uuid.UUID (Data)
+            Guid = uuid.UUID(Data)
         except:
-            Message = '{Data} is not a valid registry format GUID value.'.format (Data = Data)
-            raise ValueError (Message)
+            Message = '{Data} is not a valid registry format GUID value.'.format(
+                Data=Data)
+            raise ValueError(Message)
         return Guid.bytes_le
 
-    def Guid2Str (self, Data):
+    def Guid2Str(self, Data):
         try:
-            Guid = uuid.UUID (bytes_le = Data)
+            Guid = uuid.UUID(bytes_le=Data)
         except:
-            Message = '{Data} is not a valid binary format GUID value.'.format (Data = Data)
-            raise ValueError (Message)
-        return str (Guid).upper ()
+            Message = '{Data} is not a valid binary format GUID value.'.format(
+                Data=Data)
+            raise ValueError(Message)
+        return str(Guid).upper()
 
-    def Str2Utf8 (self, Data):
-        if isinstance (Data, str):
-            return Data.encode ('utf-8')
+    def Str2Utf8(self, Data):
+        if isinstance(Data, str):
+            return Data.encode('utf-8')
         else:
-            Message = '{Data} is not a valid string.'.format (Data = Data)
-            raise ValueError (Message)
+            Message = '{Data} is not a valid string.'.format(Data=Data)
+            raise ValueError(Message)
 
-    def Byte2Str (self, Data):
-        if isinstance (Data, bytes):
+    def Byte2Str(self, Data):
+        if isinstance(Data, bytes):
             if Data[-1:] == b'\x00':
-                return str (Data[:-1], 'utf-8')
+                return str(Data[:-1], 'utf-8')
             else:
-                return str (Data, 'utf-8')
+                return str(Data, 'utf-8')
         else:
-            Message = '{Data} is not a valid binary string.'.format (Data = Data)
-            raise ValueError (Message)
+            Message = '{Data} is not a valid binary string.'.format(Data=Data)
+            raise ValueError(Message)
 
-    def OpEncode (self, Opcode, Operand = None):
-        BinTemp = struct.pack ('<b', Opcode)
+    def OpEncode(self, Opcode, Operand=None):
+        BinTemp = struct.pack('<b', Opcode)
         if Opcode <= 0x02 and Operand != None:
-            OperandSize, PackSize, PackFmt, EncodeConvert, DecodeConvert = self._DepexOperations[Opcode]
-            Value = EncodeConvert (Operand)
+            OperandSize, PackSize, PackFmt, EncodeConvert, DecodeConvert = self._DepexOperations[
+                Opcode]
+            Value = EncodeConvert(Operand)
             if Opcode == 0x02:
-                PackSize = len (Value) + 1
-            BinTemp += struct.pack ('<{PackSize}{PackFmt}'.format (PackSize = PackSize, PackFmt = PackFmt), Value)
+                PackSize = len(Value) + 1
+            BinTemp += struct.pack('<{PackSize}{PackFmt}'.format(
+                PackSize=PackSize, PackFmt=PackFmt), Value)
         return BinTemp
 
-    def OpDecode (self, Buffer):
-        Opcode = struct.unpack ('<b', Buffer[0:1])[0]
+    def OpDecode(self, Buffer):
+        Opcode = struct.unpack('<b', Buffer[0:1])[0]
         if Opcode <= 0x02:
-            OperandSize, PackSize, PackFmt, EncodeConvert, DecodeConvert = self._DepexOperations[Opcode]
+            OperandSize, PackSize, PackFmt, EncodeConvert, DecodeConvert = self._DepexOperations[
+                Opcode]
             if Opcode == 0x02:
                 try:
-                    PackSize = Buffer[1:].index (b'\x00') + 1
+                    PackSize = Buffer[1:].index(b'\x00') + 1
                     OperandSize = PackSize
                 except:
                     Message = 'CapsuleDependency: OpConvert: error: decode failed with wrong opcode/string.'
-                    raise ValueError (Message)
+                    raise ValueError(Message)
             try:
-                Operand = DecodeConvert (struct.unpack ('<{PackSize}{PackFmt}'.format (PackSize = PackSize, PackFmt = PackFmt), Buffer[1:1+OperandSize])[0])
+                Operand = DecodeConvert(struct.unpack('<{PackSize}{PackFmt}'.format(
+                    PackSize=PackSize, PackFmt=PackFmt), Buffer[1:1+OperandSize])[0])
             except:
                 Message = 'CapsuleDependency: OpConvert: error: decode failed with unpack failure.'
-                raise ValueError (Message)
+                raise ValueError(Message)
         else:
             Operand = None
             OperandSize = 0
         return (Opcode, Operand, OperandSize)
 
+
 class CapsuleDependencyClass (object):
     # //**************************************************************
     # // Image Attribute - Dependency
@@ -122,147 +130,158 @@ class CapsuleDependencyClass (object):
                     '<=':  [4, 0x0C, 2],
                     }
 
-    def __init__ (self):
-        self.Payload              = b''
-        self._DepexExp            = None
-        self._DepexList           = []
-        self._DepexDump           = []
-        self.Depex                = b''
-        self._Valid               = False
-        self._DepexSize           = 0
-        self._opReferenceReverse  = {v[1] : k for k, v in self._opReference.items ()}
-        self.OpConverter          = OpConvert ()
+    def __init__(self):
+        self.Payload = b''
+        self._DepexExp = None
+        self._DepexList = []
+        self._DepexDump = []
+        self.Depex = b''
+        self._Valid = False
+        self._DepexSize = 0
+        self._opReferenceReverse = {
+            v[1]: k for k, v in self._opReference.items()}
+        self.OpConverter = OpConvert()
 
     @property
-    def DepexExp (self):
+    def DepexExp(self):
         return self._DepexExp
 
     @DepexExp.setter
-    def DepexExp (self, DepexExp = ''):
-        if isinstance (DepexExp, str):
-            DepexExp = re.sub (r'\n',r' ',DepexExp)
-            DepexExp = re.sub (r'\(',r' ( ',DepexExp)
-            DepexExp = re.sub (r'\)',r' ) ',DepexExp)
-            DepexExp = re.sub (r'~',r' ~ ',DepexExp)
-            self._DepexList = re.findall(r"[^\s\"\']+|\"[^\"]*\"|\'[^\']*\'",DepexExp)
-            self._DepexExp  = " ".join(self._DepexList)
+    def DepexExp(self, DepexExp=''):
+        if isinstance(DepexExp, str):
+            DepexExp = re.sub(r'\n', r' ', DepexExp)
+            DepexExp = re.sub(r'\(', r' ( ', DepexExp)
+            DepexExp = re.sub(r'\)', r' ) ', DepexExp)
+            DepexExp = re.sub(r'~', r' ~ ', DepexExp)
+            self._DepexList = re.findall(
+                r"[^\s\"\']+|\"[^\"]*\"|\'[^\']*\'", DepexExp)
+            self._DepexExp = " ".join(self._DepexList)
 
         else:
             Msg = 'Input Depex Expression is not valid string.'
-            raise ValueError (Msg)
+            raise ValueError(Msg)
 
-    def IsValidOperator (self, op):
-        return op in self._opReference.keys ()
+    def IsValidOperator(self, op):
+        return op in self._opReference.keys()
 
-    def IsValidUnaryOperator (self, op):
-        return op in self._opReference.keys () and self._opReference[op][2] == 1
+    def IsValidUnaryOperator(self, op):
+        return op in self._opReference.keys() and self._opReference[op][2] == 1
 
-    def IsValidBinocularOperator (self, op):
-        return op in self._opReference.keys () and self._opReference[op][2] == 2
+    def IsValidBinocularOperator(self, op):
+        return op in self._opReference.keys() and self._opReference[op][2] == 2
 
-    def IsValidGuid (self, operand):
+    def IsValidGuid(self, operand):
         try:
-            uuid.UUID (operand)
+            uuid.UUID(operand)
         except:
             return False
         return True
 
-    def IsValidVersion (self, operand):
+    def IsValidVersion(self, operand):
         try:
-            Value = int (operand, 16)
+            Value = int(operand, 16)
             if Value < 0 or Value > 0xFFFFFFFF:
                 return False
         except:
             return False
         return True
 
-    def IsValidBoolean (self, operand):
+    def IsValidBoolean(self, operand):
         try:
-            return operand.upper () in ['TRUE', 'FALSE']
+            return operand.upper() in ['TRUE', 'FALSE']
         except:
             return False
 
-    def IsValidOperand (self, operand):
-        return self.IsValidVersion (operand) or self.IsValidGuid (operand) or self.IsValidBoolean (operand)
+    def IsValidOperand(self, operand):
+        return self.IsValidVersion(operand) or self.IsValidGuid(operand) or self.IsValidBoolean(operand)
 
-    def IsValidString (self, operand):
+    def IsValidString(self, operand):
         return operand[0] == "\"" and operand[-1] == "\"" and len(operand) >= 2
 
     # Check if priority of current operater is greater than pervious op
-    def PriorityNotGreater (self, prevOp, currOp):
+    def PriorityNotGreater(self, prevOp, currOp):
         return self._opReference[currOp][0] <= self._opReference[prevOp][0]
 
-    def ValidateDepex (self):
+    def ValidateDepex(self):
         OpList = self._DepexList
 
         i = 0
-        while i < len (OpList):
+        while i < len(OpList):
             Op = OpList[i]
 
             if Op == 'DECLARE':
                 i += 1
-                if i >= len (OpList):
-                    Msg = 'No more Operand after {Op}.'.format (Op = OpList[i-1])
-                    raise IndexError (Msg)
+                if i >= len(OpList):
+                    Msg = 'No more Operand after {Op}.'.format(Op=OpList[i-1])
+                    raise IndexError(Msg)
                 # Check valid string
                 if not self.IsValidString(OpList[i]):
-                    Msg = '{Operand} after {Op} is not a valid expression input.'.format (Operand = OpList[i], Op = OpList[i-1])
-                    raise ValueError (Msg)
+                    Msg = '{Operand} after {Op} is not a valid expression input.'.format(
+                        Operand=OpList[i], Op=OpList[i-1])
+                    raise ValueError(Msg)
 
             elif Op == '(':
                 # Expression cannot end with (
-                if i == len (OpList) - 1:
+                if i == len(OpList) - 1:
                     Msg = 'Expression cannot end with \'(\''
-                    raise ValueError (Msg)
+                    raise ValueError(Msg)
                 # The previous op after '(' cannot be a binocular operator
-                if self.IsValidBinocularOperator (OpList[i+1]) :
-                    Msg = '{Op} after \'(\' is not a valid expression input.'.format (Op = OpList[i+1])
-                    raise ValueError (Msg)
+                if self.IsValidBinocularOperator(OpList[i+1]):
+                    Msg = '{Op} after \'(\' is not a valid expression input.'.format(
+                        Op=OpList[i+1])
+                    raise ValueError(Msg)
 
             elif Op == ')':
                 # Expression cannot start with )
                 if i == 0:
                     Msg = 'Expression cannot start with \')\''
-                    raise ValueError (Msg)
+                    raise ValueError(Msg)
                 # The previous op before ')' cannot be an operator
-                if self.IsValidOperator (OpList[i-1]):
-                    Msg = '{Op} before \')\' is not a valid expression input.'.format (Op = OpList[i-1])
-                    raise ValueError (Msg)
+                if self.IsValidOperator(OpList[i-1]):
+                    Msg = '{Op} before \')\' is not a valid expression input.'.format(
+                        Op=OpList[i-1])
+                    raise ValueError(Msg)
                 # The next op after ')' cannot be operand or unary operator
-                if (i + 1) < len (OpList) and (self.IsValidOperand (OpList[i+1]) or self.IsValidUnaryOperator (OpList[i+1])):
-                    Msg = '{Op} after \')\' is not a valid expression input.'.format (Op = OpList[i+1])
-                    raise ValueError (Msg)
+                if (i + 1) < len(OpList) and (self.IsValidOperand(OpList[i+1]) or self.IsValidUnaryOperator(OpList[i+1])):
+                    Msg = '{Op} after \')\' is not a valid expression input.'.format(
+                        Op=OpList[i+1])
+                    raise ValueError(Msg)
 
-            elif self.IsValidOperand (Op):
+            elif self.IsValidOperand(Op):
                 # The next expression of operand cannot be operand or unary operator
-                if (i + 1) < len (OpList) and (self.IsValidOperand (OpList[i+1]) or self.IsValidUnaryOperator (OpList[i+1])):
-                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format (Op = OpList[i+1], PrevOp = Op)
-                    raise ValueError (Msg)
+                if (i + 1) < len(OpList) and (self.IsValidOperand(OpList[i+1]) or self.IsValidUnaryOperator(OpList[i+1])):
+                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format(
+                        Op=OpList[i+1], PrevOp=Op)
+                    raise ValueError(Msg)
 
-            elif self.IsValidOperator (Op):
+            elif self.IsValidOperator(Op):
                 # The next op of operator cannot binocular operator
-                if (i + 1) < len (OpList) and self.IsValidBinocularOperator (OpList[i+1]):
-                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format (Op = OpList[i+1], PrevOp = Op)
-                    raise ValueError (Msg)
+                if (i + 1) < len(OpList) and self.IsValidBinocularOperator(OpList[i+1]):
+                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format(
+                        Op=OpList[i+1], PrevOp=Op)
+                    raise ValueError(Msg)
                 # The first op can not be binocular operator
-                if i == 0 and self.IsValidBinocularOperator (Op):
-                    Msg = 'Expression cannot start with an operator {Op}.'.format (Op = Op)
-                    raise ValueError (Msg)
+                if i == 0 and self.IsValidBinocularOperator(Op):
+                    Msg = 'Expression cannot start with an operator {Op}.'.format(
+                        Op=Op)
+                    raise ValueError(Msg)
                 # The last op can not be operator
-                if i == len (OpList) - 1:
-                    Msg = 'Expression cannot ended with an operator {Op}.'.format (Op = Op)
-                    raise ValueError (Msg)
+                if i == len(OpList) - 1:
+                    Msg = 'Expression cannot ended with an operator {Op}.'.format(
+                        Op=Op)
+                    raise ValueError(Msg)
                 # The next op of unary operator cannot be guid / version
-                if self.IsValidUnaryOperator (Op) and (self.IsValidGuid (OpList[i+1]) or self.IsValidVersion (OpList[i+1])):
-                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format (Op = OpList[i+1], PrevOp = Op)
-                    raise ValueError (Msg)
+                if self.IsValidUnaryOperator(Op) and (self.IsValidGuid(OpList[i+1]) or self.IsValidVersion(OpList[i+1])):
+                    Msg = '{Op} after {PrevOp} is not a valid expression input.'.format(
+                        Op=OpList[i+1], PrevOp=Op)
+                    raise ValueError(Msg)
 
             else:
-                Msg = '{Op} is not a valid expression input.'.format (Op = Op)
-                raise ValueError (Msg)
+                Msg = '{Op} is not a valid expression input.'.format(Op=Op)
+                raise ValueError(Msg)
             i += 1
 
-    def Encode (self):
+    def Encode(self):
         # initialize
         self.Depex = b''
         self._DepexDump = []
@@ -270,80 +289,84 @@ class CapsuleDependencyClass (object):
         OpeartorStack = []
         OpList = self._DepexList
 
-        self.ValidateDepex ()
+        self.ValidateDepex()
 
         # convert
         i = 0
-        while i < len (OpList):
+        while i < len(OpList):
             Op = OpList[i]
             if Op == 'DECLARE':
                 # This declare next expression value is a VERSION_STRING
                 i += 1
-                self.Depex += self.OpConverter.OpEncode (0x02, OpList[i][1:-1])
+                self.Depex += self.OpConverter.OpEncode(0x02, OpList[i][1:-1])
 
             elif Op == '(':
-                OpeartorStack.append (Op)
+                OpeartorStack.append(Op)
 
             elif Op == ')':
                 while (OpeartorStack and OpeartorStack[-1] != '('):
-                    Operator = OpeartorStack.pop ()
-                    self.Depex += self.OpConverter.OpEncode (self._opReference[Operator][1])
+                    Operator = OpeartorStack.pop()
+                    self.Depex += self.OpConverter.OpEncode(
+                        self._opReference[Operator][1])
                 try:
-                    OpeartorStack.pop () # pop out '('
+                    OpeartorStack.pop()  # pop out '('
                 except:
                     Msg = 'Pop out \'(\' failed, too many \')\''
-                    raise ValueError (Msg)
+                    raise ValueError(Msg)
 
-            elif self.IsValidGuid (Op):
+            elif self.IsValidGuid(Op):
                 if not OperandStack:
-                    OperandStack.append (self.OpConverter.OpEncode (0x00, Op))
+                    OperandStack.append(self.OpConverter.OpEncode(0x00, Op))
                 else:
                     # accroding to uefi spec 2.8, the guid/version operands is a reversed order in firmware comparison.
-                    self.Depex += self.OpConverter.OpEncode (0x00, Op)
-                    self.Depex += OperandStack.pop ()
+                    self.Depex += self.OpConverter.OpEncode(0x00, Op)
+                    self.Depex += OperandStack.pop()
 
-            elif self.IsValidVersion (Op):
+            elif self.IsValidVersion(Op):
                 if not OperandStack:
-                    OperandStack.append (self.OpConverter.OpEncode (0x01, Op))
+                    OperandStack.append(self.OpConverter.OpEncode(0x01, Op))
                 else:
                     # accroding to uefi spec 2.8, the guid/version operands is a reversed order in firmware comparison.
-                    self.Depex += self.OpConverter.OpEncode (0x01, Op)
-                    self.Depex += OperandStack.pop ()
+                    self.Depex += self.OpConverter.OpEncode(0x01, Op)
+                    self.Depex += OperandStack.pop()
 
-            elif self.IsValidBoolean (Op):
-                if Op.upper () == 'FALSE':
-                    self.Depex += self.OpConverter.OpEncode (0x07)
-                elif Op.upper () == 'TRUE':
-                    self.Depex += self.OpConverter.OpEncode (0x06)
+            elif self.IsValidBoolean(Op):
+                if Op.upper() == 'FALSE':
+                    self.Depex += self.OpConverter.OpEncode(0x07)
+                elif Op.upper() == 'TRUE':
+                    self.Depex += self.OpConverter.OpEncode(0x06)
 
-            elif self.IsValidOperator (Op):
-                while (OpeartorStack and OpeartorStack[-1] != '(' and self.PriorityNotGreater (OpeartorStack[-1], Op)):
-                    Operator = OpeartorStack.pop ()
-                    self.Depex += self.OpConverter.OpEncode (self._opReference[Operator][1])
-                OpeartorStack.append (Op)
+            elif self.IsValidOperator(Op):
+                while (OpeartorStack and OpeartorStack[-1] != '(' and self.PriorityNotGreater(OpeartorStack[-1], Op)):
+                    Operator = OpeartorStack.pop()
+                    self.Depex += self.OpConverter.OpEncode(
+                        self._opReference[Operator][1])
+                OpeartorStack.append(Op)
 
             i += 1
 
         while OpeartorStack:
-            Operator = OpeartorStack.pop ()
+            Operator = OpeartorStack.pop()
             if Operator == '(':
                 Msg = 'Too many \'(\'.'
-                raise ValueError (Msg)
-            self.Depex += self.OpConverter.OpEncode (self._opReference[Operator][1])
-        self.Depex += self.OpConverter.OpEncode (0x0D)
+                raise ValueError(Msg)
+            self.Depex += self.OpConverter.OpEncode(
+                self._opReference[Operator][1])
+        self.Depex += self.OpConverter.OpEncode(0x0D)
 
         self._Valid = True
-        self._DepexSize = len (self.Depex)
+        self._DepexSize = len(self.Depex)
         return self.Depex + self.Payload
 
-    def Decode (self, Buffer):
+    def Decode(self, Buffer):
         # initialize
         self.Depex = Buffer
         OperandStack = []
         DepexLen = 0
 
         while True:
-            Opcode, Operand, OperandSize = self.OpConverter.OpDecode (Buffer[DepexLen:])
+            Opcode, Operand, OperandSize = self.OpConverter.OpDecode(
+                Buffer[DepexLen:])
             DepexLen += OperandSize + 1
 
             if Opcode == 0x0D:
@@ -351,59 +374,68 @@ class CapsuleDependencyClass (object):
 
             elif Opcode == 0x02:
                 if not OperandStack:
-                    OperandStack.append ('DECLARE \"{String}\"'.format (String = Operand))
+                    OperandStack.append(
+                        'DECLARE \"{String}\"'.format(String=Operand))
                 else:
-                    PrevOperand = OperandStack.pop ()
-                    OperandStack.append ('{Operand} DECLARE \"{String}\"'.format (Operand = PrevOperand, String = Operand))
+                    PrevOperand = OperandStack.pop()
+                    OperandStack.append('{Operand} DECLARE \"{String}\"'.format(
+                        Operand=PrevOperand, String=Operand))
 
             elif Opcode in [0x00, 0x01]:
-                OperandStack.append (Operand)
+                OperandStack.append(Operand)
 
             elif Opcode == 0x06:
-                OperandStack.append ('TRUE')
+                OperandStack.append('TRUE')
 
             elif Opcode == 0x07:
-                OperandStack.append ('FALSE')
+                OperandStack.append('FALSE')
 
-            elif self.IsValidOperator (self._opReferenceReverse[Opcode]):
+            elif self.IsValidOperator(self._opReferenceReverse[Opcode]):
                 Operator = self._opReferenceReverse[Opcode]
-                if self.IsValidUnaryOperator (self._opReferenceReverse[Opcode]) and len (OperandStack) >= 1:
-                    Oprand = OperandStack.pop ()
-                    OperandStack.append (' ( {Operator} {Oprand} )'.format (Operator = Operator, Oprand = Oprand))
-                elif self.IsValidBinocularOperator (self._opReferenceReverse[Opcode]) and len (OperandStack) >= 2:
-                    Oprand1 = OperandStack.pop ()
-                    Oprand2 = OperandStack.pop ()
-                    OperandStack.append (' ( {Oprand1} {Operator} {Oprand2} )'.format (Operator = Operator, Oprand1 = Oprand1, Oprand2 = Oprand2))
+                if self.IsValidUnaryOperator(self._opReferenceReverse[Opcode]) and len(OperandStack) >= 1:
+                    Oprand = OperandStack.pop()
+                    OperandStack.append(' ( {Operator} {Oprand} )'.format(
+                        Operator=Operator, Oprand=Oprand))
+                elif self.IsValidBinocularOperator(self._opReferenceReverse[Opcode]) and len(OperandStack) >= 2:
+                    Oprand1 = OperandStack.pop()
+                    Oprand2 = OperandStack.pop()
+                    OperandStack.append(' ( {Oprand1} {Operator} {Oprand2} )'.format(
+                        Operator=Operator, Oprand1=Oprand1, Oprand2=Oprand2))
                 else:
-                    Msg = 'No enough Operands for {Opcode:02X}.'.format (Opcode = Opcode)
-                    raise ValueError (Msg)
+                    Msg = 'No enough Operands for {Opcode:02X}.'.format(
+                        Opcode=Opcode)
+                    raise ValueError(Msg)
 
             else:
-                Msg = '{Opcode:02X} is not a valid OpCode.'.format (Opcode = Opcode)
-                raise ValueError (Msg)
+                Msg = '{Opcode:02X} is not a valid OpCode.'.format(
+                    Opcode=Opcode)
+                raise ValueError(Msg)
 
-        self.DepexExp = OperandStack[0].strip (' ')
+        self.DepexExp = OperandStack[0].strip(' ')
         self.Payload = Buffer[DepexLen:]
         self._Valid = True
         self._DepexSize = DepexLen
         return self.Payload
 
-
-    def DumpInfo (self):
+    def DumpInfo(self):
         DepexLen = 0
         Opcode = None
         Buffer = self.Depex
 
         if self._Valid == True:
-            print ('EFI_FIRMWARE_IMAGE_DEP.Dependencies = {')
+            print('EFI_FIRMWARE_IMAGE_DEP.Dependencies = {')
             while Opcode != 0x0D:
-                Opcode, Operand, OperandSize = self.OpConverter.OpDecode (Buffer[DepexLen:])
+                Opcode, Operand, OperandSize = self.OpConverter.OpDecode(
+                    Buffer[DepexLen:])
                 DepexLen += OperandSize + 1
                 if Operand:
-                    print ('    {Opcode:02X}, {Operand},'.format (Opcode = Opcode, Operand = Operand))
+                    print('    {Opcode:02X}, {Operand},'.format(
+                        Opcode=Opcode, Operand=Operand))
                 else:
-                    print ('    {Opcode:02X},'.format (Opcode = Opcode))
-            print ('}')
+                    print('    {Opcode:02X},'.format(Opcode=Opcode))
+            print('}')
 
-            print ('sizeof (EFI_FIRMWARE_IMAGE_DEP.Dependencies)    = {Size:08X}'.format (Size = self._DepexSize))
-            print ('sizeof (Payload)                                = {Size:08X}'.format (Size = len (self.Payload)))
+            print('sizeof (EFI_FIRMWARE_IMAGE_DEP.Dependencies)    = {Size:08X}'.format(
+                Size=self._DepexSize))
+            print('sizeof (Payload)                                = {Size:08X}'.format(
+                Size=len(self.Payload)))
diff --git a/BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py b/BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py
index 48c605faa8dd..75539a3e32ad 100644
--- a/BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py
+++ b/BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Module that encodes and decodes a EFI_FIRMWARE_IMAGE_AUTHENTICATION with
 # certificate data and payload data.
 #
@@ -13,6 +13,7 @@ FmpAuthHeader
 import struct
 import uuid
 
+
 class FmpAuthHeaderClass (object):
     # ///
     # /// Image Attribute -Authentication Required
@@ -96,58 +97,58 @@ class FmpAuthHeaderClass (object):
     #   }
 
     _StructFormat = '<QIHH16s'
-    _StructSize   = struct.calcsize (_StructFormat)
+    _StructSize = struct.calcsize(_StructFormat)
 
     _MonotonicCountFormat = '<Q'
-    _MonotonicCountSize   = struct.calcsize (_MonotonicCountFormat)
+    _MonotonicCountSize = struct.calcsize(_MonotonicCountFormat)
 
     _StructAuthInfoFormat = '<IHH16s'
-    _StructAuthInfoSize   = struct.calcsize (_StructAuthInfoFormat)
+    _StructAuthInfoSize = struct.calcsize(_StructAuthInfoFormat)
 
-    _WIN_CERT_REVISION        = 0x0200
-    _WIN_CERT_TYPE_EFI_GUID   = 0x0EF1
-    _EFI_CERT_TYPE_PKCS7_GUID = uuid.UUID ('4aafd29d-68df-49ee-8aa9-347d375665a7')
+    _WIN_CERT_REVISION = 0x0200
+    _WIN_CERT_TYPE_EFI_GUID = 0x0EF1
+    _EFI_CERT_TYPE_PKCS7_GUID = uuid.UUID(
+        '4aafd29d-68df-49ee-8aa9-347d375665a7')
 
-    def __init__ (self):
-        self._Valid              = False
-        self.MonotonicCount      = 0
-        self.dwLength            = self._StructAuthInfoSize
-        self.wRevision           = self._WIN_CERT_REVISION
-        self.wCertificateType    = self._WIN_CERT_TYPE_EFI_GUID
-        self.CertType            = self._EFI_CERT_TYPE_PKCS7_GUID
-        self.CertData            = b''
-        self.Payload             = b''
+    def __init__(self):
+        self._Valid = False
+        self.MonotonicCount = 0
+        self.dwLength = self._StructAuthInfoSize
+        self.wRevision = self._WIN_CERT_REVISION
+        self.wCertificateType = self._WIN_CERT_TYPE_EFI_GUID
+        self.CertType = self._EFI_CERT_TYPE_PKCS7_GUID
+        self.CertData = b''
+        self.Payload = b''
 
-
-    def Encode (self):
+    def Encode(self):
         if self.wRevision != self._WIN_CERT_REVISION:
             raise ValueError
         if self.wCertificateType != self._WIN_CERT_TYPE_EFI_GUID:
             raise ValueError
         if self.CertType != self._EFI_CERT_TYPE_PKCS7_GUID:
             raise ValueError
-        self.dwLength = self._StructAuthInfoSize + len (self.CertData)
+        self.dwLength = self._StructAuthInfoSize + len(self.CertData)
 
-        FmpAuthHeader = struct.pack (
-                                 self._StructFormat,
-                                 self.MonotonicCount,
-                                 self.dwLength,
-                                 self.wRevision,
-                                 self.wCertificateType,
-                                 self.CertType.bytes_le
-                                 )
+        FmpAuthHeader = struct.pack(
+            self._StructFormat,
+            self.MonotonicCount,
+            self.dwLength,
+            self.wRevision,
+            self.wCertificateType,
+            self.CertType.bytes_le
+        )
         self._Valid = True
 
         return FmpAuthHeader + self.CertData + self.Payload
 
-    def Decode (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def Decode(self, Buffer):
+        if len(Buffer) < self._StructSize:
             raise ValueError
         (MonotonicCount, dwLength, wRevision, wCertificateType, CertType) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
         if dwLength < self._StructAuthInfoSize:
             raise ValueError
         if wRevision != self._WIN_CERT_REVISION:
@@ -156,35 +157,43 @@ class FmpAuthHeaderClass (object):
             raise ValueError
         if CertType != self._EFI_CERT_TYPE_PKCS7_GUID.bytes_le:
             raise ValueError
-        self.MonotonicCount   = MonotonicCount
-        self.dwLength         = dwLength
-        self.wRevision        = wRevision
+        self.MonotonicCount = MonotonicCount
+        self.dwLength = dwLength
+        self.wRevision = wRevision
         self.wCertificateType = wCertificateType
-        self.CertType         = uuid.UUID (bytes_le = CertType)
-        self.CertData         = Buffer[self._StructSize:self._MonotonicCountSize + self.dwLength]
-        self.Payload          = Buffer[self._MonotonicCountSize + self.dwLength:]
-        self._Valid           = True
+        self.CertType = uuid.UUID(bytes_le=CertType)
+        self.CertData = Buffer[self._StructSize:
+                               self._MonotonicCountSize + self.dwLength]
+        self.Payload = Buffer[self._MonotonicCountSize + self.dwLength:]
+        self._Valid = True
         return self.Payload
 
-    def IsSigned (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def IsSigned(self, Buffer):
+        if len(Buffer) < self._StructSize:
             return False
         (MonotonicCount, dwLength, wRevision, wCertificateType, CertType) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
         if CertType != self._EFI_CERT_TYPE_PKCS7_GUID.bytes_le:
             return False
         return True
 
-    def DumpInfo (self):
+    def DumpInfo(self):
         if not self._Valid:
             raise ValueError
-        print ('EFI_FIRMWARE_IMAGE_AUTHENTICATION.MonotonicCount                = {MonotonicCount:016X}'.format (MonotonicCount = self.MonotonicCount))
-        print ('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.dwLength         = {dwLength:08X}'.format (dwLength = self.dwLength))
-        print ('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.wRevision        = {wRevision:04X}'.format (wRevision = self.wRevision))
-        print ('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.wCertificateType = {wCertificateType:04X}'.format (wCertificateType = self.wCertificateType))
-        print ('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.CertType             = {Guid}'.format (Guid = str(self.CertType).upper()))
-        print ('sizeof (EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.CertData)    = {Size:08X}'.format (Size = len (self.CertData)))
-        print ('sizeof (Payload)                                                = {Size:08X}'.format (Size = len (self.Payload)))
+        print('EFI_FIRMWARE_IMAGE_AUTHENTICATION.MonotonicCount                = {MonotonicCount:016X}'.format(
+            MonotonicCount=self.MonotonicCount))
+        print('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.dwLength         = {dwLength:08X}'.format(
+            dwLength=self.dwLength))
+        print('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.wRevision        = {wRevision:04X}'.format(
+            wRevision=self.wRevision))
+        print('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.Hdr.wCertificateType = {wCertificateType:04X}'.format(
+            wCertificateType=self.wCertificateType))
+        print('EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.CertType             = {Guid}'.format(
+            Guid=str(self.CertType).upper()))
+        print('sizeof (EFI_FIRMWARE_IMAGE_AUTHENTICATION.AuthInfo.CertData)    = {Size:08X}'.format(
+            Size=len(self.CertData)))
+        print('sizeof (Payload)                                                = {Size:08X}'.format(
+            Size=len(self.Payload)))
diff --git a/BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py b/BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py
index 8abb449c6fd7..6fb041c5fc77 100644
--- a/BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py
+++ b/BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Module that encodes and decodes a EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER with
 # a payload.
 #
@@ -13,6 +13,7 @@ FmpCapsuleHeader
 import struct
 import uuid
 
+
 class FmpCapsuleImageHeaderClass (object):
     # typedef struct {
     #   UINT32   Version;
@@ -57,79 +58,90 @@ class FmpCapsuleImageHeaderClass (object):
     #  #define EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER_INIT_VERSION 0x00000003
 
     _StructFormat = '<I16sB3BIIQQ'
-    _StructSize   = struct.calcsize (_StructFormat)
+    _StructSize = struct.calcsize(_StructFormat)
 
     EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER_INIT_VERSION = 0x00000003
 
-    def __init__ (self):
-        self._Valid                 = False
-        self.Version                = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER_INIT_VERSION
-        self.UpdateImageTypeId      = uuid.UUID ('00000000-0000-0000-0000-000000000000')
-        self.UpdateImageIndex       = 0
-        self.UpdateImageSize        = 0
-        self.UpdateVendorCodeSize   = 0
+    def __init__(self):
+        self._Valid = False
+        self.Version = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER_INIT_VERSION
+        self.UpdateImageTypeId = uuid.UUID(
+            '00000000-0000-0000-0000-000000000000')
+        self.UpdateImageIndex = 0
+        self.UpdateImageSize = 0
+        self.UpdateVendorCodeSize = 0
         self.UpdateHardwareInstance = 0x0000000000000000
-        self.ImageCapsuleSupport    = 0x0000000000000000
-        self.Payload                = b''
-        self.VendorCodeBytes        = b''
+        self.ImageCapsuleSupport = 0x0000000000000000
+        self.Payload = b''
+        self.VendorCodeBytes = b''
 
-    def Encode (self):
-        self.UpdateImageSize      = len (self.Payload)
-        self.UpdateVendorCodeSize = len (self.VendorCodeBytes)
-        FmpCapsuleImageHeader = struct.pack (
-                                         self._StructFormat,
-                                         self.Version,
-                                         self.UpdateImageTypeId.bytes_le,
-                                         self.UpdateImageIndex,
-                                         0,0,0,
-                                         self.UpdateImageSize,
-                                         self.UpdateVendorCodeSize,
-                                         self.UpdateHardwareInstance,
-                                         self.ImageCapsuleSupport
-                                         )
+    def Encode(self):
+        self.UpdateImageSize = len(self.Payload)
+        self.UpdateVendorCodeSize = len(self.VendorCodeBytes)
+        FmpCapsuleImageHeader = struct.pack(
+            self._StructFormat,
+            self.Version,
+            self.UpdateImageTypeId.bytes_le,
+            self.UpdateImageIndex,
+            0, 0, 0,
+            self.UpdateImageSize,
+            self.UpdateVendorCodeSize,
+            self.UpdateHardwareInstance,
+            self.ImageCapsuleSupport
+        )
         self._Valid = True
         return FmpCapsuleImageHeader + self.Payload + self.VendorCodeBytes
 
-    def Decode (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def Decode(self, Buffer):
+        if len(Buffer) < self._StructSize:
             raise ValueError
         (Version, UpdateImageTypeId, UpdateImageIndex, r0, r1, r2, UpdateImageSize, UpdateVendorCodeSize, UpdateHardwareInstance, ImageCapsuleSupport) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
 
         if Version < self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER_INIT_VERSION:
             raise ValueError
         if UpdateImageIndex < 1:
             raise ValueError
-        if UpdateImageSize + UpdateVendorCodeSize != len (Buffer[self._StructSize:]):
+        if UpdateImageSize + UpdateVendorCodeSize != len(Buffer[self._StructSize:]):
             raise ValueError
 
-        self.Version                = Version
-        self.UpdateImageTypeId      = uuid.UUID (bytes_le = UpdateImageTypeId)
-        self.UpdateImageIndex       = UpdateImageIndex
-        self.UpdateImageSize        = UpdateImageSize
-        self.UpdateVendorCodeSize   = UpdateVendorCodeSize
+        self.Version = Version
+        self.UpdateImageTypeId = uuid.UUID(bytes_le=UpdateImageTypeId)
+        self.UpdateImageIndex = UpdateImageIndex
+        self.UpdateImageSize = UpdateImageSize
+        self.UpdateVendorCodeSize = UpdateVendorCodeSize
         self.UpdateHardwareInstance = UpdateHardwareInstance
-        self.ImageCapsuleSupport    = ImageCapsuleSupport
-        self.Payload                = Buffer[self._StructSize:self._StructSize + UpdateImageSize]
-        self.VendorCodeBytes        = Buffer[self._StructSize + UpdateImageSize:]
-        self._Valid                 = True
+        self.ImageCapsuleSupport = ImageCapsuleSupport
+        self.Payload = Buffer[self._StructSize:self._StructSize + UpdateImageSize]
+        self.VendorCodeBytes = Buffer[self._StructSize + UpdateImageSize:]
+        self._Valid = True
         return Buffer[self._StructSize:]
 
-    def DumpInfo (self):
+    def DumpInfo(self):
         if not self._Valid:
             raise ValueError
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.Version                = {Version:08X}'.format (Version = self.Version))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageTypeId      = {UpdateImageTypeId}'.format (UpdateImageTypeId = str(self.UpdateImageTypeId).upper()))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageIndex       = {UpdateImageIndex:08X}'.format (UpdateImageIndex = self.UpdateImageIndex))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageSize        = {UpdateImageSize:08X}'.format (UpdateImageSize = self.UpdateImageSize))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateVendorCodeSize   = {UpdateVendorCodeSize:08X}'.format (UpdateVendorCodeSize = self.UpdateVendorCodeSize))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateHardwareInstance = {UpdateHardwareInstance:016X}'.format (UpdateHardwareInstance = self.UpdateHardwareInstance))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.ImageCapsuleSupport    = {ImageCapsuleSupport:016X}'.format (ImageCapsuleSupport = self.ImageCapsuleSupport))
-        print ('sizeof (Payload)                                                    = {Size:08X}'.format (Size = len (self.Payload)))
-        print ('sizeof (VendorCodeBytes)                                            = {Size:08X}'.format (Size = len (self.VendorCodeBytes)))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.Version                = {Version:08X}'.format(
+            Version=self.Version))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageTypeId      = {UpdateImageTypeId}'.format(
+            UpdateImageTypeId=str(self.UpdateImageTypeId).upper()))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageIndex       = {UpdateImageIndex:08X}'.format(
+            UpdateImageIndex=self.UpdateImageIndex))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateImageSize        = {UpdateImageSize:08X}'.format(
+            UpdateImageSize=self.UpdateImageSize))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateVendorCodeSize   = {UpdateVendorCodeSize:08X}'.format(
+            UpdateVendorCodeSize=self.UpdateVendorCodeSize))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.UpdateHardwareInstance = {UpdateHardwareInstance:016X}'.format(
+            UpdateHardwareInstance=self.UpdateHardwareInstance))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER.ImageCapsuleSupport    = {ImageCapsuleSupport:016X}'.format(
+            ImageCapsuleSupport=self.ImageCapsuleSupport))
+        print('sizeof (Payload)                                                    = {Size:08X}'.format(
+            Size=len(self.Payload)))
+        print('sizeof (VendorCodeBytes)                                            = {Size:08X}'.format(
+            Size=len(self.VendorCodeBytes)))
+
 
 class FmpCapsuleHeaderClass (object):
     # typedef struct {
@@ -156,155 +168,163 @@ class FmpCapsuleHeaderClass (object):
     #
     #  #define EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER_INIT_VERSION       0x00000001
     _StructFormat = '<IHH'
-    _StructSize   = struct.calcsize (_StructFormat)
+    _StructSize = struct.calcsize(_StructFormat)
 
     _ItemOffsetFormat = '<Q'
-    _ItemOffsetSize   = struct.calcsize (_ItemOffsetFormat)
+    _ItemOffsetSize = struct.calcsize(_ItemOffsetFormat)
 
     EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER_INIT_VERSION = 0x00000001
     CAPSULE_SUPPORT_AUTHENTICATION = 0x0000000000000001
-    CAPSULE_SUPPORT_DEPENDENCY     = 0x0000000000000002
+    CAPSULE_SUPPORT_DEPENDENCY = 0x0000000000000002
 
-    def __init__ (self):
-        self._Valid                     = False
-        self.Version                    = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER_INIT_VERSION
-        self.EmbeddedDriverCount        = 0
-        self.PayloadItemCount           = 0
-        self._ItemOffsetList            = []
-        self._EmbeddedDriverList        = []
-        self._PayloadList               = []
+    def __init__(self):
+        self._Valid = False
+        self.Version = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER_INIT_VERSION
+        self.EmbeddedDriverCount = 0
+        self.PayloadItemCount = 0
+        self._ItemOffsetList = []
+        self._EmbeddedDriverList = []
+        self._PayloadList = []
         self._FmpCapsuleImageHeaderList = []
 
-    def AddEmbeddedDriver (self, EmbeddedDriver):
-        self._EmbeddedDriverList.append (EmbeddedDriver)
+    def AddEmbeddedDriver(self, EmbeddedDriver):
+        self._EmbeddedDriverList.append(EmbeddedDriver)
 
-    def GetEmbeddedDriver (self, Index):
-        if Index > len (self._EmbeddedDriverList):
+    def GetEmbeddedDriver(self, Index):
+        if Index > len(self._EmbeddedDriverList):
             raise ValueError
         return self._EmbeddedDriverList[Index]
 
-    def AddPayload (self, UpdateImageTypeId, Payload = b'', VendorCodeBytes = b'', HardwareInstance = 0, UpdateImageIndex = 1, CapsuleSupport = 0):
-        self._PayloadList.append ((UpdateImageTypeId, Payload, VendorCodeBytes, HardwareInstance, UpdateImageIndex, CapsuleSupport))
+    def AddPayload(self, UpdateImageTypeId, Payload=b'', VendorCodeBytes=b'', HardwareInstance=0, UpdateImageIndex=1, CapsuleSupport=0):
+        self._PayloadList.append((UpdateImageTypeId, Payload, VendorCodeBytes,
+                                 HardwareInstance, UpdateImageIndex, CapsuleSupport))
 
-    def GetFmpCapsuleImageHeader (self, Index):
-        if Index >= len (self._FmpCapsuleImageHeaderList):
+    def GetFmpCapsuleImageHeader(self, Index):
+        if Index >= len(self._FmpCapsuleImageHeaderList):
             raise ValueError
         return self._FmpCapsuleImageHeaderList[Index]
 
-    def Encode (self):
-        self.EmbeddedDriverCount = len (self._EmbeddedDriverList)
-        self.PayloadItemCount    = len (self._PayloadList)
+    def Encode(self):
+        self.EmbeddedDriverCount = len(self._EmbeddedDriverList)
+        self.PayloadItemCount = len(self._PayloadList)
 
-        FmpCapsuleHeader = struct.pack (
-                                    self._StructFormat,
-                                    self.Version,
-                                    self.EmbeddedDriverCount,
-                                    self.PayloadItemCount
-                                    )
+        FmpCapsuleHeader = struct.pack(
+            self._StructFormat,
+            self.Version,
+            self.EmbeddedDriverCount,
+            self.PayloadItemCount
+        )
 
         FmpCapsuleData = b''
-        Offset = self._StructSize + (self.EmbeddedDriverCount + self.PayloadItemCount) * self._ItemOffsetSize
+        Offset = self._StructSize + \
+            (self.EmbeddedDriverCount + self.PayloadItemCount) * self._ItemOffsetSize
         for EmbeddedDriver in self._EmbeddedDriverList:
             FmpCapsuleData = FmpCapsuleData + EmbeddedDriver
-            self._ItemOffsetList.append (Offset)
-            Offset = Offset + len (EmbeddedDriver)
+            self._ItemOffsetList.append(Offset)
+            Offset = Offset + len(EmbeddedDriver)
         Index = 1
         for (UpdateImageTypeId, Payload, VendorCodeBytes, HardwareInstance, UpdateImageIndex, CapsuleSupport) in self._PayloadList:
-            FmpCapsuleImageHeader = FmpCapsuleImageHeaderClass ()
-            FmpCapsuleImageHeader.UpdateImageTypeId      = UpdateImageTypeId
-            FmpCapsuleImageHeader.UpdateImageIndex       = UpdateImageIndex
-            FmpCapsuleImageHeader.Payload                = Payload
-            FmpCapsuleImageHeader.VendorCodeBytes        = VendorCodeBytes
+            FmpCapsuleImageHeader = FmpCapsuleImageHeaderClass()
+            FmpCapsuleImageHeader.UpdateImageTypeId = UpdateImageTypeId
+            FmpCapsuleImageHeader.UpdateImageIndex = UpdateImageIndex
+            FmpCapsuleImageHeader.Payload = Payload
+            FmpCapsuleImageHeader.VendorCodeBytes = VendorCodeBytes
             FmpCapsuleImageHeader.UpdateHardwareInstance = HardwareInstance
-            FmpCapsuleImageHeader.ImageCapsuleSupport    = CapsuleSupport
-            FmpCapsuleImage = FmpCapsuleImageHeader.Encode ()
+            FmpCapsuleImageHeader.ImageCapsuleSupport = CapsuleSupport
+            FmpCapsuleImage = FmpCapsuleImageHeader.Encode()
             FmpCapsuleData = FmpCapsuleData + FmpCapsuleImage
 
-            self._ItemOffsetList.append (Offset)
-            self._FmpCapsuleImageHeaderList.append (FmpCapsuleImageHeader)
+            self._ItemOffsetList.append(Offset)
+            self._FmpCapsuleImageHeaderList.append(FmpCapsuleImageHeader)
 
-            Offset = Offset + len (FmpCapsuleImage)
+            Offset = Offset + len(FmpCapsuleImage)
             Index = Index + 1
 
         for Offset in self._ItemOffsetList:
-          FmpCapsuleHeader = FmpCapsuleHeader + struct.pack (self._ItemOffsetFormat, Offset)
+            FmpCapsuleHeader = FmpCapsuleHeader + \
+                struct.pack(self._ItemOffsetFormat, Offset)
 
         self._Valid = True
         return FmpCapsuleHeader + FmpCapsuleData
 
-    def Decode (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def Decode(self, Buffer):
+        if len(Buffer) < self._StructSize:
             raise ValueError
         (Version, EmbeddedDriverCount, PayloadItemCount) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
         if Version < self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER_INIT_VERSION:
             raise ValueError
 
-        self.Version                    = Version
-        self.EmbeddedDriverCount        = EmbeddedDriverCount
-        self.PayloadItemCount           = PayloadItemCount
-        self._ItemOffsetList            = []
-        self._EmbeddedDriverList        = []
-        self._PayloadList               = []
+        self.Version = Version
+        self.EmbeddedDriverCount = EmbeddedDriverCount
+        self.PayloadItemCount = PayloadItemCount
+        self._ItemOffsetList = []
+        self._EmbeddedDriverList = []
+        self._PayloadList = []
         self._FmpCapsuleImageHeaderList = []
 
         #
         # Parse the ItemOffsetList values
         #
         Offset = self._StructSize
-        for Index in range (0, EmbeddedDriverCount + PayloadItemCount):
-            ItemOffset = struct.unpack (self._ItemOffsetFormat, Buffer[Offset:Offset + self._ItemOffsetSize])[0]
-            if ItemOffset >= len (Buffer):
+        for Index in range(0, EmbeddedDriverCount + PayloadItemCount):
+            ItemOffset = struct.unpack(
+                self._ItemOffsetFormat, Buffer[Offset:Offset + self._ItemOffsetSize])[0]
+            if ItemOffset >= len(Buffer):
                 raise ValueError
-            self._ItemOffsetList.append (ItemOffset)
+            self._ItemOffsetList.append(ItemOffset)
             Offset = Offset + self._ItemOffsetSize
         Result = Buffer[Offset:]
 
         #
         # Parse the EmbeddedDrivers
         #
-        for Index in range (0, EmbeddedDriverCount):
+        for Index in range(0, EmbeddedDriverCount):
             Offset = self._ItemOffsetList[Index]
-            if Index < (len (self._ItemOffsetList) - 1):
+            if Index < (len(self._ItemOffsetList) - 1):
                 Length = self._ItemOffsetList[Index + 1] - Offset
             else:
-                Length = len (Buffer) - Offset
-            self.AddEmbeddedDriver (Buffer[Offset:Offset + Length])
+                Length = len(Buffer) - Offset
+            self.AddEmbeddedDriver(Buffer[Offset:Offset + Length])
 
         #
         # Parse the Payloads that are FMP Capsule Images
         #
-        for Index in range (EmbeddedDriverCount, EmbeddedDriverCount + PayloadItemCount):
+        for Index in range(EmbeddedDriverCount, EmbeddedDriverCount + PayloadItemCount):
             Offset = self._ItemOffsetList[Index]
-            if Index < (len (self._ItemOffsetList) - 1):
+            if Index < (len(self._ItemOffsetList) - 1):
                 Length = self._ItemOffsetList[Index + 1] - Offset
             else:
-                Length = len (Buffer) - Offset
-            FmpCapsuleImageHeader = FmpCapsuleImageHeaderClass ()
-            FmpCapsuleImageHeader.Decode (Buffer[Offset:Offset + Length])
-            self.AddPayload (
-                   FmpCapsuleImageHeader.UpdateImageTypeId,
-                   FmpCapsuleImageHeader.Payload,
-                   FmpCapsuleImageHeader.VendorCodeBytes
-                   )
-            self._FmpCapsuleImageHeaderList.append (FmpCapsuleImageHeader)
+                Length = len(Buffer) - Offset
+            FmpCapsuleImageHeader = FmpCapsuleImageHeaderClass()
+            FmpCapsuleImageHeader.Decode(Buffer[Offset:Offset + Length])
+            self.AddPayload(
+                FmpCapsuleImageHeader.UpdateImageTypeId,
+                FmpCapsuleImageHeader.Payload,
+                FmpCapsuleImageHeader.VendorCodeBytes
+            )
+            self._FmpCapsuleImageHeaderList.append(FmpCapsuleImageHeader)
 
         self._Valid = True
         return Result
 
-    def DumpInfo (self):
+    def DumpInfo(self):
         if not self._Valid:
             raise ValueError
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.Version             = {Version:08X}'.format (Version = self.Version))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.EmbeddedDriverCount = {EmbeddedDriverCount:08X}'.format (EmbeddedDriverCount = self.EmbeddedDriverCount))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.Version             = {Version:08X}'.format(
+            Version=self.Version))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.EmbeddedDriverCount = {EmbeddedDriverCount:08X}'.format(
+            EmbeddedDriverCount=self.EmbeddedDriverCount))
         for EmbeddedDriver in self._EmbeddedDriverList:
-            print ('  sizeof (EmbeddedDriver)                                  = {Size:08X}'.format (Size = len (EmbeddedDriver)))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.PayloadItemCount    = {PayloadItemCount:08X}'.format (PayloadItemCount = self.PayloadItemCount))
-        print ('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.ItemOffsetList      = ')
+            print('  sizeof (EmbeddedDriver)                                  = {Size:08X}'.format(
+                Size=len(EmbeddedDriver)))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.PayloadItemCount    = {PayloadItemCount:08X}'.format(
+            PayloadItemCount=self.PayloadItemCount))
+        print('EFI_FIRMWARE_MANAGEMENT_CAPSULE_HEADER.ItemOffsetList      = ')
         for Offset in self._ItemOffsetList:
-            print ('  {Offset:016X}'.format (Offset = Offset))
+            print('  {Offset:016X}'.format(Offset=Offset))
         for FmpCapsuleImageHeader in self._FmpCapsuleImageHeaderList:
-            FmpCapsuleImageHeader.DumpInfo ()
+            FmpCapsuleImageHeader.DumpInfo()
diff --git a/BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py b/BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py
index 0e59028697ed..0ed4cbf5f73a 100644
--- a/BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py
+++ b/BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Module that encodes and decodes a EFI_CAPSULE_HEADER with a payload
 #
 # Copyright (c) 2018, Intel Corporation. All rights reserved.<BR>
@@ -12,6 +12,7 @@ UefiCapsuleHeader
 import struct
 import uuid
 
+
 class UefiCapsuleHeaderClass (object):
     # typedef struct {
     #   ///
@@ -41,26 +42,27 @@ class UefiCapsuleHeaderClass (object):
     # #define CAPSULE_FLAGS_INITIATE_RESET                0x00040000
     #
     _StructFormat = '<16sIIII'
-    _StructSize   = struct.calcsize (_StructFormat)
+    _StructSize = struct.calcsize(_StructFormat)
 
-    EFI_FIRMWARE_MANAGEMENT_CAPSULE_ID_GUID = uuid.UUID ('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')
+    EFI_FIRMWARE_MANAGEMENT_CAPSULE_ID_GUID = uuid.UUID(
+        '6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')
 
-    _CAPSULE_FLAGS_PERSIST_ACROSS_RESET  = 0x00010000
+    _CAPSULE_FLAGS_PERSIST_ACROSS_RESET = 0x00010000
     _CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE = 0x00020000
-    _CAPSULE_FLAGS_INITIATE_RESET        = 0x00040000
+    _CAPSULE_FLAGS_INITIATE_RESET = 0x00040000
 
-    def __init__ (self):
-        self._Valid              = False
-        self.CapsuleGuid         = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_ID_GUID
-        self.HeaderSize          = self._StructSize
-        self.OemFlags            = 0x0000
-        self.PersistAcrossReset  = False
+    def __init__(self):
+        self._Valid = False
+        self.CapsuleGuid = self.EFI_FIRMWARE_MANAGEMENT_CAPSULE_ID_GUID
+        self.HeaderSize = self._StructSize
+        self.OemFlags = 0x0000
+        self.PersistAcrossReset = False
         self.PopulateSystemTable = False
-        self.InitiateReset       = False
-        self.CapsuleImageSize    = self.HeaderSize
-        self.Payload             = b''
+        self.InitiateReset = False
+        self.CapsuleImageSize = self.HeaderSize
+        self.Payload = b''
 
-    def Encode (self):
+    def Encode(self):
         Flags = self.OemFlags
         if self.PersistAcrossReset:
             Flags = Flags | self._CAPSULE_FLAGS_PERSIST_ACROSS_RESET
@@ -69,44 +71,46 @@ class UefiCapsuleHeaderClass (object):
         if self.InitiateReset:
             Flags = Flags | self._CAPSULE_FLAGS_INITIATE_RESET
 
-        self.CapsuleImageSize = self.HeaderSize + len (self.Payload)
+        self.CapsuleImageSize = self.HeaderSize + len(self.Payload)
 
-        UefiCapsuleHeader = struct.pack (
-                                     self._StructFormat,
-                                     self.CapsuleGuid.bytes_le,
-                                     self.HeaderSize,
-                                     Flags,
-                                     self.CapsuleImageSize,
-                                     0
-                                     )
+        UefiCapsuleHeader = struct.pack(
+            self._StructFormat,
+            self.CapsuleGuid.bytes_le,
+            self.HeaderSize,
+            Flags,
+            self.CapsuleImageSize,
+            0
+        )
         self._Valid = True
         return UefiCapsuleHeader + self.Payload
 
-    def Decode (self, Buffer):
-        if len (Buffer) < self._StructSize:
+    def Decode(self, Buffer):
+        if len(Buffer) < self._StructSize:
             raise ValueError
         (CapsuleGuid, HeaderSize, Flags, CapsuleImageSize, Reserved) = \
-            struct.unpack (
-                     self._StructFormat,
-                     Buffer[0:self._StructSize]
-                     )
+            struct.unpack(
+            self._StructFormat,
+            Buffer[0:self._StructSize]
+        )
         if HeaderSize < self._StructSize:
             raise ValueError
-        if CapsuleImageSize != len (Buffer):
+        if CapsuleImageSize != len(Buffer):
             raise ValueError
-        self.CapsuleGuid         = uuid.UUID (bytes_le = CapsuleGuid)
-        self.HeaderSize          = HeaderSize
-        self.OemFlags            = Flags & 0xffff
-        self.PersistAcrossReset  = (Flags & self._CAPSULE_FLAGS_PERSIST_ACROSS_RESET) != 0
-        self.PopulateSystemTable = (Flags & self._CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE) != 0
-        self.InitiateReset       = (Flags & self._CAPSULE_FLAGS_INITIATE_RESET) != 0
-        self.CapsuleImageSize    = CapsuleImageSize
-        self.Payload             = Buffer[self.HeaderSize:]
+        self.CapsuleGuid = uuid.UUID(bytes_le=CapsuleGuid)
+        self.HeaderSize = HeaderSize
+        self.OemFlags = Flags & 0xffff
+        self.PersistAcrossReset = (
+            Flags & self._CAPSULE_FLAGS_PERSIST_ACROSS_RESET) != 0
+        self.PopulateSystemTable = (
+            Flags & self._CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE) != 0
+        self.InitiateReset = (Flags & self._CAPSULE_FLAGS_INITIATE_RESET) != 0
+        self.CapsuleImageSize = CapsuleImageSize
+        self.Payload = Buffer[self.HeaderSize:]
 
-        self._Valid              = True
+        self._Valid = True
         return self.Payload
 
-    def DumpInfo (self):
+    def DumpInfo(self):
         if not self._Valid:
             raise ValueError
         Flags = self.OemFlags
@@ -116,15 +120,21 @@ class UefiCapsuleHeaderClass (object):
             Flags = Flags | self._CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE
         if self.InitiateReset:
             Flags = Flags | self._CAPSULE_FLAGS_INITIATE_RESET
-        print ('EFI_CAPSULE_HEADER.CapsuleGuid      = {Guid}'.format (Guid = str(self.CapsuleGuid).upper()))
-        print ('EFI_CAPSULE_HEADER.HeaderSize       = {Size:08X}'.format (Size = self.HeaderSize))
-        print ('EFI_CAPSULE_HEADER.Flags            = {Flags:08X}'.format (Flags = Flags))
-        print ('  OEM Flags                         = {Flags:04X}'.format (Flags = self.OemFlags))
+        print('EFI_CAPSULE_HEADER.CapsuleGuid      = {Guid}'.format(
+            Guid=str(self.CapsuleGuid).upper()))
+        print('EFI_CAPSULE_HEADER.HeaderSize       = {Size:08X}'.format(
+            Size=self.HeaderSize))
+        print('EFI_CAPSULE_HEADER.Flags            = {Flags:08X}'.format(
+            Flags=Flags))
+        print('  OEM Flags                         = {Flags:04X}'.format(
+            Flags=self.OemFlags))
         if self.PersistAcrossReset:
-            print ('  CAPSULE_FLAGS_PERSIST_ACROSS_RESET')
+            print('  CAPSULE_FLAGS_PERSIST_ACROSS_RESET')
         if self.PopulateSystemTable:
-            print ('  CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE')
+            print('  CAPSULE_FLAGS_POPULATE_SYSTEM_TABLE')
         if self.InitiateReset:
-            print ('  CAPSULE_FLAGS_INITIATE_RESET')
-        print ('EFI_CAPSULE_HEADER.CapsuleImageSize = {Size:08X}'.format (Size = self.CapsuleImageSize))
-        print ('sizeof (Payload)                    = {Size:08X}'.format (Size = len (self.Payload)))
+            print('  CAPSULE_FLAGS_INITIATE_RESET')
+        print('EFI_CAPSULE_HEADER.CapsuleImageSize = {Size:08X}'.format(
+            Size=self.CapsuleImageSize))
+        print('sizeof (Payload)                    = {Size:08X}'.format(
+            Size=len(self.Payload)))
diff --git a/BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py b/BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py
index 329adcd51216..0420c953aa1f 100644
--- a/BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py
+++ b/BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Common.Uefi.Capsule' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Common/Uefi/__init__.py b/BaseTools/Source/Python/Common/Uefi/__init__.py
index 213883d04833..1a49d89e5970 100644
--- a/BaseTools/Source/Python/Common/Uefi/__init__.py
+++ b/BaseTools/Source/Python/Common/Uefi/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Common.Uefi' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Common/VariableAttributes.py b/BaseTools/Source/Python/Common/VariableAttributes.py
index 90c43f87d171..4477e747cc9e 100644
--- a/BaseTools/Source/Python/Common/VariableAttributes.py
+++ b/BaseTools/Source/Python/Common/VariableAttributes.py
@@ -13,11 +13,11 @@ class VariableAttributes(object):
     EFI_VARIABLE_RUNTIME_ACCESS = 0x00000004
     VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY = 0x00000001
     VarAttributesMap = {
-                     "NV":EFI_VARIABLE_NON_VOLATILE,
-                     "BS":EFI_VARIABLE_BOOTSERVICE_ACCESS,
-                     "RT":EFI_VARIABLE_RUNTIME_ACCESS,
-                     "RO":VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
-                     }
+        "NV": EFI_VARIABLE_NON_VOLATILE,
+        "BS": EFI_VARIABLE_BOOTSERVICE_ACCESS,
+        "RT": EFI_VARIABLE_RUNTIME_ACCESS,
+        "RO": VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
+    }
 
     def __init__(self):
         pass
@@ -33,8 +33,10 @@ class VariableAttributes(object):
             if attr == 'RO':
                 VarProp = VariableAttributes.VAR_CHECK_VARIABLE_PROPERTY_READ_ONLY
             else:
-                VarAttr = VarAttr | VariableAttributes.VarAttributesMap.get(attr, 0x00000000)
+                VarAttr = VarAttr | VariableAttributes.VarAttributesMap.get(
+                    attr, 0x00000000)
         return VarAttr, VarProp
+
     @staticmethod
     def ValidateVarAttributes(var_attr_str):
         if not var_attr_str:
diff --git a/BaseTools/Source/Python/Common/VpdInfoFile.py b/BaseTools/Source/Python/Common/VpdInfoFile.py
index 4249b9f899e7..5886cb772e42 100644
--- a/BaseTools/Source/Python/Common/VpdInfoFile.py
+++ b/BaseTools/Source/Python/Common/VpdInfoFile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This package manage the VPD PCD information file which will be generated
 # by build tool's autogen.
@@ -21,7 +21,7 @@ from Common.Misc import SaveFileOnChange
 from Common.DataType import *
 
 FILE_COMMENT_TEMPLATE = \
-"""
+    """
 ## @file
 #
 #  THIS IS AUTO-GENERATED FILE BY BUILD TOOLS AND PLEASE DO NOT MAKE MODIFICATION.
@@ -40,7 +40,7 @@ FILE_COMMENT_TEMPLATE = \
 
 """
 
-## The class manage VpdInfoFile.
+# The class manage VpdInfoFile.
 #
 #  This file contains an ordered (based on position in the DSC file) list of the PCDs specified in the platform description file (DSC). The Value field that will be assigned to the PCD comes from the DSC file, INF file (if not defined in the DSC file) or the DEC file (if not defined in the INF file). This file is used as an input to the BPDG tool.
 #  Format for this file (using EBNF notation) is:
@@ -62,12 +62,15 @@ FILE_COMMENT_TEMPLATE = \
 #  <CArray>          ::=  "{" <HexNumber> ["," <HexNumber>]* "}"
 #  <NList>           ::=  <HexNumber> ["," <HexNumber>]*
 #
+
+
 class VpdInfoFile:
 
     _rVpdPcdLine = None
-    ## Constructor
+    # Constructor
+
     def __init__(self):
-        ## Dictionary for VPD in following format
+        # Dictionary for VPD in following format
         #
         #  Key    : PcdClassObject instance.
         #           @see BuildClassObject.PcdClassObject
@@ -75,7 +78,7 @@ class VpdInfoFile:
         self._VpdArray = {}
         self._VpdInfo = {}
 
-    ## Add a VPD PCD collected from platform's autogen when building.
+    # Add a VPD PCD collected from platform's autogen when building.
     #
     #  @param vpds  The list of VPD PCD collected for a platform.
     #               @see BuildClassObject.PcdClassObject
@@ -84,10 +87,12 @@ class VpdInfoFile:
     #
     def Add(self, Vpd, skuname, Offset):
         if (Vpd is None):
-            EdkLogger.error("VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
+            EdkLogger.error(
+                "VpdInfoFile", BuildToolError.ATTRIBUTE_UNKNOWN_ERROR, "Invalid VPD PCD entry.")
 
         if not (Offset >= "0" or Offset == TAB_STAR):
-            EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID, "Invalid offset parameter: %s." % Offset)
+            EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
+                            "Invalid offset parameter: %s." % Offset)
 
         if Vpd.DatumType == TAB_VOID:
             if Vpd.MaxDatumSize <= "0":
@@ -107,14 +112,14 @@ class VpdInfoFile:
             #
             self._VpdArray[Vpd] = {}
 
-        self._VpdArray[Vpd].update({skuname:Offset})
+        self._VpdArray[Vpd].update({skuname: Offset})
 
-
-    ## Generate VPD PCD information into a text file
+    # Generate VPD PCD information into a text file
     #
     #  If parameter FilePath is invalid, then assert.
     #  If
     #  @param FilePath        The given file path which would hold VPD information
+
     def Write(self, FilePath):
         if not (FilePath is not None or len(FilePath) != 0):
             EdkLogger.error("VpdInfoFile", BuildToolError.PARAMETER_INVALID,
@@ -130,15 +135,16 @@ class VpdInfoFile:
                     PcdTokenCName = PcdItem[0]
             for skuname in self._VpdArray[Pcd]:
                 PcdValue = str(Pcd.SkuInfoList[skuname].DefaultValue).strip()
-                if PcdValue == "" :
-                    PcdValue  = Pcd.DefaultValue
+                if PcdValue == "":
+                    PcdValue = Pcd.DefaultValue
 
-                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname, str(self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(), PcdValue)
+                Content += "%s.%s|%s|%s|%s|%s  \n" % (Pcd.TokenSpaceGuidCName, PcdTokenCName, skuname, str(
+                    self._VpdArray[Pcd][skuname]).strip(), str(Pcd.MaxDatumSize).strip(), PcdValue)
                 i += 1
 
         return SaveFileOnChange(FilePath, Content, False)
 
-    ## Read an existing VPD PCD info file.
+    # Read an existing VPD PCD info file.
     #
     #  This routine will read VPD PCD information from existing file and construct
     #  internal PcdClassObject array.
@@ -162,17 +168,21 @@ class VpdInfoFile:
             # the line must follow output format defined in BPDG spec.
             #
             try:
-                PcdName, SkuId, Offset, Size, Value = Line.split("#")[0].split("|")
-                PcdName, SkuId, Offset, Size, Value = PcdName.strip(), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
+                PcdName, SkuId, Offset, Size, Value = Line.split("#")[
+                    0].split("|")
+                PcdName, SkuId, Offset, Size, Value = PcdName.strip(
+                ), SkuId.strip(), Offset.strip(), Size.strip(), Value.strip()
                 TokenSpaceName, PcdTokenName = PcdName.split(".")
             except:
-                EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Fail to parse VPD information file %s" % FilePath)
+                EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR,
+                                "Fail to parse VPD information file %s" % FilePath)
 
             Found = False
 
             if (TokenSpaceName, PcdTokenName) not in self._VpdInfo:
                 self._VpdInfo[(TokenSpaceName, PcdTokenName)] = {}
-            self._VpdInfo[(TokenSpaceName, PcdTokenName)][(SkuId, Offset)] = Value
+            self._VpdInfo[(TokenSpaceName, PcdTokenName)
+                          ][(SkuId, Offset)] = Value
             for VpdObject in self._VpdArray:
                 VpdObjectTokenCName = VpdObject.TokenCName
                 for PcdItem in GlobalData.MixedPcd:
@@ -182,13 +192,15 @@ class VpdInfoFile:
                     if VpdObject.TokenSpaceGuidCName == TokenSpaceName and VpdObjectTokenCName == PcdTokenName.strip() and sku == SkuId:
                         if self._VpdArray[VpdObject][sku] == TAB_STAR:
                             if Offset == TAB_STAR:
-                                EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID, "The offset of %s has not been fixed up by third-party BPDG tool." % PcdName)
+                                EdkLogger.error("BPDG", BuildToolError.FORMAT_INVALID,
+                                                "The offset of %s has not been fixed up by third-party BPDG tool." % PcdName)
                             self._VpdArray[VpdObject][sku] = Offset
                         Found = True
             if not Found:
-                EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR, "Can not find PCD defined in VPD guid file.")
+                EdkLogger.error("BPDG", BuildToolError.PARSER_ERROR,
+                                "Can not find PCD defined in VPD guid file.")
 
-    ## Get count of VPD PCD collected from platform's autogen when building.
+    # Get count of VPD PCD collected from platform's autogen when building.
     #
     #  @return The integer count value
     def GetCount(self):
@@ -198,7 +210,7 @@ class VpdInfoFile:
 
         return Count
 
-    ## Get an offset value for a given VPD PCD
+    # Get an offset value for a given VPD PCD
     #
     #  Because BPDG only support one Sku, so only return offset for SKU default.
     #
@@ -211,18 +223,22 @@ class VpdInfoFile:
             return None
 
         return self._VpdArray[vpd]
+
     def GetVpdInfo(self, arg):
         (PcdTokenName, TokenSpaceName) = arg
-        return [(sku,offset,value) for (sku,offset),value in self._VpdInfo.get((TokenSpaceName, PcdTokenName)).items()]
+        return [(sku, offset, value) for (sku, offset), value in self._VpdInfo.get((TokenSpaceName, PcdTokenName)).items()]
 
-## Call external BPDG tool to process VPD file
+# Call external BPDG tool to process VPD file
 #
 #  @param ToolPath      The string path name for BPDG tool
 #  @param VpdFileName   The string path name for VPD information guid.txt
 #
+
+
 def CallExtenalBPDGTool(ToolPath, VpdFileName):
     assert ToolPath is not None, "Invalid parameter ToolPath"
-    assert VpdFileName is not None and os.path.exists(VpdFileName), "Invalid parameter VpdFileName"
+    assert VpdFileName is not None and os.path.exists(
+        VpdFileName), "Invalid parameter VpdFileName"
 
     OutputDir = os.path.dirname(VpdFileName)
     FileName = os.path.basename(VpdFileName)
@@ -232,24 +248,26 @@ def CallExtenalBPDGTool(ToolPath, VpdFileName):
 
     try:
         PopenObject = subprocess.Popen(' '.join([ToolPath,
-                                        '-o', OutputBinFileName,
-                                        '-m', OutputMapFileName,
-                                        '-q',
-                                        '-f',
-                                        VpdFileName]),
-                                        stdout=subprocess.PIPE,
-                                        stderr= subprocess.PIPE,
-                                        shell=True)
+                                                 '-o', OutputBinFileName,
+                                                 '-m', OutputMapFileName,
+                                                 '-q',
+                                                 '-f',
+                                                 VpdFileName]),
+                                       stdout=subprocess.PIPE,
+                                       stderr=subprocess.PIPE,
+                                       shell=True)
     except Exception as X:
-        EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, ExtraData=str(X))
+        EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE,
+                        ExtraData=str(X))
     (out, error) = PopenObject.communicate()
     print(out.decode())
-    while PopenObject.returncode is None :
+    while PopenObject.returncode is None:
         PopenObject.wait()
 
     if PopenObject.returncode != 0:
-        EdkLogger.debug(EdkLogger.DEBUG_1, "Fail to call BPDG tool", str(error))
-        EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, "Fail to execute BPDG tool with exit code: %d, the error message is: \n %s" % \
-                            (PopenObject.returncode, str(error)))
+        EdkLogger.debug(EdkLogger.DEBUG_1,
+                        "Fail to call BPDG tool", str(error))
+        EdkLogger.error("BPDG", BuildToolError.COMMAND_FAILURE, "Fail to execute BPDG tool with exit code: %d, the error message is: \n %s" %
+                        (PopenObject.returncode, str(error)))
 
     return PopenObject.returncode
diff --git a/BaseTools/Source/Python/Common/__init__.py b/BaseTools/Source/Python/Common/__init__.py
index 7aaef6ae318c..b53b69b3d15d 100644
--- a/BaseTools/Source/Python/Common/__init__.py
+++ b/BaseTools/Source/Python/Common/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Common' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Common/caching.py b/BaseTools/Source/Python/Common/caching.py
index fda30f7321ef..1c2375026433 100644
--- a/BaseTools/Source/Python/Common/caching.py
+++ b/BaseTools/Source/Python/Common/caching.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # help with caching in BaseTools
 #
 # Copyright (c) 2018, Intel Corporation. All rights reserved.<BR>
@@ -6,36 +6,45 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 
 # for class function
 class cached_class_function(object):
     def __init__(self, function):
         self._function = function
+
     def __get__(self, obj, cls):
-        def CallMeHere(*args,**kwargs):
-            Value = self._function(obj, *args,**kwargs)
-            obj.__dict__[self._function.__name__] = lambda *args,**kwargs:Value
+        def CallMeHere(*args, **kwargs):
+            Value = self._function(obj, *args, **kwargs)
+            obj.__dict__[
+                self._function.__name__] = lambda *args, **kwargs: Value
             return Value
         return CallMeHere
 
 # for class property
+
+
 class cached_property(object):
     def __init__(self, function):
         self._function = function
+
     def __get__(self, obj, cls):
         Value = obj.__dict__[self._function.__name__] = self._function(obj)
         return Value
 
 # for non-class function
+
+
 class cached_basic_function(object):
     def __init__(self, function):
         self._function = function
     # wrapper to call _do since <class>.__dict__ doesn't support changing __call__
-    def __call__(self,*args,**kwargs):
-        return self._do(*args,**kwargs)
-    def _do(self,*args,**kwargs):
-        Value = self._function(*args,**kwargs)
-        self.__dict__['_do'] = lambda self,*args,**kwargs:Value
+
+    def __call__(self, *args, **kwargs):
+        return self._do(*args, **kwargs)
+
+    def _do(self, *args, **kwargs):
+        Value = self._function(*args, **kwargs)
+        self.__dict__['_do'] = lambda self, *args, **kwargs: Value
         return Value
diff --git a/BaseTools/Source/Python/CommonDataClass/CommonClass.py b/BaseTools/Source/Python/CommonDataClass/CommonClass.py
index bcf52c7b75dd..920703b783a5 100644
--- a/BaseTools/Source/Python/CommonDataClass/CommonClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/CommonClass.py
@@ -1,11 +1,11 @@
-## @file
+# @file
 # This file is used to define common items of class object
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 
 
-## SkuInfoClass
+# SkuInfoClass
 #
 # This class defined SkuInfo item used in Module/Platform/Package files
 #
@@ -29,8 +29,8 @@
 # @var DefaultValue:       To store value for DefaultValue
 #
 class SkuInfoClass(object):
-    def __init__(self, SkuIdName = '', SkuId = '', VariableName = '', VariableGuid = '', VariableOffset = '',
-                 HiiDefaultValue = '', VpdOffset = '', DefaultValue = '', VariableGuidValue = '', VariableAttribute = '', DefaultStore = None):
+    def __init__(self, SkuIdName='', SkuId='', VariableName='', VariableGuid='', VariableOffset='',
+                 HiiDefaultValue='', VpdOffset='', DefaultValue='', VariableGuidValue='', VariableAttribute='', DefaultStore=None):
         self.SkuIdName = SkuIdName
         self.SkuId = SkuId
 
@@ -57,7 +57,7 @@ class SkuInfoClass(object):
         #
         self.DefaultValue = DefaultValue
 
-    ## Convert the class to a string
+    # Convert the class to a string
     #
     #  Convert each member of the class to string
     #  Organize to a single line format string
@@ -66,16 +66,16 @@ class SkuInfoClass(object):
     #
     def __str__(self):
         Rtn = 'SkuId = ' + str(self.SkuId) + "," + \
-                    'SkuIdName = ' + str(self.SkuIdName) + "," + \
-                    'VariableName = ' + str(self.VariableName) + "," + \
-                    'VariableGuid = ' + str(self.VariableGuid) + "," + \
-                    'VariableOffset = ' + str(self.VariableOffset) + "," + \
-                    'HiiDefaultValue = ' + str(self.HiiDefaultValue) + "," + \
-                    'VpdOffset = ' + str(self.VpdOffset) + "," + \
-                    'DefaultValue = ' + str(self.DefaultValue) + ","
+            'SkuIdName = ' + str(self.SkuIdName) + "," + \
+            'VariableName = ' + str(self.VariableName) + "," + \
+            'VariableGuid = ' + str(self.VariableGuid) + "," + \
+            'VariableOffset = ' + str(self.VariableOffset) + "," + \
+            'HiiDefaultValue = ' + str(self.HiiDefaultValue) + "," + \
+            'VpdOffset = ' + str(self.VpdOffset) + "," + \
+            'DefaultValue = ' + str(self.DefaultValue) + ","
         return Rtn
 
-    def __deepcopy__(self,memo):
+    def __deepcopy__(self, memo):
         new_sku = SkuInfoClass()
         new_sku.SkuIdName = self.SkuIdName
         new_sku.SkuId = self.SkuId
@@ -85,7 +85,8 @@ class SkuInfoClass(object):
         new_sku.VariableOffset = self.VariableOffset
         new_sku.HiiDefaultValue = self.HiiDefaultValue
         new_sku.VariableAttribute = self.VariableAttribute
-        new_sku.DefaultStoreDict = {key:value for key,value in self.DefaultStoreDict.items()}
+        new_sku.DefaultStoreDict = {
+            key: value for key, value in self.DefaultStoreDict.items()}
         new_sku.VpdOffset = self.VpdOffset
         new_sku.DefaultValue = self.DefaultValue
         return new_sku
diff --git a/BaseTools/Source/Python/CommonDataClass/DataClass.py b/BaseTools/Source/Python/CommonDataClass/DataClass.py
index 6f35bd4c8e8f..54b42f2806be 100644
--- a/BaseTools/Source/Python/CommonDataClass/DataClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/DataClass.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class for data structure used in ECC
 #
 # Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -73,14 +73,14 @@ MODEL_PCD_DYNAMIC_DEFAULT = 4009
 MODEL_PCD_DYNAMIC_VPD = 4010
 MODEL_PCD_DYNAMIC_HII = 4011
 MODEL_PCD_TYPE_LIST = [MODEL_PCD_FIXED_AT_BUILD,
-                        MODEL_PCD_PATCHABLE_IN_MODULE,
-                        MODEL_PCD_FEATURE_FLAG,
-                        MODEL_PCD_DYNAMIC_DEFAULT,
-                        MODEL_PCD_DYNAMIC_HII,
-                        MODEL_PCD_DYNAMIC_VPD,
-                        MODEL_PCD_DYNAMIC_EX_DEFAULT,
-                        MODEL_PCD_DYNAMIC_EX_HII,
-                        MODEL_PCD_DYNAMIC_EX_VPD
+                       MODEL_PCD_PATCHABLE_IN_MODULE,
+                       MODEL_PCD_FEATURE_FLAG,
+                       MODEL_PCD_DYNAMIC_DEFAULT,
+                       MODEL_PCD_DYNAMIC_HII,
+                       MODEL_PCD_DYNAMIC_VPD,
+                       MODEL_PCD_DYNAMIC_EX_DEFAULT,
+                       MODEL_PCD_DYNAMIC_EX_HII,
+                       MODEL_PCD_DYNAMIC_EX_VPD
                        ]
 
 MODEL_META_DATA_HEADER_COMMENT = 5000
@@ -119,13 +119,15 @@ MODEL_LIST = [('MODEL_UNKNOWN', MODEL_UNKNOWN),
               ('MODEL_FILE_CIF', MODEL_FILE_CIF),
               ('MODEL_FILE_OTHERS', MODEL_FILE_OTHERS),
               ('MODEL_IDENTIFIER_FILE_HEADER', MODEL_IDENTIFIER_FILE_HEADER),
-              ('MODEL_IDENTIFIER_FUNCTION_HEADER', MODEL_IDENTIFIER_FUNCTION_HEADER),
+              ('MODEL_IDENTIFIER_FUNCTION_HEADER',
+               MODEL_IDENTIFIER_FUNCTION_HEADER),
               ('MODEL_IDENTIFIER_COMMENT', MODEL_IDENTIFIER_COMMENT),
               ('MODEL_IDENTIFIER_PARAMETER', MODEL_IDENTIFIER_PARAMETER),
               ('MODEL_IDENTIFIER_STRUCTURE', MODEL_IDENTIFIER_STRUCTURE),
               ('MODEL_IDENTIFIER_VARIABLE', MODEL_IDENTIFIER_VARIABLE),
               ('MODEL_IDENTIFIER_INCLUDE', MODEL_IDENTIFIER_INCLUDE),
-              ('MODEL_IDENTIFIER_PREDICATE_EXPRESSION', MODEL_IDENTIFIER_PREDICATE_EXPRESSION),
+              ('MODEL_IDENTIFIER_PREDICATE_EXPRESSION',
+               MODEL_IDENTIFIER_PREDICATE_EXPRESSION),
               ('MODEL_IDENTIFIER_ENUMERATE', MODEL_IDENTIFIER_ENUMERATE),
               ('MODEL_IDENTIFIER_PCD', MODEL_IDENTIFIER_PCD),
               ('MODEL_IDENTIFIER_UNION', MODEL_IDENTIFIER_UNION),
@@ -134,10 +136,13 @@ MODEL_LIST = [('MODEL_UNKNOWN', MODEL_UNKNOWN),
               ('MODEL_IDENTIFIER_MACRO_DEFINE', MODEL_IDENTIFIER_MACRO_DEFINE),
               ('MODEL_IDENTIFIER_MACRO_ENDIF', MODEL_IDENTIFIER_MACRO_ENDIF),
               ('MODEL_IDENTIFIER_MACRO_PROGMA', MODEL_IDENTIFIER_MACRO_PROGMA),
-              ('MODEL_IDENTIFIER_FUNCTION_CALLING', MODEL_IDENTIFIER_FUNCTION_CALLING),
+              ('MODEL_IDENTIFIER_FUNCTION_CALLING',
+               MODEL_IDENTIFIER_FUNCTION_CALLING),
               ('MODEL_IDENTIFIER_TYPEDEF', MODEL_IDENTIFIER_TYPEDEF),
-              ('MODEL_IDENTIFIER_FUNCTION_DECLARATION', MODEL_IDENTIFIER_FUNCTION_DECLARATION),
-              ('MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION', MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION),
+              ('MODEL_IDENTIFIER_FUNCTION_DECLARATION',
+               MODEL_IDENTIFIER_FUNCTION_DECLARATION),
+              ('MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION',
+               MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION),
               ('MODEL_EFI_PROTOCOL', MODEL_EFI_PROTOCOL),
               ('MODEL_EFI_PPI', MODEL_EFI_PPI),
               ('MODEL_EFI_GUID', MODEL_EFI_GUID),
@@ -165,20 +170,25 @@ MODEL_LIST = [('MODEL_UNKNOWN', MODEL_UNKNOWN),
               ("MODEL_META_DATA_HEADER", MODEL_META_DATA_HEADER),
               ("MODEL_META_DATA_INCLUDE", MODEL_META_DATA_INCLUDE),
               ("MODEL_META_DATA_DEFINE", MODEL_META_DATA_DEFINE),
-              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IF", MODEL_META_DATA_CONDITIONAL_STATEMENT_IF),
-              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE", MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE),
-              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF", MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF),
-              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF", MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF),
-              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR", MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR),
+              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IF",
+               MODEL_META_DATA_CONDITIONAL_STATEMENT_IF),
+              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE",
+               MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE),
+              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF",
+               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF),
+              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF",
+               MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF),
+              ("MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR",
+               MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR),
               ("MODEL_META_DATA_BUILD_OPTION", MODEL_META_DATA_BUILD_OPTION),
               ("MODEL_META_DATA_COMPONENT", MODEL_META_DATA_COMPONENT),
               ('MODEL_META_DATA_USER_EXTENSION', MODEL_META_DATA_USER_EXTENSION),
               ('MODEL_META_DATA_PACKAGE', MODEL_META_DATA_PACKAGE),
               ('MODEL_META_DATA_NMAKE', MODEL_META_DATA_NMAKE),
               ('MODEL_META_DATA_COMMENT', MODEL_META_DATA_COMMENT)
-             ]
+              ]
 
-## FunctionClass
+# FunctionClass
 #
 # This class defines a structure of a function
 #
@@ -212,12 +222,14 @@ MODEL_LIST = [('MODEL_UNKNOWN', MODEL_UNKNOWN),
 # @var IdentifierList:     IdentifierList of a File
 # @var PcdList:            PcdList of a File
 #
+
+
 class FunctionClass(object):
-    def __init__(self, ID = -1, Header = '', Modifier = '', Name = '', ReturnStatement = '', \
-                 StartLine = -1, StartColumn = -1, EndLine = -1, EndColumn = -1, \
-                 BodyStartLine = -1, BodyStartColumn = -1, BelongsToFile = -1, \
-                 IdentifierList = [], PcdList = [], \
-                 FunNameStartLine = -1, FunNameStartColumn = -1):
+    def __init__(self, ID=-1, Header='', Modifier='', Name='', ReturnStatement='',
+                 StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1,
+                 BodyStartLine=-1, BodyStartColumn=-1, BelongsToFile=-1,
+                 IdentifierList=[], PcdList=[],
+                 FunNameStartLine=-1, FunNameStartColumn=-1):
         self.ID = ID
         self.Header = Header
         self.Modifier = Modifier
@@ -236,7 +248,7 @@ class FunctionClass(object):
         self.IdentifierList = IdentifierList
         self.PcdList = PcdList
 
-## IdentifierClass
+# IdentifierClass
 #
 # This class defines a structure of a variable
 #
@@ -266,9 +278,11 @@ class FunctionClass(object):
 # @var EndLine:              EndLine of a Identifier
 # @var EndColumn:            EndColumn of a Identifier
 #
+
+
 class IdentifierClass(object):
-    def __init__(self, ID = -1, Modifier = '', Type = '', Name = '', Value = '', Model = MODEL_UNKNOWN, \
-                 BelongsToFile = -1, BelongsToFunction = -1, StartLine = -1, StartColumn = -1, EndLine = -1, EndColumn = -1):
+    def __init__(self, ID=-1, Modifier='', Type='', Name='', Value='', Model=MODEL_UNKNOWN,
+                 BelongsToFile=-1, BelongsToFunction=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1):
         self.ID = ID
         self.Modifier = Modifier
         self.Type = Type
@@ -282,7 +296,7 @@ class IdentifierClass(object):
         self.EndLine = EndLine
         self.EndColumn = EndColumn
 
-## PcdClass
+# PcdClass
 #
 # This class defines a structure of a Pcd
 #
@@ -312,9 +326,11 @@ class IdentifierClass(object):
 # @var EndLine:                EndLine of a Pcd
 # @var EndColumn:              EndColumn of a Pcd
 #
+
+
 class PcdDataClass(object):
-    def __init__(self, ID = -1, CName = '', TokenSpaceGuidCName = '', Token = '', DatumType = '', Model = MODEL_UNKNOWN, \
-                 BelongsToFile = -1, BelongsToFunction = -1, StartLine = -1, StartColumn = -1, EndLine = -1, EndColumn = -1):
+    def __init__(self, ID=-1, CName='', TokenSpaceGuidCName='', Token='', DatumType='', Model=MODEL_UNKNOWN,
+                 BelongsToFile=-1, BelongsToFunction=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1):
         self.ID = ID
         self.CName = CName
         self.TokenSpaceGuidCName = TokenSpaceGuidCName
@@ -327,7 +343,7 @@ class PcdDataClass(object):
         self.EndLine = EndLine
         self.EndColumn = EndColumn
 
-## FileClass
+# FileClass
 #
 # This class defines a structure of a file
 #
@@ -353,9 +369,11 @@ class PcdDataClass(object):
 # @var IdentifierList:    IdentifierList of a File
 # @var PcdList:           PcdList of a File
 #
+
+
 class FileClass(object):
-    def __init__(self, ID = -1, Name = '', ExtName = '', Path = '', FullPath = '', Model = MODEL_UNKNOWN, TimeStamp = '', \
-                 FunctionList = [], IdentifierList = [], PcdList = []):
+    def __init__(self, ID=-1, Name='', ExtName='', Path='', FullPath='', Model=MODEL_UNKNOWN, TimeStamp='',
+                 FunctionList=[], IdentifierList=[], PcdList=[]):
         self.ID = ID
         self.Name = Name
         self.ExtName = ExtName
diff --git a/BaseTools/Source/Python/CommonDataClass/Exceptions.py b/BaseTools/Source/Python/CommonDataClass/Exceptions.py
index 4489b757e881..3d31d7b074ea 100644
--- a/BaseTools/Source/Python/CommonDataClass/Exceptions.py
+++ b/BaseTools/Source/Python/CommonDataClass/Exceptions.py
@@ -1,23 +1,27 @@
-## @file
+# @file
 # This file is used to define common Exceptions class used in python tools
 #
 # Copyright (c) 2011, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 
-## Exceptions used in Expression
+# Exceptions used in Expression
 class EvaluationException(Exception):
     pass
 
+
 class BadExpression(EvaluationException):
     pass
 
+
 class WrnExpression(Exception):
     pass
 
-## Exceptions used in macro replacements
+# Exceptions used in macro replacements
+
+
 class MacroException(Exception):
     pass
 
+
 class SymbolNotFound(MacroException):
     pass
-
diff --git a/BaseTools/Source/Python/CommonDataClass/FdfClass.py b/BaseTools/Source/Python/CommonDataClass/FdfClass.py
index 2fbb7b436a9b..de1edb6f04fe 100644
--- a/BaseTools/Source/Python/CommonDataClass/FdfClass.py
+++ b/BaseTools/Source/Python/CommonDataClass/FdfClass.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # classes represent data in FDF
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -6,11 +6,11 @@
 #  SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## FD data in FDF
+# FD data in FDF
 #
 #
 class FDClassObject:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -30,11 +30,13 @@ class FDClassObject:
         self.SetVarDict = {}
         self.RegionList = []
 
-## FFS data in FDF
+# FFS data in FDF
 #
 #
+
+
 class FfsClassObject:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -45,11 +47,13 @@ class FfsClassObject:
         self.Alignment = None
         self.SectionList = []
 
-## FILE statement data in FDF
+# FILE statement data in FDF
 #
 #
-class FileStatementClassObject (FfsClassObject) :
-    ## The constructor
+
+
+class FileStatementClassObject (FfsClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -63,11 +67,13 @@ class FileStatementClassObject (FfsClassObject) :
         self.DefineVarDict = {}
         self.KeepReloc = None
 
-## INF statement data in FDF
+# INF statement data in FDF
 #
 #
+
+
 class FfsInfStatementClassObject(FfsClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -82,22 +88,26 @@ class FfsInfStatementClassObject(FfsClassObject):
         self.KeepReloc = None
         self.UseArch = None
 
-## section data in FDF
+# section data in FDF
 #
 #
+
+
 class SectionClassObject:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         self.Alignment = None
 
-## Depex expression section in FDF
+# Depex expression section in FDF
 #
 #
+
+
 class DepexSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -106,11 +116,13 @@ class DepexSectionClassObject (SectionClassObject):
         self.Expression = None
         self.ExpressionProcessed = False
 
-## Compress section data in FDF
+# Compress section data in FDF
 #
 #
-class CompressSectionClassObject (SectionClassObject) :
-    ## The constructor
+
+
+class CompressSectionClassObject (SectionClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -119,11 +131,13 @@ class CompressSectionClassObject (SectionClassObject) :
         self.CompType = None
         self.SectionList = []
 
-## Data section data in FDF
+# Data section data in FDF
 #
 #
+
+
 class DataSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -134,11 +148,13 @@ class DataSectionClassObject (SectionClassObject):
         self.SectionList = []
         self.KeepReloc = True
 
-## Rule section data in FDF
+# Rule section data in FDF
 #
 #
+
+
 class EfiSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -153,11 +169,13 @@ class EfiSectionClassObject (SectionClassObject):
         self.BuildNum = None
         self.KeepReloc = None
 
-## FV image section data in FDF
+# FV image section data in FDF
 #
 #
+
+
 class FvImageSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -170,11 +188,13 @@ class FvImageSectionClassObject (SectionClassObject):
         self.FvFileExtension = None
         self.FvAddr = None
 
-## GUIDed section data in FDF
+# GUIDed section data in FDF
 #
 #
-class GuidSectionClassObject (SectionClassObject) :
-    ## The constructor
+
+
+class GuidSectionClassObject (SectionClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -190,11 +210,13 @@ class GuidSectionClassObject (SectionClassObject) :
         self.FvParentAddr = None
         self.IncludeFvSection = False
 
-## UI section data in FDF
+# UI section data in FDF
 #
 #
+
+
 class UiSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -203,11 +225,13 @@ class UiSectionClassObject (SectionClassObject):
         self.StringData = None
         self.FileName = None
 
-## Version section data in FDF
+# Version section data in FDF
 #
 #
+
+
 class VerSectionClassObject (SectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -217,11 +241,13 @@ class VerSectionClassObject (SectionClassObject):
         self.StringData = None
         self.FileName = None
 
-## Rule data in FDF
+# Rule data in FDF
 #
 #
-class RuleClassObject :
-    ## The constructor
+
+
+class RuleClassObject:
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -238,11 +264,13 @@ class RuleClassObject :
         self.KeyStringList = []
         self.KeepReloc = None
 
-## Complex rule data in FDF
+# Complex rule data in FDF
 #
 #
-class RuleComplexFileClassObject(RuleClassObject) :
-    ## The constructor
+
+
+class RuleComplexFileClassObject(RuleClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -250,11 +278,13 @@ class RuleComplexFileClassObject(RuleClassObject) :
         RuleClassObject.__init__(self)
         self.SectionList = []
 
-## Simple rule data in FDF
+# Simple rule data in FDF
 #
 #
-class RuleSimpleFileClassObject(RuleClassObject) :
-    ## The constructor
+
+
+class RuleSimpleFileClassObject(RuleClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -264,11 +294,13 @@ class RuleSimpleFileClassObject(RuleClassObject) :
         self.SectionType = ''
         self.FileExtension = None
 
-## File extension rule data in FDF
+# File extension rule data in FDF
 #
 #
+
+
 class RuleFileExtensionClassObject(RuleClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -276,11 +308,13 @@ class RuleFileExtensionClassObject(RuleClassObject):
         RuleClassObject.__init__(self)
         self.FileExtension = None
 
-## Capsule data in FDF
+# Capsule data in FDF
 #
 #
-class CapsuleClassObject :
-    ## The constructor
+
+
+class CapsuleClassObject:
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -298,15 +332,16 @@ class CapsuleClassObject :
         self.CapsuleDataList = []
         self.FmpPayloadList = []
 
-## OptionROM data in FDF
+# OptionROM data in FDF
 #
 #
+
+
 class OptionRomClassObject:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         self.DriverName = None
         self.FfsList = []
-
diff --git a/BaseTools/Source/Python/CommonDataClass/__init__.py b/BaseTools/Source/Python/CommonDataClass/__init__.py
index 81af4c07a36e..b56969e90fe2 100644
--- a/BaseTools/Source/Python/CommonDataClass/__init__.py
+++ b/BaseTools/Source/Python/CommonDataClass/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'CommonDataClass' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Ecc/CParser3/CLexer.py b/BaseTools/Source/Python/Ecc/CParser3/CLexer.py
index ca03adea7a65..c48cb2404bf0 100644
--- a/BaseTools/Source/Python/Ecc/CParser3/CLexer.py
+++ b/BaseTools/Source/Python/Ecc/CParser3/CLexer.py
@@ -3,7 +3,7 @@
 from antlr3 import *
 from antlr3.compat import set, frozenset
 
-## @file
+# @file
 # The file defines the Lexer for C source files.
 #
 # THIS FILE IS AUTO-GENERATED. PLEASE DO NOT MODIFY THIS FILE.
@@ -17,127 +17,127 @@ from antlr3.compat import set, frozenset
 ##
 
 
-
 # for convenience in actions
 HIDDEN = BaseRecognizer.HIDDEN
 
 # token types
-T114=114
-T115=115
-T116=116
-T117=117
-FloatTypeSuffix=16
-LETTER=11
-T29=29
-T28=28
-T27=27
-T26=26
-T25=25
-EOF=-1
-STRING_LITERAL=9
-FLOATING_POINT_LITERAL=10
-T38=38
-T37=37
-T39=39
-T34=34
-COMMENT=22
-T33=33
-T36=36
-T35=35
-T30=30
-T32=32
-T31=31
-LINE_COMMENT=23
-IntegerTypeSuffix=14
-CHARACTER_LITERAL=8
-T49=49
-T48=48
-T100=100
-T43=43
-T42=42
-T102=102
-T41=41
-T101=101
-T40=40
-T47=47
-T46=46
-T45=45
-T44=44
-T109=109
-T107=107
-T108=108
-T105=105
-WS=19
-T106=106
-T103=103
-T104=104
-T50=50
-LINE_COMMAND=24
-T59=59
-T113=113
-T52=52
-T112=112
-T51=51
-T111=111
-T54=54
-T110=110
-EscapeSequence=12
-DECIMAL_LITERAL=7
-T53=53
-T56=56
-T55=55
-T58=58
-T57=57
-T75=75
-T76=76
-T73=73
-T74=74
-T79=79
-T77=77
-T78=78
-Exponent=15
-HexDigit=13
-T72=72
-T71=71
-T70=70
-T62=62
-T63=63
-T64=64
-T65=65
-T66=66
-T67=67
-T68=68
-T69=69
-IDENTIFIER=4
-UnicodeVocabulary=21
-HEX_LITERAL=5
-T61=61
-T60=60
-T99=99
-T97=97
-BS=20
-T98=98
-T95=95
-T96=96
-OCTAL_LITERAL=6
-T94=94
-Tokens=118
-T93=93
-T92=92
-T91=91
-T90=90
-T88=88
-T89=89
-T84=84
-T85=85
-T86=86
-T87=87
-UnicodeEscape=18
-T81=81
-T80=80
-T83=83
-OctalEscape=17
-T82=82
+T114 = 114
+T115 = 115
+T116 = 116
+T117 = 117
+FloatTypeSuffix = 16
+LETTER = 11
+T29 = 29
+T28 = 28
+T27 = 27
+T26 = 26
+T25 = 25
+EOF = -1
+STRING_LITERAL = 9
+FLOATING_POINT_LITERAL = 10
+T38 = 38
+T37 = 37
+T39 = 39
+T34 = 34
+COMMENT = 22
+T33 = 33
+T36 = 36
+T35 = 35
+T30 = 30
+T32 = 32
+T31 = 31
+LINE_COMMENT = 23
+IntegerTypeSuffix = 14
+CHARACTER_LITERAL = 8
+T49 = 49
+T48 = 48
+T100 = 100
+T43 = 43
+T42 = 42
+T102 = 102
+T41 = 41
+T101 = 101
+T40 = 40
+T47 = 47
+T46 = 46
+T45 = 45
+T44 = 44
+T109 = 109
+T107 = 107
+T108 = 108
+T105 = 105
+WS = 19
+T106 = 106
+T103 = 103
+T104 = 104
+T50 = 50
+LINE_COMMAND = 24
+T59 = 59
+T113 = 113
+T52 = 52
+T112 = 112
+T51 = 51
+T111 = 111
+T54 = 54
+T110 = 110
+EscapeSequence = 12
+DECIMAL_LITERAL = 7
+T53 = 53
+T56 = 56
+T55 = 55
+T58 = 58
+T57 = 57
+T75 = 75
+T76 = 76
+T73 = 73
+T74 = 74
+T79 = 79
+T77 = 77
+T78 = 78
+Exponent = 15
+HexDigit = 13
+T72 = 72
+T71 = 71
+T70 = 70
+T62 = 62
+T63 = 63
+T64 = 64
+T65 = 65
+T66 = 66
+T67 = 67
+T68 = 68
+T69 = 69
+IDENTIFIER = 4
+UnicodeVocabulary = 21
+HEX_LITERAL = 5
+T61 = 61
+T60 = 60
+T99 = 99
+T97 = 97
+BS = 20
+T98 = 98
+T95 = 95
+T96 = 96
+OCTAL_LITERAL = 6
+T94 = 94
+Tokens = 118
+T93 = 93
+T92 = 92
+T91 = 91
+T90 = 90
+T88 = 88
+T89 = 89
+T84 = 84
+T85 = 85
+T86 = 86
+T87 = 87
+UnicodeEscape = 18
+T81 = 81
+T80 = 80
+T83 = 83
+OctalEscape = 17
+T82 = 82
+
 
 class CLexer(Lexer):
 
@@ -147,31 +147,27 @@ class CLexer(Lexer):
         Lexer.__init__(self, input)
         self.dfa25 = self.DFA25(
             self, 25,
-            eot = self.DFA25_eot,
-            eof = self.DFA25_eof,
-            min = self.DFA25_min,
-            max = self.DFA25_max,
-            accept = self.DFA25_accept,
-            special = self.DFA25_special,
-            transition = self.DFA25_transition
-            )
+            eot=self.DFA25_eot,
+            eof=self.DFA25_eof,
+            min=self.DFA25_min,
+            max=self.DFA25_max,
+            accept=self.DFA25_accept,
+            special=self.DFA25_special,
+            transition=self.DFA25_transition
+        )
         self.dfa35 = self.DFA35(
             self, 35,
-            eot = self.DFA35_eot,
-            eof = self.DFA35_eof,
-            min = self.DFA35_min,
-            max = self.DFA35_max,
-            accept = self.DFA35_accept,
-            special = self.DFA35_special,
-            transition = self.DFA35_transition
-            )
-
-
-
-
-
+            eot=self.DFA35_eot,
+            eof=self.DFA35_eof,
+            min=self.DFA35_min,
+            max=self.DFA35_max,
+            accept=self.DFA35_accept,
+            special=self.DFA35_special,
+            transition=self.DFA35_transition
+        )
 
     # $ANTLR start T25
+
     def mT25(self, ):
 
         try:
@@ -181,19 +177,14 @@ class CLexer(Lexer):
             # C.g:27:7: ';'
             self.match(u';')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T25
 
-
-
     # $ANTLR start T26
+
     def mT26(self, ):
 
         try:
@@ -203,20 +194,14 @@ class CLexer(Lexer):
             # C.g:28:7: 'typedef'
             self.match("typedef")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T26
 
-
-
     # $ANTLR start T27
+
     def mT27(self, ):
 
         try:
@@ -226,19 +211,14 @@ class CLexer(Lexer):
             # C.g:29:7: ','
             self.match(u',')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T27
 
-
-
     # $ANTLR start T28
+
     def mT28(self, ):
 
         try:
@@ -248,19 +228,14 @@ class CLexer(Lexer):
             # C.g:30:7: '='
             self.match(u'=')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T28
 
-
-
     # $ANTLR start T29
+
     def mT29(self, ):
 
         try:
@@ -270,20 +245,14 @@ class CLexer(Lexer):
             # C.g:31:7: 'extern'
             self.match("extern")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T29
 
-
-
     # $ANTLR start T30
+
     def mT30(self, ):
 
         try:
@@ -293,20 +262,14 @@ class CLexer(Lexer):
             # C.g:32:7: 'static'
             self.match("static")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T30
 
-
-
     # $ANTLR start T31
+
     def mT31(self, ):
 
         try:
@@ -316,20 +279,14 @@ class CLexer(Lexer):
             # C.g:33:7: 'auto'
             self.match("auto")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T31
 
-
-
     # $ANTLR start T32
+
     def mT32(self, ):
 
         try:
@@ -339,20 +296,14 @@ class CLexer(Lexer):
             # C.g:34:7: 'register'
             self.match("register")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T32
 
-
-
     # $ANTLR start T33
+
     def mT33(self, ):
 
         try:
@@ -362,20 +313,14 @@ class CLexer(Lexer):
             # C.g:35:7: 'STATIC'
             self.match("STATIC")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T33
 
-
-
     # $ANTLR start T34
+
     def mT34(self, ):
 
         try:
@@ -385,20 +330,14 @@ class CLexer(Lexer):
             # C.g:36:7: 'void'
             self.match("void")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T34
 
-
-
     # $ANTLR start T35
+
     def mT35(self, ):
 
         try:
@@ -408,20 +347,14 @@ class CLexer(Lexer):
             # C.g:37:7: 'char'
             self.match("char")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T35
 
-
-
     # $ANTLR start T36
+
     def mT36(self, ):
 
         try:
@@ -431,20 +364,14 @@ class CLexer(Lexer):
             # C.g:38:7: 'short'
             self.match("short")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T36
 
-
-
     # $ANTLR start T37
+
     def mT37(self, ):
 
         try:
@@ -454,20 +381,14 @@ class CLexer(Lexer):
             # C.g:39:7: 'int'
             self.match("int")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T37
 
-
-
     # $ANTLR start T38
+
     def mT38(self, ):
 
         try:
@@ -477,20 +398,14 @@ class CLexer(Lexer):
             # C.g:40:7: 'long'
             self.match("long")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T38
 
-
-
     # $ANTLR start T39
+
     def mT39(self, ):
 
         try:
@@ -500,20 +415,14 @@ class CLexer(Lexer):
             # C.g:41:7: 'float'
             self.match("float")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T39
 
-
-
     # $ANTLR start T40
+
     def mT40(self, ):
 
         try:
@@ -523,20 +432,14 @@ class CLexer(Lexer):
             # C.g:42:7: 'double'
             self.match("double")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T40
 
-
-
     # $ANTLR start T41
+
     def mT41(self, ):
 
         try:
@@ -546,20 +449,14 @@ class CLexer(Lexer):
             # C.g:43:7: 'signed'
             self.match("signed")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T41
 
-
-
     # $ANTLR start T42
+
     def mT42(self, ):
 
         try:
@@ -569,20 +466,14 @@ class CLexer(Lexer):
             # C.g:44:7: 'unsigned'
             self.match("unsigned")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T42
 
-
-
     # $ANTLR start T43
+
     def mT43(self, ):
 
         try:
@@ -592,19 +483,14 @@ class CLexer(Lexer):
             # C.g:45:7: '{'
             self.match(u'{')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T43
 
-
-
     # $ANTLR start T44
+
     def mT44(self, ):
 
         try:
@@ -614,19 +500,14 @@ class CLexer(Lexer):
             # C.g:46:7: '}'
             self.match(u'}')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T44
 
-
-
     # $ANTLR start T45
+
     def mT45(self, ):
 
         try:
@@ -636,20 +517,14 @@ class CLexer(Lexer):
             # C.g:47:7: 'struct'
             self.match("struct")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T45
 
-
-
     # $ANTLR start T46
+
     def mT46(self, ):
 
         try:
@@ -659,20 +534,14 @@ class CLexer(Lexer):
             # C.g:48:7: 'union'
             self.match("union")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T46
 
-
-
     # $ANTLR start T47
+
     def mT47(self, ):
 
         try:
@@ -682,19 +551,14 @@ class CLexer(Lexer):
             # C.g:49:7: ':'
             self.match(u':')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T47
 
-
-
     # $ANTLR start T48
+
     def mT48(self, ):
 
         try:
@@ -704,20 +568,14 @@ class CLexer(Lexer):
             # C.g:50:7: 'enum'
             self.match("enum")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T48
 
-
-
     # $ANTLR start T49
+
     def mT49(self, ):
 
         try:
@@ -727,20 +585,14 @@ class CLexer(Lexer):
             # C.g:51:7: 'const'
             self.match("const")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T49
 
-
-
     # $ANTLR start T50
+
     def mT50(self, ):
 
         try:
@@ -750,20 +602,14 @@ class CLexer(Lexer):
             # C.g:52:7: 'volatile'
             self.match("volatile")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T50
 
-
-
     # $ANTLR start T51
+
     def mT51(self, ):
 
         try:
@@ -773,20 +619,14 @@ class CLexer(Lexer):
             # C.g:53:7: 'IN'
             self.match("IN")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T51
 
-
-
     # $ANTLR start T52
+
     def mT52(self, ):
 
         try:
@@ -796,20 +636,14 @@ class CLexer(Lexer):
             # C.g:54:7: 'OUT'
             self.match("OUT")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T52
 
-
-
     # $ANTLR start T53
+
     def mT53(self, ):
 
         try:
@@ -819,20 +653,14 @@ class CLexer(Lexer):
             # C.g:55:7: 'OPTIONAL'
             self.match("OPTIONAL")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T53
 
-
-
     # $ANTLR start T54
+
     def mT54(self, ):
 
         try:
@@ -842,20 +670,14 @@ class CLexer(Lexer):
             # C.g:56:7: 'CONST'
             self.match("CONST")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T54
 
-
-
     # $ANTLR start T55
+
     def mT55(self, ):
 
         try:
@@ -865,20 +687,14 @@ class CLexer(Lexer):
             # C.g:57:7: 'UNALIGNED'
             self.match("UNALIGNED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T55
 
-
-
     # $ANTLR start T56
+
     def mT56(self, ):
 
         try:
@@ -888,20 +704,14 @@ class CLexer(Lexer):
             # C.g:58:7: 'VOLATILE'
             self.match("VOLATILE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T56
 
-
-
     # $ANTLR start T57
+
     def mT57(self, ):
 
         try:
@@ -911,20 +721,14 @@ class CLexer(Lexer):
             # C.g:59:7: 'GLOBAL_REMOVE_IF_UNREFERENCED'
             self.match("GLOBAL_REMOVE_IF_UNREFERENCED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T57
 
-
-
     # $ANTLR start T58
+
     def mT58(self, ):
 
         try:
@@ -934,20 +738,14 @@ class CLexer(Lexer):
             # C.g:60:7: 'EFIAPI'
             self.match("EFIAPI")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T58
 
-
-
     # $ANTLR start T59
+
     def mT59(self, ):
 
         try:
@@ -957,20 +755,14 @@ class CLexer(Lexer):
             # C.g:61:7: 'EFI_BOOTSERVICE'
             self.match("EFI_BOOTSERVICE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T59
 
-
-
     # $ANTLR start T60
+
     def mT60(self, ):
 
         try:
@@ -980,20 +772,14 @@ class CLexer(Lexer):
             # C.g:62:7: 'EFI_RUNTIMESERVICE'
             self.match("EFI_RUNTIMESERVICE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T60
 
-
-
     # $ANTLR start T61
+
     def mT61(self, ):
 
         try:
@@ -1003,20 +789,14 @@ class CLexer(Lexer):
             # C.g:63:7: 'PACKED'
             self.match("PACKED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T61
 
-
-
     # $ANTLR start T62
+
     def mT62(self, ):
 
         try:
@@ -1026,19 +806,14 @@ class CLexer(Lexer):
             # C.g:64:7: '('
             self.match(u'(')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T62
 
-
-
     # $ANTLR start T63
+
     def mT63(self, ):
 
         try:
@@ -1048,19 +823,14 @@ class CLexer(Lexer):
             # C.g:65:7: ')'
             self.match(u')')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T63
 
-
-
     # $ANTLR start T64
+
     def mT64(self, ):
 
         try:
@@ -1070,19 +840,14 @@ class CLexer(Lexer):
             # C.g:66:7: '['
             self.match(u'[')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T64
 
-
-
     # $ANTLR start T65
+
     def mT65(self, ):
 
         try:
@@ -1092,19 +857,14 @@ class CLexer(Lexer):
             # C.g:67:7: ']'
             self.match(u']')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T65
 
-
-
     # $ANTLR start T66
+
     def mT66(self, ):
 
         try:
@@ -1114,19 +874,14 @@ class CLexer(Lexer):
             # C.g:68:7: '*'
             self.match(u'*')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T66
 
-
-
     # $ANTLR start T67
+
     def mT67(self, ):
 
         try:
@@ -1136,20 +891,14 @@ class CLexer(Lexer):
             # C.g:69:7: '...'
             self.match("...")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T67
 
-
-
     # $ANTLR start T68
+
     def mT68(self, ):
 
         try:
@@ -1159,19 +908,14 @@ class CLexer(Lexer):
             # C.g:70:7: '+'
             self.match(u'+')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T68
 
-
-
     # $ANTLR start T69
+
     def mT69(self, ):
 
         try:
@@ -1181,19 +925,14 @@ class CLexer(Lexer):
             # C.g:71:7: '-'
             self.match(u'-')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T69
 
-
-
     # $ANTLR start T70
+
     def mT70(self, ):
 
         try:
@@ -1203,19 +942,14 @@ class CLexer(Lexer):
             # C.g:72:7: '/'
             self.match(u'/')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T70
 
-
-
     # $ANTLR start T71
+
     def mT71(self, ):
 
         try:
@@ -1225,19 +959,14 @@ class CLexer(Lexer):
             # C.g:73:7: '%'
             self.match(u'%')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T71
 
-
-
     # $ANTLR start T72
+
     def mT72(self, ):
 
         try:
@@ -1247,20 +976,14 @@ class CLexer(Lexer):
             # C.g:74:7: '++'
             self.match("++")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T72
 
-
-
     # $ANTLR start T73
+
     def mT73(self, ):
 
         try:
@@ -1270,20 +993,14 @@ class CLexer(Lexer):
             # C.g:75:7: '--'
             self.match("--")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T73
 
-
-
     # $ANTLR start T74
+
     def mT74(self, ):
 
         try:
@@ -1293,20 +1010,14 @@ class CLexer(Lexer):
             # C.g:76:7: 'sizeof'
             self.match("sizeof")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T74
 
-
-
     # $ANTLR start T75
+
     def mT75(self, ):
 
         try:
@@ -1316,19 +1027,14 @@ class CLexer(Lexer):
             # C.g:77:7: '.'
             self.match(u'.')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T75
 
-
-
     # $ANTLR start T76
+
     def mT76(self, ):
 
         try:
@@ -1338,20 +1044,14 @@ class CLexer(Lexer):
             # C.g:78:7: '->'
             self.match("->")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T76
 
-
-
     # $ANTLR start T77
+
     def mT77(self, ):
 
         try:
@@ -1361,19 +1061,14 @@ class CLexer(Lexer):
             # C.g:79:7: '&'
             self.match(u'&')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T77
 
-
-
     # $ANTLR start T78
+
     def mT78(self, ):
 
         try:
@@ -1383,19 +1078,14 @@ class CLexer(Lexer):
             # C.g:80:7: '~'
             self.match(u'~')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T78
 
-
-
     # $ANTLR start T79
+
     def mT79(self, ):
 
         try:
@@ -1405,19 +1095,14 @@ class CLexer(Lexer):
             # C.g:81:7: '!'
             self.match(u'!')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T79
 
-
-
     # $ANTLR start T80
+
     def mT80(self, ):
 
         try:
@@ -1427,20 +1112,14 @@ class CLexer(Lexer):
             # C.g:82:7: '*='
             self.match("*=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T80
 
-
-
     # $ANTLR start T81
+
     def mT81(self, ):
 
         try:
@@ -1450,20 +1129,14 @@ class CLexer(Lexer):
             # C.g:83:7: '/='
             self.match("/=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T81
 
-
-
     # $ANTLR start T82
+
     def mT82(self, ):
 
         try:
@@ -1473,20 +1146,14 @@ class CLexer(Lexer):
             # C.g:84:7: '%='
             self.match("%=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T82
 
-
-
     # $ANTLR start T83
+
     def mT83(self, ):
 
         try:
@@ -1496,20 +1163,14 @@ class CLexer(Lexer):
             # C.g:85:7: '+='
             self.match("+=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T83
 
-
-
     # $ANTLR start T84
+
     def mT84(self, ):
 
         try:
@@ -1519,20 +1180,14 @@ class CLexer(Lexer):
             # C.g:86:7: '-='
             self.match("-=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T84
 
-
-
     # $ANTLR start T85
+
     def mT85(self, ):
 
         try:
@@ -1542,20 +1197,14 @@ class CLexer(Lexer):
             # C.g:87:7: '<<='
             self.match("<<=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T85
 
-
-
     # $ANTLR start T86
+
     def mT86(self, ):
 
         try:
@@ -1565,20 +1214,14 @@ class CLexer(Lexer):
             # C.g:88:7: '>>='
             self.match(">>=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T86
 
-
-
     # $ANTLR start T87
+
     def mT87(self, ):
 
         try:
@@ -1588,20 +1231,14 @@ class CLexer(Lexer):
             # C.g:89:7: '&='
             self.match("&=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T87
 
-
-
     # $ANTLR start T88
+
     def mT88(self, ):
 
         try:
@@ -1611,20 +1248,14 @@ class CLexer(Lexer):
             # C.g:90:7: '^='
             self.match("^=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T88
 
-
-
     # $ANTLR start T89
+
     def mT89(self, ):
 
         try:
@@ -1634,20 +1265,14 @@ class CLexer(Lexer):
             # C.g:91:7: '|='
             self.match("|=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T89
 
-
-
     # $ANTLR start T90
+
     def mT90(self, ):
 
         try:
@@ -1657,19 +1282,14 @@ class CLexer(Lexer):
             # C.g:92:7: '?'
             self.match(u'?')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T90
 
-
-
     # $ANTLR start T91
+
     def mT91(self, ):
 
         try:
@@ -1679,20 +1299,14 @@ class CLexer(Lexer):
             # C.g:93:7: '||'
             self.match("||")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T91
 
-
-
     # $ANTLR start T92
+
     def mT92(self, ):
 
         try:
@@ -1702,20 +1316,14 @@ class CLexer(Lexer):
             # C.g:94:7: '&&'
             self.match("&&")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T92
 
-
-
     # $ANTLR start T93
+
     def mT93(self, ):
 
         try:
@@ -1725,19 +1333,14 @@ class CLexer(Lexer):
             # C.g:95:7: '|'
             self.match(u'|')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T93
 
-
-
     # $ANTLR start T94
+
     def mT94(self, ):
 
         try:
@@ -1747,19 +1350,14 @@ class CLexer(Lexer):
             # C.g:96:7: '^'
             self.match(u'^')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T94
 
-
-
     # $ANTLR start T95
+
     def mT95(self, ):
 
         try:
@@ -1769,20 +1367,14 @@ class CLexer(Lexer):
             # C.g:97:7: '=='
             self.match("==")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T95
 
-
-
     # $ANTLR start T96
+
     def mT96(self, ):
 
         try:
@@ -1792,20 +1384,14 @@ class CLexer(Lexer):
             # C.g:98:7: '!='
             self.match("!=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T96
 
-
-
     # $ANTLR start T97
+
     def mT97(self, ):
 
         try:
@@ -1815,19 +1401,14 @@ class CLexer(Lexer):
             # C.g:99:7: '<'
             self.match(u'<')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T97
 
-
-
     # $ANTLR start T98
+
     def mT98(self, ):
 
         try:
@@ -1837,19 +1418,14 @@ class CLexer(Lexer):
             # C.g:100:7: '>'
             self.match(u'>')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T98
 
-
-
     # $ANTLR start T99
+
     def mT99(self, ):
 
         try:
@@ -1859,20 +1435,14 @@ class CLexer(Lexer):
             # C.g:101:7: '<='
             self.match("<=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T99
 
-
-
     # $ANTLR start T100
+
     def mT100(self, ):
 
         try:
@@ -1882,20 +1452,14 @@ class CLexer(Lexer):
             # C.g:102:8: '>='
             self.match(">=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T100
 
-
-
     # $ANTLR start T101
+
     def mT101(self, ):
 
         try:
@@ -1905,20 +1469,14 @@ class CLexer(Lexer):
             # C.g:103:8: '<<'
             self.match("<<")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T101
 
-
-
     # $ANTLR start T102
+
     def mT102(self, ):
 
         try:
@@ -1928,20 +1486,14 @@ class CLexer(Lexer):
             # C.g:104:8: '>>'
             self.match(">>")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T102
 
-
-
     # $ANTLR start T103
+
     def mT103(self, ):
 
         try:
@@ -1951,20 +1503,14 @@ class CLexer(Lexer):
             # C.g:105:8: '__asm__'
             self.match("__asm__")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T103
 
-
-
     # $ANTLR start T104
+
     def mT104(self, ):
 
         try:
@@ -1974,20 +1520,14 @@ class CLexer(Lexer):
             # C.g:106:8: '_asm'
             self.match("_asm")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T104
 
-
-
     # $ANTLR start T105
+
     def mT105(self, ):
 
         try:
@@ -1997,20 +1537,14 @@ class CLexer(Lexer):
             # C.g:107:8: '__asm'
             self.match("__asm")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T105
 
-
-
     # $ANTLR start T106
+
     def mT106(self, ):
 
         try:
@@ -2020,20 +1554,14 @@ class CLexer(Lexer):
             # C.g:108:8: 'case'
             self.match("case")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T106
 
-
-
     # $ANTLR start T107
+
     def mT107(self, ):
 
         try:
@@ -2043,20 +1571,14 @@ class CLexer(Lexer):
             # C.g:109:8: 'default'
             self.match("default")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T107
 
-
-
     # $ANTLR start T108
+
     def mT108(self, ):
 
         try:
@@ -2066,20 +1588,14 @@ class CLexer(Lexer):
             # C.g:110:8: 'if'
             self.match("if")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T108
 
-
-
     # $ANTLR start T109
+
     def mT109(self, ):
 
         try:
@@ -2089,20 +1605,14 @@ class CLexer(Lexer):
             # C.g:111:8: 'else'
             self.match("else")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T109
 
-
-
     # $ANTLR start T110
+
     def mT110(self, ):
 
         try:
@@ -2112,20 +1622,14 @@ class CLexer(Lexer):
             # C.g:112:8: 'switch'
             self.match("switch")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T110
 
-
-
     # $ANTLR start T111
+
     def mT111(self, ):
 
         try:
@@ -2135,20 +1639,14 @@ class CLexer(Lexer):
             # C.g:113:8: 'while'
             self.match("while")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T111
 
-
-
     # $ANTLR start T112
+
     def mT112(self, ):
 
         try:
@@ -2158,20 +1656,14 @@ class CLexer(Lexer):
             # C.g:114:8: 'do'
             self.match("do")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T112
 
-
-
     # $ANTLR start T113
+
     def mT113(self, ):
 
         try:
@@ -2181,20 +1673,14 @@ class CLexer(Lexer):
             # C.g:115:8: 'for'
             self.match("for")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T113
 
-
-
     # $ANTLR start T114
+
     def mT114(self, ):
 
         try:
@@ -2204,20 +1690,14 @@ class CLexer(Lexer):
             # C.g:116:8: 'goto'
             self.match("goto")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T114
 
-
-
     # $ANTLR start T115
+
     def mT115(self, ):
 
         try:
@@ -2227,20 +1707,14 @@ class CLexer(Lexer):
             # C.g:117:8: 'continue'
             self.match("continue")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T115
 
-
-
     # $ANTLR start T116
+
     def mT116(self, ):
 
         try:
@@ -2250,20 +1724,14 @@ class CLexer(Lexer):
             # C.g:118:8: 'break'
             self.match("break")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T116
 
-
-
     # $ANTLR start T117
+
     def mT117(self, ):
 
         try:
@@ -2273,20 +1741,14 @@ class CLexer(Lexer):
             # C.g:119:8: 'return'
             self.match("return")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T117
 
-
-
     # $ANTLR start IDENTIFIER
+
     def mIDENTIFIER(self, ):
 
         try:
@@ -2297,34 +1759,25 @@ class CLexer(Lexer):
             self.mLETTER()
 
             # C.g:586:11: ( LETTER | '0' .. '9' )*
-            while True: #loop1
+            while True:  # loop1
                 alt1 = 2
                 LA1_0 = self.input.LA(1)
 
-                if (LA1_0 == u'$' or (u'0' <= LA1_0 <= u'9') or (u'A' <= LA1_0 <= u'Z') or LA1_0 == u'_' or (u'a' <= LA1_0 <= u'z')) :
+                if (LA1_0 == u'$' or (u'0' <= LA1_0 <= u'9') or (u'A' <= LA1_0 <= u'Z') or LA1_0 == u'_' or (u'a' <= LA1_0 <= u'z')):
                     alt1 = 1
 
-
                 if alt1 == 1:
                     # C.g:
                     if self.input.LA(1) == u'$' or (u'0' <= self.input.LA(1) <= u'9') or (u'A' <= self.input.LA(1) <= u'Z') or self.input.LA(1) == u'_' or (u'a' <= self.input.LA(1) <= u'z'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop1
-
-
-
-
-
+                    break  # loop1
 
         finally:
 
@@ -2332,36 +1785,29 @@ class CLexer(Lexer):
 
     # $ANTLR end IDENTIFIER
 
-
-
     # $ANTLR start LETTER
+
     def mLETTER(self, ):
 
         try:
             # C.g:591:2: ( '$' | 'A' .. 'Z' | 'a' .. 'z' | '_' )
             # C.g:
             if self.input.LA(1) == u'$' or (u'A' <= self.input.LA(1) <= u'Z') or self.input.LA(1) == u'_' or (u'a' <= self.input.LA(1) <= u'z'):
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end LETTER
 
-
-
     # $ANTLR start CHARACTER_LITERAL
+
     def mCHARACTER_LITERAL(self, ):
 
         try:
@@ -2373,27 +1819,25 @@ class CLexer(Lexer):
             alt2 = 2
             LA2_0 = self.input.LA(1)
 
-            if (LA2_0 == u'L') :
+            if (LA2_0 == u'L'):
                 alt2 = 1
             if alt2 == 1:
                 # C.g:598:10: 'L'
                 self.match(u'L')
 
-
-
-
             self.match(u'\'')
 
             # C.g:598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )
             alt3 = 2
             LA3_0 = self.input.LA(1)
 
-            if (LA3_0 == u'\\') :
+            if (LA3_0 == u'\\'):
                 alt3 = 1
-            elif ((u'\u0000' <= LA3_0 <= u'&') or (u'(' <= LA3_0 <= u'[') or (u']' <= LA3_0 <= u'\uFFFE')) :
+            elif ((u'\u0000' <= LA3_0 <= u'&') or (u'(' <= LA3_0 <= u'[') or (u']' <= LA3_0 <= u'\uFFFE')):
                 alt3 = 2
             else:
-                nvae = NoViableAltException("598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )", 3, 0, self.input)
+                nvae = NoViableAltException(
+                    "598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )", 3, 0, self.input)
 
                 raise nvae
 
@@ -2401,37 +1845,26 @@ class CLexer(Lexer):
                 # C.g:598:23: EscapeSequence
                 self.mEscapeSequence()
 
-
-
             elif alt3 == 2:
                 # C.g:598:40: ~ ( '\\'' | '\\\\' )
                 if (u'\u0000' <= self.input.LA(1) <= u'&') or (u'(' <= self.input.LA(1) <= u'[') or (u']' <= self.input.LA(1) <= u'\uFFFE'):
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
             self.match(u'\'')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end CHARACTER_LITERAL
 
-
-
     # $ANTLR start STRING_LITERAL
+
     def mSTRING_LITERAL(self, ):
 
         try:
@@ -2443,66 +1876,51 @@ class CLexer(Lexer):
             alt4 = 2
             LA4_0 = self.input.LA(1)
 
-            if (LA4_0 == u'L') :
+            if (LA4_0 == u'L'):
                 alt4 = 1
             if alt4 == 1:
                 # C.g:602:9: 'L'
                 self.match(u'L')
 
-
-
-
             self.match(u'"')
 
             # C.g:602:19: ( EscapeSequence | ~ ( '\\\\' | '\"' ) )*
-            while True: #loop5
+            while True:  # loop5
                 alt5 = 3
                 LA5_0 = self.input.LA(1)
 
-                if (LA5_0 == u'\\') :
+                if (LA5_0 == u'\\'):
                     alt5 = 1
-                elif ((u'\u0000' <= LA5_0 <= u'!') or (u'#' <= LA5_0 <= u'[') or (u']' <= LA5_0 <= u'\uFFFE')) :
+                elif ((u'\u0000' <= LA5_0 <= u'!') or (u'#' <= LA5_0 <= u'[') or (u']' <= LA5_0 <= u'\uFFFE')):
                     alt5 = 2
 
-
                 if alt5 == 1:
                     # C.g:602:21: EscapeSequence
                     self.mEscapeSequence()
 
-
-
                 elif alt5 == 2:
                     # C.g:602:38: ~ ( '\\\\' | '\"' )
                     if (u'\u0000' <= self.input.LA(1) <= u'!') or (u'#' <= self.input.LA(1) <= u'[') or (u']' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop5
-
+                    break  # loop5
 
             self.match(u'"')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end STRING_LITERAL
 
-
-
     # $ANTLR start HEX_LITERAL
+
     def mHEX_LITERAL(self, ):
 
         try:
@@ -2513,66 +1931,53 @@ class CLexer(Lexer):
             self.match(u'0')
 
             if self.input.LA(1) == u'X' or self.input.LA(1) == u'x':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
             # C.g:605:29: ( HexDigit )+
             cnt6 = 0
-            while True: #loop6
+            while True:  # loop6
                 alt6 = 2
                 LA6_0 = self.input.LA(1)
 
-                if ((u'0' <= LA6_0 <= u'9') or (u'A' <= LA6_0 <= u'F') or (u'a' <= LA6_0 <= u'f')) :
+                if ((u'0' <= LA6_0 <= u'9') or (u'A' <= LA6_0 <= u'F') or (u'a' <= LA6_0 <= u'f')):
                     alt6 = 1
 
-
                 if alt6 == 1:
                     # C.g:605:29: HexDigit
                     self.mHexDigit()
 
-
-
                 else:
                     if cnt6 >= 1:
-                        break #loop6
+                        break  # loop6
 
                     eee = EarlyExitException(6, self.input)
                     raise eee
 
                 cnt6 += 1
 
-
             # C.g:605:39: ( IntegerTypeSuffix )?
             alt7 = 2
             LA7_0 = self.input.LA(1)
 
-            if (LA7_0 == u'L' or LA7_0 == u'U' or LA7_0 == u'l' or LA7_0 == u'u') :
+            if (LA7_0 == u'L' or LA7_0 == u'U' or LA7_0 == u'l' or LA7_0 == u'u'):
                 alt7 = 1
             if alt7 == 1:
                 # C.g:605:39: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end HEX_LITERAL
 
-
-
     # $ANTLR start DECIMAL_LITERAL
+
     def mDECIMAL_LITERAL(self, ):
 
         try:
@@ -2584,12 +1989,13 @@ class CLexer(Lexer):
             alt9 = 2
             LA9_0 = self.input.LA(1)
 
-            if (LA9_0 == u'0') :
+            if (LA9_0 == u'0'):
                 alt9 = 1
-            elif ((u'1' <= LA9_0 <= u'9')) :
+            elif ((u'1' <= LA9_0 <= u'9')):
                 alt9 = 2
             else:
-                nvae = NoViableAltException("607:19: ( '0' | '1' .. '9' ( '0' .. '9' )* )", 9, 0, self.input)
+                nvae = NoViableAltException(
+                    "607:19: ( '0' | '1' .. '9' ( '0' .. '9' )* )", 9, 0, self.input)
 
                 raise nvae
 
@@ -2597,60 +2003,43 @@ class CLexer(Lexer):
                 # C.g:607:20: '0'
                 self.match(u'0')
 
-
-
             elif alt9 == 2:
                 # C.g:607:26: '1' .. '9' ( '0' .. '9' )*
                 self.matchRange(u'1', u'9')
 
                 # C.g:607:35: ( '0' .. '9' )*
-                while True: #loop8
+                while True:  # loop8
                     alt8 = 2
                     LA8_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA8_0 <= u'9')) :
+                    if ((u'0' <= LA8_0 <= u'9')):
                         alt8 = 1
 
-
                     if alt8 == 1:
                         # C.g:607:35: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
-                        break #loop8
-
-
-
-
+                        break  # loop8
 
             # C.g:607:46: ( IntegerTypeSuffix )?
             alt10 = 2
             LA10_0 = self.input.LA(1)
 
-            if (LA10_0 == u'L' or LA10_0 == u'U' or LA10_0 == u'l' or LA10_0 == u'u') :
+            if (LA10_0 == u'L' or LA10_0 == u'U' or LA10_0 == u'l' or LA10_0 == u'u'):
                 alt10 = 1
             if alt10 == 1:
                 # C.g:607:46: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end DECIMAL_LITERAL
 
-
-
     # $ANTLR start OCTAL_LITERAL
+
     def mOCTAL_LITERAL(self, ):
 
         try:
@@ -2662,83 +2051,65 @@ class CLexer(Lexer):
 
             # C.g:609:21: ( '0' .. '7' )+
             cnt11 = 0
-            while True: #loop11
+            while True:  # loop11
                 alt11 = 2
                 LA11_0 = self.input.LA(1)
 
-                if ((u'0' <= LA11_0 <= u'7')) :
+                if ((u'0' <= LA11_0 <= u'7')):
                     alt11 = 1
 
-
                 if alt11 == 1:
                     # C.g:609:22: '0' .. '7'
                     self.matchRange(u'0', u'7')
 
-
-
                 else:
                     if cnt11 >= 1:
-                        break #loop11
+                        break  # loop11
 
                     eee = EarlyExitException(11, self.input)
                     raise eee
 
                 cnt11 += 1
 
-
             # C.g:609:33: ( IntegerTypeSuffix )?
             alt12 = 2
             LA12_0 = self.input.LA(1)
 
-            if (LA12_0 == u'L' or LA12_0 == u'U' or LA12_0 == u'l' or LA12_0 == u'u') :
+            if (LA12_0 == u'L' or LA12_0 == u'U' or LA12_0 == u'l' or LA12_0 == u'u'):
                 alt12 = 1
             if alt12 == 1:
                 # C.g:609:33: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end OCTAL_LITERAL
 
-
-
     # $ANTLR start HexDigit
+
     def mHexDigit(self, ):
 
         try:
             # C.g:612:10: ( ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' ) )
             # C.g:612:12: ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' )
             if (u'0' <= self.input.LA(1) <= u'9') or (u'A' <= self.input.LA(1) <= u'F') or (u'a' <= self.input.LA(1) <= u'f'):
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end HexDigit
 
-
-
     # $ANTLR start IntegerTypeSuffix
+
     def mIntegerTypeSuffix(self, ):
 
         try:
@@ -2746,114 +2117,98 @@ class CLexer(Lexer):
             alt13 = 4
             LA13_0 = self.input.LA(1)
 
-            if (LA13_0 == u'U' or LA13_0 == u'u') :
+            if (LA13_0 == u'U' or LA13_0 == u'u'):
                 LA13_1 = self.input.LA(2)
 
-                if (LA13_1 == u'L' or LA13_1 == u'l') :
+                if (LA13_1 == u'L' or LA13_1 == u'l'):
                     LA13_3 = self.input.LA(3)
 
-                    if (LA13_3 == u'L' or LA13_3 == u'l') :
+                    if (LA13_3 == u'L' or LA13_3 == u'l'):
                         alt13 = 4
                     else:
                         alt13 = 3
                 else:
                     alt13 = 1
-            elif (LA13_0 == u'L' or LA13_0 == u'l') :
+            elif (LA13_0 == u'L' or LA13_0 == u'l'):
                 alt13 = 2
             else:
-                nvae = NoViableAltException("614:1: fragment IntegerTypeSuffix : ( ( 'u' | 'U' ) | ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' ) );", 13, 0, self.input)
+                nvae = NoViableAltException(
+                    "614:1: fragment IntegerTypeSuffix : ( ( 'u' | 'U' ) | ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' ) );", 13, 0, self.input)
 
                 raise nvae
 
             if alt13 == 1:
                 # C.g:616:4: ( 'u' | 'U' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 2:
                 # C.g:617:4: ( 'l' | 'L' )
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 3:
                 # C.g:618:4: ( 'u' | 'U' ) ( 'l' | 'L' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 4:
                 # C.g:619:4: ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end IntegerTypeSuffix
 
-
-
     # $ANTLR start FLOATING_POINT_LITERAL
+
     def mFLOATING_POINT_LITERAL(self, ):
 
         try:
@@ -2866,337 +2221,269 @@ class CLexer(Lexer):
                 # C.g:623:9: ( '0' .. '9' )+ '.' ( '0' .. '9' )* ( Exponent )? ( FloatTypeSuffix )?
                 # C.g:623:9: ( '0' .. '9' )+
                 cnt14 = 0
-                while True: #loop14
+                while True:  # loop14
                     alt14 = 2
                     LA14_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA14_0 <= u'9')) :
+                    if ((u'0' <= LA14_0 <= u'9')):
                         alt14 = 1
 
-
                     if alt14 == 1:
                         # C.g:623:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt14 >= 1:
-                            break #loop14
+                            break  # loop14
 
                         eee = EarlyExitException(14, self.input)
                         raise eee
 
                     cnt14 += 1
 
-
                 self.match(u'.')
 
                 # C.g:623:25: ( '0' .. '9' )*
-                while True: #loop15
+                while True:  # loop15
                     alt15 = 2
                     LA15_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA15_0 <= u'9')) :
+                    if ((u'0' <= LA15_0 <= u'9')):
                         alt15 = 1
 
-
                     if alt15 == 1:
                         # C.g:623:26: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
-                        break #loop15
-
+                        break  # loop15
 
                 # C.g:623:37: ( Exponent )?
                 alt16 = 2
                 LA16_0 = self.input.LA(1)
 
-                if (LA16_0 == u'E' or LA16_0 == u'e') :
+                if (LA16_0 == u'E' or LA16_0 == u'e'):
                     alt16 = 1
                 if alt16 == 1:
                     # C.g:623:37: Exponent
                     self.mExponent()
 
-
-
-
                 # C.g:623:47: ( FloatTypeSuffix )?
                 alt17 = 2
                 LA17_0 = self.input.LA(1)
 
-                if (LA17_0 == u'D' or LA17_0 == u'F' or LA17_0 == u'd' or LA17_0 == u'f') :
+                if (LA17_0 == u'D' or LA17_0 == u'F' or LA17_0 == u'd' or LA17_0 == u'f'):
                     alt17 = 1
                 if alt17 == 1:
                     # C.g:623:47: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 2:
                 # C.g:624:9: '.' ( '0' .. '9' )+ ( Exponent )? ( FloatTypeSuffix )?
                 self.match(u'.')
 
                 # C.g:624:13: ( '0' .. '9' )+
                 cnt18 = 0
-                while True: #loop18
+                while True:  # loop18
                     alt18 = 2
                     LA18_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA18_0 <= u'9')) :
+                    if ((u'0' <= LA18_0 <= u'9')):
                         alt18 = 1
 
-
                     if alt18 == 1:
                         # C.g:624:14: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt18 >= 1:
-                            break #loop18
+                            break  # loop18
 
                         eee = EarlyExitException(18, self.input)
                         raise eee
 
                     cnt18 += 1
 
-
                 # C.g:624:25: ( Exponent )?
                 alt19 = 2
                 LA19_0 = self.input.LA(1)
 
-                if (LA19_0 == u'E' or LA19_0 == u'e') :
+                if (LA19_0 == u'E' or LA19_0 == u'e'):
                     alt19 = 1
                 if alt19 == 1:
                     # C.g:624:25: Exponent
                     self.mExponent()
 
-
-
-
                 # C.g:624:35: ( FloatTypeSuffix )?
                 alt20 = 2
                 LA20_0 = self.input.LA(1)
 
-                if (LA20_0 == u'D' or LA20_0 == u'F' or LA20_0 == u'd' or LA20_0 == u'f') :
+                if (LA20_0 == u'D' or LA20_0 == u'F' or LA20_0 == u'd' or LA20_0 == u'f'):
                     alt20 = 1
                 if alt20 == 1:
                     # C.g:624:35: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 3:
                 # C.g:625:9: ( '0' .. '9' )+ Exponent ( FloatTypeSuffix )?
                 # C.g:625:9: ( '0' .. '9' )+
                 cnt21 = 0
-                while True: #loop21
+                while True:  # loop21
                     alt21 = 2
                     LA21_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA21_0 <= u'9')) :
+                    if ((u'0' <= LA21_0 <= u'9')):
                         alt21 = 1
 
-
                     if alt21 == 1:
                         # C.g:625:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt21 >= 1:
-                            break #loop21
+                            break  # loop21
 
                         eee = EarlyExitException(21, self.input)
                         raise eee
 
                     cnt21 += 1
 
-
                 self.mExponent()
 
                 # C.g:625:30: ( FloatTypeSuffix )?
                 alt22 = 2
                 LA22_0 = self.input.LA(1)
 
-                if (LA22_0 == u'D' or LA22_0 == u'F' or LA22_0 == u'd' or LA22_0 == u'f') :
+                if (LA22_0 == u'D' or LA22_0 == u'F' or LA22_0 == u'd' or LA22_0 == u'f'):
                     alt22 = 1
                 if alt22 == 1:
                     # C.g:625:30: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 4:
                 # C.g:626:9: ( '0' .. '9' )+ ( Exponent )? FloatTypeSuffix
                 # C.g:626:9: ( '0' .. '9' )+
                 cnt23 = 0
-                while True: #loop23
+                while True:  # loop23
                     alt23 = 2
                     LA23_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA23_0 <= u'9')) :
+                    if ((u'0' <= LA23_0 <= u'9')):
                         alt23 = 1
 
-
                     if alt23 == 1:
                         # C.g:626:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt23 >= 1:
-                            break #loop23
+                            break  # loop23
 
                         eee = EarlyExitException(23, self.input)
                         raise eee
 
                     cnt23 += 1
 
-
                 # C.g:626:21: ( Exponent )?
                 alt24 = 2
                 LA24_0 = self.input.LA(1)
 
-                if (LA24_0 == u'E' or LA24_0 == u'e') :
+                if (LA24_0 == u'E' or LA24_0 == u'e'):
                     alt24 = 1
                 if alt24 == 1:
                     # C.g:626:21: Exponent
                     self.mExponent()
 
-
-
-
                 self.mFloatTypeSuffix()
 
-
-
-
         finally:
 
             pass
 
     # $ANTLR end FLOATING_POINT_LITERAL
 
-
-
     # $ANTLR start Exponent
+
     def mExponent(self, ):
 
         try:
             # C.g:630:10: ( ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+ )
             # C.g:630:12: ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+
             if self.input.LA(1) == u'E' or self.input.LA(1) == u'e':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
             # C.g:630:22: ( '+' | '-' )?
             alt26 = 2
             LA26_0 = self.input.LA(1)
 
-            if (LA26_0 == u'+' or LA26_0 == u'-') :
+            if (LA26_0 == u'+' or LA26_0 == u'-'):
                 alt26 = 1
             if alt26 == 1:
                 # C.g:
                 if self.input.LA(1) == u'+' or self.input.LA(1) == u'-':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
             # C.g:630:33: ( '0' .. '9' )+
             cnt27 = 0
-            while True: #loop27
+            while True:  # loop27
                 alt27 = 2
                 LA27_0 = self.input.LA(1)
 
-                if ((u'0' <= LA27_0 <= u'9')) :
+                if ((u'0' <= LA27_0 <= u'9')):
                     alt27 = 1
 
-
                 if alt27 == 1:
                     # C.g:630:34: '0' .. '9'
                     self.matchRange(u'0', u'9')
 
-
-
                 else:
                     if cnt27 >= 1:
-                        break #loop27
+                        break  # loop27
 
                     eee = EarlyExitException(27, self.input)
                     raise eee
 
                 cnt27 += 1
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end Exponent
 
-
-
     # $ANTLR start FloatTypeSuffix
+
     def mFloatTypeSuffix(self, ):
 
         try:
             # C.g:633:17: ( ( 'f' | 'F' | 'd' | 'D' ) )
             # C.g:633:19: ( 'f' | 'F' | 'd' | 'D' )
             if self.input.LA(1) == u'D' or self.input.LA(1) == u'F' or self.input.LA(1) == u'd' or self.input.LA(1) == u'f':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end FloatTypeSuffix
 
-
-
     # $ANTLR start EscapeSequence
+
     def mEscapeSequence(self, ):
 
         try:
@@ -3204,20 +2491,22 @@ class CLexer(Lexer):
             alt28 = 2
             LA28_0 = self.input.LA(1)
 
-            if (LA28_0 == u'\\') :
+            if (LA28_0 == u'\\'):
                 LA28_1 = self.input.LA(2)
 
-                if (LA28_1 == u'"' or LA28_1 == u'\'' or LA28_1 == u'\\' or LA28_1 == u'b' or LA28_1 == u'f' or LA28_1 == u'n' or LA28_1 == u'r' or LA28_1 == u't') :
+                if (LA28_1 == u'"' or LA28_1 == u'\'' or LA28_1 == u'\\' or LA28_1 == u'b' or LA28_1 == u'f' or LA28_1 == u'n' or LA28_1 == u'r' or LA28_1 == u't'):
                     alt28 = 1
-                elif ((u'0' <= LA28_1 <= u'7')) :
+                elif ((u'0' <= LA28_1 <= u'7')):
                     alt28 = 2
                 else:
-                    nvae = NoViableAltException("635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 1, self.input)
+                    nvae = NoViableAltException(
+                        "635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 1, self.input)
 
                     raise nvae
 
             else:
-                nvae = NoViableAltException("635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 0, self.input)
+                nvae = NoViableAltException(
+                    "635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 0, self.input)
 
                 raise nvae
 
@@ -3226,32 +2515,25 @@ class CLexer(Lexer):
                 self.match(u'\\')
 
                 if self.input.LA(1) == u'"' or self.input.LA(1) == u'\'' or self.input.LA(1) == u'\\' or self.input.LA(1) == u'b' or self.input.LA(1) == u'f' or self.input.LA(1) == u'n' or self.input.LA(1) == u'r' or self.input.LA(1) == u't':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt28 == 2:
                 # C.g:638:9: OctalEscape
                 self.mOctalEscape()
 
-
-
-
         finally:
 
             pass
 
     # $ANTLR end EscapeSequence
 
-
-
     # $ANTLR start OctalEscape
+
     def mOctalEscape(self, ):
 
         try:
@@ -3259,35 +2541,37 @@ class CLexer(Lexer):
             alt29 = 3
             LA29_0 = self.input.LA(1)
 
-            if (LA29_0 == u'\\') :
+            if (LA29_0 == u'\\'):
                 LA29_1 = self.input.LA(2)
 
-                if ((u'0' <= LA29_1 <= u'3')) :
+                if ((u'0' <= LA29_1 <= u'3')):
                     LA29_2 = self.input.LA(3)
 
-                    if ((u'0' <= LA29_2 <= u'7')) :
+                    if ((u'0' <= LA29_2 <= u'7')):
                         LA29_4 = self.input.LA(4)
 
-                        if ((u'0' <= LA29_4 <= u'7')) :
+                        if ((u'0' <= LA29_4 <= u'7')):
                             alt29 = 1
                         else:
                             alt29 = 2
                     else:
                         alt29 = 3
-                elif ((u'4' <= LA29_1 <= u'7')) :
+                elif ((u'4' <= LA29_1 <= u'7')):
                     LA29_3 = self.input.LA(3)
 
-                    if ((u'0' <= LA29_3 <= u'7')) :
+                    if ((u'0' <= LA29_3 <= u'7')):
                         alt29 = 2
                     else:
                         alt29 = 3
                 else:
-                    nvae = NoViableAltException("641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 1, self.input)
+                    nvae = NoViableAltException(
+                        "641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 1, self.input)
 
                     raise nvae
 
             else:
-                nvae = NoViableAltException("641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 0, self.input)
+                nvae = NoViableAltException(
+                    "641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 0, self.input)
 
                 raise nvae
 
@@ -3299,25 +2583,14 @@ class CLexer(Lexer):
                 # C.g:643:15: '0' .. '3'
                 self.matchRange(u'0', u'3')
 
-
-
-
                 # C.g:643:25: ( '0' .. '7' )
                 # C.g:643:26: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
                 # C.g:643:36: ( '0' .. '7' )
                 # C.g:643:37: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
             elif alt29 == 2:
                 # C.g:644:9: '\\\\' ( '0' .. '7' ) ( '0' .. '7' )
                 self.match(u'\\')
@@ -3326,18 +2599,10 @@ class CLexer(Lexer):
                 # C.g:644:15: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
                 # C.g:644:25: ( '0' .. '7' )
                 # C.g:644:26: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
             elif alt29 == 3:
                 # C.g:645:9: '\\\\' ( '0' .. '7' )
                 self.match(u'\\')
@@ -3346,21 +2611,14 @@ class CLexer(Lexer):
                 # C.g:645:15: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end OctalEscape
 
-
-
     # $ANTLR start UnicodeEscape
+
     def mUnicodeEscape(self, ):
 
         try:
@@ -3378,19 +2636,14 @@ class CLexer(Lexer):
 
             self.mHexDigit()
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end UnicodeEscape
 
-
-
     # $ANTLR start WS
+
     def mWS(self, ):
 
         try:
@@ -3399,20 +2652,16 @@ class CLexer(Lexer):
             # C.g:653:5: ( ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' ) )
             # C.g:653:8: ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' )
             if (u'\t' <= self.input.LA(1) <= u'\n') or (u'\f' <= self.input.LA(1) <= u'\r') or self.input.LA(1) == u' ':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3420,9 +2669,8 @@ class CLexer(Lexer):
 
     # $ANTLR end WS
 
-
-
     # $ANTLR start BS
+
     def mBS(self, ):
 
         try:
@@ -3434,15 +2682,9 @@ class CLexer(Lexer):
             # C.g:657:8: '\\\\'
             self.match(u'\\')
 
-
-
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3450,9 +2692,8 @@ class CLexer(Lexer):
 
     # $ANTLR end BS
 
-
-
     # $ANTLR start UnicodeVocabulary
+
     def mUnicodeVocabulary(self, ):
 
         try:
@@ -3462,19 +2703,14 @@ class CLexer(Lexer):
             # C.g:665:7: '\\u0003' .. '\\uFFFE'
             self.matchRange(u'\u0003', u'\uFFFE')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end UnicodeVocabulary
 
-
-
     # $ANTLR start COMMENT
+
     def mCOMMENT(self, ):
 
         try:
@@ -3484,44 +2720,34 @@ class CLexer(Lexer):
             # C.g:668:9: '/*' ( options {greedy=false; } : . )* '*/'
             self.match("/*")
 
-
             # C.g:668:14: ( options {greedy=false; } : . )*
-            while True: #loop30
+            while True:  # loop30
                 alt30 = 2
                 LA30_0 = self.input.LA(1)
 
-                if (LA30_0 == u'*') :
+                if (LA30_0 == u'*'):
                     LA30_1 = self.input.LA(2)
 
-                    if (LA30_1 == u'/') :
+                    if (LA30_1 == u'/'):
                         alt30 = 2
-                    elif ((u'\u0000' <= LA30_1 <= u'.') or (u'0' <= LA30_1 <= u'\uFFFE')) :
+                    elif ((u'\u0000' <= LA30_1 <= u'.') or (u'0' <= LA30_1 <= u'\uFFFE')):
                         alt30 = 1
 
-
-                elif ((u'\u0000' <= LA30_0 <= u')') or (u'+' <= LA30_0 <= u'\uFFFE')) :
+                elif ((u'\u0000' <= LA30_0 <= u')') or (u'+' <= LA30_0 <= u'\uFFFE')):
                     alt30 = 1
 
-
                 if alt30 == 1:
                     # C.g:668:42: .
                     self.matchAny()
 
-
-
                 else:
-                    break #loop30
-
+                    break  # loop30
 
             self.match("*/")
 
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3529,9 +2755,8 @@ class CLexer(Lexer):
 
     # $ANTLR end COMMENT
 
-
-
     # $ANTLR start LINE_COMMENT
+
     def mLINE_COMMENT(self, ):
 
         try:
@@ -3541,54 +2766,42 @@ class CLexer(Lexer):
             # C.g:673:7: '//' (~ ( '\\n' | '\\r' ) )* ( '\\r' )? '\\n'
             self.match("//")
 
-
             # C.g:673:12: (~ ( '\\n' | '\\r' ) )*
-            while True: #loop31
+            while True:  # loop31
                 alt31 = 2
                 LA31_0 = self.input.LA(1)
 
-                if ((u'\u0000' <= LA31_0 <= u'\t') or (u'\u000B' <= LA31_0 <= u'\f') or (u'\u000E' <= LA31_0 <= u'\uFFFE')) :
+                if ((u'\u0000' <= LA31_0 <= u'\t') or (u'\u000B' <= LA31_0 <= u'\f') or (u'\u000E' <= LA31_0 <= u'\uFFFE')):
                     alt31 = 1
 
-
                 if alt31 == 1:
                     # C.g:673:12: ~ ( '\\n' | '\\r' )
                     if (u'\u0000' <= self.input.LA(1) <= u'\t') or (u'\u000B' <= self.input.LA(1) <= u'\f') or (u'\u000E' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop31
-
+                    break  # loop31
 
             # C.g:673:26: ( '\\r' )?
             alt32 = 2
             LA32_0 = self.input.LA(1)
 
-            if (LA32_0 == u'\r') :
+            if (LA32_0 == u'\r'):
                 alt32 = 1
             if alt32 == 1:
                 # C.g:673:26: '\\r'
                 self.match(u'\r')
 
-
-
-
             self.match(u'\n')
 
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3596,9 +2809,8 @@ class CLexer(Lexer):
 
     # $ANTLR end LINE_COMMENT
 
-
-
     # $ANTLR start LINE_COMMAND
+
     def mLINE_COMMAND(self, ):
 
         try:
@@ -3609,52 +2821,41 @@ class CLexer(Lexer):
             self.match(u'#')
 
             # C.g:678:11: (~ ( '\\n' | '\\r' ) )*
-            while True: #loop33
+            while True:  # loop33
                 alt33 = 2
                 LA33_0 = self.input.LA(1)
 
-                if ((u'\u0000' <= LA33_0 <= u'\t') or (u'\u000B' <= LA33_0 <= u'\f') or (u'\u000E' <= LA33_0 <= u'\uFFFE')) :
+                if ((u'\u0000' <= LA33_0 <= u'\t') or (u'\u000B' <= LA33_0 <= u'\f') or (u'\u000E' <= LA33_0 <= u'\uFFFE')):
                     alt33 = 1
 
-
                 if alt33 == 1:
                     # C.g:678:11: ~ ( '\\n' | '\\r' )
                     if (u'\u0000' <= self.input.LA(1) <= u'\t') or (u'\u000B' <= self.input.LA(1) <= u'\f') or (u'\u000E' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop33
-
+                    break  # loop33
 
             # C.g:678:25: ( '\\r' )?
             alt34 = 2
             LA34_0 = self.input.LA(1)
 
-            if (LA34_0 == u'\r') :
+            if (LA34_0 == u'\r'):
                 alt34 = 1
             if alt34 == 1:
                 # C.g:678:25: '\\r'
                 self.match(u'\r')
 
-
-
-
             self.match(u'\n')
 
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3662,8 +2863,6 @@ class CLexer(Lexer):
 
     # $ANTLR end LINE_COMMAND
 
-
-
     def mTokens(self):
         # C.g:1:8: ( T25 | T26 | T27 | T28 | T29 | T30 | T31 | T32 | T33 | T34 | T35 | T36 | T37 | T38 | T39 | T40 | T41 | T42 | T43 | T44 | T45 | T46 | T47 | T48 | T49 | T50 | T51 | T52 | T53 | T54 | T55 | T56 | T57 | T58 | T59 | T60 | T61 | T62 | T63 | T64 | T65 | T66 | T67 | T68 | T69 | T70 | T71 | T72 | T73 | T74 | T75 | T76 | T77 | T78 | T79 | T80 | T81 | T82 | T83 | T84 | T85 | T86 | T87 | T88 | T89 | T90 | T91 | T92 | T93 | T94 | T95 | T96 | T97 | T98 | T99 | T100 | T101 | T102 | T103 | T104 | T105 | T106 | T107 | T108 | T109 | T110 | T111 | T112 | T113 | T114 | T115 | T116 | T117 | IDENTIFIER | CHARACTER_LITERAL | STRING_LITERAL | HEX_LITERAL | DECIMAL_LITERAL | OCTAL_LITERAL | FLOATING_POINT_LITERAL | WS | BS | UnicodeVocabulary | COMMENT | LINE_COMMENT | LINE_COMMAND )
         alt35 = 106
@@ -3672,681 +2871,463 @@ class CLexer(Lexer):
             # C.g:1:10: T25
             self.mT25()
 
-
-
         elif alt35 == 2:
             # C.g:1:14: T26
             self.mT26()
 
-
-
         elif alt35 == 3:
             # C.g:1:18: T27
             self.mT27()
 
-
-
         elif alt35 == 4:
             # C.g:1:22: T28
             self.mT28()
 
-
-
         elif alt35 == 5:
             # C.g:1:26: T29
             self.mT29()
 
-
-
         elif alt35 == 6:
             # C.g:1:30: T30
             self.mT30()
 
-
-
         elif alt35 == 7:
             # C.g:1:34: T31
             self.mT31()
 
-
-
         elif alt35 == 8:
             # C.g:1:38: T32
             self.mT32()
 
-
-
         elif alt35 == 9:
             # C.g:1:42: T33
             self.mT33()
 
-
-
         elif alt35 == 10:
             # C.g:1:46: T34
             self.mT34()
 
-
-
         elif alt35 == 11:
             # C.g:1:50: T35
             self.mT35()
 
-
-
         elif alt35 == 12:
             # C.g:1:54: T36
             self.mT36()
 
-
-
         elif alt35 == 13:
             # C.g:1:58: T37
             self.mT37()
 
-
-
         elif alt35 == 14:
             # C.g:1:62: T38
             self.mT38()
 
-
-
         elif alt35 == 15:
             # C.g:1:66: T39
             self.mT39()
 
-
-
         elif alt35 == 16:
             # C.g:1:70: T40
             self.mT40()
 
-
-
         elif alt35 == 17:
             # C.g:1:74: T41
             self.mT41()
 
-
-
         elif alt35 == 18:
             # C.g:1:78: T42
             self.mT42()
 
-
-
         elif alt35 == 19:
             # C.g:1:82: T43
             self.mT43()
 
-
-
         elif alt35 == 20:
             # C.g:1:86: T44
             self.mT44()
 
-
-
         elif alt35 == 21:
             # C.g:1:90: T45
             self.mT45()
 
-
-
         elif alt35 == 22:
             # C.g:1:94: T46
             self.mT46()
 
-
-
         elif alt35 == 23:
             # C.g:1:98: T47
             self.mT47()
 
-
-
         elif alt35 == 24:
             # C.g:1:102: T48
             self.mT48()
 
-
-
         elif alt35 == 25:
             # C.g:1:106: T49
             self.mT49()
 
-
-
         elif alt35 == 26:
             # C.g:1:110: T50
             self.mT50()
 
-
-
         elif alt35 == 27:
             # C.g:1:114: T51
             self.mT51()
 
-
-
         elif alt35 == 28:
             # C.g:1:118: T52
             self.mT52()
 
-
-
         elif alt35 == 29:
             # C.g:1:122: T53
             self.mT53()
 
-
-
         elif alt35 == 30:
             # C.g:1:126: T54
             self.mT54()
 
-
-
         elif alt35 == 31:
             # C.g:1:130: T55
             self.mT55()
 
-
-
         elif alt35 == 32:
             # C.g:1:134: T56
             self.mT56()
 
-
-
         elif alt35 == 33:
             # C.g:1:138: T57
             self.mT57()
 
-
-
         elif alt35 == 34:
             # C.g:1:142: T58
             self.mT58()
 
-
-
         elif alt35 == 35:
             # C.g:1:146: T59
             self.mT59()
 
-
-
         elif alt35 == 36:
             # C.g:1:150: T60
             self.mT60()
 
-
-
         elif alt35 == 37:
             # C.g:1:154: T61
             self.mT61()
 
-
-
         elif alt35 == 38:
             # C.g:1:158: T62
             self.mT62()
 
-
-
         elif alt35 == 39:
             # C.g:1:162: T63
             self.mT63()
 
-
-
         elif alt35 == 40:
             # C.g:1:166: T64
             self.mT64()
 
-
-
         elif alt35 == 41:
             # C.g:1:170: T65
             self.mT65()
 
-
-
         elif alt35 == 42:
             # C.g:1:174: T66
             self.mT66()
 
-
-
         elif alt35 == 43:
             # C.g:1:178: T67
             self.mT67()
 
-
-
         elif alt35 == 44:
             # C.g:1:182: T68
             self.mT68()
 
-
-
         elif alt35 == 45:
             # C.g:1:186: T69
             self.mT69()
 
-
-
         elif alt35 == 46:
             # C.g:1:190: T70
             self.mT70()
 
-
-
         elif alt35 == 47:
             # C.g:1:194: T71
             self.mT71()
 
-
-
         elif alt35 == 48:
             # C.g:1:198: T72
             self.mT72()
 
-
-
         elif alt35 == 49:
             # C.g:1:202: T73
             self.mT73()
 
-
-
         elif alt35 == 50:
             # C.g:1:206: T74
             self.mT74()
 
-
-
         elif alt35 == 51:
             # C.g:1:210: T75
             self.mT75()
 
-
-
         elif alt35 == 52:
             # C.g:1:214: T76
             self.mT76()
 
-
-
         elif alt35 == 53:
             # C.g:1:218: T77
             self.mT77()
 
-
-
         elif alt35 == 54:
             # C.g:1:222: T78
             self.mT78()
 
-
-
         elif alt35 == 55:
             # C.g:1:226: T79
             self.mT79()
 
-
-
         elif alt35 == 56:
             # C.g:1:230: T80
             self.mT80()
 
-
-
         elif alt35 == 57:
             # C.g:1:234: T81
             self.mT81()
 
-
-
         elif alt35 == 58:
             # C.g:1:238: T82
             self.mT82()
 
-
-
         elif alt35 == 59:
             # C.g:1:242: T83
             self.mT83()
 
-
-
         elif alt35 == 60:
             # C.g:1:246: T84
             self.mT84()
 
-
-
         elif alt35 == 61:
             # C.g:1:250: T85
             self.mT85()
 
-
-
         elif alt35 == 62:
             # C.g:1:254: T86
             self.mT86()
 
-
-
         elif alt35 == 63:
             # C.g:1:258: T87
             self.mT87()
 
-
-
         elif alt35 == 64:
             # C.g:1:262: T88
             self.mT88()
 
-
-
         elif alt35 == 65:
             # C.g:1:266: T89
             self.mT89()
 
-
-
         elif alt35 == 66:
             # C.g:1:270: T90
             self.mT90()
 
-
-
         elif alt35 == 67:
             # C.g:1:274: T91
             self.mT91()
 
-
-
         elif alt35 == 68:
             # C.g:1:278: T92
             self.mT92()
 
-
-
         elif alt35 == 69:
             # C.g:1:282: T93
             self.mT93()
 
-
-
         elif alt35 == 70:
             # C.g:1:286: T94
             self.mT94()
 
-
-
         elif alt35 == 71:
             # C.g:1:290: T95
             self.mT95()
 
-
-
         elif alt35 == 72:
             # C.g:1:294: T96
             self.mT96()
 
-
-
         elif alt35 == 73:
             # C.g:1:298: T97
             self.mT97()
 
-
-
         elif alt35 == 74:
             # C.g:1:302: T98
             self.mT98()
 
-
-
         elif alt35 == 75:
             # C.g:1:306: T99
             self.mT99()
 
-
-
         elif alt35 == 76:
             # C.g:1:310: T100
             self.mT100()
 
-
-
         elif alt35 == 77:
             # C.g:1:315: T101
             self.mT101()
 
-
-
         elif alt35 == 78:
             # C.g:1:320: T102
             self.mT102()
 
-
-
         elif alt35 == 79:
             # C.g:1:325: T103
             self.mT103()
 
-
-
         elif alt35 == 80:
             # C.g:1:330: T104
             self.mT104()
 
-
-
         elif alt35 == 81:
             # C.g:1:335: T105
             self.mT105()
 
-
-
         elif alt35 == 82:
             # C.g:1:340: T106
             self.mT106()
 
-
-
         elif alt35 == 83:
             # C.g:1:345: T107
             self.mT107()
 
-
-
         elif alt35 == 84:
             # C.g:1:350: T108
             self.mT108()
 
-
-
         elif alt35 == 85:
             # C.g:1:355: T109
             self.mT109()
 
-
-
         elif alt35 == 86:
             # C.g:1:360: T110
             self.mT110()
 
-
-
         elif alt35 == 87:
             # C.g:1:365: T111
             self.mT111()
 
-
-
         elif alt35 == 88:
             # C.g:1:370: T112
             self.mT112()
 
-
-
         elif alt35 == 89:
             # C.g:1:375: T113
             self.mT113()
 
-
-
         elif alt35 == 90:
             # C.g:1:380: T114
             self.mT114()
 
-
-
         elif alt35 == 91:
             # C.g:1:385: T115
             self.mT115()
 
-
-
         elif alt35 == 92:
             # C.g:1:390: T116
             self.mT116()
 
-
-
         elif alt35 == 93:
             # C.g:1:395: T117
             self.mT117()
 
-
-
         elif alt35 == 94:
             # C.g:1:400: IDENTIFIER
             self.mIDENTIFIER()
 
-
-
         elif alt35 == 95:
             # C.g:1:411: CHARACTER_LITERAL
             self.mCHARACTER_LITERAL()
 
-
-
         elif alt35 == 96:
             # C.g:1:429: STRING_LITERAL
             self.mSTRING_LITERAL()
 
-
-
         elif alt35 == 97:
             # C.g:1:444: HEX_LITERAL
             self.mHEX_LITERAL()
 
-
-
         elif alt35 == 98:
             # C.g:1:456: DECIMAL_LITERAL
             self.mDECIMAL_LITERAL()
 
-
-
         elif alt35 == 99:
             # C.g:1:472: OCTAL_LITERAL
             self.mOCTAL_LITERAL()
 
-
-
         elif alt35 == 100:
             # C.g:1:486: FLOATING_POINT_LITERAL
             self.mFLOATING_POINT_LITERAL()
 
-
-
         elif alt35 == 101:
             # C.g:1:509: WS
             self.mWS()
 
-
-
         elif alt35 == 102:
             # C.g:1:512: BS
             self.mBS()
 
-
-
         elif alt35 == 103:
             # C.g:1:515: UnicodeVocabulary
             self.mUnicodeVocabulary()
 
-
-
         elif alt35 == 104:
             # C.g:1:533: COMMENT
             self.mCOMMENT()
 
-
-
         elif alt35 == 105:
             # C.g:1:541: LINE_COMMENT
             self.mLINE_COMMENT()
 
-
-
         elif alt35 == 106:
             # C.g:1:554: LINE_COMMAND
             self.mLINE_COMMAND()
 
-
-
-
-
-
-
-
     # lookup tables for DFA #25
 
     DFA25_eot = DFA.unpack(
         u"\7\uffff\1\10\2\uffff"
-        )
+    )
 
     DFA25_eof = DFA.unpack(
         u"\12\uffff"
-        )
+    )
 
     DFA25_min = DFA.unpack(
         u"\2\56\2\uffff\1\53\1\uffff\2\60\2\uffff"
-        )
+    )
 
     DFA25_max = DFA.unpack(
         u"\1\71\1\146\2\uffff\1\71\1\uffff\1\71\1\146\2\uffff"
-        )
+    )
 
     DFA25_accept = DFA.unpack(
         u"\2\uffff\1\2\1\1\1\uffff\1\4\2\uffff\2\3"
-        )
+    )
 
     DFA25_special = DFA.unpack(
         u"\12\uffff"
-        )
-
+    )
 
     DFA25_transition = [
         DFA.unpack(u"\1\2\1\uffff\12\1"),
         DFA.unpack(u"\1\3\1\uffff\12\1\12\uffff\1\5\1\4\1\5\35\uffff\1\5"
-        u"\1\4\1\5"),
+                   u"\1\4\1\5"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\6\1\uffff\1\6\2\uffff\12\7"),
         DFA.unpack(u""),
         DFA.unpack(u"\12\7"),
         DFA.unpack(u"\12\7\12\uffff\1\11\1\uffff\1\11\35\uffff\1\11\1\uffff"
-        u"\1\11"),
+                   u"\1\11"),
         DFA.unpack(u""),
         DFA.unpack(u"")
     ]
@@ -4376,11 +3357,11 @@ class CLexer(Lexer):
         u"\uffff\1\u0164\1\u0165\1\76\1\u0167\3\76\6\uffff\1\u016b\1\uffff"
         u"\3\76\1\uffff\21\76\1\u0180\2\76\1\uffff\3\76\1\u0186\1\76\1\uffff"
         u"\11\76\1\u0191\1\uffff"
-        )
+    )
 
     DFA35_eof = DFA.unpack(
         u"\u0192\uffff"
-        )
+    )
 
     DFA35_min = DFA.unpack(
         u"\1\3\1\uffff\1\171\1\uffff\1\75\1\154\1\150\1\165\1\145\1\124\1"
@@ -4413,7 +3394,7 @@ class CLexer(Lexer):
         u"\1\111\1\137\1\122\1\103\1\111\1\126\1\105\1\106\1\111\1\44\1\137"
         u"\1\103\1\uffff\1\125\1\105\1\116\1\44\1\122\1\uffff\1\105\1\106"
         u"\1\105\1\122\1\105\1\116\1\103\1\105\1\104\1\44\1\uffff"
-        )
+    )
 
     DFA35_max = DFA.unpack(
         u"\1\ufffe\1\uffff\1\171\1\uffff\1\75\1\170\1\167\1\165\1\145\1\124"
@@ -4447,7 +3428,7 @@ class CLexer(Lexer):
         u"\1\106\1\111\1\172\1\137\1\103\1\uffff\1\125\1\105\1\116\1\172"
         u"\1\122\1\uffff\1\105\1\106\1\105\1\122\1\105\1\116\1\103\1\105"
         u"\1\104\1\172\1\uffff"
-        )
+    )
 
     DFA35_accept = DFA.unpack(
         u"\1\uffff\1\1\1\uffff\1\3\15\uffff\1\23\1\24\1\27\10\uffff\1\46"
@@ -4467,21 +3448,20 @@ class CLexer(Lexer):
         u"\uffff\1\42\1\45\1\uffff\1\2\3\uffff\1\123\7\uffff\1\117\1\10\1"
         u"\32\1\133\1\22\1\35\1\uffff\1\40\3\uffff\1\37\24\uffff\1\43\5\uffff"
         u"\1\44\12\uffff\1\41"
-        )
+    )
 
     DFA35_special = DFA.unpack(
         u"\u0192\uffff"
-        )
-
+    )
 
     DFA35_transition = [
         DFA.unpack(u"\6\73\2\70\1\73\2\70\22\73\1\70\1\50\1\65\1\72\1\63"
-        u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
-        u"\67\1\23\1\1\1\51\1\4\1\52\1\55\1\73\2\63\1\26\1\63\1\32\1\63\1"
-        u"\31\1\63\1\24\2\63\1\62\2\63\1\25\1\33\2\63\1\11\1\63\1\27\1\30"
-        u"\4\63\1\36\1\71\1\37\1\53\1\56\1\73\1\7\1\61\1\13\1\17\1\5\1\16"
-        u"\1\60\1\63\1\14\2\63\1\15\5\63\1\10\1\6\1\2\1\20\1\12\1\57\3\63"
-        u"\1\21\1\54\1\22\1\47\uff80\73"),
+                   u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
+                   u"\67\1\23\1\1\1\51\1\4\1\52\1\55\1\73\2\63\1\26\1\63\1\32\1\63\1"
+                   u"\31\1\63\1\24\2\63\1\62\2\63\1\25\1\33\2\63\1\11\1\63\1\27\1\30"
+                   u"\4\63\1\36\1\71\1\37\1\53\1\56\1\73\1\7\1\61\1\13\1\17\1\5\1\16"
+                   u"\1\60\1\63\1\14\2\63\1\15\5\63\1\10\1\6\1\2\1\20\1\12\1\57\3\63"
+                   u"\1\21\1\54\1\22\1\47\uff80\73"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\75"),
         DFA.unpack(u""),
@@ -4536,7 +3516,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\47\u0092\1\uffff\uffd7\u0092"),
         DFA.unpack(u"\uffff\u0091"),
         DFA.unpack(u"\1\154\1\uffff\10\u0094\2\154\12\uffff\3\154\21\uffff"
-        u"\1\u0093\13\uffff\3\154\21\uffff\1\u0093"),
+                   u"\1\u0093\13\uffff\3\154\21\uffff\1\u0093"),
         DFA.unpack(u"\1\154\1\uffff\12\u0096\12\uffff\3\154\35\uffff\3\154"),
         DFA.unpack(u""),
         DFA.unpack(u""),
@@ -4563,20 +3543,20 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00ab"),
         DFA.unpack(u"\1\u00ac"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00ae"),
         DFA.unpack(u"\1\u00af"),
         DFA.unpack(u"\1\u00b0"),
         DFA.unpack(u"\1\u00b1"),
         DFA.unpack(u"\1\u00b2"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\24\76\1\u00b3\5\76"),
+                   u"\24\76\1\u00b3\5\76"),
         DFA.unpack(u"\1\u00b6\11\uffff\1\u00b5"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00b8"),
         DFA.unpack(u"\1\u00b9"),
         DFA.unpack(u"\1\u00ba"),
@@ -4634,7 +3614,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\154\1\uffff\10\u0094\2\154\12\uffff\3\154\35\uffff"
-        u"\3\154"),
+                   u"\3\154"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\154\1\uffff\12\u0096\12\uffff\3\154\35\uffff\3\154"),
         DFA.unpack(u""),
@@ -4661,10 +3641,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00dd"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00df"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00e1"),
         DFA.unpack(u"\1\u00e2"),
         DFA.unpack(u"\1\u00e3"),
@@ -4674,7 +3654,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u00e6"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00e8"),
         DFA.unpack(u"\1\u00e9"),
         DFA.unpack(u"\1\u00ea"),
@@ -4693,10 +3673,10 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u00f4"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00f6"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00f8"),
         DFA.unpack(u"\1\u00f9"),
         DFA.unpack(u"\1\u00fa"),
@@ -4704,22 +3684,22 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00fc"),
         DFA.unpack(u"\1\u00fd"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00ff"),
         DFA.unpack(u"\1\u0100"),
         DFA.unpack(u"\1\u0101"),
         DFA.unpack(u"\1\u0102"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0105"),
         DFA.unpack(u"\1\u0106"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0109"),
         DFA.unpack(u"\1\u010a"),
@@ -4737,10 +3717,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0116"),
         DFA.unpack(u"\1\u0117"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0119"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u011b"),
         DFA.unpack(u"\1\u011c"),
         DFA.unpack(u""),
@@ -4752,7 +3732,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0121"),
         DFA.unpack(u"\1\u0122"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0124"),
         DFA.unpack(u"\1\u0125"),
@@ -4762,19 +3742,19 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0128"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u012b"),
         DFA.unpack(u"\1\u012c"),
         DFA.unpack(u"\1\u012d"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u012f"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0131"),
         DFA.unpack(u"\1\u0132"),
         DFA.unpack(u"\1\u0133"),
@@ -4783,39 +3763,39 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0136"),
         DFA.unpack(u"\1\u0137"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\u0138\1"
-        u"\uffff\32\76"),
+                   u"\uffff\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u013c"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0143"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0146"),
         DFA.unpack(u"\1\u0147"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0148"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u014a"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u014b"),
@@ -4826,15 +3806,15 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u014f"),
         DFA.unpack(u"\1\u0150"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0153"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
@@ -4847,7 +3827,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0156"),
         DFA.unpack(u"\1\u0157"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0159"),
         DFA.unpack(u"\1\u015a"),
@@ -4859,22 +3839,22 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0166"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0168"),
         DFA.unpack(u"\1\u0169"),
         DFA.unpack(u"\1\u016a"),
@@ -4885,7 +3865,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u016c"),
         DFA.unpack(u"\1\u016d"),
@@ -4909,7 +3889,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u017e"),
         DFA.unpack(u"\1\u017f"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0181"),
         DFA.unpack(u"\1\u0182"),
         DFA.unpack(u""),
@@ -4917,7 +3897,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0184"),
         DFA.unpack(u"\1\u0185"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0187"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0188"),
@@ -4930,12 +3910,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u018f"),
         DFA.unpack(u"\1\u0190"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"")
     ]
 
     # class definition for DFA #35
 
     DFA35 = DFA
-
-
diff --git a/BaseTools/Source/Python/Ecc/CParser3/CParser.py b/BaseTools/Source/Python/Ecc/CParser3/CParser.py
index b078397969f9..aaba24b977ea 100644
--- a/BaseTools/Source/Python/Ecc/CParser3/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser3/CParser.py
@@ -5,7 +5,7 @@ from __future__ import absolute_import
 from antlr3 import *
 from antlr3.compat import set, frozenset
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENERATED. PLEASE DO NOT MODIFY THIS FILE.
@@ -22,33 +22,32 @@ from Ecc import CodeFragment
 from Ecc import FileProfile
 
 
-
 # for convenience in actions
 HIDDEN = BaseRecognizer.HIDDEN
 
 # token types
-BS=20
-LINE_COMMENT=23
-FloatTypeSuffix=16
-IntegerTypeSuffix=14
-LETTER=11
-OCTAL_LITERAL=6
-CHARACTER_LITERAL=8
-Exponent=15
-EOF=-1
-HexDigit=13
-STRING_LITERAL=9
-WS=19
-FLOATING_POINT_LITERAL=10
-IDENTIFIER=4
-UnicodeEscape=18
-LINE_COMMAND=24
-UnicodeVocabulary=21
-HEX_LITERAL=5
-COMMENT=22
-DECIMAL_LITERAL=7
-EscapeSequence=12
-OctalEscape=17
+BS = 20
+LINE_COMMENT = 23
+FloatTypeSuffix = 16
+IntegerTypeSuffix = 14
+LETTER = 11
+OCTAL_LITERAL = 6
+CHARACTER_LITERAL = 8
+Exponent = 15
+EOF = -1
+HexDigit = 13
+STRING_LITERAL = 9
+WS = 19
+FLOATING_POINT_LITERAL = 10
+IDENTIFIER = 4
+UnicodeEscape = 18
+LINE_COMMAND = 24
+UnicodeVocabulary = 21
+HEX_LITERAL = 5
+COMMENT = 22
+DECIMAL_LITERAL = 7
+EscapeSequence = 12
+OctalEscape = 17
 
 # token names
 tokenNames = [
@@ -81,6 +80,8 @@ class function_definition_scope(object):
         self.LBOffset = None
         self.DeclLine = None
         self.DeclOffset = None
+
+
 class postfix_expression_scope(object):
     def __init__(self):
         self.FuncCallText = None
@@ -98,41 +99,46 @@ class CParser(Parser):
         self.postfix_expression_stack = []
 
     def printTokenInfo(self, line, offset, tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.PredicateExpressionList.append(PredExp)
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.PredicateExpressionList.append(PredExp)
 
     def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.EnumerationDefinitionList.append(EnumDef)
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.EnumerationDefinitionList.append(EnumDef)
 
     def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.StructUnionDefinitionList.append(SUDef)
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.StructUnionDefinitionList.append(SUDef)
 
     def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
-      Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.TypedefDefinitionList.append(Tdef)
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.TypedefDefinitionList.append(Tdef)
 
     def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
-      FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
-      FileProfile.FunctionDefinitionList.append(FuncDef)
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+        FileProfile.FunctionDefinitionList.append(FuncDef)
 
     def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
-      VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.VariableDeclarationList.append(VarDecl)
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.VariableDeclarationList.append(VarDecl)
 
     def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
-      FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.FunctionCallingList.append(FuncCall)
-
-
-
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.FunctionCallingList.append(FuncCall)
 
     # $ANTLR start translation_unit
     # C.g:102:1: translation_unit : ( external_declaration )* ;
+
     def translation_unit(self, ):
 
         translation_unit_StartIndex = self.input.index()
@@ -144,30 +150,24 @@ class CParser(Parser):
                 # C.g:103:2: ( ( external_declaration )* )
                 # C.g:103:4: ( external_declaration )*
                 # C.g:103:4: ( external_declaration )*
-                while True: #loop1
+                while True:  # loop1
                     alt1 = 2
                     LA1_0 = self.input.LA(1)
 
-                    if (LA1_0 == IDENTIFIER or LA1_0 == 26 or (29 <= LA1_0 <= 42) or (45 <= LA1_0 <= 46) or (48 <= LA1_0 <= 62) or LA1_0 == 66) :
+                    if (LA1_0 == IDENTIFIER or LA1_0 == 26 or (29 <= LA1_0 <= 42) or (45 <= LA1_0 <= 46) or (48 <= LA1_0 <= 62) or LA1_0 == 66):
                         alt1 = 1
 
-
                     if alt1 == 1:
                         # C.g:0:0: external_declaration
-                        self.following.append(self.FOLLOW_external_declaration_in_translation_unit74)
+                        self.following.append(
+                            self.FOLLOW_external_declaration_in_translation_unit74)
                         self.external_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop1
-
-
-
-
-
+                        break  # loop1
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -182,9 +182,9 @@ class CParser(Parser):
 
     # $ANTLR end translation_unit
 
-
     # $ANTLR start external_declaration
     # C.g:114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );
+
     def external_declaration(self, ):
 
         external_declaration_StartIndex = self.input.index()
@@ -197,316 +197,335 @@ class CParser(Parser):
                 alt3 = 3
                 LA3_0 = self.input.LA(1)
 
-                if ((29 <= LA3_0 <= 33)) :
+                if ((29 <= LA3_0 <= 33)):
                     LA3_1 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 34) :
+                elif (LA3_0 == 34):
                     LA3_2 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 35) :
+                elif (LA3_0 == 35):
                     LA3_3 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 36) :
+                elif (LA3_0 == 36):
                     LA3_4 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 37) :
+                elif (LA3_0 == 37):
                     LA3_5 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 38) :
+                elif (LA3_0 == 38):
                     LA3_6 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 39) :
+                elif (LA3_0 == 39):
                     LA3_7 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 40) :
+                elif (LA3_0 == 40):
                     LA3_8 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 41) :
+                elif (LA3_0 == 41):
                     LA3_9 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 42) :
+                elif (LA3_0 == 42):
                     LA3_10 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
 
                         raise nvae
 
-                elif ((45 <= LA3_0 <= 46)) :
+                elif ((45 <= LA3_0 <= 46)):
                     LA3_11 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 48) :
+                elif (LA3_0 == 48):
                     LA3_12 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == IDENTIFIER) :
+                elif (LA3_0 == IDENTIFIER):
                     LA3_13 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
-                    elif (True) :
+                    elif (True):
                         alt3 = 3
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 58) :
+                elif (LA3_0 == 58):
                     LA3_14 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
 
                         raise nvae
 
                 elif (LA3_0 == 66) and (self.synpred4()):
                     alt3 = 1
-                elif (LA3_0 == 59) :
+                elif (LA3_0 == 59):
                     LA3_16 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 60) :
+                elif (LA3_0 == 60):
                     LA3_17 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
 
                         raise nvae
 
-                elif ((49 <= LA3_0 <= 57) or LA3_0 == 61) :
+                elif ((49 <= LA3_0 <= 57) or LA3_0 == 61):
                     LA3_18 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
 
                         raise nvae
 
                 elif (LA3_0 == 62) and (self.synpred4()):
                     alt3 = 1
-                elif (LA3_0 == 26) :
+                elif (LA3_0 == 26):
                     alt3 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
+                    nvae = NoViableAltException(
+                        "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
 
                     raise nvae
 
                 if alt3 == 1:
                     # C.g:119:4: ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition
-                    self.following.append(self.FOLLOW_function_definition_in_external_declaration113)
+                    self.following.append(
+                        self.FOLLOW_function_definition_in_external_declaration113)
                     self.function_definition()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt3 == 2:
                     # C.g:120:4: declaration
-                    self.following.append(self.FOLLOW_declaration_in_external_declaration118)
+                    self.following.append(
+                        self.FOLLOW_declaration_in_external_declaration118)
                     self.declaration()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt3 == 3:
                     # C.g:121:4: macro_statement ( ';' )?
-                    self.following.append(self.FOLLOW_macro_statement_in_external_declaration123)
+                    self.following.append(
+                        self.FOLLOW_macro_statement_in_external_declaration123)
                     self.macro_statement()
                     self.following.pop()
                     if self.failed:
@@ -515,19 +534,15 @@ class CParser(Parser):
                     alt2 = 2
                     LA2_0 = self.input.LA(1)
 
-                    if (LA2_0 == 25) :
+                    if (LA2_0 == 25):
                         alt2 = 1
                     if alt2 == 1:
                         # C.g:121:21: ';'
-                        self.match(self.input, 25, self.FOLLOW_25_in_external_declaration126)
+                        self.match(self.input, 25,
+                                   self.FOLLOW_25_in_external_declaration126)
                         if self.failed:
                             return
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -546,10 +561,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start function_definition
     # C.g:126:1: function_definition : (d= declaration_specifiers )? declarator ( ( declaration )+ a= compound_statement | b= compound_statement ) ;
+
     def function_definition(self, ):
         self.function_definition_stack.append(function_definition_scope())
         retval = self.function_definition_return()
@@ -563,14 +577,12 @@ class CParser(Parser):
 
         declarator1 = None
 
-
-
-        self.function_definition_stack[-1].ModifierText =  ''
-        self.function_definition_stack[-1].DeclText =  ''
-        self.function_definition_stack[-1].LBLine =  0
-        self.function_definition_stack[-1].LBOffset =  0
-        self.function_definition_stack[-1].DeclLine =  0
-        self.function_definition_stack[-1].DeclOffset =  0
+        self.function_definition_stack[-1].ModifierText = ''
+        self.function_definition_stack[-1].DeclText = ''
+        self.function_definition_stack[-1].LBLine = 0
+        self.function_definition_stack[-1].LBOffset = 0
+        self.function_definition_stack[-1].DeclLine = 0
+        self.function_definition_stack[-1].DeclOffset = 0
 
         try:
             try:
@@ -591,119 +603,119 @@ class CParser(Parser):
                     elif LA4 == 58:
                         LA4_21 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 59:
                         LA4_22 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 60:
                         LA4_23 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == IDENTIFIER:
                         LA4_24 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 62:
                         LA4_25 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 29 or LA4 == 30 or LA4 == 31 or LA4 == 32 or LA4 == 33:
                         LA4_26 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 34:
                         LA4_27 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 35:
                         LA4_28 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 36:
                         LA4_29 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 37:
                         LA4_30 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 38:
                         LA4_31 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 39:
                         LA4_32 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 40:
                         LA4_33 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 41:
                         LA4_34 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 42:
                         LA4_35 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 45 or LA4 == 46:
                         LA4_36 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 48:
                         LA4_37 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 49 or LA4 == 50 or LA4 == 51 or LA4 == 52 or LA4 == 53 or LA4 == 54 or LA4 == 55 or LA4 == 56 or LA4 == 57 or LA4 == 61:
                         LA4_38 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                 elif LA4 == 58:
                     LA4_14 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 elif LA4 == 59:
                     LA4_16 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 elif LA4 == 60:
                     LA4_17 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 if alt4 == 1:
                     # C.g:0:0: d= declaration_specifiers
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_function_definition157)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_function_definition157)
                     d = self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
-                self.following.append(self.FOLLOW_declarator_in_function_definition160)
+                self.following.append(
+                    self.FOLLOW_declarator_in_function_definition160)
                 declarator1 = self.declarator()
                 self.following.pop()
                 if self.failed:
@@ -712,16 +724,17 @@ class CParser(Parser):
                 alt6 = 2
                 LA6_0 = self.input.LA(1)
 
-                if (LA6_0 == IDENTIFIER or LA6_0 == 26 or (29 <= LA6_0 <= 42) or (45 <= LA6_0 <= 46) or (48 <= LA6_0 <= 61)) :
+                if (LA6_0 == IDENTIFIER or LA6_0 == 26 or (29 <= LA6_0 <= 42) or (45 <= LA6_0 <= 46) or (48 <= LA6_0 <= 61)):
                     alt6 = 1
-                elif (LA6_0 == 43) :
+                elif (LA6_0 == 43):
                     alt6 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("147:3: ( ( declaration )+ a= compound_statement | b= compound_statement )", 6, 0, self.input)
+                    nvae = NoViableAltException(
+                        "147:3: ( ( declaration )+ a= compound_statement | b= compound_statement )", 6, 0, self.input)
 
                     raise nvae
 
@@ -729,26 +742,25 @@ class CParser(Parser):
                     # C.g:147:5: ( declaration )+ a= compound_statement
                     # C.g:147:5: ( declaration )+
                     cnt5 = 0
-                    while True: #loop5
+                    while True:  # loop5
                         alt5 = 2
                         LA5_0 = self.input.LA(1)
 
-                        if (LA5_0 == IDENTIFIER or LA5_0 == 26 or (29 <= LA5_0 <= 42) or (45 <= LA5_0 <= 46) or (48 <= LA5_0 <= 61)) :
+                        if (LA5_0 == IDENTIFIER or LA5_0 == 26 or (29 <= LA5_0 <= 42) or (45 <= LA5_0 <= 46) or (48 <= LA5_0 <= 61)):
                             alt5 = 1
 
-
                         if alt5 == 1:
                             # C.g:0:0: declaration
-                            self.following.append(self.FOLLOW_declaration_in_function_definition166)
+                            self.following.append(
+                                self.FOLLOW_declaration_in_function_definition166)
                             self.declaration()
                             self.following.pop()
                             if self.failed:
                                 return retval
 
-
                         else:
                             if cnt5 >= 1:
-                                break #loop5
+                                break  # loop5
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -759,51 +771,46 @@ class CParser(Parser):
 
                         cnt5 += 1
 
-
-                    self.following.append(self.FOLLOW_compound_statement_in_function_definition171)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_function_definition171)
                     a = self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt6 == 2:
                     # C.g:148:5: b= compound_statement
-                    self.following.append(self.FOLLOW_compound_statement_in_function_definition180)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_function_definition180)
                     b = self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
                 if self.backtracking == 0:
 
                     if d is not None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
+                        self.function_definition_stack[-1].ModifierText = self.input.toString(
+                            d.start, d.stop)
                     else:
-                      self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
+                        self.function_definition_stack[-1].ModifierText = ''
+                    self.function_definition_stack[-1].DeclText = self.input.toString(
+                        declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a is not None:
-                      self.function_definition_stack[-1].LBLine = a.start.line
-                      self.function_definition_stack[-1].LBOffset = a.start.charPositionInLine
+                        self.function_definition_stack[-1].LBLine = a.start.line
+                        self.function_definition_stack[-1].LBOffset = a.start.charPositionInLine
                     else:
-                      self.function_definition_stack[-1].LBLine = b.start.line
-                      self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
-
-
-
-
+                        self.function_definition_stack[-1].LBLine = b.start.line
+                        self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
 
                 retval.stop = self.input.LT(-1)
 
                 if self.backtracking == 0:
 
-                    self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[-1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
-
-
+                    self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[
+                                                 -1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -819,9 +826,9 @@ class CParser(Parser):
 
     # $ANTLR end function_definition
 
-
     # $ANTLR start declaration
     # C.g:166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );
+
     def declaration(self, ):
 
         declaration_StartIndex = self.input.index()
@@ -836,7 +843,6 @@ class CParser(Parser):
 
         t = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 4):
@@ -846,23 +852,25 @@ class CParser(Parser):
                 alt9 = 2
                 LA9_0 = self.input.LA(1)
 
-                if (LA9_0 == 26) :
+                if (LA9_0 == 26):
                     alt9 = 1
-                elif (LA9_0 == IDENTIFIER or (29 <= LA9_0 <= 42) or (45 <= LA9_0 <= 46) or (48 <= LA9_0 <= 61)) :
+                elif (LA9_0 == IDENTIFIER or (29 <= LA9_0 <= 42) or (45 <= LA9_0 <= 46) or (48 <= LA9_0 <= 61)):
                     alt9 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
+                    nvae = NoViableAltException(
+                        "166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
 
                     raise nvae
 
                 if alt9 == 1:
                     # C.g:167:4: a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';'
                     a = self.input.LT(1)
-                    self.match(self.input, 26, self.FOLLOW_26_in_declaration203)
+                    self.match(self.input, 26,
+                               self.FOLLOW_26_in_declaration203)
                     if self.failed:
                         return
                     # C.g:167:17: (b= declaration_specifiers )?
@@ -873,60 +881,61 @@ class CParser(Parser):
                     elif LA7 == IDENTIFIER:
                         LA7_13 = self.input.LA(2)
 
-                        if (LA7_13 == 62) :
+                        if (LA7_13 == 62):
                             LA7_21 = self.input.LA(3)
 
-                            if (self.synpred10()) :
+                            if (self.synpred10()):
                                 alt7 = 1
-                        elif (LA7_13 == IDENTIFIER or (29 <= LA7_13 <= 42) or (45 <= LA7_13 <= 46) or (48 <= LA7_13 <= 61) or LA7_13 == 66) :
+                        elif (LA7_13 == IDENTIFIER or (29 <= LA7_13 <= 42) or (45 <= LA7_13 <= 46) or (48 <= LA7_13 <= 61) or LA7_13 == 66):
                             alt7 = 1
                     elif LA7 == 58:
                         LA7_14 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     elif LA7 == 59:
                         LA7_16 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     elif LA7 == 60:
                         LA7_17 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     if alt7 == 1:
                         # C.g:0:0: b= declaration_specifiers
-                        self.following.append(self.FOLLOW_declaration_specifiers_in_declaration207)
+                        self.following.append(
+                            self.FOLLOW_declaration_specifiers_in_declaration207)
                         b = self.declaration_specifiers()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-                    self.following.append(self.FOLLOW_init_declarator_list_in_declaration216)
+                    self.following.append(
+                        self.FOLLOW_init_declarator_list_in_declaration216)
                     c = self.init_declarator_list()
                     self.following.pop()
                     if self.failed:
                         return
                     d = self.input.LT(1)
-                    self.match(self.input, 25, self.FOLLOW_25_in_declaration220)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_declaration220)
                     if self.failed:
                         return
                     if self.backtracking == 0:
 
                         if b is not None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
+                            self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(
+                                b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
-
-
-
+                            self.StoreTypedefDefinition(
+                                a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
 
                 elif alt9 == 2:
                     # C.g:175:4: s= declaration_specifiers (t= init_declarator_list )? e= ';'
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_declaration234)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_declaration234)
                     s = self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
@@ -935,30 +944,27 @@ class CParser(Parser):
                     alt8 = 2
                     LA8_0 = self.input.LA(1)
 
-                    if (LA8_0 == IDENTIFIER or (58 <= LA8_0 <= 60) or LA8_0 == 62 or LA8_0 == 66) :
+                    if (LA8_0 == IDENTIFIER or (58 <= LA8_0 <= 60) or LA8_0 == 62 or LA8_0 == 66):
                         alt8 = 1
                     if alt8 == 1:
                         # C.g:0:0: t= init_declarator_list
-                        self.following.append(self.FOLLOW_init_declarator_list_in_declaration238)
+                        self.following.append(
+                            self.FOLLOW_init_declarator_list_in_declaration238)
                         t = self.init_declarator_list()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
                     e = self.input.LT(1)
-                    self.match(self.input, 25, self.FOLLOW_25_in_declaration243)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_declaration243)
                     if self.failed:
                         return
                     if self.backtracking == 0:
 
                         if t is not None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
-
-
-
-
+                            self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(
+                                s.start, s.stop), self.input.toString(t.start, t.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -978,10 +984,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start declaration_specifiers
     # C.g:182:1: declaration_specifiers : ( storage_class_specifier | type_specifier | type_qualifier )+ ;
+
     def declaration_specifiers(self, ):
 
         retval = self.declaration_specifiers_return()
@@ -996,44 +1001,39 @@ class CParser(Parser):
                 # C.g:183:6: ( storage_class_specifier | type_specifier | type_qualifier )+
                 # C.g:183:6: ( storage_class_specifier | type_specifier | type_qualifier )+
                 cnt10 = 0
-                while True: #loop10
+                while True:  # loop10
                     alt10 = 4
                     LA10 = self.input.LA(1)
                     if LA10 == 58:
                         LA10_2 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 59:
                         LA10_3 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 60:
                         LA10_4 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == IDENTIFIER:
                         LA10_5 = self.input.LA(2)
 
-                        if (self.synpred14()) :
+                        if (self.synpred14()):
                             alt10 = 2
 
-
                     elif LA10 == 53:
                         LA10_9 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 29 or LA10 == 30 or LA10 == 31 or LA10 == 32 or LA10 == 33:
                         alt10 = 1
                     elif LA10 == 34 or LA10 == 35 or LA10 == 36 or LA10 == 37 or LA10 == 38 or LA10 == 39 or LA10 == 40 or LA10 == 41 or LA10 == 42 or LA10 == 45 or LA10 == 46 or LA10 == 48:
@@ -1043,34 +1043,34 @@ class CParser(Parser):
 
                     if alt10 == 1:
                         # C.g:183:10: storage_class_specifier
-                        self.following.append(self.FOLLOW_storage_class_specifier_in_declaration_specifiers264)
+                        self.following.append(
+                            self.FOLLOW_storage_class_specifier_in_declaration_specifiers264)
                         self.storage_class_specifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     elif alt10 == 2:
                         # C.g:184:7: type_specifier
-                        self.following.append(self.FOLLOW_type_specifier_in_declaration_specifiers272)
+                        self.following.append(
+                            self.FOLLOW_type_specifier_in_declaration_specifiers272)
                         self.type_specifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     elif alt10 == 3:
                         # C.g:185:13: type_qualifier
-                        self.following.append(self.FOLLOW_type_qualifier_in_declaration_specifiers286)
+                        self.following.append(
+                            self.FOLLOW_type_qualifier_in_declaration_specifiers286)
                         self.type_qualifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
                         if cnt10 >= 1:
-                            break #loop10
+                            break  # loop10
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1081,13 +1081,8 @@ class CParser(Parser):
 
                     cnt10 += 1
 
-
-
-
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1106,10 +1101,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start init_declarator_list
     # C.g:189:1: init_declarator_list : init_declarator ( ',' init_declarator )* ;
+
     def init_declarator_list(self, ):
 
         retval = self.init_declarator_list_return()
@@ -1122,42 +1116,38 @@ class CParser(Parser):
 
                 # C.g:190:2: ( init_declarator ( ',' init_declarator )* )
                 # C.g:190:4: init_declarator ( ',' init_declarator )*
-                self.following.append(self.FOLLOW_init_declarator_in_init_declarator_list308)
+                self.following.append(
+                    self.FOLLOW_init_declarator_in_init_declarator_list308)
                 self.init_declarator()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:190:20: ( ',' init_declarator )*
-                while True: #loop11
+                while True:  # loop11
                     alt11 = 2
                     LA11_0 = self.input.LA(1)
 
-                    if (LA11_0 == 27) :
+                    if (LA11_0 == 27):
                         alt11 = 1
 
-
                     if alt11 == 1:
                         # C.g:190:21: ',' init_declarator
-                        self.match(self.input, 27, self.FOLLOW_27_in_init_declarator_list311)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_init_declarator_list311)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_init_declarator_in_init_declarator_list313)
+                        self.following.append(
+                            self.FOLLOW_init_declarator_in_init_declarator_list313)
                         self.init_declarator()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop11
-
-
-
-
+                        break  # loop11
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1171,9 +1161,9 @@ class CParser(Parser):
 
     # $ANTLR end init_declarator_list
 
-
     # $ANTLR start init_declarator
     # C.g:193:1: init_declarator : declarator ( '=' initializer )? ;
+
     def init_declarator(self, ):
 
         init_declarator_StartIndex = self.input.index()
@@ -1184,7 +1174,8 @@ class CParser(Parser):
 
                 # C.g:194:2: ( declarator ( '=' initializer )? )
                 # C.g:194:4: declarator ( '=' initializer )?
-                self.following.append(self.FOLLOW_declarator_in_init_declarator326)
+                self.following.append(
+                    self.FOLLOW_declarator_in_init_declarator326)
                 self.declarator()
                 self.following.pop()
                 if self.failed:
@@ -1193,25 +1184,21 @@ class CParser(Parser):
                 alt12 = 2
                 LA12_0 = self.input.LA(1)
 
-                if (LA12_0 == 28) :
+                if (LA12_0 == 28):
                     alt12 = 1
                 if alt12 == 1:
                     # C.g:194:16: '=' initializer
-                    self.match(self.input, 28, self.FOLLOW_28_in_init_declarator329)
+                    self.match(self.input, 28,
+                               self.FOLLOW_28_in_init_declarator329)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_initializer_in_init_declarator331)
+                    self.following.append(
+                        self.FOLLOW_initializer_in_init_declarator331)
                     self.initializer()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1225,9 +1212,9 @@ class CParser(Parser):
 
     # $ANTLR end init_declarator
 
-
     # $ANTLR start storage_class_specifier
     # C.g:197:1: storage_class_specifier : ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' );
+
     def storage_class_specifier(self, ):
 
         storage_class_specifier_StartIndex = self.input.index()
@@ -1239,7 +1226,7 @@ class CParser(Parser):
                 # C.g:198:2: ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' )
                 # C.g:
                 if (29 <= self.input.LA(1) <= 33):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -1251,14 +1238,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_storage_class_specifier0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1272,9 +1254,9 @@ class CParser(Parser):
 
     # $ANTLR end storage_class_specifier
 
-
     # $ANTLR start type_specifier
     # C.g:205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );
+
     def type_specifier(self, ):
 
         type_specifier_StartIndex = self.input.index()
@@ -1282,7 +1264,6 @@ class CParser(Parser):
 
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 9):
@@ -1292,27 +1273,27 @@ class CParser(Parser):
                 alt13 = 12
                 LA13_0 = self.input.LA(1)
 
-                if (LA13_0 == 34) :
+                if (LA13_0 == 34):
                     alt13 = 1
-                elif (LA13_0 == 35) :
+                elif (LA13_0 == 35):
                     alt13 = 2
-                elif (LA13_0 == 36) :
+                elif (LA13_0 == 36):
                     alt13 = 3
-                elif (LA13_0 == 37) :
+                elif (LA13_0 == 37):
                     alt13 = 4
-                elif (LA13_0 == 38) :
+                elif (LA13_0 == 38):
                     alt13 = 5
-                elif (LA13_0 == 39) :
+                elif (LA13_0 == 39):
                     alt13 = 6
-                elif (LA13_0 == 40) :
+                elif (LA13_0 == 40):
                     alt13 = 7
-                elif (LA13_0 == 41) :
+                elif (LA13_0 == 41):
                     alt13 = 8
-                elif (LA13_0 == 42) :
+                elif (LA13_0 == 42):
                     alt13 = 9
-                elif ((45 <= LA13_0 <= 46)) :
+                elif ((45 <= LA13_0 <= 46)):
                     alt13 = 10
-                elif (LA13_0 == 48) :
+                elif (LA13_0 == 48):
                     alt13 = 11
                 elif (LA13_0 == IDENTIFIER) and (self.synpred34()):
                     alt13 = 12
@@ -1321,76 +1302,78 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
+                    nvae = NoViableAltException(
+                        "205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
 
                     raise nvae
 
                 if alt13 == 1:
                     # C.g:206:4: 'void'
-                    self.match(self.input, 34, self.FOLLOW_34_in_type_specifier376)
+                    self.match(self.input, 34,
+                               self.FOLLOW_34_in_type_specifier376)
                     if self.failed:
                         return
 
-
                 elif alt13 == 2:
                     # C.g:207:4: 'char'
-                    self.match(self.input, 35, self.FOLLOW_35_in_type_specifier381)
+                    self.match(self.input, 35,
+                               self.FOLLOW_35_in_type_specifier381)
                     if self.failed:
                         return
 
-
                 elif alt13 == 3:
                     # C.g:208:4: 'short'
-                    self.match(self.input, 36, self.FOLLOW_36_in_type_specifier386)
+                    self.match(self.input, 36,
+                               self.FOLLOW_36_in_type_specifier386)
                     if self.failed:
                         return
 
-
                 elif alt13 == 4:
                     # C.g:209:4: 'int'
-                    self.match(self.input, 37, self.FOLLOW_37_in_type_specifier391)
+                    self.match(self.input, 37,
+                               self.FOLLOW_37_in_type_specifier391)
                     if self.failed:
                         return
 
-
                 elif alt13 == 5:
                     # C.g:210:4: 'long'
-                    self.match(self.input, 38, self.FOLLOW_38_in_type_specifier396)
+                    self.match(self.input, 38,
+                               self.FOLLOW_38_in_type_specifier396)
                     if self.failed:
                         return
 
-
                 elif alt13 == 6:
                     # C.g:211:4: 'float'
-                    self.match(self.input, 39, self.FOLLOW_39_in_type_specifier401)
+                    self.match(self.input, 39,
+                               self.FOLLOW_39_in_type_specifier401)
                     if self.failed:
                         return
 
-
                 elif alt13 == 7:
                     # C.g:212:4: 'double'
-                    self.match(self.input, 40, self.FOLLOW_40_in_type_specifier406)
+                    self.match(self.input, 40,
+                               self.FOLLOW_40_in_type_specifier406)
                     if self.failed:
                         return
 
-
                 elif alt13 == 8:
                     # C.g:213:4: 'signed'
-                    self.match(self.input, 41, self.FOLLOW_41_in_type_specifier411)
+                    self.match(self.input, 41,
+                               self.FOLLOW_41_in_type_specifier411)
                     if self.failed:
                         return
 
-
                 elif alt13 == 9:
                     # C.g:214:4: 'unsigned'
-                    self.match(self.input, 42, self.FOLLOW_42_in_type_specifier416)
+                    self.match(self.input, 42,
+                               self.FOLLOW_42_in_type_specifier416)
                     if self.failed:
                         return
 
-
                 elif alt13 == 10:
                     # C.g:215:4: s= struct_or_union_specifier
-                    self.following.append(self.FOLLOW_struct_or_union_specifier_in_type_specifier423)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_specifier_in_type_specifier423)
                     s = self.struct_or_union_specifier()
                     self.following.pop()
                     if self.failed:
@@ -1398,14 +1381,13 @@ class CParser(Parser):
                     if self.backtracking == 0:
 
                         if s.stop is not None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
-
-
-
+                            self.StoreStructUnionDefinition(
+                                s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
 
                 elif alt13 == 11:
                     # C.g:220:4: e= enum_specifier
-                    self.following.append(self.FOLLOW_enum_specifier_in_type_specifier433)
+                    self.following.append(
+                        self.FOLLOW_enum_specifier_in_type_specifier433)
                     e = self.enum_specifier()
                     self.following.pop()
                     if self.failed:
@@ -1413,21 +1395,18 @@ class CParser(Parser):
                     if self.backtracking == 0:
 
                         if e.stop is not None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
+                            self.StoreEnumerationDefinition(
+                                e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt13 == 12:
                     # C.g:225:4: ( IDENTIFIER ( type_qualifier )* declarator )=> type_id
-                    self.following.append(self.FOLLOW_type_id_in_type_specifier451)
+                    self.following.append(
+                        self.FOLLOW_type_id_in_type_specifier451)
                     self.type_id()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1441,9 +1420,9 @@ class CParser(Parser):
 
     # $ANTLR end type_specifier
 
-
     # $ANTLR start type_id
     # C.g:228:1: type_id : IDENTIFIER ;
+
     def type_id(self, ):
 
         type_id_StartIndex = self.input.index()
@@ -1454,13 +1433,11 @@ class CParser(Parser):
 
                 # C.g:229:5: ( IDENTIFIER )
                 # C.g:229:9: IDENTIFIER
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_type_id467)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_type_id467)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1479,10 +1456,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start struct_or_union_specifier
     # C.g:233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );
+
     def struct_or_union_specifier(self, ):
 
         retval = self.struct_or_union_specifier_return()
@@ -1497,33 +1473,35 @@ class CParser(Parser):
                 alt15 = 2
                 LA15_0 = self.input.LA(1)
 
-                if ((45 <= LA15_0 <= 46)) :
+                if ((45 <= LA15_0 <= 46)):
                     LA15_1 = self.input.LA(2)
 
-                    if (LA15_1 == IDENTIFIER) :
+                    if (LA15_1 == IDENTIFIER):
                         LA15_2 = self.input.LA(3)
 
-                        if (LA15_2 == 43) :
+                        if (LA15_2 == 43):
                             alt15 = 1
-                        elif (LA15_2 == EOF or LA15_2 == IDENTIFIER or LA15_2 == 25 or LA15_2 == 27 or (29 <= LA15_2 <= 42) or (45 <= LA15_2 <= 64) or LA15_2 == 66) :
+                        elif (LA15_2 == EOF or LA15_2 == IDENTIFIER or LA15_2 == 25 or LA15_2 == 27 or (29 <= LA15_2 <= 42) or (45 <= LA15_2 <= 64) or LA15_2 == 66):
                             alt15 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return retval
 
-                            nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 2, self.input)
+                            nvae = NoViableAltException(
+                                "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 2, self.input)
 
                             raise nvae
 
-                    elif (LA15_1 == 43) :
+                    elif (LA15_1 == 43):
                         alt15 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 1, self.input)
+                        nvae = NoViableAltException(
+                            "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 1, self.input)
 
                         raise nvae
 
@@ -1532,13 +1510,15 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 0, self.input)
+                    nvae = NoViableAltException(
+                        "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 0, self.input)
 
                     raise nvae
 
                 if alt15 == 1:
                     # C.g:235:4: struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}'
-                    self.following.append(self.FOLLOW_struct_or_union_in_struct_or_union_specifier494)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_in_struct_or_union_specifier494)
                     self.struct_or_union()
                     self.following.pop()
                     if self.failed:
@@ -1547,50 +1527,52 @@ class CParser(Parser):
                     alt14 = 2
                     LA14_0 = self.input.LA(1)
 
-                    if (LA14_0 == IDENTIFIER) :
+                    if (LA14_0 == IDENTIFIER):
                         alt14 = 1
                     if alt14 == 1:
                         # C.g:0:0: IDENTIFIER
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier496)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier496)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 43, self.FOLLOW_43_in_struct_or_union_specifier499)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_struct_or_union_specifier499)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_struct_declaration_list_in_struct_or_union_specifier501)
+                    self.following.append(
+                        self.FOLLOW_struct_declaration_list_in_struct_or_union_specifier501)
                     self.struct_declaration_list()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 44, self.FOLLOW_44_in_struct_or_union_specifier503)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_struct_or_union_specifier503)
                     if self.failed:
                         return retval
 
-
                 elif alt15 == 2:
                     # C.g:236:4: struct_or_union IDENTIFIER
-                    self.following.append(self.FOLLOW_struct_or_union_in_struct_or_union_specifier508)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_in_struct_or_union_specifier508)
                     self.struct_or_union()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier510)
+                    self.match(
+                        self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier510)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 11, struct_or_union_specifier_StartIndex)
+                self.memoize(self.input, 11,
+                             struct_or_union_specifier_StartIndex)
 
             pass
 
@@ -1598,9 +1580,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_or_union_specifier
 
-
     # $ANTLR start struct_or_union
     # C.g:239:1: struct_or_union : ( 'struct' | 'union' );
+
     def struct_or_union(self, ):
 
         struct_or_union_StartIndex = self.input.index()
@@ -1612,7 +1594,7 @@ class CParser(Parser):
                 # C.g:240:2: ( 'struct' | 'union' )
                 # C.g:
                 if (45 <= self.input.LA(1) <= 46):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -1624,14 +1606,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_struct_or_union0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1645,9 +1622,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_or_union
 
-
     # $ANTLR start struct_declaration_list
     # C.g:244:1: struct_declaration_list : ( struct_declaration )+ ;
+
     def struct_declaration_list(self, ):
 
         struct_declaration_list_StartIndex = self.input.index()
@@ -1660,26 +1637,25 @@ class CParser(Parser):
                 # C.g:245:4: ( struct_declaration )+
                 # C.g:245:4: ( struct_declaration )+
                 cnt16 = 0
-                while True: #loop16
+                while True:  # loop16
                     alt16 = 2
                     LA16_0 = self.input.LA(1)
 
-                    if (LA16_0 == IDENTIFIER or (34 <= LA16_0 <= 42) or (45 <= LA16_0 <= 46) or (48 <= LA16_0 <= 61)) :
+                    if (LA16_0 == IDENTIFIER or (34 <= LA16_0 <= 42) or (45 <= LA16_0 <= 46) or (48 <= LA16_0 <= 61)):
                         alt16 = 1
 
-
                     if alt16 == 1:
                         # C.g:0:0: struct_declaration
-                        self.following.append(self.FOLLOW_struct_declaration_in_struct_declaration_list537)
+                        self.following.append(
+                            self.FOLLOW_struct_declaration_in_struct_declaration_list537)
                         self.struct_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt16 >= 1:
-                            break #loop16
+                            break  # loop16
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1690,17 +1666,13 @@ class CParser(Parser):
 
                     cnt16 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 13, struct_declaration_list_StartIndex)
+                self.memoize(self.input, 13,
+                             struct_declaration_list_StartIndex)
 
             pass
 
@@ -1708,9 +1680,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declaration_list
 
-
     # $ANTLR start struct_declaration
     # C.g:248:1: struct_declaration : specifier_qualifier_list struct_declarator_list ';' ;
+
     def struct_declaration(self, ):
 
         struct_declaration_StartIndex = self.input.index()
@@ -1721,23 +1693,23 @@ class CParser(Parser):
 
                 # C.g:249:2: ( specifier_qualifier_list struct_declarator_list ';' )
                 # C.g:249:4: specifier_qualifier_list struct_declarator_list ';'
-                self.following.append(self.FOLLOW_specifier_qualifier_list_in_struct_declaration549)
+                self.following.append(
+                    self.FOLLOW_specifier_qualifier_list_in_struct_declaration549)
                 self.specifier_qualifier_list()
                 self.following.pop()
                 if self.failed:
                     return
-                self.following.append(self.FOLLOW_struct_declarator_list_in_struct_declaration551)
+                self.following.append(
+                    self.FOLLOW_struct_declarator_list_in_struct_declaration551)
                 self.struct_declarator_list()
                 self.following.pop()
                 if self.failed:
                     return
-                self.match(self.input, 25, self.FOLLOW_25_in_struct_declaration553)
+                self.match(self.input, 25,
+                           self.FOLLOW_25_in_struct_declaration553)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1751,9 +1723,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declaration
 
-
     # $ANTLR start specifier_qualifier_list
     # C.g:252:1: specifier_qualifier_list : ( type_qualifier | type_specifier )+ ;
+
     def specifier_qualifier_list(self, ):
 
         specifier_qualifier_list_StartIndex = self.input.index()
@@ -1766,30 +1738,27 @@ class CParser(Parser):
                 # C.g:253:4: ( type_qualifier | type_specifier )+
                 # C.g:253:4: ( type_qualifier | type_specifier )+
                 cnt17 = 0
-                while True: #loop17
+                while True:  # loop17
                     alt17 = 3
                     LA17 = self.input.LA(1)
                     if LA17 == 58:
                         LA17_2 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == 59:
                         LA17_3 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == 60:
                         LA17_4 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == IDENTIFIER:
                         LA17 = self.input.LA(2)
                         if LA17 == EOF or LA17 == IDENTIFIER or LA17 == 34 or LA17 == 35 or LA17 == 36 or LA17 == 37 or LA17 == 38 or LA17 == 39 or LA17 == 40 or LA17 == 41 or LA17 == 42 or LA17 == 45 or LA17 == 46 or LA17 == 48 or LA17 == 49 or LA17 == 50 or LA17 == 51 or LA17 == 52 or LA17 == 53 or LA17 == 54 or LA17 == 55 or LA17 == 56 or LA17 == 57 or LA17 == 58 or LA17 == 59 or LA17 == 60 or LA17 == 61 or LA17 == 63 or LA17 == 66:
@@ -1797,25 +1766,21 @@ class CParser(Parser):
                         elif LA17 == 62:
                             LA17_94 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
                         elif LA17 == 47:
                             LA17_95 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
                         elif LA17 == 64:
                             LA17_96 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
-
                     elif LA17 == 49 or LA17 == 50 or LA17 == 51 or LA17 == 52 or LA17 == 53 or LA17 == 54 or LA17 == 55 or LA17 == 56 or LA17 == 57 or LA17 == 61:
                         alt17 = 1
                     elif LA17 == 34 or LA17 == 35 or LA17 == 36 or LA17 == 37 or LA17 == 38 or LA17 == 39 or LA17 == 40 or LA17 == 41 or LA17 == 42 or LA17 == 45 or LA17 == 46 or LA17 == 48:
@@ -1823,25 +1788,25 @@ class CParser(Parser):
 
                     if alt17 == 1:
                         # C.g:253:6: type_qualifier
-                        self.following.append(self.FOLLOW_type_qualifier_in_specifier_qualifier_list566)
+                        self.following.append(
+                            self.FOLLOW_type_qualifier_in_specifier_qualifier_list566)
                         self.type_qualifier()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt17 == 2:
                         # C.g:253:23: type_specifier
-                        self.following.append(self.FOLLOW_type_specifier_in_specifier_qualifier_list570)
+                        self.following.append(
+                            self.FOLLOW_type_specifier_in_specifier_qualifier_list570)
                         self.type_specifier()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt17 >= 1:
-                            break #loop17
+                            break  # loop17
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1852,17 +1817,13 @@ class CParser(Parser):
 
                     cnt17 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 15, specifier_qualifier_list_StartIndex)
+                self.memoize(self.input, 15,
+                             specifier_qualifier_list_StartIndex)
 
             pass
 
@@ -1870,9 +1831,9 @@ class CParser(Parser):
 
     # $ANTLR end specifier_qualifier_list
 
-
     # $ANTLR start struct_declarator_list
     # C.g:256:1: struct_declarator_list : struct_declarator ( ',' struct_declarator )* ;
+
     def struct_declarator_list(self, ):
 
         struct_declarator_list_StartIndex = self.input.index()
@@ -1883,39 +1844,35 @@ class CParser(Parser):
 
                 # C.g:257:2: ( struct_declarator ( ',' struct_declarator )* )
                 # C.g:257:4: struct_declarator ( ',' struct_declarator )*
-                self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list584)
+                self.following.append(
+                    self.FOLLOW_struct_declarator_in_struct_declarator_list584)
                 self.struct_declarator()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:257:22: ( ',' struct_declarator )*
-                while True: #loop18
+                while True:  # loop18
                     alt18 = 2
                     LA18_0 = self.input.LA(1)
 
-                    if (LA18_0 == 27) :
+                    if (LA18_0 == 27):
                         alt18 = 1
 
-
                     if alt18 == 1:
                         # C.g:257:23: ',' struct_declarator
-                        self.match(self.input, 27, self.FOLLOW_27_in_struct_declarator_list587)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_struct_declarator_list587)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list589)
+                        self.following.append(
+                            self.FOLLOW_struct_declarator_in_struct_declarator_list589)
                         self.struct_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop18
-
-
-
-
-
+                        break  # loop18
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -1930,9 +1887,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declarator_list
 
-
     # $ANTLR start struct_declarator
     # C.g:260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );
+
     def struct_declarator(self, ):
 
         struct_declarator_StartIndex = self.input.index()
@@ -1945,22 +1902,24 @@ class CParser(Parser):
                 alt20 = 2
                 LA20_0 = self.input.LA(1)
 
-                if (LA20_0 == IDENTIFIER or (58 <= LA20_0 <= 60) or LA20_0 == 62 or LA20_0 == 66) :
+                if (LA20_0 == IDENTIFIER or (58 <= LA20_0 <= 60) or LA20_0 == 62 or LA20_0 == 66):
                     alt20 = 1
-                elif (LA20_0 == 47) :
+                elif (LA20_0 == 47):
                     alt20 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
+                    nvae = NoViableAltException(
+                        "260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
 
                     raise nvae
 
                 if alt20 == 1:
                     # C.g:261:4: declarator ( ':' constant_expression )?
-                    self.following.append(self.FOLLOW_declarator_in_struct_declarator602)
+                    self.following.append(
+                        self.FOLLOW_declarator_in_struct_declarator602)
                     self.declarator()
                     self.following.pop()
                     if self.failed:
@@ -1969,36 +1928,34 @@ class CParser(Parser):
                     alt19 = 2
                     LA19_0 = self.input.LA(1)
 
-                    if (LA19_0 == 47) :
+                    if (LA19_0 == 47):
                         alt19 = 1
                     if alt19 == 1:
                         # C.g:261:16: ':' constant_expression
-                        self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator605)
+                        self.match(self.input, 47,
+                                   self.FOLLOW_47_in_struct_declarator605)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_constant_expression_in_struct_declarator607)
+                        self.following.append(
+                            self.FOLLOW_constant_expression_in_struct_declarator607)
                         self.constant_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt20 == 2:
                     # C.g:262:4: ':' constant_expression
-                    self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator614)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_struct_declarator614)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_struct_declarator616)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_struct_declarator616)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2017,10 +1974,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start enum_specifier
     # C.g:265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );
+
     def enum_specifier(self, ):
 
         retval = self.enum_specifier_return()
@@ -2035,33 +1991,35 @@ class CParser(Parser):
                 alt23 = 3
                 LA23_0 = self.input.LA(1)
 
-                if (LA23_0 == 48) :
+                if (LA23_0 == 48):
                     LA23_1 = self.input.LA(2)
 
-                    if (LA23_1 == IDENTIFIER) :
+                    if (LA23_1 == IDENTIFIER):
                         LA23_2 = self.input.LA(3)
 
-                        if (LA23_2 == 43) :
+                        if (LA23_2 == 43):
                             alt23 = 2
-                        elif (LA23_2 == EOF or LA23_2 == IDENTIFIER or LA23_2 == 25 or LA23_2 == 27 or (29 <= LA23_2 <= 42) or (45 <= LA23_2 <= 64) or LA23_2 == 66) :
+                        elif (LA23_2 == EOF or LA23_2 == IDENTIFIER or LA23_2 == 25 or LA23_2 == 27 or (29 <= LA23_2 <= 42) or (45 <= LA23_2 <= 64) or LA23_2 == 66):
                             alt23 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return retval
 
-                            nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 2, self.input)
+                            nvae = NoViableAltException(
+                                "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 2, self.input)
 
                             raise nvae
 
-                    elif (LA23_1 == 43) :
+                    elif (LA23_1 == 43):
                         alt23 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 1, self.input)
+                        nvae = NoViableAltException(
+                            "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 1, self.input)
 
                         raise nvae
 
@@ -2070,19 +2028,23 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 0, self.input)
+                    nvae = NoViableAltException(
+                        "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 0, self.input)
 
                     raise nvae
 
                 if alt23 == 1:
                     # C.g:267:4: 'enum' '{' enumerator_list ( ',' )? '}'
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier634)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier634)
                     if self.failed:
                         return retval
-                    self.match(self.input, 43, self.FOLLOW_43_in_enum_specifier636)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_enum_specifier636)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_enumerator_list_in_enum_specifier638)
+                    self.following.append(
+                        self.FOLLOW_enumerator_list_in_enum_specifier638)
                     self.enumerator_list()
                     self.following.pop()
                     if self.failed:
@@ -2091,33 +2053,36 @@ class CParser(Parser):
                     alt21 = 2
                     LA21_0 = self.input.LA(1)
 
-                    if (LA21_0 == 27) :
+                    if (LA21_0 == 27):
                         alt21 = 1
                     if alt21 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_enum_specifier640)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enum_specifier640)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_enum_specifier643)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_enum_specifier643)
                     if self.failed:
                         return retval
 
-
                 elif alt23 == 2:
                     # C.g:268:4: 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}'
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier648)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier648)
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enum_specifier650)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_enum_specifier650)
                     if self.failed:
                         return retval
-                    self.match(self.input, 43, self.FOLLOW_43_in_enum_specifier652)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_enum_specifier652)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_enumerator_list_in_enum_specifier654)
+                    self.following.append(
+                        self.FOLLOW_enumerator_list_in_enum_specifier654)
                     self.enumerator_list()
                     self.following.pop()
                     if self.failed:
@@ -2126,34 +2091,33 @@ class CParser(Parser):
                     alt22 = 2
                     LA22_0 = self.input.LA(1)
 
-                    if (LA22_0 == 27) :
+                    if (LA22_0 == 27):
                         alt22 = 1
                     if alt22 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_enum_specifier656)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enum_specifier656)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_enum_specifier659)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_enum_specifier659)
                     if self.failed:
                         return retval
 
-
                 elif alt23 == 3:
                     # C.g:269:4: 'enum' IDENTIFIER
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier664)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier664)
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enum_specifier666)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_enum_specifier666)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2167,9 +2131,9 @@ class CParser(Parser):
 
     # $ANTLR end enum_specifier
 
-
     # $ANTLR start enumerator_list
     # C.g:272:1: enumerator_list : enumerator ( ',' enumerator )* ;
+
     def enumerator_list(self, ):
 
         enumerator_list_StartIndex = self.input.index()
@@ -2180,44 +2144,38 @@ class CParser(Parser):
 
                 # C.g:273:2: ( enumerator ( ',' enumerator )* )
                 # C.g:273:4: enumerator ( ',' enumerator )*
-                self.following.append(self.FOLLOW_enumerator_in_enumerator_list677)
+                self.following.append(
+                    self.FOLLOW_enumerator_in_enumerator_list677)
                 self.enumerator()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:273:15: ( ',' enumerator )*
-                while True: #loop24
+                while True:  # loop24
                     alt24 = 2
                     LA24_0 = self.input.LA(1)
 
-                    if (LA24_0 == 27) :
+                    if (LA24_0 == 27):
                         LA24_1 = self.input.LA(2)
 
-                        if (LA24_1 == IDENTIFIER) :
+                        if (LA24_1 == IDENTIFIER):
                             alt24 = 1
 
-
-
-
                     if alt24 == 1:
                         # C.g:273:16: ',' enumerator
-                        self.match(self.input, 27, self.FOLLOW_27_in_enumerator_list680)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enumerator_list680)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_enumerator_in_enumerator_list682)
+                        self.following.append(
+                            self.FOLLOW_enumerator_in_enumerator_list682)
                         self.enumerator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop24
-
-
-
-
-
+                        break  # loop24
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -2232,9 +2190,9 @@ class CParser(Parser):
 
     # $ANTLR end enumerator_list
 
-
     # $ANTLR start enumerator
     # C.g:276:1: enumerator : IDENTIFIER ( '=' constant_expression )? ;
+
     def enumerator(self, ):
 
         enumerator_StartIndex = self.input.index()
@@ -2245,32 +2203,28 @@ class CParser(Parser):
 
                 # C.g:277:2: ( IDENTIFIER ( '=' constant_expression )? )
                 # C.g:277:4: IDENTIFIER ( '=' constant_expression )?
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enumerator695)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_enumerator695)
                 if self.failed:
                     return
                 # C.g:277:15: ( '=' constant_expression )?
                 alt25 = 2
                 LA25_0 = self.input.LA(1)
 
-                if (LA25_0 == 28) :
+                if (LA25_0 == 28):
                     alt25 = 1
                 if alt25 == 1:
                     # C.g:277:16: '=' constant_expression
                     self.match(self.input, 28, self.FOLLOW_28_in_enumerator698)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_enumerator700)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_enumerator700)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2284,9 +2238,9 @@ class CParser(Parser):
 
     # $ANTLR end enumerator
 
-
     # $ANTLR start type_qualifier
     # C.g:280:1: type_qualifier : ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' );
+
     def type_qualifier(self, ):
 
         type_qualifier_StartIndex = self.input.index()
@@ -2298,7 +2252,7 @@ class CParser(Parser):
                 # C.g:281:2: ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' )
                 # C.g:
                 if (49 <= self.input.LA(1) <= 61):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -2310,14 +2264,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_type_qualifier0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2336,10 +2285,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start declarator
     # C.g:296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );
+
     def declarator(self, ):
 
         retval = self.declarator_return()
@@ -2354,30 +2302,32 @@ class CParser(Parser):
                 alt30 = 2
                 LA30_0 = self.input.LA(1)
 
-                if (LA30_0 == 66) :
+                if (LA30_0 == 66):
                     LA30_1 = self.input.LA(2)
 
-                    if (self.synpred66()) :
+                    if (self.synpred66()):
                         alt30 = 1
-                    elif (True) :
+                    elif (True):
                         alt30 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 1, self.input)
+                        nvae = NoViableAltException(
+                            "296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 1, self.input)
 
                         raise nvae
 
-                elif (LA30_0 == IDENTIFIER or (58 <= LA30_0 <= 60) or LA30_0 == 62) :
+                elif (LA30_0 == IDENTIFIER or (58 <= LA30_0 <= 60) or LA30_0 == 62):
                     alt30 = 1
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 0, self.input)
+                    nvae = NoViableAltException(
+                        "296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 0, self.input)
 
                     raise nvae
 
@@ -2387,67 +2337,63 @@ class CParser(Parser):
                     alt26 = 2
                     LA26_0 = self.input.LA(1)
 
-                    if (LA26_0 == 66) :
+                    if (LA26_0 == 66):
                         alt26 = 1
                     if alt26 == 1:
                         # C.g:0:0: pointer
-                        self.following.append(self.FOLLOW_pointer_in_declarator784)
+                        self.following.append(
+                            self.FOLLOW_pointer_in_declarator784)
                         self.pointer()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:13: ( 'EFIAPI' )?
                     alt27 = 2
                     LA27_0 = self.input.LA(1)
 
-                    if (LA27_0 == 58) :
+                    if (LA27_0 == 58):
                         alt27 = 1
                     if alt27 == 1:
                         # C.g:297:14: 'EFIAPI'
-                        self.match(self.input, 58, self.FOLLOW_58_in_declarator788)
+                        self.match(self.input, 58,
+                                   self.FOLLOW_58_in_declarator788)
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:25: ( 'EFI_BOOTSERVICE' )?
                     alt28 = 2
                     LA28_0 = self.input.LA(1)
 
-                    if (LA28_0 == 59) :
+                    if (LA28_0 == 59):
                         alt28 = 1
                     if alt28 == 1:
                         # C.g:297:26: 'EFI_BOOTSERVICE'
-                        self.match(self.input, 59, self.FOLLOW_59_in_declarator793)
+                        self.match(self.input, 59,
+                                   self.FOLLOW_59_in_declarator793)
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:46: ( 'EFI_RUNTIMESERVICE' )?
                     alt29 = 2
                     LA29_0 = self.input.LA(1)
 
-                    if (LA29_0 == 60) :
+                    if (LA29_0 == 60):
                         alt29 = 1
                     if alt29 == 1:
                         # C.g:297:47: 'EFI_RUNTIMESERVICE'
-                        self.match(self.input, 60, self.FOLLOW_60_in_declarator798)
+                        self.match(self.input, 60,
+                                   self.FOLLOW_60_in_declarator798)
                         if self.failed:
                             return retval
 
-
-
-                    self.following.append(self.FOLLOW_direct_declarator_in_declarator802)
+                    self.following.append(
+                        self.FOLLOW_direct_declarator_in_declarator802)
                     self.direct_declarator()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt30 == 2:
                     # C.g:299:4: pointer
                     self.following.append(self.FOLLOW_pointer_in_declarator808)
@@ -2456,10 +2402,8 @@ class CParser(Parser):
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2473,9 +2417,9 @@ class CParser(Parser):
 
     # $ANTLR end declarator
 
-
     # $ANTLR start direct_declarator
     # C.g:302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );
+
     def direct_declarator(self, ):
 
         direct_declarator_StartIndex = self.input.index()
@@ -2488,556 +2432,485 @@ class CParser(Parser):
                 alt34 = 2
                 LA34_0 = self.input.LA(1)
 
-                if (LA34_0 == IDENTIFIER) :
+                if (LA34_0 == IDENTIFIER):
                     alt34 = 1
-                elif (LA34_0 == 62) :
+                elif (LA34_0 == 62):
                     alt34 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
+                    nvae = NoViableAltException(
+                        "302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
 
                     raise nvae
 
                 if alt34 == 1:
                     # C.g:303:4: IDENTIFIER ( declarator_suffix )*
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_direct_declarator819)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_direct_declarator819)
                     if self.failed:
                         return
                     # C.g:303:15: ( declarator_suffix )*
-                    while True: #loop31
+                    while True:  # loop31
                         alt31 = 2
                         LA31_0 = self.input.LA(1)
 
-                        if (LA31_0 == 62) :
+                        if (LA31_0 == 62):
                             LA31 = self.input.LA(2)
                             if LA31 == 63:
                                 LA31_30 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 58:
                                 LA31_31 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 66:
                                 LA31_32 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 59:
                                 LA31_33 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 60:
                                 LA31_34 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == IDENTIFIER:
                                 LA31_35 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 29 or LA31 == 30 or LA31 == 31 or LA31 == 32 or LA31 == 33:
                                 LA31_37 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 34:
                                 LA31_38 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 35:
                                 LA31_39 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 36:
                                 LA31_40 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 37:
                                 LA31_41 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 38:
                                 LA31_42 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 39:
                                 LA31_43 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 40:
                                 LA31_44 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 41:
                                 LA31_45 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 42:
                                 LA31_46 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 45 or LA31 == 46:
                                 LA31_47 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 48:
                                 LA31_48 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 49 or LA31 == 50 or LA31 == 51 or LA31 == 52 or LA31 == 53 or LA31 == 54 or LA31 == 55 or LA31 == 56 or LA31 == 57 or LA31 == 61:
                                 LA31_49 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
-
-                        elif (LA31_0 == 64) :
+                        elif (LA31_0 == 64):
                             LA31 = self.input.LA(2)
                             if LA31 == 65:
                                 LA31_51 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 62:
                                 LA31_52 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == IDENTIFIER:
                                 LA31_53 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == HEX_LITERAL:
                                 LA31_54 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == OCTAL_LITERAL:
                                 LA31_55 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == DECIMAL_LITERAL:
                                 LA31_56 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == CHARACTER_LITERAL:
                                 LA31_57 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == STRING_LITERAL:
                                 LA31_58 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == FLOATING_POINT_LITERAL:
                                 LA31_59 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 72:
                                 LA31_60 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 73:
                                 LA31_61 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 66 or LA31 == 68 or LA31 == 69 or LA31 == 77 or LA31 == 78 or LA31 == 79:
                                 LA31_62 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 74:
                                 LA31_63 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
-
-
-
                         if alt31 == 1:
                             # C.g:0:0: declarator_suffix
-                            self.following.append(self.FOLLOW_declarator_suffix_in_direct_declarator821)
+                            self.following.append(
+                                self.FOLLOW_declarator_suffix_in_direct_declarator821)
                             self.declarator_suffix()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop31
-
-
-
+                            break  # loop31
 
                 elif alt34 == 2:
                     # C.g:304:4: '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+
-                    self.match(self.input, 62, self.FOLLOW_62_in_direct_declarator827)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_direct_declarator827)
                     if self.failed:
                         return
                     # C.g:304:8: ( 'EFIAPI' )?
                     alt32 = 2
                     LA32_0 = self.input.LA(1)
 
-                    if (LA32_0 == 58) :
+                    if (LA32_0 == 58):
                         LA32_1 = self.input.LA(2)
 
-                        if (self.synpred69()) :
+                        if (self.synpred69()):
                             alt32 = 1
                     if alt32 == 1:
                         # C.g:304:9: 'EFIAPI'
-                        self.match(self.input, 58, self.FOLLOW_58_in_direct_declarator830)
+                        self.match(self.input, 58,
+                                   self.FOLLOW_58_in_direct_declarator830)
                         if self.failed:
                             return
 
-
-
-                    self.following.append(self.FOLLOW_declarator_in_direct_declarator834)
+                    self.following.append(
+                        self.FOLLOW_declarator_in_direct_declarator834)
                     self.declarator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_direct_declarator836)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_direct_declarator836)
                     if self.failed:
                         return
                     # C.g:304:35: ( declarator_suffix )+
                     cnt33 = 0
-                    while True: #loop33
+                    while True:  # loop33
                         alt33 = 2
                         LA33_0 = self.input.LA(1)
 
-                        if (LA33_0 == 62) :
+                        if (LA33_0 == 62):
                             LA33 = self.input.LA(2)
                             if LA33 == 63:
                                 LA33_30 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 58:
                                 LA33_31 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 66:
                                 LA33_32 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 59:
                                 LA33_33 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 60:
                                 LA33_34 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == IDENTIFIER:
                                 LA33_35 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 29 or LA33 == 30 or LA33 == 31 or LA33 == 32 or LA33 == 33:
                                 LA33_37 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 34:
                                 LA33_38 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 35:
                                 LA33_39 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 36:
                                 LA33_40 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 37:
                                 LA33_41 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 38:
                                 LA33_42 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 39:
                                 LA33_43 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 40:
                                 LA33_44 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 41:
                                 LA33_45 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 42:
                                 LA33_46 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 45 or LA33 == 46:
                                 LA33_47 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 48:
                                 LA33_48 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 49 or LA33 == 50 or LA33 == 51 or LA33 == 52 or LA33 == 53 or LA33 == 54 or LA33 == 55 or LA33 == 56 or LA33 == 57 or LA33 == 61:
                                 LA33_49 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
-
-                        elif (LA33_0 == 64) :
+                        elif (LA33_0 == 64):
                             LA33 = self.input.LA(2)
                             if LA33 == 65:
                                 LA33_51 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 62:
                                 LA33_52 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == IDENTIFIER:
                                 LA33_53 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == HEX_LITERAL:
                                 LA33_54 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == OCTAL_LITERAL:
                                 LA33_55 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == DECIMAL_LITERAL:
                                 LA33_56 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == CHARACTER_LITERAL:
                                 LA33_57 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == STRING_LITERAL:
                                 LA33_58 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == FLOATING_POINT_LITERAL:
                                 LA33_59 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 72:
                                 LA33_60 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 73:
                                 LA33_61 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 66 or LA33 == 68 or LA33 == 69 or LA33 == 77 or LA33 == 78 or LA33 == 79:
                                 LA33_62 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 74:
                                 LA33_63 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
-
-
-
                         if alt33 == 1:
                             # C.g:0:0: declarator_suffix
-                            self.following.append(self.FOLLOW_declarator_suffix_in_direct_declarator838)
+                            self.following.append(
+                                self.FOLLOW_declarator_suffix_in_direct_declarator838)
                             self.declarator_suffix()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
                             if cnt33 >= 1:
-                                break #loop33
+                                break  # loop33
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -3048,10 +2921,6 @@ class CParser(Parser):
 
                         cnt33 += 1
 
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3065,9 +2934,9 @@ class CParser(Parser):
 
     # $ANTLR end direct_declarator
 
-
     # $ANTLR start declarator_suffix
     # C.g:307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );
+
     def declarator_suffix(self, ):
 
         declarator_suffix_StartIndex = self.input.index()
@@ -3080,23 +2949,24 @@ class CParser(Parser):
                 alt35 = 5
                 LA35_0 = self.input.LA(1)
 
-                if (LA35_0 == 64) :
+                if (LA35_0 == 64):
                     LA35_1 = self.input.LA(2)
 
-                    if (LA35_1 == 65) :
+                    if (LA35_1 == 65):
                         alt35 = 2
-                    elif ((IDENTIFIER <= LA35_1 <= FLOATING_POINT_LITERAL) or LA35_1 == 62 or LA35_1 == 66 or (68 <= LA35_1 <= 69) or (72 <= LA35_1 <= 74) or (77 <= LA35_1 <= 79)) :
+                    elif ((IDENTIFIER <= LA35_1 <= FLOATING_POINT_LITERAL) or LA35_1 == 62 or LA35_1 == 66 or (68 <= LA35_1 <= 69) or (72 <= LA35_1 <= 74) or (77 <= LA35_1 <= 79)):
                         alt35 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
+                        nvae = NoViableAltException(
+                            "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
 
                         raise nvae
 
-                elif (LA35_0 == 62) :
+                elif (LA35_0 == 62):
                     LA35 = self.input.LA(2)
                     if LA35 == 63:
                         alt35 = 5
@@ -3105,16 +2975,17 @@ class CParser(Parser):
                     elif LA35 == IDENTIFIER:
                         LA35_29 = self.input.LA(3)
 
-                        if (self.synpred73()) :
+                        if (self.synpred73()):
                             alt35 = 3
-                        elif (self.synpred74()) :
+                        elif (self.synpred74()):
                             alt35 = 4
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
+                            nvae = NoViableAltException(
+                                "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
 
                             raise nvae
 
@@ -3123,7 +2994,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
+                        nvae = NoViableAltException(
+                            "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
 
                         raise nvae
 
@@ -3132,76 +3004,84 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
+                    nvae = NoViableAltException(
+                        "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
 
                     raise nvae
 
                 if alt35 == 1:
                     # C.g:308:6: '[' constant_expression ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix852)
+                    self.match(self.input, 64,
+                               self.FOLLOW_64_in_declarator_suffix852)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_declarator_suffix854)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_declarator_suffix854)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix856)
+                    self.match(self.input, 65,
+                               self.FOLLOW_65_in_declarator_suffix856)
                     if self.failed:
                         return
 
-
                 elif alt35 == 2:
                     # C.g:309:9: '[' ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix866)
+                    self.match(self.input, 64,
+                               self.FOLLOW_64_in_declarator_suffix866)
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix868)
+                    self.match(self.input, 65,
+                               self.FOLLOW_65_in_declarator_suffix868)
                     if self.failed:
                         return
 
-
                 elif alt35 == 3:
                     # C.g:310:9: '(' parameter_type_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix878)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix878)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_parameter_type_list_in_declarator_suffix880)
+                    self.following.append(
+                        self.FOLLOW_parameter_type_list_in_declarator_suffix880)
                     self.parameter_type_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix882)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix882)
                     if self.failed:
                         return
 
-
                 elif alt35 == 4:
                     # C.g:311:9: '(' identifier_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix892)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix892)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_identifier_list_in_declarator_suffix894)
+                    self.following.append(
+                        self.FOLLOW_identifier_list_in_declarator_suffix894)
                     self.identifier_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix896)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix896)
                     if self.failed:
                         return
 
-
                 elif alt35 == 5:
                     # C.g:312:9: '(' ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix906)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix906)
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix908)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix908)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3215,9 +3095,9 @@ class CParser(Parser):
 
     # $ANTLR end declarator_suffix
 
-
     # $ANTLR start pointer
     # C.g:315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );
+
     def pointer(self, ):
 
         pointer_StartIndex = self.input.index()
@@ -3230,69 +3110,73 @@ class CParser(Parser):
                 alt38 = 3
                 LA38_0 = self.input.LA(1)
 
-                if (LA38_0 == 66) :
+                if (LA38_0 == 66):
                     LA38 = self.input.LA(2)
                     if LA38 == 66:
                         LA38_2 = self.input.LA(3)
 
-                        if (self.synpred78()) :
+                        if (self.synpred78()):
                             alt38 = 2
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
 
                             raise nvae
 
                     elif LA38 == 58:
                         LA38_3 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
 
                             raise nvae
 
                     elif LA38 == 59:
                         LA38_4 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
 
                             raise nvae
 
                     elif LA38 == 60:
                         LA38_5 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
 
                             raise nvae
 
@@ -3301,32 +3185,34 @@ class CParser(Parser):
                     elif LA38 == 53:
                         LA38_21 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
 
                             raise nvae
 
                     elif LA38 == 49 or LA38 == 50 or LA38 == 51 or LA38 == 52 or LA38 == 54 or LA38 == 55 or LA38 == 56 or LA38 == 57 or LA38 == 61:
                         LA38_29 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
 
                             raise nvae
 
@@ -3335,7 +3221,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
+                        nvae = NoViableAltException(
+                            "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
 
                         raise nvae
 
@@ -3344,7 +3231,8 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
+                    nvae = NoViableAltException(
+                        "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
 
                     raise nvae
 
@@ -3355,57 +3243,51 @@ class CParser(Parser):
                         return
                     # C.g:316:8: ( type_qualifier )+
                     cnt36 = 0
-                    while True: #loop36
+                    while True:  # loop36
                         alt36 = 2
                         LA36 = self.input.LA(1)
                         if LA36 == 58:
                             LA36_2 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 59:
                             LA36_3 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 60:
                             LA36_4 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 53:
                             LA36_20 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 49 or LA36 == 50 or LA36 == 51 or LA36 == 52 or LA36 == 54 or LA36 == 55 or LA36 == 56 or LA36 == 57 or LA36 == 61:
                             LA36_28 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
-
                         if alt36 == 1:
                             # C.g:0:0: type_qualifier
-                            self.following.append(self.FOLLOW_type_qualifier_in_pointer921)
+                            self.following.append(
+                                self.FOLLOW_type_qualifier_in_pointer921)
                             self.type_qualifier()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
                             if cnt36 >= 1:
-                                break #loop36
+                                break  # loop36
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -3416,28 +3298,24 @@ class CParser(Parser):
 
                         cnt36 += 1
 
-
                     # C.g:316:24: ( pointer )?
                     alt37 = 2
                     LA37_0 = self.input.LA(1)
 
-                    if (LA37_0 == 66) :
+                    if (LA37_0 == 66):
                         LA37_1 = self.input.LA(2)
 
-                        if (self.synpred76()) :
+                        if (self.synpred76()):
                             alt37 = 1
                     if alt37 == 1:
                         # C.g:0:0: pointer
-                        self.following.append(self.FOLLOW_pointer_in_pointer924)
+                        self.following.append(
+                            self.FOLLOW_pointer_in_pointer924)
                         self.pointer()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt38 == 2:
                     # C.g:317:4: '*' pointer
                     self.match(self.input, 66, self.FOLLOW_66_in_pointer930)
@@ -3449,15 +3327,12 @@ class CParser(Parser):
                     if self.failed:
                         return
 
-
                 elif alt38 == 3:
                     # C.g:318:4: '*'
                     self.match(self.input, 66, self.FOLLOW_66_in_pointer937)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3471,9 +3346,9 @@ class CParser(Parser):
 
     # $ANTLR end pointer
 
-
     # $ANTLR start parameter_type_list
     # C.g:321:1: parameter_type_list : parameter_list ( ',' ( 'OPTIONAL' )? '...' )? ;
+
     def parameter_type_list(self, ):
 
         parameter_type_list_StartIndex = self.input.index()
@@ -3484,7 +3359,8 @@ class CParser(Parser):
 
                 # C.g:322:2: ( parameter_list ( ',' ( 'OPTIONAL' )? '...' )? )
                 # C.g:322:4: parameter_list ( ',' ( 'OPTIONAL' )? '...' )?
-                self.following.append(self.FOLLOW_parameter_list_in_parameter_type_list948)
+                self.following.append(
+                    self.FOLLOW_parameter_list_in_parameter_type_list948)
                 self.parameter_list()
                 self.following.pop()
                 if self.failed:
@@ -3493,37 +3369,32 @@ class CParser(Parser):
                 alt40 = 2
                 LA40_0 = self.input.LA(1)
 
-                if (LA40_0 == 27) :
+                if (LA40_0 == 27):
                     alt40 = 1
                 if alt40 == 1:
                     # C.g:322:20: ',' ( 'OPTIONAL' )? '...'
-                    self.match(self.input, 27, self.FOLLOW_27_in_parameter_type_list951)
+                    self.match(self.input, 27,
+                               self.FOLLOW_27_in_parameter_type_list951)
                     if self.failed:
                         return
                     # C.g:322:24: ( 'OPTIONAL' )?
                     alt39 = 2
                     LA39_0 = self.input.LA(1)
 
-                    if (LA39_0 == 53) :
+                    if (LA39_0 == 53):
                         alt39 = 1
                     if alt39 == 1:
                         # C.g:322:25: 'OPTIONAL'
-                        self.match(self.input, 53, self.FOLLOW_53_in_parameter_type_list954)
+                        self.match(self.input, 53,
+                                   self.FOLLOW_53_in_parameter_type_list954)
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 67, self.FOLLOW_67_in_parameter_type_list958)
+                    self.match(self.input, 67,
+                               self.FOLLOW_67_in_parameter_type_list958)
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3537,9 +3408,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_type_list
 
-
     # $ANTLR start parameter_list
     # C.g:325:1: parameter_list : parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* ;
+
     def parameter_list(self, ):
 
         parameter_list_StartIndex = self.input.index()
@@ -3550,68 +3421,60 @@ class CParser(Parser):
 
                 # C.g:326:2: ( parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* )
                 # C.g:326:4: parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )*
-                self.following.append(self.FOLLOW_parameter_declaration_in_parameter_list971)
+                self.following.append(
+                    self.FOLLOW_parameter_declaration_in_parameter_list971)
                 self.parameter_declaration()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:326:26: ( ',' ( 'OPTIONAL' )? parameter_declaration )*
-                while True: #loop42
+                while True:  # loop42
                     alt42 = 2
                     LA42_0 = self.input.LA(1)
 
-                    if (LA42_0 == 27) :
+                    if (LA42_0 == 27):
                         LA42_1 = self.input.LA(2)
 
-                        if (LA42_1 == 53) :
+                        if (LA42_1 == 53):
                             LA42_3 = self.input.LA(3)
 
-                            if (self.synpred82()) :
+                            if (self.synpred82()):
                                 alt42 = 1
 
-
-                        elif (LA42_1 == IDENTIFIER or (29 <= LA42_1 <= 42) or (45 <= LA42_1 <= 46) or (48 <= LA42_1 <= 52) or (54 <= LA42_1 <= 61) or LA42_1 == 66) :
+                        elif (LA42_1 == IDENTIFIER or (29 <= LA42_1 <= 42) or (45 <= LA42_1 <= 46) or (48 <= LA42_1 <= 52) or (54 <= LA42_1 <= 61) or LA42_1 == 66):
                             alt42 = 1
 
-
-
-
                     if alt42 == 1:
                         # C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
-                        self.match(self.input, 27, self.FOLLOW_27_in_parameter_list974)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_parameter_list974)
                         if self.failed:
                             return
                         # C.g:326:31: ( 'OPTIONAL' )?
                         alt41 = 2
                         LA41_0 = self.input.LA(1)
 
-                        if (LA41_0 == 53) :
+                        if (LA41_0 == 53):
                             LA41_1 = self.input.LA(2)
 
-                            if (self.synpred81()) :
+                            if (self.synpred81()):
                                 alt41 = 1
                         if alt41 == 1:
                             # C.g:326:32: 'OPTIONAL'
-                            self.match(self.input, 53, self.FOLLOW_53_in_parameter_list977)
+                            self.match(self.input, 53,
+                                       self.FOLLOW_53_in_parameter_list977)
                             if self.failed:
                                 return
 
-
-
-                        self.following.append(self.FOLLOW_parameter_declaration_in_parameter_list981)
+                        self.following.append(
+                            self.FOLLOW_parameter_declaration_in_parameter_list981)
                         self.parameter_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop42
-
-
-
-
-
+                        break  # loop42
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -3626,9 +3489,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_list
 
-
     # $ANTLR start parameter_declaration
     # C.g:329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );
+
     def parameter_declaration(self, ):
 
         parameter_declaration_StartIndex = self.input.index()
@@ -3645,16 +3508,17 @@ class CParser(Parser):
                 elif LA46 == IDENTIFIER:
                     LA46_13 = self.input.LA(2)
 
-                    if (self.synpred86()) :
+                    if (self.synpred86()):
                         alt46 = 1
-                    elif (True) :
+                    elif (True):
                         alt46 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
+                        nvae = NoViableAltException(
+                            "329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
 
                         raise nvae
 
@@ -3665,30 +3529,31 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
+                    nvae = NoViableAltException(
+                        "329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
 
                     raise nvae
 
                 if alt46 == 1:
                     # C.g:330:4: declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )?
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_parameter_declaration994)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_parameter_declaration994)
                     self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
                         return
                     # C.g:330:27: ( declarator | abstract_declarator )*
-                    while True: #loop43
+                    while True:  # loop43
                         alt43 = 3
                         LA43 = self.input.LA(1)
                         if LA43 == 66:
                             LA43_5 = self.input.LA(2)
 
-                            if (self.synpred83()) :
+                            if (self.synpred83()):
                                 alt43 = 1
-                            elif (self.synpred84()) :
+                            elif (self.synpred84()):
                                 alt43 = 2
 
-
                         elif LA43 == IDENTIFIER or LA43 == 58 or LA43 == 59 or LA43 == 60:
                             alt43 = 1
                         elif LA43 == 62:
@@ -3698,129 +3563,115 @@ class CParser(Parser):
                             elif LA43 == IDENTIFIER:
                                 LA43_37 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 58:
                                 LA43_38 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 66:
                                 LA43_39 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 59:
                                 LA43_40 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 60:
                                 LA43_41 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 62:
                                 LA43_43 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
-
                         elif LA43 == 64:
                             alt43 = 2
 
                         if alt43 == 1:
                             # C.g:330:28: declarator
-                            self.following.append(self.FOLLOW_declarator_in_parameter_declaration997)
+                            self.following.append(
+                                self.FOLLOW_declarator_in_parameter_declaration997)
                             self.declarator()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         elif alt43 == 2:
                             # C.g:330:39: abstract_declarator
-                            self.following.append(self.FOLLOW_abstract_declarator_in_parameter_declaration999)
+                            self.following.append(
+                                self.FOLLOW_abstract_declarator_in_parameter_declaration999)
                             self.abstract_declarator()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop43
-
+                            break  # loop43
 
                     # C.g:330:61: ( 'OPTIONAL' )?
                     alt44 = 2
                     LA44_0 = self.input.LA(1)
 
-                    if (LA44_0 == 53) :
+                    if (LA44_0 == 53):
                         alt44 = 1
                     if alt44 == 1:
                         # C.g:330:62: 'OPTIONAL'
-                        self.match(self.input, 53, self.FOLLOW_53_in_parameter_declaration1004)
+                        self.match(self.input, 53,
+                                   self.FOLLOW_53_in_parameter_declaration1004)
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt46 == 2:
                     # C.g:332:4: ( pointer )* IDENTIFIER
                     # C.g:332:4: ( pointer )*
-                    while True: #loop45
+                    while True:  # loop45
                         alt45 = 2
                         LA45_0 = self.input.LA(1)
 
-                        if (LA45_0 == 66) :
+                        if (LA45_0 == 66):
                             alt45 = 1
 
-
                         if alt45 == 1:
                             # C.g:0:0: pointer
-                            self.following.append(self.FOLLOW_pointer_in_parameter_declaration1013)
+                            self.following.append(
+                                self.FOLLOW_pointer_in_parameter_declaration1013)
                             self.pointer()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop45
+                            break  # loop45
 
-
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3834,9 +3685,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_declaration
 
-
     # $ANTLR start identifier_list
     # C.g:335:1: identifier_list : IDENTIFIER ( ',' IDENTIFIER )* ;
+
     def identifier_list(self, ):
 
         identifier_list_StartIndex = self.input.index()
@@ -3847,35 +3698,31 @@ class CParser(Parser):
 
                 # C.g:336:2: ( IDENTIFIER ( ',' IDENTIFIER )* )
                 # C.g:336:4: IDENTIFIER ( ',' IDENTIFIER )*
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1027)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_identifier_list1027)
                 if self.failed:
                     return
                 # C.g:337:2: ( ',' IDENTIFIER )*
-                while True: #loop47
+                while True:  # loop47
                     alt47 = 2
                     LA47_0 = self.input.LA(1)
 
-                    if (LA47_0 == 27) :
+                    if (LA47_0 == 27):
                         alt47 = 1
 
-
                     if alt47 == 1:
                         # C.g:337:3: ',' IDENTIFIER
-                        self.match(self.input, 27, self.FOLLOW_27_in_identifier_list1031)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_identifier_list1031)
                         if self.failed:
                             return
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1033)
+                        self.match(self.input, IDENTIFIER,
+                                   self.FOLLOW_IDENTIFIER_in_identifier_list1033)
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop47
-
-
-
-
-
+                        break  # loop47
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -3890,9 +3737,9 @@ class CParser(Parser):
 
     # $ANTLR end identifier_list
 
-
     # $ANTLR start type_name
     # C.g:340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );
+
     def type_name(self, ):
 
         type_name_StartIndex = self.input.index()
@@ -3905,21 +3752,22 @@ class CParser(Parser):
                 alt49 = 2
                 LA49_0 = self.input.LA(1)
 
-                if ((34 <= LA49_0 <= 42) or (45 <= LA49_0 <= 46) or (48 <= LA49_0 <= 61)) :
+                if ((34 <= LA49_0 <= 42) or (45 <= LA49_0 <= 46) or (48 <= LA49_0 <= 61)):
                     alt49 = 1
-                elif (LA49_0 == IDENTIFIER) :
+                elif (LA49_0 == IDENTIFIER):
                     LA49_13 = self.input.LA(2)
 
-                    if (self.synpred90()) :
+                    if (self.synpred90()):
                         alt49 = 1
-                    elif (True) :
+                    elif (True):
                         alt49 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
+                        nvae = NoViableAltException(
+                            "340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
 
                         raise nvae
 
@@ -3928,13 +3776,15 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
+                    nvae = NoViableAltException(
+                        "340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
 
                     raise nvae
 
                 if alt49 == 1:
                     # C.g:341:4: specifier_qualifier_list ( abstract_declarator )?
-                    self.following.append(self.FOLLOW_specifier_qualifier_list_in_type_name1046)
+                    self.following.append(
+                        self.FOLLOW_specifier_qualifier_list_in_type_name1046)
                     self.specifier_qualifier_list()
                     self.following.pop()
                     if self.failed:
@@ -3943,20 +3793,17 @@ class CParser(Parser):
                     alt48 = 2
                     LA48_0 = self.input.LA(1)
 
-                    if (LA48_0 == 62 or LA48_0 == 64 or LA48_0 == 66) :
+                    if (LA48_0 == 62 or LA48_0 == 64 or LA48_0 == 66):
                         alt48 = 1
                     if alt48 == 1:
                         # C.g:0:0: abstract_declarator
-                        self.following.append(self.FOLLOW_abstract_declarator_in_type_name1048)
+                        self.following.append(
+                            self.FOLLOW_abstract_declarator_in_type_name1048)
                         self.abstract_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt49 == 2:
                     # C.g:342:4: type_id
                     self.following.append(self.FOLLOW_type_id_in_type_name1054)
@@ -3965,8 +3812,6 @@ class CParser(Parser):
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3980,9 +3825,9 @@ class CParser(Parser):
 
     # $ANTLR end type_name
 
-
     # $ANTLR start abstract_declarator
     # C.g:345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );
+
     def abstract_declarator(self, ):
 
         abstract_declarator_StartIndex = self.input.index()
@@ -3995,22 +3840,24 @@ class CParser(Parser):
                 alt51 = 2
                 LA51_0 = self.input.LA(1)
 
-                if (LA51_0 == 66) :
+                if (LA51_0 == 66):
                     alt51 = 1
-                elif (LA51_0 == 62 or LA51_0 == 64) :
+                elif (LA51_0 == 62 or LA51_0 == 64):
                     alt51 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
+                    nvae = NoViableAltException(
+                        "345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
 
                     raise nvae
 
                 if alt51 == 1:
                     # C.g:346:4: pointer ( direct_abstract_declarator )?
-                    self.following.append(self.FOLLOW_pointer_in_abstract_declarator1065)
+                    self.following.append(
+                        self.FOLLOW_pointer_in_abstract_declarator1065)
                     self.pointer()
                     self.following.pop()
                     if self.failed:
@@ -4019,202 +3866,198 @@ class CParser(Parser):
                     alt50 = 2
                     LA50_0 = self.input.LA(1)
 
-                    if (LA50_0 == 62) :
+                    if (LA50_0 == 62):
                         LA50 = self.input.LA(2)
                         if LA50 == 63:
                             LA50_12 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 58:
                             LA50_13 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 66:
                             LA50_14 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 59:
                             LA50_15 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 60:
                             LA50_16 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == IDENTIFIER:
                             LA50_17 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 62:
                             LA50_18 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 64:
                             LA50_19 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 29 or LA50 == 30 or LA50 == 31 or LA50 == 32 or LA50 == 33:
                             LA50_20 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 34:
                             LA50_21 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 35:
                             LA50_22 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 36:
                             LA50_23 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 37:
                             LA50_24 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 38:
                             LA50_25 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 39:
                             LA50_26 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 40:
                             LA50_27 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 41:
                             LA50_28 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 42:
                             LA50_29 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 45 or LA50 == 46:
                             LA50_30 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 48:
                             LA50_31 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 49 or LA50 == 50 or LA50 == 51 or LA50 == 52 or LA50 == 53 or LA50 == 54 or LA50 == 55 or LA50 == 56 or LA50 == 57 or LA50 == 61:
                             LA50_32 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
-                    elif (LA50_0 == 64) :
+                    elif (LA50_0 == 64):
                         LA50 = self.input.LA(2)
                         if LA50 == 65:
                             LA50_33 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 62:
                             LA50_34 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == IDENTIFIER:
                             LA50_35 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == HEX_LITERAL:
                             LA50_36 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == OCTAL_LITERAL:
                             LA50_37 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == DECIMAL_LITERAL:
                             LA50_38 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == CHARACTER_LITERAL:
                             LA50_39 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == STRING_LITERAL:
                             LA50_40 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == FLOATING_POINT_LITERAL:
                             LA50_41 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 72:
                             LA50_42 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 73:
                             LA50_43 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 66 or LA50 == 68 or LA50 == 69 or LA50 == 77 or LA50 == 78 or LA50 == 79:
                             LA50_44 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 74:
                             LA50_45 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                     if alt50 == 1:
                         # C.g:0:0: direct_abstract_declarator
-                        self.following.append(self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1067)
+                        self.following.append(
+                            self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1067)
                         self.direct_abstract_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt51 == 2:
                     # C.g:347:4: direct_abstract_declarator
-                    self.following.append(self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1073)
+                    self.following.append(
+                        self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1073)
                     self.direct_abstract_declarator()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -4228,9 +4071,9 @@ class CParser(Parser):
 
     # $ANTLR end abstract_declarator
 
-
     # $ANTLR start direct_abstract_declarator
     # C.g:350:1: direct_abstract_declarator : ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )* ;
+
     def direct_abstract_declarator(self, ):
 
         direct_abstract_declarator_StartIndex = self.input.index()
@@ -4245,23 +4088,24 @@ class CParser(Parser):
                 alt52 = 2
                 LA52_0 = self.input.LA(1)
 
-                if (LA52_0 == 62) :
+                if (LA52_0 == 62):
                     LA52 = self.input.LA(2)
                     if LA52 == IDENTIFIER or LA52 == 29 or LA52 == 30 or LA52 == 31 or LA52 == 32 or LA52 == 33 or LA52 == 34 or LA52 == 35 or LA52 == 36 or LA52 == 37 or LA52 == 38 or LA52 == 39 or LA52 == 40 or LA52 == 41 or LA52 == 42 or LA52 == 45 or LA52 == 46 or LA52 == 48 or LA52 == 49 or LA52 == 50 or LA52 == 51 or LA52 == 52 or LA52 == 53 or LA52 == 54 or LA52 == 55 or LA52 == 56 or LA52 == 57 or LA52 == 58 or LA52 == 59 or LA52 == 60 or LA52 == 61 or LA52 == 63:
                         alt52 = 2
                     elif LA52 == 66:
                         LA52_18 = self.input.LA(3)
 
-                        if (self.synpred93()) :
+                        if (self.synpred93()):
                             alt52 = 1
-                        elif (True) :
+                        elif (True):
                             alt52 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
+                            nvae = NoViableAltException(
+                                "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
 
                             raise nvae
 
@@ -4272,306 +4116,269 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
+                        nvae = NoViableAltException(
+                            "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
 
                         raise nvae
 
-                elif (LA52_0 == 64) :
+                elif (LA52_0 == 64):
                     alt52 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
+                    nvae = NoViableAltException(
+                        "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
 
                     raise nvae
 
                 if alt52 == 1:
                     # C.g:351:6: '(' abstract_declarator ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
+                    self.following.append(
+                        self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
                     self.abstract_declarator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
                     if self.failed:
                         return
 
-
                 elif alt52 == 2:
                     # C.g:351:36: abstract_declarator_suffix
-                    self.following.append(self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094)
+                    self.following.append(
+                        self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094)
                     self.abstract_declarator_suffix()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
                 # C.g:351:65: ( abstract_declarator_suffix )*
-                while True: #loop53
+                while True:  # loop53
                     alt53 = 2
                     LA53_0 = self.input.LA(1)
 
-                    if (LA53_0 == 62) :
+                    if (LA53_0 == 62):
                         LA53 = self.input.LA(2)
                         if LA53 == 63:
                             LA53_12 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 58:
                             LA53_13 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 66:
                             LA53_14 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 59:
                             LA53_15 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 60:
                             LA53_16 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == IDENTIFIER:
                             LA53_17 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 29 or LA53 == 30 or LA53 == 31 or LA53 == 32 or LA53 == 33:
                             LA53_19 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 34:
                             LA53_20 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 35:
                             LA53_21 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 36:
                             LA53_22 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 37:
                             LA53_23 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 38:
                             LA53_24 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 39:
                             LA53_25 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 40:
                             LA53_26 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 41:
                             LA53_27 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 42:
                             LA53_28 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 45 or LA53 == 46:
                             LA53_29 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 48:
                             LA53_30 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 49 or LA53 == 50 or LA53 == 51 or LA53 == 52 or LA53 == 53 or LA53 == 54 or LA53 == 55 or LA53 == 56 or LA53 == 57 or LA53 == 61:
                             LA53_31 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
-
-                    elif (LA53_0 == 64) :
+                    elif (LA53_0 == 64):
                         LA53 = self.input.LA(2)
                         if LA53 == 65:
                             LA53_33 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 62:
                             LA53_34 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == IDENTIFIER:
                             LA53_35 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == HEX_LITERAL:
                             LA53_36 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == OCTAL_LITERAL:
                             LA53_37 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == DECIMAL_LITERAL:
                             LA53_38 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == CHARACTER_LITERAL:
                             LA53_39 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == STRING_LITERAL:
                             LA53_40 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == FLOATING_POINT_LITERAL:
                             LA53_41 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 72:
                             LA53_42 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 73:
                             LA53_43 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 66 or LA53 == 68 or LA53 == 69 or LA53 == 77 or LA53 == 78 or LA53 == 79:
                             LA53_44 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 74:
                             LA53_45 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
-
-
-
                     if alt53 == 1:
                         # C.g:0:0: abstract_declarator_suffix
-                        self.following.append(self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098)
+                        self.following.append(
+                            self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098)
                         self.abstract_declarator_suffix()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop53
-
-
-
-
-
+                        break  # loop53
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 32, direct_abstract_declarator_StartIndex)
+                self.memoize(self.input, 32,
+                             direct_abstract_declarator_StartIndex)
 
             pass
 
@@ -4579,9 +4386,9 @@ class CParser(Parser):
 
     # $ANTLR end direct_abstract_declarator
 
-
     # $ANTLR start abstract_declarator_suffix
     # C.g:354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );
+
     def abstract_declarator_suffix(self, ):
 
         abstract_declarator_suffix_StartIndex = self.input.index()
@@ -4594,35 +4401,37 @@ class CParser(Parser):
                 alt54 = 4
                 LA54_0 = self.input.LA(1)
 
-                if (LA54_0 == 64) :
+                if (LA54_0 == 64):
                     LA54_1 = self.input.LA(2)
 
-                    if (LA54_1 == 65) :
+                    if (LA54_1 == 65):
                         alt54 = 1
-                    elif ((IDENTIFIER <= LA54_1 <= FLOATING_POINT_LITERAL) or LA54_1 == 62 or LA54_1 == 66 or (68 <= LA54_1 <= 69) or (72 <= LA54_1 <= 74) or (77 <= LA54_1 <= 79)) :
+                    elif ((IDENTIFIER <= LA54_1 <= FLOATING_POINT_LITERAL) or LA54_1 == 62 or LA54_1 == 66 or (68 <= LA54_1 <= 69) or (72 <= LA54_1 <= 74) or (77 <= LA54_1 <= 79)):
                         alt54 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
+                        nvae = NoViableAltException(
+                            "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
 
                         raise nvae
 
-                elif (LA54_0 == 62) :
+                elif (LA54_0 == 62):
                     LA54_2 = self.input.LA(2)
 
-                    if (LA54_2 == 63) :
+                    if (LA54_2 == 63):
                         alt54 = 3
-                    elif (LA54_2 == IDENTIFIER or (29 <= LA54_2 <= 42) or (45 <= LA54_2 <= 46) or (48 <= LA54_2 <= 61) or LA54_2 == 66) :
+                    elif (LA54_2 == IDENTIFIER or (29 <= LA54_2 <= 42) or (45 <= LA54_2 <= 46) or (48 <= LA54_2 <= 61) or LA54_2 == 66):
                         alt54 = 4
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
+                        nvae = NoViableAltException(
+                            "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
 
                         raise nvae
 
@@ -4631,67 +4440,74 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
+                    nvae = NoViableAltException(
+                        "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
 
                     raise nvae
 
                 if alt54 == 1:
                     # C.g:355:4: '[' ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
+                    self.match(
+                        self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
+                    self.match(
+                        self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
                     if self.failed:
                         return
 
-
                 elif alt54 == 2:
                     # C.g:356:4: '[' constant_expression ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
+                    self.match(
+                        self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
+                    self.match(
+                        self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
                     if self.failed:
                         return
 
-
                 elif alt54 == 3:
                     # C.g:357:4: '(' ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
                     if self.failed:
                         return
 
-
                 elif alt54 == 4:
                     # C.g:358:4: '(' parameter_type_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
+                    self.following.append(
+                        self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
                     self.parameter_type_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 33, abstract_declarator_suffix_StartIndex)
+                self.memoize(self.input, 33,
+                             abstract_declarator_suffix_StartIndex)
 
             pass
 
@@ -4699,9 +4515,9 @@ class CParser(Parser):
 
     # $ANTLR end abstract_declarator_suffix
 
-
     # $ANTLR start initializer
     # C.g:361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );
+
     def initializer(self, ):
 
         initializer_StartIndex = self.input.index()
@@ -4714,34 +4530,37 @@ class CParser(Parser):
                 alt56 = 2
                 LA56_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA56_0 <= FLOATING_POINT_LITERAL) or LA56_0 == 62 or LA56_0 == 66 or (68 <= LA56_0 <= 69) or (72 <= LA56_0 <= 74) or (77 <= LA56_0 <= 79)) :
+                if ((IDENTIFIER <= LA56_0 <= FLOATING_POINT_LITERAL) or LA56_0 == 62 or LA56_0 == 66 or (68 <= LA56_0 <= 69) or (72 <= LA56_0 <= 74) or (77 <= LA56_0 <= 79)):
                     alt56 = 1
-                elif (LA56_0 == 43) :
+                elif (LA56_0 == 43):
                     alt56 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
+                    nvae = NoViableAltException(
+                        "361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
 
                     raise nvae
 
                 if alt56 == 1:
                     # C.g:363:4: assignment_expression
-                    self.following.append(self.FOLLOW_assignment_expression_in_initializer1150)
+                    self.following.append(
+                        self.FOLLOW_assignment_expression_in_initializer1150)
                     self.assignment_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt56 == 2:
                     # C.g:364:4: '{' initializer_list ( ',' )? '}'
-                    self.match(self.input, 43, self.FOLLOW_43_in_initializer1155)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_initializer1155)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_initializer_list_in_initializer1157)
+                    self.following.append(
+                        self.FOLLOW_initializer_list_in_initializer1157)
                     self.initializer_list()
                     self.following.pop()
                     if self.failed:
@@ -4750,22 +4569,20 @@ class CParser(Parser):
                     alt55 = 2
                     LA55_0 = self.input.LA(1)
 
-                    if (LA55_0 == 27) :
+                    if (LA55_0 == 27):
                         alt55 = 1
                     if alt55 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_initializer1159)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_initializer1159)
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_initializer1162)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_initializer1162)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -4779,9 +4596,9 @@ class CParser(Parser):
 
     # $ANTLR end initializer
 
-
     # $ANTLR start initializer_list
     # C.g:367:1: initializer_list : initializer ( ',' initializer )* ;
+
     def initializer_list(self, ):
 
         initializer_list_StartIndex = self.input.index()
@@ -4792,44 +4609,38 @@ class CParser(Parser):
 
                 # C.g:368:2: ( initializer ( ',' initializer )* )
                 # C.g:368:4: initializer ( ',' initializer )*
-                self.following.append(self.FOLLOW_initializer_in_initializer_list1173)
+                self.following.append(
+                    self.FOLLOW_initializer_in_initializer_list1173)
                 self.initializer()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:368:16: ( ',' initializer )*
-                while True: #loop57
+                while True:  # loop57
                     alt57 = 2
                     LA57_0 = self.input.LA(1)
 
-                    if (LA57_0 == 27) :
+                    if (LA57_0 == 27):
                         LA57_1 = self.input.LA(2)
 
-                        if ((IDENTIFIER <= LA57_1 <= FLOATING_POINT_LITERAL) or LA57_1 == 43 or LA57_1 == 62 or LA57_1 == 66 or (68 <= LA57_1 <= 69) or (72 <= LA57_1 <= 74) or (77 <= LA57_1 <= 79)) :
+                        if ((IDENTIFIER <= LA57_1 <= FLOATING_POINT_LITERAL) or LA57_1 == 43 or LA57_1 == 62 or LA57_1 == 66 or (68 <= LA57_1 <= 69) or (72 <= LA57_1 <= 74) or (77 <= LA57_1 <= 79)):
                             alt57 = 1
 
-
-
-
                     if alt57 == 1:
                         # C.g:368:17: ',' initializer
-                        self.match(self.input, 27, self.FOLLOW_27_in_initializer_list1176)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_initializer_list1176)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_initializer_in_initializer_list1178)
+                        self.following.append(
+                            self.FOLLOW_initializer_in_initializer_list1178)
                         self.initializer()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop57
-
-
-
-
-
+                        break  # loop57
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -4849,10 +4660,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start argument_expression_list
     # C.g:373:1: argument_expression_list : assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )* ;
+
     def argument_expression_list(self, ):
 
         retval = self.argument_expression_list_return()
@@ -4865,7 +4675,8 @@ class CParser(Parser):
 
                 # C.g:374:2: ( assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )* )
                 # C.g:374:6: assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )*
-                self.following.append(self.FOLLOW_assignment_expression_in_argument_expression_list1196)
+                self.following.append(
+                    self.FOLLOW_assignment_expression_in_argument_expression_list1196)
                 self.assignment_expression()
                 self.following.pop()
                 if self.failed:
@@ -4874,31 +4685,31 @@ class CParser(Parser):
                 alt58 = 2
                 LA58_0 = self.input.LA(1)
 
-                if (LA58_0 == 53) :
+                if (LA58_0 == 53):
                     alt58 = 1
                 if alt58 == 1:
                     # C.g:374:29: 'OPTIONAL'
-                    self.match(self.input, 53, self.FOLLOW_53_in_argument_expression_list1199)
+                    self.match(self.input, 53,
+                               self.FOLLOW_53_in_argument_expression_list1199)
                     if self.failed:
                         return retval
 
-
-
                 # C.g:374:42: ( ',' assignment_expression ( 'OPTIONAL' )? )*
-                while True: #loop60
+                while True:  # loop60
                     alt60 = 2
                     LA60_0 = self.input.LA(1)
 
-                    if (LA60_0 == 27) :
+                    if (LA60_0 == 27):
                         alt60 = 1
 
-
                     if alt60 == 1:
                         # C.g:374:43: ',' assignment_expression ( 'OPTIONAL' )?
-                        self.match(self.input, 27, self.FOLLOW_27_in_argument_expression_list1204)
+                        self.match(
+                            self.input, 27, self.FOLLOW_27_in_argument_expression_list1204)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_assignment_expression_in_argument_expression_list1206)
+                        self.following.append(
+                            self.FOLLOW_assignment_expression_in_argument_expression_list1206)
                         self.assignment_expression()
                         self.following.pop()
                         if self.failed:
@@ -4907,34 +4718,27 @@ class CParser(Parser):
                         alt59 = 2
                         LA59_0 = self.input.LA(1)
 
-                        if (LA59_0 == 53) :
+                        if (LA59_0 == 53):
                             alt59 = 1
                         if alt59 == 1:
                             # C.g:374:70: 'OPTIONAL'
-                            self.match(self.input, 53, self.FOLLOW_53_in_argument_expression_list1209)
+                            self.match(
+                                self.input, 53, self.FOLLOW_53_in_argument_expression_list1209)
                             if self.failed:
                                 return retval
 
-
-
-
-
                     else:
-                        break #loop60
-
-
-
-
+                        break  # loop60
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 36, argument_expression_list_StartIndex)
+                self.memoize(self.input, 36,
+                             argument_expression_list_StartIndex)
 
             pass
 
@@ -4942,9 +4746,9 @@ class CParser(Parser):
 
     # $ANTLR end argument_expression_list
 
-
     # $ANTLR start additive_expression
     # C.g:377:1: additive_expression : ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )* ;
+
     def additive_expression(self, ):
 
         additive_expression_StartIndex = self.input.index()
@@ -4957,56 +4761,51 @@ class CParser(Parser):
                 # C.g:378:4: ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )*
                 # C.g:378:4: ( multiplicative_expression )
                 # C.g:378:5: multiplicative_expression
-                self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1225)
+                self.following.append(
+                    self.FOLLOW_multiplicative_expression_in_additive_expression1225)
                 self.multiplicative_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
                 # C.g:378:32: ( '+' multiplicative_expression | '-' multiplicative_expression )*
-                while True: #loop61
+                while True:  # loop61
                     alt61 = 3
                     LA61_0 = self.input.LA(1)
 
-                    if (LA61_0 == 68) :
+                    if (LA61_0 == 68):
                         alt61 = 1
-                    elif (LA61_0 == 69) :
+                    elif (LA61_0 == 69):
                         alt61 = 2
 
-
                     if alt61 == 1:
                         # C.g:378:33: '+' multiplicative_expression
-                        self.match(self.input, 68, self.FOLLOW_68_in_additive_expression1229)
+                        self.match(self.input, 68,
+                                   self.FOLLOW_68_in_additive_expression1229)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1231)
+                        self.following.append(
+                            self.FOLLOW_multiplicative_expression_in_additive_expression1231)
                         self.multiplicative_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt61 == 2:
                         # C.g:378:65: '-' multiplicative_expression
-                        self.match(self.input, 69, self.FOLLOW_69_in_additive_expression1235)
+                        self.match(self.input, 69,
+                                   self.FOLLOW_69_in_additive_expression1235)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1237)
+                        self.following.append(
+                            self.FOLLOW_multiplicative_expression_in_additive_expression1237)
                         self.multiplicative_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop61
-
-
-
-
-
+                        break  # loop61
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5021,9 +4820,9 @@ class CParser(Parser):
 
     # $ANTLR end additive_expression
 
-
     # $ANTLR start multiplicative_expression
     # C.g:381:1: multiplicative_expression : ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )* ;
+
     def multiplicative_expression(self, ):
 
         multiplicative_expression_StartIndex = self.input.index()
@@ -5036,16 +4835,15 @@ class CParser(Parser):
                 # C.g:382:4: ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
                 # C.g:382:4: ( cast_expression )
                 # C.g:382:5: cast_expression
-                self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1251)
+                self.following.append(
+                    self.FOLLOW_cast_expression_in_multiplicative_expression1251)
                 self.cast_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
                 # C.g:382:22: ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
-                while True: #loop62
+                while True:  # loop62
                     alt62 = 4
                     LA62 = self.input.LA(1)
                     if LA62 == 66:
@@ -5057,54 +4855,53 @@ class CParser(Parser):
 
                     if alt62 == 1:
                         # C.g:382:23: '*' cast_expression
-                        self.match(self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
+                        self.match(
+                            self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1257)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1257)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt62 == 2:
                         # C.g:382:45: '/' cast_expression
-                        self.match(self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
+                        self.match(
+                            self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1263)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1263)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt62 == 3:
                         # C.g:382:67: '%' cast_expression
-                        self.match(self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
+                        self.match(
+                            self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1269)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1269)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop62
-
-
-
-
-
+                        break  # loop62
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 38, multiplicative_expression_StartIndex)
+                self.memoize(self.input, 38,
+                             multiplicative_expression_StartIndex)
 
             pass
 
@@ -5112,9 +4909,9 @@ class CParser(Parser):
 
     # $ANTLR end multiplicative_expression
 
-
     # $ANTLR start cast_expression
     # C.g:385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );
+
     def cast_expression(self, ):
 
         cast_expression_StartIndex = self.input.index()
@@ -5127,23 +4924,24 @@ class CParser(Parser):
                 alt63 = 2
                 LA63_0 = self.input.LA(1)
 
-                if (LA63_0 == 62) :
+                if (LA63_0 == 62):
                     LA63 = self.input.LA(2)
                     if LA63 == 34 or LA63 == 35 or LA63 == 36 or LA63 == 37 or LA63 == 38 or LA63 == 39 or LA63 == 40 or LA63 == 41 or LA63 == 42 or LA63 == 45 or LA63 == 46 or LA63 == 48 or LA63 == 49 or LA63 == 50 or LA63 == 51 or LA63 == 52 or LA63 == 53 or LA63 == 54 or LA63 == 55 or LA63 == 56 or LA63 == 57 or LA63 == 58 or LA63 == 59 or LA63 == 60 or LA63 == 61:
                         alt63 = 1
                     elif LA63 == IDENTIFIER:
                         LA63_25 = self.input.LA(3)
 
-                        if (self.synpred109()) :
+                        if (self.synpred109()):
                             alt63 = 1
-                        elif (True) :
+                        elif (True):
                             alt63 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
+                            nvae = NoViableAltException(
+                                "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
 
                             raise nvae
 
@@ -5154,51 +4952,55 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
+                        nvae = NoViableAltException(
+                            "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
 
                         raise nvae
 
-                elif ((IDENTIFIER <= LA63_0 <= FLOATING_POINT_LITERAL) or LA63_0 == 66 or (68 <= LA63_0 <= 69) or (72 <= LA63_0 <= 74) or (77 <= LA63_0 <= 79)) :
+                elif ((IDENTIFIER <= LA63_0 <= FLOATING_POINT_LITERAL) or LA63_0 == 66 or (68 <= LA63_0 <= 69) or (72 <= LA63_0 <= 74) or (77 <= LA63_0 <= 79)):
                     alt63 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
+                    nvae = NoViableAltException(
+                        "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
 
                     raise nvae
 
                 if alt63 == 1:
                     # C.g:386:4: '(' type_name ')' cast_expression
-                    self.match(self.input, 62, self.FOLLOW_62_in_cast_expression1282)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_cast_expression1282)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_type_name_in_cast_expression1284)
+                    self.following.append(
+                        self.FOLLOW_type_name_in_cast_expression1284)
                     self.type_name()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_cast_expression1286)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_cast_expression1286)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_cast_expression_in_cast_expression1288)
+                    self.following.append(
+                        self.FOLLOW_cast_expression_in_cast_expression1288)
                     self.cast_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt63 == 2:
                     # C.g:387:4: unary_expression
-                    self.following.append(self.FOLLOW_unary_expression_in_cast_expression1293)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_cast_expression1293)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5212,9 +5014,9 @@ class CParser(Parser):
 
     # $ANTLR end cast_expression
 
-
     # $ANTLR start unary_expression
     # C.g:390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );
+
     def unary_expression(self, ):
 
         unary_expression_StartIndex = self.input.index()
@@ -5237,30 +5039,32 @@ class CParser(Parser):
                 elif LA64 == 74:
                     LA64_12 = self.input.LA(2)
 
-                    if (LA64_12 == 62) :
+                    if (LA64_12 == 62):
                         LA64_13 = self.input.LA(3)
 
-                        if (self.synpred114()) :
+                        if (self.synpred114()):
                             alt64 = 5
-                        elif (True) :
+                        elif (True):
                             alt64 = 6
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
+                            nvae = NoViableAltException(
+                                "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
 
                             raise nvae
 
-                    elif ((IDENTIFIER <= LA64_12 <= FLOATING_POINT_LITERAL) or LA64_12 == 66 or (68 <= LA64_12 <= 69) or (72 <= LA64_12 <= 74) or (77 <= LA64_12 <= 79)) :
+                    elif ((IDENTIFIER <= LA64_12 <= FLOATING_POINT_LITERAL) or LA64_12 == 66 or (68 <= LA64_12 <= 69) or (72 <= LA64_12 <= 74) or (77 <= LA64_12 <= 79)):
                         alt64 = 5
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
+                        nvae = NoViableAltException(
+                            "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
 
                         raise nvae
 
@@ -5269,88 +5073,95 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
+                    nvae = NoViableAltException(
+                        "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
 
                     raise nvae
 
                 if alt64 == 1:
                     # C.g:391:4: postfix_expression
-                    self.following.append(self.FOLLOW_postfix_expression_in_unary_expression1304)
+                    self.following.append(
+                        self.FOLLOW_postfix_expression_in_unary_expression1304)
                     self.postfix_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 2:
                     # C.g:392:4: '++' unary_expression
-                    self.match(self.input, 72, self.FOLLOW_72_in_unary_expression1309)
+                    self.match(self.input, 72,
+                               self.FOLLOW_72_in_unary_expression1309)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1311)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1311)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 3:
                     # C.g:393:4: '--' unary_expression
-                    self.match(self.input, 73, self.FOLLOW_73_in_unary_expression1316)
+                    self.match(self.input, 73,
+                               self.FOLLOW_73_in_unary_expression1316)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1318)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1318)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 4:
                     # C.g:394:4: unary_operator cast_expression
-                    self.following.append(self.FOLLOW_unary_operator_in_unary_expression1323)
+                    self.following.append(
+                        self.FOLLOW_unary_operator_in_unary_expression1323)
                     self.unary_operator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_cast_expression_in_unary_expression1325)
+                    self.following.append(
+                        self.FOLLOW_cast_expression_in_unary_expression1325)
                     self.cast_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 5:
                     # C.g:395:4: 'sizeof' unary_expression
-                    self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1330)
+                    self.match(self.input, 74,
+                               self.FOLLOW_74_in_unary_expression1330)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1332)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1332)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 6:
                     # C.g:396:4: 'sizeof' '(' type_name ')'
-                    self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1337)
+                    self.match(self.input, 74,
+                               self.FOLLOW_74_in_unary_expression1337)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_unary_expression1339)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_unary_expression1339)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_type_name_in_unary_expression1341)
+                    self.following.append(
+                        self.FOLLOW_type_name_in_unary_expression1341)
                     self.type_name()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_unary_expression1343)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_unary_expression1343)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5364,9 +5175,9 @@ class CParser(Parser):
 
     # $ANTLR end unary_expression
 
-
     # $ANTLR start postfix_expression
     # C.g:399:1: postfix_expression : p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* ;
+
     def postfix_expression(self, ):
         self.postfix_expression_stack.append(postfix_expression_scope())
         postfix_expression_StartIndex = self.input.index()
@@ -5379,9 +5190,7 @@ class CParser(Parser):
 
         c = None
 
-
-
-        self.postfix_expression_stack[-1].FuncCallText =  ''
+        self.postfix_expression_stack[-1].FuncCallText = ''
 
         try:
             try:
@@ -5390,30 +5199,29 @@ class CParser(Parser):
 
                 # C.g:406:2: (p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* )
                 # C.g:406:6: p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
-                self.following.append(self.FOLLOW_primary_expression_in_postfix_expression1367)
+                self.following.append(
+                    self.FOLLOW_primary_expression_in_postfix_expression1367)
                 p = self.primary_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(
+                        p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
-                while True: #loop65
+                while True:  # loop65
                     alt65 = 10
                     LA65 = self.input.LA(1)
                     if LA65 == 66:
                         LA65_1 = self.input.LA(2)
 
-                        if (LA65_1 == IDENTIFIER) :
+                        if (LA65_1 == IDENTIFIER):
                             LA65_30 = self.input.LA(3)
 
-                            if (self.synpred120()) :
+                            if (self.synpred120()):
                                 alt65 = 6
 
-
-
-
                     elif LA65 == 64:
                         alt65 = 1
                     elif LA65 == 62:
@@ -5425,21 +5233,19 @@ class CParser(Parser):
                         elif LA65 == IDENTIFIER:
                             LA65_55 = self.input.LA(3)
 
-                            if (self.synpred117()) :
+                            if (self.synpred117()):
                                 alt65 = 3
-                            elif (self.synpred118()) :
+                            elif (self.synpred118()):
                                 alt65 = 4
 
-
                         elif LA65 == 66:
                             LA65_57 = self.input.LA(3)
 
-                            if (self.synpred117()) :
+                            if (self.synpred117()):
                                 alt65 = 3
-                            elif (self.synpred118()) :
+                            elif (self.synpred118()):
                                 alt65 = 4
 
-
                         elif LA65 == HEX_LITERAL or LA65 == OCTAL_LITERAL or LA65 == DECIMAL_LITERAL or LA65 == CHARACTER_LITERAL or LA65 == STRING_LITERAL or LA65 == FLOATING_POINT_LITERAL or LA65 == 62 or LA65 == 68 or LA65 == 69 or LA65 == 72 or LA65 == 73 or LA65 == 74 or LA65 == 77 or LA65 == 78 or LA65 == 79:
                             alt65 = 3
 
@@ -5454,130 +5260,132 @@ class CParser(Parser):
 
                     if alt65 == 1:
                         # C.g:407:13: '[' expression ']'
-                        self.match(self.input, 64, self.FOLLOW_64_in_postfix_expression1383)
+                        self.match(self.input, 64,
+                                   self.FOLLOW_64_in_postfix_expression1383)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_expression_in_postfix_expression1385)
+                        self.following.append(
+                            self.FOLLOW_expression_in_postfix_expression1385)
                         self.expression()
                         self.following.pop()
                         if self.failed:
                             return
-                        self.match(self.input, 65, self.FOLLOW_65_in_postfix_expression1387)
+                        self.match(self.input, 65,
+                                   self.FOLLOW_65_in_postfix_expression1387)
                         if self.failed:
                             return
 
-
                     elif alt65 == 2:
                         # C.g:408:13: '(' a= ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1401)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1401)
                         if self.failed:
                             return
                         a = self.input.LT(1)
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1405)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1405)
                         if self.failed:
                             return
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line, a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
-
-
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line,
+                                                      a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
 
                     elif alt65 == 3:
                         # C.g:409:13: '(' c= argument_expression_list b= ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1420)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1420)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_argument_expression_list_in_postfix_expression1424)
+                        self.following.append(
+                            self.FOLLOW_argument_expression_list_in_postfix_expression1424)
                         c = self.argument_expression_list()
                         self.following.pop()
                         if self.failed:
                             return
                         b = self.input.LT(1)
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1428)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1428)
                         if self.failed:
                             return
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
-
-
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine,
+                                                      self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
                     elif alt65 == 4:
                         # C.g:410:13: '(' macro_parameter_list ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1444)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1444)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
+                        self.following.append(
+                            self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
                         self.macro_parameter_list()
                         self.following.pop()
                         if self.failed:
                             return
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1448)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1448)
                         if self.failed:
                             return
 
-
                     elif alt65 == 5:
                         # C.g:411:13: '.' x= IDENTIFIER
-                        self.match(self.input, 75, self.FOLLOW_75_in_postfix_expression1462)
+                        self.match(self.input, 75,
+                                   self.FOLLOW_75_in_postfix_expression1462)
                         if self.failed:
                             return
                         x = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText += '.' + x.text
 
-
-
                     elif alt65 == 6:
                         # C.g:412:13: '*' y= IDENTIFIER
-                        self.match(self.input, 66, self.FOLLOW_66_in_postfix_expression1482)
+                        self.match(self.input, 66,
+                                   self.FOLLOW_66_in_postfix_expression1482)
                         if self.failed:
                             return
                         y = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText = y.text
 
-
-
                     elif alt65 == 7:
                         # C.g:413:13: '->' z= IDENTIFIER
-                        self.match(self.input, 76, self.FOLLOW_76_in_postfix_expression1502)
+                        self.match(self.input, 76,
+                                   self.FOLLOW_76_in_postfix_expression1502)
                         if self.failed:
                             return
                         z = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText += '->' + z.text
 
-
-
                     elif alt65 == 8:
                         # C.g:414:13: '++'
-                        self.match(self.input, 72, self.FOLLOW_72_in_postfix_expression1522)
+                        self.match(self.input, 72,
+                                   self.FOLLOW_72_in_postfix_expression1522)
                         if self.failed:
                             return
 
-
                     elif alt65 == 9:
                         # C.g:415:13: '--'
-                        self.match(self.input, 73, self.FOLLOW_73_in_postfix_expression1536)
+                        self.match(self.input, 73,
+                                   self.FOLLOW_73_in_postfix_expression1536)
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop65
-
-
-
-
-
+                        break  # loop65
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5593,9 +5401,9 @@ class CParser(Parser):
 
     # $ANTLR end postfix_expression
 
-
     # $ANTLR start macro_parameter_list
     # C.g:419:1: macro_parameter_list : parameter_declaration ( ',' parameter_declaration )* ;
+
     def macro_parameter_list(self, ):
 
         macro_parameter_list_StartIndex = self.input.index()
@@ -5606,39 +5414,35 @@ class CParser(Parser):
 
                 # C.g:420:2: ( parameter_declaration ( ',' parameter_declaration )* )
                 # C.g:420:4: parameter_declaration ( ',' parameter_declaration )*
-                self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1559)
+                self.following.append(
+                    self.FOLLOW_parameter_declaration_in_macro_parameter_list1559)
                 self.parameter_declaration()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:420:26: ( ',' parameter_declaration )*
-                while True: #loop66
+                while True:  # loop66
                     alt66 = 2
                     LA66_0 = self.input.LA(1)
 
-                    if (LA66_0 == 27) :
+                    if (LA66_0 == 27):
                         alt66 = 1
 
-
                     if alt66 == 1:
                         # C.g:420:27: ',' parameter_declaration
-                        self.match(self.input, 27, self.FOLLOW_27_in_macro_parameter_list1562)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_macro_parameter_list1562)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
+                        self.following.append(
+                            self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
                         self.parameter_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop66
-
-
-
-
-
+                        break  # loop66
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5653,9 +5457,9 @@ class CParser(Parser):
 
     # $ANTLR end macro_parameter_list
 
-
     # $ANTLR start unary_operator
     # C.g:423:1: unary_operator : ( '&' | '*' | '+' | '-' | '~' | '!' );
+
     def unary_operator(self, ):
 
         unary_operator_StartIndex = self.input.index()
@@ -5667,7 +5471,7 @@ class CParser(Parser):
                 # C.g:424:2: ( '&' | '*' | '+' | '-' | '~' | '!' )
                 # C.g:
                 if self.input.LA(1) == 66 or (68 <= self.input.LA(1) <= 69) or (77 <= self.input.LA(1) <= 79):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -5679,14 +5483,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_unary_operator0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5705,10 +5504,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start primary_expression
     # C.g:432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );
+
     def primary_expression(self, ):
 
         retval = self.primary_expression_return()
@@ -5725,16 +5523,17 @@ class CParser(Parser):
                 if LA67 == IDENTIFIER:
                     LA67_1 = self.input.LA(2)
 
-                    if (LA67_1 == EOF or LA67_1 == 25 or (27 <= LA67_1 <= 28) or LA67_1 == 44 or LA67_1 == 47 or LA67_1 == 53 or (62 <= LA67_1 <= 66) or (68 <= LA67_1 <= 73) or (75 <= LA67_1 <= 77) or (80 <= LA67_1 <= 102)) :
+                    if (LA67_1 == EOF or LA67_1 == 25 or (27 <= LA67_1 <= 28) or LA67_1 == 44 or LA67_1 == 47 or LA67_1 == 53 or (62 <= LA67_1 <= 66) or (68 <= LA67_1 <= 73) or (75 <= LA67_1 <= 77) or (80 <= LA67_1 <= 102)):
                         alt67 = 1
-                    elif (LA67_1 == IDENTIFIER or LA67_1 == STRING_LITERAL) :
+                    elif (LA67_1 == IDENTIFIER or LA67_1 == STRING_LITERAL):
                         alt67 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 1, self.input)
+                        nvae = NoViableAltException(
+                            "432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 1, self.input)
 
                         raise nvae
 
@@ -5747,44 +5546,46 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 0, self.input)
+                    nvae = NoViableAltException(
+                        "432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 0, self.input)
 
                     raise nvae
 
                 if alt67 == 1:
                     # C.g:433:4: IDENTIFIER
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_primary_expression1613)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_primary_expression1613)
                     if self.failed:
                         return retval
 
-
                 elif alt67 == 2:
                     # C.g:434:4: constant
-                    self.following.append(self.FOLLOW_constant_in_primary_expression1618)
+                    self.following.append(
+                        self.FOLLOW_constant_in_primary_expression1618)
                     self.constant()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt67 == 3:
                     # C.g:435:4: '(' expression ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_primary_expression1623)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_primary_expression1623)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_expression_in_primary_expression1625)
+                    self.following.append(
+                        self.FOLLOW_expression_in_primary_expression1625)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 63, self.FOLLOW_63_in_primary_expression1627)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_primary_expression1627)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5798,9 +5599,9 @@ class CParser(Parser):
 
     # $ANTLR end primary_expression
 
-
     # $ANTLR start constant
     # C.g:438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );
+
     def constant(self, ):
 
         constant_StartIndex = self.input.index()
@@ -5829,111 +5630,103 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
+                    nvae = NoViableAltException(
+                        "438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
 
                     raise nvae
 
                 if alt72 == 1:
                     # C.g:439:9: HEX_LITERAL
-                    self.match(self.input, HEX_LITERAL, self.FOLLOW_HEX_LITERAL_in_constant1643)
+                    self.match(self.input, HEX_LITERAL,
+                               self.FOLLOW_HEX_LITERAL_in_constant1643)
                     if self.failed:
                         return
 
-
                 elif alt72 == 2:
                     # C.g:440:9: OCTAL_LITERAL
-                    self.match(self.input, OCTAL_LITERAL, self.FOLLOW_OCTAL_LITERAL_in_constant1653)
+                    self.match(self.input, OCTAL_LITERAL,
+                               self.FOLLOW_OCTAL_LITERAL_in_constant1653)
                     if self.failed:
                         return
 
-
                 elif alt72 == 3:
                     # C.g:441:9: DECIMAL_LITERAL
-                    self.match(self.input, DECIMAL_LITERAL, self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
+                    self.match(self.input, DECIMAL_LITERAL,
+                               self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
                     if self.failed:
                         return
 
-
                 elif alt72 == 4:
                     # C.g:442:7: CHARACTER_LITERAL
-                    self.match(self.input, CHARACTER_LITERAL, self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
+                    self.match(self.input, CHARACTER_LITERAL,
+                               self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
                     if self.failed:
                         return
 
-
                 elif alt72 == 5:
                     # C.g:443:7: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )*
                     # C.g:443:7: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+
                     cnt70 = 0
-                    while True: #loop70
+                    while True:  # loop70
                         alt70 = 2
                         LA70_0 = self.input.LA(1)
 
-                        if (LA70_0 == IDENTIFIER) :
+                        if (LA70_0 == IDENTIFIER):
                             LA70_1 = self.input.LA(2)
 
-                            if (LA70_1 == STRING_LITERAL) :
+                            if (LA70_1 == STRING_LITERAL):
                                 alt70 = 1
-                            elif (LA70_1 == IDENTIFIER) :
+                            elif (LA70_1 == IDENTIFIER):
                                 LA70_33 = self.input.LA(3)
 
-                                if (self.synpred138()) :
+                                if (self.synpred138()):
                                     alt70 = 1
 
-
-
-
-                        elif (LA70_0 == STRING_LITERAL) :
+                        elif (LA70_0 == STRING_LITERAL):
                             alt70 = 1
 
-
                         if alt70 == 1:
                             # C.g:443:8: ( IDENTIFIER )* ( STRING_LITERAL )+
                             # C.g:443:8: ( IDENTIFIER )*
-                            while True: #loop68
+                            while True:  # loop68
                                 alt68 = 2
                                 LA68_0 = self.input.LA(1)
 
-                                if (LA68_0 == IDENTIFIER) :
+                                if (LA68_0 == IDENTIFIER):
                                     alt68 = 1
 
-
                                 if alt68 == 1:
                                     # C.g:0:0: IDENTIFIER
-                                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
+                                    self.match(
+                                        self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
                                     if self.failed:
                                         return
 
-
                                 else:
-                                    break #loop68
-
+                                    break  # loop68
 
                             # C.g:443:20: ( STRING_LITERAL )+
                             cnt69 = 0
-                            while True: #loop69
+                            while True:  # loop69
                                 alt69 = 2
                                 LA69_0 = self.input.LA(1)
 
-                                if (LA69_0 == STRING_LITERAL) :
+                                if (LA69_0 == STRING_LITERAL):
                                     LA69_31 = self.input.LA(2)
 
-                                    if (self.synpred137()) :
+                                    if (self.synpred137()):
                                         alt69 = 1
 
-
-
-
                                 if alt69 == 1:
                                     # C.g:0:0: STRING_LITERAL
-                                    self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
+                                    self.match(
+                                        self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
                                     if self.failed:
                                         return
 
-
                                 else:
                                     if cnt69 >= 1:
-                                        break #loop69
+                                        break  # loop69
 
                                     if self.backtracking > 0:
                                         self.failed = True
@@ -5944,12 +5737,9 @@ class CParser(Parser):
 
                                 cnt69 += 1
 
-
-
-
                         else:
                             if cnt70 >= 1:
-                                break #loop70
+                                break  # loop70
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -5960,37 +5750,31 @@ class CParser(Parser):
 
                         cnt70 += 1
 
-
                     # C.g:443:38: ( IDENTIFIER )*
-                    while True: #loop71
+                    while True:  # loop71
                         alt71 = 2
                         LA71_0 = self.input.LA(1)
 
-                        if (LA71_0 == IDENTIFIER) :
+                        if (LA71_0 == IDENTIFIER):
                             alt71 = 1
 
-
                         if alt71 == 1:
                             # C.g:0:0: IDENTIFIER
-                            self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1688)
+                            self.match(self.input, IDENTIFIER,
+                                       self.FOLLOW_IDENTIFIER_in_constant1688)
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop71
-
-
-
+                            break  # loop71
 
                 elif alt72 == 6:
                     # C.g:444:9: FLOATING_POINT_LITERAL
-                    self.match(self.input, FLOATING_POINT_LITERAL, self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
+                    self.match(self.input, FLOATING_POINT_LITERAL,
+                               self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6009,10 +5793,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start expression
     # C.g:449:1: expression : assignment_expression ( ',' assignment_expression )* ;
+
     def expression(self, ):
 
         retval = self.expression_return()
@@ -6025,42 +5808,38 @@ class CParser(Parser):
 
                 # C.g:450:2: ( assignment_expression ( ',' assignment_expression )* )
                 # C.g:450:4: assignment_expression ( ',' assignment_expression )*
-                self.following.append(self.FOLLOW_assignment_expression_in_expression1715)
+                self.following.append(
+                    self.FOLLOW_assignment_expression_in_expression1715)
                 self.assignment_expression()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:450:26: ( ',' assignment_expression )*
-                while True: #loop73
+                while True:  # loop73
                     alt73 = 2
                     LA73_0 = self.input.LA(1)
 
-                    if (LA73_0 == 27) :
+                    if (LA73_0 == 27):
                         alt73 = 1
 
-
                     if alt73 == 1:
                         # C.g:450:27: ',' assignment_expression
-                        self.match(self.input, 27, self.FOLLOW_27_in_expression1718)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_expression1718)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_assignment_expression_in_expression1720)
+                        self.following.append(
+                            self.FOLLOW_assignment_expression_in_expression1720)
                         self.assignment_expression()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop73
-
-
-
-
+                        break  # loop73
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6074,9 +5853,9 @@ class CParser(Parser):
 
     # $ANTLR end expression
 
-
     # $ANTLR start constant_expression
     # C.g:453:1: constant_expression : conditional_expression ;
+
     def constant_expression(self, ):
 
         constant_expression_StartIndex = self.input.index()
@@ -6087,15 +5866,13 @@ class CParser(Parser):
 
                 # C.g:454:2: ( conditional_expression )
                 # C.g:454:4: conditional_expression
-                self.following.append(self.FOLLOW_conditional_expression_in_constant_expression1733)
+                self.following.append(
+                    self.FOLLOW_conditional_expression_in_constant_expression1733)
                 self.conditional_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6109,9 +5886,9 @@ class CParser(Parser):
 
     # $ANTLR end constant_expression
 
-
     # $ANTLR start assignment_expression
     # C.g:457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );
+
     def assignment_expression(self, ):
 
         assignment_expression_StartIndex = self.input.index()
@@ -6128,112 +5905,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_13 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_14 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_15 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_16 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_17 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_18 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_19 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
 
                             raise nvae
 
@@ -6242,32 +6026,34 @@ class CParser(Parser):
                     elif LA74 == STRING_LITERAL:
                         LA74_21 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_22 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
 
                             raise nvae
 
@@ -6278,7 +6064,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
 
                         raise nvae
 
@@ -6287,112 +6074,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_44 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_45 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_46 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_47 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_48 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_49 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_50 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
 
                             raise nvae
 
@@ -6405,7 +6199,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
 
                         raise nvae
 
@@ -6414,112 +6209,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_73 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_74 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_75 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_76 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_77 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_78 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_79 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
 
                             raise nvae
 
@@ -6532,7 +6334,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
 
                         raise nvae
 
@@ -6541,112 +6344,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_102 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_103 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_104 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_105 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_106 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_107 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_108 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
 
                             raise nvae
 
@@ -6659,7 +6469,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
 
                         raise nvae
 
@@ -6668,112 +6479,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_131 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_132 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_133 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_134 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_135 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_136 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_137 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
 
                             raise nvae
 
@@ -6786,7 +6604,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
 
                         raise nvae
 
@@ -6795,128 +6614,136 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_160 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
 
                             raise nvae
 
                     elif LA74 == 64:
                         LA74_161 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_162 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_163 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_164 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_165 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_166 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_167 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
 
                             raise nvae
 
@@ -6925,16 +6752,17 @@ class CParser(Parser):
                     elif LA74 == STRING_LITERAL:
                         LA74_189 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
 
                             raise nvae
 
@@ -6945,7 +6773,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
 
                         raise nvae
 
@@ -6954,112 +6783,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_191 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_192 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_193 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_194 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_195 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_196 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_197 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
 
                             raise nvae
 
@@ -7072,7 +6908,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
 
                         raise nvae
 
@@ -7081,192 +6918,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_220 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_221 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_222 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_223 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_224 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_225 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_226 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_227 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_228 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_229 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_230 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_231 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
 
                             raise nvae
 
@@ -7277,7 +7126,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
 
                         raise nvae
 
@@ -7286,192 +7136,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_244 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_245 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_246 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_247 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_248 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_249 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_250 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_251 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_252 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_253 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_254 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_255 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
 
                             raise nvae
 
@@ -7480,7 +7342,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
 
                         raise nvae
 
@@ -7489,192 +7352,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_256 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_257 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_258 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_259 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_260 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_261 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_262 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_263 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_264 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_265 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_266 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_267 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
 
                             raise nvae
 
@@ -7683,7 +7558,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
 
                         raise nvae
 
@@ -7692,192 +7568,204 @@ class CParser(Parser):
                     if LA74 == 62:
                         LA74_268 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_269 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_270 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_271 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_272 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_273 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_274 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_275 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_276 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_277 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_278 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_279 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
 
                             raise nvae
 
@@ -7886,7 +7774,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
 
                         raise nvae
 
@@ -7895,192 +7784,204 @@ class CParser(Parser):
                     if LA74 == 62:
                         LA74_280 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_281 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_282 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_283 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_284 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_285 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_286 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_287 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_288 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_289 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_290 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_291 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
 
                             raise nvae
 
@@ -8089,7 +7990,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
 
                         raise nvae
 
@@ -8098,39 +8000,41 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
+                    nvae = NoViableAltException(
+                        "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
 
                     raise nvae
 
                 if alt74 == 1:
                     # C.g:458:4: lvalue assignment_operator assignment_expression
-                    self.following.append(self.FOLLOW_lvalue_in_assignment_expression1744)
+                    self.following.append(
+                        self.FOLLOW_lvalue_in_assignment_expression1744)
                     self.lvalue()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_assignment_operator_in_assignment_expression1746)
+                    self.following.append(
+                        self.FOLLOW_assignment_operator_in_assignment_expression1746)
                     self.assignment_operator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_assignment_expression_in_assignment_expression1748)
+                    self.following.append(
+                        self.FOLLOW_assignment_expression_in_assignment_expression1748)
                     self.assignment_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt74 == 2:
                     # C.g:459:4: conditional_expression
-                    self.following.append(self.FOLLOW_conditional_expression_in_assignment_expression1753)
+                    self.following.append(
+                        self.FOLLOW_conditional_expression_in_assignment_expression1753)
                     self.conditional_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8144,9 +8048,9 @@ class CParser(Parser):
 
     # $ANTLR end assignment_expression
 
-
     # $ANTLR start lvalue
     # C.g:462:1: lvalue : unary_expression ;
+
     def lvalue(self, ):
 
         lvalue_StartIndex = self.input.index()
@@ -8157,15 +8061,13 @@ class CParser(Parser):
 
                 # C.g:463:2: ( unary_expression )
                 # C.g:463:4: unary_expression
-                self.following.append(self.FOLLOW_unary_expression_in_lvalue1765)
+                self.following.append(
+                    self.FOLLOW_unary_expression_in_lvalue1765)
                 self.unary_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8179,9 +8081,9 @@ class CParser(Parser):
 
     # $ANTLR end lvalue
 
-
     # $ANTLR start assignment_operator
     # C.g:466:1: assignment_operator : ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' );
+
     def assignment_operator(self, ):
 
         assignment_operator_StartIndex = self.input.index()
@@ -8193,7 +8095,7 @@ class CParser(Parser):
                 # C.g:467:2: ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' )
                 # C.g:
                 if self.input.LA(1) == 28 or (80 <= self.input.LA(1) <= 89):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -8205,14 +8107,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_assignment_operator0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8226,15 +8123,14 @@ class CParser(Parser):
 
     # $ANTLR end assignment_operator
 
-
     # $ANTLR start conditional_expression
     # C.g:480:1: conditional_expression : e= logical_or_expression ( '?' expression ':' conditional_expression )? ;
+
     def conditional_expression(self, ):
 
         conditional_expression_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 51):
@@ -8242,7 +8138,8 @@ class CParser(Parser):
 
                 # C.g:481:2: (e= logical_or_expression ( '?' expression ':' conditional_expression )? )
                 # C.g:481:4: e= logical_or_expression ( '?' expression ':' conditional_expression )?
-                self.following.append(self.FOLLOW_logical_or_expression_in_conditional_expression1839)
+                self.following.append(
+                    self.FOLLOW_logical_or_expression_in_conditional_expression1839)
                 e = self.logical_or_expression()
                 self.following.pop()
                 if self.failed:
@@ -8251,35 +8148,33 @@ class CParser(Parser):
                 alt75 = 2
                 LA75_0 = self.input.LA(1)
 
-                if (LA75_0 == 90) :
+                if (LA75_0 == 90):
                     alt75 = 1
                 if alt75 == 1:
                     # C.g:481:29: '?' expression ':' conditional_expression
-                    self.match(self.input, 90, self.FOLLOW_90_in_conditional_expression1842)
+                    self.match(self.input, 90,
+                               self.FOLLOW_90_in_conditional_expression1842)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_conditional_expression1844)
+                    self.following.append(
+                        self.FOLLOW_expression_in_conditional_expression1844)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_conditional_expression1846)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_conditional_expression1846)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_conditional_expression_in_conditional_expression1848)
+                    self.following.append(
+                        self.FOLLOW_conditional_expression_in_conditional_expression1848)
                     self.conditional_expression()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
-
-
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8299,10 +8194,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start logical_or_expression
     # C.g:484:1: logical_or_expression : logical_and_expression ( '||' logical_and_expression )* ;
+
     def logical_or_expression(self, ):
 
         retval = self.logical_or_expression_return()
@@ -8315,42 +8209,38 @@ class CParser(Parser):
 
                 # C.g:485:2: ( logical_and_expression ( '||' logical_and_expression )* )
                 # C.g:485:4: logical_and_expression ( '||' logical_and_expression )*
-                self.following.append(self.FOLLOW_logical_and_expression_in_logical_or_expression1863)
+                self.following.append(
+                    self.FOLLOW_logical_and_expression_in_logical_or_expression1863)
                 self.logical_and_expression()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:485:27: ( '||' logical_and_expression )*
-                while True: #loop76
+                while True:  # loop76
                     alt76 = 2
                     LA76_0 = self.input.LA(1)
 
-                    if (LA76_0 == 91) :
+                    if (LA76_0 == 91):
                         alt76 = 1
 
-
                     if alt76 == 1:
                         # C.g:485:28: '||' logical_and_expression
-                        self.match(self.input, 91, self.FOLLOW_91_in_logical_or_expression1866)
+                        self.match(self.input, 91,
+                                   self.FOLLOW_91_in_logical_or_expression1866)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_logical_and_expression_in_logical_or_expression1868)
+                        self.following.append(
+                            self.FOLLOW_logical_and_expression_in_logical_or_expression1868)
                         self.logical_and_expression()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop76
-
-
-
-
+                        break  # loop76
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8364,9 +8254,9 @@ class CParser(Parser):
 
     # $ANTLR end logical_or_expression
 
-
     # $ANTLR start logical_and_expression
     # C.g:488:1: logical_and_expression : inclusive_or_expression ( '&&' inclusive_or_expression )* ;
+
     def logical_and_expression(self, ):
 
         logical_and_expression_StartIndex = self.input.index()
@@ -8377,39 +8267,35 @@ class CParser(Parser):
 
                 # C.g:489:2: ( inclusive_or_expression ( '&&' inclusive_or_expression )* )
                 # C.g:489:4: inclusive_or_expression ( '&&' inclusive_or_expression )*
-                self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1881)
+                self.following.append(
+                    self.FOLLOW_inclusive_or_expression_in_logical_and_expression1881)
                 self.inclusive_or_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:489:28: ( '&&' inclusive_or_expression )*
-                while True: #loop77
+                while True:  # loop77
                     alt77 = 2
                     LA77_0 = self.input.LA(1)
 
-                    if (LA77_0 == 92) :
+                    if (LA77_0 == 92):
                         alt77 = 1
 
-
                     if alt77 == 1:
                         # C.g:489:29: '&&' inclusive_or_expression
-                        self.match(self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
+                        self.match(
+                            self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
+                        self.following.append(
+                            self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
                         self.inclusive_or_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop77
-
-
-
-
-
+                        break  # loop77
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8424,9 +8310,9 @@ class CParser(Parser):
 
     # $ANTLR end logical_and_expression
 
-
     # $ANTLR start inclusive_or_expression
     # C.g:492:1: inclusive_or_expression : exclusive_or_expression ( '|' exclusive_or_expression )* ;
+
     def inclusive_or_expression(self, ):
 
         inclusive_or_expression_StartIndex = self.input.index()
@@ -8437,46 +8323,43 @@ class CParser(Parser):
 
                 # C.g:493:2: ( exclusive_or_expression ( '|' exclusive_or_expression )* )
                 # C.g:493:4: exclusive_or_expression ( '|' exclusive_or_expression )*
-                self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899)
+                self.following.append(
+                    self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899)
                 self.exclusive_or_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:493:28: ( '|' exclusive_or_expression )*
-                while True: #loop78
+                while True:  # loop78
                     alt78 = 2
                     LA78_0 = self.input.LA(1)
 
-                    if (LA78_0 == 93) :
+                    if (LA78_0 == 93):
                         alt78 = 1
 
-
                     if alt78 == 1:
                         # C.g:493:29: '|' exclusive_or_expression
-                        self.match(self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
+                        self.match(
+                            self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
+                        self.following.append(
+                            self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
                         self.exclusive_or_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop78
-
-
-
-
-
+                        break  # loop78
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 54, inclusive_or_expression_StartIndex)
+                self.memoize(self.input, 54,
+                             inclusive_or_expression_StartIndex)
 
             pass
 
@@ -8484,9 +8367,9 @@ class CParser(Parser):
 
     # $ANTLR end inclusive_or_expression
 
-
     # $ANTLR start exclusive_or_expression
     # C.g:496:1: exclusive_or_expression : and_expression ( '^' and_expression )* ;
+
     def exclusive_or_expression(self, ):
 
         exclusive_or_expression_StartIndex = self.input.index()
@@ -8497,46 +8380,43 @@ class CParser(Parser):
 
                 # C.g:497:2: ( and_expression ( '^' and_expression )* )
                 # C.g:497:4: and_expression ( '^' and_expression )*
-                self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1917)
+                self.following.append(
+                    self.FOLLOW_and_expression_in_exclusive_or_expression1917)
                 self.and_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:497:19: ( '^' and_expression )*
-                while True: #loop79
+                while True:  # loop79
                     alt79 = 2
                     LA79_0 = self.input.LA(1)
 
-                    if (LA79_0 == 94) :
+                    if (LA79_0 == 94):
                         alt79 = 1
 
-
                     if alt79 == 1:
                         # C.g:497:20: '^' and_expression
-                        self.match(self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
+                        self.match(
+                            self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1922)
+                        self.following.append(
+                            self.FOLLOW_and_expression_in_exclusive_or_expression1922)
                         self.and_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop79
-
-
-
-
-
+                        break  # loop79
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 55, exclusive_or_expression_StartIndex)
+                self.memoize(self.input, 55,
+                             exclusive_or_expression_StartIndex)
 
             pass
 
@@ -8544,9 +8424,9 @@ class CParser(Parser):
 
     # $ANTLR end exclusive_or_expression
 
-
     # $ANTLR start and_expression
     # C.g:500:1: and_expression : equality_expression ( '&' equality_expression )* ;
+
     def and_expression(self, ):
 
         and_expression_StartIndex = self.input.index()
@@ -8557,39 +8437,35 @@ class CParser(Parser):
 
                 # C.g:501:2: ( equality_expression ( '&' equality_expression )* )
                 # C.g:501:4: equality_expression ( '&' equality_expression )*
-                self.following.append(self.FOLLOW_equality_expression_in_and_expression1935)
+                self.following.append(
+                    self.FOLLOW_equality_expression_in_and_expression1935)
                 self.equality_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:501:24: ( '&' equality_expression )*
-                while True: #loop80
+                while True:  # loop80
                     alt80 = 2
                     LA80_0 = self.input.LA(1)
 
-                    if (LA80_0 == 77) :
+                    if (LA80_0 == 77):
                         alt80 = 1
 
-
                     if alt80 == 1:
                         # C.g:501:25: '&' equality_expression
-                        self.match(self.input, 77, self.FOLLOW_77_in_and_expression1938)
+                        self.match(self.input, 77,
+                                   self.FOLLOW_77_in_and_expression1938)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_equality_expression_in_and_expression1940)
+                        self.following.append(
+                            self.FOLLOW_equality_expression_in_and_expression1940)
                         self.equality_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop80
-
-
-
-
-
+                        break  # loop80
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8604,9 +8480,9 @@ class CParser(Parser):
 
     # $ANTLR end and_expression
 
-
     # $ANTLR start equality_expression
     # C.g:503:1: equality_expression : relational_expression ( ( '==' | '!=' ) relational_expression )* ;
+
     def equality_expression(self, ):
 
         equality_expression_StartIndex = self.input.index()
@@ -8617,24 +8493,24 @@ class CParser(Parser):
 
                 # C.g:504:2: ( relational_expression ( ( '==' | '!=' ) relational_expression )* )
                 # C.g:504:4: relational_expression ( ( '==' | '!=' ) relational_expression )*
-                self.following.append(self.FOLLOW_relational_expression_in_equality_expression1952)
+                self.following.append(
+                    self.FOLLOW_relational_expression_in_equality_expression1952)
                 self.relational_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:504:26: ( ( '==' | '!=' ) relational_expression )*
-                while True: #loop81
+                while True:  # loop81
                     alt81 = 2
                     LA81_0 = self.input.LA(1)
 
-                    if ((95 <= LA81_0 <= 96)) :
+                    if ((95 <= LA81_0 <= 96)):
                         alt81 = 1
 
-
                     if alt81 == 1:
                         # C.g:504:27: ( '==' | '!=' ) relational_expression
                         if (95 <= self.input.LA(1) <= 96):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8646,24 +8522,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_equality_expression1955
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_relational_expression_in_equality_expression1961)
+                        self.following.append(
+                            self.FOLLOW_relational_expression_in_equality_expression1961)
                         self.relational_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop81
-
-
-
-
-
+                        break  # loop81
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8678,9 +8548,9 @@ class CParser(Parser):
 
     # $ANTLR end equality_expression
 
-
     # $ANTLR start relational_expression
     # C.g:507:1: relational_expression : shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* ;
+
     def relational_expression(self, ):
 
         relational_expression_StartIndex = self.input.index()
@@ -8691,24 +8561,24 @@ class CParser(Parser):
 
                 # C.g:508:2: ( shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* )
                 # C.g:508:4: shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
-                self.following.append(self.FOLLOW_shift_expression_in_relational_expression1975)
+                self.following.append(
+                    self.FOLLOW_shift_expression_in_relational_expression1975)
                 self.shift_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:508:21: ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
-                while True: #loop82
+                while True:  # loop82
                     alt82 = 2
                     LA82_0 = self.input.LA(1)
 
-                    if ((97 <= LA82_0 <= 100)) :
+                    if ((97 <= LA82_0 <= 100)):
                         alt82 = 1
 
-
                     if alt82 == 1:
                         # C.g:508:22: ( '<' | '>' | '<=' | '>=' ) shift_expression
                         if (97 <= self.input.LA(1) <= 100):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8720,24 +8590,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_relational_expression1978
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_shift_expression_in_relational_expression1988)
+                        self.following.append(
+                            self.FOLLOW_shift_expression_in_relational_expression1988)
                         self.shift_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop82
-
-
-
-
-
+                        break  # loop82
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8752,9 +8616,9 @@ class CParser(Parser):
 
     # $ANTLR end relational_expression
 
-
     # $ANTLR start shift_expression
     # C.g:511:1: shift_expression : additive_expression ( ( '<<' | '>>' ) additive_expression )* ;
+
     def shift_expression(self, ):
 
         shift_expression_StartIndex = self.input.index()
@@ -8765,24 +8629,24 @@ class CParser(Parser):
 
                 # C.g:512:2: ( additive_expression ( ( '<<' | '>>' ) additive_expression )* )
                 # C.g:512:4: additive_expression ( ( '<<' | '>>' ) additive_expression )*
-                self.following.append(self.FOLLOW_additive_expression_in_shift_expression2001)
+                self.following.append(
+                    self.FOLLOW_additive_expression_in_shift_expression2001)
                 self.additive_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:512:24: ( ( '<<' | '>>' ) additive_expression )*
-                while True: #loop83
+                while True:  # loop83
                     alt83 = 2
                     LA83_0 = self.input.LA(1)
 
-                    if ((101 <= LA83_0 <= 102)) :
+                    if ((101 <= LA83_0 <= 102)):
                         alt83 = 1
 
-
                     if alt83 == 1:
                         # C.g:512:25: ( '<<' | '>>' ) additive_expression
                         if (101 <= self.input.LA(1) <= 102):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8794,24 +8658,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_shift_expression2004
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_additive_expression_in_shift_expression2010)
+                        self.following.append(
+                            self.FOLLOW_additive_expression_in_shift_expression2010)
                         self.additive_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop83
-
-
-
-
-
+                        break  # loop83
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8826,9 +8684,9 @@ class CParser(Parser):
 
     # $ANTLR end shift_expression
 
-
     # $ANTLR start statement
     # C.g:517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );
+
     def statement(self, ):
 
         statement_StartIndex = self.input.index()
@@ -8845,20 +8703,21 @@ class CParser(Parser):
                     if LA84 == 62:
                         LA84_43 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (self.synpred173()) :
+                        elif (self.synpred173()):
                             alt84 = 7
-                        elif (self.synpred174()) :
+                        elif (self.synpred174()):
                             alt84 = 8
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
 
                             raise nvae
 
@@ -8869,48 +8728,51 @@ class CParser(Parser):
                     elif LA84 == 66:
                         LA84_47 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
 
                             raise nvae
 
                     elif LA84 == IDENTIFIER:
                         LA84_53 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
 
                             raise nvae
 
                     elif LA84 == 25:
                         LA84_68 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
 
                             raise nvae
 
@@ -8921,7 +8783,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
+                        nvae = NoViableAltException(
+                            "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
 
                         raise nvae
 
@@ -8950,110 +8813,110 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
+                    nvae = NoViableAltException(
+                        "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
 
                     raise nvae
 
                 if alt84 == 1:
                     # C.g:518:4: labeled_statement
-                    self.following.append(self.FOLLOW_labeled_statement_in_statement2025)
+                    self.following.append(
+                        self.FOLLOW_labeled_statement_in_statement2025)
                     self.labeled_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 2:
                     # C.g:519:4: compound_statement
-                    self.following.append(self.FOLLOW_compound_statement_in_statement2030)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_statement2030)
                     self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 3:
                     # C.g:520:4: expression_statement
-                    self.following.append(self.FOLLOW_expression_statement_in_statement2035)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_statement2035)
                     self.expression_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 4:
                     # C.g:521:4: selection_statement
-                    self.following.append(self.FOLLOW_selection_statement_in_statement2040)
+                    self.following.append(
+                        self.FOLLOW_selection_statement_in_statement2040)
                     self.selection_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 5:
                     # C.g:522:4: iteration_statement
-                    self.following.append(self.FOLLOW_iteration_statement_in_statement2045)
+                    self.following.append(
+                        self.FOLLOW_iteration_statement_in_statement2045)
                     self.iteration_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 6:
                     # C.g:523:4: jump_statement
-                    self.following.append(self.FOLLOW_jump_statement_in_statement2050)
+                    self.following.append(
+                        self.FOLLOW_jump_statement_in_statement2050)
                     self.jump_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 7:
                     # C.g:524:4: macro_statement
-                    self.following.append(self.FOLLOW_macro_statement_in_statement2055)
+                    self.following.append(
+                        self.FOLLOW_macro_statement_in_statement2055)
                     self.macro_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 8:
                     # C.g:525:4: asm2_statement
-                    self.following.append(self.FOLLOW_asm2_statement_in_statement2060)
+                    self.following.append(
+                        self.FOLLOW_asm2_statement_in_statement2060)
                     self.asm2_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 9:
                     # C.g:526:4: asm1_statement
-                    self.following.append(self.FOLLOW_asm1_statement_in_statement2065)
+                    self.following.append(
+                        self.FOLLOW_asm1_statement_in_statement2065)
                     self.asm1_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 10:
                     # C.g:527:4: asm_statement
-                    self.following.append(self.FOLLOW_asm_statement_in_statement2070)
+                    self.following.append(
+                        self.FOLLOW_asm_statement_in_statement2070)
                     self.asm_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 11:
                     # C.g:528:4: declaration
-                    self.following.append(self.FOLLOW_declaration_in_statement2075)
+                    self.following.append(
+                        self.FOLLOW_declaration_in_statement2075)
                     self.declaration()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9067,9 +8930,9 @@ class CParser(Parser):
 
     # $ANTLR end statement
 
-
     # $ANTLR start asm2_statement
     # C.g:531:1: asm2_statement : ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';' ;
+
     def asm2_statement(self, ):
 
         asm2_statement_StartIndex = self.input.index()
@@ -9084,42 +8947,41 @@ class CParser(Parser):
                 alt85 = 2
                 LA85_0 = self.input.LA(1)
 
-                if (LA85_0 == 103) :
+                if (LA85_0 == 103):
                     alt85 = 1
                 if alt85 == 1:
                     # C.g:0:0: '__asm__'
-                    self.match(self.input, 103, self.FOLLOW_103_in_asm2_statement2086)
+                    self.match(self.input, 103,
+                               self.FOLLOW_103_in_asm2_statement2086)
                     if self.failed:
                         return
 
-
-
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
                 if self.failed:
                     return
-                self.match(self.input, 62, self.FOLLOW_62_in_asm2_statement2091)
+                self.match(self.input, 62,
+                           self.FOLLOW_62_in_asm2_statement2091)
                 if self.failed:
                     return
                 # C.g:532:30: (~ ( ';' ) )*
-                while True: #loop86
+                while True:  # loop86
                     alt86 = 2
                     LA86_0 = self.input.LA(1)
 
-                    if (LA86_0 == 63) :
+                    if (LA86_0 == 63):
                         LA86_1 = self.input.LA(2)
 
-                        if ((IDENTIFIER <= LA86_1 <= LINE_COMMAND) or (26 <= LA86_1 <= 117)) :
+                        if ((IDENTIFIER <= LA86_1 <= LINE_COMMAND) or (26 <= LA86_1 <= 117)):
                             alt86 = 1
 
-
-                    elif ((IDENTIFIER <= LA86_0 <= LINE_COMMAND) or (26 <= LA86_0 <= 62) or (64 <= LA86_0 <= 117)) :
+                    elif ((IDENTIFIER <= LA86_0 <= LINE_COMMAND) or (26 <= LA86_0 <= 62) or (64 <= LA86_0 <= 117)):
                         alt86 = 1
 
-
                     if alt86 == 1:
                         # C.g:532:31: ~ ( ';' )
                         if (IDENTIFIER <= self.input.LA(1) <= LINE_COMMAND) or (26 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9131,26 +8993,21 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm2_statement2094
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop86
+                        break  # loop86
 
-
-                self.match(self.input, 63, self.FOLLOW_63_in_asm2_statement2101)
+                self.match(self.input, 63,
+                           self.FOLLOW_63_in_asm2_statement2101)
                 if self.failed:
                     return
-                self.match(self.input, 25, self.FOLLOW_25_in_asm2_statement2103)
+                self.match(self.input, 25,
+                           self.FOLLOW_25_in_asm2_statement2103)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9164,9 +9021,9 @@ class CParser(Parser):
 
     # $ANTLR end asm2_statement
 
-
     # $ANTLR start asm1_statement
     # C.g:535:1: asm1_statement : '_asm' '{' (~ ( '}' ) )* '}' ;
+
     def asm1_statement(self, ):
 
         asm1_statement_StartIndex = self.input.index()
@@ -9177,25 +9034,26 @@ class CParser(Parser):
 
                 # C.g:536:2: ( '_asm' '{' (~ ( '}' ) )* '}' )
                 # C.g:536:4: '_asm' '{' (~ ( '}' ) )* '}'
-                self.match(self.input, 104, self.FOLLOW_104_in_asm1_statement2115)
+                self.match(self.input, 104,
+                           self.FOLLOW_104_in_asm1_statement2115)
                 if self.failed:
                     return
-                self.match(self.input, 43, self.FOLLOW_43_in_asm1_statement2117)
+                self.match(self.input, 43,
+                           self.FOLLOW_43_in_asm1_statement2117)
                 if self.failed:
                     return
                 # C.g:536:15: (~ ( '}' ) )*
-                while True: #loop87
+                while True:  # loop87
                     alt87 = 2
                     LA87_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA87_0 <= 43) or (45 <= LA87_0 <= 117)) :
+                    if ((IDENTIFIER <= LA87_0 <= 43) or (45 <= LA87_0 <= 117)):
                         alt87 = 1
 
-
                     if alt87 == 1:
                         # C.g:536:16: ~ ( '}' )
                         if (IDENTIFIER <= self.input.LA(1) <= 43) or (45 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9207,23 +9065,17 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm1_statement2120
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop87
+                        break  # loop87
 
-
-                self.match(self.input, 44, self.FOLLOW_44_in_asm1_statement2127)
+                self.match(self.input, 44,
+                           self.FOLLOW_44_in_asm1_statement2127)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9237,9 +9089,9 @@ class CParser(Parser):
 
     # $ANTLR end asm1_statement
 
-
     # $ANTLR start asm_statement
     # C.g:539:1: asm_statement : '__asm' '{' (~ ( '}' ) )* '}' ;
+
     def asm_statement(self, ):
 
         asm_statement_StartIndex = self.input.index()
@@ -9250,25 +9102,25 @@ class CParser(Parser):
 
                 # C.g:540:2: ( '__asm' '{' (~ ( '}' ) )* '}' )
                 # C.g:540:4: '__asm' '{' (~ ( '}' ) )* '}'
-                self.match(self.input, 105, self.FOLLOW_105_in_asm_statement2138)
+                self.match(self.input, 105,
+                           self.FOLLOW_105_in_asm_statement2138)
                 if self.failed:
                     return
                 self.match(self.input, 43, self.FOLLOW_43_in_asm_statement2140)
                 if self.failed:
                     return
                 # C.g:540:16: (~ ( '}' ) )*
-                while True: #loop88
+                while True:  # loop88
                     alt88 = 2
                     LA88_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA88_0 <= 43) or (45 <= LA88_0 <= 117)) :
+                    if ((IDENTIFIER <= LA88_0 <= 43) or (45 <= LA88_0 <= 117)):
                         alt88 = 1
 
-
                     if alt88 == 1:
                         # C.g:540:17: ~ ( '}' )
                         if (IDENTIFIER <= self.input.LA(1) <= 43) or (45 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9280,23 +9132,16 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm_statement2143
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop88
-
+                        break  # loop88
 
                 self.match(self.input, 44, self.FOLLOW_44_in_asm_statement2150)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9310,9 +9155,9 @@ class CParser(Parser):
 
     # $ANTLR end asm_statement
 
-
     # $ANTLR start macro_statement
     # C.g:543:1: macro_statement : IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' ;
+
     def macro_statement(self, ):
 
         macro_statement_StartIndex = self.input.index()
@@ -9323,14 +9168,16 @@ class CParser(Parser):
 
                 # C.g:544:2: ( IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' )
                 # C.g:544:4: IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')'
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_macro_statement2162)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_macro_statement2162)
                 if self.failed:
                     return
-                self.match(self.input, 62, self.FOLLOW_62_in_macro_statement2164)
+                self.match(self.input, 62,
+                           self.FOLLOW_62_in_macro_statement2164)
                 if self.failed:
                     return
                 # C.g:544:19: ( declaration )*
-                while True: #loop89
+                while True:  # loop89
                     alt89 = 2
                     LA89 = self.input.LA(1)
                     if LA89 == IDENTIFIER:
@@ -9338,1904 +9185,1622 @@ class CParser(Parser):
                         if LA89 == 62:
                             LA89_45 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_47 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 66:
                             LA89_50 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_68 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_71 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_72 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_73 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_74 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_75 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_76 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_77 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_78 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_79 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_80 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_81 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_82 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_83 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_84 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_85 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_86 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 26:
                         LA89 = self.input.LA(2)
                         if LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_87 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_88 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_89 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_90 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_91 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_92 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_93 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_94 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_95 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_96 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_97 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_98 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_99 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_100 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 66:
                             LA89_101 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_102 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_103 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_104 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_105 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_106 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_107 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_108 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_109 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_110 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_111 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_112 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_113 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_114 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_115 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_116 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_117 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_118 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_119 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_120 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_121 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_122 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_123 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_124 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_125 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 34:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_126 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_127 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_128 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_129 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_130 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_131 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_132 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_133 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_134 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_135 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_136 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_137 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_138 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_139 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_140 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_141 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_142 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_143 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_144 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_145 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 35:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_146 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_147 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_148 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_149 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_150 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_151 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_152 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_153 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_154 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_155 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_156 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_157 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_158 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_159 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_160 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_161 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_162 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_163 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_164 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_165 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 36:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_166 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_167 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_168 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_169 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_170 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_171 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_172 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_173 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_174 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_175 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_176 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_177 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_178 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_179 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_180 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_181 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_182 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_183 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_184 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_185 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 37:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_186 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_187 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_188 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_189 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_190 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_191 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_192 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_193 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_194 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_195 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_196 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_197 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_198 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_199 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_200 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_201 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_202 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_203 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_204 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_205 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 38:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_206 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_207 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_208 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_209 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_210 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_211 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_212 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_213 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_214 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_215 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_216 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_217 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_218 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_219 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_220 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_221 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_222 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_223 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_224 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_225 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 39:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_226 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_227 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_228 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_229 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_230 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_231 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_232 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_233 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_234 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_235 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_236 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_237 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_238 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_239 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_240 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_241 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_242 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_243 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_244 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_245 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 40:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_246 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_247 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_248 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_249 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_250 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_251 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_252 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_253 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_254 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_255 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_256 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_257 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_258 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_259 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_260 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_261 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_262 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_263 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_264 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_265 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 41:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_266 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_267 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_268 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_269 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_270 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_271 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_272 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_273 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_274 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_275 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_276 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_277 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_278 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_279 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_280 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_281 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_282 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_283 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_284 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_285 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 42:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_286 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_287 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_288 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_289 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_290 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_291 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_292 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_293 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_294 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_295 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_296 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_297 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_298 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_299 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_300 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_301 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_302 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_303 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_304 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_305 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 45 or LA89 == 46:
                         LA89_40 = self.input.LA(2)
 
-                        if (LA89_40 == IDENTIFIER) :
+                        if (LA89_40 == IDENTIFIER):
                             LA89_306 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-                        elif (LA89_40 == 43) :
+                        elif (LA89_40 == 43):
                             LA89_307 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     elif LA89 == 48:
                         LA89_41 = self.input.LA(2)
 
-                        if (LA89_41 == 43) :
+                        if (LA89_41 == 43):
                             LA89_308 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-                        elif (LA89_41 == IDENTIFIER) :
+                        elif (LA89_41 == IDENTIFIER):
                             LA89_309 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 58 or LA89 == 59 or LA89 == 60 or LA89 == 61:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_310 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_311 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_312 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_313 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_314 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_315 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_316 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_317 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_318 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_319 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_320 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_321 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_322 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_323 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_324 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_325 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_326 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_327 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_328 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_329 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     if alt89 == 1:
                         # C.g:0:0: declaration
-                        self.following.append(self.FOLLOW_declaration_in_macro_statement2166)
+                        self.following.append(
+                            self.FOLLOW_declaration_in_macro_statement2166)
                         self.declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop89
-
+                        break  # loop89
 
                 # C.g:544:33: ( statement_list )?
                 alt90 = 2
@@ -11247,122 +10812,122 @@ class CParser(Parser):
                     elif LA90 == 62:
                         LA90_45 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_46 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_47 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 64:
                         LA90_48 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_49 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_50 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_51 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_52 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_53 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_54 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_55 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_56 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_57 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_58 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_59 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_60 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_61 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_62 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_63 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_64 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_65 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_66 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_67 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_70 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 25 or LA90 == 26 or LA90 == 29 or LA90 == 30 or LA90 == 31 or LA90 == 32 or LA90 == 33 or LA90 == 34 or LA90 == 35 or LA90 == 36 or LA90 == 37 or LA90 == 38 or LA90 == 39 or LA90 == 40 or LA90 == 41 or LA90 == 42 or LA90 == 43 or LA90 == 45 or LA90 == 46 or LA90 == 48 or LA90 == 49 or LA90 == 50 or LA90 == 51 or LA90 == 52 or LA90 == 53 or LA90 == 54 or LA90 == 55 or LA90 == 56 or LA90 == 57 or LA90 == 58 or LA90 == 59 or LA90 == 60 or LA90 == 61 or LA90 == 103 or LA90 == 104 or LA90 == 105 or LA90 == 106 or LA90 == 107 or LA90 == 108 or LA90 == 110 or LA90 == 111 or LA90 == 112 or LA90 == 113 or LA90 == 114 or LA90 == 115 or LA90 == 116 or LA90 == 117:
                     alt90 = 1
@@ -11371,112 +10936,112 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_87 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_88 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_89 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_90 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_91 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_92 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_93 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_94 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_95 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_96 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_97 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_98 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_99 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_100 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_101 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_102 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_103 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_104 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_105 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_106 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_107 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_108 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11485,226 +11050,226 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_111 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_112 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_113 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_114 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_115 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_116 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_117 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_118 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_119 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_120 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_121 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_122 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_123 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_124 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_125 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_126 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_127 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_128 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_129 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_130 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_131 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_134 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == DECIMAL_LITERAL:
                     LA90 = self.input.LA(2)
                     if LA90 == 64:
                         LA90_135 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_136 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_137 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_138 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_139 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_140 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_141 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_142 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_143 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_144 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_145 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_146 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_147 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_148 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_149 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_150 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_151 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_152 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_153 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_154 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_155 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_156 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11713,236 +11278,236 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_159 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_160 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_161 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_162 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_163 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_164 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_165 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_166 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_167 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_168 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_169 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_170 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_171 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_172 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_173 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_174 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_175 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_176 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_177 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_178 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_179 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_181 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == STRING_LITERAL:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_183 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 64:
                         LA90_184 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_185 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_186 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_187 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_188 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_189 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_190 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_191 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_192 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_193 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_194 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_195 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_196 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_197 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_198 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_199 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_200 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_201 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_202 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_203 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_204 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_205 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_206 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11951,112 +11516,112 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_209 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_210 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_211 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_212 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_213 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_214 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_215 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_216 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_217 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_218 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_219 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_220 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_221 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_222 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_223 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_224 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_225 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_226 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_227 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_228 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_229 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_230 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -12065,404 +11630,400 @@ class CParser(Parser):
                     if LA90 == IDENTIFIER:
                         LA90_233 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_234 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_235 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_236 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_237 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_238 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_239 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_240 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_241 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_242 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_243 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_244 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 49 or LA90 == 50 or LA90 == 51 or LA90 == 52 or LA90 == 53 or LA90 == 54 or LA90 == 55 or LA90 == 56 or LA90 == 57 or LA90 == 58 or LA90 == 59 or LA90 == 60 or LA90 == 61:
                         LA90_245 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 34:
                         LA90_246 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 35:
                         LA90_247 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 36:
                         LA90_248 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 37:
                         LA90_249 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 38:
                         LA90_250 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 39:
                         LA90_251 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 40:
                         LA90_252 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 41:
                         LA90_253 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 42:
                         LA90_254 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 45 or LA90 == 46:
                         LA90_255 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 48:
                         LA90_256 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 72:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_257 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_258 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_259 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_260 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_261 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_262 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_263 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_264 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_265 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_266 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_267 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_268 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 73:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_269 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_270 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_271 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_272 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_273 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_274 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_275 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_276 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_277 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_278 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_279 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_280 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                     LA90 = self.input.LA(2)
                     if LA90 == 62:
                         LA90_281 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_282 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_283 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_284 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_285 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_286 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_287 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_288 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_289 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_290 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_291 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_292 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 74:
                     LA90 = self.input.LA(2)
                     if LA90 == 62:
                         LA90_293 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_294 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_295 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_296 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_297 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_298 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_299 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_300 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_301 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_302 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_303 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_304 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 if alt90 == 1:
                     # C.g:0:0: statement_list
-                    self.following.append(self.FOLLOW_statement_list_in_macro_statement2170)
+                    self.following.append(
+                        self.FOLLOW_statement_list_in_macro_statement2170)
                     self.statement_list()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
                 # C.g:544:49: ( expression )?
                 alt91 = 2
                 LA91_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA91_0 <= FLOATING_POINT_LITERAL) or LA91_0 == 62 or LA91_0 == 66 or (68 <= LA91_0 <= 69) or (72 <= LA91_0 <= 74) or (77 <= LA91_0 <= 79)) :
+                if ((IDENTIFIER <= LA91_0 <= FLOATING_POINT_LITERAL) or LA91_0 == 62 or LA91_0 == 66 or (68 <= LA91_0 <= 69) or (72 <= LA91_0 <= 74) or (77 <= LA91_0 <= 79)):
                     alt91 = 1
                 if alt91 == 1:
                     # C.g:0:0: expression
-                    self.following.append(self.FOLLOW_expression_in_macro_statement2173)
+                    self.following.append(
+                        self.FOLLOW_expression_in_macro_statement2173)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-                self.match(self.input, 63, self.FOLLOW_63_in_macro_statement2176)
+                self.match(self.input, 63,
+                           self.FOLLOW_63_in_macro_statement2176)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -12476,9 +12037,9 @@ class CParser(Parser):
 
     # $ANTLR end macro_statement
 
-
     # $ANTLR start labeled_statement
     # C.g:547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );
+
     def labeled_statement(self, ):
 
         labeled_statement_StartIndex = self.input.index()
@@ -12501,61 +12062,68 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
+                    nvae = NoViableAltException(
+                        "547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
 
                     raise nvae
 
                 if alt92 == 1:
                     # C.g:548:4: IDENTIFIER ':' statement
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2190)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2190)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2192)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2192)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt92 == 2:
                     # C.g:549:4: 'case' constant_expression ':' statement
-                    self.match(self.input, 106, self.FOLLOW_106_in_labeled_statement2197)
+                    self.match(self.input, 106,
+                               self.FOLLOW_106_in_labeled_statement2197)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_labeled_statement2199)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_labeled_statement2199)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2201)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2201)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2203)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2203)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt92 == 3:
                     # C.g:550:4: 'default' ':' statement
-                    self.match(self.input, 107, self.FOLLOW_107_in_labeled_statement2208)
+                    self.match(self.input, 107,
+                               self.FOLLOW_107_in_labeled_statement2208)
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2210)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2210)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2212)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2212)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -12574,10 +12142,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start compound_statement
     # C.g:553:1: compound_statement : '{' ( declaration )* ( statement_list )? '}' ;
+
     def compound_statement(self, ):
 
         retval = self.compound_statement_return()
@@ -12590,11 +12157,12 @@ class CParser(Parser):
 
                 # C.g:554:2: ( '{' ( declaration )* ( statement_list )? '}' )
                 # C.g:554:4: '{' ( declaration )* ( statement_list )? '}'
-                self.match(self.input, 43, self.FOLLOW_43_in_compound_statement2223)
+                self.match(self.input, 43,
+                           self.FOLLOW_43_in_compound_statement2223)
                 if self.failed:
                     return retval
                 # C.g:554:8: ( declaration )*
-                while True: #loop93
+                while True:  # loop93
                     alt93 = 2
                     LA93 = self.input.LA(1)
                     if LA93 == IDENTIFIER:
@@ -12602,1930 +12170,1645 @@ class CParser(Parser):
                         if LA93 == 62:
                             LA93_44 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_47 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 66:
                             LA93_48 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_49 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_50 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_51 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_52 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_53 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_54 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_55 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_56 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_57 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_58 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_59 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_60 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_61 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_62 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_63 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_64 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_65 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 26:
                         LA93 = self.input.LA(2)
                         if LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_86 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_87 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_88 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_89 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_90 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_91 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_92 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_93 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_94 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_95 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_96 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_97 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_98 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_99 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 66:
                             LA93_100 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_101 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_102 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_103 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_104 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_105 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_106 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_107 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_108 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_109 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_110 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_111 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_112 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_113 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_114 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_115 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_116 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_117 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_118 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_119 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_120 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_121 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_122 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_123 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_124 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 34:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_125 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_126 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_127 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_128 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_129 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_130 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_131 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_132 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_133 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_134 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_135 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_136 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_137 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_138 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_139 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_140 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_141 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_142 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_143 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_144 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 35:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_145 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_146 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_147 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_148 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_149 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_150 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_151 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_152 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_153 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_154 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_155 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_156 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_157 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_158 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_159 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_160 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_161 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_162 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_163 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_164 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 36:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_165 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_166 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_167 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_168 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_169 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_170 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_171 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_172 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_173 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_174 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_175 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_176 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_177 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_178 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_179 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_180 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_181 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_182 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_183 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_184 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 37:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_185 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_186 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_187 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_188 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_189 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_190 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_191 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_192 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_193 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_194 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_195 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_196 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_197 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_198 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_199 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_200 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_201 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_202 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_203 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_204 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 38:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_205 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_206 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_207 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_208 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_209 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_210 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_211 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_212 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_213 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_214 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_215 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_216 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_217 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_218 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_219 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_220 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_221 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_222 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_223 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_224 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 39:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_225 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_226 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_227 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_228 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_229 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_230 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_231 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_232 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_233 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_234 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_235 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_236 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_237 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_238 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_239 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_240 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_241 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_242 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_243 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_244 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 40:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_245 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_246 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_247 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_248 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_249 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_250 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_251 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_252 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_253 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_254 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_255 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_256 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_257 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_258 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_259 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_260 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_261 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_262 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_263 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_264 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 41:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_265 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_266 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_267 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_268 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_269 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_270 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_271 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_272 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_273 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_274 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_275 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_276 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_277 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_278 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_279 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_280 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_281 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_282 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_283 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_284 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 42:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_285 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_286 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_287 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_288 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_289 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_290 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_291 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_292 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_293 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_294 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_295 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_296 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_297 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_298 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_299 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_300 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_301 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_302 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_303 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_304 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 45 or LA93 == 46:
                         LA93_40 = self.input.LA(2)
 
-                        if (LA93_40 == IDENTIFIER) :
+                        if (LA93_40 == IDENTIFIER):
                             LA93_305 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-                        elif (LA93_40 == 43) :
+                        elif (LA93_40 == 43):
                             LA93_306 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     elif LA93 == 48:
                         LA93_41 = self.input.LA(2)
 
-                        if (LA93_41 == 43) :
+                        if (LA93_41 == 43):
                             LA93_307 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-                        elif (LA93_41 == IDENTIFIER) :
+                        elif (LA93_41 == IDENTIFIER):
                             LA93_308 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 58 or LA93 == 59 or LA93 == 60 or LA93 == 61:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_309 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_310 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_311 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_312 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_313 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_314 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_315 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_316 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_317 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_318 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_319 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_320 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_321 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_322 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_323 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_324 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_325 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_326 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_327 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_328 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     if alt93 == 1:
                         # C.g:0:0: declaration
-                        self.following.append(self.FOLLOW_declaration_in_compound_statement2225)
+                        self.following.append(
+                            self.FOLLOW_declaration_in_compound_statement2225)
                         self.declaration()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop93
-
+                        break  # loop93
 
                 # C.g:554:21: ( statement_list )?
                 alt94 = 2
                 LA94_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA94_0 <= FLOATING_POINT_LITERAL) or (25 <= LA94_0 <= 26) or (29 <= LA94_0 <= 43) or (45 <= LA94_0 <= 46) or (48 <= LA94_0 <= 62) or LA94_0 == 66 or (68 <= LA94_0 <= 69) or (72 <= LA94_0 <= 74) or (77 <= LA94_0 <= 79) or (103 <= LA94_0 <= 108) or (110 <= LA94_0 <= 117)) :
+                if ((IDENTIFIER <= LA94_0 <= FLOATING_POINT_LITERAL) or (25 <= LA94_0 <= 26) or (29 <= LA94_0 <= 43) or (45 <= LA94_0 <= 46) or (48 <= LA94_0 <= 62) or LA94_0 == 66 or (68 <= LA94_0 <= 69) or (72 <= LA94_0 <= 74) or (77 <= LA94_0 <= 79) or (103 <= LA94_0 <= 108) or (110 <= LA94_0 <= 117)):
                     alt94 = 1
                 if alt94 == 1:
                     # C.g:0:0: statement_list
-                    self.following.append(self.FOLLOW_statement_list_in_compound_statement2228)
+                    self.following.append(
+                        self.FOLLOW_statement_list_in_compound_statement2228)
                     self.statement_list()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
-                self.match(self.input, 44, self.FOLLOW_44_in_compound_statement2231)
+                self.match(self.input, 44,
+                           self.FOLLOW_44_in_compound_statement2231)
                 if self.failed:
                     return retval
 
-
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -14539,9 +13822,9 @@ class CParser(Parser):
 
     # $ANTLR end compound_statement
 
-
     # $ANTLR start statement_list
     # C.g:557:1: statement_list : ( statement )+ ;
+
     def statement_list(self, ):
 
         statement_list_StartIndex = self.input.index()
@@ -14554,7 +13837,7 @@ class CParser(Parser):
                 # C.g:558:4: ( statement )+
                 # C.g:558:4: ( statement )+
                 cnt95 = 0
-                while True: #loop95
+                while True:  # loop95
                     alt95 = 2
                     LA95 = self.input.LA(1)
                     if LA95 == IDENTIFIER:
@@ -14562,330 +13845,283 @@ class CParser(Parser):
                         if LA95 == 62:
                             LA95_46 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25 or LA95 == 29 or LA95 == 30 or LA95 == 31 or LA95 == 32 or LA95 == 33 or LA95 == 34 or LA95 == 35 or LA95 == 36 or LA95 == 37 or LA95 == 38 or LA95 == 39 or LA95 == 40 or LA95 == 41 or LA95 == 42 or LA95 == 45 or LA95 == 46 or LA95 == 47 or LA95 == 48 or LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61:
                             alt95 = 1
                         elif LA95 == STRING_LITERAL:
                             LA95_48 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_49 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 64:
                             LA95_50 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_51 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_52 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_53 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_54 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_55 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_56 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_57 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_58 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_59 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_60 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_61 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_62 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_63 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_64 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_65 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_66 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_67 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_68 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_69 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_88 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == HEX_LITERAL:
                         LA95 = self.input.LA(2)
                         if LA95 == 64:
                             LA95_89 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_90 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_91 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_92 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_93 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_94 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_95 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_96 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_97 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_98 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_99 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_100 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_101 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_102 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_103 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_104 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_105 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_106 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_107 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_108 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_109 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_110 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -14894,157 +14130,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_113 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_114 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_115 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_116 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_117 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_118 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_119 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_120 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_121 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_122 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_123 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_124 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_125 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_126 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_127 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_128 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_129 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_130 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_131 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_132 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_133 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_135 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15053,157 +14267,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_137 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_138 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_139 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_140 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_141 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_142 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_143 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_144 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_145 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_146 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_147 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_148 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_149 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_150 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_151 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_152 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_153 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_154 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_155 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_156 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_157 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_158 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15212,157 +14404,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_161 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_162 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_163 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_164 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_165 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_166 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_167 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_168 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_169 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_170 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_171 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_172 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_173 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_174 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_175 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_176 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_177 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_178 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_179 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_180 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_181 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_182 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15371,867 +14541,742 @@ class CParser(Parser):
                         if LA95 == IDENTIFIER:
                             LA95_185 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 64:
                             LA95_186 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_187 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_188 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_189 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_190 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_191 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_192 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_193 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_194 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_195 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_196 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_197 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_198 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_199 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_200 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_201 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_202 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_203 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_204 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_205 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_206 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
                         elif LA95 == STRING_LITERAL:
                             LA95_208 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_209 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == FLOATING_POINT_LITERAL:
                         LA95 = self.input.LA(2)
                         if LA95 == 64:
                             LA95_211 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_212 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_213 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_214 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_215 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_216 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_217 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_218 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_219 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_220 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_221 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_222 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_223 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_224 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_225 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_226 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_227 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_228 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_229 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_230 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_231 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_234 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 62:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_235 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_236 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_237 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_238 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_239 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_240 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_241 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_242 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_243 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_244 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_245 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_246 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61:
                             LA95_247 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 34:
                             LA95_248 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 35:
                             LA95_249 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 36:
                             LA95_250 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 37:
                             LA95_251 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 38:
                             LA95_252 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 39:
                             LA95_253 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 40:
                             LA95_254 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 41:
                             LA95_255 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 42:
                             LA95_256 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 45 or LA95 == 46:
                             LA95_257 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 48:
                             LA95_258 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 72:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_259 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_260 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_261 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_262 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_263 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_264 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_265 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_266 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_267 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_268 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_269 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_270 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 73:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_271 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_272 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_273 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_274 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_275 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_276 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_277 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_278 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_279 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_280 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_281 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_282 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                         LA95 = self.input.LA(2)
                         if LA95 == 62:
                             LA95_283 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_284 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_285 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_286 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_287 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_288 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_289 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_290 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_291 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_292 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_293 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_294 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 74:
                         LA95 = self.input.LA(2)
                         if LA95 == 62:
                             LA95_295 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_296 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_297 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_298 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_299 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_300 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_301 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_302 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_303 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_304 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_305 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_306 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 25 or LA95 == 26 or LA95 == 29 or LA95 == 30 or LA95 == 31 or LA95 == 32 or LA95 == 33 or LA95 == 34 or LA95 == 35 or LA95 == 36 or LA95 == 37 or LA95 == 38 or LA95 == 39 or LA95 == 40 or LA95 == 41 or LA95 == 42 or LA95 == 43 or LA95 == 45 or LA95 == 46 or LA95 == 48 or LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61 or LA95 == 103 or LA95 == 104 or LA95 == 105 or LA95 == 106 or LA95 == 107 or LA95 == 108 or LA95 == 110 or LA95 == 111 or LA95 == 112 or LA95 == 113 or LA95 == 114 or LA95 == 115 or LA95 == 116 or LA95 == 117:
                         alt95 = 1
 
                     if alt95 == 1:
                         # C.g:0:0: statement
-                        self.following.append(self.FOLLOW_statement_in_statement_list2242)
+                        self.following.append(
+                            self.FOLLOW_statement_in_statement_list2242)
                         self.statement()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt95 >= 1:
-                            break #loop95
+                            break  # loop95
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -16242,11 +15287,6 @@ class CParser(Parser):
 
                     cnt95 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16265,10 +15305,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start expression_statement
     # C.g:561:1: expression_statement : ( ';' | expression ';' );
+
     def expression_statement(self, ):
 
         retval = self.expression_statement_return()
@@ -16283,41 +15322,42 @@ class CParser(Parser):
                 alt96 = 2
                 LA96_0 = self.input.LA(1)
 
-                if (LA96_0 == 25) :
+                if (LA96_0 == 25):
                     alt96 = 1
-                elif ((IDENTIFIER <= LA96_0 <= FLOATING_POINT_LITERAL) or LA96_0 == 62 or LA96_0 == 66 or (68 <= LA96_0 <= 69) or (72 <= LA96_0 <= 74) or (77 <= LA96_0 <= 79)) :
+                elif ((IDENTIFIER <= LA96_0 <= FLOATING_POINT_LITERAL) or LA96_0 == 62 or LA96_0 == 66 or (68 <= LA96_0 <= 69) or (72 <= LA96_0 <= 74) or (77 <= LA96_0 <= 79)):
                     alt96 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("561:1: expression_statement : ( ';' | expression ';' );", 96, 0, self.input)
+                    nvae = NoViableAltException(
+                        "561:1: expression_statement : ( ';' | expression ';' );", 96, 0, self.input)
 
                     raise nvae
 
                 if alt96 == 1:
                     # C.g:562:4: ';'
-                    self.match(self.input, 25, self.FOLLOW_25_in_expression_statement2254)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_expression_statement2254)
                     if self.failed:
                         return retval
 
-
                 elif alt96 == 2:
                     # C.g:563:4: expression ';'
-                    self.following.append(self.FOLLOW_expression_in_expression_statement2259)
+                    self.following.append(
+                        self.FOLLOW_expression_in_expression_statement2259)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 25, self.FOLLOW_25_in_expression_statement2261)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_expression_statement2261)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16331,15 +15371,14 @@ class CParser(Parser):
 
     # $ANTLR end expression_statement
 
-
     # $ANTLR start selection_statement
     # C.g:566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );
+
     def selection_statement(self, ):
 
         selection_statement_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 69):
@@ -16349,39 +15388,46 @@ class CParser(Parser):
                 alt98 = 2
                 LA98_0 = self.input.LA(1)
 
-                if (LA98_0 == 108) :
+                if (LA98_0 == 108):
                     alt98 = 1
-                elif (LA98_0 == 110) :
+                elif (LA98_0 == 110):
                     alt98 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
+                    nvae = NoViableAltException(
+                        "566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
 
                     raise nvae
 
                 if alt98 == 1:
                     # C.g:567:4: 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )?
-                    self.match(self.input, 108, self.FOLLOW_108_in_selection_statement2272)
+                    self.match(self.input, 108,
+                               self.FOLLOW_108_in_selection_statement2272)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2274)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_selection_statement2274)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_selection_statement2278)
+                    self.following.append(
+                        self.FOLLOW_expression_in_selection_statement2278)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2280)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_selection_statement2280)
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
-                    self.following.append(self.FOLLOW_statement_in_selection_statement2284)
+                    self.following.append(
+                        self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
                     self.following.pop()
                     if self.failed:
@@ -16390,47 +15436,48 @@ class CParser(Parser):
                     alt97 = 2
                     LA97_0 = self.input.LA(1)
 
-                    if (LA97_0 == 109) :
+                    if (LA97_0 == 109):
                         alt97 = 1
                     if alt97 == 1:
                         # C.g:567:200: 'else' statement
-                        self.match(self.input, 109, self.FOLLOW_109_in_selection_statement2299)
+                        self.match(self.input, 109,
+                                   self.FOLLOW_109_in_selection_statement2299)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_statement_in_selection_statement2301)
+                        self.following.append(
+                            self.FOLLOW_statement_in_selection_statement2301)
                         self.statement()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt98 == 2:
                     # C.g:568:4: 'switch' '(' expression ')' statement
-                    self.match(self.input, 110, self.FOLLOW_110_in_selection_statement2308)
+                    self.match(self.input, 110,
+                               self.FOLLOW_110_in_selection_statement2308)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2310)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_selection_statement2310)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_selection_statement2312)
+                    self.following.append(
+                        self.FOLLOW_expression_in_selection_statement2312)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2314)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_selection_statement2314)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_selection_statement2316)
+                    self.following.append(
+                        self.FOLLOW_statement_in_selection_statement2316)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16444,15 +15491,14 @@ class CParser(Parser):
 
     # $ANTLR end selection_statement
 
-
     # $ANTLR start iteration_statement
     # C.g:571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );
+
     def iteration_statement(self, ):
 
         iteration_statement_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 70):
@@ -16472,82 +15518,97 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
+                    nvae = NoViableAltException(
+                        "571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
 
                     raise nvae
 
                 if alt100 == 1:
                     # C.g:572:4: 'while' '(' e= expression ')' statement
-                    self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2327)
+                    self.match(self.input, 111,
+                               self.FOLLOW_111_in_iteration_statement2327)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2329)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2329)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_iteration_statement2333)
+                    self.following.append(
+                        self.FOLLOW_expression_in_iteration_statement2333)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2335)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2335)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2337)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2337)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt100 == 2:
                     # C.g:573:4: 'do' statement 'while' '(' e= expression ')' ';'
-                    self.match(self.input, 112, self.FOLLOW_112_in_iteration_statement2344)
+                    self.match(self.input, 112,
+                               self.FOLLOW_112_in_iteration_statement2344)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2346)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2346)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2348)
+                    self.match(self.input, 111,
+                               self.FOLLOW_111_in_iteration_statement2348)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2350)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2350)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_iteration_statement2354)
+                    self.following.append(
+                        self.FOLLOW_expression_in_iteration_statement2354)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2356)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2356)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_iteration_statement2358)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_iteration_statement2358)
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt100 == 3:
                     # C.g:574:4: 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement
-                    self.match(self.input, 113, self.FOLLOW_113_in_iteration_statement2365)
+                    self.match(self.input, 113,
+                               self.FOLLOW_113_in_iteration_statement2365)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2367)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2367)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2369)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_iteration_statement2369)
                     self.expression_statement()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2373)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_iteration_statement2373)
                     e = self.expression_statement()
                     self.following.pop()
                     if self.failed:
@@ -16556,31 +15617,30 @@ class CParser(Parser):
                     alt99 = 2
                     LA99_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA99_0 <= FLOATING_POINT_LITERAL) or LA99_0 == 62 or LA99_0 == 66 or (68 <= LA99_0 <= 69) or (72 <= LA99_0 <= 74) or (77 <= LA99_0 <= 79)) :
+                    if ((IDENTIFIER <= LA99_0 <= FLOATING_POINT_LITERAL) or LA99_0 == 62 or LA99_0 == 66 or (68 <= LA99_0 <= 69) or (72 <= LA99_0 <= 74) or (77 <= LA99_0 <= 79)):
                         alt99 = 1
                     if alt99 == 1:
                         # C.g:0:0: expression
-                        self.following.append(self.FOLLOW_expression_in_iteration_statement2375)
+                        self.following.append(
+                            self.FOLLOW_expression_in_iteration_statement2375)
                         self.expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2378)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2378)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2380)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2380)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -16595,9 +15655,9 @@ class CParser(Parser):
 
     # $ANTLR end iteration_statement
 
-
     # $ANTLR start jump_statement
     # C.g:577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );
+
     def jump_statement(self, ):
 
         jump_statement_StartIndex = self.input.index()
@@ -16618,16 +15678,17 @@ class CParser(Parser):
                 elif LA101 == 117:
                     LA101_4 = self.input.LA(2)
 
-                    if (LA101_4 == 25) :
+                    if (LA101_4 == 25):
                         alt101 = 4
-                    elif ((IDENTIFIER <= LA101_4 <= FLOATING_POINT_LITERAL) or LA101_4 == 62 or LA101_4 == 66 or (68 <= LA101_4 <= 69) or (72 <= LA101_4 <= 74) or (77 <= LA101_4 <= 79)) :
+                    elif ((IDENTIFIER <= LA101_4 <= FLOATING_POINT_LITERAL) or LA101_4 == 62 or LA101_4 == 66 or (68 <= LA101_4 <= 69) or (72 <= LA101_4 <= 74) or (77 <= LA101_4 <= 79)):
                         alt101 = 5
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
+                        nvae = NoViableAltException(
+                            "577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
 
                         raise nvae
 
@@ -16636,69 +15697,76 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
+                    nvae = NoViableAltException(
+                        "577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
 
                     raise nvae
 
                 if alt101 == 1:
                     # C.g:578:4: 'goto' IDENTIFIER ';'
-                    self.match(self.input, 114, self.FOLLOW_114_in_jump_statement2393)
+                    self.match(self.input, 114,
+                               self.FOLLOW_114_in_jump_statement2393)
                     if self.failed:
                         return
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_jump_statement2395)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_jump_statement2395)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2397)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2397)
                     if self.failed:
                         return
 
-
                 elif alt101 == 2:
                     # C.g:579:4: 'continue' ';'
-                    self.match(self.input, 115, self.FOLLOW_115_in_jump_statement2402)
+                    self.match(self.input, 115,
+                               self.FOLLOW_115_in_jump_statement2402)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2404)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2404)
                     if self.failed:
                         return
 
-
                 elif alt101 == 3:
                     # C.g:580:4: 'break' ';'
-                    self.match(self.input, 116, self.FOLLOW_116_in_jump_statement2409)
+                    self.match(self.input, 116,
+                               self.FOLLOW_116_in_jump_statement2409)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2411)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2411)
                     if self.failed:
                         return
 
-
                 elif alt101 == 4:
                     # C.g:581:4: 'return' ';'
-                    self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2416)
+                    self.match(self.input, 117,
+                               self.FOLLOW_117_in_jump_statement2416)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2418)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2418)
                     if self.failed:
                         return
 
-
                 elif alt101 == 5:
                     # C.g:582:4: 'return' expression ';'
-                    self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2423)
+                    self.match(self.input, 117,
+                               self.FOLLOW_117_in_jump_statement2423)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_jump_statement2425)
+                    self.following.append(
+                        self.FOLLOW_expression_in_jump_statement2425)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2427)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2427)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16716,18 +15784,17 @@ class CParser(Parser):
     def synpred2_fragment(self, ):
         # C.g:119:6: ( declaration_specifiers )
         # C.g:119:6: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred2100)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred2100)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred2
 
-
-
     # $ANTLR start synpred4
+
     def synpred4_fragment(self, ):
         # C.g:119:4: ( ( declaration_specifiers )? declarator ( declaration )* '{' )
         # C.g:119:6: ( declaration_specifiers )? declarator ( declaration )* '{'
@@ -16741,134 +15808,132 @@ class CParser(Parser):
             if LA102 == 62:
                 LA102_21 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 29 or LA102 == 30 or LA102 == 31 or LA102 == 32 or LA102 == 33:
                 LA102_23 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 34:
                 LA102_24 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 35:
                 LA102_25 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 36:
                 LA102_26 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 37:
                 LA102_27 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 38:
                 LA102_28 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 39:
                 LA102_29 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 40:
                 LA102_30 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 41:
                 LA102_31 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 42:
                 LA102_32 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 45 or LA102 == 46:
                 LA102_33 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 48:
                 LA102_34 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == IDENTIFIER:
                 LA102_35 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 58:
                 LA102_36 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 66:
                 alt102 = 1
             elif LA102 == 59:
                 LA102_39 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 60:
                 LA102_40 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 49 or LA102 == 50 or LA102 == 51 or LA102 == 52 or LA102 == 53 or LA102 == 54 or LA102 == 55 or LA102 == 56 or LA102 == 57 or LA102 == 61:
                 LA102_41 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
         elif LA102 == 58:
             LA102_14 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         elif LA102 == 59:
             LA102_16 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         elif LA102 == 60:
             LA102_17 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         if alt102 == 1:
             # C.g:0:0: declaration_specifiers
-            self.following.append(self.FOLLOW_declaration_specifiers_in_synpred4100)
+            self.following.append(
+                self.FOLLOW_declaration_specifiers_in_synpred4100)
             self.declaration_specifiers()
             self.following.pop()
             if self.failed:
                 return
 
-
-
         self.following.append(self.FOLLOW_declarator_in_synpred4103)
         self.declarator()
         self.following.pop()
         if self.failed:
             return
         # C.g:119:41: ( declaration )*
-        while True: #loop103
+        while True:  # loop103
             alt103 = 2
             LA103_0 = self.input.LA(1)
 
-            if (LA103_0 == IDENTIFIER or LA103_0 == 26 or (29 <= LA103_0 <= 42) or (45 <= LA103_0 <= 46) or (48 <= LA103_0 <= 61)) :
+            if (LA103_0 == IDENTIFIER or LA103_0 == 26 or (29 <= LA103_0 <= 42) or (45 <= LA103_0 <= 46) or (48 <= LA103_0 <= 61)):
                 alt103 = 1
 
-
             if alt103 == 1:
                 # C.g:0:0: declaration
                 self.following.append(self.FOLLOW_declaration_in_synpred4105)
@@ -16877,21 +15942,17 @@ class CParser(Parser):
                 if self.failed:
                     return
 
-
             else:
-                break #loop103
-
+                break  # loop103
 
         self.match(self.input, 43, self.FOLLOW_43_in_synpred4108)
         if self.failed:
             return
 
-
     # $ANTLR end synpred4
 
-
-
     # $ANTLR start synpred5
+
     def synpred5_fragment(self, ):
         # C.g:120:4: ( declaration )
         # C.g:120:4: declaration
@@ -16901,42 +15962,38 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred5
 
-
-
     # $ANTLR start synpred7
+
     def synpred7_fragment(self, ):
         # C.g:146:6: ( declaration_specifiers )
         # C.g:146:6: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred7157)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred7157)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred7
 
-
-
     # $ANTLR start synpred10
+
     def synpred10_fragment(self, ):
         # C.g:167:18: ( declaration_specifiers )
         # C.g:167:18: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred10207)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred10207)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred10
 
-
-
     # $ANTLR start synpred14
+
     def synpred14_fragment(self, ):
         # C.g:184:7: ( type_specifier )
         # C.g:184:7: type_specifier
@@ -16946,12 +16003,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred14
 
-
-
     # $ANTLR start synpred15
+
     def synpred15_fragment(self, ):
         # C.g:185:13: ( type_qualifier )
         # C.g:185:13: type_qualifier
@@ -16961,12 +16016,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred15
 
-
-
     # $ANTLR start synpred33
+
     def synpred33_fragment(self, ):
         # C.g:225:16: ( type_qualifier )
         # C.g:225:16: type_qualifier
@@ -16976,58 +16029,53 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred33
 
-
-
     # $ANTLR start synpred34
+
     def synpred34_fragment(self, ):
         # C.g:225:4: ( IDENTIFIER ( type_qualifier )* declarator )
         # C.g:225:5: IDENTIFIER ( type_qualifier )* declarator
-        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred34442)
+        self.match(self.input, IDENTIFIER,
+                   self.FOLLOW_IDENTIFIER_in_synpred34442)
         if self.failed:
             return
         # C.g:225:16: ( type_qualifier )*
-        while True: #loop106
+        while True:  # loop106
             alt106 = 2
             LA106 = self.input.LA(1)
             if LA106 == 58:
                 LA106_2 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 59:
                 LA106_3 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 60:
                 LA106_4 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 49 or LA106 == 50 or LA106 == 51 or LA106 == 52 or LA106 == 53 or LA106 == 54 or LA106 == 55 or LA106 == 56 or LA106 == 57 or LA106 == 61:
                 alt106 = 1
 
             if alt106 == 1:
                 # C.g:0:0: type_qualifier
-                self.following.append(self.FOLLOW_type_qualifier_in_synpred34444)
+                self.following.append(
+                    self.FOLLOW_type_qualifier_in_synpred34444)
                 self.type_qualifier()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
-                break #loop106
-
+                break  # loop106
 
         self.following.append(self.FOLLOW_declarator_in_synpred34447)
         self.declarator()
@@ -17035,12 +16083,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred34
 
-
-
     # $ANTLR start synpred39
+
     def synpred39_fragment(self, ):
         # C.g:253:6: ( type_qualifier )
         # C.g:253:6: type_qualifier
@@ -17050,12 +16096,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred39
 
-
-
     # $ANTLR start synpred40
+
     def synpred40_fragment(self, ):
         # C.g:253:23: ( type_specifier )
         # C.g:253:23: type_specifier
@@ -17065,12 +16109,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred40
 
-
-
     # $ANTLR start synpred66
+
     def synpred66_fragment(self, ):
         # C.g:297:4: ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator )
         # C.g:297:4: ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator
@@ -17078,7 +16120,7 @@ class CParser(Parser):
         alt111 = 2
         LA111_0 = self.input.LA(1)
 
-        if (LA111_0 == 66) :
+        if (LA111_0 == 66):
             alt111 = 1
         if alt111 == 1:
             # C.g:0:0: pointer
@@ -17088,13 +16130,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:13: ( 'EFIAPI' )?
         alt112 = 2
         LA112_0 = self.input.LA(1)
 
-        if (LA112_0 == 58) :
+        if (LA112_0 == 58):
             alt112 = 1
         if alt112 == 1:
             # C.g:297:14: 'EFIAPI'
@@ -17102,13 +16142,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:25: ( 'EFI_BOOTSERVICE' )?
         alt113 = 2
         LA113_0 = self.input.LA(1)
 
-        if (LA113_0 == 59) :
+        if (LA113_0 == 59):
             alt113 = 1
         if alt113 == 1:
             # C.g:297:26: 'EFI_BOOTSERVICE'
@@ -17116,13 +16154,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:46: ( 'EFI_RUNTIMESERVICE' )?
         alt114 = 2
         LA114_0 = self.input.LA(1)
 
-        if (LA114_0 == 60) :
+        if (LA114_0 == 60):
             alt114 = 1
         if alt114 == 1:
             # C.g:297:47: 'EFI_RUNTIMESERVICE'
@@ -17130,20 +16166,16 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         self.following.append(self.FOLLOW_direct_declarator_in_synpred66802)
         self.direct_declarator()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred66
 
-
-
     # $ANTLR start synpred67
+
     def synpred67_fragment(self, ):
         # C.g:303:15: ( declarator_suffix )
         # C.g:303:15: declarator_suffix
@@ -17153,12 +16185,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred67
 
-
-
     # $ANTLR start synpred69
+
     def synpred69_fragment(self, ):
         # C.g:304:9: ( 'EFIAPI' )
         # C.g:304:9: 'EFIAPI'
@@ -17166,12 +16196,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred69
 
-
-
     # $ANTLR start synpred70
+
     def synpred70_fragment(self, ):
         # C.g:304:35: ( declarator_suffix )
         # C.g:304:35: declarator_suffix
@@ -17181,12 +16209,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred70
 
-
-
     # $ANTLR start synpred73
+
     def synpred73_fragment(self, ):
         # C.g:310:9: ( '(' parameter_type_list ')' )
         # C.g:310:9: '(' parameter_type_list ')'
@@ -17202,12 +16228,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred73
 
-
-
     # $ANTLR start synpred74
+
     def synpred74_fragment(self, ):
         # C.g:311:9: ( '(' identifier_list ')' )
         # C.g:311:9: '(' identifier_list ')'
@@ -17223,12 +16247,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred74
 
-
-
     # $ANTLR start synpred75
+
     def synpred75_fragment(self, ):
         # C.g:316:8: ( type_qualifier )
         # C.g:316:8: type_qualifier
@@ -17238,12 +16260,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred75
 
-
-
     # $ANTLR start synpred76
+
     def synpred76_fragment(self, ):
         # C.g:316:24: ( pointer )
         # C.g:316:24: pointer
@@ -17253,12 +16273,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred76
 
-
-
     # $ANTLR start synpred77
+
     def synpred77_fragment(self, ):
         # C.g:316:4: ( '*' ( type_qualifier )+ ( pointer )? )
         # C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
@@ -17267,26 +16285,25 @@ class CParser(Parser):
             return
         # C.g:316:8: ( type_qualifier )+
         cnt116 = 0
-        while True: #loop116
+        while True:  # loop116
             alt116 = 2
             LA116_0 = self.input.LA(1)
 
-            if ((49 <= LA116_0 <= 61)) :
+            if ((49 <= LA116_0 <= 61)):
                 alt116 = 1
 
-
             if alt116 == 1:
                 # C.g:0:0: type_qualifier
-                self.following.append(self.FOLLOW_type_qualifier_in_synpred77921)
+                self.following.append(
+                    self.FOLLOW_type_qualifier_in_synpred77921)
                 self.type_qualifier()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
                 if cnt116 >= 1:
-                    break #loop116
+                    break  # loop116
 
                 if self.backtracking > 0:
                     self.failed = True
@@ -17297,12 +16314,11 @@ class CParser(Parser):
 
             cnt116 += 1
 
-
         # C.g:316:24: ( pointer )?
         alt117 = 2
         LA117_0 = self.input.LA(1)
 
-        if (LA117_0 == 66) :
+        if (LA117_0 == 66):
             alt117 = 1
         if alt117 == 1:
             # C.g:0:0: pointer
@@ -17312,15 +16328,10 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred77
 
-
-
     # $ANTLR start synpred78
+
     def synpred78_fragment(self, ):
         # C.g:317:4: ( '*' pointer )
         # C.g:317:4: '*' pointer
@@ -17333,12 +16344,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred78
 
-
-
     # $ANTLR start synpred81
+
     def synpred81_fragment(self, ):
         # C.g:326:32: ( 'OPTIONAL' )
         # C.g:326:32: 'OPTIONAL'
@@ -17346,12 +16355,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred81
 
-
-
     # $ANTLR start synpred82
+
     def synpred82_fragment(self, ):
         # C.g:326:27: ( ',' ( 'OPTIONAL' )? parameter_declaration )
         # C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
@@ -17362,10 +16369,10 @@ class CParser(Parser):
         alt119 = 2
         LA119_0 = self.input.LA(1)
 
-        if (LA119_0 == 53) :
+        if (LA119_0 == 53):
             LA119_1 = self.input.LA(2)
 
-            if (self.synpred81()) :
+            if (self.synpred81()):
                 alt119 = 1
         if alt119 == 1:
             # C.g:326:32: 'OPTIONAL'
@@ -17373,20 +16380,17 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-        self.following.append(self.FOLLOW_parameter_declaration_in_synpred82981)
+        self.following.append(
+            self.FOLLOW_parameter_declaration_in_synpred82981)
         self.parameter_declaration()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred82
 
-
-
     # $ANTLR start synpred83
+
     def synpred83_fragment(self, ):
         # C.g:330:28: ( declarator )
         # C.g:330:28: declarator
@@ -17396,12 +16400,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred83
 
-
-
     # $ANTLR start synpred84
+
     def synpred84_fragment(self, ):
         # C.g:330:39: ( abstract_declarator )
         # C.g:330:39: abstract_declarator
@@ -17411,33 +16413,31 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred84
 
-
-
     # $ANTLR start synpred86
+
     def synpred86_fragment(self, ):
         # C.g:330:4: ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? )
         # C.g:330:4: declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )?
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred86994)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred86994)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
         # C.g:330:27: ( declarator | abstract_declarator )*
-        while True: #loop120
+        while True:  # loop120
             alt120 = 3
             LA120 = self.input.LA(1)
             if LA120 == 66:
                 LA120_3 = self.input.LA(2)
 
-                if (self.synpred83()) :
+                if (self.synpred83()):
                     alt120 = 1
-                elif (self.synpred84()) :
+                elif (self.synpred84()):
                     alt120 = 2
 
-
             elif LA120 == IDENTIFIER or LA120 == 58 or LA120 == 59 or LA120 == 60:
                 alt120 = 1
             elif LA120 == 62:
@@ -17447,58 +16447,51 @@ class CParser(Parser):
                 elif LA120 == 58:
                     LA120_21 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 66:
                     LA120_22 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 59:
                     LA120_23 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 60:
                     LA120_24 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == IDENTIFIER:
                     LA120_25 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 62:
                     LA120_26 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
-
             elif LA120 == 64:
                 alt120 = 2
 
@@ -17510,25 +16503,23 @@ class CParser(Parser):
                 if self.failed:
                     return
 
-
             elif alt120 == 2:
                 # C.g:330:39: abstract_declarator
-                self.following.append(self.FOLLOW_abstract_declarator_in_synpred86999)
+                self.following.append(
+                    self.FOLLOW_abstract_declarator_in_synpred86999)
                 self.abstract_declarator()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
-                break #loop120
-
+                break  # loop120
 
         # C.g:330:61: ( 'OPTIONAL' )?
         alt121 = 2
         LA121_0 = self.input.LA(1)
 
-        if (LA121_0 == 53) :
+        if (LA121_0 == 53):
             alt121 = 1
         if alt121 == 1:
             # C.g:330:62: 'OPTIONAL'
@@ -17536,19 +16527,15 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred86
 
-
-
     # $ANTLR start synpred90
+
     def synpred90_fragment(self, ):
         # C.g:341:4: ( specifier_qualifier_list ( abstract_declarator )? )
         # C.g:341:4: specifier_qualifier_list ( abstract_declarator )?
-        self.following.append(self.FOLLOW_specifier_qualifier_list_in_synpred901046)
+        self.following.append(
+            self.FOLLOW_specifier_qualifier_list_in_synpred901046)
         self.specifier_qualifier_list()
         self.following.pop()
         if self.failed:
@@ -17557,40 +16544,35 @@ class CParser(Parser):
         alt122 = 2
         LA122_0 = self.input.LA(1)
 
-        if (LA122_0 == 62 or LA122_0 == 64 or LA122_0 == 66) :
+        if (LA122_0 == 62 or LA122_0 == 64 or LA122_0 == 66):
             alt122 = 1
         if alt122 == 1:
             # C.g:0:0: abstract_declarator
-            self.following.append(self.FOLLOW_abstract_declarator_in_synpred901048)
+            self.following.append(
+                self.FOLLOW_abstract_declarator_in_synpred901048)
             self.abstract_declarator()
             self.following.pop()
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred90
 
-
-
     # $ANTLR start synpred91
+
     def synpred91_fragment(self, ):
         # C.g:346:12: ( direct_abstract_declarator )
         # C.g:346:12: direct_abstract_declarator
-        self.following.append(self.FOLLOW_direct_abstract_declarator_in_synpred911067)
+        self.following.append(
+            self.FOLLOW_direct_abstract_declarator_in_synpred911067)
         self.direct_abstract_declarator()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred91
 
-
-
     # $ANTLR start synpred93
+
     def synpred93_fragment(self, ):
         # C.g:351:6: ( '(' abstract_declarator ')' )
         # C.g:351:6: '(' abstract_declarator ')'
@@ -17606,27 +16588,24 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred93
 
-
-
     # $ANTLR start synpred94
+
     def synpred94_fragment(self, ):
         # C.g:351:65: ( abstract_declarator_suffix )
         # C.g:351:65: abstract_declarator_suffix
-        self.following.append(self.FOLLOW_abstract_declarator_suffix_in_synpred941098)
+        self.following.append(
+            self.FOLLOW_abstract_declarator_suffix_in_synpred941098)
         self.abstract_declarator_suffix()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred94
 
-
-
     # $ANTLR start synpred109
+
     def synpred109_fragment(self, ):
         # C.g:386:4: ( '(' type_name ')' cast_expression )
         # C.g:386:4: '(' type_name ')' cast_expression
@@ -17647,12 +16626,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred109
 
-
-
     # $ANTLR start synpred114
+
     def synpred114_fragment(self, ):
         # C.g:395:4: ( 'sizeof' unary_expression )
         # C.g:395:4: 'sizeof' unary_expression
@@ -17665,19 +16642,18 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred114
 
-
-
     # $ANTLR start synpred117
+
     def synpred117_fragment(self, ):
         # C.g:409:13: ( '(' argument_expression_list ')' )
         # C.g:409:13: '(' argument_expression_list ')'
         self.match(self.input, 62, self.FOLLOW_62_in_synpred1171420)
         if self.failed:
             return
-        self.following.append(self.FOLLOW_argument_expression_list_in_synpred1171424)
+        self.following.append(
+            self.FOLLOW_argument_expression_list_in_synpred1171424)
         self.argument_expression_list()
         self.following.pop()
         if self.failed:
@@ -17686,19 +16662,18 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred117
 
-
-
     # $ANTLR start synpred118
+
     def synpred118_fragment(self, ):
         # C.g:410:13: ( '(' macro_parameter_list ')' )
         # C.g:410:13: '(' macro_parameter_list ')'
         self.match(self.input, 62, self.FOLLOW_62_in_synpred1181444)
         if self.failed:
             return
-        self.following.append(self.FOLLOW_macro_parameter_list_in_synpred1181446)
+        self.following.append(
+            self.FOLLOW_macro_parameter_list_in_synpred1181446)
         self.macro_parameter_list()
         self.following.pop()
         if self.failed:
@@ -17707,84 +16682,77 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred118
 
-
-
     # $ANTLR start synpred120
+
     def synpred120_fragment(self, ):
         # C.g:412:13: ( '*' IDENTIFIER )
         # C.g:412:13: '*' IDENTIFIER
         self.match(self.input, 66, self.FOLLOW_66_in_synpred1201482)
         if self.failed:
             return
-        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1201486)
+        self.match(self.input, IDENTIFIER,
+                   self.FOLLOW_IDENTIFIER_in_synpred1201486)
         if self.failed:
             return
 
-
     # $ANTLR end synpred120
 
-
-
     # $ANTLR start synpred137
+
     def synpred137_fragment(self, ):
         # C.g:443:20: ( STRING_LITERAL )
         # C.g:443:20: STRING_LITERAL
-        self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1371683)
+        self.match(self.input, STRING_LITERAL,
+                   self.FOLLOW_STRING_LITERAL_in_synpred1371683)
         if self.failed:
             return
 
-
     # $ANTLR end synpred137
 
-
-
     # $ANTLR start synpred138
+
     def synpred138_fragment(self, ):
         # C.g:443:8: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )
         # C.g:443:8: ( IDENTIFIER )* ( STRING_LITERAL )+
         # C.g:443:8: ( IDENTIFIER )*
-        while True: #loop125
+        while True:  # loop125
             alt125 = 2
             LA125_0 = self.input.LA(1)
 
-            if (LA125_0 == IDENTIFIER) :
+            if (LA125_0 == IDENTIFIER):
                 alt125 = 1
 
-
             if alt125 == 1:
                 # C.g:0:0: IDENTIFIER
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1381680)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_synpred1381680)
                 if self.failed:
                     return
 
-
             else:
-                break #loop125
-
+                break  # loop125
 
         # C.g:443:20: ( STRING_LITERAL )+
         cnt126 = 0
-        while True: #loop126
+        while True:  # loop126
             alt126 = 2
             LA126_0 = self.input.LA(1)
 
-            if (LA126_0 == STRING_LITERAL) :
+            if (LA126_0 == STRING_LITERAL):
                 alt126 = 1
 
-
             if alt126 == 1:
                 # C.g:0:0: STRING_LITERAL
-                self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1381683)
+                self.match(self.input, STRING_LITERAL,
+                           self.FOLLOW_STRING_LITERAL_in_synpred1381683)
                 if self.failed:
                     return
 
-
             else:
                 if cnt126 >= 1:
-                    break #loop126
+                    break  # loop126
 
                 if self.backtracking > 0:
                     self.failed = True
@@ -17795,14 +16763,10 @@ class CParser(Parser):
 
             cnt126 += 1
 
-
-
-
     # $ANTLR end synpred138
 
-
-
     # $ANTLR start synpred142
+
     def synpred142_fragment(self, ):
         # C.g:458:4: ( lvalue assignment_operator assignment_expression )
         # C.g:458:4: lvalue assignment_operator assignment_expression
@@ -17811,38 +16775,37 @@ class CParser(Parser):
         self.following.pop()
         if self.failed:
             return
-        self.following.append(self.FOLLOW_assignment_operator_in_synpred1421746)
+        self.following.append(
+            self.FOLLOW_assignment_operator_in_synpred1421746)
         self.assignment_operator()
         self.following.pop()
         if self.failed:
             return
-        self.following.append(self.FOLLOW_assignment_expression_in_synpred1421748)
+        self.following.append(
+            self.FOLLOW_assignment_expression_in_synpred1421748)
         self.assignment_expression()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred142
 
-
-
     # $ANTLR start synpred169
+
     def synpred169_fragment(self, ):
         # C.g:520:4: ( expression_statement )
         # C.g:520:4: expression_statement
-        self.following.append(self.FOLLOW_expression_statement_in_synpred1692035)
+        self.following.append(
+            self.FOLLOW_expression_statement_in_synpred1692035)
         self.expression_statement()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred169
 
-
-
     # $ANTLR start synpred173
+
     def synpred173_fragment(self, ):
         # C.g:524:4: ( macro_statement )
         # C.g:524:4: macro_statement
@@ -17852,12 +16815,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred173
 
-
-
     # $ANTLR start synpred174
+
     def synpred174_fragment(self, ):
         # C.g:525:4: ( asm2_statement )
         # C.g:525:4: asm2_statement
@@ -17867,12 +16828,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred174
 
-
-
     # $ANTLR start synpred181
+
     def synpred181_fragment(self, ):
         # C.g:544:19: ( declaration )
         # C.g:544:19: declaration
@@ -17882,12 +16841,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred181
 
-
-
     # $ANTLR start synpred182
+
     def synpred182_fragment(self, ):
         # C.g:544:33: ( statement_list )
         # C.g:544:33: statement_list
@@ -17897,12 +16854,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred182
 
-
-
     # $ANTLR start synpred186
+
     def synpred186_fragment(self, ):
         # C.g:554:8: ( declaration )
         # C.g:554:8: declaration
@@ -17912,12 +16867,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred186
 
-
-
     # $ANTLR start synpred188
+
     def synpred188_fragment(self, ):
         # C.g:558:4: ( statement )
         # C.g:558:4: statement
@@ -17927,11 +16880,8 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred188
 
-
-
     def synpred69(self):
         self.backtracking += 1
         start = self.input.mark()
@@ -18382,35 +17332,42 @@ class CParser(Parser):
         self.failed = False
         return success
 
-
-
-
-
-    FOLLOW_external_declaration_in_translation_unit74 = frozenset([1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_external_declaration_in_translation_unit74 = frozenset(
+        [1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
     FOLLOW_function_definition_in_external_declaration113 = frozenset([1])
     FOLLOW_declaration_in_external_declaration118 = frozenset([1])
     FOLLOW_macro_statement_in_external_declaration123 = frozenset([1, 25])
     FOLLOW_25_in_external_declaration126 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_function_definition157 = frozenset([4, 58, 59, 60, 62, 66])
-    FOLLOW_declarator_in_function_definition160 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_declaration_in_function_definition166 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_specifiers_in_function_definition157 = frozenset(
+        [4, 58, 59, 60, 62, 66])
+    FOLLOW_declarator_in_function_definition160 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_in_function_definition166 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_compound_statement_in_function_definition171 = frozenset([1])
     FOLLOW_compound_statement_in_function_definition180 = frozenset([1])
-    FOLLOW_26_in_declaration203 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
-    FOLLOW_declaration_specifiers_in_declaration207 = frozenset([4, 58, 59, 60, 62, 66])
+    FOLLOW_26_in_declaration203 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                            40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_declaration_specifiers_in_declaration207 = frozenset(
+        [4, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_list_in_declaration216 = frozenset([25])
     FOLLOW_25_in_declaration220 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_declaration234 = frozenset([4, 25, 58, 59, 60, 62, 66])
+    FOLLOW_declaration_specifiers_in_declaration234 = frozenset(
+        [4, 25, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_list_in_declaration238 = frozenset([25])
     FOLLOW_25_in_declaration243 = frozenset([1])
-    FOLLOW_storage_class_specifier_in_declaration_specifiers264 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_specifier_in_declaration_specifiers272 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_declaration_specifiers286 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_storage_class_specifier_in_declaration_specifiers264 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_specifier_in_declaration_specifiers272 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_declaration_specifiers286 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_init_declarator_in_init_declarator_list308 = frozenset([1, 27])
     FOLLOW_27_in_init_declarator_list311 = frozenset([4, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_in_init_declarator_list313 = frozenset([1, 27])
     FOLLOW_declarator_in_init_declarator326 = frozenset([1, 28])
-    FOLLOW_28_in_init_declarator329 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_28_in_init_declarator329 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_in_init_declarator331 = frozenset([1])
     FOLLOW_set_in_storage_class_specifier0 = frozenset([1])
     FOLLOW_34_in_type_specifier376 = frozenset([1])
@@ -18428,25 +17385,34 @@ class CParser(Parser):
     FOLLOW_IDENTIFIER_in_type_id467 = frozenset([1])
     FOLLOW_struct_or_union_in_struct_or_union_specifier494 = frozenset([4, 43])
     FOLLOW_IDENTIFIER_in_struct_or_union_specifier496 = frozenset([43])
-    FOLLOW_43_in_struct_or_union_specifier499 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_struct_declaration_list_in_struct_or_union_specifier501 = frozenset([44])
+    FOLLOW_43_in_struct_or_union_specifier499 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_struct_declaration_list_in_struct_or_union_specifier501 = frozenset([
+                                                                               44])
     FOLLOW_44_in_struct_or_union_specifier503 = frozenset([1])
     FOLLOW_struct_or_union_in_struct_or_union_specifier508 = frozenset([4])
     FOLLOW_IDENTIFIER_in_struct_or_union_specifier510 = frozenset([1])
     FOLLOW_set_in_struct_or_union0 = frozenset([1])
-    FOLLOW_struct_declaration_in_struct_declaration_list537 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_specifier_qualifier_list_in_struct_declaration549 = frozenset([4, 47, 58, 59, 60, 62, 66])
+    FOLLOW_struct_declaration_in_struct_declaration_list537 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_specifier_qualifier_list_in_struct_declaration549 = frozenset(
+        [4, 47, 58, 59, 60, 62, 66])
     FOLLOW_struct_declarator_list_in_struct_declaration551 = frozenset([25])
     FOLLOW_25_in_struct_declaration553 = frozenset([1])
-    FOLLOW_type_qualifier_in_specifier_qualifier_list566 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_specifier_in_specifier_qualifier_list570 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_specifier_qualifier_list566 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_specifier_in_specifier_qualifier_list570 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_struct_declarator_in_struct_declarator_list584 = frozenset([1, 27])
-    FOLLOW_27_in_struct_declarator_list587 = frozenset([4, 47, 58, 59, 60, 62, 66])
+    FOLLOW_27_in_struct_declarator_list587 = frozenset(
+        [4, 47, 58, 59, 60, 62, 66])
     FOLLOW_struct_declarator_in_struct_declarator_list589 = frozenset([1, 27])
     FOLLOW_declarator_in_struct_declarator602 = frozenset([1, 47])
-    FOLLOW_47_in_struct_declarator605 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_47_in_struct_declarator605 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_struct_declarator607 = frozenset([1])
-    FOLLOW_47_in_struct_declarator614 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_47_in_struct_declarator614 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_struct_declarator616 = frozenset([1])
     FOLLOW_48_in_enum_specifier634 = frozenset([43])
     FOLLOW_43_in_enum_specifier636 = frozenset([4])
@@ -18465,7 +17431,8 @@ class CParser(Parser):
     FOLLOW_27_in_enumerator_list680 = frozenset([4])
     FOLLOW_enumerator_in_enumerator_list682 = frozenset([1, 27])
     FOLLOW_IDENTIFIER_in_enumerator695 = frozenset([1, 28])
-    FOLLOW_28_in_enumerator698 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_28_in_enumerator698 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_enumerator700 = frozenset([1])
     FOLLOW_set_in_type_qualifier0 = frozenset([1])
     FOLLOW_pointer_in_declarator784 = frozenset([4, 58, 59, 60, 62])
@@ -18481,12 +17448,14 @@ class CParser(Parser):
     FOLLOW_declarator_in_direct_declarator834 = frozenset([63])
     FOLLOW_63_in_direct_declarator836 = frozenset([62, 64])
     FOLLOW_declarator_suffix_in_direct_declarator838 = frozenset([1, 62, 64])
-    FOLLOW_64_in_declarator_suffix852 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_64_in_declarator_suffix852 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_declarator_suffix854 = frozenset([65])
     FOLLOW_65_in_declarator_suffix856 = frozenset([1])
     FOLLOW_64_in_declarator_suffix866 = frozenset([65])
     FOLLOW_65_in_declarator_suffix868 = frozenset([1])
-    FOLLOW_62_in_declarator_suffix878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_declarator_suffix878 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_type_list_in_declarator_suffix880 = frozenset([63])
     FOLLOW_63_in_declarator_suffix882 = frozenset([1])
     FOLLOW_62_in_declarator_suffix892 = frozenset([4])
@@ -18494,8 +17463,10 @@ class CParser(Parser):
     FOLLOW_63_in_declarator_suffix896 = frozenset([1])
     FOLLOW_62_in_declarator_suffix906 = frozenset([63])
     FOLLOW_63_in_declarator_suffix908 = frozenset([1])
-    FOLLOW_66_in_pointer919 = frozenset([49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_pointer921 = frozenset([1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_66_in_pointer919 = frozenset(
+        [49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_pointer921 = frozenset(
+        [1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_pointer_in_pointer924 = frozenset([1])
     FOLLOW_66_in_pointer930 = frozenset([66])
     FOLLOW_pointer_in_pointer932 = frozenset([1])
@@ -18505,109 +17476,165 @@ class CParser(Parser):
     FOLLOW_53_in_parameter_type_list954 = frozenset([67])
     FOLLOW_67_in_parameter_type_list958 = frozenset([1])
     FOLLOW_parameter_declaration_in_parameter_list971 = frozenset([1, 27])
-    FOLLOW_27_in_parameter_list974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_53_in_parameter_list977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_27_in_parameter_list974 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_53_in_parameter_list977 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_declaration_in_parameter_list981 = frozenset([1, 27])
-    FOLLOW_declaration_specifiers_in_parameter_declaration994 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_declarator_in_parameter_declaration997 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_abstract_declarator_in_parameter_declaration999 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declaration_specifiers_in_parameter_declaration994 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declarator_in_parameter_declaration997 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_abstract_declarator_in_parameter_declaration999 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
     FOLLOW_53_in_parameter_declaration1004 = frozenset([1])
     FOLLOW_pointer_in_parameter_declaration1013 = frozenset([4, 66])
     FOLLOW_IDENTIFIER_in_parameter_declaration1016 = frozenset([1])
     FOLLOW_IDENTIFIER_in_identifier_list1027 = frozenset([1, 27])
     FOLLOW_27_in_identifier_list1031 = frozenset([4])
     FOLLOW_IDENTIFIER_in_identifier_list1033 = frozenset([1, 27])
-    FOLLOW_specifier_qualifier_list_in_type_name1046 = frozenset([1, 62, 64, 66])
+    FOLLOW_specifier_qualifier_list_in_type_name1046 = frozenset([
+                                                                 1, 62, 64, 66])
     FOLLOW_abstract_declarator_in_type_name1048 = frozenset([1])
     FOLLOW_type_id_in_type_name1054 = frozenset([1])
     FOLLOW_pointer_in_abstract_declarator1065 = frozenset([1, 62, 64])
-    FOLLOW_direct_abstract_declarator_in_abstract_declarator1067 = frozenset([1])
-    FOLLOW_direct_abstract_declarator_in_abstract_declarator1073 = frozenset([1])
+    FOLLOW_direct_abstract_declarator_in_abstract_declarator1067 = frozenset([
+                                                                             1])
+    FOLLOW_direct_abstract_declarator_in_abstract_declarator1073 = frozenset([
+                                                                             1])
     FOLLOW_62_in_direct_abstract_declarator1086 = frozenset([62, 64, 66])
-    FOLLOW_abstract_declarator_in_direct_abstract_declarator1088 = frozenset([63])
+    FOLLOW_abstract_declarator_in_direct_abstract_declarator1088 = frozenset([
+                                                                             63])
     FOLLOW_63_in_direct_abstract_declarator1090 = frozenset([1, 62, 64])
-    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094 = frozenset([1, 62, 64])
-    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098 = frozenset([1, 62, 64])
+    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094 = frozenset([
+                                                                                    1, 62, 64])
+    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098 = frozenset([
+                                                                                    1, 62, 64])
     FOLLOW_64_in_abstract_declarator_suffix1110 = frozenset([65])
     FOLLOW_65_in_abstract_declarator_suffix1112 = frozenset([1])
-    FOLLOW_64_in_abstract_declarator_suffix1117 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_constant_expression_in_abstract_declarator_suffix1119 = frozenset([65])
+    FOLLOW_64_in_abstract_declarator_suffix1117 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_constant_expression_in_abstract_declarator_suffix1119 = frozenset([
+                                                                             65])
     FOLLOW_65_in_abstract_declarator_suffix1121 = frozenset([1])
     FOLLOW_62_in_abstract_declarator_suffix1126 = frozenset([63])
     FOLLOW_63_in_abstract_declarator_suffix1128 = frozenset([1])
-    FOLLOW_62_in_abstract_declarator_suffix1133 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135 = frozenset([63])
+    FOLLOW_62_in_abstract_declarator_suffix1133 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135 = frozenset([
+                                                                             63])
     FOLLOW_63_in_abstract_declarator_suffix1137 = frozenset([1])
     FOLLOW_assignment_expression_in_initializer1150 = frozenset([1])
-    FOLLOW_43_in_initializer1155 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_43_in_initializer1155 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_list_in_initializer1157 = frozenset([27, 44])
     FOLLOW_27_in_initializer1159 = frozenset([44])
     FOLLOW_44_in_initializer1162 = frozenset([1])
     FOLLOW_initializer_in_initializer_list1173 = frozenset([1, 27])
-    FOLLOW_27_in_initializer_list1176 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_27_in_initializer_list1176 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_in_initializer_list1178 = frozenset([1, 27])
-    FOLLOW_assignment_expression_in_argument_expression_list1196 = frozenset([1, 27, 53])
+    FOLLOW_assignment_expression_in_argument_expression_list1196 = frozenset([
+                                                                             1, 27, 53])
     FOLLOW_53_in_argument_expression_list1199 = frozenset([1, 27])
-    FOLLOW_27_in_argument_expression_list1204 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_assignment_expression_in_argument_expression_list1206 = frozenset([1, 27, 53])
+    FOLLOW_27_in_argument_expression_list1204 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_assignment_expression_in_argument_expression_list1206 = frozenset([
+                                                                             1, 27, 53])
     FOLLOW_53_in_argument_expression_list1209 = frozenset([1, 27])
-    FOLLOW_multiplicative_expression_in_additive_expression1225 = frozenset([1, 68, 69])
-    FOLLOW_68_in_additive_expression1229 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_multiplicative_expression_in_additive_expression1231 = frozenset([1, 68, 69])
-    FOLLOW_69_in_additive_expression1235 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_multiplicative_expression_in_additive_expression1237 = frozenset([1, 68, 69])
-    FOLLOW_cast_expression_in_multiplicative_expression1251 = frozenset([1, 66, 70, 71])
-    FOLLOW_66_in_multiplicative_expression1255 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1257 = frozenset([1, 66, 70, 71])
-    FOLLOW_70_in_multiplicative_expression1261 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1263 = frozenset([1, 66, 70, 71])
-    FOLLOW_71_in_multiplicative_expression1267 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1269 = frozenset([1, 66, 70, 71])
-    FOLLOW_62_in_cast_expression1282 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_multiplicative_expression_in_additive_expression1225 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_68_in_additive_expression1229 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_multiplicative_expression_in_additive_expression1231 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_69_in_additive_expression1235 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_multiplicative_expression_in_additive_expression1237 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_cast_expression_in_multiplicative_expression1251 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_66_in_multiplicative_expression1255 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1257 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_70_in_multiplicative_expression1261 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1263 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_71_in_multiplicative_expression1267 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1269 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_62_in_cast_expression1282 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_cast_expression1284 = frozenset([63])
-    FOLLOW_63_in_cast_expression1286 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_cast_expression1286 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_cast_expression1288 = frozenset([1])
     FOLLOW_unary_expression_in_cast_expression1293 = frozenset([1])
     FOLLOW_postfix_expression_in_unary_expression1304 = frozenset([1])
-    FOLLOW_72_in_unary_expression1309 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_72_in_unary_expression1309 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1311 = frozenset([1])
-    FOLLOW_73_in_unary_expression1316 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_73_in_unary_expression1316 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1318 = frozenset([1])
-    FOLLOW_unary_operator_in_unary_expression1323 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_unary_operator_in_unary_expression1323 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_unary_expression1325 = frozenset([1])
-    FOLLOW_74_in_unary_expression1330 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_74_in_unary_expression1330 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1332 = frozenset([1])
     FOLLOW_74_in_unary_expression1337 = frozenset([62])
-    FOLLOW_62_in_unary_expression1339 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_62_in_unary_expression1339 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_unary_expression1341 = frozenset([63])
     FOLLOW_63_in_unary_expression1343 = frozenset([1])
-    FOLLOW_primary_expression_in_postfix_expression1367 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_64_in_postfix_expression1383 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_primary_expression_in_postfix_expression1367 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_64_in_postfix_expression1383 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_postfix_expression1385 = frozenset([65])
-    FOLLOW_65_in_postfix_expression1387 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_65_in_postfix_expression1387 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_62_in_postfix_expression1401 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1405 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_62_in_postfix_expression1420 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_postfix_expression1405 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_62_in_postfix_expression1420 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_argument_expression_list_in_postfix_expression1424 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1428 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_62_in_postfix_expression1444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_63_in_postfix_expression1428 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_62_in_postfix_expression1444 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_macro_parameter_list_in_postfix_expression1446 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1448 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_63_in_postfix_expression1448 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_75_in_postfix_expression1462 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1466 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_IDENTIFIER_in_postfix_expression1466 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_66_in_postfix_expression1482 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1486 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_IDENTIFIER_in_postfix_expression1486 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_76_in_postfix_expression1502 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1506 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_72_in_postfix_expression1522 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_73_in_postfix_expression1536 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_parameter_declaration_in_macro_parameter_list1559 = frozenset([1, 27])
-    FOLLOW_27_in_macro_parameter_list1562 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_parameter_declaration_in_macro_parameter_list1564 = frozenset([1, 27])
+    FOLLOW_IDENTIFIER_in_postfix_expression1506 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_72_in_postfix_expression1522 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_73_in_postfix_expression1536 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_parameter_declaration_in_macro_parameter_list1559 = frozenset([
+                                                                         1, 27])
+    FOLLOW_27_in_macro_parameter_list1562 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_parameter_declaration_in_macro_parameter_list1564 = frozenset([
+                                                                         1, 27])
     FOLLOW_set_in_unary_operator0 = frozenset([1])
     FOLLOW_IDENTIFIER_in_primary_expression1613 = frozenset([1])
     FOLLOW_constant_in_primary_expression1618 = frozenset([1])
-    FOLLOW_62_in_primary_expression1623 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_primary_expression1623 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_primary_expression1625 = frozenset([63])
     FOLLOW_63_in_primary_expression1627 = frozenset([1])
     FOLLOW_HEX_LITERAL_in_constant1643 = frozenset([1])
@@ -18619,44 +17646,71 @@ class CParser(Parser):
     FOLLOW_IDENTIFIER_in_constant1688 = frozenset([1, 4])
     FOLLOW_FLOATING_POINT_LITERAL_in_constant1699 = frozenset([1])
     FOLLOW_assignment_expression_in_expression1715 = frozenset([1, 27])
-    FOLLOW_27_in_expression1718 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_27_in_expression1718 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_expression1720 = frozenset([1, 27])
     FOLLOW_conditional_expression_in_constant_expression1733 = frozenset([1])
-    FOLLOW_lvalue_in_assignment_expression1744 = frozenset([28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
-    FOLLOW_assignment_operator_in_assignment_expression1746 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_lvalue_in_assignment_expression1744 = frozenset(
+        [28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
+    FOLLOW_assignment_operator_in_assignment_expression1746 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_assignment_expression1748 = frozenset([1])
     FOLLOW_conditional_expression_in_assignment_expression1753 = frozenset([1])
     FOLLOW_unary_expression_in_lvalue1765 = frozenset([1])
     FOLLOW_set_in_assignment_operator0 = frozenset([1])
-    FOLLOW_logical_or_expression_in_conditional_expression1839 = frozenset([1, 90])
-    FOLLOW_90_in_conditional_expression1842 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_logical_or_expression_in_conditional_expression1839 = frozenset([
+                                                                           1, 90])
+    FOLLOW_90_in_conditional_expression1842 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_conditional_expression1844 = frozenset([47])
-    FOLLOW_47_in_conditional_expression1846 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_conditional_expression_in_conditional_expression1848 = frozenset([1])
-    FOLLOW_logical_and_expression_in_logical_or_expression1863 = frozenset([1, 91])
-    FOLLOW_91_in_logical_or_expression1866 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_logical_and_expression_in_logical_or_expression1868 = frozenset([1, 91])
-    FOLLOW_inclusive_or_expression_in_logical_and_expression1881 = frozenset([1, 92])
-    FOLLOW_92_in_logical_and_expression1884 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_inclusive_or_expression_in_logical_and_expression1886 = frozenset([1, 92])
-    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899 = frozenset([1, 93])
-    FOLLOW_93_in_inclusive_or_expression1902 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904 = frozenset([1, 93])
+    FOLLOW_47_in_conditional_expression1846 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_conditional_expression_in_conditional_expression1848 = frozenset([
+                                                                            1])
+    FOLLOW_logical_and_expression_in_logical_or_expression1863 = frozenset([
+                                                                           1, 91])
+    FOLLOW_91_in_logical_or_expression1866 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_logical_and_expression_in_logical_or_expression1868 = frozenset([
+                                                                           1, 91])
+    FOLLOW_inclusive_or_expression_in_logical_and_expression1881 = frozenset([
+                                                                             1, 92])
+    FOLLOW_92_in_logical_and_expression1884 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_inclusive_or_expression_in_logical_and_expression1886 = frozenset([
+                                                                             1, 92])
+    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899 = frozenset([
+                                                                              1, 93])
+    FOLLOW_93_in_inclusive_or_expression1902 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904 = frozenset([
+                                                                              1, 93])
     FOLLOW_and_expression_in_exclusive_or_expression1917 = frozenset([1, 94])
-    FOLLOW_94_in_exclusive_or_expression1920 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_94_in_exclusive_or_expression1920 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_and_expression_in_exclusive_or_expression1922 = frozenset([1, 94])
     FOLLOW_equality_expression_in_and_expression1935 = frozenset([1, 77])
-    FOLLOW_77_in_and_expression1938 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_77_in_and_expression1938 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_equality_expression_in_and_expression1940 = frozenset([1, 77])
-    FOLLOW_relational_expression_in_equality_expression1952 = frozenset([1, 95, 96])
-    FOLLOW_set_in_equality_expression1955 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_relational_expression_in_equality_expression1961 = frozenset([1, 95, 96])
-    FOLLOW_shift_expression_in_relational_expression1975 = frozenset([1, 97, 98, 99, 100])
-    FOLLOW_set_in_relational_expression1978 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_shift_expression_in_relational_expression1988 = frozenset([1, 97, 98, 99, 100])
-    FOLLOW_additive_expression_in_shift_expression2001 = frozenset([1, 101, 102])
-    FOLLOW_set_in_shift_expression2004 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_additive_expression_in_shift_expression2010 = frozenset([1, 101, 102])
+    FOLLOW_relational_expression_in_equality_expression1952 = frozenset([
+                                                                        1, 95, 96])
+    FOLLOW_set_in_equality_expression1955 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_relational_expression_in_equality_expression1961 = frozenset([
+                                                                        1, 95, 96])
+    FOLLOW_shift_expression_in_relational_expression1975 = frozenset([
+                                                                     1, 97, 98, 99, 100])
+    FOLLOW_set_in_relational_expression1978 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_shift_expression_in_relational_expression1988 = frozenset([
+                                                                     1, 97, 98, 99, 100])
+    FOLLOW_additive_expression_in_shift_expression2001 = frozenset([
+                                                                   1, 101, 102])
+    FOLLOW_set_in_shift_expression2004 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_additive_expression_in_shift_expression2010 = frozenset([
+                                                                   1, 101, 102])
     FOLLOW_labeled_statement_in_statement2025 = frozenset([1])
     FOLLOW_compound_statement_in_statement2030 = frozenset([1])
     FOLLOW_expression_statement_in_statement2035 = frozenset([1])
@@ -18670,72 +17724,101 @@ class CParser(Parser):
     FOLLOW_declaration_in_statement2075 = frozenset([1])
     FOLLOW_103_in_asm2_statement2086 = frozenset([4])
     FOLLOW_IDENTIFIER_in_asm2_statement2089 = frozenset([62])
-    FOLLOW_62_in_asm2_statement2091 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm2_statement2094 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_62_in_asm2_statement2091 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm2_statement2094 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_63_in_asm2_statement2101 = frozenset([25])
     FOLLOW_25_in_asm2_statement2103 = frozenset([1])
     FOLLOW_104_in_asm1_statement2115 = frozenset([43])
-    FOLLOW_43_in_asm1_statement2117 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm1_statement2120 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_asm1_statement2117 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm1_statement2120 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_44_in_asm1_statement2127 = frozenset([1])
     FOLLOW_105_in_asm_statement2138 = frozenset([43])
-    FOLLOW_43_in_asm_statement2140 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm_statement2143 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_asm_statement2140 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                               59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm_statement2143 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_44_in_asm_statement2150 = frozenset([1])
     FOLLOW_IDENTIFIER_in_macro_statement2162 = frozenset([62])
-    FOLLOW_62_in_macro_statement2164 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_declaration_in_macro_statement2166 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_statement_list_in_macro_statement2170 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_macro_statement2164 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51,
+                                                 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_declaration_in_macro_statement2166 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                          50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_statement_list_in_macro_statement2170 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_macro_statement2173 = frozenset([63])
     FOLLOW_63_in_macro_statement2176 = frozenset([1])
     FOLLOW_IDENTIFIER_in_labeled_statement2188 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2190 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2190 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2192 = frozenset([1])
-    FOLLOW_106_in_labeled_statement2197 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_106_in_labeled_statement2197 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_labeled_statement2199 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2201 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2201 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2203 = frozenset([1])
     FOLLOW_107_in_labeled_statement2208 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2210 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2210 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2212 = frozenset([1])
-    FOLLOW_43_in_compound_statement2223 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_declaration_in_compound_statement2225 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_compound_statement2223 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49,
+                                                    50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_declaration_in_compound_statement2225 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48,
+                                                             49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_list_in_compound_statement2228 = frozenset([44])
     FOLLOW_44_in_compound_statement2231 = frozenset([1])
-    FOLLOW_statement_in_statement_list2242 = frozenset([1, 4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_statement_in_statement_list2242 = frozenset([1, 4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                       50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_25_in_expression_statement2254 = frozenset([1])
     FOLLOW_expression_in_expression_statement2259 = frozenset([25])
     FOLLOW_25_in_expression_statement2261 = frozenset([1])
     FOLLOW_108_in_selection_statement2272 = frozenset([62])
-    FOLLOW_62_in_selection_statement2274 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_selection_statement2274 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_selection_statement2278 = frozenset([63])
-    FOLLOW_63_in_selection_statement2280 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_selection_statement2280 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2284 = frozenset([1, 109])
-    FOLLOW_109_in_selection_statement2299 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_109_in_selection_statement2299 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                      50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2301 = frozenset([1])
     FOLLOW_110_in_selection_statement2308 = frozenset([62])
-    FOLLOW_62_in_selection_statement2310 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_selection_statement2310 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_selection_statement2312 = frozenset([63])
-    FOLLOW_63_in_selection_statement2314 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_selection_statement2314 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2316 = frozenset([1])
     FOLLOW_111_in_iteration_statement2327 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2329 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2329 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2333 = frozenset([63])
-    FOLLOW_63_in_iteration_statement2335 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_iteration_statement2335 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2337 = frozenset([1])
-    FOLLOW_112_in_iteration_statement2344 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_112_in_iteration_statement2344 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                      50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2346 = frozenset([111])
     FOLLOW_111_in_iteration_statement2348 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2350 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2350 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2354 = frozenset([63])
     FOLLOW_63_in_iteration_statement2356 = frozenset([25])
     FOLLOW_25_in_iteration_statement2358 = frozenset([1])
     FOLLOW_113_in_iteration_statement2365 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2367 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_expression_statement_in_iteration_statement2369 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_expression_statement_in_iteration_statement2373 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2367 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_expression_statement_in_iteration_statement2369 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_expression_statement_in_iteration_statement2373 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2375 = frozenset([63])
-    FOLLOW_63_in_iteration_statement2378 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_iteration_statement2378 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2380 = frozenset([1])
     FOLLOW_114_in_jump_statement2393 = frozenset([4])
     FOLLOW_IDENTIFIER_in_jump_statement2395 = frozenset([25])
@@ -18746,13 +17829,17 @@ class CParser(Parser):
     FOLLOW_25_in_jump_statement2411 = frozenset([1])
     FOLLOW_117_in_jump_statement2416 = frozenset([25])
     FOLLOW_25_in_jump_statement2418 = frozenset([1])
-    FOLLOW_117_in_jump_statement2423 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_117_in_jump_statement2423 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_jump_statement2425 = frozenset([25])
     FOLLOW_25_in_jump_statement2427 = frozenset([1])
     FOLLOW_declaration_specifiers_in_synpred2100 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_synpred4100 = frozenset([4, 58, 59, 60, 62, 66])
-    FOLLOW_declarator_in_synpred4103 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_declaration_in_synpred4105 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_specifiers_in_synpred4100 = frozenset(
+        [4, 58, 59, 60, 62, 66])
+    FOLLOW_declarator_in_synpred4103 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_in_synpred4105 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_43_in_synpred4108 = frozenset([1])
     FOLLOW_declaration_in_synpred5118 = frozenset([1])
     FOLLOW_declaration_specifiers_in_synpred7157 = frozenset([1])
@@ -18760,8 +17847,10 @@ class CParser(Parser):
     FOLLOW_type_specifier_in_synpred14272 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred15286 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred33444 = frozenset([1])
-    FOLLOW_IDENTIFIER_in_synpred34442 = frozenset([4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
-    FOLLOW_type_qualifier_in_synpred34444 = frozenset([4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_IDENTIFIER_in_synpred34442 = frozenset(
+        [4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_type_qualifier_in_synpred34444 = frozenset(
+        [4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
     FOLLOW_declarator_in_synpred34447 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred39566 = frozenset([1])
     FOLLOW_type_specifier_in_synpred40570 = frozenset([1])
@@ -18773,7 +17862,8 @@ class CParser(Parser):
     FOLLOW_declarator_suffix_in_synpred67821 = frozenset([1])
     FOLLOW_58_in_synpred69830 = frozenset([1])
     FOLLOW_declarator_suffix_in_synpred70838 = frozenset([1])
-    FOLLOW_62_in_synpred73878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_synpred73878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_type_list_in_synpred73880 = frozenset([63])
     FOLLOW_63_in_synpred73882 = frozenset([1])
     FOLLOW_62_in_synpred74892 = frozenset([4])
@@ -18781,38 +17871,51 @@ class CParser(Parser):
     FOLLOW_63_in_synpred74896 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred75921 = frozenset([1])
     FOLLOW_pointer_in_synpred76924 = frozenset([1])
-    FOLLOW_66_in_synpred77919 = frozenset([49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_synpred77921 = frozenset([1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_66_in_synpred77919 = frozenset(
+        [49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_synpred77921 = frozenset(
+        [1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_pointer_in_synpred77924 = frozenset([1])
     FOLLOW_66_in_synpred78930 = frozenset([66])
     FOLLOW_pointer_in_synpred78932 = frozenset([1])
     FOLLOW_53_in_synpred81977 = frozenset([1])
-    FOLLOW_27_in_synpred82974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_53_in_synpred82977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_27_in_synpred82974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_53_in_synpred82977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_declaration_in_synpred82981 = frozenset([1])
     FOLLOW_declarator_in_synpred83997 = frozenset([1])
     FOLLOW_abstract_declarator_in_synpred84999 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_synpred86994 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_declarator_in_synpred86997 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_abstract_declarator_in_synpred86999 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declaration_specifiers_in_synpred86994 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declarator_in_synpred86997 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_abstract_declarator_in_synpred86999 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
     FOLLOW_53_in_synpred861004 = frozenset([1])
-    FOLLOW_specifier_qualifier_list_in_synpred901046 = frozenset([1, 62, 64, 66])
+    FOLLOW_specifier_qualifier_list_in_synpred901046 = frozenset([
+                                                                 1, 62, 64, 66])
     FOLLOW_abstract_declarator_in_synpred901048 = frozenset([1])
     FOLLOW_direct_abstract_declarator_in_synpred911067 = frozenset([1])
     FOLLOW_62_in_synpred931086 = frozenset([62, 64, 66])
     FOLLOW_abstract_declarator_in_synpred931088 = frozenset([63])
     FOLLOW_63_in_synpred931090 = frozenset([1])
     FOLLOW_abstract_declarator_suffix_in_synpred941098 = frozenset([1])
-    FOLLOW_62_in_synpred1091282 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_62_in_synpred1091282 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_synpred1091284 = frozenset([63])
-    FOLLOW_63_in_synpred1091286 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_synpred1091286 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_synpred1091288 = frozenset([1])
-    FOLLOW_74_in_synpred1141330 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_74_in_synpred1141330 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_synpred1141332 = frozenset([1])
-    FOLLOW_62_in_synpred1171420 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_synpred1171420 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_argument_expression_list_in_synpred1171424 = frozenset([63])
     FOLLOW_63_in_synpred1171428 = frozenset([1])
-    FOLLOW_62_in_synpred1181444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_synpred1181444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38,
+                                            39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_macro_parameter_list_in_synpred1181446 = frozenset([63])
     FOLLOW_63_in_synpred1181448 = frozenset([1])
     FOLLOW_66_in_synpred1201482 = frozenset([4])
@@ -18820,8 +17923,10 @@ class CParser(Parser):
     FOLLOW_STRING_LITERAL_in_synpred1371683 = frozenset([1])
     FOLLOW_IDENTIFIER_in_synpred1381680 = frozenset([4, 9])
     FOLLOW_STRING_LITERAL_in_synpred1381683 = frozenset([1, 9])
-    FOLLOW_lvalue_in_synpred1421744 = frozenset([28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
-    FOLLOW_assignment_operator_in_synpred1421746 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_lvalue_in_synpred1421744 = frozenset(
+        [28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
+    FOLLOW_assignment_operator_in_synpred1421746 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_synpred1421748 = frozenset([1])
     FOLLOW_expression_statement_in_synpred1692035 = frozenset([1])
     FOLLOW_macro_statement_in_synpred1732055 = frozenset([1])
@@ -18830,4 +17935,3 @@ class CParser(Parser):
     FOLLOW_statement_list_in_synpred1822170 = frozenset([1])
     FOLLOW_declaration_in_synpred1862225 = frozenset([1])
     FOLLOW_statement_in_synpred1882242 = frozenset([1])
-
diff --git a/BaseTools/Source/Python/Ecc/CParser4/CLexer.py b/BaseTools/Source/Python/Ecc/CParser4/CLexer.py
index a2cc5bf56e66..739152edf0f5 100644
--- a/BaseTools/Source/Python/Ecc/CParser4/CLexer.py
+++ b/BaseTools/Source/Python/Ecc/CParser4/CLexer.py
@@ -5,7 +5,7 @@ from typing.io import TextIO
 import sys
 
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -21,6 +21,7 @@ import sys
 import Ecc.CodeFragment as CodeFragment
 import Ecc.FileProfile as FileProfile
 
+
 def serializedATN():
     with StringIO() as buf:
         buf.write("\3\u608b\ua72a\u8133\ub9ed\u417c\u3be7\u7786\u5964\2k")
@@ -423,7 +424,7 @@ class CLexer(Lexer):
 
     atn = ATNDeserializer().deserialize(serializedATN())
 
-    decisionsToDFA = [ DFA(ds, i) for i, ds in enumerate(atn.decisionToState) ]
+    decisionsToDFA = [DFA(ds, i) for i, ds in enumerate(atn.decisionToState)]
 
     T__0 = 1
     T__1 = 2
@@ -531,96 +532,99 @@ class CLexer(Lexer):
     LINE_COMMENT = 104
     LINE_COMMAND = 105
 
-    channelNames = [ u"DEFAULT_TOKEN_CHANNEL", u"HIDDEN" ]
+    channelNames = [u"DEFAULT_TOKEN_CHANNEL", u"HIDDEN"]
 
-    modeNames = [ "DEFAULT_MODE" ]
+    modeNames = ["DEFAULT_MODE"]
 
-    literalNames = [ "<INVALID>",
-            "'{'", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
-            "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'",
-            "'int'", "'long'", "'float'", "'double'", "'signed'", "'unsigned'",
-            "'}'", "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'",
-            "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'",
-            "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'", "'EFI_BOOTSERVICE'",
-            "'EFI_RUNTIMESERVICE'", "'PACKED'", "'('", "')'", "'['", "']'",
-            "'*'", "'...'", "'+'", "'-'", "'/'", "'%'", "'++'", "'--'",
-            "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='", "'/='",
-            "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
-            "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'",
-            "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'",
-            "'__asm'", "'case'", "'default'", "'if'", "'else'", "'switch'",
-            "'while'", "'do'", "'goto'", "'continue'", "'break'", "'return'" ]
+    literalNames = ["<INVALID>",
+                    "'{'", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
+                    "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'",
+                    "'int'", "'long'", "'float'", "'double'", "'signed'", "'unsigned'",
+                    "'}'", "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'",
+                    "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'",
+                    "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'", "'EFI_BOOTSERVICE'",
+                    "'EFI_RUNTIMESERVICE'", "'PACKED'", "'('", "')'", "'['", "']'",
+                    "'*'", "'...'", "'+'", "'-'", "'/'", "'%'", "'++'", "'--'",
+                    "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='", "'/='",
+                    "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+                    "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'",
+                    "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'",
+                    "'__asm'", "'case'", "'default'", "'if'", "'else'", "'switch'",
+                    "'while'", "'do'", "'goto'", "'continue'", "'break'", "'return'"]
 
-    symbolicNames = [ "<INVALID>",
-            "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL", "HEX_LITERAL",
-            "DECIMAL_LITERAL", "OCTAL_LITERAL", "FLOATING_POINT_LITERAL",
-            "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
-            "LINE_COMMAND" ]
+    symbolicNames = ["<INVALID>",
+                     "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL", "HEX_LITERAL",
+                     "DECIMAL_LITERAL", "OCTAL_LITERAL", "FLOATING_POINT_LITERAL",
+                     "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+                     "LINE_COMMAND"]
 
-    ruleNames = [ "T__0", "T__1", "T__2", "T__3", "T__4", "T__5", "T__6",
-                  "T__7", "T__8", "T__9", "T__10", "T__11", "T__12", "T__13",
-                  "T__14", "T__15", "T__16", "T__17", "T__18", "T__19",
-                  "T__20", "T__21", "T__22", "T__23", "T__24", "T__25",
-                  "T__26", "T__27", "T__28", "T__29", "T__30", "T__31",
-                  "T__32", "T__33", "T__34", "T__35", "T__36", "T__37",
-                  "T__38", "T__39", "T__40", "T__41", "T__42", "T__43",
-                  "T__44", "T__45", "T__46", "T__47", "T__48", "T__49",
-                  "T__50", "T__51", "T__52", "T__53", "T__54", "T__55",
-                  "T__56", "T__57", "T__58", "T__59", "T__60", "T__61",
-                  "T__62", "T__63", "T__64", "T__65", "T__66", "T__67",
-                  "T__68", "T__69", "T__70", "T__71", "T__72", "T__73",
-                  "T__74", "T__75", "T__76", "T__77", "T__78", "T__79",
-                  "T__80", "T__81", "T__82", "T__83", "T__84", "T__85",
-                  "T__86", "T__87", "T__88", "T__89", "T__90", "T__91",
-                  "IDENTIFIER", "LETTER", "CHARACTER_LITERAL", "STRING_LITERAL",
-                  "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL", "HexDigit",
-                  "IntegerTypeSuffix", "FLOATING_POINT_LITERAL", "Exponent",
-                  "FloatTypeSuffix", "EscapeSequence", "OctalEscape", "UnicodeEscape",
-                  "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
-                  "LINE_COMMAND" ]
+    ruleNames = ["T__0", "T__1", "T__2", "T__3", "T__4", "T__5", "T__6",
+                 "T__7", "T__8", "T__9", "T__10", "T__11", "T__12", "T__13",
+                 "T__14", "T__15", "T__16", "T__17", "T__18", "T__19",
+                 "T__20", "T__21", "T__22", "T__23", "T__24", "T__25",
+                 "T__26", "T__27", "T__28", "T__29", "T__30", "T__31",
+                 "T__32", "T__33", "T__34", "T__35", "T__36", "T__37",
+                 "T__38", "T__39", "T__40", "T__41", "T__42", "T__43",
+                 "T__44", "T__45", "T__46", "T__47", "T__48", "T__49",
+                 "T__50", "T__51", "T__52", "T__53", "T__54", "T__55",
+                 "T__56", "T__57", "T__58", "T__59", "T__60", "T__61",
+                 "T__62", "T__63", "T__64", "T__65", "T__66", "T__67",
+                 "T__68", "T__69", "T__70", "T__71", "T__72", "T__73",
+                 "T__74", "T__75", "T__76", "T__77", "T__78", "T__79",
+                 "T__80", "T__81", "T__82", "T__83", "T__84", "T__85",
+                 "T__86", "T__87", "T__88", "T__89", "T__90", "T__91",
+                 "IDENTIFIER", "LETTER", "CHARACTER_LITERAL", "STRING_LITERAL",
+                 "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL", "HexDigit",
+                 "IntegerTypeSuffix", "FLOATING_POINT_LITERAL", "Exponent",
+                 "FloatTypeSuffix", "EscapeSequence", "OctalEscape", "UnicodeEscape",
+                 "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+                 "LINE_COMMAND"]
 
     grammarFileName = "C.g4"
 
     # @param  output= sys.stdout Type: TextIO
-    def __init__(self,input=None,output= sys.stdout):
+    def __init__(self, input=None, output=sys.stdout):
         super().__init__(input, output)
         self.checkVersion("4.7.1")
-        self._interp = LexerATNSimulator(self, self.atn, self.decisionsToDFA, PredictionContextCache())
+        self._interp = LexerATNSimulator(
+            self, self.atn, self.decisionsToDFA, PredictionContextCache())
         self._actions = None
         self._predicates = None
 
+    def printTokenInfo(self, line, offset, tokenText):
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
-
-    def printTokenInfo(self,line,offset,tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
-
-    def StorePredicateExpression(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.PredicateExpressionList.append(PredExp)
 
-    def StoreEnumerationDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.EnumerationDefinitionList.append(EnumDef)
 
-    def StoreStructUnionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.StructUnionDefinitionList.append(SUDef)
 
-    def StoreTypedefDefinition(self,StartLine,StartOffset,EndLine,EndOffset,FromText,ToText):
-        Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.TypedefDefinitionList.append(Tdef)
 
-    def StoreFunctionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText,LeftBraceLine,LeftBraceOffset,DeclLine,DeclOffset):
-        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+    def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
         FileProfile.FunctionDefinitionList.append(FuncDef)
 
-    def StoreVariableDeclaration(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText):
-        VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.VariableDeclarationList.append(VarDecl)
 
-    def StoreFunctionCalling(self,StartLine,StartOffset,EndLine,EndOffset,FuncName,ParamList):
-        FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.FunctionCallingList.append(FuncCall)
-
-
-
diff --git a/BaseTools/Source/Python/Ecc/CParser4/CListener.py b/BaseTools/Source/Python/Ecc/CParser4/CListener.py
index bb4351d9249a..866d5717d42b 100644
--- a/BaseTools/Source/Python/Ecc/CParser4/CListener.py
+++ b/BaseTools/Source/Python/Ecc/CParser4/CListener.py
@@ -5,7 +5,7 @@ if __name__ is not None and "." in __name__:
 else:
     from CParser import CParser
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -27,783 +27,710 @@ class CListener(ParseTreeListener):
 
     # Enter a parse tree produced by CParser#translation_unit.
     # @param  ctx Type: CParser.Translation_unitContext
-    def enterTranslation_unit(self,ctx):
+    def enterTranslation_unit(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#translation_unit.
     # @param  ctx Type: CParser.Translation_unitContext
-    def exitTranslation_unit(self,ctx):
+    def exitTranslation_unit(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#external_declaration.
     # @param  ctx Type: CParser.External_declarationContext
-    def enterExternal_declaration(self,ctx):
+    def enterExternal_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#external_declaration.
     # @param  ctx Type: CParser.External_declarationContext
-    def exitExternal_declaration(self,ctx):
+    def exitExternal_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#function_definition.
     # @param  ctx Type: CParser.Function_definitionContext
-    def enterFunction_definition(self,ctx):
+    def enterFunction_definition(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#function_definition.
     # @param  ctx Type: CParser.Function_definitionContext
-    def exitFunction_definition(self,ctx):
+    def exitFunction_definition(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declaration_specifiers.
     # @param  ctx Type: CParser.Declaration_specifiersContext
-    def enterDeclaration_specifiers(self,ctx):
+    def enterDeclaration_specifiers(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declaration_specifiers.
     # @param  ctx Type: CParser.Declaration_specifiersContext
-    def exitDeclaration_specifiers(self,ctx):
+    def exitDeclaration_specifiers(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declaration.
     # @param  ctx Type: CParser.DeclarationContext
-    def enterDeclaration(self,ctx):
+    def enterDeclaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declaration.
     # @param  ctx Type: CParser.DeclarationContext
-    def exitDeclaration(self,ctx):
+    def exitDeclaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#init_declarator_list.
     # @param  ctx Type: CParser.Init_declarator_listContext
-    def enterInit_declarator_list(self,ctx):
+    def enterInit_declarator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#init_declarator_list.
     # @param  ctx Type: CParser.Init_declarator_listContext
-    def exitInit_declarator_list(self,ctx):
+    def exitInit_declarator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#init_declarator.
     # @param  ctx Type: CParser.Init_declaratorContext
-    def enterInit_declarator(self,ctx):
+    def enterInit_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#init_declarator.
     # @param  ctx Type: CParser.Init_declaratorContext
-    def exitInit_declarator(self,ctx):
+    def exitInit_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#storage_class_specifier.
     # @param  ctx Type: CParser.Storage_class_specifierContext
-    def enterStorage_class_specifier(self,ctx):
+    def enterStorage_class_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#storage_class_specifier.
     # @param  ctx Type: CParser.Storage_class_specifierContext
-    def exitStorage_class_specifier(self,ctx):
+    def exitStorage_class_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_specifier.
     # @param  ctx Type: CParser.Type_specifierContext
-    def enterType_specifier(self,ctx):
+    def enterType_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_specifier.
     # @param  ctx Type: CParser.Type_specifierContext
-    def exitType_specifier(self,ctx):
+    def exitType_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_id.
     # @param  ctx Type: CParser.Type_idContext
-    def enterType_id(self,ctx):
+    def enterType_id(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_id.
     # @param  ctx Type: CParser.Type_idContext
-    def exitType_id(self,ctx):
+    def exitType_id(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_or_union_specifier.
     # @param  ctx Type: CParser.Struct_or_union_specifierContext
-    def enterStruct_or_union_specifier(self,ctx):
+    def enterStruct_or_union_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_or_union_specifier.
     # @param  ctx Type: CParser.Struct_or_union_specifierContext
-    def exitStruct_or_union_specifier(self,ctx):
+    def exitStruct_or_union_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_or_union.
     # @param  ctx Type: CParser.Struct_or_unionContext
-    def enterStruct_or_union(self,ctx):
+    def enterStruct_or_union(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_or_union.
     # @param  ctx Type: CParser.Struct_or_unionContext
-    def exitStruct_or_union(self,ctx):
+    def exitStruct_or_union(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declaration_list.
     # @param  ctx Type: CParser.Struct_declaration_listContext
-    def enterStruct_declaration_list(self,ctx):
+    def enterStruct_declaration_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declaration_list.
     # @param  ctx Type: CParser.Struct_declaration_listContext
-    def exitStruct_declaration_list(self,ctx):
+    def exitStruct_declaration_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declaration.
     # @param  ctx Type: CParser.Struct_declarationContext
-    def enterStruct_declaration(self,ctx):
+    def enterStruct_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declaration.
     # @param  ctx Type: CParser.Struct_declarationContext
-    def exitStruct_declaration(self,ctx):
+    def exitStruct_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#specifier_qualifier_list.
     # @param  ctx Type: CParser.Specifier_qualifier_listContext
-    def enterSpecifier_qualifier_list(self,ctx):
+    def enterSpecifier_qualifier_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#specifier_qualifier_list.
     # @param  ctx Type: CParser.Specifier_qualifier_listContext
-    def exitSpecifier_qualifier_list(self,ctx):
+    def exitSpecifier_qualifier_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declarator_list.
     # @param  ctx Type: CParser.Struct_declarator_listContext
-    def enterStruct_declarator_list(self,ctx):
+    def enterStruct_declarator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declarator_list.
     # @param  ctx Type: CParser.Struct_declarator_listContext
-    def exitStruct_declarator_list(self,ctx):
+    def exitStruct_declarator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declarator.
     # @param  ctx Type: CParser.Struct_declaratorContext
-    def enterStruct_declarator(self,ctx):
+    def enterStruct_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declarator.
     # @param  ctx Type: CParser.Struct_declaratorContext
-    def exitStruct_declarator(self,ctx):
+    def exitStruct_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enum_specifier.
     # @param  ctx Type: CParser.Enum_specifierContext
-    def enterEnum_specifier(self,ctx):
+    def enterEnum_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enum_specifier.
     # @param  ctx Type: CParser.Enum_specifierContext
-    def exitEnum_specifier(self,ctx):
+    def exitEnum_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enumerator_list.
     # @param  ctx Type: CParser.Enumerator_listContext
-    def enterEnumerator_list(self,ctx):
+    def enterEnumerator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enumerator_list.
     # @param  ctx Type: CParser.Enumerator_listContext
-    def exitEnumerator_list(self,ctx):
+    def exitEnumerator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enumerator.
     # @param  ctx Type: CParser.EnumeratorContext
-    def enterEnumerator(self,ctx):
+    def enterEnumerator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enumerator.
     # @param  ctx Type: CParser.EnumeratorContext
-    def exitEnumerator(self,ctx):
+    def exitEnumerator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_qualifier.
     # @param  ctx Type: CParser.Type_qualifierContext
-    def enterType_qualifier(self,ctx):
+    def enterType_qualifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_qualifier.
     # @param  ctx Type: CParser.Type_qualifierContext
-    def exitType_qualifier(self,ctx):
+    def exitType_qualifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declarator.
     # @param  ctx Type: CParser.DeclaratorContext
-    def enterDeclarator(self,ctx):
+    def enterDeclarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declarator.
     # @param  ctx Type: CParser.DeclaratorContext
-    def exitDeclarator(self,ctx):
+    def exitDeclarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#direct_declarator.
     # @param  ctx Type: CParser.Direct_declaratorContext
-    def enterDirect_declarator(self,ctx):
+    def enterDirect_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#direct_declarator.
     # @param  ctx Type: CParser.Direct_declaratorContext
-    def exitDirect_declarator(self,ctx):
+    def exitDirect_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declarator_suffix.
     # @param  ctx Type: CParser.Declarator_suffixContext
-    def enterDeclarator_suffix(self,ctx):
+    def enterDeclarator_suffix(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declarator_suffix.
     # @param  ctx Type: CParser.Declarator_suffixContext
-    def exitDeclarator_suffix(self,ctx):
+    def exitDeclarator_suffix(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#pointer.
     # @param  ctx Type: CParser.PointerContext
-    def enterPointer(self,ctx):
+    def enterPointer(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#pointer.
     # @param  ctx Type: CParser.PointerContext
-    def exitPointer(self,ctx):
+    def exitPointer(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_type_list.
     # @param  ctx Type: CParser.Parameter_type_listContext
-    def enterParameter_type_list(self,ctx):
+    def enterParameter_type_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_type_list.
     # @param  ctx Type: CParser.Parameter_type_listContext
-    def exitParameter_type_list(self,ctx):
+    def exitParameter_type_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_list.
     # @param  ctx Type: CParser.Parameter_listContext
-    def enterParameter_list(self,ctx):
+    def enterParameter_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_list.
     # @param  ctx Type: CParser.Parameter_listContext
-    def exitParameter_list(self,ctx):
+    def exitParameter_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_declaration.
     # @param  ctx Type: CParser.Parameter_declarationContext
-    def enterParameter_declaration(self,ctx):
+    def enterParameter_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_declaration.
     # @param  ctx Type: CParser.Parameter_declarationContext
-    def exitParameter_declaration(self,ctx):
+    def exitParameter_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#identifier_list.
     # @param  ctx Type: CParser.Identifier_listContext
-    def enterIdentifier_list(self,ctx):
+    def enterIdentifier_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#identifier_list.
     # @param  ctx Type: CParser.Identifier_listContext
-    def exitIdentifier_list(self,ctx):
+    def exitIdentifier_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_name.
     # @param  ctx Type: CParser.Type_nameContext
-    def enterType_name(self,ctx):
+    def enterType_name(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_name.
     # @param  ctx Type: CParser.Type_nameContext
-    def exitType_name(self,ctx):
+    def exitType_name(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#abstract_declarator.
     # @param  ctx Type: CParser.Abstract_declaratorContext
-    def enterAbstract_declarator(self,ctx):
+    def enterAbstract_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#abstract_declarator.
     # @param  ctx Type: CParser.Abstract_declaratorContext
-    def exitAbstract_declarator(self,ctx):
+    def exitAbstract_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#direct_abstract_declarator.
     # @param  ctx Type: CParser.Direct_abstract_declaratorContext
-    def enterDirect_abstract_declarator(self,ctx):
+    def enterDirect_abstract_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#direct_abstract_declarator.
     # @param  ctx Type: CParser.Direct_abstract_declaratorContext
-    def exitDirect_abstract_declarator(self,ctx):
+    def exitDirect_abstract_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#abstract_declarator_suffix.
     # @param  ctx Type: CParser.Abstract_declarator_suffixContext
-    def enterAbstract_declarator_suffix(self,ctx):
+    def enterAbstract_declarator_suffix(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#abstract_declarator_suffix.
     # @param  ctx Type: CParser.Abstract_declarator_suffixContext
-    def exitAbstract_declarator_suffix(self,ctx):
+    def exitAbstract_declarator_suffix(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#initializer.
     # @param  ctx Type: CParser.InitializerContext
-    def enterInitializer(self,ctx):
+    def enterInitializer(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#initializer.
     # @param  ctx Type: CParser.InitializerContext
-    def exitInitializer(self,ctx):
+    def exitInitializer(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#initializer_list.
     # @param  ctx Type: CParser.Initializer_listContext
-    def enterInitializer_list(self,ctx):
+    def enterInitializer_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#initializer_list.
     # @param  ctx Type: CParser.Initializer_listContext
-    def exitInitializer_list(self,ctx):
+    def exitInitializer_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#argument_expression_list.
     # @param  ctx Type: CParser.Argument_expression_listContext
-    def enterArgument_expression_list(self,ctx):
+    def enterArgument_expression_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#argument_expression_list.
     # @param  ctx Type: CParser.Argument_expression_listContext
-    def exitArgument_expression_list(self,ctx):
+    def exitArgument_expression_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#additive_expression.
     # @param  ctx Type: CParser.Additive_expressionContext
-    def enterAdditive_expression(self,ctx):
+    def enterAdditive_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#additive_expression.
     # @param  ctx Type: CParser.Additive_expressionContext
-    def exitAdditive_expression(self,ctx):
+    def exitAdditive_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#multiplicative_expression.
     # @param  ctx Type: CParser.Multiplicative_expressionContext
-    def enterMultiplicative_expression(self,ctx):
+    def enterMultiplicative_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#multiplicative_expression.
     # @param  ctx Type: CParser.Multiplicative_expressionContext
-    def exitMultiplicative_expression(self,ctx):
+    def exitMultiplicative_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#cast_expression.
     # @param  ctx Type: CParser.Cast_expressionContext
-    def enterCast_expression(self,ctx):
+    def enterCast_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#cast_expression.
     # @param  ctx Type: CParser.Cast_expressionContext
-    def exitCast_expression(self,ctx):
+    def exitCast_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#unary_expression.
     # @param  ctx Type: CParser.Unary_expressionContext
-    def enterUnary_expression(self,ctx):
+    def enterUnary_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#unary_expression.
     # @param  ctx Type: CParser.Unary_expressionContext
-    def exitUnary_expression(self,ctx):
+    def exitUnary_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#postfix_expression.
     # @param  ctx Type: CParser.Postfix_expressionContext
-    def enterPostfix_expression(self,ctx):
+    def enterPostfix_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#postfix_expression.
     # @param  ctx Type: CParser.Postfix_expressionContext
-    def exitPostfix_expression(self,ctx):
+    def exitPostfix_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#macro_parameter_list.
     # @param  ctx Type: CParser.Macro_parameter_listContext
-    def enterMacro_parameter_list(self,ctx):
+    def enterMacro_parameter_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#macro_parameter_list.
     # @param  ctx Type: CParser.Macro_parameter_listContext
-    def exitMacro_parameter_list(self,ctx):
+    def exitMacro_parameter_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#unary_operator.
     # @param  ctx Type: CParser.Unary_operatorContext
-    def enterUnary_operator(self,ctx):
+    def enterUnary_operator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#unary_operator.
     # @param  ctx Type: CParser.Unary_operatorContext
-    def exitUnary_operator(self,ctx):
+    def exitUnary_operator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#primary_expression.
     # @param  ctx Type: CParser.Primary_expressionContext
-    def enterPrimary_expression(self,ctx):
+    def enterPrimary_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#primary_expression.
     # @param  ctx Type: CParser.Primary_expressionContext
-    def exitPrimary_expression(self,ctx):
+    def exitPrimary_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#constant.
     # @param  ctx Type: CParser.ConstantContext
-    def enterConstant(self,ctx):
+    def enterConstant(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#constant.
     # @param  ctx Type: CParser.ConstantContext
-    def exitConstant(self,ctx):
+    def exitConstant(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#expression.
     # @param  ctx Type: CParser.ExpressionContext
-    def enterExpression(self,ctx):
+    def enterExpression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#expression.
     # @param  ctx Type: CParser.ExpressionContext
-    def exitExpression(self,ctx):
+    def exitExpression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#constant_expression.
     # @param  ctx Type: CParser.Constant_expressionContext
-    def enterConstant_expression(self,ctx):
+    def enterConstant_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#constant_expression.
     # @param  ctx Type: CParser.Constant_expressionContext
-    def exitConstant_expression(self,ctx):
+    def exitConstant_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#assignment_expression.
     # @param  ctx Type: CParser.Assignment_expressionContext
-    def enterAssignment_expression(self,ctx):
+    def enterAssignment_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#assignment_expression.
     # @param  ctx Type: CParser.Assignment_expressionContext
-    def exitAssignment_expression(self,ctx):
+    def exitAssignment_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#lvalue.
     # @param  ctx Type: CParser.LvalueContext
-    def enterLvalue(self,ctx):
+    def enterLvalue(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#lvalue.
     # @param  ctx Type: CParser.LvalueContext
-    def exitLvalue(self,ctx):
+    def exitLvalue(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#assignment_operator.
     # @param  ctx Type: CParser.Assignment_operatorContext
-    def enterAssignment_operator(self,ctx):
+    def enterAssignment_operator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#assignment_operator.
     # @param  ctx Type: CParser.Assignment_operatorContext
-    def exitAssignment_operator(self,ctx):
+    def exitAssignment_operator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#conditional_expression.
     # @param  ctx Type: CParser.Conditional_expressionContext
-    def enterConditional_expression(self,ctx):
+    def enterConditional_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#conditional_expression.
     # @param  ctx Type: CParser.Conditional_expressionContext
-    def exitConditional_expression(self,ctx):
+    def exitConditional_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#logical_or_expression.
     # @param  ctx Type: CParser.Logical_or_expressionContext
-    def enterLogical_or_expression(self,ctx):
+    def enterLogical_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#logical_or_expression.
     # @param  ctx Type: CParser.Logical_or_expressionContext
-    def exitLogical_or_expression(self,ctx):
+    def exitLogical_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#logical_and_expression.
     # @param  ctx Type: CParser.Logical_and_expressionContext
-    def enterLogical_and_expression(self,ctx):
+    def enterLogical_and_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#logical_and_expression.
     # @param  ctx Type: CParser.Logical_and_expressionContext
-    def exitLogical_and_expression(self,ctx):
+    def exitLogical_and_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#inclusive_or_expression.
     # @param  ctx Type: CParser.Inclusive_or_expressionContext
-    def enterInclusive_or_expression(self,ctx):
+    def enterInclusive_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#inclusive_or_expression.
     # @param  ctx Type: CParser.Inclusive_or_expressionContext
-    def exitInclusive_or_expression(self,ctx):
+    def exitInclusive_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#exclusive_or_expression.
     # @param  ctx Type: CParser.Exclusive_or_expressionContext
-    def enterExclusive_or_expression(self,ctx):
+    def enterExclusive_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#exclusive_or_expression.
     # @param  ctx Type: CParser.Exclusive_or_expressionContext
-    def exitExclusive_or_expression(self,ctx):
+    def exitExclusive_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#and_expression.
     # @param  ctx Type: CParser.And_expressionContext
-    def enterAnd_expression(self,ctx):
+    def enterAnd_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#and_expression.
     # @param  ctx Type: CParser.And_expressionContext
-    def exitAnd_expression(self,ctx):
+    def exitAnd_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#equality_expression.
     # @param  ctx Type: CParser.Equality_expressionContext
-    def enterEquality_expression(self,ctx):
+    def enterEquality_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#equality_expression.
     # @param  ctx Type: CParser.Equality_expressionContext
-    def exitEquality_expression(self,ctx):
+    def exitEquality_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#relational_expression.
     # @param  ctx Type: CParser.Relational_expressionContext
-    def enterRelational_expression(self,ctx):
+    def enterRelational_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#relational_expression.
     # @param  ctx Type: CParser.Relational_expressionContext
-    def exitRelational_expression(self,ctx):
+    def exitRelational_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#shift_expression.
     # @param  ctx Type: CParser.Shift_expressionContext
-    def enterShift_expression(self,ctx):
+    def enterShift_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#shift_expression.
     # @param  ctx Type: CParser.Shift_expressionContext
-    def exitShift_expression(self,ctx):
+    def exitShift_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#statement.
     # @param  ctx Type: CParser.StatementContext
-    def enterStatement(self,ctx):
+    def enterStatement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#statement.
     # @param  ctx Type: CParser.StatementContext
-    def exitStatement(self,ctx):
+    def exitStatement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm2_statement.
     # @param  ctx Type: CParser.Asm2_statementContext
-    def enterAsm2_statement(self,ctx):
+    def enterAsm2_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm2_statement.
     # @param  ctx Type: CParser.Asm2_statementContext
-    def exitAsm2_statement(self,ctx):
+    def exitAsm2_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm1_statement.
     # @param  ctx Type: CParser.Asm1_statementContext
-    def enterAsm1_statement(self,ctx):
+    def enterAsm1_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm1_statement.
     # @param  ctx Type: CParser.Asm1_statementContext
-    def exitAsm1_statement(self,ctx):
+    def exitAsm1_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm_statement.
     # @param  ctx Type: CParser.Asm_statementContext
-    def enterAsm_statement(self,ctx):
+    def enterAsm_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm_statement.
     # @param  ctx Type: CParser.Asm_statementContext
-    def exitAsm_statement(self,ctx):
+    def exitAsm_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#macro_statement.
     # @param  ctx Type: CParser.Macro_statementContext
-    def enterMacro_statement(self,ctx):
+    def enterMacro_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#macro_statement.
     # @param  ctx Type: CParser.Macro_statementContext
-    def exitMacro_statement(self,ctx):
+    def exitMacro_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#labeled_statement.
     # @param  ctx Type: CParser.Labeled_statementContext
-    def enterLabeled_statement(self,ctx):
+    def enterLabeled_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#labeled_statement.
     # @param  ctx Type: CParser.Labeled_statementContext
-    def exitLabeled_statement(self,ctx):
+    def exitLabeled_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#compound_statement.
     # @param  ctx Type: CParser.Compound_statementContext
-    def enterCompound_statement(self,ctx):
+    def enterCompound_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#compound_statement.
     # @param  ctx Type: CParser.Compound_statementContext
-    def exitCompound_statement(self,ctx):
+    def exitCompound_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#statement_list.
     # @param  ctx Type: CParser.Statement_listContext
-    def enterStatement_list(self,ctx):
+    def enterStatement_list(self, ctx):
         pass
 
-
     # Exit a parse tree produced by CParser#statement_list.
     # @param  ctx Type: CParser.Statement_listContext
-    def exitStatement_list(self,ctx):
+    def exitStatement_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#expression_statement.
     # @param  ctx Type: CParser.Expression_statementContext
-    def enterExpression_statement(self,ctx):
+    def enterExpression_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#expression_statement.
     # @param  ctx Type: CParser.Expression_statementContext
-    def exitExpression_statement(self,ctx):
+    def exitExpression_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#selection_statement.
     # @param  ctx Type: CParser.Selection_statementContext
-    def enterSelection_statement(self,ctx):
+    def enterSelection_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#selection_statement.
     # @param  ctx Type: CParser.Selection_statementContext
-    def exitSelection_statement(self,ctx):
+    def exitSelection_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#iteration_statement.
     # @param  ctx Type: CParser.Iteration_statementContext
-    def enterIteration_statement(self,ctx):
+    def enterIteration_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#iteration_statement.
     # @param  ctx Type: CParser.Iteration_statementContext
-    def exitIteration_statement(self,ctx):
+    def exitIteration_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#jump_statement.
     # @param  ctx Type: CParser.Jump_statementContext
-    def enterJump_statement(self,ctx):
+    def enterJump_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#jump_statement.
     # @param  ctx Type: CParser.Jump_statementContext
-    def exitJump_statement(self,ctx):
+    def exitJump_statement(self, ctx):
         pass
-
-
diff --git a/BaseTools/Source/Python/Ecc/CParser4/CParser.py b/BaseTools/Source/Python/Ecc/CParser4/CParser.py
index 31d23d55aa57..22c17c66680a 100644
--- a/BaseTools/Source/Python/Ecc/CParser4/CParser.py
+++ b/BaseTools/Source/Python/Ecc/CParser4/CParser.py
@@ -6,7 +6,7 @@ from typing.io import TextIO
 import sys
 
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -22,6 +22,7 @@ import sys
 import Ecc.CodeFragment as CodeFragment
 import Ecc.FileProfile as FileProfile
 
+
 def serializedATN():
     with StringIO() as buf:
         buf.write("\3\u608b\ua72a\u8133\ub9ed\u417c\u3be7\u7786\u5964\3k")
@@ -475,61 +476,61 @@ def serializedATN():
         return buf.getvalue()
 
 
-class CParser ( Parser ):
+class CParser (Parser):
 
     grammarFileName = "C.g4"
 
     atn = ATNDeserializer().deserialize(serializedATN())
 
-    decisionsToDFA = [ DFA(ds, i) for i, ds in enumerate(atn.decisionToState) ]
+    decisionsToDFA = [DFA(ds, i) for i, ds in enumerate(atn.decisionToState)]
 
     sharedContextCache = PredictionContextCache()
 
-    literalNames = [ "<INVALID>", "'{'", "';'", "'typedef'", "','", "'='",
-                     "'extern'", "'static'", "'auto'", "'register'", "'STATIC'",
-                     "'void'", "'char'", "'short'", "'int'", "'long'", "'float'",
-                     "'double'", "'signed'", "'unsigned'", "'}'", "'struct'",
-                     "'union'", "':'", "'enum'", "'const'", "'volatile'",
-                     "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'",
-                     "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'",
-                     "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
-                     "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'",
-                     "'-'", "'/'", "'%'", "'++'", "'--'", "'sizeof'", "'.'",
-                     "'->'", "'&'", "'~'", "'!'", "'*='", "'/='", "'%='",
-                     "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
-                     "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='",
-                     "'<'", "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'",
-                     "'_asm'", "'__asm'", "'case'", "'default'", "'if'",
-                     "'else'", "'switch'", "'while'", "'do'", "'goto'",
-                     "'continue'", "'break'", "'return'" ]
+    literalNames = ["<INVALID>", "'{'", "';'", "'typedef'", "','", "'='",
+                    "'extern'", "'static'", "'auto'", "'register'", "'STATIC'",
+                    "'void'", "'char'", "'short'", "'int'", "'long'", "'float'",
+                    "'double'", "'signed'", "'unsigned'", "'}'", "'struct'",
+                    "'union'", "':'", "'enum'", "'const'", "'volatile'",
+                    "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'",
+                    "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'",
+                    "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
+                    "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'",
+                    "'-'", "'/'", "'%'", "'++'", "'--'", "'sizeof'", "'.'",
+                    "'->'", "'&'", "'~'", "'!'", "'*='", "'/='", "'%='",
+                    "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+                    "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='",
+                    "'<'", "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'",
+                    "'_asm'", "'__asm'", "'case'", "'default'", "'if'",
+                    "'else'", "'switch'", "'while'", "'do'", "'goto'",
+                    "'continue'", "'break'", "'return'"]
 
-    symbolicNames = [ "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL",
-                      "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL",
-                      "FLOATING_POINT_LITERAL", "WS", "BS", "UnicodeVocabulary",
-                      "COMMENT", "LINE_COMMENT", "LINE_COMMAND" ]
+    symbolicNames = ["<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL",
+                     "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL",
+                     "FLOATING_POINT_LITERAL", "WS", "BS", "UnicodeVocabulary",
+                     "COMMENT", "LINE_COMMENT", "LINE_COMMAND"]
 
     RULE_translation_unit = 0
     RULE_external_declaration = 1
@@ -603,225 +604,224 @@ class CParser ( Parser ):
     RULE_iteration_statement = 69
     RULE_jump_statement = 70
 
-    ruleNames =  [ "translation_unit", "external_declaration", "function_definition",
-                   "declaration_specifiers", "declaration", "init_declarator_list",
-                   "init_declarator", "storage_class_specifier", "type_specifier",
-                   "type_id", "struct_or_union_specifier", "struct_or_union",
-                   "struct_declaration_list", "struct_declaration", "specifier_qualifier_list",
-                   "struct_declarator_list", "struct_declarator", "enum_specifier",
-                   "enumerator_list", "enumerator", "type_qualifier", "declarator",
-                   "direct_declarator", "declarator_suffix", "pointer",
-                   "parameter_type_list", "parameter_list", "parameter_declaration",
-                   "identifier_list", "type_name", "abstract_declarator",
-                   "direct_abstract_declarator", "abstract_declarator_suffix",
-                   "initializer", "initializer_list", "argument_expression_list",
-                   "additive_expression", "multiplicative_expression", "cast_expression",
-                   "unary_expression", "postfix_expression", "macro_parameter_list",
-                   "unary_operator", "primary_expression", "constant", "expression",
-                   "constant_expression", "assignment_expression", "lvalue",
-                   "assignment_operator", "conditional_expression", "logical_or_expression",
-                   "logical_and_expression", "inclusive_or_expression",
-                   "exclusive_or_expression", "and_expression", "equality_expression",
-                   "relational_expression", "shift_expression", "statement",
-                   "asm2_statement", "asm1_statement", "asm_statement",
-                   "macro_statement", "labeled_statement", "compound_statement",
-                   "statement_list", "expression_statement", "selection_statement",
-                   "iteration_statement", "jump_statement" ]
+    ruleNames = ["translation_unit", "external_declaration", "function_definition",
+                 "declaration_specifiers", "declaration", "init_declarator_list",
+                 "init_declarator", "storage_class_specifier", "type_specifier",
+                 "type_id", "struct_or_union_specifier", "struct_or_union",
+                 "struct_declaration_list", "struct_declaration", "specifier_qualifier_list",
+                 "struct_declarator_list", "struct_declarator", "enum_specifier",
+                 "enumerator_list", "enumerator", "type_qualifier", "declarator",
+                 "direct_declarator", "declarator_suffix", "pointer",
+                 "parameter_type_list", "parameter_list", "parameter_declaration",
+                 "identifier_list", "type_name", "abstract_declarator",
+                 "direct_abstract_declarator", "abstract_declarator_suffix",
+                 "initializer", "initializer_list", "argument_expression_list",
+                 "additive_expression", "multiplicative_expression", "cast_expression",
+                 "unary_expression", "postfix_expression", "macro_parameter_list",
+                 "unary_operator", "primary_expression", "constant", "expression",
+                 "constant_expression", "assignment_expression", "lvalue",
+                 "assignment_operator", "conditional_expression", "logical_or_expression",
+                 "logical_and_expression", "inclusive_or_expression",
+                 "exclusive_or_expression", "and_expression", "equality_expression",
+                 "relational_expression", "shift_expression", "statement",
+                 "asm2_statement", "asm1_statement", "asm_statement",
+                 "macro_statement", "labeled_statement", "compound_statement",
+                 "statement_list", "expression_statement", "selection_statement",
+                 "iteration_statement", "jump_statement"]
 
     EOF = Token.EOF
-    T__0=1
-    T__1=2
-    T__2=3
-    T__3=4
-    T__4=5
-    T__5=6
-    T__6=7
-    T__7=8
-    T__8=9
-    T__9=10
-    T__10=11
-    T__11=12
-    T__12=13
-    T__13=14
-    T__14=15
-    T__15=16
-    T__16=17
-    T__17=18
-    T__18=19
-    T__19=20
-    T__20=21
-    T__21=22
-    T__22=23
-    T__23=24
-    T__24=25
-    T__25=26
-    T__26=27
-    T__27=28
-    T__28=29
-    T__29=30
-    T__30=31
-    T__31=32
-    T__32=33
-    T__33=34
-    T__34=35
-    T__35=36
-    T__36=37
-    T__37=38
-    T__38=39
-    T__39=40
-    T__40=41
-    T__41=42
-    T__42=43
-    T__43=44
-    T__44=45
-    T__45=46
-    T__46=47
-    T__47=48
-    T__48=49
-    T__49=50
-    T__50=51
-    T__51=52
-    T__52=53
-    T__53=54
-    T__54=55
-    T__55=56
-    T__56=57
-    T__57=58
-    T__58=59
-    T__59=60
-    T__60=61
-    T__61=62
-    T__62=63
-    T__63=64
-    T__64=65
-    T__65=66
-    T__66=67
-    T__67=68
-    T__68=69
-    T__69=70
-    T__70=71
-    T__71=72
-    T__72=73
-    T__73=74
-    T__74=75
-    T__75=76
-    T__76=77
-    T__77=78
-    T__78=79
-    T__79=80
-    T__80=81
-    T__81=82
-    T__82=83
-    T__83=84
-    T__84=85
-    T__85=86
-    T__86=87
-    T__87=88
-    T__88=89
-    T__89=90
-    T__90=91
-    T__91=92
-    IDENTIFIER=93
-    CHARACTER_LITERAL=94
-    STRING_LITERAL=95
-    HEX_LITERAL=96
-    DECIMAL_LITERAL=97
-    OCTAL_LITERAL=98
-    FLOATING_POINT_LITERAL=99
-    WS=100
-    BS=101
-    UnicodeVocabulary=102
-    COMMENT=103
-    LINE_COMMENT=104
-    LINE_COMMAND=105
+    T__0 = 1
+    T__1 = 2
+    T__2 = 3
+    T__3 = 4
+    T__4 = 5
+    T__5 = 6
+    T__6 = 7
+    T__7 = 8
+    T__8 = 9
+    T__9 = 10
+    T__10 = 11
+    T__11 = 12
+    T__12 = 13
+    T__13 = 14
+    T__14 = 15
+    T__15 = 16
+    T__16 = 17
+    T__17 = 18
+    T__18 = 19
+    T__19 = 20
+    T__20 = 21
+    T__21 = 22
+    T__22 = 23
+    T__23 = 24
+    T__24 = 25
+    T__25 = 26
+    T__26 = 27
+    T__27 = 28
+    T__28 = 29
+    T__29 = 30
+    T__30 = 31
+    T__31 = 32
+    T__32 = 33
+    T__33 = 34
+    T__34 = 35
+    T__35 = 36
+    T__36 = 37
+    T__37 = 38
+    T__38 = 39
+    T__39 = 40
+    T__40 = 41
+    T__41 = 42
+    T__42 = 43
+    T__43 = 44
+    T__44 = 45
+    T__45 = 46
+    T__46 = 47
+    T__47 = 48
+    T__48 = 49
+    T__49 = 50
+    T__50 = 51
+    T__51 = 52
+    T__52 = 53
+    T__53 = 54
+    T__54 = 55
+    T__55 = 56
+    T__56 = 57
+    T__57 = 58
+    T__58 = 59
+    T__59 = 60
+    T__60 = 61
+    T__61 = 62
+    T__62 = 63
+    T__63 = 64
+    T__64 = 65
+    T__65 = 66
+    T__66 = 67
+    T__67 = 68
+    T__68 = 69
+    T__69 = 70
+    T__70 = 71
+    T__71 = 72
+    T__72 = 73
+    T__73 = 74
+    T__74 = 75
+    T__75 = 76
+    T__76 = 77
+    T__77 = 78
+    T__78 = 79
+    T__79 = 80
+    T__80 = 81
+    T__81 = 82
+    T__82 = 83
+    T__83 = 84
+    T__84 = 85
+    T__85 = 86
+    T__86 = 87
+    T__87 = 88
+    T__88 = 89
+    T__89 = 90
+    T__90 = 91
+    T__91 = 92
+    IDENTIFIER = 93
+    CHARACTER_LITERAL = 94
+    STRING_LITERAL = 95
+    HEX_LITERAL = 96
+    DECIMAL_LITERAL = 97
+    OCTAL_LITERAL = 98
+    FLOATING_POINT_LITERAL = 99
+    WS = 100
+    BS = 101
+    UnicodeVocabulary = 102
+    COMMENT = 103
+    LINE_COMMENT = 104
+    LINE_COMMAND = 105
 
     # @param  input Type: TokenStream
     # @param  output= sys.stdout Type: TextIO
-    def __init__(self,input,output= sys.stdout):
+    def __init__(self, input, output=sys.stdout):
         super().__init__(input, output)
         self.checkVersion("4.7.1")
-        self._interp = ParserATNSimulator(self, self.atn, self.decisionsToDFA, self.sharedContextCache)
+        self._interp = ParserATNSimulator(
+            self, self.atn, self.decisionsToDFA, self.sharedContextCache)
         self._predicates = None
 
+    def printTokenInfo(self, line, offset, tokenText):
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
-
-
-    def printTokenInfo(self,line,offset,tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
-
-    def StorePredicateExpression(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.PredicateExpressionList.append(PredExp)
 
-    def StoreEnumerationDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.EnumerationDefinitionList.append(EnumDef)
 
-    def StoreStructUnionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.StructUnionDefinitionList.append(SUDef)
 
-    def StoreTypedefDefinition(self,StartLine,StartOffset,EndLine,EndOffset,FromText,ToText):
-        Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.TypedefDefinitionList.append(Tdef)
 
-    def StoreFunctionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText,LeftBraceLine,LeftBraceOffset,DeclLine,DeclOffset):
-        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+    def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
         FileProfile.FunctionDefinitionList.append(FuncDef)
 
-    def StoreVariableDeclaration(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText):
-        VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.VariableDeclarationList.append(VarDecl)
 
-    def StoreFunctionCalling(self,StartLine,StartOffset,EndLine,EndOffset,FuncName,ParamList):
-        FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.FunctionCallingList.append(FuncCall)
 
-
-
     class Translation_unitContext(ParserRuleContext):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def external_declaration(self,i=None):
+        def external_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.External_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.External_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.External_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_translation_unit
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterTranslation_unit" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterTranslation_unit"):
                 listener.enterTranslation_unit(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitTranslation_unit" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitTranslation_unit"):
                 listener.exitTranslation_unit(self)
 
-
-
-
     def translation_unit(self):
 
         localctx = CParser.Translation_unitContext(self, self._ctx, self.state)
         self.enterRule(localctx, 0, self.RULE_translation_unit)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 145
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__41))) != 0) or _la==CParser.IDENTIFIER:
+            while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__41))) != 0) or _la == CParser.IDENTIFIER:
                 self.state = 142
                 self.external_declaration()
                 self.state = 147
@@ -840,75 +840,67 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def function_definition(self):
-            return self.getTypedRuleContext(CParser.Function_definitionContext,0)
-
+            return self.getTypedRuleContext(CParser.Function_definitionContext, 0)
 
         def macro_statement(self):
-            return self.getTypedRuleContext(CParser.Macro_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Macro_statementContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_external_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExternal_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExternal_declaration"):
                 listener.enterExternal_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExternal_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExternal_declaration"):
                 listener.exitExternal_declaration(self)
 
-
-
-
     def external_declaration(self):
 
-        localctx = CParser.External_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.External_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 2, self.RULE_external_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 166
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,4,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 4, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 149
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,1,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 1, self._ctx)
                 if la_ == 1:
                     self.state = 148
                     self.declaration_specifiers()
 
-
                 self.state = 151
                 self.declarator()
                 self.state = 155
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER:
+                while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER:
                     self.state = 152
                     self.declaration()
                     self.state = 157
@@ -938,14 +930,12 @@ class CParser ( Parser ):
                 self.state = 164
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__1:
+                if _la == CParser.T__1:
                     self.state = 163
                     self.match(CParser.T__1)
 
-
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -958,7 +948,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
             self.ModifierText = ''
@@ -967,71 +957,64 @@ class CParser ( Parser ):
             self.LBOffset = 0
             self.DeclLine = 0
             self.DeclOffset = 0
-            self.d = None # Declaration_specifiersContext
-            self._declaration_specifiers = None # Declaration_specifiersContext
-            self._declarator = None # DeclaratorContext
-            self.a = None # Compound_statementContext
-            self.b = None # Compound_statementContext
+            self.d = None  # Declaration_specifiersContext
+            self._declaration_specifiers = None  # Declaration_specifiersContext
+            self._declarator = None  # DeclaratorContext
+            self.a = None  # Compound_statementContext
+            self.b = None  # Compound_statementContext
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def compound_statement(self):
-            return self.getTypedRuleContext(CParser.Compound_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Compound_statementContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_function_definition
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterFunction_definition" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterFunction_definition"):
                 listener.enterFunction_definition(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitFunction_definition" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitFunction_definition"):
                 listener.exitFunction_definition(self)
 
-
-
-
     def function_definition(self):
 
-        localctx = CParser.Function_definitionContext(self, self._ctx, self.state)
+        localctx = CParser.Function_definitionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 4, self.RULE_function_definition)
 
-        ModifierText = '';
-        DeclText = '';
-        LBLine = 0;
-        LBOffset = 0;
-        DeclLine = 0;
-        DeclOffset = 0;
+        ModifierText = ''
+        DeclText = ''
+        LBLine = 0
+        LBOffset = 0
+        DeclLine = 0
+        DeclOffset = 0
 
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 169
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,5,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 5, self._ctx)
             if la_ == 1:
                 self.state = 168
                 localctx.d = localctx._declaration_specifiers = self.declaration_specifiers()
 
-
             self.state = 171
             localctx._declarator = self.declarator()
             self.state = 180
@@ -1047,7 +1030,7 @@ class CParser ( Parser ):
                     self.state = 175
                     self._errHandler.sync(self)
                     _la = self._input.LA(1)
-                    if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER):
+                    if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER):
                         break
 
                 self.state = 177
@@ -1060,24 +1043,30 @@ class CParser ( Parser ):
             else:
                 raise NoViableAltException(self)
 
-
             if localctx.d != None:
-                ModifierText = (None if localctx._declaration_specifiers is None else self._input.getText((localctx._declaration_specifiers.start,localctx._declaration_specifiers.stop)))
+                ModifierText = (None if localctx._declaration_specifiers is None else self._input.getText(
+                    (localctx._declaration_specifiers.start, localctx._declaration_specifiers.stop)))
             else:
                 ModifierText = ''
-            DeclText = (None if localctx._declarator is None else self._input.getText((localctx._declarator.start,localctx._declarator.stop)))
-            DeclLine = (None if localctx._declarator is None else localctx._declarator.start).line
-            DeclOffset = (None if localctx._declarator is None else localctx._declarator.start).column
+            DeclText = (None if localctx._declarator is None else self._input.getText(
+                (localctx._declarator.start, localctx._declarator.stop)))
+            DeclLine = (
+                None if localctx._declarator is None else localctx._declarator.start).line
+            DeclOffset = (
+                None if localctx._declarator is None else localctx._declarator.start).column
             if localctx.a != None:
                 LBLine = (None if localctx.a is None else localctx.a.start).line
-                LBOffset = (None if localctx.a is None else localctx.a.start).column
+                LBOffset = (
+                    None if localctx.a is None else localctx.a.start).column
             else:
                 LBLine = (None if localctx.b is None else localctx.b.start).line
-                LBOffset = (None if localctx.b is None else localctx.b.start).column
+                LBOffset = (
+                    None if localctx.b is None else localctx.b.start).column
 
             self._ctx.stop = self._input.LT(-1)
 
-            self.StoreFunctionDefinition(localctx.start.line, localctx.start.column, localctx.stop.line, localctx.stop.column, ModifierText, DeclText, LBLine, LBOffset, DeclLine, DeclOffset)
+            self.StoreFunctionDefinition(localctx.start.line, localctx.start.column, localctx.stop.line,
+                                         localctx.stop.column, ModifierText, DeclText, LBLine, LBOffset, DeclLine, DeclOffset)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1091,60 +1080,55 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def storage_class_specifier(self,i=None):
+        def storage_class_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Storage_class_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Storage_class_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Storage_class_specifierContext, i)
 
         # @param  i=None Type: int
-        def type_specifier(self,i=None):
+        def type_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_specifierContext, i)
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_declaration_specifiers
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclaration_specifiers" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclaration_specifiers"):
                 listener.enterDeclaration_specifiers(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclaration_specifiers" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclaration_specifiers"):
                 listener.exitDeclaration_specifiers(self)
 
-
-
-
     def declaration_specifiers(self):
 
-        localctx = CParser.Declaration_specifiersContext(self, self._ctx, self.state)
+        localctx = CParser.Declaration_specifiersContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 6, self.RULE_declaration_specifiers)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 187
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 187
                     self._errHandler.sync(self)
@@ -1164,12 +1148,11 @@ class CParser ( Parser ):
                     else:
                         raise NoViableAltException(self)
 
-
                 else:
                     raise NoViableAltException(self)
                 self.state = 189
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,9,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 9, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1183,46 +1166,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.a = None # Token
-            self.b = None # Declaration_specifiersContext
-            self.c = None # Init_declarator_listContext
-            self.d = None # Token
-            self.s = None # Declaration_specifiersContext
-            self.t = None # Init_declarator_listContext
-            self.e = None # Token
+            self.a = None  # Token
+            self.b = None  # Declaration_specifiersContext
+            self.c = None  # Init_declarator_listContext
+            self.d = None  # Token
+            self.s = None  # Declaration_specifiersContext
+            self.t = None  # Init_declarator_listContext
+            self.e = None  # Token
 
         def init_declarator_list(self):
-            return self.getTypedRuleContext(CParser.Init_declarator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Init_declarator_listContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclaration"):
                 listener.enterDeclaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclaration"):
                 listener.exitDeclaration(self)
 
-
-
-
     def declaration(self):
 
         localctx = CParser.DeclarationContext(self, self._ctx, self.state)
         self.enterRule(localctx, 8, self.RULE_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 206
             self._errHandler.sync(self)
@@ -1233,21 +1211,22 @@ class CParser ( Parser ):
                 localctx.a = self.match(CParser.T__2)
                 self.state = 193
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,10,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 10, self._ctx)
                 if la_ == 1:
                     self.state = 192
                     localctx.b = self.declaration_specifiers()
 
-
                 self.state = 195
                 localctx.c = self.init_declarator_list()
                 self.state = 196
                 localctx.d = self.match(CParser.T__1)
 
                 if localctx.b is not None:
-                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, (None if localctx.b is None else self._input.getText((localctx.b.start,localctx.b.stop))), (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, (None if localctx.b is None else self._input.getText(
+                        (localctx.b.start, localctx.b.stop))), (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
                 else:
-                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, '', (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line),
+                                                localctx.d.column, '', (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
 
                 pass
             elif token in [CParser.T__5, CParser.T__6, CParser.T__7, CParser.T__8, CParser.T__9, CParser.T__10, CParser.T__11, CParser.T__12, CParser.T__13, CParser.T__14, CParser.T__15, CParser.T__16, CParser.T__17, CParser.T__18, CParser.T__20, CParser.T__21, CParser.T__23, CParser.T__24, CParser.T__25, CParser.T__26, CParser.T__27, CParser.T__28, CParser.T__29, CParser.T__30, CParser.T__31, CParser.T__32, CParser.T__33, CParser.T__34, CParser.T__35, CParser.T__36, CParser.IDENTIFIER]:
@@ -1261,12 +1240,12 @@ class CParser ( Parser ):
                     self.state = 200
                     localctx.t = self.init_declarator_list()
 
-
                 self.state = 203
                 localctx.e = self.match(CParser.T__1)
 
                 if localctx.t is not None:
-                    self.StoreVariableDeclaration((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.t is None else localctx.t.start).line, (None if localctx.t is None else localctx.t.start).column, (None if localctx.s is None else self._input.getText((localctx.s.start,localctx.s.stop))), (None if localctx.t is None else self._input.getText((localctx.t.start,localctx.t.stop))))
+                    self.StoreVariableDeclaration((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.t is None else localctx.t.start).line, (
+                        None if localctx.t is None else localctx.t.start).column, (None if localctx.s is None else self._input.getText((localctx.s.start, localctx.s.stop))), (None if localctx.t is None else self._input.getText((localctx.t.start, localctx.t.stop))))
 
                 pass
             else:
@@ -1284,39 +1263,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def init_declarator(self,i=None):
+        def init_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Init_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Init_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Init_declaratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_init_declarator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInit_declarator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInit_declarator_list"):
                 listener.enterInit_declarator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInit_declarator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInit_declarator_list"):
                 listener.exitInit_declarator_list(self)
 
-
-
-
     def init_declarator_list(self):
 
-        localctx = CParser.Init_declarator_listContext(self, self._ctx, self.state)
+        localctx = CParser.Init_declarator_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 10, self.RULE_init_declarator_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 208
@@ -1324,7 +1300,7 @@ class CParser ( Parser ):
             self.state = 213
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 209
                 self.match(CParser.T__3)
                 self.state = 210
@@ -1345,39 +1321,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def initializer(self):
-            return self.getTypedRuleContext(CParser.InitializerContext,0)
-
+            return self.getTypedRuleContext(CParser.InitializerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_init_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInit_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInit_declarator"):
                 listener.enterInit_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInit_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInit_declarator"):
                 listener.exitInit_declarator(self)
 
-
-
-
     def init_declarator(self):
 
         localctx = CParser.Init_declaratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 12, self.RULE_init_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 216
@@ -1385,13 +1356,12 @@ class CParser ( Parser ):
             self.state = 219
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__4:
+            if _la == CParser.T__4:
                 self.state = 217
                 self.match(CParser.T__4)
                 self.state = 218
                 self.initializer()
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1404,32 +1374,29 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_storage_class_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStorage_class_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStorage_class_specifier"):
                 listener.enterStorage_class_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStorage_class_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStorage_class_specifier"):
                 listener.exitStorage_class_specifier(self)
 
-
-
-
     def storage_class_specifier(self):
 
-        localctx = CParser.Storage_class_specifierContext(self, self._ctx, self.state)
+        localctx = CParser.Storage_class_specifierContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 14, self.RULE_storage_class_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 221
@@ -1451,55 +1418,47 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.s = None # Struct_or_union_specifierContext
-            self.e = None # Enum_specifierContext
+            self.s = None  # Struct_or_union_specifierContext
+            self.e = None  # Enum_specifierContext
 
         def struct_or_union_specifier(self):
-            return self.getTypedRuleContext(CParser.Struct_or_union_specifierContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_or_union_specifierContext, 0)
 
         def enum_specifier(self):
-            return self.getTypedRuleContext(CParser.Enum_specifierContext,0)
-
+            return self.getTypedRuleContext(CParser.Enum_specifierContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def type_id(self):
-            return self.getTypedRuleContext(CParser.Type_idContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_idContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_type_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_specifier"):
                 listener.enterType_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_specifier"):
                 listener.exitType_specifier(self)
 
-
-
-
     def type_specifier(self):
 
         localctx = CParser.Type_specifierContext(self, self._ctx, self.state)
@@ -1507,7 +1466,7 @@ class CParser ( Parser ):
         try:
             self.state = 247
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,16,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 16, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 223
@@ -1568,7 +1527,8 @@ class CParser ( Parser ):
                 localctx.s = self.struct_or_union_specifier()
 
                 if localctx.s.stop is not None:
-                    self.StoreStructUnionDefinition((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.s is None else localctx.s.stop).line, (None if localctx.s is None else localctx.s.stop).column, (None if localctx.s is None else self._input.getText((localctx.s.start,localctx.s.stop))))
+                    self.StoreStructUnionDefinition((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.s is None else localctx.s.stop).line, (
+                        None if localctx.s is None else localctx.s.stop).column, (None if localctx.s is None else self._input.getText((localctx.s.start, localctx.s.stop))))
 
                 pass
 
@@ -1578,7 +1538,8 @@ class CParser ( Parser ):
                 localctx.e = self.enum_specifier()
 
                 if localctx.e.stop is not None:
-                    self.StoreEnumerationDefinition((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                    self.StoreEnumerationDefinition((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                        None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
 
                 pass
 
@@ -1588,14 +1549,15 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 self.state = 242
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,15,self._ctx)
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                    if _alt==1:
+                _alt = self._interp.adaptivePredict(self._input, 15, self._ctx)
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                    if _alt == 1:
                         self.state = 239
                         self.type_qualifier()
                     self.state = 244
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,15,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 15, self._ctx)
 
                 self.state = 245
                 self.declarator()
@@ -1607,7 +1569,6 @@ class CParser ( Parser ):
                 self.type_id()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1620,7 +1581,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -1631,18 +1592,15 @@ class CParser ( Parser ):
             return CParser.RULE_type_id
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_id" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_id"):
                 listener.enterType_id(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_id" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_id"):
                 listener.exitType_id(self)
 
-
-
-
     def type_id(self):
 
         localctx = CParser.Type_idContext(self, self._ctx, self.state)
@@ -1663,17 +1621,15 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def struct_or_union(self):
-            return self.getTypedRuleContext(CParser.Struct_or_unionContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_or_unionContext, 0)
 
         def struct_declaration_list(self):
-            return self.getTypedRuleContext(CParser.Struct_declaration_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_declaration_listContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
@@ -1682,27 +1638,25 @@ class CParser ( Parser ):
             return CParser.RULE_struct_or_union_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_or_union_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_or_union_specifier"):
                 listener.enterStruct_or_union_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_or_union_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_or_union_specifier"):
                 listener.exitStruct_or_union_specifier(self)
 
-
-
-
     def struct_or_union_specifier(self):
 
-        localctx = CParser.Struct_or_union_specifierContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_or_union_specifierContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 20, self.RULE_struct_or_union_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 262
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,18,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 18, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 251
@@ -1710,11 +1664,10 @@ class CParser ( Parser ):
                 self.state = 253
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.IDENTIFIER:
+                if _la == CParser.IDENTIFIER:
                     self.state = 252
                     self.match(CParser.IDENTIFIER)
 
-
                 self.state = 255
                 self.match(CParser.T__0)
                 self.state = 256
@@ -1731,7 +1684,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1744,37 +1696,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_struct_or_union
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_or_union" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_or_union"):
                 listener.enterStruct_or_union(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_or_union" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_or_union"):
                 listener.exitStruct_or_union(self)
 
-
-
-
     def struct_or_union(self):
 
         localctx = CParser.Struct_or_unionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 22, self.RULE_struct_or_union)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 264
             _la = self._input.LA(1)
-            if not(_la==CParser.T__20 or _la==CParser.T__21):
+            if not(_la == CParser.T__20 or _la == CParser.T__21):
                 self._errHandler.recoverInline(self)
             else:
                 self._errHandler.reportMatch(self)
@@ -1791,39 +1739,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def struct_declaration(self,i=None):
+        def struct_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Struct_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Struct_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Struct_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declaration_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declaration_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declaration_list"):
                 listener.enterStruct_declaration_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declaration_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declaration_list"):
                 listener.exitStruct_declaration_list(self)
 
-
-
-
     def struct_declaration_list(self):
 
-        localctx = CParser.Struct_declaration_listContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declaration_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 24, self.RULE_struct_declaration_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 267
@@ -1835,7 +1780,7 @@ class CParser ( Parser ):
                 self.state = 269
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER):
+                if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER):
                     break
 
         except RecognitionException as re:
@@ -1850,37 +1795,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def specifier_qualifier_list(self):
-            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext, 0)
 
         def struct_declarator_list(self):
-            return self.getTypedRuleContext(CParser.Struct_declarator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_declarator_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declaration"):
                 listener.enterStruct_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declaration"):
                 listener.exitStruct_declaration(self)
 
-
-
-
     def struct_declaration(self):
 
-        localctx = CParser.Struct_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 26, self.RULE_struct_declaration)
         try:
             self.enterOuterAlt(localctx, 1)
@@ -1902,52 +1843,48 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         # @param  i=None Type: int
-        def type_specifier(self,i=None):
+        def type_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_specifierContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_specifier_qualifier_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterSpecifier_qualifier_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterSpecifier_qualifier_list"):
                 listener.enterSpecifier_qualifier_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitSpecifier_qualifier_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitSpecifier_qualifier_list"):
                 listener.exitSpecifier_qualifier_list(self)
 
-
-
-
     def specifier_qualifier_list(self):
 
-        localctx = CParser.Specifier_qualifier_listContext(self, self._ctx, self.state)
+        localctx = CParser.Specifier_qualifier_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 28, self.RULE_specifier_qualifier_list)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 277
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 277
                     self._errHandler.sync(self)
@@ -1963,12 +1900,11 @@ class CParser ( Parser ):
                     else:
                         raise NoViableAltException(self)
 
-
                 else:
                     raise NoViableAltException(self)
                 self.state = 279
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,21,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 21, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1982,39 +1918,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def struct_declarator(self,i=None):
+        def struct_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Struct_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Struct_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Struct_declaratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declarator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declarator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declarator_list"):
                 listener.enterStruct_declarator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declarator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declarator_list"):
                 listener.exitStruct_declarator_list(self)
 
-
-
-
     def struct_declarator_list(self):
 
-        localctx = CParser.Struct_declarator_listContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declarator_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 30, self.RULE_struct_declarator_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 281
@@ -2022,7 +1955,7 @@ class CParser ( Parser ):
             self.state = 286
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 282
                 self.match(CParser.T__3)
                 self.state = 283
@@ -2043,39 +1976,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declarator"):
                 listener.enterStruct_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declarator"):
                 listener.exitStruct_declarator(self)
 
-
-
-
     def struct_declarator(self):
 
-        localctx = CParser.Struct_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 32, self.RULE_struct_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 296
             self._errHandler.sync(self)
@@ -2087,13 +2016,12 @@ class CParser ( Parser ):
                 self.state = 292
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__22:
+                if _la == CParser.T__22:
                     self.state = 290
                     self.match(CParser.T__22)
                     self.state = 291
                     self.constant_expression()
 
-
                 pass
             elif token in [CParser.T__22]:
                 self.enterOuterAlt(localctx, 2)
@@ -2117,13 +2045,12 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def enumerator_list(self):
-            return self.getTypedRuleContext(CParser.Enumerator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Enumerator_listContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
@@ -2132,27 +2059,24 @@ class CParser ( Parser ):
             return CParser.RULE_enum_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnum_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnum_specifier"):
                 listener.enterEnum_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnum_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnum_specifier"):
                 listener.exitEnum_specifier(self)
 
-
-
-
     def enum_specifier(self):
 
         localctx = CParser.Enum_specifierContext(self, self._ctx, self.state)
         self.enterRule(localctx, 34, self.RULE_enum_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 317
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,27,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 27, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 298
@@ -2164,11 +2088,10 @@ class CParser ( Parser ):
                 self.state = 302
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 301
                     self.match(CParser.T__3)
 
-
                 self.state = 304
                 self.match(CParser.T__19)
                 pass
@@ -2186,11 +2109,10 @@ class CParser ( Parser ):
                 self.state = 311
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 310
                     self.match(CParser.T__3)
 
-
                 self.state = 313
                 self.match(CParser.T__19)
                 pass
@@ -2203,7 +2125,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2216,34 +2137,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def enumerator(self,i=None):
+        def enumerator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.EnumeratorContext)
             else:
-                return self.getTypedRuleContext(CParser.EnumeratorContext,i)
-
+                return self.getTypedRuleContext(CParser.EnumeratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_enumerator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnumerator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnumerator_list"):
                 listener.enterEnumerator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnumerator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnumerator_list"):
                 listener.exitEnumerator_list(self)
 
-
-
-
     def enumerator_list(self):
 
         localctx = CParser.Enumerator_listContext(self, self._ctx, self.state)
@@ -2254,16 +2171,16 @@ class CParser ( Parser ):
             self.enumerator()
             self.state = 324
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,28,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 28, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 320
                     self.match(CParser.T__3)
                     self.state = 321
                     self.enumerator()
                 self.state = 326
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,28,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 28, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -2277,7 +2194,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -2285,30 +2202,26 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_enumerator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnumerator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnumerator"):
                 listener.enterEnumerator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnumerator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnumerator"):
                 listener.exitEnumerator(self)
 
-
-
-
     def enumerator(self):
 
         localctx = CParser.EnumeratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 38, self.RULE_enumerator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 327
@@ -2316,13 +2229,12 @@ class CParser ( Parser ):
             self.state = 330
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__4:
+            if _la == CParser.T__4:
                 self.state = 328
                 self.match(CParser.T__4)
                 self.state = 329
                 self.constant_expression()
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2335,32 +2247,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_type_qualifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_qualifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_qualifier"):
                 listener.enterType_qualifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_qualifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_qualifier"):
                 listener.exitType_qualifier(self)
 
-
-
-
     def type_qualifier(self):
 
         localctx = CParser.Type_qualifierContext(self, self._ctx, self.state)
         self.enterRule(localctx, 40, self.RULE_type_qualifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 332
@@ -2382,77 +2290,68 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def direct_declarator(self):
-            return self.getTypedRuleContext(CParser.Direct_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Direct_declaratorContext, 0)
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclarator"):
                 listener.enterDeclarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclarator"):
                 listener.exitDeclarator(self)
 
-
-
-
     def declarator(self):
 
         localctx = CParser.DeclaratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 42, self.RULE_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 348
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,34,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 34, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 335
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__41:
+                if _la == CParser.T__41:
                     self.state = 334
                     self.pointer()
 
-
                 self.state = 338
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__33:
+                if _la == CParser.T__33:
                     self.state = 337
                     self.match(CParser.T__33)
 
-
                 self.state = 341
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__34:
+                if _la == CParser.T__34:
                     self.state = 340
                     self.match(CParser.T__34)
 
-
                 self.state = 344
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__35:
+                if _la == CParser.T__35:
                     self.state = 343
                     self.match(CParser.T__35)
 
-
                 self.state = 346
                 self.direct_declarator()
                 pass
@@ -2463,7 +2362,6 @@ class CParser ( Parser ):
                 self.pointer()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2476,7 +2374,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -2484,36 +2382,32 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def declarator_suffix(self,i=None):
+        def declarator_suffix(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Declarator_suffixContext)
             else:
-                return self.getTypedRuleContext(CParser.Declarator_suffixContext,i)
-
+                return self.getTypedRuleContext(CParser.Declarator_suffixContext, i)
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_direct_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDirect_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDirect_declarator"):
                 listener.enterDirect_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDirect_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDirect_declarator"):
                 listener.exitDirect_declarator(self)
 
-
-
-
     def direct_declarator(self):
 
-        localctx = CParser.Direct_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Direct_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 44, self.RULE_direct_declarator)
         try:
             self.state = 368
@@ -2525,14 +2419,15 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 self.state = 354
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,35,self._ctx)
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                    if _alt==1:
+                _alt = self._interp.adaptivePredict(self._input, 35, self._ctx)
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                    if _alt == 1:
                         self.state = 351
                         self.declarator_suffix()
                     self.state = 356
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,35,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 35, self._ctx)
 
                 pass
             elif token in [CParser.T__37]:
@@ -2541,12 +2436,11 @@ class CParser ( Parser ):
                 self.match(CParser.T__37)
                 self.state = 359
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,36,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 36, self._ctx)
                 if la_ == 1:
                     self.state = 358
                     self.match(CParser.T__33)
 
-
                 self.state = 361
                 self.declarator()
                 self.state = 362
@@ -2554,7 +2448,7 @@ class CParser ( Parser ):
                 self.state = 364
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 363
                         self.declarator_suffix()
@@ -2563,7 +2457,8 @@ class CParser ( Parser ):
                         raise NoViableAltException(self)
                     self.state = 366
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,37,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 37, self._ctx)
 
                 pass
             else:
@@ -2581,46 +2476,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def parameter_type_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_type_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_type_listContext, 0)
 
         def identifier_list(self):
-            return self.getTypedRuleContext(CParser.Identifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Identifier_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declarator_suffix
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclarator_suffix" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclarator_suffix"):
                 listener.enterDeclarator_suffix(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclarator_suffix" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclarator_suffix"):
                 listener.exitDeclarator_suffix(self)
 
-
-
-
     def declarator_suffix(self):
 
-        localctx = CParser.Declarator_suffixContext(self, self._ctx, self.state)
+        localctx = CParser.Declarator_suffixContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 46, self.RULE_declarator_suffix)
         try:
             self.state = 386
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,39,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 39, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 370
@@ -2667,7 +2557,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2680,38 +2569,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_pointer
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPointer" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPointer"):
                 listener.enterPointer(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPointer" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPointer"):
                 listener.exitPointer(self)
 
-
-
-
     def pointer(self):
 
         localctx = CParser.PointerContext(self, self._ctx, self.state)
@@ -2719,7 +2603,7 @@ class CParser ( Parser ):
         try:
             self.state = 400
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,42,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 42, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 388
@@ -2727,7 +2611,7 @@ class CParser ( Parser ):
                 self.state = 390
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 389
                         self.type_qualifier()
@@ -2736,16 +2620,16 @@ class CParser ( Parser ):
                         raise NoViableAltException(self)
                     self.state = 392
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,40,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 40, self._ctx)
 
                 self.state = 395
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,41,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 41, self._ctx)
                 if la_ == 1:
                     self.state = 394
                     self.pointer()
 
-
                 pass
 
             elif la_ == 2:
@@ -2762,7 +2646,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__41)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2775,35 +2658,32 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def parameter_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_type_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_type_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_type_list"):
                 listener.enterParameter_type_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_type_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_type_list"):
                 listener.exitParameter_type_list(self)
 
-
-
-
     def parameter_type_list(self):
 
-        localctx = CParser.Parameter_type_listContext(self, self._ctx, self.state)
+        localctx = CParser.Parameter_type_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 50, self.RULE_parameter_type_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 402
@@ -2811,21 +2691,19 @@ class CParser ( Parser ):
             self.state = 408
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__3:
+            if _la == CParser.T__3:
                 self.state = 403
                 self.match(CParser.T__3)
                 self.state = 405
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 404
                     self.match(CParser.T__28)
 
-
                 self.state = 407
                 self.match(CParser.T__42)
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2838,34 +2716,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def parameter_declaration(self,i=None):
+        def parameter_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Parameter_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Parameter_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Parameter_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_list"):
                 listener.enterParameter_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_list"):
                 listener.exitParameter_list(self)
 
-
-
-
     def parameter_list(self):
 
         localctx = CParser.Parameter_listContext(self, self._ctx, self.state)
@@ -2876,24 +2750,24 @@ class CParser ( Parser ):
             self.parameter_declaration()
             self.state = 418
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,46,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 46, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 411
                     self.match(CParser.T__3)
                     self.state = 413
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,45,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 45, self._ctx)
                     if la_ == 1:
                         self.state = 412
                         self.match(CParser.T__28)
 
-
                     self.state = 415
                     self.parameter_declaration()
                 self.state = 420
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,46,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 46, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -2907,66 +2781,60 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declarator(self,i=None):
+        def declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclaratorContext, i)
 
         # @param  i=None Type: int
-        def abstract_declarator(self,i=None):
+        def abstract_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Abstract_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Abstract_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Abstract_declaratorContext, i)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def pointer(self,i=None):
+        def pointer(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.PointerContext)
             else:
-                return self.getTypedRuleContext(CParser.PointerContext,i)
-
+                return self.getTypedRuleContext(CParser.PointerContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_declaration"):
                 listener.enterParameter_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_declaration"):
                 listener.exitParameter_declaration(self)
 
-
-
-
     def parameter_declaration(self):
 
-        localctx = CParser.Parameter_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.Parameter_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 54, self.RULE_parameter_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 439
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,51,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 51, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 421
@@ -2977,7 +2845,8 @@ class CParser ( Parser ):
                 while ((((_la - 34)) & ~0x3f) == 0 and ((1 << (_la - 34)) & ((1 << (CParser.T__33 - 34)) | (1 << (CParser.T__34 - 34)) | (1 << (CParser.T__35 - 34)) | (1 << (CParser.T__37 - 34)) | (1 << (CParser.T__39 - 34)) | (1 << (CParser.T__41 - 34)) | (1 << (CParser.IDENTIFIER - 34)))) != 0):
                     self.state = 424
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,47,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 47, self._ctx)
                     if la_ == 1:
                         self.state = 422
                         self.declarator()
@@ -2988,7 +2857,6 @@ class CParser ( Parser ):
                         self.abstract_declarator()
                         pass
 
-
                     self.state = 428
                     self._errHandler.sync(self)
                     _la = self._input.LA(1)
@@ -2996,11 +2864,10 @@ class CParser ( Parser ):
                 self.state = 430
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 429
                     self.match(CParser.T__28)
 
-
                 pass
 
             elif la_ == 2:
@@ -3008,7 +2875,7 @@ class CParser ( Parser ):
                 self.state = 435
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while _la==CParser.T__41:
+                while _la == CParser.T__41:
                     self.state = 432
                     self.pointer()
                     self.state = 437
@@ -3019,7 +2886,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3032,12 +2898,12 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
@@ -3047,23 +2913,20 @@ class CParser ( Parser ):
             return CParser.RULE_identifier_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterIdentifier_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterIdentifier_list"):
                 listener.enterIdentifier_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitIdentifier_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitIdentifier_list"):
                 listener.exitIdentifier_list(self)
 
-
-
-
     def identifier_list(self):
 
         localctx = CParser.Identifier_listContext(self, self._ctx, self.state)
         self.enterRule(localctx, 56, self.RULE_identifier_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 441
@@ -3071,7 +2934,7 @@ class CParser ( Parser ):
             self.state = 446
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 442
                 self.match(CParser.T__3)
                 self.state = 443
@@ -3092,47 +2955,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def specifier_qualifier_list(self):
-            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext, 0)
 
         def abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Abstract_declaratorContext, 0)
 
         def type_id(self):
-            return self.getTypedRuleContext(CParser.Type_idContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_idContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_type_name
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_name" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_name"):
                 listener.enterType_name(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_name" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_name"):
                 listener.exitType_name(self)
 
-
-
-
     def type_name(self):
 
         localctx = CParser.Type_nameContext(self, self._ctx, self.state)
         self.enterRule(localctx, 58, self.RULE_type_name)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 454
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,54,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 54, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 449
@@ -3144,7 +3001,6 @@ class CParser ( Parser ):
                     self.state = 450
                     self.abstract_declarator()
 
-
                 pass
 
             elif la_ == 2:
@@ -3153,7 +3009,6 @@ class CParser ( Parser ):
                 self.type_id()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3166,37 +3021,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def direct_abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Direct_abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Direct_abstract_declaratorContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_abstract_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAbstract_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAbstract_declarator"):
                 listener.enterAbstract_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAbstract_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAbstract_declarator"):
                 listener.exitAbstract_declarator(self)
 
-
-
-
     def abstract_declarator(self):
 
-        localctx = CParser.Abstract_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Abstract_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 60, self.RULE_abstract_declarator)
         try:
             self.state = 461
@@ -3208,12 +3059,11 @@ class CParser ( Parser ):
                 self.pointer()
                 self.state = 458
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,55,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 55, self._ctx)
                 if la_ == 1:
                     self.state = 457
                     self.direct_abstract_declarator()
 
-
                 pass
             elif token in [CParser.T__37, CParser.T__39]:
                 self.enterOuterAlt(localctx, 2)
@@ -3235,46 +3085,43 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Abstract_declaratorContext, 0)
 
         # @param  i=None Type: int
-        def abstract_declarator_suffix(self,i=None):
+        def abstract_declarator_suffix(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Abstract_declarator_suffixContext)
             else:
-                return self.getTypedRuleContext(CParser.Abstract_declarator_suffixContext,i)
-
+                return self.getTypedRuleContext(CParser.Abstract_declarator_suffixContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_direct_abstract_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDirect_abstract_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDirect_abstract_declarator"):
                 listener.enterDirect_abstract_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDirect_abstract_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDirect_abstract_declarator"):
                 listener.exitDirect_abstract_declarator(self)
 
-
-
     def direct_abstract_declarator(self):
 
-        localctx = CParser.Direct_abstract_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Direct_abstract_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 62, self.RULE_direct_abstract_declarator)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 468
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,57,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 57, self._ctx)
             if la_ == 1:
                 self.state = 463
                 self.match(CParser.T__37)
@@ -3289,17 +3136,16 @@ class CParser ( Parser ):
                 self.abstract_declarator_suffix()
                 pass
 
-
             self.state = 473
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,58,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 58, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 470
                     self.abstract_declarator_suffix()
                 self.state = 475
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,58,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 58, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -3313,42 +3159,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def parameter_type_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_type_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_type_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_abstract_declarator_suffix
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAbstract_declarator_suffix" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAbstract_declarator_suffix"):
                 listener.enterAbstract_declarator_suffix(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAbstract_declarator_suffix" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAbstract_declarator_suffix"):
                 listener.exitAbstract_declarator_suffix(self)
 
-
-
-
     def abstract_declarator_suffix(self):
 
-        localctx = CParser.Abstract_declarator_suffixContext(self, self._ctx, self.state)
+        localctx = CParser.Abstract_declarator_suffixContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 64, self.RULE_abstract_declarator_suffix)
         try:
             self.state = 488
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,59,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 59, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 476
@@ -3385,7 +3227,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3398,39 +3239,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def assignment_expression(self):
-            return self.getTypedRuleContext(CParser.Assignment_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_expressionContext, 0)
 
         def initializer_list(self):
-            return self.getTypedRuleContext(CParser.Initializer_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Initializer_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_initializer
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInitializer" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInitializer"):
                 listener.enterInitializer(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInitializer" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInitializer"):
                 listener.exitInitializer(self)
 
-
-
-
     def initializer(self):
 
         localctx = CParser.InitializerContext(self, self._ctx, self.state)
         self.enterRule(localctx, 66, self.RULE_initializer)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 498
             self._errHandler.sync(self)
@@ -3449,11 +3285,10 @@ class CParser ( Parser ):
                 self.state = 494
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 493
                     self.match(CParser.T__3)
 
-
                 self.state = 496
                 self.match(CParser.T__19)
                 pass
@@ -3472,34 +3307,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def initializer(self,i=None):
+        def initializer(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.InitializerContext)
             else:
-                return self.getTypedRuleContext(CParser.InitializerContext,i)
-
+                return self.getTypedRuleContext(CParser.InitializerContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_initializer_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInitializer_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInitializer_list"):
                 listener.enterInitializer_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInitializer_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInitializer_list"):
                 listener.exitInitializer_list(self)
 
-
-
-
     def initializer_list(self):
 
         localctx = CParser.Initializer_listContext(self, self._ctx, self.state)
@@ -3510,16 +3341,16 @@ class CParser ( Parser ):
             self.initializer()
             self.state = 505
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,62,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 62, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 501
                     self.match(CParser.T__3)
                     self.state = 502
                     self.initializer()
                 self.state = 507
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,62,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 62, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -3533,39 +3364,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def assignment_expression(self,i=None):
+        def assignment_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Assignment_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Assignment_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Assignment_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_argument_expression_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterArgument_expression_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterArgument_expression_list"):
                 listener.enterArgument_expression_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitArgument_expression_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitArgument_expression_list"):
                 listener.exitArgument_expression_list(self)
 
-
-
-
     def argument_expression_list(self):
 
-        localctx = CParser.Argument_expression_listContext(self, self._ctx, self.state)
+        localctx = CParser.Argument_expression_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 70, self.RULE_argument_expression_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 508
@@ -3573,15 +3401,14 @@ class CParser ( Parser ):
             self.state = 510
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__28:
+            if _la == CParser.T__28:
                 self.state = 509
                 self.match(CParser.T__28)
 
-
             self.state = 519
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 512
                 self.match(CParser.T__3)
                 self.state = 513
@@ -3589,11 +3416,10 @@ class CParser ( Parser ):
                 self.state = 515
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 514
                     self.match(CParser.T__28)
 
-
                 self.state = 521
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
@@ -3610,39 +3436,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def multiplicative_expression(self,i=None):
+        def multiplicative_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Multiplicative_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Multiplicative_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Multiplicative_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_additive_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAdditive_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAdditive_expression"):
                 listener.enterAdditive_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAdditive_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAdditive_expression"):
                 listener.exitAdditive_expression(self)
 
-
-
-
     def additive_expression(self):
 
-        localctx = CParser.Additive_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Additive_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 72, self.RULE_additive_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 522
@@ -3650,7 +3473,7 @@ class CParser ( Parser ):
             self.state = 529
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__43 or _la==CParser.T__44:
+            while _la == CParser.T__43 or _la == CParser.T__44:
                 self.state = 527
                 self._errHandler.sync(self)
                 token = self._input.LA(1)
@@ -3685,39 +3508,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def cast_expression(self,i=None):
+        def cast_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Cast_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Cast_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Cast_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_multiplicative_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMultiplicative_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMultiplicative_expression"):
                 listener.enterMultiplicative_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMultiplicative_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMultiplicative_expression"):
                 listener.exitMultiplicative_expression(self)
 
-
-
-
     def multiplicative_expression(self):
 
-        localctx = CParser.Multiplicative_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Multiplicative_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 74, self.RULE_multiplicative_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 532
@@ -3766,38 +3586,32 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def type_name(self):
-            return self.getTypedRuleContext(CParser.Type_nameContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_nameContext, 0)
 
         def cast_expression(self):
-            return self.getTypedRuleContext(CParser.Cast_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Cast_expressionContext, 0)
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_cast_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterCast_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterCast_expression"):
                 listener.enterCast_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitCast_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitCast_expression"):
                 listener.exitCast_expression(self)
 
-
-
-
     def cast_expression(self):
 
         localctx = CParser.Cast_expressionContext(self, self._ctx, self.state)
@@ -3805,7 +3619,7 @@ class CParser ( Parser ):
         try:
             self.state = 550
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,70,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 70, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 544
@@ -3824,7 +3638,6 @@ class CParser ( Parser ):
                 self.unary_expression()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3837,46 +3650,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def postfix_expression(self):
-            return self.getTypedRuleContext(CParser.Postfix_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Postfix_expressionContext, 0)
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def unary_operator(self):
-            return self.getTypedRuleContext(CParser.Unary_operatorContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_operatorContext, 0)
 
         def cast_expression(self):
-            return self.getTypedRuleContext(CParser.Cast_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Cast_expressionContext, 0)
 
         def type_name(self):
-            return self.getTypedRuleContext(CParser.Type_nameContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_nameContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_unary_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterUnary_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterUnary_expression"):
                 listener.enterUnary_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitUnary_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitUnary_expression"):
                 listener.exitUnary_expression(self)
 
-
-
-
     def unary_expression(self):
 
         localctx = CParser.Unary_expressionContext(self, self._ctx, self.state)
@@ -3884,7 +3689,7 @@ class CParser ( Parser ):
         try:
             self.state = 567
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,71,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 71, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 552
@@ -3935,7 +3740,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3948,48 +3752,44 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
             self.FuncCallText = ''
-            self.p = None # Primary_expressionContext
-            self.a = None # Token
-            self.c = None # Argument_expression_listContext
-            self.b = None # Token
-            self.x = None # Token
-            self.y = None # Token
-            self.z = None # Token
+            self.p = None  # Primary_expressionContext
+            self.a = None  # Token
+            self.c = None  # Argument_expression_listContext
+            self.b = None  # Token
+            self.x = None  # Token
+            self.y = None  # Token
+            self.z = None  # Token
 
         def primary_expression(self):
-            return self.getTypedRuleContext(CParser.Primary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Primary_expressionContext, 0)
 
         # @param  i=None Type: int
-        def expression(self,i=None):
+        def expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.ExpressionContext)
             else:
-                return self.getTypedRuleContext(CParser.ExpressionContext,i)
-
+                return self.getTypedRuleContext(CParser.ExpressionContext, i)
 
         # @param  i=None Type: int
-        def macro_parameter_list(self,i=None):
+        def macro_parameter_list(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Macro_parameter_listContext)
             else:
-                return self.getTypedRuleContext(CParser.Macro_parameter_listContext,i)
-
+                return self.getTypedRuleContext(CParser.Macro_parameter_listContext, i)
 
         # @param  i=None Type: int
-        def argument_expression_list(self,i=None):
+        def argument_expression_list(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Argument_expression_listContext)
             else:
-                return self.getTypedRuleContext(CParser.Argument_expression_listContext,i)
-
+                return self.getTypedRuleContext(CParser.Argument_expression_listContext, i)
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
@@ -3999,38 +3799,38 @@ class CParser ( Parser ):
             return CParser.RULE_postfix_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPostfix_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPostfix_expression"):
                 listener.enterPostfix_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPostfix_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPostfix_expression"):
                 listener.exitPostfix_expression(self)
 
-
-
-
     def postfix_expression(self):
 
-        localctx = CParser.Postfix_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Postfix_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 80, self.RULE_postfix_expression)
 
-        self.FuncCallText=''
+        self.FuncCallText = ''
 
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 569
             localctx.p = self.primary_expression()
-            self.FuncCallText += (None if localctx.p is None else self._input.getText((localctx.p.start,localctx.p.stop)))
+            self.FuncCallText += (None if localctx.p is None else self._input.getText(
+                (localctx.p.start, localctx.p.stop)))
             self.state = 600
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,73,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 73, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 598
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,72,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 72, self._ctx)
                     if la_ == 1:
                         self.state = 571
                         self.match(CParser.T__39)
@@ -4045,7 +3845,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__37)
                         self.state = 576
                         localctx.a = self.match(CParser.T__38)
-                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (0 if localctx.a is None else localctx.a.line), localctx.a.column, self.FuncCallText, '')
+                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (
+                            0 if localctx.a is None else localctx.a.line), localctx.a.column, self.FuncCallText, '')
                         pass
 
                     elif la_ == 3:
@@ -4055,7 +3856,8 @@ class CParser ( Parser ):
                         localctx.c = self.argument_expression_list()
                         self.state = 580
                         localctx.b = self.match(CParser.T__38)
-                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (0 if localctx.b is None else localctx.b.line), localctx.b.column, self.FuncCallText, (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (
+                            0 if localctx.b is None else localctx.b.line), localctx.b.column, self.FuncCallText, (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
                         pass
 
                     elif la_ == 4:
@@ -4072,7 +3874,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__50)
                         self.state = 588
                         localctx.x = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText += '.' + (None if localctx.x is None else localctx.x.text)
+                        self.FuncCallText += '.' + \
+                            (None if localctx.x is None else localctx.x.text)
                         pass
 
                     elif la_ == 6:
@@ -4080,7 +3883,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__41)
                         self.state = 591
                         localctx.y = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText = (None if localctx.y is None else localctx.y.text)
+                        self.FuncCallText = (
+                            None if localctx.y is None else localctx.y.text)
                         pass
 
                     elif la_ == 7:
@@ -4088,7 +3892,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__51)
                         self.state = 594
                         localctx.z = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText += '->' + (None if localctx.z is None else localctx.z.text)
+                        self.FuncCallText += '->' + \
+                            (None if localctx.z is None else localctx.z.text)
                         pass
 
                     elif la_ == 8:
@@ -4101,10 +3906,9 @@ class CParser ( Parser ):
                         self.match(CParser.T__48)
                         pass
 
-
                 self.state = 602
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,73,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 73, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -4118,39 +3922,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def parameter_declaration(self,i=None):
+        def parameter_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Parameter_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Parameter_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Parameter_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_macro_parameter_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMacro_parameter_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMacro_parameter_list"):
                 listener.enterMacro_parameter_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMacro_parameter_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMacro_parameter_list"):
                 listener.exitMacro_parameter_list(self)
 
-
-
-
     def macro_parameter_list(self):
 
-        localctx = CParser.Macro_parameter_listContext(self, self._ctx, self.state)
+        localctx = CParser.Macro_parameter_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 82, self.RULE_macro_parameter_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 603
@@ -4158,7 +3959,7 @@ class CParser ( Parser ):
             self.state = 608
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 604
                 self.match(CParser.T__3)
                 self.state = 605
@@ -4179,32 +3980,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_unary_operator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterUnary_operator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterUnary_operator"):
                 listener.enterUnary_operator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitUnary_operator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitUnary_operator"):
                 listener.exitUnary_operator(self)
 
-
-
-
     def unary_operator(self):
 
         localctx = CParser.Unary_operatorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 84, self.RULE_unary_operator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 611
@@ -4226,7 +4023,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -4234,37 +4031,33 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def constant(self):
-            return self.getTypedRuleContext(CParser.ConstantContext,0)
-
+            return self.getTypedRuleContext(CParser.ConstantContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_primary_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPrimary_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPrimary_expression"):
                 listener.enterPrimary_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPrimary_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPrimary_expression"):
                 listener.exitPrimary_expression(self)
 
-
-
-
     def primary_expression(self):
 
-        localctx = CParser.Primary_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Primary_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 86, self.RULE_primary_expression)
         try:
             self.state = 619
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,75,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 75, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 613
@@ -4287,7 +4080,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -4300,7 +4092,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -4317,14 +4109,14 @@ class CParser ( Parser ):
             return self.getToken(CParser.CHARACTER_LITERAL, 0)
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
                 return self.getToken(CParser.IDENTIFIER, i)
 
         # @param  i=None Type: int
-        def STRING_LITERAL(self,i=None):
+        def STRING_LITERAL(self, i=None):
             if i is None:
                 return self.getTokens(CParser.STRING_LITERAL)
             else:
@@ -4337,23 +4129,20 @@ class CParser ( Parser ):
             return CParser.RULE_constant
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConstant" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConstant"):
                 listener.enterConstant(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConstant" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConstant"):
                 listener.exitConstant(self)
 
-
-
-
     def constant(self):
 
         localctx = CParser.ConstantContext(self, self._ctx, self.state)
         self.enterRule(localctx, 88, self.RULE_constant)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 647
             self._errHandler.sync(self)
@@ -4383,12 +4172,12 @@ class CParser ( Parser ):
                 self.state = 636
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 628
                         self._errHandler.sync(self)
                         _la = self._input.LA(1)
-                        while _la==CParser.IDENTIFIER:
+                        while _la == CParser.IDENTIFIER:
                             self.state = 625
                             self.match(CParser.IDENTIFIER)
                             self.state = 630
@@ -4398,7 +4187,7 @@ class CParser ( Parser ):
                         self.state = 632
                         self._errHandler.sync(self)
                         _alt = 1
-                        while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                        while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                             if _alt == 1:
                                 self.state = 631
                                 self.match(CParser.STRING_LITERAL)
@@ -4407,19 +4196,20 @@ class CParser ( Parser ):
                                 raise NoViableAltException(self)
                             self.state = 634
                             self._errHandler.sync(self)
-                            _alt = self._interp.adaptivePredict(self._input,77,self._ctx)
-
+                            _alt = self._interp.adaptivePredict(
+                                self._input, 77, self._ctx)
 
                     else:
                         raise NoViableAltException(self)
                     self.state = 638
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,78,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 78, self._ctx)
 
                 self.state = 643
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while _la==CParser.IDENTIFIER:
+                while _la == CParser.IDENTIFIER:
                     self.state = 640
                     self.match(CParser.IDENTIFIER)
                     self.state = 645
@@ -4447,39 +4237,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def assignment_expression(self,i=None):
+        def assignment_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Assignment_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Assignment_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Assignment_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExpression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExpression"):
                 listener.enterExpression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExpression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExpression"):
                 listener.exitExpression(self)
 
-
-
-
     def expression(self):
 
         localctx = CParser.ExpressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 90, self.RULE_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 649
@@ -4487,7 +4273,7 @@ class CParser ( Parser ):
             self.state = 654
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 650
                 self.match(CParser.T__3)
                 self.state = 651
@@ -4508,33 +4294,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_constant_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConstant_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConstant_expression"):
                 listener.enterConstant_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConstant_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConstant_expression"):
                 listener.exitConstant_expression(self)
 
-
-
-
     def constant_expression(self):
 
-        localctx = CParser.Constant_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Constant_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 92, self.RULE_constant_expression)
         try:
             self.enterOuterAlt(localctx, 1)
@@ -4552,50 +4335,44 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def lvalue(self):
-            return self.getTypedRuleContext(CParser.LvalueContext,0)
-
+            return self.getTypedRuleContext(CParser.LvalueContext, 0)
 
         def assignment_operator(self):
-            return self.getTypedRuleContext(CParser.Assignment_operatorContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_operatorContext, 0)
 
         def assignment_expression(self):
-            return self.getTypedRuleContext(CParser.Assignment_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_expressionContext, 0)
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_assignment_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAssignment_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAssignment_expression"):
                 listener.enterAssignment_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAssignment_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAssignment_expression"):
                 listener.exitAssignment_expression(self)
 
-
-
-
     def assignment_expression(self):
 
-        localctx = CParser.Assignment_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Assignment_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 94, self.RULE_assignment_expression)
         try:
             self.state = 664
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,82,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 82, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 659
@@ -4612,7 +4389,6 @@ class CParser ( Parser ):
                 self.conditional_expression()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -4625,30 +4401,26 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_lvalue
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLvalue" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLvalue"):
                 listener.enterLvalue(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLvalue" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLvalue"):
                 listener.exitLvalue(self)
 
-
-
-
     def lvalue(self):
 
         localctx = CParser.LvalueContext(self, self._ctx, self.state)
@@ -4669,32 +4441,29 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_assignment_operator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAssignment_operator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAssignment_operator"):
                 listener.enterAssignment_operator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAssignment_operator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAssignment_operator"):
                 listener.exitAssignment_operator(self)
 
-
-
-
     def assignment_operator(self):
 
-        localctx = CParser.Assignment_operatorContext(self, self._ctx, self.state)
+        localctx = CParser.Assignment_operatorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 98, self.RULE_assignment_operator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 668
@@ -4716,44 +4485,39 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # Logical_or_expressionContext
+            self.e = None  # Logical_or_expressionContext
 
         def logical_or_expression(self):
-            return self.getTypedRuleContext(CParser.Logical_or_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Logical_or_expressionContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_conditional_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConditional_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConditional_expression"):
                 listener.enterConditional_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConditional_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConditional_expression"):
                 listener.exitConditional_expression(self)
 
-
-
-
     def conditional_expression(self):
 
-        localctx = CParser.Conditional_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Conditional_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 100, self.RULE_conditional_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 670
@@ -4761,7 +4525,7 @@ class CParser ( Parser ):
             self.state = 677
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__65:
+            if _la == CParser.T__65:
                 self.state = 671
                 self.match(CParser.T__65)
                 self.state = 672
@@ -4770,8 +4534,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__22)
                 self.state = 674
                 self.conditional_expression()
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
-
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
 
         except RecognitionException as re:
             localctx.exception = re
@@ -4785,39 +4549,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def logical_and_expression(self,i=None):
+        def logical_and_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Logical_and_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Logical_and_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Logical_and_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_logical_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLogical_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLogical_or_expression"):
                 listener.enterLogical_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLogical_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLogical_or_expression"):
                 listener.exitLogical_or_expression(self)
 
-
-
-
     def logical_or_expression(self):
 
-        localctx = CParser.Logical_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Logical_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 102, self.RULE_logical_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 679
@@ -4825,7 +4586,7 @@ class CParser ( Parser ):
             self.state = 684
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__66:
+            while _la == CParser.T__66:
                 self.state = 680
                 self.match(CParser.T__66)
                 self.state = 681
@@ -4846,39 +4607,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def inclusive_or_expression(self,i=None):
+        def inclusive_or_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Inclusive_or_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Inclusive_or_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Inclusive_or_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_logical_and_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLogical_and_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLogical_and_expression"):
                 listener.enterLogical_and_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLogical_and_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLogical_and_expression"):
                 listener.exitLogical_and_expression(self)
 
-
-
-
     def logical_and_expression(self):
 
-        localctx = CParser.Logical_and_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Logical_and_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 104, self.RULE_logical_and_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 687
@@ -4886,7 +4644,7 @@ class CParser ( Parser ):
             self.state = 692
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__67:
+            while _la == CParser.T__67:
                 self.state = 688
                 self.match(CParser.T__67)
                 self.state = 689
@@ -4907,39 +4665,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def exclusive_or_expression(self,i=None):
+        def exclusive_or_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Exclusive_or_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Exclusive_or_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Exclusive_or_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_inclusive_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInclusive_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInclusive_or_expression"):
                 listener.enterInclusive_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInclusive_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInclusive_or_expression"):
                 listener.exitInclusive_or_expression(self)
 
-
-
-
     def inclusive_or_expression(self):
 
-        localctx = CParser.Inclusive_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Inclusive_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 106, self.RULE_inclusive_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 695
@@ -4947,7 +4702,7 @@ class CParser ( Parser ):
             self.state = 700
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__68:
+            while _la == CParser.T__68:
                 self.state = 696
                 self.match(CParser.T__68)
                 self.state = 697
@@ -4968,39 +4723,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def and_expression(self,i=None):
+        def and_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.And_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.And_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.And_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_exclusive_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExclusive_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExclusive_or_expression"):
                 listener.enterExclusive_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExclusive_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExclusive_or_expression"):
                 listener.exitExclusive_or_expression(self)
 
-
-
-
     def exclusive_or_expression(self):
 
-        localctx = CParser.Exclusive_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Exclusive_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 108, self.RULE_exclusive_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 703
@@ -5008,7 +4760,7 @@ class CParser ( Parser ):
             self.state = 708
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__69:
+            while _la == CParser.T__69:
                 self.state = 704
                 self.match(CParser.T__69)
                 self.state = 705
@@ -5029,39 +4781,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def equality_expression(self,i=None):
+        def equality_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Equality_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Equality_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Equality_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_and_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAnd_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAnd_expression"):
                 listener.enterAnd_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAnd_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAnd_expression"):
                 listener.exitAnd_expression(self)
 
-
-
-
     def and_expression(self):
 
         localctx = CParser.And_expressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 110, self.RULE_and_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 711
@@ -5069,7 +4817,7 @@ class CParser ( Parser ):
             self.state = 716
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__52:
+            while _la == CParser.T__52:
                 self.state = 712
                 self.match(CParser.T__52)
                 self.state = 713
@@ -5090,39 +4838,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def relational_expression(self,i=None):
+        def relational_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Relational_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Relational_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Relational_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_equality_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEquality_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEquality_expression"):
                 listener.enterEquality_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEquality_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEquality_expression"):
                 listener.exitEquality_expression(self)
 
-
-
-
     def equality_expression(self):
 
-        localctx = CParser.Equality_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Equality_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 112, self.RULE_equality_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 719
@@ -5130,10 +4875,10 @@ class CParser ( Parser ):
             self.state = 724
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__70 or _la==CParser.T__71:
+            while _la == CParser.T__70 or _la == CParser.T__71:
                 self.state = 720
                 _la = self._input.LA(1)
-                if not(_la==CParser.T__70 or _la==CParser.T__71):
+                if not(_la == CParser.T__70 or _la == CParser.T__71):
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5156,39 +4901,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def shift_expression(self,i=None):
+        def shift_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Shift_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Shift_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Shift_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_relational_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterRelational_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterRelational_expression"):
                 listener.enterRelational_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitRelational_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitRelational_expression"):
                 listener.exitRelational_expression(self)
 
-
-
-
     def relational_expression(self):
 
-        localctx = CParser.Relational_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Relational_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 114, self.RULE_relational_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 727
@@ -5222,39 +4964,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def additive_expression(self,i=None):
+        def additive_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Additive_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Additive_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Additive_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_shift_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterShift_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterShift_expression"):
                 listener.enterShift_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitShift_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitShift_expression"):
                 listener.exitShift_expression(self)
 
-
-
-
     def shift_expression(self):
 
         localctx = CParser.Shift_expressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 116, self.RULE_shift_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 735
@@ -5262,10 +5000,10 @@ class CParser ( Parser ):
             self.state = 740
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__76 or _la==CParser.T__77:
+            while _la == CParser.T__76 or _la == CParser.T__77:
                 self.state = 736
                 _la = self._input.LA(1)
-                if not(_la==CParser.T__76 or _la==CParser.T__77):
+                if not(_la == CParser.T__76 or _la == CParser.T__77):
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5288,70 +5026,56 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def labeled_statement(self):
-            return self.getTypedRuleContext(CParser.Labeled_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Labeled_statementContext, 0)
 
         def compound_statement(self):
-            return self.getTypedRuleContext(CParser.Compound_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Compound_statementContext, 0)
 
         def expression_statement(self):
-            return self.getTypedRuleContext(CParser.Expression_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Expression_statementContext, 0)
 
         def selection_statement(self):
-            return self.getTypedRuleContext(CParser.Selection_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Selection_statementContext, 0)
 
         def iteration_statement(self):
-            return self.getTypedRuleContext(CParser.Iteration_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Iteration_statementContext, 0)
 
         def jump_statement(self):
-            return self.getTypedRuleContext(CParser.Jump_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Jump_statementContext, 0)
 
         def macro_statement(self):
-            return self.getTypedRuleContext(CParser.Macro_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Macro_statementContext, 0)
 
         def asm2_statement(self):
-            return self.getTypedRuleContext(CParser.Asm2_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm2_statementContext, 0)
 
         def asm1_statement(self):
-            return self.getTypedRuleContext(CParser.Asm1_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm1_statementContext, 0)
 
         def asm_statement(self):
-            return self.getTypedRuleContext(CParser.Asm_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm_statementContext, 0)
 
         def declaration(self):
-            return self.getTypedRuleContext(CParser.DeclarationContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclarationContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStatement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStatement"):
                 listener.enterStatement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStatement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStatement"):
                 listener.exitStatement(self)
 
-
-
-
     def statement(self):
 
         localctx = CParser.StatementContext(self, self._ctx, self.state)
@@ -5359,7 +5083,7 @@ class CParser ( Parser ):
         try:
             self.state = 754
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,92,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 92, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 743
@@ -5426,7 +5150,6 @@ class CParser ( Parser ):
                 self.declaration()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -5439,7 +5162,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5450,52 +5173,48 @@ class CParser ( Parser ):
             return CParser.RULE_asm2_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm2_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm2_statement"):
                 listener.enterAsm2_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm2_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm2_statement"):
                 listener.exitAsm2_statement(self)
 
-
-
-
     def asm2_statement(self):
 
         localctx = CParser.Asm2_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 120, self.RULE_asm2_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 757
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__78:
+            if _la == CParser.T__78:
                 self.state = 756
                 self.match(CParser.T__78)
 
-
             self.state = 759
             self.match(CParser.IDENTIFIER)
             self.state = 760
             self.match(CParser.T__37)
             self.state = 764
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,94,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 94, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 761
                     _la = self._input.LA(1)
-                    if _la <= 0 or _la==CParser.T__1:
+                    if _la <= 0 or _la == CParser.T__1:
                         self._errHandler.recoverInline(self)
                     else:
                         self._errHandler.reportMatch(self)
                         self.consume()
                 self.state = 766
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,94,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 94, self._ctx)
 
             self.state = 767
             self.match(CParser.T__38)
@@ -5513,32 +5232,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_asm1_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm1_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm1_statement"):
                 listener.enterAsm1_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm1_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm1_statement"):
                 listener.exitAsm1_statement(self)
 
-
-
-
     def asm1_statement(self):
 
         localctx = CParser.Asm1_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 122, self.RULE_asm1_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 770
@@ -5551,7 +5266,7 @@ class CParser ( Parser ):
             while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__0) | (1 << CParser.T__1) | (1 << CParser.T__2) | (1 << CParser.T__3) | (1 << CParser.T__4) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__22) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__38) | (1 << CParser.T__39) | (1 << CParser.T__40) | (1 << CParser.T__41) | (1 << CParser.T__42) | (1 << CParser.T__43) | (1 << CParser.T__44) | (1 << CParser.T__45) | (1 << CParser.T__46) | (1 << CParser.T__47) | (1 << CParser.T__48) | (1 << CParser.T__49) | (1 << CParser.T__50) | (1 << CParser.T__51) | (1 << CParser.T__52) | (1 << CParser.T__53) | (1 << CParser.T__54) | (1 << CParser.T__55) | (1 << CParser.T__56) | (1 << CParser.T__57) | (1 << CParser.T__58) | (1 << CParser.T__59) | (1 << CParser.T__60) | (1 << CParser.T__61) | (1 << CParser.T__62))) != 0) or ((((_la - 64)) & ~0x3f) == 0 and ((1 << (_la - 64)) & ((1 << (CParser.T__63 - 64)) | (1 << (CParser.T__64 - 64)) | (1 << (CParser.T__65 - 64)) | (1 << (CParser.T__66 - 64)) | (1 << (CParser.T__67 - 64)) | (1 << (CParser.T__68 - 64)) | (1 << (CParser.T__69 - 64)) | (1 << (CParser.T__70 - 64)) | (1 << (CParser.T__71 - 64)) | (1 << (CParser.T__72 - 64)) | (1 << (CParser.T__73 - 64)) | (1 << (CParser.T__74 - 64)) | (1 << (CParser.T__75 - 64)) | (1 << (CParser.T__76 - 64)) | (1 << (CParser.T__77 - 64)) | (1 << (CParser.T__78 - 64)) | (1 << (CParser.T__79 - 64)) | (1 << (CParser.T__80 - 64)) | (1 << (CParser.T__81 - 64)) | (1 << (CParser.T__82 - 64)) | (1 << (CParser.T__83 - 64)) | (1 << (CParser.T__84 - 64)) | (1 << (CParser.T__85 - 64)) | (1 << (CParser.T__86 - 64)) | (1 << (CParser.T__87 - 64)) | (1 << (CParser.T__88 - 64)) | (1 << (CParser.T__89 - 64)) | (1 << (CParser.T__90 - 64)) | (1 << (CParser.T__91 - 64)) | (1 << (CParser.IDENTIFIER - 64)) | (1 << (CParser.CHARACTER_LITERAL - 64)) | (1 << (CParser.STRING_LITERAL - 64)) | (1 << (CParser.HEX_LITERAL - 64)) | (1 << (CParser.DECIMAL_LITERAL - 64)) | (1 << (CParser.OCTAL_LITERAL - 64)) | (1 << (CParser.FLOATING_POINT_LITERAL - 64)) | (1 << (CParser.WS - 64)) | (1 << (CParser.BS - 64)) | (1 << (CParser.UnicodeVocabulary - 64)) | (1 << (CParser.COMMENT - 64)) | (1 << (CParser.LINE_COMMENT - 64)) | (1 << (CParser.LINE_COMMAND - 64)))) != 0):
                 self.state = 772
                 _la = self._input.LA(1)
-                if _la <= 0 or _la==CParser.T__19:
+                if _la <= 0 or _la == CParser.T__19:
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5574,32 +5289,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_asm_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm_statement"):
                 listener.enterAsm_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm_statement"):
                 listener.exitAsm_statement(self)
 
-
-
-
     def asm_statement(self):
 
         localctx = CParser.Asm_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 124, self.RULE_asm_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 780
@@ -5612,7 +5323,7 @@ class CParser ( Parser ):
             while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__0) | (1 << CParser.T__1) | (1 << CParser.T__2) | (1 << CParser.T__3) | (1 << CParser.T__4) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__22) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__38) | (1 << CParser.T__39) | (1 << CParser.T__40) | (1 << CParser.T__41) | (1 << CParser.T__42) | (1 << CParser.T__43) | (1 << CParser.T__44) | (1 << CParser.T__45) | (1 << CParser.T__46) | (1 << CParser.T__47) | (1 << CParser.T__48) | (1 << CParser.T__49) | (1 << CParser.T__50) | (1 << CParser.T__51) | (1 << CParser.T__52) | (1 << CParser.T__53) | (1 << CParser.T__54) | (1 << CParser.T__55) | (1 << CParser.T__56) | (1 << CParser.T__57) | (1 << CParser.T__58) | (1 << CParser.T__59) | (1 << CParser.T__60) | (1 << CParser.T__61) | (1 << CParser.T__62))) != 0) or ((((_la - 64)) & ~0x3f) == 0 and ((1 << (_la - 64)) & ((1 << (CParser.T__63 - 64)) | (1 << (CParser.T__64 - 64)) | (1 << (CParser.T__65 - 64)) | (1 << (CParser.T__66 - 64)) | (1 << (CParser.T__67 - 64)) | (1 << (CParser.T__68 - 64)) | (1 << (CParser.T__69 - 64)) | (1 << (CParser.T__70 - 64)) | (1 << (CParser.T__71 - 64)) | (1 << (CParser.T__72 - 64)) | (1 << (CParser.T__73 - 64)) | (1 << (CParser.T__74 - 64)) | (1 << (CParser.T__75 - 64)) | (1 << (CParser.T__76 - 64)) | (1 << (CParser.T__77 - 64)) | (1 << (CParser.T__78 - 64)) | (1 << (CParser.T__79 - 64)) | (1 << (CParser.T__80 - 64)) | (1 << (CParser.T__81 - 64)) | (1 << (CParser.T__82 - 64)) | (1 << (CParser.T__83 - 64)) | (1 << (CParser.T__84 - 64)) | (1 << (CParser.T__85 - 64)) | (1 << (CParser.T__86 - 64)) | (1 << (CParser.T__87 - 64)) | (1 << (CParser.T__88 - 64)) | (1 << (CParser.T__89 - 64)) | (1 << (CParser.T__90 - 64)) | (1 << (CParser.T__91 - 64)) | (1 << (CParser.IDENTIFIER - 64)) | (1 << (CParser.CHARACTER_LITERAL - 64)) | (1 << (CParser.STRING_LITERAL - 64)) | (1 << (CParser.HEX_LITERAL - 64)) | (1 << (CParser.DECIMAL_LITERAL - 64)) | (1 << (CParser.OCTAL_LITERAL - 64)) | (1 << (CParser.FLOATING_POINT_LITERAL - 64)) | (1 << (CParser.WS - 64)) | (1 << (CParser.BS - 64)) | (1 << (CParser.UnicodeVocabulary - 64)) | (1 << (CParser.COMMENT - 64)) | (1 << (CParser.LINE_COMMENT - 64)) | (1 << (CParser.LINE_COMMAND - 64)))) != 0):
                 self.state = 782
                 _la = self._input.LA(1)
-                if _la <= 0 or _la==CParser.T__19:
+                if _la <= 0 or _la == CParser.T__19:
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5635,7 +5346,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5643,42 +5354,36 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def statement_list(self):
-            return self.getTypedRuleContext(CParser.Statement_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Statement_listContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_macro_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMacro_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMacro_statement"):
                 listener.enterMacro_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMacro_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMacro_statement"):
                 listener.exitMacro_statement(self)
 
-
-
-
     def macro_statement(self):
 
         localctx = CParser.Macro_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 126, self.RULE_macro_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 790
@@ -5687,23 +5392,22 @@ class CParser ( Parser ):
             self.match(CParser.T__37)
             self.state = 795
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,97,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 97, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 792
                     self.declaration()
                 self.state = 797
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,97,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 97, self._ctx)
 
             self.state = 799
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,98,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 98, self._ctx)
             if la_ == 1:
                 self.state = 798
                 self.statement_list()
 
-
             self.state = 802
             self._errHandler.sync(self)
             _la = self._input.LA(1)
@@ -5711,7 +5415,6 @@ class CParser ( Parser ):
                 self.state = 801
                 self.expression()
 
-
             self.state = 804
             self.match(CParser.T__38)
         except RecognitionException as re:
@@ -5726,7 +5429,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5734,32 +5437,28 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def statement(self):
-            return self.getTypedRuleContext(CParser.StatementContext,0)
-
+            return self.getTypedRuleContext(CParser.StatementContext, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_labeled_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLabeled_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLabeled_statement"):
                 listener.enterLabeled_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLabeled_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLabeled_statement"):
                 listener.exitLabeled_statement(self)
 
-
-
-
     def labeled_statement(self):
 
-        localctx = CParser.Labeled_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Labeled_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 128, self.RULE_labeled_statement)
         try:
             self.state = 817
@@ -5809,57 +5508,54 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def statement_list(self):
-            return self.getTypedRuleContext(CParser.Statement_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Statement_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_compound_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterCompound_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterCompound_statement"):
                 listener.enterCompound_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitCompound_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitCompound_statement"):
                 listener.exitCompound_statement(self)
 
-
-
-
     def compound_statement(self):
 
-        localctx = CParser.Compound_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Compound_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 130, self.RULE_compound_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 819
             self.match(CParser.T__0)
             self.state = 823
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,101,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 101, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 820
                     self.declaration()
                 self.state = 825
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,101,self._ctx)
+                _alt = self._interp.adaptivePredict(
+                    self._input, 101, self._ctx)
 
             self.state = 827
             self._errHandler.sync(self)
@@ -5868,7 +5564,6 @@ class CParser ( Parser ):
                 self.state = 826
                 self.statement_list()
 
-
             self.state = 829
             self.match(CParser.T__19)
         except RecognitionException as re:
@@ -5883,34 +5578,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def statement(self,i=None):
+        def statement(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.StatementContext)
             else:
-                return self.getTypedRuleContext(CParser.StatementContext,i)
-
+                return self.getTypedRuleContext(CParser.StatementContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_statement_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStatement_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStatement_list"):
                 listener.enterStatement_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStatement_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStatement_list"):
                 listener.exitStatement_list(self)
 
-
-
-
     def statement_list(self):
 
         localctx = CParser.Statement_listContext(self, self._ctx, self.state)
@@ -5920,7 +5611,7 @@ class CParser ( Parser ):
             self.state = 832
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 831
                     self.statement()
@@ -5929,7 +5620,8 @@ class CParser ( Parser ):
                     raise NoViableAltException(self)
                 self.state = 834
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,103,self._ctx)
+                _alt = self._interp.adaptivePredict(
+                    self._input, 103, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -5943,33 +5635,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_expression_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExpression_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExpression_statement"):
                 listener.enterExpression_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExpression_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExpression_statement"):
                 listener.exitExpression_statement(self)
 
-
-
-
     def expression_statement(self):
 
-        localctx = CParser.Expression_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Expression_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 134, self.RULE_expression_statement)
         try:
             self.state = 840
@@ -6002,42 +5691,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # ExpressionContext
+            self.e = None  # ExpressionContext
 
         # @param  i=None Type: int
-        def statement(self,i=None):
+        def statement(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.StatementContext)
             else:
-                return self.getTypedRuleContext(CParser.StatementContext,i)
-
+                return self.getTypedRuleContext(CParser.StatementContext, i)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_selection_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterSelection_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterSelection_statement"):
                 listener.enterSelection_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitSelection_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitSelection_statement"):
                 listener.exitSelection_statement(self)
 
-
-
-
     def selection_statement(self):
 
-        localctx = CParser.Selection_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Selection_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 136, self.RULE_selection_statement)
         try:
             self.state = 858
@@ -6053,19 +5738,19 @@ class CParser ( Parser ):
                 localctx.e = self.expression()
                 self.state = 845
                 self.match(CParser.T__38)
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 self.state = 847
                 self.statement()
                 self.state = 850
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,105,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 105, self._ctx)
                 if la_ == 1:
                     self.state = 848
                     self.match(CParser.T__84)
                     self.state = 849
                     self.statement()
 
-
                 pass
             elif token in [CParser.T__85]:
                 self.enterOuterAlt(localctx, 2)
@@ -6095,38 +5780,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # ExpressionContext
+            self.e = None  # ExpressionContext
 
         def statement(self):
-            return self.getTypedRuleContext(CParser.StatementContext,0)
-
+            return self.getTypedRuleContext(CParser.StatementContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_iteration_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterIteration_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterIteration_statement"):
                 listener.enterIteration_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitIteration_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitIteration_statement"):
                 listener.exitIteration_statement(self)
 
-
-
-
     def iteration_statement(self):
 
-        localctx = CParser.Iteration_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Iteration_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 138, self.RULE_iteration_statement)
         try:
             self.state = 876
@@ -6144,7 +5825,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 self.state = 864
                 self.statement()
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 pass
             elif token in [CParser.T__87]:
                 self.enterOuterAlt(localctx, 2)
@@ -6162,7 +5844,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 self.state = 873
                 self.match(CParser.T__1)
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 pass
             else:
                 raise NoViableAltException(self)
@@ -6179,7 +5862,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -6187,25 +5870,21 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_jump_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterJump_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterJump_statement"):
                 listener.enterJump_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitJump_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitJump_statement"):
                 listener.exitJump_statement(self)
 
-
-
-
     def jump_statement(self):
 
         localctx = CParser.Jump_statementContext(self, self._ctx, self.state)
@@ -6213,7 +5892,7 @@ class CParser ( Parser ):
         try:
             self.state = 891
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,108,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 108, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 878
@@ -6258,7 +5937,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__1)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -6266,8 +5944,3 @@ class CParser ( Parser ):
         finally:
             self.exitRule()
         return localctx
-
-
-
-
-
diff --git a/BaseTools/Source/Python/Ecc/Check.py b/BaseTools/Source/Python/Ecc/Check.py
index 33060db5f27a..362d4247560f 100644
--- a/BaseTools/Source/Python/Ecc/Check.py
+++ b/BaseTools/Source/Python/Ecc/Check.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define checkpoints used by ECC tool
 #
 # Copyright (c) 2021, Arm Limited. All rights reserved.<BR>
@@ -17,12 +17,14 @@ from Ecc import c
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## Check
+# Check
 #
 # This class is to define checkpoints used by ECC tool
 #
 # @param object:          Inherited from object class
 #
+
+
 class Check(object):
     def __init__(self):
         pass
@@ -42,7 +44,6 @@ class Check(object):
     def SmmCommParaCheck(self):
         self.SmmCommParaCheckBufferType()
 
-
     # Check if SMM communication function has correct parameter type
     # 1. Get function calling with instance./->Communicate() interface
     # and make sure the protocol instance is of type EFI_SMM_COMMUNICATION_PROTOCOL.
@@ -59,6 +60,7 @@ class Check(object):
     #       report warning to indicate human code review.
     #    e. it is a buffer from other kind of pointers (may need to trace into nested function calls to locate),
     #       repeat checks in a.b.c and d.
+
     def SmmCommParaCheckBufferType(self):
         if EccGlobalData.gConfig.SmmCommParaCheckBufferType == '1' or EccGlobalData.gConfig.SmmCommParaCheckAll == '1':
             EdkLogger.quiet("Checking SMM communication parameter type ...")
@@ -88,7 +90,8 @@ class Check(object):
                             if SecondPara.startswith('&'):
                                 SecondPara = SecondPara[1:]
                             if SecondPara.endswith(']'):
-                                SecondParaIndex = SecondPara[SecondPara.find('[') + 1:-1]
+                                SecondParaIndex = SecondPara[SecondPara.find(
+                                    '[') + 1:-1]
                                 SecondPara = SecondPara[:SecondPara.find('[')]
                             # Get the ID
                             Id = Record[0]
@@ -96,7 +99,8 @@ class Check(object):
                             BelongsToFile = Record[3]
                             # Get the source file path
                             SqlCommand = """select FullPath from File where ID = %s""" % BelongsToFile
-                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
+                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(
+                                SqlCommand)
                             FullPath = NewRecordSet[0][0]
                             # Get the line no of function calling
                             StartLine = Record[4]
@@ -104,7 +108,8 @@ class Check(object):
                             SqlCommand = """select Value3 from INF where BelongsToFile = (select ID from File
                                             where Path = (select Path from File where ID = %s) and Model = 1011)
                                             and Value2 = 'MODULE_TYPE'""" % BelongsToFile
-                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
+                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(
+                                SqlCommand)
                             ModuleType = NewRecordSet[0][0] if NewRecordSet else None
 
                             # print BelongsToFile, FullPath, StartLine, ModuleType, SecondPara
@@ -113,14 +118,14 @@ class Check(object):
                             # Find the value of the parameter
                             if Value:
                                 if 'AllocatePage' in Value \
-                                    or 'AllocatePool' in Value \
-                                    or 'AllocateRuntimePool' in Value \
-                                    or 'AllocateZeroPool' in Value:
+                                        or 'AllocatePool' in Value \
+                                        or 'AllocateRuntimePool' in Value \
+                                        or 'AllocateZeroPool' in Value:
                                     pass
                                 else:
                                     if '->' in Value:
                                         if not EccGlobalData.gException.IsException(
-                                               ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
+                                                ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
                                             EccGlobalData.gDb.TblReport.Insert(ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE,
                                                                                OtherMsg="Please review the buffer type"
                                                                                + "is correct or not. If it is correct" +
@@ -130,7 +135,7 @@ class Check(object):
                                                                                BelongsToItem=Id)
                                     else:
                                         if not EccGlobalData.gException.IsException(
-                                               ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
+                                                ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
                                             EccGlobalData.gDb.TblReport.Insert(ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE,
                                                                                OtherMsg="Please review the buffer type"
                                                                                + "is correct or not. If it is correct" +
@@ -139,23 +144,23 @@ class Check(object):
                                                                                BelongsToTable=IdentifierTable,
                                                                                BelongsToItem=Id)
 
-
                             # Not find the value of the parameter
                             else:
                                 SqlCommand = """select ID, Modifier, Name, Value, Model, BelongsToFunction from %s
                                                 where Name = '%s' and StartLine < %s order by StartLine DESC""" \
                                                 % (IdentifierTable, SecondPara, StartLine)
-                                NewRecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
+                                NewRecordSet = EccGlobalData.gDb.TblFile.Exec(
+                                    SqlCommand)
                                 if NewRecordSet:
                                     Value = NewRecordSet[0][1]
                                     if 'AllocatePage' in Value \
-                                        or 'AllocatePool' in Value \
-                                        or 'AllocateRuntimePool' in Value \
-                                        or 'AllocateZeroPool' in Value:
+                                            or 'AllocatePool' in Value \
+                                            or 'AllocateRuntimePool' in Value \
+                                            or 'AllocateZeroPool' in Value:
                                         pass
                                     else:
                                         if not EccGlobalData.gException.IsException(
-                                            ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
+                                                ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE, Value):
                                             EccGlobalData.gDb.TblReport.Insert(ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE,
                                                                                OtherMsg="Please review the buffer type"
                                                                                + "is correct or not. If it is correct" +
@@ -177,7 +182,8 @@ class Check(object):
                 FileIn = open(File, 'rb').read(2)
                 if FileIn != '\xff\xfe':
                     OtherMsg = "File %s is not a valid UTF-16 UNI file" % Record[1]
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_GENERAL_CHECK_UNI, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_GENERAL_CHECK_UNI, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
 
     # General Checking
     def GeneralCheck(self):
@@ -203,8 +209,10 @@ class Check(object):
                         for Char in Line:
                             IndexOfChar += 1
                             if Char == '\t':
-                                OtherMsg = "File %s has TAB char at line %s column %s" % (Record[1], IndexOfLine, IndexOfChar)
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_GENERAL_CHECK_NO_TAB, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
+                                OtherMsg = "File %s has TAB char at line %s column %s" % (
+                                    Record[1], IndexOfLine, IndexOfChar)
+                                EccGlobalData.gDb.TblReport.Insert(
+                                    ERROR_GENERAL_CHECK_NO_TAB, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
 
     # Check Only use CRLF (Carriage Return Line Feed) line endings.
     def GeneralCheckLineEnding(self):
@@ -219,8 +227,10 @@ class Check(object):
                     for Line in op:
                         IndexOfLine += 1
                         if not bytes.decode(Line).endswith('\r\n'):
-                            OtherMsg = "File %s has invalid line ending at line %s" % (Record[1], IndexOfLine)
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_GENERAL_CHECK_INVALID_LINE_ENDING, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
+                            OtherMsg = "File %s has invalid line ending at line %s" % (
+                                Record[1], IndexOfLine)
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_GENERAL_CHECK_INVALID_LINE_ENDING, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
 
     # Check if there is no trailing white space in one line.
     def GeneralCheckTrailingWhiteSpaceLine(self):
@@ -235,8 +245,10 @@ class Check(object):
                     for Line in op:
                         IndexOfLine += 1
                         if Line.replace('\r', '').replace('\n', '').endswith(' '):
-                            OtherMsg = "File %s has trailing white spaces at line %s" % (Record[1], IndexOfLine)
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_GENERAL_CHECK_TRAILING_WHITE_SPACE_LINE, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
+                            OtherMsg = "File %s has trailing white spaces at line %s" % (
+                                Record[1], IndexOfLine)
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_GENERAL_CHECK_TRAILING_WHITE_SPACE_LINE, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
 
     # Check whether file has non ACSII char
     def GeneralCheckNonAcsii(self):
@@ -254,8 +266,10 @@ class Check(object):
                         for Char in Line:
                             IndexOfChar += 1
                             if ord(Char) > 126:
-                                OtherMsg = "File %s has Non-ASCII char at line %s column %s" % (Record[1], IndexOfLine, IndexOfChar)
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_GENERAL_CHECK_NON_ACSII, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
+                                OtherMsg = "File %s has Non-ASCII char at line %s column %s" % (
+                                    Record[1], IndexOfLine, IndexOfChar)
+                                EccGlobalData.gDb.TblReport.Insert(
+                                    ERROR_GENERAL_CHECK_NON_ACSII, OtherMsg=OtherMsg, BelongsToTable='File', BelongsToItem=Record[0])
 
     # C Function Layout Checking
     def FunctionLayoutCheck(self):
@@ -270,7 +284,8 @@ class Check(object):
     # To check if the deprecated functions are used
     def FunctionLayoutCheckDeprecated(self):
         if EccGlobalData.gConfig.CFunctionLayoutCheckNoDeprecated == '1' or EccGlobalData.gConfig.CFunctionLayoutCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking function no deprecated one being used ...")
+            EdkLogger.quiet(
+                "Checking function no deprecated one being used ...")
 
             DeprecatedFunctionSet = ('UnicodeValueToString',
                                      'AsciiValueToString',
@@ -441,8 +456,8 @@ class Check(object):
         self.DeclCheckSameStructure()
         self.DeclCheckUnionType()
 
-
     # Check whether no use of int, unsigned, char, void, long in any .c, .h or .asl files.
+
     def DeclCheckNoUseCType(self):
         if EccGlobalData.gConfig.DeclarationDataTypeCheckNoUseCType == '1' or EccGlobalData.gConfig.DeclarationDataTypeCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Declaration No use C type ...")
@@ -509,7 +524,8 @@ class Check(object):
             EdkLogger.quiet("Checking same struct ...")
             AllStructure = {}
             for IdentifierTable in EccGlobalData.gIdentifierTableList:
-                SqlCommand = """select ID, Name, BelongsToFile from %s where Model = %s""" % (IdentifierTable, MODEL_IDENTIFIER_STRUCTURE)
+                SqlCommand = """select ID, Name, BelongsToFile from %s where Model = %s""" % (
+                    IdentifierTable, MODEL_IDENTIFIER_STRUCTURE)
                 RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
                 for Record in RecordSet:
                     if Record[1] != '':
@@ -518,12 +534,15 @@ class Check(object):
                         else:
                             ID = AllStructure[Record[1]]
                             SqlCommand = """select FullPath from File where ID = %s """ % ID
-                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
+                            NewRecordSet = EccGlobalData.gDb.TblFile.Exec(
+                                SqlCommand)
                             OtherMsg = "The structure name '%s' is duplicate" % Record[1]
                             if NewRecordSet != []:
-                                OtherMsg = "The structure name [%s] is duplicate with the one defined in %s, maybe struct NOT typedefed or the typedef new type NOT used to qualify variables" % (Record[1], NewRecordSet[0][0])
+                                OtherMsg = "The structure name [%s] is duplicate with the one defined in %s, maybe struct NOT typedefed or the typedef new type NOT used to qualify variables" % (
+                                    Record[1], NewRecordSet[0][0])
                             if not EccGlobalData.gException.IsException(ERROR_DECLARATION_DATA_TYPE_CHECK_SAME_STRUCTURE, Record[1]):
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_DECLARATION_DATA_TYPE_CHECK_SAME_STRUCTURE, OtherMsg=OtherMsg, BelongsToTable=IdentifierTable, BelongsToItem=Record[0])
+                                EccGlobalData.gDb.TblReport.Insert(
+                                    ERROR_DECLARATION_DATA_TYPE_CHECK_SAME_STRUCTURE, OtherMsg=OtherMsg, BelongsToTable=IdentifierTable, BelongsToItem=Record[0])
 
     # Check whether Union Type has a 'typedef' and the name is capital
     def DeclCheckUnionType(self):
@@ -564,7 +583,8 @@ class Check(object):
     # Check whether Non-Boolean comparisons use a compare operator (==, !=, >, < >=, <=).
     def PredicateExpressionCheckNonBooleanOperator(self):
         if EccGlobalData.gConfig.PredicateExpressionCheckNonBooleanOperator == '1' or EccGlobalData.gConfig.PredicateExpressionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking predicate expression Non-Boolean variable...")
+            EdkLogger.quiet(
+                "Checking predicate expression Non-Boolean variable...")
 
 #            for Dirpath, Dirnames, Filenames in self.WalkTree():
 #                for F in Filenames:
@@ -621,7 +641,8 @@ class Check(object):
                     for Item in RecordDict[Key]:
                         Path = mws.relpath(Item[1], EccGlobalData.gWorkspace)
                         if not EccGlobalData.gException.IsException(ERROR_INCLUDE_FILE_CHECK_NAME, Path):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_INCLUDE_FILE_CHECK_NAME, OtherMsg="The file name for [%s] is duplicate" % Path, BelongsToTable='File', BelongsToItem=Item[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_INCLUDE_FILE_CHECK_NAME, OtherMsg="The file name for [%s] is duplicate" % Path, BelongsToTable='File', BelongsToItem=Item[0])
 
     # Check whether all include file contents is guarded by a #ifndef statement.
     def IncludeFileCheckIfndef(self):
@@ -682,15 +703,15 @@ class Check(object):
                         FullName = os.path.join(Dirpath, F)
                         op = open(FullName).readlines()
                         FileLinesList = op
-                        LineNo             = 0
-                        CurrentSection     = MODEL_UNKNOWN
-                        HeaderSectionLines       = []
+                        LineNo = 0
+                        CurrentSection = MODEL_UNKNOWN
+                        HeaderSectionLines = []
                         HeaderCommentStart = False
-                        HeaderCommentEnd   = False
+                        HeaderCommentEnd = False
 
                         for Line in FileLinesList:
-                            LineNo   = LineNo + 1
-                            Line     = Line.strip()
+                            LineNo = LineNo + 1
+                            Line = Line.strip()
                             if (LineNo < len(FileLinesList) - 1):
                                 NextLine = FileLinesList[LineNo].strip()
 
@@ -705,13 +726,15 @@ class Check(object):
                             #
                             if Line.startswith('#') and \
                                 (Line.find('@file') > -1) and \
-                                not HeaderCommentStart:
+                                    not HeaderCommentStart:
                                 if CurrentSection != MODEL_UNKNOWN:
                                     SqlStatement = """ select ID from File where FullPath like '%s'""" % FullName
-                                    ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
+                                    ResultSet = EccGlobalData.gDb.TblFile.Exec(
+                                        SqlStatement)
                                     for Result in ResultSet:
                                         Msg = 'INF/DEC/DSC/FDF file header comment should begin with ""## @file"" or ""# @file""at the very top file'
-                                        EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+                                        EccGlobalData.gDb.TblReport.Insert(
+                                            ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
 
                                 else:
                                     CurrentSection = MODEL_IDENTIFIER_FILE_HEADER
@@ -726,38 +749,42 @@ class Check(object):
                             # Collect Header content.
                             #
                             if (Line.startswith('#') and CurrentSection == MODEL_IDENTIFIER_FILE_HEADER) and\
-                                HeaderCommentStart and not Line.startswith('##') and not\
-                                HeaderCommentEnd and NextLine != '':
+                                    HeaderCommentStart and not Line.startswith('##') and not\
+                                    HeaderCommentEnd and NextLine != '':
                                 HeaderSectionLines.append((Line, LineNo))
                                 continue
                             #
                             # Header content end
                             #
                             if (Line.startswith('##') or not Line.strip().startswith("#")) and HeaderCommentStart \
-                                and not HeaderCommentEnd:
+                                    and not HeaderCommentEnd:
                                 if Line.startswith('##'):
                                     HeaderCommentEnd = True
                                 HeaderSectionLines.append((Line, LineNo))
-                                ParseHeaderCommentSection(HeaderSectionLines, FullName)
+                                ParseHeaderCommentSection(
+                                    HeaderSectionLines, FullName)
                                 break
                         if HeaderCommentStart == False:
                             SqlStatement = """ select ID from File where FullPath like '%s'""" % FullName
-                            ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
+                            ResultSet = EccGlobalData.gDb.TblFile.Exec(
+                                SqlStatement)
                             for Result in ResultSet:
                                 Msg = 'INF/DEC/DSC/FDF file header comment should begin with ""## @file"" or ""# @file"" at the very top file'
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+                                EccGlobalData.gDb.TblReport.Insert(
+                                    ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
                         if HeaderCommentEnd == False:
                             SqlStatement = """ select ID from File where FullPath like '%s'""" % FullName
-                            ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
+                            ResultSet = EccGlobalData.gDb.TblFile.Exec(
+                                SqlStatement)
                             for Result in ResultSet:
                                 Msg = 'INF/DEC/DSC/FDF file header comment should end with ""##"" at the end of file header comment block'
                                 # Check whether File header Comment End with '##'
                                 if EccGlobalData.gConfig.HeaderCheckFileCommentEnd == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-                                    EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
-
-
+                                    EccGlobalData.gDb.TblReport.Insert(
+                                        ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
 
     # Check whether the function headers are followed Doxygen special documentation blocks in section 2.3.5
+
     def DoxygenCheckFunctionHeader(self):
         if EccGlobalData.gConfig.DoxygenCheckFunctionHeader == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking Doxygen function header ...")
@@ -770,9 +797,9 @@ class Check(object):
             for FullName in EccGlobalData.gCFileList + EccGlobalData.gHFileList:
                 MsgList = c.CheckFuncHeaderDoxygenComments(FullName)
 
-
     # Check whether the first line of text in a comment block is a brief description of the element being documented.
     # The brief description must end with a period.
+
     def DoxygenCheckCommentDescription(self):
         if EccGlobalData.gConfig.DoxygenCheckCommentDescription == '1' or EccGlobalData.gConfig.DoxygenCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             pass
@@ -853,7 +880,8 @@ class Check(object):
                 List = Record[1].split('|', 1)
                 SupModType = []
                 if len(List) == 1:
-                    SupModType = DT.SUP_MODULE_LIST_STRING.split(DT.TAB_VALUE_SPLIT)
+                    SupModType = DT.SUP_MODULE_LIST_STRING.split(
+                        DT.TAB_VALUE_SPLIT)
                 elif len(List) == 2:
                     SupModType = List[1].split()
 
@@ -865,7 +893,8 @@ class Check(object):
                             LibraryClasses[List[0]].append(Item)
 
                 if Record[2] != DT.SUP_MODULE_BASE and Record[2] not in SupModType:
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_2, OtherMsg="The Library Class '%s' does not specify its supported module types" % (List[0]), BelongsToTable='Inf', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_2, OtherMsg="The Library Class '%s' does not specify its supported module types" % (
+                        List[0]), BelongsToTable='Inf', BelongsToItem=Record[0])
 
             SqlCommand = """select A.ID, A.Value1, B.Value3 from Inf as A left join Inf as B
                             where A.Model = %s and B.Value2 = '%s' and B.Model = %s
@@ -885,22 +914,26 @@ class Check(object):
                 if Record[1] in LibraryClasses:
                     if Record[2] not in LibraryClasses[Record[1]] and DT.SUP_MODULE_BASE not in RecordDict[Record[1]]:
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, Record[1]):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, OtherMsg="The type of Library Class [%s] defined in Inf file does not match the type of the module" % (Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, OtherMsg="The type of Library Class [%s] defined in Inf file does not match the type of the module" % (
+                                Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
                 else:
                     if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, Record[1]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, OtherMsg="The type of Library Class [%s] defined in Inf file does not match the type of the module" % (Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1, OtherMsg="The type of Library Class [%s] defined in Inf file does not match the type of the module" % (
+                            Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
 
     # Check whether a Library Instance has been defined for all dependent library classes
     def MetaDataFileCheckLibraryInstanceDependent(self):
         if EccGlobalData.gConfig.MetaDataFileCheckLibraryInstanceDependent == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for library instance dependent issue ...")
+            EdkLogger.quiet(
+                "Checking for library instance dependent issue ...")
             SqlCommand = """select ID, Value1, Value2 from Dsc where Model = %s""" % MODEL_EFI_LIBRARY_CLASS
             LibraryClasses = EccGlobalData.gDb.TblDsc.Exec(SqlCommand)
             for LibraryClass in LibraryClasses:
                 if LibraryClass[1].upper() == 'NULL' or LibraryClass[1].startswith('!ifdef') or LibraryClass[1].startswith('!ifndef') or LibraryClass[1].endswith('!endif'):
                     continue
                 else:
-                    LibraryIns = os.path.normpath(mws.join(EccGlobalData.gWorkspace, LibraryClass[2]))
+                    LibraryIns = os.path.normpath(
+                        mws.join(EccGlobalData.gWorkspace, LibraryClass[2]))
                     SkipDirString = '|'.join(EccGlobalData.gConfig.SkipDirList)
                     p = re.compile(r'.*[\\/](?:%s^\S)[\\/]?.*' % SkipDirString)
                     if p.match(os.path.split(LibraryIns)[0].upper()):
@@ -916,7 +949,8 @@ class Check(object):
                             IsFound = True
                     if not IsFound:
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_DEPENDENT, LibraryClass[1]):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_DEPENDENT, OtherMsg="The Library Class [%s] is not specified in '%s'" % (LibraryClass[1], LibraryClass[2]), BelongsToTable='Dsc', BelongsToItem=LibraryClass[0])
+                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_DEPENDENT, OtherMsg="The Library Class [%s] is not specified in '%s'" % (
+                                LibraryClass[1], LibraryClass[2]), BelongsToTable='Dsc', BelongsToItem=LibraryClass[0])
 
     # Check whether the Library Instances specified by the LibraryClasses sections are listed in order of dependencies
     def MetaDataFileCheckLibraryInstanceOrder(self):
@@ -929,11 +963,13 @@ class Check(object):
     def MetaDataFileCheckLibraryNoUse(self):
         if EccGlobalData.gConfig.MetaDataFileCheckLibraryNoUse == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking for library instance not used ...")
-            SqlCommand = """select ID, Value1 from Inf as A where A.Model = %s and A.Value1 not in (select B.Value1 from Dsc as B where Model = %s)""" % (MODEL_EFI_LIBRARY_CLASS, MODEL_EFI_LIBRARY_CLASS)
+            SqlCommand = """select ID, Value1 from Inf as A where A.Model = %s and A.Value1 not in (select B.Value1 from Dsc as B where Model = %s)""" % (
+                MODEL_EFI_LIBRARY_CLASS, MODEL_EFI_LIBRARY_CLASS)
             RecordSet = EccGlobalData.gDb.TblInf.Exec(SqlCommand)
             for Record in RecordSet:
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE, Record[1]):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE, OtherMsg="The Library Class [%s] is not used in any platform" % (Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE, OtherMsg="The Library Class [%s] is not used in any platform" % (
+                        Record[1]), BelongsToTable='Inf', BelongsToItem=Record[0])
             SqlCommand = """
                          select A.ID, A.Value1, A.BelongsToFile, A.StartLine, B.StartLine from Dsc as A left join Dsc as B
                          where A.Model = %s and B.Model = %s and A.Scope1 = B.Scope1 and A.Scope2 = B.Scope2 and A.ID != B.ID
@@ -942,16 +978,19 @@ class Check(object):
             RecordSet = EccGlobalData.gDb.TblDsc.Exec(SqlCommand)
             for Record in RecordSet:
                 if Record[3] and Record[4] and Record[3] != Record[4] and Record[1] != 'NULL':
-                    SqlCommand = """select FullPath from File where ID = %s""" % (Record[2])
+                    SqlCommand = """select FullPath from File where ID = %s""" % (
+                        Record[2])
                     FilePathList = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
                     for FilePath in FilePathList:
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE, Record[1]):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE, OtherMsg="The Library Class [%s] is duplicated in '%s' line %s and line %s." % (Record[1], FilePath, Record[3], Record[4]), BelongsToTable='Dsc', BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE, OtherMsg="The Library Class [%s] is duplicated in '%s' line %s and line %s." % (
+                                Record[1], FilePath, Record[3], Record[4]), BelongsToTable='Dsc', BelongsToItem=Record[0])
 
     # Check the header file in Include\Library directory whether be defined in the package DEC file.
     def MetaDataFileCheckLibraryDefinedInDec(self):
         if EccGlobalData.gConfig.MetaDataFileCheckLibraryDefinedInDec == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for library instance whether be defined in the package dec file ...")
+            EdkLogger.quiet(
+                "Checking for library instance whether be defined in the package dec file ...")
             SqlCommand = """
                     select A.Value1, A.StartLine, A.ID, B.Value1 from Inf as A left join Dec as B
                     on A.Model = B.Model and A.Value1 = B.Value1 where A.Model=%s
@@ -961,14 +1000,16 @@ class Check(object):
                 LibraryInInf, Line, ID, LibraryDec = Record
                 if not LibraryDec:
                     if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED, LibraryInInf):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED, \
-                                            OtherMsg="The Library Class [%s] in %s line is not defined in the associated package file." % (LibraryInInf, Line),
-                                            BelongsToTable='Inf', BelongsToItem=ID)
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED,
+                                                           OtherMsg="The Library Class [%s] in %s line is not defined in the associated package file." % (
+                                                               LibraryInInf, Line),
+                                                           BelongsToTable='Inf', BelongsToItem=ID)
 
     # Check whether an Inf file is specified in the FDF file, but not in the Dsc file, then the Inf file must be for a Binary module only
     def MetaDataFileCheckBinaryInfInFdf(self):
         if EccGlobalData.gConfig.MetaDataFileCheckBinaryInfInFdf == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for non-binary modules defined in FDF files ...")
+            EdkLogger.quiet(
+                "Checking for non-binary modules defined in FDF files ...")
             SqlCommand = """select A.ID, A.Value1 from Fdf as A
                          where A.Model = %s
                          and A.Enabled > -1
@@ -980,18 +1021,21 @@ class Check(object):
             for Record in RecordSet:
                 FdfID = Record[0]
                 FilePath = Record[1]
-                FilePath = os.path.normpath(mws.join(EccGlobalData.gWorkspace, FilePath))
+                FilePath = os.path.normpath(
+                    mws.join(EccGlobalData.gWorkspace, FilePath))
                 SqlCommand = """select ID from Inf where Model = %s and BelongsToFile = (select ID from File where FullPath like '%s')
                                 """ % (MODEL_EFI_SOURCE_FILE, FilePath)
                 NewRecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
                 if NewRecordSet != []:
                     if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_BINARY_INF_IN_FDF, FilePath):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_BINARY_INF_IN_FDF, OtherMsg="File [%s] defined in FDF file and not in DSC file must be a binary module" % (FilePath), BelongsToTable='Fdf', BelongsToItem=FdfID)
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_BINARY_INF_IN_FDF, OtherMsg="File [%s] defined in FDF file and not in DSC file must be a binary module" % (
+                            FilePath), BelongsToTable='Fdf', BelongsToItem=FdfID)
 
     # Check whether a PCD is set in a Dsc file or the FDF file, but not in both.
     def MetaDataFileCheckPcdDuplicate(self):
         if EccGlobalData.gConfig.MetaDataFileCheckPcdDuplicate == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for duplicate PCDs defined in both DSC and FDF files ...")
+            EdkLogger.quiet(
+                "Checking for duplicate PCDs defined in both DSC and FDF files ...")
             SqlCommand = """
                          select A.ID, A.Value1, A.Value2, A.BelongsToFile, B.ID, B.Value1, B.Value2, B.BelongsToFile from Dsc as A, Fdf as B
                          where A.Model >= %s and A.Model < %s
@@ -1006,16 +1050,21 @@ class Check(object):
             for Record in RecordSet:
                 SqlCommand1 = """select Name from File where ID = %s""" % Record[3]
                 SqlCommand2 = """select Name from File where ID = %s""" % Record[7]
-                DscFileName = os.path.splitext(EccGlobalData.gDb.TblDsc.Exec(SqlCommand1)[0][0])[0]
-                FdfFileName = os.path.splitext(EccGlobalData.gDb.TblDsc.Exec(SqlCommand2)[0][0])[0]
+                DscFileName = os.path.splitext(
+                    EccGlobalData.gDb.TblDsc.Exec(SqlCommand1)[0][0])[0]
+                FdfFileName = os.path.splitext(
+                    EccGlobalData.gDb.TblDsc.Exec(SqlCommand2)[0][0])[0]
                 if DscFileName != FdfFileName:
                     continue
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, Record[1] + '.' + Record[2]):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined in both FDF file and DSC file" % (Record[1] + '.' + Record[2]), BelongsToTable='Dsc', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined in both FDF file and DSC file" % (
+                        Record[1] + '.' + Record[2]), BelongsToTable='Dsc', BelongsToItem=Record[0])
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, Record[5] + '.' + Record[6]):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined in both FDF file and DSC file" % (Record[5] + '.' + Record[6]), BelongsToTable='Fdf', BelongsToItem=Record[4])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined in both FDF file and DSC file" % (
+                        Record[5] + '.' + Record[6]), BelongsToTable='Fdf', BelongsToItem=Record[4])
 
-            EdkLogger.quiet("Checking for duplicate PCDs defined in DEC files ...")
+            EdkLogger.quiet(
+                "Checking for duplicate PCDs defined in DEC files ...")
             SqlCommand = """
                          select A.ID, A.Value1, A.Value2, A.Model, B.Model from Dec as A left join Dec as B
                          where A.Model >= %s and A.Model < %s
@@ -1034,12 +1083,14 @@ class Check(object):
             for Record in RecordSet:
                 RecordCat = Record[1] + '.' + Record[2]
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, RecordCat):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined duplicated in DEC file" % RecordCat, BelongsToTable='Dec', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE, OtherMsg="The PCD [%s] is defined duplicated in DEC file" % RecordCat, BelongsToTable='Dec', BelongsToItem=Record[0])
 
     # Check whether PCD settings in the FDF file can only be related to flash.
     def MetaDataFileCheckPcdFlash(self):
         if EccGlobalData.gConfig.MetaDataFileCheckPcdFlash == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking only Flash related PCDs are used in FDF ...")
+            EdkLogger.quiet(
+                "Checking only Flash related PCDs are used in FDF ...")
             SqlCommand = """
                          select ID, Value1, Value2, BelongsToFile from Fdf as A
                          where A.Model >= %s and Model < %s
@@ -1049,7 +1100,8 @@ class Check(object):
             RecordSet = EccGlobalData.gDb.TblFdf.Exec(SqlCommand)
             for Record in RecordSet:
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_FLASH, Record[1] + '.' + Record[2]):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_FLASH, OtherMsg="The PCD [%s] defined in FDF file is not related to Flash" % (Record[1] + '.' + Record[2]), BelongsToTable='Fdf', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_FLASH, OtherMsg="The PCD [%s] defined in FDF file is not related to Flash" % (
+                        Record[1] + '.' + Record[2]), BelongsToTable='Fdf', BelongsToItem=Record[0])
 
     # Check whether PCDs used in Inf files but not specified in Dsc or FDF files
     def MetaDataFileCheckPcdNoUse(self):
@@ -1071,24 +1123,34 @@ class Check(object):
             RecordSet = EccGlobalData.gDb.TblInf.Exec(SqlCommand)
             for Record in RecordSet:
                 if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_NO_USE, Record[1] + '.' + Record[2]):
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_NO_USE, OtherMsg="The PCD [%s] defined in INF file is not specified in either DSC or FDF files" % (Record[1] + '.' + Record[2]), BelongsToTable='Inf', BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_NO_USE, OtherMsg="The PCD [%s] defined in INF file is not specified in either DSC or FDF files" % (
+                        Record[1] + '.' + Record[2]), BelongsToTable='Inf', BelongsToItem=Record[0])
 
     # Check whether having duplicate guids defined for Guid/Protocol/Ppi
     def MetaDataFileCheckGuidDuplicate(self):
         if EccGlobalData.gConfig.MetaDataFileCheckGuidDuplicate == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Checking for duplicate GUID/PPI/PROTOCOL ...")
             # Check Guid
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID, EccGlobalData.gDb.TblDec)
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID, EccGlobalData.gDb.TblDsc)
-            self.CheckGuidProtocolPpiValue(ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID, EccGlobalData.gDb.TblDec)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID, EccGlobalData.gDb.TblDsc)
+            self.CheckGuidProtocolPpiValue(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID, MODEL_EFI_GUID)
             # Check protocol
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL, EccGlobalData.gDb.TblDec)
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL, EccGlobalData.gDb.TblDsc)
-            self.CheckGuidProtocolPpiValue(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL, EccGlobalData.gDb.TblDec)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL, EccGlobalData.gDb.TblDsc)
+            self.CheckGuidProtocolPpiValue(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL, MODEL_EFI_PROTOCOL)
             # Check ppi
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI, EccGlobalData.gDb.TblDec)
-            self.CheckGuidProtocolPpi(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI, EccGlobalData.gDb.TblDsc)
-            self.CheckGuidProtocolPpiValue(ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI, EccGlobalData.gDb.TblDec)
+            self.CheckGuidProtocolPpi(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI, EccGlobalData.gDb.TblDsc)
+            self.CheckGuidProtocolPpiValue(
+                ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI, MODEL_EFI_PPI)
 
     # Check whether all files under module directory are described in INF files
     def MetaDataFileCheckModuleFileNoUse(self):
@@ -1112,15 +1174,18 @@ class Check(object):
             RecordSet = EccGlobalData.gDb.TblInf.Exec(SqlCommand)
             for Record in RecordSet:
                 Path = Record[1]
-                Path = Path.upper().replace('\X64', '').replace('\IA32', '').replace('\EBC', '').replace('\IPF', '').replace('\ARM', '')
+                Path = Path.upper().replace('\X64', '').replace('\IA32', '').replace(
+                    '\EBC', '').replace('\IPF', '').replace('\ARM', '')
                 if Path in InfPathList:
                     if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_MODULE_FILE_NO_USE, Record[2]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_MODULE_FILE_NO_USE, OtherMsg="The source file [%s] is existing in module directory but it is not described in INF file." % (Record[2]), BelongsToTable='File', BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_MODULE_FILE_NO_USE, OtherMsg="The source file [%s] is existing in module directory but it is not described in INF file." % (
+                            Record[2]), BelongsToTable='File', BelongsToItem=Record[0])
 
     # Check whether the PCD is correctly used in C function via its type
     def MetaDataFileCheckPcdType(self):
         if EccGlobalData.gConfig.MetaDataFileCheckPcdType == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for pcd type in c code function usage ...")
+            EdkLogger.quiet(
+                "Checking for pcd type in c code function usage ...")
             SqlCommand = """
                          select ID, Model, Value1, Value2, BelongsToFile from INF where Model > %s and Model < %s
                          """ % (MODEL_PCD, MODEL_META_DATA_HEADER)
@@ -1148,19 +1213,23 @@ class Check(object):
                         FunName = Record[0]
                         if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, FunName):
                             if Model in [MODEL_PCD_FIXED_AT_BUILD] and not FunName.startswith('FixedPcdGet'):
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FixPcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
+                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FixPcd but now it is called by c function [%s]" % (
+                                    PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
                             if Model in [MODEL_PCD_FEATURE_FLAG] and (not FunName.startswith('FeaturePcdGet') and not FunName.startswith('FeaturePcdSet')):
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FeaturePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
+                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a FeaturePcd but now it is called by c function [%s]" % (
+                                    PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
                             if Model in [MODEL_PCD_PATCHABLE_IN_MODULE] and (not FunName.startswith('PatchablePcdGet') and not FunName.startswith('PatchablePcdSet')):
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a PatchablePcd but now it is called by c function [%s]" % (PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
+                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_PCD_TYPE, OtherMsg="The pcd '%s' is defined as a PatchablePcd but now it is called by c function [%s]" % (
+                                    PcdName, FunName), BelongsToTable=TblName, BelongsToItem=Record[1])
 
-            #ERROR_META_DATA_FILE_CHECK_PCD_TYPE
+            # ERROR_META_DATA_FILE_CHECK_PCD_TYPE
         pass
 
     # Internal worker function to get the INF workspace relative path from FileID
     def GetInfFilePathFromID(self, FileID):
         Table = EccGlobalData.gDb.TblFile
-        SqlCommand = """select A.FullPath from %s as A where A.ID = %s""" % (Table.Table, FileID)
+        SqlCommand = """select A.FullPath from %s as A where A.ID = %s""" % (
+            Table.Table, FileID)
         RecordSet = Table.Exec(SqlCommand)
         Path = ""
         for Record in RecordSet:
@@ -1170,7 +1239,8 @@ class Check(object):
     # Check whether two module INFs under one workspace has the same FILE_GUID value
     def MetaDataFileCheckModuleFileGuidDuplication(self):
         if EccGlobalData.gConfig.MetaDataFileCheckModuleFileGuidDuplication == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking for pcd type in c code function usage ...")
+            EdkLogger.quiet(
+                "Checking for pcd type in c code function usage ...")
             Table = EccGlobalData.gDb.TblInf
             SqlCommand = """
                          select A.ID, A.Value3, A.BelongsToFile, B.BelongsToFile from %s as A, %s as B
@@ -1183,11 +1253,13 @@ class Check(object):
                 InfPath2 = self.GetInfFilePathFromID(Record[3])
                 if InfPath1 and InfPath2:
                     if not EccGlobalData.gException.IsException(ERROR_META_DATA_FILE_CHECK_MODULE_FILE_GUID_DUPLICATION, InfPath1):
-                        Msg = "The FILE_GUID of INF file [%s] is duplicated with that of %s" % (InfPath1, InfPath2)
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_MODULE_FILE_GUID_DUPLICATION, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
-
+                        Msg = "The FILE_GUID of INF file [%s] is duplicated with that of %s" % (
+                            InfPath1, InfPath2)
+                        EccGlobalData.gDb.TblReport.Insert(
+                            ERROR_META_DATA_FILE_CHECK_MODULE_FILE_GUID_DUPLICATION, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check Guid Format in module INF
+
     def MetaDataFileCheckModuleFileGuidFormat(self):
         if EccGlobalData.gConfig.MetaDataFileCheckModuleFileGuidFormat == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Check Guid Format in module INF ...")
@@ -1201,7 +1273,8 @@ class Check(object):
                 Value2 = Record[2]
                 GuidCommentList = []
                 InfPath = self.GetInfFilePathFromID(Record[3])
-                Msg = "The GUID format of %s in INF file [%s] does not follow rules" % (Value1, InfPath)
+                Msg = "The GUID format of %s in INF file [%s] does not follow rules" % (
+                    Value1, InfPath)
                 if Value2.startswith(DT.TAB_SPECIAL_COMMENT):
                     GuidCommentList = Value2[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if GuidCommentList[0].strip().startswith(DT.TAB_INF_USAGE_UNDEFINED):
@@ -1211,7 +1284,8 @@ class Check(object):
                                                                       DT.TAB_INF_USAGE_SOME_PRO,
                                                                       DT.TAB_INF_USAGE_CON,
                                                                       DT.TAB_INF_USAGE_SOME_CON)):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                         if not (GuidCommentList[1].strip()).startswith(DT.TAB_INF_GUIDTYPE_VAR) and \
                             not GuidCommentList[1].strip().startswith((DT.TAB_INF_GUIDTYPE_EVENT,
                                                                        DT.TAB_INF_GUIDTYPE_HII,
@@ -1224,11 +1298,14 @@ class Check(object):
                                                                        DT.TAB_INF_GUIDTYPE_PROTOCOL,
                                                                        DT.TAB_INF_GUIDTYPE_PPI,
                                                                        DT.TAB_INF_USAGE_UNDEFINED)):
-                                EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                     else:
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(
+                            ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                 else:
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_META_DATA_FILE_CHECK_FORMAT_GUID, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check Protocol Format in module INF
     def MetaDataFileCheckModuleFileProtocolFormat(self):
@@ -1244,7 +1321,8 @@ class Check(object):
                 Value2 = Record[2]
                 GuidCommentList = []
                 InfPath = self.GetInfFilePathFromID(Record[3])
-                Msg = "The Protocol format of %s in INF file [%s] does not follow rules" % (Value1, InfPath)
+                Msg = "The Protocol format of %s in INF file [%s] does not follow rules" % (
+                    Value1, InfPath)
                 if Value2.startswith(DT.TAB_SPECIAL_COMMENT):
                     GuidCommentList = Value2[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if len(GuidCommentList) >= 1:
@@ -1256,12 +1334,14 @@ class Check(object):
                                                                       DT.TAB_INF_USAGE_TO_START,
                                                                       DT.TAB_INF_USAGE_BY_START,
                                                                       DT.TAB_INF_USAGE_UNDEFINED)):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                 else:
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
-
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check Ppi Format in module INF
+
     def MetaDataFileCheckModuleFilePpiFormat(self):
         if EccGlobalData.gConfig.MetaDataFileCheckModuleFilePpiFormat == '1' or EccGlobalData.gConfig.MetaDataFileCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
             EdkLogger.quiet("Check Ppi Format in module INF ...")
@@ -1275,7 +1355,8 @@ class Check(object):
                 Value2 = Record[2]
                 GuidCommentList = []
                 InfPath = self.GetInfFilePathFromID(Record[3])
-                Msg = "The Ppi format of %s in INF file [%s] does not follow rules" % (Value1, InfPath)
+                Msg = "The Ppi format of %s in INF file [%s] does not follow rules" % (
+                    Value1, InfPath)
                 if Value2.startswith(DT.TAB_SPECIAL_COMMENT):
                     GuidCommentList = Value2[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if len(GuidCommentList) >= 1:
@@ -1285,9 +1366,11 @@ class Check(object):
                                                                       DT.TAB_INF_USAGE_SOME_CON,
                                                                       DT.TAB_INF_USAGE_NOTIFY,
                                                                       DT.TAB_INF_USAGE_UNDEFINED)):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PPI, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_PPI, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                 else:
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PPI, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_META_DATA_FILE_CHECK_FORMAT_PPI, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check Pcd Format in module INF
     def MetaDataFileCheckModuleFilePcdFormat(self):
@@ -1304,7 +1387,8 @@ class Check(object):
                 Usage = Record[4]
                 PcdCommentList = []
                 InfPath = self.GetInfFilePathFromID(Record[5])
-                Msg = "The Pcd format of %s in INF file [%s] does not follow rules" % (PcdName, InfPath)
+                Msg = "The Pcd format of %s in INF file [%s] does not follow rules" % (
+                    PcdName, InfPath)
                 if Usage.startswith(DT.TAB_SPECIAL_COMMENT):
                     PcdCommentList = Usage[2:].split(DT.TAB_SPECIAL_COMMENT)
                     if len(PcdCommentList) >= 1:
@@ -1312,16 +1396,19 @@ class Check(object):
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
                                                                           DT.TAB_INF_USAGE_UNDEFINED)):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                         if Model in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX] \
                             and not PcdCommentList[0].strip().startswith((DT.TAB_INF_USAGE_PRO,
                                                                           DT.TAB_INF_USAGE_SOME_PRO,
                                                                           DT.TAB_INF_USAGE_CON,
                                                                           DT.TAB_INF_USAGE_SOME_CON,
                                                                           DT.TAB_INF_USAGE_UNDEFINED)):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(
+                                ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
                 else:
-                    EccGlobalData.gDb.TblReport.Insert(ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                    EccGlobalData.gDb.TblReport.Insert(
+                        ERROR_META_DATA_FILE_CHECK_FORMAT_PCD, OtherMsg=Msg, BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check whether these is duplicate Guid/Ppi/Protocol name
     def CheckGuidProtocolPpi(self, ErrorID, Model, Table):
@@ -1344,7 +1431,8 @@ class Check(object):
         RecordSet = Table.Exec(SqlCommand)
         for Record in RecordSet:
             if not EccGlobalData.gException.IsException(ErrorID, Record[1]):
-                EccGlobalData.gDb.TblReport.Insert(ErrorID, OtherMsg="The %s name [%s] is defined more than one time" % (Name.upper(), Record[1]), BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                EccGlobalData.gDb.TblReport.Insert(ErrorID, OtherMsg="The %s name [%s] is defined more than one time" % (
+                    Name.upper(), Record[1]), BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Check whether these is duplicate Guid/Ppi/Protocol value
     def CheckGuidProtocolPpiValue(self, ErrorID, Model):
@@ -1366,17 +1454,18 @@ class Check(object):
         RecordSet = Table.Exec(SqlCommand)
         for Record in RecordSet:
             if not EccGlobalData.gException.IsException(ErrorID, Record[2]):
-                EccGlobalData.gDb.TblReport.Insert(ErrorID, OtherMsg="The %s value [%s] is used more than one time" % (Name.upper(), Record[2]), BelongsToTable=Table.Table, BelongsToItem=Record[0])
+                EccGlobalData.gDb.TblReport.Insert(ErrorID, OtherMsg="The %s value [%s] is used more than one time" % (
+                    Name.upper(), Record[2]), BelongsToTable=Table.Table, BelongsToItem=Record[0])
 
     # Naming Convention Check
     def NamingConventionCheck(self):
         if EccGlobalData.gConfig.NamingConventionCheckDefineStatement == '1' \
-        or EccGlobalData.gConfig.NamingConventionCheckTypedefStatement == '1' \
-        or EccGlobalData.gConfig.NamingConventionCheckIfndefStatement == '1' \
-        or EccGlobalData.gConfig.NamingConventionCheckVariableName == '1' \
-        or EccGlobalData.gConfig.NamingConventionCheckSingleCharacterVariable == '1' \
-        or EccGlobalData.gConfig.NamingConventionCheckAll == '1'\
-        or EccGlobalData.gConfig.CheckAll == '1':
+                or EccGlobalData.gConfig.NamingConventionCheckTypedefStatement == '1' \
+                or EccGlobalData.gConfig.NamingConventionCheckIfndefStatement == '1' \
+                or EccGlobalData.gConfig.NamingConventionCheckVariableName == '1' \
+                or EccGlobalData.gConfig.NamingConventionCheckSingleCharacterVariable == '1' \
+                or EccGlobalData.gConfig.NamingConventionCheckAll == '1'\
+                or EccGlobalData.gConfig.CheckAll == '1':
             for Dirpath, Dirnames, Filenames in self.WalkTree():
                 for F in Filenames:
                     if os.path.splitext(F)[1] in ('.h', '.c'):
@@ -1388,9 +1477,11 @@ class Check(object):
                         self.NamingConventionCheckDefineStatement(FileTable)
                         self.NamingConventionCheckTypedefStatement(FileTable)
                         self.NamingConventionCheckVariableName(FileTable)
-                        self.NamingConventionCheckSingleCharacterVariable(FileTable)
+                        self.NamingConventionCheckSingleCharacterVariable(
+                            FileTable)
                         if os.path.splitext(F)[1] in ('.h'):
-                            self.NamingConventionCheckIfndefStatement(FileTable)
+                            self.NamingConventionCheckIfndefStatement(
+                                FileTable)
 
         self.NamingConventionCheckPathName()
         self.NamingConventionCheckFunctionName()
@@ -1398,9 +1489,11 @@ class Check(object):
     # Check whether only capital letters are used for #define declarations
     def NamingConventionCheckDefineStatement(self, FileTable):
         if EccGlobalData.gConfig.NamingConventionCheckDefineStatement == '1' or EccGlobalData.gConfig.NamingConventionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking naming convention of #define statement ...")
+            EdkLogger.quiet(
+                "Checking naming convention of #define statement ...")
 
-            SqlCommand = """select ID, Value from %s where Model = %s""" % (FileTable, MODEL_IDENTIFIER_MACRO_DEFINE)
+            SqlCommand = """select ID, Value from %s where Model = %s""" % (
+                FileTable, MODEL_IDENTIFIER_MACRO_DEFINE)
             RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
             for Record in RecordSet:
                 Name = Record[1].strip().split()[1]
@@ -1408,14 +1501,17 @@ class Check(object):
                     Name = Name[0:Name.find('(')]
                 if Name.upper() != Name:
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_DEFINE_STATEMENT, Name):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_DEFINE_STATEMENT, OtherMsg="The #define name [%s] does not follow the rules" % (Name), BelongsToTable=FileTable, BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_DEFINE_STATEMENT, OtherMsg="The #define name [%s] does not follow the rules" % (
+                            Name), BelongsToTable=FileTable, BelongsToItem=Record[0])
 
     # Check whether only capital letters are used for typedef declarations
     def NamingConventionCheckTypedefStatement(self, FileTable):
         if EccGlobalData.gConfig.NamingConventionCheckTypedefStatement == '1' or EccGlobalData.gConfig.NamingConventionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking naming convention of #typedef statement ...")
+            EdkLogger.quiet(
+                "Checking naming convention of #typedef statement ...")
 
-            SqlCommand = """select ID, Name from %s where Model = %s""" % (FileTable, MODEL_IDENTIFIER_TYPEDEF)
+            SqlCommand = """select ID, Name from %s where Model = %s""" % (
+                FileTable, MODEL_IDENTIFIER_TYPEDEF)
             RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
             for Record in RecordSet:
                 Name = Record[1].strip()
@@ -1423,27 +1519,32 @@ class Check(object):
                     if Name[0] == '(':
                         Name = Name[1:Name.find(')')]
                     if Name.find('(') > -1:
-                        Name = Name[Name.find('(') + 1 : Name.find(')')]
+                        Name = Name[Name.find('(') + 1: Name.find(')')]
                     Name = Name.replace('WINAPI', '')
                     Name = Name.replace('*', '').strip()
                     if Name.upper() != Name:
                         if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_TYPEDEF_STATEMENT, Name):
-                            EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_TYPEDEF_STATEMENT, OtherMsg="The #typedef name [%s] does not follow the rules" % (Name), BelongsToTable=FileTable, BelongsToItem=Record[0])
+                            EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_TYPEDEF_STATEMENT, OtherMsg="The #typedef name [%s] does not follow the rules" % (
+                                Name), BelongsToTable=FileTable, BelongsToItem=Record[0])
 
     # Check whether the #ifndef at the start of an include file uses both prefix and postfix underscore characters, '_'.
     def NamingConventionCheckIfndefStatement(self, FileTable):
         if EccGlobalData.gConfig.NamingConventionCheckIfndefStatement == '1' or EccGlobalData.gConfig.NamingConventionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking naming convention of #ifndef statement ...")
+            EdkLogger.quiet(
+                "Checking naming convention of #ifndef statement ...")
 
-            SqlCommand = """select ID, Value from %s where Model = %s""" % (FileTable, MODEL_IDENTIFIER_MACRO_IFNDEF)
+            SqlCommand = """select ID, Value from %s where Model = %s""" % (
+                FileTable, MODEL_IDENTIFIER_MACRO_IFNDEF)
             RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
             if RecordSet:
                 # Only check the first ifndef statement of the file
-                FirstDefine = sorted(RecordSet, key=lambda Record: Record[0])[0]
+                FirstDefine = sorted(
+                    RecordSet, key=lambda Record: Record[0])[0]
                 Name = FirstDefine[1].replace('#ifndef', '').strip()
                 if Name[0] == '_' or Name[-1] != '_' or Name[-2] == '_':
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_IFNDEF_STATEMENT, Name):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_IFNDEF_STATEMENT, OtherMsg="The #ifndef name [%s] does not follow the rules" % (Name), BelongsToTable=FileTable, BelongsToItem=FirstDefine[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_IFNDEF_STATEMENT, OtherMsg="The #ifndef name [%s] does not follow the rules" % (
+                            Name), BelongsToTable=FileTable, BelongsToItem=FirstDefine[0])
 
     # Rule for path name, variable name and function name
     # 1. First character should be upper case
@@ -1459,7 +1560,8 @@ class Check(object):
             for Record in RecordSet:
                 if not Pattern.match(Record[1]):
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_PATH_NAME, Record[1]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_PATH_NAME, OtherMsg="The file path [%s] does not follow the rules" % (Record[1]), BelongsToTable='File', BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_PATH_NAME, OtherMsg="The file path [%s] does not follow the rules" % (
+                            Record[1]), BelongsToTable='File', BelongsToItem=Record[0])
 
     # Rule for path name, variable name and function name
     # 1. First character should be upper case
@@ -1472,7 +1574,8 @@ class Check(object):
             EdkLogger.quiet("Checking naming convention of variable name ...")
             Pattern = re.compile(r'^[A-Zgm]+\S*[a-z]\S*$')
 
-            SqlCommand = """select ID, Name, Modifier from %s where Model = %s""" % (FileTable, MODEL_IDENTIFIER_VARIABLE)
+            SqlCommand = """select ID, Name, Modifier from %s where Model = %s""" % (
+                FileTable, MODEL_IDENTIFIER_VARIABLE)
             RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
             for Record in RecordSet:
                 Var = Record[1]
@@ -1481,7 +1584,8 @@ class Check(object):
                     Var = Var[5:].lstrip()
                 if not Pattern.match(Var) and not (Modifier.endswith('*') and Var.startswith('p')):
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Record[1]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, OtherMsg="The variable name [%s] does not follow the rules" % (Record[1]), BelongsToTable=FileTable, BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, OtherMsg="The variable name [%s] does not follow the rules" % (
+                            Record[1]), BelongsToTable=FileTable, BelongsToItem=Record[0])
 
     # Rule for path name, variable name and function name
     # 1. First character should be upper case
@@ -1497,20 +1601,25 @@ class Check(object):
             for Record in RecordSet:
                 if not Pattern.match(Record[1]):
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_FUNCTION_NAME, Record[1]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_FUNCTION_NAME, OtherMsg="The function name [%s] does not follow the rules" % (Record[1]), BelongsToTable='Function', BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_FUNCTION_NAME, OtherMsg="The function name [%s] does not follow the rules" % (
+                            Record[1]), BelongsToTable='Function', BelongsToItem=Record[0])
 
     # Check whether NO use short variable name with single character
     def NamingConventionCheckSingleCharacterVariable(self, FileTable):
         if EccGlobalData.gConfig.NamingConventionCheckSingleCharacterVariable == '1' or EccGlobalData.gConfig.NamingConventionCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
-            EdkLogger.quiet("Checking naming convention of single character variable name ...")
+            EdkLogger.quiet(
+                "Checking naming convention of single character variable name ...")
 
-            SqlCommand = """select ID, Name from %s where Model = %s""" % (FileTable, MODEL_IDENTIFIER_VARIABLE)
+            SqlCommand = """select ID, Name from %s where Model = %s""" % (
+                FileTable, MODEL_IDENTIFIER_VARIABLE)
             RecordSet = EccGlobalData.gDb.TblFile.Exec(SqlCommand)
             for Record in RecordSet:
                 Variable = Record[1].replace('*', '')
                 if len(Variable) == 1:
                     if not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_SINGLE_CHARACTER_VARIABLE, Record[1]):
-                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_SINGLE_CHARACTER_VARIABLE, OtherMsg="The variable name [%s] does not follow the rules" % (Record[1]), BelongsToTable=FileTable, BelongsToItem=Record[0])
+                        EccGlobalData.gDb.TblReport.Insert(ERROR_NAMING_CONVENTION_CHECK_SINGLE_CHARACTER_VARIABLE, OtherMsg="The variable name [%s] does not follow the rules" % (
+                            Record[1]), BelongsToTable=FileTable, BelongsToItem=Record[0])
+
 
 def FindPara(FilePath, Para, CallingLine):
     Lines = open(FilePath).readlines()
@@ -1525,6 +1634,7 @@ def FindPara(FilePath, Para, CallingLine):
 
     return ''
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Ecc/CodeFragment.py b/BaseTools/Source/Python/Ecc/CodeFragment.py
index 8ced51f25bde..2f558196ae3c 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragment.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragment.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # fragments of source file
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -7,11 +7,11 @@
 #
 
 
-## The description of comment contents and start & end position
+# The description of comment contents and start & end position
 #
 #
-class Comment :
-    ## The constructor
+class Comment:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -25,11 +25,13 @@ class Comment :
         self.EndPos = End
         self.Type = CommentType
 
-## The description of preprocess directives and start & end position
+# The description of preprocess directives and start & end position
 #
 #
-class PP_Directive :
-    ## The constructor
+
+
+class PP_Directive:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -41,11 +43,13 @@ class PP_Directive :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of predicate expression and start & end position
+# The description of predicate expression and start & end position
 #
 #
-class PredicateExpression :
-    ## The constructor
+
+
+class PredicateExpression:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -57,11 +61,13 @@ class PredicateExpression :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of function definition and start & end position
+# The description of function definition and start & end position
 #
 #
-class FunctionDefinition :
-    ## The constructor
+
+
+class FunctionDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -77,11 +83,13 @@ class FunctionDefinition :
         self.LeftBracePos = LBPos
         self.NamePos = NamePos
 
-## The description of variable declaration and start & end position
+# The description of variable declaration and start & end position
 #
 #
-class VariableDeclaration :
-    ## The constructor
+
+
+class VariableDeclaration:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -94,11 +102,13 @@ class VariableDeclaration :
         self.StartPos = Begin
         self.NameStartPos = NamePos
 
-## The description of enum definition and start & end position
+# The description of enum definition and start & end position
 #
 #
-class EnumerationDefinition :
-    ## The constructor
+
+
+class EnumerationDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -110,11 +120,13 @@ class EnumerationDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of struct/union definition and start & end position
+# The description of struct/union definition and start & end position
 #
 #
-class StructUnionDefinition :
-    ## The constructor
+
+
+class StructUnionDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -126,11 +138,13 @@ class StructUnionDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of 'Typedef' definition and start & end position
+# The description of 'Typedef' definition and start & end position
 #
 #
-class TypedefDefinition :
-    ## The constructor
+
+
+class TypedefDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -143,8 +157,9 @@ class TypedefDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
+
 class FunctionCalling:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -156,4 +171,3 @@ class FunctionCalling:
         self.ParamList = Param
         self.StartPos = Begin
         self.EndPos = End
-
diff --git a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
index d8d6aff08a6e..1f0bae3d6b92 100644
--- a/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # preprocess source file
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -32,21 +32,21 @@ from Ecc.CodeFragment import PP_Directive
 from Ecc.ParserWarning import Warning
 
 
-##define T_CHAR_SPACE                ' '
-##define T_CHAR_NULL                 '\0'
-##define T_CHAR_CR                   '\r'
-##define T_CHAR_TAB                  '\t'
-##define T_CHAR_LF                   '\n'
-##define T_CHAR_SLASH                '/'
-##define T_CHAR_BACKSLASH            '\\'
-##define T_CHAR_DOUBLE_QUOTE         '\"'
-##define T_CHAR_SINGLE_QUOTE         '\''
-##define T_CHAR_STAR                 '*'
-##define T_CHAR_HASH                 '#'
+# define T_CHAR_SPACE                ' '
+# define T_CHAR_NULL                 '\0'
+# define T_CHAR_CR                   '\r'
+# define T_CHAR_TAB                  '\t'
+# define T_CHAR_LF                   '\n'
+# define T_CHAR_SLASH                '/'
+# define T_CHAR_BACKSLASH            '\\'
+# define T_CHAR_DOUBLE_QUOTE         '\"'
+# define T_CHAR_SINGLE_QUOTE         '\''
+# define T_CHAR_STAR                 '*'
+# define T_CHAR_HASH                 '#'
 
-(T_CHAR_SPACE, T_CHAR_NULL, T_CHAR_CR, T_CHAR_TAB, T_CHAR_LF, T_CHAR_SLASH, \
-T_CHAR_BACKSLASH, T_CHAR_DOUBLE_QUOTE, T_CHAR_SINGLE_QUOTE, T_CHAR_STAR, T_CHAR_HASH) = \
-(' ', '\0', '\r', '\t', '\n', '/', '\\', '\"', '\'', '*', '#')
+(T_CHAR_SPACE, T_CHAR_NULL, T_CHAR_CR, T_CHAR_TAB, T_CHAR_LF, T_CHAR_SLASH,
+ T_CHAR_BACKSLASH, T_CHAR_DOUBLE_QUOTE, T_CHAR_SINGLE_QUOTE, T_CHAR_STAR, T_CHAR_HASH) = \
+    (' ', '\0', '\r', '\t', '\n', '/', '\\', '\"', '\'', '*', '#')
 
 SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
 
@@ -54,15 +54,17 @@ SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
 
 (T_PP_INCLUDE, T_PP_DEFINE, T_PP_OTHERS) = (0, 1, 2)
 
-## The collector for source code fragments.
+# The collector for source code fragments.
 #
 # PreprocessFile method should be called prior to ParseFile
 #
 # GetNext*** procedures mean these procedures will get next token first, then make judgement.
 # Get*** procedures mean these procedures will make judgement on current token only.
 #
+
+
 class CodeFragmentCollector:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
@@ -77,7 +79,7 @@ class CodeFragmentCollector:
         self.__Token = ""
         self.__SkippedChars = ""
 
-    ## __EndOfFile() method
+    # __EndOfFile() method
     #
     #   Judge current buffer pos is at file end
     #
@@ -98,7 +100,7 @@ class CodeFragmentCollector:
         else:
             return False
 
-    ## __EndOfLine() method
+    # __EndOfLine() method
     #
     #   Judge current buffer pos is at line end
     #
@@ -107,13 +109,14 @@ class CodeFragmentCollector:
     #   @retval False       Current File buffer position is NOT at line end
     #
     def __EndOfLine(self):
-        SizeOfCurrentLine = len(self.Profile.FileLinesList[self.CurrentLineNumber - 1])
+        SizeOfCurrentLine = len(
+            self.Profile.FileLinesList[self.CurrentLineNumber - 1])
         if self.CurrentOffsetWithinLine >= SizeOfCurrentLine - 1:
             return True
         else:
             return False
 
-    ## Rewind() method
+    # Rewind() method
     #
     #   Reset file data buffer to the initial state
     #
@@ -123,7 +126,7 @@ class CodeFragmentCollector:
         self.CurrentLineNumber = 1
         self.CurrentOffsetWithinLine = 0
 
-    ## __UndoOneChar() method
+    # __UndoOneChar() method
     #
     #   Go back one char in the file buffer
     #
@@ -142,7 +145,7 @@ class CodeFragmentCollector:
             self.CurrentOffsetWithinLine -= 1
         return True
 
-    ## __GetOneChar() method
+    # __GetOneChar() method
     #
     #   Move forward one char in the file buffer
     #
@@ -150,12 +153,12 @@ class CodeFragmentCollector:
     #
     def __GetOneChar(self):
         if self.CurrentOffsetWithinLine == len(self.Profile.FileLinesList[self.CurrentLineNumber - 1]) - 1:
-                self.CurrentLineNumber += 1
-                self.CurrentOffsetWithinLine = 0
+            self.CurrentLineNumber += 1
+            self.CurrentOffsetWithinLine = 0
         else:
-                self.CurrentOffsetWithinLine += 1
+            self.CurrentOffsetWithinLine += 1
 
-    ## __CurrentChar() method
+    # __CurrentChar() method
     #
     #   Get the char pointed to by the file buffer pointer
     #
@@ -163,12 +166,13 @@ class CodeFragmentCollector:
     #   @retval Char        Current char
     #
     def __CurrentChar(self):
-        CurrentChar = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine]
+        CurrentChar = self.Profile.FileLinesList[self.CurrentLineNumber -
+                                                 1][self.CurrentOffsetWithinLine]
 #        if CurrentChar > 255:
 #            raise Warning("Non-Ascii char found At Line %d, offset %d" % (self.CurrentLineNumber, self.CurrentOffsetWithinLine), self.FileName, self.CurrentLineNumber)
         return CurrentChar
 
-    ## __NextChar() method
+    # __NextChar() method
     #
     #   Get the one char pass the char pointed to by the file buffer pointer
     #
@@ -181,7 +185,7 @@ class CodeFragmentCollector:
         else:
             return self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine + 1]
 
-    ## __SetCurrentCharValue() method
+    # __SetCurrentCharValue() method
     #
     #   Modify the value of current char
     #
@@ -189,9 +193,10 @@ class CodeFragmentCollector:
     #   @param  Value       The new value of current char
     #
     def __SetCurrentCharValue(self, Value):
-        self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine] = Value
+        self.Profile.FileLinesList[self.CurrentLineNumber -
+                                   1][self.CurrentOffsetWithinLine] = Value
 
-    ## __SetCharValue() method
+    # __SetCharValue() method
     #
     #   Modify the value of current char
     #
@@ -201,7 +206,7 @@ class CodeFragmentCollector:
     def __SetCharValue(self, Line, Offset, Value):
         self.Profile.FileLinesList[Line - 1][Offset] = Value
 
-    ## __CurrentLine() method
+    # __CurrentLine() method
     #
     #   Get the list that contains current line contents
     #
@@ -211,7 +216,7 @@ class CodeFragmentCollector:
     def __CurrentLine(self):
         return self.Profile.FileLinesList[self.CurrentLineNumber - 1]
 
-    ## __InsertComma() method
+    # __InsertComma() method
     #
     #   Insert ',' to replace PP
     #
@@ -220,9 +225,9 @@ class CodeFragmentCollector:
     #
     def __InsertComma(self, Line):
 
-
         if self.Profile.FileLinesList[Line - 1][0] != T_CHAR_HASH:
-            BeforeHashPart = str(self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
+            BeforeHashPart = str(
+                self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
             if BeforeHashPart.rstrip().endswith(T_CHAR_COMMA) or BeforeHashPart.rstrip().endswith(';'):
                 return
 
@@ -235,9 +240,10 @@ class CodeFragmentCollector:
         if str(self.Profile.FileLinesList[Line]).lstrip().startswith(',') or str(self.Profile.FileLinesList[Line]).lstrip().startswith(';'):
             return
 
-        self.Profile.FileLinesList[Line - 1].insert(self.CurrentOffsetWithinLine, ',')
+        self.Profile.FileLinesList[Line -
+                                   1].insert(self.CurrentOffsetWithinLine, ',')
 
-    ## PreprocessFile() method
+    # PreprocessFile() method
     #
     #   Preprocess file contents, replace comments with spaces.
     #   In the end, rewind the file buffer pointer to the beginning
@@ -259,7 +265,8 @@ class CodeFragmentCollector:
         InString = False
         InCharLiteral = False
 
-        self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesListFromFile]
+        self.Profile.FileLinesList = [
+            list(s) for s in self.Profile.FileLinesListFromFile]
         while not self.__EndOfFile():
 
             if not InComment and self.__CurrentChar() == T_CHAR_DOUBLE_QUOTE:
@@ -276,7 +283,8 @@ class CodeFragmentCollector:
                     else:
                         PPExtend = False
 
-                EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
+                EndLinePos = (self.CurrentLineNumber,
+                              self.CurrentOffsetWithinLine)
 
                 if InComment and DoubleSlashComment:
                     InComment = False
@@ -297,7 +305,8 @@ class CodeFragmentCollector:
                     CurrentLine = "".join(self.__CurrentLine())
                     if CurrentLine.rstrip(T_CHAR_LF).rstrip(T_CHAR_CR).endswith(T_CHAR_BACKSLASH):
                         SlashIndex = CurrentLine.rindex(T_CHAR_BACKSLASH)
-                        self.__SetCharValue(self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
+                        self.__SetCharValue(
+                            self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
 
                 if InComment and not DoubleSlashComment and not HashComment:
                     CommentObj.Content += T_CHAR_LF
@@ -310,7 +319,8 @@ class CodeFragmentCollector:
                 self.__GetOneChar()
                 CommentObj.Content += self.__CurrentChar()
 #                self.__SetCurrentCharValue(T_CHAR_SPACE)
-                CommentObj.EndPos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
+                CommentObj.EndPos = (self.CurrentLineNumber,
+                                     self.CurrentOffsetWithinLine)
                 FileProfile.CommentList.append(CommentObj)
                 CommentObj = None
                 self.__GetOneChar()
@@ -322,7 +332,8 @@ class CodeFragmentCollector:
                     if self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
                         InComment = False
                         HashComment = False
-                        PPDirectiveObj.EndPos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
+                        PPDirectiveObj.EndPos = (
+                            self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
                         FileProfile.PPDirectiveList.append(PPDirectiveObj)
                         PPDirectiveObj = None
                         continue
@@ -338,15 +349,18 @@ class CodeFragmentCollector:
             elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
                 InComment = True
                 DoubleSlashComment = True
-                CommentObj = Comment('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_TWO_SLASH)
+                CommentObj = Comment(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_TWO_SLASH)
             # check for '#' comment
             elif self.__CurrentChar() == T_CHAR_HASH and not InString and not InCharLiteral:
                 InComment = True
                 HashComment = True
-                PPDirectiveObj = PP_Directive('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
+                PPDirectiveObj = PP_Directive(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
             # check for /* comment start
             elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_STAR:
-                CommentObj = Comment('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_SLASH_STAR)
+                CommentObj = Comment(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_SLASH_STAR)
                 CommentObj.Content += self.__CurrentChar()
 #                self.__SetCurrentCharValue( T_CHAR_SPACE)
                 self.__GetOneChar()
@@ -381,7 +395,8 @@ class CodeFragmentCollector:
         InString = False
         InCharLiteral = False
 
-        self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesListFromFile]
+        self.Profile.FileLinesList = [
+            list(s) for s in self.Profile.FileLinesListFromFile]
         while not self.__EndOfFile():
 
             if not InComment and self.__CurrentChar() == T_CHAR_DOUBLE_QUOTE:
@@ -398,7 +413,8 @@ class CodeFragmentCollector:
                     else:
                         PPExtend = False
 
-                EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
+                EndLinePos = (self.CurrentLineNumber,
+                              self.CurrentOffsetWithinLine)
 
                 if InComment and DoubleSlashComment:
                     InComment = False
@@ -419,7 +435,8 @@ class CodeFragmentCollector:
                     CurrentLine = "".join(self.__CurrentLine())
                     if CurrentLine.rstrip(T_CHAR_LF).rstrip(T_CHAR_CR).endswith(T_CHAR_BACKSLASH):
                         SlashIndex = CurrentLine.rindex(T_CHAR_BACKSLASH)
-                        self.__SetCharValue(self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
+                        self.__SetCharValue(
+                            self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
 
                 if InComment and not DoubleSlashComment and not HashComment:
                     CommentObj.Content += T_CHAR_LF
@@ -432,7 +449,8 @@ class CodeFragmentCollector:
                 self.__GetOneChar()
                 CommentObj.Content += self.__CurrentChar()
                 self.__SetCurrentCharValue(T_CHAR_SPACE)
-                CommentObj.EndPos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
+                CommentObj.EndPos = (self.CurrentLineNumber,
+                                     self.CurrentOffsetWithinLine)
                 FileProfile.CommentList.append(CommentObj)
                 CommentObj = None
                 self.__GetOneChar()
@@ -444,7 +462,8 @@ class CodeFragmentCollector:
                     if self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
                         InComment = False
                         HashComment = False
-                        PPDirectiveObj.EndPos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
+                        PPDirectiveObj.EndPos = (
+                            self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
                         FileProfile.PPDirectiveList.append(PPDirectiveObj)
                         PPDirectiveObj = None
                         continue
@@ -460,20 +479,23 @@ class CodeFragmentCollector:
             elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
                 InComment = True
                 DoubleSlashComment = True
-                CommentObj = Comment('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_TWO_SLASH)
+                CommentObj = Comment(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_TWO_SLASH)
             # check for '#' comment
             elif self.__CurrentChar() == T_CHAR_HASH and not InString and not InCharLiteral:
                 InComment = True
                 HashComment = True
-                PPDirectiveObj = PP_Directive('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
+                PPDirectiveObj = PP_Directive(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
             # check for /* comment start
             elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_STAR:
-                CommentObj = Comment('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_SLASH_STAR)
+                CommentObj = Comment(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None, T_COMMENT_SLASH_STAR)
                 CommentObj.Content += self.__CurrentChar()
-                self.__SetCurrentCharValue( T_CHAR_SPACE)
+                self.__SetCurrentCharValue(T_CHAR_SPACE)
                 self.__GetOneChar()
                 CommentObj.Content += self.__CurrentChar()
-                self.__SetCurrentCharValue( T_CHAR_SPACE)
+                self.__SetCurrentCharValue(T_CHAR_SPACE)
                 self.__GetOneChar()
                 InComment = True
             else:
@@ -489,7 +511,7 @@ class CodeFragmentCollector:
             FileProfile.PPDirectiveList.append(PPDirectiveObj)
         self.Rewind()
 
-    ## ParseFile() method
+    # ParseFile() method
     #
     #   Parse the file profile buffer to extract fd, fv ... information
     #   Exception will be raised if syntax error found
@@ -499,13 +521,15 @@ class CodeFragmentCollector:
     def ParseFile(self):
         self.PreprocessFile()
         # restore from ListOfList to ListOfString
-        self.Profile.FileLinesList = ["".join(list) for list in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [
+            "".join(list) for list in self.Profile.FileLinesList]
         FileStringContents = ''
         for fileLine in self.Profile.FileLinesList:
             FileStringContents += fileLine
         for Token in self.TokenReleaceList:
             if Token in FileStringContents:
-                FileStringContents = FileStringContents.replace(Token, 'TOKENSTRING')
+                FileStringContents = FileStringContents.replace(
+                    Token, 'TOKENSTRING')
         cStream = antlr.InputStream(FileStringContents)
         lexer = CLexer(cStream)
         tStream = antlr.CommonTokenStream(lexer)
@@ -515,7 +539,8 @@ class CodeFragmentCollector:
     def ParseFileWithClearedPPDirective(self):
         self.PreprocessFileWithClear()
         # restore from ListOfList to ListOfString
-        self.Profile.FileLinesList = ["".join(list) for list in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [
+            "".join(list) for list in self.Profile.FileLinesList]
         FileStringContents = ''
         for fileLine in self.Profile.FileLinesList:
             FileStringContents += fileLine
@@ -556,13 +581,14 @@ class CodeFragmentCollector:
         print('/********* VARIABLE DECLARATIONS ********/')
         print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
+            print(str(var.StartPos) + var.Modifier + ' ' + var.Declarator)
 
         print('/****************************************/')
         print('/********* FUNCTION DEFINITIONS *********/')
         print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
+            print(str(func.StartPos) + func.Modifier + ' ' +
+                  func.Declarator + ' ' + str(func.NamePos))
 
         print('/****************************************/')
         print('/************ ENUMERATIONS **************/')
@@ -588,6 +614,7 @@ class CodeFragmentCollector:
         for typedef in FileProfile.TypedefDefinitionList:
             print(str(typedef.StartPos) + typedef.ToType)
 
+
 if __name__ == "__main__":
 
     collector = CodeFragmentCollector(sys.argv[1])
diff --git a/BaseTools/Source/Python/Ecc/Configuration.py b/BaseTools/Source/Python/Ecc/Configuration.py
index 9d9feaca5eb6..c5ae3acfb8b0 100644
--- a/BaseTools/Source/Python/Ecc/Configuration.py
+++ b/BaseTools/Source/Python/Ecc/Configuration.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class Configuration
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -17,129 +17,131 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 
 _ConfigFileToInternalTranslation = {
     # not same
-    "ModifierList":"ModifierSet",
+    "ModifierList": "ModifierSet",
 
     # same
     # please keep this in correct alphabetical order.
-    "AutoCorrect":"AutoCorrect",
-    "BinaryExtList":"BinaryExtList",
-    "CFunctionLayoutCheckAll":"CFunctionLayoutCheckAll",
-    "CFunctionLayoutCheckDataDeclaration":"CFunctionLayoutCheckDataDeclaration",
-    "CFunctionLayoutCheckFunctionBody":"CFunctionLayoutCheckFunctionBody",
-    "CFunctionLayoutCheckFunctionName":"CFunctionLayoutCheckFunctionName",
-    "CFunctionLayoutCheckFunctionPrototype":"CFunctionLayoutCheckFunctionPrototype",
-    "CFunctionLayoutCheckNoInitOfVariable":"CFunctionLayoutCheckNoInitOfVariable",
-    "CFunctionLayoutCheckNoStatic":"CFunctionLayoutCheckNoStatic",
-    "CFunctionLayoutCheckOptionalFunctionalModifier":"CFunctionLayoutCheckOptionalFunctionalModifier",
-    "CFunctionLayoutCheckReturnType":"CFunctionLayoutCheckReturnType",
-    "CheckAll":"CheckAll",
-    "Copyright":"Copyright",
-    "DeclarationDataTypeCheckAll":"DeclarationDataTypeCheckAll",
-    "DeclarationDataTypeCheckEFIAPIModifier":"DeclarationDataTypeCheckEFIAPIModifier",
-    "DeclarationDataTypeCheckEnumeratedType":"DeclarationDataTypeCheckEnumeratedType",
-    "DeclarationDataTypeCheckInOutModifier":"DeclarationDataTypeCheckInOutModifier",
-    "DeclarationDataTypeCheckNoUseCType":"DeclarationDataTypeCheckNoUseCType",
-    "DeclarationDataTypeCheckSameStructure":"DeclarationDataTypeCheckSameStructure",
-    "DeclarationDataTypeCheckStructureDeclaration":"DeclarationDataTypeCheckStructureDeclaration",
-    "DeclarationDataTypeCheckUnionType":"DeclarationDataTypeCheckUnionType",
-    "DoxygenCheckAll":"DoxygenCheckAll",
-    "DoxygenCheckCommand":"DoxygenCheckCommand",
-    "DoxygenCheckCommentDescription":"DoxygenCheckCommentDescription",
-    "DoxygenCheckCommentFormat":"DoxygenCheckCommentFormat",
-    "DoxygenCheckFileHeader":"DoxygenCheckFileHeader",
-    "DoxygenCheckFunctionHeader":"DoxygenCheckFunctionHeader",
-    "GeneralCheckAll":"GeneralCheckAll",
-    "GeneralCheckCarriageReturn":"GeneralCheckCarriageReturn",
-    "GeneralCheckFileExistence":"GeneralCheckFileExistence",
-    "GeneralCheckIndentation":"GeneralCheckIndentation",
-    "GeneralCheckIndentationWidth":"GeneralCheckIndentationWidth",
-    "GeneralCheckLine":"GeneralCheckLine",
-    "GeneralCheckLineEnding":"GeneralCheckLineEnding",
-    "GeneralCheckLineWidth":"GeneralCheckLineWidth",
-    "GeneralCheckNoProgma":"GeneralCheckNoProgma",
-    "GeneralCheckNoTab":"GeneralCheckNoTab",
-    "GeneralCheckNo_Asm":"GeneralCheckNo_Asm",
-    "GeneralCheckNonAcsii":"GeneralCheckNonAcsii",
-    "GeneralCheckTabWidth":"GeneralCheckTabWidth",
-    "GeneralCheckTrailingWhiteSpaceLine":"GeneralCheckTrailingWhiteSpaceLine",
-    "GeneralCheckUni":"GeneralCheckUni",
-    "HeaderCheckAll":"HeaderCheckAll",
-    "HeaderCheckCFileCommentLicenseFormat":"HeaderCheckCFileCommentLicenseFormat",
-    "HeaderCheckCFileCommentReferenceFormat":"HeaderCheckCFileCommentReferenceFormat",
-    "HeaderCheckCFileCommentStartSpacesNum":"HeaderCheckCFileCommentStartSpacesNum",
-    "HeaderCheckFile":"HeaderCheckFile",
-    "HeaderCheckFileCommentEnd":"HeaderCheckFileCommentEnd",
-    "HeaderCheckFunction":"HeaderCheckFunction",
-    "IncludeFileCheckAll":"IncludeFileCheckAll",
-    "IncludeFileCheckData":"IncludeFileCheckData",
-    "IncludeFileCheckIfndefStatement":"IncludeFileCheckIfndefStatement",
-    "IncludeFileCheckSameName":"IncludeFileCheckSameName",
-    "MetaDataFileCheckAll":"MetaDataFileCheckAll",
-    "MetaDataFileCheckBinaryInfInFdf":"MetaDataFileCheckBinaryInfInFdf",
-    "MetaDataFileCheckGenerateFileList":"MetaDataFileCheckGenerateFileList",
-    "MetaDataFileCheckGuidDuplicate":"MetaDataFileCheckGuidDuplicate",
-    "MetaDataFileCheckLibraryDefinedInDec":"MetaDataFileCheckLibraryDefinedInDec",
-    "MetaDataFileCheckLibraryInstance":"MetaDataFileCheckLibraryInstance",
-    "MetaDataFileCheckLibraryInstanceDependent":"MetaDataFileCheckLibraryInstanceDependent",
-    "MetaDataFileCheckLibraryInstanceOrder":"MetaDataFileCheckLibraryInstanceOrder",
-    "MetaDataFileCheckLibraryNoUse":"MetaDataFileCheckLibraryNoUse",
-    "MetaDataFileCheckModuleFileGuidDuplication":"MetaDataFileCheckModuleFileGuidDuplication",
-    "MetaDataFileCheckModuleFileGuidFormat":"MetaDataFileCheckModuleFileGuidFormat",
-    "MetaDataFileCheckModuleFileNoUse":"MetaDataFileCheckModuleFileNoUse",
-    "MetaDataFileCheckModuleFilePcdFormat":"MetaDataFileCheckModuleFilePcdFormat",
-    "MetaDataFileCheckModuleFilePpiFormat":"MetaDataFileCheckModuleFilePpiFormat",
-    "MetaDataFileCheckModuleFileProtocolFormat":"MetaDataFileCheckModuleFileProtocolFormat",
-    "MetaDataFileCheckPathName":"MetaDataFileCheckPathName",
-    "MetaDataFileCheckPathOfGenerateFileList":"MetaDataFileCheckPathOfGenerateFileList",
-    "MetaDataFileCheckPcdDuplicate":"MetaDataFileCheckPcdDuplicate",
-    "MetaDataFileCheckPcdFlash":"MetaDataFileCheckPcdFlash",
-    "MetaDataFileCheckPcdNoUse":"MetaDataFileCheckPcdNoUse",
-    "MetaDataFileCheckPcdType":"MetaDataFileCheckPcdType",
-    "NamingConventionCheckAll":"NamingConventionCheckAll",
-    "NamingConventionCheckDefineStatement":"NamingConventionCheckDefineStatement",
-    "NamingConventionCheckFunctionName":"NamingConventionCheckFunctionName",
-    "NamingConventionCheckIfndefStatement":"NamingConventionCheckIfndefStatement",
-    "NamingConventionCheckPathName":"NamingConventionCheckPathName",
-    "NamingConventionCheckSingleCharacterVariable":"NamingConventionCheckSingleCharacterVariable",
-    "NamingConventionCheckTypedefStatement":"NamingConventionCheckTypedefStatement",
-    "NamingConventionCheckVariableName":"NamingConventionCheckVariableName",
-    "PredicateExpressionCheckAll":"PredicateExpressionCheckAll",
-    "PredicateExpressionCheckBooleanValue":"PredicateExpressionCheckBooleanValue",
-    "PredicateExpressionCheckComparisonNullType":"PredicateExpressionCheckComparisonNullType",
-    "PredicateExpressionCheckNonBooleanOperator":"PredicateExpressionCheckNonBooleanOperator",
-    "ScanOnlyDirList":"ScanOnlyDirList",
-    "SkipDirList":"SkipDirList",
-    "SkipFileList":"SkipFileList",
-    "SmmCommParaCheckAll":"SmmCommParaCheckAll",
-    "SmmCommParaCheckBufferType":"SmmCommParaCheckBufferType",
-    "SpaceCheckAll":"SpaceCheckAll",
-    "SpellingCheckAll":"SpellingCheckAll",
-    "TokenReleaceList":"TokenReleaceList",
-    "UniCheckAll":"UniCheckAll",
-    "UniCheckHelpInfo":"UniCheckHelpInfo",
-    "UniCheckPCDInfo":"UniCheckPCDInfo",
-    "Version":"Version"
-    }
+    "AutoCorrect": "AutoCorrect",
+    "BinaryExtList": "BinaryExtList",
+    "CFunctionLayoutCheckAll": "CFunctionLayoutCheckAll",
+    "CFunctionLayoutCheckDataDeclaration": "CFunctionLayoutCheckDataDeclaration",
+    "CFunctionLayoutCheckFunctionBody": "CFunctionLayoutCheckFunctionBody",
+    "CFunctionLayoutCheckFunctionName": "CFunctionLayoutCheckFunctionName",
+    "CFunctionLayoutCheckFunctionPrototype": "CFunctionLayoutCheckFunctionPrototype",
+    "CFunctionLayoutCheckNoInitOfVariable": "CFunctionLayoutCheckNoInitOfVariable",
+    "CFunctionLayoutCheckNoStatic": "CFunctionLayoutCheckNoStatic",
+    "CFunctionLayoutCheckOptionalFunctionalModifier": "CFunctionLayoutCheckOptionalFunctionalModifier",
+    "CFunctionLayoutCheckReturnType": "CFunctionLayoutCheckReturnType",
+    "CheckAll": "CheckAll",
+    "Copyright": "Copyright",
+    "DeclarationDataTypeCheckAll": "DeclarationDataTypeCheckAll",
+    "DeclarationDataTypeCheckEFIAPIModifier": "DeclarationDataTypeCheckEFIAPIModifier",
+    "DeclarationDataTypeCheckEnumeratedType": "DeclarationDataTypeCheckEnumeratedType",
+    "DeclarationDataTypeCheckInOutModifier": "DeclarationDataTypeCheckInOutModifier",
+    "DeclarationDataTypeCheckNoUseCType": "DeclarationDataTypeCheckNoUseCType",
+    "DeclarationDataTypeCheckSameStructure": "DeclarationDataTypeCheckSameStructure",
+    "DeclarationDataTypeCheckStructureDeclaration": "DeclarationDataTypeCheckStructureDeclaration",
+    "DeclarationDataTypeCheckUnionType": "DeclarationDataTypeCheckUnionType",
+    "DoxygenCheckAll": "DoxygenCheckAll",
+    "DoxygenCheckCommand": "DoxygenCheckCommand",
+    "DoxygenCheckCommentDescription": "DoxygenCheckCommentDescription",
+    "DoxygenCheckCommentFormat": "DoxygenCheckCommentFormat",
+    "DoxygenCheckFileHeader": "DoxygenCheckFileHeader",
+    "DoxygenCheckFunctionHeader": "DoxygenCheckFunctionHeader",
+    "GeneralCheckAll": "GeneralCheckAll",
+    "GeneralCheckCarriageReturn": "GeneralCheckCarriageReturn",
+    "GeneralCheckFileExistence": "GeneralCheckFileExistence",
+    "GeneralCheckIndentation": "GeneralCheckIndentation",
+    "GeneralCheckIndentationWidth": "GeneralCheckIndentationWidth",
+    "GeneralCheckLine": "GeneralCheckLine",
+    "GeneralCheckLineEnding": "GeneralCheckLineEnding",
+    "GeneralCheckLineWidth": "GeneralCheckLineWidth",
+    "GeneralCheckNoProgma": "GeneralCheckNoProgma",
+    "GeneralCheckNoTab": "GeneralCheckNoTab",
+    "GeneralCheckNo_Asm": "GeneralCheckNo_Asm",
+    "GeneralCheckNonAcsii": "GeneralCheckNonAcsii",
+    "GeneralCheckTabWidth": "GeneralCheckTabWidth",
+    "GeneralCheckTrailingWhiteSpaceLine": "GeneralCheckTrailingWhiteSpaceLine",
+    "GeneralCheckUni": "GeneralCheckUni",
+    "HeaderCheckAll": "HeaderCheckAll",
+    "HeaderCheckCFileCommentLicenseFormat": "HeaderCheckCFileCommentLicenseFormat",
+    "HeaderCheckCFileCommentReferenceFormat": "HeaderCheckCFileCommentReferenceFormat",
+    "HeaderCheckCFileCommentStartSpacesNum": "HeaderCheckCFileCommentStartSpacesNum",
+    "HeaderCheckFile": "HeaderCheckFile",
+    "HeaderCheckFileCommentEnd": "HeaderCheckFileCommentEnd",
+    "HeaderCheckFunction": "HeaderCheckFunction",
+    "IncludeFileCheckAll": "IncludeFileCheckAll",
+    "IncludeFileCheckData": "IncludeFileCheckData",
+    "IncludeFileCheckIfndefStatement": "IncludeFileCheckIfndefStatement",
+    "IncludeFileCheckSameName": "IncludeFileCheckSameName",
+    "MetaDataFileCheckAll": "MetaDataFileCheckAll",
+    "MetaDataFileCheckBinaryInfInFdf": "MetaDataFileCheckBinaryInfInFdf",
+    "MetaDataFileCheckGenerateFileList": "MetaDataFileCheckGenerateFileList",
+    "MetaDataFileCheckGuidDuplicate": "MetaDataFileCheckGuidDuplicate",
+    "MetaDataFileCheckLibraryDefinedInDec": "MetaDataFileCheckLibraryDefinedInDec",
+    "MetaDataFileCheckLibraryInstance": "MetaDataFileCheckLibraryInstance",
+    "MetaDataFileCheckLibraryInstanceDependent": "MetaDataFileCheckLibraryInstanceDependent",
+    "MetaDataFileCheckLibraryInstanceOrder": "MetaDataFileCheckLibraryInstanceOrder",
+    "MetaDataFileCheckLibraryNoUse": "MetaDataFileCheckLibraryNoUse",
+    "MetaDataFileCheckModuleFileGuidDuplication": "MetaDataFileCheckModuleFileGuidDuplication",
+    "MetaDataFileCheckModuleFileGuidFormat": "MetaDataFileCheckModuleFileGuidFormat",
+    "MetaDataFileCheckModuleFileNoUse": "MetaDataFileCheckModuleFileNoUse",
+    "MetaDataFileCheckModuleFilePcdFormat": "MetaDataFileCheckModuleFilePcdFormat",
+    "MetaDataFileCheckModuleFilePpiFormat": "MetaDataFileCheckModuleFilePpiFormat",
+    "MetaDataFileCheckModuleFileProtocolFormat": "MetaDataFileCheckModuleFileProtocolFormat",
+    "MetaDataFileCheckPathName": "MetaDataFileCheckPathName",
+    "MetaDataFileCheckPathOfGenerateFileList": "MetaDataFileCheckPathOfGenerateFileList",
+    "MetaDataFileCheckPcdDuplicate": "MetaDataFileCheckPcdDuplicate",
+    "MetaDataFileCheckPcdFlash": "MetaDataFileCheckPcdFlash",
+    "MetaDataFileCheckPcdNoUse": "MetaDataFileCheckPcdNoUse",
+    "MetaDataFileCheckPcdType": "MetaDataFileCheckPcdType",
+    "NamingConventionCheckAll": "NamingConventionCheckAll",
+    "NamingConventionCheckDefineStatement": "NamingConventionCheckDefineStatement",
+    "NamingConventionCheckFunctionName": "NamingConventionCheckFunctionName",
+    "NamingConventionCheckIfndefStatement": "NamingConventionCheckIfndefStatement",
+    "NamingConventionCheckPathName": "NamingConventionCheckPathName",
+    "NamingConventionCheckSingleCharacterVariable": "NamingConventionCheckSingleCharacterVariable",
+    "NamingConventionCheckTypedefStatement": "NamingConventionCheckTypedefStatement",
+    "NamingConventionCheckVariableName": "NamingConventionCheckVariableName",
+    "PredicateExpressionCheckAll": "PredicateExpressionCheckAll",
+    "PredicateExpressionCheckBooleanValue": "PredicateExpressionCheckBooleanValue",
+    "PredicateExpressionCheckComparisonNullType": "PredicateExpressionCheckComparisonNullType",
+    "PredicateExpressionCheckNonBooleanOperator": "PredicateExpressionCheckNonBooleanOperator",
+    "ScanOnlyDirList": "ScanOnlyDirList",
+    "SkipDirList": "SkipDirList",
+    "SkipFileList": "SkipFileList",
+    "SmmCommParaCheckAll": "SmmCommParaCheckAll",
+    "SmmCommParaCheckBufferType": "SmmCommParaCheckBufferType",
+    "SpaceCheckAll": "SpaceCheckAll",
+    "SpellingCheckAll": "SpellingCheckAll",
+    "TokenReleaceList": "TokenReleaceList",
+    "UniCheckAll": "UniCheckAll",
+    "UniCheckHelpInfo": "UniCheckHelpInfo",
+    "UniCheckPCDInfo": "UniCheckPCDInfo",
+    "Version": "Version"
+}
 
-## Configuration
+# Configuration
 #
 # This class is used to define all items in configuration file
 #
 # @param Filename:  The name of configuration file, the default is config.ini
 #
+
+
 class Configuration(object):
     def __init__(self, Filename):
         self.Filename = Filename
 
         self.Version = 0.1
 
-        ## Identify to if check all items
+        # Identify to if check all items
         # 1 - Check all items and ignore all other detailed items
         # 0 - Not check all items, the tool will go through all other detailed items to decide to check or not
         #
         self.CheckAll = 0
 
-        ## Identify to if automatically correct mistakes
+        # Identify to if automatically correct mistakes
         # 1 - Automatically correct
         # 0 - Not automatically correct
         # Only the following check points can be automatically corrected, others not listed below are not supported even it is 1
@@ -156,7 +158,7 @@ class Configuration(object):
         # Defaultly use the definition in class DataType
         self.ModifierSet = MODIFIER_SET
 
-        ## General Checking
+        # General Checking
         self.GeneralCheckAll = 0
 
         # Check whether NO Tab is used, replaced with spaces
@@ -190,10 +192,10 @@ class Configuration(object):
 
         self.CFunctionLayoutCheckNoDeprecated = 1
 
-        ## Space Checking
+        # Space Checking
         self.SpaceCheckAll = 1
 
-        ## Predicate Expression Checking
+        # Predicate Expression Checking
         self.PredicateExpressionCheckAll = 0
 
         # Check whether Boolean values, variable type BOOLEAN not use explicit comparisons to TRUE or FALSE
@@ -203,7 +205,7 @@ class Configuration(object):
         # Check whether a comparison of any pointer to zero must be done via the NULL type
         self.PredicateExpressionCheckComparisonNullType = 1
 
-        ## Headers Checking
+        # Headers Checking
         self.HeaderCheckAll = 0
 
         # Check whether File header exists
@@ -219,7 +221,7 @@ class Configuration(object):
         # Check whether C File header Comment have the License immediately after the ""Copyright"" line
         self.HeaderCheckCFileCommentLicenseFormat = 1
 
-        ## C Function Layout Checking
+        # C Function Layout Checking
         self.CFunctionLayoutCheckAll = 0
 
         # Check whether return type exists and in the first line
@@ -240,10 +242,10 @@ class Configuration(object):
         # Check whether no use of STATIC for functions
         self.CFunctionLayoutCheckNoStatic = 1
 
-        ## Include Files Checking
+        # Include Files Checking
         self.IncludeFileCheckAll = 0
 
-        #Check whether having include files with same name
+        # Check whether having include files with same name
         self.IncludeFileCheckSameName = 1
         # Check whether all include file contents is guarded by a #ifndef statement.
         # the #ifndef must be the first line of code following the file header comment
@@ -253,7 +255,7 @@ class Configuration(object):
         # Check whether include files NOT contain code or define data variables
         self.IncludeFileCheckData = 1
 
-        ## Declarations and Data Types Checking
+        # Declarations and Data Types Checking
         self.DeclarationDataTypeCheckAll = 0
 
         # Check whether no use of int, unsigned, char, void, long in any .c, .h or .asl files.
@@ -271,7 +273,7 @@ class Configuration(object):
         # Check whether Union Type has a 'typedef' and the name is capital
         self.DeclarationDataTypeCheckUnionType = 1
 
-        ## Naming Conventions Checking
+        # Naming Conventions Checking
         self.NamingConventionCheckAll = 0
 
         # Check whether only capital letters are used for #define declarations
@@ -293,7 +295,7 @@ class Configuration(object):
         # Check whether NO use short variable name with single character
         self.NamingConventionCheckSingleCharacterVariable = 1
 
-        ## Doxygen Checking
+        # Doxygen Checking
         self.DoxygenCheckAll = 0
 
         # Check whether the file headers are followed Doxygen special documentation blocks in section 2.3.5
@@ -308,7 +310,7 @@ class Configuration(object):
         # Check whether only Doxygen commands allowed to mark the code are @bug and @todo.
         self.DoxygenCheckCommand = 1
 
-        ## Meta-Data File Processing Checking
+        # Meta-Data File Processing Checking
         self.MetaDataFileCheckAll = 0
 
         # Check whether each file defined in meta-data exists
@@ -398,7 +400,8 @@ class Configuration(object):
         Filepath = os.path.normpath(self.Filename)
         if not os.path.isfile(Filepath):
             ErrorMsg = "Can't find configuration file '%s'" % Filepath
-            EdkLogger.error("Ecc", EdkLogger.ECC_ERROR, ErrorMsg, File = Filepath)
+            EdkLogger.error("Ecc", EdkLogger.ECC_ERROR,
+                            ErrorMsg, File=Filepath)
 
         LineNo = 0
         for Line in open(Filepath, 'r'):
@@ -408,8 +411,10 @@ class Configuration(object):
                 List = GetSplitValueList(Line, TAB_EQUAL_SPLIT)
                 if List[0] not in _ConfigFileToInternalTranslation:
                     ErrorMsg = "Invalid configuration option '%s' was found" % List[0]
-                    EdkLogger.error("Ecc", EdkLogger.ECC_ERROR, ErrorMsg, File = Filepath, Line = LineNo)
-                assert _ConfigFileToInternalTranslation[List[0]] in self.__dict__
+                    EdkLogger.error("Ecc", EdkLogger.ECC_ERROR,
+                                    ErrorMsg, File=Filepath, Line=LineNo)
+                assert _ConfigFileToInternalTranslation[List[0]
+                                                        ] in self.__dict__
                 if List[0] == 'ModifierList':
                     List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 if List[0] == 'MetaDataFileCheckPathOfGenerateFileList' and List[1] == "":
@@ -424,13 +429,15 @@ class Configuration(object):
                     List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
                 if List[0] == 'TokenReleaceList':
                     List[1] = GetSplitValueList(List[1], TAB_COMMA_SPLIT)
-                self.__dict__[_ConfigFileToInternalTranslation[List[0]]] = List[1]
+                self.__dict__[
+                    _ConfigFileToInternalTranslation[List[0]]] = List[1]
 
     def ShowMe(self):
         print(self.Filename)
         for Key in self.__dict__.keys():
             print(Key, '=', self.__dict__[Key])
 
+
 #
 # test that our dict and out class still match in contents.
 #
diff --git a/BaseTools/Source/Python/Ecc/Database.py b/BaseTools/Source/Python/Ecc/Database.py
index a5b70c52029b..c567dfbd04cf 100644
--- a/BaseTools/Source/Python/Ecc/Database.py
+++ b/BaseTools/Source/Python/Ecc/Database.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by ECC tool
 #
 # Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,8 @@
 #
 from __future__ import absolute_import
 import sqlite3
-import Common.LongFilePathOs as os, time
+import Common.LongFilePathOs as os
+import time
 
 import Common.EdkLogger as EdkLogger
 import CommonDataClass.DataClass as DataClass
@@ -31,7 +32,7 @@ from Table.TableFdf import TableFdf
 #
 DATABASE_PATH = "Ecc.db"
 
-## Database
+# Database
 #
 # This class defined the ECC database
 # During the phase of initialization, the database will create all tables and
@@ -44,6 +45,8 @@ DATABASE_PATH = "Ecc.db"
 # @var Cur:         Cursor of the connection
 # @var TblDataModel:  Local instance for TableDataModel
 #
+
+
 class Database(object):
     def __init__(self, DbPath):
         self.DbPath = DbPath
@@ -60,13 +63,13 @@ class Database(object):
         self.TblDsc = None
         self.TblFdf = None
 
-    ## Initialize ECC database
+    # Initialize ECC database
     #
     # 1. Delete all old existing tables
     # 2. Create new tables
     # 3. Initialize table DataModel
     #
-    def InitDatabase(self, NewDatabase = True):
+    def InitDatabase(self, NewDatabase=True):
         EdkLogger.verbose("\nInitialize ECC database started ...")
         #
         # Drop all old existing tables
@@ -74,7 +77,7 @@ class Database(object):
         if NewDatabase:
             if os.path.exists(self.DbPath):
                 os.remove(self.DbPath)
-        self.Conn = sqlite3.connect(self.DbPath, isolation_level = 'DEFERRED')
+        self.Conn = sqlite3.connect(self.DbPath, isolation_level='DEFERRED')
         self.Conn.execute("PRAGMA page_size=4096")
         self.Conn.execute("PRAGMA synchronous=OFF")
         # to avoid non-ascii character conversion error
@@ -127,14 +130,14 @@ class Database(object):
 
         EdkLogger.verbose("Initialize ECC database ... DONE!")
 
-    ## Query a table
+    # Query a table
     #
     # @param Table:  The instance of the table to be queried
     #
     def QueryTable(self, Table):
         Table.Query()
 
-    ## Close entire database
+    # Close entire database
     #
     # Commit all first
     # Close the connection and cursor
@@ -151,7 +154,7 @@ class Database(object):
         self.Cur.close()
         self.Conn.close()
 
-    ## Insert one file information
+    # Insert one file information
     #
     # Insert one file's information to the database
     # 1. Create a record in TableFile
@@ -165,7 +168,8 @@ class Database(object):
         #
         # Insert a record for file
         #
-        FileID = self.TblFile.Insert(File.Name, File.ExtName, File.Path, File.FullPath, Model = File.Model, TimeStamp = File.TimeStamp)
+        FileID = self.TblFile.Insert(
+            File.Name, File.ExtName, File.Path, File.FullPath, Model=File.Model, TimeStamp=File.TimeStamp)
 
         if File.Model == DataClass.MODEL_FILE_C or File.Model == DataClass.MODEL_FILE_H:
             IdTable = TableIdentifier(self.Cur)
@@ -175,47 +179,49 @@ class Database(object):
             # Insert function of file
             #
             for Function in File.FunctionList:
-                FunctionID = self.TblFunction.Insert(Function.Header, Function.Modifier, Function.Name, Function.ReturnStatement, \
-                                        Function.StartLine, Function.StartColumn, Function.EndLine, Function.EndColumn, \
-                                        Function.BodyStartLine, Function.BodyStartColumn, FileID, \
-                                        Function.FunNameStartLine, Function.FunNameStartColumn)
+                FunctionID = self.TblFunction.Insert(Function.Header, Function.Modifier, Function.Name, Function.ReturnStatement,
+                                                     Function.StartLine, Function.StartColumn, Function.EndLine, Function.EndColumn,
+                                                     Function.BodyStartLine, Function.BodyStartColumn, FileID,
+                                                     Function.FunNameStartLine, Function.FunNameStartColumn)
                 #
                 # Insert Identifier of function
                 #
                 for Identifier in Function.IdentifierList:
-                    IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model, \
-                                            FileID, FunctionID, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
+                    IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model,
+                                                  FileID, FunctionID, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
                 #
                 # Insert Pcd of function
                 #
                 for Pcd in Function.PcdList:
-                    PcdID = self.TblPcd.Insert(Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.Token, Pcd.DatumType, Pcd.Model, \
-                                       FileID, FunctionID, Pcd.StartLine, Pcd.StartColumn, Pcd.EndLine, Pcd.EndColumn)
+                    PcdID = self.TblPcd.Insert(Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.Token, Pcd.DatumType, Pcd.Model,
+                                               FileID, FunctionID, Pcd.StartLine, Pcd.StartColumn, Pcd.EndLine, Pcd.EndColumn)
             #
             # Insert Identifier of file
             #
             for Identifier in File.IdentifierList:
-                IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model, \
-                                        FileID, -1, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
+                IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model,
+                                              FileID, -1, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
             #
             # Insert Pcd of file
             #
             for Pcd in File.PcdList:
-                PcdID = self.TblPcd.Insert(Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.Token, Pcd.DatumType, Pcd.Model, \
-                                   FileID, -1, Pcd.StartLine, Pcd.StartColumn, Pcd.EndLine, Pcd.EndColumn)
+                PcdID = self.TblPcd.Insert(Pcd.CName, Pcd.TokenSpaceGuidCName, Pcd.Token, Pcd.DatumType, Pcd.Model,
+                                           FileID, -1, Pcd.StartLine, Pcd.StartColumn, Pcd.EndLine, Pcd.EndColumn)
 
-        EdkLogger.verbose("Insert information from file %s ... DONE!" % File.FullPath)
+        EdkLogger.verbose(
+            "Insert information from file %s ... DONE!" % File.FullPath)
 
-    ## UpdateIdentifierBelongsToFunction
+    # UpdateIdentifierBelongsToFunction
     #
     # Update the field "BelongsToFunction" for each Identifier
     #
     #
     def UpdateIdentifierBelongsToFunction_disabled(self):
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers started ...")
+        EdkLogger.verbose(
+            "Update 'BelongsToFunction' for Identifiers started ...")
 
         SqlCommand = """select ID, BelongsToFile, StartLine, EndLine, Model from Identifier"""
-        EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
+        EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
         self.Cur.execute(SqlCommand)
         Records = self.Cur.fetchall()
         for Record in Records:
@@ -232,12 +238,13 @@ class Database(object):
             SqlCommand = """select ID from Function
                         where StartLine < %s and EndLine > %s
                         and BelongsToFile = %s""" % (StartLine, EndLine, BelongsToFile)
-            EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
+            EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
             self.Cur.execute(SqlCommand)
             IDs = self.Cur.fetchall()
             for ID in IDs:
-                SqlCommand = """Update Identifier set BelongsToFunction = %s where ID = %s""" % (ID[0], IdentifierID)
-                EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
+                SqlCommand = """Update Identifier set BelongsToFunction = %s where ID = %s""" % (
+                    ID[0], IdentifierID)
+                EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
                 self.Cur.execute(SqlCommand)
 
             #
@@ -248,24 +255,27 @@ class Database(object):
                 SqlCommand = """select ID from Function
                         where StartLine = %s + 1
                         and BelongsToFile = %s""" % (EndLine, BelongsToFile)
-                EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
+                EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
                 self.Cur.execute(SqlCommand)
                 IDs = self.Cur.fetchall()
                 for ID in IDs:
-                    SqlCommand = """Update Identifier set BelongsToFunction = %s, Model = %s where ID = %s""" % (ID[0], DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, IdentifierID)
-                    EdkLogger.debug(4, "SqlCommand: %s" %SqlCommand)
+                    SqlCommand = """Update Identifier set BelongsToFunction = %s, Model = %s where ID = %s""" % (
+                        ID[0], DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, IdentifierID)
+                    EdkLogger.debug(4, "SqlCommand: %s" % SqlCommand)
                     self.Cur.execute(SqlCommand)
 
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers ... DONE")
+        EdkLogger.verbose(
+            "Update 'BelongsToFunction' for Identifiers ... DONE")
 
-
-    ## UpdateIdentifierBelongsToFunction
+    # UpdateIdentifierBelongsToFunction
     #
     # Update the field "BelongsToFunction" for each Identifier
     #
     #
+
     def UpdateIdentifierBelongsToFunction(self):
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers started ...")
+        EdkLogger.verbose(
+            "Update 'BelongsToFunction' for Identifiers started ...")
 
         SqlCommand = """select ID, BelongsToFile, StartLine, EndLine from Function"""
         Records = self.TblFunction.Exec(SqlCommand)
@@ -280,11 +290,12 @@ class Database(object):
             #Data2.append(("'file%s'" % BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1))
 
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s where BelongsToFile = %s and StartLine > %s and EndLine < %s""" % \
-                        (BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine)
+                (BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine)
             self.TblIdentifier.Exec(SqlCommand)
 
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s, Model = %s where BelongsToFile = %s and Model = %s and EndLine = %s""" % \
-                         (BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1)
+                         (BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER,
+                          BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1)
             self.TblIdentifier.Exec(SqlCommand)
 #       #
 #       # Check whether an identifier belongs to a function
@@ -313,20 +324,27 @@ class Database(object):
 #
 if __name__ == '__main__':
     EdkLogger.Initialize()
-    #EdkLogger.SetLevel(EdkLogger.VERBOSE)
+    # EdkLogger.SetLevel(EdkLogger.VERBOSE)
     EdkLogger.SetLevel(EdkLogger.DEBUG_0)
-    EdkLogger.verbose("Start at " + time.strftime('%H:%M:%S', time.localtime()))
+    EdkLogger.verbose(
+        "Start at " + time.strftime('%H:%M:%S', time.localtime()))
 
     Db = Database(DATABASE_PATH)
     Db.InitDatabase()
     Db.QueryTable(Db.TblDataModel)
 
-    identifier1 = DataClass.IdentifierClass(-1, '', '', "i''1", 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 32,  43,  54,  43)
-    identifier2 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 15,  43,  20,  43)
-    identifier3 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 55,  43,  58,  43)
-    identifier4 = DataClass.IdentifierClass(-1, '', '', "i1'", 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 77,  43,  88,  43)
-    fun1 = DataClass.FunctionClass(-1, '', '', 'fun1', '', 21, 2, 60,  45, 1, 23, 0, [], [])
-    file = DataClass.FileClass(-1, 'F1', 'c', 'C:\\', 'C:\\F1.exe', DataClass.MODEL_FILE_C, '2007-12-28', [fun1], [identifier1, identifier2, identifier3, identifier4], [])
+    identifier1 = DataClass.IdentifierClass(-1, '', '', "i''1", 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 32,  43,  54,  43)
+    identifier2 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 15,  43,  20,  43)
+    identifier3 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 55,  43,  58,  43)
+    identifier4 = DataClass.IdentifierClass(-1, '', '', "i1'", 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 77,  43,  88,  43)
+    fun1 = DataClass.FunctionClass(-1, '', '',
+                                   'fun1', '', 21, 2, 60,  45, 1, 23, 0, [], [])
+    file = DataClass.FileClass(-1, 'F1', 'c', 'C:\\', 'C:\\F1.exe', DataClass.MODEL_FILE_C,
+                               '2007-12-28', [fun1], [identifier1, identifier2, identifier3, identifier4], [])
     Db.InsertOneFile(file)
     Db.UpdateIdentifierBelongsToFunction()
 
@@ -337,4 +355,3 @@ if __name__ == '__main__':
 
     Db.Close()
     EdkLogger.verbose("End at " + time.strftime('%H:%M:%S', time.localtime()))
-
diff --git a/BaseTools/Source/Python/Ecc/EccGlobalData.py b/BaseTools/Source/Python/Ecc/EccGlobalData.py
index 89036dba071b..b9093d17664d 100644
--- a/BaseTools/Source/Python/Ecc/EccGlobalData.py
+++ b/BaseTools/Source/Python/Ecc/EccGlobalData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to save global datas used by ECC tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
diff --git a/BaseTools/Source/Python/Ecc/EccMain.py b/BaseTools/Source/Python/Ecc/EccMain.py
index a349cd80147f..c5c715901ad3 100644
--- a/BaseTools/Source/Python/Ecc/EccMain.py
+++ b/BaseTools/Source/Python/Ecc/EccMain.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to be the main entrance of ECC tool
 #
 # Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,10 @@
 # Import Modules
 #
 from __future__ import absolute_import
-import Common.LongFilePathOs as os, time, glob, sys
+import Common.LongFilePathOs as os
+import time
+import glob
+import sys
 import Common.EdkLogger as EdkLogger
 from Ecc import Database
 from Ecc import EccGlobalData
@@ -31,17 +34,20 @@ from Ecc.MetaFileWorkspace.MetaFileParser import InfParser
 from Ecc.MetaFileWorkspace.MetaFileParser import Fdf
 from Ecc.MetaFileWorkspace.MetaFileTable import MetaFileStorage
 from Ecc import c
-import re, string
+import re
+import string
 from Ecc.Exception import *
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## Ecc
+# Ecc
 #
 # This class is used to define Ecc main entrance
 #
 # @param object:          Inherited from object class
 #
+
+
 class Ecc(object):
     def __init__(self):
         # Version and Copyright
@@ -61,9 +67,11 @@ class Ecc(object):
 
         # Parse the options and args
         self.ParseOption()
-        EdkLogger.info(time.strftime("%H:%M:%S, %b.%d %Y ", time.localtime()) + "[00:00]" + "\n")
+        EdkLogger.info(time.strftime("%H:%M:%S, %b.%d %Y ",
+                       time.localtime()) + "[00:00]" + "\n")
 
-        WorkspaceDir = os.path.normcase(os.path.normpath(os.environ["WORKSPACE"]))
+        WorkspaceDir = os.path.normcase(
+            os.path.normpath(os.environ["WORKSPACE"]))
         os.environ["WORKSPACE"] = WorkspaceDir
 
         # set multiple workspace
@@ -72,7 +80,7 @@ class Ecc(object):
 
         GlobalData.gWorkspace = WorkspaceDir
 
-        GlobalData.gGlobalDefines["WORKSPACE"]  = WorkspaceDir
+        GlobalData.gGlobalDefines["WORKSPACE"] = WorkspaceDir
 
         EdkLogger.info("Loading ECC configuration ... done")
         # Generate checkpoints list
@@ -112,11 +120,11 @@ class Ecc(object):
                 return
         self.ConfigFile = 'config.ini'
 
-
-    ## DetectOnlyScan
+    # DetectOnlyScan
     #
     # Detect whether only scanned folders have been enabled
     #
+
     def DetectOnlyScanDirs(self):
         if self.OnlyScan == True:
             OnlyScanDirs = []
@@ -126,16 +134,17 @@ class Ecc(object):
             if len(OnlyScanDirs) != 0:
                 self.BuildDatabase(OnlyScanDirs)
             else:
-                EdkLogger.error("ECC", BuildToolError.OPTION_VALUE_INVALID, ExtraData="Use -f option need to fill specific folders in config.ini file")
+                EdkLogger.error("ECC", BuildToolError.OPTION_VALUE_INVALID,
+                                ExtraData="Use -f option need to fill specific folders in config.ini file")
         else:
             self.BuildDatabase()
 
-
-    ## BuildDatabase
+    # BuildDatabase
     #
     # Build the database for target
     #
-    def BuildDatabase(self, SpeciDirs = None):
+
+    def BuildDatabase(self, SpeciDirs=None):
         # Clean report table
         EccGlobalData.gDb.TblReport.Drop()
         EccGlobalData.gDb.TblReport.Create()
@@ -151,27 +160,32 @@ class Ecc(object):
                     c.CollectSourceCodeDataIntoDB(EccGlobalData.gTarget)
                 else:
                     for specificDir in SpeciDirs:
-                        c.CollectSourceCodeDataIntoDB(os.path.join(EccGlobalData.gTarget, specificDir))
+                        c.CollectSourceCodeDataIntoDB(
+                            os.path.join(EccGlobalData.gTarget, specificDir))
 
-        EccGlobalData.gIdentifierTableList = GetTableList((MODEL_FILE_C, MODEL_FILE_H), 'Identifier', EccGlobalData.gDb)
+        EccGlobalData.gIdentifierTableList = GetTableList(
+            (MODEL_FILE_C, MODEL_FILE_H), 'Identifier', EccGlobalData.gDb)
         EccGlobalData.gCFileList = GetFileList(MODEL_FILE_C, EccGlobalData.gDb)
         EccGlobalData.gHFileList = GetFileList(MODEL_FILE_H, EccGlobalData.gDb)
-        EccGlobalData.gUFileList = GetFileList(MODEL_FILE_UNI, EccGlobalData.gDb)
+        EccGlobalData.gUFileList = GetFileList(
+            MODEL_FILE_UNI, EccGlobalData.gDb)
 
-    ## BuildMetaDataFileDatabase
+    # BuildMetaDataFileDatabase
     #
     # Build the database for meta data files
     #
-    def BuildMetaDataFileDatabase(self, SpecificDirs = None):
+    def BuildMetaDataFileDatabase(self, SpecificDirs=None):
         ScanFolders = []
         if SpecificDirs is None:
             ScanFolders.append(EccGlobalData.gTarget)
         else:
             for specificDir in SpecificDirs:
-                ScanFolders.append(os.path.join(EccGlobalData.gTarget, specificDir))
+                ScanFolders.append(os.path.join(
+                    EccGlobalData.gTarget, specificDir))
         EdkLogger.quiet("Building database for meta data files ...")
-        Op = open(EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w+')
-        #SkipDirs = Read from config file
+        Op = open(
+            EccGlobalData.gConfig.MetaDataFileCheckPathOfGenerateFileList, 'w+')
+        # SkipDirs = Read from config file
         SkipDirs = EccGlobalData.gConfig.SkipDirList
         SkipDirString = '|'.join(SkipDirs)
 #         p = re.compile(r'.*[\\/](?:%s)[\\/]?.*' % SkipDirString)
@@ -195,7 +209,8 @@ class Ecc(object):
                         EdkLogger.quiet("Parsing %s" % Filename)
                         Op.write("%s\r" % Filename)
                         #Dec(Filename, True, True, EccGlobalData.gWorkspace, EccGlobalData.gDb)
-                        self.MetaFile = DecParser(Filename, MODEL_FILE_DEC, EccGlobalData.gDb.TblDec)
+                        self.MetaFile = DecParser(
+                            Filename, MODEL_FILE_DEC, EccGlobalData.gDb.TblDec)
                         self.MetaFile.Start()
                         continue
                     if len(File) > 4 and File[-4:].upper() == ".DSC":
@@ -203,7 +218,8 @@ class Ecc(object):
                         EdkLogger.quiet("Parsing %s" % Filename)
                         Op.write("%s\r" % Filename)
                         #Dsc(Filename, True, True, EccGlobalData.gWorkspace, EccGlobalData.gDb)
-                        self.MetaFile = DscParser(PathClass(Filename, Root), MODEL_FILE_DSC, MetaFileStorage(EccGlobalData.gDb.TblDsc.Cur, Filename, MODEL_FILE_DSC, True))
+                        self.MetaFile = DscParser(PathClass(Filename, Root), MODEL_FILE_DSC, MetaFileStorage(
+                            EccGlobalData.gDb.TblDsc.Cur, Filename, MODEL_FILE_DSC, True))
                         # always do post-process, in case of macros change
                         self.MetaFile.DoPostProcess()
                         self.MetaFile.Start()
@@ -214,21 +230,25 @@ class Ecc(object):
                         EdkLogger.quiet("Parsing %s" % Filename)
                         Op.write("%s\r" % Filename)
                         #Inf(Filename, True, True, EccGlobalData.gWorkspace, EccGlobalData.gDb)
-                        self.MetaFile = InfParser(Filename, MODEL_FILE_INF, EccGlobalData.gDb.TblInf)
+                        self.MetaFile = InfParser(
+                            Filename, MODEL_FILE_INF, EccGlobalData.gDb.TblInf)
                         self.MetaFile.Start()
                         continue
                     if len(File) > 4 and File[-4:].upper() == ".FDF":
                         Filename = os.path.normpath(os.path.join(Root, File))
                         EdkLogger.quiet("Parsing %s" % Filename)
                         Op.write("%s\r" % Filename)
-                        Fdf(Filename, True, EccGlobalData.gWorkspace, EccGlobalData.gDb)
+                        Fdf(Filename, True, EccGlobalData.gWorkspace,
+                            EccGlobalData.gDb)
                         continue
                     if len(File) > 4 and File[-4:].upper() == ".UNI":
                         Filename = os.path.normpath(os.path.join(Root, File))
                         EdkLogger.quiet("Parsing %s" % Filename)
                         Op.write("%s\r" % Filename)
-                        FileID = EccGlobalData.gDb.TblFile.InsertFile(Filename, MODEL_FILE_UNI)
-                        EccGlobalData.gDb.TblReport.UpdateBelongsToItemByFile(FileID, File)
+                        FileID = EccGlobalData.gDb.TblFile.InsertFile(
+                            Filename, MODEL_FILE_UNI)
+                        EccGlobalData.gDb.TblReport.UpdateBelongsToItemByFile(
+                            FileID, File)
                         continue
 
         Op.close()
@@ -292,7 +312,7 @@ class Ecc(object):
 
         return RealPath
 
-    ## ParseOption
+    # ParseOption
     #
     # Parse options
     #
@@ -309,7 +329,8 @@ class Ecc(object):
         else:
             EccGlobalData.gWorkspace = os.path.normpath(os.getenv("WORKSPACE"))
             if not os.path.exists(EccGlobalData.gWorkspace):
-                EdkLogger.error("ECC", BuildToolError.FILE_NOT_FOUND, ExtraData="WORKSPACE = %s" % EccGlobalData.gWorkspace)
+                EdkLogger.error("ECC", BuildToolError.FILE_NOT_FOUND,
+                                ExtraData="WORKSPACE = %s" % EccGlobalData.gWorkspace)
             os.environ["WORKSPACE"] = EccGlobalData.gWorkspace
         # Set log level
         self.SetLogLevel(Options)
@@ -325,16 +346,20 @@ class Ecc(object):
             self.ExceptionFile = Options.ExceptionFile
         if Options.Target is not None:
             if not os.path.isdir(Options.Target):
-                EdkLogger.error("ECC", BuildToolError.OPTION_VALUE_INVALID, ExtraData="Target [%s] does NOT exist" % Options.Target)
+                EdkLogger.error("ECC", BuildToolError.OPTION_VALUE_INVALID,
+                                ExtraData="Target [%s] does NOT exist" % Options.Target)
             else:
-                EccGlobalData.gTarget = self.GetRealPathCase(os.path.normpath(Options.Target))
+                EccGlobalData.gTarget = self.GetRealPathCase(
+                    os.path.normpath(Options.Target))
         else:
-            EdkLogger.warn("Ecc", EdkLogger.ECC_ERROR, "The target source tree was not specified, using current WORKSPACE instead!")
+            EdkLogger.warn("Ecc", EdkLogger.ECC_ERROR,
+                           "The target source tree was not specified, using current WORKSPACE instead!")
             EccGlobalData.gTarget = os.path.normpath(os.getenv("WORKSPACE"))
         if Options.keepdatabase is not None:
             self.IsInit = False
         if Options.metadata is not None and Options.sourcecode is not None:
-            EdkLogger.error("ECC", BuildToolError.OPTION_CONFLICT, ExtraData="-m and -s can't be specified at one time")
+            EdkLogger.error("ECC", BuildToolError.OPTION_CONFLICT,
+                            ExtraData="-m and -s can't be specified at one time")
         if Options.metadata is not None:
             self.ScanSourceCode = False
         if Options.sourcecode is not None:
@@ -342,7 +367,7 @@ class Ecc(object):
         if Options.folders is not None:
             self.OnlyScan = True
 
-    ## SetLogLevel
+    # SetLogLevel
     #
     # Set current log level of the tool based on args
     #
@@ -358,7 +383,7 @@ class Ecc(object):
         else:
             EdkLogger.SetLevel(EdkLogger.INFO)
 
-    ## Parse command line options
+    # Parse command line options
     #
     # Using standard Python module optparse to parse command line option of this tool.
     #
@@ -366,37 +391,46 @@ class Ecc(object):
     # @retval Args  Target of build command
     #
     def EccOptionParser(self):
-        Parser = OptionParser(description = self.Copyright, version = self.Version, prog = "Ecc.exe", usage = "%prog [options]")
+        Parser = OptionParser(description=self.Copyright,
+                              version=self.Version, prog="Ecc.exe", usage="%prog [options]")
         Parser.add_option("-t", "--target sourcepath", action="store", type="string", dest='Target',
-            help="Check all files under the target workspace.")
+                          help="Check all files under the target workspace.")
         Parser.add_option("-c", "--config filename", action="store", type="string", dest="ConfigFile",
-            help="Specify a configuration file. Defaultly use config.ini under ECC tool directory.")
+                          help="Specify a configuration file. Defaultly use config.ini under ECC tool directory.")
         Parser.add_option("-o", "--outfile filename", action="store", type="string", dest="OutputFile",
-            help="Specify the name of an output file, if and only if one filename was specified.")
+                          help="Specify the name of an output file, if and only if one filename was specified.")
         Parser.add_option("-r", "--reportfile filename", action="store", type="string", dest="ReportFile",
-            help="Specify the name of an report file, if and only if one filename was specified.")
+                          help="Specify the name of an report file, if and only if one filename was specified.")
         Parser.add_option("-e", "--exceptionfile filename", action="store", type="string", dest="ExceptionFile",
-            help="Specify the name of an exception file, if and only if one filename was specified.")
-        Parser.add_option("-m", "--metadata", action="store_true", type=None, help="Only scan meta-data files information if this option is specified.")
-        Parser.add_option("-s", "--sourcecode", action="store_true", type=None, help="Only scan source code files information if this option is specified.")
-        Parser.add_option("-k", "--keepdatabase", action="store_true", type=None, help="The existing Ecc database will not be cleaned except report information if this option is specified.")
+                          help="Specify the name of an exception file, if and only if one filename was specified.")
+        Parser.add_option("-m", "--metadata", action="store_true", type=None,
+                          help="Only scan meta-data files information if this option is specified.")
+        Parser.add_option("-s", "--sourcecode", action="store_true", type=None,
+                          help="Only scan source code files information if this option is specified.")
+        Parser.add_option("-k", "--keepdatabase", action="store_true", type=None,
+                          help="The existing Ecc database will not be cleaned except report information if this option is specified.")
         Parser.add_option("-l", "--log filename", action="store", dest="LogFile", help="""If specified, the tool should emit the changes that
                                                                                           were made by the tool after printing the result message.
                                                                                           If filename, the emit to the file, otherwise emit to
                                                                                           standard output. If no modifications were made, then do not
                                                                                           create a log file, or output a log message.""")
-        Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
-        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "\
-                                                                                   "including library instances selected, final dependency expression, "\
-                                                                                   "and warning messages, etc.")
-        Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
-        Parser.add_option("-w", "--workspace", action="store", type="string", dest='Workspace', help="Specify workspace.")
-        Parser.add_option("-f", "--folders", action="store_true", type=None, help="Only scanning specified folders which are recorded in config.ini file.")
+        Parser.add_option("-q", "--quiet", action="store_true",
+                          type=None, help="Disable all messages except FATAL ERRORS.")
+        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "
+                          "including library instances selected, final dependency expression, "
+                          "and warning messages, etc.")
+        Parser.add_option("-d", "--debug", action="store", type="int",
+                          help="Enable debug messages at specified level.")
+        Parser.add_option("-w", "--workspace", action="store",
+                          type="string", dest='Workspace', help="Specify workspace.")
+        Parser.add_option("-f", "--folders", action="store_true", type=None,
+                          help="Only scanning specified folders which are recorded in config.ini file.")
 
-        (Opt, Args)=Parser.parse_args()
+        (Opt, Args) = Parser.parse_args()
 
         return (Opt, Args)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -411,5 +445,7 @@ if __name__ == '__main__':
     Ecc = Ecc()
     FinishTime = time.perf_counter()
 
-    BuildDuration = time.strftime("%M:%S", time.gmtime(int(round(FinishTime - StartTime))))
-    EdkLogger.quiet("\n%s [%s]" % (time.strftime("%H:%M:%S, %b.%d %Y", time.localtime()), BuildDuration))
+    BuildDuration = time.strftime(
+        "%M:%S", time.gmtime(int(round(FinishTime - StartTime))))
+    EdkLogger.quiet("\n%s [%s]" % (time.strftime(
+        "%H:%M:%S, %b.%d %Y", time.localtime()), BuildDuration))
diff --git a/BaseTools/Source/Python/Ecc/EccToolError.py b/BaseTools/Source/Python/Ecc/EccToolError.py
index d97bf7948ce8..0d370f49cd4d 100644
--- a/BaseTools/Source/Python/Ecc/EccToolError.py
+++ b/BaseTools/Source/Python/Ecc/EccToolError.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Standardized Error Handling infrastructures.
 #
 # Copyright (c) 2021, Arm Limited. All rights reserved.<BR>
@@ -105,101 +105,100 @@ ERROR_SPELLING_CHECK_ALL = 11000
 ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE = 12001
 
 gEccErrorMessage = {
-    ERROR_GENERAL_CHECK_ALL : "",
-    ERROR_GENERAL_CHECK_NO_TAB : "'TAB' character is not allowed in source code, please replace each 'TAB' with two spaces.",
-    ERROR_GENERAL_CHECK_INDENTATION : "Indentation does not follow coding style",
-    ERROR_GENERAL_CHECK_LINE : "The width of each line does not follow coding style",
-    ERROR_GENERAL_CHECK_NO_ASM : "There should be no use of _asm in the source file",
-    ERROR_GENERAL_CHECK_NO_PROGMA : """There should be no use of "#progma" in source file except "#pragma pack(#)\"""",
-    ERROR_GENERAL_CHECK_CARRIAGE_RETURN : "There should be a carriage return at the end of the file",
-    ERROR_GENERAL_CHECK_FILE_EXISTENCE : "File not found",
-    ERROR_GENERAL_CHECK_NON_ACSII : "File has invalid Non-ACSII char",
-    ERROR_GENERAL_CHECK_UNI : "File is not a valid UTF-16 UNI file",
-    ERROR_GENERAL_CHECK_UNI_HELP_INFO : "UNI file that is associated by INF or DEC file need define the prompt and help information.",
-    ERROR_GENERAL_CHECK_INVALID_LINE_ENDING : "Only CRLF (Carriage Return Line Feed) is allowed to line ending.",
-    ERROR_GENERAL_CHECK_TRAILING_WHITE_SPACE_LINE : "There should be no trailing white space in one line.",
+    ERROR_GENERAL_CHECK_ALL: "",
+    ERROR_GENERAL_CHECK_NO_TAB: "'TAB' character is not allowed in source code, please replace each 'TAB' with two spaces.",
+    ERROR_GENERAL_CHECK_INDENTATION: "Indentation does not follow coding style",
+    ERROR_GENERAL_CHECK_LINE: "The width of each line does not follow coding style",
+    ERROR_GENERAL_CHECK_NO_ASM: "There should be no use of _asm in the source file",
+    ERROR_GENERAL_CHECK_NO_PROGMA: """There should be no use of "#progma" in source file except "#pragma pack(#)\"""",
+    ERROR_GENERAL_CHECK_CARRIAGE_RETURN: "There should be a carriage return at the end of the file",
+    ERROR_GENERAL_CHECK_FILE_EXISTENCE: "File not found",
+    ERROR_GENERAL_CHECK_NON_ACSII: "File has invalid Non-ACSII char",
+    ERROR_GENERAL_CHECK_UNI: "File is not a valid UTF-16 UNI file",
+    ERROR_GENERAL_CHECK_UNI_HELP_INFO: "UNI file that is associated by INF or DEC file need define the prompt and help information.",
+    ERROR_GENERAL_CHECK_INVALID_LINE_ENDING: "Only CRLF (Carriage Return Line Feed) is allowed to line ending.",
+    ERROR_GENERAL_CHECK_TRAILING_WHITE_SPACE_LINE: "There should be no trailing white space in one line.",
 
-    ERROR_SPACE_CHECK_ALL : "",
+    ERROR_SPACE_CHECK_ALL: "",
 
-    ERROR_PREDICATE_EXPRESSION_CHECK_ALL : "",
-    ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE : "Boolean values and variable type BOOLEAN should not use explicit comparisons to TRUE or FALSE",
-    ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR : "Non-Boolean comparisons should use a compare operator (==, !=, >, < >=, <=)",
-    ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE : "A comparison of any pointer to zero must be done via the NULL type",
+    ERROR_PREDICATE_EXPRESSION_CHECK_ALL: "",
+    ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE: "Boolean values and variable type BOOLEAN should not use explicit comparisons to TRUE or FALSE",
+    ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR: "Non-Boolean comparisons should use a compare operator (==, !=, >, < >=, <=)",
+    ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE: "A comparison of any pointer to zero must be done via the NULL type",
 
-    ERROR_HEADER_CHECK_ALL : "",
-    ERROR_HEADER_CHECK_FILE : "File header doesn't exist",
-    ERROR_HEADER_CHECK_FUNCTION : "Function header doesn't exist",
+    ERROR_HEADER_CHECK_ALL: "",
+    ERROR_HEADER_CHECK_FILE: "File header doesn't exist",
+    ERROR_HEADER_CHECK_FUNCTION: "Function header doesn't exist",
 
-    ERROR_C_FUNCTION_LAYOUT_CHECK_ALL : "",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE : "Return type of a function should exist and in the first line",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER : "Any optional functional modifiers should exist and next to the return type",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME : """Function name should be left justified, followed by the beginning of the parameter list, with the closing parenthesis on its own line, indented two spaces""",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE : "Function prototypes in include files have the same form as function definitions",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2 : "Function prototypes in include files have different parameter number with function definitions",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3 : "Function prototypes in include files have different parameter modifier with function definitions",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY : "The body of a function should be contained by open and close braces that must be in the first column",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_DATA_DECLARATION : "The data declarations should be the first code in a module",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_NO_INIT_OF_VARIABLE : "There should be no initialization of a variable as part of its declaration",
-    ERROR_C_FUNCTION_LAYOUT_CHECK_NO_STATIC : "There should be no use of STATIC for functions",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_ALL: "",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE: "Return type of a function should exist and in the first line",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER: "Any optional functional modifiers should exist and next to the return type",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME: """Function name should be left justified, followed by the beginning of the parameter list, with the closing parenthesis on its own line, indented two spaces""",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE: "Function prototypes in include files have the same form as function definitions",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2: "Function prototypes in include files have different parameter number with function definitions",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3: "Function prototypes in include files have different parameter modifier with function definitions",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY: "The body of a function should be contained by open and close braces that must be in the first column",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_DATA_DECLARATION: "The data declarations should be the first code in a module",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_NO_INIT_OF_VARIABLE: "There should be no initialization of a variable as part of its declaration",
+    ERROR_C_FUNCTION_LAYOUT_CHECK_NO_STATIC: "There should be no use of STATIC for functions",
 
-    ERROR_INCLUDE_FILE_CHECK_ALL : "",
-    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_1 : "All include file contents should be guarded by a #ifndef statement.",
-    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_2 : "The #ifndef must be the first line of code following the file header comment",
-    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_3 : "The #endif must appear on the last line in the file",
-    ERROR_INCLUDE_FILE_CHECK_DATA : "Include files should contain only public or only private data and cannot contain code or define data variables",
-    ERROR_INCLUDE_FILE_CHECK_NAME : "No permission for the include file with same names",
+    ERROR_INCLUDE_FILE_CHECK_ALL: "",
+    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_1: "All include file contents should be guarded by a #ifndef statement.",
+    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_2: "The #ifndef must be the first line of code following the file header comment",
+    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_3: "The #endif must appear on the last line in the file",
+    ERROR_INCLUDE_FILE_CHECK_DATA: "Include files should contain only public or only private data and cannot contain code or define data variables",
+    ERROR_INCLUDE_FILE_CHECK_NAME: "No permission for the include file with same names",
 
-    ERROR_DECLARATION_DATA_TYPE_CHECK_ALL : "",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE : "There should be no use of int, unsigned, char, void, long in any .c, .h or .asl files",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER : """The modifiers IN, OUT, OPTIONAL, and UNALIGNED should be used only to qualify arguments to a function and should not appear in a data type declaration""",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_EFI_API_MODIFIER : "The EFIAPI modifier should be used at the entry of drivers, events, and member functions of protocols",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_ENUMERATED_TYPE : "Enumerated Type should have a 'typedef' and the name must be in capital letters",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_STRUCTURE_DECLARATION : "Structure Type should have a 'typedef' and the name must be in capital letters",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_SAME_STRUCTURE : "No permission for the structure with same names",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_UNION_TYPE : "Union Type should have a 'typedef' and the name must be in capital letters",
-    ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE : "Complex types should be typedef-ed",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_ALL: "",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE: "There should be no use of int, unsigned, char, void, long in any .c, .h or .asl files",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER: """The modifiers IN, OUT, OPTIONAL, and UNALIGNED should be used only to qualify arguments to a function and should not appear in a data type declaration""",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_EFI_API_MODIFIER: "The EFIAPI modifier should be used at the entry of drivers, events, and member functions of protocols",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_ENUMERATED_TYPE: "Enumerated Type should have a 'typedef' and the name must be in capital letters",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_STRUCTURE_DECLARATION: "Structure Type should have a 'typedef' and the name must be in capital letters",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_SAME_STRUCTURE: "No permission for the structure with same names",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_UNION_TYPE: "Union Type should have a 'typedef' and the name must be in capital letters",
+    ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE: "Complex types should be typedef-ed",
 
-    ERROR_NAMING_CONVENTION_CHECK_ALL : "",
-    ERROR_NAMING_CONVENTION_CHECK_DEFINE_STATEMENT : "Only capital letters are allowed to be used for #define declarations",
-    ERROR_NAMING_CONVENTION_CHECK_TYPEDEF_STATEMENT : "Only capital letters are allowed to be used for typedef declarations",
-    ERROR_NAMING_CONVENTION_CHECK_IFNDEF_STATEMENT : "The #ifndef at the start of an include file should have one postfix underscore, and no prefix underscore character '_'",
-    ERROR_NAMING_CONVENTION_CHECK_PATH_NAME : """Path name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters""",
-    ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME : """Variable name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters 4. Global variable name must start with a 'g'""",
-    ERROR_NAMING_CONVENTION_CHECK_FUNCTION_NAME : """Function name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters""",
-    ERROR_NAMING_CONVENTION_CHECK_SINGLE_CHARACTER_VARIABLE : "There should be no use of short (single character) variable names",
+    ERROR_NAMING_CONVENTION_CHECK_ALL: "",
+    ERROR_NAMING_CONVENTION_CHECK_DEFINE_STATEMENT: "Only capital letters are allowed to be used for #define declarations",
+    ERROR_NAMING_CONVENTION_CHECK_TYPEDEF_STATEMENT: "Only capital letters are allowed to be used for typedef declarations",
+    ERROR_NAMING_CONVENTION_CHECK_IFNDEF_STATEMENT: "The #ifndef at the start of an include file should have one postfix underscore, and no prefix underscore character '_'",
+    ERROR_NAMING_CONVENTION_CHECK_PATH_NAME: """Path name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters""",
+    ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME: """Variable name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters 4. Global variable name must start with a 'g'""",
+    ERROR_NAMING_CONVENTION_CHECK_FUNCTION_NAME: """Function name does not follow the rules: 1. First character should be upper case 2. Must contain lower case characters 3. No white space characters""",
+    ERROR_NAMING_CONVENTION_CHECK_SINGLE_CHARACTER_VARIABLE: "There should be no use of short (single character) variable names",
 
-    ERROR_DOXYGEN_CHECK_ALL : "",
-    ERROR_DOXYGEN_CHECK_FILE_HEADER : "The file headers should follow Doxygen special documentation blocks in section 2.3.5",
-    ERROR_DOXYGEN_CHECK_FUNCTION_HEADER : "The function headers should follow Doxygen special documentation blocks in section 2.3.5",
-    ERROR_DOXYGEN_CHECK_COMMENT_DESCRIPTION : """The first line of text in a comment block should be a brief description of the element being documented and the brief description must end with a period.""",
-    ERROR_DOXYGEN_CHECK_COMMENT_FORMAT : "For comment line with '///< ... text ...' format, if it is used, it should be after the code section",
-    ERROR_DOXYGEN_CHECK_COMMAND : "Only Doxygen commands '@bug', '@todo', '@example', '@file', '@attention', '@param', '@post', '@pre', '@retval', '@return', '@sa', '@since', '@test', '@note', '@par', '@endcode', '@code', '@{', '@}' are allowed to mark the code",
+    ERROR_DOXYGEN_CHECK_ALL: "",
+    ERROR_DOXYGEN_CHECK_FILE_HEADER: "The file headers should follow Doxygen special documentation blocks in section 2.3.5",
+    ERROR_DOXYGEN_CHECK_FUNCTION_HEADER: "The function headers should follow Doxygen special documentation blocks in section 2.3.5",
+    ERROR_DOXYGEN_CHECK_COMMENT_DESCRIPTION: """The first line of text in a comment block should be a brief description of the element being documented and the brief description must end with a period.""",
+    ERROR_DOXYGEN_CHECK_COMMENT_FORMAT: "For comment line with '///< ... text ...' format, if it is used, it should be after the code section",
+    ERROR_DOXYGEN_CHECK_COMMAND: "Only Doxygen commands '@bug', '@todo', '@example', '@file', '@attention', '@param', '@post', '@pre', '@retval', '@return', '@sa', '@since', '@test', '@note', '@par', '@endcode', '@code', '@{', '@}' are allowed to mark the code",
 
-    ERROR_META_DATA_FILE_CHECK_ALL : "",
-    ERROR_META_DATA_FILE_CHECK_PATH_NAME : "The file defined in meta-data does not exist",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1 : "A library instances defined for a given module (or dependent library instance) doesn't match the module's type.",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_2 : "A library instance must specify the Supported Module Types in its INF file",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_DEPENDENT : "A library instance must be defined for all dependent library classes",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_ORDER : "The library Instances specified by the LibraryClasses sections should be listed in order of dependencies",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE : "There should be no unnecessary inclusion of library classes in the INF file",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE : "Duplicate Library Class Name found",
-    ERROR_META_DATA_FILE_CHECK_BINARY_INF_IN_FDF : "An INF file is specified in the FDF file, but not in the DSC file, therefore the INF file must be for a Binary module only",
-    ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE : "Duplicate PCDs found",
-    ERROR_META_DATA_FILE_CHECK_PCD_FLASH : "PCD settings in the FDF file should only be related to flash",
-    ERROR_META_DATA_FILE_CHECK_PCD_NO_USE : "There should be no PCDs declared in INF files that are not specified in in either a DSC or FDF file",
-    ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID : "Duplicate GUID found",
-    ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL : "Duplicate PROTOCOL found",
-    ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI : "Duplicate PPI found",
-    ERROR_META_DATA_FILE_CHECK_MODULE_FILE_NO_USE : "No used module files found",
-    ERROR_META_DATA_FILE_CHECK_PCD_TYPE : "Wrong C code function used for this kind of PCD",
-    ERROR_META_DATA_FILE_CHECK_MODULE_FILE_GUID_DUPLICATION : "Module file has FILE_GUID collision with other module file",
-    ERROR_META_DATA_FILE_CHECK_FORMAT_GUID : "Wrong GUID Format used in Module file",
-    ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL : "Wrong Protocol Format used in Module file",
-    ERROR_META_DATA_FILE_CHECK_FORMAT_PPI : "Wrong Ppi Format used in Module file",
-    ERROR_META_DATA_FILE_CHECK_FORMAT_PCD : "Wrong Pcd Format used in Module file",
-    ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED : "Not defined LibraryClass used in the Module file.",
-    ERROR_SPELLING_CHECK_ALL : "",
-
-    ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE : "SMM communication function may use wrong parameter type",
-    }
+    ERROR_META_DATA_FILE_CHECK_ALL: "",
+    ERROR_META_DATA_FILE_CHECK_PATH_NAME: "The file defined in meta-data does not exist",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_1: "A library instances defined for a given module (or dependent library instance) doesn't match the module's type.",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_2: "A library instance must specify the Supported Module Types in its INF file",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_DEPENDENT: "A library instance must be defined for all dependent library classes",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_INSTANCE_ORDER: "The library Instances specified by the LibraryClasses sections should be listed in order of dependencies",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_NO_USE: "There should be no unnecessary inclusion of library classes in the INF file",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_NAME_DUPLICATE: "Duplicate Library Class Name found",
+    ERROR_META_DATA_FILE_CHECK_BINARY_INF_IN_FDF: "An INF file is specified in the FDF file, but not in the DSC file, therefore the INF file must be for a Binary module only",
+    ERROR_META_DATA_FILE_CHECK_PCD_DUPLICATE: "Duplicate PCDs found",
+    ERROR_META_DATA_FILE_CHECK_PCD_FLASH: "PCD settings in the FDF file should only be related to flash",
+    ERROR_META_DATA_FILE_CHECK_PCD_NO_USE: "There should be no PCDs declared in INF files that are not specified in in either a DSC or FDF file",
+    ERROR_META_DATA_FILE_CHECK_DUPLICATE_GUID: "Duplicate GUID found",
+    ERROR_META_DATA_FILE_CHECK_DUPLICATE_PROTOCOL: "Duplicate PROTOCOL found",
+    ERROR_META_DATA_FILE_CHECK_DUPLICATE_PPI: "Duplicate PPI found",
+    ERROR_META_DATA_FILE_CHECK_MODULE_FILE_NO_USE: "No used module files found",
+    ERROR_META_DATA_FILE_CHECK_PCD_TYPE: "Wrong C code function used for this kind of PCD",
+    ERROR_META_DATA_FILE_CHECK_MODULE_FILE_GUID_DUPLICATION: "Module file has FILE_GUID collision with other module file",
+    ERROR_META_DATA_FILE_CHECK_FORMAT_GUID: "Wrong GUID Format used in Module file",
+    ERROR_META_DATA_FILE_CHECK_FORMAT_PROTOCOL: "Wrong Protocol Format used in Module file",
+    ERROR_META_DATA_FILE_CHECK_FORMAT_PPI: "Wrong Ppi Format used in Module file",
+    ERROR_META_DATA_FILE_CHECK_FORMAT_PCD: "Wrong Pcd Format used in Module file",
+    ERROR_META_DATA_FILE_CHECK_LIBRARY_NOT_DEFINED: "Not defined LibraryClass used in the Module file.",
+    ERROR_SPELLING_CHECK_ALL: "",
 
+    ERROR_SMM_COMM_PARA_CHECK_BUFFER_TYPE: "SMM communication function may use wrong parameter type",
+}
diff --git a/BaseTools/Source/Python/Ecc/Exception.py b/BaseTools/Source/Python/Ecc/Exception.py
index 9251b8d7c47f..f6dfbeb2c95f 100644
--- a/BaseTools/Source/Python/Ecc/Exception.py
+++ b/BaseTools/Source/Python/Ecc/Exception.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse exception items found by ECC tool
 #
 # Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,6 +14,8 @@ from Ecc.Xml.XmlRoutines import *
 import Common.LongFilePathOs as os
 
 # ExceptionXml to parse Exception Node of XML file
+
+
 class ExceptionXml(object):
     def __init__(self):
         self.KeyWord = ''
@@ -26,9 +28,11 @@ class ExceptionXml(object):
         self.FilePath = os.path.normpath(XmlElement(Item, '%s/FilePath' % Key))
 
     def __str__(self):
-        return 'ErrorID = %s KeyWord = %s FilePath = %s' %(self.ErrorID, self.KeyWord, self.FilePath)
+        return 'ErrorID = %s KeyWord = %s FilePath = %s' % (self.ErrorID, self.KeyWord, self.FilePath)
 
 # ExceptionListXml to parse Exception Node List of XML file
+
+
 class ExceptionListXml(object):
     def __init__(self):
         self.List = []
@@ -56,8 +60,10 @@ class ExceptionListXml(object):
         return RtnStr
 
 # A class to check exception
+
+
 class ExceptionCheck(object):
-    def __init__(self, FilePath = None):
+    def __init__(self, FilePath=None):
         self.ExceptionList = []
         self.ExceptionListXml = ExceptionListXml()
         self.LoadExceptionListXml(FilePath)
@@ -73,11 +79,13 @@ class ExceptionCheck(object):
         else:
             return False
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
 # script.
 #
 if __name__ == '__main__':
-    El = ExceptionCheck('C:\\Hess\\Project\\BuildTool\\src\\Ecc\\exception.xml')
+    El = ExceptionCheck(
+        'C:\\Hess\\Project\\BuildTool\\src\\Ecc\\exception.xml')
     print(El.ExceptionList)
diff --git a/BaseTools/Source/Python/Ecc/FileProfile.py b/BaseTools/Source/Python/Ecc/FileProfile.py
index eedf263b1f9d..ecc33fac2863 100644
--- a/BaseTools/Source/Python/Ecc/FileProfile.py
+++ b/BaseTools/Source/Python/Ecc/FileProfile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # fragments of source file
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -26,13 +26,15 @@ StructUnionDefinitionList = []
 TypedefDefinitionList = []
 FunctionCallingList = []
 
-## record file data when parsing source
+# record file data when parsing source
 #
 # May raise Exception when opening file.
 #
-class FileProfile :
 
-    ## The constructor
+
+class FileProfile:
+
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
diff --git a/BaseTools/Source/Python/Ecc/MetaDataParser.py b/BaseTools/Source/Python/Ecc/MetaDataParser.py
index d9f0da1ee0d6..d8d0d2ee408a 100644
--- a/BaseTools/Source/Python/Ecc/MetaDataParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaDataParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common parser functions for meta-data
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -12,12 +12,14 @@ from Ecc.EccToolError import *
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 from Ecc import EccGlobalData
 import re
-## Get the include path list for a source file
+# Get the include path list for a source file
 #
 # 1. Find the source file belongs to which inf file
 # 2. Find the inf's package
 # 3. Return the include path list of the package
 #
+
+
 def GetIncludeListOfFile(WorkSpace, Filepath, Db):
     IncludeList = []
     Filepath = os.path.normpath(Filepath)
@@ -46,26 +48,32 @@ def GetIncludeListOfFile(WorkSpace, Filepath, Db):
 
     return IncludeList
 
-## Get the file list
+# Get the file list
 #
 # Search table file and find all specific type files
 #
+
+
 def GetFileList(FileModel, Db):
     FileList = []
-    SqlCommand = """select FullPath from File where Model = %s""" % str(FileModel)
+    SqlCommand = """select FullPath from File where Model = %s""" % str(
+        FileModel)
     RecordSet = Db.TblFile.Exec(SqlCommand)
     for Record in RecordSet:
         FileList.append(Record[0])
 
     return FileList
 
-## Get the table list
+# Get the table list
 #
 # Search table file and find all small tables
 #
+
+
 def GetTableList(FileModelList, Table, Db):
     TableList = []
-    SqlCommand = """select ID from File where Model in %s""" % str(FileModelList)
+    SqlCommand = """select ID from File where Model in %s""" % str(
+        FileModelList)
     RecordSet = Db.TblFile.Exec(SqlCommand)
     for Record in RecordSet:
         TableName = Table + str(Record[0])
@@ -73,7 +81,7 @@ def GetTableList(FileModelList, Table, Db):
 
     return TableList
 
-## ParseHeaderCommentSection
+# ParseHeaderCommentSection
 #
 # Parse Header comment section lines, extract Abstract, Description, Copyright
 # , License lines
@@ -81,7 +89,9 @@ def GetTableList(FileModelList, Table, Db):
 # @param CommentList:   List of (Comment, LineNumber)
 # @param FileName:      FileName of the comment
 #
-def ParseHeaderCommentSection(CommentList, FileName = None):
+
+
+def ParseHeaderCommentSection(CommentList, FileName=None):
 
     Abstract = ''
     Description = ''
@@ -95,13 +105,13 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
     # inf files
     #
     HEADER_COMMENT_NOT_STARTED = -1
-    HEADER_COMMENT_STARTED     = 0
-    HEADER_COMMENT_FILE        = 1
-    HEADER_COMMENT_ABSTRACT    = 2
+    HEADER_COMMENT_STARTED = 0
+    HEADER_COMMENT_FILE = 1
+    HEADER_COMMENT_ABSTRACT = 2
     HEADER_COMMENT_DESCRIPTION = 3
-    HEADER_COMMENT_COPYRIGHT   = 4
-    HEADER_COMMENT_LICENSE     = 5
-    HEADER_COMMENT_END         = 6
+    HEADER_COMMENT_COPYRIGHT = 4
+    HEADER_COMMENT_LICENSE = 5
+    HEADER_COMMENT_END = 6
     #
     # first find the last copyright line
     #
@@ -122,7 +132,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
             ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
             for Result in ResultSet:
                 Msg = 'Comment must start with #'
-                EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+                EccGlobalData.gDb.TblReport.Insert(
+                    ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
         Comment = CleanString2(Line)[1]
         Comment = Comment.strip()
         #
@@ -130,7 +141,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
         # indication of different block; or in the position that Abstract should be, also keep it
         # as it indicates that no abstract
         #
-        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE, \
+        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE,
                                                       HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT]:
             continue
 
@@ -185,36 +196,41 @@ def ParseHeaderCommentSection(CommentList, FileName = None):
         ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
         for Result in ResultSet:
             Msg = 'Header comment section must have copyright information'
-            EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+            EccGlobalData.gDb.TblReport.Insert(
+                ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
 
     if not License.strip():
         SqlStatement = """ select ID from File where FullPath like '%s'""" % FileName
         ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
         for Result in ResultSet:
             Msg = 'Header comment section must have license information'
-            EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+            EccGlobalData.gDb.TblReport.Insert(
+                ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
 
     if not Abstract.strip() or Abstract.find('Component description file') > -1:
         SqlStatement = """ select ID from File where FullPath like '%s'""" % FileName
         ResultSet = EccGlobalData.gDb.TblFile.Exec(SqlStatement)
         for Result in ResultSet:
             Msg = 'Header comment section must have Abstract information.'
-            EccGlobalData.gDb.TblReport.Insert(ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
+            EccGlobalData.gDb.TblReport.Insert(
+                ERROR_DOXYGEN_CHECK_FILE_HEADER, Msg, "File", Result[0])
 
     return Abstract.strip(), Description.strip(), Copyright.strip(), License.strip()
 
-## _IsCopyrightLine
+# _IsCopyrightLine
 # check whether current line is copyright line, the criteria is whether there is case insensitive keyword "Copyright"
 # followed by zero or more white space characters followed by a "(" character
 #
 # @param LineContent:  the line need to be checked
 # @return: True if current line is copyright line, False else
 #
-def _IsCopyrightLine (LineContent):
+
+
+def _IsCopyrightLine(LineContent):
     LineContent = LineContent.upper()
     Result = False
 
-    #Support below Copyright format
+    # Support below Copyright format
     # Copyright (C) 2020 Hewlett Packard Enterprise Development LP<BR>
     # (C) Copyright 2020 Hewlett Packard Enterprise Development LP<BR>
     ReIsCopyrightRe = re.compile(r"""(^|\s)COPYRIGHT *\(""", re.DOTALL)
@@ -225,7 +241,7 @@ def _IsCopyrightLine (LineContent):
     return Result
 
 
-## CleanString2
+# CleanString2
 #
 # Split comments in a string
 # Remove spaces
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
index 1d7f6eb10434..0b5af17c48b6 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for files
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -15,11 +15,13 @@ import Common.EdkLogger as EdkLogger
 from CommonDataClass import DataClass
 from CommonDataClass.DataClass import FileClass
 
-## Convert to SQL required string format
+# Convert to SQL required string format
+
+
 def ConvertToSqlString(StringList):
     return map(lambda s: "'" + s.replace("'", "''") + "'", StringList)
 
-## TableFile
+# TableFile
 #
 # This class defined a common table
 #
@@ -28,6 +30,8 @@ def ConvertToSqlString(StringList):
 # @param Cursor:     Cursor of the database
 # @param TableName:  Name of the table
 #
+
+
 class Table(object):
     _COLUMN_ = ''
     _ID_STEP_ = 1
@@ -44,7 +48,7 @@ class Table(object):
     def __str__(self):
         return self.Table
 
-    ## Create table
+    # Create table
     #
     # Create a table
     #
@@ -53,14 +57,16 @@ class Table(object):
             self.Drop()
 
         if self.Temporary:
-            SqlCommand = """create temp table IF NOT EXISTS %s (%s)""" % (self.Table, self._COLUMN_)
+            SqlCommand = """create temp table IF NOT EXISTS %s (%s)""" % (
+                self.Table, self._COLUMN_)
         else:
-            SqlCommand = """create table IF NOT EXISTS %s (%s)""" % (self.Table, self._COLUMN_)
+            SqlCommand = """create table IF NOT EXISTS %s (%s)""" % (
+                self.Table, self._COLUMN_)
         EdkLogger.debug(EdkLogger.DEBUG_8, SqlCommand)
         self.Cur.execute(SqlCommand)
         self.ID = self.GetId()
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into a table
     #
@@ -69,12 +75,13 @@ class Table(object):
         if self.ID >= (self.IdBase + self._ID_MAX_):
             self.ID = self.IdBase + self._ID_STEP_
         Values = ", ".join(str(Arg) for Arg in Args)
-        SqlCommand = "insert into %s values(%s, %s)" % (self.Table, self.ID, Values)
+        SqlCommand = "insert into %s values(%s, %s)" % (
+            self.Table, self.ID, Values)
         EdkLogger.debug(EdkLogger.DEBUG_5, SqlCommand)
         self.Cur.execute(SqlCommand)
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # Query all records of the table
     #
@@ -85,7 +92,7 @@ class Table(object):
             EdkLogger.verbose(str(Rs))
         TotalCount = self.GetId()
 
-    ## Drop a table
+    # Drop a table
     #
     # Drop the table
     #
@@ -96,7 +103,7 @@ class Table(object):
         except Exception as e:
             print("An error occurred when Drop a table:", e.args[0])
 
-    ## Get count
+    # Get count
     #
     # Get a count of all records of the table
     #
@@ -115,14 +122,14 @@ class Table(object):
             Id = self.IdBase
         return Id
 
-    ## Init the ID of the table
+    # Init the ID of the table
     #
     # Init the ID of the table
     #
     def InitID(self):
         self.ID = self.GetId()
 
-    ## Exec
+    # Exec
     #
     # Exec Sql Command, return result
     #
@@ -149,7 +156,7 @@ class Table(object):
         return self.Exec("select * from %s where ID > 0 order by ID" % (self.Table))
 
 
-## TableDataModel
+# TableDataModel
 #
 # This class defined a table used for data model
 #
@@ -163,10 +170,11 @@ class TableDataModel(Table):
         Name VARCHAR NOT NULL,
         Description VARCHAR
         """
+
     def __init__(self, Cursor):
         Table.__init__(self, Cursor, 'DataModel')
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table DataModel
     #
@@ -179,7 +187,7 @@ class TableDataModel(Table):
         (Name, Description) = ConvertToSqlString((Name, Description))
         return Table.Insert(self, CrossIndex, Name, Description)
 
-    ## Init table
+    # Init table
     #
     # Create all default records of table DataModel
     #
@@ -195,7 +203,7 @@ class TableDataModel(Table):
             self.Insert(CrossIndex, Name, Description)
         EdkLogger.verbose("Initialize table DataModel ... DONE!")
 
-    ## Get CrossIndex
+    # Get CrossIndex
     #
     # Get a model's cross index from its name
     #
@@ -210,4 +218,3 @@ class TableDataModel(Table):
             CrossIndex = Item[0]
 
         return CrossIndex
-
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
index 2d98ac5eadb2..1cf1815c2fe2 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse meta files
 #
 # Copyright (c) 2008 - 2020, Intel Corporation. All rights reserved.<BR>
@@ -31,11 +31,13 @@ from GenFds.FdfParser import FdfParser
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.LongFilePathSupport import CodecOpenLongFilePath
 
-## RegEx for finding file versions
+# RegEx for finding file versions
 hexVersionPattern = re.compile(r'0[xX][\da-f-A-F]{5,8}')
 decVersionPattern = re.compile(r'\d+\.\d+')
 
-## A decorator used to parse macro definition
+# A decorator used to parse macro definition
+
+
 def ParseMacro(Parser):
     def MacroParser(self):
         Match = GlobalData.gMacroDefPattern.match(self._CurrentLine)
@@ -44,7 +46,8 @@ def ParseMacro(Parser):
             Parser(self)
             return
 
-        TokenList = GetSplitValueList(self._CurrentLine[Match.end(1):], TAB_EQUAL_SPLIT, 1)
+        TokenList = GetSplitValueList(
+            self._CurrentLine[Match.end(1):], TAB_EQUAL_SPLIT, 1)
         # Syntax check
         if not TokenList[0]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No macro name given",
@@ -72,7 +75,8 @@ def ParseMacro(Parser):
                     self._FileLocalMacros[Name] = Value
                 else:
                     for Scope in self._Scope:
-                        self._SectionsMacroDict.setdefault((Scope[2], Scope[0], Scope[1]), {})[Name] = Value
+                        self._SectionsMacroDict.setdefault(
+                            (Scope[2], Scope[0], Scope[1]), {})[Name] = Value
             elif self._SectionType == MODEL_META_DATA_HEADER:
                 self._FileLocalMacros[Name] = Value
             else:
@@ -96,7 +100,7 @@ def ParseMacro(Parser):
 
     return MacroParser
 
-## Base class of parser
+# Base class of parser
 #
 #  This class is used for derivation purpose. The specific parser for one kind
 # type file must derive this class and implement some public interfaces.
@@ -108,6 +112,8 @@ def ParseMacro(Parser):
 #   @param      Owner           Owner ID (for sub-section parsing)
 #   @param      From            ID from which the data comes (for !INCLUDE directive)
 #
+
+
 class MetaFileParser(object):
     # data type (file content) for specific file type
     DataType = {}
@@ -115,7 +121,7 @@ class MetaFileParser(object):
     # Parser objects used to implement singleton
     MetaFiles = {}
 
-    ## Factory method
+    # Factory method
     #
     # One file, one parser object. This factory method makes sure that there's
     # only one object constructed for one meta file.
@@ -134,7 +140,7 @@ class MetaFileParser(object):
             Class.MetaFiles[FilePath] = ParserObject
             return ParserObject
 
-    ## Constructor of MetaFileParser
+    # Constructor of MetaFileParser
     #
     #  Initialize object of MetaFileParser
     #
@@ -179,37 +185,37 @@ class MetaFileParser(object):
         self._UniObj = None
         self._UniExtraObj = None
 
-    ## Store the parsed data in table
+    # Store the parsed data in table
     def _Store(self, *Args):
         return self._Table.Insert(*Args)
 
-    ## Virtual method for starting parse
+    # Virtual method for starting parse
     def Start(self):
         raise NotImplementedError
 
-    ## Notify a post-process is needed
+    # Notify a post-process is needed
     def DoPostProcess(self):
         self._PostProcessed = False
 
-    ## Set parsing complete flag in both class and table
+    # Set parsing complete flag in both class and table
     def _Done(self):
         self._Finished = True
-        ## Do not set end flag when processing included files
+        # Do not set end flag when processing included files
         if self._From == -1:
             self._Table.SetEndFlag()
 
     def _PostProcess(self):
         self._PostProcessed = True
 
-    ## Get the parse complete flag
+    # Get the parse complete flag
     def _GetFinished(self):
         return self._Finished
 
-    ## Set the complete flag
+    # Set the complete flag
     def _SetFinished(self, Value):
         self._Finished = Value
 
-    ## Use [] style to query data in table, just for readability
+    # Use [] style to query data in table, just for readability
     #
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
@@ -236,7 +242,7 @@ class MetaFileParser(object):
 
         return self._Table.Query(*DataInfo)
 
-    ## Data parser for the common format in different type of file
+    # Data parser for the common format in different type of file
     #
     #   The common format in the meatfile is like
     #
@@ -247,7 +253,7 @@ class MetaFileParser(object):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
 
-    ## Data parser for the format in which there's path
+    # Data parser for the format in which there's path
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -258,23 +264,26 @@ class MetaFileParser(object):
         # Don't do macro replacement for dsc file at this point
         if not isinstance(self, DscParser):
             Macros = self._Macros
-            self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
+            self._ValueList = [ReplaceMacro(Value, Macros)
+                               for Value in self._ValueList]
 
-    ## Skip unsupported data
+    # Skip unsupported data
     def _Skip(self):
         if self._SectionName == TAB_USER_EXTENSIONS.upper() and self._CurrentLine.upper().endswith('.UNI'):
             if EccGlobalData.gConfig.UniCheckHelpInfo == '1' or EccGlobalData.gConfig.UniCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
                 ExtraUni = self._CurrentLine.strip()
-                ExtraUniFile = os.path.join(os.path.dirname(self.MetaFile), ExtraUni)
+                ExtraUniFile = os.path.join(
+                    os.path.dirname(self.MetaFile), ExtraUni)
                 IsModuleUni = self.MetaFile.upper().endswith('.INF')
-                self._UniExtraObj = UniParser(ExtraUniFile, IsExtraUni=True, IsModuleUni=IsModuleUni)
+                self._UniExtraObj = UniParser(
+                    ExtraUniFile, IsExtraUni=True, IsModuleUni=IsModuleUni)
                 self._UniExtraObj.Start()
         else:
             EdkLogger.warn("Parser", "Unrecognized content", File=self.MetaFile,
-                            Line=self._LineIndex + 1, ExtraData=self._CurrentLine);
+                           Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
         self._ValueList[0:1] = [self._CurrentLine]
 
-    ## Section header parser
+    # Section header parser
     #
     #   The section header is always in following format:
     #
@@ -298,7 +307,7 @@ class MetaFileParser(object):
             else:
                 self._SectionType = MODEL_UNKNOWN
                 EdkLogger.warn("Parser", "Unrecognized section", File=self.MetaFile,
-                                Line=self._LineIndex+1, ExtraData=self._CurrentLine)
+                               Line=self._LineIndex+1, ExtraData=self._CurrentLine)
             # S1 is always Arch
             if len(ItemList) > 1:
                 S1 = ItemList[1].upper()
@@ -319,7 +328,7 @@ class MetaFileParser(object):
         # If the section information is needed later, it should be stored in database
         self._ValueList[0] = self._SectionName
 
-    ## [defines] section parser
+    # [defines] section parser
     @ParseMacro
     def _DefineParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -331,7 +340,8 @@ class MetaFileParser(object):
             EdkLogger.error('Parser', FORMAT_INVALID, "No value specified",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
 
-        self._ValueList = [ReplaceMacro(Value, self._Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, self._Macros)
+                           for Value in self._ValueList]
         Name, Value = self._ValueList[1], self._ValueList[2]
         # Sometimes, we need to make differences between EDK and EDK2 modules
         if Name == 'INF_VERSION':
@@ -351,7 +361,8 @@ class MetaFileParser(object):
         elif Name == 'MODULE_UNI_FILE':
             UniFile = os.path.join(os.path.dirname(self.MetaFile), Value)
             if os.path.exists(UniFile):
-                self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=True)
+                self._UniObj = UniParser(
+                    UniFile, IsExtraUni=False, IsModuleUni=True)
                 self._UniObj.Start()
             else:
                 EdkLogger.error('Parser', FILE_NOT_FOUND, "Module UNI file %s is missing." % Value,
@@ -360,14 +371,15 @@ class MetaFileParser(object):
         elif Name == 'PACKAGE_UNI_FILE':
             UniFile = os.path.join(os.path.dirname(self.MetaFile), Value)
             if os.path.exists(UniFile):
-                self._UniObj = UniParser(UniFile, IsExtraUni=False, IsModuleUni=False)
+                self._UniObj = UniParser(
+                    UniFile, IsExtraUni=False, IsModuleUni=False)
 
         if isinstance(self, InfParser) and self._Version < 0x00010005:
             # EDK module allows using defines as macros
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
 
-    ## [BuildOptions] section parser
+    # [BuildOptions] section parser
     @ParseMacro
     def _BuildOptionParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -377,18 +389,19 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser):  # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
             EdkLogger.error(
                 'Parser',
                 FORMAT_INVALID,
-                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
+                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[
+                    1],
                 ExtraData=self._CurrentLine,
                 File=self.MetaFile,
                 Line=self._LineIndex+1
-                )
+            )
 
     def _GetMacros(self):
         Macros = {}
@@ -396,23 +409,24 @@ class MetaFileParser(object):
         Macros.update(self._GetApplicableSectionMacro())
         return Macros
 
+    # Get section Macros that are applicable to current line, which may come from other sections
+    # that share the same name while scope is wider
 
-    ## Get section Macros that are applicable to current line, which may come from other sections
-    ## that share the same name while scope is wider
     def _GetApplicableSectionMacro(self):
         Macros = {}
         for Scope1, Scope2 in [("COMMON", "COMMON"), ("COMMON", self._Scope[0][1]),
                                (self._Scope[0][0], "COMMON"), (self._Scope[0][0], self._Scope[0][1])]:
             if (self._SectionType, Scope1, Scope2) in self._SectionsMacroDict:
-                Macros.update(self._SectionsMacroDict[(self._SectionType, Scope1, Scope2)])
+                Macros.update(self._SectionsMacroDict[(
+                    self._SectionType, Scope1, Scope2)])
         return Macros
 
-    _SectionParser  = {}
-    Finished        = property(_GetFinished, _SetFinished)
-    _Macros         = property(_GetMacros)
+    _SectionParser = {}
+    Finished = property(_GetFinished, _SetFinished)
+    _Macros = property(_GetMacros)
 
 
-## INF file parser class
+# INF file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
@@ -422,30 +436,30 @@ class MetaFileParser(object):
 class InfParser(MetaFileParser):
     # INF file supported data types (one type per section)
     DataType = {
-        TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
-        TAB_INF_DEFINES.upper() : MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE : MODEL_META_DATA_DEFINE,
-        TAB_BUILD_OPTIONS.upper() : MODEL_META_DATA_BUILD_OPTION,
-        TAB_INCLUDES.upper() : MODEL_EFI_INCLUDE,
-        TAB_LIBRARIES.upper() : MODEL_EFI_LIBRARY_INSTANCE,
-        TAB_LIBRARY_CLASSES.upper() : MODEL_EFI_LIBRARY_CLASS,
-        TAB_PACKAGES.upper() : MODEL_META_DATA_PACKAGE,
-        TAB_NMAKE.upper() : MODEL_META_DATA_NMAKE,
-        TAB_INF_FIXED_PCD.upper() : MODEL_PCD_FIXED_AT_BUILD,
-        TAB_INF_PATCH_PCD.upper() : MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_INF_FEATURE_PCD.upper() : MODEL_PCD_FEATURE_FLAG,
-        TAB_INF_PCD_EX.upper() : MODEL_PCD_DYNAMIC_EX,
-        TAB_INF_PCD.upper() : MODEL_PCD_DYNAMIC,
-        TAB_SOURCES.upper() : MODEL_EFI_SOURCE_FILE,
-        TAB_GUIDS.upper() : MODEL_EFI_GUID,
-        TAB_PROTOCOLS.upper() : MODEL_EFI_PROTOCOL,
-        TAB_PPIS.upper() : MODEL_EFI_PPI,
-        TAB_DEPEX.upper() : MODEL_EFI_DEPEX,
-        TAB_BINARIES.upper() : MODEL_EFI_BINARY_FILE,
-        TAB_USER_EXTENSIONS.upper() : MODEL_META_DATA_USER_EXTENSION
+        TAB_UNKNOWN.upper(): MODEL_UNKNOWN,
+        TAB_INF_DEFINES.upper(): MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE: MODEL_META_DATA_DEFINE,
+        TAB_BUILD_OPTIONS.upper(): MODEL_META_DATA_BUILD_OPTION,
+        TAB_INCLUDES.upper(): MODEL_EFI_INCLUDE,
+        TAB_LIBRARIES.upper(): MODEL_EFI_LIBRARY_INSTANCE,
+        TAB_LIBRARY_CLASSES.upper(): MODEL_EFI_LIBRARY_CLASS,
+        TAB_PACKAGES.upper(): MODEL_META_DATA_PACKAGE,
+        TAB_NMAKE.upper(): MODEL_META_DATA_NMAKE,
+        TAB_INF_FIXED_PCD.upper(): MODEL_PCD_FIXED_AT_BUILD,
+        TAB_INF_PATCH_PCD.upper(): MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_INF_FEATURE_PCD.upper(): MODEL_PCD_FEATURE_FLAG,
+        TAB_INF_PCD_EX.upper(): MODEL_PCD_DYNAMIC_EX,
+        TAB_INF_PCD.upper(): MODEL_PCD_DYNAMIC,
+        TAB_SOURCES.upper(): MODEL_EFI_SOURCE_FILE,
+        TAB_GUIDS.upper(): MODEL_EFI_GUID,
+        TAB_PROTOCOLS.upper(): MODEL_EFI_PROTOCOL,
+        TAB_PPIS.upper(): MODEL_EFI_PPI,
+        TAB_DEPEX.upper(): MODEL_EFI_DEPEX,
+        TAB_BINARIES.upper(): MODEL_EFI_BINARY_FILE,
+        TAB_USER_EXTENSIONS.upper(): MODEL_META_DATA_USER_EXTENSION
     }
 
-    ## Constructor of InfParser
+    # Constructor of InfParser
     #
     #  Initialize object of InfParser
     #
@@ -462,7 +476,7 @@ class InfParser(MetaFileParser):
         self.TblFile = EccGlobalData.gDb.TblFile
         self.FileID = -1
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         NmakeLine = ''
         Content = ''
@@ -470,7 +484,8 @@ class InfParser(MetaFileParser):
         try:
             Content = open(str(self.MetaFile), 'r').readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
         #
         # Insert a record for file
         #
@@ -503,7 +518,7 @@ class InfParser(MetaFileParser):
                     Usage += ' ' + Line[Line.find(TAB_COMMENT_SPLIT):]
                     Line = Line[:Line.find(TAB_COMMENT_SPLIT)]
             else:
-            # skip empty, commented, block commented lines
+                # skip empty, commented, block commented lines
                 Line = CleanString(Content[Index], AllowCppStyleComment=True)
                 Usage = ''
             NextLine = ''
@@ -541,13 +556,15 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_PPI,
                                              MODEL_META_DATA_USER_EXTENSION]:
                         EdkLogger.error('Parser', FORMAT_INVALID,
-                                        "Section [%s] is not allowed in inf file without version" % (self._SectionName),
+                                        "Section [%s] is not allowed in inf file without version" % (
+                                            self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
                 elif self._SectionType in [MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
                                            MODEL_META_DATA_NMAKE]:
                     EdkLogger.error('Parser', FORMAT_INVALID,
-                                    "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
+                                    "Section [%s] is not allowed in inf file with version 0x%08x" % (
+                                        self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
                 continue
             # merge two lines specified by '\' in section NMAKE
@@ -602,7 +619,7 @@ class InfParser(MetaFileParser):
                             File=self.MetaFile)
         self._Done()
 
-    ## Data parser for the format in which there's path
+    # Data parser for the format in which there's path
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -618,7 +635,7 @@ class InfParser(MetaFileParser):
 
                 self._ValueList[Index] = ReplaceMacro(Value, Macros)
 
-    ## Parse [Sources] section
+    # Parse [Sources] section
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -630,12 +647,14 @@ class InfParser(MetaFileParser):
         # For Acpi tables, remove macro like ' TABLE_NAME=Sata1'
         if 'COMPONENT_TYPE' in Macros:
             if self._Defines['COMPONENT_TYPE'].upper() == 'ACPITABLE':
-                self._ValueList[0] = GetSplitValueList(self._ValueList[0], ' ', 1)[0]
+                self._ValueList[0] = GetSplitValueList(
+                    self._ValueList[0], ' ', 1)[0]
         if self._Defines['BASE_NAME'] == 'Microcode':
             pass
-        self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, Macros)
+                           for Value in self._ValueList]
 
-    ## Parse [Binaries] section
+    # Parse [Binaries] section
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -644,20 +663,23 @@ class InfParser(MetaFileParser):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 2)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file type or path specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if not TokenList[0]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file type specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if not TokenList[1]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file path specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex+1)
         self._ValueList[0:len(TokenList)] = TokenList
         self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros)
 
-    ## [nmake] section parser (Edk.x style only)
+    # [nmake] section parser (Edk.x style only)
     def _NmakeParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -666,60 +688,65 @@ class InfParser(MetaFileParser):
         # remove self-reference in macro setting
         #self._ValueList[1] = ReplaceMacro(self._ValueList[1], {self._ValueList[0]:''})
 
-    ## [FixedPcd], [FeaturePcd], [PatchPcd], [Pcd] and [PcdEx] sections parser
+    # [FixedPcd], [FeaturePcd], [PatchPcd], [Pcd] and [PcdEx] sections parser
     @ParseMacro
     def _PcdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
         ValueList = GetSplitValueList(TokenList[0], TAB_SPLIT)
         if len(ValueList) != 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "Illegal token space GUID and PCD name format",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         self._ValueList[0:1] = ValueList
         if len(TokenList) > 1:
             self._ValueList[2] = TokenList[1]
         if self._ValueList[0] == '' or self._ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
 
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
-            InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
+            InfPcdValueList = GetSplitValueList(
+                TokenList[1], TAB_VALUE_SPLIT, 1)
             if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
-                self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1);
+                self._ValueList[2] = TokenList[1].replace(
+                    InfPcdValueList[0], '1', 1)
             elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
-                self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1);
+                self._ValueList[2] = TokenList[1].replace(
+                    InfPcdValueList[0], '0', 1)
 
-    ## [depex] section parser
+    # [depex] section parser
     @ParseMacro
     def _DepexParser(self):
         self._ValueList[0:1] = [self._CurrentLine]
 
     _SectionParser = {
-        MODEL_UNKNOWN                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
-        MODEL_META_DATA_BUILD_OPTION    :   MetaFileParser._BuildOptionParser,
-        MODEL_EFI_INCLUDE               :   _IncludeParser,                 # for Edk.x modules
-        MODEL_EFI_LIBRARY_INSTANCE      :   MetaFileParser._CommonParser,   # for Edk.x modules
-        MODEL_EFI_LIBRARY_CLASS         :   MetaFileParser._PathParser,
-        MODEL_META_DATA_PACKAGE         :   MetaFileParser._PathParser,
-        MODEL_META_DATA_NMAKE           :   _NmakeParser,                   # for Edk.x modules
-        MODEL_PCD_FIXED_AT_BUILD        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG          :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX            :   _PcdParser,
-        MODEL_PCD_DYNAMIC               :   _PcdParser,
-        MODEL_EFI_SOURCE_FILE           :   _SourceFileParser,
-        MODEL_EFI_GUID                  :   MetaFileParser._CommonParser,
-        MODEL_EFI_PROTOCOL              :   MetaFileParser._CommonParser,
-        MODEL_EFI_PPI                   :   MetaFileParser._CommonParser,
-        MODEL_EFI_DEPEX                 :   _DepexParser,
-        MODEL_EFI_BINARY_FILE           :   _BinaryFileParser,
-        MODEL_META_DATA_USER_EXTENSION  :   MetaFileParser._Skip,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_HEADER:   MetaFileParser._DefineParser,
+        MODEL_META_DATA_BUILD_OPTION:   MetaFileParser._BuildOptionParser,
+        MODEL_EFI_INCLUDE:   _IncludeParser,                 # for Edk.x modules
+        MODEL_EFI_LIBRARY_INSTANCE:   MetaFileParser._CommonParser,   # for Edk.x modules
+        MODEL_EFI_LIBRARY_CLASS:   MetaFileParser._PathParser,
+        MODEL_META_DATA_PACKAGE:   MetaFileParser._PathParser,
+        MODEL_META_DATA_NMAKE:   _NmakeParser,                   # for Edk.x modules
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX:   _PcdParser,
+        MODEL_PCD_DYNAMIC:   _PcdParser,
+        MODEL_EFI_SOURCE_FILE:   _SourceFileParser,
+        MODEL_EFI_GUID:   MetaFileParser._CommonParser,
+        MODEL_EFI_PROTOCOL:   MetaFileParser._CommonParser,
+        MODEL_EFI_PPI:   MetaFileParser._CommonParser,
+        MODEL_EFI_DEPEX:   _DepexParser,
+        MODEL_EFI_BINARY_FILE:   _BinaryFileParser,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._Skip,
     }
 
-## DSC file parser class
+# DSC file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
@@ -728,34 +755,36 @@ class InfParser(MetaFileParser):
 #   @param      Owner           Owner ID (for sub-section parsing)
 #   @param      From            ID from which the data comes (for !INCLUDE directive)
 #
+
+
 class DscParser(MetaFileParser):
     # DSC file supported data types (one type per section)
     DataType = {
-        TAB_SKUIDS.upper()                          :   MODEL_EFI_SKU_ID,
-        TAB_LIBRARIES.upper()                       :   MODEL_EFI_LIBRARY_INSTANCE,
-        TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
-        TAB_BUILD_OPTIONS.upper()                   :   MODEL_META_DATA_BUILD_OPTION,
-        TAB_PCDS_FIXED_AT_BUILD_NULL.upper()        :   MODEL_PCD_FIXED_AT_BUILD,
-        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper()   :   MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_PCDS_FEATURE_FLAG_NULL.upper()          :   MODEL_PCD_FEATURE_FLAG,
-        TAB_PCDS_DYNAMIC_DEFAULT_NULL.upper()       :   MODEL_PCD_DYNAMIC_DEFAULT,
-        TAB_PCDS_DYNAMIC_HII_NULL.upper()           :   MODEL_PCD_DYNAMIC_HII,
-        TAB_PCDS_DYNAMIC_VPD_NULL.upper()           :   MODEL_PCD_DYNAMIC_VPD,
-        TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL.upper()    :   MODEL_PCD_DYNAMIC_EX_DEFAULT,
-        TAB_PCDS_DYNAMIC_EX_HII_NULL.upper()        :   MODEL_PCD_DYNAMIC_EX_HII,
-        TAB_PCDS_DYNAMIC_EX_VPD_NULL.upper()        :   MODEL_PCD_DYNAMIC_EX_VPD,
-        TAB_COMPONENTS.upper()                      :   MODEL_META_DATA_COMPONENT,
-        TAB_DSC_DEFINES.upper()                     :   MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE                      :   MODEL_META_DATA_DEFINE,
-        TAB_DSC_DEFINES_EDKGLOBAL                   :   MODEL_META_DATA_GLOBAL_DEFINE,
-        TAB_INCLUDE.upper()                         :   MODEL_META_DATA_INCLUDE,
-        TAB_IF.upper()                              :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-        TAB_IF_DEF.upper()                          :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-        TAB_IF_N_DEF.upper()                        :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF,
-        TAB_ELSE_IF.upper()                         :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF,
-        TAB_ELSE.upper()                            :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
-        TAB_END_IF.upper()                          :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF,
-        TAB_ERROR.upper()                           :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR,
+        TAB_SKUIDS.upper():   MODEL_EFI_SKU_ID,
+        TAB_LIBRARIES.upper():   MODEL_EFI_LIBRARY_INSTANCE,
+        TAB_LIBRARY_CLASSES.upper():   MODEL_EFI_LIBRARY_CLASS,
+        TAB_BUILD_OPTIONS.upper():   MODEL_META_DATA_BUILD_OPTION,
+        TAB_PCDS_FIXED_AT_BUILD_NULL.upper():   MODEL_PCD_FIXED_AT_BUILD,
+        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper():   MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_PCDS_FEATURE_FLAG_NULL.upper():   MODEL_PCD_FEATURE_FLAG,
+        TAB_PCDS_DYNAMIC_DEFAULT_NULL.upper():   MODEL_PCD_DYNAMIC_DEFAULT,
+        TAB_PCDS_DYNAMIC_HII_NULL.upper():   MODEL_PCD_DYNAMIC_HII,
+        TAB_PCDS_DYNAMIC_VPD_NULL.upper():   MODEL_PCD_DYNAMIC_VPD,
+        TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL.upper():   MODEL_PCD_DYNAMIC_EX_DEFAULT,
+        TAB_PCDS_DYNAMIC_EX_HII_NULL.upper():   MODEL_PCD_DYNAMIC_EX_HII,
+        TAB_PCDS_DYNAMIC_EX_VPD_NULL.upper():   MODEL_PCD_DYNAMIC_EX_VPD,
+        TAB_COMPONENTS.upper():   MODEL_META_DATA_COMPONENT,
+        TAB_DSC_DEFINES.upper():   MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE:   MODEL_META_DATA_DEFINE,
+        TAB_DSC_DEFINES_EDKGLOBAL:   MODEL_META_DATA_GLOBAL_DEFINE,
+        TAB_INCLUDE.upper():   MODEL_META_DATA_INCLUDE,
+        TAB_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        TAB_IF_DEF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
+        TAB_IF_N_DEF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF,
+        TAB_ELSE_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF,
+        TAB_ELSE.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
+        TAB_END_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF,
+        TAB_ERROR.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR,
     }
 
     # Valid names in define section
@@ -784,7 +813,7 @@ class DscParser(MetaFileParser):
 
     SymbolPattern = ValueExpression.SymbolPattern
 
-    ## Constructor of DscParser
+    # Constructor of DscParser
     #
     #  Initialize object of DscParser
     #
@@ -812,18 +841,19 @@ class DscParser(MetaFileParser):
         #  Map the ID between the original table and new table to track
         #  the owner item
         #
-        self._IdMapping = {-1:-1}
+        self._IdMapping = {-1: -1}
 
         self.TblFile = EccGlobalData.gDb.TblFile
         self.FileID = -1
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         Content = ''
         try:
             Content = open(str(self.MetaFile.Path), 'r').readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
         #
         # Insert a record for file
         #
@@ -834,7 +864,6 @@ class DscParser(MetaFileParser):
         else:
             self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_DSC)
 
-
         for Index in range(0, len(Content)):
             Line = CleanString(Content[Index])
             # skip empty line
@@ -880,21 +909,21 @@ class DscParser(MetaFileParser):
             #
             for Arch, ModuleType in self._Scope:
                 self._LastItem = self._Store(
-                                        self._ItemType,
-                                        self._ValueList[0],
-                                        self._ValueList[1],
-                                        self._ValueList[2],
-                                        Arch,
-                                        ModuleType,
-                                        self._Owner[-1],
-                                        self.FileID,
-                                        self._From,
-                                        self._LineIndex+1,
-                                        -1,
-                                        self._LineIndex+1,
-                                        -1,
-                                        self._Enabled
-                                        )
+                    self._ItemType,
+                    self._ValueList[0],
+                    self._ValueList[1],
+                    self._ValueList[2],
+                    Arch,
+                    ModuleType,
+                    self._Owner[-1],
+                    self.FileID,
+                    self._From,
+                    self._LineIndex+1,
+                    -1,
+                    self._LineIndex+1,
+                    -1,
+                    self._Enabled
+                )
 
         if self._DirectiveStack:
             Type, Line, Text = self._DirectiveStack[-1]
@@ -902,7 +931,7 @@ class DscParser(MetaFileParser):
                             ExtraData=Text, File=self.MetaFile, Line=Line)
         self._Done()
 
-    ## <subsection_header> parser
+    # <subsection_header> parser
     def _SubsectionHeaderParser(self):
         self._SubsectionName = self._CurrentLine[1:-1].upper()
         if self._SubsectionName in self.DataType:
@@ -913,7 +942,7 @@ class DscParser(MetaFileParser):
                            Line=self._LineIndex+1, ExtraData=self._CurrentLine)
         self._ValueList[0] = self._SubsectionName
 
-    ## Directive statement parser
+    # Directive statement parser
     def _DirectiveParser(self):
         self._ValueList = ['', '', '']
         TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
@@ -951,7 +980,8 @@ class DscParser(MetaFileParser):
                 EdkLogger.error("Parser", FORMAT_INVALID, "'!elseif' after '!else'",
                                 File=self.MetaFile, Line=self._LineIndex+1,
                                 ExtraData=self._CurrentLine)
-            self._DirectiveStack.append((ItemType, self._LineIndex+1, self._CurrentLine))
+            self._DirectiveStack.append(
+                (ItemType, self._LineIndex+1, self._CurrentLine))
         elif self._From > 0:
             EdkLogger.error('Parser', FORMAT_INVALID,
                             "No '!include' allowed in included file",
@@ -963,23 +993,23 @@ class DscParser(MetaFileParser):
         # LineBegin=-1, ColumnBegin=-1, LineEnd=-1, ColumnEnd=-1, Enabled=-1
         #
         self._LastItem = self._Store(
-                                ItemType,
-                                self._ValueList[0],
-                                self._ValueList[1],
-                                self._ValueList[2],
-                                'COMMON',
-                                'COMMON',
-                                self._Owner[-1],
-                                self.FileID,
-                                self._From,
-                                self._LineIndex+1,
-                                -1,
-                                self._LineIndex+1,
-                                -1,
-                                0
-                                )
+            ItemType,
+            self._ValueList[0],
+            self._ValueList[1],
+            self._ValueList[2],
+            'COMMON',
+            'COMMON',
+            self._Owner[-1],
+            self.FileID,
+            self._From,
+            self._LineIndex+1,
+            -1,
+            self._LineIndex+1,
+            -1,
+            0
+        )
 
-    ## [defines] section parser
+    # [defines] section parser
     @ParseMacro
     def _DefineParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -993,7 +1023,7 @@ class DscParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "No value specified",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
         if (not self._ValueList[1] in self.DefineKeywords and
-            (self._InSubsection and self._ValueList[1] not in self.SubSectionDefineKeywords)):
+                (self._InSubsection and self._ValueList[1] not in self.SubSectionDefineKeywords)):
             EdkLogger.error('Parser', FORMAT_INVALID,
                             "Unknown keyword found: %s. "
                             "If this is a macro you must "
@@ -1010,11 +1040,11 @@ class DscParser(MetaFileParser):
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex+1)
         self._ValueList[0:len(TokenList)] = TokenList
 
-    ## Parse Edk style of library modules
+    # Parse Edk style of library modules
     def _LibraryInstanceParser(self):
         self._ValueList[0] = self._CurrentLine
 
-    ## PCD sections parser
+    # PCD sections parser
     #
     #   [PcdsFixedAtBuild]
     #   [PcdsPatchableInModule]
@@ -1036,20 +1066,24 @@ class DscParser(MetaFileParser):
             self._ValueList[2] = TokenList[1]
         if self._ValueList[0] == '' or self._ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if self._ValueList[2] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No PCD value given",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
         if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
-            self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
+            self._ValueList[2] = TokenList[1].replace(
+                DscPcdValueList[0], '1', 1)
         elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
-            self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
+            self._ValueList[2] = TokenList[1].replace(
+                DscPcdValueList[0], '0', 1)
 
-    ## [components] section parser
+    # [components] section parser
     @ParseMacro
     def _ComponentParser(self):
         if self._CurrentLine[-1] == '{':
@@ -1058,27 +1092,30 @@ class DscParser(MetaFileParser):
         else:
             self._ValueList[0] = self._CurrentLine
 
-    ## [LibraryClasses] section
+    # [LibraryClasses] section
     @ParseMacro
     def _LibraryClassParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No library class or instance specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if TokenList[0] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No library class specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if TokenList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No library instance specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
 
         self._ValueList[0:len(TokenList)] = TokenList
 
+    # [BuildOptions] section parser
 
-    ## [BuildOptions] section parser
     @ParseMacro
     def _BuildOptionParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -1095,15 +1132,17 @@ class DscParser(MetaFileParser):
             EdkLogger.error(
                 'Parser',
                 FORMAT_INVALID,
-                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
+                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[
+                    1],
                 ExtraData=self._CurrentLine,
                 File=self.MetaFile,
                 Line=self._LineIndex+1
-                )
+            )
 
-    ## Override parent's method since we'll do all macro replacements in parser
+    # Override parent's method since we'll do all macro replacements in parser
     def _GetMacros(self):
-        Macros = dict( [('ARCH', 'IA32'), ('FAMILY', TAB_COMPILER_MSFT), ('TOOL_CHAIN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')])
+        Macros = dict([('ARCH', 'IA32'), ('FAMILY', TAB_COMPILER_MSFT),
+                      ('TOOL_CHAIN_TAG', 'VS2008x86'), ('TARGET', 'DEBUG')])
         Macros.update(self._FileLocalMacros)
         Macros.update(self._GetApplicableSectionMacro())
         Macros.update(GlobalData.gEdkGlobal)
@@ -1116,39 +1155,40 @@ class DscParser(MetaFileParser):
 
     def _PostProcess(self):
         Processer = {
-            MODEL_META_DATA_SECTION_HEADER                  :   self.__ProcessSectionHeader,
-            MODEL_META_DATA_SUBSECTION_HEADER               :   self.__ProcessSubsectionHeader,
-            MODEL_META_DATA_HEADER                          :   self.__ProcessDefine,
-            MODEL_META_DATA_DEFINE                          :   self.__ProcessDefine,
-            MODEL_META_DATA_GLOBAL_DEFINE                   :   self.__ProcessDefine,
-            MODEL_META_DATA_INCLUDE                         :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IF        :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE      :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF     :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF    :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF     :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF    :   self.__ProcessDirective,
-            MODEL_EFI_SKU_ID                                :   self.__ProcessSkuId,
-            MODEL_EFI_LIBRARY_INSTANCE                      :   self.__ProcessLibraryInstance,
-            MODEL_EFI_LIBRARY_CLASS                         :   self.__ProcessLibraryClass,
-            MODEL_PCD_FIXED_AT_BUILD                        :   self.__ProcessPcd,
-            MODEL_PCD_PATCHABLE_IN_MODULE                   :   self.__ProcessPcd,
-            MODEL_PCD_FEATURE_FLAG                          :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_DEFAULT                       :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_HII                           :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_VPD                           :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_DEFAULT                    :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_HII                        :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_VPD                        :   self.__ProcessPcd,
-            MODEL_META_DATA_COMPONENT                       :   self.__ProcessComponent,
-            MODEL_META_DATA_BUILD_OPTION                    :   self.__ProcessBuildOption,
-            MODEL_UNKNOWN                                   :   self._Skip,
-            MODEL_META_DATA_USER_EXTENSION                  :   self._Skip,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR     :   self._Skip,
+            MODEL_META_DATA_SECTION_HEADER:   self.__ProcessSectionHeader,
+            MODEL_META_DATA_SUBSECTION_HEADER:   self.__ProcessSubsectionHeader,
+            MODEL_META_DATA_HEADER:   self.__ProcessDefine,
+            MODEL_META_DATA_DEFINE:   self.__ProcessDefine,
+            MODEL_META_DATA_GLOBAL_DEFINE:   self.__ProcessDefine,
+            MODEL_META_DATA_INCLUDE:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF:   self.__ProcessDirective,
+            MODEL_EFI_SKU_ID:   self.__ProcessSkuId,
+            MODEL_EFI_LIBRARY_INSTANCE:   self.__ProcessLibraryInstance,
+            MODEL_EFI_LIBRARY_CLASS:   self.__ProcessLibraryClass,
+            MODEL_PCD_FIXED_AT_BUILD:   self.__ProcessPcd,
+            MODEL_PCD_PATCHABLE_IN_MODULE:   self.__ProcessPcd,
+            MODEL_PCD_FEATURE_FLAG:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_DEFAULT:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_HII:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_VPD:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_DEFAULT:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_HII:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_VPD:   self.__ProcessPcd,
+            MODEL_META_DATA_COMPONENT:   self.__ProcessComponent,
+            MODEL_META_DATA_BUILD_OPTION:   self.__ProcessBuildOption,
+            MODEL_UNKNOWN:   self._Skip,
+            MODEL_META_DATA_USER_EXTENSION:   self._Skip,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR:   self._Skip,
         }
 
         self._RawTable = self._Table
-        self._Table = MetaFileStorage(self._RawTable.Cur, self.MetaFile, MODEL_FILE_DSC, True)
+        self._Table = MetaFileStorage(
+            self._RawTable.Cur, self.MetaFile, MODEL_FILE_DSC, True)
         self._DirectiveStack = []
         self._DirectiveEvalStack = []
         self._FileWithError = self.MetaFile
@@ -1160,7 +1200,7 @@ class DscParser(MetaFileParser):
         self.__RetrievePcdValue()
         self._Content = self._RawTable.GetAll()
         self._ContentIndex = 0
-        while self._ContentIndex < len(self._Content) :
+        while self._ContentIndex < len(self._Content):
             Id, self._ItemType, V1, V2, V3, S1, S2, Owner, BelongsToFile, self._From, \
                 LineStart, ColStart, LineEnd, ColEnd, Enabled = self._Content[self._ContentIndex]
 
@@ -1186,30 +1226,32 @@ class DscParser(MetaFileParser):
 #                                 Line=self._LineIndex+1)
             except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
-                                File=self._FileWithError, ExtraData=' '.join(self._ValueList),
+                                File=self._FileWithError, ExtraData=' '.join(
+                                    self._ValueList),
                                 Line=self._LineIndex+1)
 
             if self._ValueList is None:
                 continue
 
             NewOwner = self._IdMapping.get(Owner, -1)
-            self._Enabled = int((not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack))
+            self._Enabled = int((not self._DirectiveEvalStack) or (
+                False not in self._DirectiveEvalStack))
             self._LastItem = self._Store(
-                                self._ItemType,
-                                self._ValueList[0],
-                                self._ValueList[1],
-                                self._ValueList[2],
-                                S1,
-                                S2,
-                                NewOwner,
-                                BelongsToFile,
-                                self._From,
-                                self._LineIndex+1,
-                                -1,
-                                self._LineIndex+1,
-                                -1,
-                                self._Enabled
-                                )
+                self._ItemType,
+                self._ValueList[0],
+                self._ValueList[1],
+                self._ValueList[2],
+                S1,
+                S2,
+                NewOwner,
+                BelongsToFile,
+                self._From,
+                self._LineIndex+1,
+                -1,
+                self._LineIndex+1,
+                -1,
+                self._Enabled
+            )
             self._IdMapping[Id] = self._LastItem
 
         RecordList = self._Table.GetAll()
@@ -1217,7 +1259,8 @@ class DscParser(MetaFileParser):
         self._RawTable.Drop()
         self._Table.Drop()
         for Record in RecordList:
-            EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3], Record[4], Record[5], Record[6], Record[7], Record[8], Record[9], Record[10], Record[11], Record[12], Record[13], Record[14])
+            EccGlobalData.gDb.TblDsc.Insert(Record[1], Record[2], Record[3], Record[4], Record[5], Record[6],
+                                            Record[7], Record[8], Record[9], Record[10], Record[11], Record[12], Record[13], Record[14])
         GlobalData.gPlatformDefines.update(self._FileLocalMacros)
         self._PostProcessed = True
         self._Content = None
@@ -1237,7 +1280,8 @@ class DscParser(MetaFileParser):
             self._SubsectionType = MODEL_UNKNOWN
 
     def __RetrievePcdValue(self):
-        Records = self._RawTable.Query(MODEL_PCD_FEATURE_FLAG, BelongsToItem=-1.0)
+        Records = self._RawTable.Query(
+            MODEL_PCD_FEATURE_FLAG, BelongsToItem=-1.0)
         for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
             Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
             # Only use PCD whose value is straitforward (no macro and PCD)
@@ -1250,7 +1294,8 @@ class DscParser(MetaFileParser):
                 continue
             self._Symbols[Name] = Value
 
-        Records = self._RawTable.Query(MODEL_PCD_FIXED_AT_BUILD, BelongsToItem=-1.0)
+        Records = self._RawTable.Query(
+            MODEL_PCD_FIXED_AT_BUILD, BelongsToItem=-1.0)
         for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, ID, Line in Records:
             Value, DatumType, MaxDatumSize = AnalyzePcdData(Value)
             # Only use PCD whose value is straitforward (no macro and PCD)
@@ -1298,7 +1343,8 @@ class DscParser(MetaFileParser):
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
             except SymbolNotFound as Exc:
-                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
+                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc),
+                                self._ValueList[1])
                 Result = False
             except WrnExpression as Excpt:
                 #
@@ -1306,11 +1352,13 @@ class DscParser(MetaFileParser):
                 # the precise number of line and return the evaluation result
                 #
                 EdkLogger.warn('Parser', "Suspicious expression: %s" % str(Excpt),
-                                File=self._FileWithError, ExtraData=' '.join(self._ValueList),
-                                Line=self._LineIndex+1)
+                               File=self._FileWithError, ExtraData=' '.join(
+                                   self._ValueList),
+                               Line=self._LineIndex+1)
                 Result = Excpt.result
             except BadExpression as Exc:
-                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
+                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc),
+                                self._ValueList[1])
                 Result = False
 
         if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
@@ -1321,7 +1369,8 @@ class DscParser(MetaFileParser):
                 Result = bool(Result)
             else:
                 Macro = self._ValueList[1]
-                Macro = Macro[2:-1] if (Macro.startswith("$(") and Macro.endswith(")")) else Macro
+                Macro = Macro[2:-1] if (Macro.startswith("$(")
+                                        and Macro.endswith(")")) else Macro
                 Result = Macro in self._Macros
                 if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF:
                     Result = not Result
@@ -1356,7 +1405,8 @@ class DscParser(MetaFileParser):
             #
             __IncludeMacros.update(self._Macros)
 
-            IncludedFile = NormPath(ReplaceMacro(self._ValueList[1], __IncludeMacros, RaiseError=True))
+            IncludedFile = NormPath(ReplaceMacro(
+                self._ValueList[1], __IncludeMacros, RaiseError=True))
             #
             # First search the include file under the same directory as DSC file
             #
@@ -1370,11 +1420,12 @@ class DscParser(MetaFileParser):
                 ErrorCode, ErrorInfo2 = IncludedFile1.Validate()
                 if ErrorCode != 0:
                     EdkLogger.error('parser', ErrorCode, File=self._FileWithError,
-                                    Line=self._LineIndex+1, ExtraData=ErrorInfo1 + "\n"+ ErrorInfo2)
+                                    Line=self._LineIndex+1, ExtraData=ErrorInfo1 + "\n" + ErrorInfo2)
 
             self._FileWithError = IncludedFile1
 
-            IncludedFileTable = MetaFileStorage(self._Table.Cur, IncludedFile1, MODEL_FILE_DSC, True)
+            IncludedFileTable = MetaFileStorage(
+                self._Table.Cur, IncludedFile1, MODEL_FILE_DSC, True)
             Owner = self._Content[self._ContentIndex-1][0]
             Parser = DscParser(IncludedFile1, self._FileType, IncludedFileTable,
                                Owner=Owner, From=Owner)
@@ -1390,8 +1441,8 @@ class DscParser(MetaFileParser):
             # update current status with sub-parser's status
             self._SectionName = Parser._SectionName
             self._SectionType = Parser._SectionType
-            self._Scope       = Parser._Scope
-            self._Enabled     = Parser._Enabled
+            self._Scope = Parser._Scope
+            self._Enabled = Parser._Enabled
 
             # Insert all records in the table for the included file into dsc file table
             Records = IncludedFileTable.GetAll()
@@ -1406,10 +1457,12 @@ class DscParser(MetaFileParser):
                            for Value in self._ValueList]
 
     def __ProcessLibraryInstance(self):
-        self._ValueList = [ReplaceMacro(Value, self._Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, self._Macros)
+                           for Value in self._ValueList]
 
     def __ProcessLibraryClass(self):
-        self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros, RaiseError=True)
+        self._ValueList[1] = ReplaceMacro(
+            self._ValueList[1], self._Macros, RaiseError=True)
 
     def __ProcessPcd(self):
         ValueList = GetSplitValueList(self._ValueList[2])
@@ -1444,54 +1497,56 @@ class DscParser(MetaFileParser):
                            for Value in self._ValueList]
 
     _SectionParser = {
-        MODEL_META_DATA_HEADER                          :   _DefineParser,
-        MODEL_EFI_SKU_ID                                :   _SkuIdParser,
-        MODEL_EFI_LIBRARY_INSTANCE                      :   _LibraryInstanceParser,
-        MODEL_EFI_LIBRARY_CLASS                         :   _LibraryClassParser,
-        MODEL_PCD_FIXED_AT_BUILD                        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE                   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG                          :   _PcdParser,
-        MODEL_PCD_DYNAMIC_DEFAULT                       :   _PcdParser,
-        MODEL_PCD_DYNAMIC_HII                           :   _PcdParser,
-        MODEL_PCD_DYNAMIC_VPD                           :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_DEFAULT                    :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_HII                        :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_VPD                        :   _PcdParser,
-        MODEL_META_DATA_COMPONENT                       :   _ComponentParser,
-        MODEL_META_DATA_BUILD_OPTION                    :   _BuildOptionParser,
-        MODEL_UNKNOWN                                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_USER_EXTENSION                  :   MetaFileParser._Skip,
-        MODEL_META_DATA_SECTION_HEADER                  :   MetaFileParser._SectionHeaderParser,
-        MODEL_META_DATA_SUBSECTION_HEADER               :   _SubsectionHeaderParser,
+        MODEL_META_DATA_HEADER:   _DefineParser,
+        MODEL_EFI_SKU_ID:   _SkuIdParser,
+        MODEL_EFI_LIBRARY_INSTANCE:   _LibraryInstanceParser,
+        MODEL_EFI_LIBRARY_CLASS:   _LibraryClassParser,
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC_DEFAULT:   _PcdParser,
+        MODEL_PCD_DYNAMIC_HII:   _PcdParser,
+        MODEL_PCD_DYNAMIC_VPD:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_DEFAULT:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_HII:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_VPD:   _PcdParser,
+        MODEL_META_DATA_COMPONENT:   _ComponentParser,
+        MODEL_META_DATA_BUILD_OPTION:   _BuildOptionParser,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._Skip,
+        MODEL_META_DATA_SECTION_HEADER:   MetaFileParser._SectionHeaderParser,
+        MODEL_META_DATA_SUBSECTION_HEADER:   _SubsectionHeaderParser,
     }
 
-    _Macros     = property(_GetMacros)
+    _Macros = property(_GetMacros)
 
-## DEC file parser class
+# DEC file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
 #   @param      Table           Database used to retrieve module/package information
 #   @param      Macros          Macros used for replacement in file
 #
+
+
 class DecParser(MetaFileParser):
     # DEC file supported data types (one type per section)
     DataType = {
-        TAB_DEC_DEFINES.upper()                     :   MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE                      :   MODEL_META_DATA_DEFINE,
-        TAB_INCLUDES.upper()                        :   MODEL_EFI_INCLUDE,
-        TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
-        TAB_GUIDS.upper()                           :   MODEL_EFI_GUID,
-        TAB_PPIS.upper()                            :   MODEL_EFI_PPI,
-        TAB_PROTOCOLS.upper()                       :   MODEL_EFI_PROTOCOL,
-        TAB_PCDS_FIXED_AT_BUILD_NULL.upper()        :   MODEL_PCD_FIXED_AT_BUILD,
-        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper()   :   MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_PCDS_FEATURE_FLAG_NULL.upper()          :   MODEL_PCD_FEATURE_FLAG,
-        TAB_PCDS_DYNAMIC_NULL.upper()               :   MODEL_PCD_DYNAMIC,
-        TAB_PCDS_DYNAMIC_EX_NULL.upper()            :   MODEL_PCD_DYNAMIC_EX,
+        TAB_DEC_DEFINES.upper():   MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE:   MODEL_META_DATA_DEFINE,
+        TAB_INCLUDES.upper():   MODEL_EFI_INCLUDE,
+        TAB_LIBRARY_CLASSES.upper():   MODEL_EFI_LIBRARY_CLASS,
+        TAB_GUIDS.upper():   MODEL_EFI_GUID,
+        TAB_PPIS.upper():   MODEL_EFI_PPI,
+        TAB_PROTOCOLS.upper():   MODEL_EFI_PROTOCOL,
+        TAB_PCDS_FIXED_AT_BUILD_NULL.upper():   MODEL_PCD_FIXED_AT_BUILD,
+        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper():   MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_PCDS_FEATURE_FLAG_NULL.upper():   MODEL_PCD_FEATURE_FLAG,
+        TAB_PCDS_DYNAMIC_NULL.upper():   MODEL_PCD_DYNAMIC,
+        TAB_PCDS_DYNAMIC_EX_NULL.upper():   MODEL_PCD_DYNAMIC_EX,
     }
 
-    ## Constructor of DecParser
+    # Constructor of DecParser
     #
     #  Initialize object of DecParser
     #
@@ -1514,13 +1569,14 @@ class DecParser(MetaFileParser):
         self._include_flag = False
         self._package_flag = False
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         Content = ''
         try:
             Content = open(str(self.MetaFile), 'r').readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
 
         #
         # Insert a record for file
@@ -1580,7 +1636,7 @@ class DecParser(MetaFileParser):
                     self._LineIndex+1,
                     -1,
                     0
-                    )
+                )
                 for Comment, LineNo in self._Comments:
                     self._Store(
                         MODEL_META_DATA_COMMENT,
@@ -1596,7 +1652,7 @@ class DecParser(MetaFileParser):
                         LineNo,
                         -1,
                         0
-                        )
+                    )
             self._Comments = []
         self._Done()
 
@@ -1605,10 +1661,11 @@ class DecParser(MetaFileParser):
         for S1, S2, SectionType in self._Scope:
             for Scope1, Scope2 in [("COMMON", "COMMON"), ("COMMON", S2), (S1, "COMMON"), (S1, S2)]:
                 if (SectionType, Scope1, Scope2) in self._SectionsMacroDict:
-                    Macros.update(self._SectionsMacroDict[(SectionType, Scope1, Scope2)])
+                    Macros.update(
+                        self._SectionsMacroDict[(SectionType, Scope1, Scope2)])
         return Macros
 
-    ## Section header parser
+    # Section header parser
     #
     #   The section header is always in following format:
     #
@@ -1631,18 +1688,18 @@ class DecParser(MetaFileParser):
                     self._SectionType.append(self.DataType[self._SectionName])
             else:
                 EdkLogger.warn("Parser", "Unrecognized section", File=self.MetaFile,
-                                Line=self._LineIndex+1, ExtraData=self._CurrentLine)
+                               Line=self._LineIndex+1, ExtraData=self._CurrentLine)
                 continue
 
             if MODEL_PCD_FEATURE_FLAG in self._SectionType and len(self._SectionType) > 1:
                 EdkLogger.error(
-                            'Parser',
-                            FORMAT_INVALID,
-                            "%s must not be in the same section of other types of PCD" % TAB_PCDS_FEATURE_FLAG_NULL,
-                            File=self.MetaFile,
-                            Line=self._LineIndex+1,
-                            ExtraData=self._CurrentLine
-                            )
+                    'Parser',
+                    FORMAT_INVALID,
+                    "%s must not be in the same section of other types of PCD" % TAB_PCDS_FEATURE_FLAG_NULL,
+                    File=self.MetaFile,
+                    Line=self._LineIndex+1,
+                    ExtraData=self._CurrentLine
+                )
             # S1 is always Arch
             if len(ItemList) > 1:
                 S1 = ItemList[1].upper()
@@ -1662,29 +1719,32 @@ class DecParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "'common' ARCH must not be used with specific ARCHs",
                             File=self.MetaFile, Line=self._LineIndex+1, ExtraData=self._CurrentLine)
 
-    ## [guids], [ppis] and [protocols] section parser
+    # [guids], [ppis] and [protocols] section parser
     @ParseMacro
     def _GuidParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID name or value specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if TokenList[0] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID name specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if TokenList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID value specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         if TokenList[1][0] != '{' or TokenList[1][-1] != '}' or GuidStructureStringToGuidString(TokenList[1]) == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
-                            ExtraData=self._CurrentLine + \
-                                      " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         self._ValueList[0] = TokenList[0]
-        #Parse the Guid value format
+        # Parse the Guid value format
         GuidValueList = TokenList[1].strip(' {}').split(',')
         Index = 0
         HexList = []
@@ -1700,15 +1760,16 @@ class DecParser(MetaFileParser):
                         GuidValue = GuidValue.lstrip(' {')
                         HexList.append('0x' + str(GuidValue[2:]))
                         Index += 1
-            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (HexList[0], HexList[1], HexList[2], HexList[3], HexList[4], HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10])
+            self._ValueList[1] = "{ %s, %s, %s, { %s, %s, %s, %s, %s, %s, %s, %s }}" % (
+                HexList[0], HexList[1], HexList[2], HexList[3], HexList[4], HexList[5], HexList[6], HexList[7], HexList[8], HexList[9], HexList[10])
         else:
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
-                            ExtraData=self._CurrentLine + \
-                                      " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
             self._ValueList[0] = ''
 
-    def ParsePcdName(self,namelist):
+    def ParsePcdName(self, namelist):
         if "[" in namelist[1]:
             pcdname = namelist[1][:namelist[1].index("[")]
             arrayindex = namelist[1][namelist[1].index("["):]
@@ -1716,7 +1777,7 @@ class DecParser(MetaFileParser):
             if len(namelist) == 2:
                 namelist.append(arrayindex)
             else:
-                namelist[2] = ".".join((arrayindex,namelist[2]))
+                namelist[2] = ".".join((arrayindex, namelist[2]))
         return namelist
 
     def StructPcdParser(self):
@@ -1735,10 +1796,12 @@ class DecParser(MetaFileParser):
                 return
 
             if self._include_flag:
-                self._ValueList[1] = "<HeaderFiles>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
+                self._ValueList[1] = "<HeaderFiles>_" + \
+                    md5(self._CurrentLine.encode('utf-8')).hexdigest()
                 self._ValueList[2] = self._CurrentLine
             if self._package_flag and "}" != self._CurrentLine:
-                self._ValueList[1] = "<Packages>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
+                self._ValueList[1] = "<Packages>_" + \
+                    md5(self._CurrentLine.encode('utf-8')).hexdigest()
                 self._ValueList[2] = self._CurrentLine
             if self._CurrentLine == "}":
                 self._package_flag = False
@@ -1759,12 +1822,12 @@ class DecParser(MetaFileParser):
             else:
                 if self._CurrentStructurePcdName != TAB_SPLIT.join(PcdNames[:2]):
                     EdkLogger.error('Parser', FORMAT_INVALID, "Pcd Name does not match: %s and %s " % (
-                    self._CurrentStructurePcdName, TAB_SPLIT.join(PcdNames[:2])),
-                                    File=self.MetaFile, Line=self._LineIndex + 1)
+                        self._CurrentStructurePcdName, TAB_SPLIT.join(PcdNames[:2])),
+                        File=self.MetaFile, Line=self._LineIndex + 1)
                 self._ValueList[1] = TAB_SPLIT.join(PcdNames[2:])
                 self._ValueList[2] = PcdTockens[1]
 
-    ## PCD sections parser
+    # PCD sections parser
     #
     #   [PcdsFixedAtBuild]
     #   [PcdsPatchableInModule]
@@ -1782,52 +1845,50 @@ class DecParser(MetaFileParser):
         # check PCD information
         if self._ValueList[0] == '' or self._ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # check PCD datum information
         if len(TokenList) < 2 or TokenList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No PCD Datum information given",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
 
-
-        ValueRe  = re.compile(r'^\s*L?\".*\|.*\"')
+        ValueRe = re.compile(r'^\s*L?\".*\|.*\"')
         PtrValue = ValueRe.findall(TokenList[1])
 
         # Has VOID* type string, may contain "|" character in the string.
         if len(PtrValue) != 0:
             ptrValueList = re.sub(ValueRe, '', TokenList[1])
-            ValueList    = GetSplitValueList(ptrValueList)
+            ValueList = GetSplitValueList(ptrValueList)
             ValueList[0] = PtrValue[0]
         else:
             ValueList = GetSplitValueList(TokenList[1])
 
-
         # check if there's enough datum information given
         if len(ValueList) != 3:
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid PCD Datum information given",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # check default value
         if ValueList[0] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "Missing DefaultValue in PCD Datum information",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # check datum type
         if ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "Missing DatumType in PCD Datum information",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # check token of the PCD
         if ValueList[2] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "Missing Token in PCD Datum information",
-                            ExtraData=self._CurrentLine + \
-                                      " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                             File=self.MetaFile, Line=self._LineIndex+1)
         # check format of default value against the datum type
         IsValid, Cause = CheckPcdDatum(ValueList[1], ValueList[0])
@@ -1835,7 +1896,8 @@ class DecParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, Cause, ExtraData=self._CurrentLine,
                             File=self.MetaFile, Line=self._LineIndex+1)
         if Cause == "StructurePcd":
-            self._CurrentStructurePcdName = TAB_SPLIT.join(self._ValueList[0:2])
+            self._CurrentStructurePcdName = TAB_SPLIT.join(
+                self._ValueList[0:2])
             self._ValueList[0] = self._CurrentStructurePcdName
             self._ValueList[1] = ValueList[1].strip()
 
@@ -1848,7 +1910,8 @@ class DecParser(MetaFileParser):
             # check @ValidRange, @ValidList and @Expression format valid
             ErrorCodeValid = '0x0 <= %s <= 0xFFFFFFFF'
             PatternValidRangeIn = '(NOT)?\s*(\d+\s*-\s*\d+|0[xX][a-fA-F0-9]+\s*-\s*0[xX][a-fA-F0-9]+|LT\s*\d+|LT\s*0[xX][a-fA-F0-9]+|GT\s*\d+|GT\s*0[xX][a-fA-F0-9]+|LE\s*\d+|LE\s*0[xX][a-fA-F0-9]+|GE\s*\d+|GE\s*0[xX][a-fA-F0-9]+|XOR\s*\d+|XOR\s*0[xX][a-fA-F0-9]+|EQ\s*\d+|EQ\s*0[xX][a-fA-F0-9]+)'
-            PatternValidRng = re.compile('^' + '(NOT)?\s*' + PatternValidRangeIn + '$')
+            PatternValidRng = re.compile(
+                '^' + '(NOT)?\s*' + PatternValidRangeIn + '$')
             for Comment in self._Comments:
                 Comm = Comment[0].strip()
                 if not Comm:
@@ -1871,11 +1934,14 @@ class DecParser(MetaFileParser):
                         ErrorCode, Expression = ErrorCode.strip(), Expression.strip()
                         try:
                             if not eval(ErrorCodeValid % ErrorCode):
-                                EdkLogger.warn('Parser', '@ValidRange ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                                EdkLogger.warn(
+                                    'Parser', '@ValidRange ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         except:
-                            EdkLogger.warn('Parser', '@ValidRange ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                            EdkLogger.warn(
+                                'Parser', '@ValidRange ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         if not PatternValidRng.search(Expression):
-                            EdkLogger.warn('Parser', '@ValidRange Expression(%s) of PCD %s is incorrect format.' % (Expression, TokenList[0]))
+                            EdkLogger.warn(
+                                'Parser', '@ValidRange Expression(%s) of PCD %s is incorrect format.' % (Expression, TokenList[0]))
                     if ValidFormt[0:10] == '@ValidList':
                         ValidFormt = ValidFormt[10:]
                         ValidFormt = ValidFormt.lstrip()
@@ -1887,16 +1953,19 @@ class DecParser(MetaFileParser):
                         ErrorCode, Expression = ErrorCode.strip(), Expression.strip()
                         try:
                             if not eval(ErrorCodeValid % ErrorCode):
-                                EdkLogger.warn('Parser', '@ValidList ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                                EdkLogger.warn(
+                                    'Parser', '@ValidList ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         except:
-                            EdkLogger.warn('Parser', '@ValidList ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                            EdkLogger.warn(
+                                'Parser', '@ValidList ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         Values = Expression.split(',')
                         for Value in Values:
                             Value = Value.strip()
                             try:
                                 eval(Value)
                             except:
-                                EdkLogger.warn('Parser', '@ValidList Expression of PCD %s include a invalid value(%s).' % (TokenList[0], Value))
+                                EdkLogger.warn(
+                                    'Parser', '@ValidList Expression of PCD %s include a invalid value(%s).' % (TokenList[0], Value))
                                 break
                     if ValidFormt[0:11] == '@Expression':
                         ValidFormt = ValidFormt[11:]
@@ -1909,15 +1978,20 @@ class DecParser(MetaFileParser):
                         ErrorCode, Expression = ErrorCode.strip(), Expression.strip()
                         try:
                             if not eval(ErrorCodeValid % ErrorCode):
-                                EdkLogger.warn('Parser', '@Expression ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                                EdkLogger.warn(
+                                    'Parser', '@Expression ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         except:
-                            EdkLogger.warn('Parser', '@Expression ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
+                            EdkLogger.warn(
+                                'Parser', '@Expression ErrorCode(%s) of PCD %s is not valid UINT32 value.' % (ErrorCode, TokenList[0]))
                         if not Expression:
-                            EdkLogger.warn('Parser', '@Expression Expression of PCD %s is incorrect format.' % TokenList[0])
+                            EdkLogger.warn(
+                                'Parser', '@Expression Expression of PCD %s is incorrect format.' % TokenList[0])
             if not Description:
-                EdkLogger.warn('Parser', 'PCD %s Description information is not provided.' % TokenList[0])
+                EdkLogger.warn(
+                    'Parser', 'PCD %s Description information is not provided.' % TokenList[0])
             if not Prompt:
-                EdkLogger.warn('Parser', 'PCD %s Prompt information is not provided.' % TokenList[0])
+                EdkLogger.warn(
+                    'Parser', 'PCD %s Prompt information is not provided.' % TokenList[0])
             # check Description, Prompt localization information
             if self._UniObj:
                 self._UniObj.CheckPcdInfo(TokenList[0])
@@ -1927,26 +2001,27 @@ class DecParser(MetaFileParser):
         elif ValueList[0] in ['False', 'false', 'FALSE']:
             ValueList[0] = '0'
 
-        self._ValueList[2] = ValueList[0].strip() + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
+        self._ValueList[2] = ValueList[0].strip(
+        ) + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
 
     _SectionParser = {
-        MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
-        MODEL_EFI_INCLUDE               :   MetaFileParser._PathParser,
-        MODEL_EFI_LIBRARY_CLASS         :   MetaFileParser._PathParser,
-        MODEL_EFI_GUID                  :   _GuidParser,
-        MODEL_EFI_PPI                   :   _GuidParser,
-        MODEL_EFI_PROTOCOL              :   _GuidParser,
-        MODEL_PCD_FIXED_AT_BUILD        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG          :   _PcdParser,
-        MODEL_PCD_DYNAMIC               :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX            :   _PcdParser,
-        MODEL_UNKNOWN                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_USER_EXTENSION  :   MetaFileParser._Skip,
+        MODEL_META_DATA_HEADER:   MetaFileParser._DefineParser,
+        MODEL_EFI_INCLUDE:   MetaFileParser._PathParser,
+        MODEL_EFI_LIBRARY_CLASS:   MetaFileParser._PathParser,
+        MODEL_EFI_GUID:   _GuidParser,
+        MODEL_EFI_PPI:   _GuidParser,
+        MODEL_EFI_PROTOCOL:   _GuidParser,
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX:   _PcdParser,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._Skip,
     }
 
 
-## Fdf
+# Fdf
 #
 # This class defined the structure used in Fdf object
 #
@@ -1954,7 +2029,7 @@ class DecParser(MetaFileParser):
 # @param WorkspaceDir:  Input value for current workspace directory, default is None
 #
 class Fdf(object):
-    def __init__(self, Filename = None, IsToDatabase = False, WorkspaceDir = None, Database = None):
+    def __init__(self, Filename=None, IsToDatabase=False, WorkspaceDir=None, Database=None):
         self.WorkspaceDir = WorkspaceDir
         self.IsToDatabase = IsToDatabase
 
@@ -1985,13 +2060,13 @@ class Fdf(object):
 
         return self.FileList[Filename]
 
-
-    ## Load Fdf file
+    # Load Fdf file
     #
     # Load the file if it exists
     #
     # @param Filename:  Input value for filename of Fdf file
     #
+
     def LoadFdfFile(self, Filename):
         FileList = []
         #
@@ -2006,7 +2081,7 @@ class Fdf(object):
         #
         if self.IsToDatabase:
             (Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled) = \
-            (0, '', '', '', 'COMMON', 'COMMON', -1, -1, -1, -1, -1, -1, 0)
+                (0, '', '', '', 'COMMON', 'COMMON', -1, -1, -1, -1, -1, -1, 0)
             for Index in range(0, len(Fdf.Profile.PcdDict)):
                 pass
             for Key in Fdf.Profile.PcdDict.keys():
@@ -2016,7 +2091,8 @@ class Fdf(object):
                 FileName = Fdf.Profile.PcdFileLineDict[Key][0]
                 StartLine = Fdf.Profile.PcdFileLineDict[Key][1]
                 BelongsToFile = self.InsertFile(FileName)
-                self.TblFdf.Insert(Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
+                self.TblFdf.Insert(Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem,
+                                   BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
             for Index in range(0, len(Fdf.Profile.InfList)):
                 Model = MODEL_META_DATA_COMPONENT
                 Value1 = Fdf.Profile.InfList[Index]
@@ -2024,7 +2100,9 @@ class Fdf(object):
                 FileName = Fdf.Profile.InfFileLineList[Index][0]
                 StartLine = Fdf.Profile.InfFileLineList[Index][1]
                 BelongsToFile = self.InsertFile(FileName)
-                self.TblFdf.Insert(Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
+                self.TblFdf.Insert(Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem,
+                                   BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
+
 
 class UniParser(object):
     # IsExtraUni defined the UNI file is Module UNI or extra Module UNI
@@ -2040,11 +2118,14 @@ class UniParser(object):
 
     def __read(self):
         try:
-            self.FileIn = CodecOpenLongFilePath(self.FilePath, Mode='rb', Encoding='utf_8').read()
+            self.FileIn = CodecOpenLongFilePath(
+                self.FilePath, Mode='rb', Encoding='utf_8').read()
         except UnicodeError:
-            self.FileIn = CodecOpenLongFilePath(self.FilePath, Mode='rb', Encoding='utf_16').read()
+            self.FileIn = CodecOpenLongFilePath(
+                self.FilePath, Mode='rb', Encoding='utf_16').read()
         except UnicodeError:
-            self.FileIn = CodecOpenLongFilePath(self.FilePath, Mode='rb', Encoding='utf_16_le').read()
+            self.FileIn = CodecOpenLongFilePath(
+                self.FilePath, Mode='rb', Encoding='utf_16_le').read()
         except IOError:
             self.FileIn = ""
 
@@ -2056,7 +2137,8 @@ class UniParser(object):
             else:
                 ModuleAbstract = self.CheckKeyValid('STR_MODULE_ABSTRACT')
                 self.PrintLog('STR_MODULE_ABSTRACT', ModuleAbstract)
-                ModuleDescription = self.CheckKeyValid('STR_MODULE_DESCRIPTION')
+                ModuleDescription = self.CheckKeyValid(
+                    'STR_MODULE_DESCRIPTION')
                 self.PrintLog('STR_MODULE_DESCRIPTION', ModuleDescription)
         else:
             if self.IsExtraUni:
@@ -2065,13 +2147,15 @@ class UniParser(object):
             else:
                 PackageAbstract = self.CheckKeyValid('STR_PACKAGE_ABSTRACT')
                 self.PrintLog('STR_PACKAGE_ABSTRACT', PackageAbstract)
-                PackageDescription = self.CheckKeyValid('STR_PACKAGE_DESCRIPTION')
+                PackageDescription = self.CheckKeyValid(
+                    'STR_PACKAGE_DESCRIPTION')
                 self.PrintLog('STR_PACKAGE_DESCRIPTION', PackageDescription)
 
     def CheckKeyValid(self, Key, Contents=None):
         if not Contents:
             Contents = self.FileIn
-        KeyPattern = re.compile('#string\s+%s\s+.*?#language.*?".*?"' % Key, re.S)
+        KeyPattern = re.compile(
+            '#string\s+%s\s+.*?#language.*?".*?"' % Key, re.S)
         if KeyPattern.search(Contents):
             return True
         return False
@@ -2088,9 +2172,11 @@ class UniParser(object):
         if not Value and Key not in self.Missing:
             Msg = '%s is missing in the %s file.' % (Key, self.FileName)
             EdkLogger.warn('Parser', Msg)
-            EccGlobalData.gDb.TblReport.Insert(EccToolError.ERROR_GENERAL_CHECK_UNI_HELP_INFO, OtherMsg=Msg, BelongsToTable='File', BelongsToItem=-2)
+            EccGlobalData.gDb.TblReport.Insert(
+                EccToolError.ERROR_GENERAL_CHECK_UNI_HELP_INFO, OtherMsg=Msg, BelongsToTable='File', BelongsToItem=-2)
             self.Missing.append(Key)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -2098,4 +2184,3 @@ class UniParser(object):
 #
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
index 0914aec460f5..9f7f8b5fc7c7 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase a meta file table
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -17,11 +17,12 @@ import Ecc.EccGlobalData as EccGlobalData
 from Ecc.MetaFileWorkspace.MetaDataTable import Table
 from Ecc.MetaFileWorkspace.MetaDataTable import ConvertToSqlString
 from CommonDataClass.DataClass import MODEL_FILE_DSC, MODEL_FILE_DEC, MODEL_FILE_INF, \
-                                      MODEL_FILE_OTHERS
+    MODEL_FILE_OTHERS
+
 
 class MetaFileTable(Table):
-    ## Constructor
-    def __init__(self, Cursor, MetaFile, FileType, TableName, Temporary = False):
+    # Constructor
+    def __init__(self, Cursor, MetaFile, FileType, TableName, Temporary=False):
         self.MetaFile = MetaFile
         self.TblFile = EccGlobalData.gDb.TblFile
         if (FileType == MODEL_FILE_INF):
@@ -38,7 +39,7 @@ class MetaFileTable(Table):
         self.Create(False)
 
 
-## Python class representation of table storing module data
+# Python class representation of table storing module data
 class ModuleTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -60,11 +61,11 @@ class ModuleTable(MetaFileTable):
     # used as table end flag, in case the changes to database is not committed to db file
     _DUMMY_ = "-1, -1, '====', '====', '====', '====', '====', -1, -1, -1, -1, -1, -1, -1"
 
-    ## Constructor
+    # Constructor
     def __init__(self, Cursor):
         MetaFileTable.__init__(self, Cursor, '', MODEL_FILE_INF, "Inf", False)
 
-    ## Insert a record into table Inf
+    # Insert a record into table Inf
     #
     # @param Model:          Model of a Inf item
     # @param Value1:         Value1 of a Inf item
@@ -80,27 +81,28 @@ class ModuleTable(MetaFileTable):
     # @param Enabled:        If this item enabled
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON',
-               BelongsToItem=-1, BelongsToFile = -1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0, Usage=''):
-        (Value1, Value2, Value3, Usage, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Usage, Scope1, Scope2))
+               BelongsToItem=-1, BelongsToFile=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0, Usage=''):
+        (Value1, Value2, Value3, Usage, Scope1, Scope2) = ConvertToSqlString(
+            (Value1, Value2, Value3, Usage, Scope1, Scope2))
         return Table.Insert(
-                        self,
-                        Model,
-                        Value1,
-                        Value2,
-                        Value3,
-                        Usage,
-                        Scope1,
-                        Scope2,
-                        BelongsToItem,
-                        BelongsToFile,
-                        StartLine,
-                        StartColumn,
-                        EndLine,
-                        EndColumn,
-                        Enabled
-                        )
+            self,
+            Model,
+            Value1,
+            Value2,
+            Value3,
+            Usage,
+            Scope1,
+            Scope2,
+            BelongsToItem,
+            BelongsToFile,
+            StartLine,
+            StartColumn,
+            EndLine,
+            EndColumn,
+            Enabled
+        )
 
-    ## Query table
+    # Query table
     #
     # @param    Model:      The Model of Record
     # @param    Arch:       The Arch attribute of Record
@@ -117,10 +119,13 @@ class ModuleTable(MetaFileTable):
         if Platform is not None and Platform != 'COMMON':
             ConditionString += " AND (Scope2='%s' OR Scope2='COMMON' OR Scope2='DEFAULT')" % Platform
 
-        SqlCommand = "SELECT %s FROM %s WHERE %s" % (ValueString, self.Table, ConditionString)
+        SqlCommand = "SELECT %s FROM %s WHERE %s" % (
+            ValueString, self.Table, ConditionString)
         return self.Exec(SqlCommand)
 
-## Python class representation of table storing package data
+# Python class representation of table storing package data
+
+
 class PackageTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -141,11 +146,11 @@ class PackageTable(MetaFileTable):
     # used as table end flag, in case the changes to database is not committed to db file
     _DUMMY_ = "-1, -1, '====', '====', '====', '====', '====', -1, -1, -1, -1, -1, -1, -1"
 
-    ## Constructor
+    # Constructor
     def __init__(self, Cursor):
         MetaFileTable.__init__(self, Cursor, '', MODEL_FILE_DEC, "Dec", False)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dec
     #
@@ -163,26 +168,27 @@ class PackageTable(MetaFileTable):
     # @param Enabled:        If this item enabled
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON',
-               BelongsToItem=-1, BelongsToFile = -1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
-        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
+               BelongsToItem=-1, BelongsToFile=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
+        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString(
+            (Value1, Value2, Value3, Scope1, Scope2))
         return Table.Insert(
-                        self,
-                        Model,
-                        Value1,
-                        Value2,
-                        Value3,
-                        Scope1,
-                        Scope2,
-                        BelongsToItem,
-                        BelongsToFile,
-                        StartLine,
-                        StartColumn,
-                        EndLine,
-                        EndColumn,
-                        Enabled
-                        )
+            self,
+            Model,
+            Value1,
+            Value2,
+            Value3,
+            Scope1,
+            Scope2,
+            BelongsToItem,
+            BelongsToFile,
+            StartLine,
+            StartColumn,
+            EndLine,
+            EndColumn,
+            Enabled
+        )
 
-    ## Query table
+    # Query table
     #
     # @param    Model:  The Model of Record
     # @param    Arch:   The Arch attribute of Record
@@ -196,10 +202,13 @@ class PackageTable(MetaFileTable):
         if Arch is not None and Arch != 'COMMON':
             ConditionString += " AND (Scope1='%s' OR Scope1='COMMON')" % Arch
 
-        SqlCommand = "SELECT %s FROM %s WHERE %s" % (ValueString, self.Table, ConditionString)
+        SqlCommand = "SELECT %s FROM %s WHERE %s" % (
+            ValueString, self.Table, ConditionString)
         return self.Exec(SqlCommand)
 
-## Python class representation of table storing platform data
+# Python class representation of table storing platform data
+
+
 class PlatformTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -221,11 +230,12 @@ class PlatformTable(MetaFileTable):
     # used as table end flag, in case the changes to database is not committed to db file
     _DUMMY_ = "-1, -1, '====', '====', '====', '====', '====', -1, -1, -1, -1, -1, -1, -1, -1"
 
-    ## Constructor
-    def __init__(self, Cursor, MetaFile = '', FileType = MODEL_FILE_DSC, Temporary = False):
-        MetaFileTable.__init__(self, Cursor, MetaFile, FileType, "Dsc", Temporary)
+    # Constructor
+    def __init__(self, Cursor, MetaFile='', FileType=MODEL_FILE_DSC, Temporary=False):
+        MetaFileTable.__init__(self, Cursor, MetaFile,
+                               FileType, "Dsc", Temporary)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dsc
     #
@@ -243,28 +253,29 @@ class PlatformTable(MetaFileTable):
     # @param EndColumn:      EndColumn of a Dsc item
     # @param Enabled:        If this item enabled
     #
-    def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON', BelongsToItem=-1, BelongsToFile = -1,
+    def Insert(self, Model, Value1, Value2, Value3, Scope1='COMMON', Scope2='COMMON', BelongsToItem=-1, BelongsToFile=-1,
                FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
-        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
+        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString(
+            (Value1, Value2, Value3, Scope1, Scope2))
         return Table.Insert(
-                        self,
-                        Model,
-                        Value1,
-                        Value2,
-                        Value3,
-                        Scope1,
-                        Scope2,
-                        BelongsToItem,
-                        BelongsToFile,
-                        FromItem,
-                        StartLine,
-                        StartColumn,
-                        EndLine,
-                        EndColumn,
-                        Enabled
-                        )
+            self,
+            Model,
+            Value1,
+            Value2,
+            Value3,
+            Scope1,
+            Scope2,
+            BelongsToItem,
+            BelongsToFile,
+            FromItem,
+            StartLine,
+            StartColumn,
+            EndLine,
+            EndColumn,
+            Enabled
+        )
 
-    ## Query table
+    # Query table
     #
     # @param Model:          The Model of Record
     # @param Scope1:         Arch of a Dsc item
@@ -291,25 +302,28 @@ class PlatformTable(MetaFileTable):
         if FromItem is not None:
             ConditionString += " AND FromItem=%s" % FromItem
 
-        SqlCommand = "SELECT %s FROM %s WHERE %s" % (ValueString, self.Table, ConditionString)
+        SqlCommand = "SELECT %s FROM %s WHERE %s" % (
+            ValueString, self.Table, ConditionString)
         return self.Exec(SqlCommand)
 
-## Factory class to produce different storage for different type of meta-file
+# Factory class to produce different storage for different type of meta-file
+
+
 class MetaFileStorage(object):
     _FILE_TABLE_ = {
-        MODEL_FILE_INF      :   ModuleTable,
-        MODEL_FILE_DEC      :   PackageTable,
-        MODEL_FILE_DSC      :   PlatformTable,
-        MODEL_FILE_OTHERS   :   MetaFileTable,
+        MODEL_FILE_INF:   ModuleTable,
+        MODEL_FILE_DEC:   PackageTable,
+        MODEL_FILE_DSC:   PlatformTable,
+        MODEL_FILE_OTHERS:   MetaFileTable,
     }
 
     _FILE_TYPE_ = {
-        ".inf"  : MODEL_FILE_INF,
-        ".dec"  : MODEL_FILE_DEC,
-        ".dsc"  : MODEL_FILE_DSC,
+        ".inf": MODEL_FILE_INF,
+        ".dec": MODEL_FILE_DEC,
+        ".dsc": MODEL_FILE_DSC,
     }
 
-    ## Constructor
+    # Constructor
     def __new__(Class, Cursor, MetaFile, FileType=None, Temporary=False):
         # no type given, try to find one
         if not FileType:
@@ -326,4 +340,3 @@ class MetaFileStorage(object):
 
         # create the storage object and return it to caller
         return Class._FILE_TABLE_[FileType](*Args)
-
diff --git a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py
index 85ae9937c43f..400adb76a0aa 100644
--- a/BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py
+++ b/BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Workspace' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Ecc/ParserWarning.py b/BaseTools/Source/Python/Ecc/ParserWarning.py
index 2ae66d48487e..bf192b9a593c 100644
--- a/BaseTools/Source/Python/Ecc/ParserWarning.py
+++ b/BaseTools/Source/Python/Ecc/ParserWarning.py
@@ -1,23 +1,23 @@
-## @file
+# @file
 # This file is used to be the warning class of ECC tool
 #
 # Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## The exception class that used to report error messages when preprocessing
+# The exception class that used to report error messages when preprocessing
 #
 # Currently the "ToolName" is set to be "ECC PP".
 #
 class Warning (Exception):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
     #   @param  File        The FDF name
     #   @param  Line        The Line number that error occurs
     #
-    def __init__(self, Str, File = None, Line = None):
+    def __init__(self, Str, File=None, Line=None):
         self.message = Str
         self.FileName = File
         self.LineNumber = Line
diff --git a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
index b02f663b15a5..123cf66f7303 100644
--- a/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This is an XML API that uses a syntax similar to XPath, but it is written in
 # standard python so that no extra python packages are required to use it.
 #
@@ -14,7 +14,7 @@ import xml.dom.minidom
 import codecs
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
-## Create a element of XML
+# Create a element of XML
 #
 # @param Name
 # @param String
@@ -23,6 +23,8 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 #
 # @revel Element
 #
+
+
 def CreateXmlElement(Name, String, NodeList, AttributeList):
     Doc = xml.dom.minidom.Document()
     Element = Doc.createElement(Name)
@@ -47,7 +49,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
 
     return Element
 
-## Get a list of XML nodes using XPath style syntax.
+# Get a list of XML nodes using XPath style syntax.
 #
 # Return a list of XML DOM nodes from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty list is returned.
@@ -57,6 +59,8 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
 #
 # @revel  Nodes              A list of XML nodes matching XPath style Sting.
 #
+
+
 def XmlList(Dom, String):
     if String is None or String == "" or Dom is None or Dom == "":
         return []
@@ -83,7 +87,7 @@ def XmlList(Dom, String):
     return Nodes
 
 
-## Get a single XML node using XPath style syntax.
+# Get a single XML node using XPath style syntax.
 #
 # Return a single XML DOM node from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty string is returned.
@@ -94,7 +98,7 @@ def XmlList(Dom, String):
 # @revel  Node               A single XML node matching XPath style Sting.
 #
 def XmlNode(Dom, String):
-    if String is None or String == ""  or Dom is None or Dom == "":
+    if String is None or String == "" or Dom is None or Dom == "":
         return ""
     if Dom.nodeType == Dom.DOCUMENT_NODE:
         Dom = Dom.documentElement
@@ -116,7 +120,7 @@ def XmlNode(Dom, String):
     return ""
 
 
-## Get a single XML element using XPath style syntax.
+# Get a single XML element using XPath style syntax.
 #
 # Return a single XML element from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty string is returned.
@@ -133,7 +137,7 @@ def XmlElement(Dom, String):
         return ""
 
 
-## Get a single XML element of the current node.
+# Get a single XML element of the current node.
 #
 # Return a single XML element specified by the current root Dom.
 # If the input Dom is not valid, then an empty string is returned.
@@ -149,7 +153,7 @@ def XmlElementData(Dom):
         return ""
 
 
-## Get a list of XML elements using XPath style syntax.
+# Get a list of XML elements using XPath style syntax.
 #
 # Return a list of XML elements from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty list is returned.
@@ -163,7 +167,7 @@ def XmlElementList(Dom, String):
     return map(XmlElementData, XmlList(Dom, String))
 
 
-## Get the XML attribute of the current node.
+# Get the XML attribute of the current node.
 #
 # Return a single XML attribute named Attribute from the current root Dom.
 # If the input Dom or Attribute is not valid, then an empty string is returned.
@@ -180,7 +184,7 @@ def XmlAttribute(Dom, Attribute):
         return ''
 
 
-## Get the XML node name of the current node.
+# Get the XML node name of the current node.
 #
 # Return a single XML node name from the current root Dom.
 # If the input Dom is not valid, then an empty string is returned.
@@ -195,7 +199,7 @@ def XmlNodeName(Dom):
     except:
         return ''
 
-## Parse an XML file.
+# Parse an XML file.
 #
 # Parse the input XML file named FileName and return a XML DOM it stands for.
 # If the input File is not a valid XML file, then an empty string is returned.
@@ -204,9 +208,11 @@ def XmlNodeName(Dom):
 #
 # @revel  Dom                The Dom object achieved from the XML file.
 #
+
+
 def XmlParseFile(FileName):
     try:
-        XmlFile = codecs.open(FileName,encoding='utf_8_sig')
+        XmlFile = codecs.open(FileName, encoding='utf_8_sig')
         Dom = xml.dom.minidom.parse(XmlFile)
         XmlFile.close()
         return Dom
@@ -214,12 +220,15 @@ def XmlParseFile(FileName):
         print(X)
         return ""
 
+
 # This acts like the main() function for the script, unless it is 'import'ed
 # into another script.
 if __name__ == '__main__':
     # Nothing to do here. Could do some unit tests.
-    A = CreateXmlElement('AAA', 'CCC',  [['AAA', '111'], ['BBB', '222']], [['A', '1'], ['B', '2']])
-    B = CreateXmlElement('ZZZ', 'CCC',  [['XXX', '111'], ['YYY', '222']], [['A', '1'], ['B', '2']])
+    A = CreateXmlElement('AAA', 'CCC',  [['AAA', '111'], ['BBB', '222']], [
+                         ['A', '1'], ['B', '2']])
+    B = CreateXmlElement('ZZZ', 'CCC',  [['XXX', '111'], ['YYY', '222']], [
+                         ['A', '1'], ['B', '2']])
     C = CreateXmlList('DDD', 'EEE', [A, B], ['FFF', 'GGG'])
-    print(C.toprettyxml(indent = " "))
+    print(C.toprettyxml(indent=" "))
     pass
diff --git a/BaseTools/Source/Python/Ecc/Xml/__init__.py b/BaseTools/Source/Python/Ecc/Xml/__init__.py
index 172e498451b8..03dedeed636e 100644
--- a/BaseTools/Source/Python/Ecc/Xml/__init__.py
+++ b/BaseTools/Source/Python/Ecc/Xml/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Ecc/__init__.py b/BaseTools/Source/Python/Ecc/__init__.py
index a1cac674d619..a8fa5404c7bb 100644
--- a/BaseTools/Source/Python/Ecc/__init__.py
+++ b/BaseTools/Source/Python/Ecc/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Ecc' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Ecc/c.py b/BaseTools/Source/Python/Ecc/c.py
index 61ad084fcc5b..e569cd3b408c 100644
--- a/BaseTools/Source/Python/Ecc/c.py
+++ b/BaseTools/Source/Python/Ecc/c.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to be the c coding style checking of ECC tool
 #
 # Copyright (c) 2009 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -28,30 +28,38 @@ ComplexTypeDict = {}
 SUDict = {}
 IgnoredKeywordList = ['EFI_ERROR']
 
+
 def GetIgnoredDirListPattern():
     skipList = list(EccGlobalData.gConfig.SkipDirList) + ['.svn']
     DirString = '|'.join(skipList)
     p = re.compile(r'.*[\\/](?:%s)[\\/]?.*' % DirString)
     return p
 
+
 def GetFuncDeclPattern():
-    p = re.compile(r'(?:EFIAPI|EFI_BOOT_SERVICE|EFI_RUNTIME_SERVICE)?\s*[_\w]+\s*\(.*\)$', re.DOTALL)
+    p = re.compile(
+        r'(?:EFIAPI|EFI_BOOT_SERVICE|EFI_RUNTIME_SERVICE)?\s*[_\w]+\s*\(.*\)$', re.DOTALL)
     return p
 
+
 def GetArrayPattern():
     p = re.compile(r'[_\w]*\s*[\[.*\]]+')
     return p
 
+
 def GetTypedefFuncPointerPattern():
     p = re.compile('[_\w\s]*\([\w\s]*\*+\s*[_\w]+\s*\)\s*\(.*\)', re.DOTALL)
     return p
 
+
 def GetDB():
     return EccGlobalData.gDb
 
+
 def GetConfig():
     return EccGlobalData.gConfig
 
+
 def PrintErrorMsg(ErrorType, Msg, TableName, ItemId):
     Msg = Msg.replace('\n', '').replace('\r', '')
     MsgPartList = Msg.split()
@@ -59,7 +67,9 @@ def PrintErrorMsg(ErrorType, Msg, TableName, ItemId):
     for Part in MsgPartList:
         Msg += Part
         Msg += ' '
-    GetDB().TblReport.Insert(ErrorType, OtherMsg=Msg, BelongsToTable=TableName, BelongsToItem=ItemId)
+    GetDB().TblReport.Insert(ErrorType, OtherMsg=Msg,
+                             BelongsToTable=TableName, BelongsToItem=ItemId)
+
 
 def GetIdType(Str):
     Type = DataClass.MODEL_UNKNOWN
@@ -83,25 +93,30 @@ def GetIdType(Str):
         Type = DataClass.MODEL_UNKNOWN
     return Type
 
-def SuOccurInTypedef (Su, TdList):
+
+def SuOccurInTypedef(Su, TdList):
     for Td in TdList:
         if Su.StartPos[0] == Td.StartPos[0] and Su.EndPos[0] == Td.EndPos[0]:
             return True
     return False
 
+
 def GetIdentifierList():
     IdList = []
     for comment in FileProfile.CommentList:
-        IdComment = DataClass.IdentifierClass(-1, '', '', '', comment.Content, DataClass.MODEL_IDENTIFIER_COMMENT, -1, -1, comment.StartPos[0], comment.StartPos[1], comment.EndPos[0], comment.EndPos[1])
+        IdComment = DataClass.IdentifierClass(-1, '', '', '', comment.Content, DataClass.MODEL_IDENTIFIER_COMMENT, -
+                                              1, -1, comment.StartPos[0], comment.StartPos[1], comment.EndPos[0], comment.EndPos[1])
         IdList.append(IdComment)
 
     for pp in FileProfile.PPDirectiveList:
         Type = GetIdType(pp.Content)
-        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
+        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -
+                                         1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
         IdList.append(IdPP)
 
     for pe in FileProfile.PredicateExpressionList:
-        IdPE = DataClass.IdentifierClass(-1, '', '', '', pe.Content, DataClass.MODEL_IDENTIFIER_PREDICATE_EXPRESSION, -1, -1, pe.StartPos[0], pe.StartPos[1], pe.EndPos[0], pe.EndPos[1])
+        IdPE = DataClass.IdentifierClass(-1, '', '', '', pe.Content, DataClass.MODEL_IDENTIFIER_PREDICATE_EXPRESSION, -
+                                         1, -1, pe.StartPos[0], pe.StartPos[1], pe.EndPos[0], pe.EndPos[1])
         IdList.append(IdPE)
 
     FuncDeclPattern = GetFuncDeclPattern()
@@ -177,7 +192,8 @@ def GetIdentifierList():
                             Index += 1
                             VarNameStartColumn += 1
                         PreChar = FirstChar
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, FuncName, DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, FuncName, DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -
+                                              1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
             IdList.append(IdVar)
             continue
 
@@ -190,7 +206,8 @@ def GetIdentifierList():
                     var.Modifier += ' ' + Name[LSBPos:]
                     Name = Name[0:LSBPos]
 
-                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
+                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]] or [''])[
+                                                  0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
                 IdList.append(IdVar)
         else:
             DeclList = var.Declarator.split('=')
@@ -199,7 +216,8 @@ def GetIdentifierList():
                 LSBPos = var.Declarator.find('[')
                 var.Modifier += ' ' + Name[LSBPos:]
                 Name = Name[0:LSBPos]
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]] or [''])[
+                                              0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], VarNameStartLine, VarNameStartColumn)
             IdList.append(IdVar)
 
     for enum in FileProfile.EnumerationDefinitionList:
@@ -207,7 +225,8 @@ def GetIdentifierList():
         RBPos = enum.Content.find('}')
         Name = enum.Content[4:LBPos].strip()
         Value = enum.Content[LBPos + 1:RBPos]
-        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
+        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -
+                                           1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
         IdList.append(IdEnum)
 
     for su in FileProfile.StructUnionDefinitionList:
@@ -226,7 +245,8 @@ def GetIdentifierList():
         else:
             Name = su.Content[SkipLen:LBPos].strip()
             Value = su.Content[LBPos:RBPos + 1]
-        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
+        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -
+                                         1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
         IdList.append(IdPE)
 
     TdFuncPointerPattern = GetTypedefFuncPointerPattern()
@@ -242,7 +262,7 @@ def GetIdentifierList():
             if StarPos != -1:
                 Modifier += ' ' + TmpStr[0:StarPos]
             while TmpStr[StarPos] == '*':
-#                Modifier += ' ' + '*'
+                #                Modifier += ' ' + '*'
                 StarPos += 1
             TmpStr = TmpStr[StarPos:].strip()
             RBPos = TmpStr.find(')')
@@ -256,17 +276,20 @@ def GetIdentifierList():
         if Name.find('[') != -1:
             LBPos = Name.find('[')
             RBPos = Name.rfind(']')
-            Value += Name[LBPos : RBPos + 1]
-            Name = Name[0 : LBPos]
+            Value += Name[LBPos: RBPos + 1]
+            Name = Name[0: LBPos]
 
-        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
+        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -
+                                         1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
         IdList.append(IdTd)
 
     for funcCall in FileProfile.FunctionCallingList:
-        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
+        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -
+                                         1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
         IdList.append(IdFC)
     return IdList
 
+
 def StripNonAlnumChars(Str):
     StrippedStr = ''
     for Char in Str:
@@ -274,12 +297,13 @@ def StripNonAlnumChars(Str):
             StrippedStr += Char
     return StrippedStr
 
+
 def GetParamList(FuncDeclarator, FuncNameLine=0, FuncNameOffset=0):
     FuncDeclarator = StripComments(FuncDeclarator)
     ParamIdList = []
     #DeclSplitList = FuncDeclarator.split('(')
     LBPos = FuncDeclarator.find('(')
-    #if len(DeclSplitList) < 2:
+    # if len(DeclSplitList) < 2:
     if LBPos == -1:
         return ParamIdList
     #FuncName = DeclSplitList[0]
@@ -309,7 +333,7 @@ def GetParamList(FuncDeclarator, FuncNameLine=0, FuncNameOffset=0):
             FuncName = FuncName[:-1]
         TailChar = FuncName[-1]
 
-    OffsetSkipped += 1 #skip '('
+    OffsetSkipped += 1  # skip '('
 
     for p in ParamStr.split(','):
         ListP = p.split()
@@ -396,13 +420,15 @@ def GetParamList(FuncDeclarator, FuncNameLine=0, FuncNameOffset=0):
         ParamEndOffset = FuncNameOffset + OffsetSkipped
         if ParamName != '...':
             ParamName = StripNonAlnumChars(ParamName)
-        IdParam = DataClass.IdentifierClass(-1, ParamModifier, '', ParamName, '', DataClass.MODEL_IDENTIFIER_PARAMETER, -1, -1, ParamBeginLine, ParamBeginOffset, ParamEndLine, ParamEndOffset)
+        IdParam = DataClass.IdentifierClass(-1, ParamModifier, '', ParamName, '', DataClass.MODEL_IDENTIFIER_PARAMETER, -
+                                            1, -1, ParamBeginLine, ParamBeginOffset, ParamEndLine, ParamEndOffset)
         ParamIdList.append(IdParam)
 
-        OffsetSkipped += 1 #skip ','
+        OffsetSkipped += 1  # skip ','
 
     return ParamIdList
 
+
 def GetFunctionList():
     FuncObjList = []
     for FuncDef in FileProfile.FunctionDefinitionList:
@@ -476,11 +502,13 @@ def GetFunctionList():
                         FuncNameStartColumn += 1
                     PreChar = FirstChar
 
-        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [], FuncNameStartLine, FuncNameStartColumn)
+        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(
+        ), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [], FuncNameStartLine, FuncNameStartColumn)
         FuncObjList.append(FuncObj)
 
     return FuncObjList
 
+
 def GetFileModificationTimeFromDB(FullFileName):
     TimeValue = 0.0
     Db = GetDB()
@@ -493,6 +521,7 @@ def GetFileModificationTimeFromDB(FullFileName):
         TimeValue = Result[0]
     return TimeValue
 
+
 def CollectSourceCodeDataIntoDB(RootDir):
     FileObjList = []
     tuple = os.walk(RootDir)
@@ -522,8 +551,10 @@ def CollectSourceCodeDataIntoDB(RootDir):
             model = DataClass.MODEL_FILE_OTHERS
             if os.path.splitext(f)[1] in ('.h', '.c'):
                 EdkLogger.info("Parsing " + FullName)
-                model = f.endswith('c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
-                collector = CodeFragmentCollector.CodeFragmentCollector(FullName)
+                model = f.endswith(
+                    'c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
+                collector = CodeFragmentCollector.CodeFragmentCollector(
+                    FullName)
                 collector.TokenReleaceList = TokenReleaceList
                 try:
                     collector.ParseFile()
@@ -536,13 +567,15 @@ def CollectSourceCodeDataIntoDB(RootDir):
             DirName = os.path.dirname(FullName)
             Ext = os.path.splitext(f)[1].lstrip('.')
             ModifiedTime = os.path.getmtime(FullName)
-            FileObj = DataClass.FileClass(-1, BaseName, Ext, DirName, FullName, model, ModifiedTime, GetFunctionList(), GetIdentifierList(), [])
+            FileObj = DataClass.FileClass(-1, BaseName, Ext, DirName, FullName,
+                                          model, ModifiedTime, GetFunctionList(), GetIdentifierList(), [])
             FileObjList.append(FileObj)
             if collector:
                 collector.CleanFileProfileBuffer()
 
     if len(ParseErrorFileList) > 0:
-        EdkLogger.info("Found unrecoverable error during parsing:\n\t%s\n" % "\n\t".join(ParseErrorFileList))
+        EdkLogger.info("Found unrecoverable error during parsing:\n\t%s\n" %
+                       "\n\t".join(ParseErrorFileList))
 
     Db = GetDB()
     for file in FileObjList:
@@ -551,6 +584,7 @@ def CollectSourceCodeDataIntoDB(RootDir):
 
     Db.UpdateIdentifierBelongsToFunction()
 
+
 def GetTableID(FullFileName, ErrorMsgList=None):
     if ErrorMsgList is None:
         ErrorMsgList = []
@@ -565,14 +599,17 @@ def GetTableID(FullFileName, ErrorMsgList=None):
     FileID = -1
     for Result in ResultSet:
         if FileID != -1:
-            ErrorMsgList.append('Duplicate file ID found in DB for file %s' % FullFileName)
+            ErrorMsgList.append(
+                'Duplicate file ID found in DB for file %s' % FullFileName)
             return - 2
         FileID = Result[0]
     if FileID == -1:
-        ErrorMsgList.append('NO file ID found in DB for file %s' % FullFileName)
+        ErrorMsgList.append(
+            'NO file ID found in DB for file %s' % FullFileName)
         return - 1
     return FileID
 
+
 def GetIncludeFileList(FullFileName):
     if os.path.splitext(FullFileName)[1].upper() not in ('.H'):
         return []
@@ -594,6 +631,7 @@ def GetIncludeFileList(FullFileName):
     IncludeFileListDict[FullFileName] = ResultSet
     return ResultSet
 
+
 def GetFullPathOfIncludeFile(Str, IncludePathList):
     for IncludePath in IncludePathList:
         FullPath = os.path.join(IncludePath, Str)
@@ -602,6 +640,7 @@ def GetFullPathOfIncludeFile(Str, IncludePathList):
             return FullPath
     return None
 
+
 def GetAllIncludeFiles(FullFileName):
     if AllIncludeFileListDict.get(FullFileName) is not None:
         return AllIncludeFileListDict.get(FullFileName)
@@ -609,7 +648,8 @@ def GetAllIncludeFiles(FullFileName):
     FileDirName = os.path.dirname(FullFileName)
     IncludePathList = IncludePathListDict.get(FileDirName)
     if IncludePathList is None:
-        IncludePathList = MetaDataParser.GetIncludeListOfFile(EccGlobalData.gWorkspace, FullFileName, GetDB())
+        IncludePathList = MetaDataParser.GetIncludeListOfFile(
+            EccGlobalData.gWorkspace, FullFileName, GetDB())
         if FileDirName not in IncludePathList:
             IncludePathList.insert(0, FileDirName)
         IncludePathListDict[FileDirName] = IncludePathList
@@ -638,6 +678,7 @@ def GetAllIncludeFiles(FullFileName):
     AllIncludeFileListDict[FullFileName] = IncludeFileQueue
     return IncludeFileQueue
 
+
 def GetPredicateListFromPredicateExpStr(PES):
 
     PredicateList = []
@@ -674,6 +715,7 @@ def GetPredicateListFromPredicateExpStr(PES):
             PredicateList.append(Exp.rstrip(';').rstrip(')').strip())
     return PredicateList
 
+
 def GetCNameList(Lvalue, StarList=[]):
     Lvalue += ' '
     i = 0
@@ -700,7 +742,6 @@ def GetCNameList(Lvalue, StarList=[]):
         if VarEnd == -1:
             break
 
-
         DotIndex = Lvalue[VarEnd:].find('.')
         ArrowIndex = Lvalue[VarEnd:].find('->')
         if DotIndex == -1 and ArrowIndex == -1:
@@ -710,7 +751,8 @@ def GetCNameList(Lvalue, StarList=[]):
         elif ArrowIndex == -1 and DotIndex != -1:
             SearchBegin = VarEnd + DotIndex
         else:
-            SearchBegin = VarEnd + ((DotIndex < ArrowIndex) and DotIndex or ArrowIndex)
+            SearchBegin = VarEnd + \
+                ((DotIndex < ArrowIndex) and DotIndex or ArrowIndex)
 
         i = SearchBegin
         VarStart = -1
@@ -718,6 +760,7 @@ def GetCNameList(Lvalue, StarList=[]):
 
     return VarList
 
+
 def SplitPredicateByOp(Str, Op, IsFuncCalling=False):
 
     Name = Str.strip()
@@ -780,6 +823,7 @@ def SplitPredicateByOp(Str, Op, IsFuncCalling=False):
 
         TmpStr = Str[0:Index - 1]
 
+
 def SplitPredicateStr(Str):
 
     Str = Str.lstrip('(')
@@ -815,12 +859,14 @@ def SplitPredicateStr(Str):
 
     return [[Str, None], None]
 
+
 def GetFuncContainsPE(ExpLine, ResultSet):
     for Result in ResultSet:
         if Result[0] < ExpLine and Result[1] > ExpLine:
             return Result
     return None
 
+
 def PatternInModifier(Modifier, SubStr):
     PartList = Modifier.split()
     for Part in PartList:
@@ -828,6 +874,7 @@ def PatternInModifier(Modifier, SubStr):
             return True
     return False
 
+
 def GetDataTypeFromModifier(ModifierStr):
     MList = ModifierStr.split()
     ReturnType = ''
@@ -844,6 +891,7 @@ def GetDataTypeFromModifier(ModifierStr):
         ReturnType = 'VOID'
     return ReturnType
 
+
 def DiffModifier(Str1, Str2):
     PartList1 = Str1.split()
     PartList2 = Str2.split()
@@ -852,6 +900,7 @@ def DiffModifier(Str1, Str2):
     else:
         return True
 
+
 def GetTypedefDict(FullFileName):
 
     Dict = ComplexTypeDict.get(FullFileName)
@@ -897,6 +946,7 @@ def GetTypedefDict(FullFileName):
     ComplexTypeDict[FullFileName] = Dict
     return Dict
 
+
 def GetSUDict(FullFileName):
 
     Dict = SUDict.get(FullFileName)
@@ -937,6 +987,7 @@ def GetSUDict(FullFileName):
     SUDict[FullFileName] = Dict
     return Dict
 
+
 def StripComments(Str):
     Str += '   '
     ListFromStr = list(Str)
@@ -983,6 +1034,7 @@ def StripComments(Str):
 
     return Str
 
+
 def GetFinalTypeValue(Type, FieldName, TypedefDict, SUDict):
     Value = TypedefDict.get(Type)
     if Value is None:
@@ -1019,22 +1071,24 @@ def GetFinalTypeValue(Type, FieldName, TypedefDict, SUDict):
                 Type = GetDataTypeFromModifier(Field[0:Index])
                 return Type.strip()
             else:
-            # For the condition that the field in struct is an array with [] suffixes...
+                # For the condition that the field in struct is an array with [] suffixes...
                 if not Field[Index + len(FieldName)].isalnum():
                     Type = GetDataTypeFromModifier(Field[0:Index])
                     return Type.strip()
 
     return None
 
+
 def GetRealType(Type, TypedefDict, TargetType=None):
     if TargetType is not None and Type == TargetType:
-            return Type
+        return Type
     while TypedefDict.get(Type):
         Type = TypedefDict.get(Type)
         if TargetType is not None and Type == TargetType:
             return Type
     return Type
 
+
 def GetTypeInfo(RefList, Modifier, FullFileName, TargetType=None):
     TypedefDict = GetTypedefDict(FullFileName)
     SUDict = GetSUDict(FullFileName)
@@ -1063,6 +1117,7 @@ def GetTypeInfo(RefList, Modifier, FullFileName, TargetType=None):
 
     return Type
 
+
 def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetType=None, StarList=None):
 
     PredVar = PredVarList[0]
@@ -1147,10 +1202,11 @@ def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetTy
     VarFound = False
     for Result in ResultSet:
         if len(PredVarList) > 1:
-            Type = GetTypeInfo(PredVarList[1:], Result[0], FullFileName, TargetType)
+            Type = GetTypeInfo(
+                PredVarList[1:], Result[0], FullFileName, TargetType)
             return Type
         else:
-#            Type = GetDataTypeFromModifier(Result[0]).split()[-1]
+            #            Type = GetDataTypeFromModifier(Result[0]).split()[-1]
             TypeList = GetDataTypeFromModifier(Result[0]).split()
             Type = TypeList[-1]
             if len(TypeList) > 1 and StarList is not None:
@@ -1169,7 +1225,8 @@ def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetTy
     for Param in ParamList:
         if Param.Name.strip() == PredVar:
             if len(PredVarList) > 1:
-                Type = GetTypeInfo(PredVarList[1:], Param.Modifier, FullFileName, TargetType)
+                Type = GetTypeInfo(
+                    PredVarList[1:], Param.Modifier, FullFileName, TargetType)
                 return Type
             else:
                 TypeList = GetDataTypeFromModifier(Param.Modifier).split()
@@ -1196,7 +1253,8 @@ def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetTy
 
     for Result in ResultSet:
         if len(PredVarList) > 1:
-            Type = GetTypeInfo(PredVarList[1:], Result[0], FullFileName, TargetType)
+            Type = GetTypeInfo(
+                PredVarList[1:], Result[0], FullFileName, TargetType)
             return Type
         else:
             TypeList = GetDataTypeFromModifier(Result[0]).split()
@@ -1227,7 +1285,8 @@ def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetTy
 
         for Result in ResultSet:
             if len(PredVarList) > 1:
-                Type = GetTypeInfo(PredVarList[1:], Result[0], FullFileName, TargetType)
+                Type = GetTypeInfo(
+                    PredVarList[1:], Result[0], FullFileName, TargetType)
                 return Type
             else:
                 TypeList = GetDataTypeFromModifier(Result[0]).split()
@@ -1243,6 +1302,7 @@ def GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall=False, TargetTy
                 Type = GetRealType(Type, TypedefDict, TargetType)
                 return Type
 
+
 def GetTypeFromArray(Type, Var):
     Count = Var.count('[')
 
@@ -1253,6 +1313,7 @@ def GetTypeFromArray(Type, Var):
 
     return Type
 
+
 def CheckFuncLayoutReturnType(FullFileName):
     ErrorMsgList = []
 
@@ -1278,10 +1339,12 @@ def CheckFuncLayoutReturnType(FullFileName):
             Result0 = Result0[6:].strip()
         Index = Result0.find(TypeStart)
         if Index != 0 or Result[3] != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE, '[%s] Return Type should appear at the start of line' % FuncName, FileTable, Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE,
+                          '[%s] Return Type should appear at the start of line' % FuncName, FileTable, Result[1])
 
         if Result[2] == Result[4]:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE, '[%s] Return Type should appear on its own line' % FuncName, FileTable, Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE,
+                          '[%s] Return Type should appear on its own line' % FuncName, FileTable, Result[1])
 
     SqlStatement = """ select Modifier, ID, StartLine, StartColumn, FunNameStartLine, Name
                        from Function
@@ -1299,7 +1362,9 @@ def CheckFuncLayoutReturnType(FullFileName):
             Result0 = Result0[6:].strip()
         Index = Result0.find(TypeStart)
         if Index != 0 or Result[3] != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE, '[%s] Return Type should appear at the start of line' % FuncName, 'Function', Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_RETURN_TYPE,
+                          '[%s] Return Type should appear at the start of line' % FuncName, 'Function', Result[1])
+
 
 def CheckFuncLayoutModifier(FullFileName):
     ErrorMsgList = []
@@ -1323,7 +1388,8 @@ def CheckFuncLayoutModifier(FullFileName):
             Result0 = Result0[6:].strip()
         Index = Result0.find(TypeStart)
         if Index != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER, '', FileTable, Result[1])
+            PrintErrorMsg(
+                ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER, '', FileTable, Result[1])
 
     SqlStatement = """ select Modifier, ID
                        from Function
@@ -1338,7 +1404,9 @@ def CheckFuncLayoutModifier(FullFileName):
             Result0 = Result0[6:].strip()
         Index = Result0.find(TypeStart)
         if Index != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER, '', 'Function', Result[1])
+            PrintErrorMsg(
+                ERROR_C_FUNCTION_LAYOUT_CHECK_OPTIONAL_FUNCTIONAL_MODIFIER, '', 'Function', Result[1])
+
 
 def CheckFuncLayoutName(FullFileName):
     ErrorMsgList = []
@@ -1361,22 +1429,27 @@ def CheckFuncLayoutName(FullFileName):
         if EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, FuncName):
             continue
         if Result[2] != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Function name [%s] should appear at the start of a line' % FuncName, FileTable, Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                          'Function name [%s] should appear at the start of a line' % FuncName, FileTable, Result[1])
         ParamList = GetParamList(Result[0])
         if len(ParamList) == 0:
             continue
         StartLine = 0
         for Param in ParamList:
             if Param.StartLine <= StartLine:
-                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Parameter %s should be in its own line.' % Param.Name, FileTable, Result[1])
+                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                              'Parameter %s should be in its own line.' % Param.Name, FileTable, Result[1])
             if Param.StartLine - StartLine > 1:
-                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Empty line appears before Parameter %s.' % Param.Name, FileTable, Result[1])
+                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                              'Empty line appears before Parameter %s.' % Param.Name, FileTable, Result[1])
             if not Pattern.match(Param.Name) and not Param.Name in ParamIgnoreList and not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Param.Name):
-                PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
+                PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME,
+                              'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
             StartLine = Param.StartLine
 
         if not Result[0].endswith('\n  )') and not Result[0].endswith('\r  )'):
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, '\')\' should be on a new line and indented two spaces', FileTable, Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                          '\')\' should be on a new line and indented two spaces', FileTable, Result[1])
 
     SqlStatement = """ select Modifier, ID, FunNameStartColumn, Name
                        from Function
@@ -1388,21 +1461,27 @@ def CheckFuncLayoutName(FullFileName):
         if EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, FuncName):
             continue
         if Result[2] != 0:
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Function name [%s] should appear at the start of a line' % FuncName, 'Function', Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                          'Function name [%s] should appear at the start of a line' % FuncName, 'Function', Result[1])
         ParamList = GetParamList(Result[0])
         if len(ParamList) == 0:
             continue
         StartLine = 0
         for Param in ParamList:
             if Param.StartLine <= StartLine:
-                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Parameter %s should be in its own line.' % Param.Name, 'Function', Result[1])
+                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                              'Parameter %s should be in its own line.' % Param.Name, 'Function', Result[1])
             if Param.StartLine - StartLine > 1:
-                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, 'Empty line appears before Parameter %s.' % Param.Name, 'Function', Result[1])
+                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                              'Empty line appears before Parameter %s.' % Param.Name, 'Function', Result[1])
             if not Pattern.match(Param.Name) and not Param.Name in ParamIgnoreList and not EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Param.Name):
-                PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
+                PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME,
+                              'Parameter [%s] NOT follow naming convention.' % Param.Name, FileTable, Result[1])
             StartLine = Param.StartLine
         if not Result[0].endswith('\n  )') and not Result[0].endswith('\r  )'):
-            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME, '\')\' should be on a new line and indented two spaces', 'Function', Result[1])
+            PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_NAME,
+                          '\')\' should be on a new line and indented two spaces', 'Function', Result[1])
+
 
 def CheckFuncLayoutPrototype(FullFileName):
     ErrorMsgList = []
@@ -1445,17 +1524,20 @@ def CheckFuncLayoutPrototype(FullFileName):
             DeclModifier = FuncDecl[0]
             if DeclName == FuncName:
                 if DiffModifier(FuncModifier, DeclModifier) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE, FuncName):
-                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE, 'Function [%s] modifier different with prototype.' % FuncName, 'Function', FuncDef[3])
+                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE,
+                                  'Function [%s] modifier different with prototype.' % FuncName, 'Function', FuncDef[3])
                 ParamListOfDef = GetParamList(FuncDefHeader)
                 ParamListOfDecl = GetParamList(FuncDecl[1])
                 if len(ParamListOfDef) != len(ParamListOfDecl) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2, FuncName):
-                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2, 'Parameter number different in function [%s].' % FuncName, 'Function', FuncDef[3])
+                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2,
+                                  'Parameter number different in function [%s].' % FuncName, 'Function', FuncDef[3])
                     break
 
                 Index = 0
                 while Index < len(ParamListOfDef):
                     if DiffModifier(ParamListOfDef[Index].Modifier, ParamListOfDecl[Index].Modifier) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, FuncName):
-                        PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, 'Parameter %s has different modifier with prototype in function [%s].' % (ParamListOfDef[Index].Name, FuncName), 'Function', FuncDef[3])
+                        PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, 'Parameter %s has different modifier with prototype in function [%s].' % (
+                            ParamListOfDef[Index].Name, FuncName), 'Function', FuncDef[3])
                     Index += 1
                 break
         else:
@@ -1488,20 +1570,24 @@ def CheckFuncLayoutPrototype(FullFileName):
             DeclModifier = FuncDecl[0]
             if DeclName == FuncName:
                 if DiffModifier(FuncModifier, DeclModifier) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE, FuncName):
-                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE, 'Function [%s] modifier different with prototype.' % FuncName, 'Function', FuncDef[3])
+                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE,
+                                  'Function [%s] modifier different with prototype.' % FuncName, 'Function', FuncDef[3])
                 ParamListOfDef = GetParamList(FuncDefHeader)
                 ParamListOfDecl = GetParamList(FuncDecl[1])
                 if len(ParamListOfDef) != len(ParamListOfDecl) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2, FuncName):
-                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2, 'Parameter number different in function [%s].' % FuncName, 'Function', FuncDef[3])
+                    PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_2,
+                                  'Parameter number different in function [%s].' % FuncName, 'Function', FuncDef[3])
                     break
 
                 Index = 0
                 while Index < len(ParamListOfDef):
                     if DiffModifier(ParamListOfDef[Index].Modifier, ParamListOfDecl[Index].Modifier) and not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, FuncName):
-                        PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, 'Parameter %s has different modifier with prototype in function [%s].' % (ParamListOfDef[Index].Name, FuncName), 'Function', FuncDef[3])
+                        PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_PROTO_TYPE_3, 'Parameter %s has different modifier with prototype in function [%s].' % (
+                            ParamListOfDef[Index].Name, FuncName), 'Function', FuncDef[3])
                     Index += 1
                 break
 
+
 def CheckFuncLayoutBody(FullFileName):
     ErrorMsgList = []
 
@@ -1522,14 +1608,17 @@ def CheckFuncLayoutBody(FullFileName):
         if Result[0] != 0:
             if not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY, Result[3]):
                 PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY,
-                              'The open brace should be at the very beginning of a line for the function [%s].' % Result[3],
+                              'The open brace should be at the very beginning of a line for the function [%s].' % Result[
+                                  3],
                               'Function', Result[2])
         if Result[1] != 0:
             if not EccGlobalData.gException.IsException(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY, Result[3]):
                 PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_FUNCTION_BODY,
-                              'The close brace should be at the very beginning of a line for the function [%s].' % Result[3],
+                              'The close brace should be at the very beginning of a line for the function [%s].' % Result[
+                                  3],
                               'Function', Result[2])
 
+
 def CheckFuncLayoutLocalVariable(FullFileName):
     ErrorMsgList = []
 
@@ -1561,7 +1650,9 @@ def CheckFuncLayoutLocalVariable(FullFileName):
 
         for Result in ResultSet:
             if len(Result[1]) > 0 and 'CONST' not in Result[3] and 'STATIC' not in Result[3]:
-                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_NO_INIT_OF_VARIABLE, 'Variable Name: %s' % Result[0], FileTable, Result[2])
+                PrintErrorMsg(ERROR_C_FUNCTION_LAYOUT_CHECK_NO_INIT_OF_VARIABLE,
+                              'Variable Name: %s' % Result[0], FileTable, Result[2])
+
 
 def CheckMemberVariableFormat(Name, Value, FileTable, TdId, ModelId):
     ErrMsgList = []
@@ -1573,25 +1664,28 @@ def CheckMemberVariableFormat(Name, Value, FileTable, TdId, ModelId):
     if LBPos == -1 or RBPos == -1:
         return ErrMsgList
 
-    Fields = Value[LBPos + 1 : RBPos]
+    Fields = Value[LBPos + 1: RBPos]
     Fields = StripComments(Fields).strip()
-    NestPos = Fields.find ('struct')
+    NestPos = Fields.find('struct')
     if NestPos != -1 and (NestPos + len('struct') < len(Fields)) and ModelId != DataClass.MODEL_IDENTIFIER_UNION:
         if not Fields[NestPos + len('struct') + 1].isalnum():
             if not EccGlobalData.gException.IsException(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, Name):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, 'Nested struct in [%s].' % (Name), FileTable, TdId)
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE,
+                              'Nested struct in [%s].' % (Name), FileTable, TdId)
             return ErrMsgList
-    NestPos = Fields.find ('union')
+    NestPos = Fields.find('union')
     if NestPos != -1 and (NestPos + len('union') < len(Fields)):
         if not Fields[NestPos + len('union') + 1].isalnum():
             if not EccGlobalData.gException.IsException(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, Name):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, 'Nested union in [%s].' % (Name), FileTable, TdId)
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE,
+                              'Nested union in [%s].' % (Name), FileTable, TdId)
             return ErrMsgList
-    NestPos = Fields.find ('enum')
+    NestPos = Fields.find('enum')
     if NestPos != -1 and (NestPos + len('enum') < len(Fields)):
         if not Fields[NestPos + len('enum') + 1].isalnum():
             if not EccGlobalData.gException.IsException(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, Name):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE, 'Nested enum in [%s].' % (Name), FileTable, TdId)
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NESTED_STRUCTURE,
+                              'Nested enum in [%s].' % (Name), FileTable, TdId)
             return ErrMsgList
 
     if ModelId == DataClass.MODEL_IDENTIFIER_ENUMERATE:
@@ -1655,6 +1749,7 @@ def CheckMemberVariableFormat(Name, Value, FileTable, TdId, ModelId):
 
     return ErrMsgList
 
+
 def CheckDeclTypedefFormat(FullFileName, ModelId):
     ErrorMsgList = []
 
@@ -1705,11 +1800,13 @@ def CheckDeclTypedefFormat(FullFileName, ModelId):
         if ValueModelId != ModelId:
             continue
         # Check member variable format.
-        ErrMsgList = CheckMemberVariableFormat(Name, Value, FileTable, Td[5], ModelId)
+        ErrMsgList = CheckMemberVariableFormat(
+            Name, Value, FileTable, Td[5], ModelId)
         for ErrMsg in ErrMsgList:
             if EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Name + '.' + ErrMsg):
                 continue
-            PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Member variable [%s] NOT follow naming convention.' % (Name + '.' + ErrMsg), FileTable, Td[5])
+            PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Member variable [%s] NOT follow naming convention.' % (
+                Name + '.' + ErrMsg), FileTable, Td[5])
 
     # First check in current file to see whether struct/union/enum is typedef-ed.
     UntypedefedList = []
@@ -1728,11 +1825,13 @@ def CheckDeclTypedefFormat(FullFileName, ModelId):
 
         if ValueModelId != ModelId:
             continue
-        ErrMsgList = CheckMemberVariableFormat(Name, Value, FileTable, Result[3], ModelId)
+        ErrMsgList = CheckMemberVariableFormat(
+            Name, Value, FileTable, Result[3], ModelId)
         for ErrMsg in ErrMsgList:
             if EccGlobalData.gException.IsException(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, Result[0] + '.' + ErrMsg):
                 continue
-            PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Member variable [%s] NOT follow naming convention.' % (Result[0] + '.' + ErrMsg), FileTable, Result[3])
+            PrintErrorMsg(ERROR_NAMING_CONVENTION_CHECK_VARIABLE_NAME, 'Member variable [%s] NOT follow naming convention.' % (
+                Result[0] + '.' + ErrMsg), FileTable, Result[3])
         # Check whether it is typedefed.
         Found = False
         for Td in TdList:
@@ -1742,11 +1841,13 @@ def CheckDeclTypedefFormat(FullFileName, ModelId):
             if Result[1] >= Td[3] and Td[4] >= Result[2]:
                 Found = True
                 if not Td[1].isupper():
-                    PrintErrorMsg(ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
+                    PrintErrorMsg(
+                        ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
             if Result[0] in Td[2].split():
                 Found = True
                 if not Td[1].isupper():
-                    PrintErrorMsg(ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
+                    PrintErrorMsg(
+                        ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
             if Found:
                 break
 
@@ -1783,27 +1884,34 @@ def CheckDeclTypedefFormat(FullFileName, ModelId):
             if Result[1] >= Td[3] and Td[4] >= Result[2]:
                 Found = True
                 if not Td[1].isupper():
-                    PrintErrorMsg(ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
+                    PrintErrorMsg(
+                        ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
             if Result[0] in Td[2].split():
                 Found = True
                 if not Td[1].isupper():
-                    PrintErrorMsg(ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
+                    PrintErrorMsg(
+                        ErrorType, 'Typedef should be UPPER case', FileTable, Td[5])
             if Found:
                 break
 
         if not Found:
-            PrintErrorMsg(ErrorType, 'No Typedef for %s' % Result[0], FileTable, Result[3])
+            PrintErrorMsg(ErrorType, 'No Typedef for %s' %
+                          Result[0], FileTable, Result[3])
             continue
 
+
 def CheckDeclStructTypedef(FullFileName):
     CheckDeclTypedefFormat(FullFileName, DataClass.MODEL_IDENTIFIER_STRUCTURE)
 
+
 def CheckDeclEnumTypedef(FullFileName):
     CheckDeclTypedefFormat(FullFileName, DataClass.MODEL_IDENTIFIER_ENUMERATE)
 
+
 def CheckDeclUnionTypedef(FullFileName):
     CheckDeclTypedefFormat(FullFileName, DataClass.MODEL_IDENTIFIER_UNION)
 
+
 def CheckDeclArgModifier(FullFileName):
     ErrorMsgList = []
 
@@ -1823,7 +1931,8 @@ def CheckDeclArgModifier(FullFileName):
     for Result in ResultSet:
         for Modifier in ModifierTuple:
             if PatternInModifier(Result[0], Modifier) and len(Result[0]) < MAX_MODIFIER_LENGTH:
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER, 'Variable Modifier %s' % Result[0], FileTable, Result[2])
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER,
+                              'Variable Modifier %s' % Result[0], FileTable, Result[2])
                 break
 
     SqlStatement = """ select Modifier, Name, ID
@@ -1834,7 +1943,8 @@ def CheckDeclArgModifier(FullFileName):
     for Result in ResultSet:
         for Modifier in ModifierTuple:
             if PatternInModifier(Result[0], Modifier):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER, 'Return Type Modifier %s' % Result[0], FileTable, Result[2])
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER,
+                              'Return Type Modifier %s' % Result[0], FileTable, Result[2])
                 break
 
     SqlStatement = """ select Modifier, Header, ID
@@ -1845,9 +1955,11 @@ def CheckDeclArgModifier(FullFileName):
     for Result in ResultSet:
         for Modifier in ModifierTuple:
             if PatternInModifier(Result[0], Modifier):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER, 'Return Type Modifier %s' % Result[0], FileTable, Result[2])
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_IN_OUT_MODIFIER,
+                              'Return Type Modifier %s' % Result[0], FileTable, Result[2])
                 break
 
+
 def CheckDeclNoUseCType(FullFileName):
     ErrorMsgList = []
 
@@ -1870,7 +1982,8 @@ def CheckDeclNoUseCType(FullFileName):
                                                         Result[0] + ' ' + Result[1]):
                     continue
                 PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE,
-                              'Invalid variable type (%s) in definition [%s]' % (Type, Result[0] + ' ' + Result[1]),
+                              'Invalid variable type (%s) in definition [%s]' % (
+                                  Type, Result[0] + ' ' + Result[1]),
                               FileTable,
                               Result[2])
                 break
@@ -1887,11 +2000,13 @@ def CheckDeclNoUseCType(FullFileName):
             continue
         for Type in CTypeTuple:
             if PatternInModifier(Result[0], Type):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, '%s Return type %s' % (FuncName, Result[0]), FileTable, Result[2])
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, '%s Return type %s' % (
+                    FuncName, Result[0]), FileTable, Result[2])
 
             for Param in ParamList:
                 if PatternInModifier(Param.Modifier, Type):
-                    PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, 'Parameter %s' % Param.Name, FileTable, Result[2])
+                    PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE,
+                                  'Parameter %s' % Param.Name, FileTable, Result[2])
 
     SqlStatement = """ select Modifier, Header, ID, Name
                        from Function
@@ -1905,11 +2020,13 @@ def CheckDeclNoUseCType(FullFileName):
             continue
         for Type in CTypeTuple:
             if PatternInModifier(Result[0], Type):
-                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, '[%s] Return type %s' % (FuncName, Result[0]), FileTable, Result[2])
+                PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, '[%s] Return type %s' % (
+                    FuncName, Result[0]), FileTable, Result[2])
 
             for Param in ParamList:
                 if PatternInModifier(Param.Modifier, Type):
-                    PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE, 'Parameter %s' % Param.Name, FileTable, Result[2])
+                    PrintErrorMsg(ERROR_DECLARATION_DATA_TYPE_CHECK_NO_USE_C_TYPE,
+                                  'Parameter %s' % Param.Name, FileTable, Result[2])
 
 
 def CheckPointerNullComparison(FullFileName):
@@ -1976,20 +2093,24 @@ def CheckPointerNullComparison(FullFileName):
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
                         if Type.find('*') != -1 and Type != 'BOOLEAN*':
-                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE,
+                                          'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
 
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, None, StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord,
+                                  FullFileName, IsFuncCall, None, StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
                 Type = GetTypeFromArray(Type, PredVarStr)
                 if Type.find('*') != -1 and Type != 'BOOLEAN*':
-                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_COMPARISON_NULL_TYPE,
+                                  'Predicate Expression: %s' % Exp, FileTable, Str[2])
+
 
 def CheckNonBooleanValueComparison(FullFileName):
     ErrorMsgList = []
@@ -2056,18 +2177,21 @@ def CheckNonBooleanValueComparison(FullFileName):
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
                         if Type.find('BOOLEAN') == -1:
-                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR,
+                                          'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord,
+                                  FullFileName, IsFuncCall, 'BOOLEAN', StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
                 if Type.find('BOOLEAN') == -1:
-                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_NO_BOOLEAN_OPERATOR,
+                                  'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
 def CheckBooleanValueComparison(FullFileName):
@@ -2135,19 +2259,22 @@ def CheckBooleanValueComparison(FullFileName):
                     Type = FuncReturnTypeDict.get(PredVarStr)
                     if Type is not None:
                         if Type.find('BOOLEAN') != -1:
-                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                            PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE,
+                                          'Predicate Expression: %s' % Exp, FileTable, Str[2])
                         continue
 
                     if PredVarStr in FuncReturnTypeDict:
                         continue
 
-                Type = GetVarInfo(PredVarList, FuncRecord, FullFileName, IsFuncCall, 'BOOLEAN', StarList)
+                Type = GetVarInfo(PredVarList, FuncRecord,
+                                  FullFileName, IsFuncCall, 'BOOLEAN', StarList)
                 if SearchInCache:
                     FuncReturnTypeDict[PredVarStr] = Type
                 if Type is None:
                     continue
                 if Type.find('BOOLEAN') != -1:
-                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE, 'Predicate Expression: %s' % Exp, FileTable, Str[2])
+                    PrintErrorMsg(ERROR_PREDICATE_EXPRESSION_CHECK_BOOLEAN_VALUE,
+                                  'Predicate Expression: %s' % Exp, FileTable, Str[2])
 
 
 def CheckHeaderFileData(FullFileName, AllTypedefFun=[]):
@@ -2170,7 +2297,8 @@ def CheckHeaderFileData(FullFileName, AllTypedefFun=[]):
                 if '(%s)' % Result[1] in Item:
                     break
             else:
-                PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_DATA, 'Variable definition appears in header file', FileTable, Result[0])
+                PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_DATA,
+                              'Variable definition appears in header file', FileTable, Result[0])
 
     SqlStatement = """ select ID
                        from Function
@@ -2178,10 +2306,12 @@ def CheckHeaderFileData(FullFileName, AllTypedefFun=[]):
                    """ % FileID
     ResultSet = Db.TblFile.Exec(SqlStatement)
     for Result in ResultSet:
-        PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_DATA, 'Function definition appears in header file', 'Function', Result[0])
+        PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_DATA,
+                      'Function definition appears in header file', 'Function', Result[0])
 
     return ErrorMsgList
 
+
 def CheckHeaderFileIfndef(FullFileName):
     ErrorMsgList = []
 
@@ -2197,7 +2327,8 @@ def CheckHeaderFileIfndef(FullFileName):
                    """ % (FileTable, DataClass.MODEL_IDENTIFIER_MACRO_IFNDEF)
     ResultSet = Db.TblFile.Exec(SqlStatement)
     if len(ResultSet) == 0:
-        PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_1, '', 'File', FileID)
+        PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_1,
+                      '', 'File', FileID)
         return ErrorMsgList
     for Result in ResultSet:
         SqlStatement = """ select Value, EndLine
@@ -2207,7 +2338,8 @@ def CheckHeaderFileIfndef(FullFileName):
         ResultSet = Db.TblFile.Exec(SqlStatement)
         for Result in ResultSet:
             if not Result[0].startswith('/*') and not Result[0].startswith('//'):
-                PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_2, '', 'File', FileID)
+                PrintErrorMsg(
+                    ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_2, '', 'File', FileID)
         break
 
     SqlStatement = """ select Value
@@ -2217,9 +2349,11 @@ def CheckHeaderFileIfndef(FullFileName):
     ResultSet = Db.TblFile.Exec(SqlStatement)
     for Result in ResultSet:
         if not Result[0].startswith('/*') and not Result[0].startswith('//'):
-            PrintErrorMsg(ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_3, '', 'File', FileID)
+            PrintErrorMsg(
+                ERROR_INCLUDE_FILE_CHECK_IFNDEF_STATEMENT_3, '', 'File', FileID)
     return ErrorMsgList
 
+
 def CheckDoxygenCommand(FullFileName):
     ErrorMsgList = []
 
@@ -2241,9 +2375,11 @@ def CheckDoxygenCommand(FullFileName):
         CommentPartList = CommentStr.split()
         for Part in CommentPartList:
             if Part.upper() == 'BUGBUG':
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND, 'Bug should be marked with doxygen tag @bug', FileTable, Result[1])
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND,
+                              'Bug should be marked with doxygen tag @bug', FileTable, Result[1])
             if Part.upper() == 'TODO':
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND, 'ToDo should be marked with doxygen tag @todo', FileTable, Result[1])
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND,
+                              'ToDo should be marked with doxygen tag @todo', FileTable, Result[1])
             if Part.startswith('@'):
                 if EccGlobalData.gException.IsException(ERROR_DOXYGEN_CHECK_COMMAND, Part):
                     continue
@@ -2253,14 +2389,17 @@ def CheckDoxygenCommand(FullFileName):
                     continue
                 if Part.lstrip('@').isalpha():
                     if Part.lstrip('@') not in DoxygenCommandList:
-                        PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
+                        PrintErrorMsg(
+                            ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
                 else:
                     Index = Part.find('[')
                     if Index == -1:
-                        PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
+                        PrintErrorMsg(
+                            ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
                     RealCmd = Part[1:Index]
                     if RealCmd not in DoxygenCommandList:
-                        PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
+                        PrintErrorMsg(
+                            ERROR_DOXYGEN_CHECK_COMMAND, 'Unknown doxygen command %s' % Part, FileTable, Result[1])
 
 
 def CheckDoxygenTripleForwardSlash(FullFileName):
@@ -2284,7 +2423,6 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
     for Result in ResultSet:
         FuncDefSet.append(Result)
 
-
     FileTable = 'Identifier' + str(FileID)
     SqlStatement = """ select Value, ID, StartLine, StartColumn, EndLine, EndColumn
                        from %s
@@ -2299,7 +2437,6 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
     except:
         print('Unrecognized chars in comment of file %s', FullFileName)
 
-
     for Result in CommentSet:
         CommentStr = Result[0]
         StartLine = Result[2]
@@ -2324,7 +2461,8 @@ def CheckDoxygenTripleForwardSlash(FullFileName):
                 Found = True
                 break
         if Found:
-            PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMENT_FORMAT, '', FileTable, Result[1])
+            PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMENT_FORMAT,
+                          '', FileTable, Result[1])
 
 
 def CheckFileHeaderDoxygenComments(FullFileName):
@@ -2342,7 +2480,8 @@ def CheckFileHeaderDoxygenComments(FullFileName):
                    """ % (FileTable, DataClass.MODEL_IDENTIFIER_COMMENT)
     ResultSet = Db.TblFile.Exec(SqlStatement)
     if len(ResultSet) == 0:
-        PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'No File License header appear at the very beginning of file.', 'File', FileID)
+        PrintErrorMsg(ERROR_HEADER_CHECK_FILE,
+                      'No File License header appear at the very beginning of file.', 'File', FileID)
         return ErrorMsgList
 
     NoHeaderCommentStartFlag = True
@@ -2364,7 +2503,7 @@ def CheckFileHeaderDoxygenComments(FullFileName):
         for CommentLine in CommentStrListTemp:
             if CommentLine.strip().startswith('/** @file'):
                 FileStartFlag = True
-            if FileStartFlag ==  True:
+            if FileStartFlag == True:
                 CommentStrList.append(CommentLine)
 
         ID = Result[1]
@@ -2388,7 +2527,8 @@ def CheckFileHeaderDoxygenComments(FullFileName):
             # Check whether C File header Comment content start with two spaces.
             if EccGlobalData.gConfig.HeaderCheckCFileCommentStartSpacesNum == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
                 if CommentLine.startswith('/** @file') == False and CommentLine.startswith('**/') == False and CommentLine.strip() and CommentLine.startswith('  ') == False:
-                    PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment content should start with two spaces at each line', FileTable, ID)
+                    PrintErrorMsg(
+                        ERROR_HEADER_CHECK_FILE, 'File header comment content should start with two spaces at each line', FileTable, ID)
 
             CommentLine = CommentLine.strip()
             if CommentLine.startswith('Copyright') or ('Copyright' in CommentLine and CommentLine.lower().startswith('(c)')):
@@ -2396,10 +2536,12 @@ def CheckFileHeaderDoxygenComments(FullFileName):
                 if CommentLine.find('All rights reserved') == -1:
                     for Copyright in EccGlobalData.gConfig.Copyright:
                         if CommentLine.find(Copyright) > -1:
-                            PrintErrorMsg(ERROR_HEADER_CHECK_FILE, '""All rights reserved"" announcement should be following the ""Copyright"" at the same line', FileTable, ID)
+                            PrintErrorMsg(
+                                ERROR_HEADER_CHECK_FILE, '""All rights reserved"" announcement should be following the ""Copyright"" at the same line', FileTable, ID)
                             break
                 if CommentLine.endswith('<BR>') == -1:
-                    PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'The ""<BR>"" at the end of the Copyright line is required', FileTable, ID)
+                    PrintErrorMsg(
+                        ERROR_HEADER_CHECK_FILE, 'The ""<BR>"" at the end of the Copyright line is required', FileTable, ID)
                 if NextLineIndex < len(CommentStrList) and CommentStrList[NextLineIndex].strip().startswith('Copyright') == False and CommentStrList[NextLineIndex].strip():
                     NoLicenseFlag = False
             if CommentLine.startswith('@par Revision Reference:'):
@@ -2415,20 +2557,26 @@ def CheckFileHeaderDoxygenComments(FullFileName):
                     if EccGlobalData.gConfig.HeaderCheckCFileCommentReferenceFormat == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
                         if RefListFlag == True:
                             if RefLine.strip() and RefLine.strip().startswith('**/') == False and RefLine.startswith('  -') == False:
-                                PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'Each reference on a separate line should begin with a bullet character ""-"" ', FileTable, ID)
+                                PrintErrorMsg(
+                                    ERROR_HEADER_CHECK_FILE, 'Each reference on a separate line should begin with a bullet character ""-"" ', FileTable, ID)
 
     if NoHeaderCommentStartFlag:
-        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FILE_HEADER, 'File header comment should begin with ""/** @file""', FileTable, ID)
+        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FILE_HEADER,
+                      'File header comment should begin with ""/** @file""', FileTable, ID)
         return
     if NoHeaderCommentEndFlag:
-        PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment should end with ""**/""', FileTable, ID)
+        PrintErrorMsg(ERROR_HEADER_CHECK_FILE,
+                      'File header comment should end with ""**/""', FileTable, ID)
         return
     if NoCopyrightFlag:
-        PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment missing the ""Copyright""', FileTable, ID)
-    #Check whether C File header Comment have the License immediately after the ""Copyright"" line.
+        PrintErrorMsg(ERROR_HEADER_CHECK_FILE,
+                      'File header comment missing the ""Copyright""', FileTable, ID)
+    # Check whether C File header Comment have the License immediately after the ""Copyright"" line.
     if EccGlobalData.gConfig.HeaderCheckCFileCommentLicenseFormat == '1' or EccGlobalData.gConfig.HeaderCheckAll == '1' or EccGlobalData.gConfig.CheckAll == '1':
         if NoLicenseFlag:
-            PrintErrorMsg(ERROR_HEADER_CHECK_FILE, 'File header comment should have the License immediately after the ""Copyright"" line', FileTable, ID)
+            PrintErrorMsg(ERROR_HEADER_CHECK_FILE,
+                          'File header comment should have the License immediately after the ""Copyright"" line', FileTable, ID)
+
 
 def CheckFuncHeaderDoxygenComments(FullFileName):
     ErrorMsgList = []
@@ -2460,14 +2608,18 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
     ResultSet = Db.TblFile.Exec(SqlStatement)
     for Result in ResultSet:
         FuncName = Result[4]
-        FunctionHeaderComment = CheckCommentImmediatelyPrecedeFunctionHeader(Result[1], Result[2], CommentSet)
+        FunctionHeaderComment = CheckCommentImmediatelyPrecedeFunctionHeader(
+            Result[1], Result[2], CommentSet)
         if FunctionHeaderComment:
-            CheckFunctionHeaderConsistentWithDoxygenComment(Result[0], Result[1], Result[2], FunctionHeaderComment[0], FunctionHeaderComment[1], ErrorMsgList, FunctionHeaderComment[3], FileTable)
+            CheckFunctionHeaderConsistentWithDoxygenComment(
+                Result[0], Result[1], Result[2], FunctionHeaderComment[0], FunctionHeaderComment[1], ErrorMsgList, FunctionHeaderComment[3], FileTable)
         else:
             if EccGlobalData.gException.IsException(ERROR_HEADER_CHECK_FUNCTION, FuncName):
                 continue
-            ErrorMsgList.append('Line %d :Function %s has NO comment immediately preceding it.' % (Result[2], Result[1]))
-            PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'Function [%s] has NO comment immediately preceding it.' % (FuncName), FileTable, Result[3])
+            ErrorMsgList.append(
+                'Line %d :Function %s has NO comment immediately preceding it.' % (Result[2], Result[1]))
+            PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'Function [%s] has NO comment immediately preceding it.' % (
+                FuncName), FileTable, Result[3])
 
     # Func Def check
     SqlStatement = """ select Value, StartLine, EndLine, ID
@@ -2490,16 +2642,21 @@ def CheckFuncHeaderDoxygenComments(FullFileName):
     ResultSet = Db.TblFile.Exec(SqlStatement)
     for Result in ResultSet:
         FuncName = Result[4]
-        FunctionHeaderComment = CheckCommentImmediatelyPrecedeFunctionHeader(Result[1], Result[2], CommentSet)
+        FunctionHeaderComment = CheckCommentImmediatelyPrecedeFunctionHeader(
+            Result[1], Result[2], CommentSet)
         if FunctionHeaderComment:
-            CheckFunctionHeaderConsistentWithDoxygenComment(Result[0], Result[1], Result[2], FunctionHeaderComment[0], FunctionHeaderComment[1], ErrorMsgList, FunctionHeaderComment[3], FileTable)
+            CheckFunctionHeaderConsistentWithDoxygenComment(
+                Result[0], Result[1], Result[2], FunctionHeaderComment[0], FunctionHeaderComment[1], ErrorMsgList, FunctionHeaderComment[3], FileTable)
         else:
             if EccGlobalData.gException.IsException(ERROR_HEADER_CHECK_FUNCTION, FuncName):
                 continue
-            ErrorMsgList.append('Line %d :Function [%s] has NO comment immediately preceding it.' % (Result[2], Result[1]))
-            PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'Function [%s] has NO comment immediately preceding it.' % (FuncName), 'Function', Result[3])
+            ErrorMsgList.append(
+                'Line %d :Function [%s] has NO comment immediately preceding it.' % (Result[2], Result[1]))
+            PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'Function [%s] has NO comment immediately preceding it.' % (
+                FuncName), 'Function', Result[3])
     return ErrorMsgList
 
+
 def CheckCommentImmediatelyPrecedeFunctionHeader(FuncName, FuncStartLine, CommentSet):
 
     for Comment in CommentSet:
@@ -2507,6 +2664,7 @@ def CheckCommentImmediatelyPrecedeFunctionHeader(FuncName, FuncStartLine, Commen
             return Comment
     return None
 
+
 def GetDoxygenStrFromComment(Str):
     DoxygenStrList = []
     ParamTagList = Str.split('@param')
@@ -2543,28 +2701,38 @@ def GetDoxygenStrFromComment(Str):
 
     return DoxygenStrList
 
-def CheckGeneralDoxygenCommentLayout(Str, StartLine, ErrorMsgList, CommentId= -1, TableName=''):
-    #/** --*/ @retval after @param
+
+def CheckGeneralDoxygenCommentLayout(Str, StartLine, ErrorMsgList, CommentId=-1, TableName=''):
+    # /** --*/ @retval after @param
     if not Str.startswith('/**'):
-        ErrorMsgList.append('Line %d : Comment does NOT have prefix /** ' % StartLine)
-        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'Comment does NOT have prefix /** ', TableName, CommentId)
+        ErrorMsgList.append(
+            'Line %d : Comment does NOT have prefix /** ' % StartLine)
+        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                      'Comment does NOT have prefix /** ', TableName, CommentId)
     if not Str.endswith('**/'):
-        ErrorMsgList.append('Line %d : Comment does NOT have tail **/ ' % StartLine)
-        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'Comment does NOT have tail **/ ', TableName, CommentId)
+        ErrorMsgList.append(
+            'Line %d : Comment does NOT have tail **/ ' % StartLine)
+        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                      'Comment does NOT have tail **/ ', TableName, CommentId)
     FirstRetvalIndex = Str.find('@retval')
     LastParamIndex = Str.rfind('@param')
     if (FirstRetvalIndex > 0) and (LastParamIndex > 0) and (FirstRetvalIndex < LastParamIndex):
-        ErrorMsgList.append('Line %d : @retval appear before @param ' % StartLine)
-        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, @retval appear before @param  ', TableName, CommentId)
+        ErrorMsgList.append(
+            'Line %d : @retval appear before @param ' % StartLine)
+        PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                      'in Comment, @retval appear before @param  ', TableName, CommentId)
 
-def CheckFunctionHeaderConsistentWithDoxygenComment(FuncModifier, FuncHeader, FuncStartLine, CommentStr, CommentStartLine, ErrorMsgList, CommentId= -1, TableName=''):
+
+def CheckFunctionHeaderConsistentWithDoxygenComment(FuncModifier, FuncHeader, FuncStartLine, CommentStr, CommentStartLine, ErrorMsgList, CommentId=-1, TableName=''):
 
     ParamList = GetParamList(FuncHeader)
-    CheckGeneralDoxygenCommentLayout(CommentStr, CommentStartLine, ErrorMsgList, CommentId, TableName)
+    CheckGeneralDoxygenCommentLayout(
+        CommentStr, CommentStartLine, ErrorMsgList, CommentId, TableName)
     DescriptionStr = CommentStr
     DoxygenStrList = GetDoxygenStrFromComment(DescriptionStr)
     if DescriptionStr.find('.') == -1:
-        PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMENT_DESCRIPTION, 'Comment description should end with period \'.\'', TableName, CommentId)
+        PrintErrorMsg(ERROR_DOXYGEN_CHECK_COMMENT_DESCRIPTION,
+                      'Comment description should end with period \'.\'', TableName, CommentId)
     DoxygenTagNumber = len(DoxygenStrList)
     ParamNumber = len(ParamList)
     for Param in ParamList:
@@ -2577,12 +2745,16 @@ def CheckFunctionHeaderConsistentWithDoxygenComment(FuncModifier, FuncHeader, Fu
             ParamName = ParamList[Index].Name.strip()
             Tag = DoxygenStrList[Index].strip(' ')
             if (not Tag[-1] == ('\n')) and (not Tag[-1] == ('\r')):
-                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT end with new line ' % (CommentStartLine, Tag.replace('\n', '').replace('\r', '')))
-                PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'in Comment, <%s> does NOT end with new line ' % (Tag.replace('\n', '').replace('\r', '')), TableName, CommentId)
+                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT end with new line ' % (
+                    CommentStartLine, Tag.replace('\n', '').replace('\r', '')))
+                PrintErrorMsg(ERROR_HEADER_CHECK_FUNCTION, 'in Comment, <%s> does NOT end with new line ' % (
+                    Tag.replace('\n', '').replace('\r', '')), TableName, CommentId)
             TagPartList = Tag.split()
             if len(TagPartList) < 2:
-                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT contain doxygen contents ' % (CommentStartLine, Tag.replace('\n', '').replace('\r', '')))
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT contain doxygen contents ' % (Tag.replace('\n', '').replace('\r', '')), TableName, CommentId)
+                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT contain doxygen contents ' % (
+                    CommentStartLine, Tag.replace('\n', '').replace('\r', '')))
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT contain doxygen contents ' % (
+                    Tag.replace('\n', '').replace('\r', '')), TableName, CommentId)
                 Index += 1
                 continue
             LBPos = Tag.find('[')
@@ -2603,46 +2775,62 @@ def CheckFunctionHeaderConsistentWithDoxygenComment(FuncModifier, FuncHeader, Fu
                 if InOutStr != '':
                     if Tag.find('[' + InOutStr + ']') == -1:
                         if InOutStr != 'in, out':
-                            ErrorMsgList.append('Line %d : in Comment, <%s> does NOT have %s ' % (CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'))
-                            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT have %s ' % ((TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'), TableName, CommentId)
+                            ErrorMsgList.append('Line %d : in Comment, <%s> does NOT have %s ' % (
+                                CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'))
+                            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT have %s ' % (
+                                (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'), TableName, CommentId)
                         else:
                             if Tag.find('[in,out]') == -1:
-                                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT have %s ' % (CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'))
-                                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT have %s ' % ((TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'), TableName, CommentId)
-
+                                ErrorMsgList.append('Line %d : in Comment, <%s> does NOT have %s ' % (
+                                    CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'))
+                                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> does NOT have %s ' % (
+                                    (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), '[' + InOutStr + ']'), TableName, CommentId)
 
             if Tag.find(ParamName) == -1 and ParamName != 'VOID' and ParamName != 'void':
-                ErrorMsgList.append('Line %d : in Comment, <%s> is NOT consistent with parameter name %s ' % (CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), ParamName))
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> is NOT consistent with parameter name %s ' % ((TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), ParamName), TableName, CommentId)
+                ErrorMsgList.append('Line %d : in Comment, <%s> is NOT consistent with parameter name %s ' % (
+                    CommentStartLine, (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), ParamName))
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'in Comment, <%s> is NOT consistent with parameter name %s ' % (
+                    (TagPartList[0] + ' ' + TagPartList[1]).replace('\n', '').replace('\r', ''), ParamName), TableName, CommentId)
             Index += 1
 
         if Index < ParamNumber:
-            ErrorMsgList.append('Line %d : Number of doxygen tags in comment less than number of function parameters' % CommentStartLine)
-            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'Number of doxygen tags in comment less than number of function parameters ', TableName, CommentId)
+            ErrorMsgList.append(
+                'Line %d : Number of doxygen tags in comment less than number of function parameters' % CommentStartLine)
+            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                          'Number of doxygen tags in comment less than number of function parameters ', TableName, CommentId)
         # VOID return type, NOT VOID*. VOID* should be matched with a doxygen tag.
         if (FuncModifier.find('VOID') != -1 or FuncModifier.find('void') != -1) and FuncModifier.find('*') == -1:
 
             # assume we allow a return description tag for void func. return. that's why 'DoxygenTagNumber - 1' is used instead of 'DoxygenTagNumber'
             if Index < DoxygenTagNumber - 1 or (Index < DoxygenTagNumber and DoxygenStrList[Index].startswith('@retval')):
-                ErrorMsgList.append('Line %d : VOID return type need NO doxygen tags in comment' % CommentStartLine)
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'VOID return type need no doxygen tags in comment ', TableName, CommentId)
+                ErrorMsgList.append(
+                    'Line %d : VOID return type need NO doxygen tags in comment' % CommentStartLine)
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                              'VOID return type need no doxygen tags in comment ', TableName, CommentId)
         else:
             if Index < DoxygenTagNumber and not DoxygenStrList[Index].startswith('@retval') and not DoxygenStrList[Index].startswith('@return'):
-                ErrorMsgList.append('Line %d : Number of @param doxygen tags in comment does NOT match number of function parameters' % CommentStartLine)
-                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'Number of @param doxygen tags in comment does NOT match number of function parameters ', TableName, CommentId)
+                ErrorMsgList.append(
+                    'Line %d : Number of @param doxygen tags in comment does NOT match number of function parameters' % CommentStartLine)
+                PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                              'Number of @param doxygen tags in comment does NOT match number of function parameters ', TableName, CommentId)
     else:
         if ParamNumber == 0 and DoxygenTagNumber != 0 and ((FuncModifier.find('VOID') != -1 or FuncModifier.find('void') != -1) and FuncModifier.find('*') == -1):
-            ErrorMsgList.append('Line %d : VOID return type need NO doxygen tags in comment' % CommentStartLine)
-            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'VOID return type need NO doxygen tags in comment ', TableName, CommentId)
+            ErrorMsgList.append(
+                'Line %d : VOID return type need NO doxygen tags in comment' % CommentStartLine)
+            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                          'VOID return type need NO doxygen tags in comment ', TableName, CommentId)
         if ParamNumber != 0 and DoxygenTagNumber == 0:
-            ErrorMsgList.append('Line %d : No doxygen tags in comment' % CommentStartLine)
-            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER, 'No doxygen tags in comment ', TableName, CommentId)
+            ErrorMsgList.append(
+                'Line %d : No doxygen tags in comment' % CommentStartLine)
+            PrintErrorMsg(ERROR_DOXYGEN_CHECK_FUNCTION_HEADER,
+                          'No doxygen tags in comment ', TableName, CommentId)
+
 
 if __name__ == '__main__':
 
-#    EdkLogger.Initialize()
-#    EdkLogger.SetLevel(EdkLogger.QUIET)
-#    CollectSourceCodeDataIntoDB(sys.argv[1])
+    #    EdkLogger.Initialize()
+    #    EdkLogger.SetLevel(EdkLogger.QUIET)
+    #    CollectSourceCodeDataIntoDB(sys.argv[1])
     try:
         test_file = sys.argv[1]
     except IndexError as v:
diff --git a/BaseTools/Source/Python/Eot/CParser3/CLexer.py b/BaseTools/Source/Python/Eot/CParser3/CLexer.py
index ca03adea7a65..c48cb2404bf0 100644
--- a/BaseTools/Source/Python/Eot/CParser3/CLexer.py
+++ b/BaseTools/Source/Python/Eot/CParser3/CLexer.py
@@ -3,7 +3,7 @@
 from antlr3 import *
 from antlr3.compat import set, frozenset
 
-## @file
+# @file
 # The file defines the Lexer for C source files.
 #
 # THIS FILE IS AUTO-GENERATED. PLEASE DO NOT MODIFY THIS FILE.
@@ -17,127 +17,127 @@ from antlr3.compat import set, frozenset
 ##
 
 
-
 # for convenience in actions
 HIDDEN = BaseRecognizer.HIDDEN
 
 # token types
-T114=114
-T115=115
-T116=116
-T117=117
-FloatTypeSuffix=16
-LETTER=11
-T29=29
-T28=28
-T27=27
-T26=26
-T25=25
-EOF=-1
-STRING_LITERAL=9
-FLOATING_POINT_LITERAL=10
-T38=38
-T37=37
-T39=39
-T34=34
-COMMENT=22
-T33=33
-T36=36
-T35=35
-T30=30
-T32=32
-T31=31
-LINE_COMMENT=23
-IntegerTypeSuffix=14
-CHARACTER_LITERAL=8
-T49=49
-T48=48
-T100=100
-T43=43
-T42=42
-T102=102
-T41=41
-T101=101
-T40=40
-T47=47
-T46=46
-T45=45
-T44=44
-T109=109
-T107=107
-T108=108
-T105=105
-WS=19
-T106=106
-T103=103
-T104=104
-T50=50
-LINE_COMMAND=24
-T59=59
-T113=113
-T52=52
-T112=112
-T51=51
-T111=111
-T54=54
-T110=110
-EscapeSequence=12
-DECIMAL_LITERAL=7
-T53=53
-T56=56
-T55=55
-T58=58
-T57=57
-T75=75
-T76=76
-T73=73
-T74=74
-T79=79
-T77=77
-T78=78
-Exponent=15
-HexDigit=13
-T72=72
-T71=71
-T70=70
-T62=62
-T63=63
-T64=64
-T65=65
-T66=66
-T67=67
-T68=68
-T69=69
-IDENTIFIER=4
-UnicodeVocabulary=21
-HEX_LITERAL=5
-T61=61
-T60=60
-T99=99
-T97=97
-BS=20
-T98=98
-T95=95
-T96=96
-OCTAL_LITERAL=6
-T94=94
-Tokens=118
-T93=93
-T92=92
-T91=91
-T90=90
-T88=88
-T89=89
-T84=84
-T85=85
-T86=86
-T87=87
-UnicodeEscape=18
-T81=81
-T80=80
-T83=83
-OctalEscape=17
-T82=82
+T114 = 114
+T115 = 115
+T116 = 116
+T117 = 117
+FloatTypeSuffix = 16
+LETTER = 11
+T29 = 29
+T28 = 28
+T27 = 27
+T26 = 26
+T25 = 25
+EOF = -1
+STRING_LITERAL = 9
+FLOATING_POINT_LITERAL = 10
+T38 = 38
+T37 = 37
+T39 = 39
+T34 = 34
+COMMENT = 22
+T33 = 33
+T36 = 36
+T35 = 35
+T30 = 30
+T32 = 32
+T31 = 31
+LINE_COMMENT = 23
+IntegerTypeSuffix = 14
+CHARACTER_LITERAL = 8
+T49 = 49
+T48 = 48
+T100 = 100
+T43 = 43
+T42 = 42
+T102 = 102
+T41 = 41
+T101 = 101
+T40 = 40
+T47 = 47
+T46 = 46
+T45 = 45
+T44 = 44
+T109 = 109
+T107 = 107
+T108 = 108
+T105 = 105
+WS = 19
+T106 = 106
+T103 = 103
+T104 = 104
+T50 = 50
+LINE_COMMAND = 24
+T59 = 59
+T113 = 113
+T52 = 52
+T112 = 112
+T51 = 51
+T111 = 111
+T54 = 54
+T110 = 110
+EscapeSequence = 12
+DECIMAL_LITERAL = 7
+T53 = 53
+T56 = 56
+T55 = 55
+T58 = 58
+T57 = 57
+T75 = 75
+T76 = 76
+T73 = 73
+T74 = 74
+T79 = 79
+T77 = 77
+T78 = 78
+Exponent = 15
+HexDigit = 13
+T72 = 72
+T71 = 71
+T70 = 70
+T62 = 62
+T63 = 63
+T64 = 64
+T65 = 65
+T66 = 66
+T67 = 67
+T68 = 68
+T69 = 69
+IDENTIFIER = 4
+UnicodeVocabulary = 21
+HEX_LITERAL = 5
+T61 = 61
+T60 = 60
+T99 = 99
+T97 = 97
+BS = 20
+T98 = 98
+T95 = 95
+T96 = 96
+OCTAL_LITERAL = 6
+T94 = 94
+Tokens = 118
+T93 = 93
+T92 = 92
+T91 = 91
+T90 = 90
+T88 = 88
+T89 = 89
+T84 = 84
+T85 = 85
+T86 = 86
+T87 = 87
+UnicodeEscape = 18
+T81 = 81
+T80 = 80
+T83 = 83
+OctalEscape = 17
+T82 = 82
+
 
 class CLexer(Lexer):
 
@@ -147,31 +147,27 @@ class CLexer(Lexer):
         Lexer.__init__(self, input)
         self.dfa25 = self.DFA25(
             self, 25,
-            eot = self.DFA25_eot,
-            eof = self.DFA25_eof,
-            min = self.DFA25_min,
-            max = self.DFA25_max,
-            accept = self.DFA25_accept,
-            special = self.DFA25_special,
-            transition = self.DFA25_transition
-            )
+            eot=self.DFA25_eot,
+            eof=self.DFA25_eof,
+            min=self.DFA25_min,
+            max=self.DFA25_max,
+            accept=self.DFA25_accept,
+            special=self.DFA25_special,
+            transition=self.DFA25_transition
+        )
         self.dfa35 = self.DFA35(
             self, 35,
-            eot = self.DFA35_eot,
-            eof = self.DFA35_eof,
-            min = self.DFA35_min,
-            max = self.DFA35_max,
-            accept = self.DFA35_accept,
-            special = self.DFA35_special,
-            transition = self.DFA35_transition
-            )
-
-
-
-
-
+            eot=self.DFA35_eot,
+            eof=self.DFA35_eof,
+            min=self.DFA35_min,
+            max=self.DFA35_max,
+            accept=self.DFA35_accept,
+            special=self.DFA35_special,
+            transition=self.DFA35_transition
+        )
 
     # $ANTLR start T25
+
     def mT25(self, ):
 
         try:
@@ -181,19 +177,14 @@ class CLexer(Lexer):
             # C.g:27:7: ';'
             self.match(u';')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T25
 
-
-
     # $ANTLR start T26
+
     def mT26(self, ):
 
         try:
@@ -203,20 +194,14 @@ class CLexer(Lexer):
             # C.g:28:7: 'typedef'
             self.match("typedef")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T26
 
-
-
     # $ANTLR start T27
+
     def mT27(self, ):
 
         try:
@@ -226,19 +211,14 @@ class CLexer(Lexer):
             # C.g:29:7: ','
             self.match(u',')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T27
 
-
-
     # $ANTLR start T28
+
     def mT28(self, ):
 
         try:
@@ -248,19 +228,14 @@ class CLexer(Lexer):
             # C.g:30:7: '='
             self.match(u'=')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T28
 
-
-
     # $ANTLR start T29
+
     def mT29(self, ):
 
         try:
@@ -270,20 +245,14 @@ class CLexer(Lexer):
             # C.g:31:7: 'extern'
             self.match("extern")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T29
 
-
-
     # $ANTLR start T30
+
     def mT30(self, ):
 
         try:
@@ -293,20 +262,14 @@ class CLexer(Lexer):
             # C.g:32:7: 'static'
             self.match("static")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T30
 
-
-
     # $ANTLR start T31
+
     def mT31(self, ):
 
         try:
@@ -316,20 +279,14 @@ class CLexer(Lexer):
             # C.g:33:7: 'auto'
             self.match("auto")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T31
 
-
-
     # $ANTLR start T32
+
     def mT32(self, ):
 
         try:
@@ -339,20 +296,14 @@ class CLexer(Lexer):
             # C.g:34:7: 'register'
             self.match("register")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T32
 
-
-
     # $ANTLR start T33
+
     def mT33(self, ):
 
         try:
@@ -362,20 +313,14 @@ class CLexer(Lexer):
             # C.g:35:7: 'STATIC'
             self.match("STATIC")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T33
 
-
-
     # $ANTLR start T34
+
     def mT34(self, ):
 
         try:
@@ -385,20 +330,14 @@ class CLexer(Lexer):
             # C.g:36:7: 'void'
             self.match("void")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T34
 
-
-
     # $ANTLR start T35
+
     def mT35(self, ):
 
         try:
@@ -408,20 +347,14 @@ class CLexer(Lexer):
             # C.g:37:7: 'char'
             self.match("char")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T35
 
-
-
     # $ANTLR start T36
+
     def mT36(self, ):
 
         try:
@@ -431,20 +364,14 @@ class CLexer(Lexer):
             # C.g:38:7: 'short'
             self.match("short")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T36
 
-
-
     # $ANTLR start T37
+
     def mT37(self, ):
 
         try:
@@ -454,20 +381,14 @@ class CLexer(Lexer):
             # C.g:39:7: 'int'
             self.match("int")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T37
 
-
-
     # $ANTLR start T38
+
     def mT38(self, ):
 
         try:
@@ -477,20 +398,14 @@ class CLexer(Lexer):
             # C.g:40:7: 'long'
             self.match("long")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T38
 
-
-
     # $ANTLR start T39
+
     def mT39(self, ):
 
         try:
@@ -500,20 +415,14 @@ class CLexer(Lexer):
             # C.g:41:7: 'float'
             self.match("float")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T39
 
-
-
     # $ANTLR start T40
+
     def mT40(self, ):
 
         try:
@@ -523,20 +432,14 @@ class CLexer(Lexer):
             # C.g:42:7: 'double'
             self.match("double")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T40
 
-
-
     # $ANTLR start T41
+
     def mT41(self, ):
 
         try:
@@ -546,20 +449,14 @@ class CLexer(Lexer):
             # C.g:43:7: 'signed'
             self.match("signed")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T41
 
-
-
     # $ANTLR start T42
+
     def mT42(self, ):
 
         try:
@@ -569,20 +466,14 @@ class CLexer(Lexer):
             # C.g:44:7: 'unsigned'
             self.match("unsigned")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T42
 
-
-
     # $ANTLR start T43
+
     def mT43(self, ):
 
         try:
@@ -592,19 +483,14 @@ class CLexer(Lexer):
             # C.g:45:7: '{'
             self.match(u'{')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T43
 
-
-
     # $ANTLR start T44
+
     def mT44(self, ):
 
         try:
@@ -614,19 +500,14 @@ class CLexer(Lexer):
             # C.g:46:7: '}'
             self.match(u'}')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T44
 
-
-
     # $ANTLR start T45
+
     def mT45(self, ):
 
         try:
@@ -636,20 +517,14 @@ class CLexer(Lexer):
             # C.g:47:7: 'struct'
             self.match("struct")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T45
 
-
-
     # $ANTLR start T46
+
     def mT46(self, ):
 
         try:
@@ -659,20 +534,14 @@ class CLexer(Lexer):
             # C.g:48:7: 'union'
             self.match("union")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T46
 
-
-
     # $ANTLR start T47
+
     def mT47(self, ):
 
         try:
@@ -682,19 +551,14 @@ class CLexer(Lexer):
             # C.g:49:7: ':'
             self.match(u':')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T47
 
-
-
     # $ANTLR start T48
+
     def mT48(self, ):
 
         try:
@@ -704,20 +568,14 @@ class CLexer(Lexer):
             # C.g:50:7: 'enum'
             self.match("enum")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T48
 
-
-
     # $ANTLR start T49
+
     def mT49(self, ):
 
         try:
@@ -727,20 +585,14 @@ class CLexer(Lexer):
             # C.g:51:7: 'const'
             self.match("const")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T49
 
-
-
     # $ANTLR start T50
+
     def mT50(self, ):
 
         try:
@@ -750,20 +602,14 @@ class CLexer(Lexer):
             # C.g:52:7: 'volatile'
             self.match("volatile")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T50
 
-
-
     # $ANTLR start T51
+
     def mT51(self, ):
 
         try:
@@ -773,20 +619,14 @@ class CLexer(Lexer):
             # C.g:53:7: 'IN'
             self.match("IN")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T51
 
-
-
     # $ANTLR start T52
+
     def mT52(self, ):
 
         try:
@@ -796,20 +636,14 @@ class CLexer(Lexer):
             # C.g:54:7: 'OUT'
             self.match("OUT")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T52
 
-
-
     # $ANTLR start T53
+
     def mT53(self, ):
 
         try:
@@ -819,20 +653,14 @@ class CLexer(Lexer):
             # C.g:55:7: 'OPTIONAL'
             self.match("OPTIONAL")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T53
 
-
-
     # $ANTLR start T54
+
     def mT54(self, ):
 
         try:
@@ -842,20 +670,14 @@ class CLexer(Lexer):
             # C.g:56:7: 'CONST'
             self.match("CONST")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T54
 
-
-
     # $ANTLR start T55
+
     def mT55(self, ):
 
         try:
@@ -865,20 +687,14 @@ class CLexer(Lexer):
             # C.g:57:7: 'UNALIGNED'
             self.match("UNALIGNED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T55
 
-
-
     # $ANTLR start T56
+
     def mT56(self, ):
 
         try:
@@ -888,20 +704,14 @@ class CLexer(Lexer):
             # C.g:58:7: 'VOLATILE'
             self.match("VOLATILE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T56
 
-
-
     # $ANTLR start T57
+
     def mT57(self, ):
 
         try:
@@ -911,20 +721,14 @@ class CLexer(Lexer):
             # C.g:59:7: 'GLOBAL_REMOVE_IF_UNREFERENCED'
             self.match("GLOBAL_REMOVE_IF_UNREFERENCED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T57
 
-
-
     # $ANTLR start T58
+
     def mT58(self, ):
 
         try:
@@ -934,20 +738,14 @@ class CLexer(Lexer):
             # C.g:60:7: 'EFIAPI'
             self.match("EFIAPI")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T58
 
-
-
     # $ANTLR start T59
+
     def mT59(self, ):
 
         try:
@@ -957,20 +755,14 @@ class CLexer(Lexer):
             # C.g:61:7: 'EFI_BOOTSERVICE'
             self.match("EFI_BOOTSERVICE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T59
 
-
-
     # $ANTLR start T60
+
     def mT60(self, ):
 
         try:
@@ -980,20 +772,14 @@ class CLexer(Lexer):
             # C.g:62:7: 'EFI_RUNTIMESERVICE'
             self.match("EFI_RUNTIMESERVICE")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T60
 
-
-
     # $ANTLR start T61
+
     def mT61(self, ):
 
         try:
@@ -1003,20 +789,14 @@ class CLexer(Lexer):
             # C.g:63:7: 'PACKED'
             self.match("PACKED")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T61
 
-
-
     # $ANTLR start T62
+
     def mT62(self, ):
 
         try:
@@ -1026,19 +806,14 @@ class CLexer(Lexer):
             # C.g:64:7: '('
             self.match(u'(')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T62
 
-
-
     # $ANTLR start T63
+
     def mT63(self, ):
 
         try:
@@ -1048,19 +823,14 @@ class CLexer(Lexer):
             # C.g:65:7: ')'
             self.match(u')')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T63
 
-
-
     # $ANTLR start T64
+
     def mT64(self, ):
 
         try:
@@ -1070,19 +840,14 @@ class CLexer(Lexer):
             # C.g:66:7: '['
             self.match(u'[')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T64
 
-
-
     # $ANTLR start T65
+
     def mT65(self, ):
 
         try:
@@ -1092,19 +857,14 @@ class CLexer(Lexer):
             # C.g:67:7: ']'
             self.match(u']')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T65
 
-
-
     # $ANTLR start T66
+
     def mT66(self, ):
 
         try:
@@ -1114,19 +874,14 @@ class CLexer(Lexer):
             # C.g:68:7: '*'
             self.match(u'*')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T66
 
-
-
     # $ANTLR start T67
+
     def mT67(self, ):
 
         try:
@@ -1136,20 +891,14 @@ class CLexer(Lexer):
             # C.g:69:7: '...'
             self.match("...")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T67
 
-
-
     # $ANTLR start T68
+
     def mT68(self, ):
 
         try:
@@ -1159,19 +908,14 @@ class CLexer(Lexer):
             # C.g:70:7: '+'
             self.match(u'+')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T68
 
-
-
     # $ANTLR start T69
+
     def mT69(self, ):
 
         try:
@@ -1181,19 +925,14 @@ class CLexer(Lexer):
             # C.g:71:7: '-'
             self.match(u'-')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T69
 
-
-
     # $ANTLR start T70
+
     def mT70(self, ):
 
         try:
@@ -1203,19 +942,14 @@ class CLexer(Lexer):
             # C.g:72:7: '/'
             self.match(u'/')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T70
 
-
-
     # $ANTLR start T71
+
     def mT71(self, ):
 
         try:
@@ -1225,19 +959,14 @@ class CLexer(Lexer):
             # C.g:73:7: '%'
             self.match(u'%')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T71
 
-
-
     # $ANTLR start T72
+
     def mT72(self, ):
 
         try:
@@ -1247,20 +976,14 @@ class CLexer(Lexer):
             # C.g:74:7: '++'
             self.match("++")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T72
 
-
-
     # $ANTLR start T73
+
     def mT73(self, ):
 
         try:
@@ -1270,20 +993,14 @@ class CLexer(Lexer):
             # C.g:75:7: '--'
             self.match("--")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T73
 
-
-
     # $ANTLR start T74
+
     def mT74(self, ):
 
         try:
@@ -1293,20 +1010,14 @@ class CLexer(Lexer):
             # C.g:76:7: 'sizeof'
             self.match("sizeof")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T74
 
-
-
     # $ANTLR start T75
+
     def mT75(self, ):
 
         try:
@@ -1316,19 +1027,14 @@ class CLexer(Lexer):
             # C.g:77:7: '.'
             self.match(u'.')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T75
 
-
-
     # $ANTLR start T76
+
     def mT76(self, ):
 
         try:
@@ -1338,20 +1044,14 @@ class CLexer(Lexer):
             # C.g:78:7: '->'
             self.match("->")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T76
 
-
-
     # $ANTLR start T77
+
     def mT77(self, ):
 
         try:
@@ -1361,19 +1061,14 @@ class CLexer(Lexer):
             # C.g:79:7: '&'
             self.match(u'&')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T77
 
-
-
     # $ANTLR start T78
+
     def mT78(self, ):
 
         try:
@@ -1383,19 +1078,14 @@ class CLexer(Lexer):
             # C.g:80:7: '~'
             self.match(u'~')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T78
 
-
-
     # $ANTLR start T79
+
     def mT79(self, ):
 
         try:
@@ -1405,19 +1095,14 @@ class CLexer(Lexer):
             # C.g:81:7: '!'
             self.match(u'!')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T79
 
-
-
     # $ANTLR start T80
+
     def mT80(self, ):
 
         try:
@@ -1427,20 +1112,14 @@ class CLexer(Lexer):
             # C.g:82:7: '*='
             self.match("*=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T80
 
-
-
     # $ANTLR start T81
+
     def mT81(self, ):
 
         try:
@@ -1450,20 +1129,14 @@ class CLexer(Lexer):
             # C.g:83:7: '/='
             self.match("/=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T81
 
-
-
     # $ANTLR start T82
+
     def mT82(self, ):
 
         try:
@@ -1473,20 +1146,14 @@ class CLexer(Lexer):
             # C.g:84:7: '%='
             self.match("%=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T82
 
-
-
     # $ANTLR start T83
+
     def mT83(self, ):
 
         try:
@@ -1496,20 +1163,14 @@ class CLexer(Lexer):
             # C.g:85:7: '+='
             self.match("+=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T83
 
-
-
     # $ANTLR start T84
+
     def mT84(self, ):
 
         try:
@@ -1519,20 +1180,14 @@ class CLexer(Lexer):
             # C.g:86:7: '-='
             self.match("-=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T84
 
-
-
     # $ANTLR start T85
+
     def mT85(self, ):
 
         try:
@@ -1542,20 +1197,14 @@ class CLexer(Lexer):
             # C.g:87:7: '<<='
             self.match("<<=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T85
 
-
-
     # $ANTLR start T86
+
     def mT86(self, ):
 
         try:
@@ -1565,20 +1214,14 @@ class CLexer(Lexer):
             # C.g:88:7: '>>='
             self.match(">>=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T86
 
-
-
     # $ANTLR start T87
+
     def mT87(self, ):
 
         try:
@@ -1588,20 +1231,14 @@ class CLexer(Lexer):
             # C.g:89:7: '&='
             self.match("&=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T87
 
-
-
     # $ANTLR start T88
+
     def mT88(self, ):
 
         try:
@@ -1611,20 +1248,14 @@ class CLexer(Lexer):
             # C.g:90:7: '^='
             self.match("^=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T88
 
-
-
     # $ANTLR start T89
+
     def mT89(self, ):
 
         try:
@@ -1634,20 +1265,14 @@ class CLexer(Lexer):
             # C.g:91:7: '|='
             self.match("|=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T89
 
-
-
     # $ANTLR start T90
+
     def mT90(self, ):
 
         try:
@@ -1657,19 +1282,14 @@ class CLexer(Lexer):
             # C.g:92:7: '?'
             self.match(u'?')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T90
 
-
-
     # $ANTLR start T91
+
     def mT91(self, ):
 
         try:
@@ -1679,20 +1299,14 @@ class CLexer(Lexer):
             # C.g:93:7: '||'
             self.match("||")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T91
 
-
-
     # $ANTLR start T92
+
     def mT92(self, ):
 
         try:
@@ -1702,20 +1316,14 @@ class CLexer(Lexer):
             # C.g:94:7: '&&'
             self.match("&&")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T92
 
-
-
     # $ANTLR start T93
+
     def mT93(self, ):
 
         try:
@@ -1725,19 +1333,14 @@ class CLexer(Lexer):
             # C.g:95:7: '|'
             self.match(u'|')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T93
 
-
-
     # $ANTLR start T94
+
     def mT94(self, ):
 
         try:
@@ -1747,19 +1350,14 @@ class CLexer(Lexer):
             # C.g:96:7: '^'
             self.match(u'^')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T94
 
-
-
     # $ANTLR start T95
+
     def mT95(self, ):
 
         try:
@@ -1769,20 +1367,14 @@ class CLexer(Lexer):
             # C.g:97:7: '=='
             self.match("==")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T95
 
-
-
     # $ANTLR start T96
+
     def mT96(self, ):
 
         try:
@@ -1792,20 +1384,14 @@ class CLexer(Lexer):
             # C.g:98:7: '!='
             self.match("!=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T96
 
-
-
     # $ANTLR start T97
+
     def mT97(self, ):
 
         try:
@@ -1815,19 +1401,14 @@ class CLexer(Lexer):
             # C.g:99:7: '<'
             self.match(u'<')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T97
 
-
-
     # $ANTLR start T98
+
     def mT98(self, ):
 
         try:
@@ -1837,19 +1418,14 @@ class CLexer(Lexer):
             # C.g:100:7: '>'
             self.match(u'>')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T98
 
-
-
     # $ANTLR start T99
+
     def mT99(self, ):
 
         try:
@@ -1859,20 +1435,14 @@ class CLexer(Lexer):
             # C.g:101:7: '<='
             self.match("<=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T99
 
-
-
     # $ANTLR start T100
+
     def mT100(self, ):
 
         try:
@@ -1882,20 +1452,14 @@ class CLexer(Lexer):
             # C.g:102:8: '>='
             self.match(">=")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T100
 
-
-
     # $ANTLR start T101
+
     def mT101(self, ):
 
         try:
@@ -1905,20 +1469,14 @@ class CLexer(Lexer):
             # C.g:103:8: '<<'
             self.match("<<")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T101
 
-
-
     # $ANTLR start T102
+
     def mT102(self, ):
 
         try:
@@ -1928,20 +1486,14 @@ class CLexer(Lexer):
             # C.g:104:8: '>>'
             self.match(">>")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T102
 
-
-
     # $ANTLR start T103
+
     def mT103(self, ):
 
         try:
@@ -1951,20 +1503,14 @@ class CLexer(Lexer):
             # C.g:105:8: '__asm__'
             self.match("__asm__")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T103
 
-
-
     # $ANTLR start T104
+
     def mT104(self, ):
 
         try:
@@ -1974,20 +1520,14 @@ class CLexer(Lexer):
             # C.g:106:8: '_asm'
             self.match("_asm")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T104
 
-
-
     # $ANTLR start T105
+
     def mT105(self, ):
 
         try:
@@ -1997,20 +1537,14 @@ class CLexer(Lexer):
             # C.g:107:8: '__asm'
             self.match("__asm")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T105
 
-
-
     # $ANTLR start T106
+
     def mT106(self, ):
 
         try:
@@ -2020,20 +1554,14 @@ class CLexer(Lexer):
             # C.g:108:8: 'case'
             self.match("case")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T106
 
-
-
     # $ANTLR start T107
+
     def mT107(self, ):
 
         try:
@@ -2043,20 +1571,14 @@ class CLexer(Lexer):
             # C.g:109:8: 'default'
             self.match("default")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T107
 
-
-
     # $ANTLR start T108
+
     def mT108(self, ):
 
         try:
@@ -2066,20 +1588,14 @@ class CLexer(Lexer):
             # C.g:110:8: 'if'
             self.match("if")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T108
 
-
-
     # $ANTLR start T109
+
     def mT109(self, ):
 
         try:
@@ -2089,20 +1605,14 @@ class CLexer(Lexer):
             # C.g:111:8: 'else'
             self.match("else")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T109
 
-
-
     # $ANTLR start T110
+
     def mT110(self, ):
 
         try:
@@ -2112,20 +1622,14 @@ class CLexer(Lexer):
             # C.g:112:8: 'switch'
             self.match("switch")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T110
 
-
-
     # $ANTLR start T111
+
     def mT111(self, ):
 
         try:
@@ -2135,20 +1639,14 @@ class CLexer(Lexer):
             # C.g:113:8: 'while'
             self.match("while")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T111
 
-
-
     # $ANTLR start T112
+
     def mT112(self, ):
 
         try:
@@ -2158,20 +1656,14 @@ class CLexer(Lexer):
             # C.g:114:8: 'do'
             self.match("do")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T112
 
-
-
     # $ANTLR start T113
+
     def mT113(self, ):
 
         try:
@@ -2181,20 +1673,14 @@ class CLexer(Lexer):
             # C.g:115:8: 'for'
             self.match("for")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T113
 
-
-
     # $ANTLR start T114
+
     def mT114(self, ):
 
         try:
@@ -2204,20 +1690,14 @@ class CLexer(Lexer):
             # C.g:116:8: 'goto'
             self.match("goto")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T114
 
-
-
     # $ANTLR start T115
+
     def mT115(self, ):
 
         try:
@@ -2227,20 +1707,14 @@ class CLexer(Lexer):
             # C.g:117:8: 'continue'
             self.match("continue")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T115
 
-
-
     # $ANTLR start T116
+
     def mT116(self, ):
 
         try:
@@ -2250,20 +1724,14 @@ class CLexer(Lexer):
             # C.g:118:8: 'break'
             self.match("break")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T116
 
-
-
     # $ANTLR start T117
+
     def mT117(self, ):
 
         try:
@@ -2273,20 +1741,14 @@ class CLexer(Lexer):
             # C.g:119:8: 'return'
             self.match("return")
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end T117
 
-
-
     # $ANTLR start IDENTIFIER
+
     def mIDENTIFIER(self, ):
 
         try:
@@ -2297,34 +1759,25 @@ class CLexer(Lexer):
             self.mLETTER()
 
             # C.g:586:11: ( LETTER | '0' .. '9' )*
-            while True: #loop1
+            while True:  # loop1
                 alt1 = 2
                 LA1_0 = self.input.LA(1)
 
-                if (LA1_0 == u'$' or (u'0' <= LA1_0 <= u'9') or (u'A' <= LA1_0 <= u'Z') or LA1_0 == u'_' or (u'a' <= LA1_0 <= u'z')) :
+                if (LA1_0 == u'$' or (u'0' <= LA1_0 <= u'9') or (u'A' <= LA1_0 <= u'Z') or LA1_0 == u'_' or (u'a' <= LA1_0 <= u'z')):
                     alt1 = 1
 
-
                 if alt1 == 1:
                     # C.g:
                     if self.input.LA(1) == u'$' or (u'0' <= self.input.LA(1) <= u'9') or (u'A' <= self.input.LA(1) <= u'Z') or self.input.LA(1) == u'_' or (u'a' <= self.input.LA(1) <= u'z'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop1
-
-
-
-
-
+                    break  # loop1
 
         finally:
 
@@ -2332,36 +1785,29 @@ class CLexer(Lexer):
 
     # $ANTLR end IDENTIFIER
 
-
-
     # $ANTLR start LETTER
+
     def mLETTER(self, ):
 
         try:
             # C.g:591:2: ( '$' | 'A' .. 'Z' | 'a' .. 'z' | '_' )
             # C.g:
             if self.input.LA(1) == u'$' or (u'A' <= self.input.LA(1) <= u'Z') or self.input.LA(1) == u'_' or (u'a' <= self.input.LA(1) <= u'z'):
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end LETTER
 
-
-
     # $ANTLR start CHARACTER_LITERAL
+
     def mCHARACTER_LITERAL(self, ):
 
         try:
@@ -2373,27 +1819,25 @@ class CLexer(Lexer):
             alt2 = 2
             LA2_0 = self.input.LA(1)
 
-            if (LA2_0 == u'L') :
+            if (LA2_0 == u'L'):
                 alt2 = 1
             if alt2 == 1:
                 # C.g:598:10: 'L'
                 self.match(u'L')
 
-
-
-
             self.match(u'\'')
 
             # C.g:598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )
             alt3 = 2
             LA3_0 = self.input.LA(1)
 
-            if (LA3_0 == u'\\') :
+            if (LA3_0 == u'\\'):
                 alt3 = 1
-            elif ((u'\u0000' <= LA3_0 <= u'&') or (u'(' <= LA3_0 <= u'[') or (u']' <= LA3_0 <= u'\uFFFE')) :
+            elif ((u'\u0000' <= LA3_0 <= u'&') or (u'(' <= LA3_0 <= u'[') or (u']' <= LA3_0 <= u'\uFFFE')):
                 alt3 = 2
             else:
-                nvae = NoViableAltException("598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )", 3, 0, self.input)
+                nvae = NoViableAltException(
+                    "598:21: ( EscapeSequence | ~ ( '\\'' | '\\\\' ) )", 3, 0, self.input)
 
                 raise nvae
 
@@ -2401,37 +1845,26 @@ class CLexer(Lexer):
                 # C.g:598:23: EscapeSequence
                 self.mEscapeSequence()
 
-
-
             elif alt3 == 2:
                 # C.g:598:40: ~ ( '\\'' | '\\\\' )
                 if (u'\u0000' <= self.input.LA(1) <= u'&') or (u'(' <= self.input.LA(1) <= u'[') or (u']' <= self.input.LA(1) <= u'\uFFFE'):
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
             self.match(u'\'')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end CHARACTER_LITERAL
 
-
-
     # $ANTLR start STRING_LITERAL
+
     def mSTRING_LITERAL(self, ):
 
         try:
@@ -2443,66 +1876,51 @@ class CLexer(Lexer):
             alt4 = 2
             LA4_0 = self.input.LA(1)
 
-            if (LA4_0 == u'L') :
+            if (LA4_0 == u'L'):
                 alt4 = 1
             if alt4 == 1:
                 # C.g:602:9: 'L'
                 self.match(u'L')
 
-
-
-
             self.match(u'"')
 
             # C.g:602:19: ( EscapeSequence | ~ ( '\\\\' | '\"' ) )*
-            while True: #loop5
+            while True:  # loop5
                 alt5 = 3
                 LA5_0 = self.input.LA(1)
 
-                if (LA5_0 == u'\\') :
+                if (LA5_0 == u'\\'):
                     alt5 = 1
-                elif ((u'\u0000' <= LA5_0 <= u'!') or (u'#' <= LA5_0 <= u'[') or (u']' <= LA5_0 <= u'\uFFFE')) :
+                elif ((u'\u0000' <= LA5_0 <= u'!') or (u'#' <= LA5_0 <= u'[') or (u']' <= LA5_0 <= u'\uFFFE')):
                     alt5 = 2
 
-
                 if alt5 == 1:
                     # C.g:602:21: EscapeSequence
                     self.mEscapeSequence()
 
-
-
                 elif alt5 == 2:
                     # C.g:602:38: ~ ( '\\\\' | '\"' )
                     if (u'\u0000' <= self.input.LA(1) <= u'!') or (u'#' <= self.input.LA(1) <= u'[') or (u']' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop5
-
+                    break  # loop5
 
             self.match(u'"')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end STRING_LITERAL
 
-
-
     # $ANTLR start HEX_LITERAL
+
     def mHEX_LITERAL(self, ):
 
         try:
@@ -2513,66 +1931,53 @@ class CLexer(Lexer):
             self.match(u'0')
 
             if self.input.LA(1) == u'X' or self.input.LA(1) == u'x':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
             # C.g:605:29: ( HexDigit )+
             cnt6 = 0
-            while True: #loop6
+            while True:  # loop6
                 alt6 = 2
                 LA6_0 = self.input.LA(1)
 
-                if ((u'0' <= LA6_0 <= u'9') or (u'A' <= LA6_0 <= u'F') or (u'a' <= LA6_0 <= u'f')) :
+                if ((u'0' <= LA6_0 <= u'9') or (u'A' <= LA6_0 <= u'F') or (u'a' <= LA6_0 <= u'f')):
                     alt6 = 1
 
-
                 if alt6 == 1:
                     # C.g:605:29: HexDigit
                     self.mHexDigit()
 
-
-
                 else:
                     if cnt6 >= 1:
-                        break #loop6
+                        break  # loop6
 
                     eee = EarlyExitException(6, self.input)
                     raise eee
 
                 cnt6 += 1
 
-
             # C.g:605:39: ( IntegerTypeSuffix )?
             alt7 = 2
             LA7_0 = self.input.LA(1)
 
-            if (LA7_0 == u'L' or LA7_0 == u'U' or LA7_0 == u'l' or LA7_0 == u'u') :
+            if (LA7_0 == u'L' or LA7_0 == u'U' or LA7_0 == u'l' or LA7_0 == u'u'):
                 alt7 = 1
             if alt7 == 1:
                 # C.g:605:39: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end HEX_LITERAL
 
-
-
     # $ANTLR start DECIMAL_LITERAL
+
     def mDECIMAL_LITERAL(self, ):
 
         try:
@@ -2584,12 +1989,13 @@ class CLexer(Lexer):
             alt9 = 2
             LA9_0 = self.input.LA(1)
 
-            if (LA9_0 == u'0') :
+            if (LA9_0 == u'0'):
                 alt9 = 1
-            elif ((u'1' <= LA9_0 <= u'9')) :
+            elif ((u'1' <= LA9_0 <= u'9')):
                 alt9 = 2
             else:
-                nvae = NoViableAltException("607:19: ( '0' | '1' .. '9' ( '0' .. '9' )* )", 9, 0, self.input)
+                nvae = NoViableAltException(
+                    "607:19: ( '0' | '1' .. '9' ( '0' .. '9' )* )", 9, 0, self.input)
 
                 raise nvae
 
@@ -2597,60 +2003,43 @@ class CLexer(Lexer):
                 # C.g:607:20: '0'
                 self.match(u'0')
 
-
-
             elif alt9 == 2:
                 # C.g:607:26: '1' .. '9' ( '0' .. '9' )*
                 self.matchRange(u'1', u'9')
 
                 # C.g:607:35: ( '0' .. '9' )*
-                while True: #loop8
+                while True:  # loop8
                     alt8 = 2
                     LA8_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA8_0 <= u'9')) :
+                    if ((u'0' <= LA8_0 <= u'9')):
                         alt8 = 1
 
-
                     if alt8 == 1:
                         # C.g:607:35: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
-                        break #loop8
-
-
-
-
+                        break  # loop8
 
             # C.g:607:46: ( IntegerTypeSuffix )?
             alt10 = 2
             LA10_0 = self.input.LA(1)
 
-            if (LA10_0 == u'L' or LA10_0 == u'U' or LA10_0 == u'l' or LA10_0 == u'u') :
+            if (LA10_0 == u'L' or LA10_0 == u'U' or LA10_0 == u'l' or LA10_0 == u'u'):
                 alt10 = 1
             if alt10 == 1:
                 # C.g:607:46: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end DECIMAL_LITERAL
 
-
-
     # $ANTLR start OCTAL_LITERAL
+
     def mOCTAL_LITERAL(self, ):
 
         try:
@@ -2662,83 +2051,65 @@ class CLexer(Lexer):
 
             # C.g:609:21: ( '0' .. '7' )+
             cnt11 = 0
-            while True: #loop11
+            while True:  # loop11
                 alt11 = 2
                 LA11_0 = self.input.LA(1)
 
-                if ((u'0' <= LA11_0 <= u'7')) :
+                if ((u'0' <= LA11_0 <= u'7')):
                     alt11 = 1
 
-
                 if alt11 == 1:
                     # C.g:609:22: '0' .. '7'
                     self.matchRange(u'0', u'7')
 
-
-
                 else:
                     if cnt11 >= 1:
-                        break #loop11
+                        break  # loop11
 
                     eee = EarlyExitException(11, self.input)
                     raise eee
 
                 cnt11 += 1
 
-
             # C.g:609:33: ( IntegerTypeSuffix )?
             alt12 = 2
             LA12_0 = self.input.LA(1)
 
-            if (LA12_0 == u'L' or LA12_0 == u'U' or LA12_0 == u'l' or LA12_0 == u'u') :
+            if (LA12_0 == u'L' or LA12_0 == u'U' or LA12_0 == u'l' or LA12_0 == u'u'):
                 alt12 = 1
             if alt12 == 1:
                 # C.g:609:33: IntegerTypeSuffix
                 self.mIntegerTypeSuffix()
 
-
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end OCTAL_LITERAL
 
-
-
     # $ANTLR start HexDigit
+
     def mHexDigit(self, ):
 
         try:
             # C.g:612:10: ( ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' ) )
             # C.g:612:12: ( '0' .. '9' | 'a' .. 'f' | 'A' .. 'F' )
             if (u'0' <= self.input.LA(1) <= u'9') or (u'A' <= self.input.LA(1) <= u'F') or (u'a' <= self.input.LA(1) <= u'f'):
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end HexDigit
 
-
-
     # $ANTLR start IntegerTypeSuffix
+
     def mIntegerTypeSuffix(self, ):
 
         try:
@@ -2746,114 +2117,98 @@ class CLexer(Lexer):
             alt13 = 4
             LA13_0 = self.input.LA(1)
 
-            if (LA13_0 == u'U' or LA13_0 == u'u') :
+            if (LA13_0 == u'U' or LA13_0 == u'u'):
                 LA13_1 = self.input.LA(2)
 
-                if (LA13_1 == u'L' or LA13_1 == u'l') :
+                if (LA13_1 == u'L' or LA13_1 == u'l'):
                     LA13_3 = self.input.LA(3)
 
-                    if (LA13_3 == u'L' or LA13_3 == u'l') :
+                    if (LA13_3 == u'L' or LA13_3 == u'l'):
                         alt13 = 4
                     else:
                         alt13 = 3
                 else:
                     alt13 = 1
-            elif (LA13_0 == u'L' or LA13_0 == u'l') :
+            elif (LA13_0 == u'L' or LA13_0 == u'l'):
                 alt13 = 2
             else:
-                nvae = NoViableAltException("614:1: fragment IntegerTypeSuffix : ( ( 'u' | 'U' ) | ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' ) );", 13, 0, self.input)
+                nvae = NoViableAltException(
+                    "614:1: fragment IntegerTypeSuffix : ( ( 'u' | 'U' ) | ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) | ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' ) );", 13, 0, self.input)
 
                 raise nvae
 
             if alt13 == 1:
                 # C.g:616:4: ( 'u' | 'U' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 2:
                 # C.g:617:4: ( 'l' | 'L' )
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 3:
                 # C.g:618:4: ( 'u' | 'U' ) ( 'l' | 'L' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt13 == 4:
                 # C.g:619:4: ( 'u' | 'U' ) ( 'l' | 'L' ) ( 'l' | 'L' )
                 if self.input.LA(1) == u'U' or self.input.LA(1) == u'u':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
                 if self.input.LA(1) == u'L' or self.input.LA(1) == u'l':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end IntegerTypeSuffix
 
-
-
     # $ANTLR start FLOATING_POINT_LITERAL
+
     def mFLOATING_POINT_LITERAL(self, ):
 
         try:
@@ -2866,337 +2221,269 @@ class CLexer(Lexer):
                 # C.g:623:9: ( '0' .. '9' )+ '.' ( '0' .. '9' )* ( Exponent )? ( FloatTypeSuffix )?
                 # C.g:623:9: ( '0' .. '9' )+
                 cnt14 = 0
-                while True: #loop14
+                while True:  # loop14
                     alt14 = 2
                     LA14_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA14_0 <= u'9')) :
+                    if ((u'0' <= LA14_0 <= u'9')):
                         alt14 = 1
 
-
                     if alt14 == 1:
                         # C.g:623:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt14 >= 1:
-                            break #loop14
+                            break  # loop14
 
                         eee = EarlyExitException(14, self.input)
                         raise eee
 
                     cnt14 += 1
 
-
                 self.match(u'.')
 
                 # C.g:623:25: ( '0' .. '9' )*
-                while True: #loop15
+                while True:  # loop15
                     alt15 = 2
                     LA15_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA15_0 <= u'9')) :
+                    if ((u'0' <= LA15_0 <= u'9')):
                         alt15 = 1
 
-
                     if alt15 == 1:
                         # C.g:623:26: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
-                        break #loop15
-
+                        break  # loop15
 
                 # C.g:623:37: ( Exponent )?
                 alt16 = 2
                 LA16_0 = self.input.LA(1)
 
-                if (LA16_0 == u'E' or LA16_0 == u'e') :
+                if (LA16_0 == u'E' or LA16_0 == u'e'):
                     alt16 = 1
                 if alt16 == 1:
                     # C.g:623:37: Exponent
                     self.mExponent()
 
-
-
-
                 # C.g:623:47: ( FloatTypeSuffix )?
                 alt17 = 2
                 LA17_0 = self.input.LA(1)
 
-                if (LA17_0 == u'D' or LA17_0 == u'F' or LA17_0 == u'd' or LA17_0 == u'f') :
+                if (LA17_0 == u'D' or LA17_0 == u'F' or LA17_0 == u'd' or LA17_0 == u'f'):
                     alt17 = 1
                 if alt17 == 1:
                     # C.g:623:47: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 2:
                 # C.g:624:9: '.' ( '0' .. '9' )+ ( Exponent )? ( FloatTypeSuffix )?
                 self.match(u'.')
 
                 # C.g:624:13: ( '0' .. '9' )+
                 cnt18 = 0
-                while True: #loop18
+                while True:  # loop18
                     alt18 = 2
                     LA18_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA18_0 <= u'9')) :
+                    if ((u'0' <= LA18_0 <= u'9')):
                         alt18 = 1
 
-
                     if alt18 == 1:
                         # C.g:624:14: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt18 >= 1:
-                            break #loop18
+                            break  # loop18
 
                         eee = EarlyExitException(18, self.input)
                         raise eee
 
                     cnt18 += 1
 
-
                 # C.g:624:25: ( Exponent )?
                 alt19 = 2
                 LA19_0 = self.input.LA(1)
 
-                if (LA19_0 == u'E' or LA19_0 == u'e') :
+                if (LA19_0 == u'E' or LA19_0 == u'e'):
                     alt19 = 1
                 if alt19 == 1:
                     # C.g:624:25: Exponent
                     self.mExponent()
 
-
-
-
                 # C.g:624:35: ( FloatTypeSuffix )?
                 alt20 = 2
                 LA20_0 = self.input.LA(1)
 
-                if (LA20_0 == u'D' or LA20_0 == u'F' or LA20_0 == u'd' or LA20_0 == u'f') :
+                if (LA20_0 == u'D' or LA20_0 == u'F' or LA20_0 == u'd' or LA20_0 == u'f'):
                     alt20 = 1
                 if alt20 == 1:
                     # C.g:624:35: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 3:
                 # C.g:625:9: ( '0' .. '9' )+ Exponent ( FloatTypeSuffix )?
                 # C.g:625:9: ( '0' .. '9' )+
                 cnt21 = 0
-                while True: #loop21
+                while True:  # loop21
                     alt21 = 2
                     LA21_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA21_0 <= u'9')) :
+                    if ((u'0' <= LA21_0 <= u'9')):
                         alt21 = 1
 
-
                     if alt21 == 1:
                         # C.g:625:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt21 >= 1:
-                            break #loop21
+                            break  # loop21
 
                         eee = EarlyExitException(21, self.input)
                         raise eee
 
                     cnt21 += 1
 
-
                 self.mExponent()
 
                 # C.g:625:30: ( FloatTypeSuffix )?
                 alt22 = 2
                 LA22_0 = self.input.LA(1)
 
-                if (LA22_0 == u'D' or LA22_0 == u'F' or LA22_0 == u'd' or LA22_0 == u'f') :
+                if (LA22_0 == u'D' or LA22_0 == u'F' or LA22_0 == u'd' or LA22_0 == u'f'):
                     alt22 = 1
                 if alt22 == 1:
                     # C.g:625:30: FloatTypeSuffix
                     self.mFloatTypeSuffix()
 
-
-
-
-
-
             elif alt25 == 4:
                 # C.g:626:9: ( '0' .. '9' )+ ( Exponent )? FloatTypeSuffix
                 # C.g:626:9: ( '0' .. '9' )+
                 cnt23 = 0
-                while True: #loop23
+                while True:  # loop23
                     alt23 = 2
                     LA23_0 = self.input.LA(1)
 
-                    if ((u'0' <= LA23_0 <= u'9')) :
+                    if ((u'0' <= LA23_0 <= u'9')):
                         alt23 = 1
 
-
                     if alt23 == 1:
                         # C.g:626:10: '0' .. '9'
                         self.matchRange(u'0', u'9')
 
-
-
                     else:
                         if cnt23 >= 1:
-                            break #loop23
+                            break  # loop23
 
                         eee = EarlyExitException(23, self.input)
                         raise eee
 
                     cnt23 += 1
 
-
                 # C.g:626:21: ( Exponent )?
                 alt24 = 2
                 LA24_0 = self.input.LA(1)
 
-                if (LA24_0 == u'E' or LA24_0 == u'e') :
+                if (LA24_0 == u'E' or LA24_0 == u'e'):
                     alt24 = 1
                 if alt24 == 1:
                     # C.g:626:21: Exponent
                     self.mExponent()
 
-
-
-
                 self.mFloatTypeSuffix()
 
-
-
-
         finally:
 
             pass
 
     # $ANTLR end FLOATING_POINT_LITERAL
 
-
-
     # $ANTLR start Exponent
+
     def mExponent(self, ):
 
         try:
             # C.g:630:10: ( ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+ )
             # C.g:630:12: ( 'e' | 'E' ) ( '+' | '-' )? ( '0' .. '9' )+
             if self.input.LA(1) == u'E' or self.input.LA(1) == u'e':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
             # C.g:630:22: ( '+' | '-' )?
             alt26 = 2
             LA26_0 = self.input.LA(1)
 
-            if (LA26_0 == u'+' or LA26_0 == u'-') :
+            if (LA26_0 == u'+' or LA26_0 == u'-'):
                 alt26 = 1
             if alt26 == 1:
                 # C.g:
                 if self.input.LA(1) == u'+' or self.input.LA(1) == u'-':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
-
             # C.g:630:33: ( '0' .. '9' )+
             cnt27 = 0
-            while True: #loop27
+            while True:  # loop27
                 alt27 = 2
                 LA27_0 = self.input.LA(1)
 
-                if ((u'0' <= LA27_0 <= u'9')) :
+                if ((u'0' <= LA27_0 <= u'9')):
                     alt27 = 1
 
-
                 if alt27 == 1:
                     # C.g:630:34: '0' .. '9'
                     self.matchRange(u'0', u'9')
 
-
-
                 else:
                     if cnt27 >= 1:
-                        break #loop27
+                        break  # loop27
 
                     eee = EarlyExitException(27, self.input)
                     raise eee
 
                 cnt27 += 1
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end Exponent
 
-
-
     # $ANTLR start FloatTypeSuffix
+
     def mFloatTypeSuffix(self, ):
 
         try:
             # C.g:633:17: ( ( 'f' | 'F' | 'd' | 'D' ) )
             # C.g:633:19: ( 'f' | 'F' | 'd' | 'D' )
             if self.input.LA(1) == u'D' or self.input.LA(1) == u'F' or self.input.LA(1) == u'd' or self.input.LA(1) == u'f':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end FloatTypeSuffix
 
-
-
     # $ANTLR start EscapeSequence
+
     def mEscapeSequence(self, ):
 
         try:
@@ -3204,20 +2491,22 @@ class CLexer(Lexer):
             alt28 = 2
             LA28_0 = self.input.LA(1)
 
-            if (LA28_0 == u'\\') :
+            if (LA28_0 == u'\\'):
                 LA28_1 = self.input.LA(2)
 
-                if (LA28_1 == u'"' or LA28_1 == u'\'' or LA28_1 == u'\\' or LA28_1 == u'b' or LA28_1 == u'f' or LA28_1 == u'n' or LA28_1 == u'r' or LA28_1 == u't') :
+                if (LA28_1 == u'"' or LA28_1 == u'\'' or LA28_1 == u'\\' or LA28_1 == u'b' or LA28_1 == u'f' or LA28_1 == u'n' or LA28_1 == u'r' or LA28_1 == u't'):
                     alt28 = 1
-                elif ((u'0' <= LA28_1 <= u'7')) :
+                elif ((u'0' <= LA28_1 <= u'7')):
                     alt28 = 2
                 else:
-                    nvae = NoViableAltException("635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 1, self.input)
+                    nvae = NoViableAltException(
+                        "635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 1, self.input)
 
                     raise nvae
 
             else:
-                nvae = NoViableAltException("635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 0, self.input)
+                nvae = NoViableAltException(
+                    "635:1: fragment EscapeSequence : ( '\\\\' ( 'b' | 't' | 'n' | 'f' | 'r' | '\\\"' | '\\'' | '\\\\' ) | OctalEscape );", 28, 0, self.input)
 
                 raise nvae
 
@@ -3226,32 +2515,25 @@ class CLexer(Lexer):
                 self.match(u'\\')
 
                 if self.input.LA(1) == u'"' or self.input.LA(1) == u'\'' or self.input.LA(1) == u'\\' or self.input.LA(1) == u'b' or self.input.LA(1) == u'f' or self.input.LA(1) == u'n' or self.input.LA(1) == u'r' or self.input.LA(1) == u't':
-                    self.input.consume();
+                    self.input.consume()
 
                 else:
                     mse = MismatchedSetException(None, self.input)
                     self.recover(mse)
                     raise mse
 
-
-
-
             elif alt28 == 2:
                 # C.g:638:9: OctalEscape
                 self.mOctalEscape()
 
-
-
-
         finally:
 
             pass
 
     # $ANTLR end EscapeSequence
 
-
-
     # $ANTLR start OctalEscape
+
     def mOctalEscape(self, ):
 
         try:
@@ -3259,35 +2541,37 @@ class CLexer(Lexer):
             alt29 = 3
             LA29_0 = self.input.LA(1)
 
-            if (LA29_0 == u'\\') :
+            if (LA29_0 == u'\\'):
                 LA29_1 = self.input.LA(2)
 
-                if ((u'0' <= LA29_1 <= u'3')) :
+                if ((u'0' <= LA29_1 <= u'3')):
                     LA29_2 = self.input.LA(3)
 
-                    if ((u'0' <= LA29_2 <= u'7')) :
+                    if ((u'0' <= LA29_2 <= u'7')):
                         LA29_4 = self.input.LA(4)
 
-                        if ((u'0' <= LA29_4 <= u'7')) :
+                        if ((u'0' <= LA29_4 <= u'7')):
                             alt29 = 1
                         else:
                             alt29 = 2
                     else:
                         alt29 = 3
-                elif ((u'4' <= LA29_1 <= u'7')) :
+                elif ((u'4' <= LA29_1 <= u'7')):
                     LA29_3 = self.input.LA(3)
 
-                    if ((u'0' <= LA29_3 <= u'7')) :
+                    if ((u'0' <= LA29_3 <= u'7')):
                         alt29 = 2
                     else:
                         alt29 = 3
                 else:
-                    nvae = NoViableAltException("641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 1, self.input)
+                    nvae = NoViableAltException(
+                        "641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 1, self.input)
 
                     raise nvae
 
             else:
-                nvae = NoViableAltException("641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 0, self.input)
+                nvae = NoViableAltException(
+                    "641:1: fragment OctalEscape : ( '\\\\' ( '0' .. '3' ) ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) ( '0' .. '7' ) | '\\\\' ( '0' .. '7' ) );", 29, 0, self.input)
 
                 raise nvae
 
@@ -3299,25 +2583,14 @@ class CLexer(Lexer):
                 # C.g:643:15: '0' .. '3'
                 self.matchRange(u'0', u'3')
 
-
-
-
                 # C.g:643:25: ( '0' .. '7' )
                 # C.g:643:26: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
                 # C.g:643:36: ( '0' .. '7' )
                 # C.g:643:37: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
             elif alt29 == 2:
                 # C.g:644:9: '\\\\' ( '0' .. '7' ) ( '0' .. '7' )
                 self.match(u'\\')
@@ -3326,18 +2599,10 @@ class CLexer(Lexer):
                 # C.g:644:15: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
                 # C.g:644:25: ( '0' .. '7' )
                 # C.g:644:26: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
             elif alt29 == 3:
                 # C.g:645:9: '\\\\' ( '0' .. '7' )
                 self.match(u'\\')
@@ -3346,21 +2611,14 @@ class CLexer(Lexer):
                 # C.g:645:15: '0' .. '7'
                 self.matchRange(u'0', u'7')
 
-
-
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end OctalEscape
 
-
-
     # $ANTLR start UnicodeEscape
+
     def mUnicodeEscape(self, ):
 
         try:
@@ -3378,19 +2636,14 @@ class CLexer(Lexer):
 
             self.mHexDigit()
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end UnicodeEscape
 
-
-
     # $ANTLR start WS
+
     def mWS(self, ):
 
         try:
@@ -3399,20 +2652,16 @@ class CLexer(Lexer):
             # C.g:653:5: ( ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' ) )
             # C.g:653:8: ( ' ' | '\\r' | '\\t' | '\\u000C' | '\\n' )
             if (u'\t' <= self.input.LA(1) <= u'\n') or (u'\f' <= self.input.LA(1) <= u'\r') or self.input.LA(1) == u' ':
-                self.input.consume();
+                self.input.consume()
 
             else:
                 mse = MismatchedSetException(None, self.input)
                 self.recover(mse)
                 raise mse
 
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3420,9 +2669,8 @@ class CLexer(Lexer):
 
     # $ANTLR end WS
 
-
-
     # $ANTLR start BS
+
     def mBS(self, ):
 
         try:
@@ -3434,15 +2682,9 @@ class CLexer(Lexer):
             # C.g:657:8: '\\\\'
             self.match(u'\\')
 
-
-
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3450,9 +2692,8 @@ class CLexer(Lexer):
 
     # $ANTLR end BS
 
-
-
     # $ANTLR start UnicodeVocabulary
+
     def mUnicodeVocabulary(self, ):
 
         try:
@@ -3462,19 +2703,14 @@ class CLexer(Lexer):
             # C.g:665:7: '\\u0003' .. '\\uFFFE'
             self.matchRange(u'\u0003', u'\uFFFE')
 
-
-
-
-
         finally:
 
             pass
 
     # $ANTLR end UnicodeVocabulary
 
-
-
     # $ANTLR start COMMENT
+
     def mCOMMENT(self, ):
 
         try:
@@ -3484,44 +2720,34 @@ class CLexer(Lexer):
             # C.g:668:9: '/*' ( options {greedy=false; } : . )* '*/'
             self.match("/*")
 
-
             # C.g:668:14: ( options {greedy=false; } : . )*
-            while True: #loop30
+            while True:  # loop30
                 alt30 = 2
                 LA30_0 = self.input.LA(1)
 
-                if (LA30_0 == u'*') :
+                if (LA30_0 == u'*'):
                     LA30_1 = self.input.LA(2)
 
-                    if (LA30_1 == u'/') :
+                    if (LA30_1 == u'/'):
                         alt30 = 2
-                    elif ((u'\u0000' <= LA30_1 <= u'.') or (u'0' <= LA30_1 <= u'\uFFFE')) :
+                    elif ((u'\u0000' <= LA30_1 <= u'.') or (u'0' <= LA30_1 <= u'\uFFFE')):
                         alt30 = 1
 
-
-                elif ((u'\u0000' <= LA30_0 <= u')') or (u'+' <= LA30_0 <= u'\uFFFE')) :
+                elif ((u'\u0000' <= LA30_0 <= u')') or (u'+' <= LA30_0 <= u'\uFFFE')):
                     alt30 = 1
 
-
                 if alt30 == 1:
                     # C.g:668:42: .
                     self.matchAny()
 
-
-
                 else:
-                    break #loop30
-
+                    break  # loop30
 
             self.match("*/")
 
-
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3529,9 +2755,8 @@ class CLexer(Lexer):
 
     # $ANTLR end COMMENT
 
-
-
     # $ANTLR start LINE_COMMENT
+
     def mLINE_COMMENT(self, ):
 
         try:
@@ -3541,54 +2766,42 @@ class CLexer(Lexer):
             # C.g:673:7: '//' (~ ( '\\n' | '\\r' ) )* ( '\\r' )? '\\n'
             self.match("//")
 
-
             # C.g:673:12: (~ ( '\\n' | '\\r' ) )*
-            while True: #loop31
+            while True:  # loop31
                 alt31 = 2
                 LA31_0 = self.input.LA(1)
 
-                if ((u'\u0000' <= LA31_0 <= u'\t') or (u'\u000B' <= LA31_0 <= u'\f') or (u'\u000E' <= LA31_0 <= u'\uFFFE')) :
+                if ((u'\u0000' <= LA31_0 <= u'\t') or (u'\u000B' <= LA31_0 <= u'\f') or (u'\u000E' <= LA31_0 <= u'\uFFFE')):
                     alt31 = 1
 
-
                 if alt31 == 1:
                     # C.g:673:12: ~ ( '\\n' | '\\r' )
                     if (u'\u0000' <= self.input.LA(1) <= u'\t') or (u'\u000B' <= self.input.LA(1) <= u'\f') or (u'\u000E' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop31
-
+                    break  # loop31
 
             # C.g:673:26: ( '\\r' )?
             alt32 = 2
             LA32_0 = self.input.LA(1)
 
-            if (LA32_0 == u'\r') :
+            if (LA32_0 == u'\r'):
                 alt32 = 1
             if alt32 == 1:
                 # C.g:673:26: '\\r'
                 self.match(u'\r')
 
-
-
-
             self.match(u'\n')
 
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3596,9 +2809,8 @@ class CLexer(Lexer):
 
     # $ANTLR end LINE_COMMENT
 
-
-
     # $ANTLR start LINE_COMMAND
+
     def mLINE_COMMAND(self, ):
 
         try:
@@ -3609,52 +2821,41 @@ class CLexer(Lexer):
             self.match(u'#')
 
             # C.g:678:11: (~ ( '\\n' | '\\r' ) )*
-            while True: #loop33
+            while True:  # loop33
                 alt33 = 2
                 LA33_0 = self.input.LA(1)
 
-                if ((u'\u0000' <= LA33_0 <= u'\t') or (u'\u000B' <= LA33_0 <= u'\f') or (u'\u000E' <= LA33_0 <= u'\uFFFE')) :
+                if ((u'\u0000' <= LA33_0 <= u'\t') or (u'\u000B' <= LA33_0 <= u'\f') or (u'\u000E' <= LA33_0 <= u'\uFFFE')):
                     alt33 = 1
 
-
                 if alt33 == 1:
                     # C.g:678:11: ~ ( '\\n' | '\\r' )
                     if (u'\u0000' <= self.input.LA(1) <= u'\t') or (u'\u000B' <= self.input.LA(1) <= u'\f') or (u'\u000E' <= self.input.LA(1) <= u'\uFFFE'):
-                        self.input.consume();
+                        self.input.consume()
 
                     else:
                         mse = MismatchedSetException(None, self.input)
                         self.recover(mse)
                         raise mse
 
-
-
-
                 else:
-                    break #loop33
-
+                    break  # loop33
 
             # C.g:678:25: ( '\\r' )?
             alt34 = 2
             LA34_0 = self.input.LA(1)
 
-            if (LA34_0 == u'\r') :
+            if (LA34_0 == u'\r'):
                 alt34 = 1
             if alt34 == 1:
                 # C.g:678:25: '\\r'
                 self.match(u'\r')
 
-
-
-
             self.match(u'\n')
 
-            #action start
-            self.channel=HIDDEN;
-            #action end
-
-
-
+            # action start
+            self.channel = HIDDEN
+            # action end
 
         finally:
 
@@ -3662,8 +2863,6 @@ class CLexer(Lexer):
 
     # $ANTLR end LINE_COMMAND
 
-
-
     def mTokens(self):
         # C.g:1:8: ( T25 | T26 | T27 | T28 | T29 | T30 | T31 | T32 | T33 | T34 | T35 | T36 | T37 | T38 | T39 | T40 | T41 | T42 | T43 | T44 | T45 | T46 | T47 | T48 | T49 | T50 | T51 | T52 | T53 | T54 | T55 | T56 | T57 | T58 | T59 | T60 | T61 | T62 | T63 | T64 | T65 | T66 | T67 | T68 | T69 | T70 | T71 | T72 | T73 | T74 | T75 | T76 | T77 | T78 | T79 | T80 | T81 | T82 | T83 | T84 | T85 | T86 | T87 | T88 | T89 | T90 | T91 | T92 | T93 | T94 | T95 | T96 | T97 | T98 | T99 | T100 | T101 | T102 | T103 | T104 | T105 | T106 | T107 | T108 | T109 | T110 | T111 | T112 | T113 | T114 | T115 | T116 | T117 | IDENTIFIER | CHARACTER_LITERAL | STRING_LITERAL | HEX_LITERAL | DECIMAL_LITERAL | OCTAL_LITERAL | FLOATING_POINT_LITERAL | WS | BS | UnicodeVocabulary | COMMENT | LINE_COMMENT | LINE_COMMAND )
         alt35 = 106
@@ -3672,681 +2871,463 @@ class CLexer(Lexer):
             # C.g:1:10: T25
             self.mT25()
 
-
-
         elif alt35 == 2:
             # C.g:1:14: T26
             self.mT26()
 
-
-
         elif alt35 == 3:
             # C.g:1:18: T27
             self.mT27()
 
-
-
         elif alt35 == 4:
             # C.g:1:22: T28
             self.mT28()
 
-
-
         elif alt35 == 5:
             # C.g:1:26: T29
             self.mT29()
 
-
-
         elif alt35 == 6:
             # C.g:1:30: T30
             self.mT30()
 
-
-
         elif alt35 == 7:
             # C.g:1:34: T31
             self.mT31()
 
-
-
         elif alt35 == 8:
             # C.g:1:38: T32
             self.mT32()
 
-
-
         elif alt35 == 9:
             # C.g:1:42: T33
             self.mT33()
 
-
-
         elif alt35 == 10:
             # C.g:1:46: T34
             self.mT34()
 
-
-
         elif alt35 == 11:
             # C.g:1:50: T35
             self.mT35()
 
-
-
         elif alt35 == 12:
             # C.g:1:54: T36
             self.mT36()
 
-
-
         elif alt35 == 13:
             # C.g:1:58: T37
             self.mT37()
 
-
-
         elif alt35 == 14:
             # C.g:1:62: T38
             self.mT38()
 
-
-
         elif alt35 == 15:
             # C.g:1:66: T39
             self.mT39()
 
-
-
         elif alt35 == 16:
             # C.g:1:70: T40
             self.mT40()
 
-
-
         elif alt35 == 17:
             # C.g:1:74: T41
             self.mT41()
 
-
-
         elif alt35 == 18:
             # C.g:1:78: T42
             self.mT42()
 
-
-
         elif alt35 == 19:
             # C.g:1:82: T43
             self.mT43()
 
-
-
         elif alt35 == 20:
             # C.g:1:86: T44
             self.mT44()
 
-
-
         elif alt35 == 21:
             # C.g:1:90: T45
             self.mT45()
 
-
-
         elif alt35 == 22:
             # C.g:1:94: T46
             self.mT46()
 
-
-
         elif alt35 == 23:
             # C.g:1:98: T47
             self.mT47()
 
-
-
         elif alt35 == 24:
             # C.g:1:102: T48
             self.mT48()
 
-
-
         elif alt35 == 25:
             # C.g:1:106: T49
             self.mT49()
 
-
-
         elif alt35 == 26:
             # C.g:1:110: T50
             self.mT50()
 
-
-
         elif alt35 == 27:
             # C.g:1:114: T51
             self.mT51()
 
-
-
         elif alt35 == 28:
             # C.g:1:118: T52
             self.mT52()
 
-
-
         elif alt35 == 29:
             # C.g:1:122: T53
             self.mT53()
 
-
-
         elif alt35 == 30:
             # C.g:1:126: T54
             self.mT54()
 
-
-
         elif alt35 == 31:
             # C.g:1:130: T55
             self.mT55()
 
-
-
         elif alt35 == 32:
             # C.g:1:134: T56
             self.mT56()
 
-
-
         elif alt35 == 33:
             # C.g:1:138: T57
             self.mT57()
 
-
-
         elif alt35 == 34:
             # C.g:1:142: T58
             self.mT58()
 
-
-
         elif alt35 == 35:
             # C.g:1:146: T59
             self.mT59()
 
-
-
         elif alt35 == 36:
             # C.g:1:150: T60
             self.mT60()
 
-
-
         elif alt35 == 37:
             # C.g:1:154: T61
             self.mT61()
 
-
-
         elif alt35 == 38:
             # C.g:1:158: T62
             self.mT62()
 
-
-
         elif alt35 == 39:
             # C.g:1:162: T63
             self.mT63()
 
-
-
         elif alt35 == 40:
             # C.g:1:166: T64
             self.mT64()
 
-
-
         elif alt35 == 41:
             # C.g:1:170: T65
             self.mT65()
 
-
-
         elif alt35 == 42:
             # C.g:1:174: T66
             self.mT66()
 
-
-
         elif alt35 == 43:
             # C.g:1:178: T67
             self.mT67()
 
-
-
         elif alt35 == 44:
             # C.g:1:182: T68
             self.mT68()
 
-
-
         elif alt35 == 45:
             # C.g:1:186: T69
             self.mT69()
 
-
-
         elif alt35 == 46:
             # C.g:1:190: T70
             self.mT70()
 
-
-
         elif alt35 == 47:
             # C.g:1:194: T71
             self.mT71()
 
-
-
         elif alt35 == 48:
             # C.g:1:198: T72
             self.mT72()
 
-
-
         elif alt35 == 49:
             # C.g:1:202: T73
             self.mT73()
 
-
-
         elif alt35 == 50:
             # C.g:1:206: T74
             self.mT74()
 
-
-
         elif alt35 == 51:
             # C.g:1:210: T75
             self.mT75()
 
-
-
         elif alt35 == 52:
             # C.g:1:214: T76
             self.mT76()
 
-
-
         elif alt35 == 53:
             # C.g:1:218: T77
             self.mT77()
 
-
-
         elif alt35 == 54:
             # C.g:1:222: T78
             self.mT78()
 
-
-
         elif alt35 == 55:
             # C.g:1:226: T79
             self.mT79()
 
-
-
         elif alt35 == 56:
             # C.g:1:230: T80
             self.mT80()
 
-
-
         elif alt35 == 57:
             # C.g:1:234: T81
             self.mT81()
 
-
-
         elif alt35 == 58:
             # C.g:1:238: T82
             self.mT82()
 
-
-
         elif alt35 == 59:
             # C.g:1:242: T83
             self.mT83()
 
-
-
         elif alt35 == 60:
             # C.g:1:246: T84
             self.mT84()
 
-
-
         elif alt35 == 61:
             # C.g:1:250: T85
             self.mT85()
 
-
-
         elif alt35 == 62:
             # C.g:1:254: T86
             self.mT86()
 
-
-
         elif alt35 == 63:
             # C.g:1:258: T87
             self.mT87()
 
-
-
         elif alt35 == 64:
             # C.g:1:262: T88
             self.mT88()
 
-
-
         elif alt35 == 65:
             # C.g:1:266: T89
             self.mT89()
 
-
-
         elif alt35 == 66:
             # C.g:1:270: T90
             self.mT90()
 
-
-
         elif alt35 == 67:
             # C.g:1:274: T91
             self.mT91()
 
-
-
         elif alt35 == 68:
             # C.g:1:278: T92
             self.mT92()
 
-
-
         elif alt35 == 69:
             # C.g:1:282: T93
             self.mT93()
 
-
-
         elif alt35 == 70:
             # C.g:1:286: T94
             self.mT94()
 
-
-
         elif alt35 == 71:
             # C.g:1:290: T95
             self.mT95()
 
-
-
         elif alt35 == 72:
             # C.g:1:294: T96
             self.mT96()
 
-
-
         elif alt35 == 73:
             # C.g:1:298: T97
             self.mT97()
 
-
-
         elif alt35 == 74:
             # C.g:1:302: T98
             self.mT98()
 
-
-
         elif alt35 == 75:
             # C.g:1:306: T99
             self.mT99()
 
-
-
         elif alt35 == 76:
             # C.g:1:310: T100
             self.mT100()
 
-
-
         elif alt35 == 77:
             # C.g:1:315: T101
             self.mT101()
 
-
-
         elif alt35 == 78:
             # C.g:1:320: T102
             self.mT102()
 
-
-
         elif alt35 == 79:
             # C.g:1:325: T103
             self.mT103()
 
-
-
         elif alt35 == 80:
             # C.g:1:330: T104
             self.mT104()
 
-
-
         elif alt35 == 81:
             # C.g:1:335: T105
             self.mT105()
 
-
-
         elif alt35 == 82:
             # C.g:1:340: T106
             self.mT106()
 
-
-
         elif alt35 == 83:
             # C.g:1:345: T107
             self.mT107()
 
-
-
         elif alt35 == 84:
             # C.g:1:350: T108
             self.mT108()
 
-
-
         elif alt35 == 85:
             # C.g:1:355: T109
             self.mT109()
 
-
-
         elif alt35 == 86:
             # C.g:1:360: T110
             self.mT110()
 
-
-
         elif alt35 == 87:
             # C.g:1:365: T111
             self.mT111()
 
-
-
         elif alt35 == 88:
             # C.g:1:370: T112
             self.mT112()
 
-
-
         elif alt35 == 89:
             # C.g:1:375: T113
             self.mT113()
 
-
-
         elif alt35 == 90:
             # C.g:1:380: T114
             self.mT114()
 
-
-
         elif alt35 == 91:
             # C.g:1:385: T115
             self.mT115()
 
-
-
         elif alt35 == 92:
             # C.g:1:390: T116
             self.mT116()
 
-
-
         elif alt35 == 93:
             # C.g:1:395: T117
             self.mT117()
 
-
-
         elif alt35 == 94:
             # C.g:1:400: IDENTIFIER
             self.mIDENTIFIER()
 
-
-
         elif alt35 == 95:
             # C.g:1:411: CHARACTER_LITERAL
             self.mCHARACTER_LITERAL()
 
-
-
         elif alt35 == 96:
             # C.g:1:429: STRING_LITERAL
             self.mSTRING_LITERAL()
 
-
-
         elif alt35 == 97:
             # C.g:1:444: HEX_LITERAL
             self.mHEX_LITERAL()
 
-
-
         elif alt35 == 98:
             # C.g:1:456: DECIMAL_LITERAL
             self.mDECIMAL_LITERAL()
 
-
-
         elif alt35 == 99:
             # C.g:1:472: OCTAL_LITERAL
             self.mOCTAL_LITERAL()
 
-
-
         elif alt35 == 100:
             # C.g:1:486: FLOATING_POINT_LITERAL
             self.mFLOATING_POINT_LITERAL()
 
-
-
         elif alt35 == 101:
             # C.g:1:509: WS
             self.mWS()
 
-
-
         elif alt35 == 102:
             # C.g:1:512: BS
             self.mBS()
 
-
-
         elif alt35 == 103:
             # C.g:1:515: UnicodeVocabulary
             self.mUnicodeVocabulary()
 
-
-
         elif alt35 == 104:
             # C.g:1:533: COMMENT
             self.mCOMMENT()
 
-
-
         elif alt35 == 105:
             # C.g:1:541: LINE_COMMENT
             self.mLINE_COMMENT()
 
-
-
         elif alt35 == 106:
             # C.g:1:554: LINE_COMMAND
             self.mLINE_COMMAND()
 
-
-
-
-
-
-
-
     # lookup tables for DFA #25
 
     DFA25_eot = DFA.unpack(
         u"\7\uffff\1\10\2\uffff"
-        )
+    )
 
     DFA25_eof = DFA.unpack(
         u"\12\uffff"
-        )
+    )
 
     DFA25_min = DFA.unpack(
         u"\2\56\2\uffff\1\53\1\uffff\2\60\2\uffff"
-        )
+    )
 
     DFA25_max = DFA.unpack(
         u"\1\71\1\146\2\uffff\1\71\1\uffff\1\71\1\146\2\uffff"
-        )
+    )
 
     DFA25_accept = DFA.unpack(
         u"\2\uffff\1\2\1\1\1\uffff\1\4\2\uffff\2\3"
-        )
+    )
 
     DFA25_special = DFA.unpack(
         u"\12\uffff"
-        )
-
+    )
 
     DFA25_transition = [
         DFA.unpack(u"\1\2\1\uffff\12\1"),
         DFA.unpack(u"\1\3\1\uffff\12\1\12\uffff\1\5\1\4\1\5\35\uffff\1\5"
-        u"\1\4\1\5"),
+                   u"\1\4\1\5"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\6\1\uffff\1\6\2\uffff\12\7"),
         DFA.unpack(u""),
         DFA.unpack(u"\12\7"),
         DFA.unpack(u"\12\7\12\uffff\1\11\1\uffff\1\11\35\uffff\1\11\1\uffff"
-        u"\1\11"),
+                   u"\1\11"),
         DFA.unpack(u""),
         DFA.unpack(u"")
     ]
@@ -4376,11 +3357,11 @@ class CLexer(Lexer):
         u"\uffff\1\u0164\1\u0165\1\76\1\u0167\3\76\6\uffff\1\u016b\1\uffff"
         u"\3\76\1\uffff\21\76\1\u0180\2\76\1\uffff\3\76\1\u0186\1\76\1\uffff"
         u"\11\76\1\u0191\1\uffff"
-        )
+    )
 
     DFA35_eof = DFA.unpack(
         u"\u0192\uffff"
-        )
+    )
 
     DFA35_min = DFA.unpack(
         u"\1\3\1\uffff\1\171\1\uffff\1\75\1\154\1\150\1\165\1\145\1\124\1"
@@ -4413,7 +3394,7 @@ class CLexer(Lexer):
         u"\1\111\1\137\1\122\1\103\1\111\1\126\1\105\1\106\1\111\1\44\1\137"
         u"\1\103\1\uffff\1\125\1\105\1\116\1\44\1\122\1\uffff\1\105\1\106"
         u"\1\105\1\122\1\105\1\116\1\103\1\105\1\104\1\44\1\uffff"
-        )
+    )
 
     DFA35_max = DFA.unpack(
         u"\1\ufffe\1\uffff\1\171\1\uffff\1\75\1\170\1\167\1\165\1\145\1\124"
@@ -4447,7 +3428,7 @@ class CLexer(Lexer):
         u"\1\106\1\111\1\172\1\137\1\103\1\uffff\1\125\1\105\1\116\1\172"
         u"\1\122\1\uffff\1\105\1\106\1\105\1\122\1\105\1\116\1\103\1\105"
         u"\1\104\1\172\1\uffff"
-        )
+    )
 
     DFA35_accept = DFA.unpack(
         u"\1\uffff\1\1\1\uffff\1\3\15\uffff\1\23\1\24\1\27\10\uffff\1\46"
@@ -4467,21 +3448,20 @@ class CLexer(Lexer):
         u"\uffff\1\42\1\45\1\uffff\1\2\3\uffff\1\123\7\uffff\1\117\1\10\1"
         u"\32\1\133\1\22\1\35\1\uffff\1\40\3\uffff\1\37\24\uffff\1\43\5\uffff"
         u"\1\44\12\uffff\1\41"
-        )
+    )
 
     DFA35_special = DFA.unpack(
         u"\u0192\uffff"
-        )
-
+    )
 
     DFA35_transition = [
         DFA.unpack(u"\6\73\2\70\1\73\2\70\22\73\1\70\1\50\1\65\1\72\1\63"
-        u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
-        u"\67\1\23\1\1\1\51\1\4\1\52\1\55\1\73\2\63\1\26\1\63\1\32\1\63\1"
-        u"\31\1\63\1\24\2\63\1\62\2\63\1\25\1\33\2\63\1\11\1\63\1\27\1\30"
-        u"\4\63\1\36\1\71\1\37\1\53\1\56\1\73\1\7\1\61\1\13\1\17\1\5\1\16"
-        u"\1\60\1\63\1\14\2\63\1\15\5\63\1\10\1\6\1\2\1\20\1\12\1\57\3\63"
-        u"\1\21\1\54\1\22\1\47\uff80\73"),
+                   u"\1\45\1\46\1\64\1\34\1\35\1\40\1\42\1\3\1\43\1\41\1\44\1\66\11"
+                   u"\67\1\23\1\1\1\51\1\4\1\52\1\55\1\73\2\63\1\26\1\63\1\32\1\63\1"
+                   u"\31\1\63\1\24\2\63\1\62\2\63\1\25\1\33\2\63\1\11\1\63\1\27\1\30"
+                   u"\4\63\1\36\1\71\1\37\1\53\1\56\1\73\1\7\1\61\1\13\1\17\1\5\1\16"
+                   u"\1\60\1\63\1\14\2\63\1\15\5\63\1\10\1\6\1\2\1\20\1\12\1\57\3\63"
+                   u"\1\21\1\54\1\22\1\47\uff80\73"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\75"),
         DFA.unpack(u""),
@@ -4536,7 +3516,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\47\u0092\1\uffff\uffd7\u0092"),
         DFA.unpack(u"\uffff\u0091"),
         DFA.unpack(u"\1\154\1\uffff\10\u0094\2\154\12\uffff\3\154\21\uffff"
-        u"\1\u0093\13\uffff\3\154\21\uffff\1\u0093"),
+                   u"\1\u0093\13\uffff\3\154\21\uffff\1\u0093"),
         DFA.unpack(u"\1\154\1\uffff\12\u0096\12\uffff\3\154\35\uffff\3\154"),
         DFA.unpack(u""),
         DFA.unpack(u""),
@@ -4563,20 +3543,20 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00ab"),
         DFA.unpack(u"\1\u00ac"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00ae"),
         DFA.unpack(u"\1\u00af"),
         DFA.unpack(u"\1\u00b0"),
         DFA.unpack(u"\1\u00b1"),
         DFA.unpack(u"\1\u00b2"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\24\76\1\u00b3\5\76"),
+                   u"\24\76\1\u00b3\5\76"),
         DFA.unpack(u"\1\u00b6\11\uffff\1\u00b5"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00b8"),
         DFA.unpack(u"\1\u00b9"),
         DFA.unpack(u"\1\u00ba"),
@@ -4634,7 +3614,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\154\1\uffff\10\u0094\2\154\12\uffff\3\154\35\uffff"
-        u"\3\154"),
+                   u"\3\154"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\154\1\uffff\12\u0096\12\uffff\3\154\35\uffff\3\154"),
         DFA.unpack(u""),
@@ -4661,10 +3641,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00dd"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00df"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00e1"),
         DFA.unpack(u"\1\u00e2"),
         DFA.unpack(u"\1\u00e3"),
@@ -4674,7 +3654,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u00e6"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00e8"),
         DFA.unpack(u"\1\u00e9"),
         DFA.unpack(u"\1\u00ea"),
@@ -4693,10 +3673,10 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u00f4"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00f6"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00f8"),
         DFA.unpack(u"\1\u00f9"),
         DFA.unpack(u"\1\u00fa"),
@@ -4704,22 +3684,22 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u00fc"),
         DFA.unpack(u"\1\u00fd"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u00ff"),
         DFA.unpack(u"\1\u0100"),
         DFA.unpack(u"\1\u0101"),
         DFA.unpack(u"\1\u0102"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0105"),
         DFA.unpack(u"\1\u0106"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0109"),
         DFA.unpack(u"\1\u010a"),
@@ -4737,10 +3717,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0116"),
         DFA.unpack(u"\1\u0117"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0119"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u011b"),
         DFA.unpack(u"\1\u011c"),
         DFA.unpack(u""),
@@ -4752,7 +3732,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0121"),
         DFA.unpack(u"\1\u0122"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0124"),
         DFA.unpack(u"\1\u0125"),
@@ -4762,19 +3742,19 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0128"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u012b"),
         DFA.unpack(u"\1\u012c"),
         DFA.unpack(u"\1\u012d"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u012f"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0131"),
         DFA.unpack(u"\1\u0132"),
         DFA.unpack(u"\1\u0133"),
@@ -4783,39 +3763,39 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0136"),
         DFA.unpack(u"\1\u0137"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\u0138\1"
-        u"\uffff\32\76"),
+                   u"\uffff\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u013c"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0143"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0146"),
         DFA.unpack(u"\1\u0147"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0148"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u014a"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u014b"),
@@ -4826,15 +3806,15 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u014f"),
         DFA.unpack(u"\1\u0150"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0153"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u""),
@@ -4847,7 +3827,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0156"),
         DFA.unpack(u"\1\u0157"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0159"),
         DFA.unpack(u"\1\u015a"),
@@ -4859,22 +3839,22 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0166"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0168"),
         DFA.unpack(u"\1\u0169"),
         DFA.unpack(u"\1\u016a"),
@@ -4885,7 +3865,7 @@ class CLexer(Lexer):
         DFA.unpack(u""),
         DFA.unpack(u""),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u016c"),
         DFA.unpack(u"\1\u016d"),
@@ -4909,7 +3889,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u017e"),
         DFA.unpack(u"\1\u017f"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0181"),
         DFA.unpack(u"\1\u0182"),
         DFA.unpack(u""),
@@ -4917,7 +3897,7 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u0184"),
         DFA.unpack(u"\1\u0185"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"\1\u0187"),
         DFA.unpack(u""),
         DFA.unpack(u"\1\u0188"),
@@ -4930,12 +3910,10 @@ class CLexer(Lexer):
         DFA.unpack(u"\1\u018f"),
         DFA.unpack(u"\1\u0190"),
         DFA.unpack(u"\1\76\13\uffff\12\76\7\uffff\32\76\4\uffff\1\76\1\uffff"
-        u"\32\76"),
+                   u"\32\76"),
         DFA.unpack(u"")
     ]
 
     # class definition for DFA #35
 
     DFA35 = DFA
-
-
diff --git a/BaseTools/Source/Python/Eot/CParser3/CParser.py b/BaseTools/Source/Python/Eot/CParser3/CParser.py
index 42bb4d2a1fef..aaaa9f9b585e 100644
--- a/BaseTools/Source/Python/Eot/CParser3/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser3/CParser.py
@@ -5,7 +5,7 @@ from __future__ import absolute_import
 from antlr3 import *
 from antlr3.compat import set, frozenset
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENERATED. PLEASE DO NOT MODIFY THIS FILE.
@@ -22,33 +22,32 @@ from . import CodeFragment
 from . import FileProfile
 
 
-
 # for convenience in actions
 HIDDEN = BaseRecognizer.HIDDEN
 
 # token types
-BS=20
-LINE_COMMENT=23
-FloatTypeSuffix=16
-IntegerTypeSuffix=14
-LETTER=11
-OCTAL_LITERAL=6
-CHARACTER_LITERAL=8
-Exponent=15
-EOF=-1
-HexDigit=13
-STRING_LITERAL=9
-WS=19
-FLOATING_POINT_LITERAL=10
-IDENTIFIER=4
-UnicodeEscape=18
-LINE_COMMAND=24
-UnicodeVocabulary=21
-HEX_LITERAL=5
-COMMENT=22
-DECIMAL_LITERAL=7
-EscapeSequence=12
-OctalEscape=17
+BS = 20
+LINE_COMMENT = 23
+FloatTypeSuffix = 16
+IntegerTypeSuffix = 14
+LETTER = 11
+OCTAL_LITERAL = 6
+CHARACTER_LITERAL = 8
+Exponent = 15
+EOF = -1
+HexDigit = 13
+STRING_LITERAL = 9
+WS = 19
+FLOATING_POINT_LITERAL = 10
+IDENTIFIER = 4
+UnicodeEscape = 18
+LINE_COMMAND = 24
+UnicodeVocabulary = 21
+HEX_LITERAL = 5
+COMMENT = 22
+DECIMAL_LITERAL = 7
+EscapeSequence = 12
+OctalEscape = 17
 
 # token names
 tokenNames = [
@@ -81,6 +80,8 @@ class function_definition_scope(object):
         self.LBOffset = None
         self.DeclLine = None
         self.DeclOffset = None
+
+
 class postfix_expression_scope(object):
     def __init__(self):
         self.FuncCallText = None
@@ -98,41 +99,46 @@ class CParser(Parser):
         self.postfix_expression_stack = []
 
     def printTokenInfo(self, line, offset, tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
     def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.PredicateExpressionList.append(PredExp)
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.PredicateExpressionList.append(PredExp)
 
     def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.EnumerationDefinitionList.append(EnumDef)
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.EnumerationDefinitionList.append(EnumDef)
 
     def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
-      SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.StructUnionDefinitionList.append(SUDef)
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.StructUnionDefinitionList.append(SUDef)
 
     def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
-      Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.TypedefDefinitionList.append(Tdef)
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.TypedefDefinitionList.append(Tdef)
 
     def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
-      FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
-      FileProfile.FunctionDefinitionList.append(FuncDef)
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+        FileProfile.FunctionDefinitionList.append(FuncDef)
 
     def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
-      VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.VariableDeclarationList.append(VarDecl)
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.VariableDeclarationList.append(VarDecl)
 
     def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
-      FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
-      FileProfile.FunctionCallingList.append(FuncCall)
-
-
-
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+        FileProfile.FunctionCallingList.append(FuncCall)
 
     # $ANTLR start translation_unit
     # C.g:102:1: translation_unit : ( external_declaration )* ;
+
     def translation_unit(self, ):
 
         translation_unit_StartIndex = self.input.index()
@@ -144,30 +150,24 @@ class CParser(Parser):
                 # C.g:103:2: ( ( external_declaration )* )
                 # C.g:103:4: ( external_declaration )*
                 # C.g:103:4: ( external_declaration )*
-                while True: #loop1
+                while True:  # loop1
                     alt1 = 2
                     LA1_0 = self.input.LA(1)
 
-                    if (LA1_0 == IDENTIFIER or LA1_0 == 26 or (29 <= LA1_0 <= 42) or (45 <= LA1_0 <= 46) or (48 <= LA1_0 <= 62) or LA1_0 == 66) :
+                    if (LA1_0 == IDENTIFIER or LA1_0 == 26 or (29 <= LA1_0 <= 42) or (45 <= LA1_0 <= 46) or (48 <= LA1_0 <= 62) or LA1_0 == 66):
                         alt1 = 1
 
-
                     if alt1 == 1:
                         # C.g:0:0: external_declaration
-                        self.following.append(self.FOLLOW_external_declaration_in_translation_unit74)
+                        self.following.append(
+                            self.FOLLOW_external_declaration_in_translation_unit74)
                         self.external_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop1
-
-
-
-
-
+                        break  # loop1
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -182,9 +182,9 @@ class CParser(Parser):
 
     # $ANTLR end translation_unit
 
-
     # $ANTLR start external_declaration
     # C.g:114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );
+
     def external_declaration(self, ):
 
         external_declaration_StartIndex = self.input.index()
@@ -197,316 +197,335 @@ class CParser(Parser):
                 alt3 = 3
                 LA3_0 = self.input.LA(1)
 
-                if ((29 <= LA3_0 <= 33)) :
+                if ((29 <= LA3_0 <= 33)):
                     LA3_1 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 1, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 34) :
+                elif (LA3_0 == 34):
                     LA3_2 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 2, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 35) :
+                elif (LA3_0 == 35):
                     LA3_3 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 3, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 36) :
+                elif (LA3_0 == 36):
                     LA3_4 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 4, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 37) :
+                elif (LA3_0 == 37):
                     LA3_5 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 5, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 38) :
+                elif (LA3_0 == 38):
                     LA3_6 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 6, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 39) :
+                elif (LA3_0 == 39):
                     LA3_7 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 7, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 40) :
+                elif (LA3_0 == 40):
                     LA3_8 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 8, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 41) :
+                elif (LA3_0 == 41):
                     LA3_9 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 9, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 42) :
+                elif (LA3_0 == 42):
                     LA3_10 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 10, self.input)
 
                         raise nvae
 
-                elif ((45 <= LA3_0 <= 46)) :
+                elif ((45 <= LA3_0 <= 46)):
                     LA3_11 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 11, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 48) :
+                elif (LA3_0 == 48):
                     LA3_12 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 12, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == IDENTIFIER) :
+                elif (LA3_0 == IDENTIFIER):
                     LA3_13 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
-                    elif (True) :
+                    elif (True):
                         alt3 = 3
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 13, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 58) :
+                elif (LA3_0 == 58):
                     LA3_14 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 14, self.input)
 
                         raise nvae
 
                 elif (LA3_0 == 66) and (self.synpred4()):
                     alt3 = 1
-                elif (LA3_0 == 59) :
+                elif (LA3_0 == 59):
                     LA3_16 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 16, self.input)
 
                         raise nvae
 
-                elif (LA3_0 == 60) :
+                elif (LA3_0 == 60):
                     LA3_17 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 17, self.input)
 
                         raise nvae
 
-                elif ((49 <= LA3_0 <= 57) or LA3_0 == 61) :
+                elif ((49 <= LA3_0 <= 57) or LA3_0 == 61):
                     LA3_18 = self.input.LA(2)
 
-                    if (self.synpred4()) :
+                    if (self.synpred4()):
                         alt3 = 1
-                    elif (self.synpred5()) :
+                    elif (self.synpred5()):
                         alt3 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
+                        nvae = NoViableAltException(
+                            "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 18, self.input)
 
                         raise nvae
 
                 elif (LA3_0 == 62) and (self.synpred4()):
                     alt3 = 1
-                elif (LA3_0 == 26) :
+                elif (LA3_0 == 26):
                     alt3 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
+                    nvae = NoViableAltException(
+                        "114:1: external_declaration options {k=1; } : ( ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition | declaration | macro_statement ( ';' )? );", 3, 0, self.input)
 
                     raise nvae
 
                 if alt3 == 1:
                     # C.g:119:4: ( ( declaration_specifiers )? declarator ( declaration )* '{' )=> function_definition
-                    self.following.append(self.FOLLOW_function_definition_in_external_declaration113)
+                    self.following.append(
+                        self.FOLLOW_function_definition_in_external_declaration113)
                     self.function_definition()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt3 == 2:
                     # C.g:120:4: declaration
-                    self.following.append(self.FOLLOW_declaration_in_external_declaration118)
+                    self.following.append(
+                        self.FOLLOW_declaration_in_external_declaration118)
                     self.declaration()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt3 == 3:
                     # C.g:121:4: macro_statement ( ';' )?
-                    self.following.append(self.FOLLOW_macro_statement_in_external_declaration123)
+                    self.following.append(
+                        self.FOLLOW_macro_statement_in_external_declaration123)
                     self.macro_statement()
                     self.following.pop()
                     if self.failed:
@@ -515,19 +534,15 @@ class CParser(Parser):
                     alt2 = 2
                     LA2_0 = self.input.LA(1)
 
-                    if (LA2_0 == 25) :
+                    if (LA2_0 == 25):
                         alt2 = 1
                     if alt2 == 1:
                         # C.g:121:21: ';'
-                        self.match(self.input, 25, self.FOLLOW_25_in_external_declaration126)
+                        self.match(self.input, 25,
+                                   self.FOLLOW_25_in_external_declaration126)
                         if self.failed:
                             return
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -546,10 +561,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start function_definition
     # C.g:126:1: function_definition : (d= declaration_specifiers )? declarator ( ( declaration )+ a= compound_statement | b= compound_statement ) ;
+
     def function_definition(self, ):
         self.function_definition_stack.append(function_definition_scope())
         retval = self.function_definition_return()
@@ -563,14 +577,12 @@ class CParser(Parser):
 
         declarator1 = None
 
-
-
-        self.function_definition_stack[-1].ModifierText =  ''
-        self.function_definition_stack[-1].DeclText =  ''
-        self.function_definition_stack[-1].LBLine =  0
-        self.function_definition_stack[-1].LBOffset =  0
-        self.function_definition_stack[-1].DeclLine =  0
-        self.function_definition_stack[-1].DeclOffset =  0
+        self.function_definition_stack[-1].ModifierText = ''
+        self.function_definition_stack[-1].DeclText = ''
+        self.function_definition_stack[-1].LBLine = 0
+        self.function_definition_stack[-1].LBOffset = 0
+        self.function_definition_stack[-1].DeclLine = 0
+        self.function_definition_stack[-1].DeclOffset = 0
 
         try:
             try:
@@ -591,119 +603,119 @@ class CParser(Parser):
                     elif LA4 == 58:
                         LA4_21 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 59:
                         LA4_22 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 60:
                         LA4_23 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == IDENTIFIER:
                         LA4_24 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 62:
                         LA4_25 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 29 or LA4 == 30 or LA4 == 31 or LA4 == 32 or LA4 == 33:
                         LA4_26 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 34:
                         LA4_27 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 35:
                         LA4_28 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 36:
                         LA4_29 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 37:
                         LA4_30 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 38:
                         LA4_31 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 39:
                         LA4_32 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 40:
                         LA4_33 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 41:
                         LA4_34 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 42:
                         LA4_35 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 45 or LA4 == 46:
                         LA4_36 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 48:
                         LA4_37 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                     elif LA4 == 49 or LA4 == 50 or LA4 == 51 or LA4 == 52 or LA4 == 53 or LA4 == 54 or LA4 == 55 or LA4 == 56 or LA4 == 57 or LA4 == 61:
                         LA4_38 = self.input.LA(3)
 
-                        if (self.synpred7()) :
+                        if (self.synpred7()):
                             alt4 = 1
                 elif LA4 == 58:
                     LA4_14 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 elif LA4 == 59:
                     LA4_16 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 elif LA4 == 60:
                     LA4_17 = self.input.LA(2)
 
-                    if (self.synpred7()) :
+                    if (self.synpred7()):
                         alt4 = 1
                 if alt4 == 1:
                     # C.g:0:0: d= declaration_specifiers
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_function_definition157)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_function_definition157)
                     d = self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
-                self.following.append(self.FOLLOW_declarator_in_function_definition160)
+                self.following.append(
+                    self.FOLLOW_declarator_in_function_definition160)
                 declarator1 = self.declarator()
                 self.following.pop()
                 if self.failed:
@@ -712,16 +724,17 @@ class CParser(Parser):
                 alt6 = 2
                 LA6_0 = self.input.LA(1)
 
-                if (LA6_0 == IDENTIFIER or LA6_0 == 26 or (29 <= LA6_0 <= 42) or (45 <= LA6_0 <= 46) or (48 <= LA6_0 <= 61)) :
+                if (LA6_0 == IDENTIFIER or LA6_0 == 26 or (29 <= LA6_0 <= 42) or (45 <= LA6_0 <= 46) or (48 <= LA6_0 <= 61)):
                     alt6 = 1
-                elif (LA6_0 == 43) :
+                elif (LA6_0 == 43):
                     alt6 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("147:3: ( ( declaration )+ a= compound_statement | b= compound_statement )", 6, 0, self.input)
+                    nvae = NoViableAltException(
+                        "147:3: ( ( declaration )+ a= compound_statement | b= compound_statement )", 6, 0, self.input)
 
                     raise nvae
 
@@ -729,26 +742,25 @@ class CParser(Parser):
                     # C.g:147:5: ( declaration )+ a= compound_statement
                     # C.g:147:5: ( declaration )+
                     cnt5 = 0
-                    while True: #loop5
+                    while True:  # loop5
                         alt5 = 2
                         LA5_0 = self.input.LA(1)
 
-                        if (LA5_0 == IDENTIFIER or LA5_0 == 26 or (29 <= LA5_0 <= 42) or (45 <= LA5_0 <= 46) or (48 <= LA5_0 <= 61)) :
+                        if (LA5_0 == IDENTIFIER or LA5_0 == 26 or (29 <= LA5_0 <= 42) or (45 <= LA5_0 <= 46) or (48 <= LA5_0 <= 61)):
                             alt5 = 1
 
-
                         if alt5 == 1:
                             # C.g:0:0: declaration
-                            self.following.append(self.FOLLOW_declaration_in_function_definition166)
+                            self.following.append(
+                                self.FOLLOW_declaration_in_function_definition166)
                             self.declaration()
                             self.following.pop()
                             if self.failed:
                                 return retval
 
-
                         else:
                             if cnt5 >= 1:
-                                break #loop5
+                                break  # loop5
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -759,51 +771,46 @@ class CParser(Parser):
 
                         cnt5 += 1
 
-
-                    self.following.append(self.FOLLOW_compound_statement_in_function_definition171)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_function_definition171)
                     a = self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt6 == 2:
                     # C.g:148:5: b= compound_statement
-                    self.following.append(self.FOLLOW_compound_statement_in_function_definition180)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_function_definition180)
                     b = self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
                 if self.backtracking == 0:
 
                     if d is not None:
-                      self.function_definition_stack[-1].ModifierText = self.input.toString(d.start, d.stop)
+                        self.function_definition_stack[-1].ModifierText = self.input.toString(
+                            d.start, d.stop)
                     else:
-                      self.function_definition_stack[-1].ModifierText = ''
-                    self.function_definition_stack[-1].DeclText = self.input.toString(declarator1.start, declarator1.stop)
+                        self.function_definition_stack[-1].ModifierText = ''
+                    self.function_definition_stack[-1].DeclText = self.input.toString(
+                        declarator1.start, declarator1.stop)
                     self.function_definition_stack[-1].DeclLine = declarator1.start.line
                     self.function_definition_stack[-1].DeclOffset = declarator1.start.charPositionInLine
                     if a is not None:
-                      self.function_definition_stack[-1].LBLine = a.start.line
-                      self.function_definition_stack[-1].LBOffset = a.start.charPositionInLine
+                        self.function_definition_stack[-1].LBLine = a.start.line
+                        self.function_definition_stack[-1].LBOffset = a.start.charPositionInLine
                     else:
-                      self.function_definition_stack[-1].LBLine = b.start.line
-                      self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
-
-
-
-
+                        self.function_definition_stack[-1].LBLine = b.start.line
+                        self.function_definition_stack[-1].LBOffset = b.start.charPositionInLine
 
                 retval.stop = self.input.LT(-1)
 
                 if self.backtracking == 0:
 
-                    self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[-1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
-
-
+                    self.StoreFunctionDefinition(retval.start.line, retval.start.charPositionInLine, retval.stop.line, retval.stop.charPositionInLine, self.function_definition_stack[-1].ModifierText, self.function_definition_stack[
+                                                 -1].DeclText, self.function_definition_stack[-1].LBLine, self.function_definition_stack[-1].LBOffset, self.function_definition_stack[-1].DeclLine, self.function_definition_stack[-1].DeclOffset)
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -819,9 +826,9 @@ class CParser(Parser):
 
     # $ANTLR end function_definition
 
-
     # $ANTLR start declaration
     # C.g:166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );
+
     def declaration(self, ):
 
         declaration_StartIndex = self.input.index()
@@ -836,7 +843,6 @@ class CParser(Parser):
 
         t = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 4):
@@ -846,23 +852,25 @@ class CParser(Parser):
                 alt9 = 2
                 LA9_0 = self.input.LA(1)
 
-                if (LA9_0 == 26) :
+                if (LA9_0 == 26):
                     alt9 = 1
-                elif (LA9_0 == IDENTIFIER or (29 <= LA9_0 <= 42) or (45 <= LA9_0 <= 46) or (48 <= LA9_0 <= 61)) :
+                elif (LA9_0 == IDENTIFIER or (29 <= LA9_0 <= 42) or (45 <= LA9_0 <= 46) or (48 <= LA9_0 <= 61)):
                     alt9 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
+                    nvae = NoViableAltException(
+                        "166:1: declaration : (a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';' | s= declaration_specifiers (t= init_declarator_list )? e= ';' );", 9, 0, self.input)
 
                     raise nvae
 
                 if alt9 == 1:
                     # C.g:167:4: a= 'typedef' (b= declaration_specifiers )? c= init_declarator_list d= ';'
                     a = self.input.LT(1)
-                    self.match(self.input, 26, self.FOLLOW_26_in_declaration203)
+                    self.match(self.input, 26,
+                               self.FOLLOW_26_in_declaration203)
                     if self.failed:
                         return
                     # C.g:167:17: (b= declaration_specifiers )?
@@ -873,60 +881,61 @@ class CParser(Parser):
                     elif LA7 == IDENTIFIER:
                         LA7_13 = self.input.LA(2)
 
-                        if (LA7_13 == 62) :
+                        if (LA7_13 == 62):
                             LA7_21 = self.input.LA(3)
 
-                            if (self.synpred10()) :
+                            if (self.synpred10()):
                                 alt7 = 1
-                        elif (LA7_13 == IDENTIFIER or (29 <= LA7_13 <= 42) or (45 <= LA7_13 <= 46) or (48 <= LA7_13 <= 61) or LA7_13 == 66) :
+                        elif (LA7_13 == IDENTIFIER or (29 <= LA7_13 <= 42) or (45 <= LA7_13 <= 46) or (48 <= LA7_13 <= 61) or LA7_13 == 66):
                             alt7 = 1
                     elif LA7 == 58:
                         LA7_14 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     elif LA7 == 59:
                         LA7_16 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     elif LA7 == 60:
                         LA7_17 = self.input.LA(2)
 
-                        if (self.synpred10()) :
+                        if (self.synpred10()):
                             alt7 = 1
                     if alt7 == 1:
                         # C.g:0:0: b= declaration_specifiers
-                        self.following.append(self.FOLLOW_declaration_specifiers_in_declaration207)
+                        self.following.append(
+                            self.FOLLOW_declaration_specifiers_in_declaration207)
                         b = self.declaration_specifiers()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-                    self.following.append(self.FOLLOW_init_declarator_list_in_declaration216)
+                    self.following.append(
+                        self.FOLLOW_init_declarator_list_in_declaration216)
                     c = self.init_declarator_list()
                     self.following.pop()
                     if self.failed:
                         return
                     d = self.input.LT(1)
-                    self.match(self.input, 25, self.FOLLOW_25_in_declaration220)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_declaration220)
                     if self.failed:
                         return
                     if self.backtracking == 0:
 
                         if b is not None:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(b.start, b.stop), self.input.toString(c.start, c.stop))
+                            self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, self.input.toString(
+                                b.start, b.stop), self.input.toString(c.start, c.stop))
                         else:
-                          self.StoreTypedefDefinition(a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
-
-
-
+                            self.StoreTypedefDefinition(
+                                a.line, a.charPositionInLine, d.line, d.charPositionInLine, '', self.input.toString(c.start, c.stop))
 
                 elif alt9 == 2:
                     # C.g:175:4: s= declaration_specifiers (t= init_declarator_list )? e= ';'
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_declaration234)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_declaration234)
                     s = self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
@@ -935,30 +944,27 @@ class CParser(Parser):
                     alt8 = 2
                     LA8_0 = self.input.LA(1)
 
-                    if (LA8_0 == IDENTIFIER or (58 <= LA8_0 <= 60) or LA8_0 == 62 or LA8_0 == 66) :
+                    if (LA8_0 == IDENTIFIER or (58 <= LA8_0 <= 60) or LA8_0 == 62 or LA8_0 == 66):
                         alt8 = 1
                     if alt8 == 1:
                         # C.g:0:0: t= init_declarator_list
-                        self.following.append(self.FOLLOW_init_declarator_list_in_declaration238)
+                        self.following.append(
+                            self.FOLLOW_init_declarator_list_in_declaration238)
                         t = self.init_declarator_list()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
                     e = self.input.LT(1)
-                    self.match(self.input, 25, self.FOLLOW_25_in_declaration243)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_declaration243)
                     if self.failed:
                         return
                     if self.backtracking == 0:
 
                         if t is not None:
-                          self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(s.start, s.stop), self.input.toString(t.start, t.stop))
-
-
-
-
+                            self.StoreVariableDeclaration(s.start.line, s.start.charPositionInLine, t.start.line, t.start.charPositionInLine, self.input.toString(
+                                s.start, s.stop), self.input.toString(t.start, t.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -978,10 +984,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start declaration_specifiers
     # C.g:182:1: declaration_specifiers : ( storage_class_specifier | type_specifier | type_qualifier )+ ;
+
     def declaration_specifiers(self, ):
 
         retval = self.declaration_specifiers_return()
@@ -996,44 +1001,39 @@ class CParser(Parser):
                 # C.g:183:6: ( storage_class_specifier | type_specifier | type_qualifier )+
                 # C.g:183:6: ( storage_class_specifier | type_specifier | type_qualifier )+
                 cnt10 = 0
-                while True: #loop10
+                while True:  # loop10
                     alt10 = 4
                     LA10 = self.input.LA(1)
                     if LA10 == 58:
                         LA10_2 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 59:
                         LA10_3 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 60:
                         LA10_4 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == IDENTIFIER:
                         LA10_5 = self.input.LA(2)
 
-                        if (self.synpred14()) :
+                        if (self.synpred14()):
                             alt10 = 2
 
-
                     elif LA10 == 53:
                         LA10_9 = self.input.LA(2)
 
-                        if (self.synpred15()) :
+                        if (self.synpred15()):
                             alt10 = 3
 
-
                     elif LA10 == 29 or LA10 == 30 or LA10 == 31 or LA10 == 32 or LA10 == 33:
                         alt10 = 1
                     elif LA10 == 34 or LA10 == 35 or LA10 == 36 or LA10 == 37 or LA10 == 38 or LA10 == 39 or LA10 == 40 or LA10 == 41 or LA10 == 42 or LA10 == 45 or LA10 == 46 or LA10 == 48:
@@ -1043,34 +1043,34 @@ class CParser(Parser):
 
                     if alt10 == 1:
                         # C.g:183:10: storage_class_specifier
-                        self.following.append(self.FOLLOW_storage_class_specifier_in_declaration_specifiers264)
+                        self.following.append(
+                            self.FOLLOW_storage_class_specifier_in_declaration_specifiers264)
                         self.storage_class_specifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     elif alt10 == 2:
                         # C.g:184:7: type_specifier
-                        self.following.append(self.FOLLOW_type_specifier_in_declaration_specifiers272)
+                        self.following.append(
+                            self.FOLLOW_type_specifier_in_declaration_specifiers272)
                         self.type_specifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     elif alt10 == 3:
                         # C.g:185:13: type_qualifier
-                        self.following.append(self.FOLLOW_type_qualifier_in_declaration_specifiers286)
+                        self.following.append(
+                            self.FOLLOW_type_qualifier_in_declaration_specifiers286)
                         self.type_qualifier()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
                         if cnt10 >= 1:
-                            break #loop10
+                            break  # loop10
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1081,13 +1081,8 @@ class CParser(Parser):
 
                     cnt10 += 1
 
-
-
-
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1106,10 +1101,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start init_declarator_list
     # C.g:189:1: init_declarator_list : init_declarator ( ',' init_declarator )* ;
+
     def init_declarator_list(self, ):
 
         retval = self.init_declarator_list_return()
@@ -1122,42 +1116,38 @@ class CParser(Parser):
 
                 # C.g:190:2: ( init_declarator ( ',' init_declarator )* )
                 # C.g:190:4: init_declarator ( ',' init_declarator )*
-                self.following.append(self.FOLLOW_init_declarator_in_init_declarator_list308)
+                self.following.append(
+                    self.FOLLOW_init_declarator_in_init_declarator_list308)
                 self.init_declarator()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:190:20: ( ',' init_declarator )*
-                while True: #loop11
+                while True:  # loop11
                     alt11 = 2
                     LA11_0 = self.input.LA(1)
 
-                    if (LA11_0 == 27) :
+                    if (LA11_0 == 27):
                         alt11 = 1
 
-
                     if alt11 == 1:
                         # C.g:190:21: ',' init_declarator
-                        self.match(self.input, 27, self.FOLLOW_27_in_init_declarator_list311)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_init_declarator_list311)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_init_declarator_in_init_declarator_list313)
+                        self.following.append(
+                            self.FOLLOW_init_declarator_in_init_declarator_list313)
                         self.init_declarator()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop11
-
-
-
-
+                        break  # loop11
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1171,9 +1161,9 @@ class CParser(Parser):
 
     # $ANTLR end init_declarator_list
 
-
     # $ANTLR start init_declarator
     # C.g:193:1: init_declarator : declarator ( '=' initializer )? ;
+
     def init_declarator(self, ):
 
         init_declarator_StartIndex = self.input.index()
@@ -1184,7 +1174,8 @@ class CParser(Parser):
 
                 # C.g:194:2: ( declarator ( '=' initializer )? )
                 # C.g:194:4: declarator ( '=' initializer )?
-                self.following.append(self.FOLLOW_declarator_in_init_declarator326)
+                self.following.append(
+                    self.FOLLOW_declarator_in_init_declarator326)
                 self.declarator()
                 self.following.pop()
                 if self.failed:
@@ -1193,25 +1184,21 @@ class CParser(Parser):
                 alt12 = 2
                 LA12_0 = self.input.LA(1)
 
-                if (LA12_0 == 28) :
+                if (LA12_0 == 28):
                     alt12 = 1
                 if alt12 == 1:
                     # C.g:194:16: '=' initializer
-                    self.match(self.input, 28, self.FOLLOW_28_in_init_declarator329)
+                    self.match(self.input, 28,
+                               self.FOLLOW_28_in_init_declarator329)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_initializer_in_init_declarator331)
+                    self.following.append(
+                        self.FOLLOW_initializer_in_init_declarator331)
                     self.initializer()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1225,9 +1212,9 @@ class CParser(Parser):
 
     # $ANTLR end init_declarator
 
-
     # $ANTLR start storage_class_specifier
     # C.g:197:1: storage_class_specifier : ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' );
+
     def storage_class_specifier(self, ):
 
         storage_class_specifier_StartIndex = self.input.index()
@@ -1239,7 +1226,7 @@ class CParser(Parser):
                 # C.g:198:2: ( 'extern' | 'static' | 'auto' | 'register' | 'STATIC' )
                 # C.g:
                 if (29 <= self.input.LA(1) <= 33):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -1251,14 +1238,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_storage_class_specifier0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1272,9 +1254,9 @@ class CParser(Parser):
 
     # $ANTLR end storage_class_specifier
 
-
     # $ANTLR start type_specifier
     # C.g:205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );
+
     def type_specifier(self, ):
 
         type_specifier_StartIndex = self.input.index()
@@ -1282,7 +1264,6 @@ class CParser(Parser):
 
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 9):
@@ -1292,27 +1273,27 @@ class CParser(Parser):
                 alt13 = 12
                 LA13_0 = self.input.LA(1)
 
-                if (LA13_0 == 34) :
+                if (LA13_0 == 34):
                     alt13 = 1
-                elif (LA13_0 == 35) :
+                elif (LA13_0 == 35):
                     alt13 = 2
-                elif (LA13_0 == 36) :
+                elif (LA13_0 == 36):
                     alt13 = 3
-                elif (LA13_0 == 37) :
+                elif (LA13_0 == 37):
                     alt13 = 4
-                elif (LA13_0 == 38) :
+                elif (LA13_0 == 38):
                     alt13 = 5
-                elif (LA13_0 == 39) :
+                elif (LA13_0 == 39):
                     alt13 = 6
-                elif (LA13_0 == 40) :
+                elif (LA13_0 == 40):
                     alt13 = 7
-                elif (LA13_0 == 41) :
+                elif (LA13_0 == 41):
                     alt13 = 8
-                elif (LA13_0 == 42) :
+                elif (LA13_0 == 42):
                     alt13 = 9
-                elif ((45 <= LA13_0 <= 46)) :
+                elif ((45 <= LA13_0 <= 46)):
                     alt13 = 10
-                elif (LA13_0 == 48) :
+                elif (LA13_0 == 48):
                     alt13 = 11
                 elif (LA13_0 == IDENTIFIER) and (self.synpred34()):
                     alt13 = 12
@@ -1321,76 +1302,78 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
+                    nvae = NoViableAltException(
+                        "205:1: type_specifier : ( 'void' | 'char' | 'short' | 'int' | 'long' | 'float' | 'double' | 'signed' | 'unsigned' | s= struct_or_union_specifier | e= enum_specifier | ( IDENTIFIER ( type_qualifier )* declarator )=> type_id );", 13, 0, self.input)
 
                     raise nvae
 
                 if alt13 == 1:
                     # C.g:206:4: 'void'
-                    self.match(self.input, 34, self.FOLLOW_34_in_type_specifier376)
+                    self.match(self.input, 34,
+                               self.FOLLOW_34_in_type_specifier376)
                     if self.failed:
                         return
 
-
                 elif alt13 == 2:
                     # C.g:207:4: 'char'
-                    self.match(self.input, 35, self.FOLLOW_35_in_type_specifier381)
+                    self.match(self.input, 35,
+                               self.FOLLOW_35_in_type_specifier381)
                     if self.failed:
                         return
 
-
                 elif alt13 == 3:
                     # C.g:208:4: 'short'
-                    self.match(self.input, 36, self.FOLLOW_36_in_type_specifier386)
+                    self.match(self.input, 36,
+                               self.FOLLOW_36_in_type_specifier386)
                     if self.failed:
                         return
 
-
                 elif alt13 == 4:
                     # C.g:209:4: 'int'
-                    self.match(self.input, 37, self.FOLLOW_37_in_type_specifier391)
+                    self.match(self.input, 37,
+                               self.FOLLOW_37_in_type_specifier391)
                     if self.failed:
                         return
 
-
                 elif alt13 == 5:
                     # C.g:210:4: 'long'
-                    self.match(self.input, 38, self.FOLLOW_38_in_type_specifier396)
+                    self.match(self.input, 38,
+                               self.FOLLOW_38_in_type_specifier396)
                     if self.failed:
                         return
 
-
                 elif alt13 == 6:
                     # C.g:211:4: 'float'
-                    self.match(self.input, 39, self.FOLLOW_39_in_type_specifier401)
+                    self.match(self.input, 39,
+                               self.FOLLOW_39_in_type_specifier401)
                     if self.failed:
                         return
 
-
                 elif alt13 == 7:
                     # C.g:212:4: 'double'
-                    self.match(self.input, 40, self.FOLLOW_40_in_type_specifier406)
+                    self.match(self.input, 40,
+                               self.FOLLOW_40_in_type_specifier406)
                     if self.failed:
                         return
 
-
                 elif alt13 == 8:
                     # C.g:213:4: 'signed'
-                    self.match(self.input, 41, self.FOLLOW_41_in_type_specifier411)
+                    self.match(self.input, 41,
+                               self.FOLLOW_41_in_type_specifier411)
                     if self.failed:
                         return
 
-
                 elif alt13 == 9:
                     # C.g:214:4: 'unsigned'
-                    self.match(self.input, 42, self.FOLLOW_42_in_type_specifier416)
+                    self.match(self.input, 42,
+                               self.FOLLOW_42_in_type_specifier416)
                     if self.failed:
                         return
 
-
                 elif alt13 == 10:
                     # C.g:215:4: s= struct_or_union_specifier
-                    self.following.append(self.FOLLOW_struct_or_union_specifier_in_type_specifier423)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_specifier_in_type_specifier423)
                     s = self.struct_or_union_specifier()
                     self.following.pop()
                     if self.failed:
@@ -1398,14 +1381,13 @@ class CParser(Parser):
                     if self.backtracking == 0:
 
                         if s.stop is not None:
-                          self.StoreStructUnionDefinition(s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
-
-
-
+                            self.StoreStructUnionDefinition(
+                                s.start.line, s.start.charPositionInLine, s.stop.line, s.stop.charPositionInLine, self.input.toString(s.start, s.stop))
 
                 elif alt13 == 11:
                     # C.g:220:4: e= enum_specifier
-                    self.following.append(self.FOLLOW_enum_specifier_in_type_specifier433)
+                    self.following.append(
+                        self.FOLLOW_enum_specifier_in_type_specifier433)
                     e = self.enum_specifier()
                     self.following.pop()
                     if self.failed:
@@ -1413,21 +1395,18 @@ class CParser(Parser):
                     if self.backtracking == 0:
 
                         if e.stop is not None:
-                          self.StoreEnumerationDefinition(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
+                            self.StoreEnumerationDefinition(
+                                e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt13 == 12:
                     # C.g:225:4: ( IDENTIFIER ( type_qualifier )* declarator )=> type_id
-                    self.following.append(self.FOLLOW_type_id_in_type_specifier451)
+                    self.following.append(
+                        self.FOLLOW_type_id_in_type_specifier451)
                     self.type_id()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1441,9 +1420,9 @@ class CParser(Parser):
 
     # $ANTLR end type_specifier
 
-
     # $ANTLR start type_id
     # C.g:228:1: type_id : IDENTIFIER ;
+
     def type_id(self, ):
 
         type_id_StartIndex = self.input.index()
@@ -1454,13 +1433,11 @@ class CParser(Parser):
 
                 # C.g:229:5: ( IDENTIFIER )
                 # C.g:229:9: IDENTIFIER
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_type_id467)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_type_id467)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1479,10 +1456,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start struct_or_union_specifier
     # C.g:233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );
+
     def struct_or_union_specifier(self, ):
 
         retval = self.struct_or_union_specifier_return()
@@ -1497,33 +1473,35 @@ class CParser(Parser):
                 alt15 = 2
                 LA15_0 = self.input.LA(1)
 
-                if ((45 <= LA15_0 <= 46)) :
+                if ((45 <= LA15_0 <= 46)):
                     LA15_1 = self.input.LA(2)
 
-                    if (LA15_1 == IDENTIFIER) :
+                    if (LA15_1 == IDENTIFIER):
                         LA15_2 = self.input.LA(3)
 
-                        if (LA15_2 == 43) :
+                        if (LA15_2 == 43):
                             alt15 = 1
-                        elif (LA15_2 == EOF or LA15_2 == IDENTIFIER or LA15_2 == 25 or LA15_2 == 27 or (29 <= LA15_2 <= 42) or (45 <= LA15_2 <= 64) or LA15_2 == 66) :
+                        elif (LA15_2 == EOF or LA15_2 == IDENTIFIER or LA15_2 == 25 or LA15_2 == 27 or (29 <= LA15_2 <= 42) or (45 <= LA15_2 <= 64) or LA15_2 == 66):
                             alt15 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return retval
 
-                            nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 2, self.input)
+                            nvae = NoViableAltException(
+                                "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 2, self.input)
 
                             raise nvae
 
-                    elif (LA15_1 == 43) :
+                    elif (LA15_1 == 43):
                         alt15 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 1, self.input)
+                        nvae = NoViableAltException(
+                            "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 1, self.input)
 
                         raise nvae
 
@@ -1532,13 +1510,15 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 0, self.input)
+                    nvae = NoViableAltException(
+                        "233:1: struct_or_union_specifier options {k=3; } : ( struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}' | struct_or_union IDENTIFIER );", 15, 0, self.input)
 
                     raise nvae
 
                 if alt15 == 1:
                     # C.g:235:4: struct_or_union ( IDENTIFIER )? '{' struct_declaration_list '}'
-                    self.following.append(self.FOLLOW_struct_or_union_in_struct_or_union_specifier494)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_in_struct_or_union_specifier494)
                     self.struct_or_union()
                     self.following.pop()
                     if self.failed:
@@ -1547,50 +1527,52 @@ class CParser(Parser):
                     alt14 = 2
                     LA14_0 = self.input.LA(1)
 
-                    if (LA14_0 == IDENTIFIER) :
+                    if (LA14_0 == IDENTIFIER):
                         alt14 = 1
                     if alt14 == 1:
                         # C.g:0:0: IDENTIFIER
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier496)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier496)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 43, self.FOLLOW_43_in_struct_or_union_specifier499)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_struct_or_union_specifier499)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_struct_declaration_list_in_struct_or_union_specifier501)
+                    self.following.append(
+                        self.FOLLOW_struct_declaration_list_in_struct_or_union_specifier501)
                     self.struct_declaration_list()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 44, self.FOLLOW_44_in_struct_or_union_specifier503)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_struct_or_union_specifier503)
                     if self.failed:
                         return retval
 
-
                 elif alt15 == 2:
                     # C.g:236:4: struct_or_union IDENTIFIER
-                    self.following.append(self.FOLLOW_struct_or_union_in_struct_or_union_specifier508)
+                    self.following.append(
+                        self.FOLLOW_struct_or_union_in_struct_or_union_specifier508)
                     self.struct_or_union()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier510)
+                    self.match(
+                        self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_struct_or_union_specifier510)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 11, struct_or_union_specifier_StartIndex)
+                self.memoize(self.input, 11,
+                             struct_or_union_specifier_StartIndex)
 
             pass
 
@@ -1598,9 +1580,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_or_union_specifier
 
-
     # $ANTLR start struct_or_union
     # C.g:239:1: struct_or_union : ( 'struct' | 'union' );
+
     def struct_or_union(self, ):
 
         struct_or_union_StartIndex = self.input.index()
@@ -1612,7 +1594,7 @@ class CParser(Parser):
                 # C.g:240:2: ( 'struct' | 'union' )
                 # C.g:
                 if (45 <= self.input.LA(1) <= 46):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -1624,14 +1606,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_struct_or_union0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1645,9 +1622,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_or_union
 
-
     # $ANTLR start struct_declaration_list
     # C.g:244:1: struct_declaration_list : ( struct_declaration )+ ;
+
     def struct_declaration_list(self, ):
 
         struct_declaration_list_StartIndex = self.input.index()
@@ -1660,26 +1637,25 @@ class CParser(Parser):
                 # C.g:245:4: ( struct_declaration )+
                 # C.g:245:4: ( struct_declaration )+
                 cnt16 = 0
-                while True: #loop16
+                while True:  # loop16
                     alt16 = 2
                     LA16_0 = self.input.LA(1)
 
-                    if (LA16_0 == IDENTIFIER or (34 <= LA16_0 <= 42) or (45 <= LA16_0 <= 46) or (48 <= LA16_0 <= 61)) :
+                    if (LA16_0 == IDENTIFIER or (34 <= LA16_0 <= 42) or (45 <= LA16_0 <= 46) or (48 <= LA16_0 <= 61)):
                         alt16 = 1
 
-
                     if alt16 == 1:
                         # C.g:0:0: struct_declaration
-                        self.following.append(self.FOLLOW_struct_declaration_in_struct_declaration_list537)
+                        self.following.append(
+                            self.FOLLOW_struct_declaration_in_struct_declaration_list537)
                         self.struct_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt16 >= 1:
-                            break #loop16
+                            break  # loop16
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1690,17 +1666,13 @@ class CParser(Parser):
 
                     cnt16 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 13, struct_declaration_list_StartIndex)
+                self.memoize(self.input, 13,
+                             struct_declaration_list_StartIndex)
 
             pass
 
@@ -1708,9 +1680,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declaration_list
 
-
     # $ANTLR start struct_declaration
     # C.g:248:1: struct_declaration : specifier_qualifier_list struct_declarator_list ';' ;
+
     def struct_declaration(self, ):
 
         struct_declaration_StartIndex = self.input.index()
@@ -1721,23 +1693,23 @@ class CParser(Parser):
 
                 # C.g:249:2: ( specifier_qualifier_list struct_declarator_list ';' )
                 # C.g:249:4: specifier_qualifier_list struct_declarator_list ';'
-                self.following.append(self.FOLLOW_specifier_qualifier_list_in_struct_declaration549)
+                self.following.append(
+                    self.FOLLOW_specifier_qualifier_list_in_struct_declaration549)
                 self.specifier_qualifier_list()
                 self.following.pop()
                 if self.failed:
                     return
-                self.following.append(self.FOLLOW_struct_declarator_list_in_struct_declaration551)
+                self.following.append(
+                    self.FOLLOW_struct_declarator_list_in_struct_declaration551)
                 self.struct_declarator_list()
                 self.following.pop()
                 if self.failed:
                     return
-                self.match(self.input, 25, self.FOLLOW_25_in_struct_declaration553)
+                self.match(self.input, 25,
+                           self.FOLLOW_25_in_struct_declaration553)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -1751,9 +1723,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declaration
 
-
     # $ANTLR start specifier_qualifier_list
     # C.g:252:1: specifier_qualifier_list : ( type_qualifier | type_specifier )+ ;
+
     def specifier_qualifier_list(self, ):
 
         specifier_qualifier_list_StartIndex = self.input.index()
@@ -1766,30 +1738,27 @@ class CParser(Parser):
                 # C.g:253:4: ( type_qualifier | type_specifier )+
                 # C.g:253:4: ( type_qualifier | type_specifier )+
                 cnt17 = 0
-                while True: #loop17
+                while True:  # loop17
                     alt17 = 3
                     LA17 = self.input.LA(1)
                     if LA17 == 58:
                         LA17_2 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == 59:
                         LA17_3 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == 60:
                         LA17_4 = self.input.LA(2)
 
-                        if (self.synpred39()) :
+                        if (self.synpred39()):
                             alt17 = 1
 
-
                     elif LA17 == IDENTIFIER:
                         LA17 = self.input.LA(2)
                         if LA17 == EOF or LA17 == IDENTIFIER or LA17 == 34 or LA17 == 35 or LA17 == 36 or LA17 == 37 or LA17 == 38 or LA17 == 39 or LA17 == 40 or LA17 == 41 or LA17 == 42 or LA17 == 45 or LA17 == 46 or LA17 == 48 or LA17 == 49 or LA17 == 50 or LA17 == 51 or LA17 == 52 or LA17 == 53 or LA17 == 54 or LA17 == 55 or LA17 == 56 or LA17 == 57 or LA17 == 58 or LA17 == 59 or LA17 == 60 or LA17 == 61 or LA17 == 63 or LA17 == 66:
@@ -1797,25 +1766,21 @@ class CParser(Parser):
                         elif LA17 == 62:
                             LA17_94 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
                         elif LA17 == 47:
                             LA17_95 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
                         elif LA17 == 64:
                             LA17_96 = self.input.LA(3)
 
-                            if (self.synpred40()) :
+                            if (self.synpred40()):
                                 alt17 = 2
 
-
-
                     elif LA17 == 49 or LA17 == 50 or LA17 == 51 or LA17 == 52 or LA17 == 53 or LA17 == 54 or LA17 == 55 or LA17 == 56 or LA17 == 57 or LA17 == 61:
                         alt17 = 1
                     elif LA17 == 34 or LA17 == 35 or LA17 == 36 or LA17 == 37 or LA17 == 38 or LA17 == 39 or LA17 == 40 or LA17 == 41 or LA17 == 42 or LA17 == 45 or LA17 == 46 or LA17 == 48:
@@ -1823,25 +1788,25 @@ class CParser(Parser):
 
                     if alt17 == 1:
                         # C.g:253:6: type_qualifier
-                        self.following.append(self.FOLLOW_type_qualifier_in_specifier_qualifier_list566)
+                        self.following.append(
+                            self.FOLLOW_type_qualifier_in_specifier_qualifier_list566)
                         self.type_qualifier()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt17 == 2:
                         # C.g:253:23: type_specifier
-                        self.following.append(self.FOLLOW_type_specifier_in_specifier_qualifier_list570)
+                        self.following.append(
+                            self.FOLLOW_type_specifier_in_specifier_qualifier_list570)
                         self.type_specifier()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt17 >= 1:
-                            break #loop17
+                            break  # loop17
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -1852,17 +1817,13 @@ class CParser(Parser):
 
                     cnt17 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 15, specifier_qualifier_list_StartIndex)
+                self.memoize(self.input, 15,
+                             specifier_qualifier_list_StartIndex)
 
             pass
 
@@ -1870,9 +1831,9 @@ class CParser(Parser):
 
     # $ANTLR end specifier_qualifier_list
 
-
     # $ANTLR start struct_declarator_list
     # C.g:256:1: struct_declarator_list : struct_declarator ( ',' struct_declarator )* ;
+
     def struct_declarator_list(self, ):
 
         struct_declarator_list_StartIndex = self.input.index()
@@ -1883,39 +1844,35 @@ class CParser(Parser):
 
                 # C.g:257:2: ( struct_declarator ( ',' struct_declarator )* )
                 # C.g:257:4: struct_declarator ( ',' struct_declarator )*
-                self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list584)
+                self.following.append(
+                    self.FOLLOW_struct_declarator_in_struct_declarator_list584)
                 self.struct_declarator()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:257:22: ( ',' struct_declarator )*
-                while True: #loop18
+                while True:  # loop18
                     alt18 = 2
                     LA18_0 = self.input.LA(1)
 
-                    if (LA18_0 == 27) :
+                    if (LA18_0 == 27):
                         alt18 = 1
 
-
                     if alt18 == 1:
                         # C.g:257:23: ',' struct_declarator
-                        self.match(self.input, 27, self.FOLLOW_27_in_struct_declarator_list587)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_struct_declarator_list587)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_struct_declarator_in_struct_declarator_list589)
+                        self.following.append(
+                            self.FOLLOW_struct_declarator_in_struct_declarator_list589)
                         self.struct_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop18
-
-
-
-
-
+                        break  # loop18
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -1930,9 +1887,9 @@ class CParser(Parser):
 
     # $ANTLR end struct_declarator_list
 
-
     # $ANTLR start struct_declarator
     # C.g:260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );
+
     def struct_declarator(self, ):
 
         struct_declarator_StartIndex = self.input.index()
@@ -1945,22 +1902,24 @@ class CParser(Parser):
                 alt20 = 2
                 LA20_0 = self.input.LA(1)
 
-                if (LA20_0 == IDENTIFIER or (58 <= LA20_0 <= 60) or LA20_0 == 62 or LA20_0 == 66) :
+                if (LA20_0 == IDENTIFIER or (58 <= LA20_0 <= 60) or LA20_0 == 62 or LA20_0 == 66):
                     alt20 = 1
-                elif (LA20_0 == 47) :
+                elif (LA20_0 == 47):
                     alt20 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
+                    nvae = NoViableAltException(
+                        "260:1: struct_declarator : ( declarator ( ':' constant_expression )? | ':' constant_expression );", 20, 0, self.input)
 
                     raise nvae
 
                 if alt20 == 1:
                     # C.g:261:4: declarator ( ':' constant_expression )?
-                    self.following.append(self.FOLLOW_declarator_in_struct_declarator602)
+                    self.following.append(
+                        self.FOLLOW_declarator_in_struct_declarator602)
                     self.declarator()
                     self.following.pop()
                     if self.failed:
@@ -1969,36 +1928,34 @@ class CParser(Parser):
                     alt19 = 2
                     LA19_0 = self.input.LA(1)
 
-                    if (LA19_0 == 47) :
+                    if (LA19_0 == 47):
                         alt19 = 1
                     if alt19 == 1:
                         # C.g:261:16: ':' constant_expression
-                        self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator605)
+                        self.match(self.input, 47,
+                                   self.FOLLOW_47_in_struct_declarator605)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_constant_expression_in_struct_declarator607)
+                        self.following.append(
+                            self.FOLLOW_constant_expression_in_struct_declarator607)
                         self.constant_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt20 == 2:
                     # C.g:262:4: ':' constant_expression
-                    self.match(self.input, 47, self.FOLLOW_47_in_struct_declarator614)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_struct_declarator614)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_struct_declarator616)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_struct_declarator616)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2017,10 +1974,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start enum_specifier
     # C.g:265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );
+
     def enum_specifier(self, ):
 
         retval = self.enum_specifier_return()
@@ -2035,33 +1991,35 @@ class CParser(Parser):
                 alt23 = 3
                 LA23_0 = self.input.LA(1)
 
-                if (LA23_0 == 48) :
+                if (LA23_0 == 48):
                     LA23_1 = self.input.LA(2)
 
-                    if (LA23_1 == IDENTIFIER) :
+                    if (LA23_1 == IDENTIFIER):
                         LA23_2 = self.input.LA(3)
 
-                        if (LA23_2 == 43) :
+                        if (LA23_2 == 43):
                             alt23 = 2
-                        elif (LA23_2 == EOF or LA23_2 == IDENTIFIER or LA23_2 == 25 or LA23_2 == 27 or (29 <= LA23_2 <= 42) or (45 <= LA23_2 <= 64) or LA23_2 == 66) :
+                        elif (LA23_2 == EOF or LA23_2 == IDENTIFIER or LA23_2 == 25 or LA23_2 == 27 or (29 <= LA23_2 <= 42) or (45 <= LA23_2 <= 64) or LA23_2 == 66):
                             alt23 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return retval
 
-                            nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 2, self.input)
+                            nvae = NoViableAltException(
+                                "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 2, self.input)
 
                             raise nvae
 
-                    elif (LA23_1 == 43) :
+                    elif (LA23_1 == 43):
                         alt23 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 1, self.input)
+                        nvae = NoViableAltException(
+                            "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 1, self.input)
 
                         raise nvae
 
@@ -2070,19 +2028,23 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 0, self.input)
+                    nvae = NoViableAltException(
+                        "265:1: enum_specifier options {k=3; } : ( 'enum' '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}' | 'enum' IDENTIFIER );", 23, 0, self.input)
 
                     raise nvae
 
                 if alt23 == 1:
                     # C.g:267:4: 'enum' '{' enumerator_list ( ',' )? '}'
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier634)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier634)
                     if self.failed:
                         return retval
-                    self.match(self.input, 43, self.FOLLOW_43_in_enum_specifier636)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_enum_specifier636)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_enumerator_list_in_enum_specifier638)
+                    self.following.append(
+                        self.FOLLOW_enumerator_list_in_enum_specifier638)
                     self.enumerator_list()
                     self.following.pop()
                     if self.failed:
@@ -2091,33 +2053,36 @@ class CParser(Parser):
                     alt21 = 2
                     LA21_0 = self.input.LA(1)
 
-                    if (LA21_0 == 27) :
+                    if (LA21_0 == 27):
                         alt21 = 1
                     if alt21 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_enum_specifier640)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enum_specifier640)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_enum_specifier643)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_enum_specifier643)
                     if self.failed:
                         return retval
 
-
                 elif alt23 == 2:
                     # C.g:268:4: 'enum' IDENTIFIER '{' enumerator_list ( ',' )? '}'
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier648)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier648)
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enum_specifier650)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_enum_specifier650)
                     if self.failed:
                         return retval
-                    self.match(self.input, 43, self.FOLLOW_43_in_enum_specifier652)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_enum_specifier652)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_enumerator_list_in_enum_specifier654)
+                    self.following.append(
+                        self.FOLLOW_enumerator_list_in_enum_specifier654)
                     self.enumerator_list()
                     self.following.pop()
                     if self.failed:
@@ -2126,34 +2091,33 @@ class CParser(Parser):
                     alt22 = 2
                     LA22_0 = self.input.LA(1)
 
-                    if (LA22_0 == 27) :
+                    if (LA22_0 == 27):
                         alt22 = 1
                     if alt22 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_enum_specifier656)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enum_specifier656)
                         if self.failed:
                             return retval
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_enum_specifier659)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_enum_specifier659)
                     if self.failed:
                         return retval
 
-
                 elif alt23 == 3:
                     # C.g:269:4: 'enum' IDENTIFIER
-                    self.match(self.input, 48, self.FOLLOW_48_in_enum_specifier664)
+                    self.match(self.input, 48,
+                               self.FOLLOW_48_in_enum_specifier664)
                     if self.failed:
                         return retval
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enum_specifier666)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_enum_specifier666)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2167,9 +2131,9 @@ class CParser(Parser):
 
     # $ANTLR end enum_specifier
 
-
     # $ANTLR start enumerator_list
     # C.g:272:1: enumerator_list : enumerator ( ',' enumerator )* ;
+
     def enumerator_list(self, ):
 
         enumerator_list_StartIndex = self.input.index()
@@ -2180,44 +2144,38 @@ class CParser(Parser):
 
                 # C.g:273:2: ( enumerator ( ',' enumerator )* )
                 # C.g:273:4: enumerator ( ',' enumerator )*
-                self.following.append(self.FOLLOW_enumerator_in_enumerator_list677)
+                self.following.append(
+                    self.FOLLOW_enumerator_in_enumerator_list677)
                 self.enumerator()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:273:15: ( ',' enumerator )*
-                while True: #loop24
+                while True:  # loop24
                     alt24 = 2
                     LA24_0 = self.input.LA(1)
 
-                    if (LA24_0 == 27) :
+                    if (LA24_0 == 27):
                         LA24_1 = self.input.LA(2)
 
-                        if (LA24_1 == IDENTIFIER) :
+                        if (LA24_1 == IDENTIFIER):
                             alt24 = 1
 
-
-
-
                     if alt24 == 1:
                         # C.g:273:16: ',' enumerator
-                        self.match(self.input, 27, self.FOLLOW_27_in_enumerator_list680)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_enumerator_list680)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_enumerator_in_enumerator_list682)
+                        self.following.append(
+                            self.FOLLOW_enumerator_in_enumerator_list682)
                         self.enumerator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop24
-
-
-
-
-
+                        break  # loop24
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -2232,9 +2190,9 @@ class CParser(Parser):
 
     # $ANTLR end enumerator_list
 
-
     # $ANTLR start enumerator
     # C.g:276:1: enumerator : IDENTIFIER ( '=' constant_expression )? ;
+
     def enumerator(self, ):
 
         enumerator_StartIndex = self.input.index()
@@ -2245,32 +2203,28 @@ class CParser(Parser):
 
                 # C.g:277:2: ( IDENTIFIER ( '=' constant_expression )? )
                 # C.g:277:4: IDENTIFIER ( '=' constant_expression )?
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_enumerator695)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_enumerator695)
                 if self.failed:
                     return
                 # C.g:277:15: ( '=' constant_expression )?
                 alt25 = 2
                 LA25_0 = self.input.LA(1)
 
-                if (LA25_0 == 28) :
+                if (LA25_0 == 28):
                     alt25 = 1
                 if alt25 == 1:
                     # C.g:277:16: '=' constant_expression
                     self.match(self.input, 28, self.FOLLOW_28_in_enumerator698)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_enumerator700)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_enumerator700)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2284,9 +2238,9 @@ class CParser(Parser):
 
     # $ANTLR end enumerator
 
-
     # $ANTLR start type_qualifier
     # C.g:280:1: type_qualifier : ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' );
+
     def type_qualifier(self, ):
 
         type_qualifier_StartIndex = self.input.index()
@@ -2298,7 +2252,7 @@ class CParser(Parser):
                 # C.g:281:2: ( 'const' | 'volatile' | 'IN' | 'OUT' | 'OPTIONAL' | 'CONST' | 'UNALIGNED' | 'VOLATILE' | 'GLOBAL_REMOVE_IF_UNREFERENCED' | 'EFIAPI' | 'EFI_BOOTSERVICE' | 'EFI_RUNTIMESERVICE' | 'PACKED' )
                 # C.g:
                 if (49 <= self.input.LA(1) <= 61):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -2310,14 +2264,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_type_qualifier0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2336,10 +2285,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start declarator
     # C.g:296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );
+
     def declarator(self, ):
 
         retval = self.declarator_return()
@@ -2354,30 +2302,32 @@ class CParser(Parser):
                 alt30 = 2
                 LA30_0 = self.input.LA(1)
 
-                if (LA30_0 == 66) :
+                if (LA30_0 == 66):
                     LA30_1 = self.input.LA(2)
 
-                    if (self.synpred66()) :
+                    if (self.synpred66()):
                         alt30 = 1
-                    elif (True) :
+                    elif (True):
                         alt30 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 1, self.input)
+                        nvae = NoViableAltException(
+                            "296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 1, self.input)
 
                         raise nvae
 
-                elif (LA30_0 == IDENTIFIER or (58 <= LA30_0 <= 60) or LA30_0 == 62) :
+                elif (LA30_0 == IDENTIFIER or (58 <= LA30_0 <= 60) or LA30_0 == 62):
                     alt30 = 1
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 0, self.input)
+                    nvae = NoViableAltException(
+                        "296:1: declarator : ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator | pointer );", 30, 0, self.input)
 
                     raise nvae
 
@@ -2387,67 +2337,63 @@ class CParser(Parser):
                     alt26 = 2
                     LA26_0 = self.input.LA(1)
 
-                    if (LA26_0 == 66) :
+                    if (LA26_0 == 66):
                         alt26 = 1
                     if alt26 == 1:
                         # C.g:0:0: pointer
-                        self.following.append(self.FOLLOW_pointer_in_declarator784)
+                        self.following.append(
+                            self.FOLLOW_pointer_in_declarator784)
                         self.pointer()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:13: ( 'EFIAPI' )?
                     alt27 = 2
                     LA27_0 = self.input.LA(1)
 
-                    if (LA27_0 == 58) :
+                    if (LA27_0 == 58):
                         alt27 = 1
                     if alt27 == 1:
                         # C.g:297:14: 'EFIAPI'
-                        self.match(self.input, 58, self.FOLLOW_58_in_declarator788)
+                        self.match(self.input, 58,
+                                   self.FOLLOW_58_in_declarator788)
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:25: ( 'EFI_BOOTSERVICE' )?
                     alt28 = 2
                     LA28_0 = self.input.LA(1)
 
-                    if (LA28_0 == 59) :
+                    if (LA28_0 == 59):
                         alt28 = 1
                     if alt28 == 1:
                         # C.g:297:26: 'EFI_BOOTSERVICE'
-                        self.match(self.input, 59, self.FOLLOW_59_in_declarator793)
+                        self.match(self.input, 59,
+                                   self.FOLLOW_59_in_declarator793)
                         if self.failed:
                             return retval
 
-
-
                     # C.g:297:46: ( 'EFI_RUNTIMESERVICE' )?
                     alt29 = 2
                     LA29_0 = self.input.LA(1)
 
-                    if (LA29_0 == 60) :
+                    if (LA29_0 == 60):
                         alt29 = 1
                     if alt29 == 1:
                         # C.g:297:47: 'EFI_RUNTIMESERVICE'
-                        self.match(self.input, 60, self.FOLLOW_60_in_declarator798)
+                        self.match(self.input, 60,
+                                   self.FOLLOW_60_in_declarator798)
                         if self.failed:
                             return retval
 
-
-
-                    self.following.append(self.FOLLOW_direct_declarator_in_declarator802)
+                    self.following.append(
+                        self.FOLLOW_direct_declarator_in_declarator802)
                     self.direct_declarator()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt30 == 2:
                     # C.g:299:4: pointer
                     self.following.append(self.FOLLOW_pointer_in_declarator808)
@@ -2456,10 +2402,8 @@ class CParser(Parser):
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -2473,9 +2417,9 @@ class CParser(Parser):
 
     # $ANTLR end declarator
 
-
     # $ANTLR start direct_declarator
     # C.g:302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );
+
     def direct_declarator(self, ):
 
         direct_declarator_StartIndex = self.input.index()
@@ -2488,556 +2432,485 @@ class CParser(Parser):
                 alt34 = 2
                 LA34_0 = self.input.LA(1)
 
-                if (LA34_0 == IDENTIFIER) :
+                if (LA34_0 == IDENTIFIER):
                     alt34 = 1
-                elif (LA34_0 == 62) :
+                elif (LA34_0 == 62):
                     alt34 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
+                    nvae = NoViableAltException(
+                        "302:1: direct_declarator : ( IDENTIFIER ( declarator_suffix )* | '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+ );", 34, 0, self.input)
 
                     raise nvae
 
                 if alt34 == 1:
                     # C.g:303:4: IDENTIFIER ( declarator_suffix )*
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_direct_declarator819)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_direct_declarator819)
                     if self.failed:
                         return
                     # C.g:303:15: ( declarator_suffix )*
-                    while True: #loop31
+                    while True:  # loop31
                         alt31 = 2
                         LA31_0 = self.input.LA(1)
 
-                        if (LA31_0 == 62) :
+                        if (LA31_0 == 62):
                             LA31 = self.input.LA(2)
                             if LA31 == 63:
                                 LA31_30 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 58:
                                 LA31_31 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 66:
                                 LA31_32 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 59:
                                 LA31_33 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 60:
                                 LA31_34 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == IDENTIFIER:
                                 LA31_35 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 29 or LA31 == 30 or LA31 == 31 or LA31 == 32 or LA31 == 33:
                                 LA31_37 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 34:
                                 LA31_38 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 35:
                                 LA31_39 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 36:
                                 LA31_40 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 37:
                                 LA31_41 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 38:
                                 LA31_42 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 39:
                                 LA31_43 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 40:
                                 LA31_44 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 41:
                                 LA31_45 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 42:
                                 LA31_46 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 45 or LA31 == 46:
                                 LA31_47 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 48:
                                 LA31_48 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 49 or LA31 == 50 or LA31 == 51 or LA31 == 52 or LA31 == 53 or LA31 == 54 or LA31 == 55 or LA31 == 56 or LA31 == 57 or LA31 == 61:
                                 LA31_49 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
-
-                        elif (LA31_0 == 64) :
+                        elif (LA31_0 == 64):
                             LA31 = self.input.LA(2)
                             if LA31 == 65:
                                 LA31_51 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 62:
                                 LA31_52 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == IDENTIFIER:
                                 LA31_53 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == HEX_LITERAL:
                                 LA31_54 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == OCTAL_LITERAL:
                                 LA31_55 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == DECIMAL_LITERAL:
                                 LA31_56 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == CHARACTER_LITERAL:
                                 LA31_57 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == STRING_LITERAL:
                                 LA31_58 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == FLOATING_POINT_LITERAL:
                                 LA31_59 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 72:
                                 LA31_60 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 73:
                                 LA31_61 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 66 or LA31 == 68 or LA31 == 69 or LA31 == 77 or LA31 == 78 or LA31 == 79:
                                 LA31_62 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
                             elif LA31 == 74:
                                 LA31_63 = self.input.LA(3)
 
-                                if (self.synpred67()) :
+                                if (self.synpred67()):
                                     alt31 = 1
 
-
-
-
-
                         if alt31 == 1:
                             # C.g:0:0: declarator_suffix
-                            self.following.append(self.FOLLOW_declarator_suffix_in_direct_declarator821)
+                            self.following.append(
+                                self.FOLLOW_declarator_suffix_in_direct_declarator821)
                             self.declarator_suffix()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop31
-
-
-
+                            break  # loop31
 
                 elif alt34 == 2:
                     # C.g:304:4: '(' ( 'EFIAPI' )? declarator ')' ( declarator_suffix )+
-                    self.match(self.input, 62, self.FOLLOW_62_in_direct_declarator827)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_direct_declarator827)
                     if self.failed:
                         return
                     # C.g:304:8: ( 'EFIAPI' )?
                     alt32 = 2
                     LA32_0 = self.input.LA(1)
 
-                    if (LA32_0 == 58) :
+                    if (LA32_0 == 58):
                         LA32_1 = self.input.LA(2)
 
-                        if (self.synpred69()) :
+                        if (self.synpred69()):
                             alt32 = 1
                     if alt32 == 1:
                         # C.g:304:9: 'EFIAPI'
-                        self.match(self.input, 58, self.FOLLOW_58_in_direct_declarator830)
+                        self.match(self.input, 58,
+                                   self.FOLLOW_58_in_direct_declarator830)
                         if self.failed:
                             return
 
-
-
-                    self.following.append(self.FOLLOW_declarator_in_direct_declarator834)
+                    self.following.append(
+                        self.FOLLOW_declarator_in_direct_declarator834)
                     self.declarator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_direct_declarator836)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_direct_declarator836)
                     if self.failed:
                         return
                     # C.g:304:35: ( declarator_suffix )+
                     cnt33 = 0
-                    while True: #loop33
+                    while True:  # loop33
                         alt33 = 2
                         LA33_0 = self.input.LA(1)
 
-                        if (LA33_0 == 62) :
+                        if (LA33_0 == 62):
                             LA33 = self.input.LA(2)
                             if LA33 == 63:
                                 LA33_30 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 58:
                                 LA33_31 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 66:
                                 LA33_32 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 59:
                                 LA33_33 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 60:
                                 LA33_34 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == IDENTIFIER:
                                 LA33_35 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 29 or LA33 == 30 or LA33 == 31 or LA33 == 32 or LA33 == 33:
                                 LA33_37 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 34:
                                 LA33_38 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 35:
                                 LA33_39 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 36:
                                 LA33_40 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 37:
                                 LA33_41 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 38:
                                 LA33_42 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 39:
                                 LA33_43 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 40:
                                 LA33_44 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 41:
                                 LA33_45 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 42:
                                 LA33_46 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 45 or LA33 == 46:
                                 LA33_47 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 48:
                                 LA33_48 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 49 or LA33 == 50 or LA33 == 51 or LA33 == 52 or LA33 == 53 or LA33 == 54 or LA33 == 55 or LA33 == 56 or LA33 == 57 or LA33 == 61:
                                 LA33_49 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
-
-                        elif (LA33_0 == 64) :
+                        elif (LA33_0 == 64):
                             LA33 = self.input.LA(2)
                             if LA33 == 65:
                                 LA33_51 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 62:
                                 LA33_52 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == IDENTIFIER:
                                 LA33_53 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == HEX_LITERAL:
                                 LA33_54 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == OCTAL_LITERAL:
                                 LA33_55 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == DECIMAL_LITERAL:
                                 LA33_56 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == CHARACTER_LITERAL:
                                 LA33_57 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == STRING_LITERAL:
                                 LA33_58 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == FLOATING_POINT_LITERAL:
                                 LA33_59 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 72:
                                 LA33_60 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 73:
                                 LA33_61 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 66 or LA33 == 68 or LA33 == 69 or LA33 == 77 or LA33 == 78 or LA33 == 79:
                                 LA33_62 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
                             elif LA33 == 74:
                                 LA33_63 = self.input.LA(3)
 
-                                if (self.synpred70()) :
+                                if (self.synpred70()):
                                     alt33 = 1
 
-
-
-
-
                         if alt33 == 1:
                             # C.g:0:0: declarator_suffix
-                            self.following.append(self.FOLLOW_declarator_suffix_in_direct_declarator838)
+                            self.following.append(
+                                self.FOLLOW_declarator_suffix_in_direct_declarator838)
                             self.declarator_suffix()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
                             if cnt33 >= 1:
-                                break #loop33
+                                break  # loop33
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -3048,10 +2921,6 @@ class CParser(Parser):
 
                         cnt33 += 1
 
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3065,9 +2934,9 @@ class CParser(Parser):
 
     # $ANTLR end direct_declarator
 
-
     # $ANTLR start declarator_suffix
     # C.g:307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );
+
     def declarator_suffix(self, ):
 
         declarator_suffix_StartIndex = self.input.index()
@@ -3080,23 +2949,24 @@ class CParser(Parser):
                 alt35 = 5
                 LA35_0 = self.input.LA(1)
 
-                if (LA35_0 == 64) :
+                if (LA35_0 == 64):
                     LA35_1 = self.input.LA(2)
 
-                    if (LA35_1 == 65) :
+                    if (LA35_1 == 65):
                         alt35 = 2
-                    elif ((IDENTIFIER <= LA35_1 <= FLOATING_POINT_LITERAL) or LA35_1 == 62 or LA35_1 == 66 or (68 <= LA35_1 <= 69) or (72 <= LA35_1 <= 74) or (77 <= LA35_1 <= 79)) :
+                    elif ((IDENTIFIER <= LA35_1 <= FLOATING_POINT_LITERAL) or LA35_1 == 62 or LA35_1 == 66 or (68 <= LA35_1 <= 69) or (72 <= LA35_1 <= 74) or (77 <= LA35_1 <= 79)):
                         alt35 = 1
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
+                        nvae = NoViableAltException(
+                            "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 1, self.input)
 
                         raise nvae
 
-                elif (LA35_0 == 62) :
+                elif (LA35_0 == 62):
                     LA35 = self.input.LA(2)
                     if LA35 == 63:
                         alt35 = 5
@@ -3105,16 +2975,17 @@ class CParser(Parser):
                     elif LA35 == IDENTIFIER:
                         LA35_29 = self.input.LA(3)
 
-                        if (self.synpred73()) :
+                        if (self.synpred73()):
                             alt35 = 3
-                        elif (self.synpred74()) :
+                        elif (self.synpred74()):
                             alt35 = 4
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
+                            nvae = NoViableAltException(
+                                "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 29, self.input)
 
                             raise nvae
 
@@ -3123,7 +2994,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
+                        nvae = NoViableAltException(
+                            "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 2, self.input)
 
                         raise nvae
 
@@ -3132,76 +3004,84 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
+                    nvae = NoViableAltException(
+                        "307:1: declarator_suffix : ( '[' constant_expression ']' | '[' ']' | '(' parameter_type_list ')' | '(' identifier_list ')' | '(' ')' );", 35, 0, self.input)
 
                     raise nvae
 
                 if alt35 == 1:
                     # C.g:308:6: '[' constant_expression ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix852)
+                    self.match(self.input, 64,
+                               self.FOLLOW_64_in_declarator_suffix852)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_declarator_suffix854)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_declarator_suffix854)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix856)
+                    self.match(self.input, 65,
+                               self.FOLLOW_65_in_declarator_suffix856)
                     if self.failed:
                         return
 
-
                 elif alt35 == 2:
                     # C.g:309:9: '[' ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_declarator_suffix866)
+                    self.match(self.input, 64,
+                               self.FOLLOW_64_in_declarator_suffix866)
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_declarator_suffix868)
+                    self.match(self.input, 65,
+                               self.FOLLOW_65_in_declarator_suffix868)
                     if self.failed:
                         return
 
-
                 elif alt35 == 3:
                     # C.g:310:9: '(' parameter_type_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix878)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix878)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_parameter_type_list_in_declarator_suffix880)
+                    self.following.append(
+                        self.FOLLOW_parameter_type_list_in_declarator_suffix880)
                     self.parameter_type_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix882)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix882)
                     if self.failed:
                         return
 
-
                 elif alt35 == 4:
                     # C.g:311:9: '(' identifier_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix892)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix892)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_identifier_list_in_declarator_suffix894)
+                    self.following.append(
+                        self.FOLLOW_identifier_list_in_declarator_suffix894)
                     self.identifier_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix896)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix896)
                     if self.failed:
                         return
 
-
                 elif alt35 == 5:
                     # C.g:312:9: '(' ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_declarator_suffix906)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_declarator_suffix906)
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_declarator_suffix908)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_declarator_suffix908)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3215,9 +3095,9 @@ class CParser(Parser):
 
     # $ANTLR end declarator_suffix
 
-
     # $ANTLR start pointer
     # C.g:315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );
+
     def pointer(self, ):
 
         pointer_StartIndex = self.input.index()
@@ -3230,69 +3110,73 @@ class CParser(Parser):
                 alt38 = 3
                 LA38_0 = self.input.LA(1)
 
-                if (LA38_0 == 66) :
+                if (LA38_0 == 66):
                     LA38 = self.input.LA(2)
                     if LA38 == 66:
                         LA38_2 = self.input.LA(3)
 
-                        if (self.synpred78()) :
+                        if (self.synpred78()):
                             alt38 = 2
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 2, self.input)
 
                             raise nvae
 
                     elif LA38 == 58:
                         LA38_3 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 3, self.input)
 
                             raise nvae
 
                     elif LA38 == 59:
                         LA38_4 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 4, self.input)
 
                             raise nvae
 
                     elif LA38 == 60:
                         LA38_5 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 5, self.input)
 
                             raise nvae
 
@@ -3301,32 +3185,34 @@ class CParser(Parser):
                     elif LA38 == 53:
                         LA38_21 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 21, self.input)
 
                             raise nvae
 
                     elif LA38 == 49 or LA38 == 50 or LA38 == 51 or LA38 == 52 or LA38 == 54 or LA38 == 55 or LA38 == 56 or LA38 == 57 or LA38 == 61:
                         LA38_29 = self.input.LA(3)
 
-                        if (self.synpred77()) :
+                        if (self.synpred77()):
                             alt38 = 1
-                        elif (True) :
+                        elif (True):
                             alt38 = 3
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
+                            nvae = NoViableAltException(
+                                "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 29, self.input)
 
                             raise nvae
 
@@ -3335,7 +3221,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
+                        nvae = NoViableAltException(
+                            "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 1, self.input)
 
                         raise nvae
 
@@ -3344,7 +3231,8 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
+                    nvae = NoViableAltException(
+                        "315:1: pointer : ( '*' ( type_qualifier )+ ( pointer )? | '*' pointer | '*' );", 38, 0, self.input)
 
                     raise nvae
 
@@ -3355,57 +3243,51 @@ class CParser(Parser):
                         return
                     # C.g:316:8: ( type_qualifier )+
                     cnt36 = 0
-                    while True: #loop36
+                    while True:  # loop36
                         alt36 = 2
                         LA36 = self.input.LA(1)
                         if LA36 == 58:
                             LA36_2 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 59:
                             LA36_3 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 60:
                             LA36_4 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 53:
                             LA36_20 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
                         elif LA36 == 49 or LA36 == 50 or LA36 == 51 or LA36 == 52 or LA36 == 54 or LA36 == 55 or LA36 == 56 or LA36 == 57 or LA36 == 61:
                             LA36_28 = self.input.LA(2)
 
-                            if (self.synpred75()) :
+                            if (self.synpred75()):
                                 alt36 = 1
 
-
-
                         if alt36 == 1:
                             # C.g:0:0: type_qualifier
-                            self.following.append(self.FOLLOW_type_qualifier_in_pointer921)
+                            self.following.append(
+                                self.FOLLOW_type_qualifier_in_pointer921)
                             self.type_qualifier()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
                             if cnt36 >= 1:
-                                break #loop36
+                                break  # loop36
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -3416,28 +3298,24 @@ class CParser(Parser):
 
                         cnt36 += 1
 
-
                     # C.g:316:24: ( pointer )?
                     alt37 = 2
                     LA37_0 = self.input.LA(1)
 
-                    if (LA37_0 == 66) :
+                    if (LA37_0 == 66):
                         LA37_1 = self.input.LA(2)
 
-                        if (self.synpred76()) :
+                        if (self.synpred76()):
                             alt37 = 1
                     if alt37 == 1:
                         # C.g:0:0: pointer
-                        self.following.append(self.FOLLOW_pointer_in_pointer924)
+                        self.following.append(
+                            self.FOLLOW_pointer_in_pointer924)
                         self.pointer()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt38 == 2:
                     # C.g:317:4: '*' pointer
                     self.match(self.input, 66, self.FOLLOW_66_in_pointer930)
@@ -3449,15 +3327,12 @@ class CParser(Parser):
                     if self.failed:
                         return
 
-
                 elif alt38 == 3:
                     # C.g:318:4: '*'
                     self.match(self.input, 66, self.FOLLOW_66_in_pointer937)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3471,9 +3346,9 @@ class CParser(Parser):
 
     # $ANTLR end pointer
 
-
     # $ANTLR start parameter_type_list
     # C.g:321:1: parameter_type_list : parameter_list ( ',' ( 'OPTIONAL' )? '...' )? ;
+
     def parameter_type_list(self, ):
 
         parameter_type_list_StartIndex = self.input.index()
@@ -3484,7 +3359,8 @@ class CParser(Parser):
 
                 # C.g:322:2: ( parameter_list ( ',' ( 'OPTIONAL' )? '...' )? )
                 # C.g:322:4: parameter_list ( ',' ( 'OPTIONAL' )? '...' )?
-                self.following.append(self.FOLLOW_parameter_list_in_parameter_type_list948)
+                self.following.append(
+                    self.FOLLOW_parameter_list_in_parameter_type_list948)
                 self.parameter_list()
                 self.following.pop()
                 if self.failed:
@@ -3493,37 +3369,32 @@ class CParser(Parser):
                 alt40 = 2
                 LA40_0 = self.input.LA(1)
 
-                if (LA40_0 == 27) :
+                if (LA40_0 == 27):
                     alt40 = 1
                 if alt40 == 1:
                     # C.g:322:20: ',' ( 'OPTIONAL' )? '...'
-                    self.match(self.input, 27, self.FOLLOW_27_in_parameter_type_list951)
+                    self.match(self.input, 27,
+                               self.FOLLOW_27_in_parameter_type_list951)
                     if self.failed:
                         return
                     # C.g:322:24: ( 'OPTIONAL' )?
                     alt39 = 2
                     LA39_0 = self.input.LA(1)
 
-                    if (LA39_0 == 53) :
+                    if (LA39_0 == 53):
                         alt39 = 1
                     if alt39 == 1:
                         # C.g:322:25: 'OPTIONAL'
-                        self.match(self.input, 53, self.FOLLOW_53_in_parameter_type_list954)
+                        self.match(self.input, 53,
+                                   self.FOLLOW_53_in_parameter_type_list954)
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 67, self.FOLLOW_67_in_parameter_type_list958)
+                    self.match(self.input, 67,
+                               self.FOLLOW_67_in_parameter_type_list958)
                     if self.failed:
                         return
 
-
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3537,9 +3408,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_type_list
 
-
     # $ANTLR start parameter_list
     # C.g:325:1: parameter_list : parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* ;
+
     def parameter_list(self, ):
 
         parameter_list_StartIndex = self.input.index()
@@ -3550,68 +3421,60 @@ class CParser(Parser):
 
                 # C.g:326:2: ( parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )* )
                 # C.g:326:4: parameter_declaration ( ',' ( 'OPTIONAL' )? parameter_declaration )*
-                self.following.append(self.FOLLOW_parameter_declaration_in_parameter_list971)
+                self.following.append(
+                    self.FOLLOW_parameter_declaration_in_parameter_list971)
                 self.parameter_declaration()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:326:26: ( ',' ( 'OPTIONAL' )? parameter_declaration )*
-                while True: #loop42
+                while True:  # loop42
                     alt42 = 2
                     LA42_0 = self.input.LA(1)
 
-                    if (LA42_0 == 27) :
+                    if (LA42_0 == 27):
                         LA42_1 = self.input.LA(2)
 
-                        if (LA42_1 == 53) :
+                        if (LA42_1 == 53):
                             LA42_3 = self.input.LA(3)
 
-                            if (self.synpred82()) :
+                            if (self.synpred82()):
                                 alt42 = 1
 
-
-                        elif (LA42_1 == IDENTIFIER or (29 <= LA42_1 <= 42) or (45 <= LA42_1 <= 46) or (48 <= LA42_1 <= 52) or (54 <= LA42_1 <= 61) or LA42_1 == 66) :
+                        elif (LA42_1 == IDENTIFIER or (29 <= LA42_1 <= 42) or (45 <= LA42_1 <= 46) or (48 <= LA42_1 <= 52) or (54 <= LA42_1 <= 61) or LA42_1 == 66):
                             alt42 = 1
 
-
-
-
                     if alt42 == 1:
                         # C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
-                        self.match(self.input, 27, self.FOLLOW_27_in_parameter_list974)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_parameter_list974)
                         if self.failed:
                             return
                         # C.g:326:31: ( 'OPTIONAL' )?
                         alt41 = 2
                         LA41_0 = self.input.LA(1)
 
-                        if (LA41_0 == 53) :
+                        if (LA41_0 == 53):
                             LA41_1 = self.input.LA(2)
 
-                            if (self.synpred81()) :
+                            if (self.synpred81()):
                                 alt41 = 1
                         if alt41 == 1:
                             # C.g:326:32: 'OPTIONAL'
-                            self.match(self.input, 53, self.FOLLOW_53_in_parameter_list977)
+                            self.match(self.input, 53,
+                                       self.FOLLOW_53_in_parameter_list977)
                             if self.failed:
                                 return
 
-
-
-                        self.following.append(self.FOLLOW_parameter_declaration_in_parameter_list981)
+                        self.following.append(
+                            self.FOLLOW_parameter_declaration_in_parameter_list981)
                         self.parameter_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop42
-
-
-
-
-
+                        break  # loop42
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -3626,9 +3489,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_list
 
-
     # $ANTLR start parameter_declaration
     # C.g:329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );
+
     def parameter_declaration(self, ):
 
         parameter_declaration_StartIndex = self.input.index()
@@ -3645,16 +3508,17 @@ class CParser(Parser):
                 elif LA46 == IDENTIFIER:
                     LA46_13 = self.input.LA(2)
 
-                    if (self.synpred86()) :
+                    if (self.synpred86()):
                         alt46 = 1
-                    elif (True) :
+                    elif (True):
                         alt46 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
+                        nvae = NoViableAltException(
+                            "329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 13, self.input)
 
                         raise nvae
 
@@ -3665,30 +3529,31 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
+                    nvae = NoViableAltException(
+                        "329:1: parameter_declaration : ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? | ( pointer )* IDENTIFIER );", 46, 0, self.input)
 
                     raise nvae
 
                 if alt46 == 1:
                     # C.g:330:4: declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )?
-                    self.following.append(self.FOLLOW_declaration_specifiers_in_parameter_declaration994)
+                    self.following.append(
+                        self.FOLLOW_declaration_specifiers_in_parameter_declaration994)
                     self.declaration_specifiers()
                     self.following.pop()
                     if self.failed:
                         return
                     # C.g:330:27: ( declarator | abstract_declarator )*
-                    while True: #loop43
+                    while True:  # loop43
                         alt43 = 3
                         LA43 = self.input.LA(1)
                         if LA43 == 66:
                             LA43_5 = self.input.LA(2)
 
-                            if (self.synpred83()) :
+                            if (self.synpred83()):
                                 alt43 = 1
-                            elif (self.synpred84()) :
+                            elif (self.synpred84()):
                                 alt43 = 2
 
-
                         elif LA43 == IDENTIFIER or LA43 == 58 or LA43 == 59 or LA43 == 60:
                             alt43 = 1
                         elif LA43 == 62:
@@ -3698,129 +3563,115 @@ class CParser(Parser):
                             elif LA43 == IDENTIFIER:
                                 LA43_37 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 58:
                                 LA43_38 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 66:
                                 LA43_39 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 59:
                                 LA43_40 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 60:
                                 LA43_41 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
                             elif LA43 == 62:
                                 LA43_43 = self.input.LA(3)
 
-                                if (self.synpred83()) :
+                                if (self.synpred83()):
                                     alt43 = 1
-                                elif (self.synpred84()) :
+                                elif (self.synpred84()):
                                     alt43 = 2
 
-
-
                         elif LA43 == 64:
                             alt43 = 2
 
                         if alt43 == 1:
                             # C.g:330:28: declarator
-                            self.following.append(self.FOLLOW_declarator_in_parameter_declaration997)
+                            self.following.append(
+                                self.FOLLOW_declarator_in_parameter_declaration997)
                             self.declarator()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         elif alt43 == 2:
                             # C.g:330:39: abstract_declarator
-                            self.following.append(self.FOLLOW_abstract_declarator_in_parameter_declaration999)
+                            self.following.append(
+                                self.FOLLOW_abstract_declarator_in_parameter_declaration999)
                             self.abstract_declarator()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop43
-
+                            break  # loop43
 
                     # C.g:330:61: ( 'OPTIONAL' )?
                     alt44 = 2
                     LA44_0 = self.input.LA(1)
 
-                    if (LA44_0 == 53) :
+                    if (LA44_0 == 53):
                         alt44 = 1
                     if alt44 == 1:
                         # C.g:330:62: 'OPTIONAL'
-                        self.match(self.input, 53, self.FOLLOW_53_in_parameter_declaration1004)
+                        self.match(self.input, 53,
+                                   self.FOLLOW_53_in_parameter_declaration1004)
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt46 == 2:
                     # C.g:332:4: ( pointer )* IDENTIFIER
                     # C.g:332:4: ( pointer )*
-                    while True: #loop45
+                    while True:  # loop45
                         alt45 = 2
                         LA45_0 = self.input.LA(1)
 
-                        if (LA45_0 == 66) :
+                        if (LA45_0 == 66):
                             alt45 = 1
 
-
                         if alt45 == 1:
                             # C.g:0:0: pointer
-                            self.following.append(self.FOLLOW_pointer_in_parameter_declaration1013)
+                            self.following.append(
+                                self.FOLLOW_pointer_in_parameter_declaration1013)
                             self.pointer()
                             self.following.pop()
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop45
+                            break  # loop45
 
-
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_parameter_declaration1016)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3834,9 +3685,9 @@ class CParser(Parser):
 
     # $ANTLR end parameter_declaration
 
-
     # $ANTLR start identifier_list
     # C.g:335:1: identifier_list : IDENTIFIER ( ',' IDENTIFIER )* ;
+
     def identifier_list(self, ):
 
         identifier_list_StartIndex = self.input.index()
@@ -3847,35 +3698,31 @@ class CParser(Parser):
 
                 # C.g:336:2: ( IDENTIFIER ( ',' IDENTIFIER )* )
                 # C.g:336:4: IDENTIFIER ( ',' IDENTIFIER )*
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1027)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_identifier_list1027)
                 if self.failed:
                     return
                 # C.g:337:2: ( ',' IDENTIFIER )*
-                while True: #loop47
+                while True:  # loop47
                     alt47 = 2
                     LA47_0 = self.input.LA(1)
 
-                    if (LA47_0 == 27) :
+                    if (LA47_0 == 27):
                         alt47 = 1
 
-
                     if alt47 == 1:
                         # C.g:337:3: ',' IDENTIFIER
-                        self.match(self.input, 27, self.FOLLOW_27_in_identifier_list1031)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_identifier_list1031)
                         if self.failed:
                             return
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_identifier_list1033)
+                        self.match(self.input, IDENTIFIER,
+                                   self.FOLLOW_IDENTIFIER_in_identifier_list1033)
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop47
-
-
-
-
-
+                        break  # loop47
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -3890,9 +3737,9 @@ class CParser(Parser):
 
     # $ANTLR end identifier_list
 
-
     # $ANTLR start type_name
     # C.g:340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );
+
     def type_name(self, ):
 
         type_name_StartIndex = self.input.index()
@@ -3905,21 +3752,22 @@ class CParser(Parser):
                 alt49 = 2
                 LA49_0 = self.input.LA(1)
 
-                if ((34 <= LA49_0 <= 42) or (45 <= LA49_0 <= 46) or (48 <= LA49_0 <= 61)) :
+                if ((34 <= LA49_0 <= 42) or (45 <= LA49_0 <= 46) or (48 <= LA49_0 <= 61)):
                     alt49 = 1
-                elif (LA49_0 == IDENTIFIER) :
+                elif (LA49_0 == IDENTIFIER):
                     LA49_13 = self.input.LA(2)
 
-                    if (self.synpred90()) :
+                    if (self.synpred90()):
                         alt49 = 1
-                    elif (True) :
+                    elif (True):
                         alt49 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
+                        nvae = NoViableAltException(
+                            "340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 13, self.input)
 
                         raise nvae
 
@@ -3928,13 +3776,15 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
+                    nvae = NoViableAltException(
+                        "340:1: type_name : ( specifier_qualifier_list ( abstract_declarator )? | type_id );", 49, 0, self.input)
 
                     raise nvae
 
                 if alt49 == 1:
                     # C.g:341:4: specifier_qualifier_list ( abstract_declarator )?
-                    self.following.append(self.FOLLOW_specifier_qualifier_list_in_type_name1046)
+                    self.following.append(
+                        self.FOLLOW_specifier_qualifier_list_in_type_name1046)
                     self.specifier_qualifier_list()
                     self.following.pop()
                     if self.failed:
@@ -3943,20 +3793,17 @@ class CParser(Parser):
                     alt48 = 2
                     LA48_0 = self.input.LA(1)
 
-                    if (LA48_0 == 62 or LA48_0 == 64 or LA48_0 == 66) :
+                    if (LA48_0 == 62 or LA48_0 == 64 or LA48_0 == 66):
                         alt48 = 1
                     if alt48 == 1:
                         # C.g:0:0: abstract_declarator
-                        self.following.append(self.FOLLOW_abstract_declarator_in_type_name1048)
+                        self.following.append(
+                            self.FOLLOW_abstract_declarator_in_type_name1048)
                         self.abstract_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt49 == 2:
                     # C.g:342:4: type_id
                     self.following.append(self.FOLLOW_type_id_in_type_name1054)
@@ -3965,8 +3812,6 @@ class CParser(Parser):
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -3980,9 +3825,9 @@ class CParser(Parser):
 
     # $ANTLR end type_name
 
-
     # $ANTLR start abstract_declarator
     # C.g:345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );
+
     def abstract_declarator(self, ):
 
         abstract_declarator_StartIndex = self.input.index()
@@ -3995,22 +3840,24 @@ class CParser(Parser):
                 alt51 = 2
                 LA51_0 = self.input.LA(1)
 
-                if (LA51_0 == 66) :
+                if (LA51_0 == 66):
                     alt51 = 1
-                elif (LA51_0 == 62 or LA51_0 == 64) :
+                elif (LA51_0 == 62 or LA51_0 == 64):
                     alt51 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
+                    nvae = NoViableAltException(
+                        "345:1: abstract_declarator : ( pointer ( direct_abstract_declarator )? | direct_abstract_declarator );", 51, 0, self.input)
 
                     raise nvae
 
                 if alt51 == 1:
                     # C.g:346:4: pointer ( direct_abstract_declarator )?
-                    self.following.append(self.FOLLOW_pointer_in_abstract_declarator1065)
+                    self.following.append(
+                        self.FOLLOW_pointer_in_abstract_declarator1065)
                     self.pointer()
                     self.following.pop()
                     if self.failed:
@@ -4019,202 +3866,198 @@ class CParser(Parser):
                     alt50 = 2
                     LA50_0 = self.input.LA(1)
 
-                    if (LA50_0 == 62) :
+                    if (LA50_0 == 62):
                         LA50 = self.input.LA(2)
                         if LA50 == 63:
                             LA50_12 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 58:
                             LA50_13 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 66:
                             LA50_14 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 59:
                             LA50_15 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 60:
                             LA50_16 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == IDENTIFIER:
                             LA50_17 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 62:
                             LA50_18 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 64:
                             LA50_19 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 29 or LA50 == 30 or LA50 == 31 or LA50 == 32 or LA50 == 33:
                             LA50_20 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 34:
                             LA50_21 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 35:
                             LA50_22 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 36:
                             LA50_23 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 37:
                             LA50_24 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 38:
                             LA50_25 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 39:
                             LA50_26 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 40:
                             LA50_27 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 41:
                             LA50_28 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 42:
                             LA50_29 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 45 or LA50 == 46:
                             LA50_30 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 48:
                             LA50_31 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 49 or LA50 == 50 or LA50 == 51 or LA50 == 52 or LA50 == 53 or LA50 == 54 or LA50 == 55 or LA50 == 56 or LA50 == 57 or LA50 == 61:
                             LA50_32 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
-                    elif (LA50_0 == 64) :
+                    elif (LA50_0 == 64):
                         LA50 = self.input.LA(2)
                         if LA50 == 65:
                             LA50_33 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 62:
                             LA50_34 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == IDENTIFIER:
                             LA50_35 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == HEX_LITERAL:
                             LA50_36 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == OCTAL_LITERAL:
                             LA50_37 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == DECIMAL_LITERAL:
                             LA50_38 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == CHARACTER_LITERAL:
                             LA50_39 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == STRING_LITERAL:
                             LA50_40 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == FLOATING_POINT_LITERAL:
                             LA50_41 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 72:
                             LA50_42 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 73:
                             LA50_43 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 66 or LA50 == 68 or LA50 == 69 or LA50 == 77 or LA50 == 78 or LA50 == 79:
                             LA50_44 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                         elif LA50 == 74:
                             LA50_45 = self.input.LA(3)
 
-                            if (self.synpred91()) :
+                            if (self.synpred91()):
                                 alt50 = 1
                     if alt50 == 1:
                         # C.g:0:0: direct_abstract_declarator
-                        self.following.append(self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1067)
+                        self.following.append(
+                            self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1067)
                         self.direct_abstract_declarator()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt51 == 2:
                     # C.g:347:4: direct_abstract_declarator
-                    self.following.append(self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1073)
+                    self.following.append(
+                        self.FOLLOW_direct_abstract_declarator_in_abstract_declarator1073)
                     self.direct_abstract_declarator()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -4228,9 +4071,9 @@ class CParser(Parser):
 
     # $ANTLR end abstract_declarator
 
-
     # $ANTLR start direct_abstract_declarator
     # C.g:350:1: direct_abstract_declarator : ( '(' abstract_declarator ')' | abstract_declarator_suffix ) ( abstract_declarator_suffix )* ;
+
     def direct_abstract_declarator(self, ):
 
         direct_abstract_declarator_StartIndex = self.input.index()
@@ -4245,23 +4088,24 @@ class CParser(Parser):
                 alt52 = 2
                 LA52_0 = self.input.LA(1)
 
-                if (LA52_0 == 62) :
+                if (LA52_0 == 62):
                     LA52 = self.input.LA(2)
                     if LA52 == IDENTIFIER or LA52 == 29 or LA52 == 30 or LA52 == 31 or LA52 == 32 or LA52 == 33 or LA52 == 34 or LA52 == 35 or LA52 == 36 or LA52 == 37 or LA52 == 38 or LA52 == 39 or LA52 == 40 or LA52 == 41 or LA52 == 42 or LA52 == 45 or LA52 == 46 or LA52 == 48 or LA52 == 49 or LA52 == 50 or LA52 == 51 or LA52 == 52 or LA52 == 53 or LA52 == 54 or LA52 == 55 or LA52 == 56 or LA52 == 57 or LA52 == 58 or LA52 == 59 or LA52 == 60 or LA52 == 61 or LA52 == 63:
                         alt52 = 2
                     elif LA52 == 66:
                         LA52_18 = self.input.LA(3)
 
-                        if (self.synpred93()) :
+                        if (self.synpred93()):
                             alt52 = 1
-                        elif (True) :
+                        elif (True):
                             alt52 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
+                            nvae = NoViableAltException(
+                                "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 18, self.input)
 
                             raise nvae
 
@@ -4272,306 +4116,269 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
+                        nvae = NoViableAltException(
+                            "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 1, self.input)
 
                         raise nvae
 
-                elif (LA52_0 == 64) :
+                elif (LA52_0 == 64):
                     alt52 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
+                    nvae = NoViableAltException(
+                        "351:4: ( '(' abstract_declarator ')' | abstract_declarator_suffix )", 52, 0, self.input)
 
                     raise nvae
 
                 if alt52 == 1:
                     # C.g:351:6: '(' abstract_declarator ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_direct_abstract_declarator1086)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
+                    self.following.append(
+                        self.FOLLOW_abstract_declarator_in_direct_abstract_declarator1088)
                     self.abstract_declarator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_direct_abstract_declarator1090)
                     if self.failed:
                         return
 
-
                 elif alt52 == 2:
                     # C.g:351:36: abstract_declarator_suffix
-                    self.following.append(self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094)
+                    self.following.append(
+                        self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094)
                     self.abstract_declarator_suffix()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
                 # C.g:351:65: ( abstract_declarator_suffix )*
-                while True: #loop53
+                while True:  # loop53
                     alt53 = 2
                     LA53_0 = self.input.LA(1)
 
-                    if (LA53_0 == 62) :
+                    if (LA53_0 == 62):
                         LA53 = self.input.LA(2)
                         if LA53 == 63:
                             LA53_12 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 58:
                             LA53_13 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 66:
                             LA53_14 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 59:
                             LA53_15 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 60:
                             LA53_16 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == IDENTIFIER:
                             LA53_17 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 29 or LA53 == 30 or LA53 == 31 or LA53 == 32 or LA53 == 33:
                             LA53_19 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 34:
                             LA53_20 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 35:
                             LA53_21 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 36:
                             LA53_22 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 37:
                             LA53_23 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 38:
                             LA53_24 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 39:
                             LA53_25 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 40:
                             LA53_26 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 41:
                             LA53_27 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 42:
                             LA53_28 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 45 or LA53 == 46:
                             LA53_29 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 48:
                             LA53_30 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 49 or LA53 == 50 or LA53 == 51 or LA53 == 52 or LA53 == 53 or LA53 == 54 or LA53 == 55 or LA53 == 56 or LA53 == 57 or LA53 == 61:
                             LA53_31 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
-
-                    elif (LA53_0 == 64) :
+                    elif (LA53_0 == 64):
                         LA53 = self.input.LA(2)
                         if LA53 == 65:
                             LA53_33 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 62:
                             LA53_34 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == IDENTIFIER:
                             LA53_35 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == HEX_LITERAL:
                             LA53_36 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == OCTAL_LITERAL:
                             LA53_37 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == DECIMAL_LITERAL:
                             LA53_38 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == CHARACTER_LITERAL:
                             LA53_39 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == STRING_LITERAL:
                             LA53_40 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == FLOATING_POINT_LITERAL:
                             LA53_41 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 72:
                             LA53_42 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 73:
                             LA53_43 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 66 or LA53 == 68 or LA53 == 69 or LA53 == 77 or LA53 == 78 or LA53 == 79:
                             LA53_44 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
                         elif LA53 == 74:
                             LA53_45 = self.input.LA(3)
 
-                            if (self.synpred94()) :
+                            if (self.synpred94()):
                                 alt53 = 1
 
-
-
-
-
                     if alt53 == 1:
                         # C.g:0:0: abstract_declarator_suffix
-                        self.following.append(self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098)
+                        self.following.append(
+                            self.FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098)
                         self.abstract_declarator_suffix()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop53
-
-
-
-
-
+                        break  # loop53
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 32, direct_abstract_declarator_StartIndex)
+                self.memoize(self.input, 32,
+                             direct_abstract_declarator_StartIndex)
 
             pass
 
@@ -4579,9 +4386,9 @@ class CParser(Parser):
 
     # $ANTLR end direct_abstract_declarator
 
-
     # $ANTLR start abstract_declarator_suffix
     # C.g:354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );
+
     def abstract_declarator_suffix(self, ):
 
         abstract_declarator_suffix_StartIndex = self.input.index()
@@ -4594,35 +4401,37 @@ class CParser(Parser):
                 alt54 = 4
                 LA54_0 = self.input.LA(1)
 
-                if (LA54_0 == 64) :
+                if (LA54_0 == 64):
                     LA54_1 = self.input.LA(2)
 
-                    if (LA54_1 == 65) :
+                    if (LA54_1 == 65):
                         alt54 = 1
-                    elif ((IDENTIFIER <= LA54_1 <= FLOATING_POINT_LITERAL) or LA54_1 == 62 or LA54_1 == 66 or (68 <= LA54_1 <= 69) or (72 <= LA54_1 <= 74) or (77 <= LA54_1 <= 79)) :
+                    elif ((IDENTIFIER <= LA54_1 <= FLOATING_POINT_LITERAL) or LA54_1 == 62 or LA54_1 == 66 or (68 <= LA54_1 <= 69) or (72 <= LA54_1 <= 74) or (77 <= LA54_1 <= 79)):
                         alt54 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
+                        nvae = NoViableAltException(
+                            "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 1, self.input)
 
                         raise nvae
 
-                elif (LA54_0 == 62) :
+                elif (LA54_0 == 62):
                     LA54_2 = self.input.LA(2)
 
-                    if (LA54_2 == 63) :
+                    if (LA54_2 == 63):
                         alt54 = 3
-                    elif (LA54_2 == IDENTIFIER or (29 <= LA54_2 <= 42) or (45 <= LA54_2 <= 46) or (48 <= LA54_2 <= 61) or LA54_2 == 66) :
+                    elif (LA54_2 == IDENTIFIER or (29 <= LA54_2 <= 42) or (45 <= LA54_2 <= 46) or (48 <= LA54_2 <= 61) or LA54_2 == 66):
                         alt54 = 4
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
+                        nvae = NoViableAltException(
+                            "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 2, self.input)
 
                         raise nvae
 
@@ -4631,67 +4440,74 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
+                    nvae = NoViableAltException(
+                        "354:1: abstract_declarator_suffix : ( '[' ']' | '[' constant_expression ']' | '(' ')' | '(' parameter_type_list ')' );", 54, 0, self.input)
 
                     raise nvae
 
                 if alt54 == 1:
                     # C.g:355:4: '[' ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
+                    self.match(
+                        self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1110)
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
+                    self.match(
+                        self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1112)
                     if self.failed:
                         return
 
-
                 elif alt54 == 2:
                     # C.g:356:4: '[' constant_expression ']'
-                    self.match(self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
+                    self.match(
+                        self.input, 64, self.FOLLOW_64_in_abstract_declarator_suffix1117)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_abstract_declarator_suffix1119)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
+                    self.match(
+                        self.input, 65, self.FOLLOW_65_in_abstract_declarator_suffix1121)
                     if self.failed:
                         return
 
-
                 elif alt54 == 3:
                     # C.g:357:4: '(' ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1126)
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1128)
                     if self.failed:
                         return
 
-
                 elif alt54 == 4:
                     # C.g:358:4: '(' parameter_type_list ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
+                    self.match(
+                        self.input, 62, self.FOLLOW_62_in_abstract_declarator_suffix1133)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
+                    self.following.append(
+                        self.FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135)
                     self.parameter_type_list()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
+                    self.match(
+                        self.input, 63, self.FOLLOW_63_in_abstract_declarator_suffix1137)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 33, abstract_declarator_suffix_StartIndex)
+                self.memoize(self.input, 33,
+                             abstract_declarator_suffix_StartIndex)
 
             pass
 
@@ -4699,9 +4515,9 @@ class CParser(Parser):
 
     # $ANTLR end abstract_declarator_suffix
 
-
     # $ANTLR start initializer
     # C.g:361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );
+
     def initializer(self, ):
 
         initializer_StartIndex = self.input.index()
@@ -4714,34 +4530,37 @@ class CParser(Parser):
                 alt56 = 2
                 LA56_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA56_0 <= FLOATING_POINT_LITERAL) or LA56_0 == 62 or LA56_0 == 66 or (68 <= LA56_0 <= 69) or (72 <= LA56_0 <= 74) or (77 <= LA56_0 <= 79)) :
+                if ((IDENTIFIER <= LA56_0 <= FLOATING_POINT_LITERAL) or LA56_0 == 62 or LA56_0 == 66 or (68 <= LA56_0 <= 69) or (72 <= LA56_0 <= 74) or (77 <= LA56_0 <= 79)):
                     alt56 = 1
-                elif (LA56_0 == 43) :
+                elif (LA56_0 == 43):
                     alt56 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
+                    nvae = NoViableAltException(
+                        "361:1: initializer : ( assignment_expression | '{' initializer_list ( ',' )? '}' );", 56, 0, self.input)
 
                     raise nvae
 
                 if alt56 == 1:
                     # C.g:363:4: assignment_expression
-                    self.following.append(self.FOLLOW_assignment_expression_in_initializer1150)
+                    self.following.append(
+                        self.FOLLOW_assignment_expression_in_initializer1150)
                     self.assignment_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt56 == 2:
                     # C.g:364:4: '{' initializer_list ( ',' )? '}'
-                    self.match(self.input, 43, self.FOLLOW_43_in_initializer1155)
+                    self.match(self.input, 43,
+                               self.FOLLOW_43_in_initializer1155)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_initializer_list_in_initializer1157)
+                    self.following.append(
+                        self.FOLLOW_initializer_list_in_initializer1157)
                     self.initializer_list()
                     self.following.pop()
                     if self.failed:
@@ -4750,22 +4569,20 @@ class CParser(Parser):
                     alt55 = 2
                     LA55_0 = self.input.LA(1)
 
-                    if (LA55_0 == 27) :
+                    if (LA55_0 == 27):
                         alt55 = 1
                     if alt55 == 1:
                         # C.g:0:0: ','
-                        self.match(self.input, 27, self.FOLLOW_27_in_initializer1159)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_initializer1159)
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 44, self.FOLLOW_44_in_initializer1162)
+                    self.match(self.input, 44,
+                               self.FOLLOW_44_in_initializer1162)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -4779,9 +4596,9 @@ class CParser(Parser):
 
     # $ANTLR end initializer
 
-
     # $ANTLR start initializer_list
     # C.g:367:1: initializer_list : initializer ( ',' initializer )* ;
+
     def initializer_list(self, ):
 
         initializer_list_StartIndex = self.input.index()
@@ -4792,44 +4609,38 @@ class CParser(Parser):
 
                 # C.g:368:2: ( initializer ( ',' initializer )* )
                 # C.g:368:4: initializer ( ',' initializer )*
-                self.following.append(self.FOLLOW_initializer_in_initializer_list1173)
+                self.following.append(
+                    self.FOLLOW_initializer_in_initializer_list1173)
                 self.initializer()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:368:16: ( ',' initializer )*
-                while True: #loop57
+                while True:  # loop57
                     alt57 = 2
                     LA57_0 = self.input.LA(1)
 
-                    if (LA57_0 == 27) :
+                    if (LA57_0 == 27):
                         LA57_1 = self.input.LA(2)
 
-                        if ((IDENTIFIER <= LA57_1 <= FLOATING_POINT_LITERAL) or LA57_1 == 43 or LA57_1 == 62 or LA57_1 == 66 or (68 <= LA57_1 <= 69) or (72 <= LA57_1 <= 74) or (77 <= LA57_1 <= 79)) :
+                        if ((IDENTIFIER <= LA57_1 <= FLOATING_POINT_LITERAL) or LA57_1 == 43 or LA57_1 == 62 or LA57_1 == 66 or (68 <= LA57_1 <= 69) or (72 <= LA57_1 <= 74) or (77 <= LA57_1 <= 79)):
                             alt57 = 1
 
-
-
-
                     if alt57 == 1:
                         # C.g:368:17: ',' initializer
-                        self.match(self.input, 27, self.FOLLOW_27_in_initializer_list1176)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_initializer_list1176)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_initializer_in_initializer_list1178)
+                        self.following.append(
+                            self.FOLLOW_initializer_in_initializer_list1178)
                         self.initializer()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop57
-
-
-
-
-
+                        break  # loop57
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -4849,10 +4660,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start argument_expression_list
     # C.g:373:1: argument_expression_list : assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )* ;
+
     def argument_expression_list(self, ):
 
         retval = self.argument_expression_list_return()
@@ -4865,7 +4675,8 @@ class CParser(Parser):
 
                 # C.g:374:2: ( assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )* )
                 # C.g:374:6: assignment_expression ( 'OPTIONAL' )? ( ',' assignment_expression ( 'OPTIONAL' )? )*
-                self.following.append(self.FOLLOW_assignment_expression_in_argument_expression_list1196)
+                self.following.append(
+                    self.FOLLOW_assignment_expression_in_argument_expression_list1196)
                 self.assignment_expression()
                 self.following.pop()
                 if self.failed:
@@ -4874,31 +4685,31 @@ class CParser(Parser):
                 alt58 = 2
                 LA58_0 = self.input.LA(1)
 
-                if (LA58_0 == 53) :
+                if (LA58_0 == 53):
                     alt58 = 1
                 if alt58 == 1:
                     # C.g:374:29: 'OPTIONAL'
-                    self.match(self.input, 53, self.FOLLOW_53_in_argument_expression_list1199)
+                    self.match(self.input, 53,
+                               self.FOLLOW_53_in_argument_expression_list1199)
                     if self.failed:
                         return retval
 
-
-
                 # C.g:374:42: ( ',' assignment_expression ( 'OPTIONAL' )? )*
-                while True: #loop60
+                while True:  # loop60
                     alt60 = 2
                     LA60_0 = self.input.LA(1)
 
-                    if (LA60_0 == 27) :
+                    if (LA60_0 == 27):
                         alt60 = 1
 
-
                     if alt60 == 1:
                         # C.g:374:43: ',' assignment_expression ( 'OPTIONAL' )?
-                        self.match(self.input, 27, self.FOLLOW_27_in_argument_expression_list1204)
+                        self.match(
+                            self.input, 27, self.FOLLOW_27_in_argument_expression_list1204)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_assignment_expression_in_argument_expression_list1206)
+                        self.following.append(
+                            self.FOLLOW_assignment_expression_in_argument_expression_list1206)
                         self.assignment_expression()
                         self.following.pop()
                         if self.failed:
@@ -4907,34 +4718,27 @@ class CParser(Parser):
                         alt59 = 2
                         LA59_0 = self.input.LA(1)
 
-                        if (LA59_0 == 53) :
+                        if (LA59_0 == 53):
                             alt59 = 1
                         if alt59 == 1:
                             # C.g:374:70: 'OPTIONAL'
-                            self.match(self.input, 53, self.FOLLOW_53_in_argument_expression_list1209)
+                            self.match(
+                                self.input, 53, self.FOLLOW_53_in_argument_expression_list1209)
                             if self.failed:
                                 return retval
 
-
-
-
-
                     else:
-                        break #loop60
-
-
-
-
+                        break  # loop60
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 36, argument_expression_list_StartIndex)
+                self.memoize(self.input, 36,
+                             argument_expression_list_StartIndex)
 
             pass
 
@@ -4942,9 +4746,9 @@ class CParser(Parser):
 
     # $ANTLR end argument_expression_list
 
-
     # $ANTLR start additive_expression
     # C.g:377:1: additive_expression : ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )* ;
+
     def additive_expression(self, ):
 
         additive_expression_StartIndex = self.input.index()
@@ -4957,56 +4761,51 @@ class CParser(Parser):
                 # C.g:378:4: ( multiplicative_expression ) ( '+' multiplicative_expression | '-' multiplicative_expression )*
                 # C.g:378:4: ( multiplicative_expression )
                 # C.g:378:5: multiplicative_expression
-                self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1225)
+                self.following.append(
+                    self.FOLLOW_multiplicative_expression_in_additive_expression1225)
                 self.multiplicative_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
                 # C.g:378:32: ( '+' multiplicative_expression | '-' multiplicative_expression )*
-                while True: #loop61
+                while True:  # loop61
                     alt61 = 3
                     LA61_0 = self.input.LA(1)
 
-                    if (LA61_0 == 68) :
+                    if (LA61_0 == 68):
                         alt61 = 1
-                    elif (LA61_0 == 69) :
+                    elif (LA61_0 == 69):
                         alt61 = 2
 
-
                     if alt61 == 1:
                         # C.g:378:33: '+' multiplicative_expression
-                        self.match(self.input, 68, self.FOLLOW_68_in_additive_expression1229)
+                        self.match(self.input, 68,
+                                   self.FOLLOW_68_in_additive_expression1229)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1231)
+                        self.following.append(
+                            self.FOLLOW_multiplicative_expression_in_additive_expression1231)
                         self.multiplicative_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt61 == 2:
                         # C.g:378:65: '-' multiplicative_expression
-                        self.match(self.input, 69, self.FOLLOW_69_in_additive_expression1235)
+                        self.match(self.input, 69,
+                                   self.FOLLOW_69_in_additive_expression1235)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_multiplicative_expression_in_additive_expression1237)
+                        self.following.append(
+                            self.FOLLOW_multiplicative_expression_in_additive_expression1237)
                         self.multiplicative_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop61
-
-
-
-
-
+                        break  # loop61
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5021,9 +4820,9 @@ class CParser(Parser):
 
     # $ANTLR end additive_expression
 
-
     # $ANTLR start multiplicative_expression
     # C.g:381:1: multiplicative_expression : ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )* ;
+
     def multiplicative_expression(self, ):
 
         multiplicative_expression_StartIndex = self.input.index()
@@ -5036,16 +4835,15 @@ class CParser(Parser):
                 # C.g:382:4: ( cast_expression ) ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
                 # C.g:382:4: ( cast_expression )
                 # C.g:382:5: cast_expression
-                self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1251)
+                self.following.append(
+                    self.FOLLOW_cast_expression_in_multiplicative_expression1251)
                 self.cast_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
                 # C.g:382:22: ( '*' cast_expression | '/' cast_expression | '%' cast_expression )*
-                while True: #loop62
+                while True:  # loop62
                     alt62 = 4
                     LA62 = self.input.LA(1)
                     if LA62 == 66:
@@ -5057,54 +4855,53 @@ class CParser(Parser):
 
                     if alt62 == 1:
                         # C.g:382:23: '*' cast_expression
-                        self.match(self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
+                        self.match(
+                            self.input, 66, self.FOLLOW_66_in_multiplicative_expression1255)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1257)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1257)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt62 == 2:
                         # C.g:382:45: '/' cast_expression
-                        self.match(self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
+                        self.match(
+                            self.input, 70, self.FOLLOW_70_in_multiplicative_expression1261)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1263)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1263)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     elif alt62 == 3:
                         # C.g:382:67: '%' cast_expression
-                        self.match(self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
+                        self.match(
+                            self.input, 71, self.FOLLOW_71_in_multiplicative_expression1267)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_cast_expression_in_multiplicative_expression1269)
+                        self.following.append(
+                            self.FOLLOW_cast_expression_in_multiplicative_expression1269)
                         self.cast_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop62
-
-
-
-
-
+                        break  # loop62
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 38, multiplicative_expression_StartIndex)
+                self.memoize(self.input, 38,
+                             multiplicative_expression_StartIndex)
 
             pass
 
@@ -5112,9 +4909,9 @@ class CParser(Parser):
 
     # $ANTLR end multiplicative_expression
 
-
     # $ANTLR start cast_expression
     # C.g:385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );
+
     def cast_expression(self, ):
 
         cast_expression_StartIndex = self.input.index()
@@ -5127,23 +4924,24 @@ class CParser(Parser):
                 alt63 = 2
                 LA63_0 = self.input.LA(1)
 
-                if (LA63_0 == 62) :
+                if (LA63_0 == 62):
                     LA63 = self.input.LA(2)
                     if LA63 == 34 or LA63 == 35 or LA63 == 36 or LA63 == 37 or LA63 == 38 or LA63 == 39 or LA63 == 40 or LA63 == 41 or LA63 == 42 or LA63 == 45 or LA63 == 46 or LA63 == 48 or LA63 == 49 or LA63 == 50 or LA63 == 51 or LA63 == 52 or LA63 == 53 or LA63 == 54 or LA63 == 55 or LA63 == 56 or LA63 == 57 or LA63 == 58 or LA63 == 59 or LA63 == 60 or LA63 == 61:
                         alt63 = 1
                     elif LA63 == IDENTIFIER:
                         LA63_25 = self.input.LA(3)
 
-                        if (self.synpred109()) :
+                        if (self.synpred109()):
                             alt63 = 1
-                        elif (True) :
+                        elif (True):
                             alt63 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
+                            nvae = NoViableAltException(
+                                "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 25, self.input)
 
                             raise nvae
 
@@ -5154,51 +4952,55 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
+                        nvae = NoViableAltException(
+                            "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 1, self.input)
 
                         raise nvae
 
-                elif ((IDENTIFIER <= LA63_0 <= FLOATING_POINT_LITERAL) or LA63_0 == 66 or (68 <= LA63_0 <= 69) or (72 <= LA63_0 <= 74) or (77 <= LA63_0 <= 79)) :
+                elif ((IDENTIFIER <= LA63_0 <= FLOATING_POINT_LITERAL) or LA63_0 == 66 or (68 <= LA63_0 <= 69) or (72 <= LA63_0 <= 74) or (77 <= LA63_0 <= 79)):
                     alt63 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
+                    nvae = NoViableAltException(
+                        "385:1: cast_expression : ( '(' type_name ')' cast_expression | unary_expression );", 63, 0, self.input)
 
                     raise nvae
 
                 if alt63 == 1:
                     # C.g:386:4: '(' type_name ')' cast_expression
-                    self.match(self.input, 62, self.FOLLOW_62_in_cast_expression1282)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_cast_expression1282)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_type_name_in_cast_expression1284)
+                    self.following.append(
+                        self.FOLLOW_type_name_in_cast_expression1284)
                     self.type_name()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_cast_expression1286)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_cast_expression1286)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_cast_expression_in_cast_expression1288)
+                    self.following.append(
+                        self.FOLLOW_cast_expression_in_cast_expression1288)
                     self.cast_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt63 == 2:
                     # C.g:387:4: unary_expression
-                    self.following.append(self.FOLLOW_unary_expression_in_cast_expression1293)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_cast_expression1293)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5212,9 +5014,9 @@ class CParser(Parser):
 
     # $ANTLR end cast_expression
 
-
     # $ANTLR start unary_expression
     # C.g:390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );
+
     def unary_expression(self, ):
 
         unary_expression_StartIndex = self.input.index()
@@ -5237,30 +5039,32 @@ class CParser(Parser):
                 elif LA64 == 74:
                     LA64_12 = self.input.LA(2)
 
-                    if (LA64_12 == 62) :
+                    if (LA64_12 == 62):
                         LA64_13 = self.input.LA(3)
 
-                        if (self.synpred114()) :
+                        if (self.synpred114()):
                             alt64 = 5
-                        elif (True) :
+                        elif (True):
                             alt64 = 6
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
+                            nvae = NoViableAltException(
+                                "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 13, self.input)
 
                             raise nvae
 
-                    elif ((IDENTIFIER <= LA64_12 <= FLOATING_POINT_LITERAL) or LA64_12 == 66 or (68 <= LA64_12 <= 69) or (72 <= LA64_12 <= 74) or (77 <= LA64_12 <= 79)) :
+                    elif ((IDENTIFIER <= LA64_12 <= FLOATING_POINT_LITERAL) or LA64_12 == 66 or (68 <= LA64_12 <= 69) or (72 <= LA64_12 <= 74) or (77 <= LA64_12 <= 79)):
                         alt64 = 5
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
+                        nvae = NoViableAltException(
+                            "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 12, self.input)
 
                         raise nvae
 
@@ -5269,88 +5073,95 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
+                    nvae = NoViableAltException(
+                        "390:1: unary_expression : ( postfix_expression | '++' unary_expression | '--' unary_expression | unary_operator cast_expression | 'sizeof' unary_expression | 'sizeof' '(' type_name ')' );", 64, 0, self.input)
 
                     raise nvae
 
                 if alt64 == 1:
                     # C.g:391:4: postfix_expression
-                    self.following.append(self.FOLLOW_postfix_expression_in_unary_expression1304)
+                    self.following.append(
+                        self.FOLLOW_postfix_expression_in_unary_expression1304)
                     self.postfix_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 2:
                     # C.g:392:4: '++' unary_expression
-                    self.match(self.input, 72, self.FOLLOW_72_in_unary_expression1309)
+                    self.match(self.input, 72,
+                               self.FOLLOW_72_in_unary_expression1309)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1311)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1311)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 3:
                     # C.g:393:4: '--' unary_expression
-                    self.match(self.input, 73, self.FOLLOW_73_in_unary_expression1316)
+                    self.match(self.input, 73,
+                               self.FOLLOW_73_in_unary_expression1316)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1318)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1318)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 4:
                     # C.g:394:4: unary_operator cast_expression
-                    self.following.append(self.FOLLOW_unary_operator_in_unary_expression1323)
+                    self.following.append(
+                        self.FOLLOW_unary_operator_in_unary_expression1323)
                     self.unary_operator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_cast_expression_in_unary_expression1325)
+                    self.following.append(
+                        self.FOLLOW_cast_expression_in_unary_expression1325)
                     self.cast_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 5:
                     # C.g:395:4: 'sizeof' unary_expression
-                    self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1330)
+                    self.match(self.input, 74,
+                               self.FOLLOW_74_in_unary_expression1330)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_unary_expression_in_unary_expression1332)
+                    self.following.append(
+                        self.FOLLOW_unary_expression_in_unary_expression1332)
                     self.unary_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt64 == 6:
                     # C.g:396:4: 'sizeof' '(' type_name ')'
-                    self.match(self.input, 74, self.FOLLOW_74_in_unary_expression1337)
+                    self.match(self.input, 74,
+                               self.FOLLOW_74_in_unary_expression1337)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_unary_expression1339)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_unary_expression1339)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_type_name_in_unary_expression1341)
+                    self.following.append(
+                        self.FOLLOW_type_name_in_unary_expression1341)
                     self.type_name()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_unary_expression1343)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_unary_expression1343)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5364,9 +5175,9 @@ class CParser(Parser):
 
     # $ANTLR end unary_expression
 
-
     # $ANTLR start postfix_expression
     # C.g:399:1: postfix_expression : p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* ;
+
     def postfix_expression(self, ):
         self.postfix_expression_stack.append(postfix_expression_scope())
         postfix_expression_StartIndex = self.input.index()
@@ -5379,9 +5190,7 @@ class CParser(Parser):
 
         c = None
 
-
-
-        self.postfix_expression_stack[-1].FuncCallText =  ''
+        self.postfix_expression_stack[-1].FuncCallText = ''
 
         try:
             try:
@@ -5390,30 +5199,29 @@ class CParser(Parser):
 
                 # C.g:406:2: (p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )* )
                 # C.g:406:6: p= primary_expression ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
-                self.following.append(self.FOLLOW_primary_expression_in_postfix_expression1367)
+                self.following.append(
+                    self.FOLLOW_primary_expression_in_postfix_expression1367)
                 p = self.primary_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 if self.backtracking == 0:
-                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(p.start, p.stop)
+                    self.postfix_expression_stack[-1].FuncCallText += self.input.toString(
+                        p.start, p.stop)
 
                 # C.g:407:9: ( '[' expression ']' | '(' a= ')' | '(' c= argument_expression_list b= ')' | '(' macro_parameter_list ')' | '.' x= IDENTIFIER | '*' y= IDENTIFIER | '->' z= IDENTIFIER | '++' | '--' )*
-                while True: #loop65
+                while True:  # loop65
                     alt65 = 10
                     LA65 = self.input.LA(1)
                     if LA65 == 66:
                         LA65_1 = self.input.LA(2)
 
-                        if (LA65_1 == IDENTIFIER) :
+                        if (LA65_1 == IDENTIFIER):
                             LA65_30 = self.input.LA(3)
 
-                            if (self.synpred120()) :
+                            if (self.synpred120()):
                                 alt65 = 6
 
-
-
-
                     elif LA65 == 64:
                         alt65 = 1
                     elif LA65 == 62:
@@ -5425,21 +5233,19 @@ class CParser(Parser):
                         elif LA65 == IDENTIFIER:
                             LA65_55 = self.input.LA(3)
 
-                            if (self.synpred117()) :
+                            if (self.synpred117()):
                                 alt65 = 3
-                            elif (self.synpred118()) :
+                            elif (self.synpred118()):
                                 alt65 = 4
 
-
                         elif LA65 == 66:
                             LA65_57 = self.input.LA(3)
 
-                            if (self.synpred117()) :
+                            if (self.synpred117()):
                                 alt65 = 3
-                            elif (self.synpred118()) :
+                            elif (self.synpred118()):
                                 alt65 = 4
 
-
                         elif LA65 == HEX_LITERAL or LA65 == OCTAL_LITERAL or LA65 == DECIMAL_LITERAL or LA65 == CHARACTER_LITERAL or LA65 == STRING_LITERAL or LA65 == FLOATING_POINT_LITERAL or LA65 == 62 or LA65 == 68 or LA65 == 69 or LA65 == 72 or LA65 == 73 or LA65 == 74 or LA65 == 77 or LA65 == 78 or LA65 == 79:
                             alt65 = 3
 
@@ -5454,130 +5260,132 @@ class CParser(Parser):
 
                     if alt65 == 1:
                         # C.g:407:13: '[' expression ']'
-                        self.match(self.input, 64, self.FOLLOW_64_in_postfix_expression1383)
+                        self.match(self.input, 64,
+                                   self.FOLLOW_64_in_postfix_expression1383)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_expression_in_postfix_expression1385)
+                        self.following.append(
+                            self.FOLLOW_expression_in_postfix_expression1385)
                         self.expression()
                         self.following.pop()
                         if self.failed:
                             return
-                        self.match(self.input, 65, self.FOLLOW_65_in_postfix_expression1387)
+                        self.match(self.input, 65,
+                                   self.FOLLOW_65_in_postfix_expression1387)
                         if self.failed:
                             return
 
-
                     elif alt65 == 2:
                         # C.g:408:13: '(' a= ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1401)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1401)
                         if self.failed:
                             return
                         a = self.input.LT(1)
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1405)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1405)
                         if self.failed:
                             return
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line, a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
-
-
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, a.line,
+                                                      a.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, '')
 
                     elif alt65 == 3:
                         # C.g:409:13: '(' c= argument_expression_list b= ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1420)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1420)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_argument_expression_list_in_postfix_expression1424)
+                        self.following.append(
+                            self.FOLLOW_argument_expression_list_in_postfix_expression1424)
                         c = self.argument_expression_list()
                         self.following.pop()
                         if self.failed:
                             return
                         b = self.input.LT(1)
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1428)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1428)
                         if self.failed:
                             return
                         if self.backtracking == 0:
-                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine, self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
-
-
+                            self.StoreFunctionCalling(p.start.line, p.start.charPositionInLine, b.line, b.charPositionInLine,
+                                                      self.postfix_expression_stack[-1].FuncCallText, self.input.toString(c.start, c.stop))
 
                     elif alt65 == 4:
                         # C.g:410:13: '(' macro_parameter_list ')'
-                        self.match(self.input, 62, self.FOLLOW_62_in_postfix_expression1444)
+                        self.match(self.input, 62,
+                                   self.FOLLOW_62_in_postfix_expression1444)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
+                        self.following.append(
+                            self.FOLLOW_macro_parameter_list_in_postfix_expression1446)
                         self.macro_parameter_list()
                         self.following.pop()
                         if self.failed:
                             return
-                        self.match(self.input, 63, self.FOLLOW_63_in_postfix_expression1448)
+                        self.match(self.input, 63,
+                                   self.FOLLOW_63_in_postfix_expression1448)
                         if self.failed:
                             return
 
-
                     elif alt65 == 5:
                         # C.g:411:13: '.' x= IDENTIFIER
-                        self.match(self.input, 75, self.FOLLOW_75_in_postfix_expression1462)
+                        self.match(self.input, 75,
+                                   self.FOLLOW_75_in_postfix_expression1462)
                         if self.failed:
                             return
                         x = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1466)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText += '.' + x.text
 
-
-
                     elif alt65 == 6:
                         # C.g:412:13: '*' y= IDENTIFIER
-                        self.match(self.input, 66, self.FOLLOW_66_in_postfix_expression1482)
+                        self.match(self.input, 66,
+                                   self.FOLLOW_66_in_postfix_expression1482)
                         if self.failed:
                             return
                         y = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1486)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText = y.text
 
-
-
                     elif alt65 == 7:
                         # C.g:413:13: '->' z= IDENTIFIER
-                        self.match(self.input, 76, self.FOLLOW_76_in_postfix_expression1502)
+                        self.match(self.input, 76,
+                                   self.FOLLOW_76_in_postfix_expression1502)
                         if self.failed:
                             return
                         z = self.input.LT(1)
-                        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
+                        self.match(
+                            self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_postfix_expression1506)
                         if self.failed:
                             return
                         if self.backtracking == 0:
                             self.postfix_expression_stack[-1].FuncCallText += '->' + z.text
 
-
-
                     elif alt65 == 8:
                         # C.g:414:13: '++'
-                        self.match(self.input, 72, self.FOLLOW_72_in_postfix_expression1522)
+                        self.match(self.input, 72,
+                                   self.FOLLOW_72_in_postfix_expression1522)
                         if self.failed:
                             return
 
-
                     elif alt65 == 9:
                         # C.g:415:13: '--'
-                        self.match(self.input, 73, self.FOLLOW_73_in_postfix_expression1536)
+                        self.match(self.input, 73,
+                                   self.FOLLOW_73_in_postfix_expression1536)
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop65
-
-
-
-
-
+                        break  # loop65
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5593,9 +5401,9 @@ class CParser(Parser):
 
     # $ANTLR end postfix_expression
 
-
     # $ANTLR start macro_parameter_list
     # C.g:419:1: macro_parameter_list : parameter_declaration ( ',' parameter_declaration )* ;
+
     def macro_parameter_list(self, ):
 
         macro_parameter_list_StartIndex = self.input.index()
@@ -5606,39 +5414,35 @@ class CParser(Parser):
 
                 # C.g:420:2: ( parameter_declaration ( ',' parameter_declaration )* )
                 # C.g:420:4: parameter_declaration ( ',' parameter_declaration )*
-                self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1559)
+                self.following.append(
+                    self.FOLLOW_parameter_declaration_in_macro_parameter_list1559)
                 self.parameter_declaration()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:420:26: ( ',' parameter_declaration )*
-                while True: #loop66
+                while True:  # loop66
                     alt66 = 2
                     LA66_0 = self.input.LA(1)
 
-                    if (LA66_0 == 27) :
+                    if (LA66_0 == 27):
                         alt66 = 1
 
-
                     if alt66 == 1:
                         # C.g:420:27: ',' parameter_declaration
-                        self.match(self.input, 27, self.FOLLOW_27_in_macro_parameter_list1562)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_macro_parameter_list1562)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
+                        self.following.append(
+                            self.FOLLOW_parameter_declaration_in_macro_parameter_list1564)
                         self.parameter_declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop66
-
-
-
-
-
+                        break  # loop66
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -5653,9 +5457,9 @@ class CParser(Parser):
 
     # $ANTLR end macro_parameter_list
 
-
     # $ANTLR start unary_operator
     # C.g:423:1: unary_operator : ( '&' | '*' | '+' | '-' | '~' | '!' );
+
     def unary_operator(self, ):
 
         unary_operator_StartIndex = self.input.index()
@@ -5667,7 +5471,7 @@ class CParser(Parser):
                 # C.g:424:2: ( '&' | '*' | '+' | '-' | '~' | '!' )
                 # C.g:
                 if self.input.LA(1) == 66 or (68 <= self.input.LA(1) <= 69) or (77 <= self.input.LA(1) <= 79):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -5679,14 +5483,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_unary_operator0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5705,10 +5504,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start primary_expression
     # C.g:432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );
+
     def primary_expression(self, ):
 
         retval = self.primary_expression_return()
@@ -5725,16 +5523,17 @@ class CParser(Parser):
                 if LA67 == IDENTIFIER:
                     LA67_1 = self.input.LA(2)
 
-                    if (LA67_1 == EOF or LA67_1 == 25 or (27 <= LA67_1 <= 28) or LA67_1 == 44 or LA67_1 == 47 or LA67_1 == 53 or (62 <= LA67_1 <= 66) or (68 <= LA67_1 <= 73) or (75 <= LA67_1 <= 77) or (80 <= LA67_1 <= 102)) :
+                    if (LA67_1 == EOF or LA67_1 == 25 or (27 <= LA67_1 <= 28) or LA67_1 == 44 or LA67_1 == 47 or LA67_1 == 53 or (62 <= LA67_1 <= 66) or (68 <= LA67_1 <= 73) or (75 <= LA67_1 <= 77) or (80 <= LA67_1 <= 102)):
                         alt67 = 1
-                    elif (LA67_1 == IDENTIFIER or LA67_1 == STRING_LITERAL) :
+                    elif (LA67_1 == IDENTIFIER or LA67_1 == STRING_LITERAL):
                         alt67 = 2
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return retval
 
-                        nvae = NoViableAltException("432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 1, self.input)
+                        nvae = NoViableAltException(
+                            "432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 1, self.input)
 
                         raise nvae
 
@@ -5747,44 +5546,46 @@ class CParser(Parser):
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 0, self.input)
+                    nvae = NoViableAltException(
+                        "432:1: primary_expression : ( IDENTIFIER | constant | '(' expression ')' );", 67, 0, self.input)
 
                     raise nvae
 
                 if alt67 == 1:
                     # C.g:433:4: IDENTIFIER
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_primary_expression1613)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_primary_expression1613)
                     if self.failed:
                         return retval
 
-
                 elif alt67 == 2:
                     # C.g:434:4: constant
-                    self.following.append(self.FOLLOW_constant_in_primary_expression1618)
+                    self.following.append(
+                        self.FOLLOW_constant_in_primary_expression1618)
                     self.constant()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
                 elif alt67 == 3:
                     # C.g:435:4: '(' expression ')'
-                    self.match(self.input, 62, self.FOLLOW_62_in_primary_expression1623)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_primary_expression1623)
                     if self.failed:
                         return retval
-                    self.following.append(self.FOLLOW_expression_in_primary_expression1625)
+                    self.following.append(
+                        self.FOLLOW_expression_in_primary_expression1625)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 63, self.FOLLOW_63_in_primary_expression1627)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_primary_expression1627)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -5798,9 +5599,9 @@ class CParser(Parser):
 
     # $ANTLR end primary_expression
 
-
     # $ANTLR start constant
     # C.g:438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );
+
     def constant(self, ):
 
         constant_StartIndex = self.input.index()
@@ -5829,111 +5630,103 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
+                    nvae = NoViableAltException(
+                        "438:1: constant : ( HEX_LITERAL | OCTAL_LITERAL | DECIMAL_LITERAL | CHARACTER_LITERAL | ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )* | FLOATING_POINT_LITERAL );", 72, 0, self.input)
 
                     raise nvae
 
                 if alt72 == 1:
                     # C.g:439:9: HEX_LITERAL
-                    self.match(self.input, HEX_LITERAL, self.FOLLOW_HEX_LITERAL_in_constant1643)
+                    self.match(self.input, HEX_LITERAL,
+                               self.FOLLOW_HEX_LITERAL_in_constant1643)
                     if self.failed:
                         return
 
-
                 elif alt72 == 2:
                     # C.g:440:9: OCTAL_LITERAL
-                    self.match(self.input, OCTAL_LITERAL, self.FOLLOW_OCTAL_LITERAL_in_constant1653)
+                    self.match(self.input, OCTAL_LITERAL,
+                               self.FOLLOW_OCTAL_LITERAL_in_constant1653)
                     if self.failed:
                         return
 
-
                 elif alt72 == 3:
                     # C.g:441:9: DECIMAL_LITERAL
-                    self.match(self.input, DECIMAL_LITERAL, self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
+                    self.match(self.input, DECIMAL_LITERAL,
+                               self.FOLLOW_DECIMAL_LITERAL_in_constant1663)
                     if self.failed:
                         return
 
-
                 elif alt72 == 4:
                     # C.g:442:7: CHARACTER_LITERAL
-                    self.match(self.input, CHARACTER_LITERAL, self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
+                    self.match(self.input, CHARACTER_LITERAL,
+                               self.FOLLOW_CHARACTER_LITERAL_in_constant1671)
                     if self.failed:
                         return
 
-
                 elif alt72 == 5:
                     # C.g:443:7: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+ ( IDENTIFIER )*
                     # C.g:443:7: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )+
                     cnt70 = 0
-                    while True: #loop70
+                    while True:  # loop70
                         alt70 = 2
                         LA70_0 = self.input.LA(1)
 
-                        if (LA70_0 == IDENTIFIER) :
+                        if (LA70_0 == IDENTIFIER):
                             LA70_1 = self.input.LA(2)
 
-                            if (LA70_1 == STRING_LITERAL) :
+                            if (LA70_1 == STRING_LITERAL):
                                 alt70 = 1
-                            elif (LA70_1 == IDENTIFIER) :
+                            elif (LA70_1 == IDENTIFIER):
                                 LA70_33 = self.input.LA(3)
 
-                                if (self.synpred138()) :
+                                if (self.synpred138()):
                                     alt70 = 1
 
-
-
-
-                        elif (LA70_0 == STRING_LITERAL) :
+                        elif (LA70_0 == STRING_LITERAL):
                             alt70 = 1
 
-
                         if alt70 == 1:
                             # C.g:443:8: ( IDENTIFIER )* ( STRING_LITERAL )+
                             # C.g:443:8: ( IDENTIFIER )*
-                            while True: #loop68
+                            while True:  # loop68
                                 alt68 = 2
                                 LA68_0 = self.input.LA(1)
 
-                                if (LA68_0 == IDENTIFIER) :
+                                if (LA68_0 == IDENTIFIER):
                                     alt68 = 1
 
-
                                 if alt68 == 1:
                                     # C.g:0:0: IDENTIFIER
-                                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
+                                    self.match(
+                                        self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1680)
                                     if self.failed:
                                         return
 
-
                                 else:
-                                    break #loop68
-
+                                    break  # loop68
 
                             # C.g:443:20: ( STRING_LITERAL )+
                             cnt69 = 0
-                            while True: #loop69
+                            while True:  # loop69
                                 alt69 = 2
                                 LA69_0 = self.input.LA(1)
 
-                                if (LA69_0 == STRING_LITERAL) :
+                                if (LA69_0 == STRING_LITERAL):
                                     LA69_31 = self.input.LA(2)
 
-                                    if (self.synpred137()) :
+                                    if (self.synpred137()):
                                         alt69 = 1
 
-
-
-
                                 if alt69 == 1:
                                     # C.g:0:0: STRING_LITERAL
-                                    self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
+                                    self.match(
+                                        self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_constant1683)
                                     if self.failed:
                                         return
 
-
                                 else:
                                     if cnt69 >= 1:
-                                        break #loop69
+                                        break  # loop69
 
                                     if self.backtracking > 0:
                                         self.failed = True
@@ -5944,12 +5737,9 @@ class CParser(Parser):
 
                                 cnt69 += 1
 
-
-
-
                         else:
                             if cnt70 >= 1:
-                                break #loop70
+                                break  # loop70
 
                             if self.backtracking > 0:
                                 self.failed = True
@@ -5960,37 +5750,31 @@ class CParser(Parser):
 
                         cnt70 += 1
 
-
                     # C.g:443:38: ( IDENTIFIER )*
-                    while True: #loop71
+                    while True:  # loop71
                         alt71 = 2
                         LA71_0 = self.input.LA(1)
 
-                        if (LA71_0 == IDENTIFIER) :
+                        if (LA71_0 == IDENTIFIER):
                             alt71 = 1
 
-
                         if alt71 == 1:
                             # C.g:0:0: IDENTIFIER
-                            self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_constant1688)
+                            self.match(self.input, IDENTIFIER,
+                                       self.FOLLOW_IDENTIFIER_in_constant1688)
                             if self.failed:
                                 return
 
-
                         else:
-                            break #loop71
-
-
-
+                            break  # loop71
 
                 elif alt72 == 6:
                     # C.g:444:9: FLOATING_POINT_LITERAL
-                    self.match(self.input, FLOATING_POINT_LITERAL, self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
+                    self.match(self.input, FLOATING_POINT_LITERAL,
+                               self.FOLLOW_FLOATING_POINT_LITERAL_in_constant1699)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6009,10 +5793,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start expression
     # C.g:449:1: expression : assignment_expression ( ',' assignment_expression )* ;
+
     def expression(self, ):
 
         retval = self.expression_return()
@@ -6025,42 +5808,38 @@ class CParser(Parser):
 
                 # C.g:450:2: ( assignment_expression ( ',' assignment_expression )* )
                 # C.g:450:4: assignment_expression ( ',' assignment_expression )*
-                self.following.append(self.FOLLOW_assignment_expression_in_expression1715)
+                self.following.append(
+                    self.FOLLOW_assignment_expression_in_expression1715)
                 self.assignment_expression()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:450:26: ( ',' assignment_expression )*
-                while True: #loop73
+                while True:  # loop73
                     alt73 = 2
                     LA73_0 = self.input.LA(1)
 
-                    if (LA73_0 == 27) :
+                    if (LA73_0 == 27):
                         alt73 = 1
 
-
                     if alt73 == 1:
                         # C.g:450:27: ',' assignment_expression
-                        self.match(self.input, 27, self.FOLLOW_27_in_expression1718)
+                        self.match(self.input, 27,
+                                   self.FOLLOW_27_in_expression1718)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_assignment_expression_in_expression1720)
+                        self.following.append(
+                            self.FOLLOW_assignment_expression_in_expression1720)
                         self.assignment_expression()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop73
-
-
-
-
+                        break  # loop73
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6074,9 +5853,9 @@ class CParser(Parser):
 
     # $ANTLR end expression
 
-
     # $ANTLR start constant_expression
     # C.g:453:1: constant_expression : conditional_expression ;
+
     def constant_expression(self, ):
 
         constant_expression_StartIndex = self.input.index()
@@ -6087,15 +5866,13 @@ class CParser(Parser):
 
                 # C.g:454:2: ( conditional_expression )
                 # C.g:454:4: conditional_expression
-                self.following.append(self.FOLLOW_conditional_expression_in_constant_expression1733)
+                self.following.append(
+                    self.FOLLOW_conditional_expression_in_constant_expression1733)
                 self.conditional_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -6109,9 +5886,9 @@ class CParser(Parser):
 
     # $ANTLR end constant_expression
 
-
     # $ANTLR start assignment_expression
     # C.g:457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );
+
     def assignment_expression(self, ):
 
         assignment_expression_StartIndex = self.input.index()
@@ -6128,112 +5905,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_13 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 13, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_14 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 14, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_15 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 15, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_16 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 16, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_17 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 17, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_18 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 18, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_19 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 19, self.input)
 
                             raise nvae
 
@@ -6242,32 +6026,34 @@ class CParser(Parser):
                     elif LA74 == STRING_LITERAL:
                         LA74_21 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 21, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_22 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 22, self.input)
 
                             raise nvae
 
@@ -6278,7 +6064,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 1, self.input)
 
                         raise nvae
 
@@ -6287,112 +6074,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_44 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 44, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_45 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 45, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_46 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 46, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_47 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 47, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_48 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 48, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_49 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 49, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_50 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 50, self.input)
 
                             raise nvae
 
@@ -6405,7 +6199,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 2, self.input)
 
                         raise nvae
 
@@ -6414,112 +6209,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_73 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 73, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_74 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 74, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_75 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 75, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_76 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 76, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_77 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 77, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_78 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 78, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_79 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 79, self.input)
 
                             raise nvae
 
@@ -6532,7 +6334,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 3, self.input)
 
                         raise nvae
 
@@ -6541,112 +6344,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_102 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 102, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_103 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 103, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_104 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 104, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_105 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 105, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_106 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 106, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_107 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 107, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_108 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 108, self.input)
 
                             raise nvae
 
@@ -6659,7 +6469,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 4, self.input)
 
                         raise nvae
 
@@ -6668,112 +6479,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_131 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 131, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_132 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 132, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_133 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 133, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_134 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 134, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_135 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 135, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_136 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 136, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_137 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 137, self.input)
 
                             raise nvae
 
@@ -6786,7 +6604,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 5, self.input)
 
                         raise nvae
 
@@ -6795,128 +6614,136 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_160 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 160, self.input)
 
                             raise nvae
 
                     elif LA74 == 64:
                         LA74_161 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 161, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_162 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 162, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_163 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 163, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_164 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 164, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_165 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 165, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_166 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 166, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_167 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 167, self.input)
 
                             raise nvae
 
@@ -6925,16 +6752,17 @@ class CParser(Parser):
                     elif LA74 == STRING_LITERAL:
                         LA74_189 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 189, self.input)
 
                             raise nvae
 
@@ -6945,7 +6773,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 6, self.input)
 
                         raise nvae
 
@@ -6954,112 +6783,119 @@ class CParser(Parser):
                     if LA74 == 64:
                         LA74_191 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 191, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_192 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 192, self.input)
 
                             raise nvae
 
                     elif LA74 == 75:
                         LA74_193 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 193, self.input)
 
                             raise nvae
 
                     elif LA74 == 66:
                         LA74_194 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 194, self.input)
 
                             raise nvae
 
                     elif LA74 == 76:
                         LA74_195 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 195, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_196 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 196, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_197 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 197, self.input)
 
                             raise nvae
 
@@ -7072,7 +6908,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 7, self.input)
 
                         raise nvae
 
@@ -7081,192 +6918,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_220 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 220, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_221 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 221, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_222 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 222, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_223 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 223, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_224 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 224, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_225 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 225, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_226 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 226, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_227 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 227, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_228 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 228, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_229 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 229, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_230 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 230, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_231 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 231, self.input)
 
                             raise nvae
 
@@ -7277,7 +7126,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 8, self.input)
 
                         raise nvae
 
@@ -7286,192 +7136,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_244 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 244, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_245 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 245, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_246 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 246, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_247 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 247, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_248 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 248, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_249 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 249, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_250 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 250, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_251 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 251, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_252 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 252, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_253 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 253, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_254 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 254, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_255 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 255, self.input)
 
                             raise nvae
 
@@ -7480,7 +7342,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 9, self.input)
 
                         raise nvae
 
@@ -7489,192 +7352,204 @@ class CParser(Parser):
                     if LA74 == IDENTIFIER:
                         LA74_256 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 256, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_257 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 257, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_258 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 258, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_259 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 259, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_260 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 260, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_261 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 261, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_262 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 262, self.input)
 
                             raise nvae
 
                     elif LA74 == 62:
                         LA74_263 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 263, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_264 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 264, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_265 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 265, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_266 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 266, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_267 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 267, self.input)
 
                             raise nvae
 
@@ -7683,7 +7558,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 10, self.input)
 
                         raise nvae
 
@@ -7692,192 +7568,204 @@ class CParser(Parser):
                     if LA74 == 62:
                         LA74_268 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 268, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_269 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 269, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_270 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 270, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_271 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 271, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_272 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 272, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_273 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 273, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_274 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 274, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_275 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 275, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_276 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 276, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_277 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 277, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_278 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 278, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_279 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 279, self.input)
 
                             raise nvae
 
@@ -7886,7 +7774,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 11, self.input)
 
                         raise nvae
 
@@ -7895,192 +7784,204 @@ class CParser(Parser):
                     if LA74 == 62:
                         LA74_280 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 280, self.input)
 
                             raise nvae
 
                     elif LA74 == IDENTIFIER:
                         LA74_281 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 281, self.input)
 
                             raise nvae
 
                     elif LA74 == HEX_LITERAL:
                         LA74_282 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 282, self.input)
 
                             raise nvae
 
                     elif LA74 == OCTAL_LITERAL:
                         LA74_283 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 283, self.input)
 
                             raise nvae
 
                     elif LA74 == DECIMAL_LITERAL:
                         LA74_284 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 284, self.input)
 
                             raise nvae
 
                     elif LA74 == CHARACTER_LITERAL:
                         LA74_285 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 285, self.input)
 
                             raise nvae
 
                     elif LA74 == STRING_LITERAL:
                         LA74_286 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 286, self.input)
 
                             raise nvae
 
                     elif LA74 == FLOATING_POINT_LITERAL:
                         LA74_287 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 287, self.input)
 
                             raise nvae
 
                     elif LA74 == 72:
                         LA74_288 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 288, self.input)
 
                             raise nvae
 
                     elif LA74 == 73:
                         LA74_289 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 289, self.input)
 
                             raise nvae
 
                     elif LA74 == 66 or LA74 == 68 or LA74 == 69 or LA74 == 77 or LA74 == 78 or LA74 == 79:
                         LA74_290 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 290, self.input)
 
                             raise nvae
 
                     elif LA74 == 74:
                         LA74_291 = self.input.LA(3)
 
-                        if (self.synpred142()) :
+                        if (self.synpred142()):
                             alt74 = 1
-                        elif (True) :
+                        elif (True):
                             alt74 = 2
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
+                            nvae = NoViableAltException(
+                                "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 291, self.input)
 
                             raise nvae
 
@@ -8089,7 +7990,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
+                        nvae = NoViableAltException(
+                            "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 12, self.input)
 
                         raise nvae
 
@@ -8098,39 +8000,41 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
+                    nvae = NoViableAltException(
+                        "457:1: assignment_expression : ( lvalue assignment_operator assignment_expression | conditional_expression );", 74, 0, self.input)
 
                     raise nvae
 
                 if alt74 == 1:
                     # C.g:458:4: lvalue assignment_operator assignment_expression
-                    self.following.append(self.FOLLOW_lvalue_in_assignment_expression1744)
+                    self.following.append(
+                        self.FOLLOW_lvalue_in_assignment_expression1744)
                     self.lvalue()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_assignment_operator_in_assignment_expression1746)
+                    self.following.append(
+                        self.FOLLOW_assignment_operator_in_assignment_expression1746)
                     self.assignment_operator()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_assignment_expression_in_assignment_expression1748)
+                    self.following.append(
+                        self.FOLLOW_assignment_expression_in_assignment_expression1748)
                     self.assignment_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt74 == 2:
                     # C.g:459:4: conditional_expression
-                    self.following.append(self.FOLLOW_conditional_expression_in_assignment_expression1753)
+                    self.following.append(
+                        self.FOLLOW_conditional_expression_in_assignment_expression1753)
                     self.conditional_expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8144,9 +8048,9 @@ class CParser(Parser):
 
     # $ANTLR end assignment_expression
 
-
     # $ANTLR start lvalue
     # C.g:462:1: lvalue : unary_expression ;
+
     def lvalue(self, ):
 
         lvalue_StartIndex = self.input.index()
@@ -8157,15 +8061,13 @@ class CParser(Parser):
 
                 # C.g:463:2: ( unary_expression )
                 # C.g:463:4: unary_expression
-                self.following.append(self.FOLLOW_unary_expression_in_lvalue1765)
+                self.following.append(
+                    self.FOLLOW_unary_expression_in_lvalue1765)
                 self.unary_expression()
                 self.following.pop()
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8179,9 +8081,9 @@ class CParser(Parser):
 
     # $ANTLR end lvalue
 
-
     # $ANTLR start assignment_operator
     # C.g:466:1: assignment_operator : ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' );
+
     def assignment_operator(self, ):
 
         assignment_operator_StartIndex = self.input.index()
@@ -8193,7 +8095,7 @@ class CParser(Parser):
                 # C.g:467:2: ( '=' | '*=' | '/=' | '%=' | '+=' | '-=' | '<<=' | '>>=' | '&=' | '^=' | '|=' )
                 # C.g:
                 if self.input.LA(1) == 28 or (80 <= self.input.LA(1) <= 89):
-                    self.input.consume();
+                    self.input.consume()
                     self.errorRecovery = False
                     self.failed = False
 
@@ -8205,14 +8107,9 @@ class CParser(Parser):
                     mse = MismatchedSetException(None, self.input)
                     self.recoverFromMismatchedSet(
                         self.input, mse, self.FOLLOW_set_in_assignment_operator0
-                        )
+                    )
                     raise mse
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8226,15 +8123,14 @@ class CParser(Parser):
 
     # $ANTLR end assignment_operator
 
-
     # $ANTLR start conditional_expression
     # C.g:480:1: conditional_expression : e= logical_or_expression ( '?' expression ':' conditional_expression )? ;
+
     def conditional_expression(self, ):
 
         conditional_expression_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 51):
@@ -8242,7 +8138,8 @@ class CParser(Parser):
 
                 # C.g:481:2: (e= logical_or_expression ( '?' expression ':' conditional_expression )? )
                 # C.g:481:4: e= logical_or_expression ( '?' expression ':' conditional_expression )?
-                self.following.append(self.FOLLOW_logical_or_expression_in_conditional_expression1839)
+                self.following.append(
+                    self.FOLLOW_logical_or_expression_in_conditional_expression1839)
                 e = self.logical_or_expression()
                 self.following.pop()
                 if self.failed:
@@ -8251,35 +8148,33 @@ class CParser(Parser):
                 alt75 = 2
                 LA75_0 = self.input.LA(1)
 
-                if (LA75_0 == 90) :
+                if (LA75_0 == 90):
                     alt75 = 1
                 if alt75 == 1:
                     # C.g:481:29: '?' expression ':' conditional_expression
-                    self.match(self.input, 90, self.FOLLOW_90_in_conditional_expression1842)
+                    self.match(self.input, 90,
+                               self.FOLLOW_90_in_conditional_expression1842)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_conditional_expression1844)
+                    self.following.append(
+                        self.FOLLOW_expression_in_conditional_expression1844)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_conditional_expression1846)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_conditional_expression1846)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_conditional_expression_in_conditional_expression1848)
+                    self.following.append(
+                        self.FOLLOW_conditional_expression_in_conditional_expression1848)
                     self.conditional_expression()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
-
-
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8299,10 +8194,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start logical_or_expression
     # C.g:484:1: logical_or_expression : logical_and_expression ( '||' logical_and_expression )* ;
+
     def logical_or_expression(self, ):
 
         retval = self.logical_or_expression_return()
@@ -8315,42 +8209,38 @@ class CParser(Parser):
 
                 # C.g:485:2: ( logical_and_expression ( '||' logical_and_expression )* )
                 # C.g:485:4: logical_and_expression ( '||' logical_and_expression )*
-                self.following.append(self.FOLLOW_logical_and_expression_in_logical_or_expression1863)
+                self.following.append(
+                    self.FOLLOW_logical_and_expression_in_logical_or_expression1863)
                 self.logical_and_expression()
                 self.following.pop()
                 if self.failed:
                     return retval
                 # C.g:485:27: ( '||' logical_and_expression )*
-                while True: #loop76
+                while True:  # loop76
                     alt76 = 2
                     LA76_0 = self.input.LA(1)
 
-                    if (LA76_0 == 91) :
+                    if (LA76_0 == 91):
                         alt76 = 1
 
-
                     if alt76 == 1:
                         # C.g:485:28: '||' logical_and_expression
-                        self.match(self.input, 91, self.FOLLOW_91_in_logical_or_expression1866)
+                        self.match(self.input, 91,
+                                   self.FOLLOW_91_in_logical_or_expression1866)
                         if self.failed:
                             return retval
-                        self.following.append(self.FOLLOW_logical_and_expression_in_logical_or_expression1868)
+                        self.following.append(
+                            self.FOLLOW_logical_and_expression_in_logical_or_expression1868)
                         self.logical_and_expression()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop76
-
-
-
-
+                        break  # loop76
 
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -8364,9 +8254,9 @@ class CParser(Parser):
 
     # $ANTLR end logical_or_expression
 
-
     # $ANTLR start logical_and_expression
     # C.g:488:1: logical_and_expression : inclusive_or_expression ( '&&' inclusive_or_expression )* ;
+
     def logical_and_expression(self, ):
 
         logical_and_expression_StartIndex = self.input.index()
@@ -8377,39 +8267,35 @@ class CParser(Parser):
 
                 # C.g:489:2: ( inclusive_or_expression ( '&&' inclusive_or_expression )* )
                 # C.g:489:4: inclusive_or_expression ( '&&' inclusive_or_expression )*
-                self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1881)
+                self.following.append(
+                    self.FOLLOW_inclusive_or_expression_in_logical_and_expression1881)
                 self.inclusive_or_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:489:28: ( '&&' inclusive_or_expression )*
-                while True: #loop77
+                while True:  # loop77
                     alt77 = 2
                     LA77_0 = self.input.LA(1)
 
-                    if (LA77_0 == 92) :
+                    if (LA77_0 == 92):
                         alt77 = 1
 
-
                     if alt77 == 1:
                         # C.g:489:29: '&&' inclusive_or_expression
-                        self.match(self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
+                        self.match(
+                            self.input, 92, self.FOLLOW_92_in_logical_and_expression1884)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
+                        self.following.append(
+                            self.FOLLOW_inclusive_or_expression_in_logical_and_expression1886)
                         self.inclusive_or_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop77
-
-
-
-
-
+                        break  # loop77
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8424,9 +8310,9 @@ class CParser(Parser):
 
     # $ANTLR end logical_and_expression
 
-
     # $ANTLR start inclusive_or_expression
     # C.g:492:1: inclusive_or_expression : exclusive_or_expression ( '|' exclusive_or_expression )* ;
+
     def inclusive_or_expression(self, ):
 
         inclusive_or_expression_StartIndex = self.input.index()
@@ -8437,46 +8323,43 @@ class CParser(Parser):
 
                 # C.g:493:2: ( exclusive_or_expression ( '|' exclusive_or_expression )* )
                 # C.g:493:4: exclusive_or_expression ( '|' exclusive_or_expression )*
-                self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899)
+                self.following.append(
+                    self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899)
                 self.exclusive_or_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:493:28: ( '|' exclusive_or_expression )*
-                while True: #loop78
+                while True:  # loop78
                     alt78 = 2
                     LA78_0 = self.input.LA(1)
 
-                    if (LA78_0 == 93) :
+                    if (LA78_0 == 93):
                         alt78 = 1
 
-
                     if alt78 == 1:
                         # C.g:493:29: '|' exclusive_or_expression
-                        self.match(self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
+                        self.match(
+                            self.input, 93, self.FOLLOW_93_in_inclusive_or_expression1902)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
+                        self.following.append(
+                            self.FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904)
                         self.exclusive_or_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop78
-
-
-
-
-
+                        break  # loop78
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 54, inclusive_or_expression_StartIndex)
+                self.memoize(self.input, 54,
+                             inclusive_or_expression_StartIndex)
 
             pass
 
@@ -8484,9 +8367,9 @@ class CParser(Parser):
 
     # $ANTLR end inclusive_or_expression
 
-
     # $ANTLR start exclusive_or_expression
     # C.g:496:1: exclusive_or_expression : and_expression ( '^' and_expression )* ;
+
     def exclusive_or_expression(self, ):
 
         exclusive_or_expression_StartIndex = self.input.index()
@@ -8497,46 +8380,43 @@ class CParser(Parser):
 
                 # C.g:497:2: ( and_expression ( '^' and_expression )* )
                 # C.g:497:4: and_expression ( '^' and_expression )*
-                self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1917)
+                self.following.append(
+                    self.FOLLOW_and_expression_in_exclusive_or_expression1917)
                 self.and_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:497:19: ( '^' and_expression )*
-                while True: #loop79
+                while True:  # loop79
                     alt79 = 2
                     LA79_0 = self.input.LA(1)
 
-                    if (LA79_0 == 94) :
+                    if (LA79_0 == 94):
                         alt79 = 1
 
-
                     if alt79 == 1:
                         # C.g:497:20: '^' and_expression
-                        self.match(self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
+                        self.match(
+                            self.input, 94, self.FOLLOW_94_in_exclusive_or_expression1920)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_and_expression_in_exclusive_or_expression1922)
+                        self.following.append(
+                            self.FOLLOW_and_expression_in_exclusive_or_expression1922)
                         self.and_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop79
-
-
-
-
-
+                        break  # loop79
 
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
         finally:
             if self.backtracking > 0:
-                self.memoize(self.input, 55, exclusive_or_expression_StartIndex)
+                self.memoize(self.input, 55,
+                             exclusive_or_expression_StartIndex)
 
             pass
 
@@ -8544,9 +8424,9 @@ class CParser(Parser):
 
     # $ANTLR end exclusive_or_expression
 
-
     # $ANTLR start and_expression
     # C.g:500:1: and_expression : equality_expression ( '&' equality_expression )* ;
+
     def and_expression(self, ):
 
         and_expression_StartIndex = self.input.index()
@@ -8557,39 +8437,35 @@ class CParser(Parser):
 
                 # C.g:501:2: ( equality_expression ( '&' equality_expression )* )
                 # C.g:501:4: equality_expression ( '&' equality_expression )*
-                self.following.append(self.FOLLOW_equality_expression_in_and_expression1935)
+                self.following.append(
+                    self.FOLLOW_equality_expression_in_and_expression1935)
                 self.equality_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:501:24: ( '&' equality_expression )*
-                while True: #loop80
+                while True:  # loop80
                     alt80 = 2
                     LA80_0 = self.input.LA(1)
 
-                    if (LA80_0 == 77) :
+                    if (LA80_0 == 77):
                         alt80 = 1
 
-
                     if alt80 == 1:
                         # C.g:501:25: '&' equality_expression
-                        self.match(self.input, 77, self.FOLLOW_77_in_and_expression1938)
+                        self.match(self.input, 77,
+                                   self.FOLLOW_77_in_and_expression1938)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_equality_expression_in_and_expression1940)
+                        self.following.append(
+                            self.FOLLOW_equality_expression_in_and_expression1940)
                         self.equality_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop80
-
-
-
-
-
+                        break  # loop80
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8604,9 +8480,9 @@ class CParser(Parser):
 
     # $ANTLR end and_expression
 
-
     # $ANTLR start equality_expression
     # C.g:503:1: equality_expression : relational_expression ( ( '==' | '!=' ) relational_expression )* ;
+
     def equality_expression(self, ):
 
         equality_expression_StartIndex = self.input.index()
@@ -8617,24 +8493,24 @@ class CParser(Parser):
 
                 # C.g:504:2: ( relational_expression ( ( '==' | '!=' ) relational_expression )* )
                 # C.g:504:4: relational_expression ( ( '==' | '!=' ) relational_expression )*
-                self.following.append(self.FOLLOW_relational_expression_in_equality_expression1952)
+                self.following.append(
+                    self.FOLLOW_relational_expression_in_equality_expression1952)
                 self.relational_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:504:26: ( ( '==' | '!=' ) relational_expression )*
-                while True: #loop81
+                while True:  # loop81
                     alt81 = 2
                     LA81_0 = self.input.LA(1)
 
-                    if ((95 <= LA81_0 <= 96)) :
+                    if ((95 <= LA81_0 <= 96)):
                         alt81 = 1
 
-
                     if alt81 == 1:
                         # C.g:504:27: ( '==' | '!=' ) relational_expression
                         if (95 <= self.input.LA(1) <= 96):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8646,24 +8522,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_equality_expression1955
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_relational_expression_in_equality_expression1961)
+                        self.following.append(
+                            self.FOLLOW_relational_expression_in_equality_expression1961)
                         self.relational_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop81
-
-
-
-
-
+                        break  # loop81
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8678,9 +8548,9 @@ class CParser(Parser):
 
     # $ANTLR end equality_expression
 
-
     # $ANTLR start relational_expression
     # C.g:507:1: relational_expression : shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* ;
+
     def relational_expression(self, ):
 
         relational_expression_StartIndex = self.input.index()
@@ -8691,24 +8561,24 @@ class CParser(Parser):
 
                 # C.g:508:2: ( shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )* )
                 # C.g:508:4: shift_expression ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
-                self.following.append(self.FOLLOW_shift_expression_in_relational_expression1975)
+                self.following.append(
+                    self.FOLLOW_shift_expression_in_relational_expression1975)
                 self.shift_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:508:21: ( ( '<' | '>' | '<=' | '>=' ) shift_expression )*
-                while True: #loop82
+                while True:  # loop82
                     alt82 = 2
                     LA82_0 = self.input.LA(1)
 
-                    if ((97 <= LA82_0 <= 100)) :
+                    if ((97 <= LA82_0 <= 100)):
                         alt82 = 1
 
-
                     if alt82 == 1:
                         # C.g:508:22: ( '<' | '>' | '<=' | '>=' ) shift_expression
                         if (97 <= self.input.LA(1) <= 100):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8720,24 +8590,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_relational_expression1978
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_shift_expression_in_relational_expression1988)
+                        self.following.append(
+                            self.FOLLOW_shift_expression_in_relational_expression1988)
                         self.shift_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop82
-
-
-
-
-
+                        break  # loop82
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8752,9 +8616,9 @@ class CParser(Parser):
 
     # $ANTLR end relational_expression
 
-
     # $ANTLR start shift_expression
     # C.g:511:1: shift_expression : additive_expression ( ( '<<' | '>>' ) additive_expression )* ;
+
     def shift_expression(self, ):
 
         shift_expression_StartIndex = self.input.index()
@@ -8765,24 +8629,24 @@ class CParser(Parser):
 
                 # C.g:512:2: ( additive_expression ( ( '<<' | '>>' ) additive_expression )* )
                 # C.g:512:4: additive_expression ( ( '<<' | '>>' ) additive_expression )*
-                self.following.append(self.FOLLOW_additive_expression_in_shift_expression2001)
+                self.following.append(
+                    self.FOLLOW_additive_expression_in_shift_expression2001)
                 self.additive_expression()
                 self.following.pop()
                 if self.failed:
                     return
                 # C.g:512:24: ( ( '<<' | '>>' ) additive_expression )*
-                while True: #loop83
+                while True:  # loop83
                     alt83 = 2
                     LA83_0 = self.input.LA(1)
 
-                    if ((101 <= LA83_0 <= 102)) :
+                    if ((101 <= LA83_0 <= 102)):
                         alt83 = 1
 
-
                     if alt83 == 1:
                         # C.g:512:25: ( '<<' | '>>' ) additive_expression
                         if (101 <= self.input.LA(1) <= 102):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -8794,24 +8658,18 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_shift_expression2004
-                                )
+                            )
                             raise mse
 
-
-                        self.following.append(self.FOLLOW_additive_expression_in_shift_expression2010)
+                        self.following.append(
+                            self.FOLLOW_additive_expression_in_shift_expression2010)
                         self.additive_expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop83
-
-
-
-
-
+                        break  # loop83
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -8826,9 +8684,9 @@ class CParser(Parser):
 
     # $ANTLR end shift_expression
 
-
     # $ANTLR start statement
     # C.g:517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );
+
     def statement(self, ):
 
         statement_StartIndex = self.input.index()
@@ -8845,20 +8703,21 @@ class CParser(Parser):
                     if LA84 == 62:
                         LA84_43 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (self.synpred173()) :
+                        elif (self.synpred173()):
                             alt84 = 7
-                        elif (self.synpred174()) :
+                        elif (self.synpred174()):
                             alt84 = 8
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 43, self.input)
 
                             raise nvae
 
@@ -8869,48 +8728,51 @@ class CParser(Parser):
                     elif LA84 == 66:
                         LA84_47 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 47, self.input)
 
                             raise nvae
 
                     elif LA84 == IDENTIFIER:
                         LA84_53 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 53, self.input)
 
                             raise nvae
 
                     elif LA84 == 25:
                         LA84_68 = self.input.LA(3)
 
-                        if (self.synpred169()) :
+                        if (self.synpred169()):
                             alt84 = 3
-                        elif (True) :
+                        elif (True):
                             alt84 = 11
                         else:
                             if self.backtracking > 0:
                                 self.failed = True
                                 return
 
-                            nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
+                            nvae = NoViableAltException(
+                                "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 68, self.input)
 
                             raise nvae
 
@@ -8921,7 +8783,8 @@ class CParser(Parser):
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
+                        nvae = NoViableAltException(
+                            "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 1, self.input)
 
                         raise nvae
 
@@ -8950,110 +8813,110 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
+                    nvae = NoViableAltException(
+                        "517:1: statement : ( labeled_statement | compound_statement | expression_statement | selection_statement | iteration_statement | jump_statement | macro_statement | asm2_statement | asm1_statement | asm_statement | declaration );", 84, 0, self.input)
 
                     raise nvae
 
                 if alt84 == 1:
                     # C.g:518:4: labeled_statement
-                    self.following.append(self.FOLLOW_labeled_statement_in_statement2025)
+                    self.following.append(
+                        self.FOLLOW_labeled_statement_in_statement2025)
                     self.labeled_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 2:
                     # C.g:519:4: compound_statement
-                    self.following.append(self.FOLLOW_compound_statement_in_statement2030)
+                    self.following.append(
+                        self.FOLLOW_compound_statement_in_statement2030)
                     self.compound_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 3:
                     # C.g:520:4: expression_statement
-                    self.following.append(self.FOLLOW_expression_statement_in_statement2035)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_statement2035)
                     self.expression_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 4:
                     # C.g:521:4: selection_statement
-                    self.following.append(self.FOLLOW_selection_statement_in_statement2040)
+                    self.following.append(
+                        self.FOLLOW_selection_statement_in_statement2040)
                     self.selection_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 5:
                     # C.g:522:4: iteration_statement
-                    self.following.append(self.FOLLOW_iteration_statement_in_statement2045)
+                    self.following.append(
+                        self.FOLLOW_iteration_statement_in_statement2045)
                     self.iteration_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 6:
                     # C.g:523:4: jump_statement
-                    self.following.append(self.FOLLOW_jump_statement_in_statement2050)
+                    self.following.append(
+                        self.FOLLOW_jump_statement_in_statement2050)
                     self.jump_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 7:
                     # C.g:524:4: macro_statement
-                    self.following.append(self.FOLLOW_macro_statement_in_statement2055)
+                    self.following.append(
+                        self.FOLLOW_macro_statement_in_statement2055)
                     self.macro_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 8:
                     # C.g:525:4: asm2_statement
-                    self.following.append(self.FOLLOW_asm2_statement_in_statement2060)
+                    self.following.append(
+                        self.FOLLOW_asm2_statement_in_statement2060)
                     self.asm2_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 9:
                     # C.g:526:4: asm1_statement
-                    self.following.append(self.FOLLOW_asm1_statement_in_statement2065)
+                    self.following.append(
+                        self.FOLLOW_asm1_statement_in_statement2065)
                     self.asm1_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 10:
                     # C.g:527:4: asm_statement
-                    self.following.append(self.FOLLOW_asm_statement_in_statement2070)
+                    self.following.append(
+                        self.FOLLOW_asm_statement_in_statement2070)
                     self.asm_statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt84 == 11:
                     # C.g:528:4: declaration
-                    self.following.append(self.FOLLOW_declaration_in_statement2075)
+                    self.following.append(
+                        self.FOLLOW_declaration_in_statement2075)
                     self.declaration()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9067,9 +8930,9 @@ class CParser(Parser):
 
     # $ANTLR end statement
 
-
     # $ANTLR start asm2_statement
     # C.g:531:1: asm2_statement : ( '__asm__' )? IDENTIFIER '(' (~ ( ';' ) )* ')' ';' ;
+
     def asm2_statement(self, ):
 
         asm2_statement_StartIndex = self.input.index()
@@ -9084,42 +8947,41 @@ class CParser(Parser):
                 alt85 = 2
                 LA85_0 = self.input.LA(1)
 
-                if (LA85_0 == 103) :
+                if (LA85_0 == 103):
                     alt85 = 1
                 if alt85 == 1:
                     # C.g:0:0: '__asm__'
-                    self.match(self.input, 103, self.FOLLOW_103_in_asm2_statement2086)
+                    self.match(self.input, 103,
+                               self.FOLLOW_103_in_asm2_statement2086)
                     if self.failed:
                         return
 
-
-
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_asm2_statement2089)
                 if self.failed:
                     return
-                self.match(self.input, 62, self.FOLLOW_62_in_asm2_statement2091)
+                self.match(self.input, 62,
+                           self.FOLLOW_62_in_asm2_statement2091)
                 if self.failed:
                     return
                 # C.g:532:30: (~ ( ';' ) )*
-                while True: #loop86
+                while True:  # loop86
                     alt86 = 2
                     LA86_0 = self.input.LA(1)
 
-                    if (LA86_0 == 63) :
+                    if (LA86_0 == 63):
                         LA86_1 = self.input.LA(2)
 
-                        if ((IDENTIFIER <= LA86_1 <= LINE_COMMAND) or (26 <= LA86_1 <= 117)) :
+                        if ((IDENTIFIER <= LA86_1 <= LINE_COMMAND) or (26 <= LA86_1 <= 117)):
                             alt86 = 1
 
-
-                    elif ((IDENTIFIER <= LA86_0 <= LINE_COMMAND) or (26 <= LA86_0 <= 62) or (64 <= LA86_0 <= 117)) :
+                    elif ((IDENTIFIER <= LA86_0 <= LINE_COMMAND) or (26 <= LA86_0 <= 62) or (64 <= LA86_0 <= 117)):
                         alt86 = 1
 
-
                     if alt86 == 1:
                         # C.g:532:31: ~ ( ';' )
                         if (IDENTIFIER <= self.input.LA(1) <= LINE_COMMAND) or (26 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9131,26 +8993,21 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm2_statement2094
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop86
+                        break  # loop86
 
-
-                self.match(self.input, 63, self.FOLLOW_63_in_asm2_statement2101)
+                self.match(self.input, 63,
+                           self.FOLLOW_63_in_asm2_statement2101)
                 if self.failed:
                     return
-                self.match(self.input, 25, self.FOLLOW_25_in_asm2_statement2103)
+                self.match(self.input, 25,
+                           self.FOLLOW_25_in_asm2_statement2103)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9164,9 +9021,9 @@ class CParser(Parser):
 
     # $ANTLR end asm2_statement
 
-
     # $ANTLR start asm1_statement
     # C.g:535:1: asm1_statement : '_asm' '{' (~ ( '}' ) )* '}' ;
+
     def asm1_statement(self, ):
 
         asm1_statement_StartIndex = self.input.index()
@@ -9177,25 +9034,26 @@ class CParser(Parser):
 
                 # C.g:536:2: ( '_asm' '{' (~ ( '}' ) )* '}' )
                 # C.g:536:4: '_asm' '{' (~ ( '}' ) )* '}'
-                self.match(self.input, 104, self.FOLLOW_104_in_asm1_statement2115)
+                self.match(self.input, 104,
+                           self.FOLLOW_104_in_asm1_statement2115)
                 if self.failed:
                     return
-                self.match(self.input, 43, self.FOLLOW_43_in_asm1_statement2117)
+                self.match(self.input, 43,
+                           self.FOLLOW_43_in_asm1_statement2117)
                 if self.failed:
                     return
                 # C.g:536:15: (~ ( '}' ) )*
-                while True: #loop87
+                while True:  # loop87
                     alt87 = 2
                     LA87_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA87_0 <= 43) or (45 <= LA87_0 <= 117)) :
+                    if ((IDENTIFIER <= LA87_0 <= 43) or (45 <= LA87_0 <= 117)):
                         alt87 = 1
 
-
                     if alt87 == 1:
                         # C.g:536:16: ~ ( '}' )
                         if (IDENTIFIER <= self.input.LA(1) <= 43) or (45 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9207,23 +9065,17 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm1_statement2120
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop87
+                        break  # loop87
 
-
-                self.match(self.input, 44, self.FOLLOW_44_in_asm1_statement2127)
+                self.match(self.input, 44,
+                           self.FOLLOW_44_in_asm1_statement2127)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9237,9 +9089,9 @@ class CParser(Parser):
 
     # $ANTLR end asm1_statement
 
-
     # $ANTLR start asm_statement
     # C.g:539:1: asm_statement : '__asm' '{' (~ ( '}' ) )* '}' ;
+
     def asm_statement(self, ):
 
         asm_statement_StartIndex = self.input.index()
@@ -9250,25 +9102,25 @@ class CParser(Parser):
 
                 # C.g:540:2: ( '__asm' '{' (~ ( '}' ) )* '}' )
                 # C.g:540:4: '__asm' '{' (~ ( '}' ) )* '}'
-                self.match(self.input, 105, self.FOLLOW_105_in_asm_statement2138)
+                self.match(self.input, 105,
+                           self.FOLLOW_105_in_asm_statement2138)
                 if self.failed:
                     return
                 self.match(self.input, 43, self.FOLLOW_43_in_asm_statement2140)
                 if self.failed:
                     return
                 # C.g:540:16: (~ ( '}' ) )*
-                while True: #loop88
+                while True:  # loop88
                     alt88 = 2
                     LA88_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA88_0 <= 43) or (45 <= LA88_0 <= 117)) :
+                    if ((IDENTIFIER <= LA88_0 <= 43) or (45 <= LA88_0 <= 117)):
                         alt88 = 1
 
-
                     if alt88 == 1:
                         # C.g:540:17: ~ ( '}' )
                         if (IDENTIFIER <= self.input.LA(1) <= 43) or (45 <= self.input.LA(1) <= 117):
-                            self.input.consume();
+                            self.input.consume()
                             self.errorRecovery = False
                             self.failed = False
 
@@ -9280,23 +9132,16 @@ class CParser(Parser):
                             mse = MismatchedSetException(None, self.input)
                             self.recoverFromMismatchedSet(
                                 self.input, mse, self.FOLLOW_set_in_asm_statement2143
-                                )
+                            )
                             raise mse
 
-
-
-
                     else:
-                        break #loop88
-
+                        break  # loop88
 
                 self.match(self.input, 44, self.FOLLOW_44_in_asm_statement2150)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -9310,9 +9155,9 @@ class CParser(Parser):
 
     # $ANTLR end asm_statement
 
-
     # $ANTLR start macro_statement
     # C.g:543:1: macro_statement : IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' ;
+
     def macro_statement(self, ):
 
         macro_statement_StartIndex = self.input.index()
@@ -9323,14 +9168,16 @@ class CParser(Parser):
 
                 # C.g:544:2: ( IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')' )
                 # C.g:544:4: IDENTIFIER '(' ( declaration )* ( statement_list )? ( expression )? ')'
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_macro_statement2162)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_macro_statement2162)
                 if self.failed:
                     return
-                self.match(self.input, 62, self.FOLLOW_62_in_macro_statement2164)
+                self.match(self.input, 62,
+                           self.FOLLOW_62_in_macro_statement2164)
                 if self.failed:
                     return
                 # C.g:544:19: ( declaration )*
-                while True: #loop89
+                while True:  # loop89
                     alt89 = 2
                     LA89 = self.input.LA(1)
                     if LA89 == IDENTIFIER:
@@ -9338,1904 +9185,1622 @@ class CParser(Parser):
                         if LA89 == 62:
                             LA89_45 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_47 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 66:
                             LA89_50 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_68 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_71 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_72 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_73 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_74 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_75 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_76 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_77 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_78 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_79 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_80 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_81 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_82 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_83 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_84 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_85 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_86 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 26:
                         LA89 = self.input.LA(2)
                         if LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_87 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_88 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_89 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_90 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_91 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_92 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_93 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_94 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_95 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_96 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_97 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_98 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_99 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_100 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 66:
                             LA89_101 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_102 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_103 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_104 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_105 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_106 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_107 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_108 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_109 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_110 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_111 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_112 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_113 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_114 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_115 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_116 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_117 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_118 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_119 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_120 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_121 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_122 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_123 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_124 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_125 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 34:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_126 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_127 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_128 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_129 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_130 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_131 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_132 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_133 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_134 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_135 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_136 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_137 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_138 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_139 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_140 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_141 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_142 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_143 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_144 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_145 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 35:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_146 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_147 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_148 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_149 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_150 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_151 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_152 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_153 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_154 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_155 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_156 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_157 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_158 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_159 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_160 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_161 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_162 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_163 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_164 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_165 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 36:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_166 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_167 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_168 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_169 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_170 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_171 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_172 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_173 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_174 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_175 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_176 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_177 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_178 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_179 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_180 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_181 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_182 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_183 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_184 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_185 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 37:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_186 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_187 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_188 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_189 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_190 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_191 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_192 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_193 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_194 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_195 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_196 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_197 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_198 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_199 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_200 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_201 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_202 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_203 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_204 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_205 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 38:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_206 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_207 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_208 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_209 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_210 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_211 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_212 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_213 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_214 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_215 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_216 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_217 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_218 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_219 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_220 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_221 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_222 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_223 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_224 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_225 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 39:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_226 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_227 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_228 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_229 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_230 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_231 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_232 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_233 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_234 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_235 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_236 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_237 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_238 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_239 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_240 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_241 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_242 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_243 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_244 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_245 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 40:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_246 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_247 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_248 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_249 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_250 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_251 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_252 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_253 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_254 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_255 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_256 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_257 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_258 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_259 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_260 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_261 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_262 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_263 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_264 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_265 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 41:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_266 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_267 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_268 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_269 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_270 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_271 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_272 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_273 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_274 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_275 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_276 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_277 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_278 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_279 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_280 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_281 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_282 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_283 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_284 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_285 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 42:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_286 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_287 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_288 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_289 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_290 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_291 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_292 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_293 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_294 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_295 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_296 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_297 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_298 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_299 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_300 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_301 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_302 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_303 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_304 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_305 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
                     elif LA89 == 45 or LA89 == 46:
                         LA89_40 = self.input.LA(2)
 
-                        if (LA89_40 == IDENTIFIER) :
+                        if (LA89_40 == IDENTIFIER):
                             LA89_306 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-                        elif (LA89_40 == 43) :
+                        elif (LA89_40 == 43):
                             LA89_307 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     elif LA89 == 48:
                         LA89_41 = self.input.LA(2)
 
-                        if (LA89_41 == 43) :
+                        if (LA89_41 == 43):
                             LA89_308 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-                        elif (LA89_41 == IDENTIFIER) :
+                        elif (LA89_41 == IDENTIFIER):
                             LA89_309 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 58 or LA89 == 59 or LA89 == 60 or LA89 == 61:
                         LA89 = self.input.LA(2)
                         if LA89 == 66:
                             LA89_310 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 58:
                             LA89_311 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 59:
                             LA89_312 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 60:
                             LA89_313 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == IDENTIFIER:
                             LA89_314 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 62:
                             LA89_315 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 25:
                             LA89_316 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 29 or LA89 == 30 or LA89 == 31 or LA89 == 32 or LA89 == 33:
                             LA89_317 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 34:
                             LA89_318 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 35:
                             LA89_319 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 36:
                             LA89_320 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 37:
                             LA89_321 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 38:
                             LA89_322 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 39:
                             LA89_323 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 40:
                             LA89_324 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 41:
                             LA89_325 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 42:
                             LA89_326 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 45 or LA89 == 46:
                             LA89_327 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 48:
                             LA89_328 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
                         elif LA89 == 49 or LA89 == 50 or LA89 == 51 or LA89 == 52 or LA89 == 53 or LA89 == 54 or LA89 == 55 or LA89 == 56 or LA89 == 57 or LA89 == 61:
                             LA89_329 = self.input.LA(3)
 
-                            if (self.synpred181()) :
+                            if (self.synpred181()):
                                 alt89 = 1
 
-
-
-
                     if alt89 == 1:
                         # C.g:0:0: declaration
-                        self.following.append(self.FOLLOW_declaration_in_macro_statement2166)
+                        self.following.append(
+                            self.FOLLOW_declaration_in_macro_statement2166)
                         self.declaration()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
-                        break #loop89
-
+                        break  # loop89
 
                 # C.g:544:33: ( statement_list )?
                 alt90 = 2
@@ -11247,122 +10812,122 @@ class CParser(Parser):
                     elif LA90 == 62:
                         LA90_45 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_46 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_47 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 64:
                         LA90_48 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_49 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_50 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_51 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_52 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_53 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_54 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_55 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_56 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_57 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_58 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_59 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_60 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_61 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_62 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_63 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_64 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_65 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_66 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_67 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_70 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 25 or LA90 == 26 or LA90 == 29 or LA90 == 30 or LA90 == 31 or LA90 == 32 or LA90 == 33 or LA90 == 34 or LA90 == 35 or LA90 == 36 or LA90 == 37 or LA90 == 38 or LA90 == 39 or LA90 == 40 or LA90 == 41 or LA90 == 42 or LA90 == 43 or LA90 == 45 or LA90 == 46 or LA90 == 48 or LA90 == 49 or LA90 == 50 or LA90 == 51 or LA90 == 52 or LA90 == 53 or LA90 == 54 or LA90 == 55 or LA90 == 56 or LA90 == 57 or LA90 == 58 or LA90 == 59 or LA90 == 60 or LA90 == 61 or LA90 == 103 or LA90 == 104 or LA90 == 105 or LA90 == 106 or LA90 == 107 or LA90 == 108 or LA90 == 110 or LA90 == 111 or LA90 == 112 or LA90 == 113 or LA90 == 114 or LA90 == 115 or LA90 == 116 or LA90 == 117:
                     alt90 = 1
@@ -11371,112 +10936,112 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_87 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_88 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_89 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_90 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_91 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_92 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_93 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_94 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_95 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_96 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_97 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_98 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_99 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_100 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_101 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_102 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_103 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_104 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_105 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_106 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_107 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_108 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11485,226 +11050,226 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_111 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_112 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_113 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_114 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_115 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_116 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_117 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_118 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_119 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_120 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_121 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_122 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_123 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_124 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_125 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_126 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_127 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_128 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_129 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_130 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_131 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_134 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == DECIMAL_LITERAL:
                     LA90 = self.input.LA(2)
                     if LA90 == 64:
                         LA90_135 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_136 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_137 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_138 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_139 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_140 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_141 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_142 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_143 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_144 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_145 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_146 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_147 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_148 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_149 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_150 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_151 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_152 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_153 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_154 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_155 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_156 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11713,236 +11278,236 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_159 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_160 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_161 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_162 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_163 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_164 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_165 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_166 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_167 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_168 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_169 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_170 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_171 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_172 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_173 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_174 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_175 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_176 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_177 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_178 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_179 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_181 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == STRING_LITERAL:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_183 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 64:
                         LA90_184 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_185 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_186 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_187 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_188 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_189 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_190 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_191 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_192 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_193 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_194 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_195 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_196 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_197 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_198 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_199 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_200 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_201 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_202 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_203 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_204 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_205 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_206 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -11951,112 +11516,112 @@ class CParser(Parser):
                     if LA90 == 64:
                         LA90_209 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_210 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 75:
                         LA90_211 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66:
                         LA90_212 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 76:
                         LA90_213 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_214 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_215 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 28 or LA90 == 80 or LA90 == 81 or LA90 == 82 or LA90 == 83 or LA90 == 84 or LA90 == 85 or LA90 == 86 or LA90 == 87 or LA90 == 88 or LA90 == 89:
                         LA90_216 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 70:
                         LA90_217 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 71:
                         LA90_218 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 68:
                         LA90_219 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 69:
                         LA90_220 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 101 or LA90 == 102:
                         LA90_221 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 97 or LA90 == 98 or LA90 == 99 or LA90 == 100:
                         LA90_222 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 95 or LA90 == 96:
                         LA90_223 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 77:
                         LA90_224 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 94:
                         LA90_225 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 93:
                         LA90_226 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 92:
                         LA90_227 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 91:
                         LA90_228 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 90:
                         LA90_229 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 27:
                         LA90_230 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 25:
                         alt90 = 1
@@ -12065,404 +11630,400 @@ class CParser(Parser):
                     if LA90 == IDENTIFIER:
                         LA90_233 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_234 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_235 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_236 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_237 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_238 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_239 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_240 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_241 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_242 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_243 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_244 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 49 or LA90 == 50 or LA90 == 51 or LA90 == 52 or LA90 == 53 or LA90 == 54 or LA90 == 55 or LA90 == 56 or LA90 == 57 or LA90 == 58 or LA90 == 59 or LA90 == 60 or LA90 == 61:
                         LA90_245 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 34:
                         LA90_246 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 35:
                         LA90_247 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 36:
                         LA90_248 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 37:
                         LA90_249 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 38:
                         LA90_250 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 39:
                         LA90_251 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 40:
                         LA90_252 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 41:
                         LA90_253 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 42:
                         LA90_254 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 45 or LA90 == 46:
                         LA90_255 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 48:
                         LA90_256 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 72:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_257 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_258 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_259 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_260 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_261 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_262 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_263 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_264 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_265 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_266 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_267 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_268 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 73:
                     LA90 = self.input.LA(2)
                     if LA90 == IDENTIFIER:
                         LA90_269 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_270 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_271 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_272 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_273 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_274 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_275 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 62:
                         LA90_276 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_277 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_278 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_279 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_280 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                     LA90 = self.input.LA(2)
                     if LA90 == 62:
                         LA90_281 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_282 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_283 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_284 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_285 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_286 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_287 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_288 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_289 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_290 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_291 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_292 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 elif LA90 == 74:
                     LA90 = self.input.LA(2)
                     if LA90 == 62:
                         LA90_293 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == IDENTIFIER:
                         LA90_294 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == HEX_LITERAL:
                         LA90_295 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == OCTAL_LITERAL:
                         LA90_296 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == DECIMAL_LITERAL:
                         LA90_297 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == CHARACTER_LITERAL:
                         LA90_298 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == STRING_LITERAL:
                         LA90_299 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == FLOATING_POINT_LITERAL:
                         LA90_300 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 72:
                         LA90_301 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 73:
                         LA90_302 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 66 or LA90 == 68 or LA90 == 69 or LA90 == 77 or LA90 == 78 or LA90 == 79:
                         LA90_303 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                     elif LA90 == 74:
                         LA90_304 = self.input.LA(3)
 
-                        if (self.synpred182()) :
+                        if (self.synpred182()):
                             alt90 = 1
                 if alt90 == 1:
                     # C.g:0:0: statement_list
-                    self.following.append(self.FOLLOW_statement_list_in_macro_statement2170)
+                    self.following.append(
+                        self.FOLLOW_statement_list_in_macro_statement2170)
                     self.statement_list()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
                 # C.g:544:49: ( expression )?
                 alt91 = 2
                 LA91_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA91_0 <= FLOATING_POINT_LITERAL) or LA91_0 == 62 or LA91_0 == 66 or (68 <= LA91_0 <= 69) or (72 <= LA91_0 <= 74) or (77 <= LA91_0 <= 79)) :
+                if ((IDENTIFIER <= LA91_0 <= FLOATING_POINT_LITERAL) or LA91_0 == 62 or LA91_0 == 66 or (68 <= LA91_0 <= 69) or (72 <= LA91_0 <= 74) or (77 <= LA91_0 <= 79)):
                     alt91 = 1
                 if alt91 == 1:
                     # C.g:0:0: expression
-                    self.following.append(self.FOLLOW_expression_in_macro_statement2173)
+                    self.following.append(
+                        self.FOLLOW_expression_in_macro_statement2173)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
-                self.match(self.input, 63, self.FOLLOW_63_in_macro_statement2176)
+                self.match(self.input, 63,
+                           self.FOLLOW_63_in_macro_statement2176)
                 if self.failed:
                     return
 
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -12476,9 +12037,9 @@ class CParser(Parser):
 
     # $ANTLR end macro_statement
 
-
     # $ANTLR start labeled_statement
     # C.g:547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );
+
     def labeled_statement(self, ):
 
         labeled_statement_StartIndex = self.input.index()
@@ -12501,61 +12062,68 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
+                    nvae = NoViableAltException(
+                        "547:1: labeled_statement : ( IDENTIFIER ':' statement | 'case' constant_expression ':' statement | 'default' ':' statement );", 92, 0, self.input)
 
                     raise nvae
 
                 if alt92 == 1:
                     # C.g:548:4: IDENTIFIER ':' statement
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_labeled_statement2188)
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2190)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2190)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2192)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2192)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt92 == 2:
                     # C.g:549:4: 'case' constant_expression ':' statement
-                    self.match(self.input, 106, self.FOLLOW_106_in_labeled_statement2197)
+                    self.match(self.input, 106,
+                               self.FOLLOW_106_in_labeled_statement2197)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_constant_expression_in_labeled_statement2199)
+                    self.following.append(
+                        self.FOLLOW_constant_expression_in_labeled_statement2199)
                     self.constant_expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2201)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2201)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2203)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2203)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
                 elif alt92 == 3:
                     # C.g:550:4: 'default' ':' statement
-                    self.match(self.input, 107, self.FOLLOW_107_in_labeled_statement2208)
+                    self.match(self.input, 107,
+                               self.FOLLOW_107_in_labeled_statement2208)
                     if self.failed:
                         return
-                    self.match(self.input, 47, self.FOLLOW_47_in_labeled_statement2210)
+                    self.match(self.input, 47,
+                               self.FOLLOW_47_in_labeled_statement2210)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_labeled_statement2212)
+                    self.following.append(
+                        self.FOLLOW_statement_in_labeled_statement2212)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -12574,10 +12142,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start compound_statement
     # C.g:553:1: compound_statement : '{' ( declaration )* ( statement_list )? '}' ;
+
     def compound_statement(self, ):
 
         retval = self.compound_statement_return()
@@ -12590,11 +12157,12 @@ class CParser(Parser):
 
                 # C.g:554:2: ( '{' ( declaration )* ( statement_list )? '}' )
                 # C.g:554:4: '{' ( declaration )* ( statement_list )? '}'
-                self.match(self.input, 43, self.FOLLOW_43_in_compound_statement2223)
+                self.match(self.input, 43,
+                           self.FOLLOW_43_in_compound_statement2223)
                 if self.failed:
                     return retval
                 # C.g:554:8: ( declaration )*
-                while True: #loop93
+                while True:  # loop93
                     alt93 = 2
                     LA93 = self.input.LA(1)
                     if LA93 == IDENTIFIER:
@@ -12602,1930 +12170,1645 @@ class CParser(Parser):
                         if LA93 == 62:
                             LA93_44 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_47 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 66:
                             LA93_48 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_49 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_50 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_51 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_52 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_53 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_54 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_55 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_56 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_57 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_58 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_59 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_60 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_61 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_62 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_63 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_64 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_65 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 26:
                         LA93 = self.input.LA(2)
                         if LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_86 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_87 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_88 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_89 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_90 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_91 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_92 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_93 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_94 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_95 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_96 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_97 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_98 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_99 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 66:
                             LA93_100 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_101 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_102 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_103 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_104 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_105 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_106 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_107 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_108 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_109 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_110 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_111 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_112 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_113 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_114 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_115 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_116 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_117 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_118 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_119 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_120 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_121 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_122 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_123 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_124 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 34:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_125 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_126 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_127 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_128 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_129 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_130 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_131 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_132 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_133 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_134 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_135 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_136 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_137 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_138 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_139 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_140 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_141 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_142 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_143 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_144 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 35:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_145 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_146 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_147 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_148 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_149 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_150 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_151 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_152 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_153 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_154 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_155 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_156 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_157 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_158 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_159 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_160 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_161 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_162 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_163 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_164 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 36:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_165 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_166 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_167 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_168 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_169 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_170 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_171 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_172 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_173 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_174 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_175 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_176 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_177 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_178 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_179 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_180 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_181 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_182 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_183 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_184 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 37:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_185 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_186 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_187 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_188 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_189 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_190 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_191 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_192 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_193 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_194 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_195 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_196 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_197 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_198 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_199 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_200 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_201 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_202 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_203 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_204 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 38:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_205 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_206 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_207 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_208 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_209 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_210 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_211 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_212 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_213 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_214 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_215 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_216 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_217 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_218 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_219 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_220 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_221 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_222 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_223 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_224 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 39:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_225 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_226 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_227 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_228 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_229 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_230 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_231 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_232 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_233 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_234 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_235 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_236 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_237 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_238 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_239 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_240 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_241 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_242 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_243 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_244 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 40:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_245 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_246 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_247 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_248 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_249 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_250 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_251 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_252 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_253 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_254 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_255 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_256 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_257 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_258 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_259 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_260 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_261 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_262 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_263 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_264 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 41:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_265 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_266 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_267 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_268 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_269 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_270 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_271 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_272 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_273 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_274 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_275 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_276 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_277 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_278 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_279 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_280 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_281 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_282 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_283 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_284 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 42:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_285 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_286 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_287 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_288 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_289 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_290 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_291 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_292 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_293 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_294 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_295 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_296 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_297 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_298 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_299 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_300 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_301 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_302 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_303 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_304 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
                     elif LA93 == 45 or LA93 == 46:
                         LA93_40 = self.input.LA(2)
 
-                        if (LA93_40 == IDENTIFIER) :
+                        if (LA93_40 == IDENTIFIER):
                             LA93_305 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-                        elif (LA93_40 == 43) :
+                        elif (LA93_40 == 43):
                             LA93_306 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     elif LA93 == 48:
                         LA93_41 = self.input.LA(2)
 
-                        if (LA93_41 == 43) :
+                        if (LA93_41 == 43):
                             LA93_307 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-                        elif (LA93_41 == IDENTIFIER) :
+                        elif (LA93_41 == IDENTIFIER):
                             LA93_308 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 58 or LA93 == 59 or LA93 == 60 or LA93 == 61:
                         LA93 = self.input.LA(2)
                         if LA93 == 66:
                             LA93_309 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 58:
                             LA93_310 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 59:
                             LA93_311 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 60:
                             LA93_312 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == IDENTIFIER:
                             LA93_313 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 62:
                             LA93_314 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 25:
                             LA93_315 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 29 or LA93 == 30 or LA93 == 31 or LA93 == 32 or LA93 == 33:
                             LA93_316 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 34:
                             LA93_317 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 35:
                             LA93_318 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 36:
                             LA93_319 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 37:
                             LA93_320 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 38:
                             LA93_321 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 39:
                             LA93_322 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 40:
                             LA93_323 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 41:
                             LA93_324 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 42:
                             LA93_325 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 45 or LA93 == 46:
                             LA93_326 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 48:
                             LA93_327 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
                         elif LA93 == 49 or LA93 == 50 or LA93 == 51 or LA93 == 52 or LA93 == 53 or LA93 == 54 or LA93 == 55 or LA93 == 56 or LA93 == 57 or LA93 == 61:
                             LA93_328 = self.input.LA(3)
 
-                            if (self.synpred186()) :
+                            if (self.synpred186()):
                                 alt93 = 1
 
-
-
-
                     if alt93 == 1:
                         # C.g:0:0: declaration
-                        self.following.append(self.FOLLOW_declaration_in_compound_statement2225)
+                        self.following.append(
+                            self.FOLLOW_declaration_in_compound_statement2225)
                         self.declaration()
                         self.following.pop()
                         if self.failed:
                             return retval
 
-
                     else:
-                        break #loop93
-
+                        break  # loop93
 
                 # C.g:554:21: ( statement_list )?
                 alt94 = 2
                 LA94_0 = self.input.LA(1)
 
-                if ((IDENTIFIER <= LA94_0 <= FLOATING_POINT_LITERAL) or (25 <= LA94_0 <= 26) or (29 <= LA94_0 <= 43) or (45 <= LA94_0 <= 46) or (48 <= LA94_0 <= 62) or LA94_0 == 66 or (68 <= LA94_0 <= 69) or (72 <= LA94_0 <= 74) or (77 <= LA94_0 <= 79) or (103 <= LA94_0 <= 108) or (110 <= LA94_0 <= 117)) :
+                if ((IDENTIFIER <= LA94_0 <= FLOATING_POINT_LITERAL) or (25 <= LA94_0 <= 26) or (29 <= LA94_0 <= 43) or (45 <= LA94_0 <= 46) or (48 <= LA94_0 <= 62) or LA94_0 == 66 or (68 <= LA94_0 <= 69) or (72 <= LA94_0 <= 74) or (77 <= LA94_0 <= 79) or (103 <= LA94_0 <= 108) or (110 <= LA94_0 <= 117)):
                     alt94 = 1
                 if alt94 == 1:
                     # C.g:0:0: statement_list
-                    self.following.append(self.FOLLOW_statement_list_in_compound_statement2228)
+                    self.following.append(
+                        self.FOLLOW_statement_list_in_compound_statement2228)
                     self.statement_list()
                     self.following.pop()
                     if self.failed:
                         return retval
 
-
-
-                self.match(self.input, 44, self.FOLLOW_44_in_compound_statement2231)
+                self.match(self.input, 44,
+                           self.FOLLOW_44_in_compound_statement2231)
                 if self.failed:
                     return retval
 
-
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -14539,9 +13822,9 @@ class CParser(Parser):
 
     # $ANTLR end compound_statement
 
-
     # $ANTLR start statement_list
     # C.g:557:1: statement_list : ( statement )+ ;
+
     def statement_list(self, ):
 
         statement_list_StartIndex = self.input.index()
@@ -14554,7 +13837,7 @@ class CParser(Parser):
                 # C.g:558:4: ( statement )+
                 # C.g:558:4: ( statement )+
                 cnt95 = 0
-                while True: #loop95
+                while True:  # loop95
                     alt95 = 2
                     LA95 = self.input.LA(1)
                     if LA95 == IDENTIFIER:
@@ -14562,330 +13845,283 @@ class CParser(Parser):
                         if LA95 == 62:
                             LA95_46 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25 or LA95 == 29 or LA95 == 30 or LA95 == 31 or LA95 == 32 or LA95 == 33 or LA95 == 34 or LA95 == 35 or LA95 == 36 or LA95 == 37 or LA95 == 38 or LA95 == 39 or LA95 == 40 or LA95 == 41 or LA95 == 42 or LA95 == 45 or LA95 == 46 or LA95 == 47 or LA95 == 48 or LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61:
                             alt95 = 1
                         elif LA95 == STRING_LITERAL:
                             LA95_48 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_49 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 64:
                             LA95_50 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_51 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_52 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_53 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_54 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_55 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_56 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_57 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_58 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_59 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_60 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_61 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_62 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_63 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_64 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_65 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_66 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_67 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_68 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_69 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_88 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == HEX_LITERAL:
                         LA95 = self.input.LA(2)
                         if LA95 == 64:
                             LA95_89 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_90 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_91 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_92 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_93 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_94 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_95 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_96 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_97 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_98 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_99 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_100 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_101 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_102 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_103 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_104 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_105 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_106 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_107 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_108 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_109 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_110 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -14894,157 +14130,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_113 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_114 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_115 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_116 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_117 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_118 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_119 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_120 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_121 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_122 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_123 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_124 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_125 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_126 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_127 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_128 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_129 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_130 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_131 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_132 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_133 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_135 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15053,157 +14267,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_137 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_138 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_139 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_140 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_141 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_142 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_143 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_144 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_145 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_146 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_147 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_148 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_149 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_150 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_151 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_152 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_153 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_154 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_155 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_156 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_157 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_158 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15212,157 +14404,135 @@ class CParser(Parser):
                         if LA95 == 64:
                             LA95_161 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_162 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_163 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_164 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_165 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_166 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_167 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_168 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_169 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_170 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_171 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_172 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_173 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_174 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_175 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_176 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_177 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_178 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_179 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_180 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_181 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_182 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
 
@@ -15371,867 +14541,742 @@ class CParser(Parser):
                         if LA95 == IDENTIFIER:
                             LA95_185 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 64:
                             LA95_186 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_187 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_188 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_189 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_190 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_191 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_192 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_193 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_194 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_195 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_196 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_197 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_198 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_199 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_200 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_201 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_202 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_203 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_204 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_205 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_206 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
                         elif LA95 == STRING_LITERAL:
                             LA95_208 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_209 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == FLOATING_POINT_LITERAL:
                         LA95 = self.input.LA(2)
                         if LA95 == 64:
                             LA95_211 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_212 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 75:
                             LA95_213 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66:
                             LA95_214 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 76:
                             LA95_215 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_216 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_217 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 70:
                             LA95_218 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 71:
                             LA95_219 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 68:
                             LA95_220 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 69:
                             LA95_221 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 101 or LA95 == 102:
                             LA95_222 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 97 or LA95 == 98 or LA95 == 99 or LA95 == 100:
                             LA95_223 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 95 or LA95 == 96:
                             LA95_224 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 77:
                             LA95_225 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 94:
                             LA95_226 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 93:
                             LA95_227 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 92:
                             LA95_228 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 91:
                             LA95_229 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 90:
                             LA95_230 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 27:
                             LA95_231 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 25:
                             alt95 = 1
                         elif LA95 == 28 or LA95 == 80 or LA95 == 81 or LA95 == 82 or LA95 == 83 or LA95 == 84 or LA95 == 85 or LA95 == 86 or LA95 == 87 or LA95 == 88 or LA95 == 89:
                             LA95_234 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 62:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_235 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_236 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_237 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_238 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_239 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_240 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_241 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_242 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_243 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_244 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_245 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_246 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61:
                             LA95_247 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 34:
                             LA95_248 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 35:
                             LA95_249 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 36:
                             LA95_250 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 37:
                             LA95_251 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 38:
                             LA95_252 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 39:
                             LA95_253 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 40:
                             LA95_254 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 41:
                             LA95_255 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 42:
                             LA95_256 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 45 or LA95 == 46:
                             LA95_257 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 48:
                             LA95_258 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 72:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_259 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_260 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_261 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_262 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_263 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_264 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_265 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_266 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_267 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_268 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_269 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_270 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 73:
                         LA95 = self.input.LA(2)
                         if LA95 == IDENTIFIER:
                             LA95_271 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_272 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_273 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_274 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_275 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_276 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_277 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 62:
                             LA95_278 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_279 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_280 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_281 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_282 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                         LA95 = self.input.LA(2)
                         if LA95 == 62:
                             LA95_283 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_284 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_285 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_286 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_287 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_288 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_289 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_290 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_291 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_292 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_293 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_294 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 74:
                         LA95 = self.input.LA(2)
                         if LA95 == 62:
                             LA95_295 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == IDENTIFIER:
                             LA95_296 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == HEX_LITERAL:
                             LA95_297 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == OCTAL_LITERAL:
                             LA95_298 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == DECIMAL_LITERAL:
                             LA95_299 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == CHARACTER_LITERAL:
                             LA95_300 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == STRING_LITERAL:
                             LA95_301 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == FLOATING_POINT_LITERAL:
                             LA95_302 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 72:
                             LA95_303 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 73:
                             LA95_304 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 66 or LA95 == 68 or LA95 == 69 or LA95 == 77 or LA95 == 78 or LA95 == 79:
                             LA95_305 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
                         elif LA95 == 74:
                             LA95_306 = self.input.LA(3)
 
-                            if (self.synpred188()) :
+                            if (self.synpred188()):
                                 alt95 = 1
 
-
-
                     elif LA95 == 25 or LA95 == 26 or LA95 == 29 or LA95 == 30 or LA95 == 31 or LA95 == 32 or LA95 == 33 or LA95 == 34 or LA95 == 35 or LA95 == 36 or LA95 == 37 or LA95 == 38 or LA95 == 39 or LA95 == 40 or LA95 == 41 or LA95 == 42 or LA95 == 43 or LA95 == 45 or LA95 == 46 or LA95 == 48 or LA95 == 49 or LA95 == 50 or LA95 == 51 or LA95 == 52 or LA95 == 53 or LA95 == 54 or LA95 == 55 or LA95 == 56 or LA95 == 57 or LA95 == 58 or LA95 == 59 or LA95 == 60 or LA95 == 61 or LA95 == 103 or LA95 == 104 or LA95 == 105 or LA95 == 106 or LA95 == 107 or LA95 == 108 or LA95 == 110 or LA95 == 111 or LA95 == 112 or LA95 == 113 or LA95 == 114 or LA95 == 115 or LA95 == 116 or LA95 == 117:
                         alt95 = 1
 
                     if alt95 == 1:
                         # C.g:0:0: statement
-                        self.following.append(self.FOLLOW_statement_in_statement_list2242)
+                        self.following.append(
+                            self.FOLLOW_statement_in_statement_list2242)
                         self.statement()
                         self.following.pop()
                         if self.failed:
                             return
 
-
                     else:
                         if cnt95 >= 1:
-                            break #loop95
+                            break  # loop95
 
                         if self.backtracking > 0:
                             self.failed = True
@@ -16242,11 +15287,6 @@ class CParser(Parser):
 
                     cnt95 += 1
 
-
-
-
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16265,10 +15305,9 @@ class CParser(Parser):
             self.start = None
             self.stop = None
 
-
-
     # $ANTLR start expression_statement
     # C.g:561:1: expression_statement : ( ';' | expression ';' );
+
     def expression_statement(self, ):
 
         retval = self.expression_statement_return()
@@ -16283,41 +15322,42 @@ class CParser(Parser):
                 alt96 = 2
                 LA96_0 = self.input.LA(1)
 
-                if (LA96_0 == 25) :
+                if (LA96_0 == 25):
                     alt96 = 1
-                elif ((IDENTIFIER <= LA96_0 <= FLOATING_POINT_LITERAL) or LA96_0 == 62 or LA96_0 == 66 or (68 <= LA96_0 <= 69) or (72 <= LA96_0 <= 74) or (77 <= LA96_0 <= 79)) :
+                elif ((IDENTIFIER <= LA96_0 <= FLOATING_POINT_LITERAL) or LA96_0 == 62 or LA96_0 == 66 or (68 <= LA96_0 <= 69) or (72 <= LA96_0 <= 74) or (77 <= LA96_0 <= 79)):
                     alt96 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return retval
 
-                    nvae = NoViableAltException("561:1: expression_statement : ( ';' | expression ';' );", 96, 0, self.input)
+                    nvae = NoViableAltException(
+                        "561:1: expression_statement : ( ';' | expression ';' );", 96, 0, self.input)
 
                     raise nvae
 
                 if alt96 == 1:
                     # C.g:562:4: ';'
-                    self.match(self.input, 25, self.FOLLOW_25_in_expression_statement2254)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_expression_statement2254)
                     if self.failed:
                         return retval
 
-
                 elif alt96 == 2:
                     # C.g:563:4: expression ';'
-                    self.following.append(self.FOLLOW_expression_in_expression_statement2259)
+                    self.following.append(
+                        self.FOLLOW_expression_in_expression_statement2259)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return retval
-                    self.match(self.input, 25, self.FOLLOW_25_in_expression_statement2261)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_expression_statement2261)
                     if self.failed:
                         return retval
 
-
                 retval.stop = self.input.LT(-1)
 
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16331,15 +15371,14 @@ class CParser(Parser):
 
     # $ANTLR end expression_statement
 
-
     # $ANTLR start selection_statement
     # C.g:566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );
+
     def selection_statement(self, ):
 
         selection_statement_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 69):
@@ -16349,39 +15388,46 @@ class CParser(Parser):
                 alt98 = 2
                 LA98_0 = self.input.LA(1)
 
-                if (LA98_0 == 108) :
+                if (LA98_0 == 108):
                     alt98 = 1
-                elif (LA98_0 == 110) :
+                elif (LA98_0 == 110):
                     alt98 = 2
                 else:
                     if self.backtracking > 0:
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
+                    nvae = NoViableAltException(
+                        "566:1: selection_statement : ( 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )? | 'switch' '(' expression ')' statement );", 98, 0, self.input)
 
                     raise nvae
 
                 if alt98 == 1:
                     # C.g:567:4: 'if' '(' e= expression ')' statement ( options {k=1; backtrack=false; } : 'else' statement )?
-                    self.match(self.input, 108, self.FOLLOW_108_in_selection_statement2272)
+                    self.match(self.input, 108,
+                               self.FOLLOW_108_in_selection_statement2272)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2274)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_selection_statement2274)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_selection_statement2278)
+                    self.following.append(
+                        self.FOLLOW_expression_in_selection_statement2278)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2280)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_selection_statement2280)
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
-                    self.following.append(self.FOLLOW_statement_in_selection_statement2284)
+                    self.following.append(
+                        self.FOLLOW_statement_in_selection_statement2284)
                     self.statement()
                     self.following.pop()
                     if self.failed:
@@ -16390,47 +15436,48 @@ class CParser(Parser):
                     alt97 = 2
                     LA97_0 = self.input.LA(1)
 
-                    if (LA97_0 == 109) :
+                    if (LA97_0 == 109):
                         alt97 = 1
                     if alt97 == 1:
                         # C.g:567:200: 'else' statement
-                        self.match(self.input, 109, self.FOLLOW_109_in_selection_statement2299)
+                        self.match(self.input, 109,
+                                   self.FOLLOW_109_in_selection_statement2299)
                         if self.failed:
                             return
-                        self.following.append(self.FOLLOW_statement_in_selection_statement2301)
+                        self.following.append(
+                            self.FOLLOW_statement_in_selection_statement2301)
                         self.statement()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-
-
                 elif alt98 == 2:
                     # C.g:568:4: 'switch' '(' expression ')' statement
-                    self.match(self.input, 110, self.FOLLOW_110_in_selection_statement2308)
+                    self.match(self.input, 110,
+                               self.FOLLOW_110_in_selection_statement2308)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_selection_statement2310)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_selection_statement2310)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_selection_statement2312)
+                    self.following.append(
+                        self.FOLLOW_expression_in_selection_statement2312)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_selection_statement2314)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_selection_statement2314)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_selection_statement2316)
+                    self.following.append(
+                        self.FOLLOW_statement_in_selection_statement2316)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16444,15 +15491,14 @@ class CParser(Parser):
 
     # $ANTLR end selection_statement
 
-
     # $ANTLR start iteration_statement
     # C.g:571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );
+
     def iteration_statement(self, ):
 
         iteration_statement_StartIndex = self.input.index()
         e = None
 
-
         try:
             try:
                 if self.backtracking > 0 and self.alreadyParsedRule(self.input, 70):
@@ -16472,82 +15518,97 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
+                    nvae = NoViableAltException(
+                        "571:1: iteration_statement : ( 'while' '(' e= expression ')' statement | 'do' statement 'while' '(' e= expression ')' ';' | 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement );", 100, 0, self.input)
 
                     raise nvae
 
                 if alt100 == 1:
                     # C.g:572:4: 'while' '(' e= expression ')' statement
-                    self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2327)
+                    self.match(self.input, 111,
+                               self.FOLLOW_111_in_iteration_statement2327)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2329)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2329)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_iteration_statement2333)
+                    self.following.append(
+                        self.FOLLOW_expression_in_iteration_statement2333)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2335)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2335)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2337)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2337)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt100 == 2:
                     # C.g:573:4: 'do' statement 'while' '(' e= expression ')' ';'
-                    self.match(self.input, 112, self.FOLLOW_112_in_iteration_statement2344)
+                    self.match(self.input, 112,
+                               self.FOLLOW_112_in_iteration_statement2344)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2346)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2346)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 111, self.FOLLOW_111_in_iteration_statement2348)
+                    self.match(self.input, 111,
+                               self.FOLLOW_111_in_iteration_statement2348)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2350)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2350)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_iteration_statement2354)
+                    self.following.append(
+                        self.FOLLOW_expression_in_iteration_statement2354)
                     e = self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2356)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2356)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_iteration_statement2358)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_iteration_statement2358)
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
                 elif alt100 == 3:
                     # C.g:574:4: 'for' '(' expression_statement e= expression_statement ( expression )? ')' statement
-                    self.match(self.input, 113, self.FOLLOW_113_in_iteration_statement2365)
+                    self.match(self.input, 113,
+                               self.FOLLOW_113_in_iteration_statement2365)
                     if self.failed:
                         return
-                    self.match(self.input, 62, self.FOLLOW_62_in_iteration_statement2367)
+                    self.match(self.input, 62,
+                               self.FOLLOW_62_in_iteration_statement2367)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2369)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_iteration_statement2369)
                     self.expression_statement()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_statement_in_iteration_statement2373)
+                    self.following.append(
+                        self.FOLLOW_expression_statement_in_iteration_statement2373)
                     e = self.expression_statement()
                     self.following.pop()
                     if self.failed:
@@ -16556,31 +15617,30 @@ class CParser(Parser):
                     alt99 = 2
                     LA99_0 = self.input.LA(1)
 
-                    if ((IDENTIFIER <= LA99_0 <= FLOATING_POINT_LITERAL) or LA99_0 == 62 or LA99_0 == 66 or (68 <= LA99_0 <= 69) or (72 <= LA99_0 <= 74) or (77 <= LA99_0 <= 79)) :
+                    if ((IDENTIFIER <= LA99_0 <= FLOATING_POINT_LITERAL) or LA99_0 == 62 or LA99_0 == 66 or (68 <= LA99_0 <= 69) or (72 <= LA99_0 <= 74) or (77 <= LA99_0 <= 79)):
                         alt99 = 1
                     if alt99 == 1:
                         # C.g:0:0: expression
-                        self.following.append(self.FOLLOW_expression_in_iteration_statement2375)
+                        self.following.append(
+                            self.FOLLOW_expression_in_iteration_statement2375)
                         self.expression()
                         self.following.pop()
                         if self.failed:
                             return
 
-
-
-                    self.match(self.input, 63, self.FOLLOW_63_in_iteration_statement2378)
+                    self.match(self.input, 63,
+                               self.FOLLOW_63_in_iteration_statement2378)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_statement_in_iteration_statement2380)
+                    self.following.append(
+                        self.FOLLOW_statement_in_iteration_statement2380)
                     self.statement()
                     self.following.pop()
                     if self.failed:
                         return
                     if self.backtracking == 0:
-                        self.StorePredicateExpression(e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
-
-
-
+                        self.StorePredicateExpression(
+                            e.start.line, e.start.charPositionInLine, e.stop.line, e.stop.charPositionInLine, self.input.toString(e.start, e.stop))
 
             except RecognitionException as re:
                 self.reportError(re)
@@ -16595,9 +15655,9 @@ class CParser(Parser):
 
     # $ANTLR end iteration_statement
 
-
     # $ANTLR start jump_statement
     # C.g:577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );
+
     def jump_statement(self, ):
 
         jump_statement_StartIndex = self.input.index()
@@ -16618,16 +15678,17 @@ class CParser(Parser):
                 elif LA101 == 117:
                     LA101_4 = self.input.LA(2)
 
-                    if (LA101_4 == 25) :
+                    if (LA101_4 == 25):
                         alt101 = 4
-                    elif ((IDENTIFIER <= LA101_4 <= FLOATING_POINT_LITERAL) or LA101_4 == 62 or LA101_4 == 66 or (68 <= LA101_4 <= 69) or (72 <= LA101_4 <= 74) or (77 <= LA101_4 <= 79)) :
+                    elif ((IDENTIFIER <= LA101_4 <= FLOATING_POINT_LITERAL) or LA101_4 == 62 or LA101_4 == 66 or (68 <= LA101_4 <= 69) or (72 <= LA101_4 <= 74) or (77 <= LA101_4 <= 79)):
                         alt101 = 5
                     else:
                         if self.backtracking > 0:
                             self.failed = True
                             return
 
-                        nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
+                        nvae = NoViableAltException(
+                            "577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 4, self.input)
 
                         raise nvae
 
@@ -16636,69 +15697,76 @@ class CParser(Parser):
                         self.failed = True
                         return
 
-                    nvae = NoViableAltException("577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
+                    nvae = NoViableAltException(
+                        "577:1: jump_statement : ( 'goto' IDENTIFIER ';' | 'continue' ';' | 'break' ';' | 'return' ';' | 'return' expression ';' );", 101, 0, self.input)
 
                     raise nvae
 
                 if alt101 == 1:
                     # C.g:578:4: 'goto' IDENTIFIER ';'
-                    self.match(self.input, 114, self.FOLLOW_114_in_jump_statement2393)
+                    self.match(self.input, 114,
+                               self.FOLLOW_114_in_jump_statement2393)
                     if self.failed:
                         return
-                    self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_jump_statement2395)
+                    self.match(self.input, IDENTIFIER,
+                               self.FOLLOW_IDENTIFIER_in_jump_statement2395)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2397)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2397)
                     if self.failed:
                         return
 
-
                 elif alt101 == 2:
                     # C.g:579:4: 'continue' ';'
-                    self.match(self.input, 115, self.FOLLOW_115_in_jump_statement2402)
+                    self.match(self.input, 115,
+                               self.FOLLOW_115_in_jump_statement2402)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2404)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2404)
                     if self.failed:
                         return
 
-
                 elif alt101 == 3:
                     # C.g:580:4: 'break' ';'
-                    self.match(self.input, 116, self.FOLLOW_116_in_jump_statement2409)
+                    self.match(self.input, 116,
+                               self.FOLLOW_116_in_jump_statement2409)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2411)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2411)
                     if self.failed:
                         return
 
-
                 elif alt101 == 4:
                     # C.g:581:4: 'return' ';'
-                    self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2416)
+                    self.match(self.input, 117,
+                               self.FOLLOW_117_in_jump_statement2416)
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2418)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2418)
                     if self.failed:
                         return
 
-
                 elif alt101 == 5:
                     # C.g:582:4: 'return' expression ';'
-                    self.match(self.input, 117, self.FOLLOW_117_in_jump_statement2423)
+                    self.match(self.input, 117,
+                               self.FOLLOW_117_in_jump_statement2423)
                     if self.failed:
                         return
-                    self.following.append(self.FOLLOW_expression_in_jump_statement2425)
+                    self.following.append(
+                        self.FOLLOW_expression_in_jump_statement2425)
                     self.expression()
                     self.following.pop()
                     if self.failed:
                         return
-                    self.match(self.input, 25, self.FOLLOW_25_in_jump_statement2427)
+                    self.match(self.input, 25,
+                               self.FOLLOW_25_in_jump_statement2427)
                     if self.failed:
                         return
 
-
-
             except RecognitionException as re:
                 self.reportError(re)
                 self.recover(self.input, re)
@@ -16716,18 +15784,17 @@ class CParser(Parser):
     def synpred2_fragment(self, ):
         # C.g:119:6: ( declaration_specifiers )
         # C.g:119:6: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred2100)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred2100)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred2
 
-
-
     # $ANTLR start synpred4
+
     def synpred4_fragment(self, ):
         # C.g:119:4: ( ( declaration_specifiers )? declarator ( declaration )* '{' )
         # C.g:119:6: ( declaration_specifiers )? declarator ( declaration )* '{'
@@ -16741,134 +15808,132 @@ class CParser(Parser):
             if LA102 == 62:
                 LA102_21 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 29 or LA102 == 30 or LA102 == 31 or LA102 == 32 or LA102 == 33:
                 LA102_23 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 34:
                 LA102_24 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 35:
                 LA102_25 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 36:
                 LA102_26 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 37:
                 LA102_27 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 38:
                 LA102_28 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 39:
                 LA102_29 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 40:
                 LA102_30 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 41:
                 LA102_31 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 42:
                 LA102_32 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 45 or LA102 == 46:
                 LA102_33 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 48:
                 LA102_34 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == IDENTIFIER:
                 LA102_35 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 58:
                 LA102_36 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 66:
                 alt102 = 1
             elif LA102 == 59:
                 LA102_39 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 60:
                 LA102_40 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
             elif LA102 == 49 or LA102 == 50 or LA102 == 51 or LA102 == 52 or LA102 == 53 or LA102 == 54 or LA102 == 55 or LA102 == 56 or LA102 == 57 or LA102 == 61:
                 LA102_41 = self.input.LA(3)
 
-                if (self.synpred2()) :
+                if (self.synpred2()):
                     alt102 = 1
         elif LA102 == 58:
             LA102_14 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         elif LA102 == 59:
             LA102_16 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         elif LA102 == 60:
             LA102_17 = self.input.LA(2)
 
-            if (self.synpred2()) :
+            if (self.synpred2()):
                 alt102 = 1
         if alt102 == 1:
             # C.g:0:0: declaration_specifiers
-            self.following.append(self.FOLLOW_declaration_specifiers_in_synpred4100)
+            self.following.append(
+                self.FOLLOW_declaration_specifiers_in_synpred4100)
             self.declaration_specifiers()
             self.following.pop()
             if self.failed:
                 return
 
-
-
         self.following.append(self.FOLLOW_declarator_in_synpred4103)
         self.declarator()
         self.following.pop()
         if self.failed:
             return
         # C.g:119:41: ( declaration )*
-        while True: #loop103
+        while True:  # loop103
             alt103 = 2
             LA103_0 = self.input.LA(1)
 
-            if (LA103_0 == IDENTIFIER or LA103_0 == 26 or (29 <= LA103_0 <= 42) or (45 <= LA103_0 <= 46) or (48 <= LA103_0 <= 61)) :
+            if (LA103_0 == IDENTIFIER or LA103_0 == 26 or (29 <= LA103_0 <= 42) or (45 <= LA103_0 <= 46) or (48 <= LA103_0 <= 61)):
                 alt103 = 1
 
-
             if alt103 == 1:
                 # C.g:0:0: declaration
                 self.following.append(self.FOLLOW_declaration_in_synpred4105)
@@ -16877,21 +15942,17 @@ class CParser(Parser):
                 if self.failed:
                     return
 
-
             else:
-                break #loop103
-
+                break  # loop103
 
         self.match(self.input, 43, self.FOLLOW_43_in_synpred4108)
         if self.failed:
             return
 
-
     # $ANTLR end synpred4
 
-
-
     # $ANTLR start synpred5
+
     def synpred5_fragment(self, ):
         # C.g:120:4: ( declaration )
         # C.g:120:4: declaration
@@ -16901,42 +15962,38 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred5
 
-
-
     # $ANTLR start synpred7
+
     def synpred7_fragment(self, ):
         # C.g:146:6: ( declaration_specifiers )
         # C.g:146:6: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred7157)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred7157)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred7
 
-
-
     # $ANTLR start synpred10
+
     def synpred10_fragment(self, ):
         # C.g:167:18: ( declaration_specifiers )
         # C.g:167:18: declaration_specifiers
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred10207)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred10207)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred10
 
-
-
     # $ANTLR start synpred14
+
     def synpred14_fragment(self, ):
         # C.g:184:7: ( type_specifier )
         # C.g:184:7: type_specifier
@@ -16946,12 +16003,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred14
 
-
-
     # $ANTLR start synpred15
+
     def synpred15_fragment(self, ):
         # C.g:185:13: ( type_qualifier )
         # C.g:185:13: type_qualifier
@@ -16961,12 +16016,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred15
 
-
-
     # $ANTLR start synpred33
+
     def synpred33_fragment(self, ):
         # C.g:225:16: ( type_qualifier )
         # C.g:225:16: type_qualifier
@@ -16976,58 +16029,53 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred33
 
-
-
     # $ANTLR start synpred34
+
     def synpred34_fragment(self, ):
         # C.g:225:4: ( IDENTIFIER ( type_qualifier )* declarator )
         # C.g:225:5: IDENTIFIER ( type_qualifier )* declarator
-        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred34442)
+        self.match(self.input, IDENTIFIER,
+                   self.FOLLOW_IDENTIFIER_in_synpred34442)
         if self.failed:
             return
         # C.g:225:16: ( type_qualifier )*
-        while True: #loop106
+        while True:  # loop106
             alt106 = 2
             LA106 = self.input.LA(1)
             if LA106 == 58:
                 LA106_2 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 59:
                 LA106_3 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 60:
                 LA106_4 = self.input.LA(2)
 
-                if (self.synpred33()) :
+                if (self.synpred33()):
                     alt106 = 1
 
-
             elif LA106 == 49 or LA106 == 50 or LA106 == 51 or LA106 == 52 or LA106 == 53 or LA106 == 54 or LA106 == 55 or LA106 == 56 or LA106 == 57 or LA106 == 61:
                 alt106 = 1
 
             if alt106 == 1:
                 # C.g:0:0: type_qualifier
-                self.following.append(self.FOLLOW_type_qualifier_in_synpred34444)
+                self.following.append(
+                    self.FOLLOW_type_qualifier_in_synpred34444)
                 self.type_qualifier()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
-                break #loop106
-
+                break  # loop106
 
         self.following.append(self.FOLLOW_declarator_in_synpred34447)
         self.declarator()
@@ -17035,12 +16083,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred34
 
-
-
     # $ANTLR start synpred39
+
     def synpred39_fragment(self, ):
         # C.g:253:6: ( type_qualifier )
         # C.g:253:6: type_qualifier
@@ -17050,12 +16096,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred39
 
-
-
     # $ANTLR start synpred40
+
     def synpred40_fragment(self, ):
         # C.g:253:23: ( type_specifier )
         # C.g:253:23: type_specifier
@@ -17065,12 +16109,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred40
 
-
-
     # $ANTLR start synpred66
+
     def synpred66_fragment(self, ):
         # C.g:297:4: ( ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator )
         # C.g:297:4: ( pointer )? ( 'EFIAPI' )? ( 'EFI_BOOTSERVICE' )? ( 'EFI_RUNTIMESERVICE' )? direct_declarator
@@ -17078,7 +16120,7 @@ class CParser(Parser):
         alt111 = 2
         LA111_0 = self.input.LA(1)
 
-        if (LA111_0 == 66) :
+        if (LA111_0 == 66):
             alt111 = 1
         if alt111 == 1:
             # C.g:0:0: pointer
@@ -17088,13 +16130,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:13: ( 'EFIAPI' )?
         alt112 = 2
         LA112_0 = self.input.LA(1)
 
-        if (LA112_0 == 58) :
+        if (LA112_0 == 58):
             alt112 = 1
         if alt112 == 1:
             # C.g:297:14: 'EFIAPI'
@@ -17102,13 +16142,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:25: ( 'EFI_BOOTSERVICE' )?
         alt113 = 2
         LA113_0 = self.input.LA(1)
 
-        if (LA113_0 == 59) :
+        if (LA113_0 == 59):
             alt113 = 1
         if alt113 == 1:
             # C.g:297:26: 'EFI_BOOTSERVICE'
@@ -17116,13 +16154,11 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         # C.g:297:46: ( 'EFI_RUNTIMESERVICE' )?
         alt114 = 2
         LA114_0 = self.input.LA(1)
 
-        if (LA114_0 == 60) :
+        if (LA114_0 == 60):
             alt114 = 1
         if alt114 == 1:
             # C.g:297:47: 'EFI_RUNTIMESERVICE'
@@ -17130,20 +16166,16 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
         self.following.append(self.FOLLOW_direct_declarator_in_synpred66802)
         self.direct_declarator()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred66
 
-
-
     # $ANTLR start synpred67
+
     def synpred67_fragment(self, ):
         # C.g:303:15: ( declarator_suffix )
         # C.g:303:15: declarator_suffix
@@ -17153,12 +16185,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred67
 
-
-
     # $ANTLR start synpred69
+
     def synpred69_fragment(self, ):
         # C.g:304:9: ( 'EFIAPI' )
         # C.g:304:9: 'EFIAPI'
@@ -17166,12 +16196,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred69
 
-
-
     # $ANTLR start synpred70
+
     def synpred70_fragment(self, ):
         # C.g:304:35: ( declarator_suffix )
         # C.g:304:35: declarator_suffix
@@ -17181,12 +16209,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred70
 
-
-
     # $ANTLR start synpred73
+
     def synpred73_fragment(self, ):
         # C.g:310:9: ( '(' parameter_type_list ')' )
         # C.g:310:9: '(' parameter_type_list ')'
@@ -17202,12 +16228,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred73
 
-
-
     # $ANTLR start synpred74
+
     def synpred74_fragment(self, ):
         # C.g:311:9: ( '(' identifier_list ')' )
         # C.g:311:9: '(' identifier_list ')'
@@ -17223,12 +16247,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred74
 
-
-
     # $ANTLR start synpred75
+
     def synpred75_fragment(self, ):
         # C.g:316:8: ( type_qualifier )
         # C.g:316:8: type_qualifier
@@ -17238,12 +16260,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred75
 
-
-
     # $ANTLR start synpred76
+
     def synpred76_fragment(self, ):
         # C.g:316:24: ( pointer )
         # C.g:316:24: pointer
@@ -17253,12 +16273,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred76
 
-
-
     # $ANTLR start synpred77
+
     def synpred77_fragment(self, ):
         # C.g:316:4: ( '*' ( type_qualifier )+ ( pointer )? )
         # C.g:316:4: '*' ( type_qualifier )+ ( pointer )?
@@ -17267,26 +16285,25 @@ class CParser(Parser):
             return
         # C.g:316:8: ( type_qualifier )+
         cnt116 = 0
-        while True: #loop116
+        while True:  # loop116
             alt116 = 2
             LA116_0 = self.input.LA(1)
 
-            if ((49 <= LA116_0 <= 61)) :
+            if ((49 <= LA116_0 <= 61)):
                 alt116 = 1
 
-
             if alt116 == 1:
                 # C.g:0:0: type_qualifier
-                self.following.append(self.FOLLOW_type_qualifier_in_synpred77921)
+                self.following.append(
+                    self.FOLLOW_type_qualifier_in_synpred77921)
                 self.type_qualifier()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
                 if cnt116 >= 1:
-                    break #loop116
+                    break  # loop116
 
                 if self.backtracking > 0:
                     self.failed = True
@@ -17297,12 +16314,11 @@ class CParser(Parser):
 
             cnt116 += 1
 
-
         # C.g:316:24: ( pointer )?
         alt117 = 2
         LA117_0 = self.input.LA(1)
 
-        if (LA117_0 == 66) :
+        if (LA117_0 == 66):
             alt117 = 1
         if alt117 == 1:
             # C.g:0:0: pointer
@@ -17312,15 +16328,10 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred77
 
-
-
     # $ANTLR start synpred78
+
     def synpred78_fragment(self, ):
         # C.g:317:4: ( '*' pointer )
         # C.g:317:4: '*' pointer
@@ -17333,12 +16344,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred78
 
-
-
     # $ANTLR start synpred81
+
     def synpred81_fragment(self, ):
         # C.g:326:32: ( 'OPTIONAL' )
         # C.g:326:32: 'OPTIONAL'
@@ -17346,12 +16355,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred81
 
-
-
     # $ANTLR start synpred82
+
     def synpred82_fragment(self, ):
         # C.g:326:27: ( ',' ( 'OPTIONAL' )? parameter_declaration )
         # C.g:326:27: ',' ( 'OPTIONAL' )? parameter_declaration
@@ -17362,10 +16369,10 @@ class CParser(Parser):
         alt119 = 2
         LA119_0 = self.input.LA(1)
 
-        if (LA119_0 == 53) :
+        if (LA119_0 == 53):
             LA119_1 = self.input.LA(2)
 
-            if (self.synpred81()) :
+            if (self.synpred81()):
                 alt119 = 1
         if alt119 == 1:
             # C.g:326:32: 'OPTIONAL'
@@ -17373,20 +16380,17 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-        self.following.append(self.FOLLOW_parameter_declaration_in_synpred82981)
+        self.following.append(
+            self.FOLLOW_parameter_declaration_in_synpred82981)
         self.parameter_declaration()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred82
 
-
-
     # $ANTLR start synpred83
+
     def synpred83_fragment(self, ):
         # C.g:330:28: ( declarator )
         # C.g:330:28: declarator
@@ -17396,12 +16400,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred83
 
-
-
     # $ANTLR start synpred84
+
     def synpred84_fragment(self, ):
         # C.g:330:39: ( abstract_declarator )
         # C.g:330:39: abstract_declarator
@@ -17411,33 +16413,31 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred84
 
-
-
     # $ANTLR start synpred86
+
     def synpred86_fragment(self, ):
         # C.g:330:4: ( declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )? )
         # C.g:330:4: declaration_specifiers ( declarator | abstract_declarator )* ( 'OPTIONAL' )?
-        self.following.append(self.FOLLOW_declaration_specifiers_in_synpred86994)
+        self.following.append(
+            self.FOLLOW_declaration_specifiers_in_synpred86994)
         self.declaration_specifiers()
         self.following.pop()
         if self.failed:
             return
         # C.g:330:27: ( declarator | abstract_declarator )*
-        while True: #loop120
+        while True:  # loop120
             alt120 = 3
             LA120 = self.input.LA(1)
             if LA120 == 66:
                 LA120_3 = self.input.LA(2)
 
-                if (self.synpred83()) :
+                if (self.synpred83()):
                     alt120 = 1
-                elif (self.synpred84()) :
+                elif (self.synpred84()):
                     alt120 = 2
 
-
             elif LA120 == IDENTIFIER or LA120 == 58 or LA120 == 59 or LA120 == 60:
                 alt120 = 1
             elif LA120 == 62:
@@ -17447,58 +16447,51 @@ class CParser(Parser):
                 elif LA120 == 58:
                     LA120_21 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 66:
                     LA120_22 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 59:
                     LA120_23 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 60:
                     LA120_24 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == IDENTIFIER:
                     LA120_25 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
                 elif LA120 == 62:
                     LA120_26 = self.input.LA(3)
 
-                    if (self.synpred83()) :
+                    if (self.synpred83()):
                         alt120 = 1
-                    elif (self.synpred84()) :
+                    elif (self.synpred84()):
                         alt120 = 2
 
-
-
             elif LA120 == 64:
                 alt120 = 2
 
@@ -17510,25 +16503,23 @@ class CParser(Parser):
                 if self.failed:
                     return
 
-
             elif alt120 == 2:
                 # C.g:330:39: abstract_declarator
-                self.following.append(self.FOLLOW_abstract_declarator_in_synpred86999)
+                self.following.append(
+                    self.FOLLOW_abstract_declarator_in_synpred86999)
                 self.abstract_declarator()
                 self.following.pop()
                 if self.failed:
                     return
 
-
             else:
-                break #loop120
-
+                break  # loop120
 
         # C.g:330:61: ( 'OPTIONAL' )?
         alt121 = 2
         LA121_0 = self.input.LA(1)
 
-        if (LA121_0 == 53) :
+        if (LA121_0 == 53):
             alt121 = 1
         if alt121 == 1:
             # C.g:330:62: 'OPTIONAL'
@@ -17536,19 +16527,15 @@ class CParser(Parser):
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred86
 
-
-
     # $ANTLR start synpred90
+
     def synpred90_fragment(self, ):
         # C.g:341:4: ( specifier_qualifier_list ( abstract_declarator )? )
         # C.g:341:4: specifier_qualifier_list ( abstract_declarator )?
-        self.following.append(self.FOLLOW_specifier_qualifier_list_in_synpred901046)
+        self.following.append(
+            self.FOLLOW_specifier_qualifier_list_in_synpred901046)
         self.specifier_qualifier_list()
         self.following.pop()
         if self.failed:
@@ -17557,40 +16544,35 @@ class CParser(Parser):
         alt122 = 2
         LA122_0 = self.input.LA(1)
 
-        if (LA122_0 == 62 or LA122_0 == 64 or LA122_0 == 66) :
+        if (LA122_0 == 62 or LA122_0 == 64 or LA122_0 == 66):
             alt122 = 1
         if alt122 == 1:
             # C.g:0:0: abstract_declarator
-            self.following.append(self.FOLLOW_abstract_declarator_in_synpred901048)
+            self.following.append(
+                self.FOLLOW_abstract_declarator_in_synpred901048)
             self.abstract_declarator()
             self.following.pop()
             if self.failed:
                 return
 
-
-
-
-
     # $ANTLR end synpred90
 
-
-
     # $ANTLR start synpred91
+
     def synpred91_fragment(self, ):
         # C.g:346:12: ( direct_abstract_declarator )
         # C.g:346:12: direct_abstract_declarator
-        self.following.append(self.FOLLOW_direct_abstract_declarator_in_synpred911067)
+        self.following.append(
+            self.FOLLOW_direct_abstract_declarator_in_synpred911067)
         self.direct_abstract_declarator()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred91
 
-
-
     # $ANTLR start synpred93
+
     def synpred93_fragment(self, ):
         # C.g:351:6: ( '(' abstract_declarator ')' )
         # C.g:351:6: '(' abstract_declarator ')'
@@ -17606,27 +16588,24 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred93
 
-
-
     # $ANTLR start synpred94
+
     def synpred94_fragment(self, ):
         # C.g:351:65: ( abstract_declarator_suffix )
         # C.g:351:65: abstract_declarator_suffix
-        self.following.append(self.FOLLOW_abstract_declarator_suffix_in_synpred941098)
+        self.following.append(
+            self.FOLLOW_abstract_declarator_suffix_in_synpred941098)
         self.abstract_declarator_suffix()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred94
 
-
-
     # $ANTLR start synpred109
+
     def synpred109_fragment(self, ):
         # C.g:386:4: ( '(' type_name ')' cast_expression )
         # C.g:386:4: '(' type_name ')' cast_expression
@@ -17647,12 +16626,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred109
 
-
-
     # $ANTLR start synpred114
+
     def synpred114_fragment(self, ):
         # C.g:395:4: ( 'sizeof' unary_expression )
         # C.g:395:4: 'sizeof' unary_expression
@@ -17665,19 +16642,18 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred114
 
-
-
     # $ANTLR start synpred117
+
     def synpred117_fragment(self, ):
         # C.g:409:13: ( '(' argument_expression_list ')' )
         # C.g:409:13: '(' argument_expression_list ')'
         self.match(self.input, 62, self.FOLLOW_62_in_synpred1171420)
         if self.failed:
             return
-        self.following.append(self.FOLLOW_argument_expression_list_in_synpred1171424)
+        self.following.append(
+            self.FOLLOW_argument_expression_list_in_synpred1171424)
         self.argument_expression_list()
         self.following.pop()
         if self.failed:
@@ -17686,19 +16662,18 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred117
 
-
-
     # $ANTLR start synpred118
+
     def synpred118_fragment(self, ):
         # C.g:410:13: ( '(' macro_parameter_list ')' )
         # C.g:410:13: '(' macro_parameter_list ')'
         self.match(self.input, 62, self.FOLLOW_62_in_synpred1181444)
         if self.failed:
             return
-        self.following.append(self.FOLLOW_macro_parameter_list_in_synpred1181446)
+        self.following.append(
+            self.FOLLOW_macro_parameter_list_in_synpred1181446)
         self.macro_parameter_list()
         self.following.pop()
         if self.failed:
@@ -17707,84 +16682,77 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred118
 
-
-
     # $ANTLR start synpred120
+
     def synpred120_fragment(self, ):
         # C.g:412:13: ( '*' IDENTIFIER )
         # C.g:412:13: '*' IDENTIFIER
         self.match(self.input, 66, self.FOLLOW_66_in_synpred1201482)
         if self.failed:
             return
-        self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1201486)
+        self.match(self.input, IDENTIFIER,
+                   self.FOLLOW_IDENTIFIER_in_synpred1201486)
         if self.failed:
             return
 
-
     # $ANTLR end synpred120
 
-
-
     # $ANTLR start synpred137
+
     def synpred137_fragment(self, ):
         # C.g:443:20: ( STRING_LITERAL )
         # C.g:443:20: STRING_LITERAL
-        self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1371683)
+        self.match(self.input, STRING_LITERAL,
+                   self.FOLLOW_STRING_LITERAL_in_synpred1371683)
         if self.failed:
             return
 
-
     # $ANTLR end synpred137
 
-
-
     # $ANTLR start synpred138
+
     def synpred138_fragment(self, ):
         # C.g:443:8: ( ( IDENTIFIER )* ( STRING_LITERAL )+ )
         # C.g:443:8: ( IDENTIFIER )* ( STRING_LITERAL )+
         # C.g:443:8: ( IDENTIFIER )*
-        while True: #loop125
+        while True:  # loop125
             alt125 = 2
             LA125_0 = self.input.LA(1)
 
-            if (LA125_0 == IDENTIFIER) :
+            if (LA125_0 == IDENTIFIER):
                 alt125 = 1
 
-
             if alt125 == 1:
                 # C.g:0:0: IDENTIFIER
-                self.match(self.input, IDENTIFIER, self.FOLLOW_IDENTIFIER_in_synpred1381680)
+                self.match(self.input, IDENTIFIER,
+                           self.FOLLOW_IDENTIFIER_in_synpred1381680)
                 if self.failed:
                     return
 
-
             else:
-                break #loop125
-
+                break  # loop125
 
         # C.g:443:20: ( STRING_LITERAL )+
         cnt126 = 0
-        while True: #loop126
+        while True:  # loop126
             alt126 = 2
             LA126_0 = self.input.LA(1)
 
-            if (LA126_0 == STRING_LITERAL) :
+            if (LA126_0 == STRING_LITERAL):
                 alt126 = 1
 
-
             if alt126 == 1:
                 # C.g:0:0: STRING_LITERAL
-                self.match(self.input, STRING_LITERAL, self.FOLLOW_STRING_LITERAL_in_synpred1381683)
+                self.match(self.input, STRING_LITERAL,
+                           self.FOLLOW_STRING_LITERAL_in_synpred1381683)
                 if self.failed:
                     return
 
-
             else:
                 if cnt126 >= 1:
-                    break #loop126
+                    break  # loop126
 
                 if self.backtracking > 0:
                     self.failed = True
@@ -17795,14 +16763,10 @@ class CParser(Parser):
 
             cnt126 += 1
 
-
-
-
     # $ANTLR end synpred138
 
-
-
     # $ANTLR start synpred142
+
     def synpred142_fragment(self, ):
         # C.g:458:4: ( lvalue assignment_operator assignment_expression )
         # C.g:458:4: lvalue assignment_operator assignment_expression
@@ -17811,38 +16775,37 @@ class CParser(Parser):
         self.following.pop()
         if self.failed:
             return
-        self.following.append(self.FOLLOW_assignment_operator_in_synpred1421746)
+        self.following.append(
+            self.FOLLOW_assignment_operator_in_synpred1421746)
         self.assignment_operator()
         self.following.pop()
         if self.failed:
             return
-        self.following.append(self.FOLLOW_assignment_expression_in_synpred1421748)
+        self.following.append(
+            self.FOLLOW_assignment_expression_in_synpred1421748)
         self.assignment_expression()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred142
 
-
-
     # $ANTLR start synpred169
+
     def synpred169_fragment(self, ):
         # C.g:520:4: ( expression_statement )
         # C.g:520:4: expression_statement
-        self.following.append(self.FOLLOW_expression_statement_in_synpred1692035)
+        self.following.append(
+            self.FOLLOW_expression_statement_in_synpred1692035)
         self.expression_statement()
         self.following.pop()
         if self.failed:
             return
 
-
     # $ANTLR end synpred169
 
-
-
     # $ANTLR start synpred173
+
     def synpred173_fragment(self, ):
         # C.g:524:4: ( macro_statement )
         # C.g:524:4: macro_statement
@@ -17852,12 +16815,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred173
 
-
-
     # $ANTLR start synpred174
+
     def synpred174_fragment(self, ):
         # C.g:525:4: ( asm2_statement )
         # C.g:525:4: asm2_statement
@@ -17867,12 +16828,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred174
 
-
-
     # $ANTLR start synpred181
+
     def synpred181_fragment(self, ):
         # C.g:544:19: ( declaration )
         # C.g:544:19: declaration
@@ -17882,12 +16841,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred181
 
-
-
     # $ANTLR start synpred182
+
     def synpred182_fragment(self, ):
         # C.g:544:33: ( statement_list )
         # C.g:544:33: statement_list
@@ -17897,12 +16854,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred182
 
-
-
     # $ANTLR start synpred186
+
     def synpred186_fragment(self, ):
         # C.g:554:8: ( declaration )
         # C.g:554:8: declaration
@@ -17912,12 +16867,10 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred186
 
-
-
     # $ANTLR start synpred188
+
     def synpred188_fragment(self, ):
         # C.g:558:4: ( statement )
         # C.g:558:4: statement
@@ -17927,11 +16880,8 @@ class CParser(Parser):
         if self.failed:
             return
 
-
     # $ANTLR end synpred188
 
-
-
     def synpred69(self):
         self.backtracking += 1
         start = self.input.mark()
@@ -18382,35 +17332,42 @@ class CParser(Parser):
         self.failed = False
         return success
 
-
-
-
-
-    FOLLOW_external_declaration_in_translation_unit74 = frozenset([1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_external_declaration_in_translation_unit74 = frozenset(
+        [1, 4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
     FOLLOW_function_definition_in_external_declaration113 = frozenset([1])
     FOLLOW_declaration_in_external_declaration118 = frozenset([1])
     FOLLOW_macro_statement_in_external_declaration123 = frozenset([1, 25])
     FOLLOW_25_in_external_declaration126 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_function_definition157 = frozenset([4, 58, 59, 60, 62, 66])
-    FOLLOW_declarator_in_function_definition160 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_declaration_in_function_definition166 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_specifiers_in_function_definition157 = frozenset(
+        [4, 58, 59, 60, 62, 66])
+    FOLLOW_declarator_in_function_definition160 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_in_function_definition166 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_compound_statement_in_function_definition171 = frozenset([1])
     FOLLOW_compound_statement_in_function_definition180 = frozenset([1])
-    FOLLOW_26_in_declaration203 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
-    FOLLOW_declaration_specifiers_in_declaration207 = frozenset([4, 58, 59, 60, 62, 66])
+    FOLLOW_26_in_declaration203 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                            40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_declaration_specifiers_in_declaration207 = frozenset(
+        [4, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_list_in_declaration216 = frozenset([25])
     FOLLOW_25_in_declaration220 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_declaration234 = frozenset([4, 25, 58, 59, 60, 62, 66])
+    FOLLOW_declaration_specifiers_in_declaration234 = frozenset(
+        [4, 25, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_list_in_declaration238 = frozenset([25])
     FOLLOW_25_in_declaration243 = frozenset([1])
-    FOLLOW_storage_class_specifier_in_declaration_specifiers264 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_specifier_in_declaration_specifiers272 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_declaration_specifiers286 = frozenset([1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_storage_class_specifier_in_declaration_specifiers264 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_specifier_in_declaration_specifiers272 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_declaration_specifiers286 = frozenset(
+        [1, 4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_init_declarator_in_init_declarator_list308 = frozenset([1, 27])
     FOLLOW_27_in_init_declarator_list311 = frozenset([4, 58, 59, 60, 62, 66])
     FOLLOW_init_declarator_in_init_declarator_list313 = frozenset([1, 27])
     FOLLOW_declarator_in_init_declarator326 = frozenset([1, 28])
-    FOLLOW_28_in_init_declarator329 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_28_in_init_declarator329 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_in_init_declarator331 = frozenset([1])
     FOLLOW_set_in_storage_class_specifier0 = frozenset([1])
     FOLLOW_34_in_type_specifier376 = frozenset([1])
@@ -18428,25 +17385,34 @@ class CParser(Parser):
     FOLLOW_IDENTIFIER_in_type_id467 = frozenset([1])
     FOLLOW_struct_or_union_in_struct_or_union_specifier494 = frozenset([4, 43])
     FOLLOW_IDENTIFIER_in_struct_or_union_specifier496 = frozenset([43])
-    FOLLOW_43_in_struct_or_union_specifier499 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_struct_declaration_list_in_struct_or_union_specifier501 = frozenset([44])
+    FOLLOW_43_in_struct_or_union_specifier499 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_struct_declaration_list_in_struct_or_union_specifier501 = frozenset([
+                                                                               44])
     FOLLOW_44_in_struct_or_union_specifier503 = frozenset([1])
     FOLLOW_struct_or_union_in_struct_or_union_specifier508 = frozenset([4])
     FOLLOW_IDENTIFIER_in_struct_or_union_specifier510 = frozenset([1])
     FOLLOW_set_in_struct_or_union0 = frozenset([1])
-    FOLLOW_struct_declaration_in_struct_declaration_list537 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_specifier_qualifier_list_in_struct_declaration549 = frozenset([4, 47, 58, 59, 60, 62, 66])
+    FOLLOW_struct_declaration_in_struct_declaration_list537 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_specifier_qualifier_list_in_struct_declaration549 = frozenset(
+        [4, 47, 58, 59, 60, 62, 66])
     FOLLOW_struct_declarator_list_in_struct_declaration551 = frozenset([25])
     FOLLOW_25_in_struct_declaration553 = frozenset([1])
-    FOLLOW_type_qualifier_in_specifier_qualifier_list566 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_specifier_in_specifier_qualifier_list570 = frozenset([1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_specifier_qualifier_list566 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_specifier_in_specifier_qualifier_list570 = frozenset(
+        [1, 4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_struct_declarator_in_struct_declarator_list584 = frozenset([1, 27])
-    FOLLOW_27_in_struct_declarator_list587 = frozenset([4, 47, 58, 59, 60, 62, 66])
+    FOLLOW_27_in_struct_declarator_list587 = frozenset(
+        [4, 47, 58, 59, 60, 62, 66])
     FOLLOW_struct_declarator_in_struct_declarator_list589 = frozenset([1, 27])
     FOLLOW_declarator_in_struct_declarator602 = frozenset([1, 47])
-    FOLLOW_47_in_struct_declarator605 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_47_in_struct_declarator605 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_struct_declarator607 = frozenset([1])
-    FOLLOW_47_in_struct_declarator614 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_47_in_struct_declarator614 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_struct_declarator616 = frozenset([1])
     FOLLOW_48_in_enum_specifier634 = frozenset([43])
     FOLLOW_43_in_enum_specifier636 = frozenset([4])
@@ -18465,7 +17431,8 @@ class CParser(Parser):
     FOLLOW_27_in_enumerator_list680 = frozenset([4])
     FOLLOW_enumerator_in_enumerator_list682 = frozenset([1, 27])
     FOLLOW_IDENTIFIER_in_enumerator695 = frozenset([1, 28])
-    FOLLOW_28_in_enumerator698 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_28_in_enumerator698 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_enumerator700 = frozenset([1])
     FOLLOW_set_in_type_qualifier0 = frozenset([1])
     FOLLOW_pointer_in_declarator784 = frozenset([4, 58, 59, 60, 62])
@@ -18481,12 +17448,14 @@ class CParser(Parser):
     FOLLOW_declarator_in_direct_declarator834 = frozenset([63])
     FOLLOW_63_in_direct_declarator836 = frozenset([62, 64])
     FOLLOW_declarator_suffix_in_direct_declarator838 = frozenset([1, 62, 64])
-    FOLLOW_64_in_declarator_suffix852 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_64_in_declarator_suffix852 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_declarator_suffix854 = frozenset([65])
     FOLLOW_65_in_declarator_suffix856 = frozenset([1])
     FOLLOW_64_in_declarator_suffix866 = frozenset([65])
     FOLLOW_65_in_declarator_suffix868 = frozenset([1])
-    FOLLOW_62_in_declarator_suffix878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_declarator_suffix878 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_type_list_in_declarator_suffix880 = frozenset([63])
     FOLLOW_63_in_declarator_suffix882 = frozenset([1])
     FOLLOW_62_in_declarator_suffix892 = frozenset([4])
@@ -18494,8 +17463,10 @@ class CParser(Parser):
     FOLLOW_63_in_declarator_suffix896 = frozenset([1])
     FOLLOW_62_in_declarator_suffix906 = frozenset([63])
     FOLLOW_63_in_declarator_suffix908 = frozenset([1])
-    FOLLOW_66_in_pointer919 = frozenset([49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_pointer921 = frozenset([1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_66_in_pointer919 = frozenset(
+        [49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_pointer921 = frozenset(
+        [1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_pointer_in_pointer924 = frozenset([1])
     FOLLOW_66_in_pointer930 = frozenset([66])
     FOLLOW_pointer_in_pointer932 = frozenset([1])
@@ -18505,109 +17476,165 @@ class CParser(Parser):
     FOLLOW_53_in_parameter_type_list954 = frozenset([67])
     FOLLOW_67_in_parameter_type_list958 = frozenset([1])
     FOLLOW_parameter_declaration_in_parameter_list971 = frozenset([1, 27])
-    FOLLOW_27_in_parameter_list974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_53_in_parameter_list977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_27_in_parameter_list974 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_53_in_parameter_list977 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_declaration_in_parameter_list981 = frozenset([1, 27])
-    FOLLOW_declaration_specifiers_in_parameter_declaration994 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_declarator_in_parameter_declaration997 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_abstract_declarator_in_parameter_declaration999 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declaration_specifiers_in_parameter_declaration994 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declarator_in_parameter_declaration997 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_abstract_declarator_in_parameter_declaration999 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
     FOLLOW_53_in_parameter_declaration1004 = frozenset([1])
     FOLLOW_pointer_in_parameter_declaration1013 = frozenset([4, 66])
     FOLLOW_IDENTIFIER_in_parameter_declaration1016 = frozenset([1])
     FOLLOW_IDENTIFIER_in_identifier_list1027 = frozenset([1, 27])
     FOLLOW_27_in_identifier_list1031 = frozenset([4])
     FOLLOW_IDENTIFIER_in_identifier_list1033 = frozenset([1, 27])
-    FOLLOW_specifier_qualifier_list_in_type_name1046 = frozenset([1, 62, 64, 66])
+    FOLLOW_specifier_qualifier_list_in_type_name1046 = frozenset([
+                                                                 1, 62, 64, 66])
     FOLLOW_abstract_declarator_in_type_name1048 = frozenset([1])
     FOLLOW_type_id_in_type_name1054 = frozenset([1])
     FOLLOW_pointer_in_abstract_declarator1065 = frozenset([1, 62, 64])
-    FOLLOW_direct_abstract_declarator_in_abstract_declarator1067 = frozenset([1])
-    FOLLOW_direct_abstract_declarator_in_abstract_declarator1073 = frozenset([1])
+    FOLLOW_direct_abstract_declarator_in_abstract_declarator1067 = frozenset([
+                                                                             1])
+    FOLLOW_direct_abstract_declarator_in_abstract_declarator1073 = frozenset([
+                                                                             1])
     FOLLOW_62_in_direct_abstract_declarator1086 = frozenset([62, 64, 66])
-    FOLLOW_abstract_declarator_in_direct_abstract_declarator1088 = frozenset([63])
+    FOLLOW_abstract_declarator_in_direct_abstract_declarator1088 = frozenset([
+                                                                             63])
     FOLLOW_63_in_direct_abstract_declarator1090 = frozenset([1, 62, 64])
-    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094 = frozenset([1, 62, 64])
-    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098 = frozenset([1, 62, 64])
+    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1094 = frozenset([
+                                                                                    1, 62, 64])
+    FOLLOW_abstract_declarator_suffix_in_direct_abstract_declarator1098 = frozenset([
+                                                                                    1, 62, 64])
     FOLLOW_64_in_abstract_declarator_suffix1110 = frozenset([65])
     FOLLOW_65_in_abstract_declarator_suffix1112 = frozenset([1])
-    FOLLOW_64_in_abstract_declarator_suffix1117 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_constant_expression_in_abstract_declarator_suffix1119 = frozenset([65])
+    FOLLOW_64_in_abstract_declarator_suffix1117 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_constant_expression_in_abstract_declarator_suffix1119 = frozenset([
+                                                                             65])
     FOLLOW_65_in_abstract_declarator_suffix1121 = frozenset([1])
     FOLLOW_62_in_abstract_declarator_suffix1126 = frozenset([63])
     FOLLOW_63_in_abstract_declarator_suffix1128 = frozenset([1])
-    FOLLOW_62_in_abstract_declarator_suffix1133 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135 = frozenset([63])
+    FOLLOW_62_in_abstract_declarator_suffix1133 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_parameter_type_list_in_abstract_declarator_suffix1135 = frozenset([
+                                                                             63])
     FOLLOW_63_in_abstract_declarator_suffix1137 = frozenset([1])
     FOLLOW_assignment_expression_in_initializer1150 = frozenset([1])
-    FOLLOW_43_in_initializer1155 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_43_in_initializer1155 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_list_in_initializer1157 = frozenset([27, 44])
     FOLLOW_27_in_initializer1159 = frozenset([44])
     FOLLOW_44_in_initializer1162 = frozenset([1])
     FOLLOW_initializer_in_initializer_list1173 = frozenset([1, 27])
-    FOLLOW_27_in_initializer_list1176 = frozenset([4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_27_in_initializer_list1176 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 43, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_initializer_in_initializer_list1178 = frozenset([1, 27])
-    FOLLOW_assignment_expression_in_argument_expression_list1196 = frozenset([1, 27, 53])
+    FOLLOW_assignment_expression_in_argument_expression_list1196 = frozenset([
+                                                                             1, 27, 53])
     FOLLOW_53_in_argument_expression_list1199 = frozenset([1, 27])
-    FOLLOW_27_in_argument_expression_list1204 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_assignment_expression_in_argument_expression_list1206 = frozenset([1, 27, 53])
+    FOLLOW_27_in_argument_expression_list1204 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_assignment_expression_in_argument_expression_list1206 = frozenset([
+                                                                             1, 27, 53])
     FOLLOW_53_in_argument_expression_list1209 = frozenset([1, 27])
-    FOLLOW_multiplicative_expression_in_additive_expression1225 = frozenset([1, 68, 69])
-    FOLLOW_68_in_additive_expression1229 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_multiplicative_expression_in_additive_expression1231 = frozenset([1, 68, 69])
-    FOLLOW_69_in_additive_expression1235 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_multiplicative_expression_in_additive_expression1237 = frozenset([1, 68, 69])
-    FOLLOW_cast_expression_in_multiplicative_expression1251 = frozenset([1, 66, 70, 71])
-    FOLLOW_66_in_multiplicative_expression1255 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1257 = frozenset([1, 66, 70, 71])
-    FOLLOW_70_in_multiplicative_expression1261 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1263 = frozenset([1, 66, 70, 71])
-    FOLLOW_71_in_multiplicative_expression1267 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_cast_expression_in_multiplicative_expression1269 = frozenset([1, 66, 70, 71])
-    FOLLOW_62_in_cast_expression1282 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_multiplicative_expression_in_additive_expression1225 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_68_in_additive_expression1229 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_multiplicative_expression_in_additive_expression1231 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_69_in_additive_expression1235 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_multiplicative_expression_in_additive_expression1237 = frozenset([
+                                                                            1, 68, 69])
+    FOLLOW_cast_expression_in_multiplicative_expression1251 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_66_in_multiplicative_expression1255 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1257 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_70_in_multiplicative_expression1261 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1263 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_71_in_multiplicative_expression1267 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_cast_expression_in_multiplicative_expression1269 = frozenset([
+                                                                        1, 66, 70, 71])
+    FOLLOW_62_in_cast_expression1282 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_cast_expression1284 = frozenset([63])
-    FOLLOW_63_in_cast_expression1286 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_cast_expression1286 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_cast_expression1288 = frozenset([1])
     FOLLOW_unary_expression_in_cast_expression1293 = frozenset([1])
     FOLLOW_postfix_expression_in_unary_expression1304 = frozenset([1])
-    FOLLOW_72_in_unary_expression1309 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_72_in_unary_expression1309 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1311 = frozenset([1])
-    FOLLOW_73_in_unary_expression1316 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_73_in_unary_expression1316 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1318 = frozenset([1])
-    FOLLOW_unary_operator_in_unary_expression1323 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_unary_operator_in_unary_expression1323 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_unary_expression1325 = frozenset([1])
-    FOLLOW_74_in_unary_expression1330 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_74_in_unary_expression1330 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_unary_expression1332 = frozenset([1])
     FOLLOW_74_in_unary_expression1337 = frozenset([62])
-    FOLLOW_62_in_unary_expression1339 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_62_in_unary_expression1339 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_unary_expression1341 = frozenset([63])
     FOLLOW_63_in_unary_expression1343 = frozenset([1])
-    FOLLOW_primary_expression_in_postfix_expression1367 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_64_in_postfix_expression1383 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_primary_expression_in_postfix_expression1367 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_64_in_postfix_expression1383 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_postfix_expression1385 = frozenset([65])
-    FOLLOW_65_in_postfix_expression1387 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_65_in_postfix_expression1387 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_62_in_postfix_expression1401 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1405 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_62_in_postfix_expression1420 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_postfix_expression1405 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_62_in_postfix_expression1420 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_argument_expression_list_in_postfix_expression1424 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1428 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_62_in_postfix_expression1444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_63_in_postfix_expression1428 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_62_in_postfix_expression1444 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_macro_parameter_list_in_postfix_expression1446 = frozenset([63])
-    FOLLOW_63_in_postfix_expression1448 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_63_in_postfix_expression1448 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_75_in_postfix_expression1462 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1466 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_IDENTIFIER_in_postfix_expression1466 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_66_in_postfix_expression1482 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1486 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_IDENTIFIER_in_postfix_expression1486 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
     FOLLOW_76_in_postfix_expression1502 = frozenset([4])
-    FOLLOW_IDENTIFIER_in_postfix_expression1506 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_72_in_postfix_expression1522 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_73_in_postfix_expression1536 = frozenset([1, 62, 64, 66, 72, 73, 75, 76])
-    FOLLOW_parameter_declaration_in_macro_parameter_list1559 = frozenset([1, 27])
-    FOLLOW_27_in_macro_parameter_list1562 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_parameter_declaration_in_macro_parameter_list1564 = frozenset([1, 27])
+    FOLLOW_IDENTIFIER_in_postfix_expression1506 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_72_in_postfix_expression1522 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_73_in_postfix_expression1536 = frozenset(
+        [1, 62, 64, 66, 72, 73, 75, 76])
+    FOLLOW_parameter_declaration_in_macro_parameter_list1559 = frozenset([
+                                                                         1, 27])
+    FOLLOW_27_in_macro_parameter_list1562 = frozenset(
+        [4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_parameter_declaration_in_macro_parameter_list1564 = frozenset([
+                                                                         1, 27])
     FOLLOW_set_in_unary_operator0 = frozenset([1])
     FOLLOW_IDENTIFIER_in_primary_expression1613 = frozenset([1])
     FOLLOW_constant_in_primary_expression1618 = frozenset([1])
-    FOLLOW_62_in_primary_expression1623 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_primary_expression1623 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_primary_expression1625 = frozenset([63])
     FOLLOW_63_in_primary_expression1627 = frozenset([1])
     FOLLOW_HEX_LITERAL_in_constant1643 = frozenset([1])
@@ -18619,44 +17646,71 @@ class CParser(Parser):
     FOLLOW_IDENTIFIER_in_constant1688 = frozenset([1, 4])
     FOLLOW_FLOATING_POINT_LITERAL_in_constant1699 = frozenset([1])
     FOLLOW_assignment_expression_in_expression1715 = frozenset([1, 27])
-    FOLLOW_27_in_expression1718 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_27_in_expression1718 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_expression1720 = frozenset([1, 27])
     FOLLOW_conditional_expression_in_constant_expression1733 = frozenset([1])
-    FOLLOW_lvalue_in_assignment_expression1744 = frozenset([28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
-    FOLLOW_assignment_operator_in_assignment_expression1746 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_lvalue_in_assignment_expression1744 = frozenset(
+        [28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
+    FOLLOW_assignment_operator_in_assignment_expression1746 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_assignment_expression1748 = frozenset([1])
     FOLLOW_conditional_expression_in_assignment_expression1753 = frozenset([1])
     FOLLOW_unary_expression_in_lvalue1765 = frozenset([1])
     FOLLOW_set_in_assignment_operator0 = frozenset([1])
-    FOLLOW_logical_or_expression_in_conditional_expression1839 = frozenset([1, 90])
-    FOLLOW_90_in_conditional_expression1842 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_logical_or_expression_in_conditional_expression1839 = frozenset([
+                                                                           1, 90])
+    FOLLOW_90_in_conditional_expression1842 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_conditional_expression1844 = frozenset([47])
-    FOLLOW_47_in_conditional_expression1846 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_conditional_expression_in_conditional_expression1848 = frozenset([1])
-    FOLLOW_logical_and_expression_in_logical_or_expression1863 = frozenset([1, 91])
-    FOLLOW_91_in_logical_or_expression1866 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_logical_and_expression_in_logical_or_expression1868 = frozenset([1, 91])
-    FOLLOW_inclusive_or_expression_in_logical_and_expression1881 = frozenset([1, 92])
-    FOLLOW_92_in_logical_and_expression1884 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_inclusive_or_expression_in_logical_and_expression1886 = frozenset([1, 92])
-    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899 = frozenset([1, 93])
-    FOLLOW_93_in_inclusive_or_expression1902 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904 = frozenset([1, 93])
+    FOLLOW_47_in_conditional_expression1846 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_conditional_expression_in_conditional_expression1848 = frozenset([
+                                                                            1])
+    FOLLOW_logical_and_expression_in_logical_or_expression1863 = frozenset([
+                                                                           1, 91])
+    FOLLOW_91_in_logical_or_expression1866 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_logical_and_expression_in_logical_or_expression1868 = frozenset([
+                                                                           1, 91])
+    FOLLOW_inclusive_or_expression_in_logical_and_expression1881 = frozenset([
+                                                                             1, 92])
+    FOLLOW_92_in_logical_and_expression1884 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_inclusive_or_expression_in_logical_and_expression1886 = frozenset([
+                                                                             1, 92])
+    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1899 = frozenset([
+                                                                              1, 93])
+    FOLLOW_93_in_inclusive_or_expression1902 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_exclusive_or_expression_in_inclusive_or_expression1904 = frozenset([
+                                                                              1, 93])
     FOLLOW_and_expression_in_exclusive_or_expression1917 = frozenset([1, 94])
-    FOLLOW_94_in_exclusive_or_expression1920 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_94_in_exclusive_or_expression1920 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_and_expression_in_exclusive_or_expression1922 = frozenset([1, 94])
     FOLLOW_equality_expression_in_and_expression1935 = frozenset([1, 77])
-    FOLLOW_77_in_and_expression1938 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_77_in_and_expression1938 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_equality_expression_in_and_expression1940 = frozenset([1, 77])
-    FOLLOW_relational_expression_in_equality_expression1952 = frozenset([1, 95, 96])
-    FOLLOW_set_in_equality_expression1955 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_relational_expression_in_equality_expression1961 = frozenset([1, 95, 96])
-    FOLLOW_shift_expression_in_relational_expression1975 = frozenset([1, 97, 98, 99, 100])
-    FOLLOW_set_in_relational_expression1978 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_shift_expression_in_relational_expression1988 = frozenset([1, 97, 98, 99, 100])
-    FOLLOW_additive_expression_in_shift_expression2001 = frozenset([1, 101, 102])
-    FOLLOW_set_in_shift_expression2004 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_additive_expression_in_shift_expression2010 = frozenset([1, 101, 102])
+    FOLLOW_relational_expression_in_equality_expression1952 = frozenset([
+                                                                        1, 95, 96])
+    FOLLOW_set_in_equality_expression1955 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_relational_expression_in_equality_expression1961 = frozenset([
+                                                                        1, 95, 96])
+    FOLLOW_shift_expression_in_relational_expression1975 = frozenset([
+                                                                     1, 97, 98, 99, 100])
+    FOLLOW_set_in_relational_expression1978 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_shift_expression_in_relational_expression1988 = frozenset([
+                                                                     1, 97, 98, 99, 100])
+    FOLLOW_additive_expression_in_shift_expression2001 = frozenset([
+                                                                   1, 101, 102])
+    FOLLOW_set_in_shift_expression2004 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_additive_expression_in_shift_expression2010 = frozenset([
+                                                                   1, 101, 102])
     FOLLOW_labeled_statement_in_statement2025 = frozenset([1])
     FOLLOW_compound_statement_in_statement2030 = frozenset([1])
     FOLLOW_expression_statement_in_statement2035 = frozenset([1])
@@ -18670,72 +17724,101 @@ class CParser(Parser):
     FOLLOW_declaration_in_statement2075 = frozenset([1])
     FOLLOW_103_in_asm2_statement2086 = frozenset([4])
     FOLLOW_IDENTIFIER_in_asm2_statement2089 = frozenset([62])
-    FOLLOW_62_in_asm2_statement2091 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm2_statement2094 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_62_in_asm2_statement2091 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm2_statement2094 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_63_in_asm2_statement2101 = frozenset([25])
     FOLLOW_25_in_asm2_statement2103 = frozenset([1])
     FOLLOW_104_in_asm1_statement2115 = frozenset([43])
-    FOLLOW_43_in_asm1_statement2117 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm1_statement2120 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_asm1_statement2117 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm1_statement2120 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_44_in_asm1_statement2127 = frozenset([1])
     FOLLOW_105_in_asm_statement2138 = frozenset([43])
-    FOLLOW_43_in_asm_statement2140 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_set_in_asm_statement2143 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_asm_statement2140 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                               59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_set_in_asm_statement2143 = frozenset([4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58,
+                                                59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_44_in_asm_statement2150 = frozenset([1])
     FOLLOW_IDENTIFIER_in_macro_statement2162 = frozenset([62])
-    FOLLOW_62_in_macro_statement2164 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_declaration_in_macro_statement2166 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_statement_list_in_macro_statement2170 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_macro_statement2164 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51,
+                                                 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_declaration_in_macro_statement2166 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                          50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_statement_list_in_macro_statement2170 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_macro_statement2173 = frozenset([63])
     FOLLOW_63_in_macro_statement2176 = frozenset([1])
     FOLLOW_IDENTIFIER_in_labeled_statement2188 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2190 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2190 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2192 = frozenset([1])
-    FOLLOW_106_in_labeled_statement2197 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_106_in_labeled_statement2197 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_constant_expression_in_labeled_statement2199 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2201 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2201 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2203 = frozenset([1])
     FOLLOW_107_in_labeled_statement2208 = frozenset([47])
-    FOLLOW_47_in_labeled_statement2210 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_47_in_labeled_statement2210 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                   51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_labeled_statement2212 = frozenset([1])
-    FOLLOW_43_in_compound_statement2223 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
-    FOLLOW_declaration_in_compound_statement2225 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_43_in_compound_statement2223 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48, 49,
+                                                    50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_declaration_in_compound_statement2225 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 48,
+                                                             49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_list_in_compound_statement2228 = frozenset([44])
     FOLLOW_44_in_compound_statement2231 = frozenset([1])
-    FOLLOW_statement_in_statement_list2242 = frozenset([1, 4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_statement_in_statement_list2242 = frozenset([1, 4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                       50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_25_in_expression_statement2254 = frozenset([1])
     FOLLOW_expression_in_expression_statement2259 = frozenset([25])
     FOLLOW_25_in_expression_statement2261 = frozenset([1])
     FOLLOW_108_in_selection_statement2272 = frozenset([62])
-    FOLLOW_62_in_selection_statement2274 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_selection_statement2274 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_selection_statement2278 = frozenset([63])
-    FOLLOW_63_in_selection_statement2280 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_selection_statement2280 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2284 = frozenset([1, 109])
-    FOLLOW_109_in_selection_statement2299 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_109_in_selection_statement2299 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                      50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2301 = frozenset([1])
     FOLLOW_110_in_selection_statement2308 = frozenset([62])
-    FOLLOW_62_in_selection_statement2310 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_selection_statement2310 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_selection_statement2312 = frozenset([63])
-    FOLLOW_63_in_selection_statement2314 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_selection_statement2314 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_selection_statement2316 = frozenset([1])
     FOLLOW_111_in_iteration_statement2327 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2329 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2329 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2333 = frozenset([63])
-    FOLLOW_63_in_iteration_statement2335 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_iteration_statement2335 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2337 = frozenset([1])
-    FOLLOW_112_in_iteration_statement2344 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_112_in_iteration_statement2344 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49,
+                                                      50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2346 = frozenset([111])
     FOLLOW_111_in_iteration_statement2348 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2350 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2350 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2354 = frozenset([63])
     FOLLOW_63_in_iteration_statement2356 = frozenset([25])
     FOLLOW_25_in_iteration_statement2358 = frozenset([1])
     FOLLOW_113_in_iteration_statement2365 = frozenset([62])
-    FOLLOW_62_in_iteration_statement2367 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_expression_statement_in_iteration_statement2369 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
-    FOLLOW_expression_statement_in_iteration_statement2373 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_iteration_statement2367 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_expression_statement_in_iteration_statement2369 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 25, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_expression_statement_in_iteration_statement2373 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 63, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_iteration_statement2375 = frozenset([63])
-    FOLLOW_63_in_iteration_statement2378 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
+    FOLLOW_63_in_iteration_statement2378 = frozenset([4, 5, 6, 7, 8, 9, 10, 25, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50,
+                                                     51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79, 103, 104, 105, 106, 107, 108, 110, 111, 112, 113, 114, 115, 116, 117])
     FOLLOW_statement_in_iteration_statement2380 = frozenset([1])
     FOLLOW_114_in_jump_statement2393 = frozenset([4])
     FOLLOW_IDENTIFIER_in_jump_statement2395 = frozenset([25])
@@ -18746,13 +17829,17 @@ class CParser(Parser):
     FOLLOW_25_in_jump_statement2411 = frozenset([1])
     FOLLOW_117_in_jump_statement2416 = frozenset([25])
     FOLLOW_25_in_jump_statement2418 = frozenset([1])
-    FOLLOW_117_in_jump_statement2423 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_117_in_jump_statement2423 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_expression_in_jump_statement2425 = frozenset([25])
     FOLLOW_25_in_jump_statement2427 = frozenset([1])
     FOLLOW_declaration_specifiers_in_synpred2100 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_synpred4100 = frozenset([4, 58, 59, 60, 62, 66])
-    FOLLOW_declarator_in_synpred4103 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_declaration_in_synpred4105 = frozenset([4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_specifiers_in_synpred4100 = frozenset(
+        [4, 58, 59, 60, 62, 66])
+    FOLLOW_declarator_in_synpred4103 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_declaration_in_synpred4105 = frozenset(
+        [4, 26, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_43_in_synpred4108 = frozenset([1])
     FOLLOW_declaration_in_synpred5118 = frozenset([1])
     FOLLOW_declaration_specifiers_in_synpred7157 = frozenset([1])
@@ -18760,8 +17847,10 @@ class CParser(Parser):
     FOLLOW_type_specifier_in_synpred14272 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred15286 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred33444 = frozenset([1])
-    FOLLOW_IDENTIFIER_in_synpred34442 = frozenset([4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
-    FOLLOW_type_qualifier_in_synpred34444 = frozenset([4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_IDENTIFIER_in_synpred34442 = frozenset(
+        [4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
+    FOLLOW_type_qualifier_in_synpred34444 = frozenset(
+        [4, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 66])
     FOLLOW_declarator_in_synpred34447 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred39566 = frozenset([1])
     FOLLOW_type_specifier_in_synpred40570 = frozenset([1])
@@ -18773,7 +17862,8 @@ class CParser(Parser):
     FOLLOW_declarator_suffix_in_synpred67821 = frozenset([1])
     FOLLOW_58_in_synpred69830 = frozenset([1])
     FOLLOW_declarator_suffix_in_synpred70838 = frozenset([1])
-    FOLLOW_62_in_synpred73878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_synpred73878 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_type_list_in_synpred73880 = frozenset([63])
     FOLLOW_63_in_synpred73882 = frozenset([1])
     FOLLOW_62_in_synpred74892 = frozenset([4])
@@ -18781,38 +17871,51 @@ class CParser(Parser):
     FOLLOW_63_in_synpred74896 = frozenset([1])
     FOLLOW_type_qualifier_in_synpred75921 = frozenset([1])
     FOLLOW_pointer_in_synpred76924 = frozenset([1])
-    FOLLOW_66_in_synpred77919 = frozenset([49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
-    FOLLOW_type_qualifier_in_synpred77921 = frozenset([1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_66_in_synpred77919 = frozenset(
+        [49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_type_qualifier_in_synpred77921 = frozenset(
+        [1, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_pointer_in_synpred77924 = frozenset([1])
     FOLLOW_66_in_synpred78930 = frozenset([66])
     FOLLOW_pointer_in_synpred78932 = frozenset([1])
     FOLLOW_53_in_synpred81977 = frozenset([1])
-    FOLLOW_27_in_synpred82974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
-    FOLLOW_53_in_synpred82977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_27_in_synpred82974 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_53_in_synpred82977 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39,
+                                          40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_parameter_declaration_in_synpred82981 = frozenset([1])
     FOLLOW_declarator_in_synpred83997 = frozenset([1])
     FOLLOW_abstract_declarator_in_synpred84999 = frozenset([1])
-    FOLLOW_declaration_specifiers_in_synpred86994 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_declarator_in_synpred86997 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
-    FOLLOW_abstract_declarator_in_synpred86999 = frozenset([1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declaration_specifiers_in_synpred86994 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_declarator_in_synpred86997 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
+    FOLLOW_abstract_declarator_in_synpred86999 = frozenset(
+        [1, 4, 53, 58, 59, 60, 62, 64, 66])
     FOLLOW_53_in_synpred861004 = frozenset([1])
-    FOLLOW_specifier_qualifier_list_in_synpred901046 = frozenset([1, 62, 64, 66])
+    FOLLOW_specifier_qualifier_list_in_synpred901046 = frozenset([
+                                                                 1, 62, 64, 66])
     FOLLOW_abstract_declarator_in_synpred901048 = frozenset([1])
     FOLLOW_direct_abstract_declarator_in_synpred911067 = frozenset([1])
     FOLLOW_62_in_synpred931086 = frozenset([62, 64, 66])
     FOLLOW_abstract_declarator_in_synpred931088 = frozenset([63])
     FOLLOW_63_in_synpred931090 = frozenset([1])
     FOLLOW_abstract_declarator_suffix_in_synpred941098 = frozenset([1])
-    FOLLOW_62_in_synpred1091282 = frozenset([4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
+    FOLLOW_62_in_synpred1091282 = frozenset(
+        [4, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61])
     FOLLOW_type_name_in_synpred1091284 = frozenset([63])
-    FOLLOW_63_in_synpred1091286 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_63_in_synpred1091286 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_cast_expression_in_synpred1091288 = frozenset([1])
-    FOLLOW_74_in_synpred1141330 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_74_in_synpred1141330 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_unary_expression_in_synpred1141332 = frozenset([1])
-    FOLLOW_62_in_synpred1171420 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_62_in_synpred1171420 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_argument_expression_list_in_synpred1171424 = frozenset([63])
     FOLLOW_63_in_synpred1171428 = frozenset([1])
-    FOLLOW_62_in_synpred1181444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
+    FOLLOW_62_in_synpred1181444 = frozenset([4, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38,
+                                            39, 40, 41, 42, 45, 46, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 66])
     FOLLOW_macro_parameter_list_in_synpred1181446 = frozenset([63])
     FOLLOW_63_in_synpred1181448 = frozenset([1])
     FOLLOW_66_in_synpred1201482 = frozenset([4])
@@ -18820,8 +17923,10 @@ class CParser(Parser):
     FOLLOW_STRING_LITERAL_in_synpred1371683 = frozenset([1])
     FOLLOW_IDENTIFIER_in_synpred1381680 = frozenset([4, 9])
     FOLLOW_STRING_LITERAL_in_synpred1381683 = frozenset([1, 9])
-    FOLLOW_lvalue_in_synpred1421744 = frozenset([28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
-    FOLLOW_assignment_operator_in_synpred1421746 = frozenset([4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
+    FOLLOW_lvalue_in_synpred1421744 = frozenset(
+        [28, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89])
+    FOLLOW_assignment_operator_in_synpred1421746 = frozenset(
+        [4, 5, 6, 7, 8, 9, 10, 62, 66, 68, 69, 72, 73, 74, 77, 78, 79])
     FOLLOW_assignment_expression_in_synpred1421748 = frozenset([1])
     FOLLOW_expression_statement_in_synpred1692035 = frozenset([1])
     FOLLOW_macro_statement_in_synpred1732055 = frozenset([1])
@@ -18830,4 +17935,3 @@ class CParser(Parser):
     FOLLOW_statement_list_in_synpred1822170 = frozenset([1])
     FOLLOW_declaration_in_synpred1862225 = frozenset([1])
     FOLLOW_statement_in_synpred1882242 = frozenset([1])
-
diff --git a/BaseTools/Source/Python/Eot/CParser4/CLexer.py b/BaseTools/Source/Python/Eot/CParser4/CLexer.py
index 54374fd6e82d..739152edf0f5 100644
--- a/BaseTools/Source/Python/Eot/CParser4/CLexer.py
+++ b/BaseTools/Source/Python/Eot/CParser4/CLexer.py
@@ -5,7 +5,7 @@ from typing.io import TextIO
 import sys
 
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -424,7 +424,7 @@ class CLexer(Lexer):
 
     atn = ATNDeserializer().deserialize(serializedATN())
 
-    decisionsToDFA = [ DFA(ds, i) for i, ds in enumerate(atn.decisionToState) ]
+    decisionsToDFA = [DFA(ds, i) for i, ds in enumerate(atn.decisionToState)]
 
     T__0 = 1
     T__1 = 2
@@ -532,96 +532,99 @@ class CLexer(Lexer):
     LINE_COMMENT = 104
     LINE_COMMAND = 105
 
-    channelNames = [ u"DEFAULT_TOKEN_CHANNEL", u"HIDDEN" ]
+    channelNames = [u"DEFAULT_TOKEN_CHANNEL", u"HIDDEN"]
 
-    modeNames = [ "DEFAULT_MODE" ]
+    modeNames = ["DEFAULT_MODE"]
 
-    literalNames = [ "<INVALID>",
-            "'{'", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
-            "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'",
-            "'int'", "'long'", "'float'", "'double'", "'signed'", "'unsigned'",
-            "'}'", "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'",
-            "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'",
-            "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'", "'EFI_BOOTSERVICE'",
-            "'EFI_RUNTIMESERVICE'", "'PACKED'", "'('", "')'", "'['", "']'",
-            "'*'", "'...'", "'+'", "'-'", "'/'", "'%'", "'++'", "'--'",
-            "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='", "'/='",
-            "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
-            "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'",
-            "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'",
-            "'__asm'", "'case'", "'default'", "'if'", "'else'", "'switch'",
-            "'while'", "'do'", "'goto'", "'continue'", "'break'", "'return'" ]
+    literalNames = ["<INVALID>",
+                    "'{'", "';'", "'typedef'", "','", "'='", "'extern'", "'static'",
+                    "'auto'", "'register'", "'STATIC'", "'void'", "'char'", "'short'",
+                    "'int'", "'long'", "'float'", "'double'", "'signed'", "'unsigned'",
+                    "'}'", "'struct'", "'union'", "':'", "'enum'", "'const'", "'volatile'",
+                    "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'", "'VOLATILE'",
+                    "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'", "'EFI_BOOTSERVICE'",
+                    "'EFI_RUNTIMESERVICE'", "'PACKED'", "'('", "')'", "'['", "']'",
+                    "'*'", "'...'", "'+'", "'-'", "'/'", "'%'", "'++'", "'--'",
+                    "'sizeof'", "'.'", "'->'", "'&'", "'~'", "'!'", "'*='", "'/='",
+                    "'%='", "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+                    "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='", "'<'",
+                    "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'", "'_asm'",
+                    "'__asm'", "'case'", "'default'", "'if'", "'else'", "'switch'",
+                    "'while'", "'do'", "'goto'", "'continue'", "'break'", "'return'"]
 
-    symbolicNames = [ "<INVALID>",
-            "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL", "HEX_LITERAL",
-            "DECIMAL_LITERAL", "OCTAL_LITERAL", "FLOATING_POINT_LITERAL",
-            "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
-            "LINE_COMMAND" ]
+    symbolicNames = ["<INVALID>",
+                     "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL", "HEX_LITERAL",
+                     "DECIMAL_LITERAL", "OCTAL_LITERAL", "FLOATING_POINT_LITERAL",
+                     "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+                     "LINE_COMMAND"]
 
-    ruleNames = [ "T__0", "T__1", "T__2", "T__3", "T__4", "T__5", "T__6",
-                  "T__7", "T__8", "T__9", "T__10", "T__11", "T__12", "T__13",
-                  "T__14", "T__15", "T__16", "T__17", "T__18", "T__19",
-                  "T__20", "T__21", "T__22", "T__23", "T__24", "T__25",
-                  "T__26", "T__27", "T__28", "T__29", "T__30", "T__31",
-                  "T__32", "T__33", "T__34", "T__35", "T__36", "T__37",
-                  "T__38", "T__39", "T__40", "T__41", "T__42", "T__43",
-                  "T__44", "T__45", "T__46", "T__47", "T__48", "T__49",
-                  "T__50", "T__51", "T__52", "T__53", "T__54", "T__55",
-                  "T__56", "T__57", "T__58", "T__59", "T__60", "T__61",
-                  "T__62", "T__63", "T__64", "T__65", "T__66", "T__67",
-                  "T__68", "T__69", "T__70", "T__71", "T__72", "T__73",
-                  "T__74", "T__75", "T__76", "T__77", "T__78", "T__79",
-                  "T__80", "T__81", "T__82", "T__83", "T__84", "T__85",
-                  "T__86", "T__87", "T__88", "T__89", "T__90", "T__91",
-                  "IDENTIFIER", "LETTER", "CHARACTER_LITERAL", "STRING_LITERAL",
-                  "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL", "HexDigit",
-                  "IntegerTypeSuffix", "FLOATING_POINT_LITERAL", "Exponent",
-                  "FloatTypeSuffix", "EscapeSequence", "OctalEscape", "UnicodeEscape",
-                  "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
-                  "LINE_COMMAND" ]
+    ruleNames = ["T__0", "T__1", "T__2", "T__3", "T__4", "T__5", "T__6",
+                 "T__7", "T__8", "T__9", "T__10", "T__11", "T__12", "T__13",
+                 "T__14", "T__15", "T__16", "T__17", "T__18", "T__19",
+                 "T__20", "T__21", "T__22", "T__23", "T__24", "T__25",
+                 "T__26", "T__27", "T__28", "T__29", "T__30", "T__31",
+                 "T__32", "T__33", "T__34", "T__35", "T__36", "T__37",
+                 "T__38", "T__39", "T__40", "T__41", "T__42", "T__43",
+                 "T__44", "T__45", "T__46", "T__47", "T__48", "T__49",
+                 "T__50", "T__51", "T__52", "T__53", "T__54", "T__55",
+                 "T__56", "T__57", "T__58", "T__59", "T__60", "T__61",
+                 "T__62", "T__63", "T__64", "T__65", "T__66", "T__67",
+                 "T__68", "T__69", "T__70", "T__71", "T__72", "T__73",
+                 "T__74", "T__75", "T__76", "T__77", "T__78", "T__79",
+                 "T__80", "T__81", "T__82", "T__83", "T__84", "T__85",
+                 "T__86", "T__87", "T__88", "T__89", "T__90", "T__91",
+                 "IDENTIFIER", "LETTER", "CHARACTER_LITERAL", "STRING_LITERAL",
+                 "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL", "HexDigit",
+                 "IntegerTypeSuffix", "FLOATING_POINT_LITERAL", "Exponent",
+                 "FloatTypeSuffix", "EscapeSequence", "OctalEscape", "UnicodeEscape",
+                 "WS", "BS", "UnicodeVocabulary", "COMMENT", "LINE_COMMENT",
+                 "LINE_COMMAND"]
 
     grammarFileName = "C.g4"
 
     # @param  output= sys.stdout Type: TextIO
-    def __init__(self,input=None,output= sys.stdout):
+    def __init__(self, input=None, output=sys.stdout):
         super().__init__(input, output)
         self.checkVersion("4.7.1")
-        self._interp = LexerATNSimulator(self, self.atn, self.decisionsToDFA, PredictionContextCache())
+        self._interp = LexerATNSimulator(
+            self, self.atn, self.decisionsToDFA, PredictionContextCache())
         self._actions = None
         self._predicates = None
 
+    def printTokenInfo(self, line, offset, tokenText):
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
-
-    def printTokenInfo(self,line,offset,tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
-
-    def StorePredicateExpression(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.PredicateExpressionList.append(PredExp)
 
-    def StoreEnumerationDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.EnumerationDefinitionList.append(EnumDef)
 
-    def StoreStructUnionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.StructUnionDefinitionList.append(SUDef)
 
-    def StoreTypedefDefinition(self,StartLine,StartOffset,EndLine,EndOffset,FromText,ToText):
-        Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.TypedefDefinitionList.append(Tdef)
 
-    def StoreFunctionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText,LeftBraceLine,LeftBraceOffset,DeclLine,DeclOffset):
-        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+    def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
         FileProfile.FunctionDefinitionList.append(FuncDef)
 
-    def StoreVariableDeclaration(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText):
-        VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.VariableDeclarationList.append(VarDecl)
 
-    def StoreFunctionCalling(self,StartLine,StartOffset,EndLine,EndOffset,FuncName,ParamList):
-        FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.FunctionCallingList.append(FuncCall)
-
-
-
diff --git a/BaseTools/Source/Python/Eot/CParser4/CListener.py b/BaseTools/Source/Python/Eot/CParser4/CListener.py
index 46f7f1b3d1ca..866d5717d42b 100644
--- a/BaseTools/Source/Python/Eot/CParser4/CListener.py
+++ b/BaseTools/Source/Python/Eot/CParser4/CListener.py
@@ -5,7 +5,7 @@ if __name__ is not None and "." in __name__:
 else:
     from CParser import CParser
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -27,782 +27,710 @@ class CListener(ParseTreeListener):
 
     # Enter a parse tree produced by CParser#translation_unit.
     # @param  ctx Type: CParser.Translation_unitContext
-    def enterTranslation_unit(self,ctx):
+    def enterTranslation_unit(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#translation_unit.
     # @param  ctx Type: CParser.Translation_unitContext
-    def exitTranslation_unit(self,ctx):
+    def exitTranslation_unit(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#external_declaration.
     # @param  ctx Type: CParser.External_declarationContext
-    def enterExternal_declaration(self,ctx):
+    def enterExternal_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#external_declaration.
     # @param  ctx Type: CParser.External_declarationContext
-    def exitExternal_declaration(self,ctx):
+    def exitExternal_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#function_definition.
     # @param  ctx Type: CParser.Function_definitionContext
-    def enterFunction_definition(self,ctx):
+    def enterFunction_definition(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#function_definition.
     # @param  ctx Type: CParser.Function_definitionContext
-    def exitFunction_definition(self,ctx):
+    def exitFunction_definition(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declaration_specifiers.
     # @param  ctx Type: CParser.Declaration_specifiersContext
-    def enterDeclaration_specifiers(self,ctx):
+    def enterDeclaration_specifiers(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declaration_specifiers.
     # @param  ctx Type: CParser.Declaration_specifiersContext
-    def exitDeclaration_specifiers(self,ctx):
+    def exitDeclaration_specifiers(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declaration.
     # @param  ctx Type: CParser.DeclarationContext
-    def enterDeclaration(self,ctx):
+    def enterDeclaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declaration.
     # @param  ctx Type: CParser.DeclarationContext
-    def exitDeclaration(self,ctx):
+    def exitDeclaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#init_declarator_list.
     # @param  ctx Type: CParser.Init_declarator_listContext
-    def enterInit_declarator_list(self,ctx):
+    def enterInit_declarator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#init_declarator_list.
     # @param  ctx Type: CParser.Init_declarator_listContext
-    def exitInit_declarator_list(self,ctx):
+    def exitInit_declarator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#init_declarator.
     # @param  ctx Type: CParser.Init_declaratorContext
-    def enterInit_declarator(self,ctx):
+    def enterInit_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#init_declarator.
     # @param  ctx Type: CParser.Init_declaratorContext
-    def exitInit_declarator(self,ctx):
+    def exitInit_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#storage_class_specifier.
     # @param  ctx Type: CParser.Storage_class_specifierContext
-    def enterStorage_class_specifier(self,ctx):
+    def enterStorage_class_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#storage_class_specifier.
     # @param  ctx Type: CParser.Storage_class_specifierContext
-    def exitStorage_class_specifier(self,ctx):
+    def exitStorage_class_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_specifier.
     # @param  ctx Type: CParser.Type_specifierContext
-    def enterType_specifier(self,ctx):
+    def enterType_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_specifier.
     # @param  ctx Type: CParser.Type_specifierContext
-    def exitType_specifier(self,ctx):
+    def exitType_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_id.
     # @param  ctx Type: CParser.Type_idContext
-    def enterType_id(self,ctx):
+    def enterType_id(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_id.
     # @param  ctx Type: CParser.Type_idContext
-    def exitType_id(self,ctx):
+    def exitType_id(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_or_union_specifier.
     # @param  ctx Type: CParser.Struct_or_union_specifierContext
-    def enterStruct_or_union_specifier(self,ctx):
+    def enterStruct_or_union_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_or_union_specifier.
     # @param  ctx Type: CParser.Struct_or_union_specifierContext
-    def exitStruct_or_union_specifier(self,ctx):
+    def exitStruct_or_union_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_or_union.
     # @param  ctx Type: CParser.Struct_or_unionContext
-    def enterStruct_or_union(self,ctx):
+    def enterStruct_or_union(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_or_union.
     # @param  ctx Type: CParser.Struct_or_unionContext
-    def exitStruct_or_union(self,ctx):
+    def exitStruct_or_union(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declaration_list.
     # @param  ctx Type: CParser.Struct_declaration_listContext
-    def enterStruct_declaration_list(self,ctx):
+    def enterStruct_declaration_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declaration_list.
     # @param  ctx Type: CParser.Struct_declaration_listContext
-    def exitStruct_declaration_list(self,ctx):
+    def exitStruct_declaration_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declaration.
     # @param  ctx Type: CParser.Struct_declarationContext
-    def enterStruct_declaration(self,ctx):
+    def enterStruct_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declaration.
     # @param  ctx Type: CParser.Struct_declarationContext
-    def exitStruct_declaration(self,ctx):
+    def exitStruct_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#specifier_qualifier_list.
     # @param  ctx Type: CParser.Specifier_qualifier_listContext
-    def enterSpecifier_qualifier_list(self,ctx):
+    def enterSpecifier_qualifier_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#specifier_qualifier_list.
     # @param  ctx Type: CParser.Specifier_qualifier_listContext
-    def exitSpecifier_qualifier_list(self,ctx):
+    def exitSpecifier_qualifier_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declarator_list.
     # @param  ctx Type: CParser.Struct_declarator_listContext
-    def enterStruct_declarator_list(self,ctx):
+    def enterStruct_declarator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declarator_list.
     # @param  ctx Type: CParser.Struct_declarator_listContext
-    def exitStruct_declarator_list(self,ctx):
+    def exitStruct_declarator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#struct_declarator.
     # @param  ctx Type: CParser.Struct_declaratorContext
-    def enterStruct_declarator(self,ctx):
+    def enterStruct_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#struct_declarator.
     # @param  ctx Type: CParser.Struct_declaratorContext
-    def exitStruct_declarator(self,ctx):
+    def exitStruct_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enum_specifier.
     # @param  ctx Type: CParser.Enum_specifierContext
-    def enterEnum_specifier(self,ctx):
+    def enterEnum_specifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enum_specifier.
     # @param  ctx Type: CParser.Enum_specifierContext
-    def exitEnum_specifier(self,ctx):
+    def exitEnum_specifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enumerator_list.
     # @param  ctx Type: CParser.Enumerator_listContext
-    def enterEnumerator_list(self,ctx):
+    def enterEnumerator_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enumerator_list.
     # @param  ctx Type: CParser.Enumerator_listContext
-    def exitEnumerator_list(self,ctx):
+    def exitEnumerator_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#enumerator.
     # @param  ctx Type: CParser.EnumeratorContext
-    def enterEnumerator(self,ctx):
+    def enterEnumerator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#enumerator.
     # @param  ctx Type: CParser.EnumeratorContext
-    def exitEnumerator(self,ctx):
+    def exitEnumerator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_qualifier.
     # @param  ctx Type: CParser.Type_qualifierContext
-    def enterType_qualifier(self,ctx):
+    def enterType_qualifier(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_qualifier.
     # @param  ctx Type: CParser.Type_qualifierContext
-    def exitType_qualifier(self,ctx):
+    def exitType_qualifier(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declarator.
     # @param  ctx Type: CParser.DeclaratorContext
-    def enterDeclarator(self,ctx):
+    def enterDeclarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declarator.
     # @param  ctx Type: CParser.DeclaratorContext
-    def exitDeclarator(self,ctx):
+    def exitDeclarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#direct_declarator.
     # @param  ctx Type: CParser.Direct_declaratorContext
-    def enterDirect_declarator(self,ctx):
+    def enterDirect_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#direct_declarator.
     # @param  ctx Type: CParser.Direct_declaratorContext
-    def exitDirect_declarator(self,ctx):
+    def exitDirect_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#declarator_suffix.
     # @param  ctx Type: CParser.Declarator_suffixContext
-    def enterDeclarator_suffix(self,ctx):
+    def enterDeclarator_suffix(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#declarator_suffix.
     # @param  ctx Type: CParser.Declarator_suffixContext
-    def exitDeclarator_suffix(self,ctx):
+    def exitDeclarator_suffix(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#pointer.
     # @param  ctx Type: CParser.PointerContext
-    def enterPointer(self,ctx):
+    def enterPointer(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#pointer.
     # @param  ctx Type: CParser.PointerContext
-    def exitPointer(self,ctx):
+    def exitPointer(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_type_list.
     # @param  ctx Type: CParser.Parameter_type_listContext
-    def enterParameter_type_list(self,ctx):
+    def enterParameter_type_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_type_list.
     # @param  ctx Type: CParser.Parameter_type_listContext
-    def exitParameter_type_list(self,ctx):
+    def exitParameter_type_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_list.
     # @param  ctx Type: CParser.Parameter_listContext
-    def enterParameter_list(self,ctx):
+    def enterParameter_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_list.
     # @param  ctx Type: CParser.Parameter_listContext
-    def exitParameter_list(self,ctx):
+    def exitParameter_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#parameter_declaration.
     # @param  ctx Type: CParser.Parameter_declarationContext
-    def enterParameter_declaration(self,ctx):
+    def enterParameter_declaration(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#parameter_declaration.
     # @param  ctx Type: CParser.Parameter_declarationContext
-    def exitParameter_declaration(self,ctx):
+    def exitParameter_declaration(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#identifier_list.
     # @param  ctx Type: CParser.Identifier_listContext
-    def enterIdentifier_list(self,ctx):
+    def enterIdentifier_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#identifier_list.
     # @param  ctx Type: CParser.Identifier_listContext
-    def exitIdentifier_list(self,ctx):
+    def exitIdentifier_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#type_name.
     # @param  ctx Type: CParser.Type_nameContext
-    def enterType_name(self,ctx):
+    def enterType_name(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#type_name.
     # @param  ctx Type: CParser.Type_nameContext
-    def exitType_name(self,ctx):
+    def exitType_name(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#abstract_declarator.
     # @param  ctx Type: CParser.Abstract_declaratorContext
-    def enterAbstract_declarator(self,ctx):
+    def enterAbstract_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#abstract_declarator.
     # @param  ctx Type: CParser.Abstract_declaratorContext
-    def exitAbstract_declarator(self,ctx):
+    def exitAbstract_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#direct_abstract_declarator.
     # @param  ctx Type: CParser.Direct_abstract_declaratorContext
-    def enterDirect_abstract_declarator(self,ctx):
+    def enterDirect_abstract_declarator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#direct_abstract_declarator.
     # @param  ctx Type: CParser.Direct_abstract_declaratorContext
-    def exitDirect_abstract_declarator(self,ctx):
+    def exitDirect_abstract_declarator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#abstract_declarator_suffix.
     # @param  ctx Type: CParser.Abstract_declarator_suffixContext
-    def enterAbstract_declarator_suffix(self,ctx):
+    def enterAbstract_declarator_suffix(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#abstract_declarator_suffix.
     # @param  ctx Type: CParser.Abstract_declarator_suffixContext
-    def exitAbstract_declarator_suffix(self,ctx):
+    def exitAbstract_declarator_suffix(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#initializer.
     # @param  ctx Type: CParser.InitializerContext
-    def enterInitializer(self,ctx):
+    def enterInitializer(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#initializer.
     # @param  ctx Type: CParser.InitializerContext
-    def exitInitializer(self,ctx):
+    def exitInitializer(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#initializer_list.
     # @param  ctx Type: CParser.Initializer_listContext
-    def enterInitializer_list(self,ctx):
+    def enterInitializer_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#initializer_list.
     # @param  ctx Type: CParser.Initializer_listContext
-    def exitInitializer_list(self,ctx):
+    def exitInitializer_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#argument_expression_list.
     # @param  ctx Type: CParser.Argument_expression_listContext
-    def enterArgument_expression_list(self,ctx):
+    def enterArgument_expression_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#argument_expression_list.
     # @param  ctx Type: CParser.Argument_expression_listContext
-    def exitArgument_expression_list(self,ctx):
+    def exitArgument_expression_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#additive_expression.
     # @param  ctx Type: CParser.Additive_expressionContext
-    def enterAdditive_expression(self,ctx):
+    def enterAdditive_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#additive_expression.
     # @param  ctx Type: CParser.Additive_expressionContext
-    def exitAdditive_expression(self,ctx):
+    def exitAdditive_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#multiplicative_expression.
     # @param  ctx Type: CParser.Multiplicative_expressionContext
-    def enterMultiplicative_expression(self,ctx):
+    def enterMultiplicative_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#multiplicative_expression.
     # @param  ctx Type: CParser.Multiplicative_expressionContext
-    def exitMultiplicative_expression(self,ctx):
+    def exitMultiplicative_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#cast_expression.
     # @param  ctx Type: CParser.Cast_expressionContext
-    def enterCast_expression(self,ctx):
+    def enterCast_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#cast_expression.
     # @param  ctx Type: CParser.Cast_expressionContext
-    def exitCast_expression(self,ctx):
+    def exitCast_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#unary_expression.
     # @param  ctx Type: CParser.Unary_expressionContext
-    def enterUnary_expression(self,ctx):
+    def enterUnary_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#unary_expression.
     # @param  ctx Type: CParser.Unary_expressionContext
-    def exitUnary_expression(self,ctx):
+    def exitUnary_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#postfix_expression.
     # @param  ctx Type: CParser.Postfix_expressionContext
-    def enterPostfix_expression(self,ctx):
+    def enterPostfix_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#postfix_expression.
     # @param  ctx Type: CParser.Postfix_expressionContext
-    def exitPostfix_expression(self,ctx):
+    def exitPostfix_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#macro_parameter_list.
     # @param  ctx Type: CParser.Macro_parameter_listContext
-    def enterMacro_parameter_list(self,ctx):
+    def enterMacro_parameter_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#macro_parameter_list.
     # @param  ctx Type: CParser.Macro_parameter_listContext
-    def exitMacro_parameter_list(self,ctx):
+    def exitMacro_parameter_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#unary_operator.
     # @param  ctx Type: CParser.Unary_operatorContext
-    def enterUnary_operator(self,ctx):
+    def enterUnary_operator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#unary_operator.
     # @param  ctx Type: CParser.Unary_operatorContext
-    def exitUnary_operator(self,ctx):
+    def exitUnary_operator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#primary_expression.
     # @param  ctx Type: CParser.Primary_expressionContext
-    def enterPrimary_expression(self,ctx):
+    def enterPrimary_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#primary_expression.
     # @param  ctx Type: CParser.Primary_expressionContext
-    def exitPrimary_expression(self,ctx):
+    def exitPrimary_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#constant.
     # @param  ctx Type: CParser.ConstantContext
-    def enterConstant(self,ctx):
+    def enterConstant(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#constant.
     # @param  ctx Type: CParser.ConstantContext
-    def exitConstant(self,ctx):
+    def exitConstant(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#expression.
     # @param  ctx Type: CParser.ExpressionContext
-    def enterExpression(self,ctx):
+    def enterExpression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#expression.
     # @param  ctx Type: CParser.ExpressionContext
-    def exitExpression(self,ctx):
+    def exitExpression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#constant_expression.
     # @param  ctx Type: CParser.Constant_expressionContext
-    def enterConstant_expression(self,ctx):
+    def enterConstant_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#constant_expression.
     # @param  ctx Type: CParser.Constant_expressionContext
-    def exitConstant_expression(self,ctx):
+    def exitConstant_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#assignment_expression.
     # @param  ctx Type: CParser.Assignment_expressionContext
-    def enterAssignment_expression(self,ctx):
+    def enterAssignment_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#assignment_expression.
     # @param  ctx Type: CParser.Assignment_expressionContext
-    def exitAssignment_expression(self,ctx):
+    def exitAssignment_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#lvalue.
     # @param  ctx Type: CParser.LvalueContext
-    def enterLvalue(self,ctx):
+    def enterLvalue(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#lvalue.
     # @param  ctx Type: CParser.LvalueContext
-    def exitLvalue(self,ctx):
+    def exitLvalue(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#assignment_operator.
     # @param  ctx Type: CParser.Assignment_operatorContext
-    def enterAssignment_operator(self,ctx):
+    def enterAssignment_operator(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#assignment_operator.
     # @param  ctx Type: CParser.Assignment_operatorContext
-    def exitAssignment_operator(self,ctx):
+    def exitAssignment_operator(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#conditional_expression.
     # @param  ctx Type: CParser.Conditional_expressionContext
-    def enterConditional_expression(self,ctx):
+    def enterConditional_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#conditional_expression.
     # @param  ctx Type: CParser.Conditional_expressionContext
-    def exitConditional_expression(self,ctx):
+    def exitConditional_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#logical_or_expression.
     # @param  ctx Type: CParser.Logical_or_expressionContext
-    def enterLogical_or_expression(self,ctx):
+    def enterLogical_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#logical_or_expression.
     # @param  ctx Type: CParser.Logical_or_expressionContext
-    def exitLogical_or_expression(self,ctx):
+    def exitLogical_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#logical_and_expression.
     # @param  ctx Type: CParser.Logical_and_expressionContext
-    def enterLogical_and_expression(self,ctx):
+    def enterLogical_and_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#logical_and_expression.
     # @param  ctx Type: CParser.Logical_and_expressionContext
-    def exitLogical_and_expression(self,ctx):
+    def exitLogical_and_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#inclusive_or_expression.
     # @param  ctx Type: CParser.Inclusive_or_expressionContext
-    def enterInclusive_or_expression(self,ctx):
+    def enterInclusive_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#inclusive_or_expression.
     # @param  ctx Type: CParser.Inclusive_or_expressionContext
-    def exitInclusive_or_expression(self,ctx):
+    def exitInclusive_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#exclusive_or_expression.
     # @param  ctx Type: CParser.Exclusive_or_expressionContext
-    def enterExclusive_or_expression(self,ctx):
+    def enterExclusive_or_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#exclusive_or_expression.
     # @param  ctx Type: CParser.Exclusive_or_expressionContext
-    def exitExclusive_or_expression(self,ctx):
+    def exitExclusive_or_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#and_expression.
     # @param  ctx Type: CParser.And_expressionContext
-    def enterAnd_expression(self,ctx):
+    def enterAnd_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#and_expression.
     # @param  ctx Type: CParser.And_expressionContext
-    def exitAnd_expression(self,ctx):
+    def exitAnd_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#equality_expression.
     # @param  ctx Type: CParser.Equality_expressionContext
-    def enterEquality_expression(self,ctx):
+    def enterEquality_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#equality_expression.
     # @param  ctx Type: CParser.Equality_expressionContext
-    def exitEquality_expression(self,ctx):
+    def exitEquality_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#relational_expression.
     # @param  ctx Type: CParser.Relational_expressionContext
-    def enterRelational_expression(self,ctx):
+    def enterRelational_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#relational_expression.
     # @param  ctx Type: CParser.Relational_expressionContext
-    def exitRelational_expression(self,ctx):
+    def exitRelational_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#shift_expression.
     # @param  ctx Type: CParser.Shift_expressionContext
-    def enterShift_expression(self,ctx):
+    def enterShift_expression(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#shift_expression.
     # @param  ctx Type: CParser.Shift_expressionContext
-    def exitShift_expression(self,ctx):
+    def exitShift_expression(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#statement.
     # @param  ctx Type: CParser.StatementContext
-    def enterStatement(self,ctx):
+    def enterStatement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#statement.
     # @param  ctx Type: CParser.StatementContext
-    def exitStatement(self,ctx):
+    def exitStatement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm2_statement.
     # @param  ctx Type: CParser.Asm2_statementContext
-    def enterAsm2_statement(self,ctx):
+    def enterAsm2_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm2_statement.
     # @param  ctx Type: CParser.Asm2_statementContext
-    def exitAsm2_statement(self,ctx):
+    def exitAsm2_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm1_statement.
     # @param  ctx Type: CParser.Asm1_statementContext
-    def enterAsm1_statement(self,ctx):
+    def enterAsm1_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm1_statement.
     # @param  ctx Type: CParser.Asm1_statementContext
-    def exitAsm1_statement(self,ctx):
+    def exitAsm1_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#asm_statement.
     # @param  ctx Type: CParser.Asm_statementContext
-    def enterAsm_statement(self,ctx):
+    def enterAsm_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#asm_statement.
     # @param  ctx Type: CParser.Asm_statementContext
-    def exitAsm_statement(self,ctx):
+    def exitAsm_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#macro_statement.
     # @param  ctx Type: CParser.Macro_statementContext
-    def enterMacro_statement(self,ctx):
+    def enterMacro_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#macro_statement.
     # @param  ctx Type: CParser.Macro_statementContext
-    def exitMacro_statement(self,ctx):
+    def exitMacro_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#labeled_statement.
     # @param  ctx Type: CParser.Labeled_statementContext
-    def enterLabeled_statement(self,ctx):
+    def enterLabeled_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#labeled_statement.
     # @param  ctx Type: CParser.Labeled_statementContext
-    def exitLabeled_statement(self,ctx):
+    def exitLabeled_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#compound_statement.
     # @param  ctx Type: CParser.Compound_statementContext
-    def enterCompound_statement(self,ctx):
+    def enterCompound_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#compound_statement.
     # @param  ctx Type: CParser.Compound_statementContext
-    def exitCompound_statement(self,ctx):
+    def exitCompound_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#statement_list.
     # @param  ctx Type: CParser.Statement_listContext
-    def enterStatement_list(self,ctx):
+    def enterStatement_list(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#statement_list.
     # @param  ctx Type: CParser.Statement_listContext
-    def exitStatement_list(self,ctx):
+    def exitStatement_list(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#expression_statement.
     # @param  ctx Type: CParser.Expression_statementContext
-    def enterExpression_statement(self,ctx):
+    def enterExpression_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#expression_statement.
     # @param  ctx Type: CParser.Expression_statementContext
-    def exitExpression_statement(self,ctx):
+    def exitExpression_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#selection_statement.
     # @param  ctx Type: CParser.Selection_statementContext
-    def enterSelection_statement(self,ctx):
+    def enterSelection_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#selection_statement.
     # @param  ctx Type: CParser.Selection_statementContext
-    def exitSelection_statement(self,ctx):
+    def exitSelection_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#iteration_statement.
     # @param  ctx Type: CParser.Iteration_statementContext
-    def enterIteration_statement(self,ctx):
+    def enterIteration_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#iteration_statement.
     # @param  ctx Type: CParser.Iteration_statementContext
-    def exitIteration_statement(self,ctx):
+    def exitIteration_statement(self, ctx):
         pass
 
-
     # Enter a parse tree produced by CParser#jump_statement.
     # @param  ctx Type: CParser.Jump_statementContext
-    def enterJump_statement(self,ctx):
+    def enterJump_statement(self, ctx):
         pass
 
     # Exit a parse tree produced by CParser#jump_statement.
     # @param  ctx Type: CParser.Jump_statementContext
-    def exitJump_statement(self,ctx):
+    def exitJump_statement(self, ctx):
         pass
-
-
diff --git a/BaseTools/Source/Python/Eot/CParser4/CParser.py b/BaseTools/Source/Python/Eot/CParser4/CParser.py
index 31d23d55aa57..22c17c66680a 100644
--- a/BaseTools/Source/Python/Eot/CParser4/CParser.py
+++ b/BaseTools/Source/Python/Eot/CParser4/CParser.py
@@ -6,7 +6,7 @@ from typing.io import TextIO
 import sys
 
 
-## @file
+# @file
 # The file defines the parser for C source files.
 #
 # THIS FILE IS AUTO-GENENERATED. PLEASE DON NOT MODIFY THIS FILE.
@@ -22,6 +22,7 @@ import sys
 import Ecc.CodeFragment as CodeFragment
 import Ecc.FileProfile as FileProfile
 
+
 def serializedATN():
     with StringIO() as buf:
         buf.write("\3\u608b\ua72a\u8133\ub9ed\u417c\u3be7\u7786\u5964\3k")
@@ -475,61 +476,61 @@ def serializedATN():
         return buf.getvalue()
 
 
-class CParser ( Parser ):
+class CParser (Parser):
 
     grammarFileName = "C.g4"
 
     atn = ATNDeserializer().deserialize(serializedATN())
 
-    decisionsToDFA = [ DFA(ds, i) for i, ds in enumerate(atn.decisionToState) ]
+    decisionsToDFA = [DFA(ds, i) for i, ds in enumerate(atn.decisionToState)]
 
     sharedContextCache = PredictionContextCache()
 
-    literalNames = [ "<INVALID>", "'{'", "';'", "'typedef'", "','", "'='",
-                     "'extern'", "'static'", "'auto'", "'register'", "'STATIC'",
-                     "'void'", "'char'", "'short'", "'int'", "'long'", "'float'",
-                     "'double'", "'signed'", "'unsigned'", "'}'", "'struct'",
-                     "'union'", "':'", "'enum'", "'const'", "'volatile'",
-                     "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'",
-                     "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'",
-                     "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
-                     "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'",
-                     "'-'", "'/'", "'%'", "'++'", "'--'", "'sizeof'", "'.'",
-                     "'->'", "'&'", "'~'", "'!'", "'*='", "'/='", "'%='",
-                     "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
-                     "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='",
-                     "'<'", "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'",
-                     "'_asm'", "'__asm'", "'case'", "'default'", "'if'",
-                     "'else'", "'switch'", "'while'", "'do'", "'goto'",
-                     "'continue'", "'break'", "'return'" ]
+    literalNames = ["<INVALID>", "'{'", "';'", "'typedef'", "','", "'='",
+                    "'extern'", "'static'", "'auto'", "'register'", "'STATIC'",
+                    "'void'", "'char'", "'short'", "'int'", "'long'", "'float'",
+                    "'double'", "'signed'", "'unsigned'", "'}'", "'struct'",
+                    "'union'", "':'", "'enum'", "'const'", "'volatile'",
+                    "'IN'", "'OUT'", "'OPTIONAL'", "'CONST'", "'UNALIGNED'",
+                    "'VOLATILE'", "'GLOBAL_REMOVE_IF_UNREFERENCED'", "'EFIAPI'",
+                    "'EFI_BOOTSERVICE'", "'EFI_RUNTIMESERVICE'", "'PACKED'",
+                    "'('", "')'", "'['", "']'", "'*'", "'...'", "'+'",
+                    "'-'", "'/'", "'%'", "'++'", "'--'", "'sizeof'", "'.'",
+                    "'->'", "'&'", "'~'", "'!'", "'*='", "'/='", "'%='",
+                    "'+='", "'-='", "'<<='", "'>>='", "'&='", "'^='", "'|='",
+                    "'?'", "'||'", "'&&'", "'|'", "'^'", "'=='", "'!='",
+                    "'<'", "'>'", "'<='", "'>='", "'<<'", "'>>'", "'__asm__'",
+                    "'_asm'", "'__asm'", "'case'", "'default'", "'if'",
+                    "'else'", "'switch'", "'while'", "'do'", "'goto'",
+                    "'continue'", "'break'", "'return'"]
 
-    symbolicNames = [ "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
-                      "<INVALID>", "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL",
-                      "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL",
-                      "FLOATING_POINT_LITERAL", "WS", "BS", "UnicodeVocabulary",
-                      "COMMENT", "LINE_COMMENT", "LINE_COMMAND" ]
+    symbolicNames = ["<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "<INVALID>", "<INVALID>", "<INVALID>",
+                     "<INVALID>", "IDENTIFIER", "CHARACTER_LITERAL", "STRING_LITERAL",
+                     "HEX_LITERAL", "DECIMAL_LITERAL", "OCTAL_LITERAL",
+                     "FLOATING_POINT_LITERAL", "WS", "BS", "UnicodeVocabulary",
+                     "COMMENT", "LINE_COMMENT", "LINE_COMMAND"]
 
     RULE_translation_unit = 0
     RULE_external_declaration = 1
@@ -603,225 +604,224 @@ class CParser ( Parser ):
     RULE_iteration_statement = 69
     RULE_jump_statement = 70
 
-    ruleNames =  [ "translation_unit", "external_declaration", "function_definition",
-                   "declaration_specifiers", "declaration", "init_declarator_list",
-                   "init_declarator", "storage_class_specifier", "type_specifier",
-                   "type_id", "struct_or_union_specifier", "struct_or_union",
-                   "struct_declaration_list", "struct_declaration", "specifier_qualifier_list",
-                   "struct_declarator_list", "struct_declarator", "enum_specifier",
-                   "enumerator_list", "enumerator", "type_qualifier", "declarator",
-                   "direct_declarator", "declarator_suffix", "pointer",
-                   "parameter_type_list", "parameter_list", "parameter_declaration",
-                   "identifier_list", "type_name", "abstract_declarator",
-                   "direct_abstract_declarator", "abstract_declarator_suffix",
-                   "initializer", "initializer_list", "argument_expression_list",
-                   "additive_expression", "multiplicative_expression", "cast_expression",
-                   "unary_expression", "postfix_expression", "macro_parameter_list",
-                   "unary_operator", "primary_expression", "constant", "expression",
-                   "constant_expression", "assignment_expression", "lvalue",
-                   "assignment_operator", "conditional_expression", "logical_or_expression",
-                   "logical_and_expression", "inclusive_or_expression",
-                   "exclusive_or_expression", "and_expression", "equality_expression",
-                   "relational_expression", "shift_expression", "statement",
-                   "asm2_statement", "asm1_statement", "asm_statement",
-                   "macro_statement", "labeled_statement", "compound_statement",
-                   "statement_list", "expression_statement", "selection_statement",
-                   "iteration_statement", "jump_statement" ]
+    ruleNames = ["translation_unit", "external_declaration", "function_definition",
+                 "declaration_specifiers", "declaration", "init_declarator_list",
+                 "init_declarator", "storage_class_specifier", "type_specifier",
+                 "type_id", "struct_or_union_specifier", "struct_or_union",
+                 "struct_declaration_list", "struct_declaration", "specifier_qualifier_list",
+                 "struct_declarator_list", "struct_declarator", "enum_specifier",
+                 "enumerator_list", "enumerator", "type_qualifier", "declarator",
+                 "direct_declarator", "declarator_suffix", "pointer",
+                 "parameter_type_list", "parameter_list", "parameter_declaration",
+                 "identifier_list", "type_name", "abstract_declarator",
+                 "direct_abstract_declarator", "abstract_declarator_suffix",
+                 "initializer", "initializer_list", "argument_expression_list",
+                 "additive_expression", "multiplicative_expression", "cast_expression",
+                 "unary_expression", "postfix_expression", "macro_parameter_list",
+                 "unary_operator", "primary_expression", "constant", "expression",
+                 "constant_expression", "assignment_expression", "lvalue",
+                 "assignment_operator", "conditional_expression", "logical_or_expression",
+                 "logical_and_expression", "inclusive_or_expression",
+                 "exclusive_or_expression", "and_expression", "equality_expression",
+                 "relational_expression", "shift_expression", "statement",
+                 "asm2_statement", "asm1_statement", "asm_statement",
+                 "macro_statement", "labeled_statement", "compound_statement",
+                 "statement_list", "expression_statement", "selection_statement",
+                 "iteration_statement", "jump_statement"]
 
     EOF = Token.EOF
-    T__0=1
-    T__1=2
-    T__2=3
-    T__3=4
-    T__4=5
-    T__5=6
-    T__6=7
-    T__7=8
-    T__8=9
-    T__9=10
-    T__10=11
-    T__11=12
-    T__12=13
-    T__13=14
-    T__14=15
-    T__15=16
-    T__16=17
-    T__17=18
-    T__18=19
-    T__19=20
-    T__20=21
-    T__21=22
-    T__22=23
-    T__23=24
-    T__24=25
-    T__25=26
-    T__26=27
-    T__27=28
-    T__28=29
-    T__29=30
-    T__30=31
-    T__31=32
-    T__32=33
-    T__33=34
-    T__34=35
-    T__35=36
-    T__36=37
-    T__37=38
-    T__38=39
-    T__39=40
-    T__40=41
-    T__41=42
-    T__42=43
-    T__43=44
-    T__44=45
-    T__45=46
-    T__46=47
-    T__47=48
-    T__48=49
-    T__49=50
-    T__50=51
-    T__51=52
-    T__52=53
-    T__53=54
-    T__54=55
-    T__55=56
-    T__56=57
-    T__57=58
-    T__58=59
-    T__59=60
-    T__60=61
-    T__61=62
-    T__62=63
-    T__63=64
-    T__64=65
-    T__65=66
-    T__66=67
-    T__67=68
-    T__68=69
-    T__69=70
-    T__70=71
-    T__71=72
-    T__72=73
-    T__73=74
-    T__74=75
-    T__75=76
-    T__76=77
-    T__77=78
-    T__78=79
-    T__79=80
-    T__80=81
-    T__81=82
-    T__82=83
-    T__83=84
-    T__84=85
-    T__85=86
-    T__86=87
-    T__87=88
-    T__88=89
-    T__89=90
-    T__90=91
-    T__91=92
-    IDENTIFIER=93
-    CHARACTER_LITERAL=94
-    STRING_LITERAL=95
-    HEX_LITERAL=96
-    DECIMAL_LITERAL=97
-    OCTAL_LITERAL=98
-    FLOATING_POINT_LITERAL=99
-    WS=100
-    BS=101
-    UnicodeVocabulary=102
-    COMMENT=103
-    LINE_COMMENT=104
-    LINE_COMMAND=105
+    T__0 = 1
+    T__1 = 2
+    T__2 = 3
+    T__3 = 4
+    T__4 = 5
+    T__5 = 6
+    T__6 = 7
+    T__7 = 8
+    T__8 = 9
+    T__9 = 10
+    T__10 = 11
+    T__11 = 12
+    T__12 = 13
+    T__13 = 14
+    T__14 = 15
+    T__15 = 16
+    T__16 = 17
+    T__17 = 18
+    T__18 = 19
+    T__19 = 20
+    T__20 = 21
+    T__21 = 22
+    T__22 = 23
+    T__23 = 24
+    T__24 = 25
+    T__25 = 26
+    T__26 = 27
+    T__27 = 28
+    T__28 = 29
+    T__29 = 30
+    T__30 = 31
+    T__31 = 32
+    T__32 = 33
+    T__33 = 34
+    T__34 = 35
+    T__35 = 36
+    T__36 = 37
+    T__37 = 38
+    T__38 = 39
+    T__39 = 40
+    T__40 = 41
+    T__41 = 42
+    T__42 = 43
+    T__43 = 44
+    T__44 = 45
+    T__45 = 46
+    T__46 = 47
+    T__47 = 48
+    T__48 = 49
+    T__49 = 50
+    T__50 = 51
+    T__51 = 52
+    T__52 = 53
+    T__53 = 54
+    T__54 = 55
+    T__55 = 56
+    T__56 = 57
+    T__57 = 58
+    T__58 = 59
+    T__59 = 60
+    T__60 = 61
+    T__61 = 62
+    T__62 = 63
+    T__63 = 64
+    T__64 = 65
+    T__65 = 66
+    T__66 = 67
+    T__67 = 68
+    T__68 = 69
+    T__69 = 70
+    T__70 = 71
+    T__71 = 72
+    T__72 = 73
+    T__73 = 74
+    T__74 = 75
+    T__75 = 76
+    T__76 = 77
+    T__77 = 78
+    T__78 = 79
+    T__79 = 80
+    T__80 = 81
+    T__81 = 82
+    T__82 = 83
+    T__83 = 84
+    T__84 = 85
+    T__85 = 86
+    T__86 = 87
+    T__87 = 88
+    T__88 = 89
+    T__89 = 90
+    T__90 = 91
+    T__91 = 92
+    IDENTIFIER = 93
+    CHARACTER_LITERAL = 94
+    STRING_LITERAL = 95
+    HEX_LITERAL = 96
+    DECIMAL_LITERAL = 97
+    OCTAL_LITERAL = 98
+    FLOATING_POINT_LITERAL = 99
+    WS = 100
+    BS = 101
+    UnicodeVocabulary = 102
+    COMMENT = 103
+    LINE_COMMENT = 104
+    LINE_COMMAND = 105
 
     # @param  input Type: TokenStream
     # @param  output= sys.stdout Type: TextIO
-    def __init__(self,input,output= sys.stdout):
+    def __init__(self, input, output=sys.stdout):
         super().__init__(input, output)
         self.checkVersion("4.7.1")
-        self._interp = ParserATNSimulator(self, self.atn, self.decisionsToDFA, self.sharedContextCache)
+        self._interp = ParserATNSimulator(
+            self, self.atn, self.decisionsToDFA, self.sharedContextCache)
         self._predicates = None
 
+    def printTokenInfo(self, line, offset, tokenText):
+        print(str(line) + ',' + str(offset) + ':' + str(tokenText))
 
-
-
-    def printTokenInfo(self,line,offset,tokenText):
-        print(str(line)+ ',' + str(offset) + ':' + str(tokenText))
-
-    def StorePredicateExpression(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        PredExp = CodeFragment.PredicateExpression(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StorePredicateExpression(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        PredExp = CodeFragment.PredicateExpression(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.PredicateExpressionList.append(PredExp)
 
-    def StoreEnumerationDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        EnumDef = CodeFragment.EnumerationDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreEnumerationDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        EnumDef = CodeFragment.EnumerationDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.EnumerationDefinitionList.append(EnumDef)
 
-    def StoreStructUnionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,Text):
-        SUDef = CodeFragment.StructUnionDefinition(Text, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreStructUnionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, Text):
+        SUDef = CodeFragment.StructUnionDefinition(
+            Text, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.StructUnionDefinitionList.append(SUDef)
 
-    def StoreTypedefDefinition(self,StartLine,StartOffset,EndLine,EndOffset,FromText,ToText):
-        Tdef = CodeFragment.TypedefDefinition(FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreTypedefDefinition(self, StartLine, StartOffset, EndLine, EndOffset, FromText, ToText):
+        Tdef = CodeFragment.TypedefDefinition(
+            FromText, ToText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.TypedefDefinitionList.append(Tdef)
 
-    def StoreFunctionDefinition(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText,LeftBraceLine,LeftBraceOffset,DeclLine,DeclOffset):
-        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
+    def StoreFunctionDefinition(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText, LeftBraceLine, LeftBraceOffset, DeclLine, DeclOffset):
+        FuncDef = CodeFragment.FunctionDefinition(ModifierText, DeclText, (StartLine, StartOffset), (
+            EndLine, EndOffset), (LeftBraceLine, LeftBraceOffset), (DeclLine, DeclOffset))
         FileProfile.FunctionDefinitionList.append(FuncDef)
 
-    def StoreVariableDeclaration(self,StartLine,StartOffset,EndLine,EndOffset,ModifierText,DeclText):
-        VarDecl = CodeFragment.VariableDeclaration(ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreVariableDeclaration(self, StartLine, StartOffset, EndLine, EndOffset, ModifierText, DeclText):
+        VarDecl = CodeFragment.VariableDeclaration(
+            ModifierText, DeclText, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.VariableDeclarationList.append(VarDecl)
 
-    def StoreFunctionCalling(self,StartLine,StartOffset,EndLine,EndOffset,FuncName,ParamList):
-        FuncCall = CodeFragment.FunctionCalling(FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
+    def StoreFunctionCalling(self, StartLine, StartOffset, EndLine, EndOffset, FuncName, ParamList):
+        FuncCall = CodeFragment.FunctionCalling(
+            FuncName, ParamList, (StartLine, StartOffset), (EndLine, EndOffset))
         FileProfile.FunctionCallingList.append(FuncCall)
 
-
-
     class Translation_unitContext(ParserRuleContext):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def external_declaration(self,i=None):
+        def external_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.External_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.External_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.External_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_translation_unit
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterTranslation_unit" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterTranslation_unit"):
                 listener.enterTranslation_unit(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitTranslation_unit" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitTranslation_unit"):
                 listener.exitTranslation_unit(self)
 
-
-
-
     def translation_unit(self):
 
         localctx = CParser.Translation_unitContext(self, self._ctx, self.state)
         self.enterRule(localctx, 0, self.RULE_translation_unit)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 145
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__41))) != 0) or _la==CParser.IDENTIFIER:
+            while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__41))) != 0) or _la == CParser.IDENTIFIER:
                 self.state = 142
                 self.external_declaration()
                 self.state = 147
@@ -840,75 +840,67 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def function_definition(self):
-            return self.getTypedRuleContext(CParser.Function_definitionContext,0)
-
+            return self.getTypedRuleContext(CParser.Function_definitionContext, 0)
 
         def macro_statement(self):
-            return self.getTypedRuleContext(CParser.Macro_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Macro_statementContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_external_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExternal_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExternal_declaration"):
                 listener.enterExternal_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExternal_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExternal_declaration"):
                 listener.exitExternal_declaration(self)
 
-
-
-
     def external_declaration(self):
 
-        localctx = CParser.External_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.External_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 2, self.RULE_external_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 166
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,4,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 4, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 149
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,1,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 1, self._ctx)
                 if la_ == 1:
                     self.state = 148
                     self.declaration_specifiers()
 
-
                 self.state = 151
                 self.declarator()
                 self.state = 155
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER:
+                while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER:
                     self.state = 152
                     self.declaration()
                     self.state = 157
@@ -938,14 +930,12 @@ class CParser ( Parser ):
                 self.state = 164
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__1:
+                if _la == CParser.T__1:
                     self.state = 163
                     self.match(CParser.T__1)
 
-
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -958,7 +948,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
             self.ModifierText = ''
@@ -967,71 +957,64 @@ class CParser ( Parser ):
             self.LBOffset = 0
             self.DeclLine = 0
             self.DeclOffset = 0
-            self.d = None # Declaration_specifiersContext
-            self._declaration_specifiers = None # Declaration_specifiersContext
-            self._declarator = None # DeclaratorContext
-            self.a = None # Compound_statementContext
-            self.b = None # Compound_statementContext
+            self.d = None  # Declaration_specifiersContext
+            self._declaration_specifiers = None  # Declaration_specifiersContext
+            self._declarator = None  # DeclaratorContext
+            self.a = None  # Compound_statementContext
+            self.b = None  # Compound_statementContext
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def compound_statement(self):
-            return self.getTypedRuleContext(CParser.Compound_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Compound_statementContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_function_definition
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterFunction_definition" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterFunction_definition"):
                 listener.enterFunction_definition(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitFunction_definition" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitFunction_definition"):
                 listener.exitFunction_definition(self)
 
-
-
-
     def function_definition(self):
 
-        localctx = CParser.Function_definitionContext(self, self._ctx, self.state)
+        localctx = CParser.Function_definitionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 4, self.RULE_function_definition)
 
-        ModifierText = '';
-        DeclText = '';
-        LBLine = 0;
-        LBOffset = 0;
-        DeclLine = 0;
-        DeclOffset = 0;
+        ModifierText = ''
+        DeclText = ''
+        LBLine = 0
+        LBOffset = 0
+        DeclLine = 0
+        DeclOffset = 0
 
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 169
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,5,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 5, self._ctx)
             if la_ == 1:
                 self.state = 168
                 localctx.d = localctx._declaration_specifiers = self.declaration_specifiers()
 
-
             self.state = 171
             localctx._declarator = self.declarator()
             self.state = 180
@@ -1047,7 +1030,7 @@ class CParser ( Parser ):
                     self.state = 175
                     self._errHandler.sync(self)
                     _la = self._input.LA(1)
-                    if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER):
+                    if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__2) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER):
                         break
 
                 self.state = 177
@@ -1060,24 +1043,30 @@ class CParser ( Parser ):
             else:
                 raise NoViableAltException(self)
 
-
             if localctx.d != None:
-                ModifierText = (None if localctx._declaration_specifiers is None else self._input.getText((localctx._declaration_specifiers.start,localctx._declaration_specifiers.stop)))
+                ModifierText = (None if localctx._declaration_specifiers is None else self._input.getText(
+                    (localctx._declaration_specifiers.start, localctx._declaration_specifiers.stop)))
             else:
                 ModifierText = ''
-            DeclText = (None if localctx._declarator is None else self._input.getText((localctx._declarator.start,localctx._declarator.stop)))
-            DeclLine = (None if localctx._declarator is None else localctx._declarator.start).line
-            DeclOffset = (None if localctx._declarator is None else localctx._declarator.start).column
+            DeclText = (None if localctx._declarator is None else self._input.getText(
+                (localctx._declarator.start, localctx._declarator.stop)))
+            DeclLine = (
+                None if localctx._declarator is None else localctx._declarator.start).line
+            DeclOffset = (
+                None if localctx._declarator is None else localctx._declarator.start).column
             if localctx.a != None:
                 LBLine = (None if localctx.a is None else localctx.a.start).line
-                LBOffset = (None if localctx.a is None else localctx.a.start).column
+                LBOffset = (
+                    None if localctx.a is None else localctx.a.start).column
             else:
                 LBLine = (None if localctx.b is None else localctx.b.start).line
-                LBOffset = (None if localctx.b is None else localctx.b.start).column
+                LBOffset = (
+                    None if localctx.b is None else localctx.b.start).column
 
             self._ctx.stop = self._input.LT(-1)
 
-            self.StoreFunctionDefinition(localctx.start.line, localctx.start.column, localctx.stop.line, localctx.stop.column, ModifierText, DeclText, LBLine, LBOffset, DeclLine, DeclOffset)
+            self.StoreFunctionDefinition(localctx.start.line, localctx.start.column, localctx.stop.line,
+                                         localctx.stop.column, ModifierText, DeclText, LBLine, LBOffset, DeclLine, DeclOffset)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1091,60 +1080,55 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def storage_class_specifier(self,i=None):
+        def storage_class_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Storage_class_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Storage_class_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Storage_class_specifierContext, i)
 
         # @param  i=None Type: int
-        def type_specifier(self,i=None):
+        def type_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_specifierContext, i)
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_declaration_specifiers
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclaration_specifiers" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclaration_specifiers"):
                 listener.enterDeclaration_specifiers(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclaration_specifiers" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclaration_specifiers"):
                 listener.exitDeclaration_specifiers(self)
 
-
-
-
     def declaration_specifiers(self):
 
-        localctx = CParser.Declaration_specifiersContext(self, self._ctx, self.state)
+        localctx = CParser.Declaration_specifiersContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 6, self.RULE_declaration_specifiers)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 187
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 187
                     self._errHandler.sync(self)
@@ -1164,12 +1148,11 @@ class CParser ( Parser ):
                     else:
                         raise NoViableAltException(self)
 
-
                 else:
                     raise NoViableAltException(self)
                 self.state = 189
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,9,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 9, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1183,46 +1166,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.a = None # Token
-            self.b = None # Declaration_specifiersContext
-            self.c = None # Init_declarator_listContext
-            self.d = None # Token
-            self.s = None # Declaration_specifiersContext
-            self.t = None # Init_declarator_listContext
-            self.e = None # Token
+            self.a = None  # Token
+            self.b = None  # Declaration_specifiersContext
+            self.c = None  # Init_declarator_listContext
+            self.d = None  # Token
+            self.s = None  # Declaration_specifiersContext
+            self.t = None  # Init_declarator_listContext
+            self.e = None  # Token
 
         def init_declarator_list(self):
-            return self.getTypedRuleContext(CParser.Init_declarator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Init_declarator_listContext, 0)
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclaration"):
                 listener.enterDeclaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclaration"):
                 listener.exitDeclaration(self)
 
-
-
-
     def declaration(self):
 
         localctx = CParser.DeclarationContext(self, self._ctx, self.state)
         self.enterRule(localctx, 8, self.RULE_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 206
             self._errHandler.sync(self)
@@ -1233,21 +1211,22 @@ class CParser ( Parser ):
                 localctx.a = self.match(CParser.T__2)
                 self.state = 193
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,10,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 10, self._ctx)
                 if la_ == 1:
                     self.state = 192
                     localctx.b = self.declaration_specifiers()
 
-
                 self.state = 195
                 localctx.c = self.init_declarator_list()
                 self.state = 196
                 localctx.d = self.match(CParser.T__1)
 
                 if localctx.b is not None:
-                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, (None if localctx.b is None else self._input.getText((localctx.b.start,localctx.b.stop))), (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, (None if localctx.b is None else self._input.getText(
+                        (localctx.b.start, localctx.b.stop))), (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
                 else:
-                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line), localctx.d.column, '', (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                    self.StoreTypedefDefinition(localctx.a.line, localctx.a.column, (0 if localctx.d is None else localctx.d.line),
+                                                localctx.d.column, '', (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
 
                 pass
             elif token in [CParser.T__5, CParser.T__6, CParser.T__7, CParser.T__8, CParser.T__9, CParser.T__10, CParser.T__11, CParser.T__12, CParser.T__13, CParser.T__14, CParser.T__15, CParser.T__16, CParser.T__17, CParser.T__18, CParser.T__20, CParser.T__21, CParser.T__23, CParser.T__24, CParser.T__25, CParser.T__26, CParser.T__27, CParser.T__28, CParser.T__29, CParser.T__30, CParser.T__31, CParser.T__32, CParser.T__33, CParser.T__34, CParser.T__35, CParser.T__36, CParser.IDENTIFIER]:
@@ -1261,12 +1240,12 @@ class CParser ( Parser ):
                     self.state = 200
                     localctx.t = self.init_declarator_list()
 
-
                 self.state = 203
                 localctx.e = self.match(CParser.T__1)
 
                 if localctx.t is not None:
-                    self.StoreVariableDeclaration((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.t is None else localctx.t.start).line, (None if localctx.t is None else localctx.t.start).column, (None if localctx.s is None else self._input.getText((localctx.s.start,localctx.s.stop))), (None if localctx.t is None else self._input.getText((localctx.t.start,localctx.t.stop))))
+                    self.StoreVariableDeclaration((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.t is None else localctx.t.start).line, (
+                        None if localctx.t is None else localctx.t.start).column, (None if localctx.s is None else self._input.getText((localctx.s.start, localctx.s.stop))), (None if localctx.t is None else self._input.getText((localctx.t.start, localctx.t.stop))))
 
                 pass
             else:
@@ -1284,39 +1263,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def init_declarator(self,i=None):
+        def init_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Init_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Init_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Init_declaratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_init_declarator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInit_declarator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInit_declarator_list"):
                 listener.enterInit_declarator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInit_declarator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInit_declarator_list"):
                 listener.exitInit_declarator_list(self)
 
-
-
-
     def init_declarator_list(self):
 
-        localctx = CParser.Init_declarator_listContext(self, self._ctx, self.state)
+        localctx = CParser.Init_declarator_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 10, self.RULE_init_declarator_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 208
@@ -1324,7 +1300,7 @@ class CParser ( Parser ):
             self.state = 213
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 209
                 self.match(CParser.T__3)
                 self.state = 210
@@ -1345,39 +1321,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def initializer(self):
-            return self.getTypedRuleContext(CParser.InitializerContext,0)
-
+            return self.getTypedRuleContext(CParser.InitializerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_init_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInit_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInit_declarator"):
                 listener.enterInit_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInit_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInit_declarator"):
                 listener.exitInit_declarator(self)
 
-
-
-
     def init_declarator(self):
 
         localctx = CParser.Init_declaratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 12, self.RULE_init_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 216
@@ -1385,13 +1356,12 @@ class CParser ( Parser ):
             self.state = 219
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__4:
+            if _la == CParser.T__4:
                 self.state = 217
                 self.match(CParser.T__4)
                 self.state = 218
                 self.initializer()
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1404,32 +1374,29 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_storage_class_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStorage_class_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStorage_class_specifier"):
                 listener.enterStorage_class_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStorage_class_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStorage_class_specifier"):
                 listener.exitStorage_class_specifier(self)
 
-
-
-
     def storage_class_specifier(self):
 
-        localctx = CParser.Storage_class_specifierContext(self, self._ctx, self.state)
+        localctx = CParser.Storage_class_specifierContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 14, self.RULE_storage_class_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 221
@@ -1451,55 +1418,47 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.s = None # Struct_or_union_specifierContext
-            self.e = None # Enum_specifierContext
+            self.s = None  # Struct_or_union_specifierContext
+            self.e = None  # Enum_specifierContext
 
         def struct_or_union_specifier(self):
-            return self.getTypedRuleContext(CParser.Struct_or_union_specifierContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_or_union_specifierContext, 0)
 
         def enum_specifier(self):
-            return self.getTypedRuleContext(CParser.Enum_specifierContext,0)
-
+            return self.getTypedRuleContext(CParser.Enum_specifierContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def type_id(self):
-            return self.getTypedRuleContext(CParser.Type_idContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_idContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_type_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_specifier"):
                 listener.enterType_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_specifier"):
                 listener.exitType_specifier(self)
 
-
-
-
     def type_specifier(self):
 
         localctx = CParser.Type_specifierContext(self, self._ctx, self.state)
@@ -1507,7 +1466,7 @@ class CParser ( Parser ):
         try:
             self.state = 247
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,16,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 16, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 223
@@ -1568,7 +1527,8 @@ class CParser ( Parser ):
                 localctx.s = self.struct_or_union_specifier()
 
                 if localctx.s.stop is not None:
-                    self.StoreStructUnionDefinition((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.s is None else localctx.s.stop).line, (None if localctx.s is None else localctx.s.stop).column, (None if localctx.s is None else self._input.getText((localctx.s.start,localctx.s.stop))))
+                    self.StoreStructUnionDefinition((None if localctx.s is None else localctx.s.start).line, (None if localctx.s is None else localctx.s.start).column, (None if localctx.s is None else localctx.s.stop).line, (
+                        None if localctx.s is None else localctx.s.stop).column, (None if localctx.s is None else self._input.getText((localctx.s.start, localctx.s.stop))))
 
                 pass
 
@@ -1578,7 +1538,8 @@ class CParser ( Parser ):
                 localctx.e = self.enum_specifier()
 
                 if localctx.e.stop is not None:
-                    self.StoreEnumerationDefinition((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                    self.StoreEnumerationDefinition((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                        None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
 
                 pass
 
@@ -1588,14 +1549,15 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 self.state = 242
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,15,self._ctx)
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                    if _alt==1:
+                _alt = self._interp.adaptivePredict(self._input, 15, self._ctx)
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                    if _alt == 1:
                         self.state = 239
                         self.type_qualifier()
                     self.state = 244
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,15,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 15, self._ctx)
 
                 self.state = 245
                 self.declarator()
@@ -1607,7 +1569,6 @@ class CParser ( Parser ):
                 self.type_id()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1620,7 +1581,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -1631,18 +1592,15 @@ class CParser ( Parser ):
             return CParser.RULE_type_id
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_id" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_id"):
                 listener.enterType_id(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_id" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_id"):
                 listener.exitType_id(self)
 
-
-
-
     def type_id(self):
 
         localctx = CParser.Type_idContext(self, self._ctx, self.state)
@@ -1663,17 +1621,15 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def struct_or_union(self):
-            return self.getTypedRuleContext(CParser.Struct_or_unionContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_or_unionContext, 0)
 
         def struct_declaration_list(self):
-            return self.getTypedRuleContext(CParser.Struct_declaration_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_declaration_listContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
@@ -1682,27 +1638,25 @@ class CParser ( Parser ):
             return CParser.RULE_struct_or_union_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_or_union_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_or_union_specifier"):
                 listener.enterStruct_or_union_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_or_union_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_or_union_specifier"):
                 listener.exitStruct_or_union_specifier(self)
 
-
-
-
     def struct_or_union_specifier(self):
 
-        localctx = CParser.Struct_or_union_specifierContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_or_union_specifierContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 20, self.RULE_struct_or_union_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 262
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,18,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 18, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 251
@@ -1710,11 +1664,10 @@ class CParser ( Parser ):
                 self.state = 253
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.IDENTIFIER:
+                if _la == CParser.IDENTIFIER:
                     self.state = 252
                     self.match(CParser.IDENTIFIER)
 
-
                 self.state = 255
                 self.match(CParser.T__0)
                 self.state = 256
@@ -1731,7 +1684,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -1744,37 +1696,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_struct_or_union
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_or_union" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_or_union"):
                 listener.enterStruct_or_union(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_or_union" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_or_union"):
                 listener.exitStruct_or_union(self)
 
-
-
-
     def struct_or_union(self):
 
         localctx = CParser.Struct_or_unionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 22, self.RULE_struct_or_union)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 264
             _la = self._input.LA(1)
-            if not(_la==CParser.T__20 or _la==CParser.T__21):
+            if not(_la == CParser.T__20 or _la == CParser.T__21):
                 self._errHandler.recoverInline(self)
             else:
                 self._errHandler.reportMatch(self)
@@ -1791,39 +1739,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def struct_declaration(self,i=None):
+        def struct_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Struct_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Struct_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Struct_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declaration_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declaration_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declaration_list"):
                 listener.enterStruct_declaration_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declaration_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declaration_list"):
                 listener.exitStruct_declaration_list(self)
 
-
-
-
     def struct_declaration_list(self):
 
-        localctx = CParser.Struct_declaration_listContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declaration_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 24, self.RULE_struct_declaration_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 267
@@ -1835,7 +1780,7 @@ class CParser ( Parser ):
                 self.state = 269
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la==CParser.IDENTIFIER):
+                if not ((((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36))) != 0) or _la == CParser.IDENTIFIER):
                     break
 
         except RecognitionException as re:
@@ -1850,37 +1795,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def specifier_qualifier_list(self):
-            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext, 0)
 
         def struct_declarator_list(self):
-            return self.getTypedRuleContext(CParser.Struct_declarator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Struct_declarator_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declaration"):
                 listener.enterStruct_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declaration"):
                 listener.exitStruct_declaration(self)
 
-
-
-
     def struct_declaration(self):
 
-        localctx = CParser.Struct_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 26, self.RULE_struct_declaration)
         try:
             self.enterOuterAlt(localctx, 1)
@@ -1902,52 +1843,48 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         # @param  i=None Type: int
-        def type_specifier(self,i=None):
+        def type_specifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_specifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_specifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_specifierContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_specifier_qualifier_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterSpecifier_qualifier_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterSpecifier_qualifier_list"):
                 listener.enterSpecifier_qualifier_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitSpecifier_qualifier_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitSpecifier_qualifier_list"):
                 listener.exitSpecifier_qualifier_list(self)
 
-
-
-
     def specifier_qualifier_list(self):
 
-        localctx = CParser.Specifier_qualifier_listContext(self, self._ctx, self.state)
+        localctx = CParser.Specifier_qualifier_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 28, self.RULE_specifier_qualifier_list)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 277
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 277
                     self._errHandler.sync(self)
@@ -1963,12 +1900,11 @@ class CParser ( Parser ):
                     else:
                         raise NoViableAltException(self)
 
-
                 else:
                     raise NoViableAltException(self)
                 self.state = 279
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,21,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 21, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -1982,39 +1918,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def struct_declarator(self,i=None):
+        def struct_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Struct_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Struct_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Struct_declaratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declarator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declarator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declarator_list"):
                 listener.enterStruct_declarator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declarator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declarator_list"):
                 listener.exitStruct_declarator_list(self)
 
-
-
-
     def struct_declarator_list(self):
 
-        localctx = CParser.Struct_declarator_listContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declarator_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 30, self.RULE_struct_declarator_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 281
@@ -2022,7 +1955,7 @@ class CParser ( Parser ):
             self.state = 286
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 282
                 self.match(CParser.T__3)
                 self.state = 283
@@ -2043,39 +1976,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_struct_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStruct_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStruct_declarator"):
                 listener.enterStruct_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStruct_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStruct_declarator"):
                 listener.exitStruct_declarator(self)
 
-
-
-
     def struct_declarator(self):
 
-        localctx = CParser.Struct_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Struct_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 32, self.RULE_struct_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 296
             self._errHandler.sync(self)
@@ -2087,13 +2016,12 @@ class CParser ( Parser ):
                 self.state = 292
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__22:
+                if _la == CParser.T__22:
                     self.state = 290
                     self.match(CParser.T__22)
                     self.state = 291
                     self.constant_expression()
 
-
                 pass
             elif token in [CParser.T__22]:
                 self.enterOuterAlt(localctx, 2)
@@ -2117,13 +2045,12 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def enumerator_list(self):
-            return self.getTypedRuleContext(CParser.Enumerator_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Enumerator_listContext, 0)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
@@ -2132,27 +2059,24 @@ class CParser ( Parser ):
             return CParser.RULE_enum_specifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnum_specifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnum_specifier"):
                 listener.enterEnum_specifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnum_specifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnum_specifier"):
                 listener.exitEnum_specifier(self)
 
-
-
-
     def enum_specifier(self):
 
         localctx = CParser.Enum_specifierContext(self, self._ctx, self.state)
         self.enterRule(localctx, 34, self.RULE_enum_specifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 317
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,27,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 27, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 298
@@ -2164,11 +2088,10 @@ class CParser ( Parser ):
                 self.state = 302
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 301
                     self.match(CParser.T__3)
 
-
                 self.state = 304
                 self.match(CParser.T__19)
                 pass
@@ -2186,11 +2109,10 @@ class CParser ( Parser ):
                 self.state = 311
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 310
                     self.match(CParser.T__3)
 
-
                 self.state = 313
                 self.match(CParser.T__19)
                 pass
@@ -2203,7 +2125,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2216,34 +2137,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def enumerator(self,i=None):
+        def enumerator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.EnumeratorContext)
             else:
-                return self.getTypedRuleContext(CParser.EnumeratorContext,i)
-
+                return self.getTypedRuleContext(CParser.EnumeratorContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_enumerator_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnumerator_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnumerator_list"):
                 listener.enterEnumerator_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnumerator_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnumerator_list"):
                 listener.exitEnumerator_list(self)
 
-
-
-
     def enumerator_list(self):
 
         localctx = CParser.Enumerator_listContext(self, self._ctx, self.state)
@@ -2254,16 +2171,16 @@ class CParser ( Parser ):
             self.enumerator()
             self.state = 324
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,28,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 28, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 320
                     self.match(CParser.T__3)
                     self.state = 321
                     self.enumerator()
                 self.state = 326
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,28,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 28, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -2277,7 +2194,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -2285,30 +2202,26 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_enumerator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEnumerator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEnumerator"):
                 listener.enterEnumerator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEnumerator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEnumerator"):
                 listener.exitEnumerator(self)
 
-
-
-
     def enumerator(self):
 
         localctx = CParser.EnumeratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 38, self.RULE_enumerator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 327
@@ -2316,13 +2229,12 @@ class CParser ( Parser ):
             self.state = 330
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__4:
+            if _la == CParser.T__4:
                 self.state = 328
                 self.match(CParser.T__4)
                 self.state = 329
                 self.constant_expression()
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2335,32 +2247,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_type_qualifier
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_qualifier" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_qualifier"):
                 listener.enterType_qualifier(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_qualifier" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_qualifier"):
                 listener.exitType_qualifier(self)
 
-
-
-
     def type_qualifier(self):
 
         localctx = CParser.Type_qualifierContext(self, self._ctx, self.state)
         self.enterRule(localctx, 40, self.RULE_type_qualifier)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 332
@@ -2382,77 +2290,68 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def direct_declarator(self):
-            return self.getTypedRuleContext(CParser.Direct_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Direct_declaratorContext, 0)
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclarator"):
                 listener.enterDeclarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclarator"):
                 listener.exitDeclarator(self)
 
-
-
-
     def declarator(self):
 
         localctx = CParser.DeclaratorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 42, self.RULE_declarator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 348
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,34,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 34, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 335
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__41:
+                if _la == CParser.T__41:
                     self.state = 334
                     self.pointer()
 
-
                 self.state = 338
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__33:
+                if _la == CParser.T__33:
                     self.state = 337
                     self.match(CParser.T__33)
 
-
                 self.state = 341
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__34:
+                if _la == CParser.T__34:
                     self.state = 340
                     self.match(CParser.T__34)
 
-
                 self.state = 344
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__35:
+                if _la == CParser.T__35:
                     self.state = 343
                     self.match(CParser.T__35)
 
-
                 self.state = 346
                 self.direct_declarator()
                 pass
@@ -2463,7 +2362,6 @@ class CParser ( Parser ):
                 self.pointer()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2476,7 +2374,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -2484,36 +2382,32 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def declarator_suffix(self,i=None):
+        def declarator_suffix(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Declarator_suffixContext)
             else:
-                return self.getTypedRuleContext(CParser.Declarator_suffixContext,i)
-
+                return self.getTypedRuleContext(CParser.Declarator_suffixContext, i)
 
         def declarator(self):
-            return self.getTypedRuleContext(CParser.DeclaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclaratorContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_direct_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDirect_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDirect_declarator"):
                 listener.enterDirect_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDirect_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDirect_declarator"):
                 listener.exitDirect_declarator(self)
 
-
-
-
     def direct_declarator(self):
 
-        localctx = CParser.Direct_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Direct_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 44, self.RULE_direct_declarator)
         try:
             self.state = 368
@@ -2525,14 +2419,15 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 self.state = 354
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,35,self._ctx)
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                    if _alt==1:
+                _alt = self._interp.adaptivePredict(self._input, 35, self._ctx)
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                    if _alt == 1:
                         self.state = 351
                         self.declarator_suffix()
                     self.state = 356
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,35,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 35, self._ctx)
 
                 pass
             elif token in [CParser.T__37]:
@@ -2541,12 +2436,11 @@ class CParser ( Parser ):
                 self.match(CParser.T__37)
                 self.state = 359
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,36,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 36, self._ctx)
                 if la_ == 1:
                     self.state = 358
                     self.match(CParser.T__33)
 
-
                 self.state = 361
                 self.declarator()
                 self.state = 362
@@ -2554,7 +2448,7 @@ class CParser ( Parser ):
                 self.state = 364
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 363
                         self.declarator_suffix()
@@ -2563,7 +2457,8 @@ class CParser ( Parser ):
                         raise NoViableAltException(self)
                     self.state = 366
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,37,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 37, self._ctx)
 
                 pass
             else:
@@ -2581,46 +2476,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def parameter_type_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_type_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_type_listContext, 0)
 
         def identifier_list(self):
-            return self.getTypedRuleContext(CParser.Identifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Identifier_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_declarator_suffix
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDeclarator_suffix" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDeclarator_suffix"):
                 listener.enterDeclarator_suffix(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDeclarator_suffix" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDeclarator_suffix"):
                 listener.exitDeclarator_suffix(self)
 
-
-
-
     def declarator_suffix(self):
 
-        localctx = CParser.Declarator_suffixContext(self, self._ctx, self.state)
+        localctx = CParser.Declarator_suffixContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 46, self.RULE_declarator_suffix)
         try:
             self.state = 386
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,39,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 39, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 370
@@ -2667,7 +2557,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2680,38 +2569,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def type_qualifier(self,i=None):
+        def type_qualifier(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Type_qualifierContext)
             else:
-                return self.getTypedRuleContext(CParser.Type_qualifierContext,i)
-
+                return self.getTypedRuleContext(CParser.Type_qualifierContext, i)
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_pointer
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPointer" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPointer"):
                 listener.enterPointer(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPointer" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPointer"):
                 listener.exitPointer(self)
 
-
-
-
     def pointer(self):
 
         localctx = CParser.PointerContext(self, self._ctx, self.state)
@@ -2719,7 +2603,7 @@ class CParser ( Parser ):
         try:
             self.state = 400
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,42,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 42, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 388
@@ -2727,7 +2611,7 @@ class CParser ( Parser ):
                 self.state = 390
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 389
                         self.type_qualifier()
@@ -2736,16 +2620,16 @@ class CParser ( Parser ):
                         raise NoViableAltException(self)
                     self.state = 392
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,40,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 40, self._ctx)
 
                 self.state = 395
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,41,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 41, self._ctx)
                 if la_ == 1:
                     self.state = 394
                     self.pointer()
 
-
                 pass
 
             elif la_ == 2:
@@ -2762,7 +2646,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__41)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2775,35 +2658,32 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def parameter_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_type_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_type_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_type_list"):
                 listener.enterParameter_type_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_type_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_type_list"):
                 listener.exitParameter_type_list(self)
 
-
-
-
     def parameter_type_list(self):
 
-        localctx = CParser.Parameter_type_listContext(self, self._ctx, self.state)
+        localctx = CParser.Parameter_type_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 50, self.RULE_parameter_type_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 402
@@ -2811,21 +2691,19 @@ class CParser ( Parser ):
             self.state = 408
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__3:
+            if _la == CParser.T__3:
                 self.state = 403
                 self.match(CParser.T__3)
                 self.state = 405
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 404
                     self.match(CParser.T__28)
 
-
                 self.state = 407
                 self.match(CParser.T__42)
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -2838,34 +2716,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def parameter_declaration(self,i=None):
+        def parameter_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Parameter_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Parameter_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Parameter_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_list"):
                 listener.enterParameter_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_list"):
                 listener.exitParameter_list(self)
 
-
-
-
     def parameter_list(self):
 
         localctx = CParser.Parameter_listContext(self, self._ctx, self.state)
@@ -2876,24 +2750,24 @@ class CParser ( Parser ):
             self.parameter_declaration()
             self.state = 418
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,46,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 46, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 411
                     self.match(CParser.T__3)
                     self.state = 413
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,45,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 45, self._ctx)
                     if la_ == 1:
                         self.state = 412
                         self.match(CParser.T__28)
 
-
                     self.state = 415
                     self.parameter_declaration()
                 self.state = 420
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,46,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 46, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -2907,66 +2781,60 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def declaration_specifiers(self):
-            return self.getTypedRuleContext(CParser.Declaration_specifiersContext,0)
-
+            return self.getTypedRuleContext(CParser.Declaration_specifiersContext, 0)
 
         # @param  i=None Type: int
-        def declarator(self,i=None):
+        def declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclaratorContext, i)
 
         # @param  i=None Type: int
-        def abstract_declarator(self,i=None):
+        def abstract_declarator(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Abstract_declaratorContext)
             else:
-                return self.getTypedRuleContext(CParser.Abstract_declaratorContext,i)
-
+                return self.getTypedRuleContext(CParser.Abstract_declaratorContext, i)
 
         def IDENTIFIER(self):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def pointer(self,i=None):
+        def pointer(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.PointerContext)
             else:
-                return self.getTypedRuleContext(CParser.PointerContext,i)
-
+                return self.getTypedRuleContext(CParser.PointerContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_parameter_declaration
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterParameter_declaration" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterParameter_declaration"):
                 listener.enterParameter_declaration(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitParameter_declaration" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitParameter_declaration"):
                 listener.exitParameter_declaration(self)
 
-
-
-
     def parameter_declaration(self):
 
-        localctx = CParser.Parameter_declarationContext(self, self._ctx, self.state)
+        localctx = CParser.Parameter_declarationContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 54, self.RULE_parameter_declaration)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 439
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,51,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 51, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 421
@@ -2977,7 +2845,8 @@ class CParser ( Parser ):
                 while ((((_la - 34)) & ~0x3f) == 0 and ((1 << (_la - 34)) & ((1 << (CParser.T__33 - 34)) | (1 << (CParser.T__34 - 34)) | (1 << (CParser.T__35 - 34)) | (1 << (CParser.T__37 - 34)) | (1 << (CParser.T__39 - 34)) | (1 << (CParser.T__41 - 34)) | (1 << (CParser.IDENTIFIER - 34)))) != 0):
                     self.state = 424
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,47,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 47, self._ctx)
                     if la_ == 1:
                         self.state = 422
                         self.declarator()
@@ -2988,7 +2857,6 @@ class CParser ( Parser ):
                         self.abstract_declarator()
                         pass
 
-
                     self.state = 428
                     self._errHandler.sync(self)
                     _la = self._input.LA(1)
@@ -2996,11 +2864,10 @@ class CParser ( Parser ):
                 self.state = 430
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 429
                     self.match(CParser.T__28)
 
-
                 pass
 
             elif la_ == 2:
@@ -3008,7 +2875,7 @@ class CParser ( Parser ):
                 self.state = 435
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while _la==CParser.T__41:
+                while _la == CParser.T__41:
                     self.state = 432
                     self.pointer()
                     self.state = 437
@@ -3019,7 +2886,6 @@ class CParser ( Parser ):
                 self.match(CParser.IDENTIFIER)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3032,12 +2898,12 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
@@ -3047,23 +2913,20 @@ class CParser ( Parser ):
             return CParser.RULE_identifier_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterIdentifier_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterIdentifier_list"):
                 listener.enterIdentifier_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitIdentifier_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitIdentifier_list"):
                 listener.exitIdentifier_list(self)
 
-
-
-
     def identifier_list(self):
 
         localctx = CParser.Identifier_listContext(self, self._ctx, self.state)
         self.enterRule(localctx, 56, self.RULE_identifier_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 441
@@ -3071,7 +2934,7 @@ class CParser ( Parser ):
             self.state = 446
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 442
                 self.match(CParser.T__3)
                 self.state = 443
@@ -3092,47 +2955,41 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def specifier_qualifier_list(self):
-            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Specifier_qualifier_listContext, 0)
 
         def abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Abstract_declaratorContext, 0)
 
         def type_id(self):
-            return self.getTypedRuleContext(CParser.Type_idContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_idContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_type_name
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterType_name" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterType_name"):
                 listener.enterType_name(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitType_name" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitType_name"):
                 listener.exitType_name(self)
 
-
-
-
     def type_name(self):
 
         localctx = CParser.Type_nameContext(self, self._ctx, self.state)
         self.enterRule(localctx, 58, self.RULE_type_name)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 454
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,54,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 54, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 449
@@ -3144,7 +3001,6 @@ class CParser ( Parser ):
                     self.state = 450
                     self.abstract_declarator()
 
-
                 pass
 
             elif la_ == 2:
@@ -3153,7 +3009,6 @@ class CParser ( Parser ):
                 self.type_id()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3166,37 +3021,33 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def pointer(self):
-            return self.getTypedRuleContext(CParser.PointerContext,0)
-
+            return self.getTypedRuleContext(CParser.PointerContext, 0)
 
         def direct_abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Direct_abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Direct_abstract_declaratorContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_abstract_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAbstract_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAbstract_declarator"):
                 listener.enterAbstract_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAbstract_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAbstract_declarator"):
                 listener.exitAbstract_declarator(self)
 
-
-
-
     def abstract_declarator(self):
 
-        localctx = CParser.Abstract_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Abstract_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 60, self.RULE_abstract_declarator)
         try:
             self.state = 461
@@ -3208,12 +3059,11 @@ class CParser ( Parser ):
                 self.pointer()
                 self.state = 458
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,55,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 55, self._ctx)
                 if la_ == 1:
                     self.state = 457
                     self.direct_abstract_declarator()
 
-
                 pass
             elif token in [CParser.T__37, CParser.T__39]:
                 self.enterOuterAlt(localctx, 2)
@@ -3235,46 +3085,43 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def abstract_declarator(self):
-            return self.getTypedRuleContext(CParser.Abstract_declaratorContext,0)
-
+            return self.getTypedRuleContext(CParser.Abstract_declaratorContext, 0)
 
         # @param  i=None Type: int
-        def abstract_declarator_suffix(self,i=None):
+        def abstract_declarator_suffix(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Abstract_declarator_suffixContext)
             else:
-                return self.getTypedRuleContext(CParser.Abstract_declarator_suffixContext,i)
-
+                return self.getTypedRuleContext(CParser.Abstract_declarator_suffixContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_direct_abstract_declarator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterDirect_abstract_declarator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterDirect_abstract_declarator"):
                 listener.enterDirect_abstract_declarator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitDirect_abstract_declarator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitDirect_abstract_declarator"):
                 listener.exitDirect_abstract_declarator(self)
 
-
-
     def direct_abstract_declarator(self):
 
-        localctx = CParser.Direct_abstract_declaratorContext(self, self._ctx, self.state)
+        localctx = CParser.Direct_abstract_declaratorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 62, self.RULE_direct_abstract_declarator)
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 468
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,57,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 57, self._ctx)
             if la_ == 1:
                 self.state = 463
                 self.match(CParser.T__37)
@@ -3289,17 +3136,16 @@ class CParser ( Parser ):
                 self.abstract_declarator_suffix()
                 pass
 
-
             self.state = 473
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,58,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 58, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 470
                     self.abstract_declarator_suffix()
                 self.state = 475
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,58,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 58, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -3313,42 +3159,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def parameter_type_list(self):
-            return self.getTypedRuleContext(CParser.Parameter_type_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Parameter_type_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_abstract_declarator_suffix
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAbstract_declarator_suffix" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAbstract_declarator_suffix"):
                 listener.enterAbstract_declarator_suffix(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAbstract_declarator_suffix" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAbstract_declarator_suffix"):
                 listener.exitAbstract_declarator_suffix(self)
 
-
-
-
     def abstract_declarator_suffix(self):
 
-        localctx = CParser.Abstract_declarator_suffixContext(self, self._ctx, self.state)
+        localctx = CParser.Abstract_declarator_suffixContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 64, self.RULE_abstract_declarator_suffix)
         try:
             self.state = 488
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,59,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 59, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 476
@@ -3385,7 +3227,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3398,39 +3239,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def assignment_expression(self):
-            return self.getTypedRuleContext(CParser.Assignment_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_expressionContext, 0)
 
         def initializer_list(self):
-            return self.getTypedRuleContext(CParser.Initializer_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Initializer_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_initializer
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInitializer" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInitializer"):
                 listener.enterInitializer(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInitializer" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInitializer"):
                 listener.exitInitializer(self)
 
-
-
-
     def initializer(self):
 
         localctx = CParser.InitializerContext(self, self._ctx, self.state)
         self.enterRule(localctx, 66, self.RULE_initializer)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 498
             self._errHandler.sync(self)
@@ -3449,11 +3285,10 @@ class CParser ( Parser ):
                 self.state = 494
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__3:
+                if _la == CParser.T__3:
                     self.state = 493
                     self.match(CParser.T__3)
 
-
                 self.state = 496
                 self.match(CParser.T__19)
                 pass
@@ -3472,34 +3307,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def initializer(self,i=None):
+        def initializer(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.InitializerContext)
             else:
-                return self.getTypedRuleContext(CParser.InitializerContext,i)
-
+                return self.getTypedRuleContext(CParser.InitializerContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_initializer_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInitializer_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInitializer_list"):
                 listener.enterInitializer_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInitializer_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInitializer_list"):
                 listener.exitInitializer_list(self)
 
-
-
-
     def initializer_list(self):
 
         localctx = CParser.Initializer_listContext(self, self._ctx, self.state)
@@ -3510,16 +3341,16 @@ class CParser ( Parser ):
             self.initializer()
             self.state = 505
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,62,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 62, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 501
                     self.match(CParser.T__3)
                     self.state = 502
                     self.initializer()
                 self.state = 507
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,62,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 62, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -3533,39 +3364,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def assignment_expression(self,i=None):
+        def assignment_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Assignment_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Assignment_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Assignment_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_argument_expression_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterArgument_expression_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterArgument_expression_list"):
                 listener.enterArgument_expression_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitArgument_expression_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitArgument_expression_list"):
                 listener.exitArgument_expression_list(self)
 
-
-
-
     def argument_expression_list(self):
 
-        localctx = CParser.Argument_expression_listContext(self, self._ctx, self.state)
+        localctx = CParser.Argument_expression_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 70, self.RULE_argument_expression_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 508
@@ -3573,15 +3401,14 @@ class CParser ( Parser ):
             self.state = 510
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__28:
+            if _la == CParser.T__28:
                 self.state = 509
                 self.match(CParser.T__28)
 
-
             self.state = 519
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 512
                 self.match(CParser.T__3)
                 self.state = 513
@@ -3589,11 +3416,10 @@ class CParser ( Parser ):
                 self.state = 515
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                if _la==CParser.T__28:
+                if _la == CParser.T__28:
                     self.state = 514
                     self.match(CParser.T__28)
 
-
                 self.state = 521
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
@@ -3610,39 +3436,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def multiplicative_expression(self,i=None):
+        def multiplicative_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Multiplicative_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Multiplicative_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Multiplicative_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_additive_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAdditive_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAdditive_expression"):
                 listener.enterAdditive_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAdditive_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAdditive_expression"):
                 listener.exitAdditive_expression(self)
 
-
-
-
     def additive_expression(self):
 
-        localctx = CParser.Additive_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Additive_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 72, self.RULE_additive_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 522
@@ -3650,7 +3473,7 @@ class CParser ( Parser ):
             self.state = 529
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__43 or _la==CParser.T__44:
+            while _la == CParser.T__43 or _la == CParser.T__44:
                 self.state = 527
                 self._errHandler.sync(self)
                 token = self._input.LA(1)
@@ -3685,39 +3508,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def cast_expression(self,i=None):
+        def cast_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Cast_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Cast_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Cast_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_multiplicative_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMultiplicative_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMultiplicative_expression"):
                 listener.enterMultiplicative_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMultiplicative_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMultiplicative_expression"):
                 listener.exitMultiplicative_expression(self)
 
-
-
-
     def multiplicative_expression(self):
 
-        localctx = CParser.Multiplicative_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Multiplicative_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 74, self.RULE_multiplicative_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 532
@@ -3766,38 +3586,32 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def type_name(self):
-            return self.getTypedRuleContext(CParser.Type_nameContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_nameContext, 0)
 
         def cast_expression(self):
-            return self.getTypedRuleContext(CParser.Cast_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Cast_expressionContext, 0)
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_cast_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterCast_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterCast_expression"):
                 listener.enterCast_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitCast_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitCast_expression"):
                 listener.exitCast_expression(self)
 
-
-
-
     def cast_expression(self):
 
         localctx = CParser.Cast_expressionContext(self, self._ctx, self.state)
@@ -3805,7 +3619,7 @@ class CParser ( Parser ):
         try:
             self.state = 550
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,70,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 70, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 544
@@ -3824,7 +3638,6 @@ class CParser ( Parser ):
                 self.unary_expression()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3837,46 +3650,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def postfix_expression(self):
-            return self.getTypedRuleContext(CParser.Postfix_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Postfix_expressionContext, 0)
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def unary_operator(self):
-            return self.getTypedRuleContext(CParser.Unary_operatorContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_operatorContext, 0)
 
         def cast_expression(self):
-            return self.getTypedRuleContext(CParser.Cast_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Cast_expressionContext, 0)
 
         def type_name(self):
-            return self.getTypedRuleContext(CParser.Type_nameContext,0)
-
+            return self.getTypedRuleContext(CParser.Type_nameContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_unary_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterUnary_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterUnary_expression"):
                 listener.enterUnary_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitUnary_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitUnary_expression"):
                 listener.exitUnary_expression(self)
 
-
-
-
     def unary_expression(self):
 
         localctx = CParser.Unary_expressionContext(self, self._ctx, self.state)
@@ -3884,7 +3689,7 @@ class CParser ( Parser ):
         try:
             self.state = 567
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,71,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 71, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 552
@@ -3935,7 +3740,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -3948,48 +3752,44 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
             self.FuncCallText = ''
-            self.p = None # Primary_expressionContext
-            self.a = None # Token
-            self.c = None # Argument_expression_listContext
-            self.b = None # Token
-            self.x = None # Token
-            self.y = None # Token
-            self.z = None # Token
+            self.p = None  # Primary_expressionContext
+            self.a = None  # Token
+            self.c = None  # Argument_expression_listContext
+            self.b = None  # Token
+            self.x = None  # Token
+            self.y = None  # Token
+            self.z = None  # Token
 
         def primary_expression(self):
-            return self.getTypedRuleContext(CParser.Primary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Primary_expressionContext, 0)
 
         # @param  i=None Type: int
-        def expression(self,i=None):
+        def expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.ExpressionContext)
             else:
-                return self.getTypedRuleContext(CParser.ExpressionContext,i)
-
+                return self.getTypedRuleContext(CParser.ExpressionContext, i)
 
         # @param  i=None Type: int
-        def macro_parameter_list(self,i=None):
+        def macro_parameter_list(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Macro_parameter_listContext)
             else:
-                return self.getTypedRuleContext(CParser.Macro_parameter_listContext,i)
-
+                return self.getTypedRuleContext(CParser.Macro_parameter_listContext, i)
 
         # @param  i=None Type: int
-        def argument_expression_list(self,i=None):
+        def argument_expression_list(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Argument_expression_listContext)
             else:
-                return self.getTypedRuleContext(CParser.Argument_expression_listContext,i)
-
+                return self.getTypedRuleContext(CParser.Argument_expression_listContext, i)
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
@@ -3999,38 +3799,38 @@ class CParser ( Parser ):
             return CParser.RULE_postfix_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPostfix_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPostfix_expression"):
                 listener.enterPostfix_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPostfix_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPostfix_expression"):
                 listener.exitPostfix_expression(self)
 
-
-
-
     def postfix_expression(self):
 
-        localctx = CParser.Postfix_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Postfix_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 80, self.RULE_postfix_expression)
 
-        self.FuncCallText=''
+        self.FuncCallText = ''
 
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 569
             localctx.p = self.primary_expression()
-            self.FuncCallText += (None if localctx.p is None else self._input.getText((localctx.p.start,localctx.p.stop)))
+            self.FuncCallText += (None if localctx.p is None else self._input.getText(
+                (localctx.p.start, localctx.p.stop)))
             self.state = 600
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,73,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 73, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 598
                     self._errHandler.sync(self)
-                    la_ = self._interp.adaptivePredict(self._input,72,self._ctx)
+                    la_ = self._interp.adaptivePredict(
+                        self._input, 72, self._ctx)
                     if la_ == 1:
                         self.state = 571
                         self.match(CParser.T__39)
@@ -4045,7 +3845,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__37)
                         self.state = 576
                         localctx.a = self.match(CParser.T__38)
-                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (0 if localctx.a is None else localctx.a.line), localctx.a.column, self.FuncCallText, '')
+                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (
+                            0 if localctx.a is None else localctx.a.line), localctx.a.column, self.FuncCallText, '')
                         pass
 
                     elif la_ == 3:
@@ -4055,7 +3856,8 @@ class CParser ( Parser ):
                         localctx.c = self.argument_expression_list()
                         self.state = 580
                         localctx.b = self.match(CParser.T__38)
-                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (0 if localctx.b is None else localctx.b.line), localctx.b.column, self.FuncCallText, (None if localctx.c is None else self._input.getText((localctx.c.start,localctx.c.stop))))
+                        self.StoreFunctionCalling((None if localctx.p is None else localctx.p.start).line, (None if localctx.p is None else localctx.p.start).column, (
+                            0 if localctx.b is None else localctx.b.line), localctx.b.column, self.FuncCallText, (None if localctx.c is None else self._input.getText((localctx.c.start, localctx.c.stop))))
                         pass
 
                     elif la_ == 4:
@@ -4072,7 +3874,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__50)
                         self.state = 588
                         localctx.x = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText += '.' + (None if localctx.x is None else localctx.x.text)
+                        self.FuncCallText += '.' + \
+                            (None if localctx.x is None else localctx.x.text)
                         pass
 
                     elif la_ == 6:
@@ -4080,7 +3883,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__41)
                         self.state = 591
                         localctx.y = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText = (None if localctx.y is None else localctx.y.text)
+                        self.FuncCallText = (
+                            None if localctx.y is None else localctx.y.text)
                         pass
 
                     elif la_ == 7:
@@ -4088,7 +3892,8 @@ class CParser ( Parser ):
                         self.match(CParser.T__51)
                         self.state = 594
                         localctx.z = self.match(CParser.IDENTIFIER)
-                        self.FuncCallText += '->' + (None if localctx.z is None else localctx.z.text)
+                        self.FuncCallText += '->' + \
+                            (None if localctx.z is None else localctx.z.text)
                         pass
 
                     elif la_ == 8:
@@ -4101,10 +3906,9 @@ class CParser ( Parser ):
                         self.match(CParser.T__48)
                         pass
 
-
                 self.state = 602
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,73,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 73, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -4118,39 +3922,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def parameter_declaration(self,i=None):
+        def parameter_declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Parameter_declarationContext)
             else:
-                return self.getTypedRuleContext(CParser.Parameter_declarationContext,i)
-
+                return self.getTypedRuleContext(CParser.Parameter_declarationContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_macro_parameter_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMacro_parameter_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMacro_parameter_list"):
                 listener.enterMacro_parameter_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMacro_parameter_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMacro_parameter_list"):
                 listener.exitMacro_parameter_list(self)
 
-
-
-
     def macro_parameter_list(self):
 
-        localctx = CParser.Macro_parameter_listContext(self, self._ctx, self.state)
+        localctx = CParser.Macro_parameter_listContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 82, self.RULE_macro_parameter_list)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 603
@@ -4158,7 +3959,7 @@ class CParser ( Parser ):
             self.state = 608
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 604
                 self.match(CParser.T__3)
                 self.state = 605
@@ -4179,32 +3980,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_unary_operator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterUnary_operator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterUnary_operator"):
                 listener.enterUnary_operator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitUnary_operator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitUnary_operator"):
                 listener.exitUnary_operator(self)
 
-
-
-
     def unary_operator(self):
 
         localctx = CParser.Unary_operatorContext(self, self._ctx, self.state)
         self.enterRule(localctx, 84, self.RULE_unary_operator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 611
@@ -4226,7 +4023,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -4234,37 +4031,33 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def constant(self):
-            return self.getTypedRuleContext(CParser.ConstantContext,0)
-
+            return self.getTypedRuleContext(CParser.ConstantContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_primary_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterPrimary_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterPrimary_expression"):
                 listener.enterPrimary_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitPrimary_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitPrimary_expression"):
                 listener.exitPrimary_expression(self)
 
-
-
-
     def primary_expression(self):
 
-        localctx = CParser.Primary_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Primary_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 86, self.RULE_primary_expression)
         try:
             self.state = 619
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,75,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 75, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 613
@@ -4287,7 +4080,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -4300,7 +4092,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -4317,14 +4109,14 @@ class CParser ( Parser ):
             return self.getToken(CParser.CHARACTER_LITERAL, 0)
 
         # @param  i=None Type: int
-        def IDENTIFIER(self,i=None):
+        def IDENTIFIER(self, i=None):
             if i is None:
                 return self.getTokens(CParser.IDENTIFIER)
             else:
                 return self.getToken(CParser.IDENTIFIER, i)
 
         # @param  i=None Type: int
-        def STRING_LITERAL(self,i=None):
+        def STRING_LITERAL(self, i=None):
             if i is None:
                 return self.getTokens(CParser.STRING_LITERAL)
             else:
@@ -4337,23 +4129,20 @@ class CParser ( Parser ):
             return CParser.RULE_constant
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConstant" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConstant"):
                 listener.enterConstant(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConstant" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConstant"):
                 listener.exitConstant(self)
 
-
-
-
     def constant(self):
 
         localctx = CParser.ConstantContext(self, self._ctx, self.state)
         self.enterRule(localctx, 88, self.RULE_constant)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.state = 647
             self._errHandler.sync(self)
@@ -4383,12 +4172,12 @@ class CParser ( Parser ):
                 self.state = 636
                 self._errHandler.sync(self)
                 _alt = 1
-                while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                     if _alt == 1:
                         self.state = 628
                         self._errHandler.sync(self)
                         _la = self._input.LA(1)
-                        while _la==CParser.IDENTIFIER:
+                        while _la == CParser.IDENTIFIER:
                             self.state = 625
                             self.match(CParser.IDENTIFIER)
                             self.state = 630
@@ -4398,7 +4187,7 @@ class CParser ( Parser ):
                         self.state = 632
                         self._errHandler.sync(self)
                         _alt = 1
-                        while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+                        while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                             if _alt == 1:
                                 self.state = 631
                                 self.match(CParser.STRING_LITERAL)
@@ -4407,19 +4196,20 @@ class CParser ( Parser ):
                                 raise NoViableAltException(self)
                             self.state = 634
                             self._errHandler.sync(self)
-                            _alt = self._interp.adaptivePredict(self._input,77,self._ctx)
-
+                            _alt = self._interp.adaptivePredict(
+                                self._input, 77, self._ctx)
 
                     else:
                         raise NoViableAltException(self)
                     self.state = 638
                     self._errHandler.sync(self)
-                    _alt = self._interp.adaptivePredict(self._input,78,self._ctx)
+                    _alt = self._interp.adaptivePredict(
+                        self._input, 78, self._ctx)
 
                 self.state = 643
                 self._errHandler.sync(self)
                 _la = self._input.LA(1)
-                while _la==CParser.IDENTIFIER:
+                while _la == CParser.IDENTIFIER:
                     self.state = 640
                     self.match(CParser.IDENTIFIER)
                     self.state = 645
@@ -4447,39 +4237,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def assignment_expression(self,i=None):
+        def assignment_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Assignment_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Assignment_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Assignment_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExpression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExpression"):
                 listener.enterExpression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExpression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExpression"):
                 listener.exitExpression(self)
 
-
-
-
     def expression(self):
 
         localctx = CParser.ExpressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 90, self.RULE_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 649
@@ -4487,7 +4273,7 @@ class CParser ( Parser ):
             self.state = 654
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__3:
+            while _la == CParser.T__3:
                 self.state = 650
                 self.match(CParser.T__3)
                 self.state = 651
@@ -4508,33 +4294,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_constant_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConstant_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConstant_expression"):
                 listener.enterConstant_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConstant_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConstant_expression"):
                 listener.exitConstant_expression(self)
 
-
-
-
     def constant_expression(self):
 
-        localctx = CParser.Constant_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Constant_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 92, self.RULE_constant_expression)
         try:
             self.enterOuterAlt(localctx, 1)
@@ -4552,50 +4335,44 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def lvalue(self):
-            return self.getTypedRuleContext(CParser.LvalueContext,0)
-
+            return self.getTypedRuleContext(CParser.LvalueContext, 0)
 
         def assignment_operator(self):
-            return self.getTypedRuleContext(CParser.Assignment_operatorContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_operatorContext, 0)
 
         def assignment_expression(self):
-            return self.getTypedRuleContext(CParser.Assignment_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Assignment_expressionContext, 0)
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_assignment_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAssignment_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAssignment_expression"):
                 listener.enterAssignment_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAssignment_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAssignment_expression"):
                 listener.exitAssignment_expression(self)
 
-
-
-
     def assignment_expression(self):
 
-        localctx = CParser.Assignment_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Assignment_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 94, self.RULE_assignment_expression)
         try:
             self.state = 664
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,82,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 82, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 659
@@ -4612,7 +4389,6 @@ class CParser ( Parser ):
                 self.conditional_expression()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -4625,30 +4401,26 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def unary_expression(self):
-            return self.getTypedRuleContext(CParser.Unary_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Unary_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_lvalue
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLvalue" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLvalue"):
                 listener.enterLvalue(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLvalue" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLvalue"):
                 listener.exitLvalue(self)
 
-
-
-
     def lvalue(self):
 
         localctx = CParser.LvalueContext(self, self._ctx, self.state)
@@ -4669,32 +4441,29 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_assignment_operator
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAssignment_operator" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAssignment_operator"):
                 listener.enterAssignment_operator(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAssignment_operator" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAssignment_operator"):
                 listener.exitAssignment_operator(self)
 
-
-
-
     def assignment_operator(self):
 
-        localctx = CParser.Assignment_operatorContext(self, self._ctx, self.state)
+        localctx = CParser.Assignment_operatorContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 98, self.RULE_assignment_operator)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 668
@@ -4716,44 +4485,39 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # Logical_or_expressionContext
+            self.e = None  # Logical_or_expressionContext
 
         def logical_or_expression(self):
-            return self.getTypedRuleContext(CParser.Logical_or_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Logical_or_expressionContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def conditional_expression(self):
-            return self.getTypedRuleContext(CParser.Conditional_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Conditional_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_conditional_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterConditional_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterConditional_expression"):
                 listener.enterConditional_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitConditional_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitConditional_expression"):
                 listener.exitConditional_expression(self)
 
-
-
-
     def conditional_expression(self):
 
-        localctx = CParser.Conditional_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Conditional_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 100, self.RULE_conditional_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 670
@@ -4761,7 +4525,7 @@ class CParser ( Parser ):
             self.state = 677
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__65:
+            if _la == CParser.T__65:
                 self.state = 671
                 self.match(CParser.T__65)
                 self.state = 672
@@ -4770,8 +4534,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__22)
                 self.state = 674
                 self.conditional_expression()
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
-
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
 
         except RecognitionException as re:
             localctx.exception = re
@@ -4785,39 +4549,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def logical_and_expression(self,i=None):
+        def logical_and_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Logical_and_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Logical_and_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Logical_and_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_logical_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLogical_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLogical_or_expression"):
                 listener.enterLogical_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLogical_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLogical_or_expression"):
                 listener.exitLogical_or_expression(self)
 
-
-
-
     def logical_or_expression(self):
 
-        localctx = CParser.Logical_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Logical_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 102, self.RULE_logical_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 679
@@ -4825,7 +4586,7 @@ class CParser ( Parser ):
             self.state = 684
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__66:
+            while _la == CParser.T__66:
                 self.state = 680
                 self.match(CParser.T__66)
                 self.state = 681
@@ -4846,39 +4607,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def inclusive_or_expression(self,i=None):
+        def inclusive_or_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Inclusive_or_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Inclusive_or_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Inclusive_or_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_logical_and_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLogical_and_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLogical_and_expression"):
                 listener.enterLogical_and_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLogical_and_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLogical_and_expression"):
                 listener.exitLogical_and_expression(self)
 
-
-
-
     def logical_and_expression(self):
 
-        localctx = CParser.Logical_and_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Logical_and_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 104, self.RULE_logical_and_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 687
@@ -4886,7 +4644,7 @@ class CParser ( Parser ):
             self.state = 692
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__67:
+            while _la == CParser.T__67:
                 self.state = 688
                 self.match(CParser.T__67)
                 self.state = 689
@@ -4907,39 +4665,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def exclusive_or_expression(self,i=None):
+        def exclusive_or_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Exclusive_or_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Exclusive_or_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Exclusive_or_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_inclusive_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterInclusive_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterInclusive_or_expression"):
                 listener.enterInclusive_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitInclusive_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitInclusive_or_expression"):
                 listener.exitInclusive_or_expression(self)
 
-
-
-
     def inclusive_or_expression(self):
 
-        localctx = CParser.Inclusive_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Inclusive_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 106, self.RULE_inclusive_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 695
@@ -4947,7 +4702,7 @@ class CParser ( Parser ):
             self.state = 700
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__68:
+            while _la == CParser.T__68:
                 self.state = 696
                 self.match(CParser.T__68)
                 self.state = 697
@@ -4968,39 +4723,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def and_expression(self,i=None):
+        def and_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.And_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.And_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.And_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_exclusive_or_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExclusive_or_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExclusive_or_expression"):
                 listener.enterExclusive_or_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExclusive_or_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExclusive_or_expression"):
                 listener.exitExclusive_or_expression(self)
 
-
-
-
     def exclusive_or_expression(self):
 
-        localctx = CParser.Exclusive_or_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Exclusive_or_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 108, self.RULE_exclusive_or_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 703
@@ -5008,7 +4760,7 @@ class CParser ( Parser ):
             self.state = 708
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__69:
+            while _la == CParser.T__69:
                 self.state = 704
                 self.match(CParser.T__69)
                 self.state = 705
@@ -5029,39 +4781,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def equality_expression(self,i=None):
+        def equality_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Equality_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Equality_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Equality_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_and_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAnd_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAnd_expression"):
                 listener.enterAnd_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAnd_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAnd_expression"):
                 listener.exitAnd_expression(self)
 
-
-
-
     def and_expression(self):
 
         localctx = CParser.And_expressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 110, self.RULE_and_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 711
@@ -5069,7 +4817,7 @@ class CParser ( Parser ):
             self.state = 716
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__52:
+            while _la == CParser.T__52:
                 self.state = 712
                 self.match(CParser.T__52)
                 self.state = 713
@@ -5090,39 +4838,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def relational_expression(self,i=None):
+        def relational_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Relational_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Relational_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Relational_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_equality_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterEquality_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterEquality_expression"):
                 listener.enterEquality_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitEquality_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitEquality_expression"):
                 listener.exitEquality_expression(self)
 
-
-
-
     def equality_expression(self):
 
-        localctx = CParser.Equality_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Equality_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 112, self.RULE_equality_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 719
@@ -5130,10 +4875,10 @@ class CParser ( Parser ):
             self.state = 724
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__70 or _la==CParser.T__71:
+            while _la == CParser.T__70 or _la == CParser.T__71:
                 self.state = 720
                 _la = self._input.LA(1)
-                if not(_la==CParser.T__70 or _la==CParser.T__71):
+                if not(_la == CParser.T__70 or _la == CParser.T__71):
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5156,39 +4901,36 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def shift_expression(self,i=None):
+        def shift_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Shift_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Shift_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Shift_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_relational_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterRelational_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterRelational_expression"):
                 listener.enterRelational_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitRelational_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitRelational_expression"):
                 listener.exitRelational_expression(self)
 
-
-
-
     def relational_expression(self):
 
-        localctx = CParser.Relational_expressionContext(self, self._ctx, self.state)
+        localctx = CParser.Relational_expressionContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 114, self.RULE_relational_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 727
@@ -5222,39 +4964,35 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def additive_expression(self,i=None):
+        def additive_expression(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.Additive_expressionContext)
             else:
-                return self.getTypedRuleContext(CParser.Additive_expressionContext,i)
-
+                return self.getTypedRuleContext(CParser.Additive_expressionContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_shift_expression
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterShift_expression" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterShift_expression"):
                 listener.enterShift_expression(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitShift_expression" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitShift_expression"):
                 listener.exitShift_expression(self)
 
-
-
-
     def shift_expression(self):
 
         localctx = CParser.Shift_expressionContext(self, self._ctx, self.state)
         self.enterRule(localctx, 116, self.RULE_shift_expression)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 735
@@ -5262,10 +5000,10 @@ class CParser ( Parser ):
             self.state = 740
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            while _la==CParser.T__76 or _la==CParser.T__77:
+            while _la == CParser.T__76 or _la == CParser.T__77:
                 self.state = 736
                 _la = self._input.LA(1)
-                if not(_la==CParser.T__76 or _la==CParser.T__77):
+                if not(_la == CParser.T__76 or _la == CParser.T__77):
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5288,70 +5026,56 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def labeled_statement(self):
-            return self.getTypedRuleContext(CParser.Labeled_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Labeled_statementContext, 0)
 
         def compound_statement(self):
-            return self.getTypedRuleContext(CParser.Compound_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Compound_statementContext, 0)
 
         def expression_statement(self):
-            return self.getTypedRuleContext(CParser.Expression_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Expression_statementContext, 0)
 
         def selection_statement(self):
-            return self.getTypedRuleContext(CParser.Selection_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Selection_statementContext, 0)
 
         def iteration_statement(self):
-            return self.getTypedRuleContext(CParser.Iteration_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Iteration_statementContext, 0)
 
         def jump_statement(self):
-            return self.getTypedRuleContext(CParser.Jump_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Jump_statementContext, 0)
 
         def macro_statement(self):
-            return self.getTypedRuleContext(CParser.Macro_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Macro_statementContext, 0)
 
         def asm2_statement(self):
-            return self.getTypedRuleContext(CParser.Asm2_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm2_statementContext, 0)
 
         def asm1_statement(self):
-            return self.getTypedRuleContext(CParser.Asm1_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm1_statementContext, 0)
 
         def asm_statement(self):
-            return self.getTypedRuleContext(CParser.Asm_statementContext,0)
-
+            return self.getTypedRuleContext(CParser.Asm_statementContext, 0)
 
         def declaration(self):
-            return self.getTypedRuleContext(CParser.DeclarationContext,0)
-
+            return self.getTypedRuleContext(CParser.DeclarationContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStatement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStatement"):
                 listener.enterStatement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStatement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStatement"):
                 listener.exitStatement(self)
 
-
-
-
     def statement(self):
 
         localctx = CParser.StatementContext(self, self._ctx, self.state)
@@ -5359,7 +5083,7 @@ class CParser ( Parser ):
         try:
             self.state = 754
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,92,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 92, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 743
@@ -5426,7 +5150,6 @@ class CParser ( Parser ):
                 self.declaration()
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -5439,7 +5162,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5450,52 +5173,48 @@ class CParser ( Parser ):
             return CParser.RULE_asm2_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm2_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm2_statement"):
                 listener.enterAsm2_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm2_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm2_statement"):
                 listener.exitAsm2_statement(self)
 
-
-
-
     def asm2_statement(self):
 
         localctx = CParser.Asm2_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 120, self.RULE_asm2_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 757
             self._errHandler.sync(self)
             _la = self._input.LA(1)
-            if _la==CParser.T__78:
+            if _la == CParser.T__78:
                 self.state = 756
                 self.match(CParser.T__78)
 
-
             self.state = 759
             self.match(CParser.IDENTIFIER)
             self.state = 760
             self.match(CParser.T__37)
             self.state = 764
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,94,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 94, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 761
                     _la = self._input.LA(1)
-                    if _la <= 0 or _la==CParser.T__1:
+                    if _la <= 0 or _la == CParser.T__1:
                         self._errHandler.recoverInline(self)
                     else:
                         self._errHandler.reportMatch(self)
                         self.consume()
                 self.state = 766
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,94,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 94, self._ctx)
 
             self.state = 767
             self.match(CParser.T__38)
@@ -5513,32 +5232,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_asm1_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm1_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm1_statement"):
                 listener.enterAsm1_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm1_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm1_statement"):
                 listener.exitAsm1_statement(self)
 
-
-
-
     def asm1_statement(self):
 
         localctx = CParser.Asm1_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 122, self.RULE_asm1_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 770
@@ -5551,7 +5266,7 @@ class CParser ( Parser ):
             while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__0) | (1 << CParser.T__1) | (1 << CParser.T__2) | (1 << CParser.T__3) | (1 << CParser.T__4) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__22) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__38) | (1 << CParser.T__39) | (1 << CParser.T__40) | (1 << CParser.T__41) | (1 << CParser.T__42) | (1 << CParser.T__43) | (1 << CParser.T__44) | (1 << CParser.T__45) | (1 << CParser.T__46) | (1 << CParser.T__47) | (1 << CParser.T__48) | (1 << CParser.T__49) | (1 << CParser.T__50) | (1 << CParser.T__51) | (1 << CParser.T__52) | (1 << CParser.T__53) | (1 << CParser.T__54) | (1 << CParser.T__55) | (1 << CParser.T__56) | (1 << CParser.T__57) | (1 << CParser.T__58) | (1 << CParser.T__59) | (1 << CParser.T__60) | (1 << CParser.T__61) | (1 << CParser.T__62))) != 0) or ((((_la - 64)) & ~0x3f) == 0 and ((1 << (_la - 64)) & ((1 << (CParser.T__63 - 64)) | (1 << (CParser.T__64 - 64)) | (1 << (CParser.T__65 - 64)) | (1 << (CParser.T__66 - 64)) | (1 << (CParser.T__67 - 64)) | (1 << (CParser.T__68 - 64)) | (1 << (CParser.T__69 - 64)) | (1 << (CParser.T__70 - 64)) | (1 << (CParser.T__71 - 64)) | (1 << (CParser.T__72 - 64)) | (1 << (CParser.T__73 - 64)) | (1 << (CParser.T__74 - 64)) | (1 << (CParser.T__75 - 64)) | (1 << (CParser.T__76 - 64)) | (1 << (CParser.T__77 - 64)) | (1 << (CParser.T__78 - 64)) | (1 << (CParser.T__79 - 64)) | (1 << (CParser.T__80 - 64)) | (1 << (CParser.T__81 - 64)) | (1 << (CParser.T__82 - 64)) | (1 << (CParser.T__83 - 64)) | (1 << (CParser.T__84 - 64)) | (1 << (CParser.T__85 - 64)) | (1 << (CParser.T__86 - 64)) | (1 << (CParser.T__87 - 64)) | (1 << (CParser.T__88 - 64)) | (1 << (CParser.T__89 - 64)) | (1 << (CParser.T__90 - 64)) | (1 << (CParser.T__91 - 64)) | (1 << (CParser.IDENTIFIER - 64)) | (1 << (CParser.CHARACTER_LITERAL - 64)) | (1 << (CParser.STRING_LITERAL - 64)) | (1 << (CParser.HEX_LITERAL - 64)) | (1 << (CParser.DECIMAL_LITERAL - 64)) | (1 << (CParser.OCTAL_LITERAL - 64)) | (1 << (CParser.FLOATING_POINT_LITERAL - 64)) | (1 << (CParser.WS - 64)) | (1 << (CParser.BS - 64)) | (1 << (CParser.UnicodeVocabulary - 64)) | (1 << (CParser.COMMENT - 64)) | (1 << (CParser.LINE_COMMENT - 64)) | (1 << (CParser.LINE_COMMAND - 64)))) != 0):
                 self.state = 772
                 _la = self._input.LA(1)
-                if _la <= 0 or _la==CParser.T__19:
+                if _la <= 0 or _la == CParser.T__19:
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5574,32 +5289,28 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
-
         def getRuleIndex(self):
             return CParser.RULE_asm_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterAsm_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterAsm_statement"):
                 listener.enterAsm_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitAsm_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitAsm_statement"):
                 listener.exitAsm_statement(self)
 
-
-
-
     def asm_statement(self):
 
         localctx = CParser.Asm_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 124, self.RULE_asm_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 780
@@ -5612,7 +5323,7 @@ class CParser ( Parser ):
             while (((_la) & ~0x3f) == 0 and ((1 << _la) & ((1 << CParser.T__0) | (1 << CParser.T__1) | (1 << CParser.T__2) | (1 << CParser.T__3) | (1 << CParser.T__4) | (1 << CParser.T__5) | (1 << CParser.T__6) | (1 << CParser.T__7) | (1 << CParser.T__8) | (1 << CParser.T__9) | (1 << CParser.T__10) | (1 << CParser.T__11) | (1 << CParser.T__12) | (1 << CParser.T__13) | (1 << CParser.T__14) | (1 << CParser.T__15) | (1 << CParser.T__16) | (1 << CParser.T__17) | (1 << CParser.T__18) | (1 << CParser.T__20) | (1 << CParser.T__21) | (1 << CParser.T__22) | (1 << CParser.T__23) | (1 << CParser.T__24) | (1 << CParser.T__25) | (1 << CParser.T__26) | (1 << CParser.T__27) | (1 << CParser.T__28) | (1 << CParser.T__29) | (1 << CParser.T__30) | (1 << CParser.T__31) | (1 << CParser.T__32) | (1 << CParser.T__33) | (1 << CParser.T__34) | (1 << CParser.T__35) | (1 << CParser.T__36) | (1 << CParser.T__37) | (1 << CParser.T__38) | (1 << CParser.T__39) | (1 << CParser.T__40) | (1 << CParser.T__41) | (1 << CParser.T__42) | (1 << CParser.T__43) | (1 << CParser.T__44) | (1 << CParser.T__45) | (1 << CParser.T__46) | (1 << CParser.T__47) | (1 << CParser.T__48) | (1 << CParser.T__49) | (1 << CParser.T__50) | (1 << CParser.T__51) | (1 << CParser.T__52) | (1 << CParser.T__53) | (1 << CParser.T__54) | (1 << CParser.T__55) | (1 << CParser.T__56) | (1 << CParser.T__57) | (1 << CParser.T__58) | (1 << CParser.T__59) | (1 << CParser.T__60) | (1 << CParser.T__61) | (1 << CParser.T__62))) != 0) or ((((_la - 64)) & ~0x3f) == 0 and ((1 << (_la - 64)) & ((1 << (CParser.T__63 - 64)) | (1 << (CParser.T__64 - 64)) | (1 << (CParser.T__65 - 64)) | (1 << (CParser.T__66 - 64)) | (1 << (CParser.T__67 - 64)) | (1 << (CParser.T__68 - 64)) | (1 << (CParser.T__69 - 64)) | (1 << (CParser.T__70 - 64)) | (1 << (CParser.T__71 - 64)) | (1 << (CParser.T__72 - 64)) | (1 << (CParser.T__73 - 64)) | (1 << (CParser.T__74 - 64)) | (1 << (CParser.T__75 - 64)) | (1 << (CParser.T__76 - 64)) | (1 << (CParser.T__77 - 64)) | (1 << (CParser.T__78 - 64)) | (1 << (CParser.T__79 - 64)) | (1 << (CParser.T__80 - 64)) | (1 << (CParser.T__81 - 64)) | (1 << (CParser.T__82 - 64)) | (1 << (CParser.T__83 - 64)) | (1 << (CParser.T__84 - 64)) | (1 << (CParser.T__85 - 64)) | (1 << (CParser.T__86 - 64)) | (1 << (CParser.T__87 - 64)) | (1 << (CParser.T__88 - 64)) | (1 << (CParser.T__89 - 64)) | (1 << (CParser.T__90 - 64)) | (1 << (CParser.T__91 - 64)) | (1 << (CParser.IDENTIFIER - 64)) | (1 << (CParser.CHARACTER_LITERAL - 64)) | (1 << (CParser.STRING_LITERAL - 64)) | (1 << (CParser.HEX_LITERAL - 64)) | (1 << (CParser.DECIMAL_LITERAL - 64)) | (1 << (CParser.OCTAL_LITERAL - 64)) | (1 << (CParser.FLOATING_POINT_LITERAL - 64)) | (1 << (CParser.WS - 64)) | (1 << (CParser.BS - 64)) | (1 << (CParser.UnicodeVocabulary - 64)) | (1 << (CParser.COMMENT - 64)) | (1 << (CParser.LINE_COMMENT - 64)) | (1 << (CParser.LINE_COMMAND - 64)))) != 0):
                 self.state = 782
                 _la = self._input.LA(1)
-                if _la <= 0 or _la==CParser.T__19:
+                if _la <= 0 or _la == CParser.T__19:
                     self._errHandler.recoverInline(self)
                 else:
                     self._errHandler.reportMatch(self)
@@ -5635,7 +5346,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5643,42 +5354,36 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def statement_list(self):
-            return self.getTypedRuleContext(CParser.Statement_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Statement_listContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_macro_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterMacro_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterMacro_statement"):
                 listener.enterMacro_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitMacro_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitMacro_statement"):
                 listener.exitMacro_statement(self)
 
-
-
-
     def macro_statement(self):
 
         localctx = CParser.Macro_statementContext(self, self._ctx, self.state)
         self.enterRule(localctx, 126, self.RULE_macro_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 790
@@ -5687,23 +5392,22 @@ class CParser ( Parser ):
             self.match(CParser.T__37)
             self.state = 795
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,97,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 97, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 792
                     self.declaration()
                 self.state = 797
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,97,self._ctx)
+                _alt = self._interp.adaptivePredict(self._input, 97, self._ctx)
 
             self.state = 799
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,98,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 98, self._ctx)
             if la_ == 1:
                 self.state = 798
                 self.statement_list()
 
-
             self.state = 802
             self._errHandler.sync(self)
             _la = self._input.LA(1)
@@ -5711,7 +5415,6 @@ class CParser ( Parser ):
                 self.state = 801
                 self.expression()
 
-
             self.state = 804
             self.match(CParser.T__38)
         except RecognitionException as re:
@@ -5726,7 +5429,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -5734,32 +5437,28 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def statement(self):
-            return self.getTypedRuleContext(CParser.StatementContext,0)
-
+            return self.getTypedRuleContext(CParser.StatementContext, 0)
 
         def constant_expression(self):
-            return self.getTypedRuleContext(CParser.Constant_expressionContext,0)
-
+            return self.getTypedRuleContext(CParser.Constant_expressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_labeled_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterLabeled_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterLabeled_statement"):
                 listener.enterLabeled_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitLabeled_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitLabeled_statement"):
                 listener.exitLabeled_statement(self)
 
-
-
-
     def labeled_statement(self):
 
-        localctx = CParser.Labeled_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Labeled_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 128, self.RULE_labeled_statement)
         try:
             self.state = 817
@@ -5809,57 +5508,54 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def declaration(self,i=None):
+        def declaration(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.DeclarationContext)
             else:
-                return self.getTypedRuleContext(CParser.DeclarationContext,i)
-
+                return self.getTypedRuleContext(CParser.DeclarationContext, i)
 
         def statement_list(self):
-            return self.getTypedRuleContext(CParser.Statement_listContext,0)
-
+            return self.getTypedRuleContext(CParser.Statement_listContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_compound_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterCompound_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterCompound_statement"):
                 listener.enterCompound_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitCompound_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitCompound_statement"):
                 listener.exitCompound_statement(self)
 
-
-
-
     def compound_statement(self):
 
-        localctx = CParser.Compound_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Compound_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 130, self.RULE_compound_statement)
-        self._la = 0 # Token type
+        self._la = 0  # Token type
         try:
             self.enterOuterAlt(localctx, 1)
             self.state = 819
             self.match(CParser.T__0)
             self.state = 823
             self._errHandler.sync(self)
-            _alt = self._interp.adaptivePredict(self._input,101,self._ctx)
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
-                if _alt==1:
+            _alt = self._interp.adaptivePredict(self._input, 101, self._ctx)
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
+                if _alt == 1:
                     self.state = 820
                     self.declaration()
                 self.state = 825
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,101,self._ctx)
+                _alt = self._interp.adaptivePredict(
+                    self._input, 101, self._ctx)
 
             self.state = 827
             self._errHandler.sync(self)
@@ -5868,7 +5564,6 @@ class CParser ( Parser ):
                 self.state = 826
                 self.statement_list()
 
-
             self.state = 829
             self.match(CParser.T__19)
         except RecognitionException as re:
@@ -5883,34 +5578,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         # @param  i=None Type: int
-        def statement(self,i=None):
+        def statement(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.StatementContext)
             else:
-                return self.getTypedRuleContext(CParser.StatementContext,i)
-
+                return self.getTypedRuleContext(CParser.StatementContext, i)
 
         def getRuleIndex(self):
             return CParser.RULE_statement_list
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterStatement_list" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterStatement_list"):
                 listener.enterStatement_list(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitStatement_list" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitStatement_list"):
                 listener.exitStatement_list(self)
 
-
-
-
     def statement_list(self):
 
         localctx = CParser.Statement_listContext(self, self._ctx, self.state)
@@ -5920,7 +5611,7 @@ class CParser ( Parser ):
             self.state = 832
             self._errHandler.sync(self)
             _alt = 1
-            while _alt!=2 and _alt!=ATN.INVALID_ALT_NUMBER:
+            while _alt != 2 and _alt != ATN.INVALID_ALT_NUMBER:
                 if _alt == 1:
                     self.state = 831
                     self.statement()
@@ -5929,7 +5620,8 @@ class CParser ( Parser ):
                     raise NoViableAltException(self)
                 self.state = 834
                 self._errHandler.sync(self)
-                _alt = self._interp.adaptivePredict(self._input,103,self._ctx)
+                _alt = self._interp.adaptivePredict(
+                    self._input, 103, self._ctx)
 
         except RecognitionException as re:
             localctx.exception = re
@@ -5943,33 +5635,30 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_expression_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterExpression_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterExpression_statement"):
                 listener.enterExpression_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitExpression_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitExpression_statement"):
                 listener.exitExpression_statement(self)
 
-
-
-
     def expression_statement(self):
 
-        localctx = CParser.Expression_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Expression_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 134, self.RULE_expression_statement)
         try:
             self.state = 840
@@ -6002,42 +5691,38 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # ExpressionContext
+            self.e = None  # ExpressionContext
 
         # @param  i=None Type: int
-        def statement(self,i=None):
+        def statement(self, i=None):
             if i is None:
                 return self.getTypedRuleContexts(CParser.StatementContext)
             else:
-                return self.getTypedRuleContext(CParser.StatementContext,i)
-
+                return self.getTypedRuleContext(CParser.StatementContext, i)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_selection_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterSelection_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterSelection_statement"):
                 listener.enterSelection_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitSelection_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitSelection_statement"):
                 listener.exitSelection_statement(self)
 
-
-
-
     def selection_statement(self):
 
-        localctx = CParser.Selection_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Selection_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 136, self.RULE_selection_statement)
         try:
             self.state = 858
@@ -6053,19 +5738,19 @@ class CParser ( Parser ):
                 localctx.e = self.expression()
                 self.state = 845
                 self.match(CParser.T__38)
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 self.state = 847
                 self.statement()
                 self.state = 850
                 self._errHandler.sync(self)
-                la_ = self._interp.adaptivePredict(self._input,105,self._ctx)
+                la_ = self._interp.adaptivePredict(self._input, 105, self._ctx)
                 if la_ == 1:
                     self.state = 848
                     self.match(CParser.T__84)
                     self.state = 849
                     self.statement()
 
-
                 pass
             elif token in [CParser.T__85]:
                 self.enterOuterAlt(localctx, 2)
@@ -6095,38 +5780,34 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
-            self.e = None # ExpressionContext
+            self.e = None  # ExpressionContext
 
         def statement(self):
-            return self.getTypedRuleContext(CParser.StatementContext,0)
-
+            return self.getTypedRuleContext(CParser.StatementContext, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_iteration_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterIteration_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterIteration_statement"):
                 listener.enterIteration_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitIteration_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitIteration_statement"):
                 listener.exitIteration_statement(self)
 
-
-
-
     def iteration_statement(self):
 
-        localctx = CParser.Iteration_statementContext(self, self._ctx, self.state)
+        localctx = CParser.Iteration_statementContext(
+            self, self._ctx, self.state)
         self.enterRule(localctx, 138, self.RULE_iteration_statement)
         try:
             self.state = 876
@@ -6144,7 +5825,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 self.state = 864
                 self.statement()
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 pass
             elif token in [CParser.T__87]:
                 self.enterOuterAlt(localctx, 2)
@@ -6162,7 +5844,8 @@ class CParser ( Parser ):
                 self.match(CParser.T__38)
                 self.state = 873
                 self.match(CParser.T__1)
-                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start,localctx.e.stop))))
+                self.StorePredicateExpression((None if localctx.e is None else localctx.e.start).line, (None if localctx.e is None else localctx.e.start).column, (None if localctx.e is None else localctx.e.stop).line, (
+                    None if localctx.e is None else localctx.e.stop).column, (None if localctx.e is None else self._input.getText((localctx.e.start, localctx.e.stop))))
                 pass
             else:
                 raise NoViableAltException(self)
@@ -6179,7 +5862,7 @@ class CParser ( Parser ):
 
         # @param  parent=None Type: ParserRuleContext
         # @param  invokingState=-1 Type: int
-        def __init__(self,parser,parent=None,invokingState=-1):
+        def __init__(self, parser, parent=None, invokingState=-1):
             super().__init__(parent, invokingState)
             self.parser = parser
 
@@ -6187,25 +5870,21 @@ class CParser ( Parser ):
             return self.getToken(CParser.IDENTIFIER, 0)
 
         def expression(self):
-            return self.getTypedRuleContext(CParser.ExpressionContext,0)
-
+            return self.getTypedRuleContext(CParser.ExpressionContext, 0)
 
         def getRuleIndex(self):
             return CParser.RULE_jump_statement
 
         # @param  listener Type: ParseTreeListener
-        def enterRule(self,listener):
-            if hasattr( listener, "enterJump_statement" ):
+        def enterRule(self, listener):
+            if hasattr(listener, "enterJump_statement"):
                 listener.enterJump_statement(self)
 
         # @param  listener Type: ParseTreeListener
-        def exitRule(self,listener):
-            if hasattr( listener, "exitJump_statement" ):
+        def exitRule(self, listener):
+            if hasattr(listener, "exitJump_statement"):
                 listener.exitJump_statement(self)
 
-
-
-
     def jump_statement(self):
 
         localctx = CParser.Jump_statementContext(self, self._ctx, self.state)
@@ -6213,7 +5892,7 @@ class CParser ( Parser ):
         try:
             self.state = 891
             self._errHandler.sync(self)
-            la_ = self._interp.adaptivePredict(self._input,108,self._ctx)
+            la_ = self._interp.adaptivePredict(self._input, 108, self._ctx)
             if la_ == 1:
                 self.enterOuterAlt(localctx, 1)
                 self.state = 878
@@ -6258,7 +5937,6 @@ class CParser ( Parser ):
                 self.match(CParser.T__1)
                 pass
 
-
         except RecognitionException as re:
             localctx.exception = re
             self._errHandler.reportError(self, re)
@@ -6266,8 +5944,3 @@ class CParser ( Parser ):
         finally:
             self.exitRule()
         return localctx
-
-
-
-
-
diff --git a/BaseTools/Source/Python/Eot/CodeFragment.py b/BaseTools/Source/Python/Eot/CodeFragment.py
index 94c3f52d5c20..5ef7cb3557f4 100644
--- a/BaseTools/Source/Python/Eot/CodeFragment.py
+++ b/BaseTools/Source/Python/Eot/CodeFragment.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # fragments of source file
 #
 #  Copyright (c) 2007 - 2010, Intel Corporation. All rights reserved.<BR>
@@ -7,11 +7,11 @@
 #
 
 
-## The description of comment contents and start & end position
+# The description of comment contents and start & end position
 #
 #
-class Comment :
-    ## The constructor
+class Comment:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -25,11 +25,13 @@ class Comment :
         self.EndPos = End
         self.Type = CommentType
 
-## The description of preprocess directives and start & end position
+# The description of preprocess directives and start & end position
 #
 #
-class PP_Directive :
-    ## The constructor
+
+
+class PP_Directive:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -41,11 +43,13 @@ class PP_Directive :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of assignment expression and start & end position
+# The description of assignment expression and start & end position
 #
 #
-class AssignmentExpression :
-    ## The constructor
+
+
+class AssignmentExpression:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -59,11 +63,13 @@ class AssignmentExpression :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of predicate expression and start & end position
+# The description of predicate expression and start & end position
 #
 #
-class PredicateExpression :
-    ## The constructor
+
+
+class PredicateExpression:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -75,11 +81,13 @@ class PredicateExpression :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of function definition and start & end position
+# The description of function definition and start & end position
 #
 #
-class FunctionDefinition :
-    ## The constructor
+
+
+class FunctionDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -95,11 +103,13 @@ class FunctionDefinition :
         self.LeftBracePos = LBPos
         self.NamePos = NamePos
 
-## The description of variable declaration and start & end position
+# The description of variable declaration and start & end position
 #
 #
-class VariableDeclaration :
-    ## The constructor
+
+
+class VariableDeclaration:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -112,11 +122,13 @@ class VariableDeclaration :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of enum definition and start & end position
+# The description of enum definition and start & end position
 #
 #
-class EnumerationDefinition :
-    ## The constructor
+
+
+class EnumerationDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -128,11 +140,13 @@ class EnumerationDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of struct/union definition and start & end position
+# The description of struct/union definition and start & end position
 #
 #
-class StructUnionDefinition :
-    ## The constructor
+
+
+class StructUnionDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -144,11 +158,13 @@ class StructUnionDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of 'Typedef' definition and start & end position
+# The description of 'Typedef' definition and start & end position
 #
 #
-class TypedefDefinition :
-    ## The constructor
+
+
+class TypedefDefinition:
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
@@ -161,11 +177,13 @@ class TypedefDefinition :
         self.StartPos = Begin
         self.EndPos = End
 
-## The description of function calling definition and start & end position
+# The description of function calling definition and start & end position
 #
 #
+
+
 class FunctionCalling:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
diff --git a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
index a5c1ceeaea32..a9e2d18567a9 100644
--- a/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
+++ b/BaseTools/Source/Python/Eot/CodeFragmentCollector.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # preprocess source file
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -30,21 +30,21 @@ from Eot.CodeFragment import PP_Directive
 from Eot.ParserWarning import Warning
 
 
-##define T_CHAR_SPACE                ' '
-##define T_CHAR_NULL                 '\0'
-##define T_CHAR_CR                   '\r'
-##define T_CHAR_TAB                  '\t'
-##define T_CHAR_LF                   '\n'
-##define T_CHAR_SLASH                '/'
-##define T_CHAR_BACKSLASH            '\\'
-##define T_CHAR_DOUBLE_QUOTE         '\"'
-##define T_CHAR_SINGLE_QUOTE         '\''
-##define T_CHAR_STAR                 '*'
-##define T_CHAR_HASH                 '#'
+# define T_CHAR_SPACE                ' '
+# define T_CHAR_NULL                 '\0'
+# define T_CHAR_CR                   '\r'
+# define T_CHAR_TAB                  '\t'
+# define T_CHAR_LF                   '\n'
+# define T_CHAR_SLASH                '/'
+# define T_CHAR_BACKSLASH            '\\'
+# define T_CHAR_DOUBLE_QUOTE         '\"'
+# define T_CHAR_SINGLE_QUOTE         '\''
+# define T_CHAR_STAR                 '*'
+# define T_CHAR_HASH                 '#'
 
-(T_CHAR_SPACE, T_CHAR_NULL, T_CHAR_CR, T_CHAR_TAB, T_CHAR_LF, T_CHAR_SLASH, \
-T_CHAR_BACKSLASH, T_CHAR_DOUBLE_QUOTE, T_CHAR_SINGLE_QUOTE, T_CHAR_STAR, T_CHAR_HASH) = \
-(' ', '\0', '\r', '\t', '\n', '/', '\\', '\"', '\'', '*', '#')
+(T_CHAR_SPACE, T_CHAR_NULL, T_CHAR_CR, T_CHAR_TAB, T_CHAR_LF, T_CHAR_SLASH,
+ T_CHAR_BACKSLASH, T_CHAR_DOUBLE_QUOTE, T_CHAR_SINGLE_QUOTE, T_CHAR_STAR, T_CHAR_HASH) = \
+    (' ', '\0', '\r', '\t', '\n', '/', '\\', '\"', '\'', '*', '#')
 
 SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
 
@@ -52,15 +52,17 @@ SEPERATOR_TUPLE = ('=', '|', ',', '{', '}')
 
 (T_PP_INCLUDE, T_PP_DEFINE, T_PP_OTHERS) = (0, 1, 2)
 
-## The collector for source code fragments.
+# The collector for source code fragments.
 #
 # PreprocessFile method should be called prior to ParseFile
 #
 # GetNext*** procedures mean these procedures will get next token first, then make judgement.
 # Get*** procedures mean these procedures will make judgement on current token only.
 #
+
+
 class CodeFragmentCollector:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
@@ -75,7 +77,7 @@ class CodeFragmentCollector:
         self.__Token = ""
         self.__SkippedChars = ""
 
-    ## __EndOfFile() method
+    # __EndOfFile() method
     #
     #   Judge current buffer pos is at file end
     #
@@ -93,7 +95,7 @@ class CodeFragmentCollector:
         else:
             return False
 
-    ## __EndOfLine() method
+    # __EndOfLine() method
     #
     #   Judge current buffer pos is at line end
     #
@@ -102,13 +104,14 @@ class CodeFragmentCollector:
     #   @retval False       Current File buffer position is NOT at line end
     #
     def __EndOfLine(self):
-        SizeOfCurrentLine = len(self.Profile.FileLinesList[self.CurrentLineNumber - 1])
+        SizeOfCurrentLine = len(
+            self.Profile.FileLinesList[self.CurrentLineNumber - 1])
         if self.CurrentOffsetWithinLine >= SizeOfCurrentLine - 1:
             return True
         else:
             return False
 
-    ## Rewind() method
+    # Rewind() method
     #
     #   Reset file data buffer to the initial state
     #
@@ -118,7 +121,7 @@ class CodeFragmentCollector:
         self.CurrentLineNumber = 1
         self.CurrentOffsetWithinLine = 0
 
-    ## __UndoOneChar() method
+    # __UndoOneChar() method
     #
     #   Go back one char in the file buffer
     #
@@ -137,7 +140,7 @@ class CodeFragmentCollector:
             self.CurrentOffsetWithinLine -= 1
         return True
 
-    ## __GetOneChar() method
+    # __GetOneChar() method
     #
     #   Move forward one char in the file buffer
     #
@@ -145,12 +148,12 @@ class CodeFragmentCollector:
     #
     def __GetOneChar(self):
         if self.CurrentOffsetWithinLine == len(self.Profile.FileLinesList[self.CurrentLineNumber - 1]) - 1:
-                self.CurrentLineNumber += 1
-                self.CurrentOffsetWithinLine = 0
+            self.CurrentLineNumber += 1
+            self.CurrentOffsetWithinLine = 0
         else:
-                self.CurrentOffsetWithinLine += 1
+            self.CurrentOffsetWithinLine += 1
 
-    ## __CurrentChar() method
+    # __CurrentChar() method
     #
     #   Get the char pointed to by the file buffer pointer
     #
@@ -158,11 +161,12 @@ class CodeFragmentCollector:
     #   @retval Char        Current char
     #
     def __CurrentChar(self):
-        CurrentChar = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine]
+        CurrentChar = self.Profile.FileLinesList[self.CurrentLineNumber -
+                                                 1][self.CurrentOffsetWithinLine]
 
         return CurrentChar
 
-    ## __NextChar() method
+    # __NextChar() method
     #
     #   Get the one char pass the char pointed to by the file buffer pointer
     #
@@ -175,7 +179,7 @@ class CodeFragmentCollector:
         else:
             return self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine + 1]
 
-    ## __SetCurrentCharValue() method
+    # __SetCurrentCharValue() method
     #
     #   Modify the value of current char
     #
@@ -183,9 +187,10 @@ class CodeFragmentCollector:
     #   @param  Value       The new value of current char
     #
     def __SetCurrentCharValue(self, Value):
-        self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine] = Value
+        self.Profile.FileLinesList[self.CurrentLineNumber -
+                                   1][self.CurrentOffsetWithinLine] = Value
 
-    ## __SetCharValue() method
+    # __SetCharValue() method
     #
     #   Modify the value of current char
     #
@@ -195,7 +200,7 @@ class CodeFragmentCollector:
     def __SetCharValue(self, Line, Offset, Value):
         self.Profile.FileLinesList[Line - 1][Offset] = Value
 
-    ## __CurrentLine() method
+    # __CurrentLine() method
     #
     #   Get the list that contains current line contents
     #
@@ -205,7 +210,7 @@ class CodeFragmentCollector:
     def __CurrentLine(self):
         return self.Profile.FileLinesList[self.CurrentLineNumber - 1]
 
-    ## __InsertComma() method
+    # __InsertComma() method
     #
     #   Insert ',' to replace PP
     #
@@ -214,9 +219,9 @@ class CodeFragmentCollector:
     #
     def __InsertComma(self, Line):
 
-
         if self.Profile.FileLinesList[Line - 1][0] != T_CHAR_HASH:
-            BeforeHashPart = str(self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
+            BeforeHashPart = str(
+                self.Profile.FileLinesList[Line - 1]).split(T_CHAR_HASH)[0]
             if BeforeHashPart.rstrip().endswith(T_CHAR_COMMA) or BeforeHashPart.rstrip().endswith(';'):
                 return
 
@@ -229,9 +234,10 @@ class CodeFragmentCollector:
         if str(self.Profile.FileLinesList[Line]).lstrip().startswith(',') or str(self.Profile.FileLinesList[Line]).lstrip().startswith(';'):
             return
 
-        self.Profile.FileLinesList[Line - 1].insert(self.CurrentOffsetWithinLine, ',')
+        self.Profile.FileLinesList[Line -
+                                   1].insert(self.CurrentOffsetWithinLine, ',')
 
-    ## PreprocessFileWithClear() method
+    # PreprocessFileWithClear() method
     #
     # Run a preprocess for the file to clean all comments
     #
@@ -249,7 +255,8 @@ class CodeFragmentCollector:
         InString = False
         InCharLiteral = False
 
-        self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesListFromFile]
+        self.Profile.FileLinesList = [
+            list(s) for s in self.Profile.FileLinesListFromFile]
         while not self.__EndOfFile():
 
             if not InComment and self.__CurrentChar() == T_CHAR_DOUBLE_QUOTE:
@@ -266,7 +273,8 @@ class CodeFragmentCollector:
                     else:
                         PPExtend = False
 
-                EndLinePos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
+                EndLinePos = (self.CurrentLineNumber,
+                              self.CurrentOffsetWithinLine)
 
                 if InComment and DoubleSlashComment:
                     InComment = False
@@ -284,7 +292,8 @@ class CodeFragmentCollector:
                     CurrentLine = "".join(self.__CurrentLine())
                     if CurrentLine.rstrip(T_CHAR_LF).rstrip(T_CHAR_CR).endswith(T_CHAR_BACKSLASH):
                         SlashIndex = CurrentLine.rindex(T_CHAR_BACKSLASH)
-                        self.__SetCharValue(self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
+                        self.__SetCharValue(
+                            self.CurrentLineNumber, SlashIndex, T_CHAR_SPACE)
 
                 self.CurrentLineNumber += 1
                 self.CurrentOffsetWithinLine = 0
@@ -303,7 +312,8 @@ class CodeFragmentCollector:
                     if self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_SLASH:
                         InComment = False
                         HashComment = False
-                        PPDirectiveObj.EndPos = (self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
+                        PPDirectiveObj.EndPos = (
+                            self.CurrentLineNumber, self.CurrentOffsetWithinLine - 1)
                         FileProfile.PPDirectiveList.append(PPDirectiveObj)
                         PPDirectiveObj = None
                         continue
@@ -321,13 +331,14 @@ class CodeFragmentCollector:
             elif self.__CurrentChar() == T_CHAR_HASH and not InString and not InCharLiteral:
                 InComment = True
                 HashComment = True
-                PPDirectiveObj = PP_Directive('', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
+                PPDirectiveObj = PP_Directive(
+                    '', (self.CurrentLineNumber, self.CurrentOffsetWithinLine), None)
             # check for /* comment start
             elif self.__CurrentChar() == T_CHAR_SLASH and self.__NextChar() == T_CHAR_STAR:
 
-                self.__SetCurrentCharValue( T_CHAR_SPACE)
+                self.__SetCurrentCharValue(T_CHAR_SPACE)
                 self.__GetOneChar()
-                self.__SetCurrentCharValue( T_CHAR_SPACE)
+                self.__SetCurrentCharValue(T_CHAR_SPACE)
                 self.__GetOneChar()
                 InComment = True
             else:
@@ -340,7 +351,7 @@ class CodeFragmentCollector:
             FileProfile.PPDirectiveList.append(PPDirectiveObj)
         self.Rewind()
 
-    ## ParseFile() method
+    # ParseFile() method
     #
     #   Parse the file profile buffer to extract fd, fv ... information
     #   Exception will be raised if syntax error found
@@ -350,7 +361,8 @@ class CodeFragmentCollector:
     def ParseFile(self):
         self.PreprocessFileWithClear()
         # restore from ListOfList to ListOfString
-        self.Profile.FileLinesList = ["".join(list) for list in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [
+            "".join(list) for list in self.Profile.FileLinesList]
         FileStringContents = ''
         for fileLine in self.Profile.FileLinesList:
             FileStringContents += fileLine
@@ -360,7 +372,7 @@ class CodeFragmentCollector:
         parser = CParser(tStream)
         parser.translation_unit()
 
-    ## CleanFileProfileBuffer() method
+    # CleanFileProfileBuffer() method
     #
     #   Reset all contents of the profile of a file
     #
@@ -375,7 +387,7 @@ class CodeFragmentCollector:
         FileProfile.TypedefDefinitionList = []
         FileProfile.FunctionCallingList = []
 
-    ## PrintFragments() method
+    # PrintFragments() method
     #
     #   Print the contents of the profile of a file
     #
@@ -387,7 +399,8 @@ class CodeFragmentCollector:
         print('/************** ASSIGNMENTS *************/')
         print('/****************************************/')
         for assign in FileProfile.AssignmentExpressionList:
-            print(str(assign.StartPos) + assign.Name + assign.Operator + assign.Value)
+            print(str(assign.StartPos) + assign.Name +
+                  assign.Operator + assign.Value)
 
         print('/****************************************/')
         print('/********* PREPROCESS DIRECTIVES ********/')
@@ -399,13 +412,14 @@ class CodeFragmentCollector:
         print('/********* VARIABLE DECLARATIONS ********/')
         print('/****************************************/')
         for var in FileProfile.VariableDeclarationList:
-            print(str(var.StartPos) + var.Modifier + ' '+ var.Declarator)
+            print(str(var.StartPos) + var.Modifier + ' ' + var.Declarator)
 
         print('/****************************************/')
         print('/********* FUNCTION DEFINITIONS *********/')
         print('/****************************************/')
         for func in FileProfile.FunctionDefinitionList:
-            print(str(func.StartPos) + func.Modifier + ' '+ func.Declarator + ' ' + str(func.NamePos))
+            print(str(func.StartPos) + func.Modifier + ' ' +
+                  func.Declarator + ' ' + str(func.NamePos))
 
         print('/****************************************/')
         print('/************ ENUMERATIONS **************/')
@@ -425,6 +439,7 @@ class CodeFragmentCollector:
         for typedef in FileProfile.TypedefDefinitionList:
             print(str(typedef.StartPos) + typedef.ToType)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Eot/Database.py b/BaseTools/Source/Python/Eot/Database.py
index fca08b96bbdf..46d2773272c9 100644
--- a/BaseTools/Source/Python/Eot/Database.py
+++ b/BaseTools/Source/Python/Eot/Database.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by EOT tool
 #
 # Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -9,7 +9,8 @@
 # Import Modules
 #
 import sqlite3
-import Common.LongFilePathOs as os, time
+import Common.LongFilePathOs as os
+import time
 
 import Common.EdkLogger as EdkLogger
 import CommonDataClass.DataClass as DataClass
@@ -30,14 +31,16 @@ from Table.TableQuery import TableQuery
 #
 DATABASE_PATH = "Eot.db"
 
-## Database class
+# Database class
 #
 # This class defined the EOT database
 # During the phase of initialization, the database will create all tables and
 # insert all records of table DataModel
 #
+
+
 class Database(object):
-    ## The constructor
+    # The constructor
     #
     #   @param  self:      The object pointer
     #   @param  DbPath:    The file path of the database
@@ -58,7 +61,7 @@ class Database(object):
         self.TblQuery = None
         self.TblQuery2 = None
 
-    ## InitDatabase() method
+    # InitDatabase() method
     #  1. Delete all old existing tables
     #  2. Create new tables
     #  3. Initialize table DataModel
@@ -66,7 +69,7 @@ class Database(object):
     #  @param self: The object pointer
     #  @param NewDatabase: Check if it needs to create a new database
     #
-    def InitDatabase(self, NewDatabase = True):
+    def InitDatabase(self, NewDatabase=True):
         EdkLogger.verbose("\nInitialize EOT database started ...")
         #
         # Drop all old existing tables
@@ -74,7 +77,7 @@ class Database(object):
         if NewDatabase:
             if os.path.exists(self.DbPath):
                 os.remove(self.DbPath)
-        self.Conn = sqlite3.connect(self.DbPath, isolation_level = 'DEFERRED')
+        self.Conn = sqlite3.connect(self.DbPath, isolation_level='DEFERRED')
         self.Conn.execute("PRAGMA page_size=8192")
         self.Conn.execute("PRAGMA synchronous=OFF")
         # to avoid non-ascii character conversion error
@@ -129,7 +132,7 @@ class Database(object):
 
         EdkLogger.verbose("Initialize EOT database ... DONE!")
 
-    ## QueryTable() method
+    # QueryTable() method
     #
     #  Query a table
     #
@@ -139,7 +142,7 @@ class Database(object):
     def QueryTable(self, Table):
         Table.Query()
 
-    ## Close() method
+    # Close() method
     #
     # Commit all first
     # Close the connection and cursor
@@ -152,7 +155,7 @@ class Database(object):
         self.Cur.close()
         self.Conn.close()
 
-    ## InsertOneFile() method
+    # InsertOneFile() method
     #
     # Insert one file's information to the database
     # 1. Create a record in TableFile
@@ -167,37 +170,40 @@ class Database(object):
     #
     def InsertOneFile(self, File):
         # Insert a record for file
-        FileID = self.TblFile.Insert(File.Name, File.ExtName, File.Path, File.FullPath, Model = File.Model, TimeStamp = File.TimeStamp)
+        FileID = self.TblFile.Insert(
+            File.Name, File.ExtName, File.Path, File.FullPath, Model=File.Model, TimeStamp=File.TimeStamp)
         IdTable = TableIdentifier(self.Cur)
         IdTable.Table = "Identifier%s" % FileID
         IdTable.Create()
 
         # Insert function of file
         for Function in File.FunctionList:
-            FunctionID = self.TblFunction.Insert(Function.Header, Function.Modifier, Function.Name, Function.ReturnStatement, \
-                                    Function.StartLine, Function.StartColumn, Function.EndLine, Function.EndColumn, \
-                                    Function.BodyStartLine, Function.BodyStartColumn, FileID, \
-                                    Function.FunNameStartLine, Function.FunNameStartColumn)
+            FunctionID = self.TblFunction.Insert(Function.Header, Function.Modifier, Function.Name, Function.ReturnStatement,
+                                                 Function.StartLine, Function.StartColumn, Function.EndLine, Function.EndColumn,
+                                                 Function.BodyStartLine, Function.BodyStartColumn, FileID,
+                                                 Function.FunNameStartLine, Function.FunNameStartColumn)
 
             # Insert Identifier of function
             for Identifier in Function.IdentifierList:
-                IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model, \
-                                        FileID, FunctionID, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
+                IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model,
+                                              FileID, FunctionID, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
         # Insert Identifier of file
         for Identifier in File.IdentifierList:
-            IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model, \
-                                    FileID, -1, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
+            IdentifierID = IdTable.Insert(Identifier.Modifier, Identifier.Type, Identifier.Name, Identifier.Value, Identifier.Model,
+                                          FileID, -1, Identifier.StartLine, Identifier.StartColumn, Identifier.EndLine, Identifier.EndColumn)
 
-        EdkLogger.verbose("Insert information from file %s ... DONE!" % File.FullPath)
+        EdkLogger.verbose(
+            "Insert information from file %s ... DONE!" % File.FullPath)
 
-    ## UpdateIdentifierBelongsToFunction() method
+    # UpdateIdentifierBelongsToFunction() method
     #
     #  Update the field "BelongsToFunction" for each Identifier
     #
     #  @param self: The object pointer
     #
     def UpdateIdentifierBelongsToFunction(self):
-        EdkLogger.verbose("Update 'BelongsToFunction' for Identifiers started ...")
+        EdkLogger.verbose(
+            "Update 'BelongsToFunction' for Identifiers started ...")
 
         SqlCommand = """select ID, BelongsToFile, StartLine, EndLine from Function"""
         Records = self.TblFunction.Exec(SqlCommand)
@@ -210,11 +216,12 @@ class Database(object):
             EndLine = Record[3]
 
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s where BelongsToFile = %s and StartLine > %s and EndLine < %s""" % \
-                        (BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine)
+                (BelongsToFile, FunctionID, BelongsToFile, StartLine, EndLine)
             self.TblIdentifier.Exec(SqlCommand)
 
             SqlCommand = """Update Identifier%s set BelongsToFunction = %s, Model = %s where BelongsToFile = %s and Model = %s and EndLine = %s""" % \
-                         (BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER, BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1)
+                         (BelongsToFile, FunctionID, DataClass.MODEL_IDENTIFIER_FUNCTION_HEADER,
+                          BelongsToFile, DataClass.MODEL_IDENTIFIER_COMMENT, StartLine - 1)
             self.TblIdentifier.Exec(SqlCommand)
 
 
@@ -226,18 +233,25 @@ class Database(object):
 if __name__ == '__main__':
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.DEBUG_0)
-    EdkLogger.verbose("Start at " + time.strftime('%H:%M:%S', time.localtime()))
+    EdkLogger.verbose(
+        "Start at " + time.strftime('%H:%M:%S', time.localtime()))
 
     Db = Database(DATABASE_PATH)
     Db.InitDatabase()
     Db.QueryTable(Db.TblDataModel)
 
-    identifier1 = DataClass.IdentifierClass(-1, '', '', "i''1", 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 32,  43,  54,  43)
-    identifier2 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 15,  43,  20,  43)
-    identifier3 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 55,  43,  58,  43)
-    identifier4 = DataClass.IdentifierClass(-1, '', '', "i1'", 'aaa', DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 77,  43,  88,  43)
-    fun1 = DataClass.FunctionClass(-1, '', '', 'fun1', '', 21, 2, 60,  45, 1, 23, 0, [], [])
-    file = DataClass.FileClass(-1, 'F1', 'c', 'C:\\', 'C:\\F1.exe', DataClass.MODEL_FILE_C, '2007-12-28', [fun1], [identifier1, identifier2, identifier3, identifier4], [])
+    identifier1 = DataClass.IdentifierClass(-1, '', '', "i''1", 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 32,  43,  54,  43)
+    identifier2 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 15,  43,  20,  43)
+    identifier3 = DataClass.IdentifierClass(-1, '', '', 'i1', 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 55,  43,  58,  43)
+    identifier4 = DataClass.IdentifierClass(-1, '', '', "i1'", 'aaa',
+                                            DataClass.MODEL_IDENTIFIER_COMMENT, 1, -1, 77,  43,  88,  43)
+    fun1 = DataClass.FunctionClass(-1, '', '',
+                                   'fun1', '', 21, 2, 60,  45, 1, 23, 0, [], [])
+    file = DataClass.FileClass(-1, 'F1', 'c', 'C:\\', 'C:\\F1.exe', DataClass.MODEL_FILE_C,
+                               '2007-12-28', [fun1], [identifier1, identifier2, identifier3, identifier4], [])
     Db.InsertOneFile(file)
 
     Db.QueryTable(Db.TblFile)
@@ -246,4 +260,3 @@ if __name__ == '__main__':
 
     Db.Close()
     EdkLogger.verbose("End at " + time.strftime('%H:%M:%S', time.localtime()))
-
diff --git a/BaseTools/Source/Python/Eot/EotGlobalData.py b/BaseTools/Source/Python/Eot/EotGlobalData.py
index 3218f86f441c..be9783707d61 100644
--- a/BaseTools/Source/Python/Eot/EotGlobalData.py
+++ b/BaseTools/Source/Python/Eot/EotGlobalData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to save global datas
 #
 # Copyright (c) 2008 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -44,7 +44,8 @@ gOP_UN_DISPATCHED = open(gUN_DISPATCHED_LOG, 'w+')
 
 # Log file for unmatched variables in function calling
 gUN_MATCHED_IN_LIBRARY_CALLING_LOG = 'Log_UnMatchedInLibraryCalling.log'
-gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w+')
+gOP_UN_MATCHED_IN_LIBRARY_CALLING = open(
+    gUN_MATCHED_IN_LIBRARY_CALLING_LOG, 'w+')
 
 # Log file for order of dispatched PEIM/DRIVER
 gDISPATCH_ORDER_LOG = 'Log_DispatchOrder.log'
diff --git a/BaseTools/Source/Python/Eot/EotMain.py b/BaseTools/Source/Python/Eot/EotMain.py
index 791fcdfeaed8..f9989bc3de54 100644
--- a/BaseTools/Source/Python/Eot/EotMain.py
+++ b/BaseTools/Source/Python/Eot/EotMain.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to be the main entrance of EOT tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -9,7 +9,9 @@
 # Import Modules
 #
 from __future__ import absolute_import
-import Common.LongFilePathOs as os, time, glob
+import Common.LongFilePathOs as os
+import time
+import glob
 import Common.EdkLogger as EdkLogger
 import Eot.EotGlobalData as EotGlobalData
 from optparse import OptionParser
@@ -36,6 +38,7 @@ from GenFds.AprioriSection import DXE_APRIORI_GUID, PEI_APRIORI_GUID
 gGuidStringFormat = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X"
 gIndention = -4
 
+
 class Image(array):
     _HEADER_ = struct.Struct("")
     _HEADER_SIZE_ = _HEADER_.size
@@ -52,7 +55,7 @@ class Image(array):
         self._LEN_ = None
         self._OFF_ = None
 
-        self._SubImages = sdict() # {offset: Image()}
+        self._SubImages = sdict()  # {offset: Image()}
 
         array.__init__(self)
 
@@ -66,7 +69,7 @@ class Image(array):
         return Len
 
     def _Unpack(self):
-        self.extend(self._BUF_[self._OFF_ : self._OFF_ + self._LEN_])
+        self.extend(self._BUF_[self._OFF_: self._OFF_ + self._LEN_])
         return len(self)
 
     def _Pack(self, PadByte=0xFF):
@@ -106,18 +109,20 @@ class Image(array):
 
     Data = property(_GetData, _SetData)
 
-## CompressedImage() class
+# CompressedImage() class
 #
 #  A class for Compressed Image
 #
+
+
 class CompressedImage(Image):
     # UncompressedLength = 4-byte
     # CompressionType = 1-byte
     _HEADER_ = struct.Struct("1I 1B")
     _HEADER_SIZE_ = _HEADER_.size
 
-    _ORIG_SIZE_     = struct.Struct("1I")
-    _CMPRS_TYPE_    = struct.Struct("4x 1B")
+    _ORIG_SIZE_ = struct.Struct("1I")
+    _CMPRS_TYPE_ = struct.Struct("4x 1B")
 
     def __init__(self, CompressedData=None, CompressionType=None, UncompressedLength=None):
         Image.__init__(self)
@@ -130,7 +135,8 @@ class CompressedImage(Image):
 
     def __str__(self):
         global gIndention
-        S = "algorithm=%s uncompressed=%x" % (self.CompressionType, self.UncompressedLength)
+        S = "algorithm=%s uncompressed=%x" % (
+            self.CompressionType, self.UncompressedLength)
         for Sec in self.Sections:
             S += '\n' + str(Sec)
 
@@ -175,10 +181,12 @@ class CompressedImage(Image):
     CompressionType = property(_GetCompressionType, _SetCompressionType)
     Sections = property(_GetSections)
 
-## Ui() class
+# Ui() class
 #
 #  A class for Ui
 #
+
+
 class Ui(Image):
     _HEADER_ = struct.Struct("")
     _HEADER_SIZE_ = 0
@@ -192,7 +200,7 @@ class Ui(Image):
     def _Unpack(self):
         # keep header in this Image object
         self.empty()
-        self.extend(self._BUF_[self._OFF_ : self._OFF_ + self._LEN_])
+        self.extend(self._BUF_[self._OFF_: self._OFF_ + self._LEN_])
         return len(self)
 
     def _GetUiString(self):
@@ -200,42 +208,44 @@ class Ui(Image):
 
     String = property(_GetUiString)
 
-## Depex() class
+# Depex() class
 #
 #  A class for Depex
 #
+
+
 class Depex(Image):
     _HEADER_ = struct.Struct("")
     _HEADER_SIZE_ = 0
 
-    _GUID_          = struct.Struct("1I2H8B")
-    _OPCODE_        = struct.Struct("1B")
+    _GUID_ = struct.Struct("1I2H8B")
+    _OPCODE_ = struct.Struct("1B")
 
     _OPCODE_STRING_ = {
-        0x00    :   "BEFORE",
-        0x01    :   "AFTER",
-        0x02    :   "PUSH",
-        0x03    :   "AND",
-        0x04    :   "OR",
-        0x05    :   "NOT",
-        0x06    :   "TRUE",
-        0x07    :   "FALSE",
-        0x08    :   "END",
-        0x09    :   "SOR"
+        0x00:   "BEFORE",
+        0x01:   "AFTER",
+        0x02:   "PUSH",
+        0x03:   "AND",
+        0x04:   "OR",
+        0x05:   "NOT",
+        0x06:   "TRUE",
+        0x07:   "FALSE",
+        0x08:   "END",
+        0x09:   "SOR"
     }
 
     _NEXT_ = {
-        -1      :   _OPCODE_,   # first one in depex must be an opcdoe
-        0x00    :   _GUID_,     #"BEFORE",
-        0x01    :   _GUID_,     #"AFTER",
-        0x02    :   _GUID_,     #"PUSH",
-        0x03    :   _OPCODE_,   #"AND",
-        0x04    :   _OPCODE_,   #"OR",
-        0x05    :   _OPCODE_,   #"NOT",
-        0x06    :   _OPCODE_,   #"TRUE",
-        0x07    :   _OPCODE_,   #"FALSE",
-        0x08    :   None,       #"END",
-        0x09    :   _OPCODE_,   #"SOR"
+        -1:   _OPCODE_,   # first one in depex must be an opcdoe
+        0x00:   _GUID_,  # "BEFORE",
+        0x01:   _GUID_,  # "AFTER",
+        0x02:   _GUID_,  # "PUSH",
+        0x03:   _OPCODE_,  # "AND",
+        0x04:   _OPCODE_,  # "OR",
+        0x05:   _OPCODE_,  # "NOT",
+        0x06:   _OPCODE_,  # "TRUE",
+        0x07:   _OPCODE_,  # "FALSE",
+        0x08:   None,  # "END",
+        0x09:   _OPCODE_,  # "SOR"
     }
 
     def __init__(self):
@@ -260,7 +270,7 @@ class Depex(Image):
     def _Unpack(self):
         # keep header in this Image object
         self.empty()
-        self.extend(self._BUF_[self._OFF_ : self._OFF_ + self._LEN_])
+        self.extend(self._BUF_[self._OFF_: self._OFF_ + self._LEN_])
         return len(self)
 
     def _GetExpression(self):
@@ -289,6 +299,8 @@ class Depex(Image):
 #
 #  A class for Firmware Volume
 #
+
+
 class FirmwareVolume(Image):
     # Read FvLength, Attributes, HeaderLength, Checksum
     _HEADER_ = struct.Struct("16x 1I2H8B 1Q 4x 1I 1H 1H")
@@ -351,11 +363,15 @@ class FirmwareVolume(Image):
                     DepexList.append(Guid)
                 continue
             elif Item == 0x03 or Item == 0x04:
-                DepexStack.append(eval(str(DepexStack.pop()) + ' ' + Depex._OPCODE_STRING_[Item].lower() + ' ' + str(DepexStack.pop())))
-                DepexList.append(str(DepexList.pop()) + ' ' + Depex._OPCODE_STRING_[Item].upper() + ' ' + str(DepexList.pop()))
+                DepexStack.append(eval(str(DepexStack.pop(
+                )) + ' ' + Depex._OPCODE_STRING_[Item].lower() + ' ' + str(DepexStack.pop())))
+                DepexList.append(str(DepexList.pop(
+                )) + ' ' + Depex._OPCODE_STRING_[Item].upper() + ' ' + str(DepexList.pop()))
             elif Item == 0x05:
-                DepexStack.append(eval(Depex._OPCODE_STRING_[Item].lower() + ' ' + str(DepexStack.pop())))
-                DepexList.append(Depex._OPCODE_STRING_[Item].lower() + ' ' + str(DepexList.pop()))
+                DepexStack.append(
+                    eval(Depex._OPCODE_STRING_[Item].lower() + ' ' + str(DepexStack.pop())))
+                DepexList.append(Depex._OPCODE_STRING_[
+                                 Item].lower() + ' ' + str(DepexList.pop()))
             elif Item == 0x06:
                 DepexStack.append(True)
                 DepexList.append('TRUE')
@@ -405,12 +421,14 @@ class FirmwareVolume(Image):
 
         # Parse SEC_CORE first
         if FfsSecCoreGuid is not None:
-            self.OrderedFfsDict[FfsSecCoreGuid] = self.UnDispatchedFfsDict.pop(FfsSecCoreGuid)
+            self.OrderedFfsDict[FfsSecCoreGuid] = self.UnDispatchedFfsDict.pop(
+                FfsSecCoreGuid)
             self.LoadPpi(Db, FfsSecCoreGuid)
 
         # Parse PEI first
         if FfsPeiCoreGuid is not None:
-            self.OrderedFfsDict[FfsPeiCoreGuid] = self.UnDispatchedFfsDict.pop(FfsPeiCoreGuid)
+            self.OrderedFfsDict[FfsPeiCoreGuid] = self.UnDispatchedFfsDict.pop(
+                FfsPeiCoreGuid)
             self.LoadPpi(Db, FfsPeiCoreGuid)
             if FfsPeiPrioriGuid is not None:
                 # Load PEIM described in priori file
@@ -421,18 +439,21 @@ class FirmwareVolume(Image):
                         GuidStruct = struct.Struct('1I2H8B')
                         Start = 4
                         while len(Section) > Start:
-                            Guid = GuidStruct.unpack_from(Section[Start : Start + 16])
+                            Guid = GuidStruct.unpack_from(
+                                Section[Start: Start + 16])
                             GuidString = gGuidStringFormat % Guid
                             Start = Start + 16
                             if GuidString in self.UnDispatchedFfsDict:
-                                self.OrderedFfsDict[GuidString] = self.UnDispatchedFfsDict.pop(GuidString)
+                                self.OrderedFfsDict[GuidString] = self.UnDispatchedFfsDict.pop(
+                                    GuidString)
                                 self.LoadPpi(Db, GuidString)
 
         self.DisPatchPei(Db)
 
         # Parse DXE then
         if FfsDxeCoreGuid is not None:
-            self.OrderedFfsDict[FfsDxeCoreGuid] = self.UnDispatchedFfsDict.pop(FfsDxeCoreGuid)
+            self.OrderedFfsDict[FfsDxeCoreGuid] = self.UnDispatchedFfsDict.pop(
+                FfsDxeCoreGuid)
             self.LoadProtocol(Db, FfsDxeCoreGuid)
             if FfsDxePrioriGuid is not None:
                 # Load PEIM described in priori file
@@ -443,11 +464,13 @@ class FirmwareVolume(Image):
                         GuidStruct = struct.Struct('1I2H8B')
                         Start = 4
                         while len(Section) > Start:
-                            Guid = GuidStruct.unpack_from(Section[Start : Start + 16])
+                            Guid = GuidStruct.unpack_from(
+                                Section[Start: Start + 16])
                             GuidString = gGuidStringFormat % Guid
                             Start = Start + 16
                             if GuidString in self.UnDispatchedFfsDict:
-                                self.OrderedFfsDict[GuidString] = self.UnDispatchedFfsDict.pop(GuidString)
+                                self.OrderedFfsDict[GuidString] = self.UnDispatchedFfsDict.pop(
+                                    GuidString)
                                 self.LoadProtocol(Db, GuidString)
 
         self.DisPatchDxe(Db)
@@ -503,21 +526,24 @@ class FirmwareVolume(Image):
                     # Find Depex
                     if Section.Type == 0x13:
                         IsFoundDepex = True
-                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(Section._SubImages[4], 'Protocol')
+                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                            Section._SubImages[4], 'Protocol')
                         break
                     if Section.Type == 0x01:
                         CompressSections = Section._SubImages[4]
                         for CompressSection in CompressSections.Sections:
                             if CompressSection.Type == 0x13:
                                 IsFoundDepex = True
-                                CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(CompressSection._SubImages[4], 'Protocol')
+                                CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                                    CompressSection._SubImages[4], 'Protocol')
                                 break
                             if CompressSection.Type == 0x02:
                                 NewSections = CompressSection._SubImages[4]
                                 for NewSection in NewSections.Sections:
                                     if NewSection.Type == 0x13:
                                         IsFoundDepex = True
-                                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(NewSection._SubImages[4], 'Protocol')
+                                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                                            NewSection._SubImages[4], 'Protocol')
                                         break
 
                 # Not find Depex
@@ -532,7 +558,8 @@ class FirmwareVolume(Image):
                     NewFfs = self.UnDispatchedFfsDict.pop(FfsID)
                     NewFfs.Depex = DepexString
                     if FileDepex is not None:
-                        ScheduleList.insert(FileDepex[1], FfsID, NewFfs, FileDepex[0])
+                        ScheduleList.insert(
+                            FileDepex[1], FfsID, NewFfs, FileDepex[0])
                     else:
                         ScheduleList[FfsID] = NewFfs
                 else:
@@ -565,19 +592,22 @@ class FirmwareVolume(Image):
                 # Get Depex
                 for Section in Ffs.Sections.values():
                     if Section.Type == 0x1B:
-                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(Section._SubImages[4], 'Ppi')
+                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                            Section._SubImages[4], 'Ppi')
                         break
                     if Section.Type == 0x01:
                         CompressSections = Section._SubImages[4]
                         for CompressSection in CompressSections.Sections:
                             if CompressSection.Type == 0x1B:
-                                CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(CompressSection._SubImages[4], 'Ppi')
+                                CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                                    CompressSection._SubImages[4], 'Ppi')
                                 break
                             if CompressSection.Type == 0x02:
                                 NewSections = CompressSection._SubImages[4]
                                 for NewSection in NewSections.Sections:
                                     if NewSection.Type == 0x1B:
-                                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(NewSection._SubImages[4], 'Ppi')
+                                        CouldBeLoaded, DepexString, FileDepex = self.ParseDepex(
+                                            NewSection._SubImages[4], 'Ppi')
                                         break
 
                 # Append New Ffs
@@ -593,13 +623,14 @@ class FirmwareVolume(Image):
         if IsInstalled:
             self.DisPatchPei(Db)
 
-
     def __str__(self):
         global gIndention
         gIndention += 4
         FvInfo = '\n' + ' ' * gIndention
-        FvInfo += "[FV:%s] file_system=%s size=%x checksum=%s\n" % (self.Name, self.FileSystemGuid, self.Size, self.Checksum)
-        FfsInfo = "\n".join([str(self.FfsDict[FfsId]) for FfsId in self.FfsDict])
+        FvInfo += "[FV:%s] file_system=%s size=%x checksum=%s\n" % (
+            self.Name, self.FileSystemGuid, self.Size, self.Checksum)
+        FfsInfo = "\n".join([str(self.FfsDict[FfsId])
+                            for FfsId in self.FfsDict])
         gIndention -= 4
         return FvInfo + FfsInfo
 
@@ -617,18 +648,20 @@ class FirmwareVolume(Image):
             FfsObj.frombuffer(self, FfsStartAddress)
             FfsId = repr(FfsObj)
             if ((self.Attributes & 0x00000800) != 0 and len(FfsObj) == 0xFFFFFF) \
-                or ((self.Attributes & 0x00000800) == 0 and len(FfsObj) == 0):
+                    or ((self.Attributes & 0x00000800) == 0 and len(FfsObj) == 0):
                 if LastFfsObj is not None:
-                    LastFfsObj.FreeSpace = EndOfFv - LastFfsObj._OFF_ - len(LastFfsObj)
+                    LastFfsObj.FreeSpace = EndOfFv - \
+                        LastFfsObj._OFF_ - len(LastFfsObj)
             else:
                 if FfsId in self.FfsDict:
                     EdkLogger.error("FV", 0, "Duplicate GUID in FFS",
-                                    ExtraData="\t%s @ %s\n\t%s @ %s" \
+                                    ExtraData="\t%s @ %s\n\t%s @ %s"
                                     % (FfsObj.Guid, FfsObj.Offset,
                                        self.FfsDict[FfsId].Guid, self.FfsDict[FfsId].Offset))
                 self.FfsDict[FfsId] = FfsObj
                 if LastFfsObj is not None:
-                    LastFfsObj.FreeSpace = FfsStartAddress - LastFfsObj._OFF_ - len(LastFfsObj)
+                    LastFfsObj.FreeSpace = FfsStartAddress - \
+                        LastFfsObj._OFF_ - len(LastFfsObj)
 
             FfsStartAddress += len(FfsObj)
             #
@@ -659,21 +692,23 @@ class FirmwareVolume(Image):
     HeaderSize = property(_GetHeaderLength)
     FileSystemGuid = property(_GetFileSystemGuid)
 
-## GuidDefinedImage() class
+# GuidDefinedImage() class
 #
 #  A class for GUID Defined Image
 #
+
+
 class GuidDefinedImage(Image):
     _HEADER_ = struct.Struct("1I2H8B 1H 1H")
     _HEADER_SIZE_ = _HEADER_.size
 
-    _GUID_          = struct.Struct("1I2H8B")
-    _DATA_OFFSET_   = struct.Struct("16x 1H")
-    _ATTR_          = struct.Struct("18x 1H")
+    _GUID_ = struct.Struct("1I2H8B")
+    _DATA_OFFSET_ = struct.Struct("16x 1H")
+    _ATTR_ = struct.Struct("18x 1H")
 
-    CRC32_GUID          = "FC1BCDB0-7D31-49AA-936A-A4600D9DD083"
+    CRC32_GUID = "FC1BCDB0-7D31-49AA-936A-A4600D9DD083"
     TIANO_COMPRESS_GUID = 'A31280AD-481E-41B6-95E8-127F4C984779'
-    LZMA_COMPRESS_GUID  = 'EE4E5898-3914-4259-9D6E-DC7BD79403CF'
+    LZMA_COMPRESS_GUID = 'EE4E5898-3914-4259-9D6E-DC7BD79403CF'
 
     def __init__(self, SectionDefinitionGuid=None, DataOffset=None, Attributes=None, Data=None):
         Image.__init__(self)
@@ -695,7 +730,7 @@ class GuidDefinedImage(Image):
     def _Unpack(self):
         # keep header in this Image object
         self.empty()
-        self.extend(self._BUF_[self._OFF_ : self._OFF_ + self._LEN_])
+        self.extend(self._BUF_[self._OFF_: self._OFF_ + self._LEN_])
         return len(self)
 
     def _SetAttribute(self, Attribute):
@@ -781,35 +816,37 @@ class GuidDefinedImage(Image):
     DataOffset = property(_GetDataOffset, _SetDataOffset)
     Sections = property(_GetSections)
 
-## Section() class
+# Section() class
 #
 #  A class for Section
 #
+
+
 class Section(Image):
     _TypeName = {
-        0x00    :   "<unknown>",
-        0x01    :   "COMPRESSION",
-        0x02    :   "GUID_DEFINED",
-        0x10    :   "PE32",
-        0x11    :   "PIC",
-        0x12    :   "TE",
-        0x13    :   "DXE_DEPEX",
-        0x14    :   "VERSION",
-        0x15    :   "USER_INTERFACE",
-        0x16    :   "COMPATIBILITY16",
-        0x17    :   "FIRMWARE_VOLUME_IMAGE",
-        0x18    :   "FREEFORM_SUBTYPE_GUID",
-        0x19    :   "RAW",
-        0x1B    :   "PEI_DEPEX"
+        0x00:   "<unknown>",
+        0x01:   "COMPRESSION",
+        0x02:   "GUID_DEFINED",
+        0x10:   "PE32",
+        0x11:   "PIC",
+        0x12:   "TE",
+        0x13:   "DXE_DEPEX",
+        0x14:   "VERSION",
+        0x15:   "USER_INTERFACE",
+        0x16:   "COMPATIBILITY16",
+        0x17:   "FIRMWARE_VOLUME_IMAGE",
+        0x18:   "FREEFORM_SUBTYPE_GUID",
+        0x19:   "RAW",
+        0x1B:   "PEI_DEPEX"
     }
 
     _SectionSubImages = {
-        0x01    :   CompressedImage,
-        0x02    :   GuidDefinedImage,
-        0x17    :   FirmwareVolume,
-        0x13    :   Depex,
-        0x1B    :   Depex,
-        0x15    :   Ui
+        0x01:   CompressedImage,
+        0x02:   GuidDefinedImage,
+        0x17:   FirmwareVolume,
+        0x13:   Depex,
+        0x1B:   Depex,
+        0x15:   Ui
     }
 
     # Size = 3-byte
@@ -819,8 +856,8 @@ class Section(Image):
 
     # SubTypeGuid
     # _FREE_FORM_SUBTYPE_GUID_HEADER_ = struct.Struct("1I2H8B")
-    _SIZE_          = struct.Struct("3B")
-    _TYPE_          = struct.Struct("3x 1B")
+    _SIZE_ = struct.Struct("3B")
+    _TYPE_ = struct.Struct("3x 1B")
 
     def __init__(self, Type=None, Size=None):
         Image.__init__(self)
@@ -835,9 +872,11 @@ class Section(Image):
         gIndention += 4
         SectionInfo = ' ' * gIndention
         if self.Type in self._TypeName:
-            SectionInfo += "[SECTION:%s] offset=%x size=%x" % (self._TypeName[self.Type], self._OFF_, self.Size)
+            SectionInfo += "[SECTION:%s] offset=%x size=%x" % (
+                self._TypeName[self.Type], self._OFF_, self.Size)
         else:
-            SectionInfo += "[SECTION:%x<unknown>] offset=%x size=%x " % (self.Type, self._OFF_, self.Size)
+            SectionInfo += "[SECTION:%x<unknown>] offset=%x size=%x " % (
+                self.Type, self._OFF_, self.Size)
         for Offset in self._SubImages.keys():
             SectionInfo += ", " + str(self._SubImages[Offset])
         gIndention -= 4
@@ -851,10 +890,11 @@ class Section(Image):
 
         if Type not in self._SectionSubImages:
             # no need to extract sub-image, keep all in this Image object
-            self.extend(self._BUF_[self._OFF_ : self._OFF_ + Size])
+            self.extend(self._BUF_[self._OFF_: self._OFF_ + Size])
         else:
             # keep header in this Image object
-            self.extend(self._BUF_[self._OFF_ : self._OFF_ + self._HEADER_SIZE_])
+            self.extend(
+                self._BUF_[self._OFF_: self._OFF_ + self._HEADER_SIZE_])
             #
             # use new Image object to represent payload, which may be another kind
             # of image such as PE32
@@ -862,7 +902,8 @@ class Section(Image):
             PayloadOffset = self._HEADER_SIZE_
             PayloadLen = self.Size - self._HEADER_SIZE_
             Payload = self._SectionSubImages[self.Type]()
-            Payload.frombuffer(self._BUF_, self._OFF_ + self._HEADER_SIZE_, PayloadLen)
+            Payload.frombuffer(self._BUF_, self._OFF_ +
+                               self._HEADER_SIZE_, PayloadLen)
             self._SubImages[PayloadOffset] = Payload
 
         return Size
@@ -907,51 +948,53 @@ class Section(Image):
     Size = property(_GetSize, _SetSize)
     Alignment = property(_GetAlignment, _SetAlignment)
 
-## Ffs() class
+# Ffs() class
 #
 #  A class for Ffs Section
 #
+
+
 class Ffs(Image):
     _FfsFormat = "24B%(payload_size)sB"
     # skip IntegrityCheck
     _HEADER_ = struct.Struct("1I2H8B 2x 1B 1B 3B 1B")
     _HEADER_SIZE_ = _HEADER_.size
 
-    _NAME_      = struct.Struct("1I2H8B")
+    _NAME_ = struct.Struct("1I2H8B")
     _INT_CHECK_ = struct.Struct("16x 1H")
-    _TYPE_      = struct.Struct("18x 1B")
-    _ATTR_      = struct.Struct("19x 1B")
-    _SIZE_      = struct.Struct("20x 3B")
-    _STATE_     = struct.Struct("23x 1B")
+    _TYPE_ = struct.Struct("18x 1B")
+    _ATTR_ = struct.Struct("19x 1B")
+    _SIZE_ = struct.Struct("20x 3B")
+    _STATE_ = struct.Struct("23x 1B")
 
-    FFS_ATTRIB_FIXED              = 0x04
-    FFS_ATTRIB_DATA_ALIGNMENT     = 0x38
-    FFS_ATTRIB_CHECKSUM           = 0x40
+    FFS_ATTRIB_FIXED = 0x04
+    FFS_ATTRIB_DATA_ALIGNMENT = 0x38
+    FFS_ATTRIB_CHECKSUM = 0x40
 
     _TypeName = {
-        0x00    :   "<unknown>",
-        0x01    :   "RAW",
-        0x02    :   "FREEFORM",
-        0x03    :   "SECURITY_CORE",
-        0x04    :   "PEI_CORE",
-        0x05    :   "DXE_CORE",
-        0x06    :   "PEIM",
-        0x07    :   "DRIVER",
-        0x08    :   "COMBINED_PEIM_DRIVER",
-        0x09    :   "APPLICATION",
-        0x0A    :   "SMM",
-        0x0B    :   "FIRMWARE_VOLUME_IMAGE",
-        0x0C    :   "COMBINED_SMM_DXE",
-        0x0D    :   "SMM_CORE",
-        0x0E    :   "MM_STANDALONE",
-        0x0F    :   "MM_CORE_STANDALONE",
-        0xc0    :   "OEM_MIN",
-        0xdf    :   "OEM_MAX",
-        0xe0    :   "DEBUG_MIN",
-        0xef    :   "DEBUG_MAX",
-        0xf0    :   "FFS_MIN",
-        0xff    :   "FFS_MAX",
-        0xf0    :   "FFS_PAD",
+        0x00:   "<unknown>",
+        0x01:   "RAW",
+        0x02:   "FREEFORM",
+        0x03:   "SECURITY_CORE",
+        0x04:   "PEI_CORE",
+        0x05:   "DXE_CORE",
+        0x06:   "PEIM",
+        0x07:   "DRIVER",
+        0x08:   "COMBINED_PEIM_DRIVER",
+        0x09:   "APPLICATION",
+        0x0A:   "SMM",
+        0x0B:   "FIRMWARE_VOLUME_IMAGE",
+        0x0C:   "COMBINED_SMM_DXE",
+        0x0D:   "SMM_CORE",
+        0x0E:   "MM_STANDALONE",
+        0x0F:   "MM_CORE_STANDALONE",
+        0xc0:   "OEM_MIN",
+        0xdf:   "OEM_MAX",
+        0xe0:   "DEBUG_MIN",
+        0xef:   "DEBUG_MAX",
+        0xf0:   "FFS_MIN",
+        0xff:   "FFS_MAX",
+        0xf0:   "FFS_PAD",
     }
 
     def __init__(self):
@@ -968,9 +1011,11 @@ class Ffs(Image):
         gIndention += 4
         Indention = ' ' * gIndention
         FfsInfo = Indention
-        FfsInfo +=  "[FFS:%s] offset=%x size=%x guid=%s free_space=%x alignment=%s\n" % \
-                    (Ffs._TypeName[self.Type], self._OFF_, self.Size, self.Guid, self.FreeSpace, self.Alignment)
-        SectionInfo = '\n'.join([str(self.Sections[Offset]) for Offset in self.Sections.keys()])
+        FfsInfo += "[FFS:%s] offset=%x size=%x guid=%s free_space=%x alignment=%s\n" % \
+            (Ffs._TypeName[self.Type], self._OFF_, self.Size,
+             self.Guid, self.FreeSpace, self.Alignment)
+        SectionInfo = '\n'.join([str(self.Sections[Offset])
+                                for Offset in self.Sections.keys()])
         gIndention -= 4
         return FfsInfo + SectionInfo + "\n"
 
@@ -984,7 +1029,7 @@ class Ffs(Image):
         Size1, Size2, Size3 = self._SIZE_.unpack_from(self._BUF_, self._OFF_)
         Size = Size1 + (Size2 << 8) + (Size3 << 16)
         self.empty()
-        self.extend(self._BUF_[self._OFF_ : self._OFF_ + Size])
+        self.extend(self._BUF_[self._OFF_: self._OFF_ + Size])
 
         # Pad FFS may use the same GUID. We need to avoid it.
         if self.Type == 0xf0:
@@ -1001,8 +1046,8 @@ class Ffs(Image):
                 SectionObj.frombuffer(self, SectionStartAddress)
                 #f = open(repr(SectionObj), 'wb')
                 #SectionObj.Size = 0
-                #SectionObj.tofile(f)
-                #f.close()
+                # SectionObj.tofile(f)
+                # f.close()
                 self.Sections[SectionStartAddress] = SectionObj
                 SectionStartAddress += len(SectionObj)
                 SectionStartAddress = (SectionStartAddress + 3) & (~3)
@@ -1076,7 +1121,7 @@ class Ffs(Image):
     State = property(_GetState, _SetState)
 
 
-## MultipleFv() class
+# MultipleFv() class
 #
 #  A class for Multiple FV
 #
@@ -1101,18 +1146,20 @@ class MultipleFv(FirmwareVolume):
             self.BasicInfo.append([Fv.Name, Fv.FileSystemGuid, Fv.Size])
             self.FfsDict.update(Fv.FfsDict)
 
-## Class Eot
+# Class Eot
 #
 # This class is used to define Eot main entrance
 #
 # @param object:          Inherited from object class
 #
+
+
 class Eot(object):
-    ## The constructor
+    # The constructor
     #
     #   @param  self:      The object pointer
     #
-    def __init__(self, CommandLineOption=True, IsInit=True, SourceFileList=None, \
+    def __init__(self, CommandLineOption=True, IsInit=True, SourceFileList=None,
                  IncludeDirList=None, DecFileList=None, GuidList=None, LogFile=None,
                  FvFileList="", MapFileList="", Report='Report.html', Dispatch=None):
         # Version and Copyright
@@ -1136,10 +1183,13 @@ class Eot(object):
             if "EDK_SOURCE" not in os.environ:
                 pass
             else:
-                EotGlobalData.gEDK_SOURCE = os.path.normpath(os.getenv("EDK_SOURCE"))
+                EotGlobalData.gEDK_SOURCE = os.path.normpath(
+                    os.getenv("EDK_SOURCE"))
         else:
-            EotGlobalData.gEFI_SOURCE = os.path.normpath(os.getenv("EFI_SOURCE"))
-            EotGlobalData.gEDK_SOURCE = os.path.join(EotGlobalData.gEFI_SOURCE, 'Edk')
+            EotGlobalData.gEFI_SOURCE = os.path.normpath(
+                os.getenv("EFI_SOURCE"))
+            EotGlobalData.gEDK_SOURCE = os.path.join(
+                EotGlobalData.gEFI_SOURCE, 'Edk')
 
         if "WORKSPACE" not in os.environ:
             EdkLogger.error("EOT", BuildToolError.ATTRIBUTE_NOT_AVAILABLE, "Environment variable not found",
@@ -1159,16 +1209,19 @@ class Eot(object):
             for FvFile in GetSplitValueList(self.FvFileList, ' '):
                 FvFile = os.path.normpath(FvFile)
                 if not os.path.isfile(FvFile):
-                    EdkLogger.error("Eot", EdkLogger.EOT_ERROR, "Can not find file %s " % FvFile)
+                    EdkLogger.error("Eot", EdkLogger.EOT_ERROR,
+                                    "Can not find file %s " % FvFile)
                 EotGlobalData.gFV_FILE.append(FvFile)
         else:
-            EdkLogger.error("Eot", EdkLogger.EOT_ERROR, "The fv file list of target platform was not specified")
+            EdkLogger.error("Eot", EdkLogger.EOT_ERROR,
+                            "The fv file list of target platform was not specified")
 
         if self.MapFileList:
             for MapFile in GetSplitValueList(self.MapFileList, ' '):
                 MapFile = os.path.normpath(MapFile)
                 if not os.path.isfile(MapFile):
-                    EdkLogger.error("Eot", EdkLogger.EOT_ERROR, "Can not find file %s " % MapFile)
+                    EdkLogger.error("Eot", EdkLogger.EOT_ERROR,
+                                    "Can not find file %s " % MapFile)
                 EotGlobalData.gMAP_FILE.append(MapFile)
 
         # Generate source file list
@@ -1214,7 +1267,7 @@ class Eot(object):
         # Close Database
         EotGlobalData.gDb.Close()
 
-    ## ParseDecFile() method
+    # ParseDecFile() method
     #
     #  parse DEC file and get all GUID names with GUID values as {GuidName : GuidValue}
     #  The Dict is stored in EotGlobalData.gGuidDict
@@ -1227,17 +1280,18 @@ class Eot(object):
             path = os.path.normpath(DecFileList)
             lfr = open(path, 'rb')
             for line in lfr:
-                path = os.path.normpath(os.path.join(EotGlobalData.gWORKSPACE, line.strip()))
+                path = os.path.normpath(os.path.join(
+                    EotGlobalData.gWORKSPACE, line.strip()))
                 if os.path.exists(path):
                     dfr = open(path, 'rb')
                     for line in dfr:
                         line = CleanString(line)
                         list = line.split('=')
                         if len(list) == 2:
-                            EotGlobalData.gGuidDict[list[0].strip()] = GuidStructureStringToGuidString(list[1].strip())
+                            EotGlobalData.gGuidDict[list[0].strip(
+                            )] = GuidStructureStringToGuidString(list[1].strip())
 
-
-    ## ParseGuidList() method
+    # ParseGuidList() method
     #
     #  Parse Guid list and get all GUID names with GUID values as {GuidName : GuidValue}
     #  The Dict is stored in EotGlobalData.gGuidDict
@@ -1245,6 +1299,7 @@ class Eot(object):
     #  @param self: The object pointer
     #  @param GuidList: A list of all GUID and its value
     #
+
     def ParseGuidList(self, GuidList):
         Path = os.path.join(EotGlobalData.gWORKSPACE, GuidList)
         if os.path.isfile(Path):
@@ -1253,7 +1308,7 @@ class Eot(object):
                     (GuidName, GuidValue) = Line.split()
                     EotGlobalData.gGuidDict[GuidName] = GuidValue
 
-    ## ConvertLogFile() method
+    # ConvertLogFile() method
     #
     #  Parse a real running log file to get real dispatch order
     #  The result is saved to old file name + '.new'
@@ -1273,11 +1328,11 @@ class Eot(object):
                 line = line.replace('.efi', '')
                 index = line.find("Loading PEIM at ")
                 if index > -1:
-                    newline.append(line[index + 55 : ])
+                    newline.append(line[index + 55:])
                     continue
                 index = line.find("Loading driver at ")
                 if index > -1:
-                    newline.append(line[index + 57 : ])
+                    newline.append(line[index + 57:])
                     continue
 
         for line in newline:
@@ -1288,7 +1343,7 @@ class Eot(object):
         if lfw:
             lfw.close()
 
-    ## GenerateSourceFileList() method
+    # GenerateSourceFileList() method
     #
     #  Generate a list of all source files
     #  1. Search the file list one by one
@@ -1314,8 +1369,9 @@ class Eot(object):
         if SourceFileList:
             sfl = open(SourceFileList, 'r')
             for line in sfl:
-                line = os.path.normpath(os.path.join(EotGlobalData.gWORKSPACE, line.strip()))
-                if line[-2:].upper() == '.C' or  line[-2:].upper() == '.H':
+                line = os.path.normpath(os.path.join(
+                    EotGlobalData.gWORKSPACE, line.strip()))
+                if line[-2:].upper() == '.C' or line[-2:].upper() == '.H':
                     if line not in mCurrentSourceFileList:
                         mCurrentSourceFileList.append(line)
                         mSourceFileList.append(line)
@@ -1324,7 +1380,8 @@ class Eot(object):
                     if mCurrentInfFile != '':
                         mFileList[mCurrentInfFile] = mCurrentSourceFileList
                         mCurrentSourceFileList = []
-                    mCurrentInfFile = os.path.normpath(os.path.join(EotGlobalData.gWORKSPACE, line))
+                    mCurrentInfFile = os.path.normpath(
+                        os.path.join(EotGlobalData.gWORKSPACE, line))
                     EotGlobalData.gOP_INF.write('%s\n' % mCurrentInfFile)
             if mCurrentInfFile not in mFileList:
                 mFileList[mCurrentInfFile] = mCurrentSourceFileList
@@ -1335,13 +1392,15 @@ class Eot(object):
             for line in ifl:
                 if not line.strip():
                     continue
-                newline = os.path.normpath(os.path.join(EotGlobalData.gWORKSPACE, line.strip()))
+                newline = os.path.normpath(os.path.join(
+                    EotGlobalData.gWORKSPACE, line.strip()))
                 for Root, Dirs, Files in os.walk(str(newline)):
                     for File in Files:
                         FullPath = os.path.normpath(os.path.join(Root, File))
                         if FullPath not in mSourceFileList and File[-2:].upper() == '.H':
                             mSourceFileList.append(FullPath)
-                            EotGlobalData.gOP_SOURCE_FILES.write('%s\n' % FullPath)
+                            EotGlobalData.gOP_SOURCE_FILES.write(
+                                '%s\n' % FullPath)
                         if FullPath not in mDecFileList and File.upper().find('.DEC') > -1:
                             mDecFileList.append(FullPath)
 
@@ -1351,7 +1410,7 @@ class Eot(object):
         EotGlobalData.gINF_FILES = mFileList
         EotGlobalData.gOP_INF.close()
 
-    ## GenerateReport() method
+    # GenerateReport() method
     #
     #  Generate final HTML report
     #
@@ -1362,7 +1421,7 @@ class Eot(object):
         Rep = Report(self.Report, EotGlobalData.gFV, self.Dispatch)
         Rep.GenerateReport()
 
-    ## LoadMapInfo() method
+    # LoadMapInfo() method
     #
     #  Load map files and parse them
     #
@@ -1373,7 +1432,7 @@ class Eot(object):
             EdkLogger.quiet("Parsing Map file ... ")
             EotGlobalData.gMap = ParseMapFile(EotGlobalData.gMAP_FILE)
 
-    ## LoadFvInfo() method
+    # LoadFvInfo() method
     #
     #  Load FV binary files and parse them
     #
@@ -1385,9 +1444,10 @@ class Eot(object):
         EotGlobalData.gFV.Dispatch(EotGlobalData.gDb)
 
         for Protocol in EotGlobalData.gProtocolList:
-            EotGlobalData.gOP_UN_MATCHED_IN_LIBRARY_CALLING.write('%s\n' %Protocol)
+            EotGlobalData.gOP_UN_MATCHED_IN_LIBRARY_CALLING.write(
+                '%s\n' % Protocol)
 
-    ## GenerateReportDatabase() method
+    # GenerateReportDatabase() method
     #
     #  Generate data for the information needed by report
     #  1. Update name, macro and value of all found PPI/PROTOCOL GUID
@@ -1396,7 +1456,8 @@ class Eot(object):
     #  @param self: The object pointer
     #
     def GenerateReportDatabase(self):
-        EdkLogger.quiet("Generating the cross-reference table of GUID for Ppi/Protocol ... ")
+        EdkLogger.quiet(
+            "Generating the cross-reference table of GUID for Ppi/Protocol ... ")
 
         # Update Protocol/Ppi Guid
         SqlCommand = """select DISTINCT GuidName from Report"""
@@ -1410,12 +1471,13 @@ class Eot(object):
             # Find guid value defined in Dec file
             if GuidName in EotGlobalData.gGuidDict:
                 GuidValue = EotGlobalData.gGuidDict[GuidName]
-                SqlCommand = """update Report set GuidMacro = '%s', GuidValue = '%s' where GuidName = '%s'""" %(GuidMacro, GuidValue, GuidName)
+                SqlCommand = """update Report set GuidMacro = '%s', GuidValue = '%s' where GuidName = '%s'""" % (
+                    GuidMacro, GuidValue, GuidName)
                 EotGlobalData.gDb.TblReport.Exec(SqlCommand)
                 continue
 
             # Search defined Macros for guid name
-            SqlCommand ="""select DISTINCT Value, Modifier from Query where Name like '%s'""" % GuidName
+            SqlCommand = """select DISTINCT Value, Modifier from Query where Name like '%s'""" % GuidName
             GuidMacroSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
             # Ignore NULL result
             if not GuidMacroSet:
@@ -1424,14 +1486,18 @@ class Eot(object):
             if not GuidMacro:
                 continue
             # Find Guid value of Guid Macro
-            SqlCommand ="""select DISTINCT Value from Query2 where Value like '%%%s%%' and Model = %s""" % (GuidMacro, MODEL_IDENTIFIER_MACRO_DEFINE)
+            SqlCommand = """select DISTINCT Value from Query2 where Value like '%%%s%%' and Model = %s""" % (
+                GuidMacro, MODEL_IDENTIFIER_MACRO_DEFINE)
             GuidValueSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
             if GuidValueSet != []:
                 GuidValue = GuidValueSet[0][0]
-                GuidValue = GuidValue[GuidValue.find(GuidMacro) + len(GuidMacro) :]
-                GuidValue = GuidValue.lower().replace('\\', '').replace('\r', '').replace('\n', '').replace('l', '').strip()
+                GuidValue = GuidValue[GuidValue.find(
+                    GuidMacro) + len(GuidMacro):]
+                GuidValue = GuidValue.lower().replace('\\', '').replace(
+                    '\r', '').replace('\n', '').replace('l', '').strip()
                 GuidValue = GuidStructureStringToGuidString(GuidValue)
-                SqlCommand = """update Report set GuidMacro = '%s', GuidValue = '%s' where GuidName = '%s'""" %(GuidMacro, GuidValue, GuidName)
+                SqlCommand = """update Report set GuidMacro = '%s', GuidValue = '%s' where GuidName = '%s'""" % (
+                    GuidMacro, GuidValue, GuidName)
                 EotGlobalData.gDb.TblReport.Exec(SqlCommand)
                 continue
 
@@ -1444,7 +1510,7 @@ class Eot(object):
             if Record[1] == 'Protocol':
                 EotGlobalData.gProtocolList[Record[0].lower()] = -2
 
-    ## GenerateQueryTable() method
+    # GenerateQueryTable() method
     #
     #  Generate two tables improve query performance
     #
@@ -1462,7 +1528,7 @@ class Eot(object):
                             % (Identifier[0], MODEL_IDENTIFIER_MACRO_DEFINE)
             EotGlobalData.gDb.TblReport.Exec(SqlCommand)
 
-    ## ParseExecutionOrder() method
+    # ParseExecutionOrder() method
     #
     #  Get final execution order
     #  1. Search all PPI
@@ -1474,7 +1540,7 @@ class Eot(object):
         EdkLogger.quiet("Searching Ppi/Protocol ... ")
         for Identifier in EotGlobalData.gIdentifierTableList:
             ModuleID, ModuleName, ModuleGuid, SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, Enabled = \
-            -1, '', '', -1, '', '', '', '', '', '', '', '', 0
+                -1, '', '', -1, '', '', '', '', '', '', '', '', 0
 
             SourceFileID = Identifier[0].replace('Identifier', '')
             SourceFileFullPath = Identifier[1]
@@ -1485,85 +1551,108 @@ class Eot(object):
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.InstallPpi', '->InstallPpi', 'PeiInstallPpi', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchPpi(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode)
+            SearchPpi(SqlCommand, Identifier, SourceFileID,
+                      SourceFileFullPath, ItemMode)
 
             ItemMode = 'Produced'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.ReInstallPpi', '->ReInstallPpi', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchPpi(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 2)
+            SearchPpi(SqlCommand, Identifier, SourceFileID,
+                      SourceFileFullPath, ItemMode, 2)
 
-            SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode)
+            SearchPpiCallFunction(Identifier, SourceFileID,
+                                  SourceFileFullPath, ItemMode)
 
             ItemMode = 'Consumed'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.LocatePpi', '->LocatePpi', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchPpi(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode)
+            SearchPpi(SqlCommand, Identifier, SourceFileID,
+                      SourceFileFullPath, ItemMode)
 
-            SearchFunctionCalling(Identifier, SourceFileID, SourceFileFullPath, 'Ppi', ItemMode)
+            SearchFunctionCalling(Identifier, SourceFileID,
+                                  SourceFileFullPath, 'Ppi', ItemMode)
 
             ItemMode = 'Callback'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.NotifyPpi', '->NotifyPpi', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchPpi(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode)
+            SearchPpi(SqlCommand, Identifier, SourceFileID,
+                      SourceFileFullPath, ItemMode)
 
             # Find Protocols
             ItemMode = 'Produced'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%' or Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.InstallProtocolInterface', '.ReInstallProtocolInterface', '->InstallProtocolInterface', '->ReInstallProtocolInterface', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchProtocols(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 1)
+            SearchProtocols(SqlCommand, Identifier, SourceFileID,
+                            SourceFileFullPath, ItemMode, 1)
 
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.InstallMultipleProtocolInterfaces', '->InstallMultipleProtocolInterfaces', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchProtocols(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 2)
+            SearchProtocols(SqlCommand, Identifier, SourceFileID,
+                            SourceFileFullPath, ItemMode, 2)
 
-            SearchFunctionCalling(Identifier, SourceFileID, SourceFileFullPath, 'Protocol', ItemMode)
+            SearchFunctionCalling(Identifier, SourceFileID,
+                                  SourceFileFullPath, 'Protocol', ItemMode)
 
             ItemMode = 'Consumed'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.LocateProtocol', '->LocateProtocol', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchProtocols(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 0)
+            SearchProtocols(SqlCommand, Identifier, SourceFileID,
+                            SourceFileFullPath, ItemMode, 0)
 
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.HandleProtocol', '->HandleProtocol', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchProtocols(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 1)
+            SearchProtocols(SqlCommand, Identifier, SourceFileID,
+                            SourceFileFullPath, ItemMode, 1)
 
-            SearchFunctionCalling(Identifier, SourceFileID, SourceFileFullPath, 'Protocol', ItemMode)
+            SearchFunctionCalling(Identifier, SourceFileID,
+                                  SourceFileFullPath, 'Protocol', ItemMode)
 
             ItemMode = 'Callback'
             SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
                             where (Name like '%%%s%%' or Name like '%%%s%%') and Model = %s""" \
                             % (Identifier, '.RegisterProtocolNotify', '->RegisterProtocolNotify', MODEL_IDENTIFIER_FUNCTION_CALLING)
-            SearchProtocols(SqlCommand, Identifier, SourceFileID, SourceFileFullPath, ItemMode, 0)
+            SearchProtocols(SqlCommand, Identifier, SourceFileID,
+                            SourceFileFullPath, ItemMode, 0)
 
-            SearchFunctionCalling(Identifier, SourceFileID, SourceFileFullPath, 'Protocol', ItemMode)
+            SearchFunctionCalling(Identifier, SourceFileID,
+                                  SourceFileFullPath, 'Protocol', ItemMode)
 
         # Hard Code
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gEfiSecPlatformInformationPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gEfiNtLoadAsDllPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gNtPeiLoadFileGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gPeiNtAutoScanPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gNtFwhPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gPeiNtThunkPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gPeiPlatformTypePpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gPeiFrequencySelectionCpuPpiGuid', '', '', '', 0)
-        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi', 'Produced', 'gPeiCachePpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi',
+                                           'Produced', 'gEfiSecPlatformInformationPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gEfiNtLoadAsDllPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gNtPeiLoadFileGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gPeiNtAutoScanPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '',
+                                           '', 'Ppi', 'Produced', 'gNtFwhPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gPeiNtThunkPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gPeiPlatformTypePpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '', 'Ppi',
+                                           'Produced', 'gPeiFrequencySelectionCpuPpiGuid', '', '', '', 0)
+        EotGlobalData.gDb.TblReport.Insert(-2, '', '', -1, '', '',
+                                           'Ppi', 'Produced', 'gPeiCachePpiGuid', '', '', '', 0)
 
         EotGlobalData.gDb.Conn.commit()
 
-
-    ## BuildDatabase() methoc
+    # BuildDatabase() methoc
     #
     #  Build the database for target
     #
     #  @param self: The object pointer
     #
+
     def BuildDatabase(self):
         # Clean report table
         EotGlobalData.gDb.TblReport.Drop()
@@ -1576,9 +1665,10 @@ class Eot(object):
             c.CreateCCodeDB(EotGlobalData.gSOURCE_FILES)
             EdkLogger.quiet("Building database for source code done!")
 
-        EotGlobalData.gIdentifierTableList = GetTableList((MODEL_FILE_C, MODEL_FILE_H), 'Identifier', EotGlobalData.gDb)
+        EotGlobalData.gIdentifierTableList = GetTableList(
+            (MODEL_FILE_C, MODEL_FILE_H), 'Identifier', EotGlobalData.gDb)
 
-    ## BuildMetaDataFileDatabase() method
+    # BuildMetaDataFileDatabase() method
     #
     #  Build the database for meta data files
     #
@@ -1590,13 +1680,13 @@ class Eot(object):
         for InfFile in Inf_Files:
             if not InfFile:
                 continue
-            EdkLogger.quiet("Parsing %s ..."  % str(InfFile))
+            EdkLogger.quiet("Parsing %s ..." % str(InfFile))
             EdkInfParser(InfFile, EotGlobalData.gDb, Inf_Files[InfFile])
 
         EotGlobalData.gDb.Conn.commit()
         EdkLogger.quiet("Building database for meta data files done!")
 
-    ## ParseOption() method
+    # ParseOption() method
     #
     #  Parse command line options
     #
@@ -1632,7 +1722,7 @@ class Eot(object):
         if Options.keepdatabase:
             self.IsInit = False
 
-    ## SetLogLevel() method
+    # SetLogLevel() method
     #
     #  Set current log level of the tool based on args
     #
@@ -1649,7 +1739,7 @@ class Eot(object):
         else:
             EdkLogger.SetLevel(EdkLogger.INFO)
 
-    ## EotOptionParser() method
+    # EotOptionParser() method
     #
     #  Using standard Python module optparse to parse command line option of this tool.
     #
@@ -1659,38 +1749,43 @@ class Eot(object):
     #  @retval Args  Target of build command
     #
     def EotOptionParser(self):
-        Parser = OptionParser(description = self.Copyright, version = self.Version, prog = "Eot.exe", usage = "%prog [options]")
+        Parser = OptionParser(description=self.Copyright,
+                              version=self.Version, prog="Eot.exe", usage="%prog [options]")
         Parser.add_option("-m", "--makefile filename", action="store", type="string", dest='MakeFile',
-            help="Specify a makefile for the platform.")
+                          help="Specify a makefile for the platform.")
         Parser.add_option("-c", "--dsc filename", action="store", type="string", dest="DscFile",
-            help="Specify a dsc file for the platform.")
+                          help="Specify a dsc file for the platform.")
         Parser.add_option("-f", "--fv filename", action="store", type="string", dest="FvFileList",
-            help="Specify fv file list, quoted by \"\".")
+                          help="Specify fv file list, quoted by \"\".")
         Parser.add_option("-a", "--map filename", action="store", type="string", dest="MapFileList",
-            help="Specify map file list, quoted by \"\".")
+                          help="Specify map file list, quoted by \"\".")
         Parser.add_option("-s", "--source files", action="store", type="string", dest="SourceFileList",
-            help="Specify source file list by a file")
+                          help="Specify source file list by a file")
         Parser.add_option("-i", "--include dirs", action="store", type="string", dest="IncludeDirList",
-            help="Specify include dir list by a file")
+                          help="Specify include dir list by a file")
         Parser.add_option("-e", "--dec files", action="store", type="string", dest="DecFileList",
-            help="Specify dec file list by a file")
+                          help="Specify dec file list by a file")
         Parser.add_option("-g", "--guid list", action="store", type="string", dest="GuidList",
-            help="Specify guid file list by a file")
+                          help="Specify guid file list by a file")
         Parser.add_option("-l", "--log filename", action="store", type="string", dest="LogFile",
-            help="Specify real execution log file")
+                          help="Specify real execution log file")
 
-        Parser.add_option("-k", "--keepdatabase", action="store_true", type=None, help="The existing Eot database will not be cleaned except report information if this option is specified.")
+        Parser.add_option("-k", "--keepdatabase", action="store_true", type=None,
+                          help="The existing Eot database will not be cleaned except report information if this option is specified.")
 
-        Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
-        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "\
-                                                                                   "including library instances selected, final dependency expression, "\
-                                                                                   "and warning messages, etc.")
-        Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
+        Parser.add_option("-q", "--quiet", action="store_true",
+                          type=None, help="Disable all messages except FATAL ERRORS.")
+        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "
+                          "including library instances selected, final dependency expression, "
+                          "and warning messages, etc.")
+        Parser.add_option("-d", "--debug", action="store", type="int",
+                          help="Enable debug messages at specified level.")
 
-        (Opt, Args)=Parser.parse_args()
+        (Opt, Args) = Parser.parse_args()
 
         return (Opt, Args)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -1700,7 +1795,8 @@ if __name__ == '__main__':
     # Initialize log system
     EdkLogger.Initialize()
     EdkLogger.IsRaiseError = False
-    EdkLogger.quiet(time.strftime("%H:%M:%S, %b.%d %Y ", time.localtime()) + "[00:00]" + "\n")
+    EdkLogger.quiet(time.strftime("%H:%M:%S, %b.%d %Y ",
+                    time.localtime()) + "[00:00]" + "\n")
 
     StartTime = time.clock()
     Eot = Eot(CommandLineOption=False,
@@ -1709,5 +1805,7 @@ if __name__ == '__main__':
               FvFileList=r'C:\TestEot\FVRECOVERY.Fv')
     FinishTime = time.clock()
 
-    BuildDuration = time.strftime("%M:%S", time.gmtime(int(round(FinishTime - StartTime))))
-    EdkLogger.quiet("\n%s [%s]" % (time.strftime("%H:%M:%S, %b.%d %Y", time.localtime()), BuildDuration))
+    BuildDuration = time.strftime(
+        "%M:%S", time.gmtime(int(round(FinishTime - StartTime))))
+    EdkLogger.quiet("\n%s [%s]" % (time.strftime(
+        "%H:%M:%S, %b.%d %Y", time.localtime()), BuildDuration))
diff --git a/BaseTools/Source/Python/Eot/EotToolError.py b/BaseTools/Source/Python/Eot/EotToolError.py
index d4f4f3a18664..6dc146abbf20 100644
--- a/BaseTools/Source/Python/Eot/EotToolError.py
+++ b/BaseTools/Source/Python/Eot/EotToolError.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Standardized Error Handling infrastructures.
 #
 # Copyright (c) 2008 - 2010, Intel Corporation. All rights reserved.<BR>
@@ -10,6 +10,5 @@ ERROR_1 = 1000
 
 # Error message
 gEccErrorMessage = {
-    ERROR_1 : "RESERVED"
-    }
-
+    ERROR_1: "RESERVED"
+}
diff --git a/BaseTools/Source/Python/Eot/FileProfile.py b/BaseTools/Source/Python/Eot/FileProfile.py
index 1f3ec156d06c..1c4145d8dd09 100644
--- a/BaseTools/Source/Python/Eot/FileProfile.py
+++ b/BaseTools/Source/Python/Eot/FileProfile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # fragments of source file
 #
 #  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -27,15 +27,17 @@ StructUnionDefinitionList = []
 TypedefDefinitionList = []
 FunctionCallingList = []
 
-## Class FileProfile
+# Class FileProfile
 #
 # record file data when parsing source
 #
 # May raise Exception when opening file.
 #
-class FileProfile :
 
-    ## The constructor
+
+class FileProfile:
+
+    # The constructor
     #
     #   @param  self: The object pointer
     #   @param  FileName: The file that to be parsed
diff --git a/BaseTools/Source/Python/Eot/Identification.py b/BaseTools/Source/Python/Eot/Identification.py
index 31d47602e519..1f5aac258839 100644
--- a/BaseTools/Source/Python/Eot/Identification.py
+++ b/BaseTools/Source/Python/Eot/Identification.py
@@ -1,10 +1,10 @@
-## @file
+# @file
 # This file is used to define the identification of INF/DEC/DSC files
 #
 # Copyright (c) 2007, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 
-## Identification
+# Identification
 #
 # This class defined basic Identification information structure which is used by INF/DEC/DSC files
 #
@@ -22,27 +22,28 @@ class Identification(object):
         self.FileRelativePath = ''
         self.PackagePath = ''
 
-    ## GetFileName
+    # GetFileName
     #
     # Reserved
     #
     def GetFileName(self, FileFullPath, FileRelativePath):
         pass
 
-    ## GetFileName
+    # GetFileName
     #
     # Reserved
     #
     def GetFileFullPath(self, FileName, FileRelativePath):
         pass
 
-    ## GetFileName
+    # GetFileName
     #
     # Reserved
     #
     def GetFileRelativePath(self, FileName, FileFullPath):
         pass
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Eot/InfParserLite.py b/BaseTools/Source/Python/Eot/InfParserLite.py
index 2c6bc50b6cd0..10ca7ad3a471 100644
--- a/BaseTools/Source/Python/Eot/InfParserLite.py
+++ b/BaseTools/Source/Python/Eot/InfParserLite.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse INF file of EDK project
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -21,21 +21,23 @@ from Eot.Parser import *
 from Eot import Database
 from Eot import EotGlobalData
 
-## EdkInfParser() class
+# EdkInfParser() class
 #
 # This class defined basic INF object which is used by inheriting
 #
 # @param object:       Inherited from object class
 #
+
+
 class EdkInfParser(object):
-    ## The constructor
+    # The constructor
     #
     #  @param  self: The object pointer
     #  @param  Filename: INF file name
     #  @param  Database: Eot database
     #  @param  SourceFileList: A list for all source file belonging this INF file
     #
-    def __init__(self, Filename = None, Database = None, SourceFileList = None):
+    def __init__(self, Filename=None, Database=None, SourceFileList=None):
         self.Identification = Identification()
         self.Sources = []
         self.Macros = {}
@@ -51,27 +53,30 @@ class EdkInfParser(object):
 
         if SourceFileList:
             for Item in SourceFileList:
-                self.TblInf.Insert(MODEL_EFI_SOURCE_FILE, Item, '', '', '', '', 'COMMON', -1, self.FileID, -1, -1, -1, -1, 0)
+                self.TblInf.Insert(MODEL_EFI_SOURCE_FILE, Item, '', '',
+                                   '', '', 'COMMON', -1, self.FileID, -1, -1, -1, -1, 0)
 
-
-    ## LoadInffile() method
+    # LoadInffile() method
     #
     #  Load INF file and insert a record in database
     #
     #  @param  self: The object pointer
     #  @param Filename:  Input value for filename of Inf file
     #
-    def LoadInfFile(self, Filename = None):
+
+    def LoadInfFile(self, Filename=None):
         # Insert a record for file
         Filename = NormPath(Filename)
         self.Identification.FileFullPath = Filename
-        (self.Identification.FileRelativePath, self.Identification.FileName) = os.path.split(Filename)
+        (self.Identification.FileRelativePath,
+         self.Identification.FileName) = os.path.split(Filename)
 
         self.FileID = self.TblFile.InsertFile(Filename, MODEL_FILE_INF)
 
-        self.ParseInf(PreProcess(Filename, False), self.Identification.FileRelativePath, Filename)
+        self.ParseInf(PreProcess(Filename, False),
+                      self.Identification.FileRelativePath, Filename)
 
-    ## ParserSource() method
+    # ParserSource() method
     #
     #  Parse Source section and insert records in database
     #
@@ -91,9 +96,10 @@ class EdkInfParser(object):
             for Item in SectionItemList:
                 if CurrentSection.upper() == 'defines'.upper():
                     (Name, Value) = AddToSelfMacro(self.Macros, Item[0])
-                    self.TblInf.Insert(MODEL_META_DATA_HEADER, Name, Value, Third, '', '', Arch, -1, self.FileID, Item[1], -1, Item[1], -1, 0)
+                    self.TblInf.Insert(MODEL_META_DATA_HEADER, Name, Value, Third,
+                                       '', '', Arch, -1, self.FileID, Item[1], -1, Item[1], -1, 0)
 
-    ## ParseInf() method
+    # ParseInf() method
     #
     #  Parse INF file and get sections information
     #
@@ -102,9 +108,9 @@ class EdkInfParser(object):
     #  @param FileRelativePath: relative path of the file
     #  @param Filename: file name of INF file
     #
-    def ParseInf(self, Lines = [], FileRelativePath = '', Filename = ''):
+    def ParseInf(self, Lines=[], FileRelativePath='', Filename=''):
         IfDefList, SectionItemList, CurrentSection, ArchList, ThirdList, IncludeFiles = \
-        [], [], TAB_UNKNOWN, [], [], []
+            [], [], TAB_UNKNOWN, [], [], []
         LineNo = 0
 
         for Line in Lines:
@@ -112,7 +118,8 @@ class EdkInfParser(object):
             if Line == '':
                 continue
             if Line.startswith(TAB_SECTION_START) and Line.endswith(TAB_SECTION_END):
-                self.ParserSource(CurrentSection, SectionItemList, ArchList, ThirdList)
+                self.ParserSource(
+                    CurrentSection, SectionItemList, ArchList, ThirdList)
 
                 # Parse the new section
                 SectionItemList = []
@@ -120,18 +127,21 @@ class EdkInfParser(object):
                 ThirdList = []
                 # Parse section name
                 CurrentSection = ''
-                LineList = GetSplitValueList(Line[len(TAB_SECTION_START):len(Line) - len(TAB_SECTION_END)], TAB_COMMA_SPLIT)
+                LineList = GetSplitValueList(Line[len(TAB_SECTION_START):len(
+                    Line) - len(TAB_SECTION_END)], TAB_COMMA_SPLIT)
                 for Item in LineList:
                     ItemList = GetSplitValueList(Item, TAB_SPLIT)
                     if CurrentSection == '':
                         CurrentSection = ItemList[0]
                     else:
                         if CurrentSection != ItemList[0]:
-                            EdkLogger.error("Parser", PARSER_ERROR, "Different section names '%s' and '%s' are found in one section definition, this is not allowed." % (CurrentSection, ItemList[0]), File=Filename, Line=LineNo)
+                            EdkLogger.error("Parser", PARSER_ERROR, "Different section names '%s' and '%s' are found in one section definition, this is not allowed." % (
+                                CurrentSection, ItemList[0]), File=Filename, Line=LineNo)
                     ItemList.append('')
                     ItemList.append('')
                     if len(ItemList) > 5:
-                        RaiseParserError(Line, CurrentSection, Filename, '', LineNo)
+                        RaiseParserError(Line, CurrentSection,
+                                         Filename, '', LineNo)
                     else:
                         ArchList.append(ItemList[1].upper())
                         ThirdList.append(ItemList[2])
@@ -143,6 +153,4 @@ class EdkInfParser(object):
             # End of parse
 
         self.ParserSource(CurrentSection, SectionItemList, ArchList, ThirdList)
-        #End of For
-
-
+        # End of For
diff --git a/BaseTools/Source/Python/Eot/Parser.py b/BaseTools/Source/Python/Eot/Parser.py
index f204051d01f7..e02c2267dafa 100644
--- a/BaseTools/Source/Python/Eot/Parser.py
+++ b/BaseTools/Source/Python/Eot/Parser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common parsing related functions used in parsing
 # Inf/Dsc/Makefile process
 #
@@ -10,7 +10,8 @@
 # Import Modules
 #
 from __future__ import absolute_import
-import Common.LongFilePathOs as os, re
+import Common.LongFilePathOs as os
+import re
 import Common.EdkLogger as EdkLogger
 from Common.DataType import *
 from CommonDataClass.DataClass import *
@@ -21,10 +22,12 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 
 import subprocess
 
-## DeCompress
+# DeCompress
 #
 # Call external decompress tool to decompress the fv section
 #
+
+
 def DeCompress(Method, Input):
     # Write the input to a temp file
     open('_Temp.bin', 'wb').write(Input)
@@ -37,7 +40,8 @@ def DeCompress(Method, Input):
         cmd = r'TianoCompress -d -o _New.bin _Temp.bin'
 
     # Call tool to create the decompressed output file
-    Process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+    Process = subprocess.Popen(
+        cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
     Process.communicate()[0]
 
     # Return the beffer of New.bin
@@ -45,7 +49,7 @@ def DeCompress(Method, Input):
         return open('_New.bin', 'rb').read()
 
 
-## PreProcess() method
+# PreProcess() method
 #
 #  Pre process a file
 #
@@ -58,7 +62,7 @@ def DeCompress(Method, Input):
 #
 #  @return Lines: The file contents after removing comments
 #
-def PreProcess(Filename, MergeMultipleLines = True, LineNo = -1):
+def PreProcess(Filename, MergeMultipleLines=True, LineNo=-1):
     Lines = []
     Filename = os.path.normpath(Filename)
     if not os.path.isfile(Filename):
@@ -100,7 +104,8 @@ def PreProcess(Filename, MergeMultipleLines = True, LineNo = -1):
                 IsFindBlockCode = False
                 continue
             if Line[-1] == TAB_SLASH:
-                ReservedLine = ReservedLine +  TAB_SPACE_SPLIT + Line[0:-1].strip()
+                ReservedLine = ReservedLine + \
+                    TAB_SPACE_SPLIT + Line[0:-1].strip()
                 ReservedLineLength = ReservedLineLength + 1
                 IsFindBlockCode = True
                 continue
@@ -109,18 +114,20 @@ def PreProcess(Filename, MergeMultipleLines = True, LineNo = -1):
 
     return Lines
 
-## AddToGlobalMacro() method
+# AddToGlobalMacro() method
 #
 #  Add a macro to EotGlobalData.gMACRO
 #
 #  @param  Name: Name of the macro
 #  @param  Value: Value of the macro
 #
+
+
 def AddToGlobalMacro(Name, Value):
     Value = ReplaceMacro(Value, EotGlobalData.gMACRO, True)
     EotGlobalData.gMACRO[Name] = Value
 
-## AddToSelfMacro() method
+# AddToSelfMacro() method
 #
 #  Parse a line of macro definition and add it to a macro set
 #
@@ -130,6 +137,8 @@ def AddToGlobalMacro(Name, Value):
 #  @return Name: Name of macro
 #  @return Value: Value of macro
 #
+
+
 def AddToSelfMacro(SelfMacro, Line):
     Name, Value = '', ''
     List = GetSplitValueList(Line, TAB_EQUAL_SPLIT, 1)
@@ -142,7 +151,7 @@ def AddToSelfMacro(SelfMacro, Line):
 
     return (Name, Value)
 
-## GetIncludeListOfFile() method
+# GetIncludeListOfFile() method
 #
 #  Get the include path list for a source file
 #
@@ -156,6 +165,8 @@ def AddToSelfMacro(SelfMacro, Line):
 #
 #  @return IncludeList: A list of include directories
 #
+
+
 def GetIncludeListOfFile(WorkSpace, Filepath, Db):
     IncludeList = []
     Filepath = os.path.normpath(Filepath)
@@ -179,7 +190,7 @@ def GetIncludeListOfFile(WorkSpace, Filepath, Db):
 
     return IncludeList
 
-## GetTableList() method
+# GetTableList() method
 #
 #  Search table file and find all small tables
 #
@@ -189,9 +200,12 @@ def GetIncludeListOfFile(WorkSpace, Filepath, Db):
 #
 #  @return TableList: A list of tables
 #
+
+
 def GetTableList(FileModelList, Table, Db):
     TableList = []
-    SqlCommand = """select ID, FullPath from File where Model in %s""" % str(FileModelList)
+    SqlCommand = """select ID, FullPath from File where Model in %s""" % str(
+        FileModelList)
     RecordSet = Db.TblFile.Exec(SqlCommand)
     for Record in RecordSet:
         TableName = Table + str(Record[0])
@@ -199,7 +213,7 @@ def GetTableList(FileModelList, Table, Db):
 
     return TableList
 
-## GetAllIncludeDir() method
+# GetAllIncludeDir() method
 #
 #  Find all Include directories
 #
@@ -207,6 +221,8 @@ def GetTableList(FileModelList, Table, Db):
 #
 #  @return IncludeList: A list of include directories
 #
+
+
 def GetAllIncludeDirs(Db):
     IncludeList = []
     SqlCommand = """select distinct Value1 from Inf where Model = %s order by Value1""" % MODEL_EFI_INCLUDE
@@ -217,7 +233,7 @@ def GetAllIncludeDirs(Db):
 
     return IncludeList
 
-## GetAllIncludeFiles() method
+# GetAllIncludeFiles() method
 #
 #  Find all Include files
 #
@@ -225,6 +241,8 @@ def GetAllIncludeDirs(Db):
 #
 #  @return IncludeFileList: A list of include files
 #
+
+
 def GetAllIncludeFiles(Db):
     IncludeList = GetAllIncludeDirs(Db)
     IncludeFileList = []
@@ -238,7 +256,7 @@ def GetAllIncludeFiles(Db):
 
     return IncludeFileList
 
-## GetAllSourceFiles() method
+# GetAllSourceFiles() method
 #
 #  Find all source files
 #
@@ -246,6 +264,8 @@ def GetAllIncludeFiles(Db):
 #
 #  @return SourceFileList: A list of source files
 #
+
+
 def GetAllSourceFiles(Db):
     SourceFileList = []
     SqlCommand = """select distinct Value1 from Inf where Model = %s order by Value1""" % MODEL_EFI_SOURCE_FILE
@@ -256,7 +276,7 @@ def GetAllSourceFiles(Db):
 
     return SourceFileList
 
-## GetAllFiles() method
+# GetAllFiles() method
 #
 #  Find all files, both source files and include files
 #
@@ -264,6 +284,8 @@ def GetAllSourceFiles(Db):
 #
 #  @return FileList: A list of files
 #
+
+
 def GetAllFiles(Db):
     FileList = []
     IncludeFileList = GetAllIncludeFiles(Db)
@@ -277,7 +299,7 @@ def GetAllFiles(Db):
 
     return FileList
 
-## ParseConditionalStatement() method
+# ParseConditionalStatement() method
 #
 #  Parse conditional statement
 #
@@ -288,10 +310,13 @@ def GetAllFiles(Db):
 #  @retval True: Find keyword of conditional statement
 #  @retval False: Not find keyword of conditional statement
 #
+
+
 def ParseConditionalStatement(Line, Macros, StatusSet):
     NewLine = Line.upper()
     if NewLine.find(TAB_IF_EXIST.upper()) > -1:
-        IfLine = Line[NewLine.find(TAB_IF_EXIST) + len(TAB_IF_EXIST) + 1:].strip()
+        IfLine = Line[NewLine.find(TAB_IF_EXIST) +
+                      len(TAB_IF_EXIST) + 1:].strip()
         IfLine = ReplaceMacro(IfLine, EotGlobalData.gMACRO, True)
         IfLine = ReplaceMacro(IfLine, Macros, True)
         IfLine = IfLine.replace("\"", '')
@@ -308,7 +333,8 @@ def ParseConditionalStatement(Line, Macros, StatusSet):
         StatusSet.append([Status])
         return True
     if NewLine.find(TAB_IF_N_DEF.upper()) > -1:
-        IfLine = Line[NewLine.find(TAB_IF_N_DEF) + len(TAB_IF_N_DEF) + 1:].strip()
+        IfLine = Line[NewLine.find(TAB_IF_N_DEF) +
+                      len(TAB_IF_N_DEF) + 1:].strip()
         Status = False
         if IfLine not in Macros and IfLine not in EotGlobalData.gMACRO:
             Status = True
@@ -320,7 +346,8 @@ def ParseConditionalStatement(Line, Macros, StatusSet):
         StatusSet.append([Status])
         return True
     if NewLine.find(TAB_ELSE_IF.upper()) > -1:
-        IfLine = Line[NewLine.find(TAB_ELSE_IF) + len(TAB_ELSE_IF) + 1:].strip()
+        IfLine = Line[NewLine.find(TAB_ELSE_IF) +
+                      len(TAB_ELSE_IF) + 1:].strip()
         Status = ParseConditionalStatementMacros(IfLine, Macros)
         StatusSet[-1].append(Status)
         return True
@@ -336,7 +363,7 @@ def ParseConditionalStatement(Line, Macros, StatusSet):
 
     return False
 
-## ParseConditionalStatement() method
+# ParseConditionalStatement() method
 #
 #  Parse conditional statement with Macros
 #
@@ -345,6 +372,8 @@ def ParseConditionalStatement(Line, Macros, StatusSet):
 #
 #  @return Line: New line after replacing macros
 #
+
+
 def ParseConditionalStatementMacros(Line, Macros):
     if Line.upper().find('DEFINED(') > -1 or Line.upper().find('EXIST') > -1:
         return False
@@ -354,7 +383,7 @@ def ParseConditionalStatementMacros(Line, Macros):
     Line = Line.replace("||", "or")
     return eval(Line)
 
-## GetConditionalStatementStatus() method
+# GetConditionalStatementStatus() method
 #
 #  1. Assume the latest status as True
 #  2. Pop the top status of status set, previous status
@@ -364,6 +393,8 @@ def ParseConditionalStatementMacros(Line, Macros):
 #
 #  @return Status: The final status
 #
+
+
 def GetConditionalStatementStatus(StatusSet):
     Status = True
     for Item in StatusSet:
@@ -371,7 +402,7 @@ def GetConditionalStatementStatus(StatusSet):
 
     return Status
 
-## SearchBelongsToFunction() method
+# SearchBelongsToFunction() method
 #
 #  Search all functions belong to the file
 #
@@ -381,15 +412,18 @@ def GetConditionalStatementStatus(StatusSet):
 #
 #  @return: The found function
 #
+
+
 def SearchBelongsToFunction(BelongsToFile, StartLine, EndLine):
-    SqlCommand = """select ID, Name from Function where BelongsToFile = %s and StartLine <= %s and EndLine >= %s""" %(BelongsToFile, StartLine, EndLine)
+    SqlCommand = """select ID, Name from Function where BelongsToFile = %s and StartLine <= %s and EndLine >= %s""" % (
+        BelongsToFile, StartLine, EndLine)
     RecordSet = EotGlobalData.gDb.TblFunction.Exec(SqlCommand)
     if RecordSet != []:
         return RecordSet[0][0], RecordSet[0][1]
     else:
         return -1, ''
 
-## SearchPpiCallFunction() method
+# SearchPpiCallFunction() method
 #
 #  Search all used PPI calling function 'PeiServicesReInstallPpi' and 'PeiServicesInstallPpi'
 #  Store the result to database
@@ -399,6 +433,8 @@ def SearchBelongsToFunction(BelongsToFile, StartLine, EndLine):
 #  @param SourceFileFullPath: Source file full path
 #  @param ItemMode: Mode of the item
 #
+
+
 def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode):
     ItemName, ItemType, GuidName, GuidMacro, GuidValue = '', 'Ppi', '', '', ''
     SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
@@ -410,13 +446,15 @@ def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode
     for Record in RecordSet:
         Index = 0
         BelongsToFile, StartLine, EndLine = Record[2], Record[3], Record[4]
-        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(BelongsToFile, StartLine, EndLine)
+        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(
+            BelongsToFile, StartLine, EndLine)
         VariableList = Record[0].split(',')
         for Variable in VariableList:
             Variable = Variable.strip()
             # Get index of the variable
             if Variable.find('[') > -1:
-                Index = int(Variable[Variable.find('[') + 1 : Variable.find(']')])
+                Index = int(Variable[Variable.find(
+                    '[') + 1: Variable.find(']')])
                 Variable = Variable[:Variable.find('[')]
             # Get variable name
             if Variable.startswith('&'):
@@ -434,10 +472,12 @@ def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode
                     if len(NewVariableValueList) > 1:
                         NewVariableValue = NewVariableValueList[1].strip()
                         if NewVariableValue.startswith('&'):
-                            Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, NewVariableValue[1:], GuidMacro, GuidValue, BelongsToFunction, 0)
+                            Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                                      ItemMode, NewVariableValue[1:], GuidMacro, GuidValue, BelongsToFunction, 0)
                             continue
                         else:
-                            EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, NewParameter))
+                            EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                                ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, NewParameter))
 
     ItemName, ItemType, GuidName, GuidMacro, GuidValue = '', 'Ppi', '', '', ''
     SqlCommand = """select Value, Name, BelongsToFile, StartLine, EndLine from %s
@@ -458,12 +498,14 @@ def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode
             continue
         Index = 0
         BelongsToFile, StartLine, EndLine = Record[2], Record[3], Record[4]
-        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(BelongsToFile, StartLine, EndLine)
-        Variable = Record[0].replace('PeiServicesInstallPpi', '').replace('(', '').replace(')', '').replace('&', '').strip()
+        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(
+            BelongsToFile, StartLine, EndLine)
+        Variable = Record[0].replace('PeiServicesInstallPpi', '').replace(
+            '(', '').replace(')', '').replace('&', '').strip()
         Variable = Variable[Variable.find(',') + 1:].strip()
         # Get index of the variable
         if Variable.find('[') > -1:
-            Index = int(Variable[Variable.find('[') + 1 : Variable.find(']')])
+            Index = int(Variable[Variable.find('[') + 1: Variable.find(']')])
             Variable = Variable[:Variable.find('[')]
         # Get variable name
         if Variable.startswith('&'):
@@ -480,12 +522,14 @@ def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode
                 if len(NewVariableValueList) > 1:
                     NewVariableValue = NewVariableValueList[1].strip()
                     if NewVariableValue.startswith('&'):
-                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, NewVariableValue[1:], GuidMacro, GuidValue, BelongsToFunction, 0)
+                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                                  ItemMode, NewVariableValue[1:], GuidMacro, GuidValue, BelongsToFunction, 0)
                         continue
                     else:
-                        EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, NewParameter))
+                        EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                            ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, NewParameter))
 
-## SearchPpis() method
+# SearchPpis() method
 #
 #  Search all used PPI calling function
 #  Store the result to database
@@ -497,7 +541,9 @@ def SearchPpiCallFunction(Identifier, SourceFileID, SourceFileFullPath, ItemMode
 #  @param ItemMode: Mode of the item
 #  @param PpiMode: Mode of PPI
 #
-def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, PpiMode = 1):
+
+
+def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, PpiMode=1):
     ItemName, ItemType, GuidName, GuidMacro, GuidValue = '', 'Ppi', '', '', ''
     BelongsToFunctionID, BelongsToFunction = -1, ''
     Db = EotGlobalData.gDb.TblReport
@@ -506,7 +552,8 @@ def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, Ppi
         Parameter = GetPpiParameter(Record[0], PpiMode)
         BelongsToFile, StartLine, EndLine = Record[2], Record[3], Record[4]
         # Get BelongsToFunction
-        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(BelongsToFile, StartLine, EndLine)
+        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(
+            BelongsToFile, StartLine, EndLine)
 
         # Default is Not Found
         IsFound = False
@@ -514,29 +561,34 @@ def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, Ppi
         # For Consumed Ppi
         if ItemMode == 'Consumed':
             if Parameter.startswith('g'):
-                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, Parameter, GuidMacro, GuidValue, BelongsToFunction, 0)
+                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                          ItemMode, Parameter, GuidMacro, GuidValue, BelongsToFunction, 0)
             else:
-                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
+                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                    ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
             continue
 
         # Direct Parameter.Guid
-        SqlCommand = """select Value from %s where (Name like '%%%s.Guid%%' or Name like '%%%s->Guid%%') and Model = %s""" % (Table, Parameter, Parameter, MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION)
+        SqlCommand = """select Value from %s where (Name like '%%%s.Guid%%' or Name like '%%%s->Guid%%') and Model = %s""" % (
+            Table, Parameter, Parameter, MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION)
         NewRecordSet = Db.Exec(SqlCommand)
         for NewRecord in NewRecordSet:
             GuidName = GetParameterName(NewRecord[0])
-            Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+            Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                      ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
             IsFound = True
 
         # Defined Parameter
         if not IsFound:
             Key = Parameter
             if Key.rfind(' ') > -1:
-                Key = Key[Key.rfind(' ') : ].strip().replace('&', '')
+                Key = Key[Key.rfind(' '):].strip().replace('&', '')
             Value = FindKeyValue(EotGlobalData.gDb.TblFile, Table, Key)
             List = GetSplitValueList(Value.replace('\n', ''), TAB_COMMA_SPLIT)
             if len(List) > 1:
                 GuidName = GetParameterName(List[1])
-                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                          ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                 IsFound = True
 
         # A list Parameter
@@ -545,14 +597,18 @@ def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, Ppi
             End = Parameter.find(']')
             if Start > -1 and End > -1 and Start < End:
                 try:
-                    Index = int(Parameter[Start + 1 : End])
-                    Parameter = Parameter[0 : Start]
-                    SqlCommand = """select Value from %s where Name = '%s' and Model = %s""" % (Table, Parameter, MODEL_IDENTIFIER_VARIABLE)
+                    Index = int(Parameter[Start + 1: End])
+                    Parameter = Parameter[0: Start]
+                    SqlCommand = """select Value from %s where Name = '%s' and Model = %s""" % (
+                        Table, Parameter, MODEL_IDENTIFIER_VARIABLE)
                     NewRecordSet = Db.Exec(SqlCommand)
                     for NewRecord in NewRecordSet:
-                        NewParameter = GetSplitValueList(NewRecord[0], '}')[Index]
-                        GuidName = GetPpiParameter(NewParameter[NewParameter.find('{') : ])
-                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                        NewParameter = GetSplitValueList(
+                            NewRecord[0], '}')[Index]
+                        GuidName = GetPpiParameter(
+                            NewParameter[NewParameter.find('{'):])
+                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                                  ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                         IsFound = True
                 except Exception:
                     pass
@@ -565,19 +621,22 @@ def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, Ppi
             NewRecordSet = Db.Exec(SqlCommand)
             for NewRecord in NewRecordSet:
                 Table = 'Identifier' + str(NewRecord[0])
-                SqlCommand = """select Value from %s where Name = '%s' and Modifier = 'EFI_PEI_PPI_DESCRIPTOR' and Model = %s""" % (Table, Parameter, MODEL_IDENTIFIER_VARIABLE)
+                SqlCommand = """select Value from %s where Name = '%s' and Modifier = 'EFI_PEI_PPI_DESCRIPTOR' and Model = %s""" % (
+                    Table, Parameter, MODEL_IDENTIFIER_VARIABLE)
                 PpiSet = Db.Exec(SqlCommand)
                 if PpiSet != []:
                     GuidName = GetPpiParameter(PpiSet[0][0])
                     if GuidName != '':
-                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                                  ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                         IsFound = True
                         break
 
         if not IsFound:
-            EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
+            EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
 
-## SearchProtocols() method
+# SearchProtocols() method
 #
 #  Search all used PROTOCOL calling function
 #  Store the result to database
@@ -589,6 +648,8 @@ def SearchPpi(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, Ppi
 #  @param ItemMode: Mode of the item
 #  @param ProtocolMode: Mode of PROTOCOL
 #
+
+
 def SearchProtocols(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMode, ProtocolMode):
     ItemName, ItemType, GuidName, GuidMacro, GuidValue = '', 'Protocol', '', '', ''
     BelongsToFunctionID, BelongsToFunction = -1, ''
@@ -598,7 +659,8 @@ def SearchProtocols(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMod
         Parameter = ''
         BelongsToFile, StartLine, EndLine = Record[2], Record[3], Record[4]
         # Get BelongsToFunction
-        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(BelongsToFile, StartLine, EndLine)
+        BelongsToFunctionID, BelongsToFunction = SearchBelongsToFunction(
+            BelongsToFile, StartLine, EndLine)
 
         # Default is Not Found
         IsFound = False
@@ -607,7 +669,8 @@ def SearchProtocols(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMod
             Parameter = GetProtocolParameter(Record[0], ProtocolMode)
             if Parameter.startswith('g') or Parameter.endswith('Guid') or Parameter == 'ShellEnvProtocol' or Parameter == 'ShellInterfaceProtocol':
                 GuidName = GetParameterName(Parameter)
-                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                          ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                 IsFound = True
 
         if ProtocolMode == 2:
@@ -615,22 +678,27 @@ def SearchProtocols(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMod
             for Protocol in Protocols:
                 if Protocol.startswith('&') and Protocol.endswith('Guid'):
                     GuidName = GetParameterName(Protocol)
-                    Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                    Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                              ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                     IsFound = True
                 else:
-                    NewValue = FindKeyValue(EotGlobalData.gDb.TblFile, Table, Protocol)
+                    NewValue = FindKeyValue(
+                        EotGlobalData.gDb.TblFile, Table, Protocol)
                     if Protocol != NewValue and NewValue.endswith('Guid'):
                         GuidName = GetParameterName(NewValue)
-                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                        Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                                  ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                         IsFound = True
 
         if not IsFound:
             if BelongsToFunction in EotGlobalData.gProducedProtocolLibrary or BelongsToFunction in EotGlobalData.gConsumedProtocolLibrary:
-                EotGlobalData.gOP_UN_MATCHED_IN_LIBRARY_CALLING.write('%s, %s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter, BelongsToFunction))
+                EotGlobalData.gOP_UN_MATCHED_IN_LIBRARY_CALLING.write('%s, %s, %s, %s, %s, %s, %s\n' % (
+                    ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter, BelongsToFunction))
             else:
-                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
+                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                    ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
 
-## SearchFunctionCalling() method
+# SearchFunctionCalling() method
 #
 #  Search all used PPI/PROTOCOL calling function by library
 #  Store the result to database
@@ -642,6 +710,8 @@ def SearchProtocols(SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemMod
 #  @param ItemType: Type of the item, PPI or PROTOCOL
 #  @param ItemMode: Mode of item
 #
+
+
 def SearchFunctionCalling(Table, SourceFileID, SourceFileFullPath, ItemType, ItemMode):
     LibraryList = {}
     Db = EotGlobalData.gDb.TblReport
@@ -675,13 +745,15 @@ def SearchFunctionCalling(Table, SourceFileID, SourceFileFullPath, ItemType, Ite
             for Parameter in Parameters:
                 if Parameter.startswith('g') or Parameter.endswith('Guid') or Parameter == 'ShellEnvProtocol' or Parameter == 'ShellInterfaceProtocol':
                     GuidName = GetParameterName(Parameter)
-                    Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
+                    Db.Insert(-1, '', '', SourceFileID, SourceFileFullPath, ItemName, ItemType,
+                              ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, 0)
                     IsFound = True
 
             if not IsFound:
-                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
+                EotGlobalData.gOP_UN_MATCHED.write('%s, %s, %s, %s, %s, %s\n' % (
+                    ItemType, ItemMode, SourceFileID, SourceFileFullPath, StartLine, Parameter))
 
-## FindProtocols() method
+# FindProtocols() method
 #
 #  Find defined protocols
 #
@@ -693,14 +765,14 @@ def SearchFunctionCalling(Table, SourceFileID, SourceFileFullPath, ItemType, Ite
 #  @param ItemType: Type of the item, PPI or PROTOCOL
 #  @param ItemMode: Mode of item
 #
-#def FindProtocols(Db, SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue):
+# def FindProtocols(Db, SqlCommand, Table, SourceFileID, SourceFileFullPath, ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue):
 #    BelongsToFunction = ''
 #    RecordSet = Db.Exec(SqlCommand)
 #    for Record in RecordSet:
 #        IsFound = True
 #        Parameter = GetProtocolParameter(Record[0])
 
-## GetProtocolParameter() method
+# GetProtocolParameter() method
 #
 # Parse string of protocol and find parameters
 #
@@ -709,10 +781,12 @@ def SearchFunctionCalling(Table, SourceFileID, SourceFileFullPath, ItemType, Ite
 #
 #  @return: call common GetParameter
 #
-def GetProtocolParameter(Parameter, Index = 1):
+
+
+def GetProtocolParameter(Parameter, Index=1):
     return GetParameter(Parameter, Index)
 
-## GetPpiParameter() method
+# GetPpiParameter() method
 #
 # Parse string of ppi and find parameters
 #
@@ -721,10 +795,12 @@ def GetProtocolParameter(Parameter, Index = 1):
 #
 #  @return: call common GetParameter
 #
-def GetPpiParameter(Parameter, Index = 1):
+
+
+def GetPpiParameter(Parameter, Index=1):
     return GetParameter(Parameter, Index)
 
-## GetParameter() method
+# GetParameter() method
 #
 # Get a parameter by index
 #
@@ -733,7 +809,9 @@ def GetPpiParameter(Parameter, Index = 1):
 #
 #  @return Parameter: The found parameter
 #
-def GetParameter(Parameter, Index = 1):
+
+
+def GetParameter(Parameter, Index=1):
     ParameterList = GetSplitValueList(Parameter, TAB_COMMA_SPLIT)
     if len(ParameterList) > Index:
         Parameter = GetParameterName(ParameterList[Index])
@@ -742,7 +820,7 @@ def GetParameter(Parameter, Index = 1):
 
     return ''
 
-## GetParameterName() method
+# GetParameterName() method
 #
 # Get a parameter name
 #
@@ -750,13 +828,15 @@ def GetParameter(Parameter, Index = 1):
 #
 #  @return: The name of parameter
 #
+
+
 def GetParameterName(Parameter):
     if isinstance(Parameter, type('')) and Parameter.startswith('&'):
         return Parameter[1:].replace('{', '').replace('}', '').replace('\r', '').replace('\n', '').strip()
     else:
         return Parameter.strip()
 
-## FindKeyValue() method
+# FindKeyValue() method
 #
 # Find key value of a variable
 #
@@ -766,8 +846,11 @@ def GetParameterName(Parameter):
 #
 #  @return Value: The value of the keyword
 #
+
+
 def FindKeyValue(Db, Table, Key):
-    SqlCommand = """select Value from %s where Name = '%s' and (Model = %s or Model = %s)""" % (Table, Key, MODEL_IDENTIFIER_VARIABLE, MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION)
+    SqlCommand = """select Value from %s where Name = '%s' and (Model = %s or Model = %s)""" % (
+        Table, Key, MODEL_IDENTIFIER_VARIABLE, MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION)
     RecordSet = Db.Exec(SqlCommand)
     Value = ''
     for Record in RecordSet:
@@ -779,7 +862,7 @@ def FindKeyValue(Db, Table, Key):
     else:
         return Key
 
-## ParseMapFile() method
+# ParseMapFile() method
 #
 #  Parse map files to get a dict of 'ModuleName' : {FunName : FunAddress}
 #
@@ -787,6 +870,8 @@ def FindKeyValue(Db, Table, Key):
 #
 #  @return AllMaps: An object of all map files
 #
+
+
 def ParseMapFile(Files):
     AllMaps = {}
     CurrentModule = ''
@@ -819,7 +904,7 @@ def ParseMapFile(Files):
 
     return AllMaps
 
-## ConvertGuid
+# ConvertGuid
 #
 #  Convert a GUID to a GUID with all upper letters
 #
@@ -827,6 +912,8 @@ def ParseMapFile(Files):
 #
 #  @param newGuid: The GUID with all upper letters.
 #
+
+
 def ConvertGuid(guid):
     numList = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9']
     newGuid = ''
@@ -844,7 +931,7 @@ def ConvertGuid(guid):
 
     return newGuid
 
-## ConvertGuid2() method
+# ConvertGuid2() method
 #
 #  Convert a GUID to a GUID with new string instead of old string
 #
@@ -854,12 +941,15 @@ def ConvertGuid(guid):
 #
 #  @param newGuid: The GUID after replacement
 #
+
+
 def ConvertGuid2(guid, old, new):
     newGuid = ConvertGuid(guid)
     newGuid = newGuid.replace(old, new)
 
     return newGuid
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Eot/ParserWarning.py b/BaseTools/Source/Python/Eot/ParserWarning.py
index e84990a4909c..bbee973c96c9 100644
--- a/BaseTools/Source/Python/Eot/ParserWarning.py
+++ b/BaseTools/Source/Python/Eot/ParserWarning.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Warning information of Eot
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -6,14 +6,14 @@
 #  SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 class Warning (Exception):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
     #   @param  File        The FDF name
     #   @param  Line        The Line number that error occurs
     #
-    def __init__(self, Str, File = None, Line = None):
+    def __init__(self, Str, File=None, Line=None):
         self.message = Str
         self.FileName = File
         self.LineNumber = Line
diff --git a/BaseTools/Source/Python/Eot/Report.py b/BaseTools/Source/Python/Eot/Report.py
index 9d99fe22a0f1..4e39e1b072f6 100644
--- a/BaseTools/Source/Python/Eot/Report.py
+++ b/BaseTools/Source/Python/Eot/Report.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create report for Eot tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,20 +13,22 @@ import Common.LongFilePathOs as os
 from . import EotGlobalData
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
-## Report() class
+# Report() class
 #
 #  This class defined Report
 #
 #  @param object: Inherited from object class
 #
+
+
 class Report(object):
-    ## The constructor
+    # The constructor
     #
     #  @param  self: The object pointer
     #  @param  ReportName: name of the report
     #  @param  FvObj: FV object after parsing FV images
     #
-    def __init__(self, ReportName = 'Report.html', FvObj = None, DispatchName=None):
+    def __init__(self, ReportName='Report.html', FvObj=None, DispatchName=None):
         self.ReportName = ReportName
         self.Op = open(ReportName, 'w+')
         self.DispatchList = None
@@ -39,7 +41,7 @@ class Report(object):
         if EotGlobalData.gMACRO['EFI_SOURCE'] == '':
             EotGlobalData.gMACRO['EFI_SOURCE'] = EotGlobalData.gMACRO['EDK_SOURCE']
 
-    ## WriteLn() method
+    # WriteLn() method
     #
     #  Write a line in the report
     #
@@ -49,7 +51,7 @@ class Report(object):
     def WriteLn(self, Line):
         self.Op.write('%s\n' % Line)
 
-    ## GenerateReport() method
+    # GenerateReport() method
     #
     #  A caller to generate report
     #
@@ -62,7 +64,7 @@ class Report(object):
         self.Op.close()
         self.GenerateUnDispatchedList()
 
-    ## GenerateUnDispatchedList() method
+    # GenerateUnDispatchedList() method
     #
     #  Create a list for not dispatched items
     #
@@ -72,9 +74,10 @@ class Report(object):
         FvObj = self.FvObj
         EotGlobalData.gOP_UN_DISPATCHED.write('%s\n' % FvObj.Name)
         for Item in FvObj.UnDispatchedFfsDict.keys():
-            EotGlobalData.gOP_UN_DISPATCHED.write('%s\n' % FvObj.UnDispatchedFfsDict[Item])
+            EotGlobalData.gOP_UN_DISPATCHED.write(
+                '%s\n' % FvObj.UnDispatchedFfsDict[Item])
 
-    ## GenerateFv() method
+    # GenerateFv() method
     #
     #  Generate FV information
     #
@@ -98,7 +101,7 @@ class Report(object):
     <td>%s</td>
     <td>%s</td>
     <td>%s</td>
-  </tr>"""  % (FvName, FvGuid, FvSize)
+  </tr>""" % (FvName, FvGuid, FvSize)
             self.WriteLn(Content)
 
         Content = """    <td colspan="3"><table width="100%%"  border="1">
@@ -125,7 +128,7 @@ class Report(object):
   </tr>"""
         self.WriteLn(Content)
 
-    ## GenerateDepex() method
+    # GenerateDepex() method
     #
     #  Generate Depex information
     #
@@ -138,7 +141,8 @@ class Report(object):
         DepexString = ''
         for Item in ItemList:
             if Item not in NonGuidList:
-                SqlCommand = """select DISTINCT GuidName from Report where GuidValue like '%s' and ItemMode = 'Produced' group by GuidName""" % (Item)
+                SqlCommand = """select DISTINCT GuidName from Report where GuidValue like '%s' and ItemMode = 'Produced' group by GuidName""" % (
+                    Item)
                 RecordSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
                 if RecordSet != []:
                     Item = RecordSet[0][0]
@@ -149,7 +153,7 @@ class Report(object):
                 </tr>""" % (DepexString)
         self.WriteLn(Content)
 
-    ## GeneratePpi() method
+    # GeneratePpi() method
     #
     #  Generate PPI information
     #
@@ -161,7 +165,7 @@ class Report(object):
     def GeneratePpi(self, Name, Guid, Type):
         self.GeneratePpiProtocol('Ppi', Name, Guid, Type, self.PpiIndex)
 
-    ## GenerateProtocol() method
+    # GenerateProtocol() method
     #
     #  Generate PROTOCOL information
     #
@@ -171,9 +175,10 @@ class Report(object):
     #  @param Type: Type of a GUID
     #
     def GenerateProtocol(self, Name, Guid, Type):
-        self.GeneratePpiProtocol('Protocol', Name, Guid, Type, self.ProtocolIndex)
+        self.GeneratePpiProtocol(
+            'Protocol', Name, Guid, Type, self.ProtocolIndex)
 
-    ## GeneratePpiProtocol() method
+    # GeneratePpiProtocol() method
     #
     #  Generate PPI/PROTOCOL information
     #
@@ -201,7 +206,8 @@ class Report(object):
                                 select DISTINCT BelongsToFile from Inf
                                 where Value1 like '%s')""" % Record[0]
                 ModuleSet = EotGlobalData.gDb.TblReport.Exec(SqlCommand)
-                Inf = ModuleSet[0][0].replace(EotGlobalData.gMACRO['WORKSPACE'], '.')
+                Inf = ModuleSet[0][0].replace(
+                    EotGlobalData.gMACRO['WORKSPACE'], '.')
                 Function = Record[1]
                 Address = ''
                 for Item in EotGlobalData.gMap:
@@ -220,7 +226,7 @@ class Report(object):
                     </tr>""" % ('Callback', Inf, Function, Address)
                 self.WriteLn(Content)
 
-    ## GenerateFfs() method
+    # GenerateFfs() method
     #
     #  Generate FFS information
     #
@@ -274,13 +280,15 @@ class Report(object):
 
             if self.DispatchList:
                 if FfsObj.Type in [0x04, 0x06]:
-                    self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "P", FfsName, FfsPath))
+                    self.DispatchList.write("%s %s %s %s\n" % (
+                        FfsGuid, "P", FfsName, FfsPath))
                 if FfsObj.Type in [0x05, 0x07, 0x08, 0x0A]:
-                    self.DispatchList.write("%s %s %s %s\n" % (FfsGuid, "D", FfsName, FfsPath))
+                    self.DispatchList.write("%s %s %s %s\n" % (
+                        FfsGuid, "D", FfsName, FfsPath))
 
             self.WriteLn(Content)
 
-            EotGlobalData.gOP_DISPATCH_ORDER.write('%s\n' %FfsName)
+            EotGlobalData.gOP_DISPATCH_ORDER.write('%s\n' % FfsName)
 
             if FfsObj.Depex != '':
                 Content = """          <tr>
@@ -321,12 +329,13 @@ class Report(object):
                     CName = Record[4]
                     Guid = Record[3]
                     Type = Record[1]
-                    self.GeneratePpiProtocol(Type, Name, Guid, 'Consumed', CName)
+                    self.GeneratePpiProtocol(
+                        Type, Name, Guid, 'Consumed', CName)
 
                 Content = """            </table></td>
           </tr>"""
                 self.WriteLn(Content)
-            #End of Consumed Ppi/Protocol
+            # End of Consumed Ppi/Protocol
 
             # Find Produced Ppi/Protocol
             SqlCommand = """select ModuleName, ItemType, GuidName, GuidValue, GuidMacro from Report
@@ -354,7 +363,8 @@ class Report(object):
                     CName = Record[4]
                     Guid = Record[3]
                     Type = Record[1]
-                    self.GeneratePpiProtocol(Type, Name, Guid, 'Produced', CName)
+                    self.GeneratePpiProtocol(
+                        Type, Name, Guid, 'Produced', CName)
 
                 Content = """            </table></td>
           </tr>"""
@@ -366,7 +376,7 @@ class Report(object):
         </tr>"""
             self.WriteLn(Content)
 
-    ## GenerateTail() method
+    # GenerateTail() method
     #
     #  Generate end tags of HTML report
     #
@@ -378,7 +388,7 @@ class Report(object):
 </html>"""
         self.WriteLn(Tail)
 
-    ## GenerateHeader() method
+    # GenerateHeader() method
     #
     #  Generate start tags of HTML report
     #
@@ -446,6 +456,7 @@ function funOnMouseOut()
 <table width="100%%"  border="1">"""
         self.WriteLn(Header)
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/Eot/__init__.py b/BaseTools/Source/Python/Eot/__init__.py
index ba0b8c81fa30..0a328824eb9f 100644
--- a/BaseTools/Source/Python/Eot/__init__.py
+++ b/BaseTools/Source/Python/Eot/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Eot' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Eot/c.py b/BaseTools/Source/Python/Eot/c.py
index dd9530fed6d0..95d396a0cde9 100644
--- a/BaseTools/Source/Python/Eot/c.py
+++ b/BaseTools/Source/Python/Eot/c.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # preprocess source file
 #
 #  Copyright (c) 2007 - 2014, Intel Corporation. All rights reserved.<BR>
@@ -27,46 +27,55 @@ IncludePathListDict = {}
 ComplexTypeDict = {}
 SUDict = {}
 
-## GetFuncDeclPattern() method
+# GetFuncDeclPattern() method
 #
 #  Get the pattern of function declaration
 #
 #  @return p:    the pattern of function declaration
 #
+
+
 def GetFuncDeclPattern():
-    p = re.compile(r'(EFIAPI|EFI_BOOT_SERVICE|EFI_RUNTIME_SERVICE)?\s*[_\w]+\s*\(.*\).*', re.DOTALL)
+    p = re.compile(
+        r'(EFIAPI|EFI_BOOT_SERVICE|EFI_RUNTIME_SERVICE)?\s*[_\w]+\s*\(.*\).*', re.DOTALL)
     return p
 
-## GetArrayPattern() method
+# GetArrayPattern() method
 #
 #  Get the pattern of array
 #
 #  @return p:    the pattern of array
 #
+
+
 def GetArrayPattern():
     p = re.compile(r'[_\w]*\s*[\[.*\]]+')
     return p
 
-## GetTypedefFuncPointerPattern() method
+# GetTypedefFuncPointerPattern() method
 #
 #  Get the pattern of function pointer
 #
 #  @return p:    the pattern of function pointer
 #
+
+
 def GetTypedefFuncPointerPattern():
     p = re.compile('[_\w\s]*\([\w\s]*\*+\s*[_\w]+\s*\)\s*\(.*\)', re.DOTALL)
     return p
 
-## GetDB() method
+# GetDB() method
 #
 #  Get global database instance
 #
 #  @return EotGlobalData.gDb:    the global database instance
 #
+
+
 def GetDB():
     return EotGlobalData.gDb
 
-## PrintErrorMsg() method
+# PrintErrorMsg() method
 #
 #  print error message
 #
@@ -75,6 +84,8 @@ def GetDB():
 #  @param TableName: table name of error found
 #  @param ItemId: id of item
 #
+
+
 def PrintErrorMsg(ErrorType, Msg, TableName, ItemId):
     Msg = Msg.replace('\n', '').replace('\r', '')
     MsgPartList = Msg.split()
@@ -82,9 +93,10 @@ def PrintErrorMsg(ErrorType, Msg, TableName, ItemId):
     for Part in MsgPartList:
         Msg += Part
         Msg += ' '
-    GetDB().TblReport.Insert(ErrorType, OtherMsg = Msg, BelongsToTable = TableName, BelongsToItem = ItemId)
+    GetDB().TblReport.Insert(ErrorType, OtherMsg=Msg,
+                             BelongsToTable=TableName, BelongsToItem=ItemId)
 
-## GetIdType() method
+# GetIdType() method
 #
 #  Find type of input string
 #
@@ -92,6 +104,8 @@ def PrintErrorMsg(ErrorType, Msg, TableName, ItemId):
 #
 #  @return Type: The type of the string
 #
+
+
 def GetIdType(Str):
     Type = DataClass.MODEL_UNKNOWN
     Str = Str.replace('#', '# ')
@@ -112,22 +126,26 @@ def GetIdType(Str):
         Type = DataClass.MODEL_UNKNOWN
     return Type
 
-## GetIdentifierList() method
+# GetIdentifierList() method
 #
 #  Get id of all files
 #
 #  @return IdList: The list of all id of files
 #
+
+
 def GetIdentifierList():
     IdList = []
 
     for pp in FileProfile.PPDirectiveList:
         Type = GetIdType(pp.Content)
-        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
+        IdPP = DataClass.IdentifierClass(-1, '', '', '', pp.Content, Type, -
+                                         1, -1, pp.StartPos[0], pp.StartPos[1], pp.EndPos[0], pp.EndPos[1])
         IdList.append(IdPP)
 
     for ae in FileProfile.AssignmentExpressionList:
-        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -1, -1, ae.StartPos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1])
+        IdAE = DataClass.IdentifierClass(-1, ae.Operator, '', ae.Name, ae.Value, DataClass.MODEL_IDENTIFIER_ASSIGNMENT_EXPRESSION, -
+                                         1, -1, ae.StartPos[0], ae.StartPos[1], ae.EndPos[0], ae.EndPos[1])
         IdList.append(IdAE)
 
     FuncDeclPattern = GetFuncDeclPattern()
@@ -147,9 +165,11 @@ def GetIdentifierList():
                 Index = 0
                 while Index < len(FuncNamePartList) - 1:
                     var.Modifier += ' ' + FuncNamePartList[Index]
-                    var.Declarator = var.Declarator.lstrip().lstrip(FuncNamePartList[Index])
+                    var.Declarator = var.Declarator.lstrip().lstrip(
+                        FuncNamePartList[Index])
                     Index += 1
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', var.Declarator, '', DataClass.MODEL_IDENTIFIER_FUNCTION_DECLARATION, -
+                                              1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
             continue
 
@@ -162,7 +182,8 @@ def GetIdentifierList():
                     var.Modifier += ' ' + Name[LSBPos:]
                     Name = Name[0:LSBPos]
 
-                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
+                IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]] or [''])[
+                                                  0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
                 IdList.append(IdVar)
         else:
             DeclList = var.Declarator.split('=')
@@ -171,7 +192,8 @@ def GetIdentifierList():
                 LSBPos = var.Declarator.find('[')
                 var.Modifier += ' ' + Name[LSBPos:]
                 Name = Name[0:LSBPos]
-            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]]or [''])[0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
+            IdVar = DataClass.IdentifierClass(-1, var.Modifier, '', Name, (len(DeclList) > 1 and [DeclList[1]] or [''])[
+                                              0], DataClass.MODEL_IDENTIFIER_VARIABLE, -1, -1, var.StartPos[0], var.StartPos[1], var.EndPos[0], var.EndPos[1])
             IdList.append(IdVar)
 
     for enum in FileProfile.EnumerationDefinitionList:
@@ -179,7 +201,8 @@ def GetIdentifierList():
         RBPos = enum.Content.find('}')
         Name = enum.Content[4:LBPos].strip()
         Value = enum.Content[LBPos+1:RBPos]
-        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
+        IdEnum = DataClass.IdentifierClass(-1, '', '', Name, Value, DataClass.MODEL_IDENTIFIER_ENUMERATE, -
+                                           1, -1, enum.StartPos[0], enum.StartPos[1], enum.EndPos[0], enum.EndPos[1])
         IdList.append(IdEnum)
 
     for su in FileProfile.StructUnionDefinitionList:
@@ -196,7 +219,8 @@ def GetIdentifierList():
         else:
             Name = su.Content[SkipLen:LBPos].strip()
             Value = su.Content[LBPos+1:RBPos]
-        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
+        IdPE = DataClass.IdentifierClass(-1, '', '', Name, Value, Type, -
+                                         1, -1, su.StartPos[0], su.StartPos[1], su.EndPos[0], su.EndPos[1])
         IdList.append(IdPE)
 
     TdFuncPointerPattern = GetTypedefFuncPointerPattern()
@@ -219,15 +243,17 @@ def GetIdentifierList():
             Name = TmpStr[0:RBPos]
             Value = 'FP' + TmpStr[RBPos + 1:]
 
-        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
+        IdTd = DataClass.IdentifierClass(-1, Modifier, '', Name, Value, DataClass.MODEL_IDENTIFIER_TYPEDEF, -
+                                         1, -1, td.StartPos[0], td.StartPos[1], td.EndPos[0], td.EndPos[1])
         IdList.append(IdTd)
 
     for funcCall in FileProfile.FunctionCallingList:
-        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
+        IdFC = DataClass.IdentifierClass(-1, '', '', funcCall.FuncName, funcCall.ParamList, DataClass.MODEL_IDENTIFIER_FUNCTION_CALLING, -
+                                         1, -1, funcCall.StartPos[0], funcCall.StartPos[1], funcCall.EndPos[0], funcCall.EndPos[1])
         IdList.append(IdFC)
     return IdList
 
-## GetParamList() method
+# GetParamList() method
 #
 #  Get a list of parameters
 #
@@ -237,7 +263,9 @@ def GetIdentifierList():
 #
 #  @return ParamIdList: A list of parameters
 #
-def GetParamList(FuncDeclarator, FuncNameLine = 0, FuncNameOffset = 0):
+
+
+def GetParamList(FuncDeclarator, FuncNameLine=0, FuncNameOffset=0):
     ParamIdList = []
     DeclSplitList = FuncDeclarator.split('(')
     if len(DeclSplitList) < 2:
@@ -253,7 +281,7 @@ def GetParamList(FuncDeclarator, FuncNameLine = 0, FuncNameOffset = 0):
         Start += FuncName.find('\n', Start)
         Start += 1
     OffsetSkipped += len(FuncName[Start:])
-    OffsetSkipped += 1 #skip '('
+    OffsetSkipped += 1  # skip '('
     ParamBeginLine = FuncNameLine + LineSkipped
     ParamBeginOffset = OffsetSkipped
     for p in ParamStr.split(','):
@@ -289,19 +317,22 @@ def GetParamList(FuncDeclarator, FuncNameLine = 0, FuncNameOffset = 0):
 
         ParamEndLine = ParamBeginLine + LineSkipped
         ParamEndOffset = OffsetSkipped
-        IdParam = DataClass.IdentifierClass(-1, ParamModifier, '', ParamName, '', DataClass.MODEL_IDENTIFIER_PARAMETER, -1, -1, ParamBeginLine, ParamBeginOffset, ParamEndLine, ParamEndOffset)
+        IdParam = DataClass.IdentifierClass(-1, ParamModifier, '', ParamName, '', DataClass.MODEL_IDENTIFIER_PARAMETER, -
+                                            1, -1, ParamBeginLine, ParamBeginOffset, ParamEndLine, ParamEndOffset)
         ParamIdList.append(IdParam)
         ParamBeginLine = ParamEndLine
-        ParamBeginOffset = OffsetSkipped + 1 #skip ','
+        ParamBeginOffset = OffsetSkipped + 1  # skip ','
 
     return ParamIdList
 
-## GetFunctionList()
+# GetFunctionList()
 #
 #  Get a list of functions
 #
 #  @return FuncObjList: A list of function objects
 #
+
+
 def GetFunctionList():
     FuncObjList = []
     for FuncDef in FileProfile.FunctionDefinitionList:
@@ -325,17 +356,20 @@ def GetFunctionList():
                 FuncDef.Modifier += ' ' + FuncNamePartList[Index]
                 Index += 1
 
-        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
+        FuncObj = DataClass.FunctionClass(-1, FuncDef.Declarator, FuncDef.Modifier, FuncName.strip(
+        ), '', FuncDef.StartPos[0], FuncDef.StartPos[1], FuncDef.EndPos[0], FuncDef.EndPos[1], FuncDef.LeftBracePos[0], FuncDef.LeftBracePos[1], -1, ParamIdList, [])
         FuncObjList.append(FuncObj)
 
     return FuncObjList
 
-## CreateCCodeDB() method
+# CreateCCodeDB() method
 #
 #  Create database for all c code
 #
 #  @param FileNameList: A list of all c code file names
 #
+
+
 def CreateCCodeDB(FileNameList):
     FileObjList = []
     ParseErrorFileList = []
@@ -346,7 +380,8 @@ def CreateCCodeDB(FileNameList):
                 continue
             ParsedFiles[FullName.lower()] = 1
             EdkLogger.info("Parsing " + FullName)
-            model = FullName.endswith('c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
+            model = FullName.endswith(
+                'c') and DataClass.MODEL_FILE_C or DataClass.MODEL_FILE_H
             collector = CodeFragmentCollector.CodeFragmentCollector(FullName)
             try:
                 collector.ParseFile()
@@ -356,12 +391,14 @@ def CreateCCodeDB(FileNameList):
             DirName = os.path.dirname(FullName)
             Ext = os.path.splitext(BaseName)[1].lstrip('.')
             ModifiedTime = os.path.getmtime(FullName)
-            FileObj = DataClass.FileClass(-1, BaseName, Ext, DirName, FullName, model, ModifiedTime, GetFunctionList(), GetIdentifierList(), [])
+            FileObj = DataClass.FileClass(-1, BaseName, Ext, DirName, FullName,
+                                          model, ModifiedTime, GetFunctionList(), GetIdentifierList(), [])
             FileObjList.append(FileObj)
             collector.CleanFileProfileBuffer()
 
     if len(ParseErrorFileList) > 0:
-        EdkLogger.info("Found unrecoverable error during parsing:\n\t%s\n" % "\n\t".join(ParseErrorFileList))
+        EdkLogger.info("Found unrecoverable error during parsing:\n\t%s\n" %
+                       "\n\t".join(ParseErrorFileList))
 
     Db = EotGlobalData.gDb
     for file in FileObjList:
@@ -369,6 +406,7 @@ def CreateCCodeDB(FileNameList):
 
     Db.UpdateIdentifierBelongsToFunction()
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
diff --git a/BaseTools/Source/Python/FMMT/FMMT.py b/BaseTools/Source/Python/FMMT/FMMT.py
index 10800e776a72..e566df65e5fa 100644
--- a/BaseTools/Source/Python/FMMT/FMMT.py
+++ b/BaseTools/Source/Python/FMMT/FMMT.py
@@ -41,35 +41,37 @@ parser.add_argument("-c", "--ConfigFilePath", dest="ConfigFilePath", nargs='+',
                         If do not provide, FMMT tool will search the inputfile folder for FmmtConf.ini firstly, if not found,\
                         the FmmtConf.ini saved in FMMT tool's folder will be used as default.")
 
+
 def print_banner():
     print("")
 
+
 class FMMT():
     def __init__(self) -> None:
         self.firmware_packet = {}
 
-    def SetConfigFilePath(self, configfilepath:str) -> str:
+    def SetConfigFilePath(self, configfilepath: str) -> str:
         os.environ['FmmtConfPath'] = os.path.abspath(configfilepath)
 
-    def SetDestPath(self, inputfile:str) -> str:
+    def SetDestPath(self, inputfile: str) -> str:
         os.environ['FmmtConfPath'] = ''
         self.dest_path = os.path.dirname(os.path.abspath(inputfile))
         old_env = os.environ['PATH']
         os.environ['PATH'] = self.dest_path + os.pathsep + old_env
 
-    def CheckFfsName(self, FfsName:str) -> str:
+    def CheckFfsName(self, FfsName: str) -> str:
         try:
             return uuid.UUID(FfsName)
         except:
             return FfsName
 
-    def GetFvName(self, FvName:str) -> str:
+    def GetFvName(self, FvName: str) -> str:
         try:
             return uuid.UUID(FvName)
         except:
             return FvName
 
-    def View(self, inputfile: str, layoutfilename: str=None, outputfile: str=None) -> None:
+    def View(self, inputfile: str, layoutfilename: str = None, outputfile: str = None) -> None:
         # ViewFile(inputfile, ROOT_TYPE, logfile, outputfile)
         self.SetDestPath(inputfile)
         filetype = os.path.splitext(inputfile)[1].lower()
@@ -85,38 +87,43 @@ class FMMT():
             ROOT_TYPE = ROOT_TREE
         ViewFile(inputfile, ROOT_TYPE, layoutfilename, outputfile)
 
-    def Delete(self, inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str=None) -> None:
+    def Delete(self, inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str = None) -> None:
         self.SetDestPath(inputfile)
         if Fv_name:
-            DeleteFfs(inputfile, self.CheckFfsName(TargetFfs_name), outputfile, self.GetFvName(Fv_name))
+            DeleteFfs(inputfile, self.CheckFfsName(TargetFfs_name),
+                      outputfile, self.GetFvName(Fv_name))
         else:
             DeleteFfs(inputfile, self.CheckFfsName(TargetFfs_name), outputfile)
 
-    def Extract(self, inputfile: str, Ffs_name: str, outputfile: str, Fv_name: str=None) -> None:
+    def Extract(self, inputfile: str, Ffs_name: str, outputfile: str, Fv_name: str = None) -> None:
         self.SetDestPath(inputfile)
         if Fv_name:
-            ExtractFfs(inputfile, self.CheckFfsName(Ffs_name), outputfile, self.GetFvName(Fv_name))
+            ExtractFfs(inputfile, self.CheckFfsName(Ffs_name),
+                       outputfile, self.GetFvName(Fv_name))
         else:
             ExtractFfs(inputfile, self.CheckFfsName(Ffs_name), outputfile)
 
     def Add(self, inputfile: str, Fv_name: str, newffsfile: str, outputfile: str) -> None:
         self.SetDestPath(inputfile)
-        AddNewFfs(inputfile, self.CheckFfsName(Fv_name), newffsfile, outputfile)
+        AddNewFfs(inputfile, self.CheckFfsName(
+            Fv_name), newffsfile, outputfile)
 
-    def Replace(self,inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str, Fv_name: str=None) -> None:
+    def Replace(self, inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str, Fv_name: str = None) -> None:
         self.SetDestPath(inputfile)
         if Fv_name:
-            ReplaceFfs(inputfile, self.CheckFfsName(Ffs_name), newffsfile, outputfile, self.GetFvName(Fv_name))
+            ReplaceFfs(inputfile, self.CheckFfsName(Ffs_name),
+                       newffsfile, outputfile, self.GetFvName(Fv_name))
         else:
-            ReplaceFfs(inputfile, self.CheckFfsName(Ffs_name), newffsfile, outputfile)
+            ReplaceFfs(inputfile, self.CheckFfsName(
+                Ffs_name), newffsfile, outputfile)
 
 
 def main():
-    args=parser.parse_args()
-    status=0
+    args = parser.parse_args()
+    status = 0
 
     try:
-        fmmt=FMMT()
+        fmmt = FMMT()
         if args.ConfigFilePath:
             fmmt.SetConfigFilePath(args.ConfigFilePath[0])
         if args.View:
@@ -126,21 +133,25 @@ def main():
                 fmmt.View(args.View[0])
         elif args.Delete:
             if len(args.Delete) == 4:
-                fmmt.Delete(args.Delete[0],args.Delete[2],args.Delete[3],args.Delete[1])
+                fmmt.Delete(args.Delete[0], args.Delete[2],
+                            args.Delete[3], args.Delete[1])
             else:
-                fmmt.Delete(args.Delete[0],args.Delete[1],args.Delete[2])
+                fmmt.Delete(args.Delete[0], args.Delete[1], args.Delete[2])
         elif args.Extract:
             if len(args.Extract) == 4:
-                fmmt.Extract(args.Extract[0],args.Extract[2],args.Extract[3], args.Extract[1])
+                fmmt.Extract(args.Extract[0], args.Extract[2],
+                             args.Extract[3], args.Extract[1])
             else:
-                fmmt.Extract(args.Extract[0],args.Extract[1],args.Extract[2])
+                fmmt.Extract(args.Extract[0], args.Extract[1], args.Extract[2])
         elif args.Add:
-            fmmt.Add(args.Add[0],args.Add[1],args.Add[2],args.Add[3])
+            fmmt.Add(args.Add[0], args.Add[1], args.Add[2], args.Add[3])
         elif args.Replace:
             if len(args.Replace) == 5:
-                fmmt.Replace(args.Replace[0],args.Replace[2],args.Replace[3],args.Replace[4],args.Replace[1])
+                fmmt.Replace(args.Replace[0], args.Replace[2],
+                             args.Replace[3], args.Replace[4], args.Replace[1])
             else:
-                fmmt.Replace(args.Replace[0],args.Replace[1],args.Replace[2],args.Replace[3])
+                fmmt.Replace(args.Replace[0], args.Replace[1],
+                             args.Replace[2], args.Replace[3])
         else:
             parser.print_help()
     except Exception as e:
diff --git a/BaseTools/Source/Python/FMMT/__init__.py b/BaseTools/Source/Python/FMMT/__init__.py
index 04e6ec098d0e..ed81c69d67b0 100644
--- a/BaseTools/Source/Python/FMMT/__init__.py
+++ b/BaseTools/Source/Python/FMMT/__init__.py
@@ -1,6 +1,6 @@
-## @file
+# @file
 # This file is used to define the FMMT dependent external tool.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
-##
\ No newline at end of file
+##
diff --git a/BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py b/BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py
index 2d4e6d9276d7..16a3ddbc0ae6 100644
--- a/BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py
+++ b/BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to implement of the various bianry parser.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -29,19 +29,23 @@ SEC_FV_TREE = 'SEC_FV_IMAGE'
 BINARY_DATA = 'BINARY'
 Fv_count = 0
 
-## Abstract factory
+# Abstract factory
+
+
 class BinaryFactory():
-    type:list = []
+    type: list = []
 
     def Create_Product():
         pass
 
+
 class BinaryProduct():
-    ## Use GuidTool to decompress data.
+    # Use GuidTool to decompress data.
     def DeCompressData(self, GuidTool, Section_Data: bytes, FileName) -> bytes:
         guidtool = GUIDTools().__getitem__(struct2stream(GuidTool))
         if not guidtool.ifexist:
-            logger.error("GuidTool {} is not found when decompressing {} file.\n".format(guidtool.command, FileName))
+            logger.error("GuidTool {} is not found when decompressing {} file.\n".format(
+                guidtool.command, FileName))
             raise Exception("Process Failed: GuidTool not found!")
         DecompressedData = guidtool.unpack(Section_Data)
         return DecompressedData
@@ -49,33 +53,38 @@ class BinaryProduct():
     def ParserData():
         pass
 
+
 class SectionFactory(BinaryFactory):
     type = [SECTION_TREE]
 
     def Create_Product():
         return SectionProduct()
 
+
 class FfsFactory(BinaryFactory):
     type = [ROOT_SECTION_TREE, FFS_TREE]
 
     def Create_Product():
         return FfsProduct()
 
+
 class FvFactory(BinaryFactory):
     type = [ROOT_FFS_TREE, FV_TREE, SEC_FV_TREE]
 
     def Create_Product():
         return FvProduct()
 
+
 class FdFactory(BinaryFactory):
     type = [ROOT_FV_TREE, ROOT_TREE]
 
     def Create_Product():
         return FdProduct()
 
+
 class SectionProduct(BinaryProduct):
-    ## Decompress the compressed section.
-    def ParserData(self, Section_Tree, whole_Data: bytes, Rel_Whole_Offset: int=0) -> None:
+    # Decompress the compressed section.
+    def ParserData(self, Section_Tree, whole_Data: bytes, Rel_Whole_Offset: int = 0) -> None:
         if Section_Tree.Data.Type == 0x01:
             Section_Tree.Data.OriData = Section_Tree.Data.Data
             self.ParserSection(Section_Tree, b'')
@@ -83,8 +92,10 @@ class SectionProduct(BinaryProduct):
         elif Section_Tree.Data.Type == 0x02:
             Section_Tree.Data.OriData = Section_Tree.Data.Data
             DeCompressGuidTool = Section_Tree.Data.ExtHeader.SectionDefinitionGuid
-            Section_Tree.Data.Data = self.DeCompressData(DeCompressGuidTool, Section_Tree.Data.Data, Section_Tree.Parent.Data.Name)
-            Section_Tree.Data.Size = len(Section_Tree.Data.Data) + Section_Tree.Data.HeaderLength
+            Section_Tree.Data.Data = self.DeCompressData(
+                DeCompressGuidTool, Section_Tree.Data.Data, Section_Tree.Parent.Data.Name)
+            Section_Tree.Data.Size = len(
+                Section_Tree.Data.Data) + Section_Tree.Data.HeaderLength
             self.ParserSection(Section_Tree, b'')
         elif Section_Tree.Data.Type == 0x03:
             Section_Tree.Data.OriData = Section_Tree.Data.Data
@@ -93,16 +104,17 @@ class SectionProduct(BinaryProduct):
         elif Section_Tree.Data.Type == 0x17:
             global Fv_count
             Sec_Fv_Info = FvNode(Fv_count, Section_Tree.Data.Data)
-            Sec_Fv_Tree = BIOSTREE('FV'+ str(Fv_count))
+            Sec_Fv_Tree = BIOSTREE('FV' + str(Fv_count))
             Sec_Fv_Tree.type = SEC_FV_TREE
             Sec_Fv_Tree.Data = Sec_Fv_Info
             Sec_Fv_Tree.Data.HOffset = Section_Tree.Data.DOffset
-            Sec_Fv_Tree.Data.DOffset = Sec_Fv_Tree.Data.HOffset + Sec_Fv_Tree.Data.Header.HeaderLength
+            Sec_Fv_Tree.Data.DOffset = Sec_Fv_Tree.Data.HOffset + \
+                Sec_Fv_Tree.Data.Header.HeaderLength
             Sec_Fv_Tree.Data.Data = Section_Tree.Data.Data[Sec_Fv_Tree.Data.Header.HeaderLength:]
             Section_Tree.insertChild(Sec_Fv_Tree)
             Fv_count += 1
 
-    def ParserSection(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int=0) -> None:
+    def ParserSection(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int = 0) -> None:
         Rel_Offset = 0
         Section_Offset = 0
         # Get the Data from parent tree, if do not have the tree then get it from the whole_data.
@@ -118,8 +130,10 @@ class SectionProduct(BinaryProduct):
             Section_Info = SectionNode(Whole_Data[Rel_Offset:])
             Section_Tree = BIOSTREE(Section_Info.Name)
             Section_Tree.type = SECTION_TREE
-            Section_Info.Data = Whole_Data[Rel_Offset+Section_Info.HeaderLength: Rel_Offset+Section_Info.Size]
-            Section_Info.DOffset = Section_Offset + Section_Info.HeaderLength + Rel_Whole_Offset
+            Section_Info.Data = Whole_Data[Rel_Offset +
+                                           Section_Info.HeaderLength: Rel_Offset+Section_Info.Size]
+            Section_Info.DOffset = Section_Offset + \
+                Section_Info.HeaderLength + Rel_Whole_Offset
             Section_Info.HOffset = Section_Offset + Rel_Whole_Offset
             Section_Info.ROffset = Rel_Offset
             if Section_Info.Header.Type == 0:
@@ -127,11 +141,14 @@ class SectionProduct(BinaryProduct):
             # The final Section in parent Section does not need to add padding, else must be 4-bytes align with parent Section start offset
             Pad_Size = 0
             if (Rel_Offset+Section_Info.HeaderLength+len(Section_Info.Data) != Data_Size):
-                Pad_Size = GetPadSize(Section_Info.Size, SECTION_COMMON_ALIGNMENT)
+                Pad_Size = GetPadSize(
+                    Section_Info.Size, SECTION_COMMON_ALIGNMENT)
                 Section_Info.PadData = Pad_Size * b'\x00'
             if Section_Info.Header.Type == 0x02:
-                Section_Info.DOffset = Section_Offset + Section_Info.ExtHeader.DataOffset + Rel_Whole_Offset
-                Section_Info.Data = Whole_Data[Rel_Offset+Section_Info.ExtHeader.DataOffset: Rel_Offset+Section_Info.Size]
+                Section_Info.DOffset = Section_Offset + \
+                    Section_Info.ExtHeader.DataOffset + Rel_Whole_Offset
+                Section_Info.Data = Whole_Data[Rel_Offset +
+                                               Section_Info.ExtHeader.DataOffset: Rel_Offset+Section_Info.Size]
             if Section_Info.Header.Type == 0x14:
                 ParTree.Data.Version = Section_Info.ExtHeader.GetVersionString()
             if Section_Info.Header.Type == 0x15:
@@ -144,9 +161,10 @@ class SectionProduct(BinaryProduct):
             Section_Tree.Data = Section_Info
             ParTree.insertChild(Section_Tree)
 
+
 class FfsProduct(BinaryProduct):
     # ParserFFs / GetSection
-    def ParserData(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int=0) -> None:
+    def ParserData(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int = 0) -> None:
         Rel_Offset = 0
         Section_Offset = 0
         # Get the Data from parent tree, if do not have the tree then get it from the whole_data.
@@ -162,8 +180,10 @@ class FfsProduct(BinaryProduct):
             Section_Info = SectionNode(Whole_Data[Rel_Offset:])
             Section_Tree = BIOSTREE(Section_Info.Name)
             Section_Tree.type = SECTION_TREE
-            Section_Info.Data = Whole_Data[Rel_Offset+Section_Info.HeaderLength: Rel_Offset+Section_Info.Size]
-            Section_Info.DOffset = Section_Offset + Section_Info.HeaderLength + Rel_Whole_Offset
+            Section_Info.Data = Whole_Data[Rel_Offset +
+                                           Section_Info.HeaderLength: Rel_Offset+Section_Info.Size]
+            Section_Info.DOffset = Section_Offset + \
+                Section_Info.HeaderLength + Rel_Whole_Offset
             Section_Info.HOffset = Section_Offset + Rel_Whole_Offset
             Section_Info.ROffset = Rel_Offset
             if Section_Info.Header.Type == 0:
@@ -171,11 +191,14 @@ class FfsProduct(BinaryProduct):
             # The final Section in Ffs does not need to add padding, else must be 4-bytes align with Ffs start offset
             Pad_Size = 0
             if (Rel_Offset+Section_Info.HeaderLength+len(Section_Info.Data) != Data_Size):
-                Pad_Size = GetPadSize(Section_Info.Size, SECTION_COMMON_ALIGNMENT)
+                Pad_Size = GetPadSize(
+                    Section_Info.Size, SECTION_COMMON_ALIGNMENT)
                 Section_Info.PadData = Pad_Size * b'\x00'
             if Section_Info.Header.Type == 0x02:
-                Section_Info.DOffset = Section_Offset + Section_Info.ExtHeader.DataOffset + Rel_Whole_Offset
-                Section_Info.Data = Whole_Data[Rel_Offset+Section_Info.ExtHeader.DataOffset: Rel_Offset+Section_Info.Size]
+                Section_Info.DOffset = Section_Offset + \
+                    Section_Info.ExtHeader.DataOffset + Rel_Whole_Offset
+                Section_Info.Data = Whole_Data[Rel_Offset +
+                                               Section_Info.ExtHeader.DataOffset: Rel_Offset+Section_Info.Size]
             # If Section is Version or UI type, it saves the version and UI info of its parent Ffs.
             if Section_Info.Header.Type == 0x14:
                 ParTree.Data.Version = Section_Info.ExtHeader.GetVersionString()
@@ -189,9 +212,10 @@ class FfsProduct(BinaryProduct):
             Section_Tree.Data = Section_Info
             ParTree.insertChild(Section_Tree)
 
+
 class FvProduct(BinaryProduct):
     ##  ParserFv / GetFfs
-    def ParserData(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int=0) -> None:
+    def ParserData(self, ParTree, Whole_Data: bytes, Rel_Whole_Offset: int = 0) -> None:
         Ffs_Offset = 0
         Rel_Offset = 0
         # Get the Data from parent tree, if do not have the tree then get it from the whole_data.
@@ -221,8 +245,10 @@ class FvProduct(BinaryProduct):
                 Ffs_Info.ROffset = Rel_Offset
                 if Ffs_Info.Name == PADVECTOR:
                     Ffs_Tree.type = FFS_PAD
-                    Ffs_Info.Data = Whole_Data[Rel_Offset+Ffs_Info.Header.HeaderLength: Rel_Offset+Ffs_Info.Size]
-                    Ffs_Info.Size = len(Ffs_Info.Data) + Ffs_Info.Header.HeaderLength
+                    Ffs_Info.Data = Whole_Data[Rel_Offset +
+                                               Ffs_Info.Header.HeaderLength: Rel_Offset+Ffs_Info.Size]
+                    Ffs_Info.Size = len(Ffs_Info.Data) + \
+                        Ffs_Info.Header.HeaderLength
                     # if current Ffs is the final ffs of Fv and full of b'\xff', define it with Free_Space
                     if struct2stream(Ffs_Info.Header).replace(b'\xff', b'') == b'':
                         Ffs_Tree.type = FFS_FREE_SPACE
@@ -231,7 +257,8 @@ class FvProduct(BinaryProduct):
                         ParTree.Data.Free_Space = Ffs_Info.Size
                 else:
                     Ffs_Tree.type = FFS_TREE
-                    Ffs_Info.Data = Whole_Data[Rel_Offset+Ffs_Info.Header.HeaderLength: Rel_Offset+Ffs_Info.Size]
+                    Ffs_Info.Data = Whole_Data[Rel_Offset +
+                                               Ffs_Info.Header.HeaderLength: Rel_Offset+Ffs_Info.Size]
                 # The final Ffs in Fv does not need to add padding, else must be 8-bytes align with Fv start offset
                 Pad_Size = 0
                 if Ffs_Tree.type != FFS_FREE_SPACE and (Rel_Offset+Ffs_Info.Header.HeaderLength+len(Ffs_Info.Data) != Data_Size):
@@ -242,11 +269,12 @@ class FvProduct(BinaryProduct):
                 Ffs_Tree.Data = Ffs_Info
                 ParTree.insertChild(Ffs_Tree)
 
+
 class FdProduct(BinaryProduct):
     type = [ROOT_FV_TREE, ROOT_TREE]
 
-    ## Create DataTree with first level /fv Info, then parser each Fv.
-    def ParserData(self, WholeFvTree, whole_data: bytes=b'', offset: int=0) -> None:
+    # Create DataTree with first level /fv Info, then parser each Fv.
+    def ParserData(self, WholeFvTree, whole_data: bytes = b'', offset: int = 0) -> None:
         # Get all Fv image in Fd with offset and length
         Fd_Struct = self.GetFvFromFd(whole_data)
         data_size = len(whole_data)
@@ -254,7 +282,7 @@ class FdProduct(BinaryProduct):
         global Fv_count
         # If the first Fv image is the Binary Fv, add it into the tree.
         if Fd_Struct[0][1] != 0:
-            Binary_node = BIOSTREE('BINARY'+ str(Binary_count))
+            Binary_node = BIOSTREE('BINARY' + str(Binary_count))
             Binary_node.type = BINARY_DATA
             Binary_node.Data = BinaryNode(str(Binary_count))
             Binary_node.Data.Data = whole_data[:Fd_Struct[0][1]]
@@ -263,56 +291,66 @@ class FdProduct(BinaryProduct):
             WholeFvTree.insertChild(Binary_node)
             Binary_count += 1
         # Add the first collected Fv image into the tree.
-        Cur_node = BIOSTREE(Fd_Struct[0][0]+ str(Fv_count))
+        Cur_node = BIOSTREE(Fd_Struct[0][0] + str(Fv_count))
         Cur_node.type = Fd_Struct[0][0]
-        Cur_node.Data = FvNode(Fv_count, whole_data[Fd_Struct[0][1]:Fd_Struct[0][1]+Fd_Struct[0][2][0]])
+        Cur_node.Data = FvNode(
+            Fv_count, whole_data[Fd_Struct[0][1]:Fd_Struct[0][1]+Fd_Struct[0][2][0]])
         Cur_node.Data.HOffset = Fd_Struct[0][1] + offset
         Cur_node.Data.DOffset = Cur_node.Data.HOffset+Cur_node.Data.Header.HeaderLength
-        Cur_node.Data.Data = whole_data[Fd_Struct[0][1]+Cur_node.Data.Header.HeaderLength:Fd_Struct[0][1]+Cur_node.Data.Size]
+        Cur_node.Data.Data = whole_data[Fd_Struct[0][1] +
+                                        Cur_node.Data.Header.HeaderLength:Fd_Struct[0][1]+Cur_node.Data.Size]
         WholeFvTree.insertChild(Cur_node)
         Fv_count += 1
         Fv_num = len(Fd_Struct)
         # Add all the collected Fv image and the Binary Fv image between them into the tree.
         for i in range(Fv_num-1):
             if Fd_Struct[i][1]+Fd_Struct[i][2][0] != Fd_Struct[i+1][1]:
-                Binary_node = BIOSTREE('BINARY'+ str(Binary_count))
+                Binary_node = BIOSTREE('BINARY' + str(Binary_count))
                 Binary_node.type = BINARY_DATA
                 Binary_node.Data = BinaryNode(str(Binary_count))
-                Binary_node.Data.Data = whole_data[Fd_Struct[i][1]+Fd_Struct[i][2][0]:Fd_Struct[i+1][1]]
+                Binary_node.Data.Data = whole_data[Fd_Struct[i]
+                                                   [1]+Fd_Struct[i][2][0]:Fd_Struct[i+1][1]]
                 Binary_node.Data.Size = len(Binary_node.Data.Data)
-                Binary_node.Data.HOffset = Fd_Struct[i][1]+Fd_Struct[i][2][0] + offset
+                Binary_node.Data.HOffset = Fd_Struct[i][1] + \
+                    Fd_Struct[i][2][0] + offset
                 WholeFvTree.insertChild(Binary_node)
                 Binary_count += 1
-            Cur_node = BIOSTREE(Fd_Struct[i+1][0]+ str(Fv_count))
+            Cur_node = BIOSTREE(Fd_Struct[i+1][0] + str(Fv_count))
             Cur_node.type = Fd_Struct[i+1][0]
-            Cur_node.Data = FvNode(Fv_count, whole_data[Fd_Struct[i+1][1]:Fd_Struct[i+1][1]+Fd_Struct[i+1][2][0]])
+            Cur_node.Data = FvNode(
+                Fv_count, whole_data[Fd_Struct[i+1][1]:Fd_Struct[i+1][1]+Fd_Struct[i+1][2][0]])
             Cur_node.Data.HOffset = Fd_Struct[i+1][1] + offset
             Cur_node.Data.DOffset = Cur_node.Data.HOffset+Cur_node.Data.Header.HeaderLength
-            Cur_node.Data.Data = whole_data[Fd_Struct[i+1][1]+Cur_node.Data.Header.HeaderLength:Fd_Struct[i+1][1]+Cur_node.Data.Size]
+            Cur_node.Data.Data = whole_data[Fd_Struct[i+1][1] +
+                                            Cur_node.Data.Header.HeaderLength:Fd_Struct[i+1][1]+Cur_node.Data.Size]
             WholeFvTree.insertChild(Cur_node)
             Fv_count += 1
         # If the final Fv image is the Binary Fv, add it into the tree
         if Fd_Struct[-1][1] + Fd_Struct[-1][2][0] != data_size:
-            Binary_node = BIOSTREE('BINARY'+ str(Binary_count))
+            Binary_node = BIOSTREE('BINARY' + str(Binary_count))
             Binary_node.type = BINARY_DATA
             Binary_node.Data = BinaryNode(str(Binary_count))
-            Binary_node.Data.Data = whole_data[Fd_Struct[-1][1]+Fd_Struct[-1][2][0]:]
+            Binary_node.Data.Data = whole_data[Fd_Struct[-1]
+                                               [1]+Fd_Struct[-1][2][0]:]
             Binary_node.Data.Size = len(Binary_node.Data.Data)
-            Binary_node.Data.HOffset = Fd_Struct[-1][1]+Fd_Struct[-1][2][0] + offset
+            Binary_node.Data.HOffset = Fd_Struct[-1][1] + \
+                Fd_Struct[-1][2][0] + offset
             WholeFvTree.insertChild(Binary_node)
             Binary_count += 1
 
-    ## Get the first level Fv from Fd file.
-    def GetFvFromFd(self, whole_data: bytes=b'') -> list:
+    # Get the first level Fv from Fd file.
+    def GetFvFromFd(self, whole_data: bytes = b'') -> list:
         Fd_Struct = []
         data_size = len(whole_data)
         cur_index = 0
         # Get all the EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE FV image offset and length.
         while cur_index < data_size:
             if EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE in whole_data[cur_index:]:
-                target_index = whole_data[cur_index:].index(EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE) + cur_index
+                target_index = whole_data[cur_index:].index(
+                    EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE) + cur_index
                 if whole_data[target_index+24:target_index+28] == FVH_SIGNATURE:
-                    Fd_Struct.append([FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
+                    Fd_Struct.append(
+                        [FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
                     cur_index = Fd_Struct[-1][1] + Fd_Struct[-1][2][0]
                 else:
                     cur_index = target_index + 16
@@ -322,9 +360,11 @@ class FdProduct(BinaryProduct):
         # Get all the EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE FV image offset and length.
         while cur_index < data_size:
             if EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE in whole_data[cur_index:]:
-                target_index = whole_data[cur_index:].index(EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE) + cur_index
+                target_index = whole_data[cur_index:].index(
+                    EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE) + cur_index
                 if whole_data[target_index+24:target_index+28] == FVH_SIGNATURE:
-                    Fd_Struct.append([FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
+                    Fd_Struct.append(
+                        [FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
                     cur_index = Fd_Struct[-1][1] + Fd_Struct[-1][2][0]
                 else:
                     cur_index = target_index + 16
@@ -334,28 +374,31 @@ class FdProduct(BinaryProduct):
         # Get all the EFI_SYSTEM_NVDATA_FV_GUID_BYTE FV image offset and length.
         while cur_index < data_size:
             if EFI_SYSTEM_NVDATA_FV_GUID_BYTE in whole_data[cur_index:]:
-                target_index = whole_data[cur_index:].index(EFI_SYSTEM_NVDATA_FV_GUID_BYTE) + cur_index
+                target_index = whole_data[cur_index:].index(
+                    EFI_SYSTEM_NVDATA_FV_GUID_BYTE) + cur_index
                 if whole_data[target_index+24:target_index+28] == FVH_SIGNATURE:
-                    Fd_Struct.append([DATA_FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
+                    Fd_Struct.append(
+                        [DATA_FV_TREE, target_index - 16, unpack("Q", whole_data[target_index+16:target_index+24])])
                     cur_index = Fd_Struct[-1][1] + Fd_Struct[-1][2][0]
                 else:
                     cur_index = target_index + 16
             else:
                 cur_index = data_size
         # Sort all the collect Fv image with offset.
-        Fd_Struct.sort(key=lambda x:x[1])
+        Fd_Struct.sort(key=lambda x: x[1])
         tmp_struct = copy.deepcopy(Fd_Struct)
         tmp_index = 0
         Fv_num = len(Fd_Struct)
         # Remove the Fv image included in another Fv image.
-        for i in range(1,Fv_num):
+        for i in range(1, Fv_num):
             if tmp_struct[i][1]+tmp_struct[i][2][0] < tmp_struct[i-1][1]+tmp_struct[i-1][2][0]:
                 Fd_Struct.remove(Fd_Struct[i-tmp_index])
                 tmp_index += 1
         return Fd_Struct
 
+
 class ParserEntry():
-    FactoryTable:dict = {
+    FactoryTable: dict = {
         SECTION_TREE: SectionFactory,
         ROOT_SECTION_TREE: FfsFactory,
         FFS_TREE: FfsFactory,
@@ -377,4 +420,4 @@ class ParserEntry():
     def DataParser(self, Tree, Data: bytes, Offset: int) -> None:
         TargetFactory = self.GetTargetFactory(Tree.type)
         if TargetFactory:
-            self.Generate_Product(TargetFactory, Tree, Data, Offset)
\ No newline at end of file
+            self.Generate_Product(TargetFactory, Tree, Data, Offset)
diff --git a/BaseTools/Source/Python/FMMT/core/BiosTree.py b/BaseTools/Source/Python/FMMT/core/BiosTree.py
index d8fa4743354a..ab9cc6995e12 100644
--- a/BaseTools/Source/Python/FMMT/core/BiosTree.py
+++ b/BaseTools/Source/Python/FMMT/core/BiosTree.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the Bios layout tree structure and related operations.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -27,6 +27,7 @@ FvType = [FV_TREE, SEC_FV_TREE]
 FfsType = FFS_TREE
 SecType = SECTION_TREE
 
+
 class BIOSTREE:
     def __init__(self, NodeName: str) -> None:
         self.key = NodeName
@@ -52,7 +53,7 @@ class BIOSTREE:
         return False
 
     # FvTree.insertChild()
-    def insertChild(self, newNode, pos: int=None) -> None:
+    def insertChild(self, newNode, pos: int = None) -> None:
         if len(self.Child) == 0:
             self.Child.append(newNode)
         else:
@@ -112,39 +113,48 @@ class BIOSTREE:
             self = self.Parent
         return BiosTreePath
 
-    def parserTree(self, TargetDict: dict=None, Info: list=None, space: int=0, ParFvId="") -> None:
+    def parserTree(self, TargetDict: dict = None, Info: list = None, space: int = 0, ParFvId="") -> None:
         Key = list(TargetDict.keys())[0]
         if TargetDict[Key]["Type"] in RootType:
             Info.append("Image File: {}".format(Key))
-            Info.append("FilesNum: {}".format(TargetDict.get(Key).get('FilesNum')))
+            Info.append("FilesNum: {}".format(
+                TargetDict.get(Key).get('FilesNum')))
             Info.append("\n")
         elif TargetDict[Key]["Type"] in FvType:
             space += 2
             if TargetDict[Key]["Type"] == SEC_FV_TREE:
-                Info.append("{}Child FV named {} of {}".format(space*" ", Key, ParFvId))
+                Info.append("{}Child FV named {} of {}".format(
+                    space*" ", Key, ParFvId))
                 space += 2
             else:
                 Info.append("FvId: {}".format(Key))
                 ParFvId = Key
-            Info.append("{}FvNameGuid: {}".format(space*" ", TargetDict.get(Key).get('FvNameGuid')))
-            Info.append("{}Attributes: {}".format(space*" ", TargetDict.get(Key).get('Attributes')))
-            Info.append("{}Total Volume Size: {}".format(space*" ", TargetDict.get(Key).get('Size')))
-            Info.append("{}Free Volume Size: {}".format(space*" ", TargetDict.get(Key).get('FreeSize')))
-            Info.append("{}Volume Offset: {}".format(space*" ", TargetDict.get(Key).get('Offset')))
-            Info.append("{}FilesNum: {}".format(space*" ", TargetDict.get(Key).get('FilesNum')))
+            Info.append("{}FvNameGuid: {}".format(
+                space*" ", TargetDict.get(Key).get('FvNameGuid')))
+            Info.append("{}Attributes: {}".format(
+                space*" ", TargetDict.get(Key).get('Attributes')))
+            Info.append("{}Total Volume Size: {}".format(
+                space*" ", TargetDict.get(Key).get('Size')))
+            Info.append("{}Free Volume Size: {}".format(
+                space*" ", TargetDict.get(Key).get('FreeSize')))
+            Info.append("{}Volume Offset: {}".format(
+                space*" ", TargetDict.get(Key).get('Offset')))
+            Info.append("{}FilesNum: {}".format(
+                space*" ", TargetDict.get(Key).get('FilesNum')))
         elif TargetDict[Key]["Type"] in FfsType:
             space += 2
             if TargetDict.get(Key).get('UiName') != "b''":
-                Info.append("{}File: {} / {}".format(space*" ", Key, TargetDict.get(Key).get('UiName')))
+                Info.append("{}File: {} / {}".format(space*" ",
+                            Key, TargetDict.get(Key).get('UiName')))
             else:
                 Info.append("{}File: {}".format(space*" ", Key))
         if "Files" in list(TargetDict[Key].keys()):
             for item in TargetDict[Key]["Files"]:
                 self.parserTree(item, Info, space, ParFvId)
 
-    def ExportTree(self,TreeInfo: dict=None) -> dict:
+    def ExportTree(self, TreeInfo: dict = None) -> dict:
         if TreeInfo is None:
-            TreeInfo =collections.OrderedDict()
+            TreeInfo = collections.OrderedDict()
 
         if self.type == ROOT_TREE or self.type == ROOT_FV_TREE or self.type == ROOT_FFS_TREE or self.type == ROOT_SECTION_TREE:
             key = str(self.key)
@@ -152,7 +162,7 @@ class BIOSTREE:
             TreeInfo[self.key]["Name"] = key
             TreeInfo[self.key]["Type"] = self.type
             TreeInfo[self.key]["FilesNum"] = len(self.Child)
-        elif self.type == FV_TREE or  self.type == SEC_FV_TREE:
+        elif self.type == FV_TREE or self.type == SEC_FV_TREE:
             key = str(self.Data.FvId)
             TreeInfo[key] = collections.OrderedDict()
             TreeInfo[key]["Name"] = key
@@ -179,7 +189,8 @@ class BIOSTREE:
             TreeInfo[key] = collections.OrderedDict()
             TreeInfo[key]["Name"] = key
             TreeInfo[key]["Type"] = self.type
-            TreeInfo[key]["Size"] = hex(len(self.Data.OriData) + self.Data.HeaderLength)
+            TreeInfo[key]["Size"] = hex(
+                len(self.Data.OriData) + self.Data.HeaderLength)
             TreeInfo[key]["DecompressedSize"] = hex(self.Data.Size)
             TreeInfo[key]["Offset"] = hex(self.Data.HOffset)
             TreeInfo[key]["FilesNum"] = len(self.Child)
@@ -193,6 +204,6 @@ class BIOSTREE:
             TreeInfo[key]["FilesNum"] = len(self.Child)
 
         for item in self.Child:
-            TreeInfo[key].setdefault('Files',[]).append( item.ExportTree())
+            TreeInfo[key].setdefault('Files', []).append(item.ExportTree())
 
-        return TreeInfo
\ No newline at end of file
+        return TreeInfo
diff --git a/BaseTools/Source/Python/FMMT/core/BiosTreeNode.py b/BaseTools/Source/Python/FMMT/core/BiosTreeNode.py
index 20447766c821..360d9477d79a 100644
--- a/BaseTools/Source/Python/FMMT/core/BiosTreeNode.py
+++ b/BaseTools/Source/Python/FMMT/core/BiosTreeNode.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the BIOS Tree Node.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -12,24 +12,25 @@ from utils.FmmtLogger import FmmtLogger as logger
 import uuid
 
 SectionHeaderType = {
-    0x01:'EFI_COMPRESSION_SECTION',
-    0x02:'EFI_GUID_DEFINED_SECTION',
-    0x03:'EFI_SECTION_DISPOSABLE',
-    0x10:'EFI_SECTION_PE32',
-    0x11:'EFI_SECTION_PIC',
-    0x12:'EFI_SECTION_TE',
-    0x13:'EFI_SECTION_DXE_DEPEX',
-    0x14:'EFI_SECTION_VERSION',
-    0x15:'EFI_SECTION_USER_INTERFACE',
-    0x16:'EFI_SECTION_COMPATIBILITY16',
-    0x17:'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
-    0x18:'EFI_FREEFORM_SUBTYPE_GUID_SECTION',
-    0x19:'EFI_SECTION_RAW',
-    0x1B:'EFI_SECTION_PEI_DEPEX',
-    0x1C:'EFI_SECTION_MM_DEPEX'
+    0x01: 'EFI_COMPRESSION_SECTION',
+    0x02: 'EFI_GUID_DEFINED_SECTION',
+    0x03: 'EFI_SECTION_DISPOSABLE',
+    0x10: 'EFI_SECTION_PE32',
+    0x11: 'EFI_SECTION_PIC',
+    0x12: 'EFI_SECTION_TE',
+    0x13: 'EFI_SECTION_DXE_DEPEX',
+    0x14: 'EFI_SECTION_VERSION',
+    0x15: 'EFI_SECTION_USER_INTERFACE',
+    0x16: 'EFI_SECTION_COMPATIBILITY16',
+    0x17: 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
+    0x18: 'EFI_FREEFORM_SUBTYPE_GUID_SECTION',
+    0x19: 'EFI_SECTION_RAW',
+    0x1B: 'EFI_SECTION_PEI_DEPEX',
+    0x1C: 'EFI_SECTION_MM_DEPEX'
 }
 HeaderType = [0x01, 0x02, 0x14, 0x15, 0x18]
 
+
 class BinaryNode:
     def __init__(self, name: str) -> None:
         self.Size = 0
@@ -37,6 +38,7 @@ class BinaryNode:
         self.HOffset = 0
         self.Data = b''
 
+
 class FvNode:
     def __init__(self, name, buffer: bytes) -> None:
         self.Header = EFI_FIRMWARE_VOLUME_HEADER.from_buffer_copy(buffer)
@@ -45,21 +47,27 @@ class FvNode:
         self.FvId = "FV" + str(name)
         self.Name = "FV" + str(name)
         if self.Header.ExtHeaderOffset:
-            self.ExtHeader = EFI_FIRMWARE_VOLUME_EXT_HEADER.from_buffer_copy(buffer[self.Header.ExtHeaderOffset:])
-            self.Name =  uuid.UUID(bytes_le=struct2stream(self.ExtHeader.FvName))
+            self.ExtHeader = EFI_FIRMWARE_VOLUME_EXT_HEADER.from_buffer_copy(
+                buffer[self.Header.ExtHeaderOffset:])
+            self.Name = uuid.UUID(
+                bytes_le=struct2stream(self.ExtHeader.FvName))
             self.ExtEntryOffset = self.Header.ExtHeaderOffset + 20
             if self.ExtHeader.ExtHeaderSize != 20:
                 self.ExtEntryExist = 1
-                self.ExtEntry = EFI_FIRMWARE_VOLUME_EXT_ENTRY.from_buffer_copy(buffer[self.ExtEntryOffset:])
+                self.ExtEntry = EFI_FIRMWARE_VOLUME_EXT_ENTRY.from_buffer_copy(
+                    buffer[self.ExtEntryOffset:])
                 self.ExtTypeExist = 1
                 if self.ExtEntry.ExtEntryType == 0x01:
                     nums = (self.ExtEntry.ExtEntrySize - 8) // 16
-                    self.ExtEntry = Refine_FV_EXT_ENTRY_OEM_TYPE_Header(nums).from_buffer_copy(buffer[self.ExtEntryOffset:])
+                    self.ExtEntry = Refine_FV_EXT_ENTRY_OEM_TYPE_Header(
+                        nums).from_buffer_copy(buffer[self.ExtEntryOffset:])
                 elif self.ExtEntry.ExtEntryType == 0x02:
                     nums = self.ExtEntry.ExtEntrySize - 20
-                    self.ExtEntry = Refine_FV_EXT_ENTRY_GUID_TYPE_Header(nums).from_buffer_copy(buffer[self.ExtEntryOffset:])
+                    self.ExtEntry = Refine_FV_EXT_ENTRY_GUID_TYPE_Header(
+                        nums).from_buffer_copy(buffer[self.ExtEntryOffset:])
                 elif self.ExtEntry.ExtEntryType == 0x03:
-                    self.ExtEntry = EFI_FIRMWARE_VOLUME_EXT_ENTRY_USED_SIZE_TYPE.from_buffer_copy(buffer[self.ExtEntryOffset:])
+                    self.ExtEntry = EFI_FIRMWARE_VOLUME_EXT_ENTRY_USED_SIZE_TYPE.from_buffer_copy(
+                        buffer[self.ExtEntryOffset:])
                 else:
                     self.ExtTypeExist = 0
             else:
@@ -71,7 +79,8 @@ class FvNode:
         self.ROffset = 0
         self.Data = b''
         if self.Header.Signature != 1213613663:
-            logger.error('Invalid Fv Header! Fv {} signature {} is not "_FVH".'.format(struct2stream(self.Header), self.Header.Signature))
+            logger.error('Invalid Fv Header! Fv {} signature {} is not "_FVH".'.format(
+                struct2stream(self.Header), self.Header.Signature))
             raise Exception("Process Failed: Fv Header Signature!")
         self.PadData = b''
         self.Free_Space = 0
@@ -85,7 +94,8 @@ class FvNode:
         for i in range(Size):
             Sum += int(Header[i*2: i*2 + 2].hex(), 16)
         if Sum & 0xffff:
-            self.Header.Checksum = 0x10000 - (Sum - self.Header.Checksum) % 0x10000
+            self.Header.Checksum = 0x10000 - \
+                (Sum - self.Header.Checksum) % 0x10000
 
     def ModFvExt(self) -> None:
         # If used space changes and self.ExtEntry.UsedSize exists, self.ExtEntry.UsedSize need to be changed.
@@ -103,18 +113,22 @@ class FvNode:
         if self.Header.ExtHeaderOffset:
             ExtHeaderData = struct2stream(self.ExtHeader)
             ExtHeaderDataOffset = self.Header.ExtHeaderOffset - self.HeaderLength
-            self.Data = self.Data[:ExtHeaderDataOffset] + ExtHeaderData + self.Data[ExtHeaderDataOffset+20:]
+            self.Data = self.Data[:ExtHeaderDataOffset] + \
+                ExtHeaderData + self.Data[ExtHeaderDataOffset+20:]
         if self.Header.ExtHeaderOffset and self.ExtEntryExist:
             ExtHeaderEntryData = struct2stream(self.ExtEntry)
             ExtHeaderEntryDataOffset = self.Header.ExtHeaderOffset + 20 - self.HeaderLength
-            self.Data = self.Data[:ExtHeaderEntryDataOffset] + ExtHeaderEntryData + self.Data[ExtHeaderEntryDataOffset+len(ExtHeaderEntryData):]
+            self.Data = self.Data[:ExtHeaderEntryDataOffset] + ExtHeaderEntryData + \
+                self.Data[ExtHeaderEntryDataOffset+len(ExtHeaderEntryData):]
+
 
 class FfsNode:
     def __init__(self, buffer: bytes) -> None:
         self.Header = EFI_FFS_FILE_HEADER.from_buffer_copy(buffer)
         # self.Attributes = unpack("<B", buffer[21:22])[0]
         if self.Header.FFS_FILE_SIZE != 0 and self.Header.Attributes != 0xff and self.Header.Attributes & 0x01 == 1:
-            logger.error('Error Ffs Header! Ffs {} Header Size and Attributes is not matched!'.format(uuid.UUID(bytes_le=struct2stream(self.Header.Name))))
+            logger.error('Error Ffs Header! Ffs {} Header Size and Attributes is not matched!'.format(
+                uuid.UUID(bytes_le=struct2stream(self.Header.Name))))
             raise Exception("Process Failed: Error Ffs Header!")
         if self.Header.FFS_FILE_SIZE == 0 and self.Header.Attributes & 0x01 == 1:
             self.Header = EFI_FFS_FILE_HEADER2.from_buffer_copy(buffer)
@@ -141,6 +155,7 @@ class FfsNode:
             Header = self.Header.IntegrityCheck.Checksum.Header + 0x100 - HeaderSum % 0x100
             self.Header.IntegrityCheck.Checksum.Header = Header % 0x100
 
+
 class SectionNode:
     def __init__(self, buffer: bytes) -> None:
         if buffer[0:3] != b'\xff\xff\xff':
@@ -154,7 +169,8 @@ class SectionNode:
         else:
             self.Name = "SECTION"
         if self.Header.Type in HeaderType:
-            self.ExtHeader = self.GetExtHeader(self.Header.Type, buffer[self.Header.Common_Header_Size():], (self.Header.SECTION_SIZE-self.Header.Common_Header_Size()))
+            self.ExtHeader = self.GetExtHeader(self.Header.Type, buffer[self.Header.Common_Header_Size(
+            ):], (self.Header.SECTION_SIZE-self.Header.Common_Header_Size()))
             self.HeaderLength = self.Header.Common_Header_Size() + self.ExtHeader.ExtHeaderSize()
         else:
             self.ExtHeader = None
@@ -171,7 +187,7 @@ class SectionNode:
         self.IsPadSection = False
         self.SectionMaxAlignment = SECTION_COMMON_ALIGNMENT  # 4-align
 
-    def GetExtHeader(self, Type: int, buffer: bytes, nums: int=0) -> None:
+    def GetExtHeader(self, Type: int, buffer: bytes, nums: int = 0) -> None:
         if Type == 0x01:
             return EFI_COMPRESSION_SECTION.from_buffer_copy(buffer)
         elif Type == 0x02:
@@ -183,6 +199,7 @@ class SectionNode:
         elif Type == 0x18:
             return EFI_FREEFORM_SUBTYPE_GUID_SECTION.from_buffer_copy(buffer)
 
+
 class FreeSpaceNode:
     def __init__(self, buffer: bytes) -> None:
         self.Name = 'Free_Space'
@@ -191,4 +208,4 @@ class FreeSpaceNode:
         self.HOffset = 0
         self.DOffset = 0
         self.ROffset = 0
-        self.PadData = b''
\ No newline at end of file
+        self.PadData = b''
diff --git a/BaseTools/Source/Python/FMMT/core/FMMTOperation.py b/BaseTools/Source/Python/FMMT/core/FMMTOperation.py
index c2cc2e246740..19fb9177e6bf 100644
--- a/BaseTools/Source/Python/FMMT/core/FMMTOperation.py
+++ b/BaseTools/Source/Python/FMMT/core/FMMTOperation.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the functions to operate bios binary file.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -13,7 +13,9 @@ global Fv_count
 Fv_count = 0
 
 # The ROOT_TYPE can be 'ROOT_TREE', 'ROOT_FV_TREE', 'ROOT_FFS_TREE', 'ROOT_SECTION_TREE'
-def ViewFile(inputfile: str, ROOT_TYPE: str, layoutfile: str=None, outputfile: str=None) -> None:
+
+
+def ViewFile(inputfile: str, ROOT_TYPE: str, layoutfile: str = None, outputfile: str = None) -> None:
     if not os.path.exists(inputfile):
         logger.error("Invalid inputfile, can not open {}.".format(inputfile))
         raise Exception("Process Failed: Invalid inputfile!")
@@ -36,9 +38,11 @@ def ViewFile(inputfile: str, ROOT_TYPE: str, layoutfile: str=None, outputfile: s
             layoutfilename = layoutfile
             layoutfileformat = os.path.splitext(layoutfile)[1][1:].lower()
         else:
-            layoutfilename = "Layout_{}{}".format(os.path.basename(inputfile),".{}".format(layoutfile.lower()))
+            layoutfilename = "Layout_{}{}".format(os.path.basename(
+                inputfile), ".{}".format(layoutfile.lower()))
             layoutfileformat = layoutfile.lower()
-        GetFormatter(layoutfileformat).dump(InfoDict, FmmtParser.BinaryInfo, layoutfilename)
+        GetFormatter(layoutfileformat).dump(
+            InfoDict, FmmtParser.BinaryInfo, layoutfilename)
     # 4. Data Encapsulation
     if outputfile:
         logger.debug('Start encapsulating data......')
@@ -47,7 +51,8 @@ def ViewFile(inputfile: str, ROOT_TYPE: str, layoutfile: str=None, outputfile: s
             f.write(FmmtParser.FinalData)
         logger.debug('Encapsulated data is saved in {}.'.format(outputfile))
 
-def DeleteFfs(inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str=None) -> None:
+
+def DeleteFfs(inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str = None) -> None:
     if not os.path.exists(inputfile):
         logger.error("Invalid inputfile, can not open {}.".format(inputfile))
         raise Exception("Process Failed: Invalid inputfile!")
@@ -60,7 +65,8 @@ def DeleteFfs(inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str
     FmmtParser.ParserFromRoot(FmmtParser.WholeFvTree, whole_data)
     logger.debug('Done!')
     # 3. Data Modify
-    FmmtParser.WholeFvTree.FindNode(TargetFfs_name, FmmtParser.WholeFvTree.Findlist)
+    FmmtParser.WholeFvTree.FindNode(
+        TargetFfs_name, FmmtParser.WholeFvTree.Findlist)
     # Choose the Specfic DeleteFfs with Fv info
     if Fv_name:
         for item in FmmtParser.WholeFvTree.Findlist:
@@ -81,6 +87,7 @@ def DeleteFfs(inputfile: str, TargetFfs_name: str, outputfile: str, Fv_name: str
             f.write(FmmtParser.FinalData)
         logger.debug('Encapsulated data is saved in {}.'.format(outputfile))
 
+
 def AddNewFfs(inputfile: str, Fv_name: str, newffsfile: str, outputfile: str) -> None:
     if not os.path.exists(inputfile):
         logger.error("Invalid inputfile, can not open {}.".format(inputfile))
@@ -109,11 +116,14 @@ def AddNewFfs(inputfile: str, Fv_name: str, newffsfile: str, outputfile: str) ->
             TargetFfsPad = TargetFv.Child[-1]
             logger.debug('Parsing newffsfile data......')
             if TargetFfsPad.type == FFS_FREE_SPACE:
-                NewFmmtParser.ParserFromRoot(NewFmmtParser.WholeFvTree, new_ffs_data, TargetFfsPad.Data.HOffset)
+                NewFmmtParser.ParserFromRoot(
+                    NewFmmtParser.WholeFvTree, new_ffs_data, TargetFfsPad.Data.HOffset)
             else:
-                NewFmmtParser.ParserFromRoot(NewFmmtParser.WholeFvTree, new_ffs_data, TargetFfsPad.Data.HOffset+TargetFfsPad.Data.Size)
+                NewFmmtParser.ParserFromRoot(
+                    NewFmmtParser.WholeFvTree, new_ffs_data, TargetFfsPad.Data.HOffset+TargetFfsPad.Data.Size)
             logger.debug('Done!')
-            FfsMod = FvHandler(NewFmmtParser.WholeFvTree.Child[0], TargetFfsPad)
+            FfsMod = FvHandler(
+                NewFmmtParser.WholeFvTree.Child[0], TargetFfsPad)
             Status = FfsMod.AddFfs()
     else:
         logger.error('Target Fv not found!!!')
@@ -125,7 +135,8 @@ def AddNewFfs(inputfile: str, Fv_name: str, newffsfile: str, outputfile: str) ->
             f.write(FmmtParser.FinalData)
         logger.debug('Encapsulated data is saved in {}.'.format(outputfile))
 
-def ReplaceFfs(inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str, Fv_name: str=None) -> None:
+
+def ReplaceFfs(inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str, Fv_name: str = None) -> None:
     if not os.path.exists(inputfile):
         logger.error("Invalid inputfile, can not open {}.".format(inputfile))
         raise Exception("Process Failed: Invalid inputfile!")
@@ -146,7 +157,8 @@ def ReplaceFfs(inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str,
     Status = False
     # 3. Data Modify
     new_ffs = newFmmtParser.WholeFvTree.Child[0]
-    new_ffs.Data.PadData = GetPadSize(new_ffs.Data.Size, FFS_COMMON_ALIGNMENT) * b'\xff'
+    new_ffs.Data.PadData = GetPadSize(
+        new_ffs.Data.Size, FFS_COMMON_ALIGNMENT) * b'\xff'
     FmmtParser.WholeFvTree.FindNode(Ffs_name, FmmtParser.WholeFvTree.Findlist)
     if Fv_name:
         for item in FmmtParser.WholeFvTree.Findlist:
@@ -166,7 +178,8 @@ def ReplaceFfs(inputfile: str, Ffs_name: str, newffsfile: str, outputfile: str,
             f.write(FmmtParser.FinalData)
         logger.debug('Encapsulated data is saved in {}.'.format(outputfile))
 
-def ExtractFfs(inputfile: str, Ffs_name: str, outputfile: str, Fv_name: str=None) -> None:
+
+def ExtractFfs(inputfile: str, Ffs_name: str, outputfile: str, Fv_name: str = None) -> None:
     if not os.path.exists(inputfile):
         logger.error("Invalid inputfile, can not open {}.".format(inputfile))
         raise Exception("Process Failed: Invalid inputfile!")
@@ -189,7 +202,8 @@ def ExtractFfs(inputfile: str, Ffs_name: str, outputfile: str, Fv_name: str=None
         if TargetFv.Data.Header.Attributes & EFI_FVB2_ERASE_POLARITY:
             TargetNode.Data.Header.State = c_uint8(
                 ~TargetNode.Data.Header.State)
-        FinalData = struct2stream(TargetNode.Data.Header) + TargetNode.Data.Data
+        FinalData = struct2stream(
+            TargetNode.Data.Header) + TargetNode.Data.Data
         with open(outputfile, "wb") as f:
             f.write(FinalData)
         logger.debug('Extract ffs data is saved in {}.'.format(outputfile))
diff --git a/BaseTools/Source/Python/FMMT/core/FMMTParser.py b/BaseTools/Source/Python/FMMT/core/FMMTParser.py
index e76ac5118509..cd275801ba0c 100644
--- a/BaseTools/Source/Python/FMMT/core/FMMTParser.py
+++ b/BaseTools/Source/Python/FMMT/core/FMMTParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the interface of Bios Parser.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -11,6 +11,7 @@ from core.BiosTree import *
 from core.GuidTools import *
 from utils.FmmtLogger import FmmtLogger as logger
 
+
 class FMMTParser:
     def __init__(self, name: str, TYPE: str) -> None:
         self.WholeFvTree = BIOSTREE(name)
@@ -18,8 +19,8 @@ class FMMTParser:
         self.FinalData = b''
         self.BinaryInfo = []
 
-    ## Parser the nodes in WholeTree.
-    def ParserFromRoot(self, WholeFvTree=None, whole_data: bytes=b'', Reloffset: int=0) -> None:
+    # Parser the nodes in WholeTree.
+    def ParserFromRoot(self, WholeFvTree=None, whole_data: bytes = b'', Reloffset: int = 0) -> None:
         if WholeFvTree.type == ROOT_TREE or WholeFvTree.type == ROOT_FV_TREE:
             ParserEntry().DataParser(self.WholeFvTree, whole_data, Reloffset)
         else:
@@ -27,7 +28,7 @@ class FMMTParser:
         for Child in WholeFvTree.Child:
             self.ParserFromRoot(Child, "")
 
-    ## Encapuslation all the data in tree into self.FinalData
+    # Encapuslation all the data in tree into self.FinalData
     def Encapsulation(self, rootTree, CompressStatus: bool) -> None:
         # If current node is Root node, skip it.
         if rootTree.type == ROOT_TREE or rootTree.type == ROOT_FV_TREE or rootTree.type == ROOT_FFS_TREE or rootTree.type == ROOT_SECTION_TREE:
@@ -38,7 +39,8 @@ class FMMTParser:
             rootTree.Child = []
         # If current node do not have Child and ExtHeader, just add its Header and Data.
         elif rootTree.type == DATA_FV_TREE or rootTree.type == FFS_PAD:
-            self.FinalData += struct2stream(rootTree.Data.Header) + rootTree.Data.Data + rootTree.Data.PadData
+            self.FinalData += struct2stream(rootTree.Data.Header) + \
+                rootTree.Data.Data + rootTree.Data.PadData
             if rootTree.isFinalChild():
                 ParTree = rootTree.Parent
                 if ParTree.type != 'ROOT':
@@ -49,7 +51,8 @@ class FMMTParser:
             if rootTree.HasChild():
                 self.FinalData += struct2stream(rootTree.Data.Header)
             else:
-                self.FinalData += struct2stream(rootTree.Data.Header) + rootTree.Data.Data + rootTree.Data.PadData
+                self.FinalData += struct2stream(rootTree.Data.Header) + \
+                    rootTree.Data.Data + rootTree.Data.PadData
                 if rootTree.isFinalChild():
                     ParTree = rootTree.Parent
                     if ParTree.type != 'ROOT':
@@ -60,15 +63,18 @@ class FMMTParser:
             if rootTree.Data.OriData == b'' or (rootTree.Data.OriData != b'' and CompressStatus):
                 if rootTree.HasChild():
                     if rootTree.Data.ExtHeader:
-                        self.FinalData += struct2stream(rootTree.Data.Header) + struct2stream(rootTree.Data.ExtHeader)
+                        self.FinalData += struct2stream(
+                            rootTree.Data.Header) + struct2stream(rootTree.Data.ExtHeader)
                     else:
                         self.FinalData += struct2stream(rootTree.Data.Header)
                 else:
                     Data = rootTree.Data.Data
                     if rootTree.Data.ExtHeader:
-                        self.FinalData += struct2stream(rootTree.Data.Header) + struct2stream(rootTree.Data.ExtHeader) + Data + rootTree.Data.PadData
+                        self.FinalData += struct2stream(rootTree.Data.Header) + struct2stream(
+                            rootTree.Data.ExtHeader) + Data + rootTree.Data.PadData
                     else:
-                        self.FinalData += struct2stream(rootTree.Data.Header) + Data + rootTree.Data.PadData
+                        self.FinalData += struct2stream(
+                            rootTree.Data.Header) + Data + rootTree.Data.PadData
                     if rootTree.isFinalChild():
                         ParTree = rootTree.Parent
                         self.FinalData += ParTree.Data.PadData
@@ -77,9 +83,11 @@ class FMMTParser:
                 Data = rootTree.Data.OriData
                 rootTree.Child = []
                 if rootTree.Data.ExtHeader:
-                    self.FinalData += struct2stream(rootTree.Data.Header) + struct2stream(rootTree.Data.ExtHeader) + Data + rootTree.Data.PadData
+                    self.FinalData += struct2stream(rootTree.Data.Header) + struct2stream(
+                        rootTree.Data.ExtHeader) + Data + rootTree.Data.PadData
                 else:
-                    self.FinalData += struct2stream(rootTree.Data.Header) + Data + rootTree.Data.PadData
+                    self.FinalData += struct2stream(
+                        rootTree.Data.Header) + Data + rootTree.Data.PadData
                 if rootTree.isFinalChild():
                     ParTree = rootTree.Parent
                     self.FinalData += ParTree.Data.PadData
diff --git a/BaseTools/Source/Python/FMMT/core/FvHandler.py b/BaseTools/Source/Python/FMMT/core/FvHandler.py
index c81541ec18b1..1a59767afd2f 100644
--- a/BaseTools/Source/Python/FMMT/core/FvHandler.py
+++ b/BaseTools/Source/Python/FMMT/core/FvHandler.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to the implementation of Bios layout handler.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -13,7 +13,8 @@ from utils.FmmtLogger import FmmtLogger as logger
 
 EFI_FVB2_ERASE_POLARITY = 0x00000800
 
-def ChangeSize(TargetTree, size_delta: int=0) -> None:
+
+def ChangeSize(TargetTree, size_delta: int = 0) -> None:
     # If Size increase delta, then should be: size_delta = -delta
     if type(TargetTree.Data.Header) == type(EFI_FFS_FILE_HEADER2()) or type(TargetTree.Data.Header) == type(EFI_COMMON_SECTION_HEADER2()):
         TargetTree.Data.Size -= size_delta
@@ -22,14 +23,16 @@ def ChangeSize(TargetTree, size_delta: int=0) -> None:
         OriSize = TargetTree.Data.Header.SECTION_SIZE
         OriSize -= size_delta
         TargetTree.Data.Header.Size[0] = OriSize % (16**2)
-        TargetTree.Data.Header.Size[1] = OriSize % (16**4) //(16**2)
+        TargetTree.Data.Header.Size[1] = OriSize % (16**4) // (16**2)
         TargetTree.Data.Header.Size[2] = OriSize // (16**4)
     else:
         TargetTree.Data.Size -= size_delta
         TargetTree.Data.Header.Size[0] = TargetTree.Data.Size % (16**2)
-        TargetTree.Data.Header.Size[1] = TargetTree.Data.Size % (16**4) //(16**2)
+        TargetTree.Data.Header.Size[1] = TargetTree.Data.Size % (
+            16**4) // (16**2)
         TargetTree.Data.Header.Size[2] = TargetTree.Data.Size // (16**4)
 
+
 def ModifyFfsType(TargetFfs) -> None:
     if type(TargetFfs.Data.Header) == type(EFI_FFS_FILE_HEADER()) and TargetFfs.Data.Size > 0xFFFFFF:
         ExtendSize = TargetFfs.Data.Header.FFS_FILE_SIZE + 8
@@ -37,10 +40,12 @@ def ModifyFfsType(TargetFfs) -> None:
         New_Header.Name = TargetFfs.Data.Header.Name
         New_Header.IntegrityCheck = TargetFfs.Data.Header.IntegrityCheck
         New_Header.Type = TargetFfs.Data.Header.Type
-        New_Header.Attributes = TargetFfs.Data.Header.Attributes | 0x01  # set the Attribute with FFS_ATTRIB_LARGE_FILE (0x01)
+        # set the Attribute with FFS_ATTRIB_LARGE_FILE (0x01)
+        New_Header.Attributes = TargetFfs.Data.Header.Attributes | 0x01
         NewSize = 0
-        New_Header.Size[0] = NewSize % (16**2)    # minus the delta size of Header
-        New_Header.Size[1] = NewSize % (16**4) //(16**2)
+        # minus the delta size of Header
+        New_Header.Size[0] = NewSize % (16**2)
+        New_Header.Size[1] = NewSize % (16**4) // (16**2)
         New_Header.Size[2] = NewSize // (16**4)
         New_Header.State = TargetFfs.Data.Header.State
         New_Header.ExtendedSize = ExtendSize
@@ -53,9 +58,11 @@ def ModifyFfsType(TargetFfs) -> None:
         New_Header.Name = TargetFfs.Data.Header.Name
         New_Header.IntegrityCheck = TargetFfs.Data.Header.IntegrityCheck
         New_Header.Type = TargetFfs.Data.Header.Type
-        New_Header.Attributes = TargetFfs.Data.Header.Attributes - 1  # remove the FFS_ATTRIB_LARGE_FILE (0x01) from Attribute
-        New_Header.Size[0] = (TargetFfs.Data.Size - 8) % (16**2)    # minus the delta size of Header
-        New_Header.Size[1] = (TargetFfs.Data.Size - 8) % (16**4) //(16**2)
+        # remove the FFS_ATTRIB_LARGE_FILE (0x01) from Attribute
+        New_Header.Attributes = TargetFfs.Data.Header.Attributes - 1
+        # minus the delta size of Header
+        New_Header.Size[0] = (TargetFfs.Data.Size - 8) % (16**2)
+        New_Header.Size[1] = (TargetFfs.Data.Size - 8) % (16**4) // (16**2)
         New_Header.Size[2] = (TargetFfs.Data.Size - 8) // (16**4)
         New_Header.State = TargetFfs.Data.Header.State
         TargetFfs.Data.Header = New_Header
@@ -68,19 +75,24 @@ def ModifyFfsType(TargetFfs) -> None:
                 if type(item.Data.Header) == type(EFI_FFS_FILE_HEADER2()):
                     NeedChange = False
             if NeedChange:
-                TargetFfs.Parent.Data.Header.FileSystemGuid = ModifyGuidFormat("8c8ce578-8a3d-4f1c-9935-896185c32dd3")
+                TargetFfs.Parent.Data.Header.FileSystemGuid = ModifyGuidFormat(
+                    "8c8ce578-8a3d-4f1c-9935-896185c32dd3")
 
     if type(TargetFfs.Data.Header) == type(EFI_FFS_FILE_HEADER2()):
         TarParent = TargetFfs.Parent
         while TarParent:
             if TarParent.type == FV_TREE and struct2stream(TarParent.Data.Header.FileSystemGuid) == EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE:
-                TarParent.Data.Header.FileSystemGuid = ModifyGuidFormat("5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
+                TarParent.Data.Header.FileSystemGuid = ModifyGuidFormat(
+                    "5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
             TarParent = TarParent.Parent
 
+
 def PadSectionModify(PadSection, Offset) -> None:
     # Offset > 0, Size decrease; Offset < 0, Size increase;
     ChangeSize(PadSection, Offset)
-    PadSection.Data.Data = (PadSection.Data.Size - PadSection.Data.HeaderLength) * b'\xff'
+    PadSection.Data.Data = (PadSection.Data.Size -
+                            PadSection.Data.HeaderLength) * b'\xff'
+
 
 def ModifySectionType(TargetSection) -> None:
     # If Section Size is increased larger than 0xFFFFFF, need modify Section Header from EFI_COMMON_SECTION_HEADER to EFI_COMMON_SECTION_HEADER2.
@@ -88,8 +100,9 @@ def ModifySectionType(TargetSection) -> None:
         New_Header = EFI_COMMON_SECTION_HEADER2()
         New_Header.Type = TargetSection.Data.Header.Type
         NewSize = 0xFFFFFF
-        New_Header.Size[0] = NewSize % (16**2)    # minus the delta size of Header
-        New_Header.Size[1] = NewSize % (16**4) //(16**2)
+        # minus the delta size of Header
+        New_Header.Size[0] = NewSize % (16**2)
+        New_Header.Size[1] = NewSize % (16**4) // (16**2)
         New_Header.Size[2] = NewSize // (16**4)
         New_Header.ExtendedSize = TargetSection.Data.Size + 4
         TargetSection.Data.Header = New_Header
@@ -106,8 +119,9 @@ def ModifySectionType(TargetSection) -> None:
     elif type(TargetSection.Data.Header) == type(EFI_COMMON_SECTION_HEADER2()) and TargetSection.Data.Size < 0xFFFFFF:
         New_Header = EFI_COMMON_SECTION_HEADER()
         New_Header.Type = TargetSection.Data.Header.Type
-        New_Header.Size[0] = (TargetSection.Data.Size - 4) % (16**2)    # minus the delta size of Header
-        New_Header.Size[1] = (TargetSection.Data.Size - 4) % (16**4) //(16**2)
+        # minus the delta size of Header
+        New_Header.Size[0] = (TargetSection.Data.Size - 4) % (16**2)
+        New_Header.Size[1] = (TargetSection.Data.Size - 4) % (16**4) // (16**2)
         New_Header.Size[2] = (TargetSection.Data.Size - 4) // (16**4)
         TargetSection.Data.Header = New_Header
         TargetSection.Data.Size = TargetSection.Data.Header.SECTION_SIZE
@@ -120,6 +134,7 @@ def ModifySectionType(TargetSection) -> None:
             NewPadSection = SectionNode(b'\x00\x00\x00\x19')
             SecParent.insertChild(NewPadSection, Target_index)
 
+
 def ModifyFvExtData(TreeNode) -> None:
     FvExtData = b''
     if TreeNode.Data.Header.ExtHeaderOffset:
@@ -130,19 +145,24 @@ def ModifyFvExtData(TreeNode) -> None:
         FvExtData += FvExtEntry
     if FvExtData:
         InfoNode = TreeNode.Child[0]
-        InfoNode.Data.Data = FvExtData + InfoNode.Data.Data[TreeNode.Data.ExtHeader.ExtHeaderSize:]
+        InfoNode.Data.Data = FvExtData + \
+            InfoNode.Data.Data[TreeNode.Data.ExtHeader.ExtHeaderSize:]
         InfoNode.Data.ModCheckSum()
 
+
 def ModifyFvSystemGuid(TargetFv) -> None:
     if struct2stream(TargetFv.Data.Header.FileSystemGuid) == EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE:
-        TargetFv.Data.Header.FileSystemGuid = ModifyGuidFormat("5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
+        TargetFv.Data.Header.FileSystemGuid = ModifyGuidFormat(
+            "5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
     TargetFv.Data.ModCheckSum()
     TargetFv.Data.Data = b''
     for item in TargetFv.Child:
         if item.type == FFS_FREE_SPACE:
             TargetFv.Data.Data += item.Data.Data + item.Data.PadData
         else:
-            TargetFv.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+            TargetFv.Data.Data += struct2stream(item.Data.Header) + \
+                item.Data.Data + item.Data.PadData
+
 
 class FvHandler:
     def __init__(self, NewFfs, TargetFfs) -> None:
@@ -151,7 +171,7 @@ class FvHandler:
         self.Status = False
         self.Remain_New_Free_Space = 0
 
-    ## Use for Compress the Section Data
+    # Use for Compress the Section Data
     def CompressData(self, TargetTree) -> None:
         TreePath = TargetTree.GetTreePath()
         pos = len(TreePath)
@@ -159,7 +179,8 @@ class FvHandler:
         while pos:
             if not self.Status:
                 if TreePath[pos-1].type == SECTION_TREE and TreePath[pos-1].Data.Type == 0x02:
-                    self.CompressSectionData(TreePath[pos-1], None, TreePath[pos-1].Data.ExtHeader.SectionDefinitionGuid)
+                    self.CompressSectionData(
+                        TreePath[pos-1], None, TreePath[pos-1].Data.ExtHeader.SectionDefinitionGuid)
                 else:
                     if pos == len(TreePath):
                         self.CompressSectionData(TreePath[pos-1], pos)
@@ -174,15 +195,19 @@ class FvHandler:
             # Update current node data as adding all the header and data of its child node.
             for item in temp_save_child:
                 if item.type == SECTION_TREE and not item.Data.OriData and item.Data.ExtHeader:
-                    NewData += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
+                    NewData += struct2stream(item.Data.Header) + struct2stream(
+                        item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
                 elif item.type == SECTION_TREE and item.Data.OriData and not item.Data.ExtHeader:
-                    NewData += struct2stream(item.Data.Header) + item.Data.OriData + item.Data.PadData
+                    NewData += struct2stream(item.Data.Header) + \
+                        item.Data.OriData + item.Data.PadData
                 elif item.type == SECTION_TREE and item.Data.OriData and item.Data.ExtHeader:
-                    NewData += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
+                    NewData += struct2stream(item.Data.Header) + struct2stream(
+                        item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
                 elif item.type == FFS_FREE_SPACE:
                     NewData += item.Data.Data + item.Data.PadData
                 else:
-                    NewData += struct2stream(item.Data.Header) + item.Data.Data + item.Data.PadData
+                    NewData += struct2stream(item.Data.Header) + \
+                        item.Data.Data + item.Data.PadData
             # If node is FFS_TREE, update Pad data and Header info.
             # Remain_New_Free_Space is used for move more free space into lst level Fv.
             if TargetTree.type == FFS_TREE:
@@ -205,11 +230,13 @@ class FvHandler:
                         TargetTree.Data.Data += self.Remain_New_Free_Space * b'\xff'
                         New_Free_Space = BIOSTREE('FREE_SPACE')
                         New_Free_Space.type = FFS_FREE_SPACE
-                        New_Free_Space.Data = FreeSpaceNode(b'\xff' * self.Remain_New_Free_Space)
+                        New_Free_Space.Data = FreeSpaceNode(
+                            b'\xff' * self.Remain_New_Free_Space)
                         TargetTree.insertChild(New_Free_Space)
                     self.Remain_New_Free_Space = 0
                 if TargetTree.type == SEC_FV_TREE:
-                    Size_delta = len(NewData) + self.Remain_New_Free_Space - len(TargetTree.Data.Data)
+                    Size_delta = len(
+                        NewData) + self.Remain_New_Free_Space - len(TargetTree.Data.Data)
                     TargetTree.Data.Header.FvLength += Size_delta
                 TargetTree.Data.ModFvExt()
                 TargetTree.Data.ModFvSize()
@@ -223,32 +250,39 @@ class FvHandler:
                 Size_delta = len(NewData) - len(TargetTree.Data.Data)
                 ChangeSize(TargetTree, -Size_delta)
                 if TargetTree.NextRel:
-                    Delta_Pad_Size = len(TargetTree.Data.PadData) - New_Pad_Size
+                    Delta_Pad_Size = len(
+                        TargetTree.Data.PadData) - New_Pad_Size
                     self.Remain_New_Free_Space += Delta_Pad_Size
                     TargetTree.Data.PadData = b'\x00' * New_Pad_Size
             TargetTree.Data.Data = NewData
         if GuidTool:
             guidtool = GUIDTools().__getitem__(struct2stream(GuidTool))
             if not guidtool.ifexist:
-                logger.error("GuidTool {} is not found when decompressing {} file.\n".format(guidtool.command, TargetTree.Parent.Data.Name))
+                logger.error("GuidTool {} is not found when decompressing {} file.\n".format(
+                    guidtool.command, TargetTree.Parent.Data.Name))
                 raise Exception("Process Failed: GuidTool not found!")
             CompressedData = guidtool.pack(TargetTree.Data.Data)
             if len(CompressedData) < len(TargetTree.Data.OriData):
-                New_Pad_Size = GetPadSize(len(CompressedData), SECTION_COMMON_ALIGNMENT)
+                New_Pad_Size = GetPadSize(
+                    len(CompressedData), SECTION_COMMON_ALIGNMENT)
                 Size_delta = len(CompressedData) - len(TargetTree.Data.OriData)
                 ChangeSize(TargetTree, -Size_delta)
                 if TargetTree.NextRel:
                     TargetTree.Data.PadData = b'\x00' * New_Pad_Size
-                    self.Remain_New_Free_Space = len(TargetTree.Data.OriData) + len(TargetTree.Data.PadData) - len(CompressedData) - New_Pad_Size
+                    self.Remain_New_Free_Space = len(TargetTree.Data.OriData) + len(
+                        TargetTree.Data.PadData) - len(CompressedData) - New_Pad_Size
                 else:
                     TargetTree.Data.PadData = b''
-                    self.Remain_New_Free_Space = len(TargetTree.Data.OriData) - len(CompressedData)
+                    self.Remain_New_Free_Space = len(
+                        TargetTree.Data.OriData) - len(CompressedData)
                 TargetTree.Data.OriData = CompressedData
             elif len(CompressedData) == len(TargetTree.Data.OriData):
                 TargetTree.Data.OriData = CompressedData
             elif len(CompressedData) > len(TargetTree.Data.OriData):
-                New_Pad_Size = GetPadSize(len(CompressedData), SECTION_COMMON_ALIGNMENT)
-                self.Remain_New_Free_Space = len(CompressedData) + New_Pad_Size - len(TargetTree.Data.OriData) - len(TargetTree.Data.PadData)
+                New_Pad_Size = GetPadSize(
+                    len(CompressedData), SECTION_COMMON_ALIGNMENT)
+                self.Remain_New_Free_Space = len(CompressedData) + New_Pad_Size - len(
+                    TargetTree.Data.OriData) - len(TargetTree.Data.PadData)
                 self.ModifyTest(TargetTree, self.Remain_New_Free_Space)
                 self.Status = True
 
@@ -271,7 +305,7 @@ class FvHandler:
                         self.Status = False
                     else:
                         BlockSize = ParTree.Data.Header.BlockMap[0].Length
-                        New_Add_Len = BlockSize - Needed_Space%BlockSize
+                        New_Add_Len = BlockSize - Needed_Space % BlockSize
                         if New_Add_Len % BlockSize:
                             ParTree.Child[-1].Data.Data = b'\xff' * New_Add_Len
                             ParTree.Data.Free_Space = New_Add_Len
@@ -286,7 +320,8 @@ class FvHandler:
                     if item.type == FFS_FREE_SPACE:
                         ParTree.Data.Data += item.Data.Data + item.Data.PadData
                     else:
-                        ParTree.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+                        ParTree.Data.Data += struct2stream(
+                            item.Data.Header) + item.Data.Data + item.Data.PadData
                 ParTree.Data.ModFvExt()
                 ParTree.Data.ModFvSize()
                 ParTree.Data.ModExtHeaderData()
@@ -300,22 +335,28 @@ class FvHandler:
                 for item in ParTree.Child:
                     if item.Data.OriData:
                         if item.Data.ExtHeader:
-                            ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
+                            ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(
+                                item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
                         else:
-                            ParTree.Data.Data += struct2stream(item.Data.Header)+ item.Data.OriData + item.Data.PadData
+                            ParTree.Data.Data += struct2stream(
+                                item.Data.Header) + item.Data.OriData + item.Data.PadData
                     else:
                         if item.Data.ExtHeader:
-                            ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
+                            ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(
+                                item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
                         else:
-                            ParTree.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+                            ParTree.Data.Data += struct2stream(
+                                item.Data.Header) + item.Data.Data + item.Data.PadData
                 ChangeSize(ParTree, -Needed_Space)
                 ModifyFfsType(ParTree)
                 # Recalculate pad data, update needed space with Delta_Pad_Size.
                 Needed_Space += ParTree.Data.HeaderLength - OriHeaderLen
-                New_Pad_Size = GetPadSize(ParTree.Data.Size, FFS_COMMON_ALIGNMENT)
+                New_Pad_Size = GetPadSize(
+                    ParTree.Data.Size, FFS_COMMON_ALIGNMENT)
                 Delta_Pad_Size = New_Pad_Size - len(ParTree.Data.PadData)
                 Needed_Space += Delta_Pad_Size
-                ParTree.Data.PadData = b'\xff' * GetPadSize(ParTree.Data.Size, FFS_COMMON_ALIGNMENT)
+                ParTree.Data.PadData = b'\xff' * \
+                    GetPadSize(ParTree.Data.Size, FFS_COMMON_ALIGNMENT)
                 ParTree.Data.ModCheckSum()
             # If current node is a Section node
             elif ParTree.type == SECTION_TREE:
@@ -325,50 +366,61 @@ class FvHandler:
                 # Update its data as adding all the header and data of its child node.
                 for item in ParTree.Child:
                     if item.type == SECTION_TREE and item.Data.ExtHeader and item.Data.Type != 0x02:
-                        ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
+                        ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(
+                            item.Data.ExtHeader) + item.Data.Data + item.Data.PadData
                     elif item.type == SECTION_TREE and item.Data.ExtHeader and item.Data.Type == 0x02:
-                        ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
+                        ParTree.Data.Data += struct2stream(item.Data.Header) + struct2stream(
+                            item.Data.ExtHeader) + item.Data.OriData + item.Data.PadData
                     else:
-                        ParTree.Data.Data += struct2stream(item.Data.Header) + item.Data.Data + item.Data.PadData
+                        ParTree.Data.Data += struct2stream(
+                            item.Data.Header) + item.Data.Data + item.Data.PadData
                 # If the current section is guided section
                 if ParTree.Data.Type == 0x02:
-                    guidtool = GUIDTools().__getitem__(struct2stream(ParTree.Data.ExtHeader.SectionDefinitionGuid))
+                    guidtool = GUIDTools().__getitem__(struct2stream(
+                        ParTree.Data.ExtHeader.SectionDefinitionGuid))
                     if not guidtool.ifexist:
-                        logger.error("GuidTool {} is not found when decompressing {} file.\n".format(guidtool.command, ParTree.Parent.Data.Name))
+                        logger.error("GuidTool {} is not found when decompressing {} file.\n".format(
+                            guidtool.command, ParTree.Parent.Data.Name))
                         raise Exception("Process Failed: GuidTool not found!")
                     # Recompress current data, and recalculate the needed space
                     CompressedData = guidtool.pack(ParTree.Data.Data)
-                    Needed_Space = len(CompressedData) - len(ParTree.Data.OriData)
+                    Needed_Space = len(CompressedData) - \
+                        len(ParTree.Data.OriData)
                     ParTree.Data.OriData = CompressedData
                     New_Size = ParTree.Data.HeaderLength + len(CompressedData)
                     ParTree.Data.Header.Size[0] = New_Size % (16**2)
-                    ParTree.Data.Header.Size[1] = New_Size % (16**4) //(16**2)
+                    ParTree.Data.Header.Size[1] = New_Size % (16**4) // (16**2)
                     ParTree.Data.Header.Size[2] = New_Size // (16**4)
                     ParTree.Data.Size = ParTree.Data.Header.SECTION_SIZE
                     ModifySectionType(ParTree)
                     Needed_Space += ParTree.Data.HeaderLength - OriHeaderLen
                     # Update needed space with Delta_Pad_Size
                     if ParTree.NextRel:
-                        New_Pad_Size = GetPadSize(ParTree.Data.Size, SECTION_COMMON_ALIGNMENT)
-                        Delta_Pad_Size = New_Pad_Size - len(ParTree.Data.PadData)
+                        New_Pad_Size = GetPadSize(
+                            ParTree.Data.Size, SECTION_COMMON_ALIGNMENT)
+                        Delta_Pad_Size = New_Pad_Size - \
+                            len(ParTree.Data.PadData)
                         ParTree.Data.PadData = b'\x00' * New_Pad_Size
                         Needed_Space += Delta_Pad_Size
                     else:
                         ParTree.Data.PadData = b''
                     if Needed_Space < 0:
-                        self.Remain_New_Free_Space = len(ParTree.Data.OriData) - len(CompressedData)
+                        self.Remain_New_Free_Space = len(
+                            ParTree.Data.OriData) - len(CompressedData)
                 # If current section is not guided section
                 elif Needed_Space:
                     ChangeSize(ParTree, -Needed_Space)
                     ModifySectionType(ParTree)
                     # Update needed space with Delta_Pad_Size
                     Needed_Space += ParTree.Data.HeaderLength - OriHeaderLen
-                    New_Pad_Size = GetPadSize(ParTree.Data.Size, SECTION_COMMON_ALIGNMENT)
+                    New_Pad_Size = GetPadSize(
+                        ParTree.Data.Size, SECTION_COMMON_ALIGNMENT)
                     Delta_Pad_Size = New_Pad_Size - len(ParTree.Data.PadData)
                     Needed_Space += Delta_Pad_Size
                     ParTree.Data.PadData = b'\x00' * New_Pad_Size
             NewParTree = ParTree.Parent
-            ROOT_TYPE = [ROOT_FV_TREE, ROOT_FFS_TREE, ROOT_SECTION_TREE, ROOT_TREE]
+            ROOT_TYPE = [ROOT_FV_TREE, ROOT_FFS_TREE,
+                         ROOT_SECTION_TREE, ROOT_TREE]
             if NewParTree and NewParTree.type not in ROOT_TYPE:
                 self.ModifyTest(NewParTree, Needed_Space)
         # If current node have enough space, will recompress all the related node data, return true.
@@ -381,16 +433,20 @@ class FvHandler:
         TargetFv = self.TargetFfs.Parent
         # If the Fv Header Attributes is EFI_FVB2_ERASE_POLARITY, Child Ffs Header State need be reversed.
         if TargetFv.Data.Header.Attributes & EFI_FVB2_ERASE_POLARITY:
-                self.NewFfs.Data.Header.State = c_uint8(
-                    ~self.NewFfs.Data.Header.State)
+            self.NewFfs.Data.Header.State = c_uint8(
+                ~self.NewFfs.Data.Header.State)
         # NewFfs parsing will not calculate the PadSize, thus recalculate.
-        self.NewFfs.Data.PadData = b'\xff' * GetPadSize(self.NewFfs.Data.Size, FFS_COMMON_ALIGNMENT)
+        self.NewFfs.Data.PadData = b'\xff' * \
+            GetPadSize(self.NewFfs.Data.Size, FFS_COMMON_ALIGNMENT)
         if self.NewFfs.Data.Size >= self.TargetFfs.Data.Size:
-            Needed_Space = self.NewFfs.Data.Size + len(self.NewFfs.Data.PadData) - self.TargetFfs.Data.Size - len(self.TargetFfs.Data.PadData)
+            Needed_Space = self.NewFfs.Data.Size + \
+                len(self.NewFfs.Data.PadData) - self.TargetFfs.Data.Size - \
+                len(self.TargetFfs.Data.PadData)
             # If TargetFv have enough free space, just move part of the free space to NewFfs.
             if TargetFv.Data.Free_Space >= Needed_Space:
                 # Modify TargetFv Child info and BiosTree.
-                TargetFv.Child[-1].Data.Data = b'\xff' * (TargetFv.Data.Free_Space - Needed_Space)
+                TargetFv.Child[-1].Data.Data = b'\xff' * \
+                    (TargetFv.Data.Free_Space - Needed_Space)
                 TargetFv.Data.Free_Space -= Needed_Space
                 Target_index = TargetFv.Child.index(self.TargetFfs)
                 TargetFv.Child.remove(self.TargetFfs)
@@ -413,7 +469,7 @@ class FvHandler:
                     # Recalculate TargetFv needed space to keep it match the BlockSize setting.
                     Needed_Space -= TargetFv.Data.Free_Space
                     BlockSize = TargetFv.Data.Header.BlockMap[0].Length
-                    New_Add_Len = BlockSize - Needed_Space%BlockSize
+                    New_Add_Len = BlockSize - Needed_Space % BlockSize
                     Target_index = TargetFv.Child.index(self.TargetFfs)
                     if New_Add_Len % BlockSize:
                         TargetFv.Child[-1].Data.Data = b'\xff' * New_Add_Len
@@ -431,7 +487,8 @@ class FvHandler:
                         if item.type == FFS_FREE_SPACE:
                             TargetFv.Data.Data += item.Data.Data + item.Data.PadData
                         else:
-                            TargetFv.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+                            TargetFv.Data.Data += struct2stream(
+                                item.Data.Header) + item.Data.Data + item.Data.PadData
                     TargetFv.Data.Size += Needed_Space
                     # Modify TargetFv Data Header and ExtHeader info.
                     TargetFv.Data.Header.FvLength = TargetFv.Data.Size
@@ -477,9 +534,12 @@ class FvHandler:
     def AddFfs(self) -> bool:
         logger.debug('Start Adding Process......')
         # NewFfs parsing will not calculate the PadSize, thus recalculate.
-        self.NewFfs.Data.PadData = b'\xff' * GetPadSize(self.NewFfs.Data.Size, FFS_COMMON_ALIGNMENT)
+        self.NewFfs.Data.PadData = b'\xff' * \
+            GetPadSize(self.NewFfs.Data.Size, FFS_COMMON_ALIGNMENT)
         if self.TargetFfs.type == FFS_FREE_SPACE:
-            TargetLen = self.NewFfs.Data.Size + len(self.NewFfs.Data.PadData) - self.TargetFfs.Data.Size - len(self.TargetFfs.Data.PadData)
+            TargetLen = self.NewFfs.Data.Size + \
+                len(self.NewFfs.Data.PadData) - self.TargetFfs.Data.Size - \
+                len(self.TargetFfs.Data.PadData)
             TargetFv = self.TargetFfs.Parent
             # If the Fv Header Attributes is EFI_FVB2_ERASE_POLARITY, Child Ffs Header State need be reversed.
             if TargetFv.Data.Header.Attributes & EFI_FVB2_ERASE_POLARITY:
@@ -512,7 +572,7 @@ class FvHandler:
                 elif TargetFv.type == SEC_FV_TREE:
                     # Recalculate TargetFv needed space to keep it match the BlockSize setting.
                     BlockSize = TargetFv.Data.Header.BlockMap[0].Length
-                    New_Add_Len = BlockSize - TargetLen%BlockSize
+                    New_Add_Len = BlockSize - TargetLen % BlockSize
                     if New_Add_Len % BlockSize:
                         self.TargetFfs.Data.Data = b'\xff' * New_Add_Len
                         self.TargetFfs.Data.Size = New_Add_Len
@@ -530,7 +590,8 @@ class FvHandler:
                         if item.type == FFS_FREE_SPACE:
                             TargetFv.Data.Data += item.Data.Data + item.Data.PadData
                         else:
-                            TargetFv.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+                            TargetFv.Data.Data += struct2stream(
+                                item.Data.Header) + item.Data.Data + item.Data.PadData
                     # Encapsulate the Fv Data for update.
                     TargetFv.Data.Size += TargetLen
                     TargetFv.Data.Header.FvLength = TargetFv.Data.Size
@@ -552,7 +613,7 @@ class FvHandler:
                 self.Status = False
             elif TargetFv.type == SEC_FV_TREE:
                 BlockSize = TargetFv.Data.Header.BlockMap[0].Length
-                New_Add_Len = BlockSize - TargetLen%BlockSize
+                New_Add_Len = BlockSize - TargetLen % BlockSize
                 if New_Add_Len % BlockSize:
                     New_Free_Space = BIOSTREE('FREE_SPACE')
                     New_Free_Space.type = FFS_FREE_SPACE
@@ -570,7 +631,8 @@ class FvHandler:
                     if item.type == FFS_FREE_SPACE:
                         TargetFv.Data.Data += item.Data.Data + item.Data.PadData
                     else:
-                        TargetFv.Data.Data += struct2stream(item.Data.Header)+ item.Data.Data + item.Data.PadData
+                        TargetFv.Data.Data += struct2stream(
+                            item.Data.Header) + item.Data.Data + item.Data.PadData
                 TargetFv.Data.Size += TargetLen
                 TargetFv.Data.Header.FvLength = TargetFv.Data.Size
                 TargetFv.Data.ModFvExt()
@@ -596,7 +658,8 @@ class FvHandler:
                 Used_Size = Delete_Fv.Data.Size - Delete_Fv.Data.Free_Space - Add_Free_Space
                 BlockSize = Delete_Fv.Data.Header.BlockMap[0].Length
                 New_Free_Space = BlockSize - Used_Size % BlockSize
-                self.Remain_New_Free_Space += Delete_Fv.Data.Free_Space + Add_Free_Space - New_Free_Space
+                self.Remain_New_Free_Space += Delete_Fv.Data.Free_Space + \
+                    Add_Free_Space - New_Free_Space
                 Delete_Fv.Child[-1].Data.Data = New_Free_Space * b'\xff'
                 Delete_Fv.Data.Free_Space = New_Free_Space
             # If Fv is lst level Fv, new free space will be merged with origin free space.
diff --git a/BaseTools/Source/Python/FMMT/core/GuidTools.py b/BaseTools/Source/Python/FMMT/core/GuidTools.py
index a25681709bc8..ea196daa9c8a 100644
--- a/BaseTools/Source/Python/FMMT/core/GuidTools.py
+++ b/BaseTools/Source/Python/FMMT/core/GuidTools.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the FMMT dependent external tool management class.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -15,8 +15,10 @@ from FirmwareStorageFormat.Common import *
 from utils.FmmtLogger import FmmtLogger as logger
 import subprocess
 
+
 def ExecuteCommand(cmd: list) -> None:
-    subprocess.run(cmd,stdout=subprocess.DEVNULL)
+    subprocess.run(cmd, stdout=subprocess.DEVNULL)
+
 
 class GUIDTool:
     def __init__(self, guid: str, short_name: str, command: str) -> None:
@@ -39,7 +41,7 @@ class GUIDTool:
                 file.write(buffer)
                 file.close()
                 command = [tool, '-e', '-o', ToolOuputFile,
-                                  ToolInputFile]
+                           ToolInputFile]
                 ExecuteCommand(command)
                 buf = open(ToolOuputFile, "rb")
                 res_buffer = buf.read()
@@ -57,7 +59,6 @@ class GUIDTool:
             logger.info("Its GUID is: %s" % self.guid)
             return ""
 
-
     def unpack(self, buffer: bytes) -> bytes:
         """
         buffer: remove common header
@@ -85,10 +86,12 @@ class GUIDTool:
                     shutil.rmtree(tmp)
                 return res_buffer
         else:
-            logger.error("Error parsing section: EFI_SECTION_GUID_DEFINED cannot be parsed at this time.")
+            logger.error(
+                "Error parsing section: EFI_SECTION_GUID_DEFINED cannot be parsed at this time.")
             logger.info("Its GUID is: %s" % self.guid)
             return ""
 
+
 class GUIDTools:
     '''
     GUIDTools is responsible for reading FMMTConfig.ini, verify the tools and provide interfaces to access those tools.
@@ -101,19 +104,22 @@ class GUIDTools:
         struct2stream(ModifyGuidFormat("3d532050-5cda-4fd0-879e-0f7f630d5afb")): GUIDTool("3d532050-5cda-4fd0-879e-0f7f630d5afb", "BROTLI", "BrotliCompress"),
     }
 
-    def __init__(self, tooldef_file: str=None) -> None:
+    def __init__(self, tooldef_file: str = None) -> None:
         self.dir = os.path.join(os.path.dirname(__file__), "..")
-        self.tooldef_file = tooldef_file if tooldef_file else os.path.join(self.dir, "FmmtConf.ini")
+        self.tooldef_file = tooldef_file if tooldef_file else os.path.join(
+            self.dir, "FmmtConf.ini")
         self.tooldef = dict()
 
     def SetConfigFile(self) -> None:
         if os.environ['FmmtConfPath']:
-            self.tooldef_file = os.path.join(os.environ['FmmtConfPath'], 'FmmtConf.ini')
+            self.tooldef_file = os.path.join(
+                os.environ['FmmtConfPath'], 'FmmtConf.ini')
         else:
             PathList = os.environ['PATH']
             for CurrentPath in PathList:
                 if os.path.exists(os.path.join(CurrentPath, 'FmmtConf.ini')):
-                    self.tooldef_file = os.path.join(CurrentPath, 'FmmtConf.ini')
+                    self.tooldef_file = os.path.join(
+                        CurrentPath, 'FmmtConf.ini')
                     break
 
     def VerifyTools(self, guidtool) -> None:
@@ -128,10 +134,13 @@ class GUIDTools:
         if os.path.isabs(cmd):
             if not os.path.exists(cmd):
                 if guidtool not in self.default_tools:
-                    logger.error("Tool Not found %s, which causes compress/uncompress process error." % cmd)
-                    logger.error("Please goto edk2 repo in current console, run 'edksetup.bat rebuild' command, and try again.\n")
+                    logger.error(
+                        "Tool Not found %s, which causes compress/uncompress process error." % cmd)
+                    logger.error(
+                        "Please goto edk2 repo in current console, run 'edksetup.bat rebuild' command, and try again.\n")
                 else:
-                    logger.error("Tool Not found %s, which causes compress/uncompress process error." % cmd)
+                    logger.error(
+                        "Tool Not found %s, which causes compress/uncompress process error." % cmd)
             else:
                 guidtool.ifexist = True
         else:
@@ -141,10 +150,13 @@ class GUIDTools:
                     break
             else:
                 if guidtool not in self.default_tools:
-                    logger.error("Tool Not found %s, which causes compress/uncompress process error." % cmd)
-                    logger.error("Please goto edk2 repo in current console, run 'edksetup.bat rebuild' command, and try again.\n")
+                    logger.error(
+                        "Tool Not found %s, which causes compress/uncompress process error." % cmd)
+                    logger.error(
+                        "Please goto edk2 repo in current console, run 'edksetup.bat rebuild' command, and try again.\n")
                 else:
-                    logger.error("Tool Not found %s, which causes compress/uncompress process error." % cmd)
+                    logger.error(
+                        "Tool Not found %s, which causes compress/uncompress process error." % cmd)
 
     def LoadingTools(self) -> None:
         self.SetConfigFile()
@@ -155,7 +167,8 @@ class GUIDTools:
                 try:
                     if not line.startswith("#"):
                         guid, short_name, command = line.split()
-                        new_format_guid = struct2stream(ModifyGuidFormat(guid.strip()))
+                        new_format_guid = struct2stream(
+                            ModifyGuidFormat(guid.strip()))
                         self.tooldef[new_format_guid] = GUIDTool(
                             guid.strip(), short_name.strip(), command.strip())
                 except:
@@ -175,5 +188,5 @@ class GUIDTools:
             logger.error("{} GuidTool is not defined!".format(guid))
             raise Exception("Process Failed: is not defined!")
 
+
 guidtools = GUIDTools()
-
diff --git a/BaseTools/Source/Python/FMMT/utils/FmmtLogger.py b/BaseTools/Source/Python/FMMT/utils/FmmtLogger.py
index 385f098310a0..df2c30477bfb 100644
--- a/BaseTools/Source/Python/FMMT/utils/FmmtLogger.py
+++ b/BaseTools/Source/Python/FMMT/utils/FmmtLogger.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the Fmmt Logger.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -17,12 +17,12 @@ if os.path.exists(logfile):
 FmmtLogger = logging.getLogger('FMMT')
 FmmtLogger.setLevel(logging.DEBUG)
 
-log_stream_handler=logging.StreamHandler(sys.stdout)
-log_file_handler=logging.FileHandler(logfile)
+log_stream_handler = logging.StreamHandler(sys.stdout)
+log_file_handler = logging.FileHandler(logfile)
 log_stream_handler.setLevel(logging.INFO)
 
-stream_format=logging.Formatter("%(levelname)-8s: %(message)s")
-file_format=logging.Formatter("%(levelname)-8s: %(message)s")
+stream_format = logging.Formatter("%(levelname)-8s: %(message)s")
+file_format = logging.Formatter("%(levelname)-8s: %(message)s")
 
 log_stream_handler.setFormatter(stream_format)
 log_file_handler.setFormatter(file_format)
diff --git a/BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py b/BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py
index 7dafcae3b583..eadef2766380 100644
--- a/BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py
+++ b/BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the printer for Bios layout.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -6,6 +6,7 @@
 ##
 from utils.FmmtLogger import FmmtLogger as logger
 
+
 def GetFormatter(layout_format: str):
     if layout_format == 'json':
         return JsonFormatter()
@@ -16,12 +17,14 @@ def GetFormatter(layout_format: str):
     else:
         return TxtFormatter()
 
+
 class Formatter(object):
-    def dump(self, layoutdict, layoutlist, outputfile: str=None) -> None:
+    def dump(self, layoutdict, layoutlist, outputfile: str = None) -> None:
         raise NotImplemented
 
+
 class JsonFormatter(Formatter):
-    def dump(self,layoutdict: dict, layoutlist: list, outputfile: str=None) -> None:
+    def dump(self, layoutdict: dict, layoutlist: list, outputfile: str = None) -> None:
         try:
             import json
         except:
@@ -29,27 +32,31 @@ class JsonFormatter(Formatter):
             return
         print(outputfile)
         if outputfile:
-            with open(outputfile,"w") as fw:
+            with open(outputfile, "w") as fw:
                 json.dump(layoutdict, fw, indent=2)
         else:
-            print(json.dumps(layoutdict,indent=2))
+            print(json.dumps(layoutdict, indent=2))
+
 
 class TxtFormatter(Formatter):
-    def LogPrint(self,layoutlist: list) -> None:
+    def LogPrint(self, layoutlist: list) -> None:
         for item in layoutlist:
             print(item)
         print('\n')
 
-    def dump(self,layoutdict: dict, layoutlist: list, outputfile: str=None) -> None:
-        logger.info('Binary Layout Info is saved in {} file.'.format(outputfile))
+    def dump(self, layoutdict: dict, layoutlist: list, outputfile: str = None) -> None:
+        logger.info(
+            'Binary Layout Info is saved in {} file.'.format(outputfile))
         with open(outputfile, "w") as f:
             for item in layoutlist:
                 f.writelines(item + '\n')
 
+
 class YamlFormatter(Formatter):
-    def dump(self,layoutdict, layoutlist, outputfile = None):
+    def dump(self, layoutdict, layoutlist, outputfile=None):
         TxtFormatter().dump(layoutdict, layoutlist, outputfile)
 
+
 class HtmlFormatter(Formatter):
-    def dump(self,layoutdict, layoutlist, outputfile = None):
-        TxtFormatter().dump(layoutdict, layoutlist, outputfile)
\ No newline at end of file
+    def dump(self, layoutdict, layoutlist, outputfile=None):
+        TxtFormatter().dump(layoutdict, layoutlist, outputfile)
diff --git a/BaseTools/Source/Python/FirmwareStorageFormat/Common.py b/BaseTools/Source/Python/FirmwareStorageFormat/Common.py
index 5082268a0063..f45df799cd0b 100644
--- a/BaseTools/Source/Python/FirmwareStorageFormat/Common.py
+++ b/BaseTools/Source/Python/FirmwareStorageFormat/Common.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the common C struct and functions.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -12,24 +12,28 @@ import uuid
 # EFI_FIRMWARE_FILE_SYSTEM3_GUID = uuid.UUID('{5473C07A-3DCB-4dca-BD6F-1E9689E7349A}')
 # EFI_FFS_VOLUME_TOP_FILE_GUID = uuid.UUID('{1BA0062E-C779-4582-8566-336AE8F78F09}')
 
-EFI_FIRMWARE_FILE_SYSTEM2_GUID = uuid.UUID("8c8ce578-8a3d-4f1c-9935-896185c32dd3")
+EFI_FIRMWARE_FILE_SYSTEM2_GUID = uuid.UUID(
+    "8c8ce578-8a3d-4f1c-9935-896185c32dd3")
 EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE = b'x\xe5\x8c\x8c=\x8a\x1cO\x995\x89a\x85\xc3-\xd3'
 # EFI_FIRMWARE_FILE_SYSTEM2_GUID_BYTE = EFI_FIRMWARE_FILE_SYSTEM2_GUID.bytes
-EFI_FIRMWARE_FILE_SYSTEM3_GUID = uuid.UUID("5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
+EFI_FIRMWARE_FILE_SYSTEM3_GUID = uuid.UUID(
+    "5473C07A-3DCB-4dca-BD6F-1E9689E7349A")
 # EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE = b'x\xe5\x8c\x8c=\x8a\x1cO\x995\x89a\x85\xc3-\xd3'
 EFI_FIRMWARE_FILE_SYSTEM3_GUID_BYTE = b'z\xc0sT\xcb=\xcaM\xbdo\x1e\x96\x89\xe74\x9a'
 EFI_SYSTEM_NVDATA_FV_GUID = uuid.UUID("fff12b8d-7696-4c8b-a985-2747075b4f50")
 EFI_SYSTEM_NVDATA_FV_GUID_BYTE = b"\x8d+\xf1\xff\x96v\x8bL\xa9\x85'G\x07[OP"
-EFI_FFS_VOLUME_TOP_FILE_GUID = uuid.UUID("1ba0062e-c779-4582-8566-336ae8f78f09")
+EFI_FFS_VOLUME_TOP_FILE_GUID = uuid.UUID(
+    "1ba0062e-c779-4582-8566-336ae8f78f09")
 EFI_FFS_VOLUME_TOP_FILE_GUID_BYTE = b'.\x06\xa0\x1by\xc7\x82E\x85f3j\xe8\xf7\x8f\t'
 ZEROVECTOR_BYTE = b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
 PADVECTOR = uuid.UUID("ffffffff-ffff-ffff-ffff-ffffffffffff")
 FVH_SIGNATURE = b'_FVH'
 
-#Alignment
+# Alignment
 SECTION_COMMON_ALIGNMENT = 4
 FFS_COMMON_ALIGNMENT = 8
 
+
 class GUID(Structure):
     _pack_ = 1
     _fields_ = [
@@ -56,11 +60,12 @@ class GUID(Structure):
                 rt = rt & (self.Guid4[i] == otherguid.Guid4[i])
         return rt
 
+
 def ModifyGuidFormat(target_guid: str) -> GUID:
     target_guid = target_guid.replace('-', '')
     target_list = []
-    start = [0,8,12,16,18,20,22,24,26,28,30]
-    end = [8,12,16,18,20,22,24,26,28,30,32]
+    start = [0, 8, 12, 16, 18, 20, 22, 24, 26, 28, 30]
+    end = [8, 12, 16, 18, 20, 22, 24, 26, 28, 30, 32]
     num = len(start)
     for pos in range(num):
         new_value = int(target_guid[start[pos]:end[pos]], 16)
@@ -77,7 +82,6 @@ def struct2stream(s) -> bytes:
     return p.contents.raw
 
 
-
 def GetPadSize(Size: int, alignment: int) -> int:
     if Size % alignment == 0:
         return 0
diff --git a/BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py b/BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py
index e9c619d2240e..2b37dc8b797b 100644
--- a/BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py
+++ b/BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the Ffs Header C Struct.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -11,6 +11,7 @@ from FirmwareStorageFormat.Common import *
 EFI_FFS_FILE_HEADER_LEN = 24
 EFI_FFS_FILE_HEADER2_LEN = 32
 
+
 class CHECK_SUM(Structure):
     _pack_ = 1
     _fields_ = [
@@ -18,6 +19,7 @@ class CHECK_SUM(Structure):
         ('File',                     c_uint8),
     ]
 
+
 class EFI_FFS_INTEGRITY_CHECK(Union):
     _pack_ = 1
     _fields_ = [
@@ -45,6 +47,7 @@ class EFI_FFS_FILE_HEADER(Structure):
     def HeaderLength(self) -> int:
         return 24
 
+
 class EFI_FFS_FILE_HEADER2(Structure):
     _pack_ = 1
     _fields_ = [
diff --git a/BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py b/BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py
index 078beda9e5c1..e293da55b5f3 100644
--- a/BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py
+++ b/BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the FV Header C Struct.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -9,6 +9,7 @@ from struct import *
 from ctypes import *
 from FirmwareStorageFormat.Common import *
 
+
 class EFI_FV_BLOCK_MAP_ENTRY(Structure):
     _pack_ = 1
     _fields_ = [
@@ -30,7 +31,8 @@ class EFI_FIRMWARE_VOLUME_HEADER(Structure):
         ('Reserved',             c_uint8),
         ('Revision',             c_uint8),
         ('BlockMap',             ARRAY(EFI_FV_BLOCK_MAP_ENTRY, 1)),
-        ]
+    ]
+
 
 def Refine_FV_Header(nums):
     class EFI_FIRMWARE_VOLUME_HEADER(Structure):
@@ -46,33 +48,38 @@ def Refine_FV_Header(nums):
             ('Reserved',             c_uint8),
             ('Revision',             c_uint8),
             ('BlockMap',             ARRAY(EFI_FV_BLOCK_MAP_ENTRY, nums)),
-            ]
+        ]
     return EFI_FIRMWARE_VOLUME_HEADER
 
+
 class EFI_FIRMWARE_VOLUME_EXT_HEADER(Structure):
     _fields_ = [
         ('FvName',               GUID),
         ('ExtHeaderSize',        c_uint32)
-        ]
+    ]
+
 
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY(Structure):
     _fields_ = [
         ('ExtEntrySize',         c_uint16),
         ('ExtEntryType',         c_uint16)
-        ]
+    ]
+
 
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY_OEM_TYPE_0(Structure):
     _fields_ = [
         ('Hdr',                  EFI_FIRMWARE_VOLUME_EXT_ENTRY),
         ('TypeMask',             c_uint32)
-        ]
+    ]
+
 
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY_OEM_TYPE(Structure):
     _fields_ = [
         ('Hdr',                  EFI_FIRMWARE_VOLUME_EXT_ENTRY),
         ('TypeMask',             c_uint32),
         ('Types',                ARRAY(GUID, 1))
-        ]
+    ]
+
 
 def Refine_FV_EXT_ENTRY_OEM_TYPE_Header(nums: int) -> EFI_FIRMWARE_VOLUME_EXT_ENTRY_OEM_TYPE:
     class EFI_FIRMWARE_VOLUME_EXT_ENTRY_OEM_TYPE(Structure):
@@ -83,18 +90,21 @@ def Refine_FV_EXT_ENTRY_OEM_TYPE_Header(nums: int) -> EFI_FIRMWARE_VOLUME_EXT_EN
         ]
     return EFI_FIRMWARE_VOLUME_EXT_ENTRY_OEM_TYPE(Structure)
 
+
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY_GUID_TYPE_0(Structure):
     _fields_ = [
         ('Hdr',                  EFI_FIRMWARE_VOLUME_EXT_ENTRY),
         ('FormatType',           GUID)
-        ]
+    ]
+
 
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY_GUID_TYPE(Structure):
     _fields_ = [
         ('Hdr',                  EFI_FIRMWARE_VOLUME_EXT_ENTRY),
         ('FormatType',           GUID),
         ('Data',                 ARRAY(c_uint8, 1))
-        ]
+    ]
+
 
 def Refine_FV_EXT_ENTRY_GUID_TYPE_Header(nums: int) -> EFI_FIRMWARE_VOLUME_EXT_ENTRY_GUID_TYPE:
     class EFI_FIRMWARE_VOLUME_EXT_ENTRY_GUID_TYPE(Structure):
@@ -105,8 +115,9 @@ def Refine_FV_EXT_ENTRY_GUID_TYPE_Header(nums: int) -> EFI_FIRMWARE_VOLUME_EXT_E
         ]
     return EFI_FIRMWARE_VOLUME_EXT_ENTRY_GUID_TYPE(Structure)
 
+
 class EFI_FIRMWARE_VOLUME_EXT_ENTRY_USED_SIZE_TYPE(Structure):
     _fields_ = [
         ('Hdr',                  EFI_FIRMWARE_VOLUME_EXT_ENTRY),
         ('UsedSize',             c_uint32)
-        ]
+    ]
diff --git a/BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py b/BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py
index ee6a63679d89..f22e86f19365 100644
--- a/BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py
+++ b/BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define the Section Header C Struct.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
@@ -11,6 +11,7 @@ from FirmwareStorageFormat.Common import *
 EFI_COMMON_SECTION_HEADER_LEN = 4
 EFI_COMMON_SECTION_HEADER2_LEN = 8
 
+
 class EFI_COMMON_SECTION_HEADER(Structure):
     _pack_ = 1
     _fields_ = [
@@ -25,6 +26,7 @@ class EFI_COMMON_SECTION_HEADER(Structure):
     def Common_Header_Size(self) -> int:
         return 4
 
+
 class EFI_COMMON_SECTION_HEADER2(Structure):
     _pack_ = 1
     _fields_ = [
@@ -40,6 +42,7 @@ class EFI_COMMON_SECTION_HEADER2(Structure):
     def Common_Header_Size(self) -> int:
         return 8
 
+
 class EFI_COMPRESSION_SECTION(Structure):
     _pack_ = 1
     _fields_ = [
@@ -50,6 +53,7 @@ class EFI_COMPRESSION_SECTION(Structure):
     def ExtHeaderSize(self) -> int:
         return 5
 
+
 class EFI_FREEFORM_SUBTYPE_GUID_SECTION(Structure):
     _pack_ = 1
     _fields_ = [
@@ -59,6 +63,7 @@ class EFI_FREEFORM_SUBTYPE_GUID_SECTION(Structure):
     def ExtHeaderSize(self) -> int:
         return 16
 
+
 class EFI_GUID_DEFINED_SECTION(Structure):
     _pack_ = 1
     _fields_ = [
@@ -70,6 +75,7 @@ class EFI_GUID_DEFINED_SECTION(Structure):
     def ExtHeaderSize(self) -> int:
         return 20
 
+
 def Get_USER_INTERFACE_Header(nums: int):
     class EFI_SECTION_USER_INTERFACE(Structure):
         _pack_ = 1
@@ -89,6 +95,7 @@ def Get_USER_INTERFACE_Header(nums: int):
 
     return EFI_SECTION_USER_INTERFACE
 
+
 def Get_VERSION_Header(nums: int):
     class EFI_SECTION_VERSION(Structure):
         _pack_ = 1
diff --git a/BaseTools/Source/Python/FirmwareStorageFormat/__init__.py b/BaseTools/Source/Python/FirmwareStorageFormat/__init__.py
index 335653c6cc60..94330f5ae148 100644
--- a/BaseTools/Source/Python/FirmwareStorageFormat/__init__.py
+++ b/BaseTools/Source/Python/FirmwareStorageFormat/__init__.py
@@ -1,6 +1,6 @@
-## @file
+# @file
 # This file is used to define the Firmware Storage Format.
 #
 # Copyright (c) 2021-, Intel Corporation. All rights reserved.<BR>
 # SPDX-License-Identifier: BSD-2-Clause-Patent
-##
\ No newline at end of file
+##
diff --git a/BaseTools/Source/Python/GenFds/AprioriSection.py b/BaseTools/Source/Python/GenFds/AprioriSection.py
index 9f64c613eb8f..c1432394335e 100644
--- a/BaseTools/Source/Python/GenFds/AprioriSection.py
+++ b/BaseTools/Source/Python/GenFds/AprioriSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process APRIORI file data and generate PEI/DXE APRIORI file
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -24,11 +24,13 @@ from Common.DataType import TAB_COMMON
 DXE_APRIORI_GUID = "FC510EE7-FFDC-11D4-BD41-0080C73C8881"
 PEI_APRIORI_GUID = "1B45CC0A-156A-428A-AF62-49864DA0E6E6"
 
-## process APRIORI file data and generate PEI/DXE APRIORI file
+# process APRIORI file data and generate PEI/DXE APRIORI file
 #
 #
+
+
 class AprioriSection (object):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -37,7 +39,7 @@ class AprioriSection (object):
         self.FfsList = []
         self.AprioriType = ""
 
-    ## GenFfs() method
+    # GenFfs() method
     #
     #   Generate FFS for APRIORI file
     #
@@ -46,7 +48,7 @@ class AprioriSection (object):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval string      Generated file name
     #
-    def GenFfs (self, FvName, Dict = None, IsMakefile = False):
+    def GenFfs(self, FvName, Dict=None, IsMakefile=False):
         if Dict is None:
             Dict = {}
         Buffer = BytesIO()
@@ -55,16 +57,16 @@ class AprioriSection (object):
         else:
             AprioriFileGuid = DXE_APRIORI_GUID
 
-        OutputAprFilePath = os.path.join (GenFdsGlobalVariable.WorkSpaceDir, \
-                                   GenFdsGlobalVariable.FfsDir,\
-                                   AprioriFileGuid + FvName)
+        OutputAprFilePath = os.path.join(GenFdsGlobalVariable.WorkSpaceDir,
+                                         GenFdsGlobalVariable.FfsDir,
+                                         AprioriFileGuid + FvName)
         if not os.path.exists(OutputAprFilePath):
             os.makedirs(OutputAprFilePath)
 
-        OutputAprFileName = os.path.join( OutputAprFilePath, \
-                                       AprioriFileGuid + FvName + '.Apri' )
-        AprFfsFileName = os.path.join (OutputAprFilePath,\
-                                    AprioriFileGuid + FvName + '.Ffs')
+        OutputAprFileName = os.path.join(OutputAprFilePath,
+                                         AprioriFileGuid + FvName + '.Apri')
+        AprFfsFileName = os.path.join(OutputAprFilePath,
+                                      AprioriFileGuid + FvName + '.Ffs')
 
         Dict.update(self.DefineVarDict)
         InfFileName = None
@@ -78,19 +80,22 @@ class AprioriSection (object):
 
                 if Arch:
                     Dict['$(ARCH)'] = Arch
-                InfFileName = GenFdsGlobalVariable.MacroExtend(InfFileName, Dict, Arch)
+                InfFileName = GenFdsGlobalVariable.MacroExtend(
+                    InfFileName, Dict, Arch)
 
                 if Arch:
-                    Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(InfFileName, GenFdsGlobalVariable.WorkSpaceDir), Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                    Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(
+                        InfFileName, GenFdsGlobalVariable.WorkSpaceDir), Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                     Guid = Inf.Guid
                 else:
-                    Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(InfFileName, GenFdsGlobalVariable.WorkSpaceDir), TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                    Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(
+                        InfFileName, GenFdsGlobalVariable.WorkSpaceDir), TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                     Guid = Inf.Guid
 
                     if not Inf.Module.Binaries:
                         EdkLoggerError("GenFds", RESOURCE_NOT_AVAILABLE,
-                                        "INF %s not found in build ARCH %s!" \
-                                        % (InfFileName, GenFdsGlobalVariable.ArchList))
+                                       "INF %s not found in build ARCH %s!"
+                                       % (InfFileName, GenFdsGlobalVariable.ArchList))
 
             GuidPart = Guid.split('-')
             Buffer.write(pack('I', int(GuidPart[0], 16)))
@@ -107,15 +112,16 @@ class AprioriSection (object):
 
         SaveFileOnChange(OutputAprFileName, Buffer.getvalue())
 
-        RawSectionFileName = os.path.join( OutputAprFilePath, \
-                                       AprioriFileGuid + FvName + '.raw' )
+        RawSectionFileName = os.path.join(OutputAprFilePath,
+                                          AprioriFileGuid + FvName + '.raw')
         MakefilePath = None
         if IsMakefile:
             if not InfFileName:
                 return None
             MakefilePath = InfFileName, Arch
-        GenFdsGlobalVariable.GenerateSection(RawSectionFileName, [OutputAprFileName], 'EFI_SECTION_RAW', IsMakefile=IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(RawSectionFileName, [
+                                             OutputAprFileName], 'EFI_SECTION_RAW', IsMakefile=IsMakefile)
         GenFdsGlobalVariable.GenerateFfs(AprFfsFileName, [RawSectionFileName],
-                                        'EFI_FV_FILETYPE_FREEFORM', AprioriFileGuid, MakefilePath=MakefilePath)
+                                         'EFI_FV_FILETYPE_FREEFORM', AprioriFileGuid, MakefilePath=MakefilePath)
 
         return AprFfsFileName
diff --git a/BaseTools/Source/Python/GenFds/Capsule.py b/BaseTools/Source/Python/GenFds/Capsule.py
index f4bfc74e551c..50fa5722fc91 100644
--- a/BaseTools/Source/Python/GenFds/Capsule.py
+++ b/BaseTools/Source/Python/GenFds/Capsule.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # generate capsule
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -24,13 +24,16 @@ from Common.DataType import TAB_LINE_BREAK
 WIN_CERT_REVISION = 0x0200
 WIN_CERT_TYPE_EFI_GUID = 0x0EF1
 EFI_CERT_TYPE_PKCS7_GUID = uuid.UUID('{4aafd29d-68df-49ee-8aa9-347d375665a7}')
-EFI_CERT_TYPE_RSA2048_SHA256_GUID = uuid.UUID('{a7717414-c616-4977-9420-844712a735bf}')
+EFI_CERT_TYPE_RSA2048_SHA256_GUID = uuid.UUID(
+    '{a7717414-c616-4977-9420-844712a735bf}')
 
-## create inf file describes what goes into capsule and call GenFv to generate capsule
+# create inf file describes what goes into capsule and call GenFv to generate capsule
 #
 #
+
+
 class Capsule (CapsuleClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -42,7 +45,7 @@ class Capsule (CapsuleClassObject):
         self.BlockNum = None
         self.CapsuleName = None
 
-    ## Generate FMP capsule
+    # Generate FMP capsule
     #
     #   @retval string      Generated Capsule file path
     #
@@ -60,10 +63,12 @@ class Capsule (CapsuleClassObject):
         #
         # Use FMP capsule GUID: 6DCBD5ED-E82D-4C44-BDA1-7194199AD92A
         #
-        Header.write(PackGUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A'.split('-')))
+        Header.write(
+            PackGUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A'.split('-')))
         HdrSize = 0
         if 'CAPSULE_HEADER_SIZE' in self.TokensDict:
-            Header.write(pack('=I', int(self.TokensDict['CAPSULE_HEADER_SIZE'], 16)))
+            Header.write(
+                pack('=I', int(self.TokensDict['CAPSULE_HEADER_SIZE'], 16)))
             HdrSize = int(self.TokensDict['CAPSULE_HEADER_SIZE'], 16)
         else:
             Header.write(pack('=I', 0x20))
@@ -89,11 +94,14 @@ class Capsule (CapsuleClassObject):
         #
         FwMgrHdr = BytesIO()
         if 'CAPSULE_HEADER_INIT_VERSION' in self.TokensDict:
-            FwMgrHdr.write(pack('=I', int(self.TokensDict['CAPSULE_HEADER_INIT_VERSION'], 16)))
+            FwMgrHdr.write(
+                pack('=I', int(self.TokensDict['CAPSULE_HEADER_INIT_VERSION'], 16)))
         else:
             FwMgrHdr.write(pack('=I', 0x00000001))
-        FwMgrHdr.write(pack('=HH', len(self.CapsuleDataList), len(self.FmpPayloadList)))
-        FwMgrHdrSize = 4+2+2+8*(len(self.CapsuleDataList)+len(self.FmpPayloadList))
+        FwMgrHdr.write(pack('=HH', len(self.CapsuleDataList),
+                       len(self.FmpPayloadList)))
+        FwMgrHdrSize = 4+2+2+8 * \
+            (len(self.CapsuleDataList)+len(self.FmpPayloadList))
 
         #
         # typedef struct _WIN_CERTIFICATE {
@@ -143,27 +151,37 @@ class Capsule (CapsuleClassObject):
                 for Obj in fmp.VendorCodeFile:
                     fmp.VendorCodeFile = Obj.GenCapsuleSubItem()
             if fmp.Certificate_Guid:
-                ExternalTool, ExternalOption = FindExtendTool([], GenFdsGlobalVariable.ArchList, fmp.Certificate_Guid)
+                ExternalTool, ExternalOption = FindExtendTool(
+                    [], GenFdsGlobalVariable.ArchList, fmp.Certificate_Guid)
                 CmdOption = ''
                 CapInputFile = fmp.ImageFile
                 if not os.path.isabs(fmp.ImageFile):
-                    CapInputFile = os.path.join(GenFdsGlobalVariable.WorkSpaceDir, fmp.ImageFile)
-                CapOutputTmp = os.path.join(GenFdsGlobalVariable.FvDir, self.UiCapsuleName) + '.tmp'
+                    CapInputFile = os.path.join(
+                        GenFdsGlobalVariable.WorkSpaceDir, fmp.ImageFile)
+                CapOutputTmp = os.path.join(
+                    GenFdsGlobalVariable.FvDir, self.UiCapsuleName) + '.tmp'
                 if ExternalTool is None:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "No tool found with GUID %s" % fmp.Certificate_Guid)
+                    EdkLogger.error(
+                        "GenFds", GENFDS_ERROR, "No tool found with GUID %s" % fmp.Certificate_Guid)
                 else:
                     CmdOption += ExternalTool
                 if ExternalOption:
                     CmdOption = CmdOption + ' ' + ExternalOption
-                CmdOption += ' -e ' + ' --monotonic-count ' + str(fmp.MonotonicCount) + ' -o ' + CapOutputTmp + ' ' + CapInputFile
+                CmdOption += ' -e ' + ' --monotonic-count ' + \
+                    str(fmp.MonotonicCount) + ' -o ' + \
+                    CapOutputTmp + ' ' + CapInputFile
                 CmdList = CmdOption.split()
-                GenFdsGlobalVariable.CallExternalTool(CmdList, "Failed to generate FMP auth capsule")
+                GenFdsGlobalVariable.CallExternalTool(
+                    CmdList, "Failed to generate FMP auth capsule")
                 if uuid.UUID(fmp.Certificate_Guid) == EFI_CERT_TYPE_PKCS7_GUID:
-                    dwLength = 4 + 2 + 2 + 16 + os.path.getsize(CapOutputTmp) - os.path.getsize(CapInputFile)
+                    dwLength = 4 + 2 + 2 + 16 + \
+                        os.path.getsize(CapOutputTmp) - \
+                        os.path.getsize(CapInputFile)
                 else:
                     dwLength = 4 + 2 + 2 + 16 + 16 + 256 + 256
                 fmp.ImageFile = CapOutputTmp
-                AuthData = [fmp.MonotonicCount, dwLength, WIN_CERT_REVISION, WIN_CERT_TYPE_EFI_GUID, fmp.Certificate_Guid]
+                AuthData = [fmp.MonotonicCount, dwLength, WIN_CERT_REVISION,
+                            WIN_CERT_TYPE_EFI_GUID, fmp.Certificate_Guid]
                 fmp.Buffer = fmp.GenCapsuleSubItem(AuthData)
             else:
                 fmp.Buffer = fmp.GenCapsuleSubItem()
@@ -181,11 +199,12 @@ class Capsule (CapsuleClassObject):
         #
         # Generate FMP capsule file
         #
-        CapOutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiCapsuleName) + '.Cap'
+        CapOutputFile = os.path.join(
+            GenFdsGlobalVariable.FvDir, self.UiCapsuleName) + '.Cap'
         SaveFileOnChange(CapOutputFile, Header.getvalue(), True)
         return CapOutputFile
 
-    ## Generate capsule
+    # Generate capsule
     #
     #   @param  self        The object pointer
     #   @retval string      Generated Capsule file path
@@ -194,9 +213,10 @@ class Capsule (CapsuleClassObject):
         if self.UiCapsuleName.upper() + 'cap' in GenFdsGlobalVariable.ImageBinDict:
             return GenFdsGlobalVariable.ImageBinDict[self.UiCapsuleName.upper() + 'cap']
 
-        GenFdsGlobalVariable.InfLogger( "\nGenerate %s Capsule" %self.UiCapsuleName)
+        GenFdsGlobalVariable.InfLogger(
+            "\nGenerate %s Capsule" % self.UiCapsuleName)
         if ('CAPSULE_GUID' in self.TokensDict and
-            uuid.UUID(self.TokensDict['CAPSULE_GUID']) == uuid.UUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')):
+                uuid.UUID(self.TokensDict['CAPSULE_GUID']) == uuid.UUID('6DCBD5ED-E82D-4C44-BDA1-7194199AD92A')):
             return self.GenFmpCapsule()
 
         CapInfFile = self.GenCapInf()
@@ -207,44 +227,47 @@ class Capsule (CapsuleClassObject):
             FileName = CapsuleDataObj.GenCapsuleSubItem()
             CapsuleDataObj.CapsuleName = None
             CapFileList.append(FileName)
-            CapInfFile.append("EFI_FILE_NAME = " + \
-                                   FileName      + \
-                                   TAB_LINE_BREAK)
+            CapInfFile.append("EFI_FILE_NAME = " +
+                              FileName +
+                              TAB_LINE_BREAK)
         SaveFileOnChange(self.CapInfFileName, ''.join(CapInfFile), False)
         #
         # Call GenFv tool to generate capsule
         #
-        CapOutputFile = os.path.join(GenFdsGlobalVariable.FvDir, self.UiCapsuleName)
+        CapOutputFile = os.path.join(
+            GenFdsGlobalVariable.FvDir, self.UiCapsuleName)
         CapOutputFile = CapOutputFile + '.Cap'
         GenFdsGlobalVariable.GenerateFirmwareVolume(
-                                CapOutputFile,
-                                [self.CapInfFileName],
-                                Capsule=True,
-                                FfsList=CapFileList
-                                )
+            CapOutputFile,
+            [self.CapInfFileName],
+            Capsule=True,
+            FfsList=CapFileList
+        )
 
-        GenFdsGlobalVariable.VerboseLogger( "\nGenerate %s Capsule Successfully" %self.UiCapsuleName)
+        GenFdsGlobalVariable.VerboseLogger(
+            "\nGenerate %s Capsule Successfully" % self.UiCapsuleName)
         GenFdsGlobalVariable.SharpCounter = 0
-        GenFdsGlobalVariable.ImageBinDict[self.UiCapsuleName.upper() + 'cap'] = CapOutputFile
+        GenFdsGlobalVariable.ImageBinDict[self.UiCapsuleName.upper(
+        ) + 'cap'] = CapOutputFile
         return CapOutputFile
 
-    ## Generate inf file for capsule
+    # Generate inf file for capsule
     #
     #   @param  self        The object pointer
     #   @retval file        inf file object
     #
     def GenCapInf(self):
         self.CapInfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
-                                   self.UiCapsuleName +  "_Cap" + '.inf')
+                                           self.UiCapsuleName + "_Cap" + '.inf')
         CapInfFile = []
 
         CapInfFile.append("[options]" + TAB_LINE_BREAK)
 
         for Item in self.TokensDict:
-            CapInfFile.append("EFI_"                    + \
-                                  Item                      + \
-                                  ' = '                     + \
-                                  self.TokensDict[Item]     + \
-                                  TAB_LINE_BREAK)
+            CapInfFile.append("EFI_" +
+                              Item +
+                              ' = ' +
+                              self.TokensDict[Item] +
+                              TAB_LINE_BREAK)
 
         return CapInfFile
diff --git a/BaseTools/Source/Python/GenFds/CapsuleData.py b/BaseTools/Source/Python/GenFds/CapsuleData.py
index ebbde7f8708c..f9b1569635aa 100644
--- a/BaseTools/Source/Python/GenFds/CapsuleData.py
+++ b/BaseTools/Source/Python/GenFds/CapsuleData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # generate capsule
 #
 #  Copyright (c) 2007-2018, Intel Corporation. All rights reserved.<BR>
@@ -17,35 +17,39 @@ import os
 from Common.Misc import SaveFileOnChange
 import uuid
 
-## base class for capsule data
+# base class for capsule data
 #
 #
+
+
 class CapsuleData:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     def __init__(self):
         pass
 
-    ## generate capsule data
+    # generate capsule data
     #
     #   @param  self        The object pointer
     def GenCapsuleSubItem(self):
         pass
 
-## FFS class for capsule data
+# FFS class for capsule data
 #
 #
+
+
 class CapsuleFfs (CapsuleData):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self) :
+    def __init__(self):
         self.Ffs = None
         self.FvName = None
 
-    ## generate FFS capsule data
+    # generate FFS capsule data
     #
     #   @param  self        The object pointer
     #   @retval string      Generated file name
@@ -54,20 +58,22 @@ class CapsuleFfs (CapsuleData):
         FfsFile = self.Ffs.GenFfs()
         return FfsFile
 
-## FV class for capsule data
+# FV class for capsule data
 #
 #
+
+
 class CapsuleFv (CapsuleData):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self) :
+    def __init__(self):
         self.Ffs = None
         self.FvName = None
         self.CapsuleName = None
 
-    ## generate FV capsule data
+    # generate FV capsule data
     #
     #   @param  self        The object pointer
     #   @retval string      Generated file name
@@ -75,7 +81,8 @@ class CapsuleFv (CapsuleData):
     def GenCapsuleSubItem(self):
         if self.FvName.find('.fv') == -1:
             if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
-                FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[self.FvName.upper()]
+                FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[self.FvName.upper(
+                )]
                 FdBuffer = BytesIO()
                 FvObj.CapsuleName = self.CapsuleName
                 FvFile = FvObj.AddToBuffer(FdBuffer)
@@ -86,20 +93,22 @@ class CapsuleFv (CapsuleData):
             FvFile = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvName)
             return FvFile
 
-## FD class for capsule data
+# FD class for capsule data
 #
 #
+
+
 class CapsuleFd (CapsuleData):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self) :
+    def __init__(self):
         self.Ffs = None
         self.FdName = None
         self.CapsuleName = None
 
-    ## generate FD capsule data
+    # generate FD capsule data
     #
     #   @param  self        The object pointer
     #   @retval string      Generated file name
@@ -107,26 +116,29 @@ class CapsuleFd (CapsuleData):
     def GenCapsuleSubItem(self):
         if self.FdName.find('.fd') == -1:
             if self.FdName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FdDict:
-                FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[self.FdName.upper()]
+                FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[self.FdName.upper(
+                )]
                 FdFile = FdObj.GenFd()
                 return FdFile
         else:
             FdFile = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FdName)
             return FdFile
 
-## AnyFile class for capsule data
+# AnyFile class for capsule data
 #
 #
+
+
 class CapsuleAnyFile (CapsuleData):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self) :
+    def __init__(self):
         self.Ffs = None
         self.FileName = None
 
-    ## generate AnyFile capsule data
+    # generate AnyFile capsule data
     #
     #   @param  self        The object pointer
     #   @retval string      Generated file name
@@ -134,19 +146,21 @@ class CapsuleAnyFile (CapsuleData):
     def GenCapsuleSubItem(self):
         return self.FileName
 
-## Afile class for capsule data
+# Afile class for capsule data
 #
 #
+
+
 class CapsuleAfile (CapsuleData):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self) :
+    def __init__(self):
         self.Ffs = None
         self.FileName = None
 
-    ## generate Afile capsule data
+    # generate Afile capsule data
     #
     #   @param  self        The object pointer
     #   @retval string      Generated file name
@@ -154,6 +168,7 @@ class CapsuleAfile (CapsuleData):
     def GenCapsuleSubItem(self):
         return self.FileName
 
+
 class CapsulePayload(CapsuleData):
     '''Generate payload file, the header is defined below:
     #pragma pack(1)
@@ -167,6 +182,7 @@ class CapsulePayload(CapsuleData):
         UINT64 UpdateHardwareInstance; //Introduced in v2
     } EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER;
     '''
+
     def __init__(self):
         self.UiName = None
         self.Version = None
@@ -201,28 +217,29 @@ class CapsulePayload(CapsuleData):
         #
         Guid = self.ImageTypeId.split('-')
         Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
-                       int(self.Version, 16),
-                       int(Guid[0], 16),
-                       int(Guid[1], 16),
-                       int(Guid[2], 16),
-                       int(Guid[3][-4:-2], 16),
-                       int(Guid[3][-2:], 16),
-                       int(Guid[4][-12:-10], 16),
-                       int(Guid[4][-10:-8], 16),
-                       int(Guid[4][-8:-6], 16),
-                       int(Guid[4][-6:-4], 16),
-                       int(Guid[4][-4:-2], 16),
-                       int(Guid[4][-2:], 16),
-                       int(self.ImageIndex, 16),
-                       0,
-                       0,
-                       0,
-                       ImageFileSize,
-                       VendorFileSize,
-                       int(self.HardwareInstance, 16)
-                       )
+                      int(self.Version, 16),
+                      int(Guid[0], 16),
+                      int(Guid[1], 16),
+                      int(Guid[2], 16),
+                      int(Guid[3][-4:-2], 16),
+                      int(Guid[3][-2:], 16),
+                      int(Guid[4][-12:-10], 16),
+                      int(Guid[4][-10:-8], 16),
+                      int(Guid[4][-8:-6], 16),
+                      int(Guid[4][-6:-4], 16),
+                      int(Guid[4][-4:-2], 16),
+                      int(Guid[4][-2:], 16),
+                      int(self.ImageIndex, 16),
+                      0,
+                      0,
+                      0,
+                      ImageFileSize,
+                      VendorFileSize,
+                      int(self.HardwareInstance, 16)
+                      )
         if AuthData:
-            Buffer += pack('QIHH', AuthData[0], AuthData[1], AuthData[2], AuthData[3])
+            Buffer += pack('QIHH', AuthData[0],
+                           AuthData[1], AuthData[2], AuthData[3])
             Buffer += uuid.UUID(AuthData[4]).bytes_le
 
         #
diff --git a/BaseTools/Source/Python/GenFds/CompressSection.py b/BaseTools/Source/Python/GenFds/CompressSection.py
index e62280fc16c2..f248cb4ceaa1 100644
--- a/BaseTools/Source/Python/GenFds/CompressSection.py
+++ b/BaseTools/Source/Python/GenFds/CompressSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process compress section generation
 #
 #  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
@@ -18,25 +18,27 @@ from .GenFdsGlobalVariable import GenFdsGlobalVariable
 from CommonDataClass.FdfClass import CompressSectionClassObject
 from Common.DataType import *
 
-## generate compress section
+# generate compress section
 #
 #
-class CompressSection (CompressSectionClassObject) :
 
-    ## compress types: PI standard and non PI standard
+
+class CompressSection (CompressSectionClassObject):
+
+    # compress types: PI standard and non PI standard
     CompTypeDict = {
-        'PI_STD'  : 'PI_STD',
-        'PI_NONE' : 'PI_NONE'
+        'PI_STD': 'PI_STD',
+        'PI_NONE': 'PI_NONE'
     }
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         CompressSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate compressed section
     #
@@ -49,7 +51,7 @@ class CompressSection (CompressSectionClassObject) :
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile=False):
 
         if FfsInf is not None:
             self.CompType = FfsInf.__ExtendMacro__(self.CompType)
@@ -63,12 +65,13 @@ class CompressSection (CompressSectionClassObject) :
             Dict = {}
         for Sect in self.SectionList:
             Index = Index + 1
-            SecIndex = '%s.%d' %(SecNum, Index)
-            ReturnSectList, AlignValue = Sect.GenSection(OutputPath, ModuleName, SecIndex, KeyStringList, FfsInf, Dict, IsMakefile=IsMakefile)
+            SecIndex = '%s.%d' % (SecNum, Index)
+            ReturnSectList, AlignValue = Sect.GenSection(
+                OutputPath, ModuleName, SecIndex, KeyStringList, FfsInf, Dict, IsMakefile=IsMakefile)
             if AlignValue is not None:
                 if MaxAlign is None:
                     MaxAlign = AlignValue
-                if GenFdsGlobalVariable.GetAlignment (AlignValue) > GenFdsGlobalVariable.GetAlignment (MaxAlign):
+                if GenFdsGlobalVariable.GetAlignment(AlignValue) > GenFdsGlobalVariable.GetAlignment(MaxAlign):
                     MaxAlign = AlignValue
             if ReturnSectList != []:
                 if AlignValue is None:
@@ -78,19 +81,18 @@ class CompressSection (CompressSectionClassObject) :
                     SectAlign.append(AlignValue)
 
         OutputFile = OutputPath + \
-                     os.sep     + \
-                     ModuleName + \
-                     SUP_MODULE_SEC      + \
-                     SecNum     + \
-                     SectionSuffix['COMPRESS']
+            os.sep + \
+            ModuleName + \
+            SUP_MODULE_SEC + \
+            SecNum + \
+            SectionSuffix['COMPRESS']
         OutputFile = os.path.normpath(OutputFile)
         DummyFile = OutputFile + '.dummy'
-        GenFdsGlobalVariable.GenerateSection(DummyFile, SectFiles, InputAlign=SectAlign, IsMakefile=IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(
+            DummyFile, SectFiles, InputAlign=SectAlign, IsMakefile=IsMakefile)
 
         GenFdsGlobalVariable.GenerateSection(OutputFile, [DummyFile], Section.Section.SectionType['COMPRESS'],
                                              CompressionType=self.CompTypeDict[self.CompType], IsMakefile=IsMakefile)
         OutputFileList = []
         OutputFileList.append(OutputFile)
         return OutputFileList, self.Alignment
-
-
diff --git a/BaseTools/Source/Python/GenFds/DataSection.py b/BaseTools/Source/Python/GenFds/DataSection.py
index 5af3ee7b7f7c..d7c859649863 100644
--- a/BaseTools/Source/Python/GenFds/DataSection.py
+++ b/BaseTools/Source/Python/GenFds/DataSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process data section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -20,18 +20,20 @@ from Common.Misc import PeImageClass
 from Common.LongFilePathSupport import CopyLongFilePath
 from Common.DataType import *
 
-## generate data section
+# generate data section
 #
 #
+
+
 class DataSection (DataSectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         DataSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate compressed section
     #
@@ -44,24 +46,28 @@ class DataSection (DataSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name list, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, keyStringList, FfsFile = None, Dict = None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, keyStringList, FfsFile=None, Dict=None, IsMakefile=False):
         #
         # Prepare the parameter of GenSection
         #
         if Dict is None:
             Dict = {}
         if FfsFile is not None:
-            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.SectFileName)
-            self.SectFileName = GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict, FfsFile.CurrentArch)
+            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.SectFileName)
+            self.SectFileName = GenFdsGlobalVariable.MacroExtend(
+                self.SectFileName, Dict, FfsFile.CurrentArch)
         else:
-            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.SectFileName)
-            self.SectFileName = GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)
+            self.SectFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.SectFileName)
+            self.SectFileName = GenFdsGlobalVariable.MacroExtend(
+                self.SectFileName, Dict)
 
         """Check Section file exist or not !"""
 
         if not os.path.exists(self.SectFileName):
-            self.SectFileName = os.path.join (GenFdsGlobalVariable.WorkSpaceDir,
-                                              self.SectFileName)
+            self.SectFileName = os.path.join(GenFdsGlobalVariable.WorkSpaceDir,
+                                             self.SectFileName)
 
         """Copy Map file to Ffs output"""
         Filename = GenFdsGlobalVariable.MacroExtend(self.SectFileName)
@@ -72,13 +78,14 @@ class DataSection (DataSectionClassObject):
                 if GenFdsGlobalVariable.CopyList == []:
                     GenFdsGlobalVariable.CopyList = [(MapFile, CopyMapFile)]
                 else:
-                    GenFdsGlobalVariable.CopyList.append((MapFile, CopyMapFile))
+                    GenFdsGlobalVariable.CopyList.append(
+                        (MapFile, CopyMapFile))
             else:
                 if os.path.exists(MapFile):
                     if not os.path.exists(CopyMapFile) or (os.path.getmtime(MapFile) > os.path.getmtime(CopyMapFile)):
                         CopyLongFilePath(MapFile, CopyMapFile)
 
-        #Get PE Section alignment when align is set to AUTO
+        # Get PE Section alignment when align is set to AUTO
         if self.Alignment == 'Auto' and self.SecType in (BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32):
             self.Alignment = "0"
         NoStrip = True
@@ -89,29 +96,31 @@ class DataSection (DataSectionClassObject):
         if not NoStrip:
             FileBeforeStrip = os.path.join(OutputPath, ModuleName + '.efi')
             if not os.path.exists(FileBeforeStrip) or \
-                (os.path.getmtime(self.SectFileName) > os.path.getmtime(FileBeforeStrip)):
+                    (os.path.getmtime(self.SectFileName) > os.path.getmtime(FileBeforeStrip)):
                 CopyLongFilePath(self.SectFileName, FileBeforeStrip)
             StrippedFile = os.path.join(OutputPath, ModuleName + '.stripped')
             GenFdsGlobalVariable.GenerateFirmwareImage(
-                    StrippedFile,
-                    [GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)],
-                    Strip=True,
-                    IsMakefile = IsMakefile
-                )
+                StrippedFile,
+                [GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)],
+                Strip=True,
+                IsMakefile=IsMakefile
+            )
             self.SectFileName = StrippedFile
 
         if self.SecType == BINARY_FILE_TYPE_TE:
-            TeFile = os.path.join( OutputPath, ModuleName + 'Te.raw')
+            TeFile = os.path.join(OutputPath, ModuleName + 'Te.raw')
             GenFdsGlobalVariable.GenerateFirmwareImage(
-                    TeFile,
-                    [GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)],
-                    Type='te',
-                    IsMakefile = IsMakefile
-                )
+                TeFile,
+                [GenFdsGlobalVariable.MacroExtend(self.SectFileName, Dict)],
+                Type='te',
+                IsMakefile=IsMakefile
+            )
             self.SectFileName = TeFile
 
-        OutputFile = os.path.join (OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get(self.SecType))
+        OutputFile = os.path.join(
+            OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get(self.SecType))
         OutputFile = os.path.normpath(OutputFile)
-        GenFdsGlobalVariable.GenerateSection(OutputFile, [self.SectFileName], Section.Section.SectionType.get(self.SecType), IsMakefile = IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(OutputFile, [
+                                             self.SectFileName], Section.Section.SectionType.get(self.SecType), IsMakefile=IsMakefile)
         FileList = [OutputFile]
         return FileList, self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/DepexSection.py b/BaseTools/Source/Python/GenFds/DepexSection.py
index 6cabac38c496..fabbcba79e37 100644
--- a/BaseTools/Source/Python/GenFds/DepexSection.py
+++ b/BaseTools/Source/Python/GenFds/DepexSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process depex section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -20,11 +20,13 @@ from Common.BuildToolError import *
 from Common.Misc import PathClass
 from Common.DataType import *
 
-## generate data section
+# generate data section
 #
 #
+
+
 class DepexSection (DepexSectionClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -39,11 +41,11 @@ class DepexSection (DepexSectionClassObject):
                                                                     GenFdsGlobalVariable.ToolChainTag)
             for Inf in GenFdsGlobalVariable.FdfParser.Profile.InfList:
                 ModuleData = GenFdsGlobalVariable.WorkSpace.BuildObject[
-                                                            PathClass(Inf, GenFdsGlobalVariable.WorkSpaceDir),
-                                                            Arch,
-                                                            GenFdsGlobalVariable.TargetName,
-                                                            GenFdsGlobalVariable.ToolChainTag
-                                                            ]
+                    PathClass(Inf, GenFdsGlobalVariable.WorkSpaceDir),
+                    Arch,
+                    GenFdsGlobalVariable.TargetName,
+                    GenFdsGlobalVariable.ToolChainTag
+                ]
                 for Pkg in ModuleData.Packages:
                     if Pkg not in PkgList:
                         PkgList.append(Pkg)
@@ -56,7 +58,7 @@ class DepexSection (DepexSectionClassObject):
                     return PkgDb.Guids[CName]
         return None
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate compressed section
     #
@@ -69,9 +71,10 @@ class DepexSection (DepexSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name list, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, keyStringList, FfsFile = None, Dict = None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, keyStringList, FfsFile=None, Dict=None, IsMakefile=False):
         if self.ExpressionProcessed == False:
-            self.Expression = self.Expression.replace("\n", " ").replace("\r", " ")
+            self.Expression = self.Expression.replace(
+                "\n", " ").replace("\r", " ")
             ExpList = self.Expression.split()
 
             for Exp in ExpList:
@@ -88,24 +91,27 @@ class DepexSection (DepexSectionClassObject):
 
         if self.DepexType == 'PEI_DEPEX_EXP':
             ModuleType = SUP_MODULE_PEIM
-            SecType    = BINARY_FILE_TYPE_PEI_DEPEX
+            SecType = BINARY_FILE_TYPE_PEI_DEPEX
         elif self.DepexType == 'DXE_DEPEX_EXP':
             ModuleType = SUP_MODULE_DXE_DRIVER
-            SecType    = BINARY_FILE_TYPE_DXE_DEPEX
+            SecType = BINARY_FILE_TYPE_DXE_DEPEX
         elif self.DepexType == 'SMM_DEPEX_EXP':
             ModuleType = SUP_MODULE_DXE_SMM_DRIVER
-            SecType    = BINARY_FILE_TYPE_SMM_DEPEX
+            SecType = BINARY_FILE_TYPE_SMM_DEPEX
         else:
             EdkLogger.error("GenFds", FORMAT_INVALID,
                             "Depex type %s is not valid for module %s" % (self.DepexType, ModuleName))
 
-        InputFile = os.path.join (OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + '.depex')
+        InputFile = os.path.join(
+            OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + '.depex')
         InputFile = os.path.normpath(InputFile)
         Depex = DependencyExpression(self.Expression, ModuleType)
         Depex.Generate(InputFile)
 
-        OutputFile = os.path.join (OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + '.dpx')
+        OutputFile = os.path.join(
+            OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + '.dpx')
         OutputFile = os.path.normpath(OutputFile)
 
-        GenFdsGlobalVariable.GenerateSection(OutputFile, [InputFile], Section.Section.SectionType.get (SecType), IsMakefile=IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(OutputFile, [
+                                             InputFile], Section.Section.SectionType.get(SecType), IsMakefile=IsMakefile)
         return [OutputFile], self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/EfiSection.py b/BaseTools/Source/Python/GenFds/EfiSection.py
index fd58391dac99..8db984e83c71 100644
--- a/BaseTools/Source/Python/GenFds/EfiSection.py
+++ b/BaseTools/Source/Python/GenFds/EfiSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process rule section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -24,19 +24,21 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.LongFilePathSupport import CopyLongFilePath
 from Common.DataType import *
 
-## generate rule section
+# generate rule section
 #
 #
+
+
 class EfiSection (EfiSectionClassObject):
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
-          EfiSectionClassObject.__init__(self)
+        EfiSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate rule section
     #
@@ -49,12 +51,12 @@ class EfiSection (EfiSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name list, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = None, IsMakefile = False) :
+    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile=False):
 
         if self.FileName is not None and self.FileName.startswith('PCD('):
             self.FileName = GenFdsGlobalVariable.GetPcdValue(self.FileName)
         """Prepare the parameter of GenSection"""
-        if FfsInf is not None :
+        if FfsInf is not None:
             InfFileName = FfsInf.InfFileName
             SectionType = FfsInf.__ExtendMacro__(self.SectionType)
             Filename = FfsInf.__ExtendMacro__(self.FileName)
@@ -72,7 +74,8 @@ class EfiSection (EfiSectionClassObject):
                 elif FfsInf.ShadowFromInfFile is not None:
                     NoStrip = FfsInf.ShadowFromInfFile
         else:
-            EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s apply rule for None!" %ModuleName)
+            EdkLogger.error("GenFds", GENFDS_ERROR,
+                            "Module %s apply rule for None!" % ModuleName)
 
         """If the file name was pointed out, add it in FileList"""
         FileList = []
@@ -84,7 +87,8 @@ class EfiSection (EfiSectionClassObject):
             if os.path.isabs(Filename):
                 Filename = os.path.normpath(Filename)
             else:
-                Filename = os.path.normpath(os.path.join(FfsInf.EfiOutputPath, Filename))
+                Filename = os.path.normpath(
+                    os.path.join(FfsInf.EfiOutputPath, Filename))
 
             if not self.Optional:
                 FileList.append(Filename)
@@ -95,8 +99,9 @@ class EfiSection (EfiSectionClassObject):
                 if '.depex' in SuffixMap:
                     FileList.append(Filename)
         else:
-            FileList, IsSect = Section.Section.GetFileList(FfsInf, self.FileType, self.FileExtension, Dict, IsMakefile=IsMakefile, SectionType=SectionType)
-            if IsSect :
+            FileList, IsSect = Section.Section.GetFileList(
+                FfsInf, self.FileType, self.FileExtension, Dict, IsMakefile=IsMakefile, SectionType=SectionType)
+            if IsSect:
                 return FileList, self.Alignment
 
         Index = 0
@@ -120,18 +125,20 @@ class EfiSection (EfiSectionClassObject):
                     BuildNumTuple = tuple()
 
                 Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join(
+                    OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                    #Ui=StringData,
-                                                    Ver=BuildNum,
-                                                    IsMakefile=IsMakefile)
+                                                     # Ui=StringData,
+                                                     Ver=BuildNum,
+                                                     IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum, Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
+                    Num = '%s.%d' % (SecNum, Index)
+                    OutputFile = os.path.join(
+                        OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     VerString = f.read()
                     f.close()
@@ -139,9 +146,9 @@ class EfiSection (EfiSectionClassObject):
                     if BuildNum is not None and BuildNum != '':
                         BuildNumTuple = ('-j', BuildNum)
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                        #Ui=VerString,
-                                                        Ver=BuildNum,
-                                                        IsMakefile=IsMakefile)
+                                                         # Ui=VerString,
+                                                         Ver=BuildNum,
+                                                         IsMakefile=IsMakefile)
                     OutputFileList.append(OutputFile)
 
             else:
@@ -152,19 +159,22 @@ class EfiSection (EfiSectionClassObject):
                     BuildNumTuple = tuple()
                 BuildNumString = ' ' + ' '.join(BuildNumTuple)
 
-                #if VerString == '' and
+                # if VerString == '' and
                 if BuildNumString == '':
-                    if self.Optional == True :
-                        GenFdsGlobalVariable.VerboseLogger( "Optional Section don't exist!")
+                    if self.Optional == True:
+                        GenFdsGlobalVariable.VerboseLogger(
+                            "Optional Section don't exist!")
                         return [], None
                     else:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "File: %s miss Version Section value" %InfFileName)
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "File: %s miss Version Section value" % InfFileName)
                 Num = SecNum
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join(
+                    OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_VERSION',
-                                                    #Ui=VerString,
-                                                    Ver=BuildNum,
-                                                    IsMakefile=IsMakefile)
+                                                     # Ui=VerString,
+                                                     Ver=BuildNum,
+                                                     IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
         #
@@ -181,7 +191,8 @@ class EfiSection (EfiSectionClassObject):
                 Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join(
+                    OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -189,15 +200,16 @@ class EfiSection (EfiSectionClassObject):
             elif FileList != []:
                 for File in FileList:
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum, Index)
-                    OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
+                    Num = '%s.%d' % (SecNum, Index)
+                    OutputFile = os.path.join(
+                        OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
                     f = open(File, 'r')
                     UiString = f.read()
                     f.close()
                     if IsMakefile and UiString == ModuleNameStr:
                         UiString = "$(MODULE_NAME)"
                     GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
-                                                        Ui=UiString, IsMakefile=IsMakefile)
+                                                         Ui=UiString, IsMakefile=IsMakefile)
                     OutputFileList.append(OutputFile)
             else:
                 if StringData is not None and len(StringData) > 0:
@@ -205,16 +217,19 @@ class EfiSection (EfiSectionClassObject):
                 else:
                     UiTuple = tuple()
 
-                    if self.Optional == True :
-                        GenFdsGlobalVariable.VerboseLogger( "Optional Section don't exist!")
+                    if self.Optional == True:
+                        GenFdsGlobalVariable.VerboseLogger(
+                            "Optional Section don't exist!")
                         return '', None
                     else:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "File: %s miss UI Section value" %InfFileName)
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "File: %s miss UI Section value" % InfFileName)
 
                 Num = SecNum
                 if IsMakefile and StringData == ModuleNameStr:
                     StringData = "$(MODULE_NAME)"
-                OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
+                OutputFile = os.path.join(
+                    OutputPath, ModuleName + SUP_MODULE_SEC + str(Num) + SectionSuffix.get(SectionType))
                 GenFdsGlobalVariable.GenerateSection(OutputFile, [], 'EFI_SECTION_USER_INTERFACE',
                                                      Ui=StringData, IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
@@ -226,15 +241,17 @@ class EfiSection (EfiSectionClassObject):
             """If File List is empty"""
             if FileList == []:
                 if self.Optional == True:
-                    GenFdsGlobalVariable.VerboseLogger("Optional Section don't exist!")
+                    GenFdsGlobalVariable.VerboseLogger(
+                        "Optional Section don't exist!")
                     return [], None
                 else:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "Output file for %s section could not be found for %s" % (SectionType, InfFileName))
+                    EdkLogger.error("GenFds", GENFDS_ERROR, "Output file for %s section could not be found for %s" % (
+                        SectionType, InfFileName))
 
             elif len(FileList) > 1:
                 EdkLogger.error("GenFds", GENFDS_ERROR,
                                 "Files suffixed with %s are not allowed to have more than one file in %s[Binaries] section" % (
-                                self.FileExtension, InfFileName))
+                                    self.FileExtension, InfFileName))
             else:
                 for File in FileList:
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
@@ -242,77 +259,89 @@ class EfiSection (EfiSectionClassObject):
 
         else:
             """If File List is empty"""
-            if FileList == [] :
+            if FileList == []:
                 if self.Optional == True:
-                    GenFdsGlobalVariable.VerboseLogger("Optional Section don't exist!")
+                    GenFdsGlobalVariable.VerboseLogger(
+                        "Optional Section don't exist!")
                     return [], None
                 else:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "Output file for %s section could not be found for %s" % (SectionType, InfFileName))
+                    EdkLogger.error("GenFds", GENFDS_ERROR, "Output file for %s section could not be found for %s" % (
+                        SectionType, InfFileName))
 
             else:
                 """Convert the File to Section file one by one """
                 for File in FileList:
                     """ Copy Map file to FFS output path """
                     Index = Index + 1
-                    Num = '%s.%d' %(SecNum, Index)
-                    OutputFile = os.path.join( OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
+                    Num = '%s.%d' % (SecNum, Index)
+                    OutputFile = os.path.join(
+                        OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get(SectionType))
                     File = GenFdsGlobalVariable.MacroExtend(File, Dict)
 
-                    #Get PE Section alignment when align is set to AUTO
+                    # Get PE Section alignment when align is set to AUTO
                     if self.Alignment == 'Auto' and (SectionType == BINARY_FILE_TYPE_PE32 or SectionType == BINARY_FILE_TYPE_TE):
                         Align = "0"
                     if File[(len(File)-4):] == '.efi' and FfsInf.InfModule.BaseName == os.path.basename(File)[:-4]:
                         MapFile = File.replace('.efi', '.map')
-                        CopyMapFile = os.path.join(OutputPath, ModuleName + '.map')
+                        CopyMapFile = os.path.join(
+                            OutputPath, ModuleName + '.map')
                         if IsMakefile:
                             if GenFdsGlobalVariable.CopyList == []:
-                                GenFdsGlobalVariable.CopyList = [(MapFile, CopyMapFile)]
+                                GenFdsGlobalVariable.CopyList = [
+                                    (MapFile, CopyMapFile)]
                             else:
-                                GenFdsGlobalVariable.CopyList.append((MapFile, CopyMapFile))
+                                GenFdsGlobalVariable.CopyList.append(
+                                    (MapFile, CopyMapFile))
                         else:
                             if os.path.exists(MapFile):
                                 if not os.path.exists(CopyMapFile) or \
-                                       (os.path.getmtime(MapFile) > os.path.getmtime(CopyMapFile)):
+                                        (os.path.getmtime(MapFile) > os.path.getmtime(CopyMapFile)):
                                     CopyLongFilePath(MapFile, CopyMapFile)
 
                     if not NoStrip:
-                        FileBeforeStrip = os.path.join(OutputPath, ModuleName + '.efi')
+                        FileBeforeStrip = os.path.join(
+                            OutputPath, ModuleName + '.efi')
                         if IsMakefile:
                             if GenFdsGlobalVariable.CopyList == []:
-                                GenFdsGlobalVariable.CopyList = [(File, FileBeforeStrip)]
+                                GenFdsGlobalVariable.CopyList = [
+                                    (File, FileBeforeStrip)]
                             else:
-                                GenFdsGlobalVariable.CopyList.append((File, FileBeforeStrip))
+                                GenFdsGlobalVariable.CopyList.append(
+                                    (File, FileBeforeStrip))
                         else:
                             if not os.path.exists(FileBeforeStrip) or \
-                                (os.path.getmtime(File) > os.path.getmtime(FileBeforeStrip)):
+                                    (os.path.getmtime(File) > os.path.getmtime(FileBeforeStrip)):
                                 CopyLongFilePath(File, FileBeforeStrip)
-                        StrippedFile = os.path.join(OutputPath, ModuleName + '.stripped')
+                        StrippedFile = os.path.join(
+                            OutputPath, ModuleName + '.stripped')
                         GenFdsGlobalVariable.GenerateFirmwareImage(
-                                StrippedFile,
-                                [File],
-                                Strip=True,
-                                IsMakefile = IsMakefile
-                            )
+                            StrippedFile,
+                            [File],
+                            Strip=True,
+                            IsMakefile=IsMakefile
+                        )
                         File = StrippedFile
 
                     """For TE Section call GenFw to generate TE image"""
 
                     if SectionType == BINARY_FILE_TYPE_TE:
-                        TeFile = os.path.join( OutputPath, ModuleName + 'Te.raw')
+                        TeFile = os.path.join(
+                            OutputPath, ModuleName + 'Te.raw')
                         GenFdsGlobalVariable.GenerateFirmwareImage(
-                                TeFile,
-                                [File],
-                                Type='te',
-                                IsMakefile = IsMakefile
-                            )
+                            TeFile,
+                            [File],
+                            Type='te',
+                            IsMakefile=IsMakefile
+                        )
                         File = TeFile
 
                     """Call GenSection"""
                     GenFdsGlobalVariable.GenerateSection(OutputFile,
-                                                        [File],
-                                                        Section.Section.SectionType.get (SectionType),
-                                                        IsMakefile=IsMakefile
-                                                        )
+                                                         [File],
+                                                         Section.Section.SectionType.get(
+                                                             SectionType),
+                                                         IsMakefile=IsMakefile
+                                                         )
                     OutputFileList.append(OutputFile)
 
         return OutputFileList, Align
diff --git a/BaseTools/Source/Python/GenFds/Fd.py b/BaseTools/Source/Python/GenFds/Fd.py
index 973936b6f273..03196f8b0d60 100644
--- a/BaseTools/Source/Python/GenFds/Fd.py
+++ b/BaseTools/Source/Python/GenFds/Fd.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FD generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,40 +23,46 @@ from Common.BuildToolError import *
 from Common.Misc import SaveFileOnChange
 from Common.DataType import BINARY_FILE_TYPE_FV
 
-## generate FD
+# generate FD
 #
 #
+
+
 class FD(FDClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         FDClassObject.__init__(self)
 
-    ## GenFd() method
+    # GenFd() method
     #
     #   Generate FD
     #
     #   @retval string      Generated FD file name
     #
-    def GenFd (self, Flag = False):
+    def GenFd(self, Flag=False):
         if self.FdUiName.upper() + 'fd' in GenFdsGlobalVariable.ImageBinDict:
             return GenFdsGlobalVariable.ImageBinDict[self.FdUiName.upper() + 'fd']
 
         #
         # Print Information
         #
-        FdFileName = os.path.join(GenFdsGlobalVariable.FvDir, self.FdUiName + '.fd')
+        FdFileName = os.path.join(
+            GenFdsGlobalVariable.FvDir, self.FdUiName + '.fd')
         if not Flag:
-            GenFdsGlobalVariable.InfLogger("\nFd File Name:%s (%s)" %(self.FdUiName, FdFileName))
+            GenFdsGlobalVariable.InfLogger(
+                "\nFd File Name:%s (%s)" % (self.FdUiName, FdFileName))
 
         Offset = 0x00
         for item in self.BlockSizeList:
-            Offset = Offset + item[0]  * item[1]
+            Offset = Offset + item[0] * item[1]
         if Offset != self.Size:
-            EdkLogger.error("GenFds", GENFDS_ERROR, 'FD %s Size not consistent with block array' % self.FdUiName)
-        GenFdsGlobalVariable.VerboseLogger('Following Fv will be add to Fd !!!')
+            EdkLogger.error(
+                "GenFds", GENFDS_ERROR, 'FD %s Size not consistent with block array' % self.FdUiName)
+        GenFdsGlobalVariable.VerboseLogger(
+            'Following Fv will be add to Fd !!!')
         for FvObj in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
             GenFdsGlobalVariable.VerboseLogger(FvObj)
 
@@ -70,21 +76,23 @@ class FD(FDClassObject):
             PreviousRegionStart = -1
             PreviousRegionSize = 1
 
-            for RegionObj in self.RegionList :
+            for RegionObj in self.RegionList:
                 if RegionObj.RegionType == 'CAPSULE':
                     continue
                 if RegionObj.Offset + RegionObj.Size <= PreviousRegionStart:
                     pass
-                elif RegionObj.Offset <= PreviousRegionStart or (RegionObj.Offset >=PreviousRegionStart and RegionObj.Offset < PreviousRegionStart + PreviousRegionSize):
+                elif RegionObj.Offset <= PreviousRegionStart or (RegionObj.Offset >= PreviousRegionStart and RegionObj.Offset < PreviousRegionStart + PreviousRegionSize):
                     pass
                 elif RegionObj.Offset > PreviousRegionStart + PreviousRegionSize:
                     if not Flag:
-                        GenFdsGlobalVariable.InfLogger('Padding region starting from offset 0x%X, with size 0x%X' %(PreviousRegionStart + PreviousRegionSize, RegionObj.Offset - (PreviousRegionStart + PreviousRegionSize)))
+                        GenFdsGlobalVariable.InfLogger('Padding region starting from offset 0x%X, with size 0x%X' % (
+                            PreviousRegionStart + PreviousRegionSize, RegionObj.Offset - (PreviousRegionStart + PreviousRegionSize)))
                     PadRegion = Region.Region()
                     PadRegion.Offset = PreviousRegionStart + PreviousRegionSize
                     PadRegion.Size = RegionObj.Offset - PadRegion.Offset
                     if not Flag:
-                        PadRegion.AddToBuffer(TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
+                        PadRegion.AddToBuffer(TempFdBuffer, self.BaseAddress, self.BlockSizeList,
+                                              self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
                 PreviousRegionStart = RegionObj.Offset
                 PreviousRegionSize = RegionObj.Size
                 #
@@ -92,29 +100,33 @@ class FD(FDClassObject):
                 #
                 if PreviousRegionSize > self.Size:
                     pass
-                GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
-                RegionObj.AddToBuffer (TempFdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
+                GenFdsGlobalVariable.VerboseLogger(
+                    'Call each region\'s AddToBuffer function')
+                RegionObj.AddToBuffer(TempFdBuffer, self.BaseAddress, self.BlockSizeList,
+                                      self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
 
         FdBuffer = BytesIO()
         PreviousRegionStart = -1
         PreviousRegionSize = 1
-        for RegionObj in self.RegionList :
+        for RegionObj in self.RegionList:
             if RegionObj.Offset + RegionObj.Size <= PreviousRegionStart:
                 EdkLogger.error("GenFds", GENFDS_ERROR,
-                                'Region offset 0x%X in wrong order with Region starting from 0x%X, size 0x%X\nRegions in FDF must have offsets appear in ascending order.'\
+                                'Region offset 0x%X in wrong order with Region starting from 0x%X, size 0x%X\nRegions in FDF must have offsets appear in ascending order.'
                                 % (RegionObj.Offset, PreviousRegionStart, PreviousRegionSize))
-            elif RegionObj.Offset <= PreviousRegionStart or (RegionObj.Offset >=PreviousRegionStart and RegionObj.Offset < PreviousRegionStart + PreviousRegionSize):
+            elif RegionObj.Offset <= PreviousRegionStart or (RegionObj.Offset >= PreviousRegionStart and RegionObj.Offset < PreviousRegionStart + PreviousRegionSize):
                 EdkLogger.error("GenFds", GENFDS_ERROR,
-                                'Region offset 0x%X overlaps with Region starting from 0x%X, size 0x%X' \
+                                'Region offset 0x%X overlaps with Region starting from 0x%X, size 0x%X'
                                 % (RegionObj.Offset, PreviousRegionStart, PreviousRegionSize))
             elif RegionObj.Offset > PreviousRegionStart + PreviousRegionSize:
                 if not Flag:
-                    GenFdsGlobalVariable.InfLogger('Padding region starting from offset 0x%X, with size 0x%X' %(PreviousRegionStart + PreviousRegionSize, RegionObj.Offset - (PreviousRegionStart + PreviousRegionSize)))
+                    GenFdsGlobalVariable.InfLogger('Padding region starting from offset 0x%X, with size 0x%X' % (
+                        PreviousRegionStart + PreviousRegionSize, RegionObj.Offset - (PreviousRegionStart + PreviousRegionSize)))
                 PadRegion = Region.Region()
                 PadRegion.Offset = PreviousRegionStart + PreviousRegionSize
                 PadRegion.Size = RegionObj.Offset - PadRegion.Offset
                 if not Flag:
-                    PadRegion.AddToBuffer(FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
+                    PadRegion.AddToBuffer(FdBuffer, self.BaseAddress, self.BlockSizeList,
+                                          self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict)
             PreviousRegionStart = RegionObj.Offset
             PreviousRegionSize = RegionObj.Size
             #
@@ -127,29 +139,25 @@ class FD(FDClassObject):
             #
             # Call each region's AddToBuffer function
             #
-            GenFdsGlobalVariable.VerboseLogger('Call each region\'s AddToBuffer function')
-            RegionObj.AddToBuffer (FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity, GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict, Flag=Flag)
+            GenFdsGlobalVariable.VerboseLogger(
+                'Call each region\'s AddToBuffer function')
+            RegionObj.AddToBuffer(FdBuffer, self.BaseAddress, self.BlockSizeList, self.ErasePolarity,
+                                  GenFdsGlobalVariable.ImageBinDict, self.DefineVarDict, Flag=Flag)
         #
         # Write the buffer contents to Fd file
         #
-        GenFdsGlobalVariable.VerboseLogger('Write the buffer contents to Fd file')
+        GenFdsGlobalVariable.VerboseLogger(
+            'Write the buffer contents to Fd file')
         if not Flag:
             SaveFileOnChange(FdFileName, FdBuffer.getvalue())
         FdBuffer.close()
-        GenFdsGlobalVariable.ImageBinDict[self.FdUiName.upper() + 'fd'] = FdFileName
+        GenFdsGlobalVariable.ImageBinDict[self.FdUiName.upper(
+        ) + 'fd'] = FdFileName
         return FdFileName
 
-    ## generate flash map file
+    # generate flash map file
     #
     #   @param  self        The object pointer
     #
-    def GenFlashMap (self):
+    def GenFlashMap(self):
         pass
-
-
-
-
-
-
-
-
diff --git a/BaseTools/Source/Python/GenFds/FdfParser.py b/BaseTools/Source/Python/GenFds/FdfParser.py
index 5c8263f9bcc9..0b83d3b98c50 100644
--- a/BaseTools/Source/Python/GenFds/FdfParser.py
+++ b/BaseTools/Source/Python/GenFds/FdfParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # parse FDF file
 #
 #  Copyright (c) 2007 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -58,37 +58,47 @@ T_CHAR_DOUBLE_QUOTE = '\"'
 T_CHAR_SINGLE_QUOTE = '\''
 T_CHAR_BRACE_R = '}'
 
-SEPARATORS = {TAB_EQUAL_SPLIT, TAB_VALUE_SPLIT, TAB_COMMA_SPLIT, '{', T_CHAR_BRACE_R}
+SEPARATORS = {TAB_EQUAL_SPLIT, TAB_VALUE_SPLIT,
+              TAB_COMMA_SPLIT, '{', T_CHAR_BRACE_R}
 ALIGNMENTS = {"Auto", "8", "16", "32", "64", "128", "512", "1K", "4K", "32K", "64K", "128K",
-                                    "256K", "512K", "1M", "2M", "4M", "8M", "16M"}
+              "256K", "512K", "1M", "2M", "4M", "8M", "16M"}
 ALIGNMENT_NOAUTO = ALIGNMENTS - {"Auto"}
 CR_LB_SET = {T_CHAR_CR, TAB_LINE_BREAK}
 
-RegionSizePattern = compile("\s*(?P<base>(?:0x|0X)?[a-fA-F0-9]+)\s*\|\s*(?P<size>(?:0x|0X)?[a-fA-F0-9]+)\s*")
-RegionSizeGuidPattern = compile("\s*(?P<base>\w+\.\w+[\.\w\[\]]*)\s*\|\s*(?P<size>\w+\.\w+[\.\w\[\]]*)\s*")
+RegionSizePattern = compile(
+    "\s*(?P<base>(?:0x|0X)?[a-fA-F0-9]+)\s*\|\s*(?P<size>(?:0x|0X)?[a-fA-F0-9]+)\s*")
+RegionSizeGuidPattern = compile(
+    "\s*(?P<base>\w+\.\w+[\.\w\[\]]*)\s*\|\s*(?P<size>\w+\.\w+[\.\w\[\]]*)\s*")
 RegionOffsetPcdPattern = compile("\s*(?P<base>\w+\.\w+[\.\w\[\]]*)\s*$")
-ShortcutPcdPattern = compile("\s*\w+\s*=\s*(?P<value>(?:0x|0X)?[a-fA-F0-9]+)\s*\|\s*(?P<name>\w+\.\w+)\s*")
+ShortcutPcdPattern = compile(
+    "\s*\w+\s*=\s*(?P<value>(?:0x|0X)?[a-fA-F0-9]+)\s*\|\s*(?P<name>\w+\.\w+)\s*")
 BaseAddrValuePattern = compile('^0[xX][0-9a-fA-F]+')
 FileExtensionPattern = compile(r'([a-zA-Z][a-zA-Z0-9]*)')
-TokenFindPattern = compile(r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\)|\*)')
+TokenFindPattern = compile(
+    r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\)|\*)')
 AllIncludeFileList = []
 
 # Get the closest parent
-def GetParentAtLine (Line):
+
+
+def GetParentAtLine(Line):
     for Profile in AllIncludeFileList:
         if Profile.IsLineInFile(Line):
             return Profile
     return None
 
 # Check include loop
-def IsValidInclude (File, Line):
+
+
+def IsValidInclude(File, Line):
     for Profile in AllIncludeFileList:
         if Profile.IsLineInFile(Line) and Profile.FileName == File:
             return False
 
     return True
 
-def GetRealFileLine (File, Line):
+
+def GetRealFileLine(File, Line):
     InsertedLines = 0
     for Profile in AllIncludeFileList:
         if Profile.IsLineInFile(Line):
@@ -98,19 +108,21 @@ def GetRealFileLine (File, Line):
 
     return (File, Line - InsertedLines)
 
-## The exception class that used to report error messages when parsing FDF
+# The exception class that used to report error messages when parsing FDF
 #
 # Currently the "ToolName" is set to be "FdfParser".
 #
+
+
 class Warning (Exception):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Str         The message to record
     #   @param  File        The FDF name
     #   @param  Line        The Line number that error occurs
     #
-    def __init__(self, Str, File = None, Line = None):
+    def __init__(self, Str, File=None, Line=None):
         FileLineTuple = GetRealFileLine(File, Line)
         self.FileName = FileLineTuple[0]
         self.LineNumber = FileLineTuple[1]
@@ -126,25 +138,31 @@ class Warning (Exception):
     @staticmethod
     def Expected(Str, File, Line):
         return Warning("expected {}".format(Str), File, Line)
+
     @staticmethod
     def ExpectedEquals(File, Line):
         return Warning.Expected("'='", File, Line)
+
     @staticmethod
     def ExpectedCurlyOpen(File, Line):
         return Warning.Expected("'{'", File, Line)
+
     @staticmethod
     def ExpectedCurlyClose(File, Line):
         return Warning.Expected("'}'", File, Line)
+
     @staticmethod
     def ExpectedBracketClose(File, Line):
         return Warning.Expected("']'", File, Line)
 
-## The Include file content class that used to record file data when parsing include file
+# The Include file content class that used to record file data when parsing include file
 #
 # May raise Exception when opening file.
 #
+
+
 class IncludeFileProfile:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
@@ -164,7 +182,7 @@ class IncludeFileProfile:
         self.InsertStartLineNumber = None
         self.InsertAdjust = 0
         self.IncludeFileList = []
-        self.Level = 1 # first level include file
+        self.Level = 1  # first level include file
 
     def GetTotalLines(self):
         TotalLines = self.InsertAdjust + len(self.FileLinesList)
@@ -181,7 +199,7 @@ class IncludeFileProfile:
         return False
 
     def GetLineInFile(self, Line):
-        if not self.IsLineInFile (Line):
+        if not self.IsLineInFile(Line):
             return (self.FileName, -1)
 
         InsertedLines = self.InsertStartLineNumber
@@ -194,12 +212,14 @@ class IncludeFileProfile:
 
         return (self.FileName, Line - InsertedLines + 1)
 
-## The FDF content class that used to record file data when parsing FDF
+# The FDF content class that used to record file data when parsing FDF
 #
 # May raise Exception when opening file.
 #
+
+
 class FileProfile:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
@@ -217,7 +237,7 @@ class FileProfile:
         self.PcdDict = OrderedDict()
         self.PcdLocalDict = OrderedDict()
         self.InfList = []
-        self.InfDict = {'ArchTBD':[]}
+        self.InfDict = {'ArchTBD': []}
         # ECC will use this Dict and List information
         self.PcdFileLineDict = {}
         self.InfFileLineList = []
@@ -230,7 +250,7 @@ class FileProfile:
         self.OptRomDict = {}
         self.FmpPayloadDict = {}
 
-## The syntax parser for FDF
+# The syntax parser for FDF
 #
 # PreprocessFile method should be called prior to ParseFile
 # CycleReferenceCheck method can detect cycles in FDF contents
@@ -238,8 +258,10 @@ class FileProfile:
 # GetNext*** procedures mean these procedures will get next token first, then make judgement.
 # Get*** procedures mean these procedures will make judgement on current token only.
 #
+
+
 class FdfParser:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  FileName    The file that to be parsed
@@ -266,7 +288,7 @@ class FdfParser:
         if GenFdsGlobalVariable.WorkSpaceDir == '':
             GenFdsGlobalVariable.WorkSpaceDir = os.getenv("WORKSPACE")
 
-    ## _SkipWhiteSpace() method
+    # _SkipWhiteSpace() method
     #
     #   Skip white spaces from current char.
     #
@@ -281,7 +303,7 @@ class FdfParser:
                 return
         return
 
-    ## _EndOfFile() method
+    # _EndOfFile() method
     #
     #   Judge current buffer pos is at file end
     #
@@ -298,7 +320,7 @@ class FdfParser:
             return True
         return False
 
-    ## _EndOfLine() method
+    # _EndOfLine() method
     #
     #   Judge current buffer pos is at line end
     #
@@ -309,12 +331,13 @@ class FdfParser:
     def _EndOfLine(self):
         if self.CurrentLineNumber > len(self.Profile.FileLinesList):
             return True
-        SizeOfCurrentLine = len(self.Profile.FileLinesList[self.CurrentLineNumber - 1])
+        SizeOfCurrentLine = len(
+            self.Profile.FileLinesList[self.CurrentLineNumber - 1])
         if self.CurrentOffsetWithinLine >= SizeOfCurrentLine:
             return True
         return False
 
-    ## Rewind() method
+    # Rewind() method
     #
     #   Reset file data buffer to the initial state
     #
@@ -322,11 +345,11 @@ class FdfParser:
     #   @param  DestLine    Optional new destination line number.
     #   @param  DestOffset  Optional new destination offset.
     #
-    def Rewind(self, DestLine = 1, DestOffset = 0):
+    def Rewind(self, DestLine=1, DestOffset=0):
         self.CurrentLineNumber = DestLine
         self.CurrentOffsetWithinLine = DestOffset
 
-    ## _UndoOneChar() method
+    # _UndoOneChar() method
     #
     #   Go back one char in the file buffer
     #
@@ -344,7 +367,7 @@ class FdfParser:
             self.CurrentOffsetWithinLine -= 1
         return True
 
-    ## _GetOneChar() method
+    # _GetOneChar() method
     #
     #   Move forward one char in the file buffer
     #
@@ -357,7 +380,7 @@ class FdfParser:
         else:
             self.CurrentOffsetWithinLine += 1
 
-    ## _CurrentChar() method
+    # _CurrentChar() method
     #
     #   Get the char pointed to by the file buffer pointer
     #
@@ -367,7 +390,7 @@ class FdfParser:
     def _CurrentChar(self):
         return self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine]
 
-    ## _NextChar() method
+    # _NextChar() method
     #
     #   Get the one char pass the char pointed to by the file buffer pointer
     #
@@ -379,7 +402,7 @@ class FdfParser:
             return self.Profile.FileLinesList[self.CurrentLineNumber][0]
         return self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine + 1]
 
-    ## _SetCurrentCharValue() method
+    # _SetCurrentCharValue() method
     #
     #   Modify the value of current char
     #
@@ -387,9 +410,10 @@ class FdfParser:
     #   @param  Value       The new value of current char
     #
     def _SetCurrentCharValue(self, Value):
-        self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine] = Value
+        self.Profile.FileLinesList[self.CurrentLineNumber -
+                                   1][self.CurrentOffsetWithinLine] = Value
 
-    ## _CurrentLine() method
+    # _CurrentLine() method
     #
     #   Get the list that contains current line contents
     #
@@ -400,12 +424,14 @@ class FdfParser:
         return self.Profile.FileLinesList[self.CurrentLineNumber - 1]
 
     def _StringToList(self):
-        self.Profile.FileLinesList = [list(s) for s in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [list(s)
+                                      for s in self.Profile.FileLinesList]
         if not self.Profile.FileLinesList:
-            EdkLogger.error('FdfParser', FILE_READ_FAILURE, 'The file is empty!', File=self.FileName)
+            EdkLogger.error('FdfParser', FILE_READ_FAILURE,
+                            'The file is empty!', File=self.FileName)
         self.Profile.FileLinesList[-1].append(' ')
 
-    def _ReplaceFragment(self, StartPos, EndPos, Value = ' '):
+    def _ReplaceFragment(self, StartPos, EndPos, Value=' '):
         if StartPos[0] == EndPos[0]:
             Offset = StartPos[1]
             while Offset <= EndPos[1]:
@@ -437,9 +463,11 @@ class FdfParser:
 
         MacroDict = {}
         if not self._MacroDict[self._CurSection[0], self._CurSection[1], self._CurSection[2]]:
-            self._MacroDict[self._CurSection[0], self._CurSection[1], self._CurSection[2]] = MacroDict
+            self._MacroDict[self._CurSection[0],
+                            self._CurSection[1], self._CurSection[2]] = MacroDict
         else:
-            MacroDict = self._MacroDict[self._CurSection[0], self._CurSection[1], self._CurSection[2]]
+            MacroDict = self._MacroDict[self._CurSection[0],
+                                        self._CurSection[1], self._CurSection[2]]
         MacroDict[Macro] = Value
 
     def _GetMacroValue(self, Macro):
@@ -451,9 +479,9 @@ class FdfParser:
 
         if self._CurSection:
             MacroDict = self._MacroDict[
-                        self._CurSection[0],
-                        self._CurSection[1],
-                        self._CurSection[2]
+                self._CurSection[0],
+                self._CurSection[1],
+                self._CurSection[2]
             ]
             if MacroDict and Macro in MacroDict:
                 return MacroDict[Macro]
@@ -471,7 +499,8 @@ class FdfParser:
         # [Rule]: don't take rule section into account, macro is not allowed in this section
         # [OptionRom.DriverName]
         self._CurSection = []
-        Section = Section.strip()[1:-1].upper().replace(' ', '').strip(TAB_SPLIT)
+        Section = Section.strip(
+        )[1:-1].upper().replace(' ', '').strip(TAB_SPLIT)
         ItemList = Section.split(TAB_SPLIT)
         Item = ItemList[0]
         if Item == '' or Item == 'RULE':
@@ -484,7 +513,7 @@ class FdfParser:
         elif len(ItemList) > 0:
             self._CurSection = [ItemList[0], 'DUMMY', TAB_COMMON]
 
-    ## PreprocessFile() method
+    # PreprocessFile() method
     #
     #   Preprocess file contents, replace comments with spaces.
     #   In the end, rewind the file buffer pointer to the beginning
@@ -545,10 +574,11 @@ class FdfParser:
                 self._GetOneChar()
 
         # restore from ListOfList to ListOfString
-        self.Profile.FileLinesList = ["".join(list) for list in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [
+            "".join(list) for list in self.Profile.FileLinesList]
         self.Rewind()
 
-    ## PreprocessIncludeFile() method
+    # PreprocessIncludeFile() method
     #
     #   Preprocess file contents, replace !include statements with file contents.
     #   In the end, rewind the file buffer pointer to the beginning
@@ -563,10 +593,12 @@ class FdfParser:
 
             if self._Token == TAB_DEFINE:
                 if not self._GetNextToken():
-                    raise Warning.Expected("Macro name", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "Macro name", self.FileName, self.CurrentLineNumber)
                 Macro = self._Token
                 if not self._IsToken(TAB_EQUAL_SPLIT):
-                    raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedEquals(
+                        self.FileName, self.CurrentLineNumber)
                 Value = self._GetExpression()
                 MacroDict[Macro] = Value
 
@@ -575,7 +607,8 @@ class FdfParser:
                 IncludeLine = self.CurrentLineNumber
                 IncludeOffset = self.CurrentOffsetWithinLine - len(TAB_INCLUDE)
                 if not self._GetNextToken():
-                    raise Warning.Expected("include file name", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "include file name", self.FileName, self.CurrentLineNumber)
                 IncFileName = self._Token
                 PreIndex = 0
                 StartPos = IncFileName.find('$(', PreIndex)
@@ -587,13 +620,15 @@ class FdfParser:
                         if Macro in MacroDict:
                             MacroVal = MacroDict[Macro]
                     if MacroVal is not None:
-                        IncFileName = IncFileName.replace('$(' + Macro + ')', MacroVal, 1)
+                        IncFileName = IncFileName.replace(
+                            '$(' + Macro + ')', MacroVal, 1)
                         if MacroVal.find('$(') != -1:
                             PreIndex = StartPos
                         else:
                             PreIndex = StartPos + len(MacroVal)
                     else:
-                        raise Warning("The Macro %s is not defined" %Macro, self.FileName, self.CurrentLineNumber)
+                        raise Warning("The Macro %s is not defined" %
+                                      Macro, self.FileName, self.CurrentLineNumber)
                     StartPos = IncFileName.find('$(', PreIndex)
                     EndPos = IncFileName.find(')', StartPos+2)
 
@@ -601,7 +636,8 @@ class FdfParser:
                 #
                 # First search the include file under the same directory as FDF file
                 #
-                IncludedFile1 = PathClass(IncludedFile, os.path.dirname(self.FileName))
+                IncludedFile1 = PathClass(
+                    IncludedFile, os.path.dirname(self.FileName))
                 ErrorCode = IncludedFile1.Validate()[0]
                 if ErrorCode != 0:
                     #
@@ -618,14 +654,16 @@ class FdfParser:
                         #
                         # Also search file under the WORKSPACE directory
                         #
-                        IncludedFile1 = PathClass(IncludedFile, GlobalData.gWorkspace)
+                        IncludedFile1 = PathClass(
+                            IncludedFile, GlobalData.gWorkspace)
                         ErrorCode = IncludedFile1.Validate()[0]
                         if ErrorCode != 0:
-                            raise Warning("The include file does not exist under below directories: \n%s\n%s\n%s\n"%(os.path.dirname(self.FileName), PlatformDir, GlobalData.gWorkspace),
+                            raise Warning("The include file does not exist under below directories: \n%s\n%s\n%s\n" % (os.path.dirname(self.FileName), PlatformDir, GlobalData.gWorkspace),
                                           self.FileName, self.CurrentLineNumber)
 
-                if not IsValidInclude (IncludedFile1.Path, self.CurrentLineNumber):
-                    raise Warning("The include file {0} is causing a include loop.\n".format (IncludedFile1.Path), self.FileName, self.CurrentLineNumber)
+                if not IsValidInclude(IncludedFile1.Path, self.CurrentLineNumber):
+                    raise Warning("The include file {0} is causing a include loop.\n".format(
+                        IncludedFile1.Path), self.FileName, self.CurrentLineNumber)
 
                 IncFileProfile = IncludeFileProfile(IncludedFile1.Path)
 
@@ -633,7 +671,7 @@ class FdfParser:
                 CurrentOffset = self.CurrentOffsetWithinLine
                 # list index of the insertion, note that line number is 'CurrentLine + 1'
                 InsertAtLine = CurrentLine
-                ParentProfile = GetParentAtLine (CurrentLine)
+                ParentProfile = GetParentAtLine(CurrentLine)
                 if ParentProfile is not None:
                     ParentProfile.IncludeFileList.insert(0, IncFileProfile)
                     IncFileProfile.Level = ParentProfile.Level + 1
@@ -642,7 +680,8 @@ class FdfParser:
                 if self._GetNextToken():
                     if self.CurrentLineNumber == CurrentLine:
                         RemainingLine = self._CurrentLine()[CurrentOffset:]
-                        self.Profile.FileLinesList.insert(self.CurrentLineNumber, RemainingLine)
+                        self.Profile.FileLinesList.insert(
+                            self.CurrentLineNumber, RemainingLine)
                         IncFileProfile.InsertAdjust += 1
                         self.CurrentLineNumber += 1
                         self.CurrentOffsetWithinLine = 0
@@ -659,8 +698,8 @@ class FdfParser:
                 TempList = list(self.Profile.FileLinesList[IncludeLine - 1])
                 TempList.insert(IncludeOffset, TAB_COMMENT_SPLIT)
                 self.Profile.FileLinesList[IncludeLine - 1] = ''.join(TempList)
-            if Processed: # Nested and back-to-back support
-                self.Rewind(DestLine = IncFileProfile.InsertStartLineNumber - 1)
+            if Processed:  # Nested and back-to-back support
+                self.Rewind(DestLine=IncFileProfile.InsertStartLineNumber - 1)
                 Processed = False
         # Preprocess done.
         self.Rewind()
@@ -676,7 +715,7 @@ class FdfParser:
 
         return True
 
-    ## PreprocessConditionalStatement() method
+    # PreprocessConditionalStatement() method
     #
     #   Preprocess conditional statement.
     #   In the end, rewind the file buffer pointer to the beginning
@@ -697,7 +736,8 @@ class FdfParser:
                         self._SkipToToken(TAB_SECTION_END)
                         Header += self._SkippedChars
                     if Header.find('$(') != -1:
-                        raise Warning("macro cannot be used in section header", self.FileName, self.CurrentLineNumber)
+                        raise Warning(
+                            "macro cannot be used in section header", self.FileName, self.CurrentLineNumber)
                     self._SectionHeaderParser(Header)
                     continue
                 # Replace macros except in RULE section or out of section
@@ -712,7 +752,8 @@ class FdfParser:
                         MacroName = CurLine[StartPos+2: EndPos]
                         MacroValue = self._GetMacroValue(MacroName)
                         if MacroValue is not None:
-                            CurLine = CurLine.replace('$(' + MacroName + ')', MacroValue, 1)
+                            CurLine = CurLine.replace(
+                                '$(' + MacroName + ')', MacroValue, 1)
                             if MacroValue.find('$(') != -1:
                                 PreIndex = StartPos
                             else:
@@ -727,18 +768,23 @@ class FdfParser:
             if self._Token == TAB_DEFINE:
                 if self._GetIfListCurrentItemStat(IfList):
                     if not self._CurSection:
-                        raise Warning("macro cannot be defined in Rule section or out of section", self.FileName, self.CurrentLineNumber)
+                        raise Warning(
+                            "macro cannot be defined in Rule section or out of section", self.FileName, self.CurrentLineNumber)
                     DefineLine = self.CurrentLineNumber - 1
-                    DefineOffset = self.CurrentOffsetWithinLine - len(TAB_DEFINE)
+                    DefineOffset = self.CurrentOffsetWithinLine - \
+                        len(TAB_DEFINE)
                     if not self._GetNextToken():
-                        raise Warning.Expected("Macro name", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "Macro name", self.FileName, self.CurrentLineNumber)
                     Macro = self._Token
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
 
                     Value = self._GetExpression()
                     self._SetMacroValue(Macro, Value)
-                    self._WipeOffArea.append(((DefineLine, DefineOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                    self._WipeOffArea.append(
+                        ((DefineLine, DefineOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
             elif self._Token == 'SET':
                 if not self._GetIfListCurrentItemStat(IfList):
                     continue
@@ -747,91 +793,115 @@ class FdfParser:
                 PcdPair = self._GetNextPcdSettings()
                 PcdName = "%s.%s" % (PcdPair[1], PcdPair[0])
                 if not self._IsToken(TAB_EQUAL_SPLIT):
-                    raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedEquals(
+                        self.FileName, self.CurrentLineNumber)
 
                 Value = self._GetExpression()
-                Value = self._EvaluateConditional(Value, self.CurrentLineNumber, 'eval', True)
+                Value = self._EvaluateConditional(
+                    Value, self.CurrentLineNumber, 'eval', True)
 
                 self._PcdDict[PcdName] = Value
 
                 self.Profile.PcdDict[PcdPair] = Value
                 self.SetPcdLocalation(PcdPair)
-                FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                FileLineTuple = GetRealFileLine(
+                    self.FileName, self.CurrentLineNumber)
                 self.Profile.PcdFileLineDict[PcdPair] = FileLineTuple
 
-                self._WipeOffArea.append(((SetLine, SetOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                self._WipeOffArea.append(
+                    ((SetLine, SetOffset), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
             elif self._Token in {TAB_IF_DEF, TAB_IF_N_DEF, TAB_IF}:
-                IfStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self._Token))
+                IfStartPos = (self.CurrentLineNumber - 1,
+                              self.CurrentOffsetWithinLine - len(self._Token))
                 IfList.append([IfStartPos, None, None])
 
                 CondLabel = self._Token
                 Expression = self._GetExpression()
 
                 if CondLabel == TAB_IF:
-                    ConditionSatisfied = self._EvaluateConditional(Expression, IfList[-1][0][0] + 1, 'eval')
+                    ConditionSatisfied = self._EvaluateConditional(
+                        Expression, IfList[-1][0][0] + 1, 'eval')
                 else:
-                    ConditionSatisfied = self._EvaluateConditional(Expression, IfList[-1][0][0] + 1, 'in')
+                    ConditionSatisfied = self._EvaluateConditional(
+                        Expression, IfList[-1][0][0] + 1, 'in')
                     if CondLabel == TAB_IF_N_DEF:
                         ConditionSatisfied = not ConditionSatisfied
 
                 BranchDetermined = ConditionSatisfied
-                IfList[-1] = [IfList[-1][0], ConditionSatisfied, BranchDetermined]
+                IfList[-1] = [IfList[-1][0],
+                              ConditionSatisfied, BranchDetermined]
                 if ConditionSatisfied:
-                    self._WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                    self._WipeOffArea.append(
+                        (IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
             elif self._Token in {TAB_ELSE_IF, TAB_ELSE}:
-                ElseStartPos = (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(self._Token))
+                ElseStartPos = (self.CurrentLineNumber - 1,
+                                self.CurrentOffsetWithinLine - len(self._Token))
                 if len(IfList) <= 0:
-                    raise Warning("Missing !if statement", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Missing !if statement",
+                                  self.FileName, self.CurrentLineNumber)
 
                 if IfList[-1][1]:
                     IfList[-1] = [ElseStartPos, False, True]
-                    self._WipeOffArea.append((ElseStartPos, (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                    self._WipeOffArea.append(
+                        (ElseStartPos, (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
                 else:
                     self._WipeOffArea.append((IfList[-1][0], ElseStartPos))
                     IfList[-1] = [ElseStartPos, True, IfList[-1][2]]
                     if self._Token == TAB_ELSE_IF:
                         Expression = self._GetExpression()
-                        ConditionSatisfied = self._EvaluateConditional(Expression, IfList[-1][0][0] + 1, 'eval')
-                        IfList[-1] = [IfList[-1][0], ConditionSatisfied, IfList[-1][2]]
+                        ConditionSatisfied = self._EvaluateConditional(
+                            Expression, IfList[-1][0][0] + 1, 'eval')
+                        IfList[-1] = [IfList[-1][0],
+                                      ConditionSatisfied, IfList[-1][2]]
 
                     if IfList[-1][1]:
                         if IfList[-1][2]:
                             IfList[-1][1] = False
                         else:
                             IfList[-1][2] = True
-                            self._WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                            self._WipeOffArea.append(
+                                (IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
             elif self._Token == '!endif':
                 if len(IfList) <= 0:
-                    raise Warning("Missing !if statement", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Missing !if statement",
+                                  self.FileName, self.CurrentLineNumber)
                 if IfList[-1][1]:
-                    self._WipeOffArea.append(((self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len('!endif')), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                    self._WipeOffArea.append(((self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - len(
+                        '!endif')), (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
                 else:
-                    self._WipeOffArea.append((IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
+                    self._WipeOffArea.append(
+                        (IfList[-1][0], (self.CurrentLineNumber - 1, self.CurrentOffsetWithinLine - 1)))
 
                 IfList.pop()
             elif not IfList:    # Don't use PCDs inside conditional directive
                 if self.CurrentLineNumber <= RegionLayoutLine:
                     # Don't try the same line twice
                     continue
-                SetPcd = ShortcutPcdPattern.match(self.Profile.FileLinesList[self.CurrentLineNumber - 1])
+                SetPcd = ShortcutPcdPattern.match(
+                    self.Profile.FileLinesList[self.CurrentLineNumber - 1])
                 if SetPcd:
                     self._PcdDict[SetPcd.group('name')] = SetPcd.group('value')
                     RegionLayoutLine = self.CurrentLineNumber
                     continue
-                RegionSize = RegionSizePattern.match(self.Profile.FileLinesList[self.CurrentLineNumber - 1])
+                RegionSize = RegionSizePattern.match(
+                    self.Profile.FileLinesList[self.CurrentLineNumber - 1])
                 if not RegionSize:
                     RegionLayoutLine = self.CurrentLineNumber
                     continue
-                RegionSizeGuid = RegionSizeGuidPattern.match(self.Profile.FileLinesList[self.CurrentLineNumber])
+                RegionSizeGuid = RegionSizeGuidPattern.match(
+                    self.Profile.FileLinesList[self.CurrentLineNumber])
                 if not RegionSizeGuid:
                     RegionLayoutLine = self.CurrentLineNumber + 1
                     continue
-                self._PcdDict[RegionSizeGuid.group('base')] = RegionSize.group('base')
-                self._PcdDict[RegionSizeGuid.group('size')] = RegionSize.group('size')
+                self._PcdDict[RegionSizeGuid.group(
+                    'base')] = RegionSize.group('base')
+                self._PcdDict[RegionSizeGuid.group(
+                    'size')] = RegionSize.group('size')
                 RegionLayoutLine = self.CurrentLineNumber + 1
 
         if IfList:
-            raise Warning("Missing !endif", self.FileName, self.CurrentLineNumber)
+            raise Warning("Missing !endif", self.FileName,
+                          self.CurrentLineNumber)
         self.Rewind()
 
     def _CollectMacroPcd(self):
@@ -852,9 +922,9 @@ class FdfParser:
 
             # Section macro
             ScopeMacro = self._MacroDict[
-                        self._CurSection[0],
-                        self._CurSection[1],
-                        self._CurSection[2]
+                self._CurSection[0],
+                self._CurSection[1],
+                self._CurSection[2]
             ]
             if ScopeMacro:
                 MacroDict.update(ScopeMacro)
@@ -871,7 +941,7 @@ class FdfParser:
 
         return MacroDict
 
-    def _EvaluateConditional(self, Expression, Line, Op = None, Value = None):
+    def _EvaluateConditional(self, Expression, Line, Op=None, Value=None):
         MacroPcdDict = self._CollectMacroPcd()
         if Op == 'eval':
             try:
@@ -885,8 +955,8 @@ class FdfParser:
                 # the precise number of line and return the evaluation result
                 #
                 EdkLogger.warn('Parser', "Suspicious expression: %s" % str(Excpt),
-                                File=self.FileName, ExtraData=self._CurrentLine(),
-                                Line=Line)
+                               File=self.FileName, ExtraData=self._CurrentLine(),
+                               Line=Line)
                 return Excpt.result
             except Exception as Excpt:
                 if hasattr(Excpt, 'Pcd'):
@@ -895,7 +965,8 @@ class FdfParser:
                         raise Warning("Cannot use this PCD (%s) in an expression as"
                                       " it must be defined in a [PcdsFixedAtBuild] or [PcdsFeatureFlag] section"
                                       " of the DSC file (%s), and it is currently defined in this section:"
-                                      " %s, line #: %d." % (Excpt.Pcd, GlobalData.gPlatformOtherPcds['DSCFILE'], Info[0], Info[1]),
+                                      " %s, line #: %d." % (
+                                          Excpt.Pcd, GlobalData.gPlatformOtherPcds['DSCFILE'], Info[0], Info[1]),
                                       self.FileName, Line)
                     else:
                         raise Warning("PCD (%s) is not defined in DSC file (%s)" % (Excpt.Pcd, GlobalData.gPlatformOtherPcds['DSCFILE']),
@@ -907,7 +978,7 @@ class FdfParser:
                 Expression = Expression[2:-1]
             return Expression in MacroPcdDict
 
-    ## _IsToken() method
+    # _IsToken() method
     #
     #   Check whether input string is found from current char position along
     #   If found, the string value is put into self._Token
@@ -918,23 +989,26 @@ class FdfParser:
     #   @retval True        Successfully find string, file buffer pointer moved forward
     #   @retval False       Not able to find string, file buffer pointer not changed
     #
-    def _IsToken(self, String, IgnoreCase = False):
+    def _IsToken(self, String, IgnoreCase=False):
         self._SkipWhiteSpace()
 
         # Only consider the same line, no multi-line token allowed
         StartPos = self.CurrentOffsetWithinLine
         index = -1
         if IgnoreCase:
-            index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].upper().find(String.upper())
+            index = self._CurrentLine()[self.CurrentOffsetWithinLine:].upper().find(
+                String.upper())
         else:
-            index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].find(String)
+            index = self._CurrentLine(
+            )[self.CurrentOffsetWithinLine:].find(String)
         if index == 0:
             self.CurrentOffsetWithinLine += len(String)
-            self._Token = self._CurrentLine()[StartPos: self.CurrentOffsetWithinLine]
+            self._Token = self._CurrentLine(
+            )[StartPos: self.CurrentOffsetWithinLine]
             return True
         return False
 
-    ## _IsKeyword() method
+    # _IsKeyword() method
     #
     #   Check whether input keyword is found from current char position along, whole word only!
     #   If found, the string value is put into self._Token
@@ -945,22 +1019,26 @@ class FdfParser:
     #   @retval True        Successfully find string, file buffer pointer moved forward
     #   @retval False       Not able to find string, file buffer pointer not changed
     #
-    def _IsKeyword(self, KeyWord, IgnoreCase = False):
+    def _IsKeyword(self, KeyWord, IgnoreCase=False):
         self._SkipWhiteSpace()
 
         # Only consider the same line, no multi-line token allowed
         StartPos = self.CurrentOffsetWithinLine
         index = -1
         if IgnoreCase:
-            index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].upper().find(KeyWord.upper())
+            index = self._CurrentLine()[self.CurrentOffsetWithinLine:].upper().find(
+                KeyWord.upper())
         else:
-            index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].find(KeyWord)
+            index = self._CurrentLine(
+            )[self.CurrentOffsetWithinLine:].find(KeyWord)
         if index == 0:
-            followingChar = self._CurrentLine()[self.CurrentOffsetWithinLine + len(KeyWord)]
+            followingChar = self._CurrentLine(
+            )[self.CurrentOffsetWithinLine + len(KeyWord)]
             if not str(followingChar).isspace() and followingChar not in SEPARATORS:
                 return False
             self.CurrentOffsetWithinLine += len(KeyWord)
-            self._Token = self._CurrentLine()[StartPos: self.CurrentOffsetWithinLine]
+            self._Token = self._CurrentLine(
+            )[StartPos: self.CurrentOffsetWithinLine]
             return True
         return False
 
@@ -969,12 +1047,13 @@ class FdfParser:
         Index = len(Line) - 1
         while Line[Index] in CR_LB_SET:
             Index -= 1
-        ExpressionString = self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:Index+1]
+        ExpressionString = self.Profile.FileLinesList[self.CurrentLineNumber -
+                                                      1][self.CurrentOffsetWithinLine:Index+1]
         self.CurrentOffsetWithinLine += len(ExpressionString)
         ExpressionString = ExpressionString.strip()
         return ExpressionString
 
-    ## _GetNextWord() method
+    # _GetNextWord() method
     #
     #   Get next C name from file lines
     #   If found, the string value is put into self._Token
@@ -995,13 +1074,14 @@ class FdfParser:
             while not self._EndOfLine():
                 TempChar = self._CurrentChar()
                 if (TempChar >= 'a' and TempChar <= 'z') or (TempChar >= 'A' and TempChar <= 'Z') \
-                or (TempChar >= '0' and TempChar <= '9') or TempChar == '_' or TempChar == '-':
+                        or (TempChar >= '0' and TempChar <= '9') or TempChar == '_' or TempChar == '-':
                     self._GetOneChar()
 
                 else:
                     break
 
-            self._Token = self._CurrentLine()[StartPos: self.CurrentOffsetWithinLine]
+            self._Token = self._CurrentLine(
+            )[StartPos: self.CurrentOffsetWithinLine]
             return True
 
         return False
@@ -1018,18 +1098,19 @@ class FdfParser:
             while not self._EndOfLine():
                 TempChar = self._CurrentChar()
                 if (TempChar >= 'a' and TempChar <= 'z') or (TempChar >= 'A' and TempChar <= 'Z') \
-                or (TempChar >= '0' and TempChar <= '9') or TempChar == '_' or TempChar == '-' or TempChar == TAB_SECTION_START or TempChar == TAB_SECTION_END:
+                        or (TempChar >= '0' and TempChar <= '9') or TempChar == '_' or TempChar == '-' or TempChar == TAB_SECTION_START or TempChar == TAB_SECTION_END:
                     self._GetOneChar()
 
                 else:
                     break
 
-            self._Token = self._CurrentLine()[StartPos: self.CurrentOffsetWithinLine]
+            self._Token = self._CurrentLine(
+            )[StartPos: self.CurrentOffsetWithinLine]
             return True
 
         return False
 
-    ## _GetNextToken() method
+    # _GetNextToken() method
     #
     #   Get next token unit before a separator
     #   If found, the string value is put into self._Token
@@ -1073,7 +1154,7 @@ class FdfParser:
         else:
             return False
 
-    ## _GetNextGuid() method
+    # _GetNextGuid() method
     #
     #   Get next token unit before a separator
     #   If found, the GUID string is put into self._Token
@@ -1103,14 +1184,17 @@ class FdfParser:
         try:
             ValueNumber = int(Value, 0)
         except:
-            EdkLogger.error("FdfParser", FORMAT_INVALID, "The value is not valid dec or hex number for %s." % Name)
+            EdkLogger.error("FdfParser", FORMAT_INVALID,
+                            "The value is not valid dec or hex number for %s." % Name)
         if ValueNumber < 0:
-            EdkLogger.error("FdfParser", FORMAT_INVALID, "The value can't be set to negative value for %s." % Name)
+            EdkLogger.error("FdfParser", FORMAT_INVALID,
+                            "The value can't be set to negative value for %s." % Name)
         if ValueNumber > MAX_VAL_TYPE[Scope]:
-            EdkLogger.error("FdfParser", FORMAT_INVALID, "Too large value for %s." % Name)
+            EdkLogger.error("FdfParser", FORMAT_INVALID,
+                            "Too large value for %s." % Name)
         return True
 
-    ## _UndoToken() method
+    # _UndoToken() method
     #
     #   Go back one token unit in file buffer
     #
@@ -1123,7 +1207,6 @@ class FdfParser:
                 self._GetOneChar()
                 return
 
-
         StartPos = self.CurrentOffsetWithinLine
         CurrentLine = self.CurrentLineNumber
         while CurrentLine == self.CurrentLineNumber:
@@ -1143,7 +1226,7 @@ class FdfParser:
 
         self._GetOneChar()
 
-    ## _GetNextHexNumber() method
+    # _GetNextHexNumber() method
     #
     #   Get next HEX data before a separator
     #   If found, the HEX data is put into self._Token
@@ -1161,7 +1244,7 @@ class FdfParser:
             self._UndoToken()
             return False
 
-    ## _GetNextDecimalNumber() method
+    # _GetNextDecimalNumber() method
     #
     #   Get next decimal data before a separator
     #   If found, the decimal data is put into self._Token
@@ -1181,25 +1264,28 @@ class FdfParser:
 
     def _GetNextPcdSettings(self):
         if not self._GetNextWord():
-            raise Warning.Expected("<PcdTokenSpaceCName>", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "<PcdTokenSpaceCName>", self.FileName, self.CurrentLineNumber)
         pcdTokenSpaceCName = self._Token
 
         if not self._IsToken(TAB_SPLIT):
             raise Warning.Expected(".", self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextWord():
-            raise Warning.Expected("<PcdCName>", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "<PcdCName>", self.FileName, self.CurrentLineNumber)
         pcdCName = self._Token
 
         Fields = []
         while self._IsToken(TAB_SPLIT):
             if not self._GetNextPcdWord():
-                raise Warning.Expected("Pcd Fields", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Pcd Fields", self.FileName, self.CurrentLineNumber)
             Fields.append(self._Token)
 
-        return (pcdCName, pcdTokenSpaceCName,TAB_SPLIT.join(Fields))
+        return (pcdCName, pcdTokenSpaceCName, TAB_SPLIT.join(Fields))
 
-    ## _GetStringData() method
+    # _GetStringData() method
     #
     #   Get string contents quoted in ""
     #   If found, the decimal data is put into self._Token
@@ -1228,7 +1314,7 @@ class FdfParser:
         self._Token = self._SkippedChars.rstrip(QuoteToUse)
         return True
 
-    ## _SkipToToken() method
+    # _SkipToToken() method
     #
     #   Search forward in file buffer for the string
     #   The skipped chars are put into self._SkippedChars
@@ -1239,16 +1325,18 @@ class FdfParser:
     #   @retval True        Successfully find the string, file buffer pointer moved forward
     #   @retval False       Not able to find the string, file buffer pointer not changed
     #
-    def _SkipToToken(self, String, IgnoreCase = False):
+    def _SkipToToken(self, String, IgnoreCase=False):
         StartPos = self.GetFileBufferPos()
 
         self._SkippedChars = ""
         while not self._EndOfFile():
             index = -1
             if IgnoreCase:
-                index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].upper().find(String.upper())
+                index = self._CurrentLine()[self.CurrentOffsetWithinLine:].upper().find(
+                    String.upper())
             else:
-                index = self._CurrentLine()[self.CurrentOffsetWithinLine: ].find(String)
+                index = self._CurrentLine(
+                )[self.CurrentOffsetWithinLine:].find(String)
             if index == 0:
                 self.CurrentOffsetWithinLine += len(String)
                 self._SkippedChars += String
@@ -1260,7 +1348,7 @@ class FdfParser:
         self._SkippedChars = ""
         return False
 
-    ## GetFileBufferPos() method
+    # GetFileBufferPos() method
     #
     #   Return the tuple of current line and offset within the line
     #
@@ -1270,7 +1358,7 @@ class FdfParser:
     def GetFileBufferPos(self):
         return (self.CurrentLineNumber, self.CurrentOffsetWithinLine)
 
-    ## SetFileBufferPos() method
+    # SetFileBufferPos() method
     #
     #   Restore the file buffer position
     #
@@ -1280,7 +1368,7 @@ class FdfParser:
     def SetFileBufferPos(self, Pos):
         (self.CurrentLineNumber, self.CurrentOffsetWithinLine) = Pos
 
-    ## Preprocess() method
+    # Preprocess() method
     #
     #   Preprocess comment, conditional directive, include directive, replace macro.
     #   Exception will be raised if syntax error found
@@ -1297,12 +1385,13 @@ class FdfParser:
         self._StringToList()
         for Pos in self._WipeOffArea:
             self._ReplaceFragment(Pos[0], Pos[1])
-        self.Profile.FileLinesList = ["".join(list) for list in self.Profile.FileLinesList]
+        self.Profile.FileLinesList = [
+            "".join(list) for list in self.Profile.FileLinesList]
 
         while self._GetDefines():
             pass
 
-    ## ParseFile() method
+    # ParseFile() method
     #
     #   Parse the file profile buffer to extract fd, fv ... information
     #   Exception will be raised if syntax error found
@@ -1321,19 +1410,20 @@ class FdfParser:
 
         except Warning as X:
             self._UndoToken()
-            #'\n\tGot Token: \"%s\" from File %s\n' % (self._Token, FileLineTuple[0]) + \
+            # '\n\tGot Token: \"%s\" from File %s\n' % (self._Token, FileLineTuple[0]) + \
             # At this point, the closest parent would be the included file itself
             Profile = GetParentAtLine(X.OriginalLineNumber)
             if Profile is not None:
                 X.Message += ' near line %d, column %d: %s' \
-                % (X.LineNumber, 0, Profile.FileLinesList[X.LineNumber-1])
+                    % (X.LineNumber, 0, Profile.FileLinesList[X.LineNumber-1])
             else:
-                FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                FileLineTuple = GetRealFileLine(
+                    self.FileName, self.CurrentLineNumber)
                 X.Message += ' near line %d, column %d: %s' \
-                % (FileLineTuple[1], self.CurrentOffsetWithinLine + 1, self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:].rstrip(TAB_LINE_BREAK).rstrip(T_CHAR_CR))
+                    % (FileLineTuple[1], self.CurrentOffsetWithinLine + 1, self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:].rstrip(TAB_LINE_BREAK).rstrip(T_CHAR_CR))
             raise
 
-    ## SectionParser() method
+    # SectionParser() method
     #
     #   Parse the file section info
     #   Exception will be raised if syntax error found
@@ -1344,10 +1434,11 @@ class FdfParser:
     def SectionParser(self, section):
         S = section.upper()
         if not S.startswith("[DEFINES") and not S.startswith("[FD.") and not S.startswith("[FV.") and not S.startswith("[CAPSULE.") \
-             and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM.") and not S.startswith('[FMPPAYLOAD.'):
-            raise Warning("Unknown section or section appear sequence error (The correct sequence should be [DEFINES], [FD.], [FV.], [Capsule.], [Rule.], [OptionRom.], [FMPPAYLOAD.])", self.FileName, self.CurrentLineNumber)
+                and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM.") and not S.startswith('[FMPPAYLOAD.'):
+            raise Warning(
+                "Unknown section or section appear sequence error (The correct sequence should be [DEFINES], [FD.], [FV.], [Capsule.], [Rule.], [OptionRom.], [FMPPAYLOAD.])", self.FileName, self.CurrentLineNumber)
 
-    ## _GetDefines() method
+    # _GetDefines() method
     #
     #   Get Defines section contents and store its data into AllMacrosList
     #
@@ -1367,13 +1458,16 @@ class FdfParser:
 
         self._UndoToken()
         if not self._IsToken("[DEFINES", True):
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            #print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
+            # print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
             #        % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
-            raise Warning.Expected("[DEFINES", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "[DEFINES", self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         while self._GetNextWord():
             # handle the SET statement
@@ -1385,25 +1479,28 @@ class FdfParser:
             Macro = self._Token
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken() or self._Token.startswith(TAB_SECTION_START):
-                raise Warning.Expected("MACRO value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "MACRO value", self.FileName, self.CurrentLineNumber)
             Value = self._Token
 
         return False
 
-    ##_GetError() method
+    # _GetError() method
     def _GetError(self):
-        #save the Current information
+        # save the Current information
         CurrentLine = self.CurrentLineNumber
         CurrentOffset = self.CurrentOffsetWithinLine
         while self._GetNextToken():
             if self._Token == TAB_ERROR:
-                EdkLogger.error('FdfParser', ERROR_STATEMENT, self._CurrentLine().replace(TAB_ERROR, '', 1), File=self.FileName, Line=self.CurrentLineNumber)
+                EdkLogger.error('FdfParser', ERROR_STATEMENT, self._CurrentLine().replace(
+                    TAB_ERROR, '', 1), File=self.FileName, Line=self.CurrentLineNumber)
         self.CurrentLineNumber = CurrentLine
         self.CurrentOffsetWithinLine = CurrentOffset
 
-    ## _GetFd() method
+    # _GetFd() method
     #
     #   Get FD section contents and store its data into FD dictionary of self.Profile
     #
@@ -1418,52 +1515,61 @@ class FdfParser:
         S = self._Token.upper()
         if S.startswith(TAB_SECTION_START) and not S.startswith("[FD."):
             if not S.startswith("[FV.") and not S.startswith('[FMPPAYLOAD.') and not S.startswith("[CAPSULE.") \
-                and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM."):
-                raise Warning("Unknown section", self.FileName, self.CurrentLineNumber)
+                    and not S.startswith("[RULE.") and not S.startswith("[OPTIONROM."):
+                raise Warning("Unknown section", self.FileName,
+                              self.CurrentLineNumber)
             self._UndoToken()
             return False
 
         self._UndoToken()
         if not self._IsToken("[FD.", True):
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            #print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
+            # print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
             #        % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
-            raise Warning.Expected("[FD.]", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "[FD.]", self.FileName, self.CurrentLineNumber)
 
         FdName = self._GetUiName()
         if FdName == "":
-            if len (self.Profile.FdDict) == 0:
+            if len(self.Profile.FdDict) == 0:
                 FdName = GenFdsGlobalVariable.PlatformName
                 if FdName == "" and GlobalData.gActivePlatform:
                     FdName = GlobalData.gActivePlatform.PlatformName
                 self.Profile.FdNameNotSet = True
             else:
-                raise Warning.Expected("FdName in [FD.] section", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FdName in [FD.] section", self.FileName, self.CurrentLineNumber)
         self.CurrentFdName = FdName.upper()
 
         if self.CurrentFdName in self.Profile.FdDict:
-            raise Warning("Unexpected the same FD name", self.FileName, self.CurrentLineNumber)
+            raise Warning("Unexpected the same FD name",
+                          self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         FdObj = FD()
         FdObj.FdUiName = self.CurrentFdName
         self.Profile.FdDict[self.CurrentFdName] = FdObj
 
-        if len (self.Profile.FdDict) > 1 and self.Profile.FdNameNotSet:
-            raise Warning.Expected("all FDs have their name", self.FileName, self.CurrentLineNumber)
+        if len(self.Profile.FdDict) > 1 and self.Profile.FdNameNotSet:
+            raise Warning.Expected(
+                "all FDs have their name", self.FileName, self.CurrentLineNumber)
 
         Status = self._GetCreateFile(FdObj)
         if not Status:
-            raise Warning("FD name error", self.FileName, self.CurrentLineNumber)
+            raise Warning("FD name error", self.FileName,
+                          self.CurrentLineNumber)
 
         while self._GetTokenStatements(FdObj):
             pass
         for Attr in ("BaseAddress", "Size", "ErasePolarity"):
             if getattr(FdObj, Attr) is None:
                 self._GetNextToken()
-                raise Warning("Keyword %s missing" % Attr, self.FileName, self.CurrentLineNumber)
+                raise Warning("Keyword %s missing" %
+                              Attr, self.FileName, self.CurrentLineNumber)
 
         if not FdObj.BlockSizeList:
             FdObj.BlockSizeList.append((1, FdObj.Size, None))
@@ -1473,13 +1579,14 @@ class FdfParser:
         self._GetSetStatements(FdObj)
 
         if not self._GetRegionLayout(FdObj):
-            raise Warning.Expected("region layout", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "region layout", self.FileName, self.CurrentLineNumber)
 
         while self._GetRegionLayout(FdObj):
             pass
         return True
 
-    ## _GetUiName() method
+    # _GetUiName() method
     #
     #   Return the UI name of a section
     #
@@ -1493,7 +1600,7 @@ class FdfParser:
 
         return Name
 
-    ## _GetCreateFile() method
+    # _GetCreateFile() method
     #
     #   Return the output file name of object
     #
@@ -1504,20 +1611,23 @@ class FdfParser:
     def _GetCreateFile(self, Obj):
         if self._IsKeyword("CREATE_FILE"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("file name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "file name", self.FileName, self.CurrentLineNumber)
 
             FileName = self._Token
             Obj.CreateFileName = FileName
 
         return True
 
-    def SetPcdLocalation(self,pcdpair):
-        self.Profile.PcdLocalDict[pcdpair] = (self.Profile.FileName,self.CurrentLineNumber)
+    def SetPcdLocalation(self, pcdpair):
+        self.Profile.PcdLocalDict[pcdpair] = (
+            self.Profile.FileName, self.CurrentLineNumber)
 
-    ## _GetTokenStatements() method
+    # _GetTokenStatements() method
     #
     #   Get token statements
     #
@@ -1527,10 +1637,12 @@ class FdfParser:
     def _GetTokenStatements(self, Obj):
         if self._IsKeyword("BaseAddress"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextHexNumber():
-                raise Warning.Expected("Hex base address", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Hex base address", self.FileName, self.CurrentLineNumber)
 
             Obj.BaseAddress = self._Token
 
@@ -1539,16 +1651,19 @@ class FdfParser:
                 Obj.BaseAddressPcd = pcdPair
                 self.Profile.PcdDict[pcdPair] = Obj.BaseAddress
                 self.SetPcdLocalation(pcdPair)
-                FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                FileLineTuple = GetRealFileLine(
+                    self.FileName, self.CurrentLineNumber)
                 self.Profile.PcdFileLineDict[pcdPair] = FileLineTuple
             return True
 
         if self._IsKeyword("Size"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextHexNumber():
-                raise Warning.Expected("Hex size", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Hex size", self.FileName, self.CurrentLineNumber)
 
             Size = self._Token
             if self._IsToken(TAB_VALUE_SPLIT):
@@ -1556,27 +1671,31 @@ class FdfParser:
                 Obj.SizePcd = pcdPair
                 self.Profile.PcdDict[pcdPair] = Size
                 self.SetPcdLocalation(pcdPair)
-                FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                FileLineTuple = GetRealFileLine(
+                    self.FileName, self.CurrentLineNumber)
                 self.Profile.PcdFileLineDict[pcdPair] = FileLineTuple
             Obj.Size = int(Size, 0)
             return True
 
         if self._IsKeyword("ErasePolarity"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("Erase Polarity", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Erase Polarity", self.FileName, self.CurrentLineNumber)
 
             if not self._Token in {"1", "0"}:
-                raise Warning.Expected("1 or 0 Erase Polarity", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "1 or 0 Erase Polarity", self.FileName, self.CurrentLineNumber)
 
             Obj.ErasePolarity = self._Token
             return True
 
         return self._GetBlockStatements(Obj)
 
-    ## _GetAddressStatements() method
+    # _GetAddressStatements() method
     #
     #   Get address statements
     #
@@ -1588,25 +1707,29 @@ class FdfParser:
     def _GetAddressStatements(self, Obj):
         if self._IsKeyword("BsBaseAddress"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextDecimalNumber() and not self._GetNextHexNumber():
-                raise Warning.Expected("address", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "address", self.FileName, self.CurrentLineNumber)
 
             BsAddress = int(self._Token, 0)
             Obj.BsBaseAddress = BsAddress
 
         if self._IsKeyword("RtBaseAddress"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextDecimalNumber() and not self._GetNextHexNumber():
-                raise Warning.Expected("address", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "address", self.FileName, self.CurrentLineNumber)
 
             RtAddress = int(self._Token, 0)
             Obj.RtBaseAddress = RtAddress
 
-    ## _GetBlockStatements() method
+    # _GetBlockStatements() method
     #
     #   Get block statements
     #
@@ -1620,10 +1743,11 @@ class FdfParser:
 
             Item = Obj.BlockSizeList[-1]
             if Item[0] is None or Item[1] is None:
-                raise Warning.Expected("block statement", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "block statement", self.FileName, self.CurrentLineNumber)
         return IsBlock
 
-    ## _GetBlockStatement() method
+    # _GetBlockStatement() method
     #
     #   Get block statement
     #
@@ -1640,7 +1764,8 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextHexNumber() and not self._GetNextDecimalNumber():
-            raise Warning.Expected("Hex or Integer block size", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Hex or Integer block size", self.FileName, self.CurrentLineNumber)
 
         BlockSize = self._Token
         BlockSizePcd = None
@@ -1649,24 +1774,27 @@ class FdfParser:
             BlockSizePcd = PcdPair
             self.Profile.PcdDict[PcdPair] = BlockSize
             self.SetPcdLocalation(PcdPair)
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
             self.Profile.PcdFileLineDict[PcdPair] = FileLineTuple
         BlockSize = int(BlockSize, 0)
 
         BlockNumber = None
         if self._IsKeyword("NumBlocks"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextDecimalNumber() and not self._GetNextHexNumber():
-                raise Warning.Expected("block numbers", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "block numbers", self.FileName, self.CurrentLineNumber)
 
             BlockNumber = int(self._Token, 0)
 
         Obj.BlockSizeList.append((BlockSize, BlockNumber, BlockSizePcd))
         return True
 
-    ## _GetDefineStatements() method
+    # _GetDefineStatements() method
     #
     #   Get define statements
     #
@@ -1679,7 +1807,7 @@ class FdfParser:
         while self._GetDefineStatement(Obj):
             pass
 
-    ## _GetDefineStatement() method
+    # _GetDefineStatement() method
     #
     #   Get define statement
     #
@@ -1693,10 +1821,12 @@ class FdfParser:
             self._GetNextToken()
             Macro = self._Token
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "value", self.FileName, self.CurrentLineNumber)
 
             Value = self._Token
             Macro = '$(' + Macro + ')'
@@ -1705,7 +1835,7 @@ class FdfParser:
 
         return False
 
-    ## _GetSetStatements() method
+    # _GetSetStatements() method
     #
     #   Get set statements
     #
@@ -1718,7 +1848,7 @@ class FdfParser:
         while self._GetSetStatement(Obj):
             pass
 
-    ## _GetSetStatement() method
+    # _GetSetStatement() method
     #
     #   Get set statement
     #
@@ -1732,22 +1862,25 @@ class FdfParser:
             PcdPair = self._GetNextPcdSettings()
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             Value = self._GetExpression()
-            Value = self._EvaluateConditional(Value, self.CurrentLineNumber, 'eval', True)
+            Value = self._EvaluateConditional(
+                Value, self.CurrentLineNumber, 'eval', True)
 
             if Obj:
                 Obj.SetVarDict[PcdPair] = Value
             self.Profile.PcdDict[PcdPair] = Value
             self.SetPcdLocalation(PcdPair)
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
             self.Profile.PcdFileLineDict[PcdPair] = FileLineTuple
             return True
 
         return False
 
-    ## _CalcRegionExpr(self)
+    # _CalcRegionExpr(self)
     #
     #   Calculate expression for offset or size of a region
     #
@@ -1778,7 +1911,7 @@ class FdfParser:
             self.SetFileBufferPos(StartPos)
             return None
 
-    ## _GetRegionLayout() method
+    # _GetRegionLayout() method
     #
     #   Get region layout for FD
     #
@@ -1797,11 +1930,13 @@ class FdfParser:
         theFd.RegionList.append(RegionObj)
 
         if not self._IsToken(TAB_VALUE_SPLIT):
-            raise Warning.Expected("'|'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("'|'", self.FileName,
+                                   self.CurrentLineNumber)
 
         Size = self._CalcRegionExpr()
         if Size is None:
-            raise Warning.Expected("Region Size", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Region Size", self.FileName, self.CurrentLineNumber)
         RegionObj.Size = Size
 
         if not self._GetNextWord():
@@ -1818,17 +1953,22 @@ class FdfParser:
                            RegionOffsetPcdPattern.match(self._CurrentLine()[self.CurrentOffsetWithinLine:]))
             if IsRegionPcd:
                 RegionObj.PcdOffset = self._GetNextPcdSettings()
-                self.Profile.PcdDict[RegionObj.PcdOffset] = "0x%08X" % (RegionObj.Offset + int(theFd.BaseAddress, 0))
+                self.Profile.PcdDict[RegionObj.PcdOffset] = "0x%08X" % (
+                    RegionObj.Offset + int(theFd.BaseAddress, 0))
                 self.SetPcdLocalation(RegionObj.PcdOffset)
-                self._PcdDict['%s.%s' % (RegionObj.PcdOffset[1], RegionObj.PcdOffset[0])] = "0x%x" % RegionObj.Offset
-                FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                self._PcdDict['%s.%s' % (
+                    RegionObj.PcdOffset[1], RegionObj.PcdOffset[0])] = "0x%x" % RegionObj.Offset
+                FileLineTuple = GetRealFileLine(
+                    self.FileName, self.CurrentLineNumber)
                 self.Profile.PcdFileLineDict[RegionObj.PcdOffset] = FileLineTuple
                 if self._IsToken(TAB_VALUE_SPLIT):
                     RegionObj.PcdSize = self._GetNextPcdSettings()
                     self.Profile.PcdDict[RegionObj.PcdSize] = "0x%08X" % RegionObj.Size
                     self.SetPcdLocalation(RegionObj.PcdSize)
-                    self._PcdDict['%s.%s' % (RegionObj.PcdSize[1], RegionObj.PcdSize[0])] = "0x%x" % RegionObj.Size
-                    FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+                    self._PcdDict['%s.%s' % (
+                        RegionObj.PcdSize[1], RegionObj.PcdSize[0])] = "0x%x" % RegionObj.Size
+                    FileLineTuple = GetRealFileLine(
+                        self.FileName, self.CurrentLineNumber)
                     self.Profile.PcdFileLineDict[RegionObj.PcdSize] = FileLineTuple
 
             if not self._GetNextWord():
@@ -1875,7 +2015,7 @@ class FdfParser:
 
         return True
 
-    ## _GetRegionFvType() method
+    # _GetRegionFvType() method
     #
     #   Get region fv data for region
     #
@@ -1884,13 +2024,15 @@ class FdfParser:
     #
     def _GetRegionFvType(self, RegionObj):
         if not self._IsKeyword(BINARY_FILE_TYPE_FV):
-            raise Warning.Expected("'FV'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("'FV'", self.FileName,
+                                   self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("FV name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FV name", self.FileName, self.CurrentLineNumber)
 
         RegionObj.RegionType = BINARY_FILE_TYPE_FV
         RegionObj.RegionDataList.append((self._Token).upper())
@@ -1898,14 +2040,16 @@ class FdfParser:
         while self._IsKeyword(BINARY_FILE_TYPE_FV):
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("FV name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FV name", self.FileName, self.CurrentLineNumber)
 
             RegionObj.RegionDataList.append((self._Token).upper())
 
-    ## _GetRegionCapType() method
+    # _GetRegionCapType() method
     #
     #   Get region capsule data for region
     #
@@ -1914,13 +2058,15 @@ class FdfParser:
     #
     def _GetRegionCapType(self, RegionObj):
         if not self._IsKeyword("CAPSULE"):
-            raise Warning.Expected("'CAPSULE'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "'CAPSULE'", self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("CAPSULE name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "CAPSULE name", self.FileName, self.CurrentLineNumber)
 
         RegionObj.RegionType = "CAPSULE"
         RegionObj.RegionDataList.append(self._Token)
@@ -1928,14 +2074,16 @@ class FdfParser:
         while self._IsKeyword("CAPSULE"):
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("CAPSULE name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "CAPSULE name", self.FileName, self.CurrentLineNumber)
 
             RegionObj.RegionDataList.append(self._Token)
 
-    ## _GetRegionFileType() method
+    # _GetRegionFileType() method
     #
     #   Get region file data for region
     #
@@ -1944,13 +2092,15 @@ class FdfParser:
     #
     def _GetRegionFileType(self, RegionObj):
         if not self._IsKeyword("FILE"):
-            raise Warning.Expected("'FILE'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "'FILE'", self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("File name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "File name", self.FileName, self.CurrentLineNumber)
 
         RegionObj.RegionType = "FILE"
         RegionObj.RegionDataList.append(self._Token)
@@ -1958,14 +2108,16 @@ class FdfParser:
         while self._IsKeyword("FILE"):
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("FILE name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FILE name", self.FileName, self.CurrentLineNumber)
 
             RegionObj.RegionDataList.append(self._Token)
 
-    ## _GetRegionDataType() method
+    # _GetRegionDataType() method
     #
     #   Get region array data for region
     #
@@ -1974,41 +2126,49 @@ class FdfParser:
     #
     def _GetRegionDataType(self, RegionObj):
         if not self._IsKeyword("DATA"):
-            raise Warning.Expected("Region Data type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Region Data type", self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken("{"):
-            raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyOpen(
+                self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextHexNumber():
-            raise Warning.Expected("Hex byte", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Hex byte", self.FileName, self.CurrentLineNumber)
 
         if len(self._Token) > 18:
-            raise Warning("Hex string can't be converted to a valid UINT64 value", self.FileName, self.CurrentLineNumber)
+            raise Warning("Hex string can't be converted to a valid UINT64 value",
+                          self.FileName, self.CurrentLineNumber)
 
         # convert hex string value to byte hex string array
         AllString = self._Token
-        AllStrLen = len (AllString)
+        AllStrLen = len(AllString)
         DataString = ""
         while AllStrLen > 4:
-            DataString = DataString + "0x" + AllString[AllStrLen - 2: AllStrLen] + TAB_COMMA_SPLIT
-            AllStrLen  = AllStrLen - 2
+            DataString = DataString + "0x" + \
+                AllString[AllStrLen - 2: AllStrLen] + TAB_COMMA_SPLIT
+            AllStrLen = AllStrLen - 2
         DataString = DataString + AllString[:AllStrLen] + TAB_COMMA_SPLIT
 
         # byte value array
-        if len (self._Token) <= 4:
+        if len(self._Token) <= 4:
             while self._IsToken(TAB_COMMA_SPLIT):
                 if not self._GetNextHexNumber():
-                    raise Warning("Invalid Hex number", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Invalid Hex number",
+                                  self.FileName, self.CurrentLineNumber)
                 if len(self._Token) > 4:
-                    raise Warning("Hex byte(must be 2 digits) too long", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Hex byte(must be 2 digits) too long",
+                                  self.FileName, self.CurrentLineNumber)
                 DataString += self._Token
                 DataString += TAB_COMMA_SPLIT
 
         if not self._IsToken(T_CHAR_BRACE_R):
-            raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyClose(
+                self.FileName, self.CurrentLineNumber)
 
         DataString = DataString.rstrip(TAB_COMMA_SPLIT)
         RegionObj.RegionType = "DATA"
@@ -2017,43 +2177,51 @@ class FdfParser:
         while self._IsKeyword("DATA"):
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextHexNumber():
-                raise Warning.Expected("Hex byte", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Hex byte", self.FileName, self.CurrentLineNumber)
 
             if len(self._Token) > 18:
-                raise Warning("Hex string can't be converted to a valid UINT64 value", self.FileName, self.CurrentLineNumber)
+                raise Warning("Hex string can't be converted to a valid UINT64 value",
+                              self.FileName, self.CurrentLineNumber)
 
             # convert hex string value to byte hex string array
             AllString = self._Token
-            AllStrLen = len (AllString)
+            AllStrLen = len(AllString)
             DataString = ""
             while AllStrLen > 4:
-                DataString = DataString + "0x" + AllString[AllStrLen - 2: AllStrLen] + TAB_COMMA_SPLIT
-                AllStrLen  = AllStrLen - 2
+                DataString = DataString + "0x" + \
+                    AllString[AllStrLen - 2: AllStrLen] + TAB_COMMA_SPLIT
+                AllStrLen = AllStrLen - 2
             DataString = DataString + AllString[:AllStrLen] + TAB_COMMA_SPLIT
 
             # byte value array
-            if len (self._Token) <= 4:
+            if len(self._Token) <= 4:
                 while self._IsToken(TAB_COMMA_SPLIT):
                     if not self._GetNextHexNumber():
-                        raise Warning("Invalid Hex number", self.FileName, self.CurrentLineNumber)
+                        raise Warning("Invalid Hex number",
+                                      self.FileName, self.CurrentLineNumber)
                     if len(self._Token) > 4:
-                        raise Warning("Hex byte(must be 2 digits) too long", self.FileName, self.CurrentLineNumber)
+                        raise Warning("Hex byte(must be 2 digits) too long",
+                                      self.FileName, self.CurrentLineNumber)
                     DataString += self._Token
                     DataString += TAB_COMMA_SPLIT
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
 
             DataString = DataString.rstrip(TAB_COMMA_SPLIT)
             RegionObj.RegionDataList.append(DataString)
 
-    ## _GetFv() method
+    # _GetFv() method
     #
     #   Get FV section contents and store its data into FV dictionary of self.Profile
     #
@@ -2073,23 +2241,27 @@ class FdfParser:
 
         self._UndoToken()
         if not self._IsToken("[FV.", True):
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            #print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
+            # print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
             #        % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
-            raise Warning("Unknown Keyword '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown Keyword '%s'" % self._Token,
+                          self.FileName, self.CurrentLineNumber)
 
         FvName = self._GetUiName()
         self.CurrentFvName = FvName.upper()
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         FvObj = FV(Name=self.CurrentFvName)
         self.Profile.FvDict[self.CurrentFvName] = FvObj
 
         Status = self._GetCreateFile(FvObj)
         if not Status:
-            raise Warning("FV name error", self.FileName, self.CurrentLineNumber)
+            raise Warning("FV name error", self.FileName,
+                          self.CurrentLineNumber)
 
         self._GetDefineStatements(FvObj)
 
@@ -2099,13 +2271,14 @@ class FdfParser:
             self._GetSetStatements(FvObj)
 
             if not (self._GetBlockStatement(FvObj) or self._GetFvBaseAddress(FvObj) or
-                self._GetFvForceRebase(FvObj) or self._GetFvAlignment(FvObj) or
-                self._GetFvAttributes(FvObj) or self._GetFvNameGuid(FvObj) or
-                self._GetFvExtEntryStatement(FvObj) or self._GetFvNameString(FvObj)):
+                    self._GetFvForceRebase(FvObj) or self._GetFvAlignment(FvObj) or
+                    self._GetFvAttributes(FvObj) or self._GetFvNameGuid(FvObj) or
+                    self._GetFvExtEntryStatement(FvObj) or self._GetFvNameString(FvObj)):
                 break
 
         if FvObj.FvNameString == 'TRUE' and not FvObj.FvNameGuid:
-            raise Warning("FvNameString found but FvNameGuid was not found", self.FileName, self.CurrentLineNumber)
+            raise Warning("FvNameString found but FvNameGuid was not found",
+                          self.FileName, self.CurrentLineNumber)
 
         self._GetAprioriSection(FvObj)
         self._GetAprioriSection(FvObj)
@@ -2118,7 +2291,7 @@ class FdfParser:
 
         return True
 
-    ## _GetFvAlignment() method
+    # _GetFvAlignment() method
     #
     #   Get alignment for FV
     #
@@ -2135,17 +2308,19 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("alignment value", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "alignment value", self.FileName, self.CurrentLineNumber)
 
-        if self._Token.upper() not in {"1", "2", "4", "8", "16", "32", "64", "128", "256", "512", \
-                                        "1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K", \
-                                        "1M", "2M", "4M", "8M", "16M", "32M", "64M", "128M", "256M", "512M", \
-                                        "1G", "2G"}:
-            raise Warning("Unknown alignment value '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+        if self._Token.upper() not in {"1", "2", "4", "8", "16", "32", "64", "128", "256", "512",
+                                       "1K", "2K", "4K", "8K", "16K", "32K", "64K", "128K", "256K", "512K",
+                                       "1M", "2M", "4M", "8M", "16M", "32M", "64M", "128M", "256M", "512M",
+                                       "1G", "2G"}:
+            raise Warning("Unknown alignment value '%s'" %
+                          self._Token, self.FileName, self.CurrentLineNumber)
         Obj.FvAlignment = self._Token
         return True
 
-    ## _GetFvBaseAddress() method
+    # _GetFvBaseAddress() method
     #
     #   Get BaseAddress for FV
     #
@@ -2162,14 +2337,16 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("FV base address value", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FV base address value", self.FileName, self.CurrentLineNumber)
 
         if not BaseAddrValuePattern.match(self._Token.upper()):
-            raise Warning("Unknown FV base address value '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown FV base address value '%s'" %
+                          self._Token, self.FileName, self.CurrentLineNumber)
         Obj.FvBaseAddress = self._Token
         return True
 
-    ## _GetFvForceRebase() method
+    # _GetFvForceRebase() method
     #
     #   Get FvForceRebase for FV
     #
@@ -2186,10 +2363,12 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("FvForceRebase value", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("FvForceRebase value",
+                                   self.FileName, self.CurrentLineNumber)
 
         if self._Token.upper() not in {"TRUE", "FALSE", "0", "0X0", "0X00", "1", "0X1", "0X01"}:
-            raise Warning("Unknown FvForceRebase value '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown FvForceRebase value '%s'" %
+                          self._Token, self.FileName, self.CurrentLineNumber)
 
         if self._Token.upper() in {"TRUE", "1", "0X1", "0X01"}:
             Obj.FvForceRebase = True
@@ -2200,8 +2379,7 @@ class FdfParser:
 
         return True
 
-
-    ## _GetFvAttributes() method
+    # _GetFvAttributes() method
     #
     #   Get attributes for FV
     #
@@ -2209,31 +2387,34 @@ class FdfParser:
     #   @param  Obj         for whom attribute is got
     #   @retval None
     #
+
     def _GetFvAttributes(self, FvObj):
         IsWordToken = False
         while self._GetNextWord():
             IsWordToken = True
             name = self._Token
-            if name not in {"ERASE_POLARITY", "MEMORY_MAPPED", \
-                           "STICKY_WRITE", "LOCK_CAP", "LOCK_STATUS", "WRITE_ENABLED_CAP", \
-                           "WRITE_DISABLED_CAP", "WRITE_STATUS", "READ_ENABLED_CAP", \
-                           "READ_DISABLED_CAP", "READ_STATUS", "READ_LOCK_CAP", \
-                           "READ_LOCK_STATUS", "WRITE_LOCK_CAP", "WRITE_LOCK_STATUS", \
-                           "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"}:
+            if name not in {"ERASE_POLARITY", "MEMORY_MAPPED",
+                            "STICKY_WRITE", "LOCK_CAP", "LOCK_STATUS", "WRITE_ENABLED_CAP",
+                            "WRITE_DISABLED_CAP", "WRITE_STATUS", "READ_ENABLED_CAP",
+                            "READ_DISABLED_CAP", "READ_STATUS", "READ_LOCK_CAP",
+                            "READ_LOCK_STATUS", "WRITE_LOCK_CAP", "WRITE_LOCK_STATUS",
+                            "WRITE_POLICY_RELIABLE", "WEAK_ALIGNMENT", "FvUsedSizeEnable"}:
                 self._UndoToken()
                 return False
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken() or self._Token.upper() not in {"TRUE", "FALSE", "1", "0"}:
-                raise Warning.Expected("TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
 
             FvObj.FvAttributeDict[name] = self._Token
 
         return IsWordToken
 
-    ## _GetFvNameGuid() method
+    # _GetFvNameGuid() method
     #
     #   Get FV GUID for FV
     #
@@ -2249,9 +2430,11 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextGuid():
-            raise Warning.Expected("GUID value", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "GUID value", self.FileName, self.CurrentLineNumber)
         if self._Token in GlobalData.gGuidDict:
-            self._Token = GuidStructureStringToGuidString(GlobalData.gGuidDict[self._Token]).upper()
+            self._Token = GuidStructureStringToGuidString(
+                GlobalData.gGuidDict[self._Token]).upper()
 
         FvObj.FvNameGuid = self._Token
 
@@ -2265,7 +2448,8 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken() or self._Token.upper() not in {'TRUE', 'FALSE'}:
-            raise Warning.Expected("TRUE or FALSE for FvNameString", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "TRUE or FALSE for FvNameString", self.FileName, self.CurrentLineNumber)
 
         FvObj.FvNameString = self._Token
 
@@ -2275,73 +2459,88 @@ class FdfParser:
         if not (self._IsKeyword("FV_EXT_ENTRY") or self._IsKeyword("FV_EXT_ENTRY_TYPE")):
             return False
 
-        if not self._IsKeyword ("TYPE"):
-            raise Warning.Expected("'TYPE'", self.FileName, self.CurrentLineNumber)
+        if not self._IsKeyword("TYPE"):
+            raise Warning.Expected(
+                "'TYPE'", self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextHexNumber() and not self._GetNextDecimalNumber():
-            raise Warning.Expected("Hex FV extension entry type value At Line ", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Hex FV extension entry type value At Line ", self.FileName, self.CurrentLineNumber)
 
         FvObj.FvExtEntryTypeValue.append(self._Token)
 
         if not self._IsToken("{"):
-            raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyOpen(
+                self.FileName, self.CurrentLineNumber)
 
         if not self._IsKeyword("FILE") and not self._IsKeyword("DATA"):
-            raise Warning.Expected("'FILE' or 'DATA'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "'FILE' or 'DATA'", self.FileName, self.CurrentLineNumber)
 
         FvObj.FvExtEntryType.append(self._Token)
 
         if self._Token == 'DATA':
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextHexNumber():
-                raise Warning.Expected("Hex byte", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Hex byte", self.FileName, self.CurrentLineNumber)
 
             if len(self._Token) > 4:
-                raise Warning("Hex byte(must be 2 digits) too long", self.FileName, self.CurrentLineNumber)
+                raise Warning("Hex byte(must be 2 digits) too long",
+                              self.FileName, self.CurrentLineNumber)
 
             DataString = self._Token
             DataString += TAB_COMMA_SPLIT
 
             while self._IsToken(TAB_COMMA_SPLIT):
                 if not self._GetNextHexNumber():
-                    raise Warning("Invalid Hex number", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Invalid Hex number",
+                                  self.FileName, self.CurrentLineNumber)
                 if len(self._Token) > 4:
-                    raise Warning("Hex byte(must be 2 digits) too long", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Hex byte(must be 2 digits) too long",
+                                  self.FileName, self.CurrentLineNumber)
                 DataString += self._Token
                 DataString += TAB_COMMA_SPLIT
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
 
             DataString = DataString.rstrip(TAB_COMMA_SPLIT)
             FvObj.FvExtEntryData.append(DataString)
 
         if self._Token == 'FILE':
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("FV Extension Entry file path At Line ", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FV Extension Entry file path At Line ", self.FileName, self.CurrentLineNumber)
 
             FvObj.FvExtEntryData.append(self._Token)
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
 
         return True
 
-    ## _GetAprioriSection() method
+    # _GetAprioriSection() method
     #
     #   Get token statements
     #
@@ -2355,11 +2554,13 @@ class FdfParser:
             return False
 
         if not self._IsKeyword("PEI") and not self._IsKeyword("DXE"):
-            raise Warning.Expected("Apriori file type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("Apriori file type",
+                                   self.FileName, self.CurrentLineNumber)
         AprType = self._Token
 
         if not self._IsToken("{"):
-            raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyOpen(
+                self.FileName, self.CurrentLineNumber)
 
         AprSectionObj = AprioriSection()
         AprSectionObj.AprioriType = AprType
@@ -2373,7 +2574,8 @@ class FdfParser:
                 break
 
         if not self._IsToken(T_CHAR_BRACE_R):
-            raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyClose(
+                self.FileName, self.CurrentLineNumber)
 
         FvObj.AprioriSectionList.append(AprSectionObj)
         return True
@@ -2386,36 +2588,42 @@ class FdfParser:
         self._GetInfOptions(ffsInf)
 
         if not self._GetNextToken():
-            raise Warning.Expected("INF file path", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "INF file path", self.FileName, self.CurrentLineNumber)
         ffsInf.InfFileName = self._Token
         if not ffsInf.InfFileName.endswith('.inf'):
-            raise Warning.Expected(".inf file path", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                ".inf file path", self.FileName, self.CurrentLineNumber)
 
         ffsInf.CurrentLineNum = self.CurrentLineNumber
         ffsInf.CurrentLineContent = self._CurrentLine()
 
-        #Replace $(SAPCE) with real space
+        # Replace $(SAPCE) with real space
         ffsInf.InfFileName = ffsInf.InfFileName.replace('$(SPACE)', ' ')
 
         if ffsInf.InfFileName.replace(TAB_WORKSPACE, '').find('$') == -1:
-            #do case sensitive check for file path
-            ErrorCode, ErrorInfo = PathClass(NormPath(ffsInf.InfFileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
+            # do case sensitive check for file path
+            ErrorCode, ErrorInfo = PathClass(
+                NormPath(ffsInf.InfFileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
             if ErrorCode != 0:
                 EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
 
         NewFileName = ffsInf.InfFileName
         if ffsInf.OverrideGuid:
-            NewFileName = ProcessDuplicatedInf(PathClass(ffsInf.InfFileName,GenFdsGlobalVariable.WorkSpaceDir), ffsInf.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir).Path
+            NewFileName = ProcessDuplicatedInf(PathClass(
+                ffsInf.InfFileName, GenFdsGlobalVariable.WorkSpaceDir), ffsInf.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir).Path
 
         if not NewFileName in self.Profile.InfList:
             self.Profile.InfList.append(NewFileName)
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
             self.Profile.InfFileLineList.append(FileLineTuple)
             if ffsInf.UseArch:
                 if ffsInf.UseArch not in self.Profile.InfDict:
                     self.Profile.InfDict[ffsInf.UseArch] = [ffsInf.InfFileName]
                 else:
-                    self.Profile.InfDict[ffsInf.UseArch].append(ffsInf.InfFileName)
+                    self.Profile.InfDict[ffsInf.UseArch].append(
+                        ffsInf.InfFileName)
             else:
                 self.Profile.InfDict['ArchTBD'].append(ffsInf.InfFileName)
 
@@ -2425,10 +2633,11 @@ class FdfParser:
             elif self._IsKeyword('RELOCS_RETAINED'):
                 ffsInf.KeepReloc = True
             else:
-                raise Warning("Unknown reloc strip flag '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                raise Warning("Unknown reloc strip flag '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
         return ffsInf
 
-    ## _GetInfStatement() method
+    # _GetInfStatement() method
     #
     #   Get INF statements
     #
@@ -2450,7 +2659,7 @@ class FdfParser:
             Obj.FfsList.append(ffsInf)
         return True
 
-    ## _GetInfOptions() method
+    # _GetInfOptions() method
     #
     #   Get options for INF
     #
@@ -2460,48 +2669,59 @@ class FdfParser:
     def _GetInfOptions(self, FfsInfObj):
         if self._IsKeyword("FILE_GUID"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextGuid():
-                raise Warning.Expected("GUID value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "GUID value", self.FileName, self.CurrentLineNumber)
             if self._Token in GlobalData.gGuidDict:
-                self._Token = GuidStructureStringToGuidString(GlobalData.gGuidDict[self._Token]).upper()
+                self._Token = GuidStructureStringToGuidString(
+                    GlobalData.gGuidDict[self._Token]).upper()
             FfsInfObj.OverrideGuid = self._Token
 
         if self._IsKeyword("RuleOverride"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("Rule name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Rule name", self.FileName, self.CurrentLineNumber)
             FfsInfObj.Rule = self._Token
 
         if self._IsKeyword("VERSION"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("Version", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Version", self.FileName, self.CurrentLineNumber)
 
             if self._GetStringData():
                 FfsInfObj.Version = self._Token
 
         if self._IsKeyword(BINARY_FILE_TYPE_UI):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("UI name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "UI name", self.FileName, self.CurrentLineNumber)
 
             if self._GetStringData():
                 FfsInfObj.Ui = self._Token
 
         if self._IsKeyword("USE"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("ARCH name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "ARCH name", self.FileName, self.CurrentLineNumber)
             FfsInfObj.UseArch = self._Token
 
-
         if self._GetNextToken():
-            p = compile(r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\))')
+            p = compile(
+                r'([a-zA-Z0-9\-]+|\$\(TARGET\)|\*)_([a-zA-Z0-9\-]+|\$\(TOOL_CHAIN_TAG\)|\*)_([a-zA-Z0-9\-]+|\$\(ARCH\))')
             if p.match(self._Token) and p.match(self._Token).span()[1] == len(self._Token):
                 FfsInfObj.KeyStringList.append(self._Token)
                 if not self._IsToken(TAB_COMMA_SPLIT):
@@ -2512,13 +2732,14 @@ class FdfParser:
 
             while self._GetNextToken():
                 if not p.match(self._Token):
-                    raise Warning.Expected("KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
                 FfsInfObj.KeyStringList.append(self._Token)
 
                 if not self._IsToken(TAB_COMMA_SPLIT):
                     break
 
-    ## _GetFileStatement() method
+    # _GetFileStatement() method
     #
     #   Get FILE statements
     #
@@ -2527,12 +2748,13 @@ class FdfParser:
     #   @retval True        Successfully find FILE statement
     #   @retval False       Not able to find FILE statement
     #
-    def _GetFileStatement(self, Obj, ForCapsule = False):
+    def _GetFileStatement(self, Obj, ForCapsule=False):
         if not self._IsKeyword("FILE"):
             return False
 
         if not self._GetNextWord():
-            raise Warning.Expected("FFS type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FFS type", self.FileName, self.CurrentLineNumber)
 
         if ForCapsule and self._Token == 'DATA':
             self._UndoToken()
@@ -2547,17 +2769,21 @@ class FdfParser:
 
         if not self._GetNextGuid():
             if not self._GetNextWord():
-                raise Warning.Expected("File GUID", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "File GUID", self.FileName, self.CurrentLineNumber)
             if self._Token == 'PCD':
                 if not self._IsToken("("):
-                    raise Warning.Expected("'('", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "'('", self.FileName, self.CurrentLineNumber)
                 PcdPair = self._GetNextPcdSettings()
                 if not self._IsToken(")"):
-                    raise Warning.Expected("')'", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "')'", self.FileName, self.CurrentLineNumber)
                 self._Token = 'PCD('+PcdPair[1]+TAB_SPLIT+PcdPair[0]+')'
 
         if self._Token in GlobalData.gGuidDict:
-            self._Token = GuidStructureStringToGuidString(GlobalData.gGuidDict[self._Token]).upper()
+            self._Token = GuidStructureStringToGuidString(
+                GlobalData.gGuidDict[self._Token]).upper()
         FfsFileObj.NameGuid = self._Token
 
         self._GetFilePart(FfsFileObj)
@@ -2571,7 +2797,7 @@ class FdfParser:
 
         return True
 
-    ## _FileCouldHaveRelocFlag() method
+    # _FileCouldHaveRelocFlag() method
     #
     #   Check whether reloc strip flag can be set for a file type.
     #
@@ -2580,13 +2806,13 @@ class FdfParser:
     #   @retval False       No way to have it
     #
     @staticmethod
-    def _FileCouldHaveRelocFlag (FileType):
+    def _FileCouldHaveRelocFlag(FileType):
         if FileType in {SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_MM_CORE_STANDALONE, 'PEI_DXE_COMBO'}:
             return True
         else:
             return False
 
-    ## _SectionCouldHaveRelocFlag() method
+    # _SectionCouldHaveRelocFlag() method
     #
     #   Check whether reloc strip flag can be set for a section type.
     #
@@ -2595,13 +2821,13 @@ class FdfParser:
     #   @retval False       No way to have it
     #
     @staticmethod
-    def _SectionCouldHaveRelocFlag (SectionType):
+    def _SectionCouldHaveRelocFlag(SectionType):
         if SectionType in {BINARY_FILE_TYPE_TE, BINARY_FILE_TYPE_PE32}:
             return True
         else:
             return False
 
-    ## _GetFilePart() method
+    # _GetFilePart() method
     #
     #   Get components for FILE statement
     #
@@ -2619,26 +2845,33 @@ class FdfParser:
                     else:
                         FfsFileObj.KeepReloc = True
                 else:
-                    raise Warning("File type %s could not have reloc strip flag%d" % (FfsFileObj.FvFileType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                    raise Warning("File type %s could not have reloc strip flag%d" % (
+                        FfsFileObj.FvFileType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("File name or section data", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "File name or section data", self.FileName, self.CurrentLineNumber)
 
         if self._Token == BINARY_FILE_TYPE_FV:
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("FV name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FV name", self.FileName, self.CurrentLineNumber)
             FfsFileObj.FvName = self._Token
 
         elif self._Token == "FD":
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("FD name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FD name", self.FileName, self.CurrentLineNumber)
             FfsFileObj.FdName = self._Token
 
         elif self._Token in {TAB_DEFINE, "APRIORI", "SECTION"}:
@@ -2656,9 +2889,10 @@ class FdfParser:
             self._VerifyFile(FfsFileObj.FileName)
 
         if not self._IsToken(T_CHAR_BRACE_R):
-            raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedCurlyClose(
+                self.FileName, self.CurrentLineNumber)
 
-    ## _GetRAWData() method
+    # _GetRAWData() method
     #
     #   Get RAW data for FILE statement
     #
@@ -2672,20 +2906,24 @@ class FdfParser:
             AlignValue = None
             if self._GetAlignment():
                 if self._Token not in ALIGNMENTS:
-                    raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
-                #For FFS, Auto is default option same to ""
+                    raise Warning("Incorrect alignment '%s'" %
+                                  self._Token, self.FileName, self.CurrentLineNumber)
+                # For FFS, Auto is default option same to ""
                 if not self._Token == "Auto":
                     AlignValue = self._Token
             if not self._GetNextToken():
-                raise Warning.Expected("Filename value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Filename value", self.FileName, self.CurrentLineNumber)
 
             FileName = self._Token.replace('$(SPACE)', ' ')
             if FileName == T_CHAR_BRACE_R:
                 self._UndoToken()
-                raise Warning.Expected("Filename value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Filename value", self.FileName, self.CurrentLineNumber)
 
             self._VerifyFile(FileName)
-            File = PathClass(NormPath(FileName), GenFdsGlobalVariable.WorkSpaceDir)
+            File = PathClass(NormPath(FileName),
+                             GenFdsGlobalVariable.WorkSpaceDir)
             FfsFileObj.FileName.append(File.Path)
             FfsFileObj.SubAlignment.append(AlignValue)
 
@@ -2698,7 +2936,7 @@ class FdfParser:
         if len(FfsFileObj.FileName) == 1:
             FfsFileObj.FileName = FfsFileObj.FileName[0]
 
-    ## _GetFileOpts() method
+    # _GetFileOpts() method
     #
     #   Get options for FILE statement
     #
@@ -2712,7 +2950,8 @@ class FdfParser:
                 if self._IsToken(TAB_COMMA_SPLIT):
                     while self._GetNextToken():
                         if not TokenFindPattern.match(self._Token):
-                            raise Warning.Expected("KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
+                            raise Warning.Expected(
+                                "KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
                         FfsFileObj.KeyStringList.append(self._Token)
 
                         if not self._IsToken(TAB_COMMA_SPLIT):
@@ -2729,12 +2968,13 @@ class FdfParser:
 
         if self._GetAlignment():
             if self._Token not in ALIGNMENTS:
-                raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
-            #For FFS, Auto is default option same to ""
+                raise Warning("Incorrect alignment '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
+            # For FFS, Auto is default option same to ""
             if not self._Token == "Auto":
                 FfsFileObj.Alignment = self._Token
 
-    ## _GetAlignment() method
+    # _GetAlignment() method
     #
     #   Return the alignment value
     #
@@ -2745,15 +2985,17 @@ class FdfParser:
     def _GetAlignment(self):
         if self._IsKeyword("Align", True):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("alignment value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "alignment value", self.FileName, self.CurrentLineNumber)
             return True
 
         return False
 
-    ## _GetSectionData() method
+    # _GetSectionData() method
     #
     #   Get section data for FILE statement
     #
@@ -2769,7 +3011,7 @@ class FdfParser:
             if not IsLeafSection and not IsEncapSection:
                 break
 
-    ## _GetLeafSection() method
+    # _GetLeafSection() method
     #
     #   Get leaf section for Obj
     #
@@ -2783,33 +3025,40 @@ class FdfParser:
 
         if not self._IsKeyword("SECTION"):
             if len(Obj.SectionList) == 0:
-                raise Warning.Expected("SECTION", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "SECTION", self.FileName, self.CurrentLineNumber)
             else:
                 return False
 
         AlignValue = None
         if self._GetAlignment():
             if self._Token not in ALIGNMENTS:
-                raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                raise Warning("Incorrect alignment '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self._Token
 
         BuildNum = None
         if self._IsKeyword("BUILD_NUM"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("Build number value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Build number value", self.FileName, self.CurrentLineNumber)
 
             BuildNum = self._Token
 
         if self._IsKeyword("VERSION"):
             if AlignValue == 'Auto':
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("version", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "version", self.FileName, self.CurrentLineNumber)
             VerSectionObj = VerSection()
             VerSectionObj.Alignment = AlignValue
             VerSectionObj.BuildNum = BuildNum
@@ -2821,11 +3070,14 @@ class FdfParser:
 
         elif self._IsKeyword(BINARY_FILE_TYPE_UI):
             if AlignValue == 'Auto':
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("UI", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "UI", self.FileName, self.CurrentLineNumber)
             UiSectionObj = UiSection()
             UiSectionObj.Alignment = AlignValue
             if self._GetStringData():
@@ -2836,11 +3088,14 @@ class FdfParser:
 
         elif self._IsKeyword("FV_IMAGE"):
             if AlignValue == 'Auto':
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("FV name or FV file path", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "FV name or FV file path", self.FileName, self.CurrentLineNumber)
 
             FvName = self._Token
             FvObj = None
@@ -2862,7 +3117,8 @@ class FdfParser:
                         break
 
                 if not self._IsToken(T_CHAR_BRACE_R):
-                    raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedCurlyClose(
+                        self.FileName, self.CurrentLineNumber)
 
             FvImageSectionObj = FvImageSection()
             FvImageSectionObj.Alignment = AlignValue
@@ -2877,35 +3133,43 @@ class FdfParser:
 
         elif self._IsKeyword("PEI_DEPEX_EXP") or self._IsKeyword("DXE_DEPEX_EXP") or self._IsKeyword("SMM_DEPEX_EXP"):
             if AlignValue == 'Auto':
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
             DepexSectionObj = DepexSection()
             DepexSectionObj.Alignment = AlignValue
             DepexSectionObj.DepexType = self._Token
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
             if not self._SkipToToken(T_CHAR_BRACE_R):
-                raise Warning.Expected("Depex expression ending '}'", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Depex expression ending '}'", self.FileName, self.CurrentLineNumber)
 
-            DepexSectionObj.Expression = self._SkippedChars.rstrip(T_CHAR_BRACE_R)
+            DepexSectionObj.Expression = self._SkippedChars.rstrip(
+                T_CHAR_BRACE_R)
             Obj.SectionList.append(DepexSectionObj)
 
         else:
             if not self._GetNextWord():
-                raise Warning.Expected("section type", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "section type", self.FileName, self.CurrentLineNumber)
 
             # Encapsulation section appear, UndoToken and return
             if self._Token == "COMPRESS" or self._Token == "GUIDED":
                 self.SetFileBufferPos(OldPos)
                 return False
 
-            if self._Token not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,\
-                               BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
-                raise Warning("Unknown section type '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
-            if AlignValue == 'Auto'and (not self._Token == BINARY_FILE_TYPE_PE32) and (not self._Token == BINARY_FILE_TYPE_TE):
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+            if self._Token not in {"COMPAT16", BINARY_FILE_TYPE_PE32, BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE", "RAW", BINARY_FILE_TYPE_DXE_DEPEX,
+                                   BINARY_FILE_TYPE_UI, "VERSION", BINARY_FILE_TYPE_PEI_DEPEX, "SUBTYPE_GUID", BINARY_FILE_TYPE_SMM_DEPEX}:
+                raise Warning("Unknown section type '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
+            if AlignValue == 'Auto' and (not self._Token == BINARY_FILE_TYPE_PE32) and (not self._Token == BINARY_FILE_TYPE_TE):
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
 
             # DataSection
             DataSectionObj = DataSection()
@@ -2919,11 +3183,13 @@ class FdfParser:
                     else:
                         DataSectionObj.KeepReloc = True
                 else:
-                    raise Warning("File type %s, section type %s, could not have reloc strip flag%d" % (Obj.FvFileType, DataSectionObj.SecType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                    raise Warning("File type %s, section type %s, could not have reloc strip flag%d" % (
+                        Obj.FvFileType, DataSectionObj.SecType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
             if self._IsToken(TAB_EQUAL_SPLIT):
                 if not self._GetNextToken():
-                    raise Warning.Expected("section file path", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "section file path", self.FileName, self.CurrentLineNumber)
                 DataSectionObj.SectFileName = self._Token
                 self._VerifyFile(DataSectionObj.SectFileName)
             else:
@@ -2934,7 +3200,7 @@ class FdfParser:
 
         return True
 
-    ## _VerifyFile
+    # _VerifyFile
     #
     #    Check if file exists or not:
     #      If current phase if GenFds, the file must exist;
@@ -2945,11 +3211,12 @@ class FdfParser:
         if FileName.replace(TAB_WORKSPACE, '').find('$') != -1:
             return
         if not GlobalData.gAutoGenPhase or not self._GetMacroValue(TAB_DSC_DEFINES_OUTPUT_DIRECTORY) in FileName:
-            ErrorCode, ErrorInfo = PathClass(NormPath(FileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
+            ErrorCode, ErrorInfo = PathClass(
+                NormPath(FileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
             if ErrorCode != 0:
                 EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
 
-    ## _GetCglSection() method
+    # _GetCglSection() method
     #
     #   Get compressed or GUIDed section for Obj
     #
@@ -2959,7 +3226,7 @@ class FdfParser:
     #   @retval True        Successfully find section statement
     #   @retval False       Not able to find section statement
     #
-    def _GetCglSection(self, Obj, AlignValue = None):
+    def _GetCglSection(self, Obj, AlignValue=None):
 
         if self._IsKeyword("COMPRESS"):
             type = "PI_STD"
@@ -2967,7 +3234,8 @@ class FdfParser:
                 type = self._Token
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
 
             CompressSectionObj = CompressSection()
             CompressSectionObj.Alignment = AlignValue
@@ -2979,9 +3247,9 @@ class FdfParser:
                 if not IsLeafSection and not IsEncapSection:
                     break
 
-
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
             Obj.SectionList.append(CompressSectionObj)
             return True
 
@@ -2989,12 +3257,14 @@ class FdfParser:
             GuidValue = None
             if self._GetNextGuid():
                 if self._Token in GlobalData.gGuidDict:
-                    self._Token = GuidStructureStringToGuidString(GlobalData.gGuidDict[self._Token]).upper()
+                    self._Token = GuidStructureStringToGuidString(
+                        GlobalData.gGuidDict[self._Token]).upper()
                 GuidValue = self._Token
 
             AttribDict = self._GetGuidAttrib()
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
             GuidSectionObj = GuidSection()
             GuidSectionObj.Alignment = AlignValue
             GuidSectionObj.NameGuid = GuidValue
@@ -3010,14 +3280,15 @@ class FdfParser:
                     break
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
             Obj.SectionList.append(GuidSectionObj)
 
             return True
 
         return False
 
-    ## _GetGuidAttri() method
+    # _GetGuidAttri() method
     #
     #   Get attributes for GUID section
     #
@@ -3030,14 +3301,16 @@ class FdfParser:
         AttribDict["AUTH_STATUS_VALID"] = "NONE"
         AttribDict["EXTRA_HEADER_SIZE"] = -1
         while self._IsKeyword("PROCESSING_REQUIRED") or self._IsKeyword("AUTH_STATUS_VALID") \
-            or self._IsKeyword("EXTRA_HEADER_SIZE"):
+                or self._IsKeyword("EXTRA_HEADER_SIZE"):
             AttribKey = self._Token
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("TRUE(1)/FALSE(0)/Number", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "TRUE(1)/FALSE(0)/Number", self.FileName, self.CurrentLineNumber)
             elif AttribKey == "EXTRA_HEADER_SIZE":
                 Base = 10
                 if self._Token[0:2].upper() == "0X":
@@ -3046,14 +3319,16 @@ class FdfParser:
                     AttribDict[AttribKey] = int(self._Token, Base)
                     continue
                 except ValueError:
-                    raise Warning.Expected("Number", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "Number", self.FileName, self.CurrentLineNumber)
             elif self._Token.upper() not in {"TRUE", "FALSE", "1", "0"}:
-                raise Warning.Expected("TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "TRUE/FALSE (1/0)", self.FileName, self.CurrentLineNumber)
             AttribDict[AttribKey] = self._Token
 
         return AttribDict
 
-    ## _GetEncapsulationSec() method
+    # _GetEncapsulationSec() method
     #
     #   Get encapsulation section for FILE
     #
@@ -3066,14 +3341,16 @@ class FdfParser:
         OldPos = self.GetFileBufferPos()
         if not self._IsKeyword("SECTION"):
             if len(FfsFileObj.SectionList) == 0:
-                raise Warning.Expected("SECTION", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "SECTION", self.FileName, self.CurrentLineNumber)
             else:
                 return False
 
         AlignValue = None
         if self._GetAlignment():
             if self._Token not in ALIGNMENT_NOAUTO:
-                raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                raise Warning("Incorrect alignment '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
             AlignValue = self._Token
 
         if not self._GetCglSection(FfsFileObj, AlignValue):
@@ -3095,35 +3372,44 @@ class FdfParser:
         self._SkipToToken("[FMPPAYLOAD.", True)
         FmpUiName = self._GetUiName().upper()
         if FmpUiName in self.Profile.FmpPayloadDict:
-            raise Warning("Duplicated FMP UI name found: %s" % FmpUiName, self.FileName, self.CurrentLineNumber)
+            raise Warning("Duplicated FMP UI name found: %s" %
+                          FmpUiName, self.FileName, self.CurrentLineNumber)
 
         FmpData = CapsulePayload()
         FmpData.UiName = FmpUiName
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning("The FMP payload section is empty!", self.FileName, self.CurrentLineNumber)
-        FmpKeyList = ['IMAGE_HEADER_INIT_VERSION', 'IMAGE_TYPE_ID', 'IMAGE_INDEX', 'HARDWARE_INSTANCE', 'CERTIFICATE_GUID', 'MONOTONIC_COUNT']
+            raise Warning("The FMP payload section is empty!",
+                          self.FileName, self.CurrentLineNumber)
+        FmpKeyList = ['IMAGE_HEADER_INIT_VERSION', 'IMAGE_TYPE_ID', 'IMAGE_INDEX',
+                      'HARDWARE_INSTANCE', 'CERTIFICATE_GUID', 'MONOTONIC_COUNT']
         while self._Token in FmpKeyList:
             Name = self._Token
             FmpKeyList.remove(Name)
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if Name == 'IMAGE_TYPE_ID':
                 if not self._GetNextGuid():
-                    raise Warning.Expected("GUID value for IMAGE_TYPE_ID.", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "GUID value for IMAGE_TYPE_ID.", self.FileName, self.CurrentLineNumber)
                 FmpData.ImageTypeId = self._Token
             elif Name == 'CERTIFICATE_GUID':
                 if not self._GetNextGuid():
-                    raise Warning.Expected("GUID value for CERTIFICATE_GUID.", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "GUID value for CERTIFICATE_GUID.", self.FileName, self.CurrentLineNumber)
                 FmpData.Certificate_Guid = self._Token
                 if UUID(FmpData.Certificate_Guid) != EFI_CERT_TYPE_RSA2048_SHA256_GUID and UUID(FmpData.Certificate_Guid) != EFI_CERT_TYPE_PKCS7_GUID:
-                    raise Warning("Only support EFI_CERT_TYPE_RSA2048_SHA256_GUID or EFI_CERT_TYPE_PKCS7_GUID for CERTIFICATE_GUID.", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Only support EFI_CERT_TYPE_RSA2048_SHA256_GUID or EFI_CERT_TYPE_PKCS7_GUID for CERTIFICATE_GUID.",
+                                  self.FileName, self.CurrentLineNumber)
             else:
                 if not self._GetNextToken():
-                    raise Warning.Expected("value of %s" % Name, self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "value of %s" % Name, self.FileName, self.CurrentLineNumber)
                 Value = self._Token
                 if Name == 'IMAGE_HEADER_INIT_VERSION':
                     if FdfParser._Verify(Name, Value, 'UINT8'):
@@ -3138,31 +3424,37 @@ class FdfParser:
                     if FdfParser._Verify(Name, Value, 'UINT64'):
                         FmpData.MonotonicCount = Value
                         if FmpData.MonotonicCount.upper().startswith('0X'):
-                            FmpData.MonotonicCount = int(FmpData.MonotonicCount, 16)
+                            FmpData.MonotonicCount = int(
+                                FmpData.MonotonicCount, 16)
                         else:
-                            FmpData.MonotonicCount = int(FmpData.MonotonicCount)
+                            FmpData.MonotonicCount = int(
+                                FmpData.MonotonicCount)
             if not self._GetNextToken():
                 break
         else:
             self._UndoToken()
 
         if (FmpData.MonotonicCount and not FmpData.Certificate_Guid) or (not FmpData.MonotonicCount and FmpData.Certificate_Guid):
-            EdkLogger.error("FdfParser", FORMAT_INVALID, "CERTIFICATE_GUID and MONOTONIC_COUNT must be work as a pair.")
+            EdkLogger.error("FdfParser", FORMAT_INVALID,
+                            "CERTIFICATE_GUID and MONOTONIC_COUNT must be work as a pair.")
 
         # Only the IMAGE_TYPE_ID is required item
         if FmpKeyList and 'IMAGE_TYPE_ID' in FmpKeyList:
-            raise Warning("'IMAGE_TYPE_ID' in FMP payload section.", self.FileName, self.CurrentLineNumber)
+            raise Warning("'IMAGE_TYPE_ID' in FMP payload section.",
+                          self.FileName, self.CurrentLineNumber)
         # get the Image file and Vendor code file
         self._GetFMPCapsuleData(FmpData)
         if not FmpData.ImageFile:
-            raise Warning("Missing image file in FMP payload section.", self.FileName, self.CurrentLineNumber)
+            raise Warning("Missing image file in FMP payload section.",
+                          self.FileName, self.CurrentLineNumber)
         # check whether more than one Vendor code file
         if len(FmpData.VendorCodeFile) > 1:
-            raise Warning("Vendor code file max of 1 per FMP payload section.", self.FileName, self.CurrentLineNumber)
+            raise Warning("Vendor code file max of 1 per FMP payload section.",
+                          self.FileName, self.CurrentLineNumber)
         self.Profile.FmpPayloadDict[FmpUiName] = FmpData
         return True
 
-    ## _GetCapsule() method
+    # _GetCapsule() method
     #
     #   Get capsule section contents and store its data into capsule list of self.Profile
     #
@@ -3182,28 +3474,34 @@ class FdfParser:
 
         self._UndoToken()
         if not self._IsToken("[CAPSULE.", True):
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            #print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
+            # print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
             #        % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
-            raise Warning.Expected("[Capsule.]", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "[Capsule.]", self.FileName, self.CurrentLineNumber)
 
         CapsuleObj = Capsule()
 
         CapsuleName = self._GetUiName()
         if not CapsuleName:
-            raise Warning.Expected("capsule name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "capsule name", self.FileName, self.CurrentLineNumber)
 
         CapsuleObj.UiCapsuleName = CapsuleName.upper()
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         if self._IsKeyword("CREATE_FILE"):
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("file name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "file name", self.FileName, self.CurrentLineNumber)
 
             CapsuleObj.CreateFile = self._Token
 
@@ -3211,7 +3509,7 @@ class FdfParser:
         self.Profile.CapsuleDict[CapsuleObj.UiCapsuleName] = CapsuleObj
         return True
 
-    ## _GetCapsuleStatements() method
+    # _GetCapsuleStatements() method
     #
     #   Get statements for capsule
     #
@@ -3224,7 +3522,7 @@ class FdfParser:
         self._GetSetStatements(Obj)
         self._GetCapsuleData(Obj)
 
-    ## _GetCapsuleTokens() method
+    # _GetCapsuleTokens() method
     #
     #   Get token statements for capsule
     #
@@ -3237,30 +3535,38 @@ class FdfParser:
         while self._Token in {"CAPSULE_GUID", "CAPSULE_HEADER_SIZE", "CAPSULE_FLAGS", "OEM_CAPSULE_FLAGS", "CAPSULE_HEADER_INIT_VERSION"}:
             Name = self._Token.strip()
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
             if not self._GetNextToken():
-                raise Warning.Expected("value", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "value", self.FileName, self.CurrentLineNumber)
             if Name == 'CAPSULE_FLAGS':
                 if not self._Token in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
-                    raise Warning.Expected("PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                 Value = self._Token.strip()
                 while self._IsToken(TAB_COMMA_SPLIT):
                     Value += TAB_COMMA_SPLIT
                     if not self._GetNextToken():
-                        raise Warning.Expected("value", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "value", self.FileName, self.CurrentLineNumber)
                     if not self._Token in {"PersistAcrossReset", "PopulateSystemTable", "InitiateReset"}:
-                        raise Warning.Expected("PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "PersistAcrossReset, PopulateSystemTable, or InitiateReset", self.FileName, self.CurrentLineNumber)
                     Value += self._Token.strip()
             elif Name == 'OEM_CAPSULE_FLAGS':
                 Value = self._Token.strip()
                 if not Value.upper().startswith('0X'):
-                    raise Warning.Expected("hex value starting with 0x", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "hex value starting with 0x", self.FileName, self.CurrentLineNumber)
                 try:
                     Value = int(Value, 0)
                 except ValueError:
-                    raise Warning.Expected("hex string failed to convert to value", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "hex string failed to convert to value", self.FileName, self.CurrentLineNumber)
                 if not 0x0000 <= Value <= 0xFFFF:
-                    raise Warning.Expected("hex value between 0x0000 and 0xFFFF", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "hex value between 0x0000 and 0xFFFF", self.FileName, self.CurrentLineNumber)
                 Value = self._Token.strip()
             else:
                 Value = self._Token.strip()
@@ -3269,7 +3575,7 @@ class FdfParser:
                 return False
         self._UndoToken()
 
-    ## _GetCapsuleData() method
+    # _GetCapsuleData() method
     #
     #   Get capsule data for capsule
     #
@@ -3288,7 +3594,7 @@ class FdfParser:
             if not (IsInf or IsFile or IsFv or IsFd or IsAnyFile or IsAfile or IsFmp):
                 break
 
-    ## _GetFMPCapsuleData() method
+    # _GetFMPCapsuleData() method
     #
     #   Get capsule data for FMP capsule
     #
@@ -3303,7 +3609,7 @@ class FdfParser:
             if not (IsFv or IsFd or IsAnyFile):
                 break
 
-    ## _GetFvStatement() method
+    # _GetFvStatement() method
     #
     #   Get FV for capsule
     #
@@ -3312,7 +3618,7 @@ class FdfParser:
     #   @retval True        Successfully find a FV statement
     #   @retval False       Not able to find a FV statement
     #
-    def _GetFvStatement(self, CapsuleObj, FMPCapsule = False):
+    def _GetFvStatement(self, CapsuleObj, FMPCapsule=False):
         if not self._IsKeyword(BINARY_FILE_TYPE_FV):
             return False
 
@@ -3320,10 +3626,12 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("FV name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FV name", self.FileName, self.CurrentLineNumber)
 
         if self._Token.upper() not in self.Profile.FvDict:
-            raise Warning("FV name does not exist", self.FileName, self.CurrentLineNumber)
+            raise Warning("FV name does not exist",
+                          self.FileName, self.CurrentLineNumber)
 
         myCapsuleFv = CapsuleFv()
         myCapsuleFv.FvName = self._Token
@@ -3336,7 +3644,7 @@ class FdfParser:
             CapsuleObj.CapsuleDataList.append(myCapsuleFv)
         return True
 
-    ## _GetFdStatement() method
+    # _GetFdStatement() method
     #
     #   Get FD for capsule
     #
@@ -3345,7 +3653,7 @@ class FdfParser:
     #   @retval True        Successfully find a FD statement
     #   @retval False       Not able to find a FD statement
     #
-    def _GetFdStatement(self, CapsuleObj, FMPCapsule = False):
+    def _GetFdStatement(self, CapsuleObj, FMPCapsule=False):
         if not self._IsKeyword("FD"):
             return False
 
@@ -3353,10 +3661,12 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("FD name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FD name", self.FileName, self.CurrentLineNumber)
 
         if self._Token.upper() not in self.Profile.FdDict:
-            raise Warning("FD name does not exist", self.FileName, self.CurrentLineNumber)
+            raise Warning("FD name does not exist",
+                          self.FileName, self.CurrentLineNumber)
 
         myCapsuleFd = CapsuleFd()
         myCapsuleFd.FdName = self._Token
@@ -3382,10 +3692,12 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("payload name after FMP_PAYLOAD =", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "payload name after FMP_PAYLOAD =", self.FileName, self.CurrentLineNumber)
         Payload = self._Token.upper()
         if Payload not in self.Profile.FmpPayloadDict:
-            raise Warning("This FMP Payload does not exist: %s" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("This FMP Payload does not exist: %s" %
+                          self._Token, self.FileName, self.CurrentLineNumber)
         CapsuleObj.FmpPayloadList.append(self.Profile.FmpPayloadDict[Payload])
         return True
 
@@ -3401,17 +3713,19 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("File name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "File name", self.FileName, self.CurrentLineNumber)
 
         AnyFileName = self._Token
         self._VerifyFile(AnyFileName)
 
         if not os.path.isabs(AnyFileName):
-            AnyFileName = mws.join(GenFdsGlobalVariable.WorkSpaceDir, AnyFileName)
+            AnyFileName = mws.join(
+                GenFdsGlobalVariable.WorkSpaceDir, AnyFileName)
 
         return AnyFileName
 
-    ## _GetAnyFileStatement() method
+    # _GetAnyFileStatement() method
     #
     #   Get AnyFile for capsule
     #
@@ -3420,7 +3734,7 @@ class FdfParser:
     #   @retval True        Successfully find a Anyfile statement
     #   @retval False       Not able to find a AnyFile statement
     #
-    def _GetAnyFileStatement(self, CapsuleObj, FMPCapsule = False):
+    def _GetAnyFileStatement(self, CapsuleObj, FMPCapsule=False):
         AnyFileName = self._ParseRawFileStatement()
         if not AnyFileName:
             return False
@@ -3436,7 +3750,7 @@ class FdfParser:
             CapsuleObj.CapsuleDataList.append(myCapsuleAnyFile)
         return True
 
-    ## _GetAfileStatement() method
+    # _GetAfileStatement() method
     #
     #   Get Afile for capsule
     #
@@ -3453,13 +3767,14 @@ class FdfParser:
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._GetNextToken():
-            raise Warning.Expected("Afile name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Afile name", self.FileName, self.CurrentLineNumber)
 
         AfileName = self._Token
         AfileBaseName = os.path.basename(AfileName)
 
-        if os.path.splitext(AfileBaseName)[1]  not in {".bin", ".BIN", ".Bin", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"}:
-            raise Warning('invalid binary file type, should be one of "bin",BINARY_FILE_TYPE_BIN,"Bin","dat","DAT","Dat","data","DATA","Data"', \
+        if os.path.splitext(AfileBaseName)[1] not in {".bin", ".BIN", ".Bin", ".dat", ".DAT", ".Dat", ".data", ".DATA", ".Data"}:
+            raise Warning('invalid binary file type, should be one of "bin",BINARY_FILE_TYPE_BIN,"Bin","dat","DAT","Dat","data","DATA","Data"',
                           self.FileName, self.CurrentLineNumber)
 
         if not os.path.isabs(AfileName):
@@ -3467,7 +3782,8 @@ class FdfParser:
             self._VerifyFile(AfileName)
         else:
             if not os.path.exists(AfileName):
-                raise Warning('%s does not exist' % AfileName, self.FileName, self.CurrentLineNumber)
+                raise Warning('%s does not exist' % AfileName,
+                              self.FileName, self.CurrentLineNumber)
             else:
                 pass
 
@@ -3476,7 +3792,7 @@ class FdfParser:
         CapsuleObj.CapsuleDataList.append(myCapsuleAfile)
         return True
 
-    ## _GetRule() method
+    # _GetRule() method
     #
     #   Get Rule section contents and store its data into rule list of self.Profile
     #
@@ -3495,13 +3811,16 @@ class FdfParser:
             return False
         self._UndoToken()
         if not self._IsToken("[Rule.", True):
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
-            #print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
+            # print 'Parsing String: %s in File %s, At line: %d, Offset Within Line: %d' \
             #        % (self.Profile.FileLinesList[self.CurrentLineNumber - 1][self.CurrentOffsetWithinLine:], FileLineTuple[0], FileLineTuple[1], self.CurrentOffsetWithinLine)
-            raise Warning.Expected("[Rule.]", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "[Rule.]", self.FileName, self.CurrentLineNumber)
 
         if not self._SkipToToken(TAB_SPLIT):
-            raise Warning.Expected("'.'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("'.'", self.FileName,
+                                   self.CurrentLineNumber)
 
         Arch = self._SkippedChars.rstrip(TAB_SPLIT)
 
@@ -3510,33 +3829,35 @@ class FdfParser:
         TemplateName = ""
         if self._IsToken(TAB_SPLIT):
             if not self._GetNextWord():
-                raise Warning.Expected("template name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "template name", self.FileName, self.CurrentLineNumber)
             TemplateName = self._Token
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         RuleObj = self._GetRuleFileStatements()
         RuleObj.Arch = Arch.upper()
         RuleObj.ModuleType = ModuleType
         RuleObj.TemplateName = TemplateName
         if TemplateName == '':
-            self.Profile.RuleDict['RULE'             + \
-                              TAB_SPLIT              + \
-                              Arch.upper()           + \
-                              TAB_SPLIT              + \
-                              ModuleType.upper()     ] = RuleObj
+            self.Profile.RuleDict['RULE' +
+                                  TAB_SPLIT +
+                                  Arch.upper() +
+                                  TAB_SPLIT +
+                                  ModuleType.upper()] = RuleObj
         else:
-            self.Profile.RuleDict['RULE'             + \
-                              TAB_SPLIT              + \
-                              Arch.upper()           + \
-                              TAB_SPLIT              + \
-                              ModuleType.upper()     + \
-                              TAB_SPLIT              + \
-                              TemplateName.upper() ] = RuleObj
+            self.Profile.RuleDict['RULE' +
+                                  TAB_SPLIT +
+                                  Arch.upper() +
+                                  TAB_SPLIT +
+                                  ModuleType.upper() +
+                                  TAB_SPLIT +
+                                  TemplateName.upper()] = RuleObj
         return True
 
-    ## _GetModuleType() method
+    # _GetModuleType() method
     #
     #   Return the module type
     #
@@ -3545,7 +3866,8 @@ class FdfParser:
     #
     def _GetModuleType(self):
         if not self._GetNextWord():
-            raise Warning.Expected("Module type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Module type", self.FileName, self.CurrentLineNumber)
         if self._Token.upper() not in {
                 SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,
                 SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER,
@@ -3562,10 +3884,11 @@ class FdfParser:
                 EDK_COMPONENT_TYPE_APPLICATION, "ACPITABLE",
                 SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE,
                 SUP_MODULE_MM_CORE_STANDALONE}:
-            raise Warning("Unknown Module type '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown Module type '%s'" %
+                          self._Token, self.FileName, self.CurrentLineNumber)
         return self._Token
 
-    ## _GetFileExtension() method
+    # _GetFileExtension() method
     #
     #   Return the file extension
     #
@@ -3574,7 +3897,8 @@ class FdfParser:
     #
     def _GetFileExtension(self):
         if not self._IsToken(TAB_SPLIT):
-            raise Warning.Expected("'.'", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("'.'", self.FileName,
+                                   self.CurrentLineNumber)
 
         Ext = ""
         if self._GetNextToken():
@@ -3582,12 +3906,14 @@ class FdfParser:
                 Ext = self._Token
                 return TAB_SPLIT + Ext
             else:
-                raise Warning("Unknown file extension '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                raise Warning("Unknown file extension '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
 
         else:
-            raise Warning.Expected("file extension", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "file extension", self.FileName, self.CurrentLineNumber)
 
-    ## _GetRuleFileStatement() method
+    # _GetRuleFileStatement() method
     #
     #   Get rule contents
     #
@@ -3596,35 +3922,42 @@ class FdfParser:
     #
     def _GetRuleFileStatements(self):
         if not self._IsKeyword("FILE"):
-            raise Warning.Expected("FILE", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("FILE", self.FileName,
+                                   self.CurrentLineNumber)
 
         if not self._GetNextWord():
-            raise Warning.Expected("FFS type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "FFS type", self.FileName, self.CurrentLineNumber)
 
         Type = self._Token.strip().upper()
         if Type not in {"RAW", "FREEFORM", SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,
                         "PEI_DXE_COMBO", "DRIVER", SUP_MODULE_DXE_CORE, EDK_COMPONENT_TYPE_APPLICATION,
                         "FV_IMAGE", "SMM", SUP_MODULE_SMM_CORE, SUP_MODULE_MM_STANDALONE,
                         SUP_MODULE_MM_CORE_STANDALONE}:
-            raise Warning("Unknown FV type '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown FV type '%s'" % self._Token,
+                          self.FileName, self.CurrentLineNumber)
 
         if not self._IsToken(TAB_EQUAL_SPLIT):
             raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
 
         if not self._IsKeyword("$(NAMED_GUID)"):
             if not self._GetNextWord():
-                NamedGuid = self._CurrentLine()[self.CurrentOffsetWithinLine:].split()[0].strip()
+                NamedGuid = self._CurrentLine()[self.CurrentOffsetWithinLine:].split()[
+                    0].strip()
                 if GlobalData.gGuidPatternEnd.match(NamedGuid):
                     self.CurrentOffsetWithinLine += len(NamedGuid)
                     self._Token = NamedGuid
                 else:
-                    raise Warning.Expected("$(NAMED_GUID)", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "$(NAMED_GUID)", self.FileName, self.CurrentLineNumber)
             if self._Token == 'PCD':
                 if not self._IsToken("("):
-                    raise Warning.Expected("'('", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "'('", self.FileName, self.CurrentLineNumber)
                 PcdPair = self._GetNextPcdSettings()
                 if not self._IsToken(")"):
-                    raise Warning.Expected("')'", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "')'", self.FileName, self.CurrentLineNumber)
                 self._Token = 'PCD('+PcdPair[1]+TAB_SPLIT+PcdPair[0]+')'
 
         NameGuid = self._Token
@@ -3637,7 +3970,8 @@ class FdfParser:
                 else:
                     KeepReloc = True
             else:
-                raise Warning("File type %s could not have reloc strip flag%d" % (Type, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                raise Warning("File type %s could not have reloc strip flag%d" % (
+                    Type, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
         KeyStringList = []
         if self._GetNextToken():
@@ -3646,7 +3980,8 @@ class FdfParser:
                 if self._IsToken(TAB_COMMA_SPLIT):
                     while self._GetNextToken():
                         if not TokenFindPattern.match(self._Token):
-                            raise Warning.Expected("KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
+                            raise Warning.Expected(
+                                "KeyString \"Target_Tag_Arch\"", self.FileName, self.CurrentLineNumber)
                         KeyStringList.append(self._Token)
 
                         if not self._IsToken(TAB_COMMA_SPLIT):
@@ -3655,7 +3990,6 @@ class FdfParser:
             else:
                 self._UndoToken()
 
-
         Fixed = False
         if self._IsKeyword("Fixed", True):
             Fixed = True
@@ -3667,8 +4001,9 @@ class FdfParser:
         AlignValue = ""
         if self._GetAlignment():
             if self._Token not in ALIGNMENTS:
-                raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
-            #For FFS, Auto is default option same to ""
+                raise Warning("Incorrect alignment '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
+            # For FFS, Auto is default option same to ""
             if not self._Token == "Auto":
                 AlignValue = self._Token
 
@@ -3691,25 +4026,27 @@ class FdfParser:
                     break
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
 
             return NewRule
 
         else:
             # Simple file rule expected
             if not self._GetNextWord():
-                raise Warning.Expected("leaf section type", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "leaf section type", self.FileName, self.CurrentLineNumber)
 
             SectionName = self._Token
 
             if SectionName not in {
                     "COMPAT16", BINARY_FILE_TYPE_PE32,
                     BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE",
-                    "RAW",BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI,
+                    "RAW", BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI,
                     BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID",
                     BINARY_FILE_TYPE_SMM_DEPEX}:
-                raise Warning("Unknown leaf section name '%s'" % SectionName, self.FileName, self.CurrentLineNumber)
-
+                raise Warning("Unknown leaf section name '%s'" %
+                              SectionName, self.FileName, self.CurrentLineNumber)
 
             if self._IsKeyword("Fixed", True):
                 Fixed = True
@@ -3720,16 +4057,19 @@ class FdfParser:
             SectAlignment = ""
             if self._GetAlignment():
                 if self._Token not in ALIGNMENTS:
-                    raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                    raise Warning("Incorrect alignment '%s'" %
+                                  self._Token, self.FileName, self.CurrentLineNumber)
                 if self._Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
-                    raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                    raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                                  self.FileName, self.CurrentLineNumber)
                 SectAlignment = self._Token
 
             Ext = None
             if self._IsToken(TAB_VALUE_SPLIT):
                 Ext = self._GetFileExtension()
             elif not self._GetNextToken():
-                raise Warning.Expected("File name", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "File name", self.FileName, self.CurrentLineNumber)
 
             NewRule = RuleSimpleFile()
             NewRule.SectionType = SectionName
@@ -3746,7 +4086,7 @@ class FdfParser:
             NewRule.FileName = self._Token
             return NewRule
 
-    ## _GetEfiSection() method
+    # _GetEfiSection() method
     #
     #   Get section list for Rule
     #
@@ -3759,7 +4099,8 @@ class FdfParser:
         OldPos = self.GetFileBufferPos()
         EfiSectionObj = EfiSection()
         if not self._GetNextWord():
-            CurrentLine = self._CurrentLine()[self.CurrentOffsetWithinLine:].split()[0].strip()
+            CurrentLine = self._CurrentLine()[self.CurrentOffsetWithinLine:].split()[
+                0].strip()
             if self._Token == '{' and Obj.FvFileType == "RAW" and TAB_SPLIT in CurrentLine:
                 if self._IsToken(TAB_VALUE_SPLIT):
                     EfiSectionObj.FileExtension = self._GetFileExtension()
@@ -3773,11 +4114,11 @@ class FdfParser:
         SectionName = self._Token
 
         if SectionName not in {
-                    "COMPAT16", BINARY_FILE_TYPE_PE32,
-                    BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE",
-                    "RAW",BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI,
-                    BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID",
-                    BINARY_FILE_TYPE_SMM_DEPEX, BINARY_FILE_TYPE_GUID}:
+            "COMPAT16", BINARY_FILE_TYPE_PE32,
+            BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_TE, "FV_IMAGE",
+            "RAW", BINARY_FILE_TYPE_DXE_DEPEX, BINARY_FILE_TYPE_UI,
+            BINARY_FILE_TYPE_PEI_DEPEX, "VERSION", "SUBTYPE_GUID",
+                BINARY_FILE_TYPE_SMM_DEPEX, BINARY_FILE_TYPE_GUID}:
             self._UndoToken()
             return False
 
@@ -3802,18 +4143,21 @@ class FdfParser:
                         break
 
                 if not self._IsToken(T_CHAR_BRACE_R):
-                    raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedCurlyClose(
+                        self.FileName, self.CurrentLineNumber)
                 FvImageSectionObj.Fv = FvObj
                 FvImageSectionObj.FvName = None
 
             else:
                 if not self._IsKeyword(BINARY_FILE_TYPE_FV):
-                    raise Warning.Expected("'FV'", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "'FV'", self.FileName, self.CurrentLineNumber)
                 FvImageSectionObj.FvFileType = self._Token
 
                 if self._GetAlignment():
                     if self._Token not in ALIGNMENT_NOAUTO:
-                        raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                        raise Warning("Incorrect alignment '%s'" %
+                                      self._Token, self.FileName, self.CurrentLineNumber)
                     FvImageSectionObj.Alignment = self._Token
 
                 if self._IsToken(TAB_VALUE_SPLIT):
@@ -3830,7 +4174,8 @@ class FdfParser:
                     else:
                         self._UndoToken()
                 else:
-                    raise Warning.Expected("FV file name", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "FV file name", self.FileName, self.CurrentLineNumber)
 
             Obj.SectionList.append(FvImageSectionObj)
             return True
@@ -3838,55 +4183,69 @@ class FdfParser:
         EfiSectionObj.SectionType = SectionName
 
         if not self._GetNextToken():
-            raise Warning.Expected("file type", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "file type", self.FileName, self.CurrentLineNumber)
 
         if self._Token == "STRING":
             if not self._RuleSectionCouldHaveString(EfiSectionObj.SectionType):
-                raise Warning("%s section could NOT have string data%d" % (EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                raise Warning("%s section could NOT have string data%d" % (
+                    EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
             if not self._IsToken(TAB_EQUAL_SPLIT):
-                raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedEquals(
+                    self.FileName, self.CurrentLineNumber)
 
             if not self._GetNextToken():
-                raise Warning.Expected("Quoted String", self.FileName, self.CurrentLineNumber)
+                raise Warning.Expected(
+                    "Quoted String", self.FileName, self.CurrentLineNumber)
 
             if self._GetStringData():
                 EfiSectionObj.StringData = self._Token
 
             if self._IsKeyword("BUILD_NUM"):
                 if not self._RuleSectionCouldHaveBuildNum(EfiSectionObj.SectionType):
-                    raise Warning("%s section could NOT have BUILD_NUM%d" % (EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                    raise Warning("%s section could NOT have BUILD_NUM%d" % (
+                        EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
                 if not self._IsToken(TAB_EQUAL_SPLIT):
-                    raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedEquals(
+                        self.FileName, self.CurrentLineNumber)
                 if not self._GetNextToken():
-                    raise Warning.Expected("Build number", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "Build number", self.FileName, self.CurrentLineNumber)
                 EfiSectionObj.BuildNum = self._Token
 
         else:
             EfiSectionObj.FileType = self._Token
-            self._CheckRuleSectionFileType(EfiSectionObj.SectionType, EfiSectionObj.FileType)
+            self._CheckRuleSectionFileType(
+                EfiSectionObj.SectionType, EfiSectionObj.FileType)
 
         if self._IsKeyword("Optional"):
             if not self._RuleSectionCouldBeOptional(EfiSectionObj.SectionType):
-                raise Warning("%s section could NOT be optional%d" % (EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                raise Warning("%s section could NOT be optional%d" % (
+                    EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
             EfiSectionObj.Optional = True
 
             if self._IsKeyword("BUILD_NUM"):
                 if not self._RuleSectionCouldHaveBuildNum(EfiSectionObj.SectionType):
-                    raise Warning("%s section could NOT have BUILD_NUM%d" % (EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
+                    raise Warning("%s section could NOT have BUILD_NUM%d" % (
+                        EfiSectionObj.SectionType, self.CurrentLineNumber), self.FileName, self.CurrentLineNumber)
 
                 if not self._IsToken(TAB_EQUAL_SPLIT):
-                    raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                    raise Warning.ExpectedEquals(
+                        self.FileName, self.CurrentLineNumber)
                 if not self._GetNextToken():
-                    raise Warning.Expected("Build number", self.FileName, self.CurrentLineNumber)
+                    raise Warning.Expected(
+                        "Build number", self.FileName, self.CurrentLineNumber)
                 EfiSectionObj.BuildNum = self._Token
 
         if self._GetAlignment():
             if self._Token not in ALIGNMENTS:
-                raise Warning("Incorrect alignment '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+                raise Warning("Incorrect alignment '%s'" %
+                              self._Token, self.FileName, self.CurrentLineNumber)
             if self._Token == 'Auto' and (not SectionName == BINARY_FILE_TYPE_PE32) and (not SectionName == BINARY_FILE_TYPE_TE):
-                raise Warning("Auto alignment can only be used in PE32 or TE section ", self.FileName, self.CurrentLineNumber)
+                raise Warning("Auto alignment can only be used in PE32 or TE section ",
+                              self.FileName, self.CurrentLineNumber)
             EfiSectionObj.Alignment = self._Token
 
         if self._IsKeyword('RELOCS_STRIPPED') or self._IsKeyword('RELOCS_RETAINED'):
@@ -3896,10 +4255,11 @@ class FdfParser:
                 else:
                     EfiSectionObj.KeepReloc = True
                 if Obj.KeepReloc is not None and Obj.KeepReloc != EfiSectionObj.KeepReloc:
-                    raise Warning("Section type %s has reloc strip flag conflict with Rule" % EfiSectionObj.SectionType, self.FileName, self.CurrentLineNumber)
+                    raise Warning("Section type %s has reloc strip flag conflict with Rule" %
+                                  EfiSectionObj.SectionType, self.FileName, self.CurrentLineNumber)
             else:
-                raise Warning("Section type %s could not have reloc strip flag" % EfiSectionObj.SectionType, self.FileName, self.CurrentLineNumber)
-
+                raise Warning("Section type %s could not have reloc strip flag" %
+                              EfiSectionObj.SectionType, self.FileName, self.CurrentLineNumber)
 
         if self._IsToken(TAB_VALUE_SPLIT):
             EfiSectionObj.FileExtension = self._GetFileExtension()
@@ -3918,23 +4278,27 @@ class FdfParser:
 
                     if self._Token == 'PCD':
                         if not self._IsToken("("):
-                            raise Warning.Expected("'('", self.FileName, self.CurrentLineNumber)
+                            raise Warning.Expected(
+                                "'('", self.FileName, self.CurrentLineNumber)
                         PcdPair = self._GetNextPcdSettings()
                         if not self._IsToken(")"):
-                            raise Warning.Expected("')'", self.FileName, self.CurrentLineNumber)
-                        self._Token = 'PCD('+PcdPair[1]+TAB_SPLIT+PcdPair[0]+')'
+                            raise Warning.Expected(
+                                "')'", self.FileName, self.CurrentLineNumber)
+                        self._Token = 'PCD(' + \
+                            PcdPair[1]+TAB_SPLIT+PcdPair[0]+')'
 
                 EfiSectionObj.FileName = self._Token
 
             else:
                 self._UndoToken()
         else:
-            raise Warning.Expected("section file name", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected("section file name",
+                                   self.FileName, self.CurrentLineNumber)
 
         Obj.SectionList.append(EfiSectionObj)
         return True
 
-    ## _RuleSectionCouldBeOptional() method
+    # _RuleSectionCouldBeOptional() method
     #
     #   Get whether a section could be optional
     #
@@ -3949,7 +4313,7 @@ class FdfParser:
         else:
             return False
 
-    ## _RuleSectionCouldHaveBuildNum() method
+    # _RuleSectionCouldHaveBuildNum() method
     #
     #   Get whether a section could have build number information
     #
@@ -3964,7 +4328,7 @@ class FdfParser:
         else:
             return False
 
-    ## _RuleSectionCouldHaveString() method
+    # _RuleSectionCouldHaveString() method
     #
     #   Get whether a section could have string
     #
@@ -3979,7 +4343,7 @@ class FdfParser:
         else:
             return False
 
-    ## _CheckRuleSectionFileType() method
+    # _CheckRuleSectionFileType() method
     #
     #   Get whether a section matches a file type
     #
@@ -3991,36 +4355,46 @@ class FdfParser:
         WarningString = "Incorrect section file type '%s'"
         if SectionType == "COMPAT16":
             if FileType not in {"COMPAT16", "SEC_COMPAT16"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PE32:
             if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_PE32"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PIC:
             if FileType not in {BINARY_FILE_TYPE_PIC, BINARY_FILE_TYPE_PIC}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_TE:
             if FileType not in {BINARY_FILE_TYPE_TE, "SEC_TE"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "RAW":
             if FileType not in {BINARY_FILE_TYPE_BIN, "SEC_BIN", "RAW", "ASL", "ACPI"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_DXE_DEPEX or SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
             if FileType not in {BINARY_FILE_TYPE_DXE_DEPEX, "SEC_DXE_DEPEX", BINARY_FILE_TYPE_SMM_DEPEX}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_UI:
             if FileType not in {BINARY_FILE_TYPE_UI, "SEC_UI"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == "VERSION":
             if FileType not in {"VERSION", "SEC_VERSION"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_PEI_DEPEX:
             if FileType not in {BINARY_FILE_TYPE_PEI_DEPEX, "SEC_PEI_DEPEX"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
         elif SectionType == BINARY_FILE_TYPE_GUID:
             if FileType not in {BINARY_FILE_TYPE_PE32, "SEC_GUID"}:
-                raise Warning(WarningString % FileType, self.FileName, self.CurrentLineNumber)
+                raise Warning(WarningString %
+                              FileType, self.FileName, self.CurrentLineNumber)
 
-    ## _GetRuleEncapsulationSection() method
+    # _GetRuleEncapsulationSection() method
     #
     #   Get encapsulation section for Rule
     #
@@ -4036,20 +4410,23 @@ class FdfParser:
                 Type = self._Token
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
 
             CompressSectionObj = CompressSection()
 
             CompressSectionObj.CompType = Type
             # Recursive sections...
             while True:
-                IsEncapsulate = self._GetRuleEncapsulationSection(CompressSectionObj)
+                IsEncapsulate = self._GetRuleEncapsulationSection(
+                    CompressSectionObj)
                 IsLeaf = self._GetEfiSection(CompressSectionObj)
                 if not IsEncapsulate and not IsLeaf:
                     break
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
             theRule.SectionList.append(CompressSectionObj)
 
             return True
@@ -4058,7 +4435,8 @@ class FdfParser:
             GuidValue = None
             if self._GetNextGuid():
                 if self._Token in GlobalData.gGuidDict:
-                    self._Token = GuidStructureStringToGuidString(GlobalData.gGuidDict[self._Token]).upper()
+                    self._Token = GuidStructureStringToGuidString(
+                        GlobalData.gGuidDict[self._Token]).upper()
                 GuidValue = self._Token
 
             if self._IsKeyword("$(NAMED_GUID)"):
@@ -4067,7 +4445,8 @@ class FdfParser:
             AttribDict = self._GetGuidAttrib()
 
             if not self._IsToken("{"):
-                raise Warning.ExpectedCurlyOpen(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyOpen(
+                    self.FileName, self.CurrentLineNumber)
             GuidSectionObj = GuidSection()
             GuidSectionObj.NameGuid = GuidValue
             GuidSectionObj.SectionType = "GUIDED"
@@ -4077,20 +4456,22 @@ class FdfParser:
 
             # Efi sections...
             while True:
-                IsEncapsulate = self._GetRuleEncapsulationSection(GuidSectionObj)
+                IsEncapsulate = self._GetRuleEncapsulationSection(
+                    GuidSectionObj)
                 IsLeaf = self._GetEfiSection(GuidSectionObj)
                 if not IsEncapsulate and not IsLeaf:
                     break
 
             if not self._IsToken(T_CHAR_BRACE_R):
-                raise Warning.ExpectedCurlyClose(self.FileName, self.CurrentLineNumber)
+                raise Warning.ExpectedCurlyClose(
+                    self.FileName, self.CurrentLineNumber)
             theRule.SectionList.append(GuidSectionObj)
 
             return True
 
         return False
 
-    ## _GetOptionRom() method
+    # _GetOptionRom() method
     #
     #   Get OptionROM section contents and store its data into OptionROM list of self.Profile
     #
@@ -4110,12 +4491,14 @@ class FdfParser:
 
         self._UndoToken()
         if not self._IsToken("[OptionRom.", True):
-            raise Warning("Unknown Keyword '%s'" % self._Token, self.FileName, self.CurrentLineNumber)
+            raise Warning("Unknown Keyword '%s'" % self._Token,
+                          self.FileName, self.CurrentLineNumber)
 
         OptRomName = self._GetUiName()
 
         if not self._IsToken(TAB_SECTION_END):
-            raise Warning.ExpectedBracketClose(self.FileName, self.CurrentLineNumber)
+            raise Warning.ExpectedBracketClose(
+                self.FileName, self.CurrentLineNumber)
 
         OptRomObj = OPTIONROM(OptRomName)
         self.Profile.OptRomDict[OptRomName] = OptRomObj
@@ -4128,7 +4511,7 @@ class FdfParser:
 
         return True
 
-    ## _GetOptRomInfStatement() method
+    # _GetOptRomInfStatement() method
     #
     #   Get INF statements
     #
@@ -4145,37 +4528,41 @@ class FdfParser:
         self._GetInfOptions(ffsInf)
 
         if not self._GetNextToken():
-            raise Warning.Expected("INF file path", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "INF file path", self.FileName, self.CurrentLineNumber)
         ffsInf.InfFileName = self._Token
         if ffsInf.InfFileName.replace(TAB_WORKSPACE, '').find('$') == -1:
-            #check for file path
-            ErrorCode, ErrorInfo = PathClass(NormPath(ffsInf.InfFileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
+            # check for file path
+            ErrorCode, ErrorInfo = PathClass(
+                NormPath(ffsInf.InfFileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
             if ErrorCode != 0:
                 EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
 
         NewFileName = ffsInf.InfFileName
         if ffsInf.OverrideGuid:
-            NewFileName = ProcessDuplicatedInf(PathClass(ffsInf.InfFileName,GenFdsGlobalVariable.WorkSpaceDir), ffsInf.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir).Path
+            NewFileName = ProcessDuplicatedInf(PathClass(
+                ffsInf.InfFileName, GenFdsGlobalVariable.WorkSpaceDir), ffsInf.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir).Path
 
         if not NewFileName in self.Profile.InfList:
             self.Profile.InfList.append(NewFileName)
-            FileLineTuple = GetRealFileLine(self.FileName, self.CurrentLineNumber)
+            FileLineTuple = GetRealFileLine(
+                self.FileName, self.CurrentLineNumber)
             self.Profile.InfFileLineList.append(FileLineTuple)
             if ffsInf.UseArch:
                 if ffsInf.UseArch not in self.Profile.InfDict:
                     self.Profile.InfDict[ffsInf.UseArch] = [ffsInf.InfFileName]
                 else:
-                    self.Profile.InfDict[ffsInf.UseArch].append(ffsInf.InfFileName)
+                    self.Profile.InfDict[ffsInf.UseArch].append(
+                        ffsInf.InfFileName)
             else:
                 self.Profile.InfDict['ArchTBD'].append(ffsInf.InfFileName)
 
-
-        self._GetOptRomOverrides (ffsInf)
+        self._GetOptRomOverrides(ffsInf)
 
         Obj.FfsList.append(ffsInf)
         return True
 
-    ## _GetOptRomOverrides() method
+    # _GetOptRomOverrides() method
     #
     #   Get overrides for OptROM INF & FILE
     #
@@ -4188,55 +4575,67 @@ class FdfParser:
             while True:
                 if self._IsKeyword("PCI_VENDOR_ID"):
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
                     if not self._GetNextHexNumber():
-                        raise Warning.Expected("Hex vendor id", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "Hex vendor id", self.FileName, self.CurrentLineNumber)
                     Overrides.PciVendorId = self._Token
                     continue
 
                 if self._IsKeyword("PCI_CLASS_CODE"):
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
                     if not self._GetNextHexNumber():
-                        raise Warning.Expected("Hex class code", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "Hex class code", self.FileName, self.CurrentLineNumber)
                     Overrides.PciClassCode = self._Token
                     continue
 
                 if self._IsKeyword("PCI_DEVICE_ID"):
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
                     # Get a list of PCI IDs
                     Overrides.PciDeviceId = ""
                     while (self._GetNextHexNumber()):
-                        Overrides.PciDeviceId = "{} {}".format(Overrides.PciDeviceId, self._Token)
+                        Overrides.PciDeviceId = "{} {}".format(
+                            Overrides.PciDeviceId, self._Token)
                     if not Overrides.PciDeviceId:
-                        raise Warning.Expected("one or more Hex device ids", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "one or more Hex device ids", self.FileName, self.CurrentLineNumber)
                     continue
 
                 if self._IsKeyword("PCI_REVISION"):
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
                     if not self._GetNextHexNumber():
-                        raise Warning.Expected("Hex revision", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "Hex revision", self.FileName, self.CurrentLineNumber)
                     Overrides.PciRevision = self._Token
                     continue
 
                 if self._IsKeyword("PCI_COMPRESS"):
                     if not self._IsToken(TAB_EQUAL_SPLIT):
-                        raise Warning.ExpectedEquals(self.FileName, self.CurrentLineNumber)
+                        raise Warning.ExpectedEquals(
+                            self.FileName, self.CurrentLineNumber)
                     if not self._GetNextToken():
-                        raise Warning.Expected("TRUE/FALSE for compress", self.FileName, self.CurrentLineNumber)
+                        raise Warning.Expected(
+                            "TRUE/FALSE for compress", self.FileName, self.CurrentLineNumber)
                     Overrides.NeedCompress = self._Token.upper() == 'TRUE'
                     continue
 
                 if self._IsToken(T_CHAR_BRACE_R):
                     break
                 else:
-                    EdkLogger.error("FdfParser", FORMAT_INVALID, File=self.FileName, Line=self.CurrentLineNumber)
+                    EdkLogger.error("FdfParser", FORMAT_INVALID,
+                                    File=self.FileName, Line=self.CurrentLineNumber)
 
             Obj.OverrideAttribs = Overrides
 
-    ## _GetOptRomFileStatement() method
+    # _GetOptRomFileStatement() method
     #
     #   Get FILE statements
     #
@@ -4252,15 +4651,18 @@ class FdfParser:
         FfsFileObj = OptRomFileStatement()
 
         if not self._IsKeyword("EFI") and not self._IsKeyword(BINARY_FILE_TYPE_BIN):
-            raise Warning.Expected("Binary type (EFI/BIN)", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "Binary type (EFI/BIN)", self.FileName, self.CurrentLineNumber)
         FfsFileObj.FileType = self._Token
 
         if not self._GetNextToken():
-            raise Warning.Expected("File path", self.FileName, self.CurrentLineNumber)
+            raise Warning.Expected(
+                "File path", self.FileName, self.CurrentLineNumber)
         FfsFileObj.FileName = self._Token
         if FfsFileObj.FileName.replace(TAB_WORKSPACE, '').find('$') == -1:
-            #check for file path
-            ErrorCode, ErrorInfo = PathClass(NormPath(FfsFileObj.FileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
+            # check for file path
+            ErrorCode, ErrorInfo = PathClass(
+                NormPath(FfsFileObj.FileName), GenFdsGlobalVariable.WorkSpaceDir).Validate()
             if ErrorCode != 0:
                 EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
 
@@ -4271,7 +4673,7 @@ class FdfParser:
 
         return True
 
-    ## _GetCapInFd() method
+    # _GetCapInFd() method
     #
     #   Get Cap list contained in FD
     #
@@ -4279,7 +4681,7 @@ class FdfParser:
     #   @param  FdName      FD name
     #   @retval CapList     List of Capsule in FD
     #
-    def _GetCapInFd (self, FdName):
+    def _GetCapInFd(self, FdName):
         CapList = []
         if FdName.upper() in self.Profile.FdDict:
             FdObj = self.Profile.FdDict[FdName.upper()]
@@ -4292,7 +4694,7 @@ class FdfParser:
                             CapList.append(elementRegionData.upper())
         return CapList
 
-    ## _GetReferencedFdCapTuple() method
+    # _GetReferencedFdCapTuple() method
     #
     #   Get FV and FD list referenced by a capsule image
     #
@@ -4301,12 +4703,12 @@ class FdfParser:
     #   @param  RefFdList   referenced FD by section
     #   @param  RefFvList   referenced FV by section
     #
-    def _GetReferencedFdCapTuple(self, CapObj, RefFdList = [], RefFvList = []):
+    def _GetReferencedFdCapTuple(self, CapObj, RefFdList=[], RefFvList=[]):
         for CapsuleDataObj in CapObj.CapsuleDataList:
             if hasattr(CapsuleDataObj, 'FvName') and CapsuleDataObj.FvName is not None and CapsuleDataObj.FvName.upper() not in RefFvList:
-                RefFvList.append (CapsuleDataObj.FvName.upper())
+                RefFvList.append(CapsuleDataObj.FvName.upper())
             elif hasattr(CapsuleDataObj, 'FdName') and CapsuleDataObj.FdName is not None and CapsuleDataObj.FdName.upper() not in RefFdList:
-                RefFdList.append (CapsuleDataObj.FdName.upper())
+                RefFdList.append(CapsuleDataObj.FdName.upper())
             elif CapsuleDataObj.Ffs is not None:
                 if isinstance(CapsuleDataObj.Ffs, FileStatement):
                     if CapsuleDataObj.Ffs.FvName is not None and CapsuleDataObj.Ffs.FvName.upper() not in RefFvList:
@@ -4314,9 +4716,10 @@ class FdfParser:
                     elif CapsuleDataObj.Ffs.FdName is not None and CapsuleDataObj.Ffs.FdName.upper() not in RefFdList:
                         RefFdList.append(CapsuleDataObj.Ffs.FdName.upper())
                     else:
-                        self._GetReferencedFdFvTupleFromSection(CapsuleDataObj.Ffs, RefFdList, RefFvList)
+                        self._GetReferencedFdFvTupleFromSection(
+                            CapsuleDataObj.Ffs, RefFdList, RefFvList)
 
-    ## _GetFvInFd() method
+    # _GetFvInFd() method
     #
     #   Get FV list contained in FD
     #
@@ -4324,7 +4727,7 @@ class FdfParser:
     #   @param  FdName      FD name
     #   @retval FvList      list of FV in FD
     #
-    def _GetFvInFd (self, FdName):
+    def _GetFvInFd(self, FdName):
         FvList = []
         if FdName.upper() in self.Profile.FdDict:
             FdObj = self.Profile.FdDict[FdName.upper()]
@@ -4337,7 +4740,7 @@ class FdfParser:
                             FvList.append(elementRegionData.upper())
         return FvList
 
-    ## _GetReferencedFdFvTuple() method
+    # _GetReferencedFdFvTuple() method
     #
     #   Get FD and FV list referenced by a FFS file
     #
@@ -4346,7 +4749,7 @@ class FdfParser:
     #   @param  RefFdList   referenced FD by section
     #   @param  RefFvList   referenced FV by section
     #
-    def _GetReferencedFdFvTuple(self, FvObj, RefFdList = [], RefFvList = []):
+    def _GetReferencedFdFvTuple(self, FvObj, RefFdList=[], RefFvList=[]):
         for FfsObj in FvObj.FfsList:
             if isinstance(FfsObj, FileStatement):
                 if FfsObj.FvName is not None and FfsObj.FvName.upper() not in RefFvList:
@@ -4354,9 +4757,10 @@ class FdfParser:
                 elif FfsObj.FdName is not None and FfsObj.FdName.upper() not in RefFdList:
                     RefFdList.append(FfsObj.FdName.upper())
                 else:
-                    self._GetReferencedFdFvTupleFromSection(FfsObj, RefFdList, RefFvList)
+                    self._GetReferencedFdFvTupleFromSection(
+                        FfsObj, RefFdList, RefFvList)
 
-    ## _GetReferencedFdFvTupleFromSection() method
+    # _GetReferencedFdFvTupleFromSection() method
     #
     #   Get FD and FV list referenced by a FFS section
     #
@@ -4365,7 +4769,7 @@ class FdfParser:
     #   @param  FdList      referenced FD by section
     #   @param  FvList      referenced FV by section
     #
-    def _GetReferencedFdFvTupleFromSection(self, FfsFile, FdList = [], FvList = []):
+    def _GetReferencedFdFvTupleFromSection(self, FfsFile, FdList=[], FvList=[]):
         SectionStack = list(FfsFile.SectionList)
         while SectionStack != []:
             SectionObj = SectionStack.pop()
@@ -4379,7 +4783,7 @@ class FdfParser:
             if isinstance(SectionObj, CompressSection) or isinstance(SectionObj, GuidSection):
                 SectionStack.extend(SectionObj.SectionList)
 
-    ## CycleReferenceCheck() method
+    # CycleReferenceCheck() method
     #
     #   Check whether cycle reference exists in FDF
     #
@@ -4391,7 +4795,7 @@ class FdfParser:
         #
         # Check the cycle between FV and FD image
         #
-        MaxLength = len (self.Profile.FvDict)
+        MaxLength = len(self.Profile.FvDict)
         for FvName in self.Profile.FvDict:
             LogStr = "\nCycle Reference Checking for FV: %s\n" % FvName
             RefFvStack = set(FvName)
@@ -4414,11 +4818,13 @@ class FdfParser:
                     if RefFdName in FdAnalyzedList:
                         continue
 
-                    LogStr += "FV %s contains FD %s\n" % (FvNameFromStack, RefFdName)
+                    LogStr += "FV %s contains FD %s\n" % (
+                        FvNameFromStack, RefFdName)
                     FvInFdList = self._GetFvInFd(RefFdName)
                     if FvInFdList != []:
                         for FvNameInFd in FvInFdList:
-                            LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
+                            LogStr += "FD %s contains FV %s\n" % (
+                                RefFdName, FvNameInFd)
                             if FvNameInFd not in RefFvStack:
                                 RefFvStack.add(FvNameInFd)
 
@@ -4428,7 +4834,8 @@ class FdfParser:
                     FdAnalyzedList.add(RefFdName)
 
                 for RefFvName in RefFvList:
-                    LogStr += "FV %s contains FV %s\n" % (FvNameFromStack, RefFvName)
+                    LogStr += "FV %s contains FV %s\n" % (
+                        FvNameFromStack, RefFvName)
                     if RefFvName not in RefFvStack:
                         RefFvStack.add(RefFvName)
 
@@ -4439,7 +4846,7 @@ class FdfParser:
         #
         # Check the cycle between Capsule and FD image
         #
-        MaxLength = len (self.Profile.CapsuleDict)
+        MaxLength = len(self.Profile.CapsuleDict)
         for CapName in self.Profile.CapsuleDict:
             #
             # Capsule image to be checked.
@@ -4464,14 +4871,16 @@ class FdfParser:
 
                 FvListLength = 0
                 FdListLength = 0
-                while FvListLength < len (RefFvList) or FdListLength < len (RefFdList):
+                while FvListLength < len(RefFvList) or FdListLength < len(RefFdList):
                     for RefFdName in RefFdList:
                         if RefFdName in FdAnalyzedList:
                             continue
 
-                        LogStr += "Capsule %s contains FD %s\n" % (CapNameFromStack, RefFdName)
+                        LogStr += "Capsule %s contains FD %s\n" % (
+                            CapNameFromStack, RefFdName)
                         for CapNameInFd in self._GetCapInFd(RefFdName):
-                            LogStr += "FD %s contains Capsule %s\n" % (RefFdName, CapNameInFd)
+                            LogStr += "FD %s contains Capsule %s\n" % (
+                                RefFdName, CapNameInFd)
                             if CapNameInFd not in RefCapStack:
                                 RefCapStack.append(CapNameInFd)
 
@@ -4480,7 +4889,8 @@ class FdfParser:
                                 return True
 
                         for FvNameInFd in self._GetFvInFd(RefFdName):
-                            LogStr += "FD %s contains FV %s\n" % (RefFdName, FvNameInFd)
+                            LogStr += "FD %s contains FV %s\n" % (
+                                RefFdName, FvNameInFd)
                             if FvNameInFd not in RefFvList:
                                 RefFvList.append(FvNameInFd)
 
@@ -4488,25 +4898,28 @@ class FdfParser:
                     #
                     # the number of the parsed FV and FD image
                     #
-                    FvListLength = len (RefFvList)
-                    FdListLength = len (RefFdList)
+                    FvListLength = len(RefFvList)
+                    FdListLength = len(RefFdList)
                     for RefFvName in RefFvList:
                         if RefFvName in FvAnalyzedList:
                             continue
-                        LogStr += "Capsule %s contains FV %s\n" % (CapNameFromStack, RefFvName)
+                        LogStr += "Capsule %s contains FV %s\n" % (
+                            CapNameFromStack, RefFvName)
                         if RefFvName.upper() in self.Profile.FvDict:
                             FvObj = self.Profile.FvDict[RefFvName.upper()]
                         else:
                             continue
-                        self._GetReferencedFdFvTuple(FvObj, RefFdList, RefFvList)
+                        self._GetReferencedFdFvTuple(
+                            FvObj, RefFdList, RefFvList)
                         FvAnalyzedList.add(RefFvName)
 
         return False
 
-    def GetAllIncludedFile (self):
+    def GetAllIncludedFile(self):
         global AllIncludeFileList
         return AllIncludeFileList
 
+
 if __name__ == "__main__":
     import sys
     try:
@@ -4523,4 +4936,3 @@ if __name__ == "__main__":
         print(str(X))
     else:
         print("Success!")
-
diff --git a/BaseTools/Source/Python/GenFds/Ffs.py b/BaseTools/Source/Python/GenFds/Ffs.py
index 4e58df279b14..272fabc41e03 100644
--- a/BaseTools/Source/Python/GenFds/Ffs.py
+++ b/BaseTools/Source/Python/GenFds/Ffs.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FFS generation
 #
 #  Copyright (c) 2007-2018, Intel Corporation. All rights reserved.<BR>
@@ -13,37 +13,37 @@ from Common.DataType import *
 
 # mapping between FILE type in FDF and file type for GenFfs
 FdfFvFileTypeToFileType = {
-    SUP_MODULE_SEC               : 'EFI_FV_FILETYPE_SECURITY_CORE',
-    SUP_MODULE_PEI_CORE          : 'EFI_FV_FILETYPE_PEI_CORE',
-    SUP_MODULE_PEIM              : 'EFI_FV_FILETYPE_PEIM',
-    SUP_MODULE_DXE_CORE          : 'EFI_FV_FILETYPE_DXE_CORE',
-    'FREEFORM'          : 'EFI_FV_FILETYPE_FREEFORM',
-    'DRIVER'            : 'EFI_FV_FILETYPE_DRIVER',
-    'APPLICATION'       : 'EFI_FV_FILETYPE_APPLICATION',
-    'FV_IMAGE'          : 'EFI_FV_FILETYPE_FIRMWARE_VOLUME_IMAGE',
-    'RAW'               : 'EFI_FV_FILETYPE_RAW',
-    'PEI_DXE_COMBO'     : 'EFI_FV_FILETYPE_COMBINED_PEIM_DRIVER',
-    'SMM'               : 'EFI_FV_FILETYPE_SMM',
-    SUP_MODULE_SMM_CORE          : 'EFI_FV_FILETYPE_SMM_CORE',
-    SUP_MODULE_MM_STANDALONE     : 'EFI_FV_FILETYPE_MM_STANDALONE',
-    SUP_MODULE_MM_CORE_STANDALONE : 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
+    SUP_MODULE_SEC: 'EFI_FV_FILETYPE_SECURITY_CORE',
+    SUP_MODULE_PEI_CORE: 'EFI_FV_FILETYPE_PEI_CORE',
+    SUP_MODULE_PEIM: 'EFI_FV_FILETYPE_PEIM',
+    SUP_MODULE_DXE_CORE: 'EFI_FV_FILETYPE_DXE_CORE',
+    'FREEFORM': 'EFI_FV_FILETYPE_FREEFORM',
+    'DRIVER': 'EFI_FV_FILETYPE_DRIVER',
+    'APPLICATION': 'EFI_FV_FILETYPE_APPLICATION',
+    'FV_IMAGE': 'EFI_FV_FILETYPE_FIRMWARE_VOLUME_IMAGE',
+    'RAW': 'EFI_FV_FILETYPE_RAW',
+    'PEI_DXE_COMBO': 'EFI_FV_FILETYPE_COMBINED_PEIM_DRIVER',
+    'SMM': 'EFI_FV_FILETYPE_SMM',
+    SUP_MODULE_SMM_CORE: 'EFI_FV_FILETYPE_SMM_CORE',
+    SUP_MODULE_MM_STANDALONE: 'EFI_FV_FILETYPE_MM_STANDALONE',
+    SUP_MODULE_MM_CORE_STANDALONE: 'EFI_FV_FILETYPE_MM_CORE_STANDALONE'
 }
 
 # mapping between section type in FDF and file suffix
 SectionSuffix = {
-    BINARY_FILE_TYPE_PE32                 : '.pe32',
-    BINARY_FILE_TYPE_PIC                  : '.pic',
-    BINARY_FILE_TYPE_TE                   : '.te',
-    BINARY_FILE_TYPE_DXE_DEPEX            : '.dpx',
-    'VERSION'              : '.ver',
-    BINARY_FILE_TYPE_UI                   : '.ui',
-    'COMPAT16'             : '.com16',
-    'RAW'                  : '.raw',
+    BINARY_FILE_TYPE_PE32: '.pe32',
+    BINARY_FILE_TYPE_PIC: '.pic',
+    BINARY_FILE_TYPE_TE: '.te',
+    BINARY_FILE_TYPE_DXE_DEPEX: '.dpx',
+    'VERSION': '.ver',
+    BINARY_FILE_TYPE_UI: '.ui',
+    'COMPAT16': '.com16',
+    'RAW': '.raw',
     'FREEFORM_SUBTYPE_GUID': '.guid',
-    'SUBTYPE_GUID'         : '.guid',
-    'FV_IMAGE'             : 'fv.sec',
-    'COMPRESS'             : '.com',
-    'GUIDED'               : '.guided',
-    BINARY_FILE_TYPE_PEI_DEPEX            : '.dpx',
-    BINARY_FILE_TYPE_SMM_DEPEX            : '.dpx'
+    'SUBTYPE_GUID': '.guid',
+    'FV_IMAGE': 'fv.sec',
+    'COMPRESS': '.com',
+    'GUIDED': '.guided',
+    BINARY_FILE_TYPE_PEI_DEPEX: '.dpx',
+    BINARY_FILE_TYPE_SMM_DEPEX: '.dpx'
 }
diff --git a/BaseTools/Source/Python/GenFds/FfsFileStatement.py b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
index 1c6e59bac75c..92758df91e5a 100644
--- a/BaseTools/Source/Python/GenFds/FfsFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsFileStatement.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FFS generation from FILE statement
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,11 +23,13 @@ from .Ffs import FdfFvFileTypeToFileType
 from .GenFdsGlobalVariable import GenFdsGlobalVariable
 import shutil
 
-## generate FFS from FILE
+# generate FFS from FILE
 #
 #
+
+
 class FileStatement (FileStatementClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -39,7 +41,7 @@ class FileStatement (FileStatementClassObject):
         self.InfFileName = None
         self.SubAlignment = None
 
-    ## GenFfs() method
+    # GenFfs() method
     #
     #   Generate FFS
     #
@@ -49,19 +51,19 @@ class FileStatement (FileStatementClassObject):
     #   @param  FvParentAddr Parent Fv base address
     #   @retval string       Generated FFS file name
     #
-    def GenFfs(self, Dict = None, FvChildAddr=[], FvParentAddr=None, IsMakefile=False, FvName=None):
+    def GenFfs(self, Dict=None, FvChildAddr=[], FvParentAddr=None, IsMakefile=False, FvName=None):
 
         if self.NameGuid and self.NameGuid.startswith('PCD('):
             PcdValue = GenFdsGlobalVariable.GetPcdValue(self.NameGuid)
             if len(PcdValue) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.' \
-                            % (self.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.'
+                                % (self.NameGuid))
             if PcdValue.startswith('{'):
                 PcdValue = GuidStructureByteArrayToGuidString(PcdValue)
             RegistryGuidStr = PcdValue
             if len(RegistryGuidStr) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.' \
-                            % (self.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.'
+                                % (self.NameGuid))
             self.NameGuid = RegistryGuidStr
 
         Str = self.NameGuid
@@ -81,15 +83,19 @@ class FileStatement (FileStatementClassObject):
         if self.FvName:
             Buffer = BytesIO()
             if self.FvName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
-                EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
-            Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
+                EdkLogger.error(
+                    "GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (self.FvName))
+            Fv = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(
+                self.FvName.upper())
             FileName = Fv.AddToBuffer(Buffer)
             SectionFiles = [FileName]
 
         elif self.FdName:
             if self.FdName.upper() not in GenFdsGlobalVariable.FdfParser.Profile.FdDict:
-                EdkLogger.error("GenFds", GENFDS_ERROR, "FD (%s) is NOT described in FDF file!" % (self.FdName))
-            Fd = GenFdsGlobalVariable.FdfParser.Profile.FdDict.get(self.FdName.upper())
+                EdkLogger.error(
+                    "GenFds", GENFDS_ERROR, "FD (%s) is NOT described in FDF file!" % (self.FdName))
+            Fd = GenFdsGlobalVariable.FdfParser.Profile.FdDict.get(
+                self.FdName.upper())
             FileName = Fd.GenFd()
             SectionFiles = [FileName]
 
@@ -103,37 +109,44 @@ class FileStatement (FileStatementClassObject):
                         try:
                             f = open(File, 'rb')
                         except:
-                            GenFdsGlobalVariable.ErrorLogger("Error opening RAW file %s." % (File))
+                            GenFdsGlobalVariable.ErrorLogger(
+                                "Error opening RAW file %s." % (File))
                         Content = f.read()
                         f.close()
                         AlignValue = 1
                         if self.SubAlignment[Index]:
-                            AlignValue = GenFdsGlobalVariable.GetAlignment(self.SubAlignment[Index])
+                            AlignValue = GenFdsGlobalVariable.GetAlignment(
+                                self.SubAlignment[Index])
                         if AlignValue > MaxAlignValue:
                             MaxAlignIndex = Index
                             MaxAlignValue = AlignValue
                         FileContent.write(Content)
                         if len(FileContent.getvalue()) % AlignValue != 0:
-                            Size = AlignValue - len(FileContent.getvalue()) % AlignValue
+                            Size = AlignValue - \
+                                len(FileContent.getvalue()) % AlignValue
                             for i in range(0, Size):
                                 FileContent.write(pack('B', 0xFF))
 
                     if FileContent.getvalue() != b'':
-                        OutputRAWFile = os.path.join(GenFdsGlobalVariable.FfsDir, self.NameGuid, self.NameGuid + '.raw')
-                        SaveFileOnChange(OutputRAWFile, FileContent.getvalue(), True)
+                        OutputRAWFile = os.path.join(
+                            GenFdsGlobalVariable.FfsDir, self.NameGuid, self.NameGuid + '.raw')
+                        SaveFileOnChange(
+                            OutputRAWFile, FileContent.getvalue(), True)
                         self.FileName = OutputRAWFile
                         self.SubAlignment = self.SubAlignment[MaxAlignIndex]
 
                 if self.Alignment and self.SubAlignment:
-                    if GenFdsGlobalVariable.GetAlignment (self.Alignment) < GenFdsGlobalVariable.GetAlignment (self.SubAlignment):
+                    if GenFdsGlobalVariable.GetAlignment(self.Alignment) < GenFdsGlobalVariable.GetAlignment(self.SubAlignment):
                         self.Alignment = self.SubAlignment
                 elif self.SubAlignment:
                     self.Alignment = self.SubAlignment
 
-            self.FileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
-            #Replace $(SAPCE) with real space
+            self.FileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.FileName)
+            # Replace $(SAPCE) with real space
             self.FileName = self.FileName.replace('$(SPACE)', ' ')
-            SectionFiles = [GenFdsGlobalVariable.MacroExtend(self.FileName, Dict)]
+            SectionFiles = [
+                GenFdsGlobalVariable.MacroExtend(self.FileName, Dict)]
 
         else:
             SectionFiles = []
@@ -141,7 +154,7 @@ class FileStatement (FileStatementClassObject):
             SectionAlignments = []
             for section in self.SectionList:
                 Index = Index + 1
-                SecIndex = '%d' %Index
+                SecIndex = '%d' % Index
                 # process the inside FvImage from FvSection or GuidSection
                 if FvChildAddr != []:
                     if isinstance(section, FvImageSection):
@@ -153,7 +166,8 @@ class FileStatement (FileStatementClassObject):
 
                 if self.KeepReloc == False:
                     section.KeepReloc = False
-                sectList, align = section.GenSection(OutputDir, self.NameGuid, SecIndex, self.KeyStringList, None, Dict)
+                sectList, align = section.GenSection(
+                    OutputDir, self.NameGuid, SecIndex, self.KeyStringList, None, Dict)
                 if sectList != []:
                     for sect in sectList:
                         SectionFiles.append(sect)
@@ -164,12 +178,13 @@ class FileStatement (FileStatementClassObject):
         #
         FfsFileOutput = os.path.join(OutputDir, self.NameGuid + '.ffs')
         GenFdsGlobalVariable.GenerateFfs(FfsFileOutput, SectionFiles,
-                                         FdfFvFileTypeToFileType.get(self.FvFileType),
+                                         FdfFvFileTypeToFileType.get(
+                                             self.FvFileType),
                                          self.NameGuid,
                                          Fixed=self.Fixed,
                                          CheckSum=self.CheckSum,
                                          Align=self.Alignment,
                                          SectionAlign=SectionAlignments
-                                        )
+                                         )
 
         return FfsFileOutput
diff --git a/BaseTools/Source/Python/GenFds/FfsInfStatement.py b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
index 568efb6d7685..17095e5edf1d 100644
--- a/BaseTools/Source/Python/GenFds/FfsInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/FfsInfStatement.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FFS generation from INF statement
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -16,7 +16,7 @@ import Common.LongFilePathOs as os
 from io import BytesIO
 from struct import *
 from .GenFdsGlobalVariable import GenFdsGlobalVariable
-from .Ffs import SectionSuffix,FdfFvFileTypeToFileType
+from .Ffs import SectionSuffix, FdfFvFileTypeToFileType
 import subprocess
 import sys
 from . import Section
@@ -46,11 +46,13 @@ from Common.Misc import SaveFileOnChange
 from Common.Expression import *
 from Common.DataType import *
 
-## generate FFS from INF
+# generate FFS from INF
 #
 #
+
+
 class FfsInfStatement(FfsInfStatementClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -73,22 +75,25 @@ class FfsInfStatement(FfsInfStatementClassObject):
         self.MacroDict = {}
         self.Depex = False
 
-    ## GetFinalTargetSuffixMap() method
+    # GetFinalTargetSuffixMap() method
     #
     #    Get final build target list
     def GetFinalTargetSuffixMap(self):
         if not self.InfModule or not self.CurrentArch:
             return []
         if not self.FinalTargetSuffixMap:
-            FinalBuildTargetList = GenFdsGlobalVariable.GetModuleCodaTargetList(self.InfModule, self.CurrentArch)
+            FinalBuildTargetList = GenFdsGlobalVariable.GetModuleCodaTargetList(
+                self.InfModule, self.CurrentArch)
             for File in FinalBuildTargetList:
-                self.FinalTargetSuffixMap.setdefault(os.path.splitext(File)[1], []).append(File)
+                self.FinalTargetSuffixMap.setdefault(
+                    os.path.splitext(File)[1], []).append(File)
 
             # Check if current INF module has DEPEX
             if '.depex' not in self.FinalTargetSuffixMap and self.InfModule.ModuleType != SUP_MODULE_USER_DEFINED and self.InfModule.ModuleType != SUP_MODULE_HOST_APPLICATION \
-                and not self.InfModule.DxsFile and not self.InfModule.LibraryClass:
+                    and not self.InfModule.DxsFile and not self.InfModule.LibraryClass:
                 ModuleType = self.InfModule.ModuleType
-                PlatformDataBase = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                PlatformDataBase = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                                              self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
 
                 if ModuleType != SUP_MODULE_USER_DEFINED and ModuleType != SUP_MODULE_HOST_APPLICATION:
                     for LibraryClass in PlatformDataBase.LibraryClasses.GetKeys():
@@ -131,31 +136,34 @@ class FfsInfStatement(FfsInfStatementClassObject):
                             LibraryPath = Module.LibraryClasses[LibName]
                         if not LibraryPath:
                             continue
-                        LibraryModule = GenFdsGlobalVariable.WorkSpace.BuildObject[LibraryPath, self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                        LibraryModule = GenFdsGlobalVariable.WorkSpace.BuildObject[LibraryPath,
+                                                                                   self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                         LibraryInstance[LibName] = LibraryModule
                         DependencyList.append(LibraryModule)
                 if DepexList:
                     Dpx = DependencyExpression(DepexList, ModuleType, True)
                     if len(Dpx.PostfixNotation) != 0:
                         # It means this module has DEPEX
-                        self.FinalTargetSuffixMap['.depex'] = [os.path.join(self.EfiOutputPath, self.BaseName) + '.depex']
+                        self.FinalTargetSuffixMap['.depex'] = [os.path.join(
+                            self.EfiOutputPath, self.BaseName) + '.depex']
         return self.FinalTargetSuffixMap
 
-    ## __InfParse() method
+    # __InfParse() method
     #
     #   Parse inf file to get module information
     #
     #   @param  self        The object pointer
     #   @param  Dict        dictionary contains macro and value pair
     #
-    def __InfParse__(self, Dict = None, IsGenFfs=False):
+    def __InfParse__(self, Dict=None, IsGenFfs=False):
 
-        GenFdsGlobalVariable.VerboseLogger( " Begine parsing INf file : %s" %self.InfFileName)
+        GenFdsGlobalVariable.VerboseLogger(
+            " Begine parsing INf file : %s" % self.InfFileName)
 
         self.InfFileName = self.InfFileName.replace('$(WORKSPACE)', '')
         if len(self.InfFileName) > 1 and self.InfFileName[0] == '\\' and self.InfFileName[1] == '\\':
             pass
-        elif self.InfFileName[0] == '\\' or self.InfFileName[0] == '/' :
+        elif self.InfFileName[0] == '\\' or self.InfFileName[0] == '/':
             self.InfFileName = self.InfFileName[1:]
 
         if self.InfFileName.find('$') == -1:
@@ -163,14 +171,16 @@ class FfsInfStatement(FfsInfStatementClassObject):
             if not os.path.exists(InfPath):
                 InfPath = GenFdsGlobalVariable.ReplaceWorkspaceMacro(InfPath)
                 if not os.path.exists(InfPath):
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "Non-existant Module %s !" % (self.InfFileName))
+                    EdkLogger.error(
+                        "GenFds", GENFDS_ERROR, "Non-existant Module %s !" % (self.InfFileName))
 
         self.CurrentArch = self.GetCurrentArch()
         #
         # Get the InfClass object
         #
 
-        PathClassObj = PathClass(self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
+        PathClassObj = PathClass(
+            self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
         ErrorCode, ErrorInfo = PathClassObj.Validate(".inf")
         if ErrorCode != 0:
             EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
@@ -180,10 +190,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
         #
         InfLowerPath = str(PathClassObj).lower()
         if self.OverrideGuid:
-            PathClassObj = ProcessDuplicatedInf(PathClassObj, self.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir)
+            PathClassObj = ProcessDuplicatedInf(
+                PathClassObj, self.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir)
         if self.CurrentArch is not None:
 
-            Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClassObj, self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClassObj, self.CurrentArch,
+                                                             GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
             #
             # Set Ffs BaseName, ModuleGuid, ModuleType, Version, OutputPath
             #
@@ -201,7 +213,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 self.ShadowFromInfFile = Inf.Shadow
 
         else:
-            Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClassObj, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            Inf = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClassObj, TAB_COMMON,
+                                                             GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
             self.BaseName = Inf.BaseName
             self.ModuleGuid = Inf.Guid
             self.ModuleType = Inf.ModuleType
@@ -212,27 +225,31 @@ class FfsInfStatement(FfsInfStatementClassObject):
             self.SourceFileList = Inf.Sources
             if self.BinFileList == []:
                 EdkLogger.error("GenFds", GENFDS_ERROR,
-                                "INF %s specified in FDF could not be found in build ARCH %s!" \
+                                "INF %s specified in FDF could not be found in build ARCH %s!"
                                 % (self.InfFileName, GenFdsGlobalVariable.ArchList))
 
         if self.OverrideGuid:
             self.ModuleGuid = self.OverrideGuid
 
         if len(self.SourceFileList) != 0 and not self.InDsc:
-            EdkLogger.warn("GenFds", GENFDS_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % (self.InfFileName))
+            EdkLogger.warn("GenFds", GENFDS_ERROR, "Module %s NOT found in DSC file; Is it really a binary module?" % (
+                self.InfFileName))
 
         if self.ModuleType == SUP_MODULE_SMM_CORE and int(self.PiSpecVersion, 16) < 0x0001000A:
-            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.InfFileName)
+            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                            "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.InfFileName)
 
         if self.ModuleType == SUP_MODULE_MM_CORE_STANDALONE and int(self.PiSpecVersion, 16) < 0x00010032:
-            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "MM_CORE_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.InfFileName)
+            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                            "MM_CORE_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.InfFileName)
 
         if Inf._Defs is not None and len(Inf._Defs) > 0:
             self.OptRomDefs.update(Inf._Defs)
 
         self.PatchPcds = []
         InfPcds = Inf.Pcds
-        Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+        Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                              self.CurrentArch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
         FdfPcdDict = GenFdsGlobalVariable.FdfParser.Profile.PcdDict
         PlatformPcds = Platform.Pcds
 
@@ -282,15 +299,19 @@ class FfsInfStatement(FfsInfStatementClassObject):
             # Support Flexible PCD format
             if DefaultValue:
                 try:
-                    DefaultValue = ValueExpressionEx(DefaultValue, Pcd.DatumType, Platform._GuidDict)(True)
+                    DefaultValue = ValueExpressionEx(
+                        DefaultValue, Pcd.DatumType, Platform._GuidDict)(True)
                 except BadExpression:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName, DefaultValue), File=self.InfFileName)
+                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName, DefaultValue), File=self.InfFileName)
 
             if Pcd.InfDefaultValue:
                 try:
-                    Pcd.InfDefaultValue = ValueExpressionEx(Pcd.InfDefaultValue, Pcd.DatumType, Platform._GuidDict)(True)
+                    Pcd.InfDefaultValue = ValueExpressionEx(
+                        Pcd.InfDefaultValue, Pcd.DatumType, Platform._GuidDict)(True)
                 except BadExpression:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' %(Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DefaultValue), File=self.InfFileName)
+                    EdkLogger.error("GenFds", GENFDS_ERROR, 'PCD [%s.%s] Value "%s"' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DefaultValue), File=self.InfFileName)
 
             # Check value, if value are equal, no need to patch
             if Pcd.DatumType == TAB_VOID:
@@ -325,12 +346,12 @@ class FfsInfStatement(FfsInfStatementClassObject):
             # Check the Pcd size and data type
             if Pcd.DatumType == TAB_VOID:
                 if int(MaxDatumSize) > int(Pcd.MaxDatumSize):
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "The size of VOID* type PCD '%s.%s' exceeds its maximum size %d bytes." \
+                    EdkLogger.error("GenFds", GENFDS_ERROR, "The size of VOID* type PCD '%s.%s' exceeds its maximum size %d bytes."
                                     % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, int(MaxDatumSize) - int(Pcd.MaxDatumSize)))
             else:
                 if PcdValueInDscOrFdf > MAX_VAL_TYPE[Pcd.DatumType] \
-                    or PcdValueInImg > MAX_VAL_TYPE[Pcd.DatumType]:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "The size of %s type PCD '%s.%s' doesn't match its data type." \
+                        or PcdValueInImg > MAX_VAL_TYPE[Pcd.DatumType]:
+                    EdkLogger.error("GenFds", GENFDS_ERROR, "The size of %s type PCD '%s.%s' doesn't match its data type."
                                     % (Pcd.DatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
             self.PatchPcds.append((Pcd, DefaultValue))
 
@@ -343,8 +364,10 @@ class FfsInfStatement(FfsInfStatementClassObject):
         GenFdsGlobalVariable.VerboseLogger("BaseName : %s" % self.BaseName)
         GenFdsGlobalVariable.VerboseLogger("ModuleGuid : %s" % self.ModuleGuid)
         GenFdsGlobalVariable.VerboseLogger("ModuleType : %s" % self.ModuleType)
-        GenFdsGlobalVariable.VerboseLogger("VersionString : %s" % self.VersionString)
-        GenFdsGlobalVariable.VerboseLogger("InfFileName :%s" % self.InfFileName)
+        GenFdsGlobalVariable.VerboseLogger(
+            "VersionString : %s" % self.VersionString)
+        GenFdsGlobalVariable.VerboseLogger(
+            "InfFileName :%s" % self.InfFileName)
 
         #
         # Set OutputPath = ${WorkSpace}\Build\Fv\Ffs\${ModuleGuid}+ ${ModuleName}\
@@ -353,15 +376,16 @@ class FfsInfStatement(FfsInfStatementClassObject):
             Rule = self.__GetRule__()
             if GlobalData.gGuidPatternEnd.match(Rule.NameGuid):
                 self.ModuleGuid = Rule.NameGuid
-        self.OutputPath = os.path.join(GenFdsGlobalVariable.FfsDir, \
+        self.OutputPath = os.path.join(GenFdsGlobalVariable.FfsDir,
                                        self.ModuleGuid + self.BaseName)
-        if not os.path.exists(self.OutputPath) :
+        if not os.path.exists(self.OutputPath):
             os.makedirs(self.OutputPath)
 
         self.EfiOutputPath, self.EfiDebugPath = self.__GetEFIOutPutPath__()
-        GenFdsGlobalVariable.VerboseLogger( "ModuelEFIPath: " + self.EfiOutputPath)
+        GenFdsGlobalVariable.VerboseLogger(
+            "ModuelEFIPath: " + self.EfiOutputPath)
 
-    ## PatchEfiFile
+    # PatchEfiFile
     #
     #  Patch EFI file with patch PCD
     #
@@ -386,13 +410,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
         # Generate path to patched output file
         #
         Basename = os.path.basename(EfiFile)
-        Output = os.path.normpath (os.path.join(self.OutputPath, Basename))
+        Output = os.path.normpath(os.path.join(self.OutputPath, Basename))
 
         #
         # If this file has already been patched, then return the path to the patched file
         #
         if self.PatchedBinFile == Output:
-          return Output
+            return Output
 
         #
         # If a different file from the same module has already been patched, then generate an error
@@ -401,7 +425,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
             EdkLogger.error("GenFds", GENFDS_ERROR,
                             'Only one binary file can be patched:\n'
                             '  a binary file has been patched: %s\n'
-                            '  current file: %s' % (self.PatchedBinFile, EfiFile),
+                            '  current file: %s' % (
+                                self.PatchedBinFile, EfiFile),
                             File=self.InfFileName)
 
         #
@@ -413,9 +438,11 @@ class FfsInfStatement(FfsInfStatementClassObject):
         # Apply patches to patched output file
         #
         for Pcd, Value in self.PatchPcds:
-            RetVal, RetStr = PatchBinaryFile(Output, int(Pcd.Offset, 0), Pcd.DatumType, Value, Pcd.MaxDatumSize)
+            RetVal, RetStr = PatchBinaryFile(Output, int(
+                Pcd.Offset, 0), Pcd.DatumType, Value, Pcd.MaxDatumSize)
             if RetVal:
-                EdkLogger.error("GenFds", GENFDS_ERROR, RetStr, File=self.InfFileName)
+                EdkLogger.error("GenFds", GENFDS_ERROR,
+                                RetStr, File=self.InfFileName)
 
         #
         # Save the path of the patched output file
@@ -427,7 +454,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         #
         return Output
 
-    ## GenFfs() method
+    # GenFfs() method
     #
     #   Generate FFS
     #
@@ -437,7 +464,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  FvParentAddr Parent Fv base address
     #   @retval string       Generated FFS file name
     #
-    def GenFfs(self, Dict = None, FvChildAddr = [], FvParentAddr=None, IsMakefile=False, FvName=None):
+    def GenFfs(self, Dict=None, FvChildAddr=[], FvParentAddr=None, IsMakefile=False, FvName=None):
         #
         # Parse Inf file get Module related information
         #
@@ -445,8 +472,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
             Dict = {}
         self.__InfParse__(Dict, IsGenFfs=True)
         Arch = self.GetCurrentArch()
-        SrcFile = mws.join( GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName);
-        DestFile = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
+        SrcFile = mws.join(GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName)
+        DestFile = os.path.join(self.OutputPath, self.ModuleGuid + '.ffs')
 
         SrcFileDir = "."
         SrcPath = os.path.dirname(SrcFile)
@@ -457,18 +484,18 @@ class FfsInfStatement(FfsInfStatementClassObject):
         DestFileBase, DestFileExt = os.path.splitext(DestFileName)
         self.MacroDict = {
             # source file
-            "${src}"      :   SrcFile,
-            "${s_path}"   :   SrcPath,
-            "${s_dir}"    :   SrcFileDir,
-            "${s_name}"   :   SrcFileName,
-            "${s_base}"   :   SrcFileBase,
-            "${s_ext}"    :   SrcFileExt,
+            "${src}":   SrcFile,
+            "${s_path}":   SrcPath,
+            "${s_dir}":   SrcFileDir,
+            "${s_name}":   SrcFileName,
+            "${s_base}":   SrcFileBase,
+            "${s_ext}":   SrcFileExt,
             # destination file
-            "${dst}"      :   DestFile,
-            "${d_path}"   :   DestPath,
-            "${d_name}"   :   DestFileName,
-            "${d_base}"   :   DestFileBase,
-            "${d_ext}"    :   DestFileExt
+            "${dst}":   DestFile,
+            "${d_path}":   DestPath,
+            "${d_name}":   DestFileName,
+            "${d_base}":   DestFileBase,
+            "${d_ext}":   DestFileExt
         }
         #
         # Allow binary type module not specify override rule in FDF file.
@@ -483,7 +510,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
         # Get the rule of how to generate Ffs file
         #
         Rule = self.__GetRule__()
-        GenFdsGlobalVariable.VerboseLogger( "Packing binaries from inf file : %s" %self.InfFileName)
+        GenFdsGlobalVariable.VerboseLogger(
+            "Packing binaries from inf file : %s" % self.InfFileName)
         #
         # Convert Fv File Type for PI1.1 SMM driver.
         #
@@ -495,7 +523,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
         #
         if self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER and int(self.PiSpecVersion, 16) < 0x0001000A:
             if Rule.FvFileType == 'SMM' or Rule.FvFileType == SUP_MODULE_SMM_CORE:
-                EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Framework SMM module doesn't support SMM or SMM_CORE FV file type", File=self.InfFileName)
+                EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                                "Framework SMM module doesn't support SMM or SMM_CORE FV file type", File=self.InfFileName)
         #
         # For the rule only has simpleFile
         #
@@ -503,23 +532,29 @@ class FfsInfStatement(FfsInfStatementClassObject):
         if self.IsBinaryModule:
             IsMakefile = False
         if IsMakefile:
-            PathClassObj = PathClass(self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
+            PathClassObj = PathClass(
+                self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
             if self.OverrideGuid:
-                PathClassObj = ProcessDuplicatedInf(PathClassObj, self.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir)
+                PathClassObj = ProcessDuplicatedInf(
+                    PathClassObj, self.OverrideGuid, GenFdsGlobalVariable.WorkSpaceDir)
             MakefilePath = PathClassObj.Path, Arch
-        if isinstance (Rule, RuleSimpleFile.RuleSimpleFile):
-            SectionOutputList = self.__GenSimpleFileSection__(Rule, IsMakefile=IsMakefile)
-            FfsOutput = self.__GenSimpleFileFfs__(Rule, SectionOutputList, MakefilePath=MakefilePath)
+        if isinstance(Rule, RuleSimpleFile.RuleSimpleFile):
+            SectionOutputList = self.__GenSimpleFileSection__(
+                Rule, IsMakefile=IsMakefile)
+            FfsOutput = self.__GenSimpleFileFfs__(
+                Rule, SectionOutputList, MakefilePath=MakefilePath)
             return FfsOutput
         #
         # For Rule has ComplexFile
         #
         elif isinstance(Rule, RuleComplexFile.RuleComplexFile):
-            InputSectList, InputSectAlignments = self.__GenComplexFileSection__(Rule, FvChildAddr, FvParentAddr, IsMakefile=IsMakefile)
-            FfsOutput = self.__GenComplexFileFfs__(Rule, InputSectList, InputSectAlignments, MakefilePath=MakefilePath)
+            InputSectList, InputSectAlignments = self.__GenComplexFileSection__(
+                Rule, FvChildAddr, FvParentAddr, IsMakefile=IsMakefile)
+            FfsOutput = self.__GenComplexFileFfs__(
+                Rule, InputSectList, InputSectAlignments, MakefilePath=MakefilePath)
             return FfsOutput
 
-    ## __ExtendMacro__() method
+    # __ExtendMacro__() method
     #
     #   Replace macro with its value
     #
@@ -527,26 +562,26 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  String      The string to be replaced
     #   @retval string      Macro replaced string
     #
-    def __ExtendMacro__ (self, String):
+    def __ExtendMacro__(self, String):
         MacroDict = {
-            '$(INF_OUTPUT)'  : self.EfiOutputPath,
-            '$(MODULE_NAME)' : self.BaseName,
+            '$(INF_OUTPUT)': self.EfiOutputPath,
+            '$(MODULE_NAME)': self.BaseName,
             '$(BUILD_NUMBER)': self.BuildNum,
-            '$(INF_VERSION)' : self.VersionString,
-            '$(NAMED_GUID)'  : self.ModuleGuid
+            '$(INF_VERSION)': self.VersionString,
+            '$(NAMED_GUID)': self.ModuleGuid
         }
         String = GenFdsGlobalVariable.MacroExtend(String, MacroDict)
         String = GenFdsGlobalVariable.MacroExtend(String, self.MacroDict)
         return String
 
-    ## __GetRule__() method
+    # __GetRule__() method
     #
     #   Get correct rule for generating FFS for this INF
     #
     #   @param  self        The object pointer
     #   @retval Rule        Rule object
     #
-    def __GetRule__ (self) :
+    def __GetRule__(self):
         CurrentArchList = []
         if self.CurrentArch is None:
             CurrentArchList = ['common']
@@ -554,44 +589,48 @@ class FfsInfStatement(FfsInfStatementClassObject):
             CurrentArchList.append(self.CurrentArch)
 
         for CurrentArch in CurrentArchList:
-            RuleName = 'RULE'              + \
-                       '.'                 + \
+            RuleName = 'RULE' + \
+                       '.' + \
                        CurrentArch.upper() + \
-                       '.'                 + \
+                       '.' + \
                        self.ModuleType.upper()
             if self.Rule is not None:
                 RuleName = RuleName + \
-                           '.'      + \
-                           self.Rule.upper()
+                    '.' + \
+                    self.Rule.upper()
 
-            Rule = GenFdsGlobalVariable.FdfParser.Profile.RuleDict.get(RuleName)
+            Rule = GenFdsGlobalVariable.FdfParser.Profile.RuleDict.get(
+                RuleName)
             if Rule is not None:
-                GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : " + RuleName)
+                GenFdsGlobalVariable.VerboseLogger(
+                    "Want To Find Rule Name is : " + RuleName)
                 return Rule
 
-        RuleName = 'RULE'      + \
-                   '.'         + \
-                   TAB_COMMON    + \
-                   '.'         + \
+        RuleName = 'RULE' + \
+                   '.' + \
+                   TAB_COMMON + \
+                   '.' + \
                    self.ModuleType.upper()
 
         if self.Rule is not None:
             RuleName = RuleName + \
-                       '.'      + \
-                       self.Rule.upper()
+                '.' + \
+                self.Rule.upper()
 
-        GenFdsGlobalVariable.VerboseLogger ('Trying to apply common rule %s for INF %s' % (RuleName, self.InfFileName))
+        GenFdsGlobalVariable.VerboseLogger(
+            'Trying to apply common rule %s for INF %s' % (RuleName, self.InfFileName))
 
         Rule = GenFdsGlobalVariable.FdfParser.Profile.RuleDict.get(RuleName)
         if Rule is not None:
-            GenFdsGlobalVariable.VerboseLogger ("Want To Find Rule Name is : " + RuleName)
+            GenFdsGlobalVariable.VerboseLogger(
+                "Want To Find Rule Name is : " + RuleName)
             return Rule
 
-        if Rule is None :
-            EdkLogger.error("GenFds", GENFDS_ERROR, 'Don\'t Find common rule %s for INF %s' \
+        if Rule is None:
+            EdkLogger.error("GenFds", GENFDS_ERROR, 'Don\'t Find common rule %s for INF %s'
                             % (RuleName, self.InfFileName))
 
-    ## __GetPlatformArchList__() method
+    # __GetPlatformArchList__() method
     #
     #   Get Arch list this INF built under
     #
@@ -600,13 +639,15 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #
     def __GetPlatformArchList__(self):
 
-        InfFileKey = os.path.normpath(mws.join(GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName))
+        InfFileKey = os.path.normpath(
+            mws.join(GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName))
         DscArchList = []
-        for Arch in GenFdsGlobalVariable.ArchList :
-            PlatformDataBase = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
-            if  PlatformDataBase is not None:
+        for Arch in GenFdsGlobalVariable.ArchList:
+            PlatformDataBase = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                                          Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            if PlatformDataBase is not None:
                 if InfFileKey in PlatformDataBase.Modules:
-                    DscArchList.append (Arch)
+                    DscArchList.append(Arch)
                 else:
                     #
                     # BaseTools support build same module more than once, the module path with FILE_GUID overridden has
@@ -615,19 +656,19 @@ class FfsInfStatement(FfsInfStatementClassObject):
                     #
                     for key in PlatformDataBase.Modules:
                         if InfFileKey == str((PlatformDataBase.Modules[key]).MetaFile.Path):
-                            DscArchList.append (Arch)
+                            DscArchList.append(Arch)
                             break
 
         return DscArchList
 
-    ## GetCurrentArch() method
+    # GetCurrentArch() method
     #
     #   Get Arch list of the module from this INF is to be placed into flash
     #
     #   @param  self        The object pointer
     #   @retval list        Arch list
     #
-    def GetCurrentArch(self) :
+    def GetCurrentArch(self):
 
         TargetArchList = GenFdsGlobalVariable.ArchList
 
@@ -635,8 +676,9 @@ class FfsInfStatement(FfsInfStatementClassObject):
 
         CurArchList = TargetArchList
         if PlatformArchList != []:
-            CurArchList = list(set (TargetArchList) & set (PlatformArchList))
-        GenFdsGlobalVariable.VerboseLogger ("Valid target architecture(s) is : " + " ".join(CurArchList))
+            CurArchList = list(set(TargetArchList) & set(PlatformArchList))
+        GenFdsGlobalVariable.VerboseLogger(
+            "Valid target architecture(s) is : " + " ".join(CurArchList))
 
         ArchList = []
         if self.KeyStringList != []:
@@ -654,12 +696,13 @@ class FfsInfStatement(FfsInfStatementClassObject):
         if self.UseArch is not None:
             UseArchList = []
             UseArchList.append(self.UseArch)
-            ArchList = list(set (UseArchList) & set (ArchList))
+            ArchList = list(set(UseArchList) & set(ArchList))
 
         self.InfFileName = NormPath(self.InfFileName)
         if len(PlatformArchList) == 0:
             self.InDsc = False
-            PathClassObj = PathClass(self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
+            PathClassObj = PathClass(
+                self.InfFileName, GenFdsGlobalVariable.WorkSpaceDir)
             ErrorCode, ErrorInfo = PathClassObj.Validate(".inf")
             if ErrorCode != 0:
                 EdkLogger.error("GenFds", ErrorCode, ExtraData=ErrorInfo)
@@ -668,14 +711,16 @@ class FfsInfStatement(FfsInfStatementClassObject):
             return Arch
         elif len(ArchList) > 1:
             if len(PlatformArchList) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, "GenFds command line option has multiple ARCHs %s. Not able to determine which ARCH is valid for Module %s !" % (str(ArchList), self.InfFileName))
+                EdkLogger.error("GenFds", GENFDS_ERROR, "GenFds command line option has multiple ARCHs %s. Not able to determine which ARCH is valid for Module %s !" % (
+                    str(ArchList), self.InfFileName))
             else:
-                EdkLogger.error("GenFds", GENFDS_ERROR, "Module built under multiple ARCHs %s. Not able to determine which output to put into flash for Module %s !" % (str(ArchList), self.InfFileName))
+                EdkLogger.error("GenFds", GENFDS_ERROR, "Module built under multiple ARCHs %s. Not able to determine which output to put into flash for Module %s !" % (
+                    str(ArchList), self.InfFileName))
         else:
-            EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s appears under ARCH %s in platform %s, but current deduced ARCH is %s, so NO build output could be put into flash." \
-                            % (self.InfFileName, str(PlatformArchList), GenFdsGlobalVariable.ActivePlatform, str(set (UseArchList) & set (TargetArchList))))
+            EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s appears under ARCH %s in platform %s, but current deduced ARCH is %s, so NO build output could be put into flash."
+                            % (self.InfFileName, str(PlatformArchList), GenFdsGlobalVariable.ActivePlatform, str(set(UseArchList) & set(TargetArchList))))
 
-    ## __GetEFIOutPutPath__() method
+    # __GetEFIOutPutPath__() method
     #
     #   Get the output path for generated files
     #
@@ -702,16 +747,16 @@ class FfsInfStatement(FfsInfStatementClassObject):
                                   'OUTPUT'
                                   )
         DebugPath = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch],
-                                  Arch,
-                                  ModulePath,
-                                  FileName,
-                                  'DEBUG'
-                                  )
+                                 Arch,
+                                 ModulePath,
+                                 FileName,
+                                 'DEBUG'
+                                 )
         OutputPath = os.path.abspath(OutputPath)
         DebugPath = os.path.abspath(DebugPath)
         return OutputPath, DebugPath
 
-    ## __GenSimpleFileSection__() method
+    # __GenSimpleFileSection__() method
     #
     #   Generate section by specified file name or a list of files with file extension
     #
@@ -719,7 +764,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  Rule        The rule object used to generate section
     #   @retval string      File name of the generated section file
     #
-    def __GenSimpleFileSection__(self, Rule, IsMakefile = False):
+    def __GenSimpleFileSection__(self, Rule, IsMakefile=False):
         #
         # Prepare the parameter of GenSection
         #
@@ -731,9 +776,11 @@ class FfsInfStatement(FfsInfStatementClassObject):
             if os.path.isabs(GenSecInputFile):
                 GenSecInputFile = os.path.normpath(GenSecInputFile)
             else:
-                GenSecInputFile = os.path.normpath(os.path.join(self.EfiOutputPath, GenSecInputFile))
+                GenSecInputFile = os.path.normpath(
+                    os.path.join(self.EfiOutputPath, GenSecInputFile))
         else:
-            FileList, IsSect = Section.Section.GetFileList(self, '', Rule.FileExtension)
+            FileList, IsSect = Section.Section.GetFileList(
+                self, '', Rule.FileExtension)
 
         Index = 1
         SectionType = Rule.SectionType
@@ -748,7 +795,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
         #
         if self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER and int(self.PiSpecVersion, 16) < 0x0001000A:
             if SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
-                EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
+                EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                                "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
         NoStrip = True
         if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM):
             if self.KeepReloc is not None:
@@ -758,98 +806,112 @@ class FfsInfStatement(FfsInfStatementClassObject):
             elif self.ShadowFromInfFile is not None:
                 NoStrip = self.ShadowFromInfFile
 
-        if FileList != [] :
+        if FileList != []:
             for File in FileList:
 
-                SecNum = '%d' %Index
-                GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
+                SecNum = '%d' % Index
+                GenSecOutputFile = self.__ExtendMacro__(Rule.NameGuid) + \
+                    SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
                 Index = Index + 1
                 OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
-                File = GenFdsGlobalVariable.MacroExtend(File, Dict, self.CurrentArch)
+                File = GenFdsGlobalVariable.MacroExtend(
+                    File, Dict, self.CurrentArch)
 
-                #Get PE Section alignment when align is set to AUTO
+                # Get PE Section alignment when align is set to AUTO
                 if self.Alignment == 'Auto' and (SectionType == BINARY_FILE_TYPE_PE32 or SectionType == BINARY_FILE_TYPE_TE):
-                    ImageObj = PeImageClass (File)
+                    ImageObj = PeImageClass(File)
                     if ImageObj.SectionAlignment < 0x400:
-                        self.Alignment = str (ImageObj.SectionAlignment)
+                        self.Alignment = str(ImageObj.SectionAlignment)
                     elif ImageObj.SectionAlignment < 0x100000:
-                        self.Alignment = str (ImageObj.SectionAlignment // 0x400) + 'K'
+                        self.Alignment = str(
+                            ImageObj.SectionAlignment // 0x400) + 'K'
                     else:
-                        self.Alignment = str (ImageObj.SectionAlignment // 0x100000) + 'M'
+                        self.Alignment = str(
+                            ImageObj.SectionAlignment // 0x100000) + 'M'
 
                 if not NoStrip:
-                    FileBeforeStrip = os.path.join(self.OutputPath, ModuleName + '.reloc')
+                    FileBeforeStrip = os.path.join(
+                        self.OutputPath, ModuleName + '.reloc')
                     if not os.path.exists(FileBeforeStrip) or \
-                           (os.path.getmtime(File) > os.path.getmtime(FileBeforeStrip)):
+                            (os.path.getmtime(File) > os.path.getmtime(FileBeforeStrip)):
                         CopyLongFilePath(File, FileBeforeStrip)
-                    StrippedFile = os.path.join(self.OutputPath, ModuleName + '.stipped')
+                    StrippedFile = os.path.join(
+                        self.OutputPath, ModuleName + '.stipped')
                     GenFdsGlobalVariable.GenerateFirmwareImage(
-                            StrippedFile,
-                            [File],
-                            Strip=True,
-                            IsMakefile=IsMakefile
-                        )
+                        StrippedFile,
+                        [File],
+                        Strip=True,
+                        IsMakefile=IsMakefile
+                    )
                     File = StrippedFile
 
                 if SectionType == BINARY_FILE_TYPE_TE:
-                    TeFile = os.path.join( self.OutputPath, self.ModuleGuid + 'Te.raw')
+                    TeFile = os.path.join(
+                        self.OutputPath, self.ModuleGuid + 'Te.raw')
                     GenFdsGlobalVariable.GenerateFirmwareImage(
-                            TeFile,
-                            [File],
-                            Type='te',
-                            IsMakefile=IsMakefile
-                        )
+                        TeFile,
+                        [File],
+                        Type='te',
+                        IsMakefile=IsMakefile
+                    )
                     File = TeFile
-                GenFdsGlobalVariable.GenerateSection(OutputFile, [File], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
+                GenFdsGlobalVariable.GenerateSection(
+                    OutputFile, [File], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
         else:
-            SecNum = '%d' %Index
-            GenSecOutputFile= self.__ExtendMacro__(Rule.NameGuid) + \
-                              SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
+            SecNum = '%d' % Index
+            GenSecOutputFile = self.__ExtendMacro__(Rule.NameGuid) + \
+                SectionSuffix[SectionType] + SUP_MODULE_SEC + SecNum
             OutputFile = os.path.join(self.OutputPath, GenSecOutputFile)
-            GenSecInputFile = GenFdsGlobalVariable.MacroExtend(GenSecInputFile, Dict, self.CurrentArch)
+            GenSecInputFile = GenFdsGlobalVariable.MacroExtend(
+                GenSecInputFile, Dict, self.CurrentArch)
 
-            #Get PE Section alignment when align is set to AUTO
+            # Get PE Section alignment when align is set to AUTO
             if self.Alignment == 'Auto' and (SectionType == BINARY_FILE_TYPE_PE32 or SectionType == BINARY_FILE_TYPE_TE):
-                ImageObj = PeImageClass (GenSecInputFile)
+                ImageObj = PeImageClass(GenSecInputFile)
                 if ImageObj.SectionAlignment < 0x400:
-                    self.Alignment = str (ImageObj.SectionAlignment)
+                    self.Alignment = str(ImageObj.SectionAlignment)
                 elif ImageObj.SectionAlignment < 0x100000:
-                    self.Alignment = str (ImageObj.SectionAlignment // 0x400) + 'K'
+                    self.Alignment = str(
+                        ImageObj.SectionAlignment // 0x400) + 'K'
                 else:
-                    self.Alignment = str (ImageObj.SectionAlignment // 0x100000) + 'M'
+                    self.Alignment = str(
+                        ImageObj.SectionAlignment // 0x100000) + 'M'
 
             if not NoStrip:
-                FileBeforeStrip = os.path.join(self.OutputPath, ModuleName + '.reloc')
+                FileBeforeStrip = os.path.join(
+                    self.OutputPath, ModuleName + '.reloc')
                 if not os.path.exists(FileBeforeStrip) or \
-                       (os.path.getmtime(GenSecInputFile) > os.path.getmtime(FileBeforeStrip)):
+                        (os.path.getmtime(GenSecInputFile) > os.path.getmtime(FileBeforeStrip)):
                     CopyLongFilePath(GenSecInputFile, FileBeforeStrip)
 
-                StrippedFile = os.path.join(self.OutputPath, ModuleName + '.stipped')
+                StrippedFile = os.path.join(
+                    self.OutputPath, ModuleName + '.stipped')
                 GenFdsGlobalVariable.GenerateFirmwareImage(
-                        StrippedFile,
-                        [GenSecInputFile],
-                        Strip=True,
-                        IsMakefile=IsMakefile
-                    )
+                    StrippedFile,
+                    [GenSecInputFile],
+                    Strip=True,
+                    IsMakefile=IsMakefile
+                )
                 GenSecInputFile = StrippedFile
 
             if SectionType == BINARY_FILE_TYPE_TE:
-                TeFile = os.path.join( self.OutputPath, self.ModuleGuid + 'Te.raw')
+                TeFile = os.path.join(
+                    self.OutputPath, self.ModuleGuid + 'Te.raw')
                 GenFdsGlobalVariable.GenerateFirmwareImage(
-                        TeFile,
-                        [GenSecInputFile],
-                        Type='te',
-                        IsMakefile=IsMakefile
-                    )
+                    TeFile,
+                    [GenSecInputFile],
+                    Type='te',
+                    IsMakefile=IsMakefile
+                )
                 GenSecInputFile = TeFile
-            GenFdsGlobalVariable.GenerateSection(OutputFile, [GenSecInputFile], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
+            GenFdsGlobalVariable.GenerateSection(OutputFile, [
+                                                 GenSecInputFile], Section.Section.SectionType[SectionType], IsMakefile=IsMakefile)
             OutputFileList.append(OutputFile)
 
         return OutputFileList
 
-    ## __GenSimpleFileFfs__() method
+    # __GenSimpleFileFfs__() method
     #
     #   Generate FFS
     #
@@ -858,11 +920,11 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  InputFileList        The output file list from GenSection
     #   @retval string      Generated FFS file name
     #
-    def __GenSimpleFileFfs__(self, Rule, InputFileList, MakefilePath = None):
-        FfsOutput = self.OutputPath                     + \
-                    os.sep                              + \
-                    self.__ExtendMacro__(Rule.NameGuid) + \
-                    '.ffs'
+    def __GenSimpleFileFfs__(self, Rule, InputFileList, MakefilePath=None):
+        FfsOutput = self.OutputPath + \
+            os.sep + \
+            self.__ExtendMacro__(Rule.NameGuid) + \
+            '.ffs'
 
         GenFdsGlobalVariable.VerboseLogger(self.__ExtendMacro__(Rule.NameGuid))
         InputSection = []
@@ -874,14 +936,14 @@ class FfsInfStatement(FfsInfStatementClassObject):
         if Rule.NameGuid is not None and Rule.NameGuid.startswith('PCD('):
             PcdValue = GenFdsGlobalVariable.GetPcdValue(Rule.NameGuid)
             if len(PcdValue) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.' \
-                            % (Rule.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.'
+                                % (Rule.NameGuid))
             if PcdValue.startswith('{'):
                 PcdValue = GuidStructureByteArrayToGuidString(PcdValue)
             RegistryGuidStr = PcdValue
             if len(RegistryGuidStr) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.' \
-                            % (Rule.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.'
+                                % (Rule.NameGuid))
             self.ModuleGuid = RegistryGuidStr
 
             GenFdsGlobalVariable.GenerateFfs(FfsOutput, InputSection,
@@ -893,7 +955,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
                                              )
         return FfsOutput
 
-    ## __GenComplexFileSection__() method
+    # __GenComplexFileSection__() method
     #
     #   Generate section by sections in Rule
     #
@@ -903,7 +965,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  FvParentAddr Parent Fv base address
     #   @retval string       File name of the generated section file
     #
-    def __GenComplexFileSection__(self, Rule, FvChildAddr, FvParentAddr, IsMakefile = False):
+    def __GenComplexFileSection__(self, Rule, FvChildAddr, FvParentAddr, IsMakefile=False):
         if self.ModuleType in (SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, SUP_MODULE_MM_CORE_STANDALONE):
             if Rule.KeepReloc is not None:
                 self.KeepRelocFromRule = Rule.KeepReloc
@@ -913,33 +975,39 @@ class FfsInfStatement(FfsInfStatementClassObject):
         HasGeneratedFlag = False
         if self.PcdIsDriver == 'PEI_PCD_DRIVER':
             if self.IsBinaryModule:
-                PcdExDbFileName = os.path.join(GenFdsGlobalVariable.FvDir, "PEIPcdDataBase.raw")
+                PcdExDbFileName = os.path.join(
+                    GenFdsGlobalVariable.FvDir, "PEIPcdDataBase.raw")
             else:
-                PcdExDbFileName = os.path.join(self.EfiOutputPath, "PEIPcdDataBase.raw")
-            PcdExDbSecName = os.path.join(self.OutputPath, "PEIPcdDataBaseSec.raw")
+                PcdExDbFileName = os.path.join(
+                    self.EfiOutputPath, "PEIPcdDataBase.raw")
+            PcdExDbSecName = os.path.join(
+                self.OutputPath, "PEIPcdDataBaseSec.raw")
             GenFdsGlobalVariable.GenerateSection(PcdExDbSecName,
                                                  [PcdExDbFileName],
                                                  "EFI_SECTION_RAW",
-                                                 IsMakefile = IsMakefile
+                                                 IsMakefile=IsMakefile
                                                  )
             SectFiles.append(PcdExDbSecName)
             SectAlignments.append(None)
         elif self.PcdIsDriver == 'DXE_PCD_DRIVER':
             if self.IsBinaryModule:
-                PcdExDbFileName = os.path.join(GenFdsGlobalVariable.FvDir, "DXEPcdDataBase.raw")
+                PcdExDbFileName = os.path.join(
+                    GenFdsGlobalVariable.FvDir, "DXEPcdDataBase.raw")
             else:
-                PcdExDbFileName = os.path.join(self.EfiOutputPath, "DXEPcdDataBase.raw")
-            PcdExDbSecName = os.path.join(self.OutputPath, "DXEPcdDataBaseSec.raw")
+                PcdExDbFileName = os.path.join(
+                    self.EfiOutputPath, "DXEPcdDataBase.raw")
+            PcdExDbSecName = os.path.join(
+                self.OutputPath, "DXEPcdDataBaseSec.raw")
             GenFdsGlobalVariable.GenerateSection(PcdExDbSecName,
-                                                [PcdExDbFileName],
-                                                "EFI_SECTION_RAW",
-                                                IsMakefile = IsMakefile
-                                                )
+                                                 [PcdExDbFileName],
+                                                 "EFI_SECTION_RAW",
+                                                 IsMakefile=IsMakefile
+                                                 )
             SectFiles.append(PcdExDbSecName)
             SectAlignments.append(None)
         for Sect in Rule.SectionList:
-            SecIndex = '%d' %Index
-            SectList  = []
+            SecIndex = '%d' % Index
+            SectList = []
             #
             # Convert Fv Section Type for PI1.1 SMM driver.
             #
@@ -951,7 +1019,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
             #
             if self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER and int(self.PiSpecVersion, 16) < 0x0001000A:
                 if Sect.SectionType == BINARY_FILE_TYPE_SMM_DEPEX:
-                    EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
+                    EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                                    "Framework SMM module doesn't support SMM_DEPEX section type", File=self.InfFileName)
             #
             # process the inside FvImage from FvSection or GuidSection
             #
@@ -964,75 +1033,89 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 Sect.FvParentAddr = FvParentAddr
 
             if Rule.KeyStringList != []:
-                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile = IsMakefile)
-            else :
-                SectList, Align = Sect.GenSection(self.OutputPath, self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile = IsMakefile)
+                SectList, Align = Sect.GenSection(
+                    self.OutputPath, self.ModuleGuid, SecIndex, Rule.KeyStringList, self, IsMakefile=IsMakefile)
+            else:
+                SectList, Align = Sect.GenSection(
+                    self.OutputPath, self.ModuleGuid, SecIndex, self.KeyStringList, self, IsMakefile=IsMakefile)
 
             if not HasGeneratedFlag:
                 UniVfrOffsetFileSection = ""
-                ModuleFileName = mws.join(GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName)
-                InfData = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(ModuleFileName), self.CurrentArch]
+                ModuleFileName = mws.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, self.InfFileName)
+                InfData = GenFdsGlobalVariable.WorkSpace.BuildObject[PathClass(
+                    ModuleFileName), self.CurrentArch]
                 #
                 # Search the source list in InfData to find if there are .vfr file exist.
                 #
                 VfrUniBaseName = {}
                 VfrUniOffsetList = []
                 for SourceFile in InfData.Sources:
-                    if SourceFile.Type.upper() == ".VFR" :
+                    if SourceFile.Type.upper() == ".VFR":
                         #
                         # search the .map file to find the offset of vfr binary in the PE32+/TE file.
                         #
-                        VfrUniBaseName[SourceFile.BaseName] = (SourceFile.BaseName + "Bin")
-                    if SourceFile.Type.upper() == ".UNI" :
+                        VfrUniBaseName[SourceFile.BaseName] = (
+                            SourceFile.BaseName + "Bin")
+                    if SourceFile.Type.upper() == ".UNI":
                         #
                         # search the .map file to find the offset of Uni strings binary in the PE32+/TE file.
                         #
-                        VfrUniBaseName["UniOffsetName"] = (self.BaseName + "Strings")
-
+                        VfrUniBaseName["UniOffsetName"] = (
+                            self.BaseName + "Strings")
 
                 if len(VfrUniBaseName) > 0:
                     if IsMakefile:
                         if InfData.BuildType != 'UEFI_HII':
-                            UniVfrOffsetFileName = os.path.join(self.OutputPath, self.BaseName + '.offset')
-                            UniVfrOffsetFileSection = os.path.join(self.OutputPath, self.BaseName + 'Offset' + '.raw')
+                            UniVfrOffsetFileName = os.path.join(
+                                self.OutputPath, self.BaseName + '.offset')
+                            UniVfrOffsetFileSection = os.path.join(
+                                self.OutputPath, self.BaseName + 'Offset' + '.raw')
                             UniVfrOffsetFileNameList = []
-                            UniVfrOffsetFileNameList.append(UniVfrOffsetFileName)
-                            TrimCmd = "Trim --Vfr-Uni-Offset -o %s --ModuleName=%s --DebugDir=%s " % (UniVfrOffsetFileName, self.BaseName, self.EfiDebugPath)
+                            UniVfrOffsetFileNameList.append(
+                                UniVfrOffsetFileName)
+                            TrimCmd = "Trim --Vfr-Uni-Offset -o %s --ModuleName=%s --DebugDir=%s " % (
+                                UniVfrOffsetFileName, self.BaseName, self.EfiDebugPath)
                             GenFdsGlobalVariable.SecCmdList.append(TrimCmd)
                             GenFdsGlobalVariable.GenerateSection(UniVfrOffsetFileSection,
-                                                                [UniVfrOffsetFileName],
-                                                                "EFI_SECTION_RAW",
-                                                                IsMakefile = True
-                                                                )
+                                                                 [UniVfrOffsetFileName],
+                                                                 "EFI_SECTION_RAW",
+                                                                 IsMakefile=True
+                                                                 )
                     else:
-                        VfrUniOffsetList = self.__GetBuildOutputMapFileVfrUniInfo(VfrUniBaseName)
+                        VfrUniOffsetList = self.__GetBuildOutputMapFileVfrUniInfo(
+                            VfrUniBaseName)
                         #
                         # Generate the Raw data of raw section
                         #
                         if VfrUniOffsetList:
-                            UniVfrOffsetFileName = os.path.join(self.OutputPath, self.BaseName + '.offset')
-                            UniVfrOffsetFileSection = os.path.join(self.OutputPath, self.BaseName + 'Offset' + '.raw')
-                            FfsInfStatement.__GenUniVfrOffsetFile (VfrUniOffsetList, UniVfrOffsetFileName)
+                            UniVfrOffsetFileName = os.path.join(
+                                self.OutputPath, self.BaseName + '.offset')
+                            UniVfrOffsetFileSection = os.path.join(
+                                self.OutputPath, self.BaseName + 'Offset' + '.raw')
+                            FfsInfStatement.__GenUniVfrOffsetFile(
+                                VfrUniOffsetList, UniVfrOffsetFileName)
                             UniVfrOffsetFileNameList = []
-                            UniVfrOffsetFileNameList.append(UniVfrOffsetFileName)
+                            UniVfrOffsetFileNameList.append(
+                                UniVfrOffsetFileName)
                             """Call GenSection"""
 
                             GenFdsGlobalVariable.GenerateSection(UniVfrOffsetFileSection,
                                                                  UniVfrOffsetFileNameList,
                                                                  "EFI_SECTION_RAW"
                                                                  )
-                            #os.remove(UniVfrOffsetFileName)
+                            # os.remove(UniVfrOffsetFileName)
                     if UniVfrOffsetFileSection:
                         SectList.append(UniVfrOffsetFileSection)
                         HasGeneratedFlag = True
 
-            for SecName in  SectList :
+            for SecName in SectList:
                 SectFiles.append(SecName)
                 SectAlignments.append(Align)
             Index = Index + 1
         return SectFiles, SectAlignments
 
-    ## __GenComplexFileFfs__() method
+    # __GenComplexFileFfs__() method
     #
     #   Generate FFS
     #
@@ -1041,32 +1124,32 @@ class FfsInfStatement(FfsInfStatementClassObject):
     #   @param  InputFileList        The output file list from GenSection
     #   @retval string      Generated FFS file name
     #
-    def __GenComplexFileFfs__(self, Rule, InputFile, Alignments, MakefilePath = None):
+    def __GenComplexFileFfs__(self, Rule, InputFile, Alignments, MakefilePath=None):
 
         if Rule.NameGuid is not None and Rule.NameGuid.startswith('PCD('):
             PcdValue = GenFdsGlobalVariable.GetPcdValue(Rule.NameGuid)
             if len(PcdValue) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.' \
-                            % (Rule.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, '%s NOT defined.'
+                                % (Rule.NameGuid))
             if PcdValue.startswith('{'):
                 PcdValue = GuidStructureByteArrayToGuidString(PcdValue)
             RegistryGuidStr = PcdValue
             if len(RegistryGuidStr) == 0:
-                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.' \
-                            % (Rule.NameGuid))
+                EdkLogger.error("GenFds", GENFDS_ERROR, 'GUID value for %s in wrong format.'
+                                % (Rule.NameGuid))
             self.ModuleGuid = RegistryGuidStr
 
-        FfsOutput = os.path.join( self.OutputPath, self.ModuleGuid + '.ffs')
+        FfsOutput = os.path.join(self.OutputPath, self.ModuleGuid + '.ffs')
         GenFdsGlobalVariable.GenerateFfs(FfsOutput, InputFile,
-                                             FdfFvFileTypeToFileType[Rule.FvFileType],
-                                             self.ModuleGuid, Fixed=Rule.Fixed,
-                                             CheckSum=Rule.CheckSum, Align=Rule.Alignment,
-                                             SectionAlign=Alignments,
-                                             MakefilePath=MakefilePath
-                                             )
+                                         FdfFvFileTypeToFileType[Rule.FvFileType],
+                                         self.ModuleGuid, Fixed=Rule.Fixed,
+                                         CheckSum=Rule.CheckSum, Align=Rule.Alignment,
+                                         SectionAlign=Alignments,
+                                         MakefilePath=MakefilePath
+                                         )
         return FfsOutput
 
-    ## __GetBuildOutputMapFileVfrUniInfo() method
+    # __GetBuildOutputMapFileVfrUniInfo() method
     #
     #   Find the offset of UNI/INF object offset in the EFI image file.
     #
@@ -1079,7 +1162,7 @@ class FfsInfStatement(FfsInfStatementClassObject):
         EfiFileName = os.path.join(self.EfiOutputPath, self.BaseName + ".efi")
         return GetVariableOffset(MapFileName, EfiFileName, list(VfrUniBaseName.values()))
 
-    ## __GenUniVfrOffsetFile() method
+    # __GenUniVfrOffsetFile() method
     #
     #   Generate the offset file for the module which contain VFR or UNI file.
     #
@@ -1101,8 +1184,8 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 #
                 UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
                 fStringIO.write(UniGuid)
-                UniValue = pack ('Q', int (Item[1], 16))
-                fStringIO.write (UniValue)
+                UniValue = pack('Q', int(Item[1], 16))
+                fStringIO.write(UniValue)
             else:
                 #
                 # VFR binary offset in image.
@@ -1111,18 +1194,17 @@ class FfsInfStatement(FfsInfStatementClassObject):
                 #
                 VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
                 fStringIO.write(VfrGuid)
-                type (Item[1])
-                VfrValue = pack ('Q', int (Item[1], 16))
-                fStringIO.write (VfrValue)
+                type(Item[1])
+                VfrValue = pack('Q', int(Item[1], 16))
+                fStringIO.write(VfrValue)
 
         #
         # write data into file.
         #
-        try :
+        try:
             SaveFileOnChange(UniVfrOffsetFileName, fStringIO.getvalue())
         except:
-            EdkLogger.error("GenFds", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %UniVfrOffsetFileName, None)
-
-        fStringIO.close ()
-
+            EdkLogger.error("GenFds", FILE_WRITE_FAILURE,
+                            "Write data to file %s failed, please check whether the file been locked or using by other applications." % UniVfrOffsetFileName, None)
 
+        fStringIO.close()
diff --git a/BaseTools/Source/Python/GenFds/Fv.py b/BaseTools/Source/Python/GenFds/Fv.py
index 16c944a0bd79..578b1ad0e196 100644
--- a/BaseTools/Source/Python/GenFds/Fv.py
+++ b/BaseTools/Source/Python/GenFds/Fv.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FV generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,11 +23,13 @@ from Common.DataType import *
 
 FV_UI_EXT_ENTY_GUID = 'A67DF1FA-8DE8-4E98-AF09-4BDF2EFFBC7C'
 
-## generate FV
+# generate FV
 #
 #
+
+
 class FV (object):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -58,7 +60,7 @@ class FV (object):
         self.FvExtEntryTypeValue = []
         self.FvExtEntryType = []
         self.FvExtEntryData = []
-    ## AddToBuffer()
+    # AddToBuffer()
     #
     #   Generate Fv and add it to the Buffer
     #
@@ -71,7 +73,8 @@ class FV (object):
     #   @param  MacroDict   macro value pair
     #   @retval string      Generated FV file path
     #
-    def AddToBuffer (self, Buffer, BaseAddress=None, BlockSize= None, BlockNum=None, ErasePloarity='1',  MacroDict = None, Flag=False):
+
+    def AddToBuffer(self, Buffer, BaseAddress=None, BlockSize=None, BlockNum=None, ErasePloarity='1',  MacroDict=None, Flag=False):
         if BaseAddress is None and self.UiFvName.upper() + 'fv' in GenFdsGlobalVariable.ImageBinDict:
             return GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv']
         if MacroDict is None:
@@ -91,16 +94,19 @@ class FV (object):
                             elif RegionData.upper() + 'fv' in GenFdsGlobalVariable.ImageBinDict:
                                 continue
                             elif self.UiFvName.upper() == RegionData.upper():
-                                GenFdsGlobalVariable.ErrorLogger("Capsule %s in FD region can't contain a FV %s in FD region." % (self.CapsuleName, self.UiFvName.upper()))
+                                GenFdsGlobalVariable.ErrorLogger("Capsule %s in FD region can't contain a FV %s in FD region." % (
+                                    self.CapsuleName, self.UiFvName.upper()))
         if not Flag:
-            GenFdsGlobalVariable.InfLogger( "\nGenerating %s FV" %self.UiFvName)
+            GenFdsGlobalVariable.InfLogger(
+                "\nGenerating %s FV" % self.UiFvName)
         GenFdsGlobalVariable.LargeFileInFvFlags.append(False)
         FFSGuid = None
 
         if self.FvBaseAddress is not None:
             BaseAddress = self.FvBaseAddress
         if not Flag:
-            self._InitializeInf(BaseAddress, BlockSize, BlockNum, ErasePloarity)
+            self._InitializeInf(BaseAddress, BlockSize,
+                                BlockNum, ErasePloarity)
         #
         # First Process the Apriori section
         #
@@ -109,13 +115,14 @@ class FV (object):
         GenFdsGlobalVariable.VerboseLogger('First generate Apriori file !')
         FfsFileList = []
         for AprSection in self.AprioriSectionList:
-            FileName = AprSection.GenFfs (self.UiFvName, MacroDict, IsMakefile=Flag)
+            FileName = AprSection.GenFfs(
+                self.UiFvName, MacroDict, IsMakefile=Flag)
             FfsFileList.append(FileName)
             # Add Apriori file name to Inf file
             if not Flag:
-                self.FvInfFile.append("EFI_FILE_NAME = " + \
-                                            FileName          + \
-                                            TAB_LINE_BREAK)
+                self.FvInfFile.append("EFI_FILE_NAME = " +
+                                      FileName +
+                                      TAB_LINE_BREAK)
 
         # Process Modules in FfsList
         for FfsFile in self.FfsList:
@@ -124,12 +131,13 @@ class FV (object):
                     continue
             if GenFdsGlobalVariable.EnableGenfdsMultiThread and GenFdsGlobalVariable.ModuleFile and GenFdsGlobalVariable.ModuleFile.Path.find(os.path.normpath(FfsFile.InfFileName)) == -1:
                 continue
-            FileName = FfsFile.GenFfs(MacroDict, FvParentAddr=BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
+            FileName = FfsFile.GenFfs(
+                MacroDict, FvParentAddr=BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
             FfsFileList.append(FileName)
             if not Flag:
-                self.FvInfFile.append("EFI_FILE_NAME = " + \
-                                            FileName          + \
-                                            TAB_LINE_BREAK)
+                self.FvInfFile.append("EFI_FILE_NAME = " +
+                                      FileName +
+                                      TAB_LINE_BREAK)
         if not Flag:
             FvInfFile = ''.join(self.FvInfFile)
             SaveFileOnChange(self.InfFileName, FvInfFile, False)
@@ -143,28 +151,31 @@ class FV (object):
             FvOutputFile = self.CreateFileName
 
         if Flag:
-            GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
+            GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper(
+            ) + 'fv'] = FvOutputFile
             return FvOutputFile
 
-        FvInfoFileName = os.path.join(GenFdsGlobalVariable.FfsDir, self.UiFvName + '.inf')
+        FvInfoFileName = os.path.join(
+            GenFdsGlobalVariable.FfsDir, self.UiFvName + '.inf')
         if not Flag:
-            CopyLongFilePath(GenFdsGlobalVariable.FvAddressFileName, FvInfoFileName)
+            CopyLongFilePath(
+                GenFdsGlobalVariable.FvAddressFileName, FvInfoFileName)
             OrigFvInfo = None
-            if os.path.exists (FvInfoFileName):
+            if os.path.exists(FvInfoFileName):
                 OrigFvInfo = open(FvInfoFileName, 'r').read()
             if GenFdsGlobalVariable.LargeFileInFvFlags[-1]:
                 FFSGuid = GenFdsGlobalVariable.EFI_FIRMWARE_FILE_SYSTEM3_GUID
             GenFdsGlobalVariable.GenerateFirmwareVolume(
-                                    FvOutputFile,
-                                    [self.InfFileName],
-                                    AddressFile=FvInfoFileName,
-                                    FfsList=FfsFileList,
-                                    ForceRebase=self.FvForceRebase,
-                                    FileSystemGuid=FFSGuid
-                                    )
+                FvOutputFile,
+                [self.InfFileName],
+                AddressFile=FvInfoFileName,
+                FfsList=FfsFileList,
+                ForceRebase=self.FvForceRebase,
+                FileSystemGuid=FFSGuid
+            )
 
             NewFvInfo = None
-            if os.path.exists (FvInfoFileName):
+            if os.path.exists(FvInfoFileName):
                 NewFvInfo = open(FvInfoFileName, 'r').read()
             if NewFvInfo is not None and NewFvInfo != OrigFvInfo:
                 FvChildAddr = []
@@ -173,28 +184,29 @@ class FV (object):
                 AddrKeyFound = False
                 for AddrString in AddrStrings:
                     if AddrKeyFound:
-                        #get base address for the inside FvImage
-                        FvChildAddr.append (AddrString)
-                    elif AddrString.find ("[FV_BASE_ADDRESS]") != -1:
+                        # get base address for the inside FvImage
+                        FvChildAddr.append(AddrString)
+                    elif AddrString.find("[FV_BASE_ADDRESS]") != -1:
                         AddrKeyFound = True
                 AddFileObj.close()
 
                 if FvChildAddr != []:
                     # Update Ffs again
                     for FfsFile in self.FfsList:
-                        FileName = FfsFile.GenFfs(MacroDict, FvChildAddr, BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
+                        FileName = FfsFile.GenFfs(
+                            MacroDict, FvChildAddr, BaseAddress, IsMakefile=Flag, FvName=self.UiFvName)
 
                     if GenFdsGlobalVariable.LargeFileInFvFlags[-1]:
-                        FFSGuid = GenFdsGlobalVariable.EFI_FIRMWARE_FILE_SYSTEM3_GUID;
-                    #Update GenFv again
+                        FFSGuid = GenFdsGlobalVariable.EFI_FIRMWARE_FILE_SYSTEM3_GUID
+                    # Update GenFv again
                     GenFdsGlobalVariable.GenerateFirmwareVolume(
-                                                FvOutputFile,
-                                                [self.InfFileName],
-                                                AddressFile=FvInfoFileName,
-                                                FfsList=FfsFileList,
-                                                ForceRebase=self.FvForceRebase,
-                                                FileSystemGuid=FFSGuid
-                                                )
+                        FvOutputFile,
+                        [self.InfFileName],
+                        AddressFile=FvInfoFileName,
+                        FfsList=FfsFileList,
+                        ForceRebase=self.FvForceRebase,
+                        FileSystemGuid=FFSGuid
+                    )
 
             #
             # Write the Fv contents to Buffer
@@ -205,35 +217,42 @@ class FV (object):
                 FvHeaderBuffer = FvFileObj.read(0x48)
                 Signature = FvHeaderBuffer[0x28:0x32]
                 if Signature and Signature.startswith(b'_FVH'):
-                    GenFdsGlobalVariable.VerboseLogger("\nGenerate %s FV Successfully" % self.UiFvName)
+                    GenFdsGlobalVariable.VerboseLogger(
+                        "\nGenerate %s FV Successfully" % self.UiFvName)
                     GenFdsGlobalVariable.SharpCounter = 0
 
                     FvFileObj.seek(0)
                     Buffer.write(FvFileObj.read())
                     # FV alignment position.
-                    FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E:0x2F]) & 0x1F)
+                    FvAlignmentValue = 1 << (
+                        ord(FvHeaderBuffer[0x2E:0x2F]) & 0x1F)
                     if FvAlignmentValue >= 0x400:
                         if FvAlignmentValue >= 0x100000:
                             if FvAlignmentValue >= 0x1000000:
-                            #The max alignment supported by FFS is 16M.
+                                # The max alignment supported by FFS is 16M.
                                 self.FvAlignment = "16M"
                             else:
-                                self.FvAlignment = str(FvAlignmentValue // 0x100000) + "M"
+                                self.FvAlignment = str(
+                                    FvAlignmentValue // 0x100000) + "M"
                         else:
-                            self.FvAlignment = str(FvAlignmentValue // 0x400) + "K"
+                            self.FvAlignment = str(
+                                FvAlignmentValue // 0x400) + "K"
                     else:
                         # FvAlignmentValue is less than 1K
-                        self.FvAlignment = str (FvAlignmentValue)
+                        self.FvAlignment = str(FvAlignmentValue)
                     FvFileObj.close()
-                    GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper() + 'fv'] = FvOutputFile
+                    GenFdsGlobalVariable.ImageBinDict[self.UiFvName.upper(
+                    ) + 'fv'] = FvOutputFile
                     GenFdsGlobalVariable.LargeFileInFvFlags.pop()
                 else:
-                    GenFdsGlobalVariable.ErrorLogger("Invalid FV file %s." % self.UiFvName)
+                    GenFdsGlobalVariable.ErrorLogger(
+                        "Invalid FV file %s." % self.UiFvName)
             else:
-                GenFdsGlobalVariable.ErrorLogger("Failed to generate %s FV file." %self.UiFvName)
+                GenFdsGlobalVariable.ErrorLogger(
+                    "Failed to generate %s FV file." % self.UiFvName)
         return FvOutputFile
 
-    ## _GetBlockSize()
+    # _GetBlockSize()
     #
     #   Calculate FV's block size
     #   Inherit block size from FD if no block size specified in FV
@@ -256,7 +275,7 @@ class FV (object):
                             return True
         return False
 
-    ## _InitializeInf()
+    # _InitializeInf()
     #
     #   Initialize the inf file to create FV
     #
@@ -266,12 +285,12 @@ class FV (object):
     #   @param  BlockNum    How many blocks in FV
     #   @param  ErasePolarity      Flash erase polarity
     #
-    def _InitializeInf (self, BaseAddress = None, BlockSize= None, BlockNum = None, ErasePloarity='1'):
+    def _InitializeInf(self, BaseAddress=None, BlockSize=None, BlockNum=None, ErasePloarity='1'):
         #
         # Create FV inf file
         #
         self.InfFileName = os.path.join(GenFdsGlobalVariable.FvDir,
-                                   self.UiFvName + '.inf')
+                                        self.UiFvName + '.inf')
         self.FvInfFile = []
 
         #
@@ -279,79 +298,81 @@ class FV (object):
         #
         self.FvInfFile.append("[options]" + TAB_LINE_BREAK)
         if BaseAddress is not None:
-            self.FvInfFile.append("EFI_BASE_ADDRESS = " + \
-                                       BaseAddress          + \
-                                       TAB_LINE_BREAK)
+            self.FvInfFile.append("EFI_BASE_ADDRESS = " +
+                                  BaseAddress +
+                                  TAB_LINE_BREAK)
 
         if BlockSize is not None:
-            self.FvInfFile.append("EFI_BLOCK_SIZE = " + \
-                                      '0x%X' %BlockSize    + \
-                                      TAB_LINE_BREAK)
+            self.FvInfFile.append("EFI_BLOCK_SIZE = " +
+                                  '0x%X' % BlockSize +
+                                  TAB_LINE_BREAK)
             if BlockNum is not None:
-                self.FvInfFile.append("EFI_NUM_BLOCKS   = "  + \
-                                      ' 0x%X' %BlockNum    + \
+                self.FvInfFile.append("EFI_NUM_BLOCKS   = " +
+                                      ' 0x%X' % BlockNum +
                                       TAB_LINE_BREAK)
         else:
             if self.BlockSizeList == []:
                 if not self._GetBlockSize():
-                    #set default block size is 1
-                    self.FvInfFile.append("EFI_BLOCK_SIZE  = 0x1" + TAB_LINE_BREAK)
+                    # set default block size is 1
+                    self.FvInfFile.append(
+                        "EFI_BLOCK_SIZE  = 0x1" + TAB_LINE_BREAK)
 
             for BlockSize in self.BlockSizeList:
                 if BlockSize[0] is not None:
-                    self.FvInfFile.append("EFI_BLOCK_SIZE  = "  + \
-                                          '0x%X' %BlockSize[0]    + \
+                    self.FvInfFile.append("EFI_BLOCK_SIZE  = " +
+                                          '0x%X' % BlockSize[0] +
                                           TAB_LINE_BREAK)
 
                 if BlockSize[1] is not None:
-                    self.FvInfFile.append("EFI_NUM_BLOCKS   = "  + \
-                                          ' 0x%X' %BlockSize[1]    + \
+                    self.FvInfFile.append("EFI_NUM_BLOCKS   = " +
+                                          ' 0x%X' % BlockSize[1] +
                                           TAB_LINE_BREAK)
 
         if self.BsBaseAddress is not None:
-            self.FvInfFile.append('EFI_BOOT_DRIVER_BASE_ADDRESS = ' + \
-                                       '0x%X' %self.BsBaseAddress)
+            self.FvInfFile.append('EFI_BOOT_DRIVER_BASE_ADDRESS = ' +
+                                  '0x%X' % self.BsBaseAddress)
         if self.RtBaseAddress is not None:
-            self.FvInfFile.append('EFI_RUNTIME_DRIVER_BASE_ADDRESS = ' + \
-                                      '0x%X' %self.RtBaseAddress)
+            self.FvInfFile.append('EFI_RUNTIME_DRIVER_BASE_ADDRESS = ' +
+                                  '0x%X' % self.RtBaseAddress)
         #
         # Add attribute
         #
         self.FvInfFile.append("[attributes]" + TAB_LINE_BREAK)
 
-        self.FvInfFile.append("EFI_ERASE_POLARITY   = "       + \
-                                          ' %s' %ErasePloarity    + \
-                                          TAB_LINE_BREAK)
+        self.FvInfFile.append("EFI_ERASE_POLARITY   = " +
+                              ' %s' % ErasePloarity +
+                              TAB_LINE_BREAK)
         if not (self.FvAttributeDict is None):
             for FvAttribute in self.FvAttributeDict.keys():
                 if FvAttribute == "FvUsedSizeEnable":
                     if self.FvAttributeDict[FvAttribute].upper() in ('TRUE', '1'):
                         self.UsedSizeEnable = True
                     continue
-                self.FvInfFile.append("EFI_"            + \
-                                          FvAttribute       + \
-                                          ' = '             + \
-                                          self.FvAttributeDict[FvAttribute] + \
-                                          TAB_LINE_BREAK )
+                self.FvInfFile.append("EFI_" +
+                                      FvAttribute +
+                                      ' = ' +
+                                      self.FvAttributeDict[FvAttribute] +
+                                      TAB_LINE_BREAK)
         if self.FvAlignment is not None:
-            self.FvInfFile.append("EFI_FVB2_ALIGNMENT_"     + \
-                                       self.FvAlignment.strip() + \
-                                       " = TRUE"                + \
-                                       TAB_LINE_BREAK)
+            self.FvInfFile.append("EFI_FVB2_ALIGNMENT_" +
+                                  self.FvAlignment.strip() +
+                                  " = TRUE" +
+                                  TAB_LINE_BREAK)
 
         #
         # Generate FV extension header file
         #
         if not self.FvNameGuid:
             if len(self.FvExtEntryType) > 0 or self.UsedSizeEnable:
-                GenFdsGlobalVariable.ErrorLogger("FV Extension Header Entries declared for %s with no FvNameGuid declaration." % (self.UiFvName))
+                GenFdsGlobalVariable.ErrorLogger(
+                    "FV Extension Header Entries declared for %s with no FvNameGuid declaration." % (self.UiFvName))
         else:
             TotalSize = 16 + 4
             Buffer = bytearray()
             if self.UsedSizeEnable:
                 TotalSize += (4 + 4)
-                ## define EFI_FV_EXT_TYPE_USED_SIZE_TYPE 0x03
-                #typedef  struct
+                # define EFI_FV_EXT_TYPE_USED_SIZE_TYPE 0x03
+                # typedef  struct
                 # {
                 #    EFI_FIRMWARE_VOLUME_EXT_ENTRY Hdr;
                 #    UINT32 UsedSize;
@@ -376,34 +397,41 @@ class FV (object):
                            + PackGUID(Guid)
                            + self.UiFvName.encode('utf-8'))
 
-            for Index in range (0, len(self.FvExtEntryType)):
+            for Index in range(0, len(self.FvExtEntryType)):
                 if self.FvExtEntryType[Index] == 'FILE':
                     # check if the path is absolute or relative
                     if os.path.isabs(self.FvExtEntryData[Index]):
-                        FileFullPath = os.path.normpath(self.FvExtEntryData[Index])
+                        FileFullPath = os.path.normpath(
+                            self.FvExtEntryData[Index])
                     else:
-                        FileFullPath = os.path.normpath(os.path.join(GenFdsGlobalVariable.WorkSpaceDir, self.FvExtEntryData[Index]))
+                        FileFullPath = os.path.normpath(os.path.join(
+                            GenFdsGlobalVariable.WorkSpaceDir, self.FvExtEntryData[Index]))
                     # check if the file path exists or not
                     if not os.path.isfile(FileFullPath):
-                        GenFdsGlobalVariable.ErrorLogger("Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
-                    FvExtFile = open (FileFullPath, 'rb')
+                        GenFdsGlobalVariable.ErrorLogger(
+                            "Error opening FV Extension Header Entry file %s." % (self.FvExtEntryData[Index]))
+                    FvExtFile = open(FileFullPath, 'rb')
                     FvExtFile.seek(0, 2)
                     Size = FvExtFile.tell()
                     if Size >= 0x10000:
-                        GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
+                        GenFdsGlobalVariable.ErrorLogger(
+                            "The size of FV Extension Header Entry file %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
                     TotalSize += (Size + 4)
                     FvExtFile.seek(0)
-                    Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
+                    Buffer += pack('HH', (Size + 4),
+                                   int(self.FvExtEntryTypeValue[Index], 16))
                     Buffer += FvExtFile.read()
                     FvExtFile.close()
                 if self.FvExtEntryType[Index] == 'DATA':
                     ByteList = self.FvExtEntryData[Index].split(',')
-                    Size = len (ByteList)
+                    Size = len(ByteList)
                     if Size >= 0x10000:
-                        GenFdsGlobalVariable.ErrorLogger("The size of FV Extension Header Entry data %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
+                        GenFdsGlobalVariable.ErrorLogger(
+                            "The size of FV Extension Header Entry data %s exceeds 0x10000." % (self.FvExtEntryData[Index]))
                     TotalSize += (Size + 4)
-                    Buffer += pack('HH', (Size + 4), int(self.FvExtEntryTypeValue[Index], 16))
-                    for Index1 in range (0, Size):
+                    Buffer += pack('HH', (Size + 4),
+                                   int(self.FvExtEntryTypeValue[Index], 16))
+                    for Index1 in range(0, Size):
                         Buffer += pack('B', int(ByteList[Index1], 16))
 
             Guid = self.FvNameGuid.split('-')
@@ -413,17 +441,19 @@ class FV (object):
             # Generate FV extension header file if the total size is not zero
             #
             if TotalSize > 0:
-                FvExtHeaderFileName = os.path.join(GenFdsGlobalVariable.FvDir, self.UiFvName + '.ext')
+                FvExtHeaderFileName = os.path.join(
+                    GenFdsGlobalVariable.FvDir, self.UiFvName + '.ext')
                 FvExtHeaderFile = BytesIO()
                 FvExtHeaderFile.write(Buffer)
-                Changed = SaveFileOnChange(FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
+                Changed = SaveFileOnChange(
+                    FvExtHeaderFileName, FvExtHeaderFile.getvalue(), True)
                 FvExtHeaderFile.close()
                 if Changed:
-                  if os.path.exists (self.InfFileName):
-                    os.remove (self.InfFileName)
-                self.FvInfFile.append("EFI_FV_EXT_HEADER_FILE_NAME = "      + \
-                                           FvExtHeaderFileName                  + \
-                                           TAB_LINE_BREAK)
+                    if os.path.exists(self.InfFileName):
+                        os.remove(self.InfFileName)
+                self.FvInfFile.append("EFI_FV_EXT_HEADER_FILE_NAME = " +
+                                      FvExtHeaderFileName +
+                                      TAB_LINE_BREAK)
 
         #
         # Add [Files]
diff --git a/BaseTools/Source/Python/GenFds/FvImageSection.py b/BaseTools/Source/Python/GenFds/FvImageSection.py
index ff2d5ca3c053..4fade8d888ea 100644
--- a/BaseTools/Source/Python/GenFds/FvImageSection.py
+++ b/BaseTools/Source/Python/GenFds/FvImageSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FV image section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -22,19 +22,21 @@ from Common import EdkLogger
 from Common.BuildToolError import *
 from Common.DataType import *
 
-## generate FV image section
+# generate FV image section
 #
 #
+
+
 class FvImageSection(FvImageSectionClassObject):
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         FvImageSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate FV image section
     #
@@ -47,14 +49,15 @@ class FvImageSection(FvImageSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf = None, Dict = None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile=False):
 
         OutputFileList = []
         if Dict is None:
             Dict = {}
         if self.FvFileType is not None:
-            FileList, IsSect = Section.Section.GetFileList(FfsInf, self.FvFileType, self.FvFileExtension)
-            if IsSect :
+            FileList, IsSect = Section.Section.GetFileList(
+                FfsInf, self.FvFileType, self.FvFileExtension)
+            if IsSect:
                 return FileList, self.Alignment
 
             Num = SecNum
@@ -63,36 +66,39 @@ class FvImageSection(FvImageSectionClassObject):
             for FvFileName in FileList:
                 FvAlignmentValue = 0
                 if os.path.isfile(FvFileName):
-                    FvFileObj = open (FvFileName, 'rb')
+                    FvFileObj = open(FvFileName, 'rb')
                     FvFileObj.seek(0)
                     # PI FvHeader is 0x48 byte
                     FvHeaderBuffer = FvFileObj.read(0x48)
                     # FV alignment position.
                     if isinstance(FvHeaderBuffer[0x2E], str):
-                        FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E]) & 0x1F)
+                        FvAlignmentValue = 1 << (
+                            ord(FvHeaderBuffer[0x2E]) & 0x1F)
                     else:
                         FvAlignmentValue = 1 << (FvHeaderBuffer[0x2E] & 0x1F)
                     FvFileObj.close()
                 if FvAlignmentValue > MaxFvAlignment:
                     MaxFvAlignment = FvAlignmentValue
 
-                OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get("FV_IMAGE"))
-                GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
+                OutputFile = os.path.join(
+                    OutputPath, ModuleName + SUP_MODULE_SEC + Num + SectionSuffix.get("FV_IMAGE"))
+                GenFdsGlobalVariable.GenerateSection(
+                    OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
                 OutputFileList.append(OutputFile)
 
             # MaxFvAlignment is larger than or equal to 1K
             if MaxFvAlignment >= 0x400:
                 if MaxFvAlignment >= 0x100000:
-                    #The max alignment supported by FFS is 16M.
+                    # The max alignment supported by FFS is 16M.
                     if MaxFvAlignment >= 0x1000000:
                         self.Alignment = "16M"
                     else:
                         self.Alignment = str(MaxFvAlignment // 0x100000) + "M"
                 else:
-                    self.Alignment = str (MaxFvAlignment // 0x400) + "K"
+                    self.Alignment = str(MaxFvAlignment // 0x400) + "K"
             else:
                 # MaxFvAlignment is less than 1K
-                self.Alignment = str (MaxFvAlignment)
+                self.Alignment = str(MaxFvAlignment)
 
             return OutputFileList, self.Alignment
         #
@@ -105,54 +111,65 @@ class FvImageSection(FvImageSectionClassObject):
                 self.Fv = Fv
                 if not self.FvAddr and self.Fv.BaseAddress:
                     self.FvAddr = self.Fv.BaseAddress
-                FvFileName = Fv.AddToBuffer(Buffer, self.FvAddr, MacroDict = Dict, Flag=IsMakefile)
+                FvFileName = Fv.AddToBuffer(
+                    Buffer, self.FvAddr, MacroDict=Dict, Flag=IsMakefile)
                 if Fv.FvAlignment is not None:
                     if self.Alignment is None:
                         self.Alignment = Fv.FvAlignment
                     else:
-                        if GenFdsGlobalVariable.GetAlignment (Fv.FvAlignment) > GenFdsGlobalVariable.GetAlignment (self.Alignment):
+                        if GenFdsGlobalVariable.GetAlignment(Fv.FvAlignment) > GenFdsGlobalVariable.GetAlignment(self.Alignment):
                             self.Alignment = Fv.FvAlignment
             else:
                 if self.FvFileName is not None:
-                    FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvFileName)
+                    FvFileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                        self.FvFileName)
                     if os.path.isfile(FvFileName):
-                        FvFileObj = open (FvFileName, 'rb')
+                        FvFileObj = open(FvFileName, 'rb')
                         FvFileObj.seek(0)
                         # PI FvHeader is 0x48 byte
                         FvHeaderBuffer = FvFileObj.read(0x48)
                         # FV alignment position.
                         if isinstance(FvHeaderBuffer[0x2E], str):
-                            FvAlignmentValue = 1 << (ord(FvHeaderBuffer[0x2E]) & 0x1F)
+                            FvAlignmentValue = 1 << (
+                                ord(FvHeaderBuffer[0x2E]) & 0x1F)
                         else:
-                            FvAlignmentValue = 1 << (FvHeaderBuffer[0x2E] & 0x1F)
+                            FvAlignmentValue = 1 << (
+                                FvHeaderBuffer[0x2E] & 0x1F)
                         # FvAlignmentValue is larger than or equal to 1K
                         if FvAlignmentValue >= 0x400:
                             if FvAlignmentValue >= 0x100000:
-                                #The max alignment supported by FFS is 16M.
+                                # The max alignment supported by FFS is 16M.
                                 if FvAlignmentValue >= 0x1000000:
                                     self.Alignment = "16M"
                                 else:
-                                    self.Alignment = str(FvAlignmentValue // 0x100000) + "M"
+                                    self.Alignment = str(
+                                        FvAlignmentValue // 0x100000) + "M"
                             else:
-                                self.Alignment = str (FvAlignmentValue // 0x400) + "K"
+                                self.Alignment = str(
+                                    FvAlignmentValue // 0x400) + "K"
                         else:
                             # FvAlignmentValue is less than 1K
-                            self.Alignment = str (FvAlignmentValue)
+                            self.Alignment = str(FvAlignmentValue)
                         FvFileObj.close()
                     else:
-                        if len (mws.getPkgPath()) == 0:
-                            EdkLogger.error("GenFds", FILE_NOT_FOUND, "%s is not found in WORKSPACE: %s" % self.FvFileName, GenFdsGlobalVariable.WorkSpaceDir)
+                        if len(mws.getPkgPath()) == 0:
+                            EdkLogger.error("GenFds", FILE_NOT_FOUND, "%s is not found in WORKSPACE: %s" %
+                                            self.FvFileName, GenFdsGlobalVariable.WorkSpaceDir)
                         else:
-                            EdkLogger.error("GenFds", FILE_NOT_FOUND, "%s is not found in packages path:\n\t%s" % (self.FvFileName, '\n\t'.join(mws.getPkgPath())))
+                            EdkLogger.error("GenFds", FILE_NOT_FOUND, "%s is not found in packages path:\n\t%s" % (
+                                self.FvFileName, '\n\t'.join(mws.getPkgPath())))
 
                 else:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "FvImageSection Failed! %s NOT found in FDF" % self.FvName)
+                    EdkLogger.error(
+                        "GenFds", GENFDS_ERROR, "FvImageSection Failed! %s NOT found in FDF" % self.FvName)
 
             #
             # Prepare the parameter of GenSection
             #
-            OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get("FV_IMAGE"))
-            GenFdsGlobalVariable.GenerateSection(OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
+            OutputFile = os.path.join(
+                OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get("FV_IMAGE"))
+            GenFdsGlobalVariable.GenerateSection(
+                OutputFile, [FvFileName], 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE', IsMakefile=IsMakefile)
             OutputFileList.append(OutputFile)
 
             return OutputFileList, self.Alignment
diff --git a/BaseTools/Source/Python/GenFds/GenFds.py b/BaseTools/Source/Python/GenFds/GenFds.py
index 17b71b7cd347..856d4c84c435 100644
--- a/BaseTools/Source/Python/GenFds/GenFds.py
+++ b/BaseTools/Source/Python/GenFds/GenFds.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # generate flash image
 #
 #  Copyright (c) 2007 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -20,7 +20,7 @@ from linecache import getlines
 from io import BytesIO
 
 import Common.LongFilePathOs as os
-from Common.TargetTxtClassObject import TargetTxtDict,gDefaultTargetTxtFile
+from Common.TargetTxtClassObject import TargetTxtDict, gDefaultTargetTxtFile
 from Common.DataType import *
 import Common.GlobalData as GlobalData
 from Common import EdkLogger
@@ -43,7 +43,7 @@ versionNumber = "1.0" + ' ' + gBUILD_VERSION
 __version__ = "%prog Version " + versionNumber
 __copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation  All rights reserved."
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -52,12 +52,15 @@ __copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation  All rights reserv
 #   @retval 0     Tool was successful
 #   @retval 1     Tool failed
 #
+
+
 def main():
     global Options
     Options = myOptionParser()
     EdkLogger.Initialize()
     return GenFdsApi(OptionsToCommandDict(Options))
 
+
 def resetFdsGlobalVariable():
     GenFdsGlobalVariable.FvDir = ''
     GenFdsGlobalVariable.OutputDirDict = {}
@@ -91,7 +94,7 @@ def resetFdsGlobalVariable():
     GenFdsGlobalVariable.GuidToolDefinition = {}
     GenFdsGlobalVariable.FfsCmdDict = {}
     GenFdsGlobalVariable.SecCmdList = []
-    GenFdsGlobalVariable.CopyList   = []
+    GenFdsGlobalVariable.CopyList = []
     GenFdsGlobalVariable.ModuleFile = ''
     GenFdsGlobalVariable.EnableGenfdsMultiThread = True
 
@@ -104,6 +107,7 @@ def resetFdsGlobalVariable():
     # FvName, FdName, CapName in FDF, Image file name
     GenFdsGlobalVariable.ImageBinDict = {}
 
+
 def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
     global Workspace
     Workspace = ""
@@ -127,17 +131,19 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
         else:
             EdkLogger.SetLevel(EdkLogger.INFO)
 
-        if not FdsCommandDict.get("Workspace",os.environ.get('WORKSPACE')):
+        if not FdsCommandDict.get("Workspace", os.environ.get('WORKSPACE')):
             EdkLogger.error("GenFds", OPTION_MISSING, "WORKSPACE not defined",
                             ExtraData="Please use '-w' switch to pass it or set the WORKSPACE environment variable.")
-        elif not os.path.exists(FdsCommandDict.get("Workspace",os.environ.get('WORKSPACE'))):
+        elif not os.path.exists(FdsCommandDict.get("Workspace", os.environ.get('WORKSPACE'))):
             EdkLogger.error("GenFds", PARAMETER_INVALID, "WORKSPACE is invalid",
                             ExtraData="Please use '-w' switch to pass it or set the WORKSPACE environment variable.")
         else:
-            Workspace = os.path.normcase(FdsCommandDict.get("Workspace",os.environ.get('WORKSPACE')))
+            Workspace = os.path.normcase(FdsCommandDict.get(
+                "Workspace", os.environ.get('WORKSPACE')))
             GenFdsGlobalVariable.WorkSpaceDir = Workspace
             if FdsCommandDict.get("debug"):
-                GenFdsGlobalVariable.VerboseLogger("Using Workspace:" + Workspace)
+                GenFdsGlobalVariable.VerboseLogger(
+                    "Using Workspace:" + Workspace)
             if FdsCommandDict.get("GenfdsMultiThread"):
                 GenFdsGlobalVariable.EnableGenfdsMultiThread = True
             else:
@@ -150,46 +156,58 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
 
         if FdsCommandDict.get("fdf_file"):
             FdfFilename = FdsCommandDict.get("fdf_file")[0].Path
-            FdfFilename = GenFdsGlobalVariable.ReplaceWorkspaceMacro(FdfFilename)
+            FdfFilename = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                FdfFilename)
 
             if FdfFilename[0:2] == '..':
                 FdfFilename = os.path.abspath(FdfFilename)
             if not os.path.isabs(FdfFilename):
-                FdfFilename = mws.join(GenFdsGlobalVariable.WorkSpaceDir, FdfFilename)
+                FdfFilename = mws.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, FdfFilename)
             if not os.path.exists(FdfFilename):
-                EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=FdfFilename)
+                EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                                ExtraData=FdfFilename)
 
             GenFdsGlobalVariable.FdfFile = FdfFilename
-            GenFdsGlobalVariable.FdfFileTimeStamp = os.path.getmtime(FdfFilename)
+            GenFdsGlobalVariable.FdfFileTimeStamp = os.path.getmtime(
+                FdfFilename)
         else:
             EdkLogger.error("GenFds", OPTION_MISSING, "Missing FDF filename")
 
         if FdsCommandDict.get("build_target"):
-            GenFdsGlobalVariable.TargetName = FdsCommandDict.get("build_target")
+            GenFdsGlobalVariable.TargetName = FdsCommandDict.get(
+                "build_target")
 
         if FdsCommandDict.get("toolchain_tag"):
-            GenFdsGlobalVariable.ToolChainTag = FdsCommandDict.get("toolchain_tag")
+            GenFdsGlobalVariable.ToolChainTag = FdsCommandDict.get(
+                "toolchain_tag")
 
         if FdsCommandDict.get("active_platform"):
             ActivePlatform = FdsCommandDict.get("active_platform")
-            ActivePlatform = GenFdsGlobalVariable.ReplaceWorkspaceMacro(ActivePlatform)
+            ActivePlatform = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                ActivePlatform)
 
             if ActivePlatform[0:2] == '..':
                 ActivePlatform = os.path.abspath(ActivePlatform)
 
-            if not os.path.isabs (ActivePlatform):
-                ActivePlatform = mws.join(GenFdsGlobalVariable.WorkSpaceDir, ActivePlatform)
+            if not os.path.isabs(ActivePlatform):
+                ActivePlatform = mws.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, ActivePlatform)
 
             if not os.path.exists(ActivePlatform):
-                EdkLogger.error("GenFds", FILE_NOT_FOUND, "ActivePlatform doesn't exist!")
+                EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                                "ActivePlatform doesn't exist!")
         else:
-            EdkLogger.error("GenFds", OPTION_MISSING, "Missing active platform")
+            EdkLogger.error("GenFds", OPTION_MISSING,
+                            "Missing active platform")
 
-        GenFdsGlobalVariable.ActivePlatform = PathClass(NormPath(ActivePlatform))
+        GenFdsGlobalVariable.ActivePlatform = PathClass(
+            NormPath(ActivePlatform))
 
         if FdsCommandDict.get("conf_directory"):
             # Get alternate Conf location, if it is absolute, then just use the absolute directory name
-            ConfDirectoryPath = os.path.normpath(FdsCommandDict.get("conf_directory"))
+            ConfDirectoryPath = os.path.normpath(
+                FdsCommandDict.get("conf_directory"))
             if ConfDirectoryPath.startswith('"'):
                 ConfDirectoryPath = ConfDirectoryPath[1:]
             if ConfDirectoryPath.endswith('"'):
@@ -197,17 +215,20 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
             if not os.path.isabs(ConfDirectoryPath):
                 # Since alternate directory name is not absolute, the alternate directory is located within the WORKSPACE
                 # This also handles someone specifying the Conf directory in the workspace. Using --conf=Conf
-                ConfDirectoryPath = os.path.join(GenFdsGlobalVariable.WorkSpaceDir, ConfDirectoryPath)
+                ConfDirectoryPath = os.path.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, ConfDirectoryPath)
         else:
             if "CONF_PATH" in os.environ:
                 ConfDirectoryPath = os.path.normcase(os.environ["CONF_PATH"])
             else:
                 # Get standard WORKSPACE/Conf, use the absolute path to the WORKSPACE/Conf
-                ConfDirectoryPath = mws.join(GenFdsGlobalVariable.WorkSpaceDir, 'Conf')
+                ConfDirectoryPath = mws.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, 'Conf')
         GenFdsGlobalVariable.ConfDir = ConfDirectoryPath
         if not GlobalData.gConfDirectory:
             GlobalData.gConfDirectory = GenFdsGlobalVariable.ConfDir
-        BuildConfigurationFile = os.path.normpath(os.path.join(ConfDirectoryPath, gDefaultTargetTxtFile))
+        BuildConfigurationFile = os.path.normpath(
+            os.path.join(ConfDirectoryPath, gDefaultTargetTxtFile))
         if os.path.isfile(BuildConfigurationFile) == True:
             # if no build target given in command line, get it from target.txt
             TargetObj = TargetTxtDict()
@@ -215,21 +236,25 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
             if not GenFdsGlobalVariable.TargetName:
                 BuildTargetList = TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TARGET]
                 if len(BuildTargetList) != 1:
-                    EdkLogger.error("GenFds", OPTION_VALUE_INVALID, ExtraData="Only allows one instance for Target.")
+                    EdkLogger.error("GenFds", OPTION_VALUE_INVALID,
+                                    ExtraData="Only allows one instance for Target.")
                 GenFdsGlobalVariable.TargetName = BuildTargetList[0]
 
             # if no tool chain given in command line, get it from target.txt
             if not GenFdsGlobalVariable.ToolChainTag:
                 ToolChainList = TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TOOL_CHAIN_TAG]
                 if ToolChainList is None or len(ToolChainList) == 0:
-                    EdkLogger.error("GenFds", RESOURCE_NOT_AVAILABLE, ExtraData="No toolchain given. Don't know how to build.")
+                    EdkLogger.error("GenFds", RESOURCE_NOT_AVAILABLE,
+                                    ExtraData="No toolchain given. Don't know how to build.")
                 if len(ToolChainList) != 1:
-                    EdkLogger.error("GenFds", OPTION_VALUE_INVALID, ExtraData="Only allows one instance for ToolChain.")
+                    EdkLogger.error("GenFds", OPTION_VALUE_INVALID,
+                                    ExtraData="Only allows one instance for ToolChain.")
                 GenFdsGlobalVariable.ToolChainTag = ToolChainList[0]
         else:
-            EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=BuildConfigurationFile)
+            EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                            ExtraData=BuildConfigurationFile)
 
-        #Set global flag for build mode
+        # Set global flag for build mode
         GlobalData.gIgnoreSource = FdsCommandDict.get("IgnoreSources")
 
         if FdsCommandDict.get("macro"):
@@ -241,11 +266,14 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
                 List = Pair.split('=')
                 if len(List) == 2:
                     if not List[1].strip():
-                        EdkLogger.error("GenFds", OPTION_VALUE_INVALID, ExtraData="No Value given for Macro %s" %List[0])
+                        EdkLogger.error(
+                            "GenFds", OPTION_VALUE_INVALID, ExtraData="No Value given for Macro %s" % List[0])
                     if List[0].strip() in ["WORKSPACE", "TARGET", "TOOLCHAIN"]:
-                        GlobalData.gGlobalDefines[List[0].strip()] = List[1].strip()
+                        GlobalData.gGlobalDefines[List[0].strip(
+                        )] = List[1].strip()
                     else:
-                        GlobalData.gCommandLineDefines[List[0].strip()] = List[1].strip()
+                        GlobalData.gCommandLineDefines[List[0].strip(
+                        )] = List[1].strip()
                 else:
                     GlobalData.gCommandLineDefines[List[0].strip()] = "TRUE"
         os.environ["WORKSPACE"] = Workspace
@@ -259,7 +287,8 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
             GlobalData.gGlobalDefines['TOOL_CHAIN_TAG'] = GenFdsGlobalVariable.ToolChainTag
 
         """call Workspace build create database"""
-        GlobalData.gDatabasePath = os.path.normpath(os.path.join(ConfDirectoryPath, GlobalData.gDatabasePath))
+        GlobalData.gDatabasePath = os.path.normpath(
+            os.path.join(ConfDirectoryPath, GlobalData.gDatabasePath))
 
         if WorkSpaceDataBase:
             BuildWorkSpace = WorkSpaceDataBase
@@ -274,27 +303,35 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
         if FdsCommandDict.get("build_architecture_list"):
             ArchList = FdsCommandDict.get("build_architecture_list").split(',')
         else:
-            ArchList = BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON, FdsCommandDict.get("build_target"), FdsCommandDict.get("toolchain_tag")].SupArchList
+            ArchList = BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON, FdsCommandDict.get(
+                "build_target"), FdsCommandDict.get("toolchain_tag")].SupArchList
 
-        TargetArchList = set(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON, FdsCommandDict.get("build_target"), FdsCommandDict.get("toolchain_tag")].SupArchList) & set(ArchList)
+        TargetArchList = set(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON, FdsCommandDict.get(
+            "build_target"), FdsCommandDict.get("toolchain_tag")].SupArchList) & set(ArchList)
         if len(TargetArchList) == 0:
-            EdkLogger.error("GenFds", GENFDS_ERROR, "Target ARCH %s not in platform supported ARCH %s" % (str(ArchList), str(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON].SupArchList)))
+            EdkLogger.error("GenFds", GENFDS_ERROR, "Target ARCH %s not in platform supported ARCH %s" % (str(
+                ArchList), str(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, TAB_COMMON].SupArchList)))
 
         for Arch in ArchList:
-            GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = NormPath(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, FdsCommandDict.get("build_target"), FdsCommandDict.get("toolchain_tag")].OutputDirectory)
+            GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = NormPath(BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, FdsCommandDict.get(
+                "build_target"), FdsCommandDict.get("toolchain_tag")].OutputDirectory)
 
         # assign platform name based on last entry in ArchList
-        GenFdsGlobalVariable.PlatformName = BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, ArchList[-1], FdsCommandDict.get("build_target"), FdsCommandDict.get("toolchain_tag")].PlatformName
+        GenFdsGlobalVariable.PlatformName = BuildWorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                                       ArchList[-1], FdsCommandDict.get("build_target"), FdsCommandDict.get("toolchain_tag")].PlatformName
 
         if FdsCommandDict.get("platform_build_directory"):
-            OutputDirFromCommandLine = GenFdsGlobalVariable.ReplaceWorkspaceMacro(FdsCommandDict.get("platform_build_directory"))
-            if not os.path.isabs (OutputDirFromCommandLine):
-                OutputDirFromCommandLine = os.path.join(GenFdsGlobalVariable.WorkSpaceDir, OutputDirFromCommandLine)
+            OutputDirFromCommandLine = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                FdsCommandDict.get("platform_build_directory"))
+            if not os.path.isabs(OutputDirFromCommandLine):
+                OutputDirFromCommandLine = os.path.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, OutputDirFromCommandLine)
             for Arch in ArchList:
                 GenFdsGlobalVariable.OutputDirDict[Arch] = OutputDirFromCommandLine
         else:
             for Arch in ArchList:
-                GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.join(GenFdsGlobalVariable.OutputDirFromDscDict[Arch], GenFdsGlobalVariable.TargetName + '_' + GenFdsGlobalVariable.ToolChainTag)
+                GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.join(
+                    GenFdsGlobalVariable.OutputDirFromDscDict[Arch], GenFdsGlobalVariable.TargetName + '_' + GenFdsGlobalVariable.ToolChainTag)
 
         for Key in GenFdsGlobalVariable.OutputDirDict:
             OutputDir = GenFdsGlobalVariable.OutputDirDict[Key]
@@ -302,7 +339,8 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
                 OutputDir = os.path.abspath(OutputDir)
 
             if OutputDir[1] != ':':
-                OutputDir = os.path.join (GenFdsGlobalVariable.WorkSpaceDir, OutputDir)
+                OutputDir = os.path.join(
+                    GenFdsGlobalVariable.WorkSpaceDir, OutputDir)
 
             if not os.path.exists(OutputDir):
                 EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=OutputDir)
@@ -316,7 +354,8 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
             FdfParserObj.ParseFile()
 
         if FdfParserObj.CycleReferenceCheck():
-            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED, "Cycle Reference Detected in FDF file")
+            EdkLogger.error("GenFds", FORMAT_NOT_SUPPORTED,
+                            "Cycle Reference Detected in FDF file")
 
         if FdsCommandDict.get("fd"):
             if FdsCommandDict.get("fd")[0].upper() in FdfParserObj.Profile.FdDict:
@@ -347,7 +386,8 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
 
         """Modify images from build output if the feature of loading driver at fixed address is on."""
         if GenFdsGlobalVariable.FixedLoadAddress:
-            GenFds.PreprocessImage(BuildWorkSpace, GenFdsGlobalVariable.ActivePlatform)
+            GenFds.PreprocessImage(
+                BuildWorkSpace, GenFdsGlobalVariable.ActivePlatform)
 
         # Record the FV Region info that may specific in the FD
         if FdfParserObj.Profile.FvDict and FdfParserObj.Profile.FdDict:
@@ -359,13 +399,16 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
                         for RegionData in RegionObj.RegionDataList:
                             if FvObj.UiFvName.upper() == RegionData.upper():
                                 if not FvObj.BaseAddress:
-                                    FvObj.BaseAddress = '0x%x' % (int(FdObj.BaseAddress, 0) + RegionObj.Offset)
+                                    FvObj.BaseAddress = '0x%x' % (
+                                        int(FdObj.BaseAddress, 0) + RegionObj.Offset)
                                 if FvObj.FvRegionInFD:
                                     if FvObj.FvRegionInFD != RegionObj.Size:
-                                        EdkLogger.error("GenFds", FORMAT_INVALID, "The FV %s's region is specified in multiple FD with different value." %FvObj.UiFvName)
+                                        EdkLogger.error(
+                                            "GenFds", FORMAT_INVALID, "The FV %s's region is specified in multiple FD with different value." % FvObj.UiFvName)
                                 else:
                                     FvObj.FvRegionInFD = RegionObj.Size
-                                    RegionObj.BlockInfoOfRegion(FdObj.BlockSizeList, FvObj)
+                                    RegionObj.BlockInfoOfRegion(
+                                        FdObj.BlockSizeList, FvObj)
 
         """Call GenFds"""
         GenFds.GenFd('', FdfParserObj, BuildWorkSpace, ArchList)
@@ -377,7 +420,8 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
         GenFds.DisplayFvSpaceInfo(FdfParserObj)
 
     except Warning as X:
-        EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
+        EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName,
+                        Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
         ReturnCode = FORMAT_INVALID
     except FatalError as X:
         if FdsCommandDict.get("debug") is not None:
@@ -387,18 +431,19 @@ def GenFdsApi(FdsCommandDict, WorkSpaceDataBase=None):
     except:
         import traceback
         EdkLogger.error(
-                    "\nPython",
-                    CODE_ERROR,
-                    "Tools code failure",
-                    ExtraData="Please send email to %s for help, attaching following call stack trace!\n" % MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
+            "\nPython",
+            CODE_ERROR,
+            "Tools code failure",
+            ExtraData="Please send email to %s for help, attaching following call stack trace!\n" % MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
         EdkLogger.quiet(traceback.format_exc())
         ReturnCode = CODE_ERROR
     finally:
         ClearDuplicatedInf()
     return ReturnCode
 
+
 def OptionsToCommandDict(Options):
     FdsCommandDict = {}
     FdsCommandDict["verbose"] = Options.verbose
@@ -407,7 +452,8 @@ def OptionsToCommandDict(Options):
     FdsCommandDict["debug"] = Options.debug
     FdsCommandDict["Workspace"] = Options.Workspace
     FdsCommandDict["GenfdsMultiThread"] = not Options.NoGenfdsMultiThread
-    FdsCommandDict["fdf_file"] = [PathClass(Options.filename)] if Options.filename else []
+    FdsCommandDict["fdf_file"] = [
+        PathClass(Options.filename)] if Options.filename else []
     FdsCommandDict["build_target"] = Options.BuildTarget
     FdsCommandDict["toolchain_tag"] = Options.ToolChain
     FdsCommandDict["active_platform"] = Options.activePlatform
@@ -424,65 +470,88 @@ def OptionsToCommandDict(Options):
 
 
 gParamCheck = []
+
+
 def SingleCheckCallback(option, opt_str, value, parser):
     if option not in gParamCheck:
         setattr(parser.values, option.dest, value)
         gParamCheck.append(option)
     else:
-        parser.error("Option %s only allows one instance in command line!" % option)
+        parser.error(
+            "Option %s only allows one instance in command line!" % option)
 
-## Parse command line options
+# Parse command line options
 #
 # Using standard Python module optparse to parse command line option of this tool.
 #
 #   @retval Opt   A optparse.Values object containing the parsed options
 #
+
+
 def myOptionParser():
     usage = "%prog [options] -f input_file -a arch_list -b build_target -p active_platform -t tool_chain_tag -D \"MacroName [= MacroValue]\""
-    Parser = OptionParser(usage=usage, description=__copyright__, version="%prog " + str(versionNumber))
-    Parser.add_option("-f", "--file", dest="filename", type="string", help="Name of FDF file to convert", action="callback", callback=SingleCheckCallback)
-    Parser.add_option("-a", "--arch", dest="archList", help="comma separated list containing one or more of: IA32, X64, IPF, ARM, AARCH64 or EBC which should be built, overrides target.txt?s TARGET_ARCH")
-    Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
-    Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed.")
-    Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
+    Parser = OptionParser(usage=usage, description=__copyright__,
+                          version="%prog " + str(versionNumber))
+    Parser.add_option("-f", "--file", dest="filename", type="string",
+                      help="Name of FDF file to convert", action="callback", callback=SingleCheckCallback)
+    Parser.add_option("-a", "--arch", dest="archList",
+                      help="comma separated list containing one or more of: IA32, X64, IPF, ARM, AARCH64 or EBC which should be built, overrides target.txt?s TARGET_ARCH")
+    Parser.add_option("-q", "--quiet", action="store_true",
+                      type=None, help="Disable all messages except FATAL ERRORS.")
+    Parser.add_option("-v", "--verbose", action="store_true", type=None,
+                      help="Turn on verbose output with informational messages printed.")
+    Parser.add_option("-d", "--debug", action="store", type="int",
+                      help="Enable debug messages at specified level.")
     Parser.add_option("-p", "--platform", type="string", dest="activePlatform", help="Set the ACTIVE_PLATFORM, overrides target.txt ACTIVE_PLATFORM setting.",
                       action="callback", callback=SingleCheckCallback)
     Parser.add_option("-w", "--workspace", type="string", dest="Workspace", default=os.environ.get('WORKSPACE'), help="Set the WORKSPACE",
                       action="callback", callback=SingleCheckCallback)
     Parser.add_option("-o", "--outputDir", type="string", dest="outputDir", help="Name of Build Output directory",
                       action="callback", callback=SingleCheckCallback)
-    Parser.add_option("-r", "--rom_image", dest="uiFdName", help="Build the image using the [FD] section named by FdUiName.")
-    Parser.add_option("-i", "--FvImage", dest="uiFvName", help="Build the FV image using the [FV] section named by UiFvName")
-    Parser.add_option("-C", "--CapsuleImage", dest="uiCapName", help="Build the Capsule image using the [Capsule] section named by UiCapName")
+    Parser.add_option("-r", "--rom_image", dest="uiFdName",
+                      help="Build the image using the [FD] section named by FdUiName.")
+    Parser.add_option("-i", "--FvImage", dest="uiFvName",
+                      help="Build the FV image using the [FV] section named by UiFvName")
+    Parser.add_option("-C", "--CapsuleImage", dest="uiCapName",
+                      help="Build the Capsule image using the [Capsule] section named by UiCapName")
     Parser.add_option("-b", "--buildtarget", type="string", dest="BuildTarget", help="Set the build TARGET, overrides target.txt TARGET setting.",
                       action="callback", callback=SingleCheckCallback)
     Parser.add_option("-t", "--tagname", type="string", dest="ToolChain", help="Using the tools: TOOL_CHAIN_TAG name to build the platform.",
                       action="callback", callback=SingleCheckCallback)
-    Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
-    Parser.add_option("-s", "--specifyaddress", dest="FixedAddress", action="store_true", type=None, help="Specify driver load address.")
-    Parser.add_option("--conf", action="store", type="string", dest="ConfDirectory", help="Specify the customized Conf directory.")
-    Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources", default=False, help="Focus to a binary build and ignore all source files")
-    Parser.add_option("--pcd", action="append", dest="OptionPcd", help="Set PCD value by command line. Format: \"PcdName=Value\" ")
-    Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread", default=True, help="Enable GenFds multi thread to generate ffs file.")
-    Parser.add_option("--no-genfds-multi-thread", action="store_true", dest="NoGenfdsMultiThread", default=False, help="Disable GenFds multi thread to generate ffs file.")
+    Parser.add_option("-D", "--define", action="append", type="string",
+                      dest="Macros", help="Macro: \"Name [= Value]\".")
+    Parser.add_option("-s", "--specifyaddress", dest="FixedAddress",
+                      action="store_true", type=None, help="Specify driver load address.")
+    Parser.add_option("--conf", action="store", type="string",
+                      dest="ConfDirectory", help="Specify the customized Conf directory.")
+    Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources",
+                      default=False, help="Focus to a binary build and ignore all source files")
+    Parser.add_option("--pcd", action="append", dest="OptionPcd",
+                      help="Set PCD value by command line. Format: \"PcdName=Value\" ")
+    Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread",
+                      default=True, help="Enable GenFds multi thread to generate ffs file.")
+    Parser.add_option("--no-genfds-multi-thread", action="store_true", dest="NoGenfdsMultiThread",
+                      default=False, help="Disable GenFds multi thread to generate ffs file.")
 
     Options, _ = Parser.parse_args()
     return Options
 
-## The class implementing the EDK2 flash image generation process
+# The class implementing the EDK2 flash image generation process
 #
 #   This process includes:
 #       1. Collect workspace information, includes platform and module information
 #       2. Call methods of Fd class to generate FD
 #       3. Call methods of Fv class to generate FV that not belong to FD
 #
+
+
 class GenFds(object):
     FdfParsef = None
     OnlyGenerateThisFd = None
     OnlyGenerateThisFv = None
     OnlyGenerateThisCap = None
 
-    ## GenFd()
+    # GenFd()
     #
     #   @param  OutputDir           Output directory
     #   @param  FdfParserObject     FDF contents parser
@@ -490,18 +559,21 @@ class GenFds(object):
     #   @param  ArchList            The Arch list of platform
     #
     @staticmethod
-    def GenFd (OutputDir, FdfParserObject, WorkSpace, ArchList):
-        GenFdsGlobalVariable.SetDir ('', FdfParserObject, WorkSpace, ArchList)
+    def GenFd(OutputDir, FdfParserObject, WorkSpace, ArchList):
+        GenFdsGlobalVariable.SetDir('', FdfParserObject, WorkSpace, ArchList)
 
-        GenFdsGlobalVariable.VerboseLogger(" Generate all Fd images and their required FV and Capsule images!")
+        GenFdsGlobalVariable.VerboseLogger(
+            " Generate all Fd images and their required FV and Capsule images!")
         if GenFds.OnlyGenerateThisCap is not None and GenFds.OnlyGenerateThisCap.upper() in GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict:
-            CapsuleObj = GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict[GenFds.OnlyGenerateThisCap.upper()]
+            CapsuleObj = GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict[GenFds.OnlyGenerateThisCap.upper(
+            )]
             if CapsuleObj is not None:
                 CapsuleObj.GenCapsule()
                 return
 
         if GenFds.OnlyGenerateThisFd is not None and GenFds.OnlyGenerateThisFd.upper() in GenFdsGlobalVariable.FdfParser.Profile.FdDict:
-            FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[GenFds.OnlyGenerateThisFd.upper()]
+            FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[GenFds.OnlyGenerateThisFd.upper(
+            )]
             if FdObj is not None:
                 FdObj.GenFd()
                 return
@@ -511,7 +583,8 @@ class GenFds(object):
 
         GenFdsGlobalVariable.VerboseLogger("\n Generate other FV images! ")
         if GenFds.OnlyGenerateThisFv is not None and GenFds.OnlyGenerateThisFv.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
-            FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[GenFds.OnlyGenerateThisFv.upper()]
+            FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[GenFds.OnlyGenerateThisFv.upper(
+            )]
             if FvObj is not None:
                 Buffer = BytesIO()
                 FvObj.AddToBuffer(Buffer)
@@ -525,18 +598,21 @@ class GenFds(object):
 
         if GenFds.OnlyGenerateThisFv is None and GenFds.OnlyGenerateThisFd is None and GenFds.OnlyGenerateThisCap is None:
             if GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict != {}:
-                GenFdsGlobalVariable.VerboseLogger("\n Generate other Capsule images!")
+                GenFdsGlobalVariable.VerboseLogger(
+                    "\n Generate other Capsule images!")
                 for CapsuleObj in GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict.values():
                     CapsuleObj.GenCapsule()
 
             if GenFdsGlobalVariable.FdfParser.Profile.OptRomDict != {}:
-                GenFdsGlobalVariable.VerboseLogger("\n Generate all Option ROM!")
+                GenFdsGlobalVariable.VerboseLogger(
+                    "\n Generate all Option ROM!")
                 for OptRomObj in GenFdsGlobalVariable.FdfParser.Profile.OptRomDict.values():
                     OptRomObj.AddToBuffer(None)
 
     @staticmethod
     def GenFfsMakefile(OutputDir, FdfParserObject, WorkSpace, ArchList, GlobalData):
-        GenFdsGlobalVariable.SetEnv(FdfParserObject, WorkSpace, ArchList, GlobalData)
+        GenFdsGlobalVariable.SetEnv(
+            FdfParserObject, WorkSpace, ArchList, GlobalData)
         for FdObj in GenFdsGlobalVariable.FdfParser.Profile.FdDict.values():
             FdObj.GenFd(Flag=True)
 
@@ -549,7 +625,7 @@ class GenFds(object):
 
         return GenFdsGlobalVariable.FfsCmdDict
 
-    ## GetFvBlockSize()
+    # GetFvBlockSize()
     #
     #   @param  FvObj           Whose block size to get
     #   @retval int             Block size value
@@ -559,7 +635,8 @@ class GenFds(object):
         DefaultBlockSize = 0x1
         FdObj = None
         if GenFds.OnlyGenerateThisFd is not None and GenFds.OnlyGenerateThisFd.upper() in GenFdsGlobalVariable.FdfParser.Profile.FdDict:
-            FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[GenFds.OnlyGenerateThisFd.upper()]
+            FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict[GenFds.OnlyGenerateThisFd.upper(
+            )]
         if FdObj is None:
             for ElementFd in GenFdsGlobalVariable.FdfParser.Profile.FdDict.values():
                 for ElementRegion in ElementFd.RegionList:
@@ -575,16 +652,16 @@ class GenFds(object):
             return DefaultBlockSize
         else:
             for ElementRegion in FdObj.RegionList:
-                    if ElementRegion.RegionType == BINARY_FILE_TYPE_FV:
-                        for ElementRegionData in ElementRegion.RegionDataList:
-                            if ElementRegionData is not None and ElementRegionData.upper() == FvObj.UiFvName:
-                                if FvObj.BlockSizeList != []:
-                                    return FvObj.BlockSizeList[0][0]
-                                else:
-                                    return ElementRegion.BlockSizeOfRegion(ElementFd.BlockSizeList)
+                if ElementRegion.RegionType == BINARY_FILE_TYPE_FV:
+                    for ElementRegionData in ElementRegion.RegionDataList:
+                        if ElementRegionData is not None and ElementRegionData.upper() == FvObj.UiFvName:
+                            if FvObj.BlockSizeList != []:
+                                return FvObj.BlockSizeList[0][0]
+                            else:
+                                return ElementRegion.BlockSizeOfRegion(ElementFd.BlockSizeList)
             return DefaultBlockSize
 
-    ## DisplayFvSpaceInfo()
+    # DisplayFvSpaceInfo()
     #
     #   @param  FvObj           Whose block size to get
     #   @retval None
@@ -597,7 +674,8 @@ class GenFds(object):
         for FvName in FdfParserObject.Profile.FvDict:
             if len(FvName) > MaxFvNameLength:
                 MaxFvNameLength = len(FvName)
-            FvSpaceInfoFileName = os.path.join(GenFdsGlobalVariable.FvDir, FvName.upper() + '.Fv.map')
+            FvSpaceInfoFileName = os.path.join(
+                GenFdsGlobalVariable.FvDir, FvName.upper() + '.Fv.map')
             if os.path.exists(FvSpaceInfoFileName):
                 FileLinesList = getlines(FvSpaceInfoFileName)
                 TotalFound = False
@@ -631,14 +709,18 @@ class GenFds(object):
             if UsedSizeValue == TotalSizeValue:
                 Percentage = '100'
             else:
-                Percentage = str((UsedSizeValue + 0.0) / TotalSizeValue)[0:4].lstrip('0.')
+                Percentage = str((UsedSizeValue + 0.0) /
+                                 TotalSizeValue)[0:4].lstrip('0.')
 
-            GenFdsGlobalVariable.InfLogger(Name + ' ' + '[' + Percentage + '%Full] '\
-                                           + str(TotalSizeValue) + ' (' + hex(TotalSizeValue) + ')' + ' total, '\
-                                           + str(UsedSizeValue) + ' (' + hex(UsedSizeValue) + ')' + ' used, '\
+            GenFdsGlobalVariable.InfLogger(Name + ' ' + '[' + Percentage + '%Full] '
+                                           + str(TotalSizeValue) + ' (' +
+                                           hex(TotalSizeValue) +
+                                           ')' + ' total, '
+                                           + str(UsedSizeValue) + ' (' +
+                                           hex(UsedSizeValue) + ')' + ' used, '
                                            + str(FreeSizeValue) + ' (' + hex(FreeSizeValue) + ')' + ' free')
 
-    ## PreprocessImage()
+    # PreprocessImage()
     #
     #   @param  BuildDb         Database from build meta data files
     #   @param  DscFile         modules from dsc file will be preprocessed
@@ -646,7 +728,8 @@ class GenFds(object):
     #
     @staticmethod
     def PreprocessImage(BuildDb, DscFile):
-        PcdDict = BuildDb.BuildObject[DscFile, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Pcds
+        PcdDict = BuildDb.BuildObject[DscFile, TAB_COMMON,
+                                      GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Pcds
         PcdValue = ''
         for Key in PcdDict:
             PcdObj = PcdDict[Key]
@@ -665,14 +748,17 @@ class GenFds(object):
         if Int64PcdValue > 0:
             TopAddress = Int64PcdValue
 
-        ModuleDict = BuildDb.BuildObject[DscFile, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
+        ModuleDict = BuildDb.BuildObject[DscFile, TAB_COMMON,
+                                         GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].Modules
         for Key in ModuleDict:
-            ModuleObj = BuildDb.BuildObject[Key, TAB_COMMON, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            ModuleObj = BuildDb.BuildObject[Key, TAB_COMMON,
+                                            GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
             print(ModuleObj.BaseName + ' ' + ModuleObj.ModuleType)
 
     @staticmethod
     def GenerateGuidXRefFile(BuildDb, ArchList, FdfParserObj):
-        GuidXRefFileName = os.path.join(GenFdsGlobalVariable.FvDir, "Guid.xref")
+        GuidXRefFileName = os.path.join(
+            GenFdsGlobalVariable.FvDir, "Guid.xref")
         GuidXRefFile = []
         PkgGuidDict = {}
         GuidDict = {}
@@ -680,8 +766,10 @@ class GenFds(object):
         FileGuidList = []
         VariableGuidSet = set()
         for Arch in ArchList:
-            PlatformDataBase = BuildDb.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
-            PkgList = GenFdsGlobalVariable.WorkSpace.GetPackageList(GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag)
+            PlatformDataBase = BuildDb.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                   Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            PkgList = GenFdsGlobalVariable.WorkSpace.GetPackageList(
+                GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag)
             for P in PkgList:
                 PkgGuidDict.update(P.Guids)
             for Name, Guid in PlatformDataBase.Pcds:
@@ -689,33 +777,40 @@ class GenFds(object):
                 if Pcd.Type in [TAB_PCDS_DYNAMIC_HII, TAB_PCDS_DYNAMIC_EX_HII]:
                     for SkuId in Pcd.SkuInfoList:
                         Sku = Pcd.SkuInfoList[SkuId]
-                        if Sku.VariableGuid in VariableGuidSet:continue
+                        if Sku.VariableGuid in VariableGuidSet:
+                            continue
                         VariableGuidSet.add(Sku.VariableGuid)
                         if Sku.VariableGuid and Sku.VariableGuid in PkgGuidDict.keys():
                             GuidDict[Sku.VariableGuid] = PkgGuidDict[Sku.VariableGuid]
             for ModuleFile in PlatformDataBase.Modules:
-                Module = BuildDb.BuildObject[ModuleFile, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                Module = BuildDb.BuildObject[ModuleFile, Arch,
+                                             GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                 if Module in ModuleList:
                     continue
                 else:
                     ModuleList.append(Module)
                 if GlobalData.gGuidPattern.match(ModuleFile.BaseName):
-                    GuidXRefFile.append("%s %s\n" % (ModuleFile.BaseName, Module.BaseName))
+                    GuidXRefFile.append("%s %s\n" %
+                                        (ModuleFile.BaseName, Module.BaseName))
                 else:
-                    GuidXRefFile.append("%s %s\n" % (Module.Guid, Module.BaseName))
+                    GuidXRefFile.append("%s %s\n" %
+                                        (Module.Guid, Module.BaseName))
                 GuidDict.update(Module.Protocols)
                 GuidDict.update(Module.Guids)
                 GuidDict.update(Module.Ppis)
             for FvName in FdfParserObj.Profile.FvDict:
                 for FfsObj in FdfParserObj.Profile.FvDict[FvName].FfsList:
                     if not isinstance(FfsObj, FileStatement):
-                        InfPath = PathClass(NormPath(mws.join(GenFdsGlobalVariable.WorkSpaceDir, FfsObj.InfFileName)))
-                        FdfModule = BuildDb.BuildObject[InfPath, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+                        InfPath = PathClass(
+                            NormPath(mws.join(GenFdsGlobalVariable.WorkSpaceDir, FfsObj.InfFileName)))
+                        FdfModule = BuildDb.BuildObject[InfPath, Arch,
+                                                        GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
                         if FdfModule in ModuleList:
                             continue
                         else:
                             ModuleList.append(FdfModule)
-                        GuidXRefFile.append("%s %s\n" % (FdfModule.Guid, FdfModule.BaseName))
+                        GuidXRefFile.append("%s %s\n" % (
+                            FdfModule.Guid, FdfModule.BaseName))
                         GuidDict.update(FdfModule.Protocols)
                         GuidDict.update(FdfModule.Guids)
                         GuidDict.update(FdfModule.Ppis)
@@ -726,21 +821,25 @@ class GenFds(object):
                         else:
                             FileGuidList.append(FileStatementGuid)
                         Name = []
-                        FfsPath = os.path.join(GenFdsGlobalVariable.FvDir, 'Ffs')
-                        FfsPath = glob(os.path.join(FfsPath, FileStatementGuid) + TAB_STAR)
+                        FfsPath = os.path.join(
+                            GenFdsGlobalVariable.FvDir, 'Ffs')
+                        FfsPath = glob(os.path.join(
+                            FfsPath, FileStatementGuid) + TAB_STAR)
                         if not FfsPath:
                             continue
                         if not os.path.exists(FfsPath[0]):
                             continue
                         MatchDict = {}
-                        ReFileEnds = compile('\S+(.ui)$|\S+(fv.sec.txt)$|\S+(.pe32.txt)$|\S+(.te.txt)$|\S+(.pic.txt)$|\S+(.raw.txt)$|\S+(.ffs.txt)$')
+                        ReFileEnds = compile(
+                            '\S+(.ui)$|\S+(fv.sec.txt)$|\S+(.pe32.txt)$|\S+(.te.txt)$|\S+(.pic.txt)$|\S+(.raw.txt)$|\S+(.ffs.txt)$')
                         FileList = os.listdir(FfsPath[0])
                         for File in FileList:
                             Match = ReFileEnds.search(File)
                             if Match:
                                 for Index in range(1, 8):
                                     if Match.group(Index) and Match.group(Index) in MatchDict:
-                                        MatchDict[Match.group(Index)].append(File)
+                                        MatchDict[Match.group(
+                                            Index)].append(File)
                                     elif Match.group(Index):
                                         MatchDict[Match.group(Index)] = [File]
                         if not MatchDict:
@@ -751,7 +850,8 @@ class GenFds(object):
                                     F.read()
                                     length = F.tell()
                                     F.seek(4)
-                                    TmpStr = unpack('%dh' % ((length - 4) // 2), F.read())
+                                    TmpStr = unpack(
+                                        '%dh' % ((length - 4) // 2), F.read())
                                     Name = ''.join(chr(c) for c in TmpStr[:-1])
                         else:
                             FileList = []
@@ -775,26 +875,29 @@ class GenFds(object):
                         if not Name:
                             continue
 
-                        Name = ' '.join(Name) if isinstance(Name, type([])) else Name
-                        GuidXRefFile.append("%s %s\n" %(FileStatementGuid, Name))
+                        Name = ' '.join(Name) if isinstance(
+                            Name, type([])) else Name
+                        GuidXRefFile.append("%s %s\n" %
+                                            (FileStatementGuid, Name))
 
        # Append GUIDs, Protocols, and PPIs to the Xref file
         GuidXRefFile.append("\n")
         for key, item in GuidDict.items():
-            GuidXRefFile.append("%s %s\n" % (GuidStructureStringToGuidString(item).upper(), key))
+            GuidXRefFile.append("%s %s\n" % (
+                GuidStructureStringToGuidString(item).upper(), key))
 
         if GuidXRefFile:
             GuidXRefFile = ''.join(GuidXRefFile)
             SaveFileOnChange(GuidXRefFileName, GuidXRefFile, False)
-            GenFdsGlobalVariable.InfLogger("\nGUID cross reference file can be found at %s" % GuidXRefFileName)
+            GenFdsGlobalVariable.InfLogger(
+                "\nGUID cross reference file can be found at %s" % GuidXRefFileName)
         elif os.path.exists(GuidXRefFileName):
             os.remove(GuidXRefFileName)
 
 
 if __name__ == '__main__':
     r = main()
-    ## 0-127 is a safe return range, and 1 is a standard default error
+    # 0-127 is a safe return range, and 1 is a standard default error
     if r < 0 or r > 127:
         r = 1
     exit(r)
-
diff --git a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
index d7668ba681aa..8b68d60aca68 100644
--- a/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
+++ b/BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Global variables for GenFds
 #
 #  Copyright (c) 2007 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -15,28 +15,30 @@ from __future__ import absolute_import
 import Common.LongFilePathOs as os
 import sys
 from sys import stdout
-from subprocess import PIPE,Popen
+from subprocess import PIPE, Popen
 from struct import Struct
 from array import array
 
-from Common.BuildToolError import COMMAND_FAILURE,GENFDS_ERROR
+from Common.BuildToolError import COMMAND_FAILURE, GENFDS_ERROR
 from Common import EdkLogger
 from Common.Misc import SaveFileOnChange
 
 from Common.TargetTxtClassObject import TargetTxtDict
-from Common.ToolDefClassObject import ToolDefDict,gDefaultToolsDefFile
+from Common.ToolDefClassObject import ToolDefDict, gDefaultToolsDefFile
 from AutoGen.BuildEngine import ToolBuildRule
 import Common.DataType as DataType
-from Common.Misc import PathClass,CreateDirectory
+from Common.Misc import PathClass, CreateDirectory
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 import Common.GlobalData as GlobalData
 from Common.BuildToolError import *
 from AutoGen.AutoGen import CalculatePriorityValue
 
-## Global variables
+# Global variables
 #
 #
+
+
 class GenFdsGlobalVariable:
     FvDir = ''
     OutputDirDict = {}
@@ -70,7 +72,7 @@ class GenFdsGlobalVariable:
     GuidToolDefinition = {}
     FfsCmdDict = {}
     SecCmdList = []
-    CopyList   = []
+    CopyList = []
     ModuleFile = ''
     EnableGenfdsMultiThread = True
 
@@ -92,7 +94,7 @@ class GenFdsGlobalVariable:
     # FvName, FdName, CapName in FDF, Image file name
     ImageBinDict = {}
 
-    ## LoadBuildRule
+    # LoadBuildRule
     #
     @staticmethod
     def _LoadBuildRule():
@@ -101,24 +103,28 @@ class GenFdsGlobalVariable:
         BuildRule = ToolBuildRule()
         GenFdsGlobalVariable.__BuildRuleDatabase = BuildRule.ToolBuildRule
         TargetObj = TargetTxtDict()
-        ToolDefinitionFile = TargetObj.Target.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
+        ToolDefinitionFile = TargetObj.Target.TargetTxtDictionary[
+            DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
         if ToolDefinitionFile == '':
-            ToolDefinitionFile =  os.path.join('Conf', gDefaultToolsDefFile)
+            ToolDefinitionFile = os.path.join('Conf', gDefaultToolsDefFile)
         if os.path.isfile(ToolDefinitionFile):
-            ToolDefObj = ToolDefDict((os.path.join(os.getenv("WORKSPACE"), "Conf")))
+            ToolDefObj = ToolDefDict(
+                (os.path.join(os.getenv("WORKSPACE"), "Conf")))
             ToolDefinition = ToolDefObj.ToolDef.ToolsDefTxtDatabase
             if DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY in ToolDefinition \
                and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY] \
                and ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]:
-                GenFdsGlobalVariable.BuildRuleFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]
+                GenFdsGlobalVariable.BuildRuleFamily = ToolDefinition[
+                    DataType.TAB_TOD_DEFINES_BUILDRULEFAMILY][GenFdsGlobalVariable.ToolChainTag]
 
             if DataType.TAB_TOD_DEFINES_FAMILY in ToolDefinition \
                and GenFdsGlobalVariable.ToolChainTag in ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY] \
                and ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]:
-                GenFdsGlobalVariable.ToolChainFamily = ToolDefinition[DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]
+                GenFdsGlobalVariable.ToolChainFamily = ToolDefinition[
+                    DataType.TAB_TOD_DEFINES_FAMILY][GenFdsGlobalVariable.ToolChainTag]
         return GenFdsGlobalVariable.__BuildRuleDatabase
 
-    ## GetBuildRules
+    # GetBuildRules
     #    @param Inf: object of InfBuildData
     #    @param Arch: current arch
     #
@@ -144,44 +150,48 @@ class GenFdsGlobalVariable:
         )
         BinDir = os.path.join(GenFdsGlobalVariable.OutputDirDict[Arch], Arch)
         Macro = {
-        "WORKSPACE":GenFdsGlobalVariable.WorkSpaceDir,
-        "MODULE_NAME":Inf.BaseName,
-        "MODULE_GUID":Inf.Guid,
-        "MODULE_VERSION":Inf.Version,
-        "MODULE_TYPE":Inf.ModuleType,
-        "MODULE_FILE":str(PathClassObj),
-        "MODULE_FILE_BASE_NAME":PathClassObj.BaseName,
-        "MODULE_RELATIVE_DIR":PathClassObj.SubDir,
-        "MODULE_DIR":PathClassObj.SubDir,
-        "BASE_NAME":Inf.BaseName,
-        "ARCH":Arch,
-        "TOOLCHAIN":GenFdsGlobalVariable.ToolChainTag,
-        "TOOLCHAIN_TAG":GenFdsGlobalVariable.ToolChainTag,
-        "TOOL_CHAIN_TAG":GenFdsGlobalVariable.ToolChainTag,
-        "TARGET":GenFdsGlobalVariable.TargetName,
-        "BUILD_DIR":GenFdsGlobalVariable.OutputDirDict[Arch],
-        "BIN_DIR":BinDir,
-        "LIB_DIR":BinDir,
-        "MODULE_BUILD_DIR":BuildDir,
-        "OUTPUT_DIR":os.path.join(BuildDir, "OUTPUT"),
-        "DEBUG_DIR":os.path.join(BuildDir, "DEBUG")
+            "WORKSPACE": GenFdsGlobalVariable.WorkSpaceDir,
+            "MODULE_NAME": Inf.BaseName,
+            "MODULE_GUID": Inf.Guid,
+            "MODULE_VERSION": Inf.Version,
+            "MODULE_TYPE": Inf.ModuleType,
+            "MODULE_FILE": str(PathClassObj),
+            "MODULE_FILE_BASE_NAME": PathClassObj.BaseName,
+            "MODULE_RELATIVE_DIR": PathClassObj.SubDir,
+            "MODULE_DIR": PathClassObj.SubDir,
+            "BASE_NAME": Inf.BaseName,
+            "ARCH": Arch,
+            "TOOLCHAIN": GenFdsGlobalVariable.ToolChainTag,
+            "TOOLCHAIN_TAG": GenFdsGlobalVariable.ToolChainTag,
+            "TOOL_CHAIN_TAG": GenFdsGlobalVariable.ToolChainTag,
+            "TARGET": GenFdsGlobalVariable.TargetName,
+            "BUILD_DIR": GenFdsGlobalVariable.OutputDirDict[Arch],
+            "BIN_DIR": BinDir,
+            "LIB_DIR": BinDir,
+            "MODULE_BUILD_DIR": BuildDir,
+            "OUTPUT_DIR": os.path.join(BuildDir, "OUTPUT"),
+            "DEBUG_DIR": os.path.join(BuildDir, "DEBUG")
         }
 
         BuildRules = {}
         for Type in BuildRuleDatabase.FileTypeList:
-            #first try getting build rule by BuildRuleFamily
-            RuleObject = BuildRuleDatabase[Type, Inf.BuildType, Arch, GenFdsGlobalVariable.BuildRuleFamily]
+            # first try getting build rule by BuildRuleFamily
+            RuleObject = BuildRuleDatabase[Type, Inf.BuildType,
+                                           Arch, GenFdsGlobalVariable.BuildRuleFamily]
             if not RuleObject:
                 # build type is always module type, but ...
                 if Inf.ModuleType != Inf.BuildType:
-                    RuleObject = BuildRuleDatabase[Type, Inf.ModuleType, Arch, GenFdsGlobalVariable.BuildRuleFamily]
-            #second try getting build rule by ToolChainFamily
+                    RuleObject = BuildRuleDatabase[Type, Inf.ModuleType,
+                                                   Arch, GenFdsGlobalVariable.BuildRuleFamily]
+            # second try getting build rule by ToolChainFamily
             if not RuleObject:
-                RuleObject = BuildRuleDatabase[Type, Inf.BuildType, Arch, GenFdsGlobalVariable.ToolChainFamily]
+                RuleObject = BuildRuleDatabase[Type, Inf.BuildType,
+                                               Arch, GenFdsGlobalVariable.ToolChainFamily]
                 if not RuleObject:
                     # build type is always module type, but ...
                     if Inf.ModuleType != Inf.BuildType:
-                        RuleObject = BuildRuleDatabase[Type, Inf.ModuleType, Arch, GenFdsGlobalVariable.ToolChainFamily]
+                        RuleObject = BuildRuleDatabase[Type, Inf.ModuleType,
+                                                       Arch, GenFdsGlobalVariable.ToolChainFamily]
             if not RuleObject:
                 continue
             RuleObject = RuleObject.Instantiate(Macro)
@@ -190,7 +200,7 @@ class GenFdsGlobalVariable:
                 BuildRules[Ext] = RuleObject
         return BuildRules
 
-    ## GetModuleCodaTargetList
+    # GetModuleCodaTargetList
     #
     #    @param Inf: object of InfBuildData
     #    @param Arch: current arch
@@ -207,7 +217,7 @@ class GenFdsGlobalVariable:
         if not Inf.IsBinaryModule:
             for File in Inf.Sources:
                 if File.TagName in {"", DataType.TAB_STAR, GenFdsGlobalVariable.ToolChainTag} and \
-                    File.ToolChainFamily in {"", DataType.TAB_STAR, GenFdsGlobalVariable.ToolChainFamily}:
+                        File.ToolChainFamily in {"", DataType.TAB_STAR, GenFdsGlobalVariable.ToolChainFamily}:
                     FileList.append((File, DataType.TAB_UNKNOWN_FILE))
 
         for File in Inf.Binaries:
@@ -270,7 +280,7 @@ class GenFdsGlobalVariable:
 
         return list(TargetList)
 
-    ## SetDir()
+    # SetDir()
     #
     #   @param  OutputDir           Output directory
     #   @param  FdfParser           FDF contents parser
@@ -278,21 +288,25 @@ class GenFdsGlobalVariable:
     #   @param  ArchList            The Arch list of platform
     #
     @staticmethod
-    def SetDir (OutputDir, FdfParser, WorkSpace, ArchList):
-        GenFdsGlobalVariable.VerboseLogger("GenFdsGlobalVariable.OutputDir:%s" % OutputDir)
+    def SetDir(OutputDir, FdfParser, WorkSpace, ArchList):
+        GenFdsGlobalVariable.VerboseLogger(
+            "GenFdsGlobalVariable.OutputDir:%s" % OutputDir)
         GenFdsGlobalVariable.FdfParser = FdfParser
         GenFdsGlobalVariable.WorkSpace = WorkSpace
-        GenFdsGlobalVariable.FvDir = os.path.join(GenFdsGlobalVariable.OutputDirDict[ArchList[0]], DataType.TAB_FV_DIRECTORY)
+        GenFdsGlobalVariable.FvDir = os.path.join(
+            GenFdsGlobalVariable.OutputDirDict[ArchList[0]], DataType.TAB_FV_DIRECTORY)
         if not os.path.exists(GenFdsGlobalVariable.FvDir):
             os.makedirs(GenFdsGlobalVariable.FvDir)
-        GenFdsGlobalVariable.FfsDir = os.path.join(GenFdsGlobalVariable.FvDir, 'Ffs')
+        GenFdsGlobalVariable.FfsDir = os.path.join(
+            GenFdsGlobalVariable.FvDir, 'Ffs')
         if not os.path.exists(GenFdsGlobalVariable.FfsDir):
             os.makedirs(GenFdsGlobalVariable.FfsDir)
 
         #
         # Create FV Address inf file
         #
-        GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
+        GenFdsGlobalVariable.FvAddressFileName = os.path.join(
+            GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
         FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
         #
         # Add [Options]
@@ -301,23 +315,25 @@ class GenFdsGlobalVariable:
         BsAddress = '0'
         for Arch in ArchList:
             if GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress:
-                BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress
+                BsAddress = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                                       Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].BsBaseAddress
                 break
 
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                       BsAddress + \
-                                       DataType.TAB_LINE_BREAK)
+        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " +
+                                 BsAddress +
+                                 DataType.TAB_LINE_BREAK)
 
         RtAddress = '0'
         for Arch in reversed(ArchList):
-            temp = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress
+            temp = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                              Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag].RtBaseAddress
             if temp:
                 RtAddress = temp
                 break
 
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                       RtAddress + \
-                                       DataType.TAB_LINE_BREAK)
+        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " +
+                                 RtAddress +
+                                 DataType.TAB_LINE_BREAK)
 
         FvAddressFile.close()
 
@@ -330,31 +346,34 @@ class GenFdsGlobalVariable:
         GenFdsGlobalVariable.ToolChainTag = GlobalData.gGlobalDefines["TOOL_CHAIN_TAG"]
         GenFdsGlobalVariable.TargetName = GlobalData.gGlobalDefines["TARGET"]
         GenFdsGlobalVariable.ActivePlatform = GlobalData.gActivePlatform
-        GenFdsGlobalVariable.ConfDir  = GlobalData.gConfDirectory
+        GenFdsGlobalVariable.ConfDir = GlobalData.gConfDirectory
         GenFdsGlobalVariable.EnableGenfdsMultiThread = GlobalData.gEnableGenfdsMultiThread
         for Arch in ArchList:
             GenFdsGlobalVariable.OutputDirDict[Arch] = os.path.normpath(
                 os.path.join(GlobalData.gWorkspace,
                              WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GlobalData.gGlobalDefines['TARGET'],
-                             GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory,
-                             GlobalData.gGlobalDefines['TARGET'] +'_' + GlobalData.gGlobalDefines['TOOLCHAIN']))
+                                                      GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory,
+                             GlobalData.gGlobalDefines['TARGET'] + '_' + GlobalData.gGlobalDefines['TOOLCHAIN']))
             GenFdsGlobalVariable.OutputDirFromDscDict[Arch] = os.path.normpath(
-                             WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,
-                             GlobalData.gGlobalDefines['TARGET'], GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory)
+                WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,
+                                         GlobalData.gGlobalDefines['TARGET'], GlobalData.gGlobalDefines['TOOLCHAIN']].OutputDirectory)
             GenFdsGlobalVariable.PlatformName = WorkSpace.Db.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch,
-                                                                      GlobalData.gGlobalDefines['TARGET'],
-                                                                      GlobalData.gGlobalDefines['TOOLCHAIN']].PlatformName
-        GenFdsGlobalVariable.FvDir = os.path.join(GenFdsGlobalVariable.OutputDirDict[ArchList[0]], DataType.TAB_FV_DIRECTORY)
+                                                                         GlobalData.gGlobalDefines['TARGET'],
+                                                                         GlobalData.gGlobalDefines['TOOLCHAIN']].PlatformName
+        GenFdsGlobalVariable.FvDir = os.path.join(
+            GenFdsGlobalVariable.OutputDirDict[ArchList[0]], DataType.TAB_FV_DIRECTORY)
         if not os.path.exists(GenFdsGlobalVariable.FvDir):
             os.makedirs(GenFdsGlobalVariable.FvDir)
-        GenFdsGlobalVariable.FfsDir = os.path.join(GenFdsGlobalVariable.FvDir, 'Ffs')
+        GenFdsGlobalVariable.FfsDir = os.path.join(
+            GenFdsGlobalVariable.FvDir, 'Ffs')
         if not os.path.exists(GenFdsGlobalVariable.FfsDir):
             os.makedirs(GenFdsGlobalVariable.FfsDir)
 
         #
         # Create FV Address inf file
         #
-        GenFdsGlobalVariable.FvAddressFileName = os.path.join(GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
+        GenFdsGlobalVariable.FvAddressFileName = os.path.join(
+            GenFdsGlobalVariable.FfsDir, 'FvAddress.inf')
         FvAddressFile = open(GenFdsGlobalVariable.FvAddressFileName, 'w')
         #
         # Add [Options]
@@ -368,8 +387,8 @@ class GenFdsGlobalVariable:
             if BsAddress:
                 break
 
-        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " + \
-                                 BsAddress + \
+        FvAddressFile.writelines("EFI_BOOT_DRIVER_BASE_ADDRESS = " +
+                                 BsAddress +
                                  DataType.TAB_LINE_BREAK)
 
         RtAddress = '0'
@@ -381,13 +400,13 @@ class GenFdsGlobalVariable:
                 RtAddress = temp
                 break
 
-        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " + \
-                                 RtAddress + \
+        FvAddressFile.writelines("EFI_RUNTIME_DRIVER_BASE_ADDRESS = " +
+                                 RtAddress +
                                  DataType.TAB_LINE_BREAK)
 
         FvAddressFile.close()
 
-    ## ReplaceWorkspaceMacro()
+    # ReplaceWorkspaceMacro()
     #
     #   @param  String           String that may contain macro
     #
@@ -402,7 +421,7 @@ class GenFdsGlobalVariable:
             Str = mws.join(GenFdsGlobalVariable.WorkSpaceDir, String)
         return os.path.normpath(Str)
 
-    ## Check if the input files are newer than output files
+    # Check if the input files are newer than output files
     #
     #   @param  Output          Path of output file
     #   @param  Input           Path list of input files
@@ -446,10 +465,10 @@ class GenFdsGlobalVariable:
             Cmd += ("--dummy", DummyFile)
         if GuidHdrLen:
             Cmd += ("-l", GuidHdrLen)
-        #Add each guided attribute
+        # Add each guided attribute
         for Attr in GuidAttr:
             Cmd += ("-r", Attr)
-        #Section Align is only for dummy section without section type
+        # Section Align is only for dummy section without section type
         for SecAlign in InputAlign:
             Cmd += ("--sectionalign", SecAlign)
 
@@ -462,31 +481,36 @@ class GenFdsGlobalVariable:
                     Cmd += ("-n", '"' + Ui + '"')
                 Cmd += ("-o", Output)
                 if ' '.join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
-                    GenFdsGlobalVariable.SecCmdList.append(' '.join(Cmd).strip())
+                    GenFdsGlobalVariable.SecCmdList.append(
+                        ' '.join(Cmd).strip())
             else:
                 SectionData = array('B', [0, 0, 0, 0])
-                SectionData.fromlist(array('B',Ui.encode('utf-16-le')).tolist())
+                SectionData.fromlist(
+                    array('B', Ui.encode('utf-16-le')).tolist())
                 SectionData.append(0)
                 SectionData.append(0)
                 Len = len(SectionData)
-                GenFdsGlobalVariable.SectionHeader.pack_into(SectionData, 0, Len & 0xff, (Len >> 8) & 0xff, (Len >> 16) & 0xff, 0x15)
-
+                GenFdsGlobalVariable.SectionHeader.pack_into(
+                    SectionData, 0, Len & 0xff, (Len >> 8) & 0xff, (Len >> 16) & 0xff, 0x15)
 
                 DirName = os.path.dirname(Output)
                 if not CreateDirectory(DirName):
-                    EdkLogger.error(None, FILE_CREATE_FAILURE, "Could not create directory %s" % DirName)
+                    EdkLogger.error(None, FILE_CREATE_FAILURE,
+                                    "Could not create directory %s" % DirName)
                 else:
                     if DirName == '':
                         DirName = os.getcwd()
                     if not os.access(DirName, os.W_OK):
-                        EdkLogger.error(None, PERMISSION_FAILURE, "Do not have write permission on directory %s" % DirName)
+                        EdkLogger.error(
+                            None, PERMISSION_FAILURE, "Do not have write permission on directory %s" % DirName)
 
                 try:
                     with open(Output, "wb") as Fd:
                         SectionData.tofile(Fd)
                         Fd.flush()
                 except IOError as X:
-                    EdkLogger.error(None, FILE_CREATE_FAILURE, ExtraData='IOError %s' % X)
+                    EdkLogger.error(None, FILE_CREATE_FAILURE,
+                                    ExtraData='IOError %s' % X)
 
         elif Ver:
             Cmd += ("-n", Ver)
@@ -497,11 +521,13 @@ class GenFdsGlobalVariable:
             SaveFileOnChange(CommandFile, ' '.join(Cmd), False)
             if IsMakefile:
                 if ' '.join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
-                    GenFdsGlobalVariable.SecCmdList.append(' '.join(Cmd).strip())
+                    GenFdsGlobalVariable.SecCmdList.append(
+                        ' '.join(Cmd).strip())
             else:
                 if not GenFdsGlobalVariable.NeedsUpdate(Output, list(Input) + [CommandFile]):
                     return
-                GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to generate section")
+                GenFdsGlobalVariable.CallExternalTool(
+                    Cmd, "Failed to generate section")
         else:
             Cmd += ("-o", Output)
             Cmd += Input
@@ -513,38 +539,42 @@ class GenFdsGlobalVariable:
                 else:
                     Cmd = ['-test', '-e', Input[0], "&&"] + Cmd
                 if ' '.join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
-                    GenFdsGlobalVariable.SecCmdList.append(' '.join(Cmd).strip())
+                    GenFdsGlobalVariable.SecCmdList.append(
+                        ' '.join(Cmd).strip())
             elif GenFdsGlobalVariable.NeedsUpdate(Output, list(Input) + [CommandFile]):
-                GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
-                GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to generate section")
+                GenFdsGlobalVariable.DebugLogger(
+                    EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
+                GenFdsGlobalVariable.CallExternalTool(
+                    Cmd, "Failed to generate section")
                 if (os.path.getsize(Output) >= GenFdsGlobalVariable.LARGE_FILE_SIZE and
-                    GenFdsGlobalVariable.LargeFileInFvFlags):
+                        GenFdsGlobalVariable.LargeFileInFvFlags):
                     GenFdsGlobalVariable.LargeFileInFvFlags[-1] = True
 
     @staticmethod
-    def GetAlignment (AlignString):
+    def GetAlignment(AlignString):
         if not AlignString:
             return 0
         if AlignString.endswith('K'):
-            return int (AlignString.rstrip('K')) * 1024
+            return int(AlignString.rstrip('K')) * 1024
         if AlignString.endswith('M'):
-            return int (AlignString.rstrip('M')) * 1024 * 1024
+            return int(AlignString.rstrip('M')) * 1024 * 1024
         if AlignString.endswith('G'):
-            return int (AlignString.rstrip('G')) * 1024 * 1024 * 1024
-        return int (AlignString)
+            return int(AlignString.rstrip('G')) * 1024 * 1024 * 1024
+        return int(AlignString)
 
     @staticmethod
     def GenerateFfs(Output, Input, Type, Guid, Fixed=False, CheckSum=False, Align=None,
                     SectionAlign=None, MakefilePath=None):
         Cmd = ["GenFfs", "-t", Type, "-g", Guid]
-        mFfsValidAlign = ["0", "8", "16", "128", "512", "1K", "4K", "32K", "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M", "16M"]
+        mFfsValidAlign = ["0", "8", "16", "128", "512", "1K", "4K", "32K",
+                          "64K", "128K", "256K", "512K", "1M", "2M", "4M", "8M", "16M"]
         if Fixed == True:
             Cmd.append("-x")
         if CheckSum:
             Cmd.append("-s")
         if Align:
             if Align not in mFfsValidAlign:
-                Align = GenFdsGlobalVariable.GetAlignment (Align)
+                Align = GenFdsGlobalVariable.GetAlignment(Align)
                 for index in range(0, len(mFfsValidAlign) - 1):
                     if ((Align > GenFdsGlobalVariable.GetAlignment(mFfsValidAlign[index])) and (Align <= GenFdsGlobalVariable.GetAlignment(mFfsValidAlign[index + 1]))):
                         break
@@ -563,23 +593,27 @@ class GenFdsGlobalVariable:
         CommandFile = Output + '.txt'
         SaveFileOnChange(CommandFile, ' '.join(Cmd), False)
 
-        GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
+        GenFdsGlobalVariable.DebugLogger(
+            EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
         if MakefilePath:
             if (tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)) not in GenFdsGlobalVariable.FfsCmdDict:
-                GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] = MakefilePath
+                GenFdsGlobalVariable.FfsCmdDict[tuple(Cmd), tuple(
+                    GenFdsGlobalVariable.SecCmdList), tuple(GenFdsGlobalVariable.CopyList)] = MakefilePath
             GenFdsGlobalVariable.SecCmdList = []
             GenFdsGlobalVariable.CopyList = []
         else:
             if not GenFdsGlobalVariable.NeedsUpdate(Output, list(Input) + [CommandFile]):
                 return
-            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to generate FFS")
+            GenFdsGlobalVariable.CallExternalTool(
+                Cmd, "Failed to generate FFS")
 
     @staticmethod
     def GenerateFirmwareVolume(Output, Input, BaseAddress=None, ForceRebase=None, Capsule=False, Dump=False,
                                AddressFile=None, MapFile=None, FfsList=[], FileSystemGuid=None):
         if not GenFdsGlobalVariable.NeedsUpdate(Output, Input+FfsList):
             return
-        GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
+        GenFdsGlobalVariable.DebugLogger(
+            EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
 
         Cmd = ["GenFv"]
         if BaseAddress:
@@ -612,7 +646,8 @@ class GenFdsGlobalVariable:
                               Align=None, Padding=None, Convert=False, IsMakefile=False):
         if not GenFdsGlobalVariable.NeedsUpdate(Output, Input) and not IsMakefile:
             return
-        GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
+        GenFdsGlobalVariable.DebugLogger(
+            EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
 
         Cmd = ["GenFw"]
         if Type.lower() == "te":
@@ -641,11 +676,12 @@ class GenFdsGlobalVariable:
             if " ".join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
                 GenFdsGlobalVariable.SecCmdList.append(" ".join(Cmd).strip())
         else:
-            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to generate firmware image")
+            GenFdsGlobalVariable.CallExternalTool(
+                Cmd, "Failed to generate firmware image")
 
     @staticmethod
     def GenerateOptionRom(Output, EfiInput, BinaryInput, Compress=False, ClassCode=None,
-                        Revision=None, DeviceId=None, VendorId=None, IsMakefile=False):
+                          Revision=None, DeviceId=None, VendorId=None, IsMakefile=False):
         InputList = []
         Cmd = ["EfiRom"]
         if EfiInput:
@@ -657,18 +693,19 @@ class GenFdsGlobalVariable:
 
             for EfiFile in EfiInput:
                 Cmd.append(EfiFile)
-                InputList.append (EfiFile)
+                InputList.append(EfiFile)
 
         if BinaryInput:
             Cmd.append("-b")
             for BinFile in BinaryInput:
                 Cmd.append(BinFile)
-                InputList.append (BinFile)
+                InputList.append(BinFile)
 
         # Check List
         if not GenFdsGlobalVariable.NeedsUpdate(Output, InputList) and not IsMakefile:
             return
-        GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, InputList))
+        GenFdsGlobalVariable.DebugLogger(
+            EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, InputList))
 
         if ClassCode:
             Cmd += ("-l", ClassCode)
@@ -684,13 +721,15 @@ class GenFdsGlobalVariable:
             if " ".join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
                 GenFdsGlobalVariable.SecCmdList.append(" ".join(Cmd).strip())
         else:
-            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to generate option rom")
+            GenFdsGlobalVariable.CallExternalTool(
+                Cmd, "Failed to generate option rom")
 
     @staticmethod
     def GuidTool(Output, Input, ToolPath, Options='', returnValue=[], IsMakefile=False):
         if not GenFdsGlobalVariable.NeedsUpdate(Output, Input) and not IsMakefile:
             return
-        GenFdsGlobalVariable.DebugLogger(EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
+        GenFdsGlobalVariable.DebugLogger(
+            EdkLogger.DEBUG_5, "%s needs update because of newer %s" % (Output, Input))
 
         Cmd = [ToolPath, ]
         Cmd += Options.split(' ')
@@ -700,80 +739,87 @@ class GenFdsGlobalVariable:
             if " ".join(Cmd).strip() not in GenFdsGlobalVariable.SecCmdList:
                 GenFdsGlobalVariable.SecCmdList.append(" ".join(Cmd).strip())
         else:
-            GenFdsGlobalVariable.CallExternalTool(Cmd, "Failed to call " + ToolPath, returnValue)
+            GenFdsGlobalVariable.CallExternalTool(
+                Cmd, "Failed to call " + ToolPath, returnValue)
 
     @staticmethod
-    def CallExternalTool (cmd, errorMess, returnValue=[]):
+    def CallExternalTool(cmd, errorMess, returnValue=[]):
 
         if type(cmd) not in (tuple, list):
-            GenFdsGlobalVariable.ErrorLogger("ToolError!  Invalid parameter type in call to CallExternalTool")
+            GenFdsGlobalVariable.ErrorLogger(
+                "ToolError!  Invalid parameter type in call to CallExternalTool")
 
         if GenFdsGlobalVariable.DebugLevel != -1:
             cmd += ('--debug', str(GenFdsGlobalVariable.DebugLevel))
-            GenFdsGlobalVariable.InfLogger (cmd)
+            GenFdsGlobalVariable.InfLogger(cmd)
 
         if GenFdsGlobalVariable.VerboseMode:
             cmd += ('-v',)
-            GenFdsGlobalVariable.InfLogger (cmd)
+            GenFdsGlobalVariable.InfLogger(cmd)
         else:
-            stdout.write ('#')
+            stdout.write('#')
             stdout.flush()
             GenFdsGlobalVariable.SharpCounter = GenFdsGlobalVariable.SharpCounter + 1
             if GenFdsGlobalVariable.SharpCounter % GenFdsGlobalVariable.SharpNumberPerLine == 0:
                 stdout.write('\n')
 
         try:
-            PopenObject = Popen(' '.join(cmd), stdout=PIPE, stderr=PIPE, shell=True)
+            PopenObject = Popen(' '.join(cmd), stdout=PIPE,
+                                stderr=PIPE, shell=True)
         except Exception as X:
-            EdkLogger.error("GenFds", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
+            EdkLogger.error("GenFds", COMMAND_FAILURE,
+                            ExtraData="%s: %s" % (str(X), cmd[0]))
         (out, error) = PopenObject.communicate()
 
         while PopenObject.returncode is None:
             PopenObject.wait()
         if returnValue != [] and returnValue[0] != 0:
-            #get command return value
+            # get command return value
             returnValue[0] = PopenObject.returncode
             return
         if PopenObject.returncode != 0 or GenFdsGlobalVariable.VerboseMode or GenFdsGlobalVariable.DebugLevel != -1:
-            GenFdsGlobalVariable.InfLogger ("Return Value = %d" % PopenObject.returncode)
-            GenFdsGlobalVariable.InfLogger(out.decode(encoding='utf-8', errors='ignore'))
-            GenFdsGlobalVariable.InfLogger(error.decode(encoding='utf-8', errors='ignore'))
+            GenFdsGlobalVariable.InfLogger(
+                "Return Value = %d" % PopenObject.returncode)
+            GenFdsGlobalVariable.InfLogger(
+                out.decode(encoding='utf-8', errors='ignore'))
+            GenFdsGlobalVariable.InfLogger(
+                error.decode(encoding='utf-8', errors='ignore'))
             if PopenObject.returncode != 0:
                 print("###", cmd)
                 EdkLogger.error("GenFds", COMMAND_FAILURE, errorMess)
 
     @staticmethod
-    def VerboseLogger (msg):
+    def VerboseLogger(msg):
         EdkLogger.verbose(msg)
 
     @staticmethod
-    def InfLogger (msg):
+    def InfLogger(msg):
         EdkLogger.info(msg)
 
     @staticmethod
-    def ErrorLogger (msg, File=None, Line=None, ExtraData=None):
+    def ErrorLogger(msg, File=None, Line=None, ExtraData=None):
         EdkLogger.error('GenFds', GENFDS_ERROR, msg, File, Line, ExtraData)
 
     @staticmethod
-    def DebugLogger (Level, msg):
+    def DebugLogger(Level, msg):
         EdkLogger.debug(Level, msg)
 
-    ## MacroExtend()
+    # MacroExtend()
     #
     #   @param  Str           String that may contain macro
     #   @param  MacroDict     Dictionary that contains macro value pair
     #
     @staticmethod
-    def MacroExtend (Str, MacroDict=None, Arch=DataType.TAB_COMMON):
+    def MacroExtend(Str, MacroDict=None, Arch=DataType.TAB_COMMON):
         if Str is None:
             return None
 
         Dict = {'$(WORKSPACE)': GenFdsGlobalVariable.WorkSpaceDir,
-#                '$(OUTPUT_DIRECTORY)': GenFdsGlobalVariable.OutputDirFromDsc,
+                #                '$(OUTPUT_DIRECTORY)': GenFdsGlobalVariable.OutputDirFromDsc,
                 '$(TARGET)': GenFdsGlobalVariable.TargetName,
                 '$(TOOL_CHAIN_TAG)': GenFdsGlobalVariable.ToolChainTag,
                 '$(SPACE)': ' '
-               }
+                }
 
         if Arch != DataType.TAB_COMMON and Arch in GenFdsGlobalVariable.ArchList:
             OutputDir = GenFdsGlobalVariable.OutputDirFromDscDict[Arch]
@@ -787,22 +833,23 @@ class GenFdsGlobalVariable:
 
         for key in Dict:
             if Str.find(key) >= 0:
-                Str = Str.replace (key, Dict[key])
+                Str = Str.replace(key, Dict[key])
 
         if Str.find('$(ARCH)') >= 0:
             if len(GenFdsGlobalVariable.ArchList) == 1:
                 Str = Str.replace('$(ARCH)', GenFdsGlobalVariable.ArchList[0])
             else:
-                EdkLogger.error("GenFds", GENFDS_ERROR, "No way to determine $(ARCH) for %s" % Str)
+                EdkLogger.error("GenFds", GENFDS_ERROR,
+                                "No way to determine $(ARCH) for %s" % Str)
 
         return Str
 
-    ## GetPcdValue()
+    # GetPcdValue()
     #
     #   @param  PcdPattern           pattern that labels a PCD.
     #
     @staticmethod
-    def GetPcdValue (PcdPattern):
+    def GetPcdValue(PcdPattern):
         if PcdPattern is None:
             return None
         if PcdPattern.startswith('PCD('):
@@ -813,15 +860,18 @@ class GenFdsGlobalVariable:
         TokenCName = PcdPair[1]
 
         for Arch in GenFdsGlobalVariable.ArchList:
-            Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
+            Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform,
+                                                                  Arch, GenFdsGlobalVariable.TargetName, GenFdsGlobalVariable.ToolChainTag]
             PcdDict = Platform.Pcds
             for Key in PcdDict:
                 PcdObj = PcdDict[Key]
                 if (PcdObj.TokenCName == TokenCName) and (PcdObj.TokenSpaceGuidCName == TokenSpace):
                     if PcdObj.Type != DataType.TAB_PCDS_FIXED_AT_BUILD:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
                     if PcdObj.DatumType != DataType.TAB_VOID:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
 
                     return PcdObj.DefaultValue
 
@@ -834,15 +884,17 @@ class GenFdsGlobalVariable:
                     PcdObj = PcdDict[Key]
                     if (PcdObj.TokenCName == TokenCName) and (PcdObj.TokenSpaceGuidCName == TokenSpace):
                         if PcdObj.Type != DataType.TAB_PCDS_FIXED_AT_BUILD:
-                            EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
+                            EdkLogger.error(
+                                "GenFds", GENFDS_ERROR, "%s is not FixedAtBuild type." % PcdPattern)
                         if PcdObj.DatumType != DataType.TAB_VOID:
-                            EdkLogger.error("GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
+                            EdkLogger.error(
+                                "GenFds", GENFDS_ERROR, "%s is not VOID* datum type." % PcdPattern)
 
                         return PcdObj.DefaultValue
 
         return ''
 
-## FindExtendTool()
+# FindExtendTool()
 #
 #  Find location of tools to process data
 #
@@ -850,6 +902,8 @@ class GenFdsGlobalVariable:
 #  @param  CurrentArchList  Arch list
 #  @param  NameGuid         The Guid name
 #
+
+
 def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
     if GenFdsGlobalVariable.GuidToolDefinition:
         if NameGuid in GenFdsGlobalVariable.GuidToolDefinition:
@@ -863,7 +917,8 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
         Target = GenFdsGlobalVariable.TargetName
         ToolChain = GenFdsGlobalVariable.ToolChainTag
         if ToolChain not in ToolDb['TOOL_CHAIN_TAG']:
-            EdkLogger.error("GenFds", GENFDS_ERROR, "Can not find external tool because tool tag %s is not defined in tools_def.txt!" % ToolChain)
+            EdkLogger.error(
+                "GenFds", GENFDS_ERROR, "Can not find external tool because tool tag %s is not defined in tools_def.txt!" % ToolChain)
         KeyStringList = [Target + '_' + ToolChain + '_' + CurrentArchList[0]]
         for Arch in CurrentArchList:
             if Target + '_' + ToolChain + '_' + Arch not in KeyStringList:
@@ -876,13 +931,15 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
         MatchPathItem = None
         MatchOptionsItem = None
         for KeyString in KeyStringList:
-            KeyStringBuildTarget, KeyStringToolChain, KeyStringArch = KeyString.split('_')
+            KeyStringBuildTarget, KeyStringToolChain, KeyStringArch = KeyString.split(
+                '_')
             if KeyStringArch != Arch:
                 continue
             for Item in ToolDef.ToolsDefTxtDictionary:
                 if len(Item.split('_')) < 5:
                     continue
-                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item.split('_')
+                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item.split(
+                    '_')
                 if ItemTarget == DataType.TAB_STAR:
                     ItemTarget = KeyStringBuildTarget
                 if ItemToolChain == DataType.TAB_STAR:
@@ -915,7 +972,8 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
             for Item in ToolDef.ToolsDefTxtDictionary:
                 if len(Item.split('_')) < 5:
                     continue
-                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item.split('_')
+                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item.split(
+                    '_')
                 if ItemTarget == DataType.TAB_STAR:
                     ItemTarget = KeyStringBuildTarget
                 if ItemToolChain == DataType.TAB_STAR:
@@ -952,14 +1010,17 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
         MatchPathItem = None
         MatchOptionsItem = None
         for KeyString in KeyStringList:
-            KeyStringBuildTarget, KeyStringToolChain, KeyStringArch = KeyString.split('_')
+            KeyStringBuildTarget, KeyStringToolChain, KeyStringArch = KeyString.split(
+                '_')
             if KeyStringArch != Arch:
                 continue
-            Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[GenFdsGlobalVariable.ActivePlatform, Arch, KeyStringBuildTarget, KeyStringToolChain]
+            Platform = GenFdsGlobalVariable.WorkSpace.BuildObject[
+                GenFdsGlobalVariable.ActivePlatform, Arch, KeyStringBuildTarget, KeyStringToolChain]
             for Item in Platform.BuildOptions:
                 if len(Item[1].split('_')) < 5:
                     continue
-                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item[1].split('_')
+                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item[1].split(
+                    '_')
                 if ItemTarget == DataType.TAB_STAR:
                     ItemTarget = KeyStringBuildTarget
                 if ItemToolChain == DataType.TAB_STAR:
@@ -992,7 +1053,8 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
             for Item in Platform.BuildOptions:
                 if len(Item[1].split('_')) < 5:
                     continue
-                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item[1].split('_')
+                ItemTarget, ItemToolChain, ItemArch, ItemTool, ItemAttr = Item[1].split(
+                    '_')
                 if ItemTarget == DataType.TAB_STAR:
                     ItemTarget = KeyStringBuildTarget
                 if ItemToolChain == DataType.TAB_STAR:
@@ -1023,5 +1085,6 @@ def FindExtendTool(KeyStringList, CurrentArchList, NameGuid):
         ToolPathTmp = Platform.BuildOptions[MatchPathItem]
     if MatchOptionsItem:
         ToolOption = Platform.BuildOptions[MatchOptionsItem]
-    GenFdsGlobalVariable.GuidToolDefinition[NameGuid] = (ToolPathTmp, ToolOption)
+    GenFdsGlobalVariable.GuidToolDefinition[NameGuid] = (
+        ToolPathTmp, ToolOption)
     return ToolPathTmp, ToolOption
diff --git a/BaseTools/Source/Python/GenFds/GuidSection.py b/BaseTools/Source/Python/GenFds/GuidSection.py
index 8db6e2feb3e4..5efd7c645091 100644
--- a/BaseTools/Source/Python/GenFds/GuidSection.py
+++ b/BaseTools/Source/Python/GenFds/GuidSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process GUIDed section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -25,19 +25,21 @@ from .FvImageSection import FvImageSection
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.DataType import *
 
-## generate GUIDed section
+# generate GUIDed section
 #
 #
-class GuidSection(GuidSectionClassObject) :
 
-    ## The constructor
+
+class GuidSection(GuidSectionClassObject):
+
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         GuidSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate GUIDed section
     #
@@ -75,10 +77,10 @@ class GuidSection(GuidSectionClassObject) :
 
         if self.ProcessRequired in ("TRUE", "1"):
             if self.FvAddr != []:
-                #no use FvAddr when the image is processed.
+                # no use FvAddr when the image is processed.
                 self.FvAddr = []
             if self.FvParentAddr is not None:
-                #no use Parent Addr when the image is processed.
+                # no use Parent Addr when the image is processed.
                 self.FvParentAddr = None
 
         for Sect in self.SectionList:
@@ -92,7 +94,8 @@ class GuidSection(GuidSectionClassObject) :
             elif isinstance(Sect, GuidSection):
                 Sect.FvAddr = self.FvAddr
                 Sect.FvParentAddr = self.FvParentAddr
-            ReturnSectList, align = Sect.GenSection(OutputPath, ModuleName, SecIndex, KeyStringList, FfsInf, Dict, IsMakefile=IsMakefile)
+            ReturnSectList, align = Sect.GenSection(
+                OutputPath, ModuleName, SecIndex, KeyStringList, FfsInf, Dict, IsMakefile=IsMakefile)
             if isinstance(Sect, GuidSection):
                 if Sect.IncludeFvSection:
                     self.IncludeFvSection = Sect.IncludeFvSection
@@ -100,7 +103,7 @@ class GuidSection(GuidSectionClassObject) :
             if align is not None:
                 if MaxAlign is None:
                     MaxAlign = align
-                if GenFdsGlobalVariable.GetAlignment (align) > GenFdsGlobalVariable.GetAlignment (MaxAlign):
+                if GenFdsGlobalVariable.GetAlignment(align) > GenFdsGlobalVariable.GetAlignment(MaxAlign):
                     MaxAlign = align
             if ReturnSectList != []:
                 if align is None:
@@ -113,50 +116,55 @@ class GuidSection(GuidSectionClassObject) :
             if self.Alignment is None:
                 self.Alignment = MaxAlign
             else:
-                if GenFdsGlobalVariable.GetAlignment (MaxAlign) > GenFdsGlobalVariable.GetAlignment (self.Alignment):
+                if GenFdsGlobalVariable.GetAlignment(MaxAlign) > GenFdsGlobalVariable.GetAlignment(self.Alignment):
                     self.Alignment = MaxAlign
 
         OutputFile = OutputPath + \
-                     os.sep + \
-                     ModuleName + \
-                     SUP_MODULE_SEC + \
-                     SecNum + \
-                     SectionSuffix['GUIDED']
+            os.sep + \
+            ModuleName + \
+            SUP_MODULE_SEC + \
+            SecNum + \
+            SectionSuffix['GUIDED']
         OutputFile = os.path.normpath(OutputFile)
 
         ExternalTool = None
         ExternalOption = None
         if self.NameGuid is not None:
-            ExternalTool, ExternalOption = FindExtendTool(self.KeyStringList, self.CurrentArchList, self.NameGuid)
+            ExternalTool, ExternalOption = FindExtendTool(
+                self.KeyStringList, self.CurrentArchList, self.NameGuid)
 
         #
         # If not have GUID , call default
         # GENCRC32 section
         #
-        if self.NameGuid is None :
-            GenFdsGlobalVariable.VerboseLogger("Use GenSection function Generate CRC32 Section")
-            GenFdsGlobalVariable.GenerateSection(OutputFile, SectFile, Section.Section.SectionType[self.SectionType], InputAlign=SectAlign, IsMakefile=IsMakefile)
+        if self.NameGuid is None:
+            GenFdsGlobalVariable.VerboseLogger(
+                "Use GenSection function Generate CRC32 Section")
+            GenFdsGlobalVariable.GenerateSection(
+                OutputFile, SectFile, Section.Section.SectionType[self.SectionType], InputAlign=SectAlign, IsMakefile=IsMakefile)
             OutputFileList = []
             OutputFileList.append(OutputFile)
             return OutputFileList, self.Alignment
-        #or GUID not in External Tool List
+        # or GUID not in External Tool List
         elif ExternalTool is None:
-            EdkLogger.error("GenFds", GENFDS_ERROR, "No tool found with GUID %s" % self.NameGuid)
+            EdkLogger.error("GenFds", GENFDS_ERROR,
+                            "No tool found with GUID %s" % self.NameGuid)
         else:
             DummyFile = OutputFile + ".dummy"
             #
             # Call GenSection with DUMMY section type.
             #
-            GenFdsGlobalVariable.GenerateSection(DummyFile, SectFile, InputAlign=SectAlign, IsMakefile=IsMakefile)
+            GenFdsGlobalVariable.GenerateSection(
+                DummyFile, SectFile, InputAlign=SectAlign, IsMakefile=IsMakefile)
             #
             # Use external tool process the Output
             #
             TempFile = OutputPath + \
-                       os.sep + \
-                       ModuleName + \
-                       SUP_MODULE_SEC + \
-                       SecNum + \
-                       '.tmp'
+                os.sep + \
+                ModuleName + \
+                SUP_MODULE_SEC + \
+                SecNum + \
+                '.tmp'
             TempFile = os.path.normpath(TempFile)
             #
             # Remove temp file if its time stamp is older than dummy file
@@ -172,15 +180,16 @@ class GuidSection(GuidSectionClassObject) :
                 CmdOption = CmdOption + ' ' + ExternalOption
             if not GenFdsGlobalVariable.EnableGenfdsMultiThread:
                 if self.ProcessRequired not in ("TRUE", "1") and self.IncludeFvSection and not FvAddrIsSet and self.FvParentAddr is not None:
-                    #FirstCall is only set for the encapsulated flash FV image without process required attribute.
+                    # FirstCall is only set for the encapsulated flash FV image without process required attribute.
                     FirstCall = True
                 #
                 # Call external tool
                 #
                 ReturnValue = [1]
                 if FirstCall:
-                    #first try to call the guided tool with -z option and CmdOption for the no process required guided tool.
-                    GenFdsGlobalVariable.GuidTool(TempFile, [DummyFile], ExternalTool, '-z' + ' ' + CmdOption, ReturnValue)
+                    # first try to call the guided tool with -z option and CmdOption for the no process required guided tool.
+                    GenFdsGlobalVariable.GuidTool(
+                        TempFile, [DummyFile], ExternalTool, '-z' + ' ' + CmdOption, ReturnValue)
 
                 #
                 # when no call or first call failed, ReturnValue are not 1.
@@ -189,14 +198,16 @@ class GuidSection(GuidSectionClassObject) :
                 if ReturnValue[0] != 0:
                     FirstCall = False
                     ReturnValue[0] = 0
-                    GenFdsGlobalVariable.GuidTool(TempFile, [DummyFile], ExternalTool, CmdOption)
+                    GenFdsGlobalVariable.GuidTool(
+                        TempFile, [DummyFile], ExternalTool, CmdOption)
                 #
                 # There is external tool which does not follow standard rule which return nonzero if tool fails
                 # The output file has to be checked
                 #
 
-                if not os.path.exists(TempFile) :
-                    EdkLogger.error("GenFds", COMMAND_FAILURE, 'Fail to call %s, no output file was generated' % ExternalTool)
+                if not os.path.exists(TempFile):
+                    EdkLogger.error(
+                        "GenFds", COMMAND_FAILURE, 'Fail to call %s, no output file was generated' % ExternalTool)
 
                 FileHandleIn = open(DummyFile, 'rb')
                 FileHandleIn.seek(0, 2)
@@ -219,7 +230,7 @@ class GuidSection(GuidSectionClassObject) :
                         BufferOut = FileHandleOut.read()
                         if BufferIn == BufferOut[TempFileSize - InputFileSize:]:
                             HeaderLength = str(TempFileSize - InputFileSize)
-                    #auto sec guided attribute with process required
+                    # auto sec guided attribute with process required
                     if HeaderLength is None:
                         Attribute.append('PROCESSING_REQUIRED')
 
@@ -228,7 +239,8 @@ class GuidSection(GuidSectionClassObject) :
 
                 if FirstCall and 'PROCESSING_REQUIRED' in Attribute:
                     # Guided data by -z option on first call is the process required data. Call the guided tool with the real option.
-                    GenFdsGlobalVariable.GuidTool(TempFile, [DummyFile], ExternalTool, CmdOption)
+                    GenFdsGlobalVariable.GuidTool(
+                        TempFile, [DummyFile], ExternalTool, CmdOption)
 
                 #
                 # Call Gensection Add Section Header
@@ -243,8 +255,9 @@ class GuidSection(GuidSectionClassObject) :
                                                      Guid=self.NameGuid, GuidAttr=Attribute, GuidHdrLen=HeaderLength)
 
             else:
-                #add input file for GenSec get PROCESSING_REQUIRED
-                GenFdsGlobalVariable.GuidTool(TempFile, [DummyFile], ExternalTool, CmdOption, IsMakefile=IsMakefile)
+                # add input file for GenSec get PROCESSING_REQUIRED
+                GenFdsGlobalVariable.GuidTool(
+                    TempFile, [DummyFile], ExternalTool, CmdOption, IsMakefile=IsMakefile)
                 Attribute = []
                 HeaderLength = None
                 if self.ExtraHeaderSize != -1:
@@ -273,6 +286,3 @@ class GuidSection(GuidSectionClassObject) :
             if IsMakefile and self.Alignment is not None and self.Alignment.strip() == '0':
                 self.Alignment = '1'
             return OutputFileList, self.Alignment
-
-
-
diff --git a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
index 1bd4d4572a17..0caa55210af1 100644
--- a/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomFileStatement.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process OptionROM generation from FILE statement
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -16,8 +16,10 @@ from .GenFdsGlobalVariable import GenFdsGlobalVariable
 ##
 #
 #
+
+
 class OptRomFileStatement:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -26,7 +28,7 @@ class OptRomFileStatement:
         self.FileType = None
         self.OverrideAttribs = None
 
-    ## GenFfs() method
+    # GenFfs() method
     #
     #   Generate FFS
     #
@@ -34,15 +36,13 @@ class OptRomFileStatement:
     #   @param  Dict        dictionary contains macro and value pair
     #   @retval string      Generated FFS file name
     #
-    def GenFfs(self, Dict = None, IsMakefile=False):
+    def GenFfs(self, Dict=None, IsMakefile=False):
 
         if Dict is None:
             Dict = {}
 
         if self.FileName is not None:
-            self.FileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
+            self.FileName = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.FileName)
 
         return self.FileName
-
-
-
diff --git a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
index 8b570ed6bcbc..e4b3328d718b 100644
--- a/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
+++ b/BaseTools/Source/Python/GenFds/OptRomInfStatement.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process OptionROM generation from INF statement
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,8 +23,10 @@ from .GenFdsGlobalVariable import GenFdsGlobalVariable
 ##
 #
 #
+
+
 class OptRomInfStatement (FfsInfStatement):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -32,7 +34,7 @@ class OptRomInfStatement (FfsInfStatement):
         FfsInfStatement.__init__(self)
         self.OverrideAttribs = None
 
-    ## __GetOptRomParams() method
+    # __GetOptRomParams() method
     #
     #   Parse inf file to get option ROM related parameters
     #
@@ -43,31 +45,37 @@ class OptRomInfStatement (FfsInfStatement):
             self.OverrideAttribs = OverrideAttribs()
 
         if self.OverrideAttribs.NeedCompress is None:
-            self.OverrideAttribs.NeedCompress = self.OptRomDefs.get ('PCI_COMPRESS')
+            self.OverrideAttribs.NeedCompress = self.OptRomDefs.get(
+                'PCI_COMPRESS')
             if self.OverrideAttribs.NeedCompress is not None:
                 if self.OverrideAttribs.NeedCompress.upper() not in ('TRUE', 'FALSE'):
-                    GenFdsGlobalVariable.ErrorLogger( "Expected TRUE/FALSE for PCI_COMPRESS: %s" %self.InfFileName)
+                    GenFdsGlobalVariable.ErrorLogger(
+                        "Expected TRUE/FALSE for PCI_COMPRESS: %s" % self.InfFileName)
                 self.OverrideAttribs.NeedCompress = \
                     self.OverrideAttribs.NeedCompress.upper() == 'TRUE'
 
         if self.OverrideAttribs.PciVendorId is None:
-            self.OverrideAttribs.PciVendorId = self.OptRomDefs.get ('PCI_VENDOR_ID')
+            self.OverrideAttribs.PciVendorId = self.OptRomDefs.get(
+                'PCI_VENDOR_ID')
 
         if self.OverrideAttribs.PciClassCode is None:
-            self.OverrideAttribs.PciClassCode = self.OptRomDefs.get ('PCI_CLASS_CODE')
+            self.OverrideAttribs.PciClassCode = self.OptRomDefs.get(
+                'PCI_CLASS_CODE')
 
         if self.OverrideAttribs.PciDeviceId is None:
-            self.OverrideAttribs.PciDeviceId = self.OptRomDefs.get ('PCI_DEVICE_ID')
+            self.OverrideAttribs.PciDeviceId = self.OptRomDefs.get(
+                'PCI_DEVICE_ID')
 
         if self.OverrideAttribs.PciRevision is None:
-            self.OverrideAttribs.PciRevision = self.OptRomDefs.get ('PCI_REVISION')
+            self.OverrideAttribs.PciRevision = self.OptRomDefs.get(
+                'PCI_REVISION')
 
 #        InfObj = GenFdsGlobalVariable.WorkSpace.BuildObject[self.PathClassObj, self.CurrentArch]
 #        RecordList = InfObj._RawData[MODEL_META_DATA_HEADER, InfObj._Arch, InfObj._Platform]
 #        for Record in RecordList:
 #            Record = ReplaceMacros(Record, GlobalData.gEdkGlobal, False)
 #            Name = Record[0]
-    ## GenFfs() method
+    # GenFfs() method
     #
     #   Generate FFS
     #
@@ -85,21 +93,24 @@ class OptRomInfStatement (FfsInfStatement):
         # Get the rule of how to generate Ffs file
         #
         Rule = self.__GetRule__()
-        GenFdsGlobalVariable.VerboseLogger( "Packing binaries from inf file : %s" %self.InfFileName)
+        GenFdsGlobalVariable.VerboseLogger(
+            "Packing binaries from inf file : %s" % self.InfFileName)
         #
         # For the rule only has simpleFile
         #
-        if isinstance (Rule, RuleSimpleFile.RuleSimpleFile) :
-            EfiOutputList = self.__GenSimpleFileSection__(Rule, IsMakefile=IsMakefile)
+        if isinstance(Rule, RuleSimpleFile.RuleSimpleFile):
+            EfiOutputList = self.__GenSimpleFileSection__(
+                Rule, IsMakefile=IsMakefile)
             return EfiOutputList
         #
         # For Rule has ComplexFile
         #
         elif isinstance(Rule, RuleComplexFile.RuleComplexFile):
-            EfiOutputList = self.__GenComplexFileSection__(Rule, IsMakefile=IsMakefile)
+            EfiOutputList = self.__GenComplexFileSection__(
+                Rule, IsMakefile=IsMakefile)
             return EfiOutputList
 
-    ## __GenSimpleFileSection__() method
+    # __GenSimpleFileSection__() method
     #
     #   Get .efi files according to simple rule.
     #
@@ -107,7 +118,7 @@ class OptRomInfStatement (FfsInfStatement):
     #   @param  Rule        The rule object used to generate section
     #   @retval string      File name of the generated section file
     #
-    def __GenSimpleFileSection__(self, Rule, IsMakefile = False):
+    def __GenSimpleFileSection__(self, Rule, IsMakefile=False):
         #
         # Prepare the parameter of GenSection
         #
@@ -117,12 +128,12 @@ class OptRomInfStatement (FfsInfStatement):
             GenSecInputFile = self.__ExtendMacro__(Rule.FileName)
             OutputFileList.append(GenSecInputFile)
         else:
-            OutputFileList, IsSect = Section.Section.GetFileList(self, '', Rule.FileExtension)
+            OutputFileList, IsSect = Section.Section.GetFileList(
+                self, '', Rule.FileExtension)
 
         return OutputFileList
 
-
-    ## __GenComplexFileSection__() method
+    # __GenComplexFileSection__() method
     #
     #   Get .efi by sections in complex Rule
     #
@@ -130,6 +141,7 @@ class OptRomInfStatement (FfsInfStatement):
     #   @param  Rule        The rule object used to generate section
     #   @retval string      File name of the generated section file
     #
+
     def __GenComplexFileSection__(self, Rule, IsMakefile=False):
 
         OutputFileList = []
@@ -139,14 +151,16 @@ class OptRomInfStatement (FfsInfStatement):
                     GenSecInputFile = self.__ExtendMacro__(Sect.FileName)
                     OutputFileList.append(GenSecInputFile)
                 else:
-                    FileList, IsSect = Section.Section.GetFileList(self, '', Sect.FileExtension)
+                    FileList, IsSect = Section.Section.GetFileList(
+                        self, '', Sect.FileExtension)
                     OutputFileList.extend(FileList)
 
         return OutputFileList
 
+
 class OverrideAttribs:
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
diff --git a/BaseTools/Source/Python/GenFds/OptionRom.py b/BaseTools/Source/Python/GenFds/OptionRom.py
index 61d669de8d08..f20c08edaf38 100644
--- a/BaseTools/Source/Python/GenFds/OptionRom.py
+++ b/BaseTools/Source/Python/GenFds/OptionRom.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process OptionROM generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,16 +23,18 @@ from Common.BuildToolError import *
 ##
 #
 #
+
+
 class OPTIONROM (OptionRomClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
-    def __init__(self, Name = ""):
+    def __init__(self, Name=""):
         OptionRomClassObject.__init__(self)
         self.DriverName = Name
 
-    ## AddToBuffer()
+    # AddToBuffer()
     #
     #   Generate Option ROM
     #
@@ -40,26 +42,29 @@ class OPTIONROM (OptionRomClassObject):
     #   @param  Buffer      The buffer generated OptROM data will be put
     #   @retval string      Generated OptROM file path
     #
-    def AddToBuffer (self, Buffer, Flag=False) :
+    def AddToBuffer(self, Buffer, Flag=False):
         if not Flag:
-            GenFdsGlobalVariable.InfLogger( "\nGenerating %s Option ROM ..." %self.DriverName)
+            GenFdsGlobalVariable.InfLogger(
+                "\nGenerating %s Option ROM ..." % self.DriverName)
 
         EfiFileList = []
         BinFileList = []
 
         # Process Modules in FfsList
-        for FfsFile in self.FfsList :
+        for FfsFile in self.FfsList:
 
             if isinstance(FfsFile, OptRomInfStatement.OptRomInfStatement):
                 FilePathNameList = FfsFile.GenFfs(IsMakefile=Flag)
                 if len(FilePathNameList) == 0:
-                    EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s not produce .efi files, so NO file could be put into option ROM." % (FfsFile.InfFileName))
+                    EdkLogger.error("GenFds", GENFDS_ERROR, "Module %s not produce .efi files, so NO file could be put into option ROM." % (
+                        FfsFile.InfFileName))
                 if FfsFile.OverrideAttribs is None:
                     EfiFileList.extend(FilePathNameList)
                 else:
                     FileName = os.path.basename(FilePathNameList[0])
-                    TmpOutputDir = os.path.join(GenFdsGlobalVariable.FvDir, self.DriverName, FfsFile.CurrentArch)
-                    if not os.path.exists(TmpOutputDir) :
+                    TmpOutputDir = os.path.join(
+                        GenFdsGlobalVariable.FvDir, self.DriverName, FfsFile.CurrentArch)
+                    if not os.path.exists(TmpOutputDir):
                         os.makedirs(TmpOutputDir)
                     TmpOutputFile = os.path.join(TmpOutputDir, FileName+'.tmp')
 
@@ -71,14 +76,15 @@ class OPTIONROM (OptionRomClassObject):
                                                            FfsFile.OverrideAttribs.PciRevision,
                                                            FfsFile.OverrideAttribs.PciDeviceId,
                                                            FfsFile.OverrideAttribs.PciVendorId,
-                                                           IsMakefile = Flag)
+                                                           IsMakefile=Flag)
                     BinFileList.append(TmpOutputFile)
             else:
                 FilePathName = FfsFile.GenFfs(IsMakefile=Flag)
                 if FfsFile.OverrideAttribs is not None:
                     FileName = os.path.basename(FilePathName)
-                    TmpOutputDir = os.path.join(GenFdsGlobalVariable.FvDir, self.DriverName, FfsFile.CurrentArch)
-                    if not os.path.exists(TmpOutputDir) :
+                    TmpOutputDir = os.path.join(
+                        GenFdsGlobalVariable.FvDir, self.DriverName, FfsFile.CurrentArch)
+                    if not os.path.exists(TmpOutputDir):
                         os.makedirs(TmpOutputDir)
                     TmpOutputFile = os.path.join(TmpOutputDir, FileName+'.tmp')
 
@@ -105,20 +111,22 @@ class OPTIONROM (OptionRomClassObject):
         OutputFile = OutputFile + '.rom'
 
         GenFdsGlobalVariable.GenerateOptionRom(
-                                OutputFile,
-                                EfiFileList,
-                                BinFileList,
-                                IsMakefile=Flag)
+            OutputFile,
+            EfiFileList,
+            BinFileList,
+            IsMakefile=Flag)
 
         if not Flag:
-            GenFdsGlobalVariable.InfLogger( "\nGenerate %s Option ROM Successfully" %self.DriverName)
+            GenFdsGlobalVariable.InfLogger(
+                "\nGenerate %s Option ROM Successfully" % self.DriverName)
         GenFdsGlobalVariable.SharpCounter = 0
 
         return OutputFile
 
+
 class OverrideAttribs:
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
diff --git a/BaseTools/Source/Python/GenFds/Region.py b/BaseTools/Source/Python/GenFds/Region.py
index e95cfcf965d2..67c8879a4f29 100644
--- a/BaseTools/Source/Python/GenFds/Region.py
+++ b/BaseTools/Source/Python/GenFds/Region.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process FD Region generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -22,12 +22,14 @@ from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 from Common.DataType import BINARY_FILE_TYPE_FV
 
-## generate Region
+# generate Region
 #
 #
+
+
 class Region(object):
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
@@ -40,7 +42,7 @@ class Region(object):
         self.RegionType = None
         self.RegionDataList = []
 
-    ## PadBuffer()
+    # PadBuffer()
     #
     #   Add padding bytes to the Buffer
     #
@@ -52,14 +54,14 @@ class Region(object):
 
     def PadBuffer(self, Buffer, ErasePolarity, Size):
         if Size > 0:
-            if (ErasePolarity == '1') :
+            if (ErasePolarity == '1'):
                 PadByte = pack('B', 0xFF)
             else:
                 PadByte = pack('B', 0)
             for i in range(0, Size):
                 Buffer.write(PadByte)
 
-    ## AddToBuffer()
+    # AddToBuffer()
     #
     #   Add region data to the Buffer
     #
@@ -78,7 +80,8 @@ class Region(object):
         if MacroDict is None:
             MacroDict = {}
         if not Flag:
-            GenFdsGlobalVariable.InfLogger('\nGenerate Region at Offset 0x%X' % self.Offset)
+            GenFdsGlobalVariable.InfLogger(
+                '\nGenerate Region at Offset 0x%X' % self.Offset)
             GenFdsGlobalVariable.InfLogger("   Region Size = 0x%X" % Size)
         GenFdsGlobalVariable.SharpCounter = 0
         if Flag and (self.RegionType != BINARY_FILE_TYPE_FV):
@@ -94,13 +97,17 @@ class Region(object):
             for RegionData in self.RegionDataList:
                 FileName = None
                 if RegionData.endswith(".fv"):
-                    RegionData = GenFdsGlobalVariable.MacroExtend(RegionData, MacroDict)
+                    RegionData = GenFdsGlobalVariable.MacroExtend(
+                        RegionData, MacroDict)
                     if not Flag:
-                        GenFdsGlobalVariable.InfLogger('   Region FV File Name = .fv : %s' % RegionData)
-                    if RegionData[1] != ':' :
-                        RegionData = mws.join (GenFdsGlobalVariable.WorkSpaceDir, RegionData)
+                        GenFdsGlobalVariable.InfLogger(
+                            '   Region FV File Name = .fv : %s' % RegionData)
+                    if RegionData[1] != ':':
+                        RegionData = mws.join(
+                            GenFdsGlobalVariable.WorkSpaceDir, RegionData)
                     if not os.path.exists(RegionData):
-                        EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=RegionData)
+                        EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                                        ExtraData=RegionData)
 
                     FileName = RegionData
                 elif RegionData.upper() + 'fv' in ImageBinDict:
@@ -113,17 +120,20 @@ class Region(object):
                     #
                     FvObj = None
                     if RegionData.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict:
-                        FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[RegionData.upper()]
+                        FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict[RegionData.upper(
+                        )]
 
-                    if FvObj is not None :
+                    if FvObj is not None:
                         if not Flag:
-                            GenFdsGlobalVariable.InfLogger('   Region Name = FV')
+                            GenFdsGlobalVariable.InfLogger(
+                                '   Region Name = FV')
                         #
                         # Call GenFv tool
                         #
                         self.BlockInfoOfRegion(BlockSizeList, FvObj)
                         self.FvAddress = self.FvAddress + FvOffset
-                        FvAlignValue = GenFdsGlobalVariable.GetAlignment(FvObj.FvAlignment)
+                        FvAlignValue = GenFdsGlobalVariable.GetAlignment(
+                            FvObj.FvAlignment)
                         if self.FvAddress % FvAlignValue != 0:
                             EdkLogger.error("GenFds", GENFDS_ERROR,
                                             "FV (%s) is NOT %s Aligned!" % (FvObj.UiFvName, FvObj.FvAlignment))
@@ -131,7 +141,8 @@ class Region(object):
                         FvBaseAddress = '0x%X' % self.FvAddress
                         BlockSize = None
                         BlockNum = None
-                        FvObj.AddToBuffer(FvBuffer, FvBaseAddress, BlockSize, BlockNum, ErasePolarity, Flag=Flag)
+                        FvObj.AddToBuffer(
+                            FvBuffer, FvBaseAddress, BlockSize, BlockNum, ErasePolarity, Flag=Flag)
                         if Flag:
                             continue
 
@@ -149,7 +160,8 @@ class Region(object):
                         Size = Size - FvBufferLen
                         continue
                     else:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (RegionData))
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "FV (%s) is NOT described in FDF file!" % (RegionData))
                 #
                 # Add the exist Fv image into FD buffer
                 #
@@ -158,7 +170,7 @@ class Region(object):
                         FileLength = os.stat(FileName)[ST_SIZE]
                         if FileLength > Size:
                             EdkLogger.error("GenFds", GENFDS_ERROR,
-                                            "Size of FV File (%s) is larger than Region Size 0x%X specified." \
+                                            "Size of FV File (%s) is larger than Region Size 0x%X specified."
                                             % (RegionData, Size))
                         BinFile = open(FileName, 'rb')
                         Buffer.write(BinFile.read())
@@ -176,12 +188,16 @@ class Region(object):
             #
             for RegionData in self.RegionDataList:
                 if RegionData.endswith(".cap"):
-                    RegionData = GenFdsGlobalVariable.MacroExtend(RegionData, MacroDict)
-                    GenFdsGlobalVariable.InfLogger('   Region CAPSULE Image Name = .cap : %s' % RegionData)
-                    if RegionData[1] != ':' :
-                        RegionData = mws.join (GenFdsGlobalVariable.WorkSpaceDir, RegionData)
+                    RegionData = GenFdsGlobalVariable.MacroExtend(
+                        RegionData, MacroDict)
+                    GenFdsGlobalVariable.InfLogger(
+                        '   Region CAPSULE Image Name = .cap : %s' % RegionData)
+                    if RegionData[1] != ':':
+                        RegionData = mws.join(
+                            GenFdsGlobalVariable.WorkSpaceDir, RegionData)
                     if not os.path.exists(RegionData):
-                        EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=RegionData)
+                        EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                                        ExtraData=RegionData)
 
                     FileName = RegionData
                 elif RegionData.upper() + 'cap' in ImageBinDict:
@@ -193,18 +209,21 @@ class Region(object):
                     #
                     CapsuleObj = None
                     if RegionData.upper() in GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict:
-                        CapsuleObj = GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict[RegionData.upper()]
+                        CapsuleObj = GenFdsGlobalVariable.FdfParser.Profile.CapsuleDict[RegionData.upper(
+                        )]
 
-                    if CapsuleObj is not None :
+                    if CapsuleObj is not None:
                         CapsuleObj.CapsuleName = RegionData.upper()
-                        GenFdsGlobalVariable.InfLogger('   Region Name = CAPSULE')
+                        GenFdsGlobalVariable.InfLogger(
+                            '   Region Name = CAPSULE')
                         #
                         # Call GenFv tool to generate Capsule Image
                         #
                         FileName = CapsuleObj.GenCapsule()
                         CapsuleObj.CapsuleName = None
                     else:
-                        EdkLogger.error("GenFds", GENFDS_ERROR, "Capsule (%s) is NOT described in FDF file!" % (RegionData))
+                        EdkLogger.error(
+                            "GenFds", GENFDS_ERROR, "Capsule (%s) is NOT described in FDF file!" % (RegionData))
 
                 #
                 # Add the capsule image into FD buffer
@@ -212,7 +231,7 @@ class Region(object):
                 FileLength = os.stat(FileName)[ST_SIZE]
                 if FileLength > Size:
                     EdkLogger.error("GenFds", GENFDS_ERROR,
-                                    "Size 0x%X of Capsule File (%s) is larger than Region Size 0x%X specified." \
+                                    "Size 0x%X of Capsule File (%s) is larger than Region Size 0x%X specified."
                                     % (FileLength, RegionData, Size))
                 BinFile = open(FileName, 'rb')
                 Buffer.write(BinFile.read())
@@ -228,24 +247,29 @@ class Region(object):
                 if self.RegionType == 'INF':
                     RegionData.__InfParse__(None)
                     if len(RegionData.BinFileList) != 1:
-                        EdkLogger.error('GenFds', GENFDS_ERROR, 'INF in FD region can only contain one binary: %s' % RegionData)
+                        EdkLogger.error(
+                            'GenFds', GENFDS_ERROR, 'INF in FD region can only contain one binary: %s' % RegionData)
                     File = RegionData.BinFileList[0]
                     RegionData = RegionData.PatchEfiFile(File.Path, File.Type)
                 else:
-                    RegionData = GenFdsGlobalVariable.MacroExtend(RegionData, MacroDict)
-                    if RegionData[1] != ':' :
-                        RegionData = mws.join (GenFdsGlobalVariable.WorkSpaceDir, RegionData)
+                    RegionData = GenFdsGlobalVariable.MacroExtend(
+                        RegionData, MacroDict)
+                    if RegionData[1] != ':':
+                        RegionData = mws.join(
+                            GenFdsGlobalVariable.WorkSpaceDir, RegionData)
                     if not os.path.exists(RegionData):
-                        EdkLogger.error("GenFds", FILE_NOT_FOUND, ExtraData=RegionData)
+                        EdkLogger.error("GenFds", FILE_NOT_FOUND,
+                                        ExtraData=RegionData)
                 #
                 # Add the file image into FD buffer
                 #
                 FileLength = os.stat(RegionData)[ST_SIZE]
                 if FileLength > Size:
                     EdkLogger.error("GenFds", GENFDS_ERROR,
-                                    "Size of File (%s) is larger than Region Size 0x%X specified." \
+                                    "Size of File (%s) is larger than Region Size 0x%X specified."
                                     % (RegionData, Size))
-                GenFdsGlobalVariable.InfLogger('   Region File Name = %s' % RegionData)
+                GenFdsGlobalVariable.InfLogger(
+                    '   Region File Name = %s' % RegionData)
                 BinFile = open(RegionData, 'rb')
                 Buffer.write(BinFile.read())
                 BinFile.close()
@@ -255,16 +279,17 @@ class Region(object):
             #
             self.PadBuffer(Buffer, ErasePolarity, Size)
 
-        if self.RegionType == 'DATA' :
+        if self.RegionType == 'DATA':
             GenFdsGlobalVariable.InfLogger('   Region Name = DATA')
             DataSize = 0
             for RegionData in self.RegionDataList:
                 Data = RegionData.split(',')
                 DataSize = DataSize + len(Data)
                 if DataSize > Size:
-                   EdkLogger.error("GenFds", GENFDS_ERROR, "Size of DATA is larger than Region Size ")
+                    EdkLogger.error("GenFds", GENFDS_ERROR,
+                                    "Size of DATA is larger than Region Size ")
                 else:
-                    for item in Data :
+                    for item in Data:
                         Buffer.write(pack('B', int(item, 16)))
                 Size = Size - DataSize
             #
@@ -276,7 +301,7 @@ class Region(object):
             GenFdsGlobalVariable.InfLogger('   Region Name = None')
             self.PadBuffer(Buffer, ErasePolarity, Size)
 
-    ## BlockSizeOfRegion()
+    # BlockSizeOfRegion()
     #
     #   @param  BlockSizeList        List of block information
     #   @param  FvObj                The object for FV
@@ -296,7 +321,8 @@ class Region(object):
             else:
                 # region ended within current blocks
                 if self.Offset + self.Size <= End:
-                    ExpectedList.append((BlockSize, (RemindingSize + BlockSize - 1) // BlockSize))
+                    ExpectedList.append(
+                        (BlockSize, (RemindingSize + BlockSize - 1) // BlockSize))
                     break
                 # region not ended yet
                 else:
@@ -337,12 +363,9 @@ class Region(object):
                                     % FvObj.UiFvName, ExtraData=ExpectedListData)
                 elif Item[1] != ExpectedList[Index][1]:
                     if (Item[1] < ExpectedList[Index][1]) and (Index == len(FvObj.BlockSizeList) - 1):
-                        break;
+                        break
                     else:
                         EdkLogger.error("GenFds", GENFDS_ERROR, "BlockStatements of FV %s are not align with FD's, suggested FV BlockStatement"
                                         % FvObj.UiFvName, ExtraData=ExpectedListData)
                 else:
                     Index += 1
-
-
-
diff --git a/BaseTools/Source/Python/GenFds/Rule.py b/BaseTools/Source/Python/GenFds/Rule.py
index 6561c0efd80f..79ea99441d90 100644
--- a/BaseTools/Source/Python/GenFds/Rule.py
+++ b/BaseTools/Source/Python/GenFds/Rule.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Rule object for generating FFS
 #
 #  Copyright (c) 2007, Intel Corporation. All rights reserved.<BR>
@@ -11,11 +11,13 @@
 #
 from CommonDataClass.FdfClass import RuleClassObject
 
-## Rule base class
+# Rule base class
 #
 #
+
+
 class Rule(RuleClassObject):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
diff --git a/BaseTools/Source/Python/GenFds/RuleComplexFile.py b/BaseTools/Source/Python/GenFds/RuleComplexFile.py
index 198f4f0a9ab3..ed5dd4ee371c 100644
--- a/BaseTools/Source/Python/GenFds/RuleComplexFile.py
+++ b/BaseTools/Source/Python/GenFds/RuleComplexFile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Complex Rule object for generating FFS
 #
 #  Copyright (c) 2007, Intel Corporation. All rights reserved.<BR>
@@ -11,13 +11,15 @@
 #
 from __future__ import absolute_import
 from . import Rule
-from  CommonDataClass.FdfClass import RuleComplexFileClassObject
+from CommonDataClass.FdfClass import RuleComplexFileClassObject
 
-## complex rule
+# complex rule
 #
 #
-class RuleComplexFile(RuleComplexFileClassObject) :
-    ## The constructor
+
+
+class RuleComplexFile(RuleComplexFileClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
diff --git a/BaseTools/Source/Python/GenFds/RuleSimpleFile.py b/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
index 772c768cc982..0b705db8ed1b 100644
--- a/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
+++ b/BaseTools/Source/Python/GenFds/RuleSimpleFile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Simple Rule object for generating FFS
 #
 #  Copyright (c) 2007, Intel Corporation. All rights reserved.<BR>
@@ -13,11 +13,13 @@ from __future__ import absolute_import
 from . import Rule
 from CommonDataClass.FdfClass import RuleSimpleFileClassObject
 
-## simple rule
+# simple rule
 #
 #
-class RuleSimpleFile (RuleSimpleFileClassObject) :
-    ## The constructor
+
+
+class RuleSimpleFile (RuleSimpleFileClassObject):
+    # The constructor
     #
     #   @param  self        The object pointer
     #
diff --git a/BaseTools/Source/Python/GenFds/Section.py b/BaseTools/Source/Python/GenFds/Section.py
index 447828c8e588..9fa4534f0fde 100644
--- a/BaseTools/Source/Python/GenFds/Section.py
+++ b/BaseTools/Source/Python/GenFds/Section.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # section base class
 #
 #  Copyright (c) 2007-2018, Intel Corporation. All rights reserved.<BR>
@@ -12,75 +12,78 @@
 from __future__ import absolute_import
 from CommonDataClass.FdfClass import SectionClassObject
 from .GenFdsGlobalVariable import GenFdsGlobalVariable
-import Common.LongFilePathOs as os, glob
+import Common.LongFilePathOs as os
+import glob
 from Common import EdkLogger
 from Common.BuildToolError import *
 from Common.DataType import *
 
-## section base class
+# section base class
 #
 #
+
+
 class Section (SectionClassObject):
     SectionType = {
-        'RAW'       : 'EFI_SECTION_RAW',
-        'FREEFORM'  : 'EFI_SECTION_FREEFORM_SUBTYPE_GUID',
-        BINARY_FILE_TYPE_PE32      : 'EFI_SECTION_PE32',
-        BINARY_FILE_TYPE_PIC       : 'EFI_SECTION_PIC',
-        BINARY_FILE_TYPE_TE        : 'EFI_SECTION_TE',
-        'FV_IMAGE'  : 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
-        BINARY_FILE_TYPE_DXE_DEPEX : 'EFI_SECTION_DXE_DEPEX',
-        BINARY_FILE_TYPE_PEI_DEPEX : 'EFI_SECTION_PEI_DEPEX',
-        'GUIDED'    : 'EFI_SECTION_GUID_DEFINED',
-        'COMPRESS'  : 'EFI_SECTION_COMPRESSION',
-        BINARY_FILE_TYPE_UI        : 'EFI_SECTION_USER_INTERFACE',
-        BINARY_FILE_TYPE_SMM_DEPEX : 'EFI_SECTION_SMM_DEPEX'
+        'RAW': 'EFI_SECTION_RAW',
+        'FREEFORM': 'EFI_SECTION_FREEFORM_SUBTYPE_GUID',
+        BINARY_FILE_TYPE_PE32: 'EFI_SECTION_PE32',
+        BINARY_FILE_TYPE_PIC: 'EFI_SECTION_PIC',
+        BINARY_FILE_TYPE_TE: 'EFI_SECTION_TE',
+        'FV_IMAGE': 'EFI_SECTION_FIRMWARE_VOLUME_IMAGE',
+        BINARY_FILE_TYPE_DXE_DEPEX: 'EFI_SECTION_DXE_DEPEX',
+        BINARY_FILE_TYPE_PEI_DEPEX: 'EFI_SECTION_PEI_DEPEX',
+        'GUIDED': 'EFI_SECTION_GUID_DEFINED',
+        'COMPRESS': 'EFI_SECTION_COMPRESSION',
+        BINARY_FILE_TYPE_UI: 'EFI_SECTION_USER_INTERFACE',
+        BINARY_FILE_TYPE_SMM_DEPEX: 'EFI_SECTION_SMM_DEPEX'
     }
 
     BinFileType = {
-        BINARY_FILE_TYPE_GUID          : '.guid',
-        'ACPI'          : '.acpi',
-        'ASL'           : '.asl' ,
-        BINARY_FILE_TYPE_UEFI_APP      : '.app',
-        BINARY_FILE_TYPE_LIB           : '.lib',
-        BINARY_FILE_TYPE_PE32          : '.pe32',
-        BINARY_FILE_TYPE_PIC           : '.pic',
-        BINARY_FILE_TYPE_PEI_DEPEX     : '.depex',
-        'SEC_PEI_DEPEX' : '.depex',
-        BINARY_FILE_TYPE_TE            : '.te',
-        BINARY_FILE_TYPE_UNI_VER       : '.ver',
-        BINARY_FILE_TYPE_VER           : '.ver',
-        BINARY_FILE_TYPE_UNI_UI        : '.ui',
-        BINARY_FILE_TYPE_UI            : '.ui',
-        BINARY_FILE_TYPE_BIN           : '.bin',
-        'RAW'           : '.raw',
-        'COMPAT16'      : '.comp16',
-        BINARY_FILE_TYPE_FV            : '.fv'
+        BINARY_FILE_TYPE_GUID: '.guid',
+        'ACPI': '.acpi',
+        'ASL': '.asl',
+        BINARY_FILE_TYPE_UEFI_APP: '.app',
+        BINARY_FILE_TYPE_LIB: '.lib',
+        BINARY_FILE_TYPE_PE32: '.pe32',
+        BINARY_FILE_TYPE_PIC: '.pic',
+        BINARY_FILE_TYPE_PEI_DEPEX: '.depex',
+        'SEC_PEI_DEPEX': '.depex',
+        BINARY_FILE_TYPE_TE: '.te',
+        BINARY_FILE_TYPE_UNI_VER: '.ver',
+        BINARY_FILE_TYPE_VER: '.ver',
+        BINARY_FILE_TYPE_UNI_UI: '.ui',
+        BINARY_FILE_TYPE_UI: '.ui',
+        BINARY_FILE_TYPE_BIN: '.bin',
+        'RAW': '.raw',
+        'COMPAT16': '.comp16',
+        BINARY_FILE_TYPE_FV: '.fv'
     }
 
     SectFileType = {
-        'SEC_GUID'      : '.sec' ,
-        'SEC_PE32'      : '.sec' ,
-        'SEC_PIC'       : '.sec',
-        'SEC_TE'        : '.sec',
-        'SEC_VER'       : '.sec',
-        'SEC_UI'        : '.sec',
-        'SEC_COMPAT16'  : '.sec',
-        'SEC_BIN'       : '.sec'
+        'SEC_GUID': '.sec',
+        'SEC_PE32': '.sec',
+        'SEC_PIC': '.sec',
+        'SEC_TE': '.sec',
+        'SEC_VER': '.sec',
+        'SEC_UI': '.sec',
+        'SEC_COMPAT16': '.sec',
+        'SEC_BIN': '.sec'
     }
 
     ToolGuid = {
-        '0xa31280ad-0x481e-0x41b6-0x95e8-0x127f-0x4c984779' : 'TianoCompress',
-        '0xee4e5898-0x3914-0x4259-0x9d6e-0xdc7b-0xd79403cf' : 'LzmaCompress'
+        '0xa31280ad-0x481e-0x41b6-0x95e8-0x127f-0x4c984779': 'TianoCompress',
+        '0xee4e5898-0x3914-0x4259-0x9d6e-0xdc7b-0xd79403cf': 'LzmaCompress'
     }
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         SectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   virtual function
     #
@@ -92,10 +95,10 @@ class Section (SectionClassObject):
     #   @param  FfsInf      FfsInfStatement object that contains this section data
     #   @param  Dict        dictionary contains macro and its value
     #
-    def GenSection(self, OutputPath, GuidName, SecNum, keyStringList, FfsInf = None, Dict = None):
+    def GenSection(self, OutputPath, GuidName, SecNum, keyStringList, FfsInf=None, Dict=None):
         pass
 
-    ## GetFileList() method
+    # GetFileList() method
     #
     #   Generate compressed section
     #
@@ -106,12 +109,12 @@ class Section (SectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (File list, boolean)
     #
-    def GetFileList(FfsInf, FileType, FileExtension, Dict = None, IsMakefile=False, SectionType=None):
+    def GetFileList(FfsInf, FileType, FileExtension, Dict=None, IsMakefile=False, SectionType=None):
         IsSect = FileType in Section.SectFileType
 
         if FileExtension is not None:
             Suffix = FileExtension
-        elif IsSect :
+        elif IsSect:
             Suffix = Section.SectionType.get(FileType)
         else:
             Suffix = Section.BinFileType.get(FileType)
@@ -122,17 +125,21 @@ class Section (SectionClassObject):
         if FileType is not None:
             for File in FfsInf.BinFileList:
                 if File.Arch == TAB_ARCH_COMMON or FfsInf.CurrentArch == File.Arch:
-                    if File.Type == FileType or (int(FfsInf.PiSpecVersion, 16) >= 0x0001000A \
+                    if File.Type == FileType or (int(FfsInf.PiSpecVersion, 16) >= 0x0001000A
                                                  and FileType == 'DXE_DPEX' and File.Type == BINARY_FILE_TYPE_SMM_DEPEX) \
-                                                 or (FileType == BINARY_FILE_TYPE_TE and File.Type == BINARY_FILE_TYPE_PE32):
+                            or (FileType == BINARY_FILE_TYPE_TE and File.Type == BINARY_FILE_TYPE_PE32):
                         if TAB_STAR in FfsInf.TargetOverrideList or File.Target == TAB_STAR or File.Target in FfsInf.TargetOverrideList or FfsInf.TargetOverrideList == []:
-                            FileList.append(FfsInf.PatchEfiFile(File.Path, File.Type))
+                            FileList.append(
+                                FfsInf.PatchEfiFile(File.Path, File.Type))
                         else:
-                            GenFdsGlobalVariable.InfLogger ("\nBuild Target \'%s\' of File %s is not in the Scope of %s specified by INF %s in FDF" %(File.Target, File.File, FfsInf.TargetOverrideList, FfsInf.InfFileName))
+                            GenFdsGlobalVariable.InfLogger("\nBuild Target \'%s\' of File %s is not in the Scope of %s specified by INF %s in FDF" % (
+                                File.Target, File.File, FfsInf.TargetOverrideList, FfsInf.InfFileName))
                     else:
-                        GenFdsGlobalVariable.VerboseLogger ("\nFile Type \'%s\' of File %s in %s is not same with file type \'%s\' from Rule in FDF" %(File.Type, File.File, FfsInf.InfFileName, FileType))
+                        GenFdsGlobalVariable.VerboseLogger("\nFile Type \'%s\' of File %s in %s is not same with file type \'%s\' from Rule in FDF" % (
+                            File.Type, File.File, FfsInf.InfFileName, FileType))
                 else:
-                    GenFdsGlobalVariable.InfLogger ("\nCurrent ARCH \'%s\' of File %s is not in the Support Arch Scope of %s specified by INF %s in FDF" %(FfsInf.CurrentArch, File.File, File.Arch, FfsInf.InfFileName))
+                    GenFdsGlobalVariable.InfLogger("\nCurrent ARCH \'%s\' of File %s is not in the Support Arch Scope of %s specified by INF %s in FDF" % (
+                        FfsInf.CurrentArch, File.File, File.Arch, FfsInf.InfFileName))
 
         elif FileType is None and SectionType == BINARY_FILE_TYPE_RAW:
             for File in FfsInf.BinFileList:
@@ -145,8 +152,8 @@ class Section (SectionClassObject):
                 if Suffix in SuffixMap:
                     FileList.extend(SuffixMap[Suffix])
 
-        #Process the file lists is alphabetical for a same section type
-        if len (FileList) > 1:
+        # Process the file lists is alphabetical for a same section type
+        if len(FileList) > 1:
             FileList.sort()
 
         return FileList, IsSect
diff --git a/BaseTools/Source/Python/GenFds/UiSection.py b/BaseTools/Source/Python/GenFds/UiSection.py
index f643058bd637..964bf4de9622 100644
--- a/BaseTools/Source/Python/GenFds/UiSection.py
+++ b/BaseTools/Source/Python/GenFds/UiSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process UI section generation
 #
 #  Copyright (c) 2007 - 2017, Intel Corporation. All rights reserved.<BR>
@@ -19,19 +19,21 @@ from CommonDataClass.FdfClass import UiSectionClassObject
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.DataType import *
 
-## generate UI section
+# generate UI section
 #
 #
+
+
 class UiSection (UiSectionClassObject):
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         UiSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate UI section
     #
@@ -44,7 +46,7 @@ class UiSection (UiSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile=False):
         #
         # Prepare the parameter of GenSection
         #
@@ -53,21 +55,24 @@ class UiSection (UiSectionClassObject):
             self.StringData = FfsInf.__ExtendMacro__(self.StringData)
             self.FileName = FfsInf.__ExtendMacro__(self.FileName)
 
-        OutputFile = os.path.join(OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get(BINARY_FILE_TYPE_UI))
+        OutputFile = os.path.join(
+            OutputPath, ModuleName + SUP_MODULE_SEC + SecNum + SectionSuffix.get(BINARY_FILE_TYPE_UI))
 
-        if self.StringData is not None :
+        if self.StringData is not None:
             NameString = self.StringData
         elif self.FileName is not None:
             if Dict is None:
                 Dict = {}
-            FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
+            FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.FileName)
             FileNameStr = GenFdsGlobalVariable.MacroExtend(FileNameStr, Dict)
             FileObj = open(FileNameStr, 'r')
             NameString = FileObj.read()
             FileObj.close()
         else:
             NameString = ''
-        GenFdsGlobalVariable.GenerateSection(OutputFile, None, 'EFI_SECTION_USER_INTERFACE', Ui=NameString, IsMakefile=IsMakefile)
+        GenFdsGlobalVariable.GenerateSection(
+            OutputFile, None, 'EFI_SECTION_USER_INTERFACE', Ui=NameString, IsMakefile=IsMakefile)
 
         OutputFileList = []
         OutputFileList.append(OutputFile)
diff --git a/BaseTools/Source/Python/GenFds/VerSection.py b/BaseTools/Source/Python/GenFds/VerSection.py
index 7280e80cb4ef..7cd2e30d4a2b 100644
--- a/BaseTools/Source/Python/GenFds/VerSection.py
+++ b/BaseTools/Source/Python/GenFds/VerSection.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # process Version section generation
 #
 #  Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -17,19 +17,21 @@ from CommonDataClass.FdfClass import VerSectionClassObject
 from Common.LongFilePathSupport import OpenLongFilePath as open
 from Common.DataType import SUP_MODULE_SEC
 
-## generate version section
+# generate version section
 #
 #
+
+
 class VerSection (VerSectionClassObject):
 
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #
     def __init__(self):
         VerSectionClassObject.__init__(self)
 
-    ## GenSection() method
+    # GenSection() method
     #
     #   Generate version section
     #
@@ -42,7 +44,7 @@ class VerSection (VerSectionClassObject):
     #   @param  Dict        dictionary contains macro and its value
     #   @retval tuple       (Generated file name, section alignment)
     #
-    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile = False):
+    def GenSection(self, OutputPath, ModuleName, SecNum, KeyStringList, FfsInf=None, Dict=None, IsMakefile=False):
         #
         # Prepare the parameter of GenSection
         #
@@ -63,7 +65,8 @@ class VerSection (VerSectionClassObject):
         elif self.FileName:
             if Dict is None:
                 Dict = {}
-            FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FileName)
+            FileNameStr = GenFdsGlobalVariable.ReplaceWorkspaceMacro(
+                self.FileName)
             FileNameStr = GenFdsGlobalVariable.MacroExtend(FileNameStr, Dict)
             FileObj = open(FileNameStr, 'r')
             StringData = FileObj.read()
diff --git a/BaseTools/Source/Python/GenFds/__init__.py b/BaseTools/Source/Python/GenFds/__init__.py
index 09ea47ea5710..083caeeba2d5 100644
--- a/BaseTools/Source/Python/GenFds/__init__.py
+++ b/BaseTools/Source/Python/GenFds/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'GenFds' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
index d962ab0adda7..e589e7e2db64 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Generate PCD table for 'Patchable In Module' type PCD with given .map file.
 #    The Patch PCD table like:
 #
@@ -10,7 +10,7 @@
 #
 #
 
-#======================================  External Libraries ========================================
+# ======================================  External Libraries ========================================
 from __future__ import print_function
 import optparse
 import Common.LongFilePathOs as os
@@ -28,10 +28,12 @@ __version_number__ = ("0.10" + " " + gBUILD_VERSION)
 __version__ = "%prog Version " + __version_number__
 __copyright__ = "Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved."
 
-#======================================  Internal Libraries ========================================
+# ======================================  Internal Libraries ========================================
+
+# ============================================== Code ===============================================
+symRe = re.compile(
+    '^([\da-fA-F]+):([\da-fA-F]+) +([\.\-:\\\\\w\?@\$<>]+) +([\da-fA-F]+)', re.UNICODE)
 
-#============================================== Code ===============================================
-symRe = re.compile('^([\da-fA-F]+):([\da-fA-F]+) +([\.\-:\\\\\w\?@\$<>]+) +([\da-fA-F]+)', re.UNICODE)
 
 def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
     """ Parse map file to get binary patch pcd information
@@ -47,25 +49,28 @@ def parsePcdInfoFromMapFile(mapfilepath, efifilepath):
     except:
         return None
 
-    if len(lines) == 0: return None
+    if len(lines) == 0:
+        return None
     firstline = lines[0].strip()
     if re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', firstline):
         return _parseForXcodeAndClang9(lines, efifilepath)
     if (firstline.startswith("Archive member included ") and
-        firstline.endswith(" file (symbol)")):
+            firstline.endswith(" file (symbol)")):
         return _parseForGCC(lines, efifilepath)
     if firstline.startswith("# Path:"):
         return _parseForXcodeAndClang9(lines, efifilepath)
     return _parseGeneral(lines, efifilepath)
 
+
 def _parseForXcodeAndClang9(lines, efifilepath):
-    valuePattern = re.compile('^([\da-fA-FxX]+)([\s\S]*)([_]*_gPcd_BinaryPatch_([\w]+))')
+    valuePattern = re.compile(
+        '^([\da-fA-FxX]+)([\s\S]*)([_]*_gPcd_BinaryPatch_([\w]+))')
     status = 0
     pcds = []
     for line in lines:
         line = line.strip()
-        if status == 0 and (re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', line) \
-            or line == "# Symbols:"):
+        if status == 0 and (re.match('^\s*Address\s*Size\s*Align\s*Out\s*In\s*Symbol\s*$', line)
+                            or line == "# Symbols:"):
             status = 1
             continue
         if status == 1 and len(line) != 0:
@@ -75,6 +80,7 @@ def _parseForXcodeAndClang9(lines, efifilepath):
                     pcds.append((m.groups(0)[3], int(m.groups(0)[0], 16)))
     return pcds
 
+
 def _parseForGCC(lines, efifilepath):
     """ Parse map file generated by GCC linker """
     dataPattern = re.compile('^.data._gPcd_BinaryPatch_([\w_\d]+)$')
@@ -91,7 +97,7 @@ def _parseForGCC(lines, efifilepath):
         elif status == 1 and line == 'Linker script and memory map':
             status = 2
             continue
-        elif status ==2 and line == 'START GROUP':
+        elif status == 2 and line == 'START GROUP':
             status = 3
             continue
 
@@ -107,13 +113,14 @@ def _parseForGCC(lines, efifilepath):
                     PcdName = m.groups(0)[0]
                     m = pcdPatternGcc.match(lines[index + 1].strip())
                     if m is not None:
-                        bpcds.append((PcdName, int(m.groups(0)[0], 16), int(sections[-1][1], 16), sections[-1][0]))
+                        bpcds.append((PcdName, int(m.groups(0)[0], 16), int(
+                            sections[-1][1], 16), sections[-1][0]))
 
     # get section information from efi file
     efisecs = PeImageClass(efifilepath).SectionHeaderList
     if efisecs is None or len(efisecs) == 0:
         return None
-    #redirection
+    # redirection
     redirection = 0
     for efisec in efisecs:
         for section in sections:
@@ -124,16 +131,18 @@ def _parseForGCC(lines, efifilepath):
         for efisec in efisecs:
             if pcd[1] >= efisec[1] and pcd[1] < efisec[1]+efisec[3]:
                 #assert efisec[0].strip() == pcd[3].strip() and efisec[1] + redirection == pcd[2], "There are some differences between map file and efi file"
-                pcds.append([pcd[0], efisec[2] + pcd[1] - efisec[1] - redirection, efisec[0]])
+                pcds.append([pcd[0], efisec[2] + pcd[1] -
+                            efisec[1] - redirection, efisec[0]])
     return pcds
 
+
 def _parseGeneral(lines, efifilepath):
     """ For MSFT, ICC, EBC
     @param lines    line array for map file
 
     @return a list which element hold (PcdName, Offset, SectionName)
     """
-    status = 0    #0 - beginning of file; 1 - PE section definition; 2 - symbol table
+    status = 0  # 0 - beginning of file; 1 - PE section definition; 2 - symbol table
     secs = []    # key = section name
     bPcds = []
     symPattern = re.compile('^[_]+gPcd_BinaryPatch_([\w]+)')
@@ -153,7 +162,8 @@ def _parseGeneral(lines, efifilepath):
             m = secReGeneral.match(line)
             assert m is not None, "Fail to parse the section in map file , line is %s" % line
             sec_no, sec_start, sec_length, sec_name, sec_class = m.groups(0)
-            secs.append([int(sec_no, 16), int(sec_start, 16), int(sec_length, 16), sec_name, sec_class])
+            secs.append([int(sec_no, 16), int(sec_start, 16),
+                        int(sec_length, 16), sec_name, sec_class])
         if status == 2 and len(line) != 0:
             m = symRe.match(line)
             assert m is not None, "Fail to parse the symbol in map file, line is %s" % line
@@ -166,9 +176,11 @@ def _parseGeneral(lines, efifilepath):
                 # fond a binary pcd entry in map file
                 for sec in secs:
                     if sec[0] == sec_no and (sym_offset >= sec[1] and sym_offset < sec[1] + sec[2]):
-                        bPcds.append([m2.groups(0)[0], sec[3], sym_offset, vir_addr, sec_no])
+                        bPcds.append([m2.groups(0)[0], sec[3],
+                                     sym_offset, vir_addr, sec_no])
 
-    if len(bPcds) == 0: return None
+    if len(bPcds) == 0:
+        return None
 
     # get section information from efi file
     efisecs = PeImageClass(efifilepath).SectionHeaderList
@@ -186,6 +198,7 @@ def _parseGeneral(lines, efifilepath):
                 pcds.append([pcd[0], efisec[2] + pcd[2], efisec[0]])
     return pcds
 
+
 def generatePcdTable(list, pcdpath):
     try:
         f = open(pcdpath, 'w')
@@ -195,15 +208,18 @@ def generatePcdTable(list, pcdpath):
     f.write('PCD Name                       Offset    Section Name\r\n')
 
     for pcditem in list:
-        f.write('%-30s 0x%-08X %-6s\r\n' % (pcditem[0], pcditem[1], pcditem[2]))
+        f.write('%-30s 0x%-08X %-6s\r\n' %
+                (pcditem[0], pcditem[1], pcditem[2]))
     f.close()
 
-    #print 'Success to generate Binary Patch PCD table at %s!' % pcdpath
+    # print 'Success to generate Binary Patch PCD table at %s!' % pcdpath
+
 
 if __name__ == '__main__':
     UsageString = "%prog -m <MapFile> -e <EfiFile> -o <OutFile>"
     AdditionalNotes = "\nPCD table is generated in file name with .BinaryPcdTable.txt postfix"
-    parser = optparse.OptionParser(description=__copyright__, version=__version__, usage=UsageString)
+    parser = optparse.OptionParser(
+        description=__copyright__, version=__version__, usage=UsageString)
     parser.add_option('-m', '--mapfile', action='store', dest='mapfile',
                       help='Absolute path of module map file.')
     parser.add_option('-e', '--efifile', action='store', dest='efifile',
@@ -221,7 +237,8 @@ if __name__ == '__main__':
             if options.outfile is not None:
                 generatePcdTable(list, options.outfile)
             else:
-                generatePcdTable(list, options.mapfile.replace('.map', '.BinaryPcdTable.txt'))
+                generatePcdTable(list, options.mapfile.replace(
+                    '.map', '.BinaryPcdTable.txt'))
         else:
             print('Fail to generate Patch PCD Table based on map file and efi file')
     else:
diff --git a/BaseTools/Source/Python/GenPatchPcdTable/__init__.py b/BaseTools/Source/Python/GenPatchPcdTable/__init__.py
index 70b46a525de7..6105f5ed5a9f 100644
--- a/BaseTools/Source/Python/GenPatchPcdTable/__init__.py
+++ b/BaseTools/Source/Python/GenPatchPcdTable/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'GenPatchPcdTable' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
index d35cd792704c..8ede157c7e5b 100644
--- a/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
+++ b/BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Patch value into the binary file.
 #
 # Copyright (c) 2010 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -25,7 +25,7 @@ __version_number__ = ("0.10" + " " + gBUILD_VERSION)
 __version__ = "%prog Version " + __version_number__
 __copyright__ = "Copyright (c) 2010 - 2018, Intel Corporation. All rights reserved."
 
-## PatchBinaryFile method
+# PatchBinaryFile method
 #
 # This method mainly patches the data into binary file.
 #
@@ -38,12 +38,14 @@ __copyright__ = "Copyright (c) 2010 - 2018, Intel Corporation. All rights reserv
 # @retval 0     File is updated successfully.
 # @retval not 0 File is updated failed.
 #
+
+
 def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
     #
     # Length of Binary File
     #
     FileHandle = open(FileName, 'rb')
-    FileHandle.seek (0, 2)
+    FileHandle.seek(0, 2)
     FileLength = FileHandle.tell()
     FileHandle.close()
     #
@@ -104,7 +106,7 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
                 ValueNumber = 1
             elif ValueString == 'FALSE':
                 ValueNumber = 0
-            ValueNumber = int (ValueString, 0)
+            ValueNumber = int(ValueString, 0)
             if ValueNumber != 0:
                 ValueNumber = 1
         except:
@@ -118,7 +120,7 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
         # Get PCD value for UINT* data type
         #
         try:
-            ValueNumber = int (ValueString, 0)
+            ValueNumber = int(ValueString, 0)
         except:
             return PARAMETER_INVALID, "PCD Value %s is not valid dec or hex string." % (ValueString)
         #
@@ -149,7 +151,7 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
             #
             # Patch {0x1, 0x2, ...} byte by byte
             #
-            ValueList = ValueString[1 : len(ValueString) - 1].split(',')
+            ValueList = ValueString[1: len(ValueString) - 1].split(',')
             Index = 0
             try:
                 for ByteString in ValueList:
@@ -191,13 +193,15 @@ def PatchBinaryFile(FileName, ValueOffset, TypeName, ValueString, MaxSize=0):
         FileHandle.close()
     return 0, "Patch Value into File %s successfully." % (FileName)
 
-## Parse command line options
+# Parse command line options
 #
 # Using standard Python module optparse to parse command line option of this tool.
 #
 # @retval Options   A optparse.Values object containing the parsed options
 # @retval InputFile Path of file to be trimmed
 #
+
+
 def Options():
     OptionList = [
         make_option("-f", "--offset", dest="PcdOffset", action="store", type="int",
@@ -214,25 +218,28 @@ def Options():
                           help="Run with debug information"),
         make_option("-q", "--quiet", dest="LogLevel", action="store_const", const=EdkLogger.QUIET,
                           help="Run quietly"),
-        make_option("-?", action="help", help="show this help message and exit"),
+        make_option("-?", action="help",
+                    help="show this help message and exit"),
     ]
 
     # use clearer usage to override default usage message
     UsageString = "%prog -f Offset -u Value -t Type [-s MaxSize] <input_file>"
 
-    Parser = OptionParser(description=__copyright__, version=__version__, option_list=OptionList, usage=UsageString)
+    Parser = OptionParser(description=__copyright__, version=__version__,
+                          option_list=OptionList, usage=UsageString)
     Parser.set_defaults(LogLevel=EdkLogger.INFO)
 
     Options, Args = Parser.parse_args()
 
     # error check
     if len(Args) == 0:
-        EdkLogger.error("PatchPcdValue", PARAMETER_INVALID, ExtraData=Parser.get_usage())
+        EdkLogger.error("PatchPcdValue", PARAMETER_INVALID,
+                        ExtraData=Parser.get_usage())
 
     InputFile = Args[len(Args) - 1]
     return Options, InputFile
 
-## Entrance method
+# Entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -241,6 +248,8 @@ def Options():
 # @retval 0     Tool was successful
 # @retval 1     Tool failed
 #
+
+
 def Main():
     try:
         #
@@ -252,22 +261,27 @@ def Main():
             EdkLogger.SetLevel(CommandOptions.LogLevel + 1)
         else:
             EdkLogger.SetLevel(CommandOptions.LogLevel)
-        if not os.path.exists (InputFile):
-            EdkLogger.error("PatchPcdValue", FILE_NOT_FOUND, ExtraData=InputFile)
+        if not os.path.exists(InputFile):
+            EdkLogger.error("PatchPcdValue", FILE_NOT_FOUND,
+                            ExtraData=InputFile)
             return 1
         if CommandOptions.PcdOffset is None or CommandOptions.PcdValue is None or CommandOptions.PcdTypeName is None:
-            EdkLogger.error("PatchPcdValue", OPTION_MISSING, ExtraData="PcdOffset or PcdValue of PcdTypeName is not specified.")
+            EdkLogger.error("PatchPcdValue", OPTION_MISSING,
+                            ExtraData="PcdOffset or PcdValue of PcdTypeName is not specified.")
             return 1
         if CommandOptions.PcdTypeName.upper() not in TAB_PCD_NUMERIC_TYPES_VOID:
-            EdkLogger.error("PatchPcdValue", PARAMETER_INVALID, ExtraData="PCD type %s is not valid." % (CommandOptions.PcdTypeName))
+            EdkLogger.error("PatchPcdValue", PARAMETER_INVALID,
+                            ExtraData="PCD type %s is not valid." % (CommandOptions.PcdTypeName))
             return 1
         if CommandOptions.PcdTypeName.upper() == TAB_VOID and CommandOptions.PcdMaxSize is None:
-            EdkLogger.error("PatchPcdValue", OPTION_MISSING, ExtraData="PcdMaxSize is not specified for VOID* type PCD.")
+            EdkLogger.error("PatchPcdValue", OPTION_MISSING,
+                            ExtraData="PcdMaxSize is not specified for VOID* type PCD.")
             return 1
         #
         # Patch value into binary image.
         #
-        ReturnValue, ErrorInfo = PatchBinaryFile (InputFile, CommandOptions.PcdOffset, CommandOptions.PcdTypeName, CommandOptions.PcdValue, CommandOptions.PcdMaxSize)
+        ReturnValue, ErrorInfo = PatchBinaryFile(
+            InputFile, CommandOptions.PcdOffset, CommandOptions.PcdTypeName, CommandOptions.PcdValue, CommandOptions.PcdMaxSize)
         if ReturnValue != 0:
             EdkLogger.error("PatchPcdValue", ReturnValue, ExtraData=ErrorInfo)
             return 1
@@ -275,6 +289,7 @@ def Main():
     except:
         return 1
 
+
 if __name__ == '__main__':
     r = Main()
     sys.exit(r)
diff --git a/BaseTools/Source/Python/PatchPcdValue/__init__.py b/BaseTools/Source/Python/PatchPcdValue/__init__.py
index 08275ed6ed00..eda5835ac31e 100644
--- a/BaseTools/Source/Python/PatchPcdValue/__init__.py
+++ b/BaseTools/Source/Python/PatchPcdValue/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'PatchPcdValue' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
index 5d4c3a8599ff..d13091c042ba 100644
--- a/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
+++ b/BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This tool adds EFI_FIRMWARE_IMAGE_AUTHENTICATION for a binary.
 #
 # This tool only support CertType - EFI_CERT_TYPE_PKCS7_GUID
@@ -27,15 +27,15 @@ from Common.BuildVersion import gBUILD_VERSION
 #
 # Globals for help information
 #
-__prog__      = 'Pkcs7Sign'
-__version__   = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
+__prog__ = 'Pkcs7Sign'
+__version__ = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
 __copyright__ = 'Copyright (c) 2016, Intel Corporation. All rights reserved.'
-__usage__     = '%s -e|-d [options] <input_file>' % (__prog__)
+__usage__ = '%s -e|-d [options] <input_file>' % (__prog__)
 
 #
 # GUID for PKCS7 from UEFI Specification
 #
-WIN_CERT_REVISION      = 0x0200
+WIN_CERT_REVISION = 0x0200
 WIN_CERT_TYPE_EFI_GUID = 0x0EF1
 EFI_CERT_TYPE_PKCS7_GUID = uuid.UUID('{4aafd29d-68df-49ee-8aa9-347d375665a7}')
 
@@ -67,214 +67,246 @@ TEST_OTHER_PUBLIC_CERT_FILENAME = 'TestSub.pub.pem'
 TEST_TRUSTED_PUBLIC_CERT_FILENAME = 'TestRoot.pub.pem'
 
 if __name__ == '__main__':
-  #
-  # Create command line argument parser object
-  #
-  parser = argparse.ArgumentParser(prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
-  group = parser.add_mutually_exclusive_group(required=True)
-  group.add_argument("-e", action="store_true", dest='Encode', help='encode file')
-  group.add_argument("-d", action="store_true", dest='Decode', help='decode file')
-  group.add_argument("--version", action='version', version=__version__)
-  parser.add_argument("-o", "--output", dest='OutputFile', type=str, metavar='filename', help="specify the output filename", required=True)
-  parser.add_argument("--signer-private-cert", dest='SignerPrivateCertFile', type=argparse.FileType('rb'), help="specify the signer private cert filename.  If not specified, a test signer private cert is used.")
-  parser.add_argument("--other-public-cert", dest='OtherPublicCertFile', type=argparse.FileType('rb'), help="specify the other public cert filename.  If not specified, a test other public cert is used.")
-  parser.add_argument("--trusted-public-cert", dest='TrustedPublicCertFile', type=argparse.FileType('rb'), help="specify the trusted public cert filename.  If not specified, a test trusted public cert is used.")
-  parser.add_argument("--monotonic-count", dest='MonotonicCountStr', type=str, help="specify the MonotonicCount in FMP capsule.  If not specified, 0 is used.")
-  parser.add_argument("--signature-size", dest='SignatureSizeStr', type=str, help="specify the signature size for decode process.")
-  parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
-  parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
-  parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
-
-  #
-  # Parse command line arguments
-  #
-  args = parser.parse_args()
-
-  #
-  # Generate file path to Open SSL command
-  #
-  OpenSslCommand = 'openssl'
-  try:
-    OpenSslPath = os.environ['OPENSSL_PATH']
-    OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
-    if ' ' in OpenSslCommand:
-      OpenSslCommand = '"' + OpenSslCommand + '"'
-  except:
-    pass
-
-  #
-  # Verify that Open SSL command is available
-  #
-  try:
-    Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-  except:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(1)
-
-  Version = Process.communicate()
-  if Process.returncode != 0:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(Process.returncode)
-  print(Version[0].decode())
-
-  #
-  # Read input file into a buffer and save input filename
-  #
-  args.InputFileName   = args.InputFile.name
-  args.InputFileBuffer = args.InputFile.read()
-  args.InputFile.close()
-
-  #
-  # Save output filename and check if path exists
-  #
-  OutputDir = os.path.dirname(args.OutputFile)
-  if not os.path.exists(OutputDir):
-    print('ERROR: The output path does not exist: %s' % OutputDir)
-    sys.exit(1)
-  args.OutputFileName = args.OutputFile
-
-  try:
-    if args.MonotonicCountStr.upper().startswith('0X'):
-      args.MonotonicCountValue = int(args.MonotonicCountStr, 16)
-    else:
-      args.MonotonicCountValue = int(args.MonotonicCountStr)
-  except:
-    args.MonotonicCountValue = int(0)
-
-  if args.Encode:
     #
-    # Save signer private cert filename and close private cert file
+    # Create command line argument parser object
     #
+    parser = argparse.ArgumentParser(
+        prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
+    group = parser.add_mutually_exclusive_group(required=True)
+    group.add_argument("-e", action="store_true",
+                       dest='Encode', help='encode file')
+    group.add_argument("-d", action="store_true",
+                       dest='Decode', help='decode file')
+    group.add_argument("--version", action='version', version=__version__)
+    parser.add_argument("-o", "--output", dest='OutputFile', type=str,
+                        metavar='filename', help="specify the output filename", required=True)
+    parser.add_argument("--signer-private-cert", dest='SignerPrivateCertFile', type=argparse.FileType('rb'),
+                        help="specify the signer private cert filename.  If not specified, a test signer private cert is used.")
+    parser.add_argument("--other-public-cert", dest='OtherPublicCertFile', type=argparse.FileType('rb'),
+                        help="specify the other public cert filename.  If not specified, a test other public cert is used.")
+    parser.add_argument("--trusted-public-cert", dest='TrustedPublicCertFile', type=argparse.FileType('rb'),
+                        help="specify the trusted public cert filename.  If not specified, a test trusted public cert is used.")
+    parser.add_argument("--monotonic-count", dest='MonotonicCountStr', type=str,
+                        help="specify the MonotonicCount in FMP capsule.  If not specified, 0 is used.")
+    parser.add_argument("--signature-size", dest='SignatureSizeStr',
+                        type=str, help="specify the signature size for decode process.")
+    parser.add_argument("-v", "--verbose", dest='Verbose',
+                        action="store_true", help="increase output messages")
+    parser.add_argument("-q", "--quiet", dest='Quiet',
+                        action="store_true", help="reduce output messages")
+    parser.add_argument("--debug", dest='Debug', type=int,
+                        metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
+    parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType(
+        'rb'), help="specify the input filename")
+
+    #
+    # Parse command line arguments
+    #
+    args = parser.parse_args()
+
+    #
+    # Generate file path to Open SSL command
+    #
+    OpenSslCommand = 'openssl'
     try:
-      args.SignerPrivateCertFileName = args.SignerPrivateCertFile.name
-      args.SignerPrivateCertFile.close()
+        OpenSslPath = os.environ['OPENSSL_PATH']
+        OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
+        if ' ' in OpenSslCommand:
+            OpenSslCommand = '"' + OpenSslCommand + '"'
     except:
-      try:
-        #
-        # Get path to currently executing script or executable
-        #
-        if hasattr(sys, 'frozen'):
-            Pkcs7ToolPath = sys.executable
-        else:
-            Pkcs7ToolPath = sys.argv[0]
-        if Pkcs7ToolPath.startswith('"'):
-            Pkcs7ToolPath = Pkcs7ToolPath[1:]
-        if Pkcs7ToolPath.endswith('"'):
-            Pkcs7ToolPath = RsaToolPath[:-1]
-        args.SignerPrivateCertFileName = os.path.join(os.path.dirname(os.path.realpath(Pkcs7ToolPath)), TEST_SIGNER_PRIVATE_CERT_FILENAME)
-        args.SignerPrivateCertFile = open(args.SignerPrivateCertFileName, 'rb')
-        args.SignerPrivateCertFile.close()
-      except:
-        print('ERROR: test signer private cert file %s missing' % (args.SignerPrivateCertFileName))
-        sys.exit(1)
+        pass
 
     #
-    # Save other public cert filename and close public cert file
+    # Verify that Open SSL command is available
     #
     try:
-      args.OtherPublicCertFileName = args.OtherPublicCertFile.name
-      args.OtherPublicCertFile.close()
+        Process = subprocess.Popen('%s version' % (
+            OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
     except:
-      try:
-        #
-        # Get path to currently executing script or executable
-        #
-        if hasattr(sys, 'frozen'):
-            Pkcs7ToolPath = sys.executable
-        else:
-            Pkcs7ToolPath = sys.argv[0]
-        if Pkcs7ToolPath.startswith('"'):
-            Pkcs7ToolPath = Pkcs7ToolPath[1:]
-        if Pkcs7ToolPath.endswith('"'):
-            Pkcs7ToolPath = RsaToolPath[:-1]
-        args.OtherPublicCertFileName = os.path.join(os.path.dirname(os.path.realpath(Pkcs7ToolPath)), TEST_OTHER_PUBLIC_CERT_FILENAME)
-        args.OtherPublicCertFile = open(args.OtherPublicCertFileName, 'rb')
-        args.OtherPublicCertFile.close()
-      except:
-        print('ERROR: test other public cert file %s missing' % (args.OtherPublicCertFileName))
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
         sys.exit(1)
 
-    format = "%dsQ" % len(args.InputFileBuffer)
-    FullInputFileBuffer = struct.pack(format, args.InputFileBuffer, args.MonotonicCountValue)
-
-    #
-    # Sign the input file using the specified private key and capture signature from STDOUT
-    #
-    Process = subprocess.Popen('%s smime -sign -binary -signer "%s" -outform DER -md sha256 -certfile "%s"' % (OpenSslCommand, args.SignerPrivateCertFileName, args.OtherPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    Signature = Process.communicate(input=FullInputFileBuffer)[0]
+    Version = Process.communicate()
     if Process.returncode != 0:
-      sys.exit(Process.returncode)
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
+        sys.exit(Process.returncode)
+    print(Version[0].decode())
 
     #
-    # Write output file that contains Signature, and Input data
+    # Read input file into a buffer and save input filename
     #
-    args.OutputFile = open(args.OutputFileName, 'wb')
-    args.OutputFile.write(Signature)
-    args.OutputFile.write(args.InputFileBuffer)
-    args.OutputFile.close()
+    args.InputFileName = args.InputFile.name
+    args.InputFileBuffer = args.InputFile.read()
+    args.InputFile.close()
 
-  if args.Decode:
     #
-    # Save trusted public cert filename and close public cert file
+    # Save output filename and check if path exists
     #
+    OutputDir = os.path.dirname(args.OutputFile)
+    if not os.path.exists(OutputDir):
+        print('ERROR: The output path does not exist: %s' % OutputDir)
+        sys.exit(1)
+    args.OutputFileName = args.OutputFile
+
     try:
-      args.TrustedPublicCertFileName = args.TrustedPublicCertFile.name
-      args.TrustedPublicCertFile.close()
+        if args.MonotonicCountStr.upper().startswith('0X'):
+            args.MonotonicCountValue = int(args.MonotonicCountStr, 16)
+        else:
+            args.MonotonicCountValue = int(args.MonotonicCountStr)
     except:
-      try:
+        args.MonotonicCountValue = int(0)
+
+    if args.Encode:
         #
-        # Get path to currently executing script or executable
+        # Save signer private cert filename and close private cert file
         #
-        if hasattr(sys, 'frozen'):
-            Pkcs7ToolPath = sys.executable
-        else:
-            Pkcs7ToolPath = sys.argv[0]
-        if Pkcs7ToolPath.startswith('"'):
-            Pkcs7ToolPath = Pkcs7ToolPath[1:]
-        if Pkcs7ToolPath.endswith('"'):
-            Pkcs7ToolPath = RsaToolPath[:-1]
-        args.TrustedPublicCertFileName = os.path.join(os.path.dirname(os.path.realpath(Pkcs7ToolPath)), TEST_TRUSTED_PUBLIC_CERT_FILENAME)
-        args.TrustedPublicCertFile = open(args.TrustedPublicCertFileName, 'rb')
-        args.TrustedPublicCertFile.close()
-      except:
-        print('ERROR: test trusted public cert file %s missing' % (args.TrustedPublicCertFileName))
-        sys.exit(1)
+        try:
+            args.SignerPrivateCertFileName = args.SignerPrivateCertFile.name
+            args.SignerPrivateCertFile.close()
+        except:
+            try:
+                #
+                # Get path to currently executing script or executable
+                #
+                if hasattr(sys, 'frozen'):
+                    Pkcs7ToolPath = sys.executable
+                else:
+                    Pkcs7ToolPath = sys.argv[0]
+                if Pkcs7ToolPath.startswith('"'):
+                    Pkcs7ToolPath = Pkcs7ToolPath[1:]
+                if Pkcs7ToolPath.endswith('"'):
+                    Pkcs7ToolPath = RsaToolPath[:-1]
+                args.SignerPrivateCertFileName = os.path.join(os.path.dirname(
+                    os.path.realpath(Pkcs7ToolPath)), TEST_SIGNER_PRIVATE_CERT_FILENAME)
+                args.SignerPrivateCertFile = open(
+                    args.SignerPrivateCertFileName, 'rb')
+                args.SignerPrivateCertFile.close()
+            except:
+                print('ERROR: test signer private cert file %s missing' %
+                      (args.SignerPrivateCertFileName))
+                sys.exit(1)
 
-    if not args.SignatureSizeStr:
-      print("ERROR: please use the option --signature-size to specify the size of the signature data!")
-      sys.exit(1)
-    else:
-      if args.SignatureSizeStr.upper().startswith('0X'):
-        SignatureSize = int(args.SignatureSizeStr, 16)
-      else:
-        SignatureSize = int(args.SignatureSizeStr)
-    if SignatureSize < 0:
-        print("ERROR: The value of option --signature-size can't be set to negative value!")
-        sys.exit(1)
-    elif SignatureSize > len(args.InputFileBuffer):
-        print("ERROR: The value of option --signature-size is exceed the size of the input file !")
-        sys.exit(1)
+        #
+        # Save other public cert filename and close public cert file
+        #
+        try:
+            args.OtherPublicCertFileName = args.OtherPublicCertFile.name
+            args.OtherPublicCertFile.close()
+        except:
+            try:
+                #
+                # Get path to currently executing script or executable
+                #
+                if hasattr(sys, 'frozen'):
+                    Pkcs7ToolPath = sys.executable
+                else:
+                    Pkcs7ToolPath = sys.argv[0]
+                if Pkcs7ToolPath.startswith('"'):
+                    Pkcs7ToolPath = Pkcs7ToolPath[1:]
+                if Pkcs7ToolPath.endswith('"'):
+                    Pkcs7ToolPath = RsaToolPath[:-1]
+                args.OtherPublicCertFileName = os.path.join(os.path.dirname(
+                    os.path.realpath(Pkcs7ToolPath)), TEST_OTHER_PUBLIC_CERT_FILENAME)
+                args.OtherPublicCertFile = open(
+                    args.OtherPublicCertFileName, 'rb')
+                args.OtherPublicCertFile.close()
+            except:
+                print('ERROR: test other public cert file %s missing' %
+                      (args.OtherPublicCertFileName))
+                sys.exit(1)
+
+        format = "%dsQ" % len(args.InputFileBuffer)
+        FullInputFileBuffer = struct.pack(
+            format, args.InputFileBuffer, args.MonotonicCountValue)
 
-    args.SignatureBuffer = args.InputFileBuffer[0:SignatureSize]
-    args.InputFileBuffer = args.InputFileBuffer[SignatureSize:]
+        #
+        # Sign the input file using the specified private key and capture signature from STDOUT
+        #
+        Process = subprocess.Popen('%s smime -sign -binary -signer "%s" -outform DER -md sha256 -certfile "%s"' % (OpenSslCommand,
+                                   args.SignerPrivateCertFileName, args.OtherPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Signature = Process.communicate(input=FullInputFileBuffer)[0]
+        if Process.returncode != 0:
+            sys.exit(Process.returncode)
 
-    format = "%dsQ" % len(args.InputFileBuffer)
-    FullInputFileBuffer = struct.pack(format, args.InputFileBuffer, args.MonotonicCountValue)
+        #
+        # Write output file that contains Signature, and Input data
+        #
+        args.OutputFile = open(args.OutputFileName, 'wb')
+        args.OutputFile.write(Signature)
+        args.OutputFile.write(args.InputFileBuffer)
+        args.OutputFile.close()
 
-    #
-    # Save output file contents from input file
-    #
-    open(args.OutputFileName, 'wb').write(FullInputFileBuffer)
+    if args.Decode:
+        #
+        # Save trusted public cert filename and close public cert file
+        #
+        try:
+            args.TrustedPublicCertFileName = args.TrustedPublicCertFile.name
+            args.TrustedPublicCertFile.close()
+        except:
+            try:
+                #
+                # Get path to currently executing script or executable
+                #
+                if hasattr(sys, 'frozen'):
+                    Pkcs7ToolPath = sys.executable
+                else:
+                    Pkcs7ToolPath = sys.argv[0]
+                if Pkcs7ToolPath.startswith('"'):
+                    Pkcs7ToolPath = Pkcs7ToolPath[1:]
+                if Pkcs7ToolPath.endswith('"'):
+                    Pkcs7ToolPath = RsaToolPath[:-1]
+                args.TrustedPublicCertFileName = os.path.join(os.path.dirname(
+                    os.path.realpath(Pkcs7ToolPath)), TEST_TRUSTED_PUBLIC_CERT_FILENAME)
+                args.TrustedPublicCertFile = open(
+                    args.TrustedPublicCertFileName, 'rb')
+                args.TrustedPublicCertFile.close()
+            except:
+                print('ERROR: test trusted public cert file %s missing' %
+                      (args.TrustedPublicCertFileName))
+                sys.exit(1)
+
+        if not args.SignatureSizeStr:
+            print(
+                "ERROR: please use the option --signature-size to specify the size of the signature data!")
+            sys.exit(1)
+        else:
+            if args.SignatureSizeStr.upper().startswith('0X'):
+                SignatureSize = int(args.SignatureSizeStr, 16)
+            else:
+                SignatureSize = int(args.SignatureSizeStr)
+        if SignatureSize < 0:
+            print(
+                "ERROR: The value of option --signature-size can't be set to negative value!")
+            sys.exit(1)
+        elif SignatureSize > len(args.InputFileBuffer):
+            print(
+                "ERROR: The value of option --signature-size is exceed the size of the input file !")
+            sys.exit(1)
+
+        args.SignatureBuffer = args.InputFileBuffer[0:SignatureSize]
+        args.InputFileBuffer = args.InputFileBuffer[SignatureSize:]
+
+        format = "%dsQ" % len(args.InputFileBuffer)
+        FullInputFileBuffer = struct.pack(
+            format, args.InputFileBuffer, args.MonotonicCountValue)
 
-    #
-    # Verify signature
-    #
-    Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName, args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    Process.communicate(input=args.SignatureBuffer)[0]
-    if Process.returncode != 0:
-      print('ERROR: Verification failed')
-      os.remove (args.OutputFileName)
-      sys.exit(Process.returncode)
+        #
+        # Save output file contents from input file
+        #
+        open(args.OutputFileName, 'wb').write(FullInputFileBuffer)
 
-    open(args.OutputFileName, 'wb').write(args.InputFileBuffer)
+        #
+        # Verify signature
+        #
+        Process = subprocess.Popen('%s smime -verify -inform DER -content %s -CAfile %s' % (OpenSslCommand, args.OutputFileName,
+                                   args.TrustedPublicCertFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Process.communicate(input=args.SignatureBuffer)[0]
+        if Process.returncode != 0:
+            print('ERROR: Verification failed')
+            os.remove(args.OutputFileName)
+            sys.exit(Process.returncode)
+
+        open(args.OutputFileName, 'wb').write(args.InputFileBuffer)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
index 6c9b8c464e4d..abdb99ac4083 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This tool can be used to generate new RSA 2048 bit private/public key pairs
 # in a PEM file format using OpenSSL command line utilities that are installed
 # on the path specified by the system environment variable OPENSSL_PATH.
@@ -27,144 +27,159 @@ from Common.BuildVersion import gBUILD_VERSION
 #
 # Globals for help information
 #
-__prog__      = 'Rsa2048Sha256GenerateKeys'
-__version__   = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
+__prog__ = 'Rsa2048Sha256GenerateKeys'
+__version__ = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
 __copyright__ = 'Copyright (c) 2013 - 2018, Intel Corporation. All rights reserved.'
-__usage__     = '%s [options]' % (__prog__)
+__usage__ = '%s [options]' % (__prog__)
 
 
 if __name__ == '__main__':
-  #
-  # Create command line argument parser object
-  #
-  parser = argparse.ArgumentParser(prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
-  group = parser.add_mutually_exclusive_group(required=True)
-  group.add_argument("--version", action='version', version=__version__)
-  group.add_argument("-o", "--output", dest='OutputFile', type=argparse.FileType('wb'), metavar='filename', nargs='*', help="specify the output private key filename in PEM format")
-  group.add_argument("-i", "--input", dest='InputFile', type=argparse.FileType('rb'), metavar='filename', nargs='*', help="specify the input private key filename in PEM format")
-  parser.add_argument("--public-key-hash", dest='PublicKeyHashFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in binary format")
-  parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'), help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
-  parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
-  parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
+    #
+    # Create command line argument parser object
+    #
+    parser = argparse.ArgumentParser(
+        prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
+    group = parser.add_mutually_exclusive_group(required=True)
+    group.add_argument("--version", action='version', version=__version__)
+    group.add_argument("-o", "--output", dest='OutputFile', type=argparse.FileType('wb'),
+                       metavar='filename', nargs='*', help="specify the output private key filename in PEM format")
+    group.add_argument("-i", "--input", dest='InputFile', type=argparse.FileType('rb'),
+                       metavar='filename', nargs='*', help="specify the input private key filename in PEM format")
+    parser.add_argument("--public-key-hash", dest='PublicKeyHashFile', type=argparse.FileType('wb'),
+                        help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in binary format")
+    parser.add_argument("--public-key-hash-c", dest='PublicKeyHashCFile', type=argparse.FileType('wb'),
+                        help="specify the public key hash filename that is SHA 256 hash of 2048 bit RSA public key in C structure format")
+    parser.add_argument("-v", "--verbose", dest='Verbose',
+                        action="store_true", help="increase output messages")
+    parser.add_argument("-q", "--quiet", dest='Quiet',
+                        action="store_true", help="reduce output messages")
+    parser.add_argument("--debug", dest='Debug', type=int,
+                        metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
 
-  #
-  # Parse command line arguments
-  #
-  args = parser.parse_args()
+    #
+    # Parse command line arguments
+    #
+    args = parser.parse_args()
 
-  #
-  # Generate file path to Open SSL command
-  #
-  OpenSslCommand = 'openssl'
-  try:
-    OpenSslPath = os.environ['OPENSSL_PATH']
-    OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
-    if ' ' in OpenSslCommand:
-      OpenSslCommand = '"' + OpenSslCommand + '"'
-  except:
-    pass
+    #
+    # Generate file path to Open SSL command
+    #
+    OpenSslCommand = 'openssl'
+    try:
+        OpenSslPath = os.environ['OPENSSL_PATH']
+        OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
+        if ' ' in OpenSslCommand:
+            OpenSslCommand = '"' + OpenSslCommand + '"'
+    except:
+        pass
 
-  #
-  # Verify that Open SSL command is available
-  #
-  try:
-    Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-  except:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(1)
+    #
+    # Verify that Open SSL command is available
+    #
+    try:
+        Process = subprocess.Popen('%s version' % (
+            OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+    except:
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
+        sys.exit(1)
 
-  Version = Process.communicate()
-  if Process.returncode != 0:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(Process.returncode)
-  print(Version[0].decode())
-
-  args.PemFileName = []
-
-  #
-  # Check for output file argument
-  #
-  if args.OutputFile is not None:
-    for Item in args.OutputFile:
-      #
-      # Save PEM filename and close output file
-      #
-      args.PemFileName.append(Item.name)
-      Item.close()
-
-      #
-      # Generate private key and save it to output file in a PEM file format
-      #
-      Process = subprocess.Popen('%s genrsa -out %s 2048' % (OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-      Process.communicate()
-      if Process.returncode != 0:
-        print('ERROR: RSA 2048 key generation failed')
+    Version = Process.communicate()
+    if Process.returncode != 0:
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
         sys.exit(Process.returncode)
+    print(Version[0].decode())
 
-  #
-  # Check for input file argument
-  #
-  if args.InputFile is not None:
-    for Item in args.InputFile:
-      #
-      # Save PEM filename and close input file
-      #
-      args.PemFileName.append(Item.name)
-      Item.close()
+    args.PemFileName = []
 
-  PublicKeyHash = bytearray()
-  for Item in args.PemFileName:
     #
-    # Extract public key from private key into STDOUT
+    # Check for output file argument
     #
-    Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    PublicKeyHexString = Process.communicate()[0].decode().split(b'=')[1].strip()
-    if Process.returncode != 0:
-      print('ERROR: Unable to extract public key from private key')
-      sys.exit(Process.returncode)
-    PublicKey = bytearray()
-    for Index in range (0, len(PublicKeyHexString), 2):
-      PublicKey = PublicKey + PublicKeyHexString[Index:Index + 2]
+    if args.OutputFile is not None:
+        for Item in args.OutputFile:
+            #
+            # Save PEM filename and close output file
+            #
+            args.PemFileName.append(Item.name)
+            Item.close()
+
+            #
+            # Generate private key and save it to output file in a PEM file format
+            #
+            Process = subprocess.Popen('%s genrsa -out %s 2048' % (
+                OpenSslCommand, Item.name), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+            Process.communicate()
+            if Process.returncode != 0:
+                print('ERROR: RSA 2048 key generation failed')
+                sys.exit(Process.returncode)
 
     #
-    # Generate SHA 256 hash of RSA 2048 bit public key into STDOUT
+    # Check for input file argument
     #
-    Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    Process.stdin.write (PublicKey)
-    PublicKeyHash = PublicKeyHash + Process.communicate()[0].decode()
-    if Process.returncode != 0:
-      print('ERROR: Unable to extract SHA 256 hash of public key')
-      sys.exit(Process.returncode)
+    if args.InputFile is not None:
+        for Item in args.InputFile:
+            #
+            # Save PEM filename and close input file
+            #
+            args.PemFileName.append(Item.name)
+            Item.close()
+
+    PublicKeyHash = bytearray()
+    for Item in args.PemFileName:
+        #
+        # Extract public key from private key into STDOUT
+        #
+        Process = subprocess.Popen('%s rsa -in %s -modulus -noout' % (
+            OpenSslCommand, Item), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        PublicKeyHexString = Process.communicate()[0].decode().split(b'=')[
+            1].strip()
+        if Process.returncode != 0:
+            print('ERROR: Unable to extract public key from private key')
+            sys.exit(Process.returncode)
+        PublicKey = bytearray()
+        for Index in range(0, len(PublicKeyHexString), 2):
+            PublicKey = PublicKey + PublicKeyHexString[Index:Index + 2]
 
-  #
-  # Write SHA 256 hash of 2048 bit binary public key to public key hash file
-  #
-  try:
-    args.PublicKeyHashFile.write (PublicKeyHash)
-    args.PublicKeyHashFile.close ()
-  except:
-    pass
+        #
+        # Generate SHA 256 hash of RSA 2048 bit public key into STDOUT
+        #
+        Process = subprocess.Popen('%s dgst -sha256 -binary' % (OpenSslCommand),
+                                   stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Process.stdin.write(PublicKey)
+        PublicKeyHash = PublicKeyHash + Process.communicate()[0].decode()
+        if Process.returncode != 0:
+            print('ERROR: Unable to extract SHA 256 hash of public key')
+            sys.exit(Process.returncode)
 
-  #
-  # Convert public key hash to a C structure string
-  #
-  PublicKeyHashC = '{'
-  for Item in PublicKeyHash:
-    PublicKeyHashC = PublicKeyHashC + '0x%02x, ' % (Item)
-  PublicKeyHashC = PublicKeyHashC[:-2] + '}'
+    #
+    # Write SHA 256 hash of 2048 bit binary public key to public key hash file
+    #
+    try:
+        args.PublicKeyHashFile.write(PublicKeyHash)
+        args.PublicKeyHashFile.close()
+    except:
+        pass
 
-  #
-  # Write SHA 256 of 2048 bit binary public key to public key hash C structure file
-  #
-  try:
-    args.PublicKeyHashCFile.write (bytes(PublicKeyHashC))
-    args.PublicKeyHashCFile.close ()
-  except:
-    pass
+    #
+    # Convert public key hash to a C structure string
+    #
+    PublicKeyHashC = '{'
+    for Item in PublicKeyHash:
+        PublicKeyHashC = PublicKeyHashC + '0x%02x, ' % (Item)
+    PublicKeyHashC = PublicKeyHashC[:-2] + '}'
 
-  #
-  # If verbose is enabled display the public key in C structure format
-  #
-  if args.Verbose:
-    print('PublicKeySha256 = ' + PublicKeyHashC)
+    #
+    # Write SHA 256 of 2048 bit binary public key to public key hash C structure file
+    #
+    try:
+        args.PublicKeyHashCFile.write(bytes(PublicKeyHashC))
+        args.PublicKeyHashCFile.close()
+    except:
+        pass
+
+    #
+    # If verbose is enabled display the public key in C structure format
+    #
+    if args.Verbose:
+        print('PublicKeySha256 = ' + PublicKeyHashC)
diff --git a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
index df05826282eb..fb224cb83dac 100644
--- a/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
+++ b/BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This tool encodes and decodes GUIDed FFS sections or FMP capsule for a GUID type of
 # EFI_CERT_TYPE_RSA2048_SHA256_GUID defined in the UEFI 2.4 Specification as
 #   {0xa7717414, 0xc616, 0x4977, {0x94, 0x20, 0x84, 0x47, 0x12, 0xa7, 0x35, 0xbf}}
@@ -25,15 +25,16 @@ from Common.BuildVersion import gBUILD_VERSION
 #
 # Globals for help information
 #
-__prog__      = 'Rsa2048Sha256Sign'
-__version__   = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
+__prog__ = 'Rsa2048Sha256Sign'
+__version__ = '%s Version %s' % (__prog__, '0.9 ' + gBUILD_VERSION)
 __copyright__ = 'Copyright (c) 2013 - 2018, Intel Corporation. All rights reserved.'
-__usage__     = '%s -e|-d [options] <input_file>' % (__prog__)
+__usage__ = '%s -e|-d [options] <input_file>' % (__prog__)
 
 #
 # GUID for SHA 256 Hash Algorithm from UEFI Specification
 #
-EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID('{51aa59de-fdf2-4ea3-bc63-875fb7842ee9}')
+EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID(
+    '{51aa59de-fdf2-4ea3-bc63-875fb7842ee9}')
 
 #
 # Structure definition to unpack EFI_CERT_BLOCK_RSA_2048_SHA256 from UEFI 2.4 Specification
@@ -44,7 +45,8 @@ EFI_HASH_ALGORITHM_SHA256_GUID = uuid.UUID('{51aa59de-fdf2-4ea3-bc63-875fb7842ee
 #     UINT8 Signature[256];
 #   } EFI_CERT_BLOCK_RSA_2048_SHA256;
 #
-EFI_CERT_BLOCK_RSA_2048_SHA256        = collections.namedtuple('EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature'])
+EFI_CERT_BLOCK_RSA_2048_SHA256 = collections.namedtuple(
+    'EFI_CERT_BLOCK_RSA_2048_SHA256', ['HashType', 'PublicKey', 'Signature'])
 EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT = struct.Struct('16s256s256s')
 
 #
@@ -53,183 +55,205 @@ EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT = struct.Struct('16s256s256s')
 TEST_SIGNING_PRIVATE_KEY_FILENAME = 'TestSigningPrivateKey.pem'
 
 if __name__ == '__main__':
-  #
-  # Create command line argument parser object
-  #
-  parser = argparse.ArgumentParser(prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
-  group = parser.add_mutually_exclusive_group(required=True)
-  group.add_argument("-e", action="store_true", dest='Encode', help='encode file')
-  group.add_argument("-d", action="store_true", dest='Decode', help='decode file')
-  group.add_argument("--version", action='version', version=__version__)
-  parser.add_argument("-o", "--output", dest='OutputFile', type=str, metavar='filename', help="specify the output filename", required=True)
-  parser.add_argument("--monotonic-count", dest='MonotonicCountStr', type=str, help="specify the MonotonicCount in FMP capsule.")
-  parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'), help="specify the private key filename.  If not specified, a test signing key is used.")
-  parser.add_argument("-v", "--verbose", dest='Verbose', action="store_true", help="increase output messages")
-  parser.add_argument("-q", "--quiet", dest='Quiet', action="store_true", help="reduce output messages")
-  parser.add_argument("--debug", dest='Debug', type=int, metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
-  parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType('rb'), help="specify the input filename")
-
-  #
-  # Parse command line arguments
-  #
-  args = parser.parse_args()
-
-  #
-  # Generate file path to Open SSL command
-  #
-  OpenSslCommand = 'openssl'
-  try:
-    OpenSslPath = os.environ['OPENSSL_PATH']
-    OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
-    if ' ' in OpenSslCommand:
-      OpenSslCommand = '"' + OpenSslCommand + '"'
-  except:
-    pass
-
-  #
-  # Verify that Open SSL command is available
-  #
-  try:
-    Process = subprocess.Popen('%s version' % (OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-  except:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(1)
-
-  Version = Process.communicate()
-  if Process.returncode != 0:
-    print('ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
-    sys.exit(Process.returncode)
-  print(Version[0].decode('utf-8'))
-
-  #
-  # Read input file into a buffer and save input filename
-  #
-  args.InputFileName   = args.InputFile.name
-  args.InputFileBuffer = args.InputFile.read()
-  args.InputFile.close()
-
-  #
-  # Save output filename and check if path exists
-  #
-  OutputDir = os.path.dirname(args.OutputFile)
-  if not os.path.exists(OutputDir):
-    print('ERROR: The output path does not exist: %s' % OutputDir)
-    sys.exit(1)
-  args.OutputFileName = args.OutputFile
-
-  #
-  # Save private key filename and close private key file
-  #
-  try:
-    args.PrivateKeyFileName = args.PrivateKeyFile.name
-    args.PrivateKeyFile.close()
-  except:
-    try:
-      #
-      # Get path to currently executing script or executable
-      #
-      if hasattr(sys, 'frozen'):
-          RsaToolPath = sys.executable
-      else:
-          RsaToolPath = sys.argv[0]
-      if RsaToolPath.startswith('"'):
-          RsaToolPath = RsaToolPath[1:]
-      if RsaToolPath.endswith('"'):
-          RsaToolPath = RsaToolPath[:-1]
-      args.PrivateKeyFileName = os.path.join(os.path.dirname(os.path.realpath(RsaToolPath)), TEST_SIGNING_PRIVATE_KEY_FILENAME)
-      args.PrivateKeyFile = open(args.PrivateKeyFileName, 'rb')
-      args.PrivateKeyFile.close()
-    except:
-      print('ERROR: test signing private key file %s missing' % (args.PrivateKeyFileName))
-      sys.exit(1)
-
-  #
-  # Extract public key from private key into STDOUT
-  #
-  Process = subprocess.Popen('%s rsa -in "%s" -modulus -noout' % (OpenSslCommand, args.PrivateKeyFileName), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-  PublicKeyHexString = Process.communicate()[0].split(b'=')[1].strip()
-  PublicKeyHexString = PublicKeyHexString.decode('utf-8')
-  PublicKey = ''
-  while len(PublicKeyHexString) > 0:
-    PublicKey = PublicKey + PublicKeyHexString[0:2]
-    PublicKeyHexString=PublicKeyHexString[2:]
-  if Process.returncode != 0:
-    sys.exit(Process.returncode)
-
-  if args.MonotonicCountStr:
+    #
+    # Create command line argument parser object
+    #
+    parser = argparse.ArgumentParser(
+        prog=__prog__, usage=__usage__, description=__copyright__, conflict_handler='resolve')
+    group = parser.add_mutually_exclusive_group(required=True)
+    group.add_argument("-e", action="store_true",
+                       dest='Encode', help='encode file')
+    group.add_argument("-d", action="store_true",
+                       dest='Decode', help='decode file')
+    group.add_argument("--version", action='version', version=__version__)
+    parser.add_argument("-o", "--output", dest='OutputFile', type=str,
+                        metavar='filename', help="specify the output filename", required=True)
+    parser.add_argument("--monotonic-count", dest='MonotonicCountStr',
+                        type=str, help="specify the MonotonicCount in FMP capsule.")
+    parser.add_argument("--private-key", dest='PrivateKeyFile', type=argparse.FileType('rb'),
+                        help="specify the private key filename.  If not specified, a test signing key is used.")
+    parser.add_argument("-v", "--verbose", dest='Verbose',
+                        action="store_true", help="increase output messages")
+    parser.add_argument("-q", "--quiet", dest='Quiet',
+                        action="store_true", help="reduce output messages")
+    parser.add_argument("--debug", dest='Debug', type=int,
+                        metavar='[0-9]', choices=range(0, 10), default=0, help="set debug level")
+    parser.add_argument(metavar="input_file", dest='InputFile', type=argparse.FileType(
+        'rb'), help="specify the input filename")
+
+    #
+    # Parse command line arguments
+    #
+    args = parser.parse_args()
+
+    #
+    # Generate file path to Open SSL command
+    #
+    OpenSslCommand = 'openssl'
     try:
-      if args.MonotonicCountStr.upper().startswith('0X'):
-        args.MonotonicCountValue = int(args.MonotonicCountStr, 16)
-      else:
-        args.MonotonicCountValue = int(args.MonotonicCountStr)
+        OpenSslPath = os.environ['OPENSSL_PATH']
+        OpenSslCommand = os.path.join(OpenSslPath, OpenSslCommand)
+        if ' ' in OpenSslCommand:
+            OpenSslCommand = '"' + OpenSslCommand + '"'
     except:
         pass
 
-  if args.Encode:
-    FullInputFileBuffer = args.InputFileBuffer
-    if args.MonotonicCountStr:
-      format = "%dsQ" % len(args.InputFileBuffer)
-      FullInputFileBuffer = struct.pack(format, args.InputFileBuffer, args.MonotonicCountValue)
     #
-    # Sign the input file using the specified private key and capture signature from STDOUT
+    # Verify that Open SSL command is available
     #
-    Process = subprocess.Popen('%s dgst -sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    Signature = Process.communicate(input=FullInputFileBuffer)[0]
+    try:
+        Process = subprocess.Popen('%s version' % (
+            OpenSslCommand), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+    except:
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
+        sys.exit(1)
+
+    Version = Process.communicate()
     if Process.returncode != 0:
-      sys.exit(Process.returncode)
+        print(
+            'ERROR: Open SSL command not available.  Please verify PATH or set OPENSSL_PATH')
+        sys.exit(Process.returncode)
+    print(Version[0].decode('utf-8'))
 
     #
-    # Write output file that contains hash GUID, Public Key, Signature, and Input data
+    # Read input file into a buffer and save input filename
     #
-    args.OutputFile = open(args.OutputFileName, 'wb')
-    args.OutputFile.write(EFI_HASH_ALGORITHM_SHA256_GUID.bytes_le)
-    args.OutputFile.write(bytearray.fromhex(str(PublicKey)))
-    args.OutputFile.write(Signature)
-    args.OutputFile.write(args.InputFileBuffer)
-    args.OutputFile.close()
+    args.InputFileName = args.InputFile.name
+    args.InputFileBuffer = args.InputFile.read()
+    args.InputFile.close()
 
-  if args.Decode:
     #
-    # Parse Hash Type, Public Key, and Signature from the section header
+    # Save output filename and check if path exists
     #
-    Header = EFI_CERT_BLOCK_RSA_2048_SHA256._make(EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.unpack_from(args.InputFileBuffer))
-    args.InputFileBuffer = args.InputFileBuffer[EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.size:]
+    OutputDir = os.path.dirname(args.OutputFile)
+    if not os.path.exists(OutputDir):
+        print('ERROR: The output path does not exist: %s' % OutputDir)
+        sys.exit(1)
+    args.OutputFileName = args.OutputFile
 
     #
-    # Verify that the Hash Type matches the expected SHA256 type
-    #
-    if uuid.UUID(bytes_le = Header.HashType) != EFI_HASH_ALGORITHM_SHA256_GUID:
-      print('ERROR: unsupport hash GUID')
-      sys.exit(1)
+    # Save private key filename and close private key file
+    #
+    try:
+        args.PrivateKeyFileName = args.PrivateKeyFile.name
+        args.PrivateKeyFile.close()
+    except:
+        try:
+            #
+            # Get path to currently executing script or executable
+            #
+            if hasattr(sys, 'frozen'):
+                RsaToolPath = sys.executable
+            else:
+                RsaToolPath = sys.argv[0]
+            if RsaToolPath.startswith('"'):
+                RsaToolPath = RsaToolPath[1:]
+            if RsaToolPath.endswith('"'):
+                RsaToolPath = RsaToolPath[:-1]
+            args.PrivateKeyFileName = os.path.join(os.path.dirname(
+                os.path.realpath(RsaToolPath)), TEST_SIGNING_PRIVATE_KEY_FILENAME)
+            args.PrivateKeyFile = open(args.PrivateKeyFileName, 'rb')
+            args.PrivateKeyFile.close()
+        except:
+            print('ERROR: test signing private key file %s missing' %
+                  (args.PrivateKeyFileName))
+            sys.exit(1)
 
     #
-    # Verify the public key
+    # Extract public key from private key into STDOUT
     #
-    if Header.PublicKey != bytearray.fromhex(PublicKey):
-      print('ERROR: Public key in input file does not match public key from private key file')
-      sys.exit(1)
+    Process = subprocess.Popen('%s rsa -in "%s" -modulus -noout' % (OpenSslCommand,
+                               args.PrivateKeyFileName), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+    PublicKeyHexString = Process.communicate()[0].split(b'=')[1].strip()
+    PublicKeyHexString = PublicKeyHexString.decode('utf-8')
+    PublicKey = ''
+    while len(PublicKeyHexString) > 0:
+        PublicKey = PublicKey + PublicKeyHexString[0:2]
+        PublicKeyHexString = PublicKeyHexString[2:]
+    if Process.returncode != 0:
+        sys.exit(Process.returncode)
 
-    FullInputFileBuffer = args.InputFileBuffer
     if args.MonotonicCountStr:
-      format = "%dsQ" % len(args.InputFileBuffer)
-      FullInputFileBuffer = struct.pack(format, args.InputFileBuffer, args.MonotonicCountValue)
+        try:
+            if args.MonotonicCountStr.upper().startswith('0X'):
+                args.MonotonicCountValue = int(args.MonotonicCountStr, 16)
+            else:
+                args.MonotonicCountValue = int(args.MonotonicCountStr)
+        except:
+            pass
 
-    #
-    # Write Signature to output file
-    #
-    open(args.OutputFileName, 'wb').write(Header.Signature)
+    if args.Encode:
+        FullInputFileBuffer = args.InputFileBuffer
+        if args.MonotonicCountStr:
+            format = "%dsQ" % len(args.InputFileBuffer)
+            FullInputFileBuffer = struct.pack(
+                format, args.InputFileBuffer, args.MonotonicCountValue)
+        #
+        # Sign the input file using the specified private key and capture signature from STDOUT
+        #
+        Process = subprocess.Popen('%s dgst -sha256 -sign "%s"' % (OpenSslCommand, args.PrivateKeyFileName),
+                                   stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Signature = Process.communicate(input=FullInputFileBuffer)[0]
+        if Process.returncode != 0:
+            sys.exit(Process.returncode)
 
-    #
-    # Verify signature
-    #
-    Process = subprocess.Popen('%s dgst -sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName, args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
-    Process.communicate(input=FullInputFileBuffer)
-    if Process.returncode != 0:
-      print('ERROR: Verification failed')
-      os.remove (args.OutputFileName)
-      sys.exit(Process.returncode)
+        #
+        # Write output file that contains hash GUID, Public Key, Signature, and Input data
+        #
+        args.OutputFile = open(args.OutputFileName, 'wb')
+        args.OutputFile.write(EFI_HASH_ALGORITHM_SHA256_GUID.bytes_le)
+        args.OutputFile.write(bytearray.fromhex(str(PublicKey)))
+        args.OutputFile.write(Signature)
+        args.OutputFile.write(args.InputFileBuffer)
+        args.OutputFile.close()
 
-    #
-    # Save output file contents from input file
-    #
-    open(args.OutputFileName, 'wb').write(args.InputFileBuffer)
+    if args.Decode:
+        #
+        # Parse Hash Type, Public Key, and Signature from the section header
+        #
+        Header = EFI_CERT_BLOCK_RSA_2048_SHA256._make(
+            EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.unpack_from(args.InputFileBuffer))
+        args.InputFileBuffer = args.InputFileBuffer[EFI_CERT_BLOCK_RSA_2048_SHA256_STRUCT.size:]
+
+        #
+        # Verify that the Hash Type matches the expected SHA256 type
+        #
+        if uuid.UUID(bytes_le=Header.HashType) != EFI_HASH_ALGORITHM_SHA256_GUID:
+            print('ERROR: unsupport hash GUID')
+            sys.exit(1)
+
+        #
+        # Verify the public key
+        #
+        if Header.PublicKey != bytearray.fromhex(PublicKey):
+            print(
+                'ERROR: Public key in input file does not match public key from private key file')
+            sys.exit(1)
+
+        FullInputFileBuffer = args.InputFileBuffer
+        if args.MonotonicCountStr:
+            format = "%dsQ" % len(args.InputFileBuffer)
+            FullInputFileBuffer = struct.pack(
+                format, args.InputFileBuffer, args.MonotonicCountValue)
+
+        #
+        # Write Signature to output file
+        #
+        open(args.OutputFileName, 'wb').write(Header.Signature)
+
+        #
+        # Verify signature
+        #
+        Process = subprocess.Popen('%s dgst -sha256 -prverify "%s" -signature %s' % (OpenSslCommand, args.PrivateKeyFileName,
+                                   args.OutputFileName), stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+        Process.communicate(input=FullInputFileBuffer)
+        if Process.returncode != 0:
+            print('ERROR: Verification failed')
+            os.remove(args.OutputFileName)
+            sys.exit(Process.returncode)
+
+        #
+        # Save output file contents from input file
+        #
+        open(args.OutputFileName, 'wb').write(args.InputFileBuffer)
diff --git a/BaseTools/Source/Python/Split/Split.py b/BaseTools/Source/Python/Split/Split.py
index e70d5c22c468..92d3ada0850e 100644
--- a/BaseTools/Source/Python/Split/Split.py
+++ b/BaseTools/Source/Python/Split/Split.py
@@ -92,16 +92,20 @@ def getFileSize(filename):
 
     return length
 
-def getoutputfileabs(inputfile, prefix, outputfile,index):
+
+def getoutputfileabs(inputfile, prefix, outputfile, index):
     inputfile = os.path.abspath(inputfile)
     if outputfile is None:
         if prefix is None:
-            outputfileabs = os.path.join(os.path.dirname(inputfile), "{}{}".format(os.path.basename(inputfile),index))
+            outputfileabs = os.path.join(os.path.dirname(
+                inputfile), "{}{}".format(os.path.basename(inputfile), index))
         else:
             if os.path.isabs(prefix):
-                outputfileabs = os.path.join(prefix, "{}{}".format(os.path.basename(inputfile),index))
+                outputfileabs = os.path.join(prefix, "{}{}".format(
+                    os.path.basename(inputfile), index))
             else:
-                outputfileabs = os.path.join(os.getcwd(), prefix, "{}{}".format(os.path.basename(inputfile),index))
+                outputfileabs = os.path.join(os.getcwd(), prefix, "{}{}".format(
+                    os.path.basename(inputfile), index))
     elif not os.path.isabs(outputfile):
         if prefix is None:
             outputfileabs = os.path.join(os.getcwd(), outputfile)
@@ -114,6 +118,7 @@ def getoutputfileabs(inputfile, prefix, outputfile,index):
         outputfileabs = outputfile
     return outputfileabs
 
+
 def splitFile(inputfile, position, outputdir=None, outputfile1=None, outputfile2=None):
     '''
     Split the inputfile into outputfile1 and outputfile2 from the position.
@@ -132,12 +137,12 @@ def splitFile(inputfile, position, outputdir=None, outputfile1=None, outputfile2
     # Create dir for the output files
     try:
 
-        outputfile1 = getoutputfileabs(inputfile, outputdir, outputfile1,1)
+        outputfile1 = getoutputfileabs(inputfile, outputdir, outputfile1, 1)
         outputfolder = os.path.dirname(outputfile1)
         if not os.path.exists(outputfolder):
             os.makedirs(outputfolder)
 
-        outputfile2 = getoutputfileabs(inputfile, outputdir, outputfile2,2)
+        outputfile2 = getoutputfileabs(inputfile, outputdir, outputfile2, 2)
         outputfolder = os.path.dirname(outputfile2)
         if not os.path.exists(outputfolder):
             os.makedirs(outputfolder)
diff --git a/BaseTools/Source/Python/Table/Table.py b/BaseTools/Source/Python/Table/Table.py
index 7a60313e9524..c179334d8fc5 100644
--- a/BaseTools/Source/Python/Table/Table.py
+++ b/BaseTools/Source/Python/Table/Table.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase a common table
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,7 @@
 #
 import Common.EdkLogger as EdkLogger
 
-## TableFile
+# TableFile
 #
 # This class defined a common table
 #
@@ -19,13 +19,15 @@ import Common.EdkLogger as EdkLogger
 # @param Cursor:     Cursor of the database
 # @param TableName:  Name of the table
 #
+
+
 class Table(object):
     def __init__(self, Cursor):
         self.Cur = Cursor
         self.Table = ''
         self.ID = 0
 
-    ## Create table
+    # Create table
     #
     # Create a table
     #
@@ -34,14 +36,14 @@ class Table(object):
         self.ID = 0
         EdkLogger.verbose(SqlCommand + " ... DONE!")
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into a table
     #
     def Insert(self, SqlCommand):
         self.Exec(SqlCommand)
 
-    ## Query table
+    # Query table
     #
     # Query all records of the table
     #
@@ -53,10 +55,11 @@ class Table(object):
             EdkLogger.verbose(str(Rs))
 
         TotalCount = self.GetCount()
-        EdkLogger.verbose("*** Total %s records in table %s ***" % (TotalCount, self.Table) )
+        EdkLogger.verbose("*** Total %s records in table %s ***" %
+                          (TotalCount, self.Table))
         EdkLogger.verbose("Query tabel %s DONE!" % self.Table)
 
-    ## Drop a table
+    # Drop a table
     #
     # Drop the table
     #
@@ -65,7 +68,7 @@ class Table(object):
         self.Cur.execute(SqlCommand)
         EdkLogger.verbose("Drop tabel %s ... DONE!" % self.Table)
 
-    ## Get count
+    # Get count
     #
     # Get a count of all records of the table
     #
@@ -77,7 +80,7 @@ class Table(object):
         for Item in self.Cur:
             return Item[0]
 
-    ## Generate ID
+    # Generate ID
     #
     # Generate an ID if input ID is -1
     #
@@ -91,14 +94,14 @@ class Table(object):
 
         return self.ID
 
-    ## Init the ID of the table
+    # Init the ID of the table
     #
     # Init the ID of the table
     #
     def InitID(self):
         self.ID = self.GetCount()
 
-    ## Exec
+    # Exec
     #
     # Exec Sql Command, return result
     #
diff --git a/BaseTools/Source/Python/Table/TableDataModel.py b/BaseTools/Source/Python/Table/TableDataModel.py
index 3855807452f7..8b609fa2c99b 100644
--- a/BaseTools/Source/Python/Table/TableDataModel.py
+++ b/BaseTools/Source/Python/Table/TableDataModel.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for data models
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,19 +14,21 @@ import CommonDataClass.DataClass as DataClass
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableDataModel
+# TableDataModel
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableDataModel(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'DataModel'
 
-    ## Create table
+    # Create table
     #
     # Create table DataModel
     #
@@ -43,7 +45,7 @@ class TableDataModel(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table DataModel
     #
@@ -55,12 +57,13 @@ class TableDataModel(Table):
     def Insert(self, CrossIndex, Name, Description):
         self.ID = self.ID + 1
         (Name, Description) = ConvertToSqlString((Name, Description))
-        SqlCommand = """insert into %s values(%s, %s, '%s', '%s')""" % (self.Table, self.ID, CrossIndex, Name, Description)
+        SqlCommand = """insert into %s values(%s, %s, '%s', '%s')""" % (
+            self.Table, self.ID, CrossIndex, Name, Description)
         Table.Insert(self, SqlCommand)
 
         return self.ID
 
-    ## Init table
+    # Init table
     #
     # Create all default records of table DataModel
     #
@@ -73,7 +76,7 @@ class TableDataModel(Table):
             self.Insert(CrossIndex, Name, Description)
         EdkLogger.verbose("Initialize table DataModel ... DONE!")
 
-    ## Get CrossIndex
+    # Get CrossIndex
     #
     # Get a model's cross index from its name
     #
diff --git a/BaseTools/Source/Python/Table/TableDec.py b/BaseTools/Source/Python/Table/TableDec.py
index 04aa7aaad8b1..c155034a69c7 100644
--- a/BaseTools/Source/Python/Table/TableDec.py
+++ b/BaseTools/Source/Python/Table/TableDec.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for dec datas
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,19 +14,21 @@ import CommonDataClass.DataClass as DataClass
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableDec
+# TableDec
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableDec(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Dec'
 
-    ## Create table
+    # Create table
     #
     # Create table Dec
     #
@@ -61,7 +63,7 @@ class TableDec(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dec
     #
@@ -81,14 +83,15 @@ class TableDec(Table):
     #
     def Insert(self, Model, Value1, Value2, Value3, Value4, Value5, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled):
         self.ID = self.ID + 1
-        (Value1, Value2, Value3, Arch) = ConvertToSqlString((Value1, Value2, Value3, Arch))
+        (Value1, Value2, Value3, Arch) = ConvertToSqlString(
+            (Value1, Value2, Value3, Arch))
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
                      % (self.Table, self.ID, Model, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
         Table.Insert(self, SqlCommand)
 
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param Model:  The Model of Record
     #
diff --git a/BaseTools/Source/Python/Table/TableDsc.py b/BaseTools/Source/Python/Table/TableDsc.py
index deda001ca47b..b58c8ccb8a4b 100644
--- a/BaseTools/Source/Python/Table/TableDsc.py
+++ b/BaseTools/Source/Python/Table/TableDsc.py
@@ -1,5 +1,5 @@
 from __future__ import absolute_import
-## @file
+# @file
 # This file is used to create/update/query/erase table for dsc datas
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,19 +14,21 @@ import CommonDataClass.DataClass as DataClass
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableDsc
+# TableDsc
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableDsc(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Dsc'
 
-    ## Create table
+    # Create table
     #
     # Create table Dsc
     #
@@ -61,7 +63,7 @@ class TableDsc(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dsc
     #
@@ -81,14 +83,15 @@ class TableDsc(Table):
     #
     def Insert(self, Model, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled):
         self.ID = self.ID + 1
-        (Value1, Value2, Value3, Arch) = ConvertToSqlString((Value1, Value2, Value3, Arch))
+        (Value1, Value2, Value3, Arch) = ConvertToSqlString(
+            (Value1, Value2, Value3, Arch))
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
                      % (self.Table, self.ID, Model, Value1, Value2, Value3, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
         Table.Insert(self, SqlCommand)
 
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param Model:  The Model of Record
     #
diff --git a/BaseTools/Source/Python/Table/TableEotReport.py b/BaseTools/Source/Python/Table/TableEotReport.py
index 72bc11f6dbc2..ff23ef108c69 100644
--- a/BaseTools/Source/Python/Table/TableEotReport.py
+++ b/BaseTools/Source/Python/Table/TableEotReport.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for ECC reports
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,25 +10,28 @@
 #
 from __future__ import absolute_import
 import Common.EdkLogger as EdkLogger
-import Common.LongFilePathOs as os, time
+import Common.LongFilePathOs as os
+import time
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString2
 import Eot.EotToolError as EotToolError
 import Eot.EotGlobalData as EotGlobalData
 
-## TableReport
+# TableReport
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableEotReport(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Report'
 
-    ## Create table
+    # Create table
     #
     # Create table report
     #
@@ -51,16 +54,16 @@ class TableEotReport(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table report
     #
     #
-    def Insert(self, ModuleID = -1, ModuleName = '', ModuleGuid = '', SourceFileID = -1, SourceFileFullPath = '', \
-               ItemName = '', ItemType = '', ItemMode = '', GuidName = '', GuidMacro = '', GuidValue = '', BelongsToFunction = '', Enabled = 0):
+    def Insert(self, ModuleID=-1, ModuleName='', ModuleGuid='', SourceFileID=-1, SourceFileFullPath='',
+               ItemName='', ItemType='', ItemMode='', GuidName='', GuidMacro='', GuidValue='', BelongsToFunction='', Enabled=0):
         self.ID = self.ID + 1
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', %s, '%s', '%s', '%s', '%s', '%s', '%s', '%s', '%s', %s)""" \
-                     % (self.Table, self.ID, ModuleID, ModuleName, ModuleGuid, SourceFileID, SourceFileFullPath, \
+                     % (self.Table, self.ID, ModuleID, ModuleName, ModuleGuid, SourceFileID, SourceFileFullPath,
                         ItemName, ItemType, ItemMode, GuidName, GuidMacro, GuidValue, BelongsToFunction, Enabled)
         Table.Insert(self, SqlCommand)
 
diff --git a/BaseTools/Source/Python/Table/TableFdf.py b/BaseTools/Source/Python/Table/TableFdf.py
index 964ddd7a683c..08fcd0c1ef46 100644
--- a/BaseTools/Source/Python/Table/TableFdf.py
+++ b/BaseTools/Source/Python/Table/TableFdf.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for fdf datas
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,19 +14,21 @@ import CommonDataClass.DataClass as DataClass
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableFdf
+# TableFdf
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableFdf(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Fdf'
 
-    ## Create table
+    # Create table
     #
     # Create table Fdf
     #
@@ -62,7 +64,7 @@ class TableFdf(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Fdf
     #
@@ -82,14 +84,15 @@ class TableFdf(Table):
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled):
         self.ID = self.ID + 1
-        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString((Value1, Value2, Value3, Scope1, Scope2))
+        (Value1, Value2, Value3, Scope1, Scope2) = ConvertToSqlString(
+            (Value1, Value2, Value3, Scope1, Scope2))
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
                      % (self.Table, self.ID, Model, Value1, Value2, Value3, Scope1, Scope2, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
         Table.Insert(self, SqlCommand)
 
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param Model:  The Model of Record
     #
diff --git a/BaseTools/Source/Python/Table/TableFile.py b/BaseTools/Source/Python/Table/TableFile.py
index c54c389872bd..8fbff54a0973 100644
--- a/BaseTools/Source/Python/Table/TableFile.py
+++ b/BaseTools/Source/Python/Table/TableFile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for files
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -15,18 +15,20 @@ from Common.StringUtils import ConvertToSqlString
 import Common.LongFilePathOs as os
 from CommonDataClass.DataClass import FileClass
 
-## TableFile
+# TableFile
 #
 # This class defined a table used for file
 #
 # @param object:       Inherited from object class
 #
+
+
 class TableFile(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'File'
 
-    ## Create table
+    # Create table
     #
     # Create table File
     #
@@ -49,7 +51,7 @@ class TableFile(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table File
     #
@@ -63,13 +65,14 @@ class TableFile(Table):
     #
     def Insert(self, Name, ExtName, Path, FullPath, Model, TimeStamp):
         self.ID = self.ID + 1
-        (Name, ExtName, Path, FullPath) = ConvertToSqlString((Name, ExtName, Path, FullPath))
+        (Name, ExtName, Path, FullPath) = ConvertToSqlString(
+            (Name, ExtName, Path, FullPath))
         SqlCommand = """insert into %s values(%s, '%s', '%s', '%s', '%s', %s, '%s')""" \
-                                           % (self.Table, self.ID, Name, ExtName, Path, FullPath, Model, TimeStamp)
+            % (self.Table, self.ID, Name, ExtName, Path, FullPath, Model, TimeStamp)
         Table.Insert(self, SqlCommand)
 
         return self.ID
-    ## InsertFile
+    # InsertFile
     #
     # Insert one file to table
     #
@@ -78,21 +81,24 @@ class TableFile(Table):
     #
     # @retval FileID:       The ID after record is inserted
     #
+
     def InsertFile(self, FileFullPath, Model):
         (Filepath, Name) = os.path.split(FileFullPath)
         (Root, Ext) = os.path.splitext(FileFullPath)
         TimeStamp = os.stat(FileFullPath)[8]
-        File = FileClass(-1, Name, Ext, Filepath, FileFullPath, Model, '', [], [], [])
+        File = FileClass(-1, Name, Ext, Filepath,
+                         FileFullPath, Model, '', [], [], [])
         return self.Insert(File.Name, File.ExtName, File.Path, File.FullPath, File.Model, TimeStamp)
 
-    ## Get ID of a given file
+    # Get ID of a given file
     #
     #   @param  FilePath    Path of file
     #
     #   @retval ID          ID value of given file in the table
     #
     def GetFileId(self, File):
-        QueryScript = "select ID from %s where FullPath = '%s'" % (self.Table, str(File))
+        QueryScript = "select ID from %s where FullPath = '%s'" % (
+            self.Table, str(File))
         RecordList = self.Exec(QueryScript)
         if len(RecordList) == 0:
             return None
diff --git a/BaseTools/Source/Python/Table/TableFunction.py b/BaseTools/Source/Python/Table/TableFunction.py
index 69d240b9a7d6..429769b426a4 100644
--- a/BaseTools/Source/Python/Table/TableFunction.py
+++ b/BaseTools/Source/Python/Table/TableFunction.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for functions
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,18 +13,20 @@ import Common.EdkLogger as EdkLogger
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableFunction
+# TableFunction
 #
 # This class defined a table used for function
 #
 # @param Table:       Inherited from Table class
 #
+
+
 class TableFunction(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Function'
 
-    ## Create table
+    # Create table
     #
     # Create table Function
     #
@@ -61,7 +63,7 @@ class TableFunction(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Function
     #
@@ -82,9 +84,10 @@ class TableFunction(Table):
     #
     def Insert(self, Header, Modifier, Name, ReturnStatement, StartLine, StartColumn, EndLine, EndColumn, BodyStartLine, BodyStartColumn, BelongsToFile, FunNameStartLine, FunNameStartColumn):
         self.ID = self.ID + 1
-        (Header, Modifier, Name, ReturnStatement) = ConvertToSqlString((Header, Modifier, Name, ReturnStatement))
+        (Header, Modifier, Name, ReturnStatement) = ConvertToSqlString(
+            (Header, Modifier, Name, ReturnStatement))
         SqlCommand = """insert into %s values(%s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s, %s, %s)""" \
-                                    % (self.Table, self.ID, Header, Modifier, Name, ReturnStatement, StartLine, StartColumn, EndLine, EndColumn, BodyStartLine, BodyStartColumn, BelongsToFile, FunNameStartLine, FunNameStartColumn)
+            % (self.Table, self.ID, Header, Modifier, Name, ReturnStatement, StartLine, StartColumn, EndLine, EndColumn, BodyStartLine, BodyStartColumn, BelongsToFile, FunNameStartLine, FunNameStartColumn)
         Table.Insert(self, SqlCommand)
 
         return self.ID
diff --git a/BaseTools/Source/Python/Table/TableIdentifier.py b/BaseTools/Source/Python/Table/TableIdentifier.py
index 99f5023e965e..a34e1b5a76ba 100644
--- a/BaseTools/Source/Python/Table/TableIdentifier.py
+++ b/BaseTools/Source/Python/Table/TableIdentifier.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for Identifiers
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,19 +13,21 @@ import Common.EdkLogger as EdkLogger
 from Common.StringUtils import ConvertToSqlString
 from Table.Table import Table
 
-## TableIdentifier
+# TableIdentifier
 #
 # This class defined a table used for Identifier
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableIdentifier(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Identifier'
 
-    ## Create table
+    # Create table
     #
     # Create table Identifier
     #
@@ -58,7 +60,7 @@ class TableIdentifier(Table):
                                                      )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Identifier
     #
@@ -77,9 +79,10 @@ class TableIdentifier(Table):
     #
     def Insert(self, Modifier, Type, Name, Value, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn):
         self.ID = self.ID + 1
-        (Modifier, Type, Name, Value) = ConvertToSqlString((Modifier, Type, Name, Value))
+        (Modifier, Type, Name, Value) = ConvertToSqlString(
+            (Modifier, Type, Name, Value))
         SqlCommand = """insert into %s values(%s, '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
-                                           % (self.Table, self.ID, Modifier, Type, Name, Value, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
+            % (self.Table, self.ID, Modifier, Type, Name, Value, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
         Table.Insert(self, SqlCommand)
 
         return self.ID
diff --git a/BaseTools/Source/Python/Table/TableInf.py b/BaseTools/Source/Python/Table/TableInf.py
index 54af88c37da5..9a38cf68ba0f 100644
--- a/BaseTools/Source/Python/Table/TableInf.py
+++ b/BaseTools/Source/Python/Table/TableInf.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for inf datas
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,19 +14,21 @@ import CommonDataClass.DataClass as DataClass
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TableInf
+# TableInf
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableInf(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Inf'
 
-    ## Create table
+    # Create table
     #
     # Create table Inf
     #
@@ -65,7 +67,7 @@ class TableInf(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Inf
     #
@@ -87,14 +89,15 @@ class TableInf(Table):
     #
     def Insert(self, Model, Value1, Value2, Value3, Value4, Value5, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled):
         self.ID = self.ID + 1
-        (Value1, Value2, Value3, Value4, Value5, Arch) = ConvertToSqlString((Value1, Value2, Value3, Value4, Value5, Arch))
+        (Value1, Value2, Value3, Value4, Value5, Arch) = ConvertToSqlString(
+            (Value1, Value2, Value3, Value4, Value5, Arch))
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', '%s', '%s', '%s', '%s', %s, %s, %s, %s, %s, %s, %s)""" \
                      % (self.Table, self.ID, Model, Value1, Value2, Value3, Value4, Value5, Arch, BelongsToItem, BelongsToFile, StartLine, StartColumn, EndLine, EndColumn, Enabled)
         Table.Insert(self, SqlCommand)
 
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param Model:  The Model of Record
     #
diff --git a/BaseTools/Source/Python/Table/TablePcd.py b/BaseTools/Source/Python/Table/TablePcd.py
index f1e9f578173e..2966e8f0d622 100644
--- a/BaseTools/Source/Python/Table/TablePcd.py
+++ b/BaseTools/Source/Python/Table/TablePcd.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for pcds
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -13,19 +13,21 @@ import Common.EdkLogger as EdkLogger
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString
 
-## TablePcd
+# TablePcd
 #
 # This class defined a table used for pcds
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TablePcd(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Pcd'
 
-    ## Create table
+    # Create table
     #
     # Create table Pcd
     #
@@ -58,7 +60,7 @@ class TablePcd(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Pcd
     #
@@ -77,9 +79,10 @@ class TablePcd(Table):
     #
     def Insert(self, CName, TokenSpaceGuidCName, Token, DatumType, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn):
         self.ID = self.ID + 1
-        (CName, TokenSpaceGuidCName, DatumType) = ConvertToSqlString((CName, TokenSpaceGuidCName, DatumType))
+        (CName, TokenSpaceGuidCName, DatumType) = ConvertToSqlString(
+            (CName, TokenSpaceGuidCName, DatumType))
         SqlCommand = """insert into %s values(%s, '%s', '%s', %s, '%s', %s, %s, %s, %s, %s, %s, %s)""" \
-                                           % (self.Table, self.ID, CName, TokenSpaceGuidCName, Token, DatumType, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
+            % (self.Table, self.ID, CName, TokenSpaceGuidCName, Token, DatumType, Model, BelongsToFile, BelongsToFunction, StartLine, StartColumn, EndLine, EndColumn)
         Table.Insert(self, SqlCommand)
 
         return self.ID
diff --git a/BaseTools/Source/Python/Table/TableQuery.py b/BaseTools/Source/Python/Table/TableQuery.py
index 3e66fbfc9da8..ca4844e0f034 100644
--- a/BaseTools/Source/Python/Table/TableQuery.py
+++ b/BaseTools/Source/Python/Table/TableQuery.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for Queries
 #
 # Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
@@ -13,19 +13,21 @@ import Common.EdkLogger as EdkLogger
 from Common.StringUtils import ConvertToSqlString
 from Table.Table import Table
 
-## TableQuery
+# TableQuery
 #
 # This class defined a table used for Query
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableQuery(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Query'
 
-    ## Create table
+    # Create table
     #
     # Create table Query
     #
@@ -44,7 +46,7 @@ class TableQuery(Table):
                                                      )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Query
     #
@@ -57,7 +59,7 @@ class TableQuery(Table):
     def Insert(self, Name, Modifier, Value, Model):
         self.ID = self.ID + 1
         SqlCommand = """insert into %s values(%s, '%s', '%s', '%s', %s)""" \
-                                           % (self.Table, self.ID, Name, Modifier, Value, Model)
+            % (self.Table, self.ID, Name, Modifier, Value, Model)
         Table.Insert(self, SqlCommand)
 
         return self.ID
diff --git a/BaseTools/Source/Python/Table/TableReport.py b/BaseTools/Source/Python/Table/TableReport.py
index 0a77787d8f48..da32bb89094c 100644
--- a/BaseTools/Source/Python/Table/TableReport.py
+++ b/BaseTools/Source/Python/Table/TableReport.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for ECC reports
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,26 +10,29 @@
 #
 from __future__ import absolute_import
 import Common.EdkLogger as EdkLogger
-import Common.LongFilePathOs as os, time
+import Common.LongFilePathOs as os
+import time
 from Table.Table import Table
 from Common.StringUtils import ConvertToSqlString2
 import Ecc.EccToolError as EccToolError
 import Ecc.EccGlobalData as EccGlobalData
 from Common.LongFilePathSupport import OpenLongFilePath as open
 
-## TableReport
+# TableReport
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableReport(Table):
     def __init__(self, Cursor):
         Table.__init__(self, Cursor)
         self.Table = 'Report'
 
-    ## Create table
+    # Create table
     #
     # Create table report
     #
@@ -51,7 +54,7 @@ class TableReport(Table):
                                                       )""" % self.Table
         Table.Create(self, SqlCommand)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table report
     #
@@ -63,7 +66,7 @@ class TableReport(Table):
     # @param Enabled:        If this error enabled
     # @param Corrected:      if this error corrected
     #
-    def Insert(self, ErrorID, OtherMsg='', BelongsToTable='', BelongsToItem= -1, Enabled=0, Corrected= -1):
+    def Insert(self, ErrorID, OtherMsg='', BelongsToTable='', BelongsToItem=-1, Enabled=0, Corrected=-1):
         self.ID = self.ID + 1
         SqlCommand = """insert into %s values(%s, %s, '%s', '%s', %s, %s, %s)""" \
                      % (self.Table, self.ID, ErrorID, ConvertToSqlString2(OtherMsg), BelongsToTable, BelongsToItem, Enabled, Corrected)
@@ -71,7 +74,7 @@ class TableReport(Table):
 
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @retval:       A recordSet of all found records
     #
@@ -80,14 +83,14 @@ class TableReport(Table):
                         where Enabled > -1 order by ErrorID, BelongsToItem""" % (self.Table)
         return self.Exec(SqlCommand)
 
-    ## Update table
+    # Update table
     #
     def UpdateBelongsToItemByFile(self, ItemID=-1, File=""):
         SqlCommand = """update Report set BelongsToItem=%s where BelongsToTable='File' and BelongsToItem=-2
                         and OtherMsg like '%%%s%%'""" % (ItemID, File)
         return self.Exec(SqlCommand)
 
-    ## Convert to CSV
+    # Convert to CSV
     #
     # Get all enabled records from table report and save them to a .csv file
     #
@@ -96,7 +99,8 @@ class TableReport(Table):
     def ToCSV(self, Filename='Report.csv'):
         try:
             File = open(Filename, 'w+')
-            File.write("""No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
+            File.write(
+                """No, Error Code, Error Message, File, LineNo, Other Error Message\n""")
             RecordSet = self.Query()
             Index = 0
             for Record in RecordSet:
@@ -116,12 +120,15 @@ class TableReport(Table):
                                  """ % (BelongsToTable, BelongsToItem)
                 NewRecord = self.Exec(SqlCommand)
                 if NewRecord != []:
-                    File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
-                    EdkLogger.quiet("%s(%s): [%s]%s %s" % (NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
+                    File.write("""%s,%s,"%s",%s,%s,"%s"\n""" % (
+                        Index, ErrorID, EccToolError.gEccErrorMessage[ErrorID], NewRecord[0][1], NewRecord[0][0], OtherMsg))
+                    EdkLogger.quiet("%s(%s): [%s]%s %s" % (
+                        NewRecord[0][1], NewRecord[0][0], ErrorID, EccToolError.gEccErrorMessage[ErrorID], OtherMsg))
 
             File.close()
         except IOError:
-            NewFilename = 'Report_' + time.strftime("%Y%m%d_%H%M%S.csv", time.localtime())
-            EdkLogger.warn("ECC", "The report file %s is locked by other progress, use %s instead!" % (Filename, NewFilename))
+            NewFilename = 'Report_' + \
+                time.strftime("%Y%m%d_%H%M%S.csv", time.localtime())
+            EdkLogger.warn("ECC", "The report file %s is locked by other progress, use %s instead!" % (
+                Filename, NewFilename))
             self.ToCSV(NewFilename)
-
diff --git a/BaseTools/Source/Python/Table/__init__.py b/BaseTools/Source/Python/Table/__init__.py
index e901914f30a8..caa82844e967 100644
--- a/BaseTools/Source/Python/Table/__init__.py
+++ b/BaseTools/Source/Python/Table/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Table' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/TargetTool/TargetTool.py b/BaseTools/Source/Python/TargetTool/TargetTool.py
index 7f2479f0f0ac..61276661b647 100644
--- a/BaseTools/Source/Python/TargetTool/TargetTool.py
+++ b/BaseTools/Source/Python/TargetTool/TargetTool.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Target Tool Parser
 #
 #  Copyright (c) 2007 - 2021, Intel Corporation. All rights reserved.<BR>
@@ -21,23 +21,25 @@ from Common.TargetTxtClassObject import gDefaultTargetTxtFile
 
 # To Do 1.set clean, 2. add item, if the line is disabled.
 
+
 class TargetTool():
     def __init__(self, opt, args):
         self.WorkSpace = os.path.normpath(os.getenv('WORKSPACE'))
-        self.Opt       = opt
-        self.Arg       = args[0]
-        self.FileName  = os.path.normpath(os.path.join(self.WorkSpace, 'Conf', gDefaultTargetTxtFile))
+        self.Opt = opt
+        self.Arg = args[0]
+        self.FileName = os.path.normpath(os.path.join(
+            self.WorkSpace, 'Conf', gDefaultTargetTxtFile))
         if os.path.isfile(self.FileName) == False:
             print("%s does not exist." % self.FileName)
             sys.exit(1)
         self.TargetTxtDictionary = {
-            TAB_TAT_DEFINES_ACTIVE_PLATFORM                            : None,
-            TAB_TAT_DEFINES_TOOL_CHAIN_CONF                            : None,
-            TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER               : None,
-            TAB_TAT_DEFINES_TARGET                                     : None,
-            TAB_TAT_DEFINES_TOOL_CHAIN_TAG                             : None,
-            TAB_TAT_DEFINES_TARGET_ARCH                                : None,
-            TAB_TAT_DEFINES_BUILD_RULE_CONF                            : None,
+            TAB_TAT_DEFINES_ACTIVE_PLATFORM: None,
+            TAB_TAT_DEFINES_TOOL_CHAIN_CONF: None,
+            TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER: None,
+            TAB_TAT_DEFINES_TARGET: None,
+            TAB_TAT_DEFINES_TOOL_CHAIN_TAG: None,
+            TAB_TAT_DEFINES_TARGET_ARCH: None,
+            TAB_TAT_DEFINES_BUILD_RULE_CONF: None,
         }
         self.LoadTargetTxtFile(self.FileName)
 
@@ -45,7 +47,8 @@ class TargetTool():
         if os.path.exists(filename) and os.path.isfile(filename):
             return self.ConvertTextFileToDict(filename, '#', '=')
         else:
-            raise ParseError('LoadTargetTxtFile() : No Target.txt file exists.')
+            raise ParseError(
+                'LoadTargetTxtFile() : No Target.txt file exists.')
             return 1
 
 #
@@ -63,11 +66,12 @@ class TargetTool():
                     Key = LineList[0].strip()
                     if Key.startswith(CommentCharacter) == False and Key in self.TargetTxtDictionary:
                         if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM or Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF \
-                          or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
-                          or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
-                            self.TargetTxtDictionary[Key] = LineList[1].replace('\\', '/').strip()
+                                or Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER \
+                                or Key == TAB_TAT_DEFINES_ACTIVE_MODULE:
+                            self.TargetTxtDictionary[Key] = LineList[1].replace(
+                                '\\', '/').strip()
                         elif Key == TAB_TAT_DEFINES_TARGET or Key == TAB_TAT_DEFINES_TARGET_ARCH \
-                          or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
+                                or Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG or Key == TAB_TAT_DEFINES_BUILD_RULE_CONF:
                             self.TargetTxtDictionary[Key] = LineList[1].split()
             f.close()
             return 0
@@ -76,10 +80,11 @@ class TargetTool():
             traceback.print_exception(last_type, last_value, last_tb)
 
     def Print(self):
-        errMsg  = ''
+        errMsg = ''
         for Key in self.TargetTxtDictionary:
             if isinstance(self.TargetTxtDictionary[Key], type([])):
-                print("%-30s = %s" % (Key, ''.join(elem + ' ' for elem in self.TargetTxtDictionary[Key])))
+                print("%-30s = %s" % (Key, ''.join(elem +
+                      ' ' for elem in self.TargetTxtDictionary[Key])))
             elif self.TargetTxtDictionary[Key] is None:
                 errMsg += "  Missing %s configuration information, please use TargetTool to set value!" % Key + os.linesep
             else:
@@ -91,7 +96,8 @@ class TargetTool():
     def RWFile(self, CommentCharacter, KeySplitCharacter, Num):
         try:
             fr = open(self.FileName, 'r')
-            fw = open(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), 'w')
+            fw = open(os.path.normpath(os.path.join(
+                self.WorkSpace, 'Conf\\targetnew.txt')), 'w')
 
             existKeys = []
             for Line in fr:
@@ -105,7 +111,8 @@ class TargetTool():
                             if Key not in existKeys:
                                 existKeys.append(Key)
                             else:
-                                print("Warning: Found duplicate key item in original configuration files!")
+                                print(
+                                    "Warning: Found duplicate key item in original configuration files!")
 
                             if Num == 0:
                                 Line = "%-30s = \n" % Key
@@ -116,7 +123,8 @@ class TargetTool():
                             fw.write(Line)
             for key in self.TargetTxtDictionary:
                 if key not in existKeys:
-                    print("Warning: %s does not exist in original configuration file" % key)
+                    print(
+                        "Warning: %s does not exist in original configuration file" % key)
                     Line = GetConfigureKeyValue(self, key)
                     if Line is None:
                         Line = "%-30s = " % key
@@ -125,12 +133,14 @@ class TargetTool():
             fr.close()
             fw.close()
             os.remove(self.FileName)
-            os.rename(os.path.normpath(os.path.join(self.WorkSpace, 'Conf\\targetnew.txt')), self.FileName)
+            os.rename(os.path.normpath(os.path.join(
+                self.WorkSpace, 'Conf\\targetnew.txt')), self.FileName)
 
         except:
             last_type, last_value, last_tb = sys.exc_info()
             traceback.print_exception(last_type, last_value, last_tb)
 
+
 def GetConfigureKeyValue(self, Key):
     Line = None
     if Key == TAB_TAT_DEFINES_ACTIVE_PLATFORM and self.Opt.DSCFILE is not None:
@@ -141,7 +151,8 @@ def GetConfigureKeyValue(self, Key):
             EdkLogger.error("TargetTool", BuildToolError.FILE_NOT_FOUND,
                             "DSC file %s does not exist!" % self.Opt.DSCFILE, RaiseError=False)
     elif Key == TAB_TAT_DEFINES_TOOL_CHAIN_CONF and self.Opt.TOOL_DEFINITION_FILE is not None:
-        tooldefFullPath = os.path.join(self.WorkSpace, self.Opt.TOOL_DEFINITION_FILE)
+        tooldefFullPath = os.path.join(
+            self.WorkSpace, self.Opt.TOOL_DEFINITION_FILE)
         if os.path.exists(tooldefFullPath):
             Line = "%-30s = %s\n" % (Key, self.Opt.TOOL_DEFINITION_FILE)
         else:
@@ -155,13 +166,16 @@ def GetConfigureKeyValue(self, Key):
     elif Key == TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER and self.Opt.NUM is not None:
         Line = "%-30s = %s\n" % (Key, str(self.Opt.NUM))
     elif Key == TAB_TAT_DEFINES_TARGET and self.Opt.TARGET is not None:
-        Line = "%-30s = %s\n" % (Key, ''.join(elem + ' ' for elem in self.Opt.TARGET))
+        Line = "%-30s = %s\n" % (Key, ''.join(elem +
+                                 ' ' for elem in self.Opt.TARGET))
     elif Key == TAB_TAT_DEFINES_TARGET_ARCH and self.Opt.TARGET_ARCH is not None:
-        Line = "%-30s = %s\n" % (Key, ''.join(elem + ' ' for elem in self.Opt.TARGET_ARCH))
+        Line = "%-30s = %s\n" % (Key, ''.join(elem +
+                                 ' ' for elem in self.Opt.TARGET_ARCH))
     elif Key == TAB_TAT_DEFINES_TOOL_CHAIN_TAG and self.Opt.TOOL_CHAIN_TAG is not None:
         Line = "%-30s = %s\n" % (Key, self.Opt.TOOL_CHAIN_TAG)
     elif Key == TAB_TAT_DEFINES_BUILD_RULE_CONF and self.Opt.BUILD_RULE_FILE is not None:
-        buildruleFullPath = os.path.join(self.WorkSpace, self.Opt.BUILD_RULE_FILE)
+        buildruleFullPath = os.path.join(
+            self.WorkSpace, self.Opt.BUILD_RULE_FILE)
         if os.path.exists(buildruleFullPath):
             Line = "%-30s = %s\n" % (Key, self.Opt.BUILD_RULE_FILE)
         else:
@@ -169,6 +183,7 @@ def GetConfigureKeyValue(self, Key):
                             "Build rule file %s does not exist!" % self.Opt.BUILD_RULE_FILE, RaiseError=False)
     return Line
 
+
 VersionNumber = ("0.01" + " " + gBUILD_VERSION)
 __version__ = "%prog Version " + VersionNumber
 __copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation  All rights reserved."
@@ -179,42 +194,51 @@ __usage__ = "%prog [options] {args} \
 \n Set    replace the default configuration with expected value specified by option."
 
 gParamCheck = []
+
+
 def SingleCheckCallback(option, opt_str, value, parser):
     if option not in gParamCheck:
         setattr(parser.values, option.dest, value)
         gParamCheck.append(option)
     else:
-        parser.error("Option %s only allows one instance in command line!" % option)
+        parser.error(
+            "Option %s only allows one instance in command line!" % option)
+
 
 def RangeCheckCallback(option, opt_str, value, parser):
     if option not in gParamCheck:
         gParamCheck.append(option)
         if value < 1 or value > 8:
-            parser.error("The count of multi-thread is not in valid range of 1 ~ 8.")
+            parser.error(
+                "The count of multi-thread is not in valid range of 1 ~ 8.")
         else:
             setattr(parser.values, option.dest, value)
     else:
-        parser.error("Option %s only allows one instance in command line!" % option)
+        parser.error(
+            "Option %s only allows one instance in command line!" % option)
+
 
 def MyOptionParser():
-    parser = OptionParser(version=__version__, prog="TargetTool.exe", usage=__usage__, description=__copyright__)
+    parser = OptionParser(version=__version__, prog="TargetTool.exe",
+                          usage=__usage__, description=__copyright__)
     parser.add_option("-a", "--arch", action="append", dest="TARGET_ARCH",
-        help="ARCHS is one of list: IA32, X64, ARM, AARCH64 or EBC, which replaces target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
+                      help="ARCHS is one of list: IA32, X64, ARM, AARCH64 or EBC, which replaces target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-p", "--platform", action="callback", type="string", dest="DSCFILE", callback=SingleCheckCallback,
-        help="Specify a DSC file, which replace target.txt's ACTIVE_PLATFORM definition. 0 will clear this setting in target.txt and can't combine with other value.")
+                      help="Specify a DSC file, which replace target.txt's ACTIVE_PLATFORM definition. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-c", "--tooldef", action="callback", type="string", dest="TOOL_DEFINITION_FILE", callback=SingleCheckCallback,
-        help="Specify the WORKSPACE relative path of tool_def.txt file, which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this setting in target.txt and can't combine with other value.")
+                      help="Specify the WORKSPACE relative path of tool_def.txt file, which replace target.txt's TOOL_CHAIN_CONF definition. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-t", "--target", action="append", type="choice", choices=['DEBUG', 'RELEASE', '0'], dest="TARGET",
-        help="TARGET is one of list: DEBUG, RELEASE, which replaces target.txt's TARGET definition. To specify more TARGET, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
+                      help="TARGET is one of list: DEBUG, RELEASE, which replaces target.txt's TARGET definition. To specify more TARGET, please repeat this option. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-n", "--tagname", action="callback", type="string", dest="TOOL_CHAIN_TAG", callback=SingleCheckCallback,
-        help="Specify the Tool Chain Tagname, which replaces target.txt's TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and can't combine with other value.")
+                      help="Specify the Tool Chain Tagname, which replaces target.txt's TOOL_CHAIN_TAG definition. 0 will clear this setting in target.txt and can't combine with other value.")
     parser.add_option("-r", "--buildrule", action="callback", type="string", dest="BUILD_RULE_FILE", callback=SingleCheckCallback,
-        help="Specify the build rule configure file, which replaces target.txt's BUILD_RULE_CONF definition. If not specified, the default value Conf/build_rule.txt will be set.")
+                      help="Specify the build rule configure file, which replaces target.txt's BUILD_RULE_CONF definition. If not specified, the default value Conf/build_rule.txt will be set.")
     parser.add_option("-m", "--multithreadnum", action="callback", type="int", dest="NUM", callback=RangeCheckCallback,
-        help="Specify the multi-thread number which replace target.txt's MAX_CONCURRENT_THREAD_NUMBER. If the value is less than 2, MULTIPLE_THREAD will be disabled. If the value is larger than 1, MULTIPLE_THREAD will be enabled.")
-    (opt, args)=parser.parse_args()
+                      help="Specify the multi-thread number which replace target.txt's MAX_CONCURRENT_THREAD_NUMBER. If the value is less than 2, MULTIPLE_THREAD will be disabled. If the value is larger than 1, MULTIPLE_THREAD will be enabled.")
+    (opt, args) = parser.parse_args()
     return (opt, args)
 
+
 if __name__ == '__main__':
     EdkLogger.Initialize()
     EdkLogger.SetLevel(EdkLogger.QUIET)
@@ -232,12 +256,14 @@ if __name__ == '__main__':
     if opt.TARGET is not None and len(opt.TARGET) > 1:
         for elem in opt.TARGET:
             if elem == '0':
-                print("0 will clear the TARGET setting in target.txt and can't combine with other value.")
+                print(
+                    "0 will clear the TARGET setting in target.txt and can't combine with other value.")
                 sys.exit(1)
     if opt.TARGET_ARCH is not None and len(opt.TARGET_ARCH) > 1:
         for elem in opt.TARGET_ARCH:
             if elem == '0':
-                print("0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value.")
+                print(
+                    "0 will clear the TARGET_ARCH setting in target.txt and can't combine with other value.")
                 sys.exit(1)
 
     try:
@@ -252,4 +278,3 @@ if __name__ == '__main__':
     except Exception as e:
         last_type, last_value, last_tb = sys.exc_info()
         traceback.print_exception(last_type, last_value, last_tb)
-
diff --git a/BaseTools/Source/Python/TargetTool/__init__.py b/BaseTools/Source/Python/TargetTool/__init__.py
index ae712b44aa9c..48b55bd4f367 100644
--- a/BaseTools/Source/Python/TargetTool/__init__.py
+++ b/BaseTools/Source/Python/TargetTool/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'TargetTool' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Trim/Trim.py b/BaseTools/Source/Python/Trim/Trim.py
index c479f7d2b2e7..cffca972d219 100644
--- a/BaseTools/Source/Python/Trim/Trim.py
+++ b/BaseTools/Source/Python/Trim/Trim.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Trim files preprocessed by compiler
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -27,43 +27,51 @@ __version_number__ = ("0.10" + " " + gBUILD_VERSION)
 __version__ = "%prog Version " + __version_number__
 __copyright__ = "Copyright (c) 2007-2018, Intel Corporation. All rights reserved."
 
-## Regular expression for matching Line Control directive like "#line xxx"
+# Regular expression for matching Line Control directive like "#line xxx"
 gLineControlDirective = re.compile('^\s*#(?:line)?\s+([0-9]+)\s+"*([^"]*)"')
-## Regular expression for matching "typedef struct"
-gTypedefPattern = re.compile("^\s*typedef\s+struct(\s+\w+)?\s*[{]*$", re.MULTILINE)
-## Regular expression for matching "#pragma pack"
+# Regular expression for matching "typedef struct"
+gTypedefPattern = re.compile(
+    "^\s*typedef\s+struct(\s+\w+)?\s*[{]*$", re.MULTILINE)
+# Regular expression for matching "#pragma pack"
 gPragmaPattern = re.compile("^\s*#pragma\s+pack", re.MULTILINE)
-## Regular expression for matching "typedef"
+# Regular expression for matching "typedef"
 gTypedef_SinglePattern = re.compile("^\s*typedef", re.MULTILINE)
-## Regular expression for matching "typedef struct, typedef union, struct, union"
-gTypedef_MulPattern = re.compile("^\s*(typedef)?\s+(struct|union)(\s+\w+)?\s*[{]*$", re.MULTILINE)
+# Regular expression for matching "typedef struct, typedef union, struct, union"
+gTypedef_MulPattern = re.compile(
+    "^\s*(typedef)?\s+(struct|union)(\s+\w+)?\s*[{]*$", re.MULTILINE)
 
 #
 # The following number pattern match will only match if following criteria is met:
 # There is leading non-(alphanumeric or _) character, and no following alphanumeric or _
 # as the pattern is greedily match, so it is ok for the gDecNumberPattern or gHexNumberPattern to grab the maximum match
 #
-## Regular expression for matching HEX number
-gHexNumberPattern = re.compile("(?<=[^a-zA-Z0-9_])(0[xX])([0-9a-fA-F]+)(U(?=$|[^a-zA-Z0-9_]))?")
-## Regular expression for matching decimal number with 'U' postfix
-gDecNumberPattern = re.compile("(?<=[^a-zA-Z0-9_])([0-9]+)U(?=$|[^a-zA-Z0-9_])")
-## Regular expression for matching constant with 'ULL' 'LL' postfix
-gLongNumberPattern = re.compile("(?<=[^a-zA-Z0-9_])(0[xX][0-9a-fA-F]+|[0-9]+)U?LL(?=$|[^a-zA-Z0-9_])")
+# Regular expression for matching HEX number
+gHexNumberPattern = re.compile(
+    "(?<=[^a-zA-Z0-9_])(0[xX])([0-9a-fA-F]+)(U(?=$|[^a-zA-Z0-9_]))?")
+# Regular expression for matching decimal number with 'U' postfix
+gDecNumberPattern = re.compile(
+    "(?<=[^a-zA-Z0-9_])([0-9]+)U(?=$|[^a-zA-Z0-9_])")
+# Regular expression for matching constant with 'ULL' 'LL' postfix
+gLongNumberPattern = re.compile(
+    "(?<=[^a-zA-Z0-9_])(0[xX][0-9a-fA-F]+|[0-9]+)U?LL(?=$|[^a-zA-Z0-9_])")
 
-## Regular expression for matching "Include ()" in asl file
-gAslIncludePattern = re.compile("^(\s*)[iI]nclude\s*\(\"?([^\"\(\)]+)\"\)", re.MULTILINE)
-## Regular expression for matching C style #include "XXX.asl" in asl file
-gAslCIncludePattern = re.compile(r'^(\s*)#include\s*[<"]\s*([-\\/\w.]+)\s*([>"])', re.MULTILINE)
-## Patterns used to convert EDK conventions to EDK2 ECP conventions
+# Regular expression for matching "Include ()" in asl file
+gAslIncludePattern = re.compile(
+    "^(\s*)[iI]nclude\s*\(\"?([^\"\(\)]+)\"\)", re.MULTILINE)
+# Regular expression for matching C style #include "XXX.asl" in asl file
+gAslCIncludePattern = re.compile(
+    r'^(\s*)#include\s*[<"]\s*([-\\/\w.]+)\s*([>"])', re.MULTILINE)
+# Patterns used to convert EDK conventions to EDK2 ECP conventions
 
-## Regular expression for finding header file inclusions
-gIncludePattern = re.compile(r"^[ \t]*[%]?[ \t]*include(?:[ \t]*(?:\\(?:\r\n|\r|\n))*[ \t]*)*(?:\(?[\"<]?[ \t]*)([-\w.\\/() \t]+)(?:[ \t]*[\">]?\)?)", re.MULTILINE | re.UNICODE | re.IGNORECASE)
+# Regular expression for finding header file inclusions
+gIncludePattern = re.compile(
+    r"^[ \t]*[%]?[ \t]*include(?:[ \t]*(?:\\(?:\r\n|\r|\n))*[ \t]*)*(?:\(?[\"<]?[ \t]*)([-\w.\\/() \t]+)(?:[ \t]*[\">]?\)?)", re.MULTILINE | re.UNICODE | re.IGNORECASE)
 
 
-## file cache to avoid circular include in ASL file
+# file cache to avoid circular include in ASL file
 gIncludedAslFile = []
 
-## Trim preprocessed source code
+# Trim preprocessed source code
 #
 # Remove extra content made by preprocessor. The preprocessor must enable the
 # line number generation option when preprocessing.
@@ -72,6 +80,8 @@ gIncludedAslFile = []
 # @param  Target    File to store the trimmed content
 # @param  Convert   If True, convert standard HEX format to MASM format
 #
+
+
 def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
     CreateDirectory(os.path.dirname(Target))
     try:
@@ -80,7 +90,8 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
     except IOError:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
     except:
-        EdkLogger.error("Trim", AUTOGEN_ERROR, "TrimPreprocessedFile: Error while processing file", File=Source)
+        EdkLogger.error("Trim", AUTOGEN_ERROR,
+                        "TrimPreprocessedFile: Error while processing file", File=Source)
 
     PreprocessedFile = ""
     InjectedFile = ""
@@ -183,7 +194,7 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
-## Trim preprocessed VFR file
+# Trim preprocessed VFR file
 #
 # Remove extra content made by preprocessor. The preprocessor doesn't need to
 # enable line number generation option when preprocessing.
@@ -191,6 +202,8 @@ def TrimPreprocessedFile(Source, Target, ConvertHex, TrimLong):
 # @param  Source    File to be trimmed
 # @param  Target    File to store the trimmed content
 #
+
+
 def TrimPreprocessedVfr(Source, Target):
     CreateDirectory(os.path.dirname(Target))
 
@@ -248,7 +261,7 @@ def TrimPreprocessedVfr(Source, Target):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
-## Read the content  ASL file, including ASL included, recursively
+# Read the content  ASL file, including ASL included, recursively
 #
 # @param  Source            File to be read
 # @param  Indent            Spaces before the Include() statement
@@ -257,7 +270,9 @@ def TrimPreprocessedVfr(Source, Target):
 #                           first for the included file; otherwise, only the path specified
 #                           in the IncludePathList will be searched.
 #
-def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, IncludeFileList = None, filetype=None):
+
+
+def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, IncludeFileList=None, filetype=None):
     NewFileContent = []
     if IncludeFileList is None:
         IncludeFileList = []
@@ -287,12 +302,11 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, Inclu
         EdkLogger.warn("Trim", FILE_OPEN_FAILURE, ExtraData=Source)
         return []
 
-
     # avoid A "include" B and B "include" A
     IncludeFile = os.path.abspath(os.path.normpath(IncludeFile))
     if IncludeFile in gIncludedAslFile:
         EdkLogger.warn("Trim", "Circular include",
-                       ExtraData= "%s -> %s" % (" -> ".join(gIncludedAslFile), IncludeFile))
+                       ExtraData="%s -> %s" % (" -> ".join(gIncludedAslFile), IncludeFile))
         return []
     gIncludedAslFile.append(IncludeFile)
     IncludeFileList.append(IncludeFile.strip())
@@ -312,7 +326,8 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, Inclu
                     LocalSearchPath = os.path.dirname(IncludeFile)
             CurrentIndent = Indent + Result[0][0]
             IncludedFile = Result[0][1]
-            NewFileContent.extend(DoInclude(IncludedFile, CurrentIndent, IncludePathList, LocalSearchPath,IncludeFileList,filetype))
+            NewFileContent.extend(DoInclude(
+                IncludedFile, CurrentIndent, IncludePathList, LocalSearchPath, IncludeFileList, filetype))
             NewFileContent.append("\n")
         elif filetype == "ASM":
             Result = gIncludePattern.findall(Line)
@@ -324,7 +339,8 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, Inclu
 
             IncludedFile = IncludedFile.strip()
             IncludedFile = os.path.normpath(IncludedFile)
-            NewFileContent.extend(DoInclude(IncludedFile, '', IncludePathList, LocalSearchPath,IncludeFileList,filetype))
+            NewFileContent.extend(DoInclude(
+                IncludedFile, '', IncludePathList, LocalSearchPath, IncludeFileList, filetype))
             NewFileContent.append("\n")
 
     gIncludedAslFile.pop()
@@ -332,7 +348,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, Inclu
     return NewFileContent
 
 
-## Trim ASL file
+# Trim ASL file
 #
 # Replace ASL include statement with the content the included file
 #
@@ -340,7 +356,7 @@ def DoInclude(Source, Indent='', IncludePathList=[], LocalSearchPath=None, Inclu
 # @param  Target          File to store the trimmed content
 # @param  IncludePathFile The file to log the external include path
 #
-def TrimAslFile(Source, Target, IncludePathFile,AslDeps = False):
+def TrimAslFile(Source, Target, IncludePathFile, AslDeps=False):
     CreateDirectory(os.path.dirname(Target))
 
     SourceDir = os.path.dirname(Source)
@@ -363,16 +379,20 @@ def TrimAslFile(Source, Target, IncludePathFile,AslDeps = False):
                 FileLines = File.readlines()
             for Line in FileLines:
                 LineNum += 1
-                if Line.startswith("/I") or Line.startswith ("-I"):
+                if Line.startswith("/I") or Line.startswith("-I"):
                     IncludePathList.append(Line[2:].strip())
                 else:
-                    EdkLogger.warn("Trim", "Invalid include line in include list file.", IncludePathFile, LineNum)
+                    EdkLogger.warn(
+                        "Trim", "Invalid include line in include list file.", IncludePathFile, LineNum)
         except:
-            EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=IncludePathFile)
+            EdkLogger.error("Trim", FILE_OPEN_FAILURE,
+                            ExtraData=IncludePathFile)
     AslIncludes = []
-    Lines = DoInclude(Source, '', IncludePathList,IncludeFileList=AslIncludes,filetype='ASL')
-    AslIncludes = [item for item in AslIncludes if item !=Source]
-    SaveFileOnChange(os.path.join(os.path.dirname(Target),os.path.basename(Source))+".trim.deps", " \\\n".join([Source+":"] +AslIncludes),False)
+    Lines = DoInclude(Source, '', IncludePathList,
+                      IncludeFileList=AslIncludes, filetype='ASL')
+    AslIncludes = [item for item in AslIncludes if item != Source]
+    SaveFileOnChange(os.path.join(os.path.dirname(Target), os.path.basename(
+        Source))+".trim.deps", " \\\n".join([Source+":"] + AslIncludes), False)
 
     #
     # Undef MIN and MAX to avoid collision in ASL source code
@@ -386,7 +406,7 @@ def TrimAslFile(Source, Target, IncludePathFile,AslDeps = False):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
-## Trim ASM file
+# Trim ASM file
 #
 # Output ASM include statement with the content the included file
 #
@@ -394,6 +414,8 @@ def TrimAslFile(Source, Target, IncludePathFile,AslDeps = False):
 # @param  Target          File to store the trimmed content
 # @param  IncludePathFile The file to log the external include path
 #
+
+
 def TrimAsmFile(Source, Target, IncludePathFile):
     CreateDirectory(os.path.dirname(Target))
 
@@ -416,17 +438,21 @@ def TrimAsmFile(Source, Target, IncludePathFile):
                 FileLines = File.readlines()
             for Line in FileLines:
                 LineNum += 1
-                if Line.startswith("/I") or Line.startswith ("-I"):
+                if Line.startswith("/I") or Line.startswith("-I"):
                     IncludePathList.append(Line[2:].strip())
                 else:
-                    EdkLogger.warn("Trim", "Invalid include line in include list file.", IncludePathFile, LineNum)
+                    EdkLogger.warn(
+                        "Trim", "Invalid include line in include list file.", IncludePathFile, LineNum)
         except:
-            EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=IncludePathFile)
+            EdkLogger.error("Trim", FILE_OPEN_FAILURE,
+                            ExtraData=IncludePathFile)
     AsmIncludes = []
-    Lines = DoInclude(Source, '', IncludePathList,IncludeFileList=AsmIncludes,filetype='ASM')
+    Lines = DoInclude(Source, '', IncludePathList,
+                      IncludeFileList=AsmIncludes, filetype='ASM')
     AsmIncludes = [item for item in AsmIncludes if item != Source]
     if AsmIncludes:
-        SaveFileOnChange(os.path.join(os.path.dirname(Target),os.path.basename(Source))+".trim.deps", " \\\n".join([Source+":"] +AsmIncludes),False)
+        SaveFileOnChange(os.path.join(os.path.dirname(Target), os.path.basename(
+            Source))+".trim.deps", " \\\n".join([Source+":"] + AsmIncludes), False)
     # save all lines trimmed
     try:
         with open(Target, 'w') as File:
@@ -434,6 +460,7 @@ def TrimAsmFile(Source, Target, IncludePathFile):
     except:
         EdkLogger.error("Trim", FILE_OPEN_FAILURE, ExtraData=Target)
 
+
 def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     VfrNameList = []
     if os.path.isdir(DebugDir):
@@ -441,9 +468,9 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
             for FileName in Files:
                 Name, Ext = os.path.splitext(FileName)
                 if Ext == '.c' and Name != 'AutoGen':
-                    VfrNameList.append (Name + 'Bin')
+                    VfrNameList.append(Name + 'Bin')
 
-    VfrNameList.append (ModuleName + 'Strings')
+    VfrNameList.append(ModuleName + 'Strings')
 
     EfiFileName = os.path.join(DebugDir, ModuleName + '.efi')
     MapFileName = os.path.join(DebugDir, ModuleName + '.map')
@@ -455,7 +482,8 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
     try:
         fInputfile = open(OutputFile, "wb+")
     except:
-        EdkLogger.error("Trim", FILE_OPEN_FAILURE, "File open failed for %s" %OutputFile, None)
+        EdkLogger.error("Trim", FILE_OPEN_FAILURE,
+                        "File open failed for %s" % OutputFile, None)
 
     # Use a instance of BytesIO to cache data
     fStringIO = BytesIO()
@@ -469,8 +497,8 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
             #
             UniGuid = b'\xe0\xc5\x13\x89\xf63\x86M\x9b\xf1C\xef\x89\xfc\x06f'
             fStringIO.write(UniGuid)
-            UniValue = pack ('Q', int (Item[1], 16))
-            fStringIO.write (UniValue)
+            UniValue = pack('Q', int(Item[1], 16))
+            fStringIO.write(UniValue)
         else:
             #
             # VFR binary offset in image.
@@ -479,23 +507,24 @@ def GenerateVfrBinSec(ModuleName, DebugDir, OutputFile):
             #
             VfrGuid = b'\xb4|\xbc\xd0Gj_I\xaa\x11q\x07F\xda\x06\xa2'
             fStringIO.write(VfrGuid)
-            type (Item[1])
-            VfrValue = pack ('Q', int (Item[1], 16))
-            fStringIO.write (VfrValue)
+            type(Item[1])
+            VfrValue = pack('Q', int(Item[1], 16))
+            fStringIO.write(VfrValue)
 
     #
     # write data into file.
     #
-    try :
-        fInputfile.write (fStringIO.getvalue())
+    try:
+        fInputfile.write(fStringIO.getvalue())
     except:
-        EdkLogger.error("Trim", FILE_WRITE_FAILURE, "Write data to file %s failed, please check whether the file been locked or using by other applications." %OutputFile, None)
+        EdkLogger.error("Trim", FILE_WRITE_FAILURE,
+                        "Write data to file %s failed, please check whether the file been locked or using by other applications." % OutputFile, None)
 
-    fStringIO.close ()
-    fInputfile.close ()
+    fStringIO.close()
+    fInputfile.close()
 
 
-## Parse command line options
+# Parse command line options
 #
 # Using standard Python module optparse to parse command line option of this tool.
 #
@@ -509,13 +538,13 @@ def Options():
         make_option("-r", "--vfr-file", dest="FileType", const="Vfr", action="store_const",
                           help="The input file is preprocessed VFR file"),
         make_option("--Vfr-Uni-Offset", dest="FileType", const="VfrOffsetBin", action="store_const",
-                          help="The input file is EFI image"),
+                    help="The input file is EFI image"),
         make_option("--asl-deps", dest="AslDeps", const="True", action="store_const",
-                          help="Generate Asl dependent files."),
+                    help="Generate Asl dependent files."),
         make_option("-a", "--asl-file", dest="FileType", const="Asl", action="store_const",
                           help="The input file is ASL file"),
-        make_option( "--asm-file", dest="FileType", const="Asm", action="store_const",
-                          help="The input file is asm file"),
+        make_option("--asm-file", dest="FileType", const="Asm", action="store_const",
+                    help="The input file is asm file"),
         make_option("-c", "--convert-hex", dest="ConvertHex", action="store_true",
                           help="Convert standard hex format (0xabcd) to MASM format (abcdh)"),
 
@@ -525,22 +554,25 @@ def Options():
                           help="The input file is include path list to search for ASL include file"),
         make_option("-o", "--output", dest="OutputFile",
                           help="File to store the trimmed content"),
-        make_option("--ModuleName", dest="ModuleName", help="The module's BASE_NAME"),
+        make_option("--ModuleName", dest="ModuleName",
+                    help="The module's BASE_NAME"),
         make_option("--DebugDir", dest="DebugDir",
-                          help="Debug Output directory to store the output files"),
+                    help="Debug Output directory to store the output files"),
         make_option("-v", "--verbose", dest="LogLevel", action="store_const", const=EdkLogger.VERBOSE,
                           help="Run verbosely"),
         make_option("-d", "--debug", dest="LogLevel", type="int",
                           help="Run with debug information"),
         make_option("-q", "--quiet", dest="LogLevel", action="store_const", const=EdkLogger.QUIET,
                           help="Run quietly"),
-        make_option("-?", action="help", help="show this help message and exit"),
+        make_option("-?", action="help",
+                    help="show this help message and exit"),
     ]
 
     # use clearer usage to override default usage message
     UsageString = "%prog [-s|-r|-a|--Vfr-Uni-Offset] [-c] [-v|-d <debug_level>|-q] [-i <include_path_file>] [-o <output_file>] [--ModuleName <ModuleName>] [--DebugDir <DebugDir>] [<input_file>]"
 
-    Parser = OptionParser(description=__copyright__, version=__version__, option_list=OptionList, usage=UsageString)
+    Parser = OptionParser(description=__copyright__, version=__version__,
+                          option_list=OptionList, usage=UsageString)
     Parser.set_defaults(FileType="Vfr")
     Parser.set_defaults(ConvertHex=False)
     Parser.set_defaults(LogLevel=EdkLogger.INFO)
@@ -552,16 +584,18 @@ def Options():
         if len(Args) == 0:
             return Options, ''
         elif len(Args) > 1:
-            EdkLogger.error("Trim", OPTION_NOT_SUPPORTED, ExtraData=Parser.get_usage())
+            EdkLogger.error("Trim", OPTION_NOT_SUPPORTED,
+                            ExtraData=Parser.get_usage())
     if len(Args) == 0:
         EdkLogger.error("Trim", OPTION_MISSING, ExtraData=Parser.get_usage())
     if len(Args) > 1:
-        EdkLogger.error("Trim", OPTION_NOT_SUPPORTED, ExtraData=Parser.get_usage())
+        EdkLogger.error("Trim", OPTION_NOT_SUPPORTED,
+                        ExtraData=Parser.get_usage())
 
     InputFile = Args[0]
     return Options, InputFile
 
-## Entrance method
+# Entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -570,6 +604,8 @@ def Options():
 # @retval 0     Tool was successful
 # @retval 1     Tool failed
 #
+
+
 def Main():
     try:
         EdkLogger.Initialize()
@@ -584,44 +620,54 @@ def Main():
     try:
         if CommandOptions.FileType == "Vfr":
             if CommandOptions.OutputFile is None:
-                CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
+                CommandOptions.OutputFile = os.path.splitext(InputFile)[
+                    0] + '.iii'
             TrimPreprocessedVfr(InputFile, CommandOptions.OutputFile)
         elif CommandOptions.FileType == "Asl":
             if CommandOptions.OutputFile is None:
-                CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
-            TrimAslFile(InputFile, CommandOptions.OutputFile, CommandOptions.IncludePathFile,CommandOptions.AslDeps)
+                CommandOptions.OutputFile = os.path.splitext(InputFile)[
+                    0] + '.iii'
+            TrimAslFile(InputFile, CommandOptions.OutputFile,
+                        CommandOptions.IncludePathFile, CommandOptions.AslDeps)
         elif CommandOptions.FileType == "VfrOffsetBin":
-            GenerateVfrBinSec(CommandOptions.ModuleName, CommandOptions.DebugDir, CommandOptions.OutputFile)
+            GenerateVfrBinSec(CommandOptions.ModuleName,
+                              CommandOptions.DebugDir, CommandOptions.OutputFile)
         elif CommandOptions.FileType == "Asm":
-            TrimAsmFile(InputFile, CommandOptions.OutputFile, CommandOptions.IncludePathFile)
-        else :
+            TrimAsmFile(InputFile, CommandOptions.OutputFile,
+                        CommandOptions.IncludePathFile)
+        else:
             if CommandOptions.OutputFile is None:
-                CommandOptions.OutputFile = os.path.splitext(InputFile)[0] + '.iii'
-            TrimPreprocessedFile(InputFile, CommandOptions.OutputFile, CommandOptions.ConvertHex, CommandOptions.TrimLong)
+                CommandOptions.OutputFile = os.path.splitext(InputFile)[
+                    0] + '.iii'
+            TrimPreprocessedFile(InputFile, CommandOptions.OutputFile,
+                                 CommandOptions.ConvertHex, CommandOptions.TrimLong)
     except FatalError as X:
         import platform
         import traceback
         if CommandOptions is not None and CommandOptions.LogLevel <= EdkLogger.DEBUG_9:
-            EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            EdkLogger.quiet("(Python %s on %s) " % (
+                platform.python_version(), sys.platform) + traceback.format_exc())
         return 1
     except:
         import traceback
         import platform
         EdkLogger.error(
-                    "\nTrim",
-                    CODE_ERROR,
-                    "Unknown fatal error when trimming [%s]" % InputFile,
-                    ExtraData="\n(Please send email to %s for help, attaching following call stack trace!)\n" % MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
-        EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            "\nTrim",
+            CODE_ERROR,
+            "Unknown fatal error when trimming [%s]" % InputFile,
+            ExtraData="\n(Please send email to %s for help, attaching following call stack trace!)\n" % MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
+        EdkLogger.quiet("(Python %s on %s) " % (
+            platform.python_version(), sys.platform) + traceback.format_exc())
         return 1
 
     return 0
 
+
 if __name__ == '__main__':
     r = Main()
-    ## 0-127 is a safe return range, and 1 is a standard default error
-    if r < 0 or r > 127: r = 1
+    # 0-127 is a safe return range, and 1 is a standard default error
+    if r < 0 or r > 127:
+        r = 1
     sys.exit(r)
-
diff --git a/BaseTools/Source/Python/UPT/BuildVersion.py b/BaseTools/Source/Python/UPT/BuildVersion.py
index 1b144dc1b666..de0225985950 100644
--- a/BaseTools/Source/Python/UPT/BuildVersion.py
+++ b/BaseTools/Source/Python/UPT/BuildVersion.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This file is for build version number auto generation
 #
diff --git a/BaseTools/Source/Python/UPT/Core/DependencyRules.py b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
index 0c801c72d614..2b6b088b4809 100644
--- a/BaseTools/Source/Python/UPT/Core/DependencyRules.py
+++ b/BaseTools/Source/Python/UPT/Core/DependencyRules.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is for installed package information database operations
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -26,11 +26,11 @@ from Library import GlobalData
 from Logger.ToolError import FatalError
 from Logger.ToolError import EDK1_INF_ERROR
 from Logger.ToolError import UNKNOWN_ERROR
-(DEPEX_CHECK_SUCCESS, DEPEX_CHECK_MODULE_NOT_FOUND, \
-DEPEX_CHECK_PACKAGE_NOT_FOUND, DEPEX_CHECK_DP_NOT_FOUND) = (0, 1, 2, 3)
+(DEPEX_CHECK_SUCCESS, DEPEX_CHECK_MODULE_NOT_FOUND,
+ DEPEX_CHECK_PACKAGE_NOT_FOUND, DEPEX_CHECK_DP_NOT_FOUND) = (0, 1, 2, 3)
 
 
-## DependencyRules
+# DependencyRules
 #
 # This class represents the dependency rule check mechanism
 #
@@ -42,10 +42,12 @@ class DependencyRules(object):
         self.WsPkgList = GetWorkspacePackage()
         self.WsModuleList = GetWorkspaceModule()
 
-        self.PkgsToBeDepend = [(PkgInfo[1], PkgInfo[2]) for PkgInfo in self.WsPkgList]
+        self.PkgsToBeDepend = [(PkgInfo[1], PkgInfo[2])
+                               for PkgInfo in self.WsPkgList]
 
         # Add package info from the DIST to be installed.
-        self.PkgsToBeDepend.extend(self.GenToBeInstalledPkgList(ToBeInstalledPkgList))
+        self.PkgsToBeDepend.extend(
+            self.GenToBeInstalledPkgList(ToBeInstalledPkgList))
 
     def GenToBeInstalledPkgList(self, ToBeInstalledPkgList):
         if not ToBeInstalledPkgList:
@@ -57,7 +59,7 @@ class DependencyRules(object):
 
         return RtnList
 
-    ## Check whether a module exists by checking the Guid+Version+Name+Path combination
+    # Check whether a module exists by checking the Guid+Version+Name+Path combination
     #
     # @param Guid:  Guid of a module
     # @param Version: Version of a module
@@ -68,14 +70,15 @@ class DependencyRules(object):
     def CheckModuleExists(self, Guid, Version, Name, Path):
         Logger.Verbose(ST.MSG_CHECK_MODULE_EXIST)
         ModuleList = self.IpiDb.GetModInPackage(Guid, Version, Name, Path)
-        ModuleList.extend(self.IpiDb.GetStandaloneModule(Guid, Version, Name, Path))
+        ModuleList.extend(self.IpiDb.GetStandaloneModule(
+            Guid, Version, Name, Path))
         Logger.Verbose(ST.MSG_CHECK_MODULE_EXIST_FINISH)
         if len(ModuleList) > 0:
             return True
         else:
             return False
 
-    ## Check whether a module depex satisfied.
+    # Check whether a module depex satisfied.
     #
     # @param ModuleObj: A module object
     # @param DpObj: A distribution object
@@ -103,7 +106,7 @@ class DependencyRules(object):
                 for GuidVerPair in DpObj.PackageSurfaceArea.keys():
                     if Dep.GetGuid() == GuidVerPair[0]:
                         if Dep.GetVersion() is None or \
-                        len(Dep.GetVersion()) == 0:
+                                len(Dep.GetVersion()) == 0:
                             Result = True
                             break
                         if Dep.GetVersion() == GuidVerPair[1]:
@@ -114,14 +117,14 @@ class DependencyRules(object):
                     break
 
         if not Result:
-            Logger.Error("CheckModuleDepex", UNKNOWN_ERROR, \
-                         ST.ERR_DEPENDENCY_NOT_MATCH % (ModuleObj.GetName(), \
-                                                        Dep.GetPackageFilePath(), \
-                                                        Dep.GetGuid(), \
+            Logger.Error("CheckModuleDepex", UNKNOWN_ERROR,
+                         ST.ERR_DEPENDENCY_NOT_MATCH % (ModuleObj.GetName(),
+                                                        Dep.GetPackageFilePath(),
+                                                        Dep.GetGuid(),
                                                         Dep.GetVersion()))
         return Result
 
-    ## Check whether a package exists in a package list specified by PkgsToBeDepend.
+    # Check whether a package exists in a package list specified by PkgsToBeDepend.
     #
     # @param Guid: Guid of a package
     # @param Version: Version of a package
@@ -148,7 +151,7 @@ class DependencyRules(object):
         Logger.Verbose(ST.MSG_CHECK_PACKAGE_FINISH)
         return Found
 
-    ## Check whether a package depex satisfied.
+    # Check whether a package depex satisfied.
     #
     # @param PkgObj: A package object
     # @param DpObj: A distribution object
@@ -165,7 +168,7 @@ class DependencyRules(object):
                 return False
         return True
 
-    ## Check whether a DP exists.
+    # Check whether a DP exists.
     #
     # @param Guid: Guid of a Distribution
     # @param Version: Version of a Distribution
@@ -182,7 +185,7 @@ class DependencyRules(object):
         Logger.Verbose(ST.MSG_CHECK_DP_FINISH)
         return Found
 
-    ## Check whether a DP depex satisfied by current workspace for Install
+    # Check whether a DP depex satisfied by current workspace for Install
     #
     # @param DpObj:  A distribution object
     # @return: True if distribution depex satisfied
@@ -208,24 +211,26 @@ class DependencyRules(object):
 
         return True, DpObj
 
-
-    ## Check whether a DP depex satisfied by current workspace
+    # Check whether a DP depex satisfied by current workspace
     #  (excluding the original distribution's packages to be replaced) for Replace
     #
     # @param DpObj:  A distribution object
     # @param OrigDpGuid: The original distribution's Guid
     # @param OrigDpVersion: The original distribution's Version
     #
+
     def ReplaceCheckNewDpDepex(self, DpObj, OrigDpGuid, OrigDpVersion):
-        self.PkgsToBeDepend = [(PkgInfo[1], PkgInfo[2]) for PkgInfo in self.WsPkgList]
-        OrigDpPackageList = self.IpiDb.GetPackageListFromDp(OrigDpGuid, OrigDpVersion)
+        self.PkgsToBeDepend = [(PkgInfo[1], PkgInfo[2])
+                               for PkgInfo in self.WsPkgList]
+        OrigDpPackageList = self.IpiDb.GetPackageListFromDp(
+            OrigDpGuid, OrigDpVersion)
         for OrigPkgInfo in OrigDpPackageList:
             Guid, Version = OrigPkgInfo[0], OrigPkgInfo[1]
             if (Guid, Version) in self.PkgsToBeDepend:
                 self.PkgsToBeDepend.remove((Guid, Version))
         return self.CheckDpDepexSatisfied(DpObj)
 
-    ## Check whether a DP depex satisfied by current workspace.
+    # Check whether a DP depex satisfied by current workspace.
     #
     # @param DpObj:  A distribution object
     #
@@ -246,7 +251,7 @@ class DependencyRules(object):
 
         return True
 
-    ## Check whether a DP could be removed from current workspace.
+    # Check whether a DP could be removed from current workspace.
     #
     # @param DpGuid:  File's guid
     # @param DpVersion: File's version
@@ -293,7 +298,7 @@ class DependencyRules(object):
         #
         for (PkgGuid, PkgVersion, InstallPath) in DpPackageList:
             Logger.Warn("UPT",
-                        ST.WARN_INSTALLED_PACKAGE_NOT_FOUND%(PkgGuid, PkgVersion, InstallPath))
+                        ST.WARN_INSTALLED_PACKAGE_NOT_FOUND % (PkgGuid, PkgVersion, InstallPath))
 
         #
         # check modules to see if has dependency on package of current DP
@@ -304,8 +309,7 @@ class DependencyRules(object):
                 DependModuleList.append(Module)
         return (Removable, DependModuleList)
 
-
-    ## Check whether a DP could be replaced by a distribution containing NewDpPkgList
+    # Check whether a DP could be replaced by a distribution containing NewDpPkgList
     # from current workspace.
     #
     # @param OrigDpGuid:  original Dp's Guid
@@ -314,6 +318,7 @@ class DependencyRules(object):
     # @retval Replaceable: True if distribution could be replaced, False Else
     # @retval DependModuleList: the list of modules that make distribution can not be replaced
     #
+
     def CheckDpDepexForReplace(self, OrigDpGuid, OrigDpVersion, NewDpPkgList):
         Replaceable = True
         DependModuleList = []
@@ -333,7 +338,8 @@ class DependencyRules(object):
         #
         # get packages in current Dp and find the install path
         # List of item (PkgGuid, PkgVersion, InstallPath)
-        DpPackageList = self.IpiDb.GetPackageListFromDp(OrigDpGuid, OrigDpVersion)
+        DpPackageList = self.IpiDb.GetPackageListFromDp(
+            OrigDpGuid, OrigDpVersion)
         DpPackagePathList = []
         WorkSP = GlobalData.gWORKSPACE
         for (PkgName, PkgGuid, PkgVersion, DecFile) in self.WsPkgList:
@@ -358,7 +364,7 @@ class DependencyRules(object):
         #
         for (PkgGuid, PkgVersion, InstallPath) in DpPackageList:
             Logger.Warn("UPT",
-                        ST.WARN_INSTALLED_PACKAGE_NOT_FOUND%(PkgGuid, PkgVersion, InstallPath))
+                        ST.WARN_INSTALLED_PACKAGE_NOT_FOUND % (PkgGuid, PkgVersion, InstallPath))
 
         #
         # check modules to see if it can be satisfied by package not belong to removed DP
@@ -370,7 +376,7 @@ class DependencyRules(object):
         return (Replaceable, DependModuleList)
 
 
-## check whether module depends on packages in DpPackagePathList, return True
+# check whether module depends on packages in DpPackagePathList, return True
 # if found, False else
 #
 # @param Path: a module path
@@ -382,7 +388,8 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
     try:
         for Item in GetPackagePath(Path):
             if Item in DpPackagePathList:
-                DecPath = os.path.normpath(os.path.join(GlobalData.gWORKSPACE, Item))
+                DecPath = os.path.normpath(
+                    os.path.join(GlobalData.gWORKSPACE, Item))
                 Logger.Info(ST.MSG_MODULE_DEPEND_ON % (Path, DecPath))
                 return False
         else:
@@ -390,7 +397,7 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
     except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
-                        ST.WRN_EDK1_INF_FOUND%Path)
+                        ST.WRN_EDK1_INF_FOUND % Path)
             return True
         else:
             return True
@@ -399,6 +406,8 @@ def VerifyRemoveModuleDep(Path, DpPackagePathList):
 #
 # Get Dependency package path from an Inf file path
 #
+
+
 def GetPackagePath(InfPath):
     PackagePath = []
     if os.path.exists(InfPath):
@@ -419,7 +428,7 @@ def GetPackagePath(InfPath):
 
     return PackagePath
 
-## check whether module depends on packages in DpPackagePathList and can not be satisfied by OtherPkgList
+# check whether module depends on packages in DpPackagePathList and can not be satisfied by OtherPkgList
 #
 # @param Path: a module path
 # @param DpPackagePathList:  a list of Package Paths
@@ -428,11 +437,14 @@ def GetPackagePath(InfPath):
 #           True:  either module doesn't depend on DpPackagePathList or module depends on DpPackagePathList
 #                 but can be satisfied by OtherPkgList
 #
+
+
 def VerifyReplaceModuleDep(Path, DpPackagePathList, OtherPkgList):
     try:
         for Item in GetPackagePath(Path):
             if Item in DpPackagePathList:
-                DecPath = os.path.normpath(os.path.join(GlobalData.gWORKSPACE, Item))
+                DecPath = os.path.normpath(
+                    os.path.join(GlobalData.gWORKSPACE, Item))
                 Name, Guid, Version = GetPkgInfoFromDec(DecPath)
                 if (Guid, Version) not in OtherPkgList:
                     Logger.Info(ST.MSG_MODULE_DEPEND_ON % (Path, DecPath))
@@ -442,7 +454,7 @@ def VerifyReplaceModuleDep(Path, DpPackagePathList, OtherPkgList):
     except FatalError as ErrCode:
         if ErrCode.message == EDK1_INF_ERROR:
             Logger.Warn("UPT",
-                        ST.WRN_EDK1_INF_FOUND%Path)
+                        ST.WRN_EDK1_INF_FOUND % Path)
             return True
         else:
             return True
diff --git a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
index 5ee79b6317bc..5fa211c9088a 100644
--- a/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
+++ b/BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define a class object to describe a distribution package
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -28,12 +28,14 @@ from Object.POM.CommonObject import CommonHeaderObject
 from Object.POM.CommonObject import MiscFileObject
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## DistributionPackageHeaderClass
+# DistributionPackageHeaderClass
 #
 # @param IdentificationObject: Identification Object
 # @param CommonHeaderObject: Common Header Object
 #
-class DistributionPackageHeaderObject(IdentificationObject, \
+
+
+class DistributionPackageHeaderObject(IdentificationObject,
                                       CommonHeaderObject):
     def __init__(self):
         IdentificationObject.__init__(self)
@@ -78,10 +80,12 @@ class DistributionPackageHeaderObject(IdentificationObject, \
     def GetXmlSpecification(self):
         return self.XmlSpecification
 
-## DistributionPackageClass
+# DistributionPackageClass
 #
 # @param object: DistributionPackageClass
 #
+
+
 class DistributionPackageClass(object):
     def __init__(self):
         self.Header = DistributionPackageHeaderObject()
@@ -98,7 +102,7 @@ class DistributionPackageClass(object):
         self.UserExtensions = []
         self.FileList = []
 
-    ## Get all included packages and modules for a distribution package
+    # Get all included packages and modules for a distribution package
     #
     # @param WorkspaceDir:  WorkspaceDir
     # @param PackageList:   A list of all packages
@@ -115,7 +119,8 @@ class DistributionPackageClass(object):
             for PackageFile in PackageList:
                 PackageFileFullPath = mws.join(Root, PackageFile)
                 WorkspaceDir = mws.getWs(Root, PackageFile)
-                DecObj = DecPomAlignment(PackageFileFullPath, WorkspaceDir, CheckMulDec=True)
+                DecObj = DecPomAlignment(
+                    PackageFileFullPath, WorkspaceDir, CheckMulDec=True)
                 PackageObj = DecObj
                 #
                 # Parser inf file one bye one
@@ -127,10 +132,10 @@ class DistributionPackageClass(object):
                     if ModuleList and WsRelPath in ModuleList:
                         Logger.Error("UPT",
                                      OPTION_VALUE_INVALID,
-                                     ST.ERR_NOT_STANDALONE_MODULE_ERROR%\
+                                     ST.ERR_NOT_STANDALONE_MODULE_ERROR %
                                      (WsRelPath, PackageFile))
-                    Filename = os.path.normpath\
-                    (os.path.join(PackageObj.GetRelaPath(), File))
+                    Filename = os.path.normpath(
+                        os.path.join(PackageObj.GetRelaPath(), File))
                     os.path.splitext(Filename)
                     #
                     # Call INF parser to generate Inf Object.
@@ -138,27 +143,27 @@ class DistributionPackageClass(object):
                     # Inf class in InfPomAlignment.
                     #
                     try:
-                        ModuleObj = InfPomAlignment(Filename, WorkspaceDir, PackageObj.GetPackagePath())
+                        ModuleObj = InfPomAlignment(
+                            Filename, WorkspaceDir, PackageObj.GetPackagePath())
 
                         #
                         # Add module to package
                         #
                         ModuleDict = PackageObj.GetModuleDict()
-                        ModuleDict[(ModuleObj.GetGuid(), \
-                                    ModuleObj.GetVersion(), \
-                                    ModuleObj.GetName(), \
+                        ModuleDict[(ModuleObj.GetGuid(),
+                                    ModuleObj.GetVersion(),
+                                    ModuleObj.GetName(),
                                     ModuleObj.GetCombinePath())] = ModuleObj
                         PackageObj.SetModuleDict(ModuleDict)
                     except FatalError as ErrCode:
                         if ErrCode.message == EDK1_INF_ERROR:
                             Logger.Warn("UPT",
-                                        ST.WRN_EDK1_INF_FOUND%Filename)
+                                        ST.WRN_EDK1_INF_FOUND % Filename)
                         else:
                             raise
 
-                self.PackageSurfaceArea\
-                [(PackageObj.GetGuid(), PackageObj.GetVersion(), \
-                  PackageObj.GetCombinePath())] = PackageObj
+                self.PackageSurfaceArea[(PackageObj.GetGuid(), PackageObj.GetVersion(),
+                                         PackageObj.GetCombinePath())] = PackageObj
 
         #
         # Get Modules
@@ -169,7 +174,8 @@ class DistributionPackageClass(object):
                 WorkspaceDir = mws.getWs(Root, ModuleFile)
 
                 try:
-                    ModuleObj = InfPomAlignment(ModuleFileFullPath, WorkspaceDir)
+                    ModuleObj = InfPomAlignment(
+                        ModuleFileFullPath, WorkspaceDir)
                     ModuleKey = (ModuleObj.GetGuid(),
                                  ModuleObj.GetVersion(),
                                  ModuleObj.GetName(),
@@ -179,7 +185,7 @@ class DistributionPackageClass(object):
                     if ErrCode.message == EDK1_INF_ERROR:
                         Logger.Error("UPT",
                                      EDK1_INF_ERROR,
-                                     ST.WRN_EDK1_INF_FOUND%ModuleFileFullPath,
+                                     ST.WRN_EDK1_INF_FOUND % ModuleFileFullPath,
                                      ExtraData=ST.ERR_NOT_SUPPORTED_SA_MODULE)
                     else:
                         raise
@@ -187,7 +193,7 @@ class DistributionPackageClass(object):
         # Recover WorkspaceDir
         WorkspaceDir = Root
 
-    ## Get all files included for a distribution package, except tool/misc of
+    # Get all files included for a distribution package, except tool/misc of
     # distribution level
     #
     # @retval DistFileList  A list of filepath for NonMetaDataFile, relative to workspace
@@ -204,15 +210,19 @@ class DistributionPackageClass(object):
             MetaDataFileList.append(Path)
             IncludePathList = Package.GetIncludePathList()
             for IncludePath in IncludePathList:
-                SearchPath = os.path.normpath(os.path.join(os.path.dirname(FullPath), IncludePath))
-                AddPath = os.path.normpath(os.path.join(PackagePath, IncludePath))
-                self.FileList += GetNonMetaDataFiles(SearchPath, ['CVS', '.svn'], False, AddPath)
+                SearchPath = os.path.normpath(os.path.join(
+                    os.path.dirname(FullPath), IncludePath))
+                AddPath = os.path.normpath(
+                    os.path.join(PackagePath, IncludePath))
+                self.FileList += GetNonMetaDataFiles(
+                    SearchPath, ['CVS', '.svn'], False, AddPath)
             #
             # Add the miscellaneous files on DEC file
             #
             for MiscFileObj in Package.GetMiscFileList():
                 for FileObj in MiscFileObj.GetFileList():
-                    MiscFileFullPath = os.path.normpath(os.path.join(PackagePath, FileObj.GetURI()))
+                    MiscFileFullPath = os.path.normpath(
+                        os.path.join(PackagePath, FileObj.GetURI()))
                     if MiscFileFullPath not in self.FileList:
                         self.FileList.append(MiscFileFullPath)
 
@@ -222,20 +232,23 @@ class DistributionPackageClass(object):
                 Module = ModuleDict[Guid, Version, Name, Path]
                 ModulePath = Module.GetModulePath()
                 FullPath = Module.GetFullPath()
-                PkgRelPath = os.path.normpath(os.path.join(PackagePath, ModulePath))
+                PkgRelPath = os.path.normpath(
+                    os.path.join(PackagePath, ModulePath))
                 MetaDataFileList.append(Path)
                 SkipList = ['CVS', '.svn']
                 NonMetaDataFileList = []
                 if Module.UniFileClassObject:
                     for UniFile in Module.UniFileClassObject.IncFileList:
                         OriPath = os.path.normpath(os.path.dirname(FullPath))
-                        UniFilePath = os.path.normpath(os.path.join(PkgRelPath, UniFile.Path[len(OriPath) + 1:]))
+                        UniFilePath = os.path.normpath(os.path.join(
+                            PkgRelPath, UniFile.Path[len(OriPath) + 1:]))
                         if UniFilePath not in SkipModulesUniList:
                             SkipModulesUniList.append(UniFilePath)
                     for IncludeFile in Module.UniFileClassObject.IncludePathList:
                         if IncludeFile not in SkipModulesUniList:
                             SkipModulesUniList.append(IncludeFile)
-                NonMetaDataFileList = GetNonMetaDataFiles(os.path.dirname(FullPath), SkipList, False, PkgRelPath)
+                NonMetaDataFileList = GetNonMetaDataFiles(
+                    os.path.dirname(FullPath), SkipList, False, PkgRelPath)
                 for NonMetaDataFile in NonMetaDataFileList:
                     if NonMetaDataFile not in self.FileList:
                         self.FileList.append(NonMetaDataFile)
@@ -249,10 +262,12 @@ class DistributionPackageClass(object):
             if Module.UniFileClassObject:
                 for UniFile in Module.UniFileClassObject.IncFileList:
                     OriPath = os.path.normpath(os.path.dirname(FullPath))
-                    UniFilePath = os.path.normpath(os.path.join(ModulePath, UniFile.Path[len(OriPath) + 1:]))
+                    UniFilePath = os.path.normpath(os.path.join(
+                        ModulePath, UniFile.Path[len(OriPath) + 1:]))
                     if UniFilePath not in SkipModulesUniList:
                         SkipModulesUniList.append(UniFilePath)
-            NonMetaDataFileList = GetNonMetaDataFiles(os.path.dirname(FullPath), SkipList, False, ModulePath)
+            NonMetaDataFileList = GetNonMetaDataFiles(
+                os.path.dirname(FullPath), SkipList, False, ModulePath)
             for NonMetaDataFile in NonMetaDataFileList:
                 if NonMetaDataFile not in self.FileList:
                     self.FileList.append(NonMetaDataFile)
@@ -261,7 +276,4 @@ class DistributionPackageClass(object):
             if SkipModuleUni in self.FileList:
                 self.FileList.remove(SkipModuleUni)
 
-        return  self.FileList, MetaDataFileList
-
-
-
+        return self.FileList, MetaDataFileList
diff --git a/BaseTools/Source/Python/UPT/Core/FileHook.py b/BaseTools/Source/Python/UPT/Core/FileHook.py
index 20712065f742..0de02b6a19be 100644
--- a/BaseTools/Source/Python/UPT/Core/FileHook.py
+++ b/BaseTools/Source/Python/UPT/Core/FileHook.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file hooks file and directory creation and removal
 #
 # Copyright (c) 2014 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -18,19 +18,21 @@ from time import sleep
 from Library import GlobalData
 
 __built_in_remove__ = os.remove
-__built_in_mkdir__  = os.mkdir
-__built_in_rmdir__  = os.rmdir
-__built_in_chmod__  = os.chmod
-__built_in_open__   = open
+__built_in_mkdir__ = os.mkdir
+__built_in_rmdir__ = os.rmdir
+__built_in_chmod__ = os.chmod
+__built_in_open__ = open
 
-_RMFILE      = 0
-_MKFILE      = 1
-_RMDIR       = 2
-_MKDIR       = 3
-_CHMOD       = 4
+_RMFILE = 0
+_MKFILE = 1
+_RMDIR = 2
+_MKDIR = 3
+_CHMOD = 4
 
 gBACKUPFILE = 'file.backup'
-gEXCEPTION_LIST = ['Conf'+os.sep+'DistributionPackageDatabase.db', '.tmp', gBACKUPFILE]
+gEXCEPTION_LIST = ['Conf'+os.sep +
+                   'DistributionPackageDatabase.db', '.tmp', gBACKUPFILE]
+
 
 class _PathInfo:
     def __init__(self, action, path, mode=-1):
@@ -38,6 +40,7 @@ class _PathInfo:
         self.path = path
         self.mode = mode
 
+
 class RecoverMgr:
     def __init__(self, workspace):
         self.rlist = []
@@ -101,12 +104,13 @@ class RecoverMgr:
             item = self.rlist[index]
             exist = os.path.exists(item.path)
             if item.action == _MKFILE and exist:
-                #if not os.access(item.path, os.W_OK):
+                # if not os.access(item.path, os.W_OK):
                 #    os.chmod(item.path, S_IWUSR)
                 __built_in_remove__(item.path)
             elif item.action == _RMFILE and not exist:
                 if not self.zip:
-                    self.zip = zipfile.ZipFile(self.zipfile, 'r', zipfile.ZIP_DEFLATED)
+                    self.zip = zipfile.ZipFile(
+                        self.zipfile, 'r', zipfile.ZIP_DEFLATED)
                 arcname = os.path.normpath(item.path)
                 arcname = arcname[len(self.workspace)+1:].encode('utf_8')
                 if os.sep != "/" and os.sep in arcname:
@@ -145,7 +149,8 @@ class RecoverMgr:
     # Check if path needs to be hooked
     def _tryhook(self, path):
         path = os.path.normpath(path)
-        works = self.workspace if str(self.workspace).endswith(os.sep) else (self.workspace  + os.sep)
+        works = self.workspace if str(self.workspace).endswith(
+            os.sep) else (self.workspace + os.sep)
         if not path.startswith(works):
             return ''
         for exceptdir in gEXCEPTION_LIST:
@@ -154,40 +159,47 @@ class RecoverMgr:
                 return ''
         return path[len(self.workspace)+1:]
 
+
 def _hookrm(path):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkrmfile(path)
     else:
         __built_in_remove__(path)
 
+
 def _hookmkdir(path, mode=0o777):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkmkdir(path, mode)
     else:
         __built_in_mkdir__(path, mode)
 
+
 def _hookrmdir(path):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkrmdir(path)
     else:
         __built_in_rmdir__(path)
 
+
 def _hookmkfile(path, mode='r', bufsize=-1):
     if GlobalData.gRECOVERMGR:
         return GlobalData.gRECOVERMGR.bkmkfile(path, mode, bufsize)
     return __built_in_open__(path, mode, bufsize)
 
+
 def _hookchmod(path, mode):
     if GlobalData.gRECOVERMGR:
         GlobalData.gRECOVERMGR.bkchmod(path, mode)
     else:
         __built_in_chmod__(path, mode)
 
+
 def SetRecoverMgr(mgr):
     GlobalData.gRECOVERMGR = mgr
 
-os.remove   = _hookrm
-os.mkdir    = _hookmkdir
-os.rmdir    = _hookrmdir
-os.chmod    = _hookchmod
-__FileHookOpen__    = _hookmkfile
+
+os.remove = _hookrm
+os.mkdir = _hookmkdir
+os.rmdir = _hookrmdir
+os.chmod = _hookchmod
+__FileHookOpen__ = _hookmkfile
diff --git a/BaseTools/Source/Python/UPT/Core/IpiDb.py b/BaseTools/Source/Python/UPT/Core/IpiDb.py
index 69895e11cd04..feb81e49aad6 100644
--- a/BaseTools/Source/Python/UPT/Core/IpiDb.py
+++ b/BaseTools/Source/Python/UPT/Core/IpiDb.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is for installed package information database operations
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,7 +23,7 @@ from Logger.ToolError import UPT_ALREADY_RUNNING_ERROR
 from Logger.ToolError import UPT_DB_UPDATE_ERROR
 import platform as pf
 
-## IpiDb
+# IpiDb
 #
 # This class represents the installed package information database
 # Add/Remove/Get installed distribution package information here.
@@ -33,12 +33,15 @@ import platform as pf
 # @param DbPath:      A string for the path of the database
 #
 #
+
+
 class IpiDatabase(object):
     def __init__(self, DbPath, Workspace):
         Dir = os.path.dirname(DbPath)
         if not os.path.isdir(Dir):
             os.mkdir(Dir)
-        self.Conn = sqlite3.connect(u''.join(DbPath), isolation_level='DEFERRED')
+        self.Conn = sqlite3.connect(
+            u''.join(DbPath), isolation_level='DEFERRED')
         self.Conn.execute("PRAGMA page_size=4096")
         self.Conn.execute("PRAGMA synchronous=OFF")
         self.Cur = self.Conn.cursor()
@@ -51,10 +54,10 @@ class IpiDatabase(object):
         self.DummyTable = 'Dummy'
         self.Workspace = os.path.normpath(Workspace)
 
-    ## Initialize build database
+    # Initialize build database
     #
     #
-    def InitDatabase(self, SkipLock = False):
+    def InitDatabase(self, SkipLock=False):
         Logger.Verbose(ST.MSG_INIT_IPI_START)
         if not SkipLock:
             try:
@@ -158,7 +161,7 @@ class IpiDatabase(object):
     def Commit(self):
         self.Conn.commit()
 
-    ## Add a distribution install information from DpObj
+    # Add a distribution install information from DpObj
     #
     # @param DpObj:
     # @param NewDpPkgFileName: New DpPkg File Name
@@ -171,7 +174,7 @@ class IpiDatabase(object):
                 PkgGuid = PkgKey[0]
                 PkgVersion = PkgKey[1]
                 PkgInstallPath = PkgKey[2]
-                self._AddPackage(PkgGuid, PkgVersion, DpObj.Header.GetGuid(), \
+                self._AddPackage(PkgGuid, PkgVersion, DpObj.Header.GetGuid(),
                                  DpObj.Header.GetVersion(), PkgInstallPath)
                 PkgObj = DpObj.PackageSurfaceArea[PkgKey]
                 for ModKey in PkgObj.GetModuleDict().keys():
@@ -180,18 +183,19 @@ class IpiDatabase(object):
                     ModName = ModKey[2]
                     ModInstallPath = ModKey[3]
                     ModInstallPath = \
-                    os.path.normpath(os.path.join(PkgInstallPath, ModInstallPath))
-                    self._AddModuleInPackage(ModGuid, ModVersion, ModName, PkgGuid, \
+                        os.path.normpath(os.path.join(
+                            PkgInstallPath, ModInstallPath))
+                    self._AddModuleInPackage(ModGuid, ModVersion, ModName, PkgGuid,
                                              PkgVersion, ModInstallPath)
                     ModObj = PkgObj.GetModuleDict()[ModKey]
                     for Dep in ModObj.GetPackageDependencyList():
                         DepexGuid = Dep.GetGuid()
                         DepexVersion = Dep.GetVersion()
-                        self._AddModuleDepex(ModGuid, ModVersion, ModName, ModInstallPath, \
+                        self._AddModuleDepex(ModGuid, ModVersion, ModName, ModInstallPath,
                                              DepexGuid, DepexVersion)
                 for (FilePath, Md5Sum) in PkgObj.FileList:
-                    self._AddDpFilePathList(DpObj.Header.GetGuid(), \
-                                            DpObj.Header.GetVersion(), FilePath, \
+                    self._AddDpFilePathList(DpObj.Header.GetGuid(),
+                                            DpObj.Header.GetVersion(), FilePath,
                                             Md5Sum)
 
             for ModKey in DpObj.ModuleSurfaceArea.keys():
@@ -199,46 +203,46 @@ class IpiDatabase(object):
                 ModVersion = ModKey[1]
                 ModName = ModKey[2]
                 ModInstallPath = ModKey[3]
-                self._AddStandaloneModule(ModGuid, ModVersion, ModName, \
-                                          DpObj.Header.GetGuid(), \
-                                          DpObj.Header.GetVersion(), \
+                self._AddStandaloneModule(ModGuid, ModVersion, ModName,
+                                          DpObj.Header.GetGuid(),
+                                          DpObj.Header.GetVersion(),
                                           ModInstallPath)
                 ModObj = DpObj.ModuleSurfaceArea[ModKey]
                 for Dep in ModObj.GetPackageDependencyList():
                     DepexGuid = Dep.GetGuid()
                     DepexVersion = Dep.GetVersion()
-                    self._AddModuleDepex(ModGuid, ModVersion, ModName, ModInstallPath, \
+                    self._AddModuleDepex(ModGuid, ModVersion, ModName, ModInstallPath,
                                          DepexGuid, DepexVersion)
                 for (Path, Md5Sum) in ModObj.FileList:
-                    self._AddDpFilePathList(DpObj.Header.GetGuid(), \
-                                            DpObj.Header.GetVersion(), \
+                    self._AddDpFilePathList(DpObj.Header.GetGuid(),
+                                            DpObj.Header.GetVersion(),
                                             Path, Md5Sum)
 
             #
             # add tool/misc files
             #
             for (Path, Md5Sum) in DpObj.FileList:
-                self._AddDpFilePathList(DpObj.Header.GetGuid(), \
+                self._AddDpFilePathList(DpObj.Header.GetGuid(),
                                         DpObj.Header.GetVersion(), Path, Md5Sum)
 
-            self._AddDp(DpObj.Header.GetGuid(), DpObj.Header.GetVersion(), \
+            self._AddDp(DpObj.Header.GetGuid(), DpObj.Header.GetVersion(),
                         NewDpPkgFileName, DpPkgFileName, RePackage)
 
         except sqlite3.IntegrityError as DetailMsg:
             Logger.Error("UPT",
                          UPT_DB_UPDATE_ERROR,
                          ST.ERR_UPT_DB_UPDATE_ERROR,
-                         ExtraData = DetailMsg
+                         ExtraData=DetailMsg
                          )
 
-    ## Add a distribution install information
+    # Add a distribution install information
     #
     # @param Guid         Guid of the distribution package
     # @param Version      Version of the distribution package
     # @param NewDpFileName the saved filename of distribution package file
     # @param DistributionFileName the filename of distribution package file
     #
-    def _AddDp(self, Guid, Version, NewDpFileName, DistributionFileName, \
+    def _AddDp(self, Guid, Version, NewDpFileName, DistributionFileName,
                RePackage):
 
         if Version is None or len(Version.strip()) == 0:
@@ -253,19 +257,19 @@ class IpiDatabase(object):
             PkgFileName = NewDpFileName
         CurrentTime = time.time()
         SqlCommand = \
-        """insert into %s values('%s', '%s', %s, '%s', '%s', '%s')""" % \
-        (self.DpTable, Guid, Version, CurrentTime, PkgFileName, \
-         DistributionFileName, str(RePackage).upper())
+            """insert into %s values('%s', '%s', %s, '%s', '%s', '%s')""" % \
+            (self.DpTable, Guid, Version, CurrentTime, PkgFileName,
+             DistributionFileName, str(RePackage).upper())
         self.Cur.execute(SqlCommand)
 
-
-    ## Add a file list from DP
+    # Add a file list from DP
     #
     # @param DpGuid: A DpGuid
     # @param DpVersion: A DpVersion
     # @param Path: A Path
     # @param Path: A Md5Sum
     #
+
     def _AddDpFilePathList(self, DpGuid, DpVersion, Path, Md5Sum):
         Path = os.path.normpath(Path)
         if pf.system() == 'Windows':
@@ -275,11 +279,11 @@ class IpiDatabase(object):
             if Path.startswith(self.Workspace + os.sep):
                 Path = Path[len(self.Workspace)+1:]
         SqlCommand = """insert into %s values('%s', '%s', '%s', '%s')""" % \
-        (self.DpFileListTable, Path, DpGuid, DpVersion, Md5Sum)
+            (self.DpFileListTable, Path, DpGuid, DpVersion, Md5Sum)
 
         self.Cur.execute(SqlCommand)
 
-    ## Add a package install information
+    # Add a package install information
     #
     # @param Guid: A package guid
     # @param Version: A package version
@@ -303,11 +307,11 @@ class IpiDatabase(object):
         #
         CurrentTime = time.time()
         SqlCommand = \
-        """insert into %s values('%s', '%s', %s, '%s', '%s', '%s')""" % \
-        (self.PkgTable, Guid, Version, CurrentTime, DpGuid, DpVersion, Path)
+            """insert into %s values('%s', '%s', %s, '%s', '%s', '%s')""" % \
+            (self.PkgTable, Guid, Version, CurrentTime, DpGuid, DpVersion, Path)
         self.Cur.execute(SqlCommand)
 
-    ## Add a module that from a package install information
+    # Add a module that from a package install information
     #
     # @param Guid:    Module Guid
     # @param Version: Module version
@@ -316,7 +320,7 @@ class IpiDatabase(object):
     # @param PkgVersion: Package version
     # @param Path:    Package relative path that module installs
     #
-    def _AddModuleInPackage(self, Guid, Version, Name, PkgGuid=None, \
+    def _AddModuleInPackage(self, Guid, Version, Name, PkgGuid=None,
                             PkgVersion=None, Path=''):
 
         if Version is None or len(Version.strip()) == 0:
@@ -338,12 +342,12 @@ class IpiDatabase(object):
         #
         CurrentTime = time.time()
         SqlCommand = \
-        """insert into %s values('%s', '%s', '%s', %s, '%s', '%s', '%s')""" % \
-        (self.ModInPkgTable, Guid, Version, Name, CurrentTime, PkgGuid, PkgVersion, \
-         Path)
+            """insert into %s values('%s', '%s', '%s', %s, '%s', '%s', '%s')""" % \
+            (self.ModInPkgTable, Guid, Version, Name, CurrentTime, PkgGuid, PkgVersion,
+             Path)
         self.Cur.execute(SqlCommand)
 
-    ## Add a module that is standalone install information
+    # Add a module that is standalone install information
     #
     # @param Guid: a module Guid
     # @param Version: a module Version
@@ -352,7 +356,7 @@ class IpiDatabase(object):
     # @param DpVersion: a DpVersion
     # @param Path: path
     #
-    def _AddStandaloneModule(self, Guid, Version, Name, DpGuid=None, \
+    def _AddStandaloneModule(self, Guid, Version, Name, DpGuid=None,
                              DpVersion=None, Path=''):
 
         if Version is None or len(Version.strip()) == 0:
@@ -369,12 +373,12 @@ class IpiDatabase(object):
         #
         CurrentTime = time.time()
         SqlCommand = \
-        """insert into %s values('%s', '%s', '%s', %s, '%s', '%s', '%s')""" % \
-        (self.StandaloneModTable, Guid, Version, Name, CurrentTime, DpGuid, \
-         DpVersion, Path)
+            """insert into %s values('%s', '%s', '%s', %s, '%s', '%s', '%s')""" % \
+            (self.StandaloneModTable, Guid, Version, Name, CurrentTime, DpGuid,
+             DpVersion, Path)
         self.Cur.execute(SqlCommand)
 
-    ## Add a module depex
+    # Add a module depex
     #
     # @param Guid: a module Guid
     # @param Version: a module Version
@@ -382,7 +386,7 @@ class IpiDatabase(object):
     # @param DepexGuid: a module DepexGuid
     # @param DepexVersion: a module DepexVersion
     #
-    def _AddModuleDepex(self, Guid, Version, Name, Path, DepexGuid=None, \
+    def _AddModuleDepex(self, Guid, Version, Name, Path, DepexGuid=None,
                         DepexVersion=None):
 
         if DepexGuid is None or len(DepexGuid.strip()) == 0:
@@ -400,10 +404,10 @@ class IpiDatabase(object):
         # Add module depex information to DB.
         #
         SqlCommand = """insert into %s values('%s', '%s', '%s', '%s', '%s', '%s')"""\
-         % (self.ModDepexTable, Guid, Version, Name, Path, DepexGuid, DepexVersion)
+            % (self.ModDepexTable, Guid, Version, Name, Path, DepexGuid, DepexVersion)
         self.Cur.execute(SqlCommand)
 
-    ## Remove a distribution install information, if no version specified,
+    # Remove a distribution install information, if no version specified,
     # remove all DPs with this Guid.
     #
     # @param DpGuid: guid of dpex
@@ -416,7 +420,7 @@ class IpiDatabase(object):
         # delete from ModDepex the standalone module's dependency
         #
         SqlCommand = \
-        """delete from ModDepexInfo where ModDepexInfo.ModuleGuid in
+            """delete from ModDepexInfo where ModDepexInfo.ModuleGuid in
         (select ModuleGuid from StandaloneModInfo as B where B.DpGuid = '%s'
         and B.DpVersion = '%s')
         and ModDepexInfo.ModuleVersion in
@@ -428,7 +432,8 @@ class IpiDatabase(object):
         and ModDepexInfo.InstallPath in
         (select InstallPath from StandaloneModInfo as B
         where B.DpGuid = '%s' and B.DpVersion = '%s') """ % \
-        (DpGuid, DpVersion, DpGuid, DpVersion, DpGuid, DpVersion, DpGuid, DpVersion)
+            (DpGuid, DpVersion, DpGuid, DpVersion,
+             DpGuid, DpVersion, DpGuid, DpVersion)
 
         self.Cur.execute(SqlCommand)
         #
@@ -437,7 +442,7 @@ class IpiDatabase(object):
         for Pkg in PkgList:
 
             SqlCommand = \
-            """delete from ModDepexInfo where ModDepexInfo.ModuleGuid in
+                """delete from ModDepexInfo where ModDepexInfo.ModuleGuid in
             (select ModuleGuid from ModInPkgInfo
             where ModInPkgInfo.PackageGuid ='%s' and
             ModInPkgInfo.PackageVersion = '%s')
@@ -460,44 +465,44 @@ class IpiDatabase(object):
         # delete the standalone module
         #
         SqlCommand = \
-        """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-        (self.StandaloneModTable, DpGuid, DpVersion)
+            """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+            (self.StandaloneModTable, DpGuid, DpVersion)
         self.Cur.execute(SqlCommand)
         #
         # delete the from pkg module
         #
         for Pkg in PkgList:
             SqlCommand = \
-            """delete from %s where %s.PackageGuid ='%s'
+                """delete from %s where %s.PackageGuid ='%s'
             and %s.PackageVersion = '%s'""" % \
-            (self.ModInPkgTable, self.ModInPkgTable, Pkg[0], \
-             self.ModInPkgTable, Pkg[1])
+                (self.ModInPkgTable, self.ModInPkgTable, Pkg[0],
+                 self.ModInPkgTable, Pkg[1])
             self.Cur.execute(SqlCommand)
         #
         # delete packages
         #
         SqlCommand = \
-        """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-        (self.PkgTable, DpGuid, DpVersion)
+            """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+            (self.PkgTable, DpGuid, DpVersion)
         self.Cur.execute(SqlCommand)
         #
         # delete file list from DP
         #
         SqlCommand = \
-        """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-        (self.DpFileListTable, DpGuid, DpVersion)
+            """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+            (self.DpFileListTable, DpGuid, DpVersion)
         self.Cur.execute(SqlCommand)
         #
         # delete DP
         #
         SqlCommand = \
-        """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-        (self.DpTable, DpGuid, DpVersion)
+            """delete from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+            (self.DpTable, DpGuid, DpVersion)
         self.Cur.execute(SqlCommand)
 
-        #self.Conn.commit()
+        # self.Conn.commit()
 
-    ## Get a list of distribution install information.
+    # Get a list of distribution install information.
     #
     # @param Guid: distribution package guid
     # @param Version: distribution package version
@@ -509,15 +514,15 @@ class IpiDatabase(object):
             Logger.Verbose(ST.MSG_GET_DP_INSTALL_LIST)
             (DpGuid, DpVersion) = (Guid, Version)
             SqlCommand = """select * from %s where DpGuid ='%s'""" % \
-            (self.DpTable, DpGuid)
+                (self.DpTable, DpGuid)
             self.Cur.execute(SqlCommand)
 
         else:
             Logger.Verbose(ST.MSG_GET_DP_INSTALL_INFO_START)
             (DpGuid, DpVersion) = (Guid, Version)
             SqlCommand = \
-            """select * from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-            (self.DpTable, DpGuid, DpVersion)
+                """select * from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+                (self.DpTable, DpGuid, DpVersion)
             self.Cur.execute(SqlCommand)
 
         DpList = []
@@ -531,13 +536,14 @@ class IpiDatabase(object):
         Logger.Verbose(ST.MSG_GET_DP_INSTALL_INFO_FINISH)
         return DpList
 
-    ## Get a list of distribution install dirs
+    # Get a list of distribution install dirs
     #
     # @param Guid: distribution package guid
     # @param Version: distribution package version
     #
     def GetDpInstallDirList(self, Guid, Version):
-        SqlCommand = """select InstallPath from PkgInfo where DpGuid = '%s' and DpVersion = '%s'""" % (Guid, Version)
+        SqlCommand = """select InstallPath from PkgInfo where DpGuid = '%s' and DpVersion = '%s'""" % (
+            Guid, Version)
         self.Cur.execute(SqlCommand)
         DirList = []
         for Result in self.Cur:
@@ -553,18 +559,18 @@ class IpiDatabase(object):
 
         return DirList
 
-
-    ## Get a list of distribution install file path information.
+    # Get a list of distribution install file path information.
     #
     # @param Guid: distribution package guid
     # @param Version: distribution package version
     #
+
     def GetDpFileList(self, Guid, Version):
 
         (DpGuid, DpVersion) = (Guid, Version)
         SqlCommand = \
-        """select * from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
-        (self.DpFileListTable, DpGuid, DpVersion)
+            """select * from %s where DpGuid ='%s' and DpVersion = '%s'""" % \
+            (self.DpFileListTable, DpGuid, DpVersion)
         self.Cur.execute(SqlCommand)
 
         PathList = []
@@ -575,7 +581,7 @@ class IpiDatabase(object):
 
         return PathList
 
-    ## Get files' repackage attribute if present that are installed into current workspace
+    # Get files' repackage attribute if present that are installed into current workspace
     #
     # @retval FileDict:  a Dict of file, key is file path, value is (DpGuid, DpVersion, NewDpFileName, RePackage)
     #
@@ -603,13 +609,13 @@ class IpiDatabase(object):
 
         return FileDict
 
-    ## Get (Guid, Version) from distribution file name information.
+    # Get (Guid, Version) from distribution file name information.
     #
     # @param DistributionFile: Distribution File
     #
     def GetDpByName(self, DistributionFile):
         SqlCommand = """select * from %s where NewPkgFileName = '%s'""" % \
-        (self.DpTable, DistributionFile)
+            (self.DpTable, DistributionFile)
         self.Cur.execute(SqlCommand)
 
         for Result in self.Cur:
@@ -621,7 +627,7 @@ class IpiDatabase(object):
         else:
             return (None, None, None)
 
-    ## Get a list of package information.
+    # Get a list of package information.
     #
     # @param Guid: package guid
     # @param Version: package version
@@ -632,22 +638,22 @@ class IpiDatabase(object):
 
             (PackageGuid, PackageVersion) = (Guid, Version)
             SqlCommand = """select * from %s where PackageGuid ='%s'
-            and PackageVersion = '%s'""" % (self.PkgTable, PackageGuid, \
+            and PackageVersion = '%s'""" % (self.PkgTable, PackageGuid,
                                             PackageVersion)
             self.Cur.execute(SqlCommand)
 
         elif Version is None or len(Version.strip()) == 0:
 
             SqlCommand = """select * from %s where PackageGuid ='%s'""" % \
-            (self.PkgTable, Guid)
+                (self.PkgTable, Guid)
             self.Cur.execute(SqlCommand)
         else:
             (PackageGuid, PackageVersion) = (Guid, Version)
             SqlCommand = """select * from %s where PackageGuid ='%s' and
             PackageVersion = '%s'
                             and DpGuid = '%s' and DpVersion = '%s'""" % \
-                            (self.PkgTable, PackageGuid, PackageVersion, \
-                             DpGuid, DpVersion)
+                (self.PkgTable, PackageGuid, PackageVersion,
+                 DpGuid, DpVersion)
             self.Cur.execute(SqlCommand)
 
         PkgList = []
@@ -656,31 +662,32 @@ class IpiDatabase(object):
             PkgVersion = PkgInfo[1]
             InstallTime = PkgInfo[2]
             InstallPath = PkgInfo[5]
-            PkgList.append((PkgGuid, PkgVersion, InstallTime, DpGuid, \
+            PkgList.append((PkgGuid, PkgVersion, InstallTime, DpGuid,
                             DpVersion, InstallPath))
 
         return PkgList
 
-
-    ## Get a list of module in package information.
+    # Get a list of module in package information.
     #
     # @param Guid: A module guid
     # @param Version: A module version
     #
+
     def GetModInPackage(self, Guid, Version, Name, Path, PkgGuid='', PkgVersion=''):
-        (ModuleGuid, ModuleVersion, ModuleName, InstallPath) = (Guid, Version, Name, Path)
+        (ModuleGuid, ModuleVersion, ModuleName,
+         InstallPath) = (Guid, Version, Name, Path)
         if PkgVersion == '' or PkgGuid == '':
             SqlCommand = """select * from %s where ModuleGuid ='%s' and
             ModuleVersion = '%s' and InstallPath = '%s'
-            and ModuleName = '%s'""" % (self.ModInPkgTable, ModuleGuid, \
-                                       ModuleVersion, InstallPath, ModuleName)
+            and ModuleName = '%s'""" % (self.ModInPkgTable, ModuleGuid,
+                                        ModuleVersion, InstallPath, ModuleName)
             self.Cur.execute(SqlCommand)
         else:
             SqlCommand = """select * from %s where ModuleGuid ='%s' and
             ModuleVersion = '%s' and InstallPath = '%s'
             and ModuleName = '%s' and PackageGuid ='%s'
             and PackageVersion = '%s'
-                            """ % (self.ModInPkgTable, ModuleGuid, \
+                            """ % (self.ModInPkgTable, ModuleGuid,
                                    ModuleVersion, InstallPath, ModuleName, PkgGuid, PkgVersion)
             self.Cur.execute(SqlCommand)
 
@@ -690,29 +697,30 @@ class IpiDatabase(object):
             ModVersion = ModInfo[1]
             InstallTime = ModInfo[2]
             InstallPath = ModInfo[5]
-            ModList.append((ModGuid, ModVersion, InstallTime, PkgGuid, \
+            ModList.append((ModGuid, ModVersion, InstallTime, PkgGuid,
                             PkgVersion, InstallPath))
 
         return ModList
 
-    ## Get a list of module standalone.
+    # Get a list of module standalone.
     #
     # @param Guid: A module guid
     # @param Version: A module version
     #
     def GetStandaloneModule(self, Guid, Version, Name, Path, DpGuid='', DpVersion=''):
-        (ModuleGuid, ModuleVersion, ModuleName, InstallPath) = (Guid, Version, Name, Path)
+        (ModuleGuid, ModuleVersion, ModuleName,
+         InstallPath) = (Guid, Version, Name, Path)
         if DpGuid == '':
             SqlCommand = """select * from %s where ModuleGuid ='%s' and
             ModuleVersion = '%s' and InstallPath = '%s'
-            and ModuleName = '%s'""" % (self.StandaloneModTable, ModuleGuid, \
-                                       ModuleVersion, InstallPath, ModuleName)
+            and ModuleName = '%s'""" % (self.StandaloneModTable, ModuleGuid,
+                                        ModuleVersion, InstallPath, ModuleName)
             self.Cur.execute(SqlCommand)
 
         else:
             SqlCommand = """select * from %s where ModuleGuid ='%s' and
             ModuleVersion = '%s' and InstallPath = '%s' and ModuleName = '%s' and DpGuid ='%s' and DpVersion = '%s'
-                            """ % (self.StandaloneModTable, ModuleGuid, \
+                            """ % (self.StandaloneModTable, ModuleGuid,
                                    ModuleVersion, ModuleName, InstallPath, DpGuid, DpVersion)
             self.Cur.execute(SqlCommand)
 
@@ -722,12 +730,12 @@ class IpiDatabase(object):
             ModVersion = ModInfo[1]
             InstallTime = ModInfo[2]
             InstallPath = ModInfo[5]
-            ModList.append((ModGuid, ModVersion, InstallTime, DpGuid, \
+            ModList.append((ModGuid, ModVersion, InstallTime, DpGuid,
                             DpVersion, InstallPath))
 
         return ModList
 
-    ## Get a list of module information that comes from DP.
+    # Get a list of module information that comes from DP.
     #
     # @param DpGuid: A Distribution Guid
     # @param DpVersion: A Distribution version
@@ -746,7 +754,7 @@ class IpiDatabase(object):
 
         return PathList
 
-    ## Get a list of package information.
+    # Get a list of package information.
     #
     # @param DpGuid: A Distribution Guid
     # @param DpVersion: A Distribution version
@@ -766,7 +774,7 @@ class IpiDatabase(object):
 
         return PkgList
 
-    ## Get a list of modules that depends on package information from a DP.
+    # Get a list of modules that depends on package information from a DP.
     #
     # @param DpGuid: A Distribution Guid
     # @param DpVersion: A Distribution version
@@ -790,8 +798,8 @@ class IpiDatabase(object):
             t1.ModuleVersion = t2.ModuleVersion and t2.DepexGuid ='%s'
             and (t2.DepexVersion = '%s' or t2.DepexVersion = 'N/A') and
             t1.PackageGuid != '%s' and t1.PackageVersion != '%s'
-                        """ % (self.ModInPkgTable, \
-                               self.ModDepexTable, Pkg[0], Pkg[1], Pkg[0], \
+                        """ % (self.ModInPkgTable,
+                               self.ModDepexTable, Pkg[0], Pkg[1], Pkg[0],
                                Pkg[1])
             self.Cur.execute(SqlCommand)
             for ModInfo in self.Cur:
@@ -802,17 +810,17 @@ class IpiDatabase(object):
 
             #
             # get all modules from standalone modules that depends on current
-            #Pkg (Guid match, Version match or NA) but not in current dp
+            # Pkg (Guid match, Version match or NA) but not in current dp
             #
             SqlCommand = \
-            """select t1.ModuleGuid, t1.ModuleVersion, t1.InstallPath
+                """select t1.ModuleGuid, t1.ModuleVersion, t1.InstallPath
             from %s as t1, %s as t2 where t1.ModuleGuid = t2.ModuleGuid and
             t1.ModuleVersion = t2.ModuleVersion and t2.DepexGuid ='%s'
             and (t2.DepexVersion = '%s' or t2.DepexVersion = 'N/A') and
                             t1.DpGuid != '%s' and t1.DpVersion != '%s'
                         """ % \
-                        (self.StandaloneModTable, self.ModDepexTable, Pkg[0], \
-                         Pkg[1], DpGuid, DpVersion)
+                (self.StandaloneModTable, self.ModDepexTable, Pkg[0],
+                 Pkg[1], DpGuid, DpVersion)
             self.Cur.execute(SqlCommand)
             for ModInfo in self.Cur:
                 ModGuid = ModInfo[0]
@@ -820,10 +828,9 @@ class IpiDatabase(object):
                 InstallPath = ModInfo[2]
                 ModList.append((ModGuid, ModVersion, InstallPath))
 
-
         return ModList
 
-    ## Get Dp's list of modules.
+    # Get Dp's list of modules.
     #
     # @param DpGuid: A Distribution Guid
     # @param DpVersion: A Distribution version
@@ -845,13 +852,13 @@ class IpiDatabase(object):
 
         return ModList
 
-
-    ## Get a module depex
+    # Get a module depex
     #
     # @param DpGuid: A module Guid
     # @param DpVersion: A module version
     # @param Path:
     #
+
     def GetModuleDepex(self, Guid, Version, Path):
 
         #
@@ -862,7 +869,6 @@ class IpiDatabase(object):
                             """ % (self.ModDepexTable, Guid, Version, Path)
         self.Cur.execute(SqlCommand)
 
-
         DepexList = []
         for DepInfo in self.Cur:
             DepexGuid = DepInfo[3]
@@ -871,7 +877,7 @@ class IpiDatabase(object):
 
         return DepexList
 
-    ## Inventory the distribution installed to current workspace
+    # Inventory the distribution installed to current workspace
     #
     # Inventory the distribution installed to current workspace
     #
@@ -889,7 +895,7 @@ class IpiDatabase(object):
 
         return DpInfoList
 
-    ## Close entire database
+    # Close entire database
     #
     # Close the connection and cursor
     #
@@ -906,7 +912,7 @@ class IpiDatabase(object):
         self.Cur.close()
         self.Conn.close()
 
-    ## Convert To Sql String
+    # Convert To Sql String
     #
     # 1. Replace "'" with "''" in each item of StringList
     #
@@ -916,7 +922,3 @@ class IpiDatabase(object):
         if self.DpTable:
             pass
         return list(map(lambda s: s.replace("'", "''"), StringList))
-
-
-
-
diff --git a/BaseTools/Source/Python/UPT/Core/PackageFile.py b/BaseTools/Source/Python/UPT/Core/PackageFile.py
index c9157e84d6ff..79b287cdb093 100644
--- a/BaseTools/Source/Python/UPT/Core/PackageFile.py
+++ b/BaseTools/Source/Python/UPT/Core/PackageFile.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # PackageFile class represents the zip file of a distribution package.
 #
@@ -40,29 +40,29 @@ class PackageFile:
         if Mode not in ["r", "w", "a"]:
             Mode = "r"
         try:
-            self._ZipFile = zipfile.ZipFile(FileName, Mode, \
+            self._ZipFile = zipfile.ZipFile(FileName, Mode,
                                             zipfile.ZIP_DEFLATED)
             self._Files = {}
             for Filename in self._ZipFile.namelist():
                 self._Files[os.path.normpath(Filename)] = Filename
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE,
-                            ExtraData="%s (%s)" % (FileName, str(Xstr)))
+                         ExtraData="%s (%s)" % (FileName, str(Xstr)))
 
         BadFile = self._ZipFile.testzip()
         if BadFile is not None:
             Logger.Error("PackagingTool", FILE_CHECKSUM_FAILURE,
-                            ExtraData="[%s] in %s" % (BadFile, FileName))
+                         ExtraData="[%s] in %s" % (BadFile, FileName))
 
     def GetZipFile(self):
         return self._ZipFile
 
-    ## Get file name
+    # Get file name
     #
     def __str__(self):
         return self._FileName
 
-    ## Extract the file
+    # Extract the file
     #
     # @param To:  the destination file
     #
@@ -73,7 +73,7 @@ class PackageFile:
             Logger.Info(Msg)
             self.Extract(FileN, ToFile)
 
-    ## Extract the file
+    # Extract the file
     #
     # @param File:  the extracted file
     # @param ToFile:  the destination file
@@ -88,7 +88,7 @@ class PackageFile:
 
         return ''
 
-    ## Extract the file
+    # Extract the file
     #
     # @param Which:  the source path
     # @param ToDest:  the destination path
@@ -97,34 +97,34 @@ class PackageFile:
         Which = os.path.normpath(Which)
         if Which not in self._Files:
             Logger.Error("PackagingTool", FILE_NOT_FOUND,
-                            ExtraData="[%s] in %s" % (Which, self._FileName))
+                         ExtraData="[%s] in %s" % (Which, self._FileName))
         try:
             FileContent = self._ZipFile.read(self._Files[Which])
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_DECOMPRESS_FAILURE,
-                            ExtraData="[%s] in %s (%s)" % (Which, \
-                                                           self._FileName, \
-                                                           str(Xstr)))
+                         ExtraData="[%s] in %s (%s)" % (Which,
+                                                        self._FileName,
+                                                        str(Xstr)))
         try:
             CreateDirectory(os.path.dirname(ToDest))
             if os.path.exists(ToDest) and not os.access(ToDest, os.W_OK):
-                Logger.Warn("PackagingTool", \
+                Logger.Warn("PackagingTool",
                             ST.WRN_FILE_NOT_OVERWRITTEN % ToDest)
                 return
             else:
                 ToFile = __FileHookOpen__(ToDest, 'wb')
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_OPEN_FAILURE,
-                            ExtraData="%s (%s)" % (ToDest, str(Xstr)))
+                         ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
         try:
             ToFile.write(FileContent)
             ToFile.close()
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_WRITE_FAILURE,
-                            ExtraData="%s (%s)" % (ToDest, str(Xstr)))
+                         ExtraData="%s (%s)" % (ToDest, str(Xstr)))
 
-    ## Remove the file
+    # Remove the file
     #
     # @param Files:  the removed files
     #
@@ -139,12 +139,12 @@ class PackageFile:
             SinF = os.path.normpath(SinF)
             if SinF not in self._Files:
                 Logger.Error("PackagingTool", FILE_NOT_FOUND,
-                                ExtraData="%s is not in %s!" % \
-                                (SinF, self._FileName))
+                             ExtraData="%s is not in %s!" %
+                             (SinF, self._FileName))
             self._Files.pop(SinF)
         self._ZipFile.close()
 
-        self._ZipFile = zipfile.ZipFile(self._FileName, "w", \
+        self._ZipFile = zipfile.ZipFile(self._FileName, "w",
                                         zipfile.ZIP_DEFLATED)
         Cwd = os.getcwd()
         os.chdir(TmpDir)
@@ -152,7 +152,7 @@ class PackageFile:
         os.chdir(Cwd)
         RemoveDirectory(TmpDir, True)
 
-    ## Pack the files under Top directory, the directory shown in the zipFile start from BaseDir,
+    # Pack the files under Top directory, the directory shown in the zipFile start from BaseDir,
     # BaseDir should be the parent directory of the Top directory, for example,
     # Pack(Workspace\Dir1, Workspace) will pack files under Dir1, and the path in the zipfile will
     # start from Workspace
@@ -162,13 +162,13 @@ class PackageFile:
     #
     def Pack(self, Top, BaseDir):
         if not os.path.isdir(Top):
-            Logger.Error("PackagingTool", FILE_UNKNOWN_ERROR, \
-                         "%s is not a directory!" %Top)
+            Logger.Error("PackagingTool", FILE_UNKNOWN_ERROR,
+                         "%s is not a directory!" % Top)
 
         FilesToPack = []
         Cwd = os.getcwd()
         os.chdir(BaseDir)
-        RelaDir = Top[Top.upper().find(BaseDir.upper()).\
+        RelaDir = Top[Top.upper().find(BaseDir.upper()).
                       join(len(BaseDir).join(1)):]
 
         for Root, Dirs, Files in os.walk(RelaDir):
@@ -193,7 +193,7 @@ class PackageFile:
         self.PackFiles(FilesToPack)
         os.chdir(Cwd)
 
-    ## Pack the file
+    # Pack the file
     #
     # @param Files:  the files to pack
     #
@@ -204,7 +204,7 @@ class PackageFile:
             self.PackFile(File)
             os.chdir(Cwd)
 
-    ## Pack the file
+    # Pack the file
     #
     # @param File:  the files to pack
     # @param ArcName:  the Arc Name
@@ -224,9 +224,9 @@ class PackageFile:
             self._ZipFile.write(File, ArcName)
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
-                            ExtraData="%s (%s)" % (File, str(Xstr)))
+                         ExtraData="%s (%s)" % (File, str(Xstr)))
 
-    ## Write data to the packed file
+    # Write data to the packed file
     #
     # @param Data:  data to write
     # @param ArcName:  the Arc Name
@@ -238,13 +238,10 @@ class PackageFile:
             self._ZipFile.writestr(ArcName, Data)
         except BaseException as Xstr:
             Logger.Error("PackagingTool", FILE_COMPRESS_FAILURE,
-                            ExtraData="%s (%s)" % (ArcName, str(Xstr)))
+                         ExtraData="%s (%s)" % (ArcName, str(Xstr)))
 
-    ## Close file
+    # Close file
     #
     #
     def Close(self):
         self._ZipFile.close()
-
-
-
diff --git a/BaseTools/Source/Python/UPT/Core/__init__.py b/BaseTools/Source/Python/UPT/Core/__init__.py
index 9af96f9c3c13..201bcf236305 100644
--- a/BaseTools/Source/Python/UPT/Core/__init__.py
+++ b/BaseTools/Source/Python/UPT/Core/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
index 91952a5feec7..14360fe7e1e8 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
@@ -1,4 +1,4 @@
-## @file GenDecFile.py
+# @file GenDecFile.py
 #
 # This file contained the logical of transfer package object to DEC files.
 #
@@ -61,6 +61,7 @@ import Library.DataType as DT
 from Library.UniClassObject import FormatUniEntry
 from Library.StringUtils import GetUniFileName
 
+
 def GenPcd(Package, Content):
     #
     # generate [Pcd] section
@@ -121,15 +122,17 @@ def GenPcd(Package, Content):
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
-            NewSectionDict[SortedArch] + [Statement]
+                NewSectionDict[SortedArch] + [Statement]
         else:
             NewSectionDict[SortedArch] = [Statement]
 
     for ValidUsage in ValidUsageDict:
-        Content += GenSection(ValidUsage, ValidUsageDict[ValidUsage], True, True)
+        Content += GenSection(ValidUsage,
+                              ValidUsageDict[ValidUsage], True, True)
 
     return Content
 
+
 def GenPcdErrorMsgSection(Package, Content):
     if not Package.PcdErrorCommentDict:
         return Content
@@ -139,7 +142,8 @@ def GenPcdErrorMsgSection(Package, Content):
     #
     Content += END_OF_LINE + END_OF_LINE
     SectionComment = TAB_COMMENT_SPLIT + END_OF_LINE
-    SectionComment += TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_PCD_ERROR_SECTION_COMMENT + END_OF_LINE
+    SectionComment += TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + \
+        TAB_PCD_ERROR_SECTION_COMMENT + END_OF_LINE
     SectionComment += TAB_COMMENT_SPLIT + END_OF_LINE
     TokenSpcCNameList = []
 
@@ -152,20 +156,22 @@ def GenPcdErrorMsgSection(Package, Content):
 
     for TokenSpcCNameItem in TokenSpcCNameList:
         SectionName = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_SECTION_START + TAB_PCD_ERROR + \
-                      TAB_SPLIT + TokenSpcCNameItem + TAB_SECTION_END + END_OF_LINE
+            TAB_SPLIT + TokenSpcCNameItem + TAB_SECTION_END + END_OF_LINE
         Content += SectionComment
         Content += SectionName
         for (TokenSpcCName, ErrorNumber) in Package.PcdErrorCommentDict:
             if TokenSpcCNameItem == TokenSpcCName:
-                PcdErrorMsg = GetLocalValue(Package.PcdErrorCommentDict[(TokenSpcCName, ErrorNumber)])
+                PcdErrorMsg = GetLocalValue(
+                    Package.PcdErrorCommentDict[(TokenSpcCName, ErrorNumber)])
                 SectionItem = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_SPACE_SPLIT + \
-                              ErrorNumber + TAB_SPACE_SPLIT + TAB_VALUE_SPLIT + TAB_SPACE_SPLIT + \
-                              PcdErrorMsg + END_OF_LINE
+                    ErrorNumber + TAB_SPACE_SPLIT + TAB_VALUE_SPLIT + TAB_SPACE_SPLIT + \
+                    PcdErrorMsg + END_OF_LINE
                 Content += SectionItem
 
     Content += TAB_COMMENT_SPLIT
     return Content
 
+
 def GenGuidProtocolPpi(Package, Content):
     #
     # generate [Guids] section
@@ -202,7 +208,7 @@ def GenGuidProtocolPpi(Package, Content):
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
-            NewSectionDict[SortedArch] + [Statement]
+                NewSectionDict[SortedArch] + [Statement]
         else:
             NewSectionDict[SortedArch] = [Statement]
 
@@ -242,7 +248,7 @@ def GenGuidProtocolPpi(Package, Content):
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
-            NewSectionDict[SortedArch] + [Statement]
+                NewSectionDict[SortedArch] + [Statement]
         else:
             NewSectionDict[SortedArch] = [Statement]
 
@@ -282,7 +288,7 @@ def GenGuidProtocolPpi(Package, Content):
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
-            NewSectionDict[SortedArch] + [Statement]
+                NewSectionDict[SortedArch] + [Statement]
         else:
             NewSectionDict[SortedArch] = [Statement]
 
@@ -290,13 +296,15 @@ def GenGuidProtocolPpi(Package, Content):
 
     return Content
 
-## Transfer Package Object to Dec files
+# Transfer Package Object to Dec files
 #
 # Transfer all contents of a standard Package Object to a Dec file
 #
 # @param Package:  A Package
 #
-def PackageToDec(Package, DistHeader = None):
+
+
+def PackageToDec(Package, DistHeader=None):
     #
     # Init global information for the file
     #
@@ -333,9 +341,9 @@ def PackageToDec(Package, DistHeader = None):
     #
     # Generate header comment section of DEC file
     #
-    Content += GenHeaderCommentSection(PackageAbstract, \
-                                       PackageDescription, \
-                                       PackageCopyright, \
+    Content += GenHeaderCommentSection(PackageAbstract,
+                                       PackageDescription,
+                                       PackageCopyright,
                                        PackageLicense).replace('\r\n', '\n')
 
     #
@@ -343,9 +351,11 @@ def PackageToDec(Package, DistHeader = None):
     #
     for UserExtension in Package.GetUserExtensionList():
         if UserExtension.GetUserID() == TAB_BINARY_HEADER_USERID \
-        and UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
-            PackageBinaryAbstract = GetLocalValue(UserExtension.GetBinaryAbstract())
-            PackageBinaryDescription = GetLocalValue(UserExtension.GetBinaryDescription())
+                and UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
+            PackageBinaryAbstract = GetLocalValue(
+                UserExtension.GetBinaryAbstract())
+            PackageBinaryDescription = GetLocalValue(
+                UserExtension.GetBinaryDescription())
             PackageBinaryCopyright = ''
             PackageBinaryLicense = ''
             for (Lang, Copyright) in UserExtension.GetBinaryCopyright():
@@ -353,23 +363,23 @@ def PackageToDec(Package, DistHeader = None):
             for (Lang, License) in UserExtension.GetBinaryLicense():
                 PackageBinaryLicense = License
             if PackageBinaryAbstract and PackageBinaryDescription and \
-            PackageBinaryCopyright and PackageBinaryLicense:
+                    PackageBinaryCopyright and PackageBinaryLicense:
                 Content += GenHeaderCommentSection(PackageBinaryAbstract,
-                                           PackageBinaryDescription,
-                                           PackageBinaryCopyright,
-                                           PackageBinaryLicense,
-                                           True)
+                                                   PackageBinaryDescription,
+                                                   PackageBinaryCopyright,
+                                                   PackageBinaryLicense,
+                                                   True)
 
     #
     # Generate PACKAGE_UNI_FILE for the Package
     #
-    FileHeader = GenHeaderCommentSection(PackageAbstract, PackageDescription, PackageCopyright, PackageLicense, False, \
+    FileHeader = GenHeaderCommentSection(PackageAbstract, PackageDescription, PackageCopyright, PackageLicense, False,
                                          TAB_COMMENT_EDK1_SPLIT)
     GenPackageUNIEncodeFile(Package, FileHeader)
 
     #
     # for each section, maintain a dict, sorted arch will be its key,
-    #statement list will be its data
+    # statement list will be its data
     # { 'Arch1 Arch2 Arch3': [statement1, statement2],
     #   'Arch1' : [statement1, statement3]
     #  }
@@ -379,31 +389,36 @@ def PackageToDec(Package, DistHeader = None):
     # generate [Defines] section
     #
     LeftOffset = 31
-    NewSectionDict = {TAB_ARCH_COMMON : []}
+    NewSectionDict = {TAB_ARCH_COMMON: []}
     SpecialItemList = []
 
-    Statement = (u'%s ' % TAB_DEC_DEFINES_DEC_SPECIFICATION).ljust(LeftOffset) + u'= %s' % '0x00010017'
+    Statement = (u'%s ' % TAB_DEC_DEFINES_DEC_SPECIFICATION).ljust(
+        LeftOffset) + u'= %s' % '0x00010017'
     SpecialItemList.append(Statement)
 
     BaseName = Package.GetBaseName()
     if BaseName.startswith('.') or BaseName.startswith('-'):
         BaseName = '_' + BaseName
-    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_NAME).ljust(LeftOffset) + u'= %s' % BaseName
+    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_NAME).ljust(
+        LeftOffset) + u'= %s' % BaseName
     SpecialItemList.append(Statement)
 
-    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_VERSION).ljust(LeftOffset) + u'= %s' % Package.GetVersion()
+    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_VERSION).ljust(
+        LeftOffset) + u'= %s' % Package.GetVersion()
     SpecialItemList.append(Statement)
 
-    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_GUID).ljust(LeftOffset) + u'= %s' % Package.GetGuid()
+    Statement = (u'%s ' % TAB_DEC_DEFINES_PACKAGE_GUID).ljust(
+        LeftOffset) + u'= %s' % Package.GetGuid()
     SpecialItemList.append(Statement)
 
     if Package.UNIFlag:
-        Statement = (u'%s ' % TAB_DEC_DEFINES_PKG_UNI_FILE).ljust(LeftOffset) + u'= %s' % Package.GetBaseName() + '.uni'
+        Statement = (u'%s ' % TAB_DEC_DEFINES_PKG_UNI_FILE).ljust(
+            LeftOffset) + u'= %s' % Package.GetBaseName() + '.uni'
         SpecialItemList.append(Statement)
 
     for SortedArch in NewSectionDict:
         NewSectionDict[SortedArch] = \
-        NewSectionDict[SortedArch] + SpecialItemList
+            NewSectionDict[SortedArch] + SpecialItemList
     Content += GenSection('Defines', NewSectionDict)
 
     #
@@ -418,7 +433,7 @@ def PackageToDec(Package, DistHeader = None):
             SortedArch = ' '.join(ArchList)
             if SortedArch in NewSectionDict:
                 NewSectionDict[SortedArch] = \
-                NewSectionDict[SortedArch] + [ConvertPath(Statement)]
+                    NewSectionDict[SortedArch] + [ConvertPath(Statement)]
             else:
                 NewSectionDict[SortedArch] = [ConvertPath(Statement)]
 
@@ -452,12 +467,12 @@ def PackageToDec(Package, DistHeader = None):
         #
         if LibraryClass.GetSupModuleList():
             Statement += \
-            GenDecTailComment(LibraryClass.GetSupModuleList())
+                GenDecTailComment(LibraryClass.GetSupModuleList())
         ArchList = sorted(LibraryClass.GetSupArchList())
         SortedArch = ' '.join(ArchList)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = \
-            NewSectionDict[SortedArch] + [Statement]
+                NewSectionDict[SortedArch] + [Statement]
         else:
             NewSectionDict[SortedArch] = [Statement]
 
@@ -476,7 +491,7 @@ def PackageToDec(Package, DistHeader = None):
     NewSectionDict = {}
     for UserExtension in Package.GetUserExtensionList():
         if UserExtension.GetUserID() == TAB_BINARY_HEADER_USERID and \
-            UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
+                UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
             continue
 
         # Generate Private Section first
@@ -513,15 +528,18 @@ def PackageToDec(Package, DistHeader = None):
 
     SaveFileOnChange(ContainerFile, Content, False)
     if DistHeader.ReadOnly:
-        os.chmod(ContainerFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH)
+        os.chmod(ContainerFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH)
     else:
-        os.chmod(ContainerFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH|stat.S_IWUSR|stat.S_IWGRP|stat.S_IWOTH)
+        os.chmod(ContainerFile, stat.S_IRUSR | stat.S_IRGRP |
+                 stat.S_IROTH | stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
     return ContainerFile
 
-## GenPackageUNIEncodeFile
+# GenPackageUNIEncodeFile
 # GenPackageUNIEncodeFile, default is a UCS-2LE encode file
 #
-def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCODING_UTF16LE):
+
+
+def GenPackageUNIEncodeFile(PackageObject, UniFileHeader='', Encoding=TAB_ENCODING_UTF16LE):
     GenUNIFlag = False
     OnlyLANGUAGE_EN_X = True
     BinaryAbstract = []
@@ -538,7 +556,7 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
 
     for UserExtension in PackageObject.GetUserExtensionList():
         if UserExtension.GetUserID() == TAB_BINARY_HEADER_USERID \
-        and UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
+                and UserExtension.GetIdentifier() == TAB_BINARY_HEADER_IDENTIFIER:
             for (Key, Value) in UserExtension.GetBinaryAbstract():
                 if Key == TAB_LANGUAGE_EN_X:
                     GenUNIFlag = True
@@ -577,25 +595,30 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
     if not os.path.exists(os.path.dirname(PackageObject.GetFullPath())):
         os.makedirs(os.path.dirname(PackageObject.GetFullPath()))
 
-    ContainerFile = GetUniFileName(os.path.dirname(PackageObject.GetFullPath()), PackageObject.GetBaseName())
+    ContainerFile = GetUniFileName(os.path.dirname(
+        PackageObject.GetFullPath()), PackageObject.GetBaseName())
 
     Content = UniFileHeader + '\r\n'
     Content += '\r\n'
 
-    Content += FormatUniEntry('#string ' + TAB_DEC_PACKAGE_ABSTRACT, PackageObject.GetAbstract(), ContainerFile) + '\r\n'
+    Content += FormatUniEntry('#string ' + TAB_DEC_PACKAGE_ABSTRACT,
+                              PackageObject.GetAbstract(), ContainerFile) + '\r\n'
 
     Content += FormatUniEntry('#string ' + TAB_DEC_PACKAGE_DESCRIPTION, PackageObject.GetDescription(), ContainerFile) \
-    + '\r\n'
+        + '\r\n'
 
-    Content += FormatUniEntry('#string ' + TAB_DEC_BINARY_ABSTRACT, BinaryAbstract, ContainerFile) + '\r\n'
+    Content += FormatUniEntry('#string ' + TAB_DEC_BINARY_ABSTRACT,
+                              BinaryAbstract, ContainerFile) + '\r\n'
 
-    Content += FormatUniEntry('#string ' + TAB_DEC_BINARY_DESCRIPTION, BinaryDescription, ContainerFile) + '\r\n'
+    Content += FormatUniEntry('#string ' + TAB_DEC_BINARY_DESCRIPTION,
+                              BinaryDescription, ContainerFile) + '\r\n'
 
     PromptGenList = []
     HelpTextGenList = []
     for Pcd in PackageObject.GetPcdList():
         # Generate Prompt for each Pcd
-        PcdPromptStrName = '#string ' + 'STR_' + Pcd.GetTokenSpaceGuidCName() + '_' + Pcd.GetCName() + '_PROMPT '
+        PcdPromptStrName = '#string ' + 'STR_' + Pcd.GetTokenSpaceGuidCName() + '_' + \
+            Pcd.GetCName() + '_PROMPT '
         TokenValueList = []
         for TxtObj in Pcd.GetPromptList():
             Lang = TxtObj.GetLang()
@@ -606,12 +629,14 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
             if (PcdPromptStrName, Lang) not in PromptGenList:
                 TokenValueList.append((Lang, PromptStr))
                 PromptGenList.append((PcdPromptStrName, Lang))
-        PromptString = FormatUniEntry(PcdPromptStrName, TokenValueList, ContainerFile) + '\r\n'
+        PromptString = FormatUniEntry(
+            PcdPromptStrName, TokenValueList, ContainerFile) + '\r\n'
         if PromptString not in Content:
             Content += PromptString
 
         # Generate Help String for each Pcd
-        PcdHelpStrName = '#string ' + 'STR_' + Pcd.GetTokenSpaceGuidCName() + '_' + Pcd.GetCName() + '_HELP '
+        PcdHelpStrName = '#string ' + 'STR_' + Pcd.GetTokenSpaceGuidCName() + '_' + \
+            Pcd.GetCName() + '_HELP '
         TokenValueList = []
         for TxtObj in Pcd.GetHelpTextList():
             Lang = TxtObj.GetLang()
@@ -622,7 +647,8 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
             if (PcdHelpStrName, Lang) not in HelpTextGenList:
                 TokenValueList.append((Lang, HelpStr))
                 HelpTextGenList.append((PcdHelpStrName, Lang))
-        HelpTextString = FormatUniEntry(PcdHelpStrName, TokenValueList, ContainerFile) + '\r\n'
+        HelpTextString = FormatUniEntry(
+            PcdHelpStrName, TokenValueList, ContainerFile) + '\r\n'
         if HelpTextString not in Content:
             Content += HelpTextString
 
@@ -633,7 +659,8 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
                 PcdErrStrName = '#string ' + TAB_STR_TOKENCNAME + TAB_UNDERLINE_SPLIT + Pcd.GetTokenSpaceGuidCName() \
                     + TAB_UNDERLINE_SPLIT + TAB_STR_TOKENERR \
                     + TAB_UNDERLINE_SPLIT + ErrorNo[2:]
-                PcdErrString = FormatUniEntry(PcdErrStrName, PcdError.GetErrorMessageList(), ContainerFile) + '\r\n'
+                PcdErrString = FormatUniEntry(
+                    PcdErrStrName, PcdError.GetErrorMessageList(), ContainerFile) + '\r\n'
                 if PcdErrString not in Content:
                     Content += PcdErrString
 
@@ -647,18 +674,21 @@ def GenPackageUNIEncodeFile(PackageObject, UniFileHeader = '', Encoding=TAB_ENCO
 
     return ContainerFile
 
-## GenPcdErrComment
+# GenPcdErrComment
 #
 #  @param PcdErrObject:  PcdErrorObject
 #
 #  @retval CommentStr:   Generated comment lines, with prefix "#"
 #
-def GenPcdErrComment (PcdErrObject):
+
+
+def GenPcdErrComment(PcdErrObject):
     CommentStr = ''
     ErrorCode = PcdErrObject.GetErrorNumber()
     ValidValueRange = PcdErrObject.GetValidValueRange()
     if ValidValueRange:
-        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_PCD_VALIDRANGE + TAB_SPACE_SPLIT
+        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + \
+            TAB_PCD_VALIDRANGE + TAB_SPACE_SPLIT
         if ErrorCode:
             CommentStr += ErrorCode + TAB_SPACE_SPLIT + TAB_VALUE_SPLIT + TAB_SPACE_SPLIT
         CommentStr += ValidValueRange + END_OF_LINE
@@ -666,18 +696,19 @@ def GenPcdErrComment (PcdErrObject):
     ValidValue = PcdErrObject.GetValidValue()
     if ValidValue:
         ValidValueList = \
-        [Value for Value in ValidValue.split(TAB_SPACE_SPLIT) if Value]
-        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_PCD_VALIDLIST + TAB_SPACE_SPLIT
+            [Value for Value in ValidValue.split(TAB_SPACE_SPLIT) if Value]
+        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + \
+            TAB_PCD_VALIDLIST + TAB_SPACE_SPLIT
         if ErrorCode:
             CommentStr += ErrorCode + TAB_SPACE_SPLIT + TAB_VALUE_SPLIT + TAB_SPACE_SPLIT
         CommentStr += TAB_COMMA_SPLIT.join(ValidValueList) + END_OF_LINE
 
     Expression = PcdErrObject.GetExpression()
     if Expression:
-        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + TAB_PCD_EXPRESSION + TAB_SPACE_SPLIT
+        CommentStr = TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT + \
+            TAB_PCD_EXPRESSION + TAB_SPACE_SPLIT
         if ErrorCode:
             CommentStr += ErrorCode + TAB_SPACE_SPLIT + TAB_VALUE_SPLIT + TAB_SPACE_SPLIT
         CommentStr += Expression + END_OF_LINE
 
     return CommentStr
-
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
index dd6184e0471e..fec699e2d22d 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
@@ -1,4 +1,4 @@
-## @file GenInfFile.py
+# @file GenInfFile.py
 #
 # This file contained the logical of transfer package object to INF files.
 #
@@ -38,7 +38,7 @@ from Library.UniClassObject import FormatUniEntry
 from Library.StringUtils import GetUniFileName
 
 
-## Transfer Module Object to Inf files
+# Transfer Module Object to Inf files
 #
 # Transfer all contents of a standard Module Object to an Inf file
 # @param ModuleObject: A Module Object
@@ -112,9 +112,11 @@ def ModuleToInf(ModuleObject, PackageObject=None, DistHeader=None):
     #
     for UserExtension in ModuleObject.GetUserExtensionList():
         if UserExtension.GetUserID() == DT.TAB_BINARY_HEADER_USERID \
-        and UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
-            ModuleBinaryAbstract = GetLocalValue(UserExtension.GetBinaryAbstract())
-            ModuleBinaryDescription = GetLocalValue(UserExtension.GetBinaryDescription())
+                and UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
+            ModuleBinaryAbstract = GetLocalValue(
+                UserExtension.GetBinaryAbstract())
+            ModuleBinaryDescription = GetLocalValue(
+                UserExtension.GetBinaryDescription())
             ModuleBinaryCopyright = ''
             ModuleBinaryLicense = ''
             for (Lang, Copyright) in UserExtension.GetBinaryCopyright():
@@ -122,17 +124,17 @@ def ModuleToInf(ModuleObject, PackageObject=None, DistHeader=None):
             for (Lang, License) in UserExtension.GetBinaryLicense():
                 ModuleBinaryLicense = License
             if ModuleBinaryAbstract and ModuleBinaryDescription and \
-            ModuleBinaryCopyright and ModuleBinaryLicense:
+                    ModuleBinaryCopyright and ModuleBinaryLicense:
                 Content += GenHeaderCommentSection(ModuleBinaryAbstract,
-                                           ModuleBinaryDescription,
-                                           ModuleBinaryCopyright,
-                                           ModuleBinaryLicense,
-                                           True)
+                                                   ModuleBinaryDescription,
+                                                   ModuleBinaryCopyright,
+                                                   ModuleBinaryLicense,
+                                                   True)
 
     #
     # Generate MODULE_UNI_FILE for module
     #
-    FileHeader = GenHeaderCommentSection(ModuleAbstract, ModuleDescription, ModuleCopyright, ModuleLicense, False, \
+    FileHeader = GenHeaderCommentSection(ModuleAbstract, ModuleDescription, ModuleCopyright, ModuleLicense, False,
                                          DT.TAB_COMMENT_EDK1_SPLIT)
     ModuleUniFile = GenModuleUNIEncodeFile(ModuleObject, FileHeader)
     if ModuleUniFile:
@@ -172,19 +174,25 @@ def ModuleToInf(ModuleObject, PackageObject=None, DistHeader=None):
     #
     # generate [Event], [BootMode], [Hob] section
     #
-    Content += GenSpecialSections(ModuleObject.GetEventList(), 'Event', __UserExtensionsContent)
-    Content += GenSpecialSections(ModuleObject.GetBootModeList(), 'BootMode', __UserExtensionsContent)
-    Content += GenSpecialSections(ModuleObject.GetHobList(), 'Hob', __UserExtensionsContent)
+    Content += GenSpecialSections(ModuleObject.GetEventList(),
+                                  'Event', __UserExtensionsContent)
+    Content += GenSpecialSections(ModuleObject.GetBootModeList(),
+                                  'BootMode', __UserExtensionsContent)
+    Content += GenSpecialSections(ModuleObject.GetHobList(),
+                                  'Hob', __UserExtensionsContent)
     SaveFileOnChange(ContainerFile, Content, False)
     if DistHeader.ReadOnly:
-        os.chmod(ContainerFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH)
+        os.chmod(ContainerFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH)
     else:
-        os.chmod(ContainerFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH|stat.S_IWUSR|stat.S_IWGRP|stat.S_IWOTH)
+        os.chmod(ContainerFile, stat.S_IRUSR | stat.S_IRGRP |
+                 stat.S_IROTH | stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
     return ContainerFile
 
-## GenModuleUNIEncodeFile
+# GenModuleUNIEncodeFile
 # GenModuleUNIEncodeFile, default is a UCS-2LE encode file
 #
+
+
 def GenModuleUNIEncodeFile(ModuleObject, UniFileHeader='', Encoding=DT.TAB_ENCODING_UTF16LE):
     GenUNIFlag = False
     OnlyLANGUAGE_EN_X = True
@@ -202,7 +210,7 @@ def GenModuleUNIEncodeFile(ModuleObject, UniFileHeader='', Encoding=DT.TAB_ENCOD
 
     for UserExtension in ModuleObject.GetUserExtensionList():
         if UserExtension.GetUserID() == DT.TAB_BINARY_HEADER_USERID \
-        and UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
+                and UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
             for (Key, Value) in UserExtension.GetBinaryAbstract():
                 if Key == DT.TAB_LANGUAGE_EN_X:
                     GenUNIFlag = True
@@ -216,14 +224,14 @@ def GenModuleUNIEncodeFile(ModuleObject, UniFileHeader='', Encoding=DT.TAB_ENCOD
                     OnlyLANGUAGE_EN_X = False
                 BinaryDescription.append((Key, Value))
 
-
     if not GenUNIFlag:
         return
     elif OnlyLANGUAGE_EN_X:
         return
     else:
         ModuleObject.UNIFlag = True
-    ContainerFile = GetUniFileName(os.path.dirname(ModuleObject.GetFullPath()), ModuleObject.GetBaseName())
+    ContainerFile = GetUniFileName(os.path.dirname(
+        ModuleObject.GetFullPath()), ModuleObject.GetBaseName())
 
     if not os.path.exists(os.path.dirname(ModuleObject.GetFullPath())):
         os.makedirs(os.path.dirname(ModuleObject.GetFullPath()))
@@ -231,16 +239,18 @@ def GenModuleUNIEncodeFile(ModuleObject, UniFileHeader='', Encoding=DT.TAB_ENCOD
     Content = UniFileHeader + '\r\n'
     Content += '\r\n'
 
-    Content += FormatUniEntry('#string ' + DT.TAB_INF_ABSTRACT, ModuleObject.GetAbstract(), ContainerFile) + '\r\n'
+    Content += FormatUniEntry('#string ' + DT.TAB_INF_ABSTRACT,
+                              ModuleObject.GetAbstract(), ContainerFile) + '\r\n'
 
     Content += FormatUniEntry('#string ' + DT.TAB_INF_DESCRIPTION, ModuleObject.GetDescription(), ContainerFile) \
-            + '\r\n'
+        + '\r\n'
 
-    BinaryAbstractString = FormatUniEntry('#string ' + DT.TAB_INF_BINARY_ABSTRACT, BinaryAbstract, ContainerFile)
+    BinaryAbstractString = FormatUniEntry(
+        '#string ' + DT.TAB_INF_BINARY_ABSTRACT, BinaryAbstract, ContainerFile)
     if BinaryAbstractString:
         Content += BinaryAbstractString + '\r\n'
 
-    BinaryDescriptionString = FormatUniEntry('#string ' + DT.TAB_INF_BINARY_DESCRIPTION, BinaryDescription, \
+    BinaryDescriptionString = FormatUniEntry('#string ' + DT.TAB_INF_BINARY_DESCRIPTION, BinaryDescription,
                                              ContainerFile)
     if BinaryDescriptionString:
         Content += BinaryDescriptionString + '\r\n'
@@ -255,6 +265,8 @@ def GenModuleUNIEncodeFile(ModuleObject, UniFileHeader='', Encoding=DT.TAB_ENCOD
         ModuleObject.FileList.append((ContainerFile, Md5Sum))
 
     return ContainerFile
+
+
 def GenDefines(ModuleObject):
     #
     # generate [Defines] section
@@ -270,7 +282,7 @@ def GenDefines(ModuleObject):
         for Statement in DefinesDict:
             if len(Statement.split(DT.TAB_EQUAL_SPLIT)) > 1:
                 Statement = (u'%s ' % Statement.split(DT.TAB_EQUAL_SPLIT, 1)[0]).ljust(LeftOffset) \
-                             + u'= %s' % Statement.split(DT.TAB_EQUAL_SPLIT, 1)[1].lstrip()
+                    + u'= %s' % Statement.split(DT.TAB_EQUAL_SPLIT, 1)[1].lstrip()
             SortedArch = DT.TAB_ARCH_COMMON
             if Statement.strip().startswith(DT.TAB_INF_DEFINES_CUSTOM_MAKEFILE):
                 pos = Statement.find(DT.TAB_VALUE_SPLIT)
@@ -285,51 +297,56 @@ def GenDefines(ModuleObject):
     SpecialStatementList = []
 
     # TAB_INF_DEFINES_INF_VERSION
-    Statement = (u'%s ' % DT.TAB_INF_DEFINES_INF_VERSION).ljust(LeftOffset) + u'= %s' % '0x00010017'
+    Statement = (u'%s ' % DT.TAB_INF_DEFINES_INF_VERSION).ljust(
+        LeftOffset) + u'= %s' % '0x00010017'
     SpecialStatementList.append(Statement)
 
     # BaseName
     BaseName = ModuleObject.GetBaseName()
     if BaseName.startswith('.') or BaseName.startswith('-'):
         BaseName = '_' + BaseName
-    Statement = (u'%s ' % DT.TAB_INF_DEFINES_BASE_NAME).ljust(LeftOffset) + u'= %s' % BaseName
+    Statement = (u'%s ' % DT.TAB_INF_DEFINES_BASE_NAME).ljust(
+        LeftOffset) + u'= %s' % BaseName
     SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_FILE_GUID
-    Statement = (u'%s ' % DT.TAB_INF_DEFINES_FILE_GUID).ljust(LeftOffset) + u'= %s' % ModuleObject.GetGuid()
+    Statement = (u'%s ' % DT.TAB_INF_DEFINES_FILE_GUID).ljust(
+        LeftOffset) + u'= %s' % ModuleObject.GetGuid()
     SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_VERSION_STRING
-    Statement = (u'%s ' % DT.TAB_INF_DEFINES_VERSION_STRING).ljust(LeftOffset) + u'= %s' % ModuleObject.GetVersion()
+    Statement = (u'%s ' % DT.TAB_INF_DEFINES_VERSION_STRING).ljust(
+        LeftOffset) + u'= %s' % ModuleObject.GetVersion()
     SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_VERSION_STRING
     if ModuleObject.UNIFlag:
         Statement = (u'%s ' % DT.TAB_INF_DEFINES_MODULE_UNI_FILE).ljust(LeftOffset) + \
-                    u'= %s' % ModuleObject.GetModuleUniFile()
+            u'= %s' % ModuleObject.GetModuleUniFile()
         SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_MODULE_TYPE
     if ModuleObject.GetModuleType():
-        Statement = (u'%s ' % DT.TAB_INF_DEFINES_MODULE_TYPE).ljust(LeftOffset) + u'= %s' % ModuleObject.GetModuleType()
+        Statement = (u'%s ' % DT.TAB_INF_DEFINES_MODULE_TYPE).ljust(
+            LeftOffset) + u'= %s' % ModuleObject.GetModuleType()
         SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_PCD_IS_DRIVER
     if ModuleObject.GetPcdIsDriver():
         Statement = (u'%s ' % DT.TAB_INF_DEFINES_PCD_IS_DRIVER).ljust(LeftOffset) + \
-                    u'= %s' % ModuleObject.GetPcdIsDriver()
+            u'= %s' % ModuleObject.GetPcdIsDriver()
         SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION
     if ModuleObject.GetUefiSpecificationVersion():
         Statement = (u'%s ' % DT.TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION).ljust(LeftOffset) + \
-                    u'= %s' % ModuleObject.GetUefiSpecificationVersion()
+            u'= %s' % ModuleObject.GetUefiSpecificationVersion()
         SpecialStatementList.append(Statement)
 
     # TAB_INF_DEFINES_PI_SPECIFICATION_VERSION
     if ModuleObject.GetPiSpecificationVersion():
         Statement = (u'%s ' % DT.TAB_INF_DEFINES_PI_SPECIFICATION_VERSION).ljust(LeftOffset) + \
-                    u'= %s' % ModuleObject.GetPiSpecificationVersion()
+            u'= %s' % ModuleObject.GetPiSpecificationVersion()
         SpecialStatementList.append(Statement)
 
     # LibraryClass
@@ -337,9 +354,11 @@ def GenDefines(ModuleObject):
         if LibraryClass.GetUsage() == DT.USAGE_ITEM_PRODUCES or \
            LibraryClass.GetUsage() == DT.USAGE_ITEM_SOMETIMES_PRODUCES:
             Statement = (u'%s ' % DT.TAB_INF_DEFINES_LIBRARY_CLASS).ljust(LeftOffset) + \
-                        u'= %s' % LibraryClass.GetLibraryClass()
+                u'= %s' % LibraryClass.GetLibraryClass()
             if LibraryClass.GetSupModuleList():
-                Statement += '|' + DT.TAB_SPACE_SPLIT.join(l for l in LibraryClass.GetSupModuleList())
+                Statement += '|' + \
+                    DT.TAB_SPACE_SPLIT.join(
+                        l for l in LibraryClass.GetSupModuleList())
             SpecialStatementList.append(Statement)
 
     # Spec Item
@@ -359,7 +378,8 @@ def GenDefines(ModuleObject):
         Destructor = Extern.GetDestructor()
         HelpStringList = Extern.GetHelpTextList()
         FFE = Extern.GetFeatureFlag()
-        ExternList.append([ArchList, EntryPoint, UnloadImage, Constructor, Destructor, FFE, HelpStringList])
+        ExternList.append([ArchList, EntryPoint, UnloadImage,
+                          Constructor, Destructor, FFE, HelpStringList])
     #
     # Add VALID_ARCHITECTURES information
     #
@@ -368,17 +388,22 @@ def GenDefines(ModuleObject):
         ValidArchStatement = '\n' + '# ' + '\n'
         ValidArchStatement += '# The following information is for reference only and not required by the build tools.\n'
         ValidArchStatement += '# ' + '\n'
-        ValidArchStatement += '# VALID_ARCHITECTURES = %s' % (' '.join(ModuleObject.SupArchList)) + '\n'
+        ValidArchStatement += '# VALID_ARCHITECTURES = %s' % (
+            ' '.join(ModuleObject.SupArchList)) + '\n'
         ValidArchStatement += '# '
     if DT.TAB_ARCH_COMMON not in NewSectionDict:
         NewSectionDict[DT.TAB_ARCH_COMMON] = []
-    NewSectionDict[DT.TAB_ARCH_COMMON] = NewSectionDict[DT.TAB_ARCH_COMMON] + SpecialStatementList
-    GenMetaFileMisc.AddExternToDefineSec(NewSectionDict, DT.TAB_ARCH_COMMON, ExternList)
+    NewSectionDict[DT.TAB_ARCH_COMMON] = NewSectionDict[DT.TAB_ARCH_COMMON] + \
+        SpecialStatementList
+    GenMetaFileMisc.AddExternToDefineSec(
+        NewSectionDict, DT.TAB_ARCH_COMMON, ExternList)
     if ValidArchStatement is not None:
-        NewSectionDict[DT.TAB_ARCH_COMMON] = NewSectionDict[DT.TAB_ARCH_COMMON] + [ValidArchStatement]
+        NewSectionDict[DT.TAB_ARCH_COMMON] = NewSectionDict[DT.TAB_ARCH_COMMON] + \
+            [ValidArchStatement]
     Content += GenSection('Defines', NewSectionDict)
     return Content
 
+
 def GenLibraryClasses(ModuleObject):
     #
     # generate [LibraryClasses] section
@@ -451,6 +476,7 @@ def GenLibraryClasses(ModuleObject):
 
     return Content
 
+
 def GenPackages(ModuleObject):
     Content = ''
     #
@@ -496,6 +522,7 @@ def GenPackages(ModuleObject):
     Content += GenSection('Packages', NewSectionDict)
     return Content
 
+
 def GenSources(ModuleObject):
     #
     # generate [Sources] section
@@ -508,7 +535,8 @@ def GenSources(ModuleObject):
         FeatureFlag = Source.GetFeatureFlag()
         SupArchList = sorted(Source.GetSupArchList())
         SortedArch = ' '.join(SupArchList)
-        Statement = GenSourceStatement(ConvertPath(SourceFile), Family, FeatureFlag)
+        Statement = GenSourceStatement(
+            ConvertPath(SourceFile), Family, FeatureFlag)
         if SortedArch in NewSectionDict:
             NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [Statement]
         else:
@@ -517,6 +545,7 @@ def GenSources(ModuleObject):
 
     return Content
 
+
 def GenDepex(ModuleObject):
     #
     # generate [Depex] section
@@ -540,7 +569,8 @@ def GenDepex(ModuleObject):
         else:
             for ModuleType in SupModList:
                 for Arch in SupArchList:
-                    KeyList.append(ConvertArchForInstall(Arch) + '.' + ModuleType)
+                    KeyList.append(ConvertArchForInstall(
+                        Arch) + '.' + ModuleType)
         for Key in KeyList:
             if Key in NewSectionDict:
                 NewSectionDict[Key] = NewSectionDict[Key] + [Statement]
@@ -549,15 +579,17 @@ def GenDepex(ModuleObject):
     Content += GenSection('Depex', NewSectionDict, False)
 
     return Content
-## GenUserExtensions
+# GenUserExtensions
 #
 # GenUserExtensions
 #
+
+
 def GenUserExtensions(ModuleObject):
     NewSectionDict = {}
     for UserExtension in ModuleObject.GetUserExtensionList():
         if UserExtension.GetUserID() == DT.TAB_BINARY_HEADER_USERID and \
-            UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
+                UserExtension.GetIdentifier() == DT.TAB_BINARY_HEADER_IDENTIFIER:
             continue
         if UserExtension.GetIdentifier() == 'Depex':
             continue
@@ -601,6 +633,8 @@ def GenUserExtensions(ModuleObject):
 #
 #  @retval Statement: The generated statement for source
 #
+
+
 def GenSourceStatement(SourceFile, Family, FeatureFlag, TagName=None,
                        ToolCode=None, HelpStr=None):
     Statement = ''
@@ -632,6 +666,8 @@ def GenSourceStatement(SourceFile, Family, FeatureFlag, TagName=None,
 #  @param Value:     (Target, Family, TagName, Comment)
 #
 #
+
+
 def GenBinaryStatement(Key, Value, SubTypeGuidValue=None):
     (FileName, FileType, FFE, SortedArch) = Key
     if SortedArch:
@@ -666,11 +702,13 @@ def GenBinaryStatement(Key, Value, SubTypeGuidValue=None):
         elif Target:
             Statement += '|' + Target
     return Statement
-## GenGuidSections
+# GenGuidSections
 #
 #  @param GuidObjList: List of GuidObject
 #  @retVal Content: The generated section contents
 #
+
+
 def GenGuidSections(GuidObjList):
     #
     # generate [Guids] section
@@ -729,11 +767,13 @@ def GenGuidSections(GuidObjList):
 
     return Content
 
-## GenProtocolPPiSections
+# GenProtocolPPiSections
 #
 #  @param ObjList: List of ProtocolObject or Ppi Object
 #  @retVal Content: The generated section contents
 #
+
+
 def GenProtocolPPiSections(ObjList, IsProtocol):
     Content = ''
     Dict = Sdict()
@@ -791,9 +831,11 @@ def GenProtocolPPiSections(ObjList, IsProtocol):
 
     return Content
 
-## GenPcdSections
+# GenPcdSections
 #
 #
+
+
 def GenPcdSections(ModuleObject):
     Content = ''
     if not GlobalData.gIS_BINARY_INF:
@@ -868,9 +910,11 @@ def GenPcdSections(ModuleObject):
 
     return Content
 
-## GenPcdSections
+# GenPcdSections
 #
 #
+
+
 def GenAsBuiltPacthPcdSections(ModuleObject):
     PatchPcdDict = {}
     for BinaryFile in ModuleObject.GetBinaryFileList():
@@ -894,10 +938,11 @@ def GenAsBuiltPacthPcdSections(ModuleObject):
             if TokenSpaceName == '' or PcdCName == '':
                 Logger.Error("Upt",
                              ToolError.RESOURCE_NOT_AVAILABLE,
-                             ST.ERR_INSTALL_FILE_DEC_FILE_ERROR % (TokenSpaceGuidValue, Token),
+                             ST.ERR_INSTALL_FILE_DEC_FILE_ERROR % (
+                                 TokenSpaceGuidValue, Token),
                              File=ModuleObject.GetFullPath())
             Statement = HelpString + TokenSpaceName + '.' + PcdCName + ' | ' + PcdValue + ' | ' + \
-                         PcdOffset + DT.TAB_SPACE_SPLIT
+                PcdOffset + DT.TAB_SPACE_SPLIT
             #
             # Use binary file's Arch to be Pcd's Arch
             #
@@ -919,9 +964,11 @@ def GenAsBuiltPacthPcdSections(ModuleObject):
                     else:
                         PatchPcdDict[Arch] = [Statement]
     return GenSection(DT.TAB_INF_PATCH_PCD, PatchPcdDict)
-## GenPcdSections
+# GenPcdSections
 #
 #
+
+
 def GenAsBuiltPcdExSections(ModuleObject):
     PcdExDict = {}
     for BinaryFile in ModuleObject.GetBinaryFileList():
@@ -942,10 +989,12 @@ def GenAsBuiltPcdExSections(ModuleObject):
             if TokenSpaceName == '' or PcdCName == '':
                 Logger.Error("Upt",
                              ToolError.RESOURCE_NOT_AVAILABLE,
-                             ST.ERR_INSTALL_FILE_DEC_FILE_ERROR % (TokenSpaceGuidValue, Token),
+                             ST.ERR_INSTALL_FILE_DEC_FILE_ERROR % (
+                                 TokenSpaceGuidValue, Token),
                              File=ModuleObject.GetFullPath())
 
-            Statement = HelpString + TokenSpaceName + DT.TAB_SPLIT + PcdCName + DT.TAB_SPACE_SPLIT
+            Statement = HelpString + TokenSpaceName + \
+                DT.TAB_SPLIT + PcdCName + DT.TAB_SPACE_SPLIT
 
             #
             # Use binary file's Arch to be Pcd's Arch
@@ -969,9 +1018,11 @@ def GenAsBuiltPcdExSections(ModuleObject):
                         PcdExDict[Arch] = [Statement]
     return GenSection('PcdEx', PcdExDict)
 
-## GenSpecialSections
+# GenSpecialSections
 #  generate special sections for Event/BootMode/Hob
 #
+
+
 def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
     #
     # generate section
@@ -1017,13 +1068,15 @@ def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
         SupArch = sorted(Obj.GetSupArchList())
         SortedArch = ' '.join(SupArch)
         if SortedArch in NewSectionDict:
-            NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + [NewStateMent]
+            NewSectionDict[SortedArch] = NewSectionDict[SortedArch] + \
+                [NewStateMent]
         else:
             NewSectionDict[SortedArch] = [NewStateMent]
     SectionContent = GenSection(SectionName, NewSectionDict)
     SectionContent = SectionContent.strip()
     if SectionContent:
-        Content = '# ' + ('\n' + '# ').join(GetSplitValueList(SectionContent, '\n'))
+        Content = '# ' + \
+            ('\n' + '# ').join(GetSplitValueList(SectionContent, '\n'))
         Content = Content.lstrip()
     #
     # add a return to differentiate it between other possible sections
@@ -1031,9 +1084,11 @@ def GenSpecialSections(ObjectList, SectionName, UserExtensionsContent=''):
     if Content:
         Content += '\n'
     return Content
-## GenBuildOptions
+# GenBuildOptions
 #
 #
+
+
 def GenBuildOptions(ModuleObject):
     Content = ''
     if not ModuleObject.BinaryModule:
@@ -1047,7 +1102,8 @@ def GenBuildOptions(ModuleObject):
                 continue
             for Arch in BuildOptionDict:
                 if Arch in NewSectionDict:
-                    NewSectionDict[Arch] = NewSectionDict[Arch] + [BuildOptionDict[Arch]]
+                    NewSectionDict[Arch] = NewSectionDict[Arch] + \
+                        [BuildOptionDict[Arch]]
                 else:
                     NewSectionDict[Arch] = [BuildOptionDict[Arch]]
         Content = GenSection('BuildOptions', NewSectionDict)
@@ -1076,9 +1132,11 @@ def GenBuildOptions(ModuleObject):
         Content = GenSection('BuildOptions', BuildOptionDict)
 
     return Content
-## GenBinaries
+# GenBinaries
 #
 #
+
+
 def GenBinaries(ModuleObject):
     NewSectionDict = {}
     BinariesDict = []
@@ -1110,7 +1168,8 @@ def GenBinaries(ModuleObject):
                 BinariesDict[Key] = []
             else:
                 if FileType == 'SUBTYPE_GUID' and FileNameObj.GetGuidValue():
-                    Statement = GenBinaryStatement(Key, None, FileNameObj.GetGuidValue())
+                    Statement = GenBinaryStatement(
+                        Key, None, FileNameObj.GetGuidValue())
                 else:
                     Statement = GenBinaryStatement(Key, None)
                 if SortedArch in NewSectionDict:
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py
index c7146977dc84..7dcea267bf4c 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py
@@ -1,4 +1,4 @@
-## @file GenMetaFileMisc.py
+# @file GenMetaFileMisc.py
 #
 # This file contained the miscellaneous routines for GenMetaFile usage.
 #
@@ -21,57 +21,69 @@ from Parser.DecParser import Dec
 #  @param Arch:     string of source file family field
 #  @param ExternList:  string of source file FeatureFlag field
 #
+
+
 def AddExternToDefineSec(SectionDict, Arch, ExternList):
     LeftOffset = 31
     for ArchList, EntryPoint, UnloadImage, Constructor, Destructor, FFE, HelpStringList in ExternList:
         if Arch or ArchList:
             if EntryPoint:
-                Statement = (u'%s ' % DT.TAB_INF_DEFINES_ENTRY_POINT).ljust(LeftOffset) + u'= %s' % EntryPoint
+                Statement = (u'%s ' % DT.TAB_INF_DEFINES_ENTRY_POINT).ljust(
+                    LeftOffset) + u'= %s' % EntryPoint
                 if FFE:
                     Statement += ' | %s' % FFE
                 if len(HelpStringList) > 0:
-                    Statement = HelpStringList[0].GetString() + '\n' + Statement
+                    Statement = HelpStringList[0].GetString(
+                    ) + '\n' + Statement
                 if len(HelpStringList) > 1:
                     Statement = Statement + HelpStringList[1].GetString()
                 SectionDict[Arch] = SectionDict[Arch] + [Statement]
 
             if UnloadImage:
-                Statement = (u'%s ' % DT.TAB_INF_DEFINES_UNLOAD_IMAGE).ljust(LeftOffset) + u'= %s' % UnloadImage
+                Statement = (u'%s ' % DT.TAB_INF_DEFINES_UNLOAD_IMAGE).ljust(
+                    LeftOffset) + u'= %s' % UnloadImage
                 if FFE:
                     Statement += ' | %s' % FFE
 
                 if len(HelpStringList) > 0:
-                    Statement = HelpStringList[0].GetString() + '\n' + Statement
+                    Statement = HelpStringList[0].GetString(
+                    ) + '\n' + Statement
                 if len(HelpStringList) > 1:
                     Statement = Statement + HelpStringList[1].GetString()
                 SectionDict[Arch] = SectionDict[Arch] + [Statement]
 
             if Constructor:
-                Statement = (u'%s ' % DT.TAB_INF_DEFINES_CONSTRUCTOR).ljust(LeftOffset) + u'= %s' % Constructor
+                Statement = (u'%s ' % DT.TAB_INF_DEFINES_CONSTRUCTOR).ljust(
+                    LeftOffset) + u'= %s' % Constructor
                 if FFE:
                     Statement += ' | %s' % FFE
 
                 if len(HelpStringList) > 0:
-                    Statement = HelpStringList[0].GetString() + '\n' + Statement
+                    Statement = HelpStringList[0].GetString(
+                    ) + '\n' + Statement
                 if len(HelpStringList) > 1:
                     Statement = Statement + HelpStringList[1].GetString()
                 SectionDict[Arch] = SectionDict[Arch] + [Statement]
 
             if Destructor:
-                Statement = (u'%s ' % DT.TAB_INF_DEFINES_DESTRUCTOR).ljust(LeftOffset) + u'= %s' % Destructor
+                Statement = (u'%s ' % DT.TAB_INF_DEFINES_DESTRUCTOR).ljust(
+                    LeftOffset) + u'= %s' % Destructor
                 if FFE:
                     Statement += ' | %s' % FFE
 
                 if len(HelpStringList) > 0:
-                    Statement = HelpStringList[0].GetString() + '\n' + Statement
+                    Statement = HelpStringList[0].GetString(
+                    ) + '\n' + Statement
                 if len(HelpStringList) > 1:
                     Statement = Statement + HelpStringList[1].GetString()
                 SectionDict[Arch] = SectionDict[Arch] + [Statement]
 
-## ObtainPcdName
+# ObtainPcdName
 #
 # Using TokenSpaceGuidValue and Token to obtain PcdName from DEC file
 #
+
+
 def ObtainPcdName(Packages, TokenSpaceGuidValue, Token):
     TokenSpaceGuidName = ''
     PcdCName = ''
@@ -149,11 +161,13 @@ def ObtainPcdName(Packages, TokenSpaceGuidValue, Token):
 
     return TokenSpaceGuidName, PcdCName
 
-## _TransferDict
+# _TransferDict
 #  transfer dict that using (Statement, SortedArch) as key,
 #  (GenericComment, UsageComment) as value into a dict that using SortedArch as
 #  key and NewStatement as value
 #
+
+
 def TransferDict(OrigDict, Type=None):
     NewDict = {}
     LeftOffset = 0
@@ -175,7 +189,8 @@ def TransferDict(OrigDict, Type=None):
             NewStateMent = Comment + Statement
         else:
             if LeftOffset:
-                NewStateMent = Statement.ljust(LeftOffset) + ' ' + Comment.rstrip('\n')
+                NewStateMent = Statement.ljust(
+                    LeftOffset) + ' ' + Comment.rstrip('\n')
             else:
                 NewStateMent = Statement + ' ' + Comment.rstrip('\n')
 
@@ -185,4 +200,3 @@ def TransferDict(OrigDict, Type=None):
             NewDict[SortedArch] = [NewStateMent]
 
     return NewDict
-
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py b/BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py
index 50d79973b948..94fc35b1101f 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py
@@ -1,4 +1,4 @@
-## @file GenXmlFile.py
+# @file GenXmlFile.py
 #
 # This file contained the logical of generate XML files.
 #
diff --git a/BaseTools/Source/Python/UPT/GenMetaFile/__init__.py b/BaseTools/Source/Python/UPT/GenMetaFile/__init__.py
index 508c6d060045..1b771c1edf87 100644
--- a/BaseTools/Source/Python/UPT/GenMetaFile/__init__.py
+++ b/BaseTools/Source/Python/UPT/GenMetaFile/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/InstallPkg.py b/BaseTools/Source/Python/UPT/InstallPkg.py
index e4c7565441ba..576b0ab38358 100644
--- a/BaseTools/Source/Python/UPT/InstallPkg.py
+++ b/BaseTools/Source/Python/UPT/InstallPkg.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Install distribution package.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -51,15 +51,17 @@ from Core.PackageFile import CreateDirectory
 from Core.DependencyRules import DependencyRules
 from Library import GlobalData
 
-## InstallNewPackage
+# InstallNewPackage
 #
 # @param WorkspaceDir:   Workspace Directory
 # @param Path:           Package Path
 # @param CustomPath:     whether need to customize path at first
 #
-def InstallNewPackage(WorkspaceDir, Path, CustomPath = False):
+
+
+def InstallNewPackage(WorkspaceDir, Path, CustomPath=False):
     if os.path.isabs(Path):
-        Logger.Info(ST.MSG_RELATIVE_PATH_ONLY%Path)
+        Logger.Info(ST.MSG_RELATIVE_PATH_ONLY % Path)
     elif CustomPath:
         Logger.Info(ST.MSG_NEW_PKG_PATH)
     else:
@@ -67,7 +69,7 @@ def InstallNewPackage(WorkspaceDir, Path, CustomPath = False):
         Path = os.path.normpath(Path)
         FullPath = os.path.normpath(os.path.join(WorkspaceDir, Path))
         if os.path.exists(FullPath):
-            Logger.Info(ST.ERR_DIR_ALREADY_EXIST%FullPath)
+            Logger.Info(ST.ERR_DIR_ALREADY_EXIST % FullPath)
         else:
             return Path
 
@@ -78,22 +80,24 @@ def InstallNewPackage(WorkspaceDir, Path, CustomPath = False):
     Input = Input.replace('\r', '').replace('\n', '')
     return InstallNewPackage(WorkspaceDir, Input, False)
 
-## InstallNewModule
+# InstallNewModule
 #
 # @param WorkspaceDir:   Workspace Directory
 # @param Path:           Standalone Module Path
 # @param PathList:       The already installed standalone module Path list
 #
-def InstallNewModule(WorkspaceDir, Path, PathList = None):
+
+
+def InstallNewModule(WorkspaceDir, Path, PathList=None):
     if PathList is None:
         PathList = []
     Path = ConvertPath(Path)
     Path = os.path.normpath(Path)
     FullPath = os.path.normpath(os.path.join(WorkspaceDir, Path))
     if os.path.exists(FullPath) and FullPath not in PathList:
-        Logger.Info(ST.ERR_DIR_ALREADY_EXIST%Path)
+        Logger.Info(ST.ERR_DIR_ALREADY_EXIST % Path)
     elif Path == FullPath:
-        Logger.Info(ST.MSG_RELATIVE_PATH_ONLY%FullPath)
+        Logger.Info(ST.MSG_RELATIVE_PATH_ONLY % FullPath)
     else:
         return Path
 
@@ -105,7 +109,7 @@ def InstallNewModule(WorkspaceDir, Path, PathList = None):
     return InstallNewModule(WorkspaceDir, Input, PathList)
 
 
-## InstallNewFile
+# InstallNewFile
 #
 # @param WorkspaceDir:   Workspace Direction
 # @param File:      File
@@ -113,7 +117,7 @@ def InstallNewModule(WorkspaceDir, Path, PathList = None):
 def InstallNewFile(WorkspaceDir, File):
     FullPath = os.path.normpath(os.path.join(WorkspaceDir, File))
     if os.path.exists(FullPath):
-        Logger.Info(ST.ERR_FILE_ALREADY_EXIST %File)
+        Logger.Info(ST.ERR_FILE_ALREADY_EXIST % File)
         Input = stdin.readline()
         Input = Input.replace('\r', '').replace('\n', '')
         if Input == '':
@@ -123,10 +127,12 @@ def InstallNewFile(WorkspaceDir, File):
     else:
         return File
 
-## UnZipDp
+# UnZipDp
 #
 # UnZipDp
 #
+
+
 def UnZipDp(WorkspaceDir, DpPkgFileName, Index=1):
     ContentZipFile = None
     Logger.Quiet(ST.MSG_UZIP_PARSE_XML)
@@ -134,11 +140,14 @@ def UnZipDp(WorkspaceDir, DpPkgFileName, Index=1):
 
     DpDescFileName, ContentFileName = GetDPFile(DistFile.GetZipFile())
 
-    TempDir = os.path.normpath(os.path.join(WorkspaceDir, "Conf/.tmp%s" % str(Index)))
+    TempDir = os.path.normpath(os.path.join(
+        WorkspaceDir, "Conf/.tmp%s" % str(Index)))
     GlobalData.gUNPACK_DIR.append(TempDir)
-    DistPkgFile = DistFile.UnpackFile(DpDescFileName, os.path.normpath(os.path.join(TempDir, DpDescFileName)))
+    DistPkgFile = DistFile.UnpackFile(
+        DpDescFileName, os.path.normpath(os.path.join(TempDir, DpDescFileName)))
     if not DistPkgFile:
-        Logger.Error("InstallPkg", FILE_NOT_FOUND, ST.ERR_FILE_BROKEN %DpDescFileName)
+        Logger.Error("InstallPkg", FILE_NOT_FOUND,
+                     ST.ERR_FILE_BROKEN % DpDescFileName)
 
     #
     # Generate distpkg
@@ -153,10 +162,11 @@ def UnZipDp(WorkspaceDir, DpPkgFileName, Index=1):
     #
     # unzip contents.zip file
     #
-    ContentFile = DistFile.UnpackFile(ContentFileName, os.path.normpath(os.path.join(TempDir, ContentFileName)))
+    ContentFile = DistFile.UnpackFile(
+        ContentFileName, os.path.normpath(os.path.join(TempDir, ContentFileName)))
     if not ContentFile:
         Logger.Error("InstallPkg", FILE_NOT_FOUND,
-            ST.ERR_FILE_BROKEN % ContentFileName)
+                     ST.ERR_FILE_BROKEN % ContentFileName)
 
     #
     # Get file size
@@ -174,14 +184,16 @@ def UnZipDp(WorkspaceDir, DpPkgFileName, Index=1):
         if DistPkg.Header.Signature != Md5Signature.hexdigest():
             ContentZipFile.Close()
             Logger.Error("InstallPkg", FILE_CHECKSUM_FAILURE,
-                ExtraData=ContentFile)
+                         ExtraData=ContentFile)
 
     return DistPkg, ContentZipFile, DpPkgFileName, DistFile
 
-## GetPackageList
+# GetPackageList
 #
 # GetPackageList
 #
+
+
 def GetPackageList(DistPkg, Dep, WorkspaceDir, Options, ContentZipFile, ModuleList, PackageList):
     NewDict = Sdict()
     for Guid, Version, Path in DistPkg.PackageSurfaceArea:
@@ -192,9 +204,11 @@ def GetPackageList(DistPkg, Dep, WorkspaceDir, Options, ContentZipFile, ModuleLi
 #             Logger.Info(ST.WRN_PACKAGE_EXISTED %(Guid, Version))
         if Options.UseGuidedPkgPath:
             GuidedPkgPath = "%s_%s_%s" % (Package.GetName(), Guid, Version)
-            NewPackagePath = InstallNewPackage(WorkspaceDir, GuidedPkgPath, Options.CustomPath)
+            NewPackagePath = InstallNewPackage(
+                WorkspaceDir, GuidedPkgPath, Options.CustomPath)
         else:
-            NewPackagePath = InstallNewPackage(WorkspaceDir, PackagePath, Options.CustomPath)
+            NewPackagePath = InstallNewPackage(
+                WorkspaceDir, PackagePath, Options.CustomPath)
         InstallPackageContent(PackagePath, NewPackagePath, Package, ContentZipFile, Dep, WorkspaceDir, ModuleList,
                               DistPkg.Header.ReadOnly)
         PackageList.append(Package)
@@ -216,10 +230,12 @@ def GetPackageList(DistPkg, Dep, WorkspaceDir, Options, ContentZipFile, ModuleLi
 
     return NewDict
 
-## GetModuleList
+# GetModuleList
 #
 # GetModuleList
 #
+
+
 def GetModuleList(DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList):
     #
     # ModulePathList will keep track of the standalone module path that
@@ -240,16 +256,18 @@ def GetModuleList(DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList):
         Module = DistPkg.ModuleSurfaceArea[Guid, Version, Name, Path]
         Logger.Info(ST.MSG_INSTALL_MODULE % Module.GetName())
         if Dep.CheckModuleExists(Guid, Version, Name, Path):
-            Logger.Quiet(ST.WRN_MODULE_EXISTED %Path)
+            Logger.Quiet(ST.WRN_MODULE_EXISTED % Path)
         #
         # here check for the multiple inf share the same module path cases:
         # they should be installed into the same directory
         #
         ModuleFullPath = \
-        os.path.normpath(os.path.join(WorkspaceDir, ModulePath))
+            os.path.normpath(os.path.join(WorkspaceDir, ModulePath))
         if ModuleFullPath not in ModulePathList:
-            NewModulePath = InstallNewModule(WorkspaceDir, ModulePath, ModulePathList)
-            NewModuleFullPath = os.path.normpath(os.path.join(WorkspaceDir, NewModulePath))
+            NewModulePath = InstallNewModule(
+                WorkspaceDir, ModulePath, ModulePathList)
+            NewModuleFullPath = os.path.normpath(
+                os.path.join(WorkspaceDir, NewModulePath))
             ModulePathList.append(NewModuleFullPath)
         else:
             NewModulePath = ModulePath
@@ -259,7 +277,8 @@ def GetModuleList(DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList):
         #
         # Update module
         #
-        Module.SetModulePath(Module.GetModulePath().replace(Path, NewModulePath, 1))
+        Module.SetModulePath(
+            Module.GetModulePath().replace(Path, NewModulePath, 1))
 
         NewDict[Guid, Version, Name, Module.GetModulePath()] = Module
 
@@ -289,6 +308,8 @@ def GetModuleList(DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList):
 ##
 # Get all protocol/ppi/guid CNames and pcd name from all dependent DEC file
 #
+
+
 def GetDepProtocolPpiGuidPcdNames(DePackageObjList):
     #
     # [[Dec1Protocol1, Dec1Protocol2...], [Dec2Protocols...],...]
@@ -340,12 +361,13 @@ def GetDepProtocolPpiGuidPcdNames(DePackageObjList):
 
         DependentPcdNames.append(PcdNames)
 
-
     return DependentProtocolCNames, DependentPpiCNames, DependentGuidCNames, DependentPcdNames
 
 ##
 # Check if protocol CName is redefined
 #
+
+
 def CheckProtoclCNameRedefined(Module, DependentProtocolCNames):
     for ProtocolInModule in Module.GetProtocolList():
         IsCNameDefined = False
@@ -353,15 +375,16 @@ def CheckProtoclCNameRedefined(Module, DependentProtocolCNames):
             if ProtocolInModule.GetCName() in PackageProtocolCNames:
                 if IsCNameDefined:
                     Logger.Error("\nUPT", FORMAT_INVALID,
-                                 File = Module.GetFullPath(),
-                                 ExtraData = \
-                                 ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % ProtocolInModule.GetCName())
+                                 File=Module.GetFullPath(),
+                                 ExtraData=ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % ProtocolInModule.GetCName())
                 else:
                     IsCNameDefined = True
 
 ##
 # Check if Ppi CName is redefined
 #
+
+
 def CheckPpiCNameRedefined(Module, DependentPpiCNames):
     for PpiInModule in Module.GetPpiList():
         IsCNameDefined = False
@@ -369,14 +392,16 @@ def CheckPpiCNameRedefined(Module, DependentPpiCNames):
             if PpiInModule.GetCName() in PackagePpiCNames:
                 if IsCNameDefined:
                     Logger.Error("\nUPT", FORMAT_INVALID,
-                                 File = Module.GetFullPath(),
-                                 ExtraData = ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % PpiInModule.GetCName())
+                                 File=Module.GetFullPath(),
+                                 ExtraData=ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % PpiInModule.GetCName())
                 else:
                     IsCNameDefined = True
 
 ##
 # Check if Guid CName is redefined
 #
+
+
 def CheckGuidCNameRedefined(Module, DependentGuidCNames):
     for GuidInModule in Module.GetGuidList():
         IsCNameDefined = False
@@ -384,15 +409,16 @@ def CheckGuidCNameRedefined(Module, DependentGuidCNames):
             if GuidInModule.GetCName() in PackageGuidCNames:
                 if IsCNameDefined:
                     Logger.Error("\nUPT", FORMAT_INVALID,
-                                 File = Module.GetFullPath(),
-                                 ExtraData = \
-                                 ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % GuidInModule.GetCName())
+                                 File=Module.GetFullPath(),
+                                 ExtraData=ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % GuidInModule.GetCName())
                 else:
                     IsCNameDefined = True
 
 ##
 # Check if PcdName is redefined
 #
+
+
 def CheckPcdNameRedefined(Module, DependentPcdNames):
     PcdObjs = []
     if not Module.GetBinaryFileList():
@@ -403,20 +429,23 @@ def CheckPcdNameRedefined(Module, DependentPcdNames):
             PcdObjs += AsBuild.GetPatchPcdList() + AsBuild.GetPcdExList()
 
     for PcdObj in PcdObjs:
-        PcdName = '.'.join([PcdObj.GetTokenSpaceGuidCName(), PcdObj.GetCName()])
+        PcdName = '.'.join(
+            [PcdObj.GetTokenSpaceGuidCName(), PcdObj.GetCName()])
         IsPcdNameDefined = False
         for PcdNames in DependentPcdNames:
             if PcdName in PcdNames:
                 if IsPcdNameDefined:
                     Logger.Error("\nUPT", FORMAT_INVALID,
-                                 File = Module.GetFullPath(),
-                                 ExtraData = ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % PcdName)
+                                 File=Module.GetFullPath(),
+                                 ExtraData=ST.ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC % PcdName)
                 else:
                     IsPcdNameDefined = True
 
 ##
 # Check if any Protocol/Ppi/Guid and Pcd name is redefined in its dependent DEC files
 #
+
+
 def CheckCNameInModuleRedefined(Module, DistPkg):
     DePackageObjList = []
     #
@@ -431,18 +460,20 @@ def CheckCNameInModuleRedefined(Module, DistPkg):
                     DePackageObjList.append(DistPkg.PackageSurfaceArea[Key])
 
     DependentProtocolCNames, DependentPpiCNames, DependentGuidCNames, DependentPcdNames = \
-    GetDepProtocolPpiGuidPcdNames(DePackageObjList)
+        GetDepProtocolPpiGuidPcdNames(DePackageObjList)
 
     CheckProtoclCNameRedefined(Module, DependentProtocolCNames)
     CheckPpiCNameRedefined(Module, DependentPpiCNames)
     CheckGuidCNameRedefined(Module, DependentGuidCNames)
     CheckPcdNameRedefined(Module, DependentPcdNames)
 
-## GenToolMisc
+# GenToolMisc
 #
 # GenToolMisc
 #
 #
+
+
 def GenToolMisc(DistPkg, WorkspaceDir, ContentZipFile):
     ToolObject = DistPkg.Tools
     MiscObject = DistPkg.MiscellaneousFiles
@@ -473,20 +504,21 @@ def GenToolMisc(DistPkg, WorkspaceDir, ContentZipFile):
         File = ConvertPath(FileObject.GetURI())
         ToFile = os.path.normpath(os.path.join(RootDir, File))
         if os.path.exists(ToFile):
-            Logger.Info( ST.WRN_FILE_EXISTED % ToFile )
+            Logger.Info(ST.WRN_FILE_EXISTED % ToFile)
             #
             # ask for user input the new file name
             #
-            Logger.Info( ST.MSG_NEW_FILE_NAME)
+            Logger.Info(ST.MSG_NEW_FILE_NAME)
             Input = stdin.readline()
             Input = Input.replace('\r', '').replace('\n', '')
             OrigPath = os.path.split(ToFile)[0]
             ToFile = os.path.normpath(os.path.join(OrigPath, Input))
         FromFile = os.path.join(FileObject.GetURI())
-        Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, DistPkg.Header.ReadOnly, FileObject.GetExecutable())
+        Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile,
+                             DistPkg.Header.ReadOnly, FileObject.GetExecutable())
         DistPkg.FileList.append((ToFile, Md5Sum))
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -494,12 +526,15 @@ def GenToolMisc(DistPkg, WorkspaceDir, ContentZipFile):
 #
 # @param  Options: command Options
 #
-def Main(Options = None):
+
+
+def Main(Options=None):
     try:
         DataBase = GlobalData.gDB
         WorkspaceDir = GlobalData.gWORKSPACE
         if not Options.PackageFile:
-            Logger.Error("InstallPkg", OPTION_MISSING, ExtraData=ST.ERR_SPECIFY_PACKAGE)
+            Logger.Error("InstallPkg", OPTION_MISSING,
+                         ExtraData=ST.ERR_SPECIFY_PACKAGE)
 
         # Get all Dist Info
         DistInfoList = []
@@ -509,7 +544,8 @@ def Main(Options = None):
             #
             # unzip dist.pkg file
             #
-            DistInfoList.append(UnZipDp(WorkspaceDir, ToBeInstalledDist, Index))
+            DistInfoList.append(
+                UnZipDp(WorkspaceDir, ToBeInstalledDist, Index))
             DistPkgList.append(DistInfoList[-1][0])
             Index += 1
 
@@ -534,24 +570,26 @@ def Main(Options = None):
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
 
     except KeyboardInterrupt:
         ReturnCode = ABORT_ERROR
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
 
     except:
         ReturnCode = CODE_ERROR
         Logger.Error(
-                    "\nInstallPkg",
-                    CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_INSTALL_ERR % Options.PackageFile,
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
+            "\nInstallPkg",
+            CODE_ERROR,
+            ST.ERR_UNKNOWN_FATAL_INSTALL_ERR % Options.PackageFile,
+            ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
         Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
-            platform) + format_exc())
+                                         platform) + format_exc())
     finally:
         Logger.Quiet(ST.MSG_REMOVE_TEMP_FILE_STARTED)
         for ToBeInstalledDist in DistInfoList:
@@ -578,6 +616,8 @@ def Main(Options = None):
 # @param WorkspaceDir:  The workspace directory
 # @retval NewDpPkgFileName: The exact backup file name
 #
+
+
 def BackupDist(DpPkgFileName, Guid, Version, WorkspaceDir):
     DistFileName = os.path.split(DpPkgFileName)[1]
     DestDir = os.path.normpath(os.path.join(WorkspaceDir, GlobalData.gUPT_DIR))
@@ -591,7 +631,7 @@ def BackupDist(DpPkgFileName, Guid, Version, WorkspaceDir):
             #
             # ask for user input the new file name
             #
-            Logger.Info( ST.MSG_NEW_FILE_NAME_FOR_DIST)
+            Logger.Info(ST.MSG_NEW_FILE_NAME_FOR_DIST)
             Input = stdin.readline()
             Input = Input.replace('\r', '').replace('\n', '')
             DestFile = os.path.normpath(os.path.join(DestDir, Input))
@@ -599,19 +639,21 @@ def BackupDist(DpPkgFileName, Guid, Version, WorkspaceDir):
     NewDpPkgFileName = DestFile[DestFile.find(DestDir) + len(DestDir) + 1:]
     return NewDpPkgFileName
 
-## CheckInstallDpx method
+# CheckInstallDpx method
 #
 #  check whether distribution could be installed
 #
 #   @param  Dep: the DependencyRules instance that used to check dependency
 #   @param  DistPkg: the distribution object
 #
+
+
 def CheckInstallDpx(Dep, DistPkg, DistPkgFileName):
     #
     # Check distribution package installed or not
     #
     if Dep.CheckDpExists(DistPkg.Header.GetGuid(),
-        DistPkg.Header.GetVersion()):
+                         DistPkg.Header.GetVersion()):
         Logger.Error("InstallPkg",
                      UPT_ALREADY_INSTALLED_ERROR,
                      ST.WRN_DIST_PKG_INSTALLED % os.path.basename(DistPkgFileName))
@@ -621,10 +663,10 @@ def CheckInstallDpx(Dep, DistPkg, DistPkgFileName):
     #
     if not Dep.CheckInstallDpDepexSatisfied(DistPkg):
         Logger.Error("InstallPkg", UNKNOWN_ERROR,
-            ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY,
-            ExtraData=DistPkg.Header.Name)
+                     ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY,
+                     ExtraData=DistPkg.Header.Name)
 
-## InstallModuleContent method
+# InstallModuleContent method
 #
 # If this is standalone module, then Package should be none,
 # ModulePath should be ''
@@ -637,19 +679,22 @@ def CheckInstallDpx(Dep, DistPkg, DistPkgFileName):
 #   @param  ModuleList: ModuleList
 #   @param  Package: Package
 #
+
+
 def InstallModuleContent(FromPath, NewPath, ModulePath, Module, ContentZipFile,
-    WorkspaceDir, ModuleList, Package = None, ReadOnly = False):
+                         WorkspaceDir, ModuleList, Package=None, ReadOnly=False):
 
     if NewPath.startswith("\\") or NewPath.startswith("/"):
         NewPath = NewPath[1:]
 
     if not IsValidInstallPath(NewPath):
-        Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%NewPath)
+        Logger.Error("UPT", FORMAT_INVALID,
+                     ST.ERR_FILE_NAME_INVALIDE % NewPath)
 
     NewModuleFullPath = os.path.normpath(os.path.join(WorkspaceDir, NewPath,
-        ConvertPath(ModulePath)))
+                                                      ConvertPath(ModulePath)))
     Module.SetFullPath(os.path.normpath(os.path.join(NewModuleFullPath,
-        ConvertPath(Module.GetName()) + '.inf')))
+                                                     ConvertPath(Module.GetName()) + '.inf')))
     Module.FileList = []
 
     for MiscFile in Module.GetMiscFileList():
@@ -661,12 +706,15 @@ def InstallModuleContent(FromPath, NewPath, ModulePath, Module, ContentZipFile,
                 File = File[1:]
 
             if not IsValidInstallPath(File):
-                Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%File)
+                Logger.Error("UPT", FORMAT_INVALID,
+                             ST.ERR_FILE_NAME_INVALIDE % File)
 
             FromFile = os.path.join(FromPath, ModulePath, File)
             Executable = Item.GetExecutable()
-            ToFile = os.path.normpath(os.path.join(NewModuleFullPath, ConvertPath(File)))
-            Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly, Executable)
+            ToFile = os.path.normpath(os.path.join(
+                NewModuleFullPath, ConvertPath(File)))
+            Md5Sum = InstallFile(ContentZipFile, FromFile,
+                                 ToFile, ReadOnly, Executable)
             if Package and ((ToFile, Md5Sum) not in Package.FileList):
                 Package.FileList.append((ToFile, Md5Sum))
             elif Package:
@@ -679,10 +727,12 @@ def InstallModuleContent(FromPath, NewPath, ModulePath, Module, ContentZipFile,
             File = File[1:]
 
         if not IsValidInstallPath(File):
-            Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%File)
+            Logger.Error("UPT", FORMAT_INVALID,
+                         ST.ERR_FILE_NAME_INVALIDE % File)
 
         FromFile = os.path.join(FromPath, ModulePath, File)
-        ToFile = os.path.normpath(os.path.join(NewModuleFullPath, ConvertPath(File)))
+        ToFile = os.path.normpath(os.path.join(
+            NewModuleFullPath, ConvertPath(File)))
         Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly)
         if Package and ((ToFile, Md5Sum) not in Package.FileList):
             Package.FileList.append((ToFile, Md5Sum))
@@ -698,10 +748,12 @@ def InstallModuleContent(FromPath, NewPath, ModulePath, Module, ContentZipFile,
                 File = File[1:]
 
             if not IsValidInstallPath(File):
-                Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%File)
+                Logger.Error("UPT", FORMAT_INVALID,
+                             ST.ERR_FILE_NAME_INVALIDE % File)
 
             FromFile = os.path.join(FromPath, ModulePath, File)
-            ToFile = os.path.normpath(os.path.join(NewModuleFullPath, ConvertPath(File)))
+            ToFile = os.path.normpath(os.path.join(
+                NewModuleFullPath, ConvertPath(File)))
             Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly)
             if Package and ((ToFile, Md5Sum) not in Package.FileList):
                 Package.FileList.append((ToFile, Md5Sum))
@@ -713,10 +765,12 @@ def InstallModuleContent(FromPath, NewPath, ModulePath, Module, ContentZipFile,
     InstallModuleContentZipFile(ContentZipFile, FromPath, ModulePath, WorkspaceDir, NewPath, Module, Package, ReadOnly,
                                 ModuleList)
 
-## InstallModuleContentZipFile
+# InstallModuleContentZipFile
 #
 # InstallModuleContentZipFile
 #
+
+
 def InstallModuleContentZipFile(ContentZipFile, FromPath, ModulePath, WorkspaceDir, NewPath, Module, Package, ReadOnly,
                                 ModuleList):
     #
@@ -731,11 +785,12 @@ def InstallModuleContentZipFile(ContentZipFile, FromPath, ModulePath, WorkspaceD
                     FileName = FileName[1:]
 
                 if not IsValidInstallPath(FileName):
-                    Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%FileName)
+                    Logger.Error("UPT", FORMAT_INVALID,
+                                 ST.ERR_FILE_NAME_INVALIDE % FileName)
 
                 FromFile = FileName
                 ToFile = os.path.normpath(os.path.join(WorkspaceDir,
-                        ConvertPath(FileName.replace(FromPath, NewPath, 1))))
+                                                       ConvertPath(FileName.replace(FromPath, NewPath, 1))))
                 CheckList = copy.copy(Module.FileList)
                 if Package:
                     CheckList += Package.FileList
@@ -743,7 +798,8 @@ def InstallModuleContentZipFile(ContentZipFile, FromPath, ModulePath, WorkspaceD
                     if Item[0] == ToFile:
                         break
                 else:
-                    Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly)
+                    Md5Sum = InstallFile(
+                        ContentZipFile, FromFile, ToFile, ReadOnly)
                     if Package and ((ToFile, Md5Sum) not in Package.FileList):
                         Package.FileList.append((ToFile, Md5Sum))
                     elif Package:
@@ -753,13 +809,15 @@ def InstallModuleContentZipFile(ContentZipFile, FromPath, ModulePath, WorkspaceD
 
     ModuleList.append((Module, Package))
 
-## FileUnderPath
+# FileUnderPath
 #  Check whether FileName started with directory specified by CheckPath
 #
 # @param FileName: the FileName need to be checked
 # @param CheckPath: the path need to be checked against
 # @return:  True or False
 #
+
+
 def FileUnderPath(FileName, CheckPath):
     FileName = FileName.replace('\\', '/')
     FileName = os.path.normpath(FileName)
@@ -774,35 +832,40 @@ def FileUnderPath(FileName, CheckPath):
 
     return False
 
-## InstallFile
+# InstallFile
 #  Extract File from Zipfile, set file attribute, and return the Md5Sum
 #
 # @return:  True or False
 #
+
+
 def InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly, Executable=False):
     if os.path.exists(os.path.normpath(ToFile)):
         pass
     else:
         if not ContentZipFile or not ContentZipFile.UnpackFile(FromFile, ToFile):
-            Logger.Error("UPT", FILE_NOT_FOUND, ST.ERR_INSTALL_FILE_FROM_EMPTY_CONTENT % FromFile)
+            Logger.Error("UPT", FILE_NOT_FOUND,
+                         ST.ERR_INSTALL_FILE_FROM_EMPTY_CONTENT % FromFile)
 
         if ReadOnly:
             if not Executable:
                 chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH)
             else:
-                chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH | stat.S_IEXEC | stat.S_IXGRP | stat.S_IXOTH)
+                chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH |
+                      stat.S_IEXEC | stat.S_IXGRP | stat.S_IXOTH)
         elif Executable:
             chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH | stat.S_IWUSR | stat.S_IWGRP |
                   stat.S_IWOTH | stat.S_IEXEC | stat.S_IXGRP | stat.S_IXOTH)
         else:
-            chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH | stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
+            chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH |
+                  stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
 
     Md5Signature = md5(__FileHookOpen__(str(ToFile), 'rb').read())
     Md5Sum = Md5Signature.hexdigest()
 
     return Md5Sum
 
-## InstallPackageContent method
+# InstallPackageContent method
 #
 #   @param  FromPath: FromPath
 #   @param  ToPath: ToPath
@@ -812,8 +875,10 @@ def InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly, Executable=False):
 #   @param  WorkspaceDir: WorkspaceDir
 #   @param  ModuleList: ModuleList
 #
+
+
 def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
-    WorkspaceDir, ModuleList, ReadOnly = False):
+                          WorkspaceDir, ModuleList, ReadOnly=False):
     if Dep:
         pass
     Package.FileList = []
@@ -822,13 +887,14 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
         ToPath = ToPath[1:]
 
     if not IsValidInstallPath(ToPath):
-        Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%ToPath)
+        Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE % ToPath)
 
     if FromPath.startswith("\\") or FromPath.startswith("/"):
         FromPath = FromPath[1:]
 
     if not IsValidInstallPath(FromPath):
-        Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%FromPath)
+        Logger.Error("UPT", FORMAT_INVALID,
+                     ST.ERR_FILE_NAME_INVALIDE % FromPath)
 
     PackageFullPath = os.path.normpath(os.path.join(WorkspaceDir, ToPath))
     for MiscFile in Package.GetMiscFileList():
@@ -838,12 +904,14 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
                 FileName = FileName[1:]
 
             if not IsValidInstallPath(FileName):
-                Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%FileName)
+                Logger.Error("UPT", FORMAT_INVALID,
+                             ST.ERR_FILE_NAME_INVALIDE % FileName)
 
             FromFile = os.path.join(FromPath, FileName)
             Executable = Item.GetExecutable()
-            ToFile =  (os.path.join(PackageFullPath, ConvertPath(FileName)))
-            Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly, Executable)
+            ToFile = (os.path.join(PackageFullPath, ConvertPath(FileName)))
+            Md5Sum = InstallFile(ContentZipFile, FromFile,
+                                 ToFile, ReadOnly, Executable)
             if (ToFile, Md5Sum) not in Package.FileList:
                 Package.FileList.append((ToFile, Md5Sum))
     PackageIncludeArchList = []
@@ -853,23 +921,27 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
             FileName = FileName[1:]
 
         if not IsValidInstallPath(FileName):
-            Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%FileName)
+            Logger.Error("UPT", FORMAT_INVALID,
+                         ST.ERR_FILE_NAME_INVALIDE % FileName)
 
         FromFile = os.path.join(FromPath, FileName)
-        ToFile = os.path.normpath(os.path.join(PackageFullPath, ConvertPath(FileName)))
+        ToFile = os.path.normpath(os.path.join(
+            PackageFullPath, ConvertPath(FileName)))
         RetFile = ContentZipFile.UnpackFile(FromFile, ToFile)
         if RetFile == '':
             #
             # a non-exist path in Zipfile will return '', which means an include directory in our case
             # save the information for later DEC creation usage and also create the directory
             #
-            PackageIncludeArchList.append([Item.GetFilePath(), Item.GetSupArchList()])
+            PackageIncludeArchList.append(
+                [Item.GetFilePath(), Item.GetSupArchList()])
             CreateDirectory(ToFile)
             continue
         if ReadOnly:
-            chmod(ToFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH)
+            chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH)
         else:
-            chmod(ToFile, stat.S_IRUSR|stat.S_IRGRP|stat.S_IROTH|stat.S_IWUSR|stat.S_IWGRP|stat.S_IWOTH)
+            chmod(ToFile, stat.S_IRUSR | stat.S_IRGRP | stat.S_IROTH |
+                  stat.S_IWUSR | stat.S_IWGRP | stat.S_IWOTH)
         Md5Signature = md5(__FileHookOpen__(str(ToFile), 'rb').read())
         Md5Sum = Md5Signature.hexdigest()
         if (ToFile, Md5Sum) not in Package.FileList:
@@ -882,10 +954,12 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
             FileName = FileName[1:]
 
         if not IsValidInstallPath(FileName):
-            Logger.Error("UPT", FORMAT_INVALID, ST.ERR_FILE_NAME_INVALIDE%FileName)
+            Logger.Error("UPT", FORMAT_INVALID,
+                         ST.ERR_FILE_NAME_INVALIDE % FileName)
 
         FromFile = os.path.join(FromPath, FileName)
-        ToFile = os.path.normpath(os.path.join(PackageFullPath, ConvertPath(FileName)))
+        ToFile = os.path.normpath(os.path.join(
+            PackageFullPath, ConvertPath(FileName)))
         Md5Sum = InstallFile(ContentZipFile, FromFile, ToFile, ReadOnly)
         if (ToFile, Md5Sum) not in Package.FileList:
             Package.FileList.append((ToFile, Md5Sum))
@@ -894,9 +968,9 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
     # Update package
     #
     Package.SetPackagePath(Package.GetPackagePath().replace(FromPath,
-        ToPath, 1))
+                                                            ToPath, 1))
     Package.SetFullPath(os.path.normpath(os.path.join(PackageFullPath,
-        ConvertPath(Package.GetName()) + '.dec')))
+                                                      ConvertPath(Package.GetName()) + '.dec')))
 
     #
     # Install files in module
@@ -906,12 +980,14 @@ def InstallPackageContent(FromPath, ToPath, Package, ContentZipFile, Dep,
     for ModuleGuid, ModuleVersion, ModuleName, ModulePath in ModuleDict:
         Module = ModuleDict[ModuleGuid, ModuleVersion, ModuleName, ModulePath]
         InstallModuleContent(FromPath, ToPath, ModulePath, Module,
-            ContentZipFile, WorkspaceDir, ModuleList, Package, ReadOnly)
+                             ContentZipFile, WorkspaceDir, ModuleList, Package, ReadOnly)
 
-## GetDPFile method
+# GetDPFile method
 #
 #   @param  ZipFile: A ZipFile
 #
+
+
 def GetDPFile(ZipFile):
     ContentFile = ''
     DescFile = ''
@@ -928,16 +1004,18 @@ def GetDPFile(ZipFile):
             continue
 
         Logger.Error("PackagingTool", FILE_TYPE_MISMATCH,
-            ExtraData=ST.ERR_DIST_FILE_TOOMANY)
+                     ExtraData=ST.ERR_DIST_FILE_TOOMANY)
     if not DescFile or not ContentFile:
         Logger.Error("PackagingTool", FILE_UNKNOWN_ERROR,
-            ExtraData=ST.ERR_DIST_FILE_TOOFEW)
+                     ExtraData=ST.ERR_DIST_FILE_TOOFEW)
     return DescFile, ContentFile
 
-## InstallDp method
+# InstallDp method
 #
 #   Install the distribution to current workspace
 #
+
+
 def InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir, DataBase):
     #
     # PackageList, ModuleList record the information for the meta-data
@@ -948,7 +1026,8 @@ def InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir
     DistPkg.PackageSurfaceArea = GetPackageList(DistPkg, Dep, WorkspaceDir, Options,
                                                 ContentZipFile, ModuleList, PackageList)
 
-    DistPkg.ModuleSurfaceArea = GetModuleList(DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList)
+    DistPkg.ModuleSurfaceArea = GetModuleList(
+        DistPkg, Dep, WorkspaceDir, ContentZipFile, ModuleList)
 
     GenToolMisc(DistPkg, WorkspaceDir, ContentZipFile)
 
@@ -956,12 +1035,12 @@ def InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir
     # copy "Distribution File" to directory $(WORKSPACE)/conf/upt
     #
     DistFileName = os.path.split(DpPkgFileName)[1]
-    NewDpPkgFileName = BackupDist(DpPkgFileName, DistPkg.Header.GetGuid(), DistPkg.Header.GetVersion(), WorkspaceDir)
+    NewDpPkgFileName = BackupDist(DpPkgFileName, DistPkg.Header.GetGuid(
+    ), DistPkg.Header.GetVersion(), WorkspaceDir)
 
     #
     # update database
     #
     Logger.Quiet(ST.MSG_UPDATE_PACKAGE_DATABASE)
     DataBase.AddDPObject(DistPkg, NewDpPkgFileName, DistFileName,
-                   DistPkg.Header.RePackage)
-
+                         DistPkg.Header.RePackage)
diff --git a/BaseTools/Source/Python/UPT/InventoryWs.py b/BaseTools/Source/Python/UPT/InventoryWs.py
index 955b2e510eb2..438af59454b5 100644
--- a/BaseTools/Source/Python/UPT/InventoryWs.py
+++ b/BaseTools/Source/Python/UPT/InventoryWs.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Inventory workspace's distribution package information.
 #
 # Copyright (c) 2014 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,12 +23,14 @@ import Logger.Log as Logger
 
 from Library import GlobalData
 
-## InventoryDistInstalled
+# InventoryDistInstalled
 #
 # This method retrieves the installed distribution information from the internal UPT database
 #
 # @param  DataBase: the UPT database
 #
+
+
 def InventoryDistInstalled(DataBase):
     DistInstalled = DataBase.InventoryDistInstalled()
 
@@ -47,7 +49,8 @@ def InventoryDistInstalled(DataBase):
     for (DpGuid, DpVersion, DpOriginalName, DpAliasFileName) in DistInstalled:
         MaxGuidlen = max(MaxGuidlen, len(DpGuid))
         MaxVerlen = max(MaxVerlen, len(DpVersion))
-        MaxDpAliasFileNameLen = max(MaxDpAliasFileNameLen, len(DpAliasFileName))
+        MaxDpAliasFileNameLen = max(
+            MaxDpAliasFileNameLen, len(DpAliasFileName))
         MaxDpOrigFileNamelen = max(MaxDpOrigFileNamelen, len(DpOriginalName))
 
     OutMsgFmt = "%-*s\t%-*s\t%-*s\t%-s"
@@ -62,15 +65,15 @@ def InventoryDistInstalled(DataBase):
 
     for (DpGuid, DpVersion, DpFileName, DpAliasFileName) in DistInstalled:
         OutMsg = OutMsgFmt % (MaxDpAliasFileNameLen,
-                            DpAliasFileName,
-                            MaxGuidlen,
-                            DpGuid,
-                            MaxVerlen,
-                            DpVersion,
-                            DpFileName)
+                              DpAliasFileName,
+                              MaxGuidlen,
+                              DpGuid,
+                              MaxVerlen,
+                              DpVersion,
+                              DpFileName)
         Logger.Info(OutMsg)
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -78,7 +81,9 @@ def InventoryDistInstalled(DataBase):
 #
 # @param  Options: command Options
 #
-def Main(Options = None):
+
+
+def Main(Options=None):
     if Options:
         pass
 
@@ -89,21 +94,23 @@ def Main(Options = None):
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
     except KeyboardInterrupt:
         ReturnCode = ABORT_ERROR
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
     except:
         ReturnCode = CODE_ERROR
         Logger.Error("\nInventoryWs",
-                    CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_INVENTORYWS_ERR,
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
+                     CODE_ERROR,
+                     ST.ERR_UNKNOWN_FATAL_INVENTORYWS_ERR,
+                     ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+                     RaiseError=False
+                     )
         Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
-            platform) + format_exc())
+                                         platform) + format_exc())
 
     if ReturnCode == 0:
         Logger.Quiet(ST.MSG_FINISH)
diff --git a/BaseTools/Source/Python/UPT/Library/CommentGenerating.py b/BaseTools/Source/Python/UPT/Library/CommentGenerating.py
index bded508f565a..2666a8c4e9aa 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentGenerating.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentGenerating.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define comment generating interface
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -30,33 +30,38 @@ from Library.DataType import TAB_STAR
 from Library.DataType import TAB_PCD_PROMPT
 from Library.UniClassObject import ConvertSpecialUnicodes
 from Library.Misc import GetLocalValue
-## GenTailCommentLines
+# GenTailCommentLines
 #
 # @param TailCommentLines:  the tail comment lines that need to be generated
 # @param LeadingSpaceNum:   the number of leading space needed for non-first
 #                            line tail comment
 #
-def GenTailCommentLines (TailCommentLines, LeadingSpaceNum = 0):
+
+
+def GenTailCommentLines(TailCommentLines, LeadingSpaceNum=0):
     TailCommentLines = TailCommentLines.rstrip(END_OF_LINE)
     CommentStr = TAB_SPACE_SPLIT*2 + TAB_SPECIAL_COMMENT + TAB_SPACE_SPLIT + \
-    (END_OF_LINE + LeadingSpaceNum * TAB_SPACE_SPLIT + TAB_SPACE_SPLIT*2 + TAB_SPECIAL_COMMENT + \
-     TAB_SPACE_SPLIT).join(GetSplitValueList(TailCommentLines, END_OF_LINE))
+        (END_OF_LINE + LeadingSpaceNum * TAB_SPACE_SPLIT + TAB_SPACE_SPLIT*2 + TAB_SPECIAL_COMMENT +
+         TAB_SPACE_SPLIT).join(GetSplitValueList(TailCommentLines, END_OF_LINE))
 
     return CommentStr
 
-## GenGenericComment
+# GenGenericComment
 #
 # @param CommentLines:   Generic comment Text, maybe Multiple Lines
 #
-def GenGenericComment (CommentLines):
+
+
+def GenGenericComment(CommentLines):
     if not CommentLines:
         return ''
     CommentLines = CommentLines.rstrip(END_OF_LINE)
-    CommentStr = TAB_SPECIAL_COMMENT + TAB_SPACE_SPLIT + (END_OF_LINE + TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT).join\
-    (GetSplitValueList(CommentLines, END_OF_LINE)) + END_OF_LINE
+    CommentStr = TAB_SPECIAL_COMMENT + TAB_SPACE_SPLIT + \
+        (END_OF_LINE + TAB_COMMENT_SPLIT + TAB_SPACE_SPLIT).join(
+            GetSplitValueList(CommentLines, END_OF_LINE)) + END_OF_LINE
     return CommentStr
 
-## GenGenericCommentF
+# GenGenericCommentF
 #
 #  similar to GenGenericComment but will remove <EOL> at end of comment once,
 #  and for line with only <EOL>, '#\n' will be generated instead of '# \n'
@@ -64,7 +69,9 @@ def GenGenericComment (CommentLines):
 # @param CommentLines:   Generic comment Text, maybe Multiple Lines
 # @return CommentStr:    Generated comment line
 #
-def GenGenericCommentF (CommentLines, NumOfPound=1, IsPrompt=False, IsInfLibraryClass=False):
+
+
+def GenGenericCommentF(CommentLines, NumOfPound=1, IsPrompt=False, IsInfLibraryClass=False):
     if not CommentLines:
         return ''
     #
@@ -76,7 +83,7 @@ def GenGenericCommentF (CommentLines, NumOfPound=1, IsPrompt=False, IsInfLibrary
     CommentStr = ''
     if IsPrompt:
         CommentStr += TAB_COMMENT_SPLIT * NumOfPound + TAB_SPACE_SPLIT + TAB_PCD_PROMPT + TAB_SPACE_SPLIT + \
-        CommentLines.replace(END_OF_LINE, '') + END_OF_LINE
+            CommentLines.replace(END_OF_LINE, '') + END_OF_LINE
     else:
         CommentLineList = GetSplitValueList(CommentLines, END_OF_LINE)
         FindLibraryClass = False
@@ -93,16 +100,19 @@ def GenGenericCommentF (CommentLines, NumOfPound=1, IsPrompt=False, IsInfLibrary
                 CommentStr += TAB_COMMENT_SPLIT * NumOfPound + END_OF_LINE
             else:
                 if FindLibraryClass and Line.find(u'@libraryclass ') > -1:
-                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + TAB_SPACE_SPLIT + Line + END_OF_LINE
+                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + \
+                        TAB_SPACE_SPLIT + Line + END_OF_LINE
                 elif FindLibraryClass:
-                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + TAB_SPACE_SPLIT * 16 + Line + END_OF_LINE
+                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + \
+                        TAB_SPACE_SPLIT * 16 + Line + END_OF_LINE
                 else:
-                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + TAB_SPACE_SPLIT + Line + END_OF_LINE
+                    CommentStr += TAB_COMMENT_SPLIT * NumOfPound + \
+                        TAB_SPACE_SPLIT + Line + END_OF_LINE
 
     return CommentStr
 
 
-## GenHeaderCommentSection
+# GenHeaderCommentSection
 #
 # Generate Header comment sections
 #
@@ -111,7 +121,7 @@ def GenGenericCommentF (CommentLines, NumOfPound=1, IsPrompt=False, IsInfLibrary
 # @param Copyright     possible multiple copyright lines
 # @param License       possible multiple license lines
 #
-def GenHeaderCommentSection(Abstract, Description, Copyright, License, IsBinaryHeader=False, \
+def GenHeaderCommentSection(Abstract, Description, Copyright, License, IsBinaryHeader=False,
                             CommChar=TAB_COMMENT_SPLIT):
     Content = ''
 
@@ -124,21 +134,21 @@ def GenHeaderCommentSection(Abstract, Description, Copyright, License, IsBinaryH
         Content += CommChar * 2 + TAB_SPACE_SPLIT + TAB_BINARY_HEADER_COMMENT + '\r\n'
     elif CommChar == TAB_COMMENT_EDK1_SPLIT:
         Content += CommChar + TAB_SPACE_SPLIT + TAB_COMMENT_EDK1_START + TAB_STAR + TAB_SPACE_SPLIT +\
-         TAB_HEADER_COMMENT + '\r\n'
+            TAB_HEADER_COMMENT + '\r\n'
     else:
         Content += CommChar * 2 + TAB_SPACE_SPLIT + TAB_HEADER_COMMENT + '\r\n'
     if Abstract:
         Abstract = Abstract.rstrip('\r\n')
-        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList\
-                                                                                                (Abstract, '\n'))
+        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList
+                                                                                           (Abstract, '\n'))
         Content += '\r\n' + CommChar + '\r\n'
     else:
         Content += CommChar + '\r\n'
 
     if Description:
         Description = Description.rstrip('\r\n')
-        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList\
-                                                  (Description, '\n'))
+        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList
+                                                                                           (Description, '\n'))
         Content += '\r\n' + CommChar + '\r\n'
 
     #
@@ -146,14 +156,14 @@ def GenHeaderCommentSection(Abstract, Description, Copyright, License, IsBinaryH
     #
     if Copyright:
         Copyright = Copyright.rstrip('\r\n')
-        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join\
-        (GetSplitValueList(Copyright, '\n'))
+        Content += CommChar + TAB_SPACE_SPLIT + \
+            ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList(Copyright, '\n'))
         Content += '\r\n' + CommChar + '\r\n'
 
     if License:
         License = License.rstrip('\r\n')
-        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList\
-                                                  (License, '\n'))
+        Content += CommChar + TAB_SPACE_SPLIT + ('\r\n' + CommChar + TAB_SPACE_SPLIT).join(GetSplitValueList
+                                                                                           (License, '\n'))
         Content += '\r\n' + CommChar + '\r\n'
 
     if CommChar == TAB_COMMENT_EDK1_SPLIT:
@@ -164,26 +174,28 @@ def GenHeaderCommentSection(Abstract, Description, Copyright, License, IsBinaryH
     return Content
 
 
-## GenInfPcdTailComment
+# GenInfPcdTailComment
 #  Generate Pcd tail comment for Inf, this would be one line comment
 #
 # @param Usage:            Usage type
 # @param TailCommentText:  Comment text for tail comment
 #
-def GenInfPcdTailComment (Usage, TailCommentText):
+def GenInfPcdTailComment(Usage, TailCommentText):
     if (Usage == ITEM_UNDEFINED) and (not TailCommentText):
         return ''
 
     CommentLine = TAB_SPACE_SPLIT.join([Usage, TailCommentText])
     return GenTailCommentLines(CommentLine)
 
-## GenInfProtocolPPITailComment
+# GenInfProtocolPPITailComment
 #  Generate Protocol/PPI tail comment for Inf
 #
 # @param Usage:            Usage type
 # @param TailCommentText:  Comment text for tail comment
 #
-def GenInfProtocolPPITailComment (Usage, Notify, TailCommentText):
+
+
+def GenInfProtocolPPITailComment(Usage, Notify, TailCommentText):
     if (not Notify) and (Usage == ITEM_UNDEFINED) and (not TailCommentText):
         return ''
 
@@ -195,16 +207,18 @@ def GenInfProtocolPPITailComment (Usage, Notify, TailCommentText):
     CommentLine += TAB_SPACE_SPLIT.join([Usage, TailCommentText])
     return GenTailCommentLines(CommentLine)
 
-## GenInfGuidTailComment
+# GenInfGuidTailComment
 #  Generate Guid tail comment for Inf
 #
 # @param Usage:            Usage type
 # @param TailCommentText:  Comment text for tail comment
 #
-def GenInfGuidTailComment (Usage, GuidTypeList, VariableName, TailCommentText):
+
+
+def GenInfGuidTailComment(Usage, GuidTypeList, VariableName, TailCommentText):
     GuidType = GuidTypeList[0]
     if (Usage == ITEM_UNDEFINED) and (GuidType == ITEM_UNDEFINED) and \
-        (not TailCommentText):
+            (not TailCommentText):
         return ''
 
     FirstLine = Usage + " ## " + GuidType
@@ -214,16 +228,18 @@ def GenInfGuidTailComment (Usage, GuidTypeList, VariableName, TailCommentText):
     CommentLine = TAB_SPACE_SPLIT.join([FirstLine, TailCommentText])
     return GenTailCommentLines(CommentLine)
 
-## GenDecGuidTailComment
+# GenDecGuidTailComment
 #
 # @param SupModuleList:  Supported module type list
 #
-def GenDecTailComment (SupModuleList):
+
+
+def GenDecTailComment(SupModuleList):
     CommentLine = TAB_SPACE_SPLIT.join(SupModuleList)
     return GenTailCommentLines(CommentLine)
 
 
-## _GetHelpStr
+# _GetHelpStr
 #  get HelpString from a list of HelpTextObject, the priority refer to
 #  related HLD
 #
diff --git a/BaseTools/Source/Python/UPT/Library/CommentParsing.py b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
index 7ba9830d34ac..511b2861a8d7 100644
--- a/BaseTools/Source/Python/UPT/Library/CommentParsing.py
+++ b/BaseTools/Source/Python/UPT/Library/CommentParsing.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define comment parsing interface
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -44,7 +44,7 @@ from Logger.ToolError import FORMAT_INVALID
 from Logger.ToolError import FORMAT_NOT_SUPPORTED
 from Logger import StringTable as ST
 
-## ParseHeaderCommentSection
+# ParseHeaderCommentSection
 #
 # Parse Header comment section lines, extract Abstract, Description, Copyright
 # , License lines
@@ -52,7 +52,9 @@ from Logger import StringTable as ST
 # @param CommentList:   List of (Comment, LineNumber)
 # @param FileName:      FileName of the comment
 #
-def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = False):
+
+
+def ParseHeaderCommentSection(CommentList, FileName=None, IsBinaryHeader=False):
     Abstract = ''
     Description = ''
     Copyright = ''
@@ -79,7 +81,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
         LineNo = Item[1]
 
         if not Line.startswith(TAB_COMMENT_SPLIT) and Line:
-            Logger.Error("\nUPT", FORMAT_INVALID, ST.ERR_INVALID_COMMENT_FORMAT, FileName, Item[1])
+            Logger.Error("\nUPT", FORMAT_INVALID,
+                         ST.ERR_INVALID_COMMENT_FORMAT, FileName, Item[1])
         Comment = CleanString2(Line)[1]
         Comment = Comment.strip()
         #
@@ -87,7 +90,7 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
         # indication of different block; or in the position that Abstract should be, also keep it
         # as it indicates that no abstract
         #
-        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE, \
+        if not Comment and HeaderCommentStage not in [HEADER_COMMENT_LICENSE,
                                                       HEADER_COMMENT_DESCRIPTION, HEADER_COMMENT_ABSTRACT]:
             continue
 
@@ -105,7 +108,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
                     HeaderCommentStage = HEADER_COMMENT_DESCRIPTION
                 elif _IsCopyrightLine(Comment):
                     Result, ErrMsg = _ValidateCopyright(Comment)
-                    ValidateCopyright(Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
+                    ValidateCopyright(
+                        Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
                     Copyright += Comment + EndOfLine
                     HeaderCommentStage = HEADER_COMMENT_COPYRIGHT
                 else:
@@ -117,7 +121,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
                 #
                 if _IsCopyrightLine(Comment):
                     Result, ErrMsg = _ValidateCopyright(Comment)
-                    ValidateCopyright(Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
+                    ValidateCopyright(
+                        Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
                     Copyright += Comment + EndOfLine
                     HeaderCommentStage = HEADER_COMMENT_COPYRIGHT
                 else:
@@ -125,7 +130,8 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
             elif HeaderCommentStage == HEADER_COMMENT_COPYRIGHT:
                 if _IsCopyrightLine(Comment):
                     Result, ErrMsg = _ValidateCopyright(Comment)
-                    ValidateCopyright(Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
+                    ValidateCopyright(
+                        Result, ST.WRN_INVALID_COPYRIGHT, FileName, LineNo, ErrMsg)
                     Copyright += Comment + EndOfLine
                 else:
                     #
@@ -144,14 +150,16 @@ def ParseHeaderCommentSection(CommentList, FileName = None, IsBinaryHeader = Fal
 
     return Abstract.strip(), Description.strip(), Copyright.strip(), License.strip()
 
-## _IsCopyrightLine
+# _IsCopyrightLine
 # check whether current line is copyright line, the criteria is whether there is case insensitive keyword "Copyright"
 # followed by zero or more white space characters followed by a "(" character
 #
 # @param LineContent:  the line need to be checked
 # @return: True if current line is copyright line, False else
 #
-def _IsCopyrightLine (LineContent):
+
+
+def _IsCopyrightLine(LineContent):
     LineContent = LineContent.upper()
     Result = False
 
@@ -161,13 +169,15 @@ def _IsCopyrightLine (LineContent):
 
     return Result
 
-## ParseGenericComment
+# ParseGenericComment
 #
 # @param GenericComment: Generic comment list, element of
 #                        (CommentLine, LineNum)
 # @param ContainerFile:  Input value for filename of Dec file
 #
-def ParseGenericComment (GenericComment, ContainerFile=None, SkipTag=None):
+
+
+def ParseGenericComment(GenericComment, ContainerFile=None, SkipTag=None):
     if ContainerFile:
         pass
     HelpTxt = None
@@ -188,13 +198,15 @@ def ParseGenericComment (GenericComment, ContainerFile=None, SkipTag=None):
 
     return HelpTxt
 
-## ParsePcdErrorCode
+# ParsePcdErrorCode
 #
 # @param Value: original ErrorCode value
 # @param ContainerFile: Input value for filename of Dec file
 # @param LineNum: Line Num
 #
-def ParsePcdErrorCode (Value = None, ContainerFile = None, LineNum = None):
+
+
+def ParsePcdErrorCode(Value=None, ContainerFile=None, LineNum=None):
     try:
         if Value.strip().startswith((TAB_HEX_START, TAB_CAPHEX_START)):
             Base = 16
@@ -203,28 +215,30 @@ def ParsePcdErrorCode (Value = None, ContainerFile = None, LineNum = None):
         ErrorCode = int(Value, Base)
         if ErrorCode > PCD_ERR_CODE_MAX_SIZE or ErrorCode < 0:
             Logger.Error('Parser',
-                        FORMAT_NOT_SUPPORTED,
-                        "The format %s of ErrorCode is not valid, should be UNIT32 type or long type" % Value,
-                        File = ContainerFile,
-                        Line = LineNum)
+                         FORMAT_NOT_SUPPORTED,
+                         "The format %s of ErrorCode is not valid, should be UNIT32 type or long type" % Value,
+                         File=ContainerFile,
+                         Line=LineNum)
         ErrorCode = '0x%x' % ErrorCode
         return ErrorCode
     except ValueError as XStr:
         if XStr:
             pass
         Logger.Error('Parser',
-                    FORMAT_NOT_SUPPORTED,
-                    "The format %s of ErrorCode is not valid, should be UNIT32 type or long type" % Value,
-                    File = ContainerFile,
-                    Line = LineNum)
+                     FORMAT_NOT_SUPPORTED,
+                     "The format %s of ErrorCode is not valid, should be UNIT32 type or long type" % Value,
+                     File=ContainerFile,
+                     Line=LineNum)
 
-## ParseDecPcdGenericComment
+# ParseDecPcdGenericComment
 #
 # @param GenericComment: Generic comment list, element of (CommentLine,
 #                         LineNum)
 # @param ContainerFile:  Input value for filename of Dec file
 #
-def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCName, CName, MacroReplaceDict):
+
+
+def ParseDecPcdGenericComment(GenericComment, ContainerFile, TokenSpaceGuidCName, CName, MacroReplaceDict):
     HelpStr = ''
     PromptStr = ''
     PcdErr = None
@@ -239,19 +253,20 @@ def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCNam
         # To replace Macro
         #
         MACRO_PATTERN = '[\t\s]*\$\([A-Z][_A-Z0-9]*\)'
-        MatchedStrs =  re.findall(MACRO_PATTERN, Comment)
+        MatchedStrs = re.findall(MACRO_PATTERN, Comment)
         for MatchedStr in MatchedStrs:
             if MatchedStr:
                 Macro = MatchedStr.strip().lstrip('$(').rstrip(')').strip()
                 if Macro in MacroReplaceDict:
-                    Comment = Comment.replace(MatchedStr, MacroReplaceDict[Macro])
+                    Comment = Comment.replace(
+                        MatchedStr, MacroReplaceDict[Macro])
         if Comment.startswith(TAB_PCD_VALIDRANGE):
             if ValidValueNum > 0 or ExpressionNum > 0:
                 Logger.Error('Parser',
                              FORMAT_NOT_SUPPORTED,
                              ST.WRN_MULTI_PCD_RANGES,
-                             File = ContainerFile,
-                             Line = LineNum)
+                             File=ContainerFile,
+                             Line=LineNum)
             else:
                 PcdErr = PcdErrorObject()
                 PcdErr.SetTokenSpaceGuidCName(TokenSpaceGuidCName)
@@ -264,30 +279,32 @@ def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCNam
             if Valid:
                 ValueList = ValidRange.split(TAB_VALUE_SPLIT)
                 if len(ValueList) > 1:
-                    PcdErr.SetValidValueRange((TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
-                    PcdErr.SetErrorNumber(ParsePcdErrorCode(ValueList[0], ContainerFile, LineNum))
+                    PcdErr.SetValidValueRange(
+                        (TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
+                    PcdErr.SetErrorNumber(ParsePcdErrorCode(
+                        ValueList[0], ContainerFile, LineNum))
                 else:
                     PcdErr.SetValidValueRange(ValidRange)
                 PcdErrList.append(PcdErr)
             else:
                 Logger.Error("Parser",
-                         FORMAT_NOT_SUPPORTED,
-                         Cause,
-                         ContainerFile,
-                         LineNum)
+                             FORMAT_NOT_SUPPORTED,
+                             Cause,
+                             ContainerFile,
+                             LineNum)
         elif Comment.startswith(TAB_PCD_VALIDLIST):
             if ValidRangeNum > 0 or ExpressionNum > 0:
                 Logger.Error('Parser',
                              FORMAT_NOT_SUPPORTED,
                              ST.WRN_MULTI_PCD_RANGES,
-                             File = ContainerFile,
-                             Line = LineNum)
+                             File=ContainerFile,
+                             Line=LineNum)
             elif ValidValueNum > 0:
                 Logger.Error('Parser',
                              FORMAT_NOT_SUPPORTED,
                              ST.WRN_MULTI_PCD_VALIDVALUE,
-                             File = ContainerFile,
-                             Line = LineNum)
+                             File=ContainerFile,
+                             Line=LineNum)
             else:
                 PcdErr = PcdErrorObject()
                 PcdErr.SetTokenSpaceGuidCName(TokenSpaceGuidCName)
@@ -295,30 +312,34 @@ def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCNam
                 PcdErr.SetFileLine(Comment)
                 PcdErr.SetLineNum(LineNum)
                 ValidValueNum += 1
-                ValidValueExpr = Comment.replace(TAB_PCD_VALIDLIST, "", 1).strip()
+                ValidValueExpr = Comment.replace(
+                    TAB_PCD_VALIDLIST, "", 1).strip()
             Valid, Cause = _CheckListExpression(ValidValueExpr)
             if Valid:
-                ValidValue = Comment.replace(TAB_PCD_VALIDLIST, "", 1).replace(TAB_COMMA_SPLIT, TAB_SPACE_SPLIT)
+                ValidValue = Comment.replace(TAB_PCD_VALIDLIST, "", 1).replace(
+                    TAB_COMMA_SPLIT, TAB_SPACE_SPLIT)
                 ValueList = ValidValue.split(TAB_VALUE_SPLIT)
                 if len(ValueList) > 1:
-                    PcdErr.SetValidValue((TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
-                    PcdErr.SetErrorNumber(ParsePcdErrorCode(ValueList[0], ContainerFile, LineNum))
+                    PcdErr.SetValidValue(
+                        (TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
+                    PcdErr.SetErrorNumber(ParsePcdErrorCode(
+                        ValueList[0], ContainerFile, LineNum))
                 else:
                     PcdErr.SetValidValue(ValidValue)
                 PcdErrList.append(PcdErr)
             else:
                 Logger.Error("Parser",
-                         FORMAT_NOT_SUPPORTED,
-                         Cause,
-                         ContainerFile,
-                         LineNum)
+                             FORMAT_NOT_SUPPORTED,
+                             Cause,
+                             ContainerFile,
+                             LineNum)
         elif Comment.startswith(TAB_PCD_EXPRESSION):
             if ValidRangeNum > 0 or ValidValueNum > 0:
                 Logger.Error('Parser',
                              FORMAT_NOT_SUPPORTED,
                              ST.WRN_MULTI_PCD_RANGES,
-                             File = ContainerFile,
-                             Line = LineNum)
+                             File=ContainerFile,
+                             Line=LineNum)
             else:
                 PcdErr = PcdErrorObject()
                 PcdErr.SetTokenSpaceGuidCName(TokenSpaceGuidCName)
@@ -331,24 +352,26 @@ def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCNam
             if Valid:
                 ValueList = Expression.split(TAB_VALUE_SPLIT)
                 if len(ValueList) > 1:
-                    PcdErr.SetExpression((TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
-                    PcdErr.SetErrorNumber(ParsePcdErrorCode(ValueList[0], ContainerFile, LineNum))
+                    PcdErr.SetExpression(
+                        (TAB_VALUE_SPLIT.join(ValueList[1:])).strip())
+                    PcdErr.SetErrorNumber(ParsePcdErrorCode(
+                        ValueList[0], ContainerFile, LineNum))
                 else:
                     PcdErr.SetExpression(Expression)
                 PcdErrList.append(PcdErr)
             else:
                 Logger.Error("Parser",
-                         FORMAT_NOT_SUPPORTED,
-                         Cause,
-                         ContainerFile,
-                         LineNum)
+                             FORMAT_NOT_SUPPORTED,
+                             Cause,
+                             ContainerFile,
+                             LineNum)
         elif Comment.startswith(TAB_PCD_PROMPT):
             if PromptStr:
                 Logger.Error('Parser',
                              FORMAT_NOT_SUPPORTED,
                              ST.WRN_MULTI_PCD_PROMPT,
-                             File = ContainerFile,
-                             Line = LineNum)
+                             File=ContainerFile,
+                             Line=LineNum)
             PromptStr = Comment.replace(TAB_PCD_PROMPT, "", 1).strip()
         else:
             if Comment:
@@ -363,14 +386,16 @@ def ParseDecPcdGenericComment (GenericComment, ContainerFile, TokenSpaceGuidCNam
 
     return HelpStr, PcdErrList, PromptStr
 
-## ParseDecPcdTailComment
+# ParseDecPcdTailComment
 #
 # @param TailCommentList:    Tail comment list of Pcd, item of format (Comment, LineNum)
 # @param ContainerFile:      Input value for filename of Dec file
 # @retVal SupModuleList:  The supported module type list detected
 # @retVal HelpStr:  The generic help text string detected
 #
-def ParseDecPcdTailComment (TailCommentList, ContainerFile):
+
+
+def ParseDecPcdTailComment(TailCommentList, ContainerFile):
     assert(len(TailCommentList) == 1)
     TailComment = TailCommentList[0][0]
     LineNum = TailCommentList[0][1]
@@ -399,7 +424,7 @@ def ParseDecPcdTailComment (TailCommentList, ContainerFile):
         elif Mod not in SUP_MODULE_LIST:
             Logger.Error("UPT",
                          FORMAT_INVALID,
-                         ST.WRN_INVALID_MODULE_TYPE%Mod,
+                         ST.WRN_INVALID_MODULE_TYPE % Mod,
                          ContainerFile,
                          LineNum)
         else:
@@ -407,10 +432,12 @@ def ParseDecPcdTailComment (TailCommentList, ContainerFile):
 
     return SupModuleList, HelpStr
 
-## _CheckListExpression
+# _CheckListExpression
 #
 # @param Expression: Pcd value list expression
 #
+
+
 def _CheckListExpression(Expression):
     ListExpr = ''
     if TAB_VALUE_SPLIT in Expression:
@@ -420,10 +447,12 @@ def _CheckListExpression(Expression):
 
     return IsValidListExpr(ListExpr)
 
-## _CheckExpression
+# _CheckExpression
 #
 # @param Expression: Pcd value expression
 #
+
+
 def _CheckExpression(Expression):
     Expr = ''
     if TAB_VALUE_SPLIT in Expression:
@@ -432,10 +461,12 @@ def _CheckExpression(Expression):
         Expr = Expression
     return IsValidLogicalExpr(Expr, True)
 
-## _CheckRangeExpression
+# _CheckRangeExpression
 #
 # @param Expression:    Pcd range expression
 #
+
+
 def _CheckRangeExpression(Expression):
     RangeExpr = ''
     if TAB_VALUE_SPLIT in Expression:
@@ -445,21 +476,25 @@ def _CheckRangeExpression(Expression):
 
     return IsValidRangeExpr(RangeExpr)
 
-## ValidateCopyright
+# ValidateCopyright
 #
 #
 #
+
+
 def ValidateCopyright(Result, ErrType, FileName, LineNo, ErrMsg):
     if not Result:
         Logger.Warn("\nUPT", ErrType, FileName, LineNo, ErrMsg)
 
-## _ValidateCopyright
+# _ValidateCopyright
 #
 # @param Line:    Line that contains copyright information, # stripped
 #
 # @retval Result: True if line is conformed to Spec format, False else
 # @retval ErrMsg: the detailed error description
 #
+
+
 def _ValidateCopyright(Line):
     if Line:
         pass
@@ -468,14 +503,16 @@ def _ValidateCopyright(Line):
 
     return Result, ErrMsg
 
-def GenerateTokenList (Comment):
+
+def GenerateTokenList(Comment):
     #
     # Tokenize Comment using '#' and ' ' as token separators
     #
     ReplacedComment = None
     while Comment != ReplacedComment:
         ReplacedComment = Comment
-        Comment = Comment.replace('##', '#').replace('  ', ' ').replace(' ', '#').strip('# ')
+        Comment = Comment.replace('##', '#').replace(
+            '  ', ' ').replace(' ', '#').strip('# ')
     return Comment.split('#')
 
 
@@ -485,7 +522,7 @@ def GenerateTokenList (Comment):
 # RemoveTokens  - A list of tokens to remove from help text
 # ParseVariable - True for parsing [Guids].  Otherwise False
 #
-def ParseComment (Comment, UsageTokens, TypeTokens, RemoveTokens, ParseVariable):
+def ParseComment(Comment, UsageTokens, TypeTokens, RemoveTokens, ParseVariable):
     #
     # Initialize return values
     #
@@ -503,7 +540,7 @@ def ParseComment (Comment, UsageTokens, TypeTokens, RemoveTokens, ParseVariable)
         #
         List = Comment.split(':', 1)
         if len(List) > 1:
-            SubList = GenerateTokenList (List[0].strip())
+            SubList = GenerateTokenList(List[0].strip())
             if len(SubList) in [1, 2] and SubList[-1] == 'Variable':
                 if List[1].strip().find('L"') == 0:
                     Comment = List[0].strip() + ':' + List[1].strip()
@@ -524,7 +561,7 @@ def ParseComment (Comment, UsageTokens, TypeTokens, RemoveTokens, ParseVariable)
                 String = Comment[Start:]
                 End = String[2:].find('"')
         if End >= 0:
-            SubList = GenerateTokenList (Comment[:Start])
+            SubList = GenerateTokenList(Comment[:Start])
             if len(SubList) < 2:
                 Comment = Comment[:Start] + String[End + 3:]
                 String = String[:End + 3]
@@ -540,7 +577,7 @@ def ParseComment (Comment, UsageTokens, TypeTokens, RemoveTokens, ParseVariable)
     #
     # Tokenize Comment using '#' and ' ' as token separators
     #
-    List = GenerateTokenList (Comment)
+    List = GenerateTokenList(Comment)
 
     #
     # Search first two tokens for Usage and Type and remove any matching tokens
diff --git a/BaseTools/Source/Python/UPT/Library/DataType.py b/BaseTools/Source/Python/UPT/Library/DataType.py
index 2033149aa6dc..a5e0bb154e33 100644
--- a/BaseTools/Source/Python/UPT/Library/DataType.py
+++ b/BaseTools/Source/Python/UPT/Library/DataType.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class for data type structure
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -42,18 +42,18 @@ USAGE_LIST = ["CONSUMES",
               "SOMETIMES_PRODUCES"]
 
 TAB_LANGUAGE_EN_US = 'en-US'
-TAB_LANGUAGE_ENG   = 'eng'
-TAB_LANGUAGE_EN    = 'en'
-TAB_LANGUAGE_EN_X  = 'en-x-tianocore'
+TAB_LANGUAGE_ENG = 'eng'
+TAB_LANGUAGE_EN = 'en'
+TAB_LANGUAGE_EN_X = 'en-x-tianocore'
 
-USAGE_ITEM_PRODUCES           = 'PRODUCES'
+USAGE_ITEM_PRODUCES = 'PRODUCES'
 USAGE_ITEM_SOMETIMES_PRODUCES = 'SOMETIMES_PRODUCES'
-USAGE_ITEM_CONSUMES           = 'CONSUMES'
+USAGE_ITEM_CONSUMES = 'CONSUMES'
 USAGE_ITEM_SOMETIMES_CONSUMES = 'SOMETIMES_CONSUMES'
-USAGE_ITEM_TO_START           = 'TO_START'
-USAGE_ITEM_BY_START           = 'BY_START'
-USAGE_ITEM_NOTIFY             = 'NOTIFY'
-USAGE_ITEM_UNDEFINED          = 'UNDEFINED'
+USAGE_ITEM_TO_START = 'TO_START'
+USAGE_ITEM_BY_START = 'BY_START'
+USAGE_ITEM_NOTIFY = 'NOTIFY'
+USAGE_ITEM_UNDEFINED = 'UNDEFINED'
 
 USAGE_CONSUMES_LIST = [USAGE_ITEM_CONSUMES,
                        'CONSUMED',
@@ -90,103 +90,103 @@ TAB_STR_TOKENERR = 'ERR'
 # Dictionary of usage tokens and their synonyms
 #
 ALL_USAGE_TOKENS = {
-  "PRODUCES"           : "PRODUCES",
-  "PRODUCED"           : "PRODUCES",
-  "ALWAYS_PRODUCES"    : "PRODUCES",
-  "ALWAYS_PRODUCED"    : "PRODUCES",
-  "SOMETIMES_PRODUCES" : "SOMETIMES_PRODUCES",
-  "SOMETIMES_PRODUCED" : "SOMETIMES_PRODUCES",
-  "CONSUMES"           : "CONSUMES",
-  "CONSUMED"           : "CONSUMES",
-  "ALWAYS_CONSUMES"    : "CONSUMES",
-  "ALWAYS_CONSUMED"    : "CONSUMES",
-  "SOMETIMES_CONSUMES" : "SOMETIMES_CONSUMES",
-  "SOMETIMES_CONSUMED" : "SOMETIMES_CONSUMES",
-  "SOMETIME_CONSUMES"  : "SOMETIMES_CONSUMES",
-  "UNDEFINED"          : "UNDEFINED"
-  }
+    "PRODUCES": "PRODUCES",
+    "PRODUCED": "PRODUCES",
+    "ALWAYS_PRODUCES": "PRODUCES",
+    "ALWAYS_PRODUCED": "PRODUCES",
+    "SOMETIMES_PRODUCES": "SOMETIMES_PRODUCES",
+    "SOMETIMES_PRODUCED": "SOMETIMES_PRODUCES",
+    "CONSUMES": "CONSUMES",
+    "CONSUMED": "CONSUMES",
+    "ALWAYS_CONSUMES": "CONSUMES",
+    "ALWAYS_CONSUMED": "CONSUMES",
+    "SOMETIMES_CONSUMES": "SOMETIMES_CONSUMES",
+    "SOMETIMES_CONSUMED": "SOMETIMES_CONSUMES",
+    "SOMETIME_CONSUMES": "SOMETIMES_CONSUMES",
+    "UNDEFINED": "UNDEFINED"
+}
 
 PROTOCOL_USAGE_TOKENS = {
-  "TO_START"           : "TO_START",
-  "BY_START"           : "BY_START"
-  }
+    "TO_START": "TO_START",
+    "BY_START": "BY_START"
+}
 
-PROTOCOL_USAGE_TOKENS.update (ALL_USAGE_TOKENS)
+PROTOCOL_USAGE_TOKENS.update(ALL_USAGE_TOKENS)
 
 #
 # Dictionary of GUID type tokens
 #
 GUID_TYPE_TOKENS = {
-  "Event"          : "Event",
-  "File"           : "File",
-  "FV"             : "FV",
-  "GUID"           : "GUID",
-  "Guid"           : "GUID",
-  "HII"            : "HII",
-  "HOB"            : "HOB",
-  "Hob"            : "HOB",
-  "Hob:"           : "HOB",
-  "SystemTable"    : "SystemTable",
-  "TokenSpaceGuid" : "TokenSpaceGuid",
-  "UNDEFINED"      : "UNDEFINED"
-  }
+    "Event": "Event",
+    "File": "File",
+    "FV": "FV",
+    "GUID": "GUID",
+    "Guid": "GUID",
+    "HII": "HII",
+    "HOB": "HOB",
+    "Hob": "HOB",
+    "Hob:": "HOB",
+    "SystemTable": "SystemTable",
+    "TokenSpaceGuid": "TokenSpaceGuid",
+    "UNDEFINED": "UNDEFINED"
+}
 
 #
 # Dictionary of Protocol Notify tokens and their synonyms
 #
 PROTOCOL_NOTIFY_TOKENS = {
-  "NOTIFY"          : "NOTIFY",
-  "PROTOCOL_NOTIFY" : "NOTIFY",
-  "UNDEFINED"       : "UNDEFINED"
-  }
+    "NOTIFY": "NOTIFY",
+    "PROTOCOL_NOTIFY": "NOTIFY",
+    "UNDEFINED": "UNDEFINED"
+}
 
 #
 # Dictionary of PPI Notify tokens and their synonyms
 #
 PPI_NOTIFY_TOKENS = {
-  "NOTIFY"     : "NOTIFY",
-  "PPI_NOTIFY" : "NOTIFY",
-  "UNDEFINED"  : "UNDEFINED"
-  }
+    "NOTIFY": "NOTIFY",
+    "PPI_NOTIFY": "NOTIFY",
+    "UNDEFINED": "UNDEFINED"
+}
 
 EVENT_TOKENS = {
-  "EVENT_TYPE_PERIODIC_TIMER" : "EVENT_TYPE_PERIODIC_TIMER",
-  "EVENT_TYPE_RELATIVE_TIMER" : "EVENT_TYPE_RELATIVE_TIMER",
-  "UNDEFINED"                 : "UNDEFINED"
-  }
+    "EVENT_TYPE_PERIODIC_TIMER": "EVENT_TYPE_PERIODIC_TIMER",
+    "EVENT_TYPE_RELATIVE_TIMER": "EVENT_TYPE_RELATIVE_TIMER",
+    "UNDEFINED": "UNDEFINED"
+}
 
 BOOTMODE_TOKENS = {
-  "FULL"                  : "FULL",
-  "MINIMAL"               : "MINIMAL",
-  "NO_CHANGE"             : "NO_CHANGE",
-  "DIAGNOSTICS"           : "DIAGNOSTICS",
-  "DEFAULT"               : "DEFAULT",
-  "S2_RESUME"             : "S2_RESUME",
-  "S3_RESUME"             : "S3_RESUME",
-  "S4_RESUME"             : "S4_RESUME",
-  "S5_RESUME"             : "S5_RESUME",
-  "FLASH_UPDATE"          : "FLASH_UPDATE",
-  "RECOVERY_FULL"         : "RECOVERY_FULL",
-  "RECOVERY_MINIMAL"      : "RECOVERY_MINIMAL",
-  "RECOVERY_NO_CHANGE"    : "RECOVERY_NO_CHANGE",
-  "RECOVERY_DIAGNOSTICS"  : "RECOVERY_DIAGNOSTICS",
-  "RECOVERY_DEFAULT"      : "RECOVERY_DEFAULT",
-  "RECOVERY_S2_RESUME"    : "RECOVERY_S2_RESUME",
-  "RECOVERY_S3_RESUME"    : "RECOVERY_S3_RESUME",
-  "RECOVERY_S4_RESUME"    : "RECOVERY_S4_RESUME",
-  "RECOVERY_S5_RESUME"    : "RECOVERY_S5_RESUME",
-  "RECOVERY_FLASH_UPDATE" : "RECOVERY_FLASH_UPDATE",
-  "UNDEFINED"             : "UNDEFINED"
-  }
+    "FULL": "FULL",
+    "MINIMAL": "MINIMAL",
+    "NO_CHANGE": "NO_CHANGE",
+    "DIAGNOSTICS": "DIAGNOSTICS",
+    "DEFAULT": "DEFAULT",
+    "S2_RESUME": "S2_RESUME",
+    "S3_RESUME": "S3_RESUME",
+    "S4_RESUME": "S4_RESUME",
+    "S5_RESUME": "S5_RESUME",
+    "FLASH_UPDATE": "FLASH_UPDATE",
+    "RECOVERY_FULL": "RECOVERY_FULL",
+    "RECOVERY_MINIMAL": "RECOVERY_MINIMAL",
+    "RECOVERY_NO_CHANGE": "RECOVERY_NO_CHANGE",
+    "RECOVERY_DIAGNOSTICS": "RECOVERY_DIAGNOSTICS",
+    "RECOVERY_DEFAULT": "RECOVERY_DEFAULT",
+    "RECOVERY_S2_RESUME": "RECOVERY_S2_RESUME",
+    "RECOVERY_S3_RESUME": "RECOVERY_S3_RESUME",
+    "RECOVERY_S4_RESUME": "RECOVERY_S4_RESUME",
+    "RECOVERY_S5_RESUME": "RECOVERY_S5_RESUME",
+    "RECOVERY_FLASH_UPDATE": "RECOVERY_FLASH_UPDATE",
+    "UNDEFINED": "UNDEFINED"
+}
 
 HOB_TOKENS = {
-  "PHIT"                : "PHIT",
-  "MEMORY_ALLOCATION"   : "MEMORY_ALLOCATION",
-  "LOAD_PEIM"           : "LOAD_PEIM",
-  "RESOURCE_DESCRIPTOR" : "RESOURCE_DESCRIPTOR",
-  "FIRMWARE_VOLUME"     : "FIRMWARE_VOLUME",
-  "UNDEFINED"           : "UNDEFINED"
-  }
+    "PHIT": "PHIT",
+    "MEMORY_ALLOCATION": "MEMORY_ALLOCATION",
+    "LOAD_PEIM": "LOAD_PEIM",
+    "RESOURCE_DESCRIPTOR": "RESOURCE_DESCRIPTOR",
+    "FIRMWARE_VOLUME": "FIRMWARE_VOLUME",
+    "UNDEFINED": "UNDEFINED"
+}
 
 ##
 # Usage List Items for Protocol
@@ -260,12 +260,13 @@ GUID_TYPE_LIST = ["Event", "File", "FV", "GUID", "HII", "HOB",
 # PCD Usage Type List of Package
 #
 PCD_USAGE_TYPE_LIST_OF_PACKAGE = ["FeatureFlag", "PatchableInModule",
-                             "FixedAtBuild", "Dynamic", "DynamicEx"]
+                                  "FixedAtBuild", "Dynamic", "DynamicEx"]
 
 ##
 # PCD Usage Type List of Module
 #
-PCD_USAGE_TYPE_LIST_OF_MODULE = ["FEATUREPCD", "PATCHPCD", "FIXEDPCD", "PCD", "PCDEX"]
+PCD_USAGE_TYPE_LIST_OF_MODULE = ["FEATUREPCD",
+                                 "PATCHPCD", "FIXEDPCD", "PCD", "PCDEX"]
 ##
 # PCD Usage Type List of UPT
 #
@@ -282,14 +283,14 @@ BINARY_FILE_TYPE_LIST = ["PE32", "PIC", "TE", "DXE_DEPEX", "VER", "UI", "COMPAT1
                          "DISPOSABLE"
                          ]
 BINARY_FILE_TYPE_LIST_IN_UDP = \
-                        ["GUID", "FREEFORM",
-                         "UEFI_IMAGE", "PE32", "PIC",
-                         "PEI_DEPEX",
-                         "DXE_DEPEX",
-                         "SMM_DEPEX",
-                         "FV", "TE",
-                         "BIN", "VER", "UI"
-                         ]
+    ["GUID", "FREEFORM",
+     "UEFI_IMAGE", "PE32", "PIC",
+     "PEI_DEPEX",
+     "DXE_DEPEX",
+     "SMM_DEPEX",
+     "FV", "TE",
+     "BIN", "VER", "UI"
+     ]
 
 SUBTYPE_GUID_BINARY_FILE_TYPE = "FREEFORM"
 ##
@@ -300,28 +301,28 @@ SUBTYPE_GUID_BINARY_FILE_TYPE = "FREEFORM"
 # are required. The COMPONENT_TYPE definition is case sensitive.
 #
 COMPONENT_TYPE_LIST = [
-                       "APPLICATION",
-                       "ACPITABLE",
-                       "APRIORI",
-                       "BINARY",
-                       "BS_DRIVER",
-                       "CONFIG",
-                       "FILE",
-                       "FVIMAGEFILE",
-                       "LIBRARY",
-                       "LOGO",
-                       "LEGACY16",
-                       "MICROCODE",
-                       "PE32_PEIM",
-                       "PEI_CORE",
-                       "RAWFILE",
-                       "RT_DRIVER",
-                       "SAL_RT_DRIVER",
-                       "SECURITY_CORE",
-                       "COMBINED_PEIM_DRIVER",
-                       "PIC_PEIM",
-                       "RELOCATABLE_PEIM"
-                       ]
+    "APPLICATION",
+    "ACPITABLE",
+    "APRIORI",
+    "BINARY",
+    "BS_DRIVER",
+    "CONFIG",
+    "FILE",
+    "FVIMAGEFILE",
+    "LIBRARY",
+    "LOGO",
+    "LEGACY16",
+    "MICROCODE",
+    "PE32_PEIM",
+    "PEI_CORE",
+    "RAWFILE",
+    "RT_DRIVER",
+    "SAL_RT_DRIVER",
+    "SECURITY_CORE",
+    "COMBINED_PEIM_DRIVER",
+    "PIC_PEIM",
+    "RELOCATABLE_PEIM"
+]
 
 ##
 # Common Definitions
@@ -343,7 +344,7 @@ TAB_COLON_SPLIT = ':'
 TAB_SECTION_START = '['
 TAB_SECTION_END = ']'
 TAB_OPTION_START = '<'
-TAB_OPTION_END  = '>'
+TAB_OPTION_END = '>'
 TAB_SLASH = '\\'
 TAB_BACK_SLASH = '/'
 TAB_SPECIAL_COMMENT = '##'
@@ -370,7 +371,7 @@ TAB_ARCH_ARM = 'ARM'
 TAB_ARCH_EBC = 'EBC'
 
 ARCH_LIST = \
-[TAB_ARCH_IA32, TAB_ARCH_X64, TAB_ARCH_IPF, TAB_ARCH_ARM, TAB_ARCH_EBC]
+    [TAB_ARCH_IA32, TAB_ARCH_X64, TAB_ARCH_IPF, TAB_ARCH_ARM, TAB_ARCH_EBC]
 
 SUP_MODULE_BASE = 'BASE'
 SUP_MODULE_SEC = 'SEC'
@@ -387,12 +388,12 @@ SUP_MODULE_USER_DEFINED = 'USER_DEFINED'
 SUP_MODULE_SMM_CORE = 'SMM_CORE'
 
 SUP_MODULE_LIST = \
-[SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, \
-SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER, \
-                   SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER, \
-                   SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER, \
-                   SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED, \
-                   SUP_MODULE_SMM_CORE]
+    [SUP_MODULE_BASE, SUP_MODULE_SEC, SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM,
+     SUP_MODULE_DXE_CORE, SUP_MODULE_DXE_DRIVER,
+     SUP_MODULE_DXE_RUNTIME_DRIVER, SUP_MODULE_DXE_SAL_DRIVER,
+     SUP_MODULE_DXE_SMM_DRIVER, SUP_MODULE_UEFI_DRIVER,
+     SUP_MODULE_UEFI_APPLICATION, SUP_MODULE_USER_DEFINED,
+     SUP_MODULE_SMM_CORE]
 SUP_MODULE_LIST_STRING = TAB_VALUE_SPLIT.join(l for l in SUP_MODULE_LIST)
 
 EDK_COMPONENT_TYPE_LIBRARY = 'LIBRARY'
@@ -405,7 +406,7 @@ EDK_COMPONENT_TYPE_BS_DRIVER = 'BS_DRIVER'
 EDK_COMPONENT_TYPE_RT_DRIVER = 'RT_DRIVER'
 EDK_COMPONENT_TYPE_SAL_RT_DRIVER = 'SAL_RT_DRIVER'
 EDK_COMPONENT_TYPE_APPLICATION = 'APPLICATION'
-EDK_NAME   = 'EDK'
+EDK_NAME = 'EDK'
 EDKII_NAME = 'EDKII'
 
 BINARY_FILE_TYPE_FW = 'FW'
@@ -432,9 +433,9 @@ BINARY_FILE_TYPE_UI_LIST = [BINARY_FILE_TYPE_UNI_UI,
                             BINARY_FILE_TYPE_UI
                             ]
 BINARY_FILE_TYPE_VER_LIST = [BINARY_FILE_TYPE_UNI_VER,
-                            BINARY_FILE_TYPE_SEC_VER,
-                            BINARY_FILE_TYPE_VER
-                            ]
+                             BINARY_FILE_TYPE_SEC_VER,
+                             BINARY_FILE_TYPE_VER
+                             ]
 
 DEPEX_SECTION_LIST = ['<PEI_DEPEX>',
                       '<DXE_DEPEX>',
@@ -527,117 +528,117 @@ TAB_PCDS_DYNAMIC_HII = 'DynamicHii'
 
 TAB_PTR_TYPE_PCD = 'VOID*'
 
-PCD_DYNAMIC_TYPE_LIST = [TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_DEFAULT, \
+PCD_DYNAMIC_TYPE_LIST = [TAB_PCDS_DYNAMIC, TAB_PCDS_DYNAMIC_DEFAULT,
                          TAB_PCDS_DYNAMIC_VPD, TAB_PCDS_DYNAMIC_HII]
-PCD_DYNAMIC_EX_TYPE_LIST = [TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT, \
+PCD_DYNAMIC_EX_TYPE_LIST = [TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT,
                             TAB_PCDS_DYNAMIC_EX_VPD, TAB_PCDS_DYNAMIC_EX_HII]
 
-## Dynamic-ex PCD types
+# Dynamic-ex PCD types
 #
-gDYNAMIC_EX_PCD = [TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT, \
-                 TAB_PCDS_DYNAMIC_EX_VPD, TAB_PCDS_DYNAMIC_EX_HII]
+gDYNAMIC_EX_PCD = [TAB_PCDS_DYNAMIC_EX, TAB_PCDS_DYNAMIC_EX_DEFAULT,
+                   TAB_PCDS_DYNAMIC_EX_VPD, TAB_PCDS_DYNAMIC_EX_HII]
 
 TAB_PCDS_FIXED_AT_BUILD_NULL = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD
 TAB_PCDS_FIXED_AT_BUILD_COMMON = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_COMMON
+    TAB_SPLIT + TAB_ARCH_COMMON
 TAB_PCDS_FIXED_AT_BUILD_IA32 = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_IA32
+    TAB_SPLIT + TAB_ARCH_IA32
 TAB_PCDS_FIXED_AT_BUILD_X64 = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_X64
+    TAB_SPLIT + TAB_ARCH_X64
 TAB_PCDS_FIXED_AT_BUILD_IPF = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_IPF
+    TAB_SPLIT + TAB_ARCH_IPF
 TAB_PCDS_FIXED_AT_BUILD_ARM = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_ARM
+    TAB_SPLIT + TAB_ARCH_ARM
 TAB_PCDS_FIXED_AT_BUILD_EBC = TAB_PCDS + TAB_PCDS_FIXED_AT_BUILD + \
-TAB_SPLIT + TAB_ARCH_EBC
+    TAB_SPLIT + TAB_ARCH_EBC
 
 TAB_PCDS_PATCHABLE_IN_MODULE_NULL = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE
 TAB_PCDS_PATCHABLE_IN_MODULE_COMMON = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE \
-+ TAB_SPLIT + TAB_ARCH_COMMON
+    + TAB_SPLIT + TAB_ARCH_COMMON
 TAB_PCDS_PATCHABLE_IN_MODULE_IA32 = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + \
-TAB_SPLIT + TAB_ARCH_IA32
+    TAB_SPLIT + TAB_ARCH_IA32
 TAB_PCDS_PATCHABLE_IN_MODULE_X64 = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + \
-TAB_SPLIT + TAB_ARCH_X64
+    TAB_SPLIT + TAB_ARCH_X64
 TAB_PCDS_PATCHABLE_IN_MODULE_IPF = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + \
-TAB_SPLIT + TAB_ARCH_IPF
+    TAB_SPLIT + TAB_ARCH_IPF
 TAB_PCDS_PATCHABLE_IN_MODULE_ARM = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + \
-TAB_SPLIT + TAB_ARCH_ARM
+    TAB_SPLIT + TAB_ARCH_ARM
 TAB_PCDS_PATCHABLE_IN_MODULE_EBC = TAB_PCDS + TAB_PCDS_PATCHABLE_IN_MODULE + \
-TAB_SPLIT + TAB_ARCH_EBC
+    TAB_SPLIT + TAB_ARCH_EBC
 
 TAB_PCDS_FEATURE_FLAG_NULL = TAB_PCDS + TAB_PCDS_FEATURE_FLAG
 TAB_PCDS_FEATURE_FLAG_COMMON = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT \
-+ TAB_ARCH_COMMON
+    + TAB_ARCH_COMMON
 TAB_PCDS_FEATURE_FLAG_IA32 = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + \
-TAB_ARCH_IA32
+    TAB_ARCH_IA32
 TAB_PCDS_FEATURE_FLAG_X64 = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + \
-TAB_ARCH_X64
+    TAB_ARCH_X64
 TAB_PCDS_FEATURE_FLAG_IPF = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + \
-TAB_ARCH_IPF
+    TAB_ARCH_IPF
 TAB_PCDS_FEATURE_FLAG_ARM = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + \
-TAB_ARCH_ARM
+    TAB_ARCH_ARM
 TAB_PCDS_FEATURE_FLAG_EBC = TAB_PCDS + TAB_PCDS_FEATURE_FLAG + TAB_SPLIT + \
-TAB_ARCH_EBC
+    TAB_ARCH_EBC
 
 TAB_PCDS_DYNAMIC_EX_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX
 TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_DEFAULT
 TAB_PCDS_DYNAMIC_EX_HII_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_HII
 TAB_PCDS_DYNAMIC_EX_VPD_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_EX_VPD
 TAB_PCDS_DYNAMIC_EX_COMMON = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_COMMON
+    TAB_ARCH_COMMON
 TAB_PCDS_DYNAMIC_EX_IA32 = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_IA32
+    TAB_ARCH_IA32
 TAB_PCDS_DYNAMIC_EX_X64 = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_X64
+    TAB_ARCH_X64
 TAB_PCDS_DYNAMIC_EX_IPF = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_IPF
+    TAB_ARCH_IPF
 TAB_PCDS_DYNAMIC_EX_ARM = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_ARM
+    TAB_ARCH_ARM
 TAB_PCDS_DYNAMIC_EX_EBC = TAB_PCDS + TAB_PCDS_DYNAMIC_EX + TAB_SPLIT + \
-TAB_ARCH_EBC
+    TAB_ARCH_EBC
 
 TAB_PCDS_DYNAMIC_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC
 TAB_PCDS_DYNAMIC_DEFAULT_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_DEFAULT
 TAB_PCDS_DYNAMIC_HII_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_HII
 TAB_PCDS_DYNAMIC_VPD_NULL = TAB_PCDS + TAB_PCDS_DYNAMIC_VPD
 TAB_PCDS_DYNAMIC_COMMON = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + \
-TAB_ARCH_COMMON
+    TAB_ARCH_COMMON
 TAB_PCDS_DYNAMIC_IA32 = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_IA32
 TAB_PCDS_DYNAMIC_X64 = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_X64
 TAB_PCDS_DYNAMIC_IPF = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_IPF
 TAB_PCDS_DYNAMIC_ARM = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_ARM
 TAB_PCDS_DYNAMIC_EBC = TAB_PCDS + TAB_PCDS_DYNAMIC + TAB_SPLIT + TAB_ARCH_EBC
 
-TAB_PCD_DYNAMIC_TYPE_LIST = [TAB_PCDS_DYNAMIC_DEFAULT_NULL, \
-                             TAB_PCDS_DYNAMIC_VPD_NULL, \
+TAB_PCD_DYNAMIC_TYPE_LIST = [TAB_PCDS_DYNAMIC_DEFAULT_NULL,
+                             TAB_PCDS_DYNAMIC_VPD_NULL,
                              TAB_PCDS_DYNAMIC_HII_NULL]
-TAB_PCD_DYNAMIC_EX_TYPE_LIST = [TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL, \
-                                TAB_PCDS_DYNAMIC_EX_VPD_NULL, \
+TAB_PCD_DYNAMIC_EX_TYPE_LIST = [TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL,
+                                TAB_PCDS_DYNAMIC_EX_VPD_NULL,
                                 TAB_PCDS_DYNAMIC_EX_HII_NULL]
 
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE = \
-'PcdLoadFixAddressPeiCodePageNumber'
+    'PcdLoadFixAddressPeiCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE_DATA_TYPE = 'UINT32'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE = \
-'PcdLoadFixAddressBootTimeCodePageNumber'
+    'PcdLoadFixAddressBootTimeCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE_DATA_TYPE = 'UINT32'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE = \
-'PcdLoadFixAddressRuntimeCodePageNumber'
+    'PcdLoadFixAddressRuntimeCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE_DATA_TYPE = 'UINT32'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE = \
-'PcdLoadFixAddressSmmCodePageNumber'
+    'PcdLoadFixAddressSmmCodePageNumber'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE_DATA_TYPE = 'UINT32'
 TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_LIST = \
-[TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE, \
-TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE, \
-TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE, \
-TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE]
-PCD_SECTION_LIST = [TAB_PCDS_FIXED_AT_BUILD_NULL.upper(), \
-                    TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper(), \
-                    TAB_PCDS_FEATURE_FLAG_NULL.upper(), \
-                    TAB_PCDS_DYNAMIC_EX_NULL.upper(), \
+    [TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE,
+     TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE,
+     TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE,
+     TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE]
+PCD_SECTION_LIST = [TAB_PCDS_FIXED_AT_BUILD_NULL.upper(),
+                    TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper(),
+                    TAB_PCDS_FEATURE_FLAG_NULL.upper(),
+                    TAB_PCDS_DYNAMIC_EX_NULL.upper(),
                     TAB_PCDS_DYNAMIC_NULL.upper()]
-INF_PCD_SECTION_LIST = ["FixedPcd".upper(), "FeaturePcd".upper(), \
+INF_PCD_SECTION_LIST = ["FixedPcd".upper(), "FeaturePcd".upper(),
                         "PatchPcd".upper(), "Pcd".upper(), "PcdEx".upper()]
 
 TAB_DEPEX = 'Depex'
@@ -692,7 +693,7 @@ TAB_INF_DEFINES_EFI_SPECIFICATION_VERSION = 'EFI_SPECIFICATION_VERSION'
 TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION = 'UEFI_SPECIFICATION_VERSION'
 TAB_INF_DEFINES_PI_SPECIFICATION_VERSION = 'PI_SPECIFICATION_VERSION'
 TAB_INF_DEFINES_EDK_RELEASE_VERSION = 'EDK_RELEASE_VERSION'
-TAB_INF_DEFINES_MODULE_UNI_FILE    = 'MODULE_UNI_FILE'
+TAB_INF_DEFINES_MODULE_UNI_FILE = 'MODULE_UNI_FILE'
 TAB_INF_DEFINES_BINARY_MODULE = 'BINARY_MODULE'
 TAB_INF_DEFINES_LIBRARY_CLASS = 'LIBRARY_CLASS'
 TAB_INF_DEFINES_COMPONENT_TYPE = 'COMPONENT_TYPE'
@@ -702,7 +703,7 @@ TAB_INF_DEFINES_BUILD_TYPE = 'BUILD_TYPE'
 TAB_INF_DEFINES_FFS_EXT = 'FFS_EXT'
 TAB_INF_DEFINES_FV_EXT = 'FV_EXT'
 TAB_INF_DEFINES_SOURCE_FV = 'SOURCE_FV'
-TAB_INF_DEFINES_PACKAGE   = 'PACKAGE'
+TAB_INF_DEFINES_PACKAGE = 'PACKAGE'
 TAB_INF_DEFINES_VERSION_NUMBER = 'VERSION_NUMBER'
 TAB_INF_DEFINES_VERSION = 'VERSION'
 TAB_INF_DEFINES_VERSION_STRING = 'VERSION_STRING'
@@ -712,11 +713,11 @@ TAB_INF_DEFINES_ENTRY_POINT = 'ENTRY_POINT'
 TAB_INF_DEFINES_UNLOAD_IMAGE = 'UNLOAD_IMAGE'
 TAB_INF_DEFINES_CONSTRUCTOR = 'CONSTRUCTOR'
 TAB_INF_DEFINES_DESTRUCTOR = 'DESTRUCTOR'
-TAB_INF_DEFINES_PCI_VENDOR_ID  = 'PCI_VENDOR_ID'
-TAB_INF_DEFINES_PCI_DEVICE_ID  = 'PCI_DEVICE_ID'
+TAB_INF_DEFINES_PCI_VENDOR_ID = 'PCI_VENDOR_ID'
+TAB_INF_DEFINES_PCI_DEVICE_ID = 'PCI_DEVICE_ID'
 TAB_INF_DEFINES_PCI_CLASS_CODE = 'PCI_CLASS_CODE'
-TAB_INF_DEFINES_PCI_REVISION   = 'PCI_REVISION'
-TAB_INF_DEFINES_PCI_COMPRESS   = 'PCI_COMPRESS'
+TAB_INF_DEFINES_PCI_REVISION = 'PCI_REVISION'
+TAB_INF_DEFINES_PCI_COMPRESS = 'PCI_COMPRESS'
 TAB_INF_DEFINES_DEFINE = 'DEFINE'
 TAB_INF_DEFINES_SPEC = 'SPEC'
 TAB_INF_DEFINES_UEFI_HII_RESOURCE_SECTION = 'UEFI_HII_RESOURCE_SECTION'
@@ -744,7 +745,7 @@ TAB_DEC_DEFINES_DEC_SPECIFICATION = 'DEC_SPECIFICATION'
 TAB_DEC_DEFINES_PACKAGE_NAME = 'PACKAGE_NAME'
 TAB_DEC_DEFINES_PACKAGE_GUID = 'PACKAGE_GUID'
 TAB_DEC_DEFINES_PACKAGE_VERSION = 'PACKAGE_VERSION'
-TAB_DEC_DEFINES_PKG_UNI_FILE    = 'PACKAGE_UNI_FILE'
+TAB_DEC_DEFINES_PKG_UNI_FILE = 'PACKAGE_UNI_FILE'
 TAB_DEC_PACKAGE_ABSTRACT = 'STR_PACKAGE_ABSTRACT'
 TAB_DEC_PACKAGE_DESCRIPTION = 'STR_PACKAGE_DESCRIPTION'
 TAB_DEC_PACKAGE_LICENSE = 'STR_PACKAGE_LICENSE'
@@ -829,12 +830,12 @@ DATABASE_PATH = ":memory:"
 #
 # used by ECC
 #
-MODIFIER_LIST = ['IN', 'OUT', 'OPTIONAL', 'UNALIGNED', 'EFI_RUNTIMESERVICE', \
+MODIFIER_LIST = ['IN', 'OUT', 'OPTIONAL', 'UNALIGNED', 'EFI_RUNTIMESERVICE',
                  'EFI_BOOTSERVICE', 'EFIAPI']
 #
 # Dependency Expression
 #
-DEPEX_SUPPORTED_OPCODE = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", \
+DEPEX_SUPPORTED_OPCODE = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT",
                           "END", "SOR", "TRUE", "FALSE", '(', ')']
 
 TAB_STATIC_LIBRARY = "STATIC-LIBRARY-FILE"
@@ -851,13 +852,13 @@ TAB_DEFAULT_BINARY_FILE = "_BINARY_FILE_"
 # inf files
 #
 HEADER_COMMENT_NOT_STARTED = -1
-HEADER_COMMENT_STARTED     = 0
-HEADER_COMMENT_FILE        = 1
-HEADER_COMMENT_ABSTRACT    = 2
+HEADER_COMMENT_STARTED = 0
+HEADER_COMMENT_FILE = 1
+HEADER_COMMENT_ABSTRACT = 2
 HEADER_COMMENT_DESCRIPTION = 3
-HEADER_COMMENT_COPYRIGHT   = 4
-HEADER_COMMENT_LICENSE     = 5
-HEADER_COMMENT_END         = 6
+HEADER_COMMENT_COPYRIGHT = 4
+HEADER_COMMENT_LICENSE = 5
+HEADER_COMMENT_END = 6
 
 #
 # Static values for data models
diff --git a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
index 7718ca12e5cf..da2d2da73d73 100644
--- a/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to check PCD logical expression
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -16,13 +16,15 @@ from __future__ import print_function
 import re
 from Logger import StringTable as ST
 
-## IsValidBareCString
+# IsValidBareCString
 #
 # Check if String is comprised by whitespace(0x20), !(0x21), 0x23 - 0x7E
 # or '\n', '\t', '\f', '\r', '\b', '\0', '\\'
 #
 # @param String: string to be checked
 #
+
+
 def IsValidBareCString(String):
     EscapeList = ['n', 't', 'f', 'r', 'b', '0', '\\', '"']
     PreChar = ''
@@ -38,7 +40,7 @@ def IsValidBareCString(String):
         else:
             IntChar = ord(Char)
             if IntChar != 0x20 and IntChar != 0x09 and IntChar != 0x21 \
-                and (IntChar < 0x23 or IntChar > 0x7e):
+                    and (IntChar < 0x23 or IntChar > 0x7e):
                 return False
         PreChar = Char
 
@@ -47,6 +49,7 @@ def IsValidBareCString(String):
         return False
     return True
 
+
 def _ValidateToken(Token):
     Token = Token.strip()
     Index = Token.find("\"")
@@ -54,31 +57,36 @@ def _ValidateToken(Token):
         return IsValidBareCString(Token[Index+1:-1])
     return True
 
-## _ExprError
+# _ExprError
 #
 # @param      Exception:    Exception
 #
+
+
 class _ExprError(Exception):
-    def __init__(self, Error = ''):
+    def __init__(self, Error=''):
         Exception.__init__(self)
         self.Error = Error
 
-## _ExprBase
+# _ExprBase
 #
+
+
 class _ExprBase:
     HEX_PATTERN = '[\t\s]*0[xX][a-fA-F0-9]+'
     INT_PATTERN = '[\t\s]*[0-9]+'
     MACRO_PATTERN = '[\t\s]*\$\(([A-Z][_A-Z0-9]*)\)'
     PCD_PATTERN = \
-    '[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*\.[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*'
+        '[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*\.[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*'
     QUOTED_PATTERN = '[\t\s]*L?"[^"]*"'
     BOOL_PATTERN = '[\t\s]*(true|True|TRUE|false|False|FALSE)'
+
     def __init__(self, Token):
         self.Token = Token
         self.Index = 0
         self.Len = len(Token)
 
-    ## SkipWhitespace
+    # SkipWhitespace
     #
     def SkipWhitespace(self):
         for Char in self.Token[self.Index:]:
@@ -86,7 +94,7 @@ class _ExprBase:
                 break
             self.Index += 1
 
-    ## IsCurrentOp
+    # IsCurrentOp
     #
     # @param      OpList:   option list
     #
@@ -95,11 +103,11 @@ class _ExprBase:
         LetterOp = ["EQ", "NE", "GE", "LE", "GT", "LT", "NOT", "and", "AND",
                     "or", "OR", "XOR"]
         OpMap = {
-            '|' : '|',
-            '&' : '&',
-            '!' : '=',
-            '>' : '=',
-            '<' : '='
+            '|': '|',
+            '&': '&',
+            '!': '=',
+            '>': '=',
+            '<': '='
         }
 
         for Operator in OpList:
@@ -107,10 +115,10 @@ class _ExprBase:
                 continue
 
             self.Index += len(Operator)
-            Char = self.Token[self.Index : self.Index + 1]
+            Char = self.Token[self.Index: self.Index + 1]
 
             if (Operator in LetterOp and (Char == '_' or Char.isalnum())) \
-                or (Operator in OpMap and OpMap[Operator] == Char):
+                    or (Operator in OpMap and OpMap[Operator] == Char):
                 self.Index -= len(Operator)
                 break
 
@@ -118,10 +126,12 @@ class _ExprBase:
 
         return False
 
-## _LogicalExpressionParser
+# _LogicalExpressionParser
 #
 # @param      _ExprBase:   _ExprBase object
 #
+
+
 class _LogicalExpressionParser(_ExprBase):
     #
     # STRINGITEM can only be logical field according to spec
@@ -147,21 +157,21 @@ class _LogicalExpressionParser(_ExprBase):
         for Match in MatchList:
             if Match and Match.start() == 0:
                 if not _ValidateToken(
-                            self.Token[self.Index:self.Index+Match.end()]
-                        ):
+                    self.Token[self.Index:self.Index+Match.end()]
+                ):
                     return False
 
                 self.Index += Match.end()
                 if self.Token[self.Index - 1] == '"':
                     return True
                 if self.Token[self.Index:self.Index+1] == '_' or \
-                    self.Token[self.Index:self.Index+1].isalnum():
+                        self.Token[self.Index:self.Index+1].isalnum():
                     self.Index -= Match.end()
                     return False
 
                 Token = self.Token[self.Index - Match.end():self.Index]
                 if Token.strip() in ["EQ", "NE", "GE", "LE", "GT", "LT",
-                    "NOT", "and", "AND", "or", "OR", "XOR"]:
+                                     "NOT", "and", "AND", "or", "OR", "XOR"]:
                     self.Index -= Match.end()
                     return False
 
@@ -192,7 +202,6 @@ class _LogicalExpressionParser(_ExprBase):
 
         return self._CheckToken([Match1, Match2, Match3, Match4])
 
-
     def IsAtomicItem(self):
         #
         # Macro
@@ -208,18 +217,18 @@ class _LogicalExpressionParser(_ExprBase):
         # Quoted string
         #
         Match3 = re.compile(self.QUOTED_PATTERN).\
-            match(self.Token[self.Index:].replace('\\\\', '//').\
+            match(self.Token[self.Index:].replace('\\\\', '//').
                   replace('\\\"', '\\\''))
 
         return self._CheckToken([Match1, Match2, Match3])
 
-    ## A || B
+    # A || B
     #
     def LogicalExpression(self):
         Ret = self.SpecNot()
         while self.IsCurrentOp(['||', 'OR', 'or', '&&', 'AND', 'and', 'XOR', 'xor', '^']):
             if self.Token[self.Index-1] == '|' and self.Parens <= 0:
-                raise  _ExprError(ST.ERR_EXPR_OR % self.Token)
+                raise _ExprError(ST.ERR_EXPR_OR % self.Token)
             if Ret not in [self.ARITH, self.LOGICAL, self.REALLOGICAL, self.STRINGITEM]:
                 raise _ExprError(ST.ERR_EXPR_LOGICAL % self.Token)
             Ret = self.SpecNot()
@@ -253,7 +262,7 @@ class _LogicalExpressionParser(_ExprBase):
         Ret = self.Factor()
         while self.IsCurrentOp(["+", "-", "&", "|", "^", "XOR", "xor"]):
             if self.Token[self.Index-1] == '|' and self.Parens <= 0:
-                raise  _ExprError(ST.ERR_EXPR_OR)
+                raise _ExprError(ST.ERR_EXPR_OR)
             if Ret == self.STRINGITEM or Ret == self.REALLOGICAL:
                 raise _ExprError(ST.ERR_EXPR_LOGICAL % self.Token)
             Ret = self.Factor()
@@ -262,14 +271,14 @@ class _LogicalExpressionParser(_ExprBase):
             Ret = self.ARITH
         return Ret
 
-    ## Factor
+    # Factor
     #
     def Factor(self):
         if self.IsCurrentOp(["("]):
             self.Parens += 1
             Ret = self.LogicalExpression()
             if not self.IsCurrentOp([")"]):
-                raise _ExprError(ST.ERR_EXPR_RIGHT_PAREN % \
+                raise _ExprError(ST.ERR_EXPR_RIGHT_PAREN %
                                  (self.Token, self.Token[self.Index:]))
             self.Parens -= 1
             return Ret
@@ -281,10 +290,10 @@ class _LogicalExpressionParser(_ExprBase):
         elif self.IsAtomicNumVal():
             return self.ARITH
         else:
-            raise _ExprError(ST.ERR_EXPR_FACTOR % \
+            raise _ExprError(ST.ERR_EXPR_FACTOR %
                              (self.Token[self.Index:], self.Token))
 
-    ## IsValidLogicalExpression
+    # IsValidLogicalExpression
     #
     def IsValidLogicalExpression(self):
         if self.Len == 0:
@@ -296,16 +305,19 @@ class _LogicalExpressionParser(_ExprBase):
             return False, XExcept.Error
         self.SkipWhitespace()
         if self.Index != self.Len:
-            return False, (ST.ERR_EXPR_BOOLEAN % \
+            return False, (ST.ERR_EXPR_BOOLEAN %
                            (self.Token[self.Index:], self.Token))
         return True, ''
 
-## _ValidRangeExpressionParser
+# _ValidRangeExpressionParser
 #
+
+
 class _ValidRangeExpressionParser(_ExprBase):
     INT_RANGE_PATTERN = '[\t\s]*[0-9]+[\t\s]*-[\t\s]*[0-9]+'
     HEX_RANGE_PATTERN = \
         '[\t\s]*0[xX][a-fA-F0-9]+[\t\s]*-[\t\s]*0[xX][a-fA-F0-9]+'
+
     def __init__(self, Token):
         _ExprBase.__init__(self, Token)
         self.Parens = 0
@@ -314,7 +326,7 @@ class _ValidRangeExpressionParser(_ExprBase):
         self.IsParenHappen = False
         self.IsLogicalOpHappen = False
 
-    ## IsValidRangeExpression
+    # IsValidRangeExpression
     #
     def IsValidRangeExpression(self):
         if self.Len == 0:
@@ -330,7 +342,7 @@ class _ValidRangeExpressionParser(_ExprBase):
             return False, (ST.ERR_EXPR_RANGE % self.Token)
         return True, ''
 
-    ## RangeExpression
+    # RangeExpression
     #
     def RangeExpression(self):
         Ret = self.Unary()
@@ -346,7 +358,7 @@ class _ValidRangeExpressionParser(_ExprBase):
 
         return Ret
 
-    ## Unary
+    # Unary
     #
     def Unary(self):
         if self.IsCurrentOp(["NOT"]):
@@ -354,7 +366,7 @@ class _ValidRangeExpressionParser(_ExprBase):
 
         return self.ValidRange()
 
-    ## ValidRange
+    # ValidRange
     #
     def ValidRange(self):
         Ret = -1
@@ -363,7 +375,8 @@ class _ValidRangeExpressionParser(_ExprBase):
             self.IsParenHappen = True
             self.Parens += 1
             if self.Parens > 1:
-                raise _ExprError(ST.ERR_EXPR_RANGE_DOUBLE_PAREN_NESTED % self.Token)
+                raise _ExprError(
+                    ST.ERR_EXPR_RANGE_DOUBLE_PAREN_NESTED % self.Token)
             Ret = self.RangeExpression()
             if not self.IsCurrentOp([")"]):
                 raise _ExprError(ST.ERR_EXPR_RIGHT_PAREN % self.Token)
@@ -385,14 +398,15 @@ class _ValidRangeExpressionParser(_ExprBase):
                 self.Index += IntMatch.end()
                 Ret = self.INT
             else:
-                raise _ExprError(ST.ERR_EXPR_RANGE_FACTOR % (self.Token[self.Index:], self.Token))
+                raise _ExprError(ST.ERR_EXPR_RANGE_FACTOR %
+                                 (self.Token[self.Index:], self.Token))
         else:
             IntRangeMatch = re.compile(
                 self.INT_RANGE_PATTERN).match(self.Token[self.Index:]
-            )
+                                              )
             HexRangeMatch = re.compile(
                 self.HEX_RANGE_PATTERN).match(self.Token[self.Index:]
-            )
+                                              )
             if HexRangeMatch and HexRangeMatch.start() == 0:
                 self.Index += HexRangeMatch.end()
                 Ret = self.HEX
@@ -404,10 +418,13 @@ class _ValidRangeExpressionParser(_ExprBase):
 
         return Ret
 
-## _ValidListExpressionParser
+# _ValidListExpressionParser
 #
+
+
 class _ValidListExpressionParser(_ExprBase):
     VALID_LIST_PATTERN = '(0[xX][0-9a-fA-F]+|[0-9]+)([\t\s]*,[\t\s]*(0[xX][0-9a-fA-F]+|[0-9]+))*'
+
     def __init__(self, Token):
         _ExprBase.__init__(self, Token)
         self.NUM = 1
@@ -430,7 +447,8 @@ class _ValidListExpressionParser(_ExprBase):
     def ListExpression(self):
         Ret = -1
         self.SkipWhitespace()
-        ListMatch = re.compile(self.VALID_LIST_PATTERN).match(self.Token[self.Index:])
+        ListMatch = re.compile(self.VALID_LIST_PATTERN).match(
+            self.Token[self.Index:])
         if ListMatch and ListMatch.start() == 0:
             self.Index += ListMatch.end()
             Ret = self.NUM
@@ -439,13 +457,15 @@ class _ValidListExpressionParser(_ExprBase):
 
         return Ret
 
-## _StringTestParser
+# _StringTestParser
 #
+
+
 class _StringTestParser(_ExprBase):
     def __init__(self, Token):
         _ExprBase.__init__(self, Token)
 
-    ## IsValidStringTest
+    # IsValidStringTest
     #
     def IsValidStringTest(self):
         if self.Len == 0:
@@ -456,11 +476,11 @@ class _StringTestParser(_ExprBase):
             return False, XExcept.Error
         return True, ''
 
-    ## StringItem
+    # StringItem
     #
     def StringItem(self):
         Match1 = re.compile(self.QUOTED_PATTERN)\
-            .match(self.Token[self.Index:].replace('\\\\', '//')\
+            .match(self.Token[self.Index:].replace('\\\\', '//')
                    .replace('\\\"', '\\\''))
         Match2 = re.compile(self.MACRO_PATTERN).match(self.Token[self.Index:])
         Match3 = re.compile(self.PCD_PATTERN).match(self.Token[self.Index:])
@@ -468,30 +488,30 @@ class _StringTestParser(_ExprBase):
         for Match in MatchList:
             if Match and Match.start() == 0:
                 if not _ValidateToken(
-                            self.Token[self.Index:self.Index+Match.end()]
-                        ):
-                    raise _ExprError(ST.ERR_EXPR_STRING_ITEM % \
+                    self.Token[self.Index:self.Index+Match.end()]
+                ):
+                    raise _ExprError(ST.ERR_EXPR_STRING_ITEM %
                                      (self.Token, self.Token[self.Index:]))
                 self.Index += Match.end()
                 Token = self.Token[self.Index - Match.end():self.Index]
                 if Token.strip() in ["EQ", "NE"]:
-                    raise _ExprError(ST.ERR_EXPR_STRING_ITEM % \
-                             (self.Token, self.Token[self.Index:]))
+                    raise _ExprError(ST.ERR_EXPR_STRING_ITEM %
+                                     (self.Token, self.Token[self.Index:]))
                 return
         else:
-            raise _ExprError(ST.ERR_EXPR_STRING_ITEM % \
+            raise _ExprError(ST.ERR_EXPR_STRING_ITEM %
                              (self.Token, self.Token[self.Index:]))
 
-    ## StringTest
+    # StringTest
     #
     def StringTest(self):
         self.StringItem()
         if not self.IsCurrentOp(["==", "EQ", "!=", "NE"]):
-            raise _ExprError(ST.ERR_EXPR_EQUALITY % \
+            raise _ExprError(ST.ERR_EXPR_EQUALITY %
                              (self.Token[self.Index:], self.Token))
         self.StringItem()
         if self.Index != self.Len:
-            raise _ExprError(ST.ERR_EXPR_BOOLEAN % \
+            raise _ExprError(ST.ERR_EXPR_BOOLEAN %
                              (self.Token[self.Index:], self.Token))
 
 ##
@@ -499,6 +519,8 @@ class _StringTestParser(_ExprBase):
 #
 # @param Token: string test token
 #
+
+
 def IsValidStringTest(Token, Flag=False):
     #
     # Not do the check right now, keep the implementation for future enhancement.
@@ -526,6 +548,8 @@ def IsValidLogicalExpr(Token, Flag=False):
 #
 # @param Token: range expression token
 #
+
+
 def IsValidRangeExpr(Token):
     return _ValidRangeExpressionParser(Token).IsValidRangeExpression()
 
@@ -534,6 +558,8 @@ def IsValidRangeExpr(Token):
 #
 # @param Token: value list expression token
 #
+
+
 def IsValidListExpr(Token):
     return _ValidListExpressionParser(Token).IsValidListExpression()
 
@@ -542,6 +568,8 @@ def IsValidListExpr(Token):
 #
 # @param Token: feature flag expression
 #
+
+
 def IsValidFeatureFlagExp(Token, Flag=False):
     #
     # Not do the check right now, keep the implementation for future enhancement.
@@ -559,9 +587,8 @@ def IsValidFeatureFlagExp(Token, Flag=False):
             return False, Cause
         return True, ""
 
+
 if __name__ == '__main__':
-#    print IsValidRangeExpr('LT 9')
-    print(_LogicalExpressionParser('gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression())
-
-
-
+    #    print IsValidRangeExpr('LT 9')
+    print(_LogicalExpressionParser(
+        'gCrownBayTokenSpaceGuid.PcdPciDevice1BridgeAddressLE0').IsValidLogicalExpression())
diff --git a/BaseTools/Source/Python/UPT/Library/GlobalData.py b/BaseTools/Source/Python/UPT/Library/GlobalData.py
index 8e88fb1e3fde..ae76425f20d9 100644
--- a/BaseTools/Source/Python/UPT/Library/GlobalData.py
+++ b/BaseTools/Source/Python/UPT/Library/GlobalData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common static strings and global data used by UPT
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
diff --git a/BaseTools/Source/Python/UPT/Library/Misc.py b/BaseTools/Source/Python/UPT/Library/Misc.py
index 77ba3584e000..f6880e02b450 100644
--- a/BaseTools/Source/Python/UPT/Library/Misc.py
+++ b/BaseTools/Source/Python/UPT/Library/Misc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Common routines used by all tools
 #
 # Copyright (c) 2011 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -46,11 +46,13 @@ from Object.POM.CommonObject import TextObject
 from Core.FileHook import __FileHookOpen__
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## Convert GUID string in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx style to C
+# Convert GUID string in xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx style to C
 # structure style
 #
 # @param      Guid:    The GUID string
 #
+
+
 def GuidStringToGuidStructureString(Guid):
     GuidList = Guid.split('-')
     Result = '{'
@@ -62,18 +64,20 @@ def GuidStringToGuidStructureString(Guid):
     Result += '}}'
     return Result
 
-## Check whether GUID string is of format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+# Check whether GUID string is of format xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 #
 # @param      GuidValue:   The GUID value
 #
+
+
 def CheckGuidRegFormat(GuidValue):
-    ## Regular expression used to find out register format of GUID
+    # Regular expression used to find out register format of GUID
     #
     RegFormatGuidPattern = re.compile("^\s*([0-9a-fA-F]){8}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){12}\s*$")
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){12}\s*$")
 
     if RegFormatGuidPattern.match(GuidValue):
         return True
@@ -81,38 +85,40 @@ def CheckGuidRegFormat(GuidValue):
         return False
 
 
-## Convert GUID string in C structure style to
+# Convert GUID string in C structure style to
 # xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
 #
 # @param      GuidValue:   The GUID value in C structure format
 #
 def GuidStructureStringToGuidString(GuidValue):
     GuidValueString = GuidValue.lower().replace("{", "").replace("}", "").\
-    replace(" ", "").replace(";", "")
+        replace(" ", "").replace(";", "")
     GuidValueList = GuidValueString.split(",")
     if len(GuidValueList) != 11:
         return ''
     try:
         return "%08x-%04x-%04x-%02x%02x-%02x%02x%02x%02x%02x%02x" % (
-                int(GuidValueList[0], 16),
-                int(GuidValueList[1], 16),
-                int(GuidValueList[2], 16),
-                int(GuidValueList[3], 16),
-                int(GuidValueList[4], 16),
-                int(GuidValueList[5], 16),
-                int(GuidValueList[6], 16),
-                int(GuidValueList[7], 16),
-                int(GuidValueList[8], 16),
-                int(GuidValueList[9], 16),
-                int(GuidValueList[10], 16)
-                )
+            int(GuidValueList[0], 16),
+            int(GuidValueList[1], 16),
+            int(GuidValueList[2], 16),
+            int(GuidValueList[3], 16),
+            int(GuidValueList[4], 16),
+            int(GuidValueList[5], 16),
+            int(GuidValueList[6], 16),
+            int(GuidValueList[7], 16),
+            int(GuidValueList[8], 16),
+            int(GuidValueList[9], 16),
+            int(GuidValueList[10], 16)
+        )
     except BaseException:
         return ''
 
-## Create directories
+# Create directories
 #
 # @param      Directory:   The directory name
 #
+
+
 def CreateDirectory(Directory):
     if Directory is None or Directory.strip() == "":
         return True
@@ -123,13 +129,15 @@ def CreateDirectory(Directory):
         return False
     return True
 
-## Remove directories, including files and sub-directories in it
+# Remove directories, including files and sub-directories in it
 #
 # @param      Directory:   The directory name
 #
+
+
 def RemoveDirectory(Directory, Recursively=False):
     if Directory is None or Directory.strip() == "" or not \
-    os.path.exists(Directory):
+            os.path.exists(Directory):
         return
     if Recursively:
         CurrentDirectory = getcwd()
@@ -142,7 +150,7 @@ def RemoveDirectory(Directory, Recursively=False):
         chdir(CurrentDirectory)
     rmdir(Directory)
 
-## Store content in file
+# Store content in file
 #
 # This method is used to save file only when its content is changed. This is
 # quite useful for "make" system to decide what will be re-built and what
@@ -153,6 +161,8 @@ def RemoveDirectory(Directory, Recursively=False):
 # @param      IsBinaryFile:    The flag indicating if the file is binary file
 #                              or not
 #
+
+
 def SaveFileOnChange(File, Content, IsBinaryFile=True):
     if os.path.exists(File):
         if IsBinaryFile:
@@ -186,11 +196,13 @@ def SaveFileOnChange(File, Content, IsBinaryFile=True):
 
     return True
 
-## Get all files of a directory
+# Get all files of a directory
 #
 # @param Root:       Root dir
 # @param SkipList :  The files need be skipped
 #
+
+
 def GetFiles(Root, SkipList=None, FullPath=True):
     OriPath = os.path.normpath(Root)
     FileList = []
@@ -215,7 +227,7 @@ def GetFiles(Root, SkipList=None, FullPath=True):
 
     return FileList
 
-## Get all non-metadata files of a directory
+# Get all non-metadata files of a directory
 #
 # @param Root:       Root Dir
 # @param SkipList :  List of path need be skipped
@@ -223,6 +235,8 @@ def GetFiles(Root, SkipList=None, FullPath=True):
 # @param PrefixPath: the path that need to be added to the files found
 # @return: the list of files found
 #
+
+
 def GetNonMetaDataFiles(Root, SkipList, FullPath, PrefixPath):
     FileList = GetFiles(Root, SkipList, FullPath)
     NewFileList = []
@@ -232,15 +246,18 @@ def GetNonMetaDataFiles(Root, SkipList, FullPath, PrefixPath):
         # skip '.dec', '.inf', '.dsc', '.fdf' files
         #
         if ExtName.lower() not in ['.dec', '.inf', '.dsc', '.fdf']:
-            NewFileList.append(os.path.normpath(os.path.join(PrefixPath, File)))
+            NewFileList.append(os.path.normpath(
+                os.path.join(PrefixPath, File)))
 
     return NewFileList
 
-## Check if given file exists or not
+# Check if given file exists or not
 #
 # @param      File:    File name or path to be checked
 # @param      Dir:     The directory the file is relative to
 #
+
+
 def ValidFile(File, Ext=None):
     File = File.replace('\\', '/')
     if Ext is not None:
@@ -251,12 +268,14 @@ def ValidFile(File, Ext=None):
         return False
     return True
 
-## RealPath
+# RealPath
 #
 # @param      File:    File name or path to be checked
 # @param      Dir:     The directory the file is relative to
 # @param      OverrideDir:     The override directory
 #
+
+
 def RealPath(File, Dir='', OverrideDir=''):
     NewFile = os.path.normpath(os.path.join(Dir, File))
     NewFile = GlobalData.gALL_FILES[NewFile]
@@ -265,22 +284,24 @@ def RealPath(File, Dir='', OverrideDir=''):
         NewFile = GlobalData.gALL_FILES[NewFile]
     return NewFile
 
-## RealPath2
+# RealPath2
 #
 # @param      File:    File name or path to be checked
 # @param      Dir:     The directory the file is relative to
 # @param      OverrideDir:     The override directory
 #
+
+
 def RealPath2(File, Dir='', OverrideDir=''):
     if OverrideDir:
-        NewFile = GlobalData.gALL_FILES[os.path.normpath(os.path.join\
-                                                        (OverrideDir, File))]
+        NewFile = GlobalData.gALL_FILES[os.path.normpath(os.path.join
+                                                         (OverrideDir, File))]
         if NewFile:
             if OverrideDir[-1] == os.path.sep:
                 return NewFile[len(OverrideDir):], NewFile[0:len(OverrideDir)]
             else:
                 return NewFile[len(OverrideDir) + 1:], \
-            NewFile[0:len(OverrideDir)]
+                    NewFile[0:len(OverrideDir)]
 
     NewFile = GlobalData.gALL_FILES[os.path.normpath(os.path.join(Dir, File))]
     if NewFile:
@@ -294,10 +315,12 @@ def RealPath2(File, Dir='', OverrideDir=''):
 
     return None, None
 
-## CommonPath
+# CommonPath
 #
 # @param PathList: PathList
 #
+
+
 def CommonPath(PathList):
     Path1 = min(PathList).split(os.path.sep)
     Path2 = max(PathList).split(os.path.sep)
@@ -306,11 +329,13 @@ def CommonPath(PathList):
             return os.path.sep.join(Path1[:Index])
     return os.path.sep.join(Path1)
 
-## PathClass
+# PathClass
 #
+
+
 class PathClass(object):
     def __init__(self, File='', Root='', AlterRoot='', Type='', IsBinary=False,
-                 Arch='COMMON', ToolChainFamily='', Target='', TagName='', \
+                 Arch='COMMON', ToolChainFamily='', Target='', TagName='',
                  ToolCode=''):
         self.Arch = Arch
         self.File = str(File)
@@ -366,14 +391,14 @@ class PathClass(object):
 
         self._Key = None
 
-    ## Convert the object of this class to a string
+    # Convert the object of this class to a string
     #
     #  Convert member Path of the class to a string
     #
     def __str__(self):
         return self.Path
 
-    ## Override __eq__ function
+    # Override __eq__ function
     #
     # Check whether PathClass are the same
     #
@@ -383,27 +408,28 @@ class PathClass(object):
         else:
             return self.Path == str(Other)
 
-    ## Override __hash__ function
+    # Override __hash__ function
     #
     # Use Path as key in hash table
     #
     def __hash__(self):
         return hash(self.Path)
 
-    ## _GetFileKey
+    # _GetFileKey
     #
     def _GetFileKey(self):
         if self._Key is None:
             self._Key = self.Path.upper()
         return self._Key
-    ## Validate
+    # Validate
     #
+
     def Validate(self, Type='', CaseSensitive=True):
         if GlobalData.gCASE_INSENSITIVE:
             CaseSensitive = False
         if Type and Type.lower() != self.Type:
             return ToolError.FILE_TYPE_MISMATCH, '%s (expect %s but got %s)' % \
-        (self.File, Type, self.Type)
+                (self.File, Type, self.Type)
 
         RealFile, RealRoot = RealPath2(self.File, self.Root, self.AlterRoot)
         if not RealRoot and not RealFile:
@@ -417,12 +443,12 @@ class PathClass(object):
         ErrorCode = 0
         ErrorInfo = ''
         if RealRoot != self.Root or RealFile != self.File:
-            if CaseSensitive and (RealFile != self.File or \
-                                  (RealRoot != self.Root and RealRoot != \
+            if CaseSensitive and (RealFile != self.File or
+                                  (RealRoot != self.Root and RealRoot !=
                                    self.AlterRoot)):
                 ErrorCode = ToolError.FILE_CASE_MISMATCH
                 ErrorInfo = self.File + '\n\t' + RealFile + \
-                 " [in file system]"
+                    " [in file system]"
 
             self.SubDir, self.Name = os.path.split(RealFile)
             self.BaseName, self.Ext = os.path.splitext(self.Name)
@@ -437,10 +463,12 @@ class PathClass(object):
 
     Key = property(_GetFileKey)
 
-## Get current workspace
+# Get current workspace
 #
 #  get WORKSPACE from environment variable if present,if not use current working directory as WORKSPACE
 #
+
+
 def GetWorkspace():
     #
     # check WORKSPACE
@@ -463,7 +491,7 @@ def GetWorkspace():
 
     return WorkspaceDir, mws.PACKAGES_PATH
 
-## Get relative path
+# Get relative path
 #
 #  use full path and workspace to get relative path
 #  the destination of this function is mainly to resolve the root path issue(like c: or c:\)
@@ -471,22 +499,28 @@ def GetWorkspace():
 #  @param Fullpath: a string of fullpath
 #  @param Workspace: a string of workspace
 #
+
+
 def GetRelativePath(Fullpath, Workspace):
 
     RelativePath = ''
     if Workspace.endswith(os.sep):
-        RelativePath = Fullpath[Fullpath.upper().find(Workspace.upper())+len(Workspace):]
+        RelativePath = Fullpath[Fullpath.upper().find(
+            Workspace.upper())+len(Workspace):]
     else:
-        RelativePath = Fullpath[Fullpath.upper().find(Workspace.upper())+len(Workspace)+1:]
+        RelativePath = Fullpath[Fullpath.upper().find(
+            Workspace.upper())+len(Workspace)+1:]
 
     return RelativePath
 
-## Check whether all module types are in list
+# Check whether all module types are in list
 #
 # check whether all module types (SUP_MODULE_LIST) are in list
 #
 # @param ModuleList:  a list of ModuleType
 #
+
+
 def IsAllModuleList(ModuleList):
     NewModuleList = [Module.upper() for Module in ModuleList]
     for Module in SUP_MODULE_LIST:
@@ -495,12 +529,14 @@ def IsAllModuleList(ModuleList):
     else:
         return True
 
-## Dictionary that use comment(GenericComment, TailComment) as value,
+# Dictionary that use comment(GenericComment, TailComment) as value,
 # if a new comment which key already in the dic is inserted, then the
 # comment will be merged.
 # Key is (Statement, SupArch), when TailComment is added, it will ident
 # according to Statement
 #
+
+
 class MergeCommentDict(dict):
     ## []= operator
     #
@@ -509,18 +545,18 @@ class MergeCommentDict(dict):
         if Key in self:
             OrigVal1, OrigVal2 = dict.__getitem__(self, Key)
             Statement = Key[0]
-            dict.__setitem__(self, Key, (OrigVal1 + GenericComment, OrigVal2 \
+            dict.__setitem__(self, Key, (OrigVal1 + GenericComment, OrigVal2
                                          + len(Statement) * ' ' + TailComment))
         else:
             dict.__setitem__(self, Key, (GenericComment, TailComment))
 
-    ## =[] operator
+    # =[] operator
     #
     def __getitem__(self, Key):
         return dict.__getitem__(self, Key)
 
 
-## GenDummyHelpTextObj
+# GenDummyHelpTextObj
 #
 # @retval HelpTxt:   Generated dummy help text object
 #
@@ -530,7 +566,7 @@ def GenDummyHelpTextObj():
     HelpTxt.SetString(' ')
     return HelpTxt
 
-## ConvertVersionToDecimal, the minor version should be within 0 - 99
+# ConvertVersionToDecimal, the minor version should be within 0 - 99
 # <HexVersion>          ::=  "0x" <Major> <Minor>
 # <Major>               ::=  (a-fA-F0-9){4}
 # <Minor>               ::=  (a-fA-F0-9){4}
@@ -539,6 +575,8 @@ def GenDummyHelpTextObj():
 # @param StringIn:  The string contains version defined in INF file.
 #                   It can be Decimal or Hex
 #
+
+
 def ConvertVersionToDecimal(StringIn):
     if IsValidHexVersion(StringIn):
         Value = int(StringIn, 16)
@@ -559,12 +597,14 @@ def ConvertVersionToDecimal(StringIn):
             #
             return StringIn
 
-## GetHelpStringByRemoveHashKey
+# GetHelpStringByRemoveHashKey
 #
 # Remove hash key at the header of string and return the remain.
 #
 # @param String:  The string need to be processed.
 #
+
+
 def GetHelpStringByRemoveHashKey(String):
     ReturnString = ''
     PattenRemoveHashKey = re.compile(r"^[#+\s]+", re.DOTALL)
@@ -585,7 +625,7 @@ def GetHelpStringByRemoveHashKey(String):
 
     return ReturnString
 
-## ConvPathFromAbsToRel
+# ConvPathFromAbsToRel
 #
 # Get relative file path from absolute path.
 #
@@ -593,6 +633,8 @@ def GetHelpStringByRemoveHashKey(String):
 # @param Root:  The string contain the parent path of Path in.
 #
 #
+
+
 def ConvPathFromAbsToRel(Path, Root):
     Path = os.path.normpath(Path)
     Root = os.path.normpath(Root)
@@ -608,13 +650,15 @@ def ConvPathFromAbsToRel(Path, Root):
     else:
         return Path
 
-## ConvertPath
+# ConvertPath
 #
 # Convert special characters to '_', '\' to '/'
 # return converted path: Test!1.inf -> Test_1.inf
 #
 # @param Path: Path to be converted
 #
+
+
 def ConvertPath(Path):
     RetPath = ''
     for Char in Path.strip():
@@ -626,7 +670,7 @@ def ConvertPath(Path):
             RetPath = RetPath + '_'
     return RetPath
 
-## ConvertSpec
+# ConvertSpec
 #
 # during install, convert the Spec string extract from UPD into INF allowable definition,
 # the difference is period is allowed in the former (not the first letter) but not in the latter.
@@ -634,6 +678,8 @@ def ConvertPath(Path):
 #
 # @param SpecStr: SpecStr to be converted
 #
+
+
 def ConvertSpec(SpecStr):
     RetStr = ''
     for Char in SpecStr:
@@ -645,7 +691,7 @@ def ConvertSpec(SpecStr):
     return RetStr
 
 
-## IsEqualList
+# IsEqualList
 #
 # Judge two lists are identical(contain same item).
 # The rule is elements in List A are in List B and elements in List B are in List A.
@@ -669,7 +715,7 @@ def IsEqualList(ListA, ListB):
 
     return True
 
-## ConvertArchList
+# ConvertArchList
 #
 # Convert item in ArchList if the start character is lower case.
 # In UDP spec, Arch is only allowed as: [A-Z]([a-zA-Z0-9])*
@@ -678,6 +724,8 @@ def IsEqualList(ListA, ListB):
 #
 # @return NewList  The ArchList been converted.
 #
+
+
 def ConvertArchList(ArchList):
     NewArchList = []
     if not ArchList:
@@ -693,7 +741,7 @@ def ConvertArchList(ArchList):
 
     return NewArchList
 
-## ProcessLineExtender
+# ProcessLineExtender
 #
 # Process the LineExtender of Line in LineList.
 # If one line ends with a line extender, then it will be combined together with next line.
@@ -702,6 +750,8 @@ def ConvertArchList(ArchList):
 #
 # @return NewList  The ArchList been processed.
 #
+
+
 def ProcessLineExtender(LineList):
     NewList = []
     Count = 0
@@ -716,7 +766,7 @@ def ProcessLineExtender(LineList):
 
     return NewList
 
-## ProcessEdkComment
+# ProcessEdkComment
 #
 # Process EDK style comment in LineList: c style /* */ comment or cpp style // comment
 #
@@ -726,6 +776,8 @@ def ProcessLineExtender(LineList):
 # @return LineList  The LineList been processed.
 # @return FirstPos  Where Edk comment is first found, -1 if not found
 #
+
+
 def ProcessEdkComment(LineList):
     FindEdkBlockComment = False
     Count = 0
@@ -769,7 +821,7 @@ def ProcessEdkComment(LineList):
 
     return LineList, FirstPos
 
-## GetLibInstanceInfo
+# GetLibInstanceInfo
 #
 # Get the information from Library Instance INF file.
 #
@@ -777,6 +829,8 @@ def ProcessEdkComment(LineList):
 # @param WorkSpace. The WorkSpace directory used to combined with INF file path.
 #
 # @return GUID, Version
+
+
 def GetLibInstanceInfo(String, WorkSpace, LineNo):
 
     FileGuidString = ""
@@ -795,7 +849,8 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo):
     #
     # Validate file name exist.
     #
-    FullFileName = os.path.normpath(os.path.realpath(os.path.join(WorkSpace, String)))
+    FullFileName = os.path.normpath(
+        os.path.realpath(os.path.join(WorkSpace, String)))
     if not (ValidFile(FullFileName)):
         Logger.Error("InfParser",
                      ToolError.FORMAT_INVALID,
@@ -812,7 +867,8 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo):
     else:
         Logger.Error("InfParser",
                      ToolError.FORMAT_INVALID,
-                     ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (String),
+                     ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                         String),
                      File=GlobalData.gINF_MODULE_NAME,
                      Line=LineNo,
                      ExtraData=OriginalString)
@@ -855,7 +911,7 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo):
 
         return FileGuidString, VerString
 
-## GetLocalValue
+# GetLocalValue
 #
 # Generate the local value for INF and DEC file. If Lang attribute not present, then use this value.
 # If present, and there is no element without the Lang attribute, and one of the elements has the rfc1766 code is
@@ -867,6 +923,8 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo):
 # @param UseFirstValue: True to use the first value, False to use the last value
 #
 # @return LocalValue
+
+
 def GetLocalValue(ValueList, UseFirstValue=False):
     Value1 = ''
     Value2 = ''
@@ -919,7 +977,7 @@ def GetLocalValue(ValueList, UseFirstValue=False):
     return ''
 
 
-## GetCharIndexOutStr
+# GetCharIndexOutStr
 #
 # Get comment character index outside a string
 #
@@ -941,19 +999,21 @@ def GetCharIndexOutStr(CommentCharacter, Line):
     for Index in range(0, len(Line)):
         if Line[Index] == '"':
             InString = not InString
-        elif Line[Index] == CommentCharacter and InString :
+        elif Line[Index] == CommentCharacter and InString:
             pass
-        elif Line[Index] == CommentCharacter and (Index +1) < len(Line) and Line[Index+1] == CommentCharacter \
-            and not InString :
+        elif Line[Index] == CommentCharacter and (Index + 1) < len(Line) and Line[Index+1] == CommentCharacter \
+                and not InString:
             return Index
     return -1
 
-## ValidateUNIFilePath
+# ValidateUNIFilePath
 #
 # Check the UNI file path
 #
 # @param FilePath: The UNI file path
 #
+
+
 def ValidateUNIFilePath(Path):
     Suffix = Path[Path.rfind(TAB_SPLIT):]
 
@@ -962,18 +1022,18 @@ def ValidateUNIFilePath(Path):
     #
     if Suffix not in TAB_UNI_FILE_SUFFIXS:
         Logger.Error("Unicode File Parser",
-                        ToolError.FORMAT_INVALID,
-                        Message=ST.ERR_UNI_FILE_SUFFIX_WRONG,
-                        ExtraData=Path)
+                     ToolError.FORMAT_INVALID,
+                     Message=ST.ERR_UNI_FILE_SUFFIX_WRONG,
+                     ExtraData=Path)
 
     #
     # Check if '..' in the file name(without suffix)
     #
     if (TAB_SPLIT + TAB_SPLIT) in Path:
         Logger.Error("Unicode File Parser",
-                        ToolError.FORMAT_INVALID,
-                        Message=ST.ERR_UNI_FILE_NAME_INVALID,
-                        ExtraData=Path)
+                     ToolError.FORMAT_INVALID,
+                     Message=ST.ERR_UNI_FILE_NAME_INVALID,
+                     ExtraData=Path)
 
     #
     # Check if the file name is valid according to the DEC and INF specification
@@ -983,7 +1043,6 @@ def ValidateUNIFilePath(Path):
     InvalidCh = re.sub(Pattern, '', FileName)
     if InvalidCh:
         Logger.Error("Unicode File Parser",
-                        ToolError.FORMAT_INVALID,
-                        Message=ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID,
-                        ExtraData=Path)
-
+                     ToolError.FORMAT_INVALID,
+                     Message=ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID,
+                     ExtraData=Path)
diff --git a/BaseTools/Source/Python/UPT/Library/ParserValidate.py b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
index 62f406141cc6..202466adf5de 100644
--- a/BaseTools/Source/Python/UPT/Library/ParserValidate.py
+++ b/BaseTools/Source/Python/UPT/Library/ParserValidate.py
@@ -1,4 +1,4 @@
-## @file ParserValidate.py
+# @file ParserValidate.py
 # Functions for parser validation
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,26 +23,30 @@ from Library.ExpressionValidate import IsValidBareCString
 from Library.ExpressionValidate import IsValidFeatureFlagExp
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## __HexDigit() method
+# __HexDigit() method
 #
 # Whether char input is a Hex data bit
 #
 # @param  TempChar:    The char to test
 #
+
+
 def __HexDigit(TempChar):
     if (TempChar >= 'a' and TempChar <= 'f') or \
-    (TempChar >= 'A' and TempChar <= 'F') \
+        (TempChar >= 'A' and TempChar <= 'F') \
             or (TempChar >= '0' and TempChar <= '9'):
         return True
     else:
         return False
 
-## IsValidHex() method
+# IsValidHex() method
 #
 # Whether char input is a Hex data.
 #
 # @param  TempChar:    The char to test
 #
+
+
 def IsValidHex(HexStr):
     if not HexStr.upper().startswith("0X"):
         return False
@@ -52,7 +56,7 @@ def IsValidHex(HexStr):
     else:
         return False
 
-## Judge the input string is valid bool type or not.
+# Judge the input string is valid bool type or not.
 #
 # <TRUE>                  ::=  {"TRUE"} {"true"} {"True"} {"0x1"} {"0x01"}
 # <FALSE>                 ::=  {"FALSE"} {"false"} {"False"} {"0x0"} {"0x00"}
@@ -60,6 +64,8 @@ def IsValidHex(HexStr):
 #
 # @param    BoolString:    A string contained the value need to be judged.
 #
+
+
 def IsValidBoolType(BoolString):
     #
     # Valid True
@@ -74,10 +80,10 @@ def IsValidBoolType(BoolString):
     # Valid False
     #
     elif BoolString == 'FALSE' or \
-         BoolString == 'False' or \
-         BoolString == 'false' or \
-         BoolString == '0x0' or \
-         BoolString == '0x00':
+            BoolString == 'False' or \
+            BoolString == 'false' or \
+            BoolString == '0x0' or \
+            BoolString == '0x00':
         return True
     #
     # Invalid bool type
@@ -85,29 +91,35 @@ def IsValidBoolType(BoolString):
     else:
         return False
 
-## Is Valid Module Type List or not
+# Is Valid Module Type List or not
 #
 # @param      ModuleTypeList:  A list contain ModuleType strings need to be
 # judged.
 #
+
+
 def IsValidInfMoudleTypeList(ModuleTypeList):
     for ModuleType in ModuleTypeList:
         return IsValidInfMoudleType(ModuleType)
 
-## Is Valid Module Type or not
+# Is Valid Module Type or not
 #
 # @param      ModuleType:  A string contain ModuleType need to be judged.
 #
+
+
 def IsValidInfMoudleType(ModuleType):
     if ModuleType in MODULE_LIST:
         return True
     else:
         return False
 
-## Is Valid Component Type or not
+# Is Valid Component Type or not
 #
 # @param      ComponentType:  A string contain ComponentType need to be judged.
 #
+
+
 def IsValidInfComponentType(ComponentType):
     if ComponentType.upper() in COMPONENT_TYPE_LIST:
         return True
@@ -115,7 +127,7 @@ def IsValidInfComponentType(ComponentType):
         return False
 
 
-## Is valid Tool Family or not
+# Is valid Tool Family or not
 #
 # @param   ToolFamily:   A string contain Tool Family need to be judged.
 # Family := [A-Z]([a-zA-Z0-9])*
@@ -126,12 +138,14 @@ def IsValidToolFamily(ToolFamily):
         return False
     return True
 
-## Is valid Tool TagName or not
+# Is valid Tool TagName or not
 #
 # The TagName sample is MYTOOLS and VS2005.
 #
 # @param   TagName:   A string contain Tool TagName need to be judged.
 #
+
+
 def IsValidToolTagName(TagName):
     if TagName.strip() == '':
         return True
@@ -141,7 +155,7 @@ def IsValidToolTagName(TagName):
         return False
     return True
 
-## Is valid arch or not
+# Is valid arch or not
 #
 # @param Arch   The arch string need to be validated
 # <OA>                  ::=  (a-zA-Z)(A-Za-z0-9){0,}
@@ -149,6 +163,8 @@ def IsValidToolTagName(TagName):
 #                            {"common"}
 # @param   Arch:   Input arch
 #
+
+
 def IsValidArch(Arch):
     if Arch == 'common':
         return True
@@ -157,13 +173,15 @@ def IsValidArch(Arch):
         return False
     return True
 
-## Is valid family or not
+# Is valid family or not
 #
 # <Family>        ::=  {"MSFT"} {"GCC"} {"INTEL"} {<Usr>} {"*"}
 # <Usr>           ::=  [A-Z][A-Za-z0-9]{0,}
 #
 # @param family:   The family string need to be validated
 #
+
+
 def IsValidFamily(Family):
     Family = Family.strip()
     if Family == '*':
@@ -177,10 +195,12 @@ def IsValidFamily(Family):
         return False
     return True
 
-## Is valid build option name or not
+# Is valid build option name or not
 #
 # @param BuildOptionName:   The BuildOptionName string need to be validated
 #
+
+
 def IsValidBuildOptionName(BuildOptionName):
     if not BuildOptionName:
         return False
@@ -207,24 +227,28 @@ def IsValidBuildOptionName(BuildOptionName):
 
     return True
 
-## IsValidToken
+# IsValidToken
 #
 # Check if pattern string matches total token
 #
 # @param ReString:     regular string
 # @param Token:        Token to be matched
 #
+
+
 def IsValidToken(ReString, Token):
     Match = re.compile(ReString).match(Token)
     return Match and Match.start() == 0 and Match.end() == len(Token)
 
-## IsValidPath
+# IsValidPath
 #
 # Check if path exist
 #
 # @param Path: Absolute path or relative path to be checked
 # @param Root: Root path
 #
+
+
 def IsValidPath(Path, Root):
     Path = Path.strip()
     OrigPath = Path.replace('\\', '/')
@@ -269,7 +293,7 @@ def IsValidPath(Path, Root):
 
     return True
 
-## IsValidInstallPath
+# IsValidInstallPath
 #
 # Check if an install path valid or not.
 #
@@ -277,6 +301,8 @@ def IsValidPath(Path, Root):
 #
 # @param Path: path to be checked
 #
+
+
 def IsValidInstallPath(Path):
     if platform.platform().find("Windows") >= 0:
         if os.path.isabs(Path):
@@ -295,7 +321,7 @@ def IsValidInstallPath(Path):
     return True
 
 
-## IsValidCFormatGuid
+# IsValidCFormatGuid
 #
 # Check if GUID format has the from of {8,4,4,{2,2,2,2,2,2,2,2}}
 #
@@ -356,19 +382,21 @@ def IsValidCFormatGuid(Guid):
 
     return SepValue == '}}' and Value == ''
 
-## IsValidPcdType
+# IsValidPcdType
 #
 # Check whether the PCD type is valid
 #
 # @param PcdTypeString: The PcdType string need to be checked.
 #
+
+
 def IsValidPcdType(PcdTypeString):
     if PcdTypeString.upper() in PCD_USAGE_TYPE_LIST_OF_MODULE:
         return True
     else:
         return False
 
-## IsValidWord
+# IsValidWord
 #
 # Check whether the word is valid.
 # <Word>   ::=  (a-zA-Z0-9_)(a-zA-Z0-9_-){0,} Alphanumeric characters with
@@ -378,6 +406,8 @@ def IsValidPcdType(PcdTypeString):
 #
 # @param Word:  The word string need to be checked.
 #
+
+
 def IsValidWord(Word):
     if not Word:
         return False
@@ -404,7 +434,7 @@ def IsValidWord(Word):
     return True
 
 
-## IsValidSimpleWord
+# IsValidSimpleWord
 #
 # Check whether the SimpleWord is valid.
 # <SimpleWord>          ::=  (a-zA-Z0-9)(a-zA-Z0-9_-){0,}
@@ -424,13 +454,15 @@ def IsValidSimpleWord(Word):
 
     return True
 
-## IsValidDecVersion
+# IsValidDecVersion
 #
 # Check whether the decimal version is valid.
 # <DecVersion>          ::=  (0-9){1,} ["." (0-9){1,}]
 #
 # @param Word:  The word string need to be checked.
 #
+
+
 def IsValidDecVersion(Word):
     if Word.find('.') > -1:
         ReIsValidDecVersion = re.compile(r"[0-9]+\.?[0-9]+$")
@@ -440,7 +472,7 @@ def IsValidDecVersion(Word):
         return False
     return True
 
-## IsValidHexVersion
+# IsValidHexVersion
 #
 # Check whether the hex version is valid.
 # <HexVersion>          ::=  "0x" <Major> <Minor>
@@ -449,6 +481,8 @@ def IsValidDecVersion(Word):
 #
 # @param Word:  The word string need to be checked.
 #
+
+
 def IsValidHexVersion(Word):
     ReIsValidHexVersion = re.compile(r"[0][xX][0-9A-Fa-f]{8}$", re.DOTALL)
     if ReIsValidHexVersion.match(Word) is None:
@@ -456,13 +490,15 @@ def IsValidHexVersion(Word):
 
     return True
 
-## IsValidBuildNumber
+# IsValidBuildNumber
 #
 # Check whether the BUILD_NUMBER is valid.
 # ["BUILD_NUMBER" "=" <Integer>{1,4} <EOL>]
 #
 # @param Word:  The BUILD_NUMBER string need to be checked.
 #
+
+
 def IsValidBuildNumber(Word):
     ReIsValieBuildNumber = re.compile(r"[0-9]{1,4}$", re.DOTALL)
     if ReIsValieBuildNumber.match(Word) is None:
@@ -470,12 +506,14 @@ def IsValidBuildNumber(Word):
 
     return True
 
-## IsValidDepex
+# IsValidDepex
 #
 # Check whether the Depex is valid.
 #
 # @param Word:  The Depex string need to be checked.
 #
+
+
 def IsValidDepex(Word):
     Index = Word.upper().find("PUSH")
     if Index > -1:
@@ -487,7 +525,7 @@ def IsValidDepex(Word):
 
     return True
 
-## IsValidNormalizedString
+# IsValidNormalizedString
 #
 # Check
 # <NormalizedString>    ::=  <DblQuote> [{<Word>} {<Space>}]{1,} <DblQuote>
@@ -495,6 +533,8 @@ def IsValidDepex(Word):
 #
 # @param String: string to be checked
 #
+
+
 def IsValidNormalizedString(String):
     if String == '':
         return True
@@ -513,12 +553,14 @@ def IsValidNormalizedString(String):
 
     return True
 
-## IsValidIdString
+# IsValidIdString
 #
 # Check whether the IdString is valid.
 #
 # @param IdString:  The IdString need to be checked.
 #
+
+
 def IsValidIdString(String):
     if IsValidSimpleWord(String.strip()):
         return True
@@ -533,7 +575,7 @@ def IsValidIdString(String):
 
     return False
 
-## IsValidVersionString
+# IsValidVersionString
 #
 # Check whether the VersionString is valid.
 # <AsciiString>           ::=  [ [<WhiteSpace>]{0,} [<AsciiChars>]{0,} ] {0,}
@@ -544,6 +586,8 @@ def IsValidIdString(String):
 #
 # @param VersionString:  The VersionString need to be checked.
 #
+
+
 def IsValidVersionString(VersionString):
     VersionString = VersionString.strip()
     for Char in VersionString:
@@ -552,12 +596,14 @@ def IsValidVersionString(VersionString):
 
     return True
 
-## IsValidPcdValue
+# IsValidPcdValue
 #
 # Check whether the PcdValue is valid.
 #
 # @param VersionString:  The PcdValue need to be checked.
 #
+
+
 def IsValidPcdValue(PcdValue):
     for Char in PcdValue:
         if Char == '\n' or Char == '\t' or Char == '\f':
@@ -615,7 +661,7 @@ def IsValidPcdValue(PcdValue):
         return True
 
     ReIsValidByteHex = re.compile(r"^\s*0x[0-9a-fA-F]{1,2}\s*$", re.DOTALL)
-    if PcdValue.strip().startswith('{') and PcdValue.strip().endswith('}') :
+    if PcdValue.strip().startswith('{') and PcdValue.strip().endswith('}'):
         StringValue = PcdValue.strip().lstrip('{').rstrip('}')
         ValueList = StringValue.split(',')
         AllValidFlag = True
@@ -640,12 +686,14 @@ def IsValidPcdValue(PcdValue):
 
     return False
 
-## IsValidCVariableName
+# IsValidCVariableName
 #
 # Check whether the PcdValue is valid.
 #
 # @param VersionString:  The PcdValue need to be checked.
 #
+
+
 def IsValidCVariableName(CName):
     ReIsValidCName = re.compile(r"^[A-Za-z_][0-9A-Za-z_]*$", re.DOTALL)
     if ReIsValidCName.match(CName) is None:
@@ -653,7 +701,7 @@ def IsValidCVariableName(CName):
 
     return True
 
-## IsValidIdentifier
+# IsValidIdentifier
 #
 # <Identifier> ::= <NonDigit> <Chars>{0,}
 # <Chars> ::= (a-zA-Z0-9_)
@@ -661,6 +709,8 @@ def IsValidCVariableName(CName):
 #
 # @param Ident: identifier to be checked
 #
+
+
 def IsValidIdentifier(Ident):
     ReIdent = re.compile(r"^[A-Za-z_][0-9A-Za-z_]*$", re.DOTALL)
     if ReIdent.match(Ident) is None:
@@ -668,12 +718,14 @@ def IsValidIdentifier(Ident):
 
     return True
 
-## IsValidDecVersionVal
+# IsValidDecVersionVal
 #
 # {(0-9){1,} "." (0-99)}
 #
 # @param Ver: version to be checked
 #
+
+
 def IsValidDecVersionVal(Ver):
     ReVersion = re.compile(r"[0-9]+(\.[0-9]{1,2})$")
 
@@ -683,7 +735,7 @@ def IsValidDecVersionVal(Ver):
     return True
 
 
-## IsValidLibName
+# IsValidLibName
 #
 # (A-Z)(a-zA-Z0-9){0,} and could not be "NULL"
 #
@@ -701,6 +753,8 @@ def IsValidLibName(LibName):
 # <UserId> ::= (a-zA-Z)(a-zA-Z0-9_.){0,}
 # Words that contain period "." must be encapsulated in double quotation marks.
 #
+
+
 def IsValidUserId(UserId):
     UserId = UserId.strip()
     Quoted = False
@@ -719,6 +773,8 @@ def IsValidUserId(UserId):
 #
 # Check if a UTF16-LE file has a BOM header
 #
+
+
 def CheckUTF16FileHeader(File):
     FileIn = open(File, 'rb').read(2)
     if FileIn != b'\xff\xfe':
diff --git a/BaseTools/Source/Python/UPT/Library/Parsing.py b/BaseTools/Source/Python/UPT/Library/Parsing.py
index 6fb133745e36..2ac5ae82e0cd 100644
--- a/BaseTools/Source/Python/UPT/Library/Parsing.py
+++ b/BaseTools/Source/Python/UPT/Library/Parsing.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common parsing related functions used in parsing
 # INF/DEC/DSC process
 #
@@ -41,7 +41,7 @@ from . import GlobalData
 
 gPKG_INFO_DICT = {}
 
-## GetBuildOption
+# GetBuildOption
 #
 # Parse a string with format "[<Family>:]<ToolFlag>=Flag"
 # Return (Family, ToolFlag, Flag)
@@ -49,22 +49,24 @@ gPKG_INFO_DICT = {}
 # @param String:  String with BuildOption statement
 # @param File:    The file which defines build option, used in error report
 #
-def GetBuildOption(String, File, LineNo= -1):
+
+
+def GetBuildOption(String, File, LineNo=-1):
     (Family, ToolChain, Flag) = ('', '', '')
     if String.find(DataType.TAB_EQUAL_SPLIT) < 0:
-        RaiseParserError(String, 'BuildOptions', File, \
+        RaiseParserError(String, 'BuildOptions', File,
                          '[<Family>:]<ToolFlag>=Flag', LineNo)
     else:
         List = GetSplitValueList(String, DataType.TAB_EQUAL_SPLIT, MaxSplit=1)
         if List[0].find(':') > -1:
-            Family = List[0][ : List[0].find(':')].strip()
-            ToolChain = List[0][List[0].find(':') + 1 : ].strip()
+            Family = List[0][: List[0].find(':')].strip()
+            ToolChain = List[0][List[0].find(':') + 1:].strip()
         else:
             ToolChain = List[0].strip()
         Flag = List[1].strip()
     return (Family, ToolChain, Flag)
 
-## Get Library Class
+# Get Library Class
 #
 # Get Library of Dsc as <LibraryClassKeyWord>|<LibraryInstance>
 #
@@ -72,23 +74,25 @@ def GetBuildOption(String, File, LineNo= -1):
 # @param ContainerFile:  The file which describes the library class, used for
 #                        error report
 #
-def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo= -1):
+
+
+def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo=-1):
     List = GetSplitValueList(Item[0])
     SupMod = DataType.SUP_MODULE_LIST_STRING
     if len(List) != 2:
-        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile, \
+        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile,
                          '<LibraryClassKeyWord>|<LibraryInstance>')
     else:
-        CheckFileType(List[1], '.Inf', ContainerFile, \
+        CheckFileType(List[1], '.Inf', ContainerFile,
                       'library class instance', Item[0], LineNo)
-        CheckFileExist(WorkspaceDir, List[1], ContainerFile, \
+        CheckFileExist(WorkspaceDir, List[1], ContainerFile,
                        'LibraryClasses', Item[0], LineNo)
         if Item[1] != '':
             SupMod = Item[1]
 
     return (List[0], List[1], SupMod)
 
-## Get Library Class
+# Get Library Class
 #
 # Get Library of Dsc as <LibraryClassKeyWord>[|<LibraryInstance>]
 # [|<TokenSpaceGuidCName>.<PcdCName>]
@@ -97,29 +101,30 @@ def GetLibraryClass(Item, ContainerFile, WorkspaceDir, LineNo= -1):
 # @param ContainerFile:  The file which describes the library class, used for
 #                        error report
 #
-def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo= -1):
+
+
+def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo=-1):
     ItemList = GetSplitValueList((Item[0] + DataType.TAB_VALUE_SPLIT * 2))
     SupMod = DataType.SUP_MODULE_LIST_STRING
 
     if len(ItemList) > 5:
-        RaiseParserError\
-        (Item[0], 'LibraryClasses', ContainerFile, \
-         '<LibraryClassKeyWord>[|<LibraryInstance>]\
+        RaiseParserError(Item[0], 'LibraryClasses', ContainerFile,
+                         '<LibraryClassKeyWord>[|<LibraryInstance>]\
          [|<TokenSpaceGuidCName>.<PcdCName>]')
     else:
-        CheckFileType(ItemList[1], '.Inf', ContainerFile, 'LibraryClasses', \
+        CheckFileType(ItemList[1], '.Inf', ContainerFile, 'LibraryClasses',
                       Item[0], LineNo)
-        CheckFileExist(WorkspaceDir, ItemList[1], ContainerFile, \
+        CheckFileExist(WorkspaceDir, ItemList[1], ContainerFile,
                        'LibraryClasses', Item[0], LineNo)
         if ItemList[2] != '':
-            CheckPcdTokenInfo(ItemList[2], 'LibraryClasses', \
+            CheckPcdTokenInfo(ItemList[2], 'LibraryClasses',
                               ContainerFile, LineNo)
         if Item[1] != '':
             SupMod = Item[1]
 
     return (ItemList[0], ItemList[1], ItemList[2], SupMod)
 
-## CheckPcdTokenInfo
+# CheckPcdTokenInfo
 #
 # Check if PcdTokenInfo is following <TokenSpaceGuidCName>.<PcdCName>
 #
@@ -127,7 +132,9 @@ def GetLibraryClassOfInf(Item, ContainerFile, WorkspaceDir, LineNo= -1):
 # @param Section:          Used for error report
 # @param File:             Used for error report
 #
-def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo= -1):
+
+
+def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo=-1):
     Format = '<TokenSpaceGuidCName>.<PcdCName>'
     if TokenInfoString != '' and TokenInfoString is not None:
         TokenInfoList = GetSplitValueList(TokenInfoString, DataType.TAB_SPLIT)
@@ -136,7 +143,7 @@ def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo= -1):
 
     RaiseParserError(TokenInfoString, Section, File, Format, LineNo)
 
-## Get Pcd
+# Get Pcd
 #
 # Get Pcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<Value>
 # [|<Type>|<MaximumDatumSize>]
@@ -147,12 +154,14 @@ def CheckPcdTokenInfo(TokenInfoString, Section, File, LineNo= -1):
 #                        report
 
 #
-def GetPcd(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value, MaximumDatumSize, Token = '', '', '', '', ''
     List = GetSplitValueList(Item + DataType.TAB_VALUE_SPLIT * 2)
 
     if len(List) < 4 or len(List) > 6:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, \
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
                          '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>\
                          [|<Type>|<MaximumDatumSize>]', LineNo)
     else:
@@ -165,7 +174,7 @@ def GetPcd(Item, Type, ContainerFile, LineNo= -1):
 
     return (TokenName, TokenGuid, Value, MaximumDatumSize, Token, Type)
 
-## Get FeatureFlagPcd
+# Get FeatureFlagPcd
 #
 # Get FeatureFlagPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE
 #
@@ -174,12 +183,14 @@ def GetPcd(Item, Type, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the pcd, used for error
 #                        report
 #
-def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value = '', '', ''
     List = GetSplitValueList(Item)
     if len(List) != 2:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, \
-                         '<PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE', \
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
+                         '<PcdTokenSpaceGuidCName>.<TokenCName>|TRUE/FALSE',
                          LineNo)
     else:
         Value = List[1]
@@ -188,7 +199,7 @@ def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo= -1):
 
     return (TokenName, TokenGuid, Value, Type)
 
-## Get DynamicDefaultPcd
+# Get DynamicDefaultPcd
 #
 # Get DynamicDefaultPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>
 # |<Value>[|<DatumTyp>[|<MaxDatumSize>]]
@@ -198,11 +209,13 @@ def GetFeatureFlagPcd(Item, Type, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the pcd, used for error
 #                        report
 #
-def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, Value, DatumTyp, MaxDatumSize = '', '', '', '', ''
     List = GetSplitValueList(Item + DataType.TAB_VALUE_SPLIT * 2)
     if len(List) < 4 or len(List) > 8:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, \
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
                          '<PcdTokenSpaceGuidCName>.<TokenCName>|<Value>\
                          [|<DatumTyp>[|<MaxDatumSize>]]', LineNo)
     else:
@@ -214,7 +227,7 @@ def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo= -1):
 
     return (TokenName, TokenGuid, Value, DatumTyp, MaxDatumSize, Type)
 
-## Get DynamicHiiPcd
+# Get DynamicHiiPcd
 #
 # Get DynamicHiiPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|<String>|
 # <VariableGuidCName>|<VariableOffset>[|<DefaultValue>[|<MaximumDatumSize>]]
@@ -224,24 +237,26 @@ def GetDynamicDefaultPcd(Item, Type, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the pcd, used for error
 #                        report
 #
-def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, List1, List2, List3, List4, List5 = \
-    '', '', '', '', '', '', ''
+        '', '', '', '', '', '', ''
     List = GetSplitValueList(Item + DataType.TAB_VALUE_SPLIT * 2)
     if len(List) < 6 or len(List) > 8:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, \
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
                          '<PcdTokenSpaceGuidCName>.<TokenCName>|<String>|\
                          <VariableGuidCName>|<VariableOffset>[|<DefaultValue>\
                          [|<MaximumDatumSize>]]', LineNo)
     else:
         List1, List2, List3, List4, List5 = \
-        List[1], List[2], List[3], List[4], List[5]
+            List[1], List[2], List[3], List[4], List[5]
     if CheckPcdTokenInfo(List[0], 'Pcds' + Type, ContainerFile, LineNo):
         (TokenGuid, TokenName) = GetSplitValueList(List[0], DataType.TAB_SPLIT)
 
     return (TokenName, TokenGuid, List1, List2, List3, List4, List5, Type)
 
-## Get DynamicVpdPcd
+# Get DynamicVpdPcd
 #
 # Get DynamicVpdPcd of Dsc as <PcdTokenSpaceGuidCName>.<TokenCName>|
 # <VpdOffset>[|<MaximumDatumSize>]
@@ -251,11 +266,13 @@ def GetDynamicHiiPcd(Item, Type, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the pcd, used for error
 #                        report
 #
-def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo=-1):
     TokenGuid, TokenName, List1, List2 = '', '', '', ''
     List = GetSplitValueList(Item + DataType.TAB_VALUE_SPLIT)
     if len(List) < 3 or len(List) > 4:
-        RaiseParserError(Item, 'Pcds' + Type, ContainerFile, \
+        RaiseParserError(Item, 'Pcds' + Type, ContainerFile,
                          '<PcdTokenSpaceGuidCName>.<TokenCName>|<VpdOffset>\
                          [|<MaximumDatumSize>]', LineNo)
     else:
@@ -265,7 +282,7 @@ def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo= -1):
 
     return (TokenName, TokenGuid, List1, List2, Type)
 
-## GetComponent
+# GetComponent
 #
 # Parse block of the components defined in dsc file
 # Set KeyValues as [ ['component name', [lib1, lib2, lib3],
@@ -274,10 +291,12 @@ def GetDynamicVpdPcd(Item, Type, ContainerFile, LineNo= -1):
 # @param Lines:             The content to be parsed
 # @param KeyValues:         To store data after parsing
 #
+
+
 def GetComponent(Lines, KeyValues):
-    (FindBlock, FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-     FindPcdsPatchableInModule, FindPcdsFixedAtBuild, FindPcdsDynamic, \
-     FindPcdsDynamicEx) = (False, False, False, False, False, False, False, \
+    (FindBlock, FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+     FindPcdsPatchableInModule, FindPcdsFixedAtBuild, FindPcdsDynamic,
+     FindPcdsDynamicEx) = (False, False, False, False, False, False, False,
                            False)
     ListItem = None
     LibraryClassItem = []
@@ -290,7 +309,7 @@ def GetComponent(Lines, KeyValues):
         # Ignore !include statement
         #
         if Line.upper().find(DataType.TAB_INCLUDE.upper() + ' ') > -1 or \
-        Line.upper().find(DataType.TAB_DEFINE + ' ') > -1:
+                Line.upper().find(DataType.TAB_DEFINE + ' ') > -1:
             continue
 
         if FindBlock == False:
@@ -300,7 +319,7 @@ def GetComponent(Lines, KeyValues):
             #
             if Line.endswith('{'):
                 FindBlock = True
-                ListItem = CleanString(Line.rsplit('{', 1)[0], \
+                ListItem = CleanString(Line.rsplit('{', 1)[0],
                                        DataType.TAB_COMMENT_SPLIT)
 
         #
@@ -308,57 +327,57 @@ def GetComponent(Lines, KeyValues):
         #
         if FindBlock:
             if Line.find('<LibraryClasses>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (True, False, False, False, False, False, False)
+                    (True, False, False, False, False, False, False)
                 continue
             if Line.find('<BuildOptions>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, True, False, False, False, False, False)
+                    (False, True, False, False, False, False, False)
                 continue
             if Line.find('<PcdsFeatureFlag>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, True, False, False, False, False)
+                    (False, False, True, False, False, False, False)
                 continue
             if Line.find('<PcdsPatchableInModule>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, True, False, False, False)
+                    (False, False, False, True, False, False, False)
                 continue
             if Line.find('<PcdsFixedAtBuild>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, True, False, False)
+                    (False, False, False, False, True, False, False)
                 continue
             if Line.find('<PcdsDynamic>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, True, False)
+                    (False, False, False, False, False, True, False)
                 continue
             if Line.find('<PcdsDynamicEx>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, False, True)
+                    (False, False, False, False, False, False, True)
                 continue
             if Line.endswith('}'):
                 #
                 # find '}' at line tail
                 #
-                KeyValues.append([ListItem, LibraryClassItem, \
+                KeyValues.append([ListItem, LibraryClassItem,
                                   BuildOption, Pcd])
-                (FindBlock, FindLibraryClass, FindBuildOption, \
-                 FindPcdsFeatureFlag, FindPcdsPatchableInModule, \
+                (FindBlock, FindLibraryClass, FindBuildOption,
+                 FindPcdsFeatureFlag, FindPcdsPatchableInModule,
                  FindPcdsFixedAtBuild, FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, False, False, False)
+                    (False, False, False, False, False, False, False, False)
                 LibraryClassItem, BuildOption, Pcd = [], [], []
                 continue
 
@@ -382,25 +401,27 @@ def GetComponent(Lines, KeyValues):
 
     return True
 
-## GetExec
+# GetExec
 #
 # Parse a string with format "InfFilename [EXEC = ExecFilename]"
 # Return (InfFilename, ExecFilename)
 #
 # @param String:  String with EXEC statement
 #
+
+
 def GetExec(String):
     InfFilename = ''
     ExecFilename = ''
     if String.find('EXEC') > -1:
-        InfFilename = String[ : String.find('EXEC')].strip()
-        ExecFilename = String[String.find('EXEC') + len('EXEC') : ].strip()
+        InfFilename = String[: String.find('EXEC')].strip()
+        ExecFilename = String[String.find('EXEC') + len('EXEC'):].strip()
     else:
         InfFilename = String.strip()
 
     return (InfFilename, ExecFilename)
 
-## GetComponents
+# GetComponents
 #
 # Parse block of the components defined in dsc file
 # Set KeyValues as [ ['component name', [lib1, lib2, lib3], [bo1, bo2, bo3],
@@ -413,13 +434,15 @@ def GetExec(String):
 #
 # @retval True Get component successfully
 #
+
+
 def GetComponents(Lines, KeyValues, CommentCharacter):
     if Lines.find(DataType.TAB_SECTION_END) > -1:
         Lines = Lines.split(DataType.TAB_SECTION_END, 1)[1]
-    (FindBlock, FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-     FindPcdsPatchableInModule, FindPcdsFixedAtBuild, FindPcdsDynamic, \
+    (FindBlock, FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+     FindPcdsPatchableInModule, FindPcdsFixedAtBuild, FindPcdsDynamic,
      FindPcdsDynamicEx) = \
-     (False, False, False, False, False, False, False, False)
+        (False, False, False, False, False, False, False, False)
     ListItem = None
     LibraryClassItem = []
     BuildOption = []
@@ -438,64 +461,65 @@ def GetComponents(Lines, KeyValues, CommentCharacter):
             #
             if Line.endswith('{'):
                 FindBlock = True
-                ListItem = CleanString(Line.rsplit('{', 1)[0], CommentCharacter)
+                ListItem = CleanString(Line.rsplit('{', 1)[
+                                       0], CommentCharacter)
 
         #
         # Parse a block content
         #
         if FindBlock:
             if Line.find('<LibraryClasses>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (True, False, False, False, False, False, False)
+                    (True, False, False, False, False, False, False)
                 continue
             if Line.find('<BuildOptions>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, True, False, False, False, False, False)
+                    (False, True, False, False, False, False, False)
                 continue
             if Line.find('<PcdsFeatureFlag>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, True, False, False, False, False)
+                    (False, False, True, False, False, False, False)
                 continue
             if Line.find('<PcdsPatchableInModule>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, True, False, False, False)
+                    (False, False, False, True, False, False, False)
                 continue
             if Line.find('<PcdsFixedAtBuild>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, True, False, False)
+                    (False, False, False, False, True, False, False)
                 continue
             if Line.find('<PcdsDynamic>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, True, False)
+                    (False, False, False, False, False, True, False)
                 continue
             if Line.find('<PcdsDynamicEx>') != -1:
-                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag, \
-                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild, \
+                (FindLibraryClass, FindBuildOption, FindPcdsFeatureFlag,
+                 FindPcdsPatchableInModule, FindPcdsFixedAtBuild,
                  FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, False, True)
+                    (False, False, False, False, False, False, True)
                 continue
             if Line.endswith('}'):
                 #
                 # find '}' at line tail
                 #
-                KeyValues.append([ListItem, LibraryClassItem, BuildOption, \
+                KeyValues.append([ListItem, LibraryClassItem, BuildOption,
                                   Pcd])
-                (FindBlock, FindLibraryClass, FindBuildOption, \
-                 FindPcdsFeatureFlag, FindPcdsPatchableInModule, \
+                (FindBlock, FindLibraryClass, FindBuildOption,
+                 FindPcdsFeatureFlag, FindPcdsPatchableInModule,
                  FindPcdsFixedAtBuild, FindPcdsDynamic, FindPcdsDynamicEx) = \
-                 (False, False, False, False, False, False, False, False)
+                    (False, False, False, False, False, False, False, False)
                 LibraryClassItem, BuildOption, Pcd = [], [], []
                 continue
 
@@ -519,7 +543,7 @@ def GetComponents(Lines, KeyValues, CommentCharacter):
 
     return True
 
-## Get Source
+# Get Source
 #
 # Get Source of Inf as <Filename>[|<Family>[|<TagName>[|<ToolCode>
 # [|<PcdFeatureFlag>]]]]
@@ -529,22 +553,24 @@ def GetComponents(Lines, KeyValues, CommentCharacter):
 # @param ContainerFile:  The file which describes the library class, used
 #                        for error report
 #
-def GetSource(Item, ContainerFile, FileRelativePath, LineNo= -1):
+
+
+def GetSource(Item, ContainerFile, FileRelativePath, LineNo=-1):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT * 4
     List = GetSplitValueList(ItemNew)
     if len(List) < 5 or len(List) > 9:
-        RaiseParserError(Item, 'Sources', ContainerFile, \
+        RaiseParserError(Item, 'Sources', ContainerFile,
                          '<Filename>[|<Family>[|<TagName>[|<ToolCode>\
                          [|<PcdFeatureFlag>]]]]', LineNo)
     List[0] = NormPath(List[0])
-    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Sources', \
+    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Sources',
                    Item, LineNo)
     if List[4] != '':
         CheckPcdTokenInfo(List[4], 'Sources', ContainerFile, LineNo)
 
     return (List[0], List[1], List[2], List[3], List[4])
 
-## Get Binary
+# Get Binary
 #
 # Get Binary of Inf as <Filename>[|<Family>[|<TagName>[|<ToolCode>
 # [|<PcdFeatureFlag>]]]]
@@ -554,11 +580,13 @@ def GetSource(Item, ContainerFile, FileRelativePath, LineNo= -1):
 # @param ContainerFile:  The file which describes the library class,
 #                        used for error report
 #
-def GetBinary(Item, ContainerFile, LineNo= -1):
+
+
+def GetBinary(Item, ContainerFile, LineNo=-1):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     if len(List) < 3 or len(List) > 5:
-        RaiseParserError(Item, 'Binaries', ContainerFile, \
+        RaiseParserError(Item, 'Binaries', ContainerFile,
                          "<FileType>|<Filename>[|<Target>\
                          [|<TokenSpaceGuidCName>.<PcdCName>]]", LineNo)
 
@@ -569,7 +597,7 @@ def GetBinary(Item, ContainerFile, LineNo= -1):
     elif len(List) == 3:
         return (List[0], List[1], List[2], '')
 
-## Get Guids/Protocols/Ppis
+# Get Guids/Protocols/Ppis
 #
 # Get Guids/Protocols/Ppis of Inf as <GuidCName>[|<PcdFeatureFlag>]
 #
@@ -578,12 +606,14 @@ def GetBinary(Item, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the library class,
 #                        used for error report
 #
+
+
 def GetGuidsProtocolsPpisOfInf(Item):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     return (List[0], List[1])
 
-## Get Guids/Protocols/Ppis
+# Get Guids/Protocols/Ppis
 #
 # Get Guids/Protocols/Ppis of Dec as <GuidCName>=<GuidValue>
 #
@@ -592,29 +622,31 @@ def GetGuidsProtocolsPpisOfInf(Item):
 # @param ContainerFile:  The file which describes the library class,
 # used for error report
 #
-def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo= -1):
+
+
+def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo=-1):
     List = GetSplitValueList(Item, DataType.TAB_EQUAL_SPLIT)
     if len(List) != 2:
-        RaiseParserError(Item, Type, ContainerFile, '<CName>=<GuidValue>', \
+        RaiseParserError(Item, Type, ContainerFile, '<CName>=<GuidValue>',
                          LineNo)
     #
-    #convert C-Format Guid to Register Format
+    # convert C-Format Guid to Register Format
     #
     if List[1][0] == '{' and List[1][-1] == '}':
         RegisterFormatGuid = GuidStructureStringToGuidString(List[1])
         if RegisterFormatGuid == '':
-            RaiseParserError(Item, Type, ContainerFile, \
+            RaiseParserError(Item, Type, ContainerFile,
                              'CFormat or RegisterFormat', LineNo)
     else:
         if CheckGuidRegFormat(List[1]):
             RegisterFormatGuid = List[1]
         else:
-            RaiseParserError(Item, Type, ContainerFile, \
+            RaiseParserError(Item, Type, ContainerFile,
                              'CFormat or RegisterFormat', LineNo)
 
     return (List[0], RegisterFormatGuid)
 
-## GetPackage
+# GetPackage
 #
 # Get Package of Inf as <PackagePath>[|<PcdFeatureFlag>]
 #
@@ -623,18 +655,20 @@ def GetGuidsProtocolsPpisOfDec(Item, Type, ContainerFile, LineNo= -1):
 # @param ContainerFile:  The file which describes the library class,
 #                        used for error report
 #
-def GetPackage(Item, ContainerFile, FileRelativePath, LineNo= -1):
+
+
+def GetPackage(Item, ContainerFile, FileRelativePath, LineNo=-1):
     ItemNew = Item + DataType.TAB_VALUE_SPLIT
     List = GetSplitValueList(ItemNew)
     CheckFileType(List[0], '.Dec', ContainerFile, 'package', List[0], LineNo)
-    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Packages', \
+    CheckFileExist(FileRelativePath, List[0], ContainerFile, 'Packages',
                    List[0], LineNo)
     if List[1] != '':
         CheckPcdTokenInfo(List[1], 'Packages', ContainerFile, LineNo)
 
     return (List[0], List[1])
 
-## Get Pcd Values of Inf
+# Get Pcd Values of Inf
 #
 # Get Pcd of Inf as <TokenSpaceGuidCName>.<PcdCName>[|<Value>]
 #
@@ -642,6 +676,8 @@ def GetPackage(Item, ContainerFile, FileRelativePath, LineNo= -1):
 # @param Type:  The type of Pcd
 # @param File:  The file which describes the pcd, used for error report
 #
+
+
 def GetPcdOfInf(Item, Type, File, LineNo):
     Format = '<TokenSpaceGuidCName>.<PcdCName>[|<Value>]'
     TokenGuid, TokenName, Value, InfType = '', '', '', ''
@@ -671,7 +707,7 @@ def GetPcdOfInf(Item, Type, File, LineNo):
     return (TokenGuid, TokenName, Value, InfType)
 
 
-## Get Pcd Values of Dec
+# Get Pcd Values of Dec
 #
 # Get Pcd of Dec as <TokenSpcCName>.<TokenCName>|<Value>|<DatumType>|<Token>
 # @param Item:  Pcd item
@@ -679,7 +715,7 @@ def GetPcdOfInf(Item, Type, File, LineNo):
 # @param File:  Dec file
 # @param LineNo:  Line number
 #
-def GetPcdOfDec(Item, Type, File, LineNo= -1):
+def GetPcdOfDec(Item, Type, File, LineNo=-1):
     Format = '<TokenSpaceGuidCName>.<PcdCName>|<Value>|<DatumType>|<Token>'
     TokenGuid, TokenName, Value, DatumType, Token = '', '', '', '', ''
     List = GetSplitValueList(Item)
@@ -698,7 +734,7 @@ def GetPcdOfDec(Item, Type, File, LineNo= -1):
 
     return (TokenGuid, TokenName, Value, DatumType, Token, Type)
 
-## Parse DEFINE statement
+# Parse DEFINE statement
 #
 # Get DEFINE macros
 #
@@ -711,21 +747,23 @@ def GetPcdOfDec(Item, Type, File, LineNo= -1):
 # @param SectionModel:  DEFINE section model
 # @param Arch:   DEFINE arch
 #
-def ParseDefine(LineValue, StartLine, Table, FileID, SectionName, \
+
+
+def ParseDefine(LineValue, StartLine, Table, FileID, SectionName,
                 SectionModel, Arch):
-    Logger.Debug(Logger.DEBUG_2, ST.MSG_DEFINE_STATEMENT_FOUND % (LineValue, \
+    Logger.Debug(Logger.DEBUG_2, ST.MSG_DEFINE_STATEMENT_FOUND % (LineValue,
                                                                   SectionName))
     Define = \
-    GetSplitValueList(CleanString\
-                      (LineValue[LineValue.upper().\
-                                 find(DataType.TAB_DEFINE.upper() + ' ') + \
-                                 len(DataType.TAB_DEFINE + ' ') : ]), \
-                                 DataType.TAB_EQUAL_SPLIT, 1)
-    Table.Insert(DataType.MODEL_META_DATA_DEFINE, Define[0], Define[1], '', \
-                 '', '', Arch, SectionModel, FileID, StartLine, -1, \
+        GetSplitValueList(CleanString
+                          (LineValue[LineValue.upper().
+                                     find(DataType.TAB_DEFINE.upper() + ' ') +
+                                     len(DataType.TAB_DEFINE + ' '):]),
+                          DataType.TAB_EQUAL_SPLIT, 1)
+    Table.Insert(DataType.MODEL_META_DATA_DEFINE, Define[0], Define[1], '',
+                 '', '', Arch, SectionModel, FileID, StartLine, -1,
                  StartLine, -1, 0)
 
-## InsertSectionItems
+# InsertSectionItems
 #
 # Insert item data of a section to a dict
 #
@@ -736,7 +774,9 @@ def ParseDefine(LineValue, StartLine, Table, FileID, SectionName, \
 # @param ThirdList:   Third list
 # @param RecordSet:   Record set
 #
-def InsertSectionItems(Model, SectionItemList, ArchList, \
+
+
+def InsertSectionItems(Model, SectionItemList, ArchList,
                        ThirdList, RecordSet):
     #
     # Insert each item data of a section
@@ -750,7 +790,7 @@ def InsertSectionItems(Model, SectionItemList, ArchList, \
         Records = RecordSet[Model]
         for SectionItem in SectionItemList:
             LineValue, StartLine, Comment = SectionItem[0], \
-            SectionItem[1], SectionItem[2]
+                SectionItem[1], SectionItem[2]
 
             Logger.Debug(4, ST.MSG_PARSING % LineValue)
             #
@@ -767,24 +807,28 @@ def InsertSectionItems(Model, SectionItemList, ArchList, \
         if RecordSet != {}:
             RecordSet[Model] = Records
 
-## GenMetaDatSectionItem
+# GenMetaDatSectionItem
 #
 # @param Key:    A key
 # @param Value:  A value
 # @param List:   A list
 #
+
+
 def GenMetaDatSectionItem(Key, Value, List):
     if Key not in List:
         List[Key] = [Value]
     else:
         List[Key].append(Value)
 
-## GetPkgInfoFromDec
+# GetPkgInfoFromDec
 #
 # get package name, guid, version info from dec files
 #
 # @param Path:   File path
 #
+
+
 def GetPkgInfoFromDec(Path):
     PkgName = None
     PkgGuid = None
@@ -815,7 +859,7 @@ def GetPkgInfoFromDec(Path):
         return None, None, None
 
 
-## GetWorkspacePackage
+# GetWorkspacePackage
 #
 # Get a list of workspace package information.
 #
@@ -837,8 +881,8 @@ def GetWorkspacePackage():
                     continue
                 Ext = os.path.splitext(FileSp)[1]
                 if Ext.lower() in ['.dec']:
-                    DecFileList.append\
-                    (os.path.normpath(os.path.join(Root, FileSp)))
+                    DecFileList.append(os.path.normpath(
+                        os.path.join(Root, FileSp)))
     #
     # abstract package guid, version info from DecFile List
     #
@@ -850,10 +894,12 @@ def GetWorkspacePackage():
 
     return PkgList
 
-## GetWorkspaceModule
+# GetWorkspaceModule
 #
 # Get a list of workspace modules.
 #
+
+
 def GetWorkspaceModule():
     InfFileList = []
     WorkspaceDir = GlobalData.gWORKSPACE
@@ -872,18 +918,20 @@ def GetWorkspaceModule():
                 continue
             Ext = os.path.splitext(FileSp)[1]
             if Ext.lower() in ['.inf']:
-                InfFileList.append\
-                (os.path.normpath(os.path.join(Root, FileSp)))
+                InfFileList.append(os.path.normpath(
+                    os.path.join(Root, FileSp)))
 
     return InfFileList
 
-## MacroParser used to parse macro definition
+# MacroParser used to parse macro definition
 #
 # @param Line:            The content contain linestring and line number
 # @param FileName:        The meta-file file name
 # @param SectionType:     Section for the Line belong to
 # @param FileLocalMacros: A list contain Macro defined in [Defines] section.
 #
+
+
 def MacroParser(Line, FileName, SectionType, FileLocalMacros):
     MacroDefPattern = re.compile("^(DEFINE)[ \t]+")
     LineContent = Line[0]
@@ -895,17 +943,17 @@ def MacroParser(Line, FileName, SectionType, FileLocalMacros):
         #
         return None, None
 
-    TokenList = GetSplitValueList(LineContent[Match.end(1):], \
+    TokenList = GetSplitValueList(LineContent[Match.end(1):],
                                   DataType.TAB_EQUAL_SPLIT, 1)
     #
     # Syntax check
     #
     if not TokenList[0]:
         Logger.Error('Parser', FORMAT_INVALID, ST.ERR_MACRONAME_NOGIVEN,
-                        ExtraData=LineContent, File=FileName, Line=LineNo)
+                     ExtraData=LineContent, File=FileName, Line=LineNo)
     if len(TokenList) < 2:
         Logger.Error('Parser', FORMAT_INVALID, ST.ERR_MACROVALUE_NOGIVEN,
-                        ExtraData=LineContent, File=FileName, Line=LineNo)
+                     ExtraData=LineContent, File=FileName, Line=LineNo)
 
     Name, Value = TokenList
 
@@ -945,7 +993,7 @@ def MacroParser(Line, FileName, SectionType, FileLocalMacros):
 
     return Name, Value
 
-## GenSection
+# GenSection
 #
 # generate section contents
 #
@@ -956,21 +1004,26 @@ def MacroParser(Line, FileName, SectionType, FileLocalMacros):
 #                       separated by space,
 #                       value is statement
 #
+
+
 def GenSection(SectionName, SectionDict, SplitArch=True, NeedBlankLine=False):
     Content = ''
     for SectionAttrs in SectionDict:
         StatementList = SectionDict[SectionAttrs]
         if SectionAttrs and SectionName != 'Defines' and SectionAttrs.strip().upper() != DataType.TAB_ARCH_COMMON:
             if SplitArch:
-                ArchList = GetSplitValueList(SectionAttrs, DataType.TAB_SPACE_SPLIT)
+                ArchList = GetSplitValueList(
+                    SectionAttrs, DataType.TAB_SPACE_SPLIT)
             else:
                 if SectionName != 'UserExtensions':
-                    ArchList = GetSplitValueList(SectionAttrs, DataType.TAB_COMMENT_SPLIT)
+                    ArchList = GetSplitValueList(
+                        SectionAttrs, DataType.TAB_COMMENT_SPLIT)
                 else:
                     ArchList = [SectionAttrs]
             for Index in range(0, len(ArchList)):
                 ArchList[Index] = ConvertArchForInstall(ArchList[Index])
-            Section = '[' + SectionName + '.' + (', ' + SectionName + '.').join(ArchList) + ']'
+            Section = '[' + SectionName + '.' + \
+                (', ' + SectionName + '.').join(ArchList) + ']'
         else:
             Section = '[' + SectionName + ']'
         Content += '\n' + Section + '\n'
@@ -998,16 +1051,18 @@ def GenSection(SectionName, SectionDict, SplitArch=True, NeedBlankLine=False):
         return ''
     return Content
 
-## ConvertArchForInstall
+# ConvertArchForInstall
 # if Arch.upper() is in "IA32", "X64", "IPF", and "EBC", it must be upper case.  "common" must be lower case.
 # Anything else, the case must be preserved
 #
 # @param Arch: the arch string that need to be converted, it should be stripped before pass in
 # @return: the arch string that get converted
 #
+
+
 def ConvertArchForInstall(Arch):
     if Arch.upper() in [DataType.TAB_ARCH_IA32, DataType.TAB_ARCH_X64,
-                                   DataType.TAB_ARCH_IPF, DataType.TAB_ARCH_EBC]:
+                        DataType.TAB_ARCH_IPF, DataType.TAB_ARCH_EBC]:
         Arch = Arch.upper()
     elif Arch.upper() == DataType.TAB_ARCH_COMMON:
         Arch = Arch.lower()
diff --git a/BaseTools/Source/Python/UPT/Library/StringUtils.py b/BaseTools/Source/Python/UPT/Library/StringUtils.py
index fbc5177caf5a..df9d10b0bb83 100644
--- a/BaseTools/Source/Python/UPT/Library/StringUtils.py
+++ b/BaseTools/Source/Python/UPT/Library/StringUtils.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common string related functions used in parsing
 # process
 #
@@ -25,7 +25,7 @@ from Logger import StringTable as ST
 #
 gMACRO_PATTERN = re.compile("\$\(([_A-Z][_A-Z0-9]*)\)", re.UNICODE)
 
-## GetSplitValueList
+# GetSplitValueList
 #
 # Get a value list from a string with multiple values split with SplitTag
 # The default SplitTag is DataType.TAB_VALUE_SPLIT
@@ -36,10 +36,12 @@ gMACRO_PATTERN = re.compile("\$\(([_A-Z][_A-Z0-9]*)\)", re.UNICODE)
 # @param MaxSplit:  The max number of split values, default is -1
 #
 #
-def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
+
+
+def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit=-1):
     return list(map(lambda l: l.strip(), String.split(SplitTag, MaxSplit)))
 
-## MergeArches
+# MergeArches
 #
 # Find a key's all arches in dict, add the new arch to the list
 # If not exist any arch, set the arch directly
@@ -48,13 +50,15 @@ def GetSplitValueList(String, SplitTag=DataType.TAB_VALUE_SPLIT, MaxSplit= -1):
 # @param Key:   The input value for Key
 # @param Arch:  The Arch to be added or merged
 #
+
+
 def MergeArches(Dict, Key, Arch):
     if Key in Dict.keys():
         Dict[Key].append(Arch)
     else:
         Dict[Key] = Arch.split()
 
-## GenDefines
+# GenDefines
 #
 # Parse a string with format "DEFINE <VarName> = <PATH>"
 # Generate a map Defines[VarName] = PATH
@@ -64,10 +68,12 @@ def MergeArches(Dict, Key, Arch):
 # @param Arch:     Supported Arch
 # @param Defines:  DEFINE statement to be parsed
 #
+
+
 def GenDefines(String, Arch, Defines):
     if String.find(DataType.TAB_DEFINE + ' ') > -1:
         List = String.replace(DataType.TAB_DEFINE + ' ', '').\
-        split(DataType.TAB_EQUAL_SPLIT)
+            split(DataType.TAB_EQUAL_SPLIT)
         if len(List) == 2:
             Defines[(CleanString(List[0]), Arch)] = CleanString(List[1])
             return 0
@@ -75,7 +81,7 @@ def GenDefines(String, Arch, Defines):
             return -1
     return 1
 
-## GetLibraryClassesWithModuleType
+# GetLibraryClassesWithModuleType
 #
 # Get Library Class definition when no module type defined
 #
@@ -84,6 +90,8 @@ def GenDefines(String, Arch, Defines):
 # @param KeyValues:         To store data after parsing
 # @param CommentCharacter:  Comment char, used to ignore comment content
 #
+
+
 def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
     NewKey = SplitModuleType(Key)
     Lines = Lines.split(DataType.TAB_SECTION_END, 1)[1]
@@ -95,7 +103,7 @@ def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
 
     return True
 
-## GetDynamics
+# GetDynamics
 #
 # Get Dynamic Pcds
 #
@@ -104,6 +112,8 @@ def GetLibraryClassesWithModuleType(Lines, Key, KeyValues, CommentCharacter):
 # @param KeyValues:         To store data after parsing
 # @param CommentCharacter:  Comment char, used to ignore comment content
 #
+
+
 def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
     #
     # Get SkuId Name List
@@ -115,11 +125,12 @@ def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
     for Line in LineList:
         Line = CleanString(Line, CommentCharacter)
         if Line != '' and Line[0] != CommentCharacter:
-            KeyValues.append([CleanString(Line, CommentCharacter), SkuIdNameList[1]])
+            KeyValues.append(
+                [CleanString(Line, CommentCharacter), SkuIdNameList[1]])
 
     return True
 
-## SplitModuleType
+# SplitModuleType
 #
 # Split ModuleType out of section defien to get key
 # [LibraryClass.Arch.ModuleType|ModuleType|ModuleType] -> [
@@ -127,6 +138,8 @@ def GetDynamics(Lines, Key, KeyValues, CommentCharacter):
 #
 # @param Key:  String to be parsed
 #
+
+
 def SplitModuleType(Key):
     KeyList = Key.split(DataType.TAB_SPLIT)
     #
@@ -146,7 +159,7 @@ def SplitModuleType(Key):
 
     return ReturnValue
 
-## Replace macro in string
+# Replace macro in string
 #
 # This method replace macros used in given string. The macros are given in a
 # dictionary.
@@ -157,6 +170,8 @@ def SplitModuleType(Key):
 # @param Line:              The content contain line string and line number
 # @param FileName:        The meta-file file name
 #
+
+
 def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, Line=None, FileName=None, Flag=False):
     LastString = String
     if MacroDefinitions is None:
@@ -191,20 +206,22 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, Line=None
             if Macro not in MacroDefinitions:
                 if SelfReplacement:
                     String = String.replace("$(%s)" % Macro, '')
-                    Logger.Debug(5, "Delete undefined MACROs in file %s line %d: %s!" % (FileName, Line[1], Line[0]))
+                    Logger.Debug(5, "Delete undefined MACROs in file %s line %d: %s!" % (
+                        FileName, Line[1], Line[0]))
                 continue
             if not HaveQuotedMacroFlag:
-                String = String.replace("$(%s)" % Macro, MacroDefinitions[Macro])
+                String = String.replace(
+                    "$(%s)" % Macro, MacroDefinitions[Macro])
             else:
                 Count = 0
                 for QuotedStringItem in QuotedStringList:
                     Count += 1
                     if Count % 2 != 0:
                         QuotedStringList[Count - 1] = QuotedStringList[Count - 1].replace("$(%s)" % Macro,
-                                                                        MacroDefinitions[Macro])
+                                                                                          MacroDefinitions[Macro])
                     elif Count == len(QuotedStringList) and Count % 2 == 0:
                         QuotedStringList[Count - 1] = QuotedStringList[Count - 1].replace("$(%s)" % Macro,
-                                                                        MacroDefinitions[Macro])
+                                                                                          MacroDefinitions[Macro])
 
         RetString = ''
         if HaveQuotedMacroFlag:
@@ -227,7 +244,7 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, Line=None
 
     return String
 
-## NormPath
+# NormPath
 #
 # Create a normal path
 # And replace DEFINE in the path
@@ -235,6 +252,8 @@ def ReplaceMacro(String, MacroDefinitions=None, SelfReplacement=False, Line=None
 # @param Path:     The input value for Path to be converted
 # @param Defines:  A set for DEFINE statement
 #
+
+
 def NormPath(Path, Defines=None):
     IsRelativePath = False
     if Defines is None:
@@ -256,7 +275,7 @@ def NormPath(Path, Defines=None):
         Path = os.path.join('.', Path)
     return Path
 
-## CleanString
+# CleanString
 #
 # Remove comments in a string
 # Remove spaces
@@ -265,6 +284,8 @@ def NormPath(Path, Defines=None):
 # @param CommentCharacter:  Comment char, used to ignore comment content,
 #                           default is DataType.TAB_COMMENT_SPLIT
 #
+
+
 def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyleComment=False):
     #
     # remove whitespace
@@ -292,7 +313,7 @@ def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyle
 
     return Line
 
-## CleanString2
+# CleanString2
 #
 # Split comments in a string
 # Remove spaces
@@ -301,6 +322,8 @@ def CleanString(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyle
 # @param CommentCharacter:  Comment char, used to ignore comment content,
 #                           default is DataType.TAB_COMMENT_SPLIT
 #
+
+
 def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyleComment=False):
     #
     # remove whitespace
@@ -337,7 +360,7 @@ def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyl
 
     return Line, Comment
 
-## GetMultipleValuesOfKeyFromLines
+# GetMultipleValuesOfKeyFromLines
 #
 # Parse multiple strings to clean comment and spaces
 # The result is saved to KeyValues
@@ -347,6 +370,8 @@ def CleanString2(Line, CommentCharacter=DataType.TAB_COMMENT_SPLIT, AllowCppStyl
 # @param KeyValues:         To store data after parsing
 # @param CommentCharacter:  Comment char, used to ignore comment content
 #
+
+
 def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
     if Key:
         pass
@@ -360,7 +385,7 @@ def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
             KeyValues += [Line]
     return True
 
-## GetDefineValue
+# GetDefineValue
 #
 # Parse a DEFINE statement to get defined value
 # DEFINE Key Value
@@ -369,13 +394,15 @@ def GetMultipleValuesOfKeyFromLines(Lines, Key, KeyValues, CommentCharacter):
 # @param Key:               The key of DEFINE statement
 # @param CommentCharacter:  Comment char, used to ignore comment content
 #
+
+
 def GetDefineValue(String, Key, CommentCharacter):
     if CommentCharacter:
         pass
     String = CleanString(String)
-    return String[String.find(Key + ' ') + len(Key + ' ') : ]
+    return String[String.find(Key + ' ') + len(Key + ' '):]
 
-## GetSingleValueOfKeyFromLines
+# GetSingleValueOfKeyFromLines
 #
 # Parse multiple strings as below to get value of each definition line
 # Key1 = Value1
@@ -393,7 +420,9 @@ def GetDefineValue(String, Key, CommentCharacter):
 #                              values. Key1 = Value1|Value2, '|' is the value
 #                              split char
 #
-def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCharacter, \
+
+
+def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCharacter,
                                  ValueSplitFlag, ValueSplitCharacter):
     Lines = Lines.split('\n')
     Keys = []
@@ -408,12 +437,14 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
         if Line.find(DataType.TAB_INF_DEFINES_DEFINE + ' ') > -1:
             if '' in DefineValues:
                 DefineValues.remove('')
-            DefineValues.append(GetDefineValue(Line, DataType.TAB_INF_DEFINES_DEFINE, CommentCharacter))
+            DefineValues.append(GetDefineValue(
+                Line, DataType.TAB_INF_DEFINES_DEFINE, CommentCharacter))
             continue
         if Line.find(DataType.TAB_INF_DEFINES_SPEC + ' ') > -1:
             if '' in SpecValues:
                 SpecValues.remove('')
-            SpecValues.append(GetDefineValue(Line, DataType.TAB_INF_DEFINES_SPEC, CommentCharacter))
+            SpecValues.append(GetDefineValue(
+                Line, DataType.TAB_INF_DEFINES_SPEC, CommentCharacter))
             continue
 
         #
@@ -428,9 +459,11 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
                 #
                 LineList[1] = CleanString(LineList[1], CommentCharacter)
                 if ValueSplitFlag:
-                    Value = list(map(lambda x: x.strip(), LineList[1].split(ValueSplitCharacter)))
+                    Value = list(
+                        map(lambda x: x.strip(), LineList[1].split(ValueSplitCharacter)))
                 else:
-                    Value = CleanString(LineList[1], CommentCharacter).splitlines()
+                    Value = CleanString(
+                        LineList[1], CommentCharacter).splitlines()
 
                 if Key[0] in Dictionary:
                     if Key[0] not in Keys:
@@ -450,7 +483,7 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
 
     return True
 
-## The content to be parsed
+# The content to be parsed
 #
 # Do pre-check for a file before it is parsed
 # Check $()
@@ -460,6 +493,8 @@ def GetSingleValueOfKeyFromLines(Lines, Dictionary, CommentCharacter, KeySplitCh
 # @param FileContent:    File content to be parsed
 # @param SupSectionTag:  Used for error report
 #
+
+
 def PreCheck(FileName, FileContent, SupSectionTag):
     if SupSectionTag:
         pass
@@ -482,7 +517,8 @@ def PreCheck(FileName, FileContent, SupSectionTag):
         #
         if Line.find('$') > -1:
             if Line.find('$(') < 0 or Line.find(')') < 0:
-                Logger.Error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
+                Logger.Error("Parser", FORMAT_INVALID, Line=LineNo,
+                             File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
         #
         # Check []
         #
@@ -491,18 +527,20 @@ def PreCheck(FileName, FileContent, SupSectionTag):
             # Only get one '[' or one ']'
             #
             if not (Line.find('[') > -1 and Line.find(']') > -1):
-                Logger.Error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
+                Logger.Error("Parser", FORMAT_INVALID, Line=LineNo,
+                             File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
         #
         # Regenerate FileContent
         #
         NewFileContent = NewFileContent + Line + '\r\n'
 
     if IsFailed:
-        Logger.Error("Parser", FORMAT_INVALID, Line=LineNo, File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
+        Logger.Error("Parser", FORMAT_INVALID, Line=LineNo,
+                     File=FileName, RaiseError=Logger.IS_RAISE_ERROR)
 
     return NewFileContent
 
-## CheckFileType
+# CheckFileType
 #
 # Check if the Filename is including ExtName
 # Return True if it exists
@@ -516,20 +554,23 @@ def PreCheck(FileName, FileContent, SupSectionTag):
 # @param Line:               The line in container file which defines the file
 #                            to be checked
 #
-def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line, LineNo= -1):
+
+
+def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line, LineNo=-1):
     if CheckFilename != '' and CheckFilename is not None:
         (Root, Ext) = os.path.splitext(CheckFilename)
         if Ext.upper() != ExtName.upper() and Root:
             ContainerFile = open(ContainerFilename, 'r').read()
             if LineNo == -1:
                 LineNo = GetLineNo(ContainerFile, Line)
-            ErrorMsg = ST.ERR_SECTIONNAME_INVALID % (SectionName, CheckFilename, ExtName)
-            Logger.Error("Parser", PARSER_ERROR, ErrorMsg, Line=LineNo, \
+            ErrorMsg = ST.ERR_SECTIONNAME_INVALID % (
+                SectionName, CheckFilename, ExtName)
+            Logger.Error("Parser", PARSER_ERROR, ErrorMsg, Line=LineNo,
                          File=ContainerFilename, RaiseError=Logger.IS_RAISE_ERROR)
 
     return True
 
-## CheckFileExist
+# CheckFileExist
 #
 # Check if the file exists
 # Return True if it exists
@@ -543,7 +584,9 @@ def CheckFileType(CheckFilename, ExtName, ContainerFilename, SectionName, Line,
 # @param Line:               The line in container file which defines the
 #                            file to be checked
 #
-def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName, Line, LineNo= -1):
+
+
+def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName, Line, LineNo=-1):
     CheckFile = ''
     if CheckFilename != '' and CheckFilename is not None:
         CheckFile = WorkspaceFile(WorkspaceDir, CheckFilename)
@@ -553,16 +596,18 @@ def CheckFileExist(WorkspaceDir, CheckFilename, ContainerFilename, SectionName,
                 LineNo = GetLineNo(ContainerFile, Line)
             ErrorMsg = ST.ERR_CHECKFILE_NOTFOUND % (CheckFile, SectionName)
             Logger.Error("Parser", PARSER_ERROR, ErrorMsg,
-                            File=ContainerFilename, Line=LineNo, RaiseError=Logger.IS_RAISE_ERROR)
+                         File=ContainerFilename, Line=LineNo, RaiseError=Logger.IS_RAISE_ERROR)
     return CheckFile
 
-## GetLineNo
+# GetLineNo
 #
 # Find the index of a line in a file
 #
 # @param FileContent:  Search scope
 # @param Line:         Search key
 #
+
+
 def GetLineNo(FileContent, Line, IsIgnoreComment=True):
     LineList = FileContent.splitlines()
     for Index in range(len(LineList)):
@@ -577,7 +622,7 @@ def GetLineNo(FileContent, Line, IsIgnoreComment=True):
 
     return -1
 
-## RaiseParserError
+# RaiseParserError
 #
 # Raise a parser error
 #
@@ -586,31 +631,37 @@ def GetLineNo(FileContent, Line, IsIgnoreComment=True):
 # @param File:     File which has the string
 # @param Format:   Correct format
 #
-def RaiseParserError(Line, Section, File, Format='', LineNo= -1):
+
+
+def RaiseParserError(Line, Section, File, Format='', LineNo=-1):
     if LineNo == -1:
         LineNo = GetLineNo(open(os.path.normpath(File), 'r').read(), Line)
     ErrorMsg = ST.ERR_INVALID_NOTFOUND % (Line, Section)
     if Format != '':
         Format = "Correct format is " + Format
-    Logger.Error("Parser", PARSER_ERROR, ErrorMsg, File=File, Line=LineNo, \
+    Logger.Error("Parser", PARSER_ERROR, ErrorMsg, File=File, Line=LineNo,
                  ExtraData=Format, RaiseError=Logger.IS_RAISE_ERROR)
 
-## WorkspaceFile
+# WorkspaceFile
 #
 # Return a full path with workspace dir
 #
 # @param WorkspaceDir:  Workspace dir
 # @param Filename:      Relative file name
 #
+
+
 def WorkspaceFile(WorkspaceDir, Filename):
     return os.path.join(NormPath(WorkspaceDir), NormPath(Filename))
 
-## Split string
+# Split string
 #
 # Remove '"' which startswith and endswith string
 #
 # @param String:  The string need to be split
 #
+
+
 def SplitString(String):
     if String.startswith('\"'):
         String = String[1:]
@@ -618,31 +669,37 @@ def SplitString(String):
         String = String[:-1]
     return String
 
-## Convert To Sql String
+# Convert To Sql String
 #
 # Replace "'" with "''" in each item of StringList
 #
 # @param StringList:  A list for strings to be converted
 #
+
+
 def ConvertToSqlString(StringList):
     return list(map(lambda s: s.replace("'", "''"), StringList))
 
-## Convert To Sql String
+# Convert To Sql String
 #
 # Replace "'" with "''" in the String
 #
 # @param String:  A String to be converted
 #
+
+
 def ConvertToSqlString2(String):
     return String.replace("'", "''")
 
-## GetStringOfList
+# GetStringOfList
 #
 # Get String of a List
 #
 # @param Lines: string list
 # @param Split: split character
 #
+
+
 def GetStringOfList(List, Split=' '):
     if not isinstance(List, type([])):
         return List
@@ -651,27 +708,32 @@ def GetStringOfList(List, Split=' '):
         Str = Str + Item + Split
     return Str.strip()
 
-## Get HelpTextList
+# Get HelpTextList
 #
 # Get HelpTextList from HelpTextClassList
 #
 # @param HelpTextClassList: Help Text Class List
 #
+
+
 def GetHelpTextList(HelpTextClassList):
     List = []
     if HelpTextClassList:
         for HelpText in HelpTextClassList:
             if HelpText.String.endswith('\n'):
-                HelpText.String = HelpText.String[0: len(HelpText.String) - len('\n')]
+                HelpText.String = HelpText.String[0: len(
+                    HelpText.String) - len('\n')]
                 List.extend(HelpText.String.split('\n'))
     return List
 
-## Get String Array Length
+# Get String Array Length
 #
 # Get String Array Length
 #
 # @param String: the source string
 #
+
+
 def StringArrayLength(String):
     if String.startswith('L"'):
         return (len(String) - 3 + 1) * 2
@@ -680,7 +742,7 @@ def StringArrayLength(String):
     else:
         return len(String.split()) + 1
 
-## RemoveDupOption
+# RemoveDupOption
 #
 # Remove Dup Option
 #
@@ -688,6 +750,8 @@ def StringArrayLength(String):
 # @param Which: Which flag
 # @param Against: Against flag
 #
+
+
 def RemoveDupOption(OptionString, Which="/I", Against=None):
     OptionList = OptionString.split()
     ValueList = []
@@ -707,7 +771,7 @@ def RemoveDupOption(OptionString, Which="/I", Against=None):
             ValueList.append(Val)
     return " ".join(OptionList)
 
-## Check if the string is HexDgit
+# Check if the string is HexDgit
 #
 # Return true if all characters in the string are digits and there is at
 # least one character
@@ -715,6 +779,8 @@ def RemoveDupOption(OptionString, Which="/I", Against=None):
 # , false otherwise.
 # @param string: input string
 #
+
+
 def IsHexDigit(Str):
     try:
         int(Str, 10)
@@ -728,7 +794,7 @@ def IsHexDigit(Str):
                 return False
     return False
 
-## Check if the string is HexDgit and its integer value within limit of UINT32
+# Check if the string is HexDgit and its integer value within limit of UINT32
 #
 # Return true if all characters in the string are digits and there is at
 # least one character
@@ -736,6 +802,8 @@ def IsHexDigit(Str):
 # , false otherwise.
 # @param string: input string
 #
+
+
 def IsHexDigitUINT32(Str):
     try:
         Value = int(Str, 10)
@@ -751,7 +819,7 @@ def IsHexDigitUINT32(Str):
                 return False
     return False
 
-## CleanSpecialChar
+# CleanSpecialChar
 #
 # The ASCII text files of type INF, DEC, INI are edited by developers,
 # and may contain characters that cannot be directly translated to strings that
@@ -759,20 +827,25 @@ def IsHexDigitUINT32(Str):
 # (0x00-0x08, TAB [0x09], 0x0B, 0x0C, 0x0E-0x1F, 0x80-0xFF)
 # must be converted to a space character[0x20] as part of the parsing process.
 #
+
+
 def ConvertSpecialChar(Lines):
     RetLines = []
     for line in Lines:
-        ReMatchSpecialChar = re.compile(r"[\x00-\x08]|\x09|\x0b|\x0c|[\x0e-\x1f]|[\x7f-\xff]")
+        ReMatchSpecialChar = re.compile(
+            r"[\x00-\x08]|\x09|\x0b|\x0c|[\x0e-\x1f]|[\x7f-\xff]")
         RetLines.append(ReMatchSpecialChar.sub(' ', line))
 
     return RetLines
 
-## __GetTokenList
+# __GetTokenList
 #
 # Assume Str is a valid feature flag expression.
 # Return a list which contains tokens: alpha numeric token and other token
 # Whitespace are not stripped
 #
+
+
 def __GetTokenList(Str):
     InQuote = False
     Token = ''
@@ -819,13 +892,15 @@ def __GetTokenList(Str):
         List.append(TokenOP)
     return List
 
-## ConvertNEToNOTEQ
+# ConvertNEToNOTEQ
 #
 # Convert NE operator to NOT EQ
 # For example: 1 NE 2 -> 1 NOT EQ 2
 #
 # @param Expr: Feature flag expression to be converted
 #
+
+
 def ConvertNEToNOTEQ(Expr):
     List = __GetTokenList(Expr)
     for Index in range(len(List)):
@@ -833,13 +908,15 @@ def ConvertNEToNOTEQ(Expr):
             List[Index] = 'NOT EQ'
     return ''.join(List)
 
-## ConvertNOTEQToNE
+# ConvertNOTEQToNE
 #
 # Convert NOT EQ operator to NE
 # For example: 1 NOT NE 2 -> 1 NE 2
 #
 # @param Expr: Feature flag expression to be converted
 #
+
+
 def ConvertNOTEQToNE(Expr):
     List = __GetTokenList(Expr)
     HasNOT = False
@@ -860,7 +937,7 @@ def ConvertNOTEQToNE(Expr):
 
     return ''.join(RetList)
 
-## SplitPcdEntry
+# SplitPcdEntry
 #
 # Split an PCD entry string to Token.CName and PCD value and FFE.
 # NOTE: PCD Value and FFE can contain "|" in its expression. And in INF specification, have below rule.
@@ -871,6 +948,8 @@ def ConvertNOTEQToNE(Expr):
 #
 # @return List     [PcdTokenCName, Value, FFE]
 #
+
+
 def SplitPcdEntry(String):
     if not String:
         return ['', '', ''], False
@@ -923,11 +1002,13 @@ def SplitPcdEntry(String):
 
     return ['', '', ''], False
 
-## Check if two arches matched?
+# Check if two arches matched?
 #
 # @param Arch1
 # @param Arch2
 #
+
+
 def IsMatchArch(Arch1, Arch2):
     if 'COMMON' in Arch1 or 'COMMON' in Arch2:
         return True
@@ -953,6 +1034,8 @@ def IsMatchArch(Arch1, Arch2):
 # Search all files in FilePath to find the FileName with the largest index
 # Return the FileName with index +1 under the FilePath
 #
+
+
 def GetUniFileName(FilePath, FileName):
     Files = []
     try:
diff --git a/BaseTools/Source/Python/UPT/Library/UniClassObject.py b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
index 8c44dc225277..c06799ed66d1 100644
--- a/BaseTools/Source/Python/UPT/Library/UniClassObject.py
+++ b/BaseTools/Source/Python/UPT/Library/UniClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Collect all defined strings in multiple uni files.
 #
 # Copyright (c) 2014 - 2019, Intel Corporation. All rights reserved.<BR>
@@ -13,7 +13,9 @@ from __future__ import print_function
 ##
 # Import Modules
 #
-import os, codecs, re
+import os
+import codecs
+import re
 import shlex
 from Logger import ToolError
 from Logger import Log as EdkLogger
@@ -42,40 +44,40 @@ NULL = u'\u0000'
 TAB = u'\t'
 BACK_SPLASH = u'\\'
 
-gLANG_CONV_TABLE = {'eng':'en', 'fra':'fr', \
-                 'aar':'aa', 'abk':'ab', 'ave':'ae', 'afr':'af', 'aka':'ak', 'amh':'am', \
-                 'arg':'an', 'ara':'ar', 'asm':'as', 'ava':'av', 'aym':'ay', 'aze':'az', \
-                 'bak':'ba', 'bel':'be', 'bul':'bg', 'bih':'bh', 'bis':'bi', 'bam':'bm', \
-                 'ben':'bn', 'bod':'bo', 'bre':'br', 'bos':'bs', 'cat':'ca', 'che':'ce', \
-                 'cha':'ch', 'cos':'co', 'cre':'cr', 'ces':'cs', 'chu':'cu', 'chv':'cv', \
-                 'cym':'cy', 'dan':'da', 'deu':'de', 'div':'dv', 'dzo':'dz', 'ewe':'ee', \
-                 'ell':'el', 'epo':'eo', 'spa':'es', 'est':'et', 'eus':'eu', 'fas':'fa', \
-                 'ful':'ff', 'fin':'fi', 'fij':'fj', 'fao':'fo', 'fry':'fy', 'gle':'ga', \
-                 'gla':'gd', 'glg':'gl', 'grn':'gn', 'guj':'gu', 'glv':'gv', 'hau':'ha', \
-                 'heb':'he', 'hin':'hi', 'hmo':'ho', 'hrv':'hr', 'hat':'ht', 'hun':'hu', \
-                 'hye':'hy', 'her':'hz', 'ina':'ia', 'ind':'id', 'ile':'ie', 'ibo':'ig', \
-                 'iii':'ii', 'ipk':'ik', 'ido':'io', 'isl':'is', 'ita':'it', 'iku':'iu', \
-                 'jpn':'ja', 'jav':'jv', 'kat':'ka', 'kon':'kg', 'kik':'ki', 'kua':'kj', \
-                 'kaz':'kk', 'kal':'kl', 'khm':'km', 'kan':'kn', 'kor':'ko', 'kau':'kr', \
-                 'kas':'ks', 'kur':'ku', 'kom':'kv', 'cor':'kw', 'kir':'ky', 'lat':'la', \
-                 'ltz':'lb', 'lug':'lg', 'lim':'li', 'lin':'ln', 'lao':'lo', 'lit':'lt', \
-                 'lub':'lu', 'lav':'lv', 'mlg':'mg', 'mah':'mh', 'mri':'mi', 'mkd':'mk', \
-                 'mal':'ml', 'mon':'mn', 'mar':'mr', 'msa':'ms', 'mlt':'mt', 'mya':'my', \
-                 'nau':'na', 'nob':'nb', 'nde':'nd', 'nep':'ne', 'ndo':'ng', 'nld':'nl', \
-                 'nno':'nn', 'nor':'no', 'nbl':'nr', 'nav':'nv', 'nya':'ny', 'oci':'oc', \
-                 'oji':'oj', 'orm':'om', 'ori':'or', 'oss':'os', 'pan':'pa', 'pli':'pi', \
-                 'pol':'pl', 'pus':'ps', 'por':'pt', 'que':'qu', 'roh':'rm', 'run':'rn', \
-                 'ron':'ro', 'rus':'ru', 'kin':'rw', 'san':'sa', 'srd':'sc', 'snd':'sd', \
-                 'sme':'se', 'sag':'sg', 'sin':'si', 'slk':'sk', 'slv':'sl', 'smo':'sm', \
-                 'sna':'sn', 'som':'so', 'sqi':'sq', 'srp':'sr', 'ssw':'ss', 'sot':'st', \
-                 'sun':'su', 'swe':'sv', 'swa':'sw', 'tam':'ta', 'tel':'te', 'tgk':'tg', \
-                 'tha':'th', 'tir':'ti', 'tuk':'tk', 'tgl':'tl', 'tsn':'tn', 'ton':'to', \
-                 'tur':'tr', 'tso':'ts', 'tat':'tt', 'twi':'tw', 'tah':'ty', 'uig':'ug', \
-                 'ukr':'uk', 'urd':'ur', 'uzb':'uz', 'ven':'ve', 'vie':'vi', 'vol':'vo', \
-                 'wln':'wa', 'wol':'wo', 'xho':'xh', 'yid':'yi', 'yor':'yo', 'zha':'za', \
-                 'zho':'zh', 'zul':'zu'}
+gLANG_CONV_TABLE = {'eng': 'en', 'fra': 'fr',
+                    'aar': 'aa', 'abk': 'ab', 'ave': 'ae', 'afr': 'af', 'aka': 'ak', 'amh': 'am',
+                    'arg': 'an', 'ara': 'ar', 'asm': 'as', 'ava': 'av', 'aym': 'ay', 'aze': 'az',
+                    'bak': 'ba', 'bel': 'be', 'bul': 'bg', 'bih': 'bh', 'bis': 'bi', 'bam': 'bm',
+                    'ben': 'bn', 'bod': 'bo', 'bre': 'br', 'bos': 'bs', 'cat': 'ca', 'che': 'ce',
+                    'cha': 'ch', 'cos': 'co', 'cre': 'cr', 'ces': 'cs', 'chu': 'cu', 'chv': 'cv',
+                    'cym': 'cy', 'dan': 'da', 'deu': 'de', 'div': 'dv', 'dzo': 'dz', 'ewe': 'ee',
+                    'ell': 'el', 'epo': 'eo', 'spa': 'es', 'est': 'et', 'eus': 'eu', 'fas': 'fa',
+                    'ful': 'ff', 'fin': 'fi', 'fij': 'fj', 'fao': 'fo', 'fry': 'fy', 'gle': 'ga',
+                    'gla': 'gd', 'glg': 'gl', 'grn': 'gn', 'guj': 'gu', 'glv': 'gv', 'hau': 'ha',
+                    'heb': 'he', 'hin': 'hi', 'hmo': 'ho', 'hrv': 'hr', 'hat': 'ht', 'hun': 'hu',
+                    'hye': 'hy', 'her': 'hz', 'ina': 'ia', 'ind': 'id', 'ile': 'ie', 'ibo': 'ig',
+                    'iii': 'ii', 'ipk': 'ik', 'ido': 'io', 'isl': 'is', 'ita': 'it', 'iku': 'iu',
+                    'jpn': 'ja', 'jav': 'jv', 'kat': 'ka', 'kon': 'kg', 'kik': 'ki', 'kua': 'kj',
+                    'kaz': 'kk', 'kal': 'kl', 'khm': 'km', 'kan': 'kn', 'kor': 'ko', 'kau': 'kr',
+                    'kas': 'ks', 'kur': 'ku', 'kom': 'kv', 'cor': 'kw', 'kir': 'ky', 'lat': 'la',
+                    'ltz': 'lb', 'lug': 'lg', 'lim': 'li', 'lin': 'ln', 'lao': 'lo', 'lit': 'lt',
+                    'lub': 'lu', 'lav': 'lv', 'mlg': 'mg', 'mah': 'mh', 'mri': 'mi', 'mkd': 'mk',
+                    'mal': 'ml', 'mon': 'mn', 'mar': 'mr', 'msa': 'ms', 'mlt': 'mt', 'mya': 'my',
+                    'nau': 'na', 'nob': 'nb', 'nde': 'nd', 'nep': 'ne', 'ndo': 'ng', 'nld': 'nl',
+                    'nno': 'nn', 'nor': 'no', 'nbl': 'nr', 'nav': 'nv', 'nya': 'ny', 'oci': 'oc',
+                    'oji': 'oj', 'orm': 'om', 'ori': 'or', 'oss': 'os', 'pan': 'pa', 'pli': 'pi',
+                    'pol': 'pl', 'pus': 'ps', 'por': 'pt', 'que': 'qu', 'roh': 'rm', 'run': 'rn',
+                    'ron': 'ro', 'rus': 'ru', 'kin': 'rw', 'san': 'sa', 'srd': 'sc', 'snd': 'sd',
+                    'sme': 'se', 'sag': 'sg', 'sin': 'si', 'slk': 'sk', 'slv': 'sl', 'smo': 'sm',
+                    'sna': 'sn', 'som': 'so', 'sqi': 'sq', 'srp': 'sr', 'ssw': 'ss', 'sot': 'st',
+                    'sun': 'su', 'swe': 'sv', 'swa': 'sw', 'tam': 'ta', 'tel': 'te', 'tgk': 'tg',
+                    'tha': 'th', 'tir': 'ti', 'tuk': 'tk', 'tgl': 'tl', 'tsn': 'tn', 'ton': 'to',
+                    'tur': 'tr', 'tso': 'ts', 'tat': 'tt', 'twi': 'tw', 'tah': 'ty', 'uig': 'ug',
+                    'ukr': 'uk', 'urd': 'ur', 'uzb': 'uz', 'ven': 've', 'vie': 'vi', 'vol': 'vo',
+                    'wln': 'wa', 'wol': 'wo', 'xho': 'xh', 'yid': 'yi', 'yor': 'yo', 'zha': 'za',
+                    'zho': 'zh', 'zul': 'zu'}
 
-## Convert a python unicode string to a normal string
+# Convert a python unicode string to a normal string
 #
 # Convert a python unicode string to a normal string
 # UniToStr(u'I am a string') is 'I am a string'
@@ -84,10 +86,12 @@ gLANG_CONV_TABLE = {'eng':'en', 'fra':'fr', \
 #
 # @retval:     The formatted normal string
 #
+
+
 def UniToStr(Uni):
     return repr(Uni)[2:-1]
 
-## Convert a unicode string to a Hex list
+# Convert a unicode string to a Hex list
 #
 # Convert a unicode string to a Hex list
 # UniToHexList('ABC') is ['0x41', '0x00', '0x42', '0x00', '0x43', '0x00']
@@ -96,6 +100,8 @@ def UniToStr(Uni):
 #
 # @retval List:  The formatted hex list
 #
+
+
 def UniToHexList(Uni):
     List = []
     for Item in Uni:
@@ -104,7 +110,7 @@ def UniToHexList(Uni):
         List.append('0x' + Temp[0:2])
     return List
 
-## Convert special unicode characters
+# Convert special unicode characters
 #
 # Convert special characters to (c), (r) and (tm).
 #
@@ -112,6 +118,8 @@ def UniToHexList(Uni):
 #
 # @retval NewUni:  The converted unicode string
 #
+
+
 def ConvertSpecialUnicodes(Uni):
     OldUni = NewUni = Uni
     NewUni = NewUni.replace(u'\u00A9', '(c)')
@@ -121,7 +129,7 @@ def ConvertSpecialUnicodes(Uni):
         NewUni = OldUni
     return NewUni
 
-## GetLanguageCode1766
+# GetLanguageCode1766
 #
 # Check the language code read from .UNI file and convert RFC 4646 codes to RFC 1766 codes
 # RFC 1766 language codes supported in compatibility mode
@@ -131,6 +139,8 @@ def ConvertSpecialUnicodes(Uni):
 #
 # @retval LangName:  Valid language code in RFC 1766 format or None
 #
+
+
 def GetLanguageCode1766(LangName, File=None):
     return LangName
 
@@ -145,9 +155,9 @@ def GetLanguageCode1766(LangName, File=None):
             return LangName
         else:
             EdkLogger.Error("Unicode File Parser",
-                             ToolError.FORMAT_INVALID,
-                             "Invalid RFC 1766 language code : %s" % LangName,
-                             File)
+                            ToolError.FORMAT_INVALID,
+                            "Invalid RFC 1766 language code : %s" % LangName,
+                            File)
     elif length == 5:
         if LangName[0:2].isalpha() and LangName[2] == '-':
             for Key in gLANG_CONV_TABLE.keys():
@@ -164,11 +174,11 @@ def GetLanguageCode1766(LangName, File=None):
                     return Key
 
     EdkLogger.Error("Unicode File Parser",
-                             ToolError.FORMAT_INVALID,
-                             "Invalid RFC 4646 language code : %s" % LangName,
-                             File)
+                    ToolError.FORMAT_INVALID,
+                    "Invalid RFC 4646 language code : %s" % LangName,
+                    File)
 
-## GetLanguageCode
+# GetLanguageCode
 #
 # Check the language code read from .UNI file and convert RFC 1766 codes to RFC 4646 codes if appropriate
 # RFC 1766 language codes supported in compatibility mode
@@ -178,6 +188,8 @@ def GetLanguageCode1766(LangName, File=None):
 #
 # @retval LangName:  Valid lanugage code in RFC 4646 format or None
 #
+
+
 def GetLanguageCode(LangName, IsCompatibleMode, File):
     length = len(LangName)
     if IsCompatibleMode:
@@ -188,9 +200,9 @@ def GetLanguageCode(LangName, IsCompatibleMode, File):
             return LangName
         else:
             EdkLogger.Error("Unicode File Parser",
-                             ToolError.FORMAT_INVALID,
-                             "Invalid RFC 1766 language code : %s" % LangName,
-                             File)
+                            ToolError.FORMAT_INVALID,
+                            "Invalid RFC 1766 language code : %s" % LangName,
+                            File)
     if (LangName[0] == 'X' or LangName[0] == 'x') and LangName[1] == '-':
         return LangName
     if length == 2:
@@ -209,11 +221,11 @@ def GetLanguageCode(LangName, IsCompatibleMode, File):
             return LangName
 
     EdkLogger.Error("Unicode File Parser",
-                             ToolError.FORMAT_INVALID,
-                             "Invalid RFC 4646 language code : %s" % LangName,
-                             File)
+                    ToolError.FORMAT_INVALID,
+                    "Invalid RFC 4646 language code : %s" % LangName,
+                    File)
 
-## FormatUniEntry
+# FormatUniEntry
 #
 # Formatted the entry in Uni file.
 #
@@ -222,6 +234,8 @@ def GetLanguageCode(LangName, IsCompatibleMode, File):
 # @param ContainerFile   ContainerFile.
 #
 # @return formatted entry
+
+
 def FormatUniEntry(StrTokenName, TokenValueList, ContainerFile):
     SubContent = ''
     PreFormatLength = 40
@@ -243,21 +257,24 @@ def FormatUniEntry(StrTokenName, TokenValueList, ContainerFile):
         for SubValue in ValueList:
             if SubValue.strip():
                 SubValueContent += \
-                ' ' * (PreFormatLength + len('#language en-US ')) + '\"%s\\n\"' % SubValue.strip() + '\r\n'
+                    ' ' * (PreFormatLength + len('#language en-US ')) + \
+                    '\"%s\\n\"' % SubValue.strip() + '\r\n'
         SubValueContent = SubValueContent[(PreFormatLength + len('#language en-US ')):SubValueContent.rfind('\\n')] \
-        + '\"' + '\r\n'
+            + '\"' + '\r\n'
         SubContent += ' '*PreFormatLength + '#language %-5s ' % Lang + SubValueContent
     if SubContent:
-        SubContent = StrTokenName + ' '*(PreFormatLength - len(StrTokenName)) + SubContent[PreFormatLength:]
+        SubContent = StrTokenName + ' ' * \
+            (PreFormatLength - len(StrTokenName)) + \
+            SubContent[PreFormatLength:]
     return SubContent
 
 
-## StringDefClassObject
+# StringDefClassObject
 #
 # A structure for language definition
 #
 class StringDefClassObject(object):
-    def __init__(self, Name = None, Value = None, Referenced = False, Token = None, UseOtherLangDef = ''):
+    def __init__(self, Name=None, Value=None, Referenced=False, Token=None, UseOtherLangDef=''):
         self.StringName = ''
         self.StringNameByteList = []
         self.StringValue = ''
@@ -279,12 +296,12 @@ class StringDefClassObject(object):
 
     def __str__(self):
         return repr(self.StringName) + ' ' + \
-               repr(self.Token) + ' ' + \
-               repr(self.Referenced) + ' ' + \
-               repr(self.StringValue) + ' ' + \
-               repr(self.UseOtherLangDef)
+            repr(self.Token) + ' ' + \
+            repr(self.Referenced) + ' ' + \
+            repr(self.StringValue) + ' ' + \
+            repr(self.UseOtherLangDef)
 
-    def UpdateValue(self, Value = None):
+    def UpdateValue(self, Value=None):
         if Value is not None:
             if self.StringValue:
                 self.StringValue = self.StringValue + '\r\n' + Value
@@ -293,21 +310,26 @@ class StringDefClassObject(object):
             self.StringValueByteList = UniToHexList(self.StringValue)
             self.Length = len(self.StringValueByteList)
 
-## UniFileClassObject
+# UniFileClassObject
 #
 # A structure for .uni file definition
 #
+
+
 class UniFileClassObject(object):
-    def __init__(self, FileList = None, IsCompatibleMode = False, IncludePathList = None):
+    def __init__(self, FileList=None, IsCompatibleMode=False, IncludePathList=None):
         self.FileList = FileList
         self.File = None
         self.IncFileList = FileList
         self.UniFileHeader = ''
         self.Token = 2
-        self.LanguageDef = []                   #[ [u'LanguageIdentifier', u'PrintableName'], ... ]
-        self.OrderedStringList = {}             #{ u'LanguageIdentifier' : [StringDefClassObject]  }
-        self.OrderedStringDict = {}             #{ u'LanguageIdentifier' : {StringName:(IndexInList)}  }
-        self.OrderedStringListByToken = {}      #{ u'LanguageIdentifier' : {Token: StringDefClassObject} }
+        self.LanguageDef = []  # [ [u'LanguageIdentifier', u'PrintableName'], ... ]
+        # { u'LanguageIdentifier' : [StringDefClassObject]  }
+        self.OrderedStringList = {}
+        # { u'LanguageIdentifier' : {StringName:(IndexInList)}  }
+        self.OrderedStringDict = {}
+        # { u'LanguageIdentifier' : {Token: StringDefClassObject} }
+        self.OrderedStringListByToken = {}
         self.IsCompatibleMode = IsCompatibleMode
         if not IncludePathList:
             self.IncludePathList = []
@@ -323,11 +345,14 @@ class UniFileClassObject(object):
         Lang = shlex.split(Line.split(u"//")[0])
         if len(Lang) != 3:
             try:
-                FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
+                FileIn = codecs.open(File.Path, mode='rb',
+                                     encoding='utf_8').readlines()
             except UnicodeError as Xstr:
-                FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
+                FileIn = codecs.open(File.Path, mode='rb',
+                                     encoding='utf_16').readlines()
             except UnicodeError as Xstr:
-                FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
+                FileIn = codecs.open(File.Path, mode='rb',
+                                     encoding='utf_16_le').readlines()
             except:
                 EdkLogger.Error("Unicode File Parser",
                                 ToolError.FILE_OPEN_FAILURE,
@@ -335,12 +360,13 @@ class UniFileClassObject(object):
                                 ExtraData=File)
             LineNo = GetLineNo(FileIn, Line, False)
             EdkLogger.Error("Unicode File Parser",
-                             ToolError.PARSER_ERROR,
-                             "Wrong language definition",
-                             ExtraData="""%s\n\t*Correct format is like '#langdef en-US "English"'""" % Line,
-                             File = File, Line = LineNo)
+                            ToolError.PARSER_ERROR,
+                            "Wrong language definition",
+                            ExtraData="""%s\n\t*Correct format is like '#langdef en-US "English"'""" % Line,
+                            File=File, Line=LineNo)
         else:
-            LangName = GetLanguageCode(Lang[1], self.IsCompatibleMode, self.File)
+            LangName = GetLanguageCode(
+                Lang[1], self.IsCompatibleMode, self.File)
             LangPrintName = Lang[2]
 
         IsLangInDef = False
@@ -355,8 +381,10 @@ class UniFileClassObject(object):
         #
         # Add language string
         #
-        self.AddStringToList(u'$LANGUAGE_NAME', LangName, LangName, 0, True, Index=0)
-        self.AddStringToList(u'$PRINTABLE_LANGUAGE_NAME', LangName, LangPrintName, 1, True, Index=1)
+        self.AddStringToList(u'$LANGUAGE_NAME', LangName,
+                             LangName, 0, True, Index=0)
+        self.AddStringToList(u'$PRINTABLE_LANGUAGE_NAME',
+                             LangName, LangPrintName, 1, True, Index=1)
 
         if not IsLangInDef:
             #
@@ -365,18 +393,19 @@ class UniFileClassObject(object):
             #
             FirstLangName = self.LanguageDef[0][0]
             if LangName != FirstLangName:
-                for Index in range (2, len (self.OrderedStringList[FirstLangName])):
+                for Index in range(2, len(self.OrderedStringList[FirstLangName])):
                     Item = self.OrderedStringList[FirstLangName][Index]
                     if Item.UseOtherLangDef != '':
                         OtherLang = Item.UseOtherLangDef
                     else:
                         OtherLang = FirstLangName
-                    self.OrderedStringList[LangName].append (StringDefClassObject(Item.StringName,
-                                                                                  '',
-                                                                                  Item.Referenced,
-                                                                                  Item.Token,
-                                                                                  OtherLang))
-                    self.OrderedStringDict[LangName][Item.StringName] = len(self.OrderedStringList[LangName]) - 1
+                    self.OrderedStringList[LangName].append(StringDefClassObject(Item.StringName,
+                                                                                 '',
+                                                                                 Item.Referenced,
+                                                                                 Item.Token,
+                                                                                 OtherLang))
+                    self.OrderedStringDict[LangName][Item.StringName] = len(
+                        self.OrderedStringList[LangName]) - 1
         return True
 
     #
@@ -392,27 +421,30 @@ class UniFileClassObject(object):
             MatchString = re.match('[A-Z0-9_]+', Name, re.UNICODE)
             if MatchString is None or MatchString.end(0) != len(Name):
                 EdkLogger.Error("Unicode File Parser",
-                             ToolError.FORMAT_INVALID,
-                             'The string token name %s in UNI file %s must be upper case character.' %(Name, self.File))
+                                ToolError.FORMAT_INVALID,
+                                'The string token name %s in UNI file %s must be upper case character.' % (Name, self.File))
         LanguageList = Item.split(u'#language ')
         for IndexI in range(len(LanguageList)):
             if IndexI == 0:
                 continue
             else:
                 Language = LanguageList[IndexI].split()[0]
-                #.replace(u'\r\n', u'')
+                # .replace(u'\r\n', u'')
                 Value = \
-                LanguageList[IndexI][LanguageList[IndexI].find(u'\"') + len(u'\"') : LanguageList[IndexI].rfind(u'\"')]
-                Language = GetLanguageCode(Language, self.IsCompatibleMode, self.File)
+                    LanguageList[IndexI][LanguageList[IndexI].find(
+                        u'\"') + len(u'\"'): LanguageList[IndexI].rfind(u'\"')]
+                Language = GetLanguageCode(
+                    Language, self.IsCompatibleMode, self.File)
                 self.AddStringToList(Name, Language, Value)
 
     #
     # Get include file list and load them
     #
-    def GetIncludeFile(self, Item, Dir = None):
+    def GetIncludeFile(self, Item, Dir=None):
         if Dir:
             pass
-        FileName = Item[Item.find(u'!include ') + len(u'!include ') :Item.find(u' ', len(u'!include '))][1:-1]
+        FileName = Item[Item.find(
+            u'!include ') + len(u'!include '):Item.find(u' ', len(u'!include '))][1:-1]
         self.LoadUniFile(FileName)
 
     #
@@ -421,8 +453,8 @@ class UniFileClassObject(object):
     def PreProcess(self, File, IsIncludeFile=False):
         if not os.path.exists(File.Path) or not os.path.isfile(File.Path):
             EdkLogger.Error("Unicode File Parser",
-                             ToolError.FILE_NOT_FOUND,
-                             ExtraData=File.Path)
+                            ToolError.FILE_NOT_FOUND,
+                            ExtraData=File.Path)
 
         #
         # Check file header of the Uni file
@@ -432,14 +464,17 @@ class UniFileClassObject(object):
 #                             ExtraData='The file %s is either invalid UTF-16LE or it is missing the BOM.' % File.Path)
 
         try:
-            FileIn = codecs.open(File.Path, mode='rb', encoding='utf_8').readlines()
+            FileIn = codecs.open(File.Path, mode='rb',
+                                 encoding='utf_8').readlines()
         except UnicodeError as Xstr:
-            FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16').readlines()
+            FileIn = codecs.open(File.Path, mode='rb',
+                                 encoding='utf_16').readlines()
         except UnicodeError:
-            FileIn = codecs.open(File.Path, mode='rb', encoding='utf_16_le').readlines()
+            FileIn = codecs.open(File.Path, mode='rb',
+                                 encoding='utf_16_le').readlines()
         except:
-            EdkLogger.Error("Unicode File Parser", ToolError.FILE_OPEN_FAILURE, ExtraData=File.Path)
-
+            EdkLogger.Error("Unicode File Parser",
+                            ToolError.FILE_OPEN_FAILURE, ExtraData=File.Path)
 
         #
         # get the file header
@@ -456,7 +491,7 @@ class UniFileClassObject(object):
             if Line == u'':
                 continue
             if Line.startswith(DT.TAB_COMMENT_EDK1_SPLIT) and (Line.find(DT.TAB_HEADER_COMMENT) > -1) \
-                and not HeaderEnd and not HeaderStart:
+                    and not HeaderEnd and not HeaderStart:
                 HeaderStart = True
             if not Line.startswith(DT.TAB_COMMENT_EDK1_SPLIT) and HeaderStart and not HeaderEnd:
                 HeaderEnd = True
@@ -494,7 +529,8 @@ class UniFileClassObject(object):
                 # there should be only one line feed character between them
                 #
                 if MultiLineFeedExits:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
                 continue
 
             MultiLineFeedExits = False
@@ -509,8 +545,8 @@ class UniFileClassObject(object):
                     FileIn[LineCount-1] = Line
                     FileIn[LineCount] = '\r\n'
                     LineCount -= 1
-                    for Index in range (LineCount + 1, len (FileIn) - 1):
-                        if (Index == len(FileIn) -1):
+                    for Index in range(LineCount + 1, len(FileIn) - 1):
+                        if (Index == len(FileIn) - 1):
                             FileIn[Index] = '\r\n'
                         else:
                             FileIn[Index] = FileIn[Index + 1]
@@ -521,9 +557,11 @@ class UniFileClassObject(object):
                     if Line[CommIndex+1] == u'/':
                         Line = Line[:CommIndex].strip()
                     else:
-                        EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                        EdkLogger.Error(
+                            "Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
                 else:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
             Line = Line.replace(UNICODE_WIDE_CHAR, WIDE_CHAR)
             Line = Line.replace(UNICODE_NARROW_CHAR, NARROW_CHAR)
@@ -545,7 +583,7 @@ class UniFileClassObject(object):
                 if not Line.endswith('"'):
                     EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
                                     ExtraData='''The line %s misses '"' at the end of it in file %s'''
-                                                 % (LineCount, File.Path))
+                                    % (LineCount, File.Path))
 
             #
             # Between Name entry and Language entry can not contain line feed
@@ -624,31 +662,34 @@ class UniFileClassObject(object):
         #
 
         if not IsIncludeFile and not Lines:
-            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                Message=ST.ERR_UNIPARSE_NO_SECTION_EXIST, \
-                ExtraData=File.Path)
+            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                            Message=ST.ERR_UNIPARSE_NO_SECTION_EXIST,
+                            ExtraData=File.Path)
 
         NewLines = []
         StrName = u''
         ExistStrNameList = []
         for Line in Lines:
             if StrName and not StrName.split()[1].startswith(DT.TAB_STR_TOKENCNAME + DT.TAB_UNDERLINE_SPLIT):
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_STRNAME_FORMAT_ERROR % StrName.split()[1], \
+                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                Message=ST.ERR_UNIPARSE_STRNAME_FORMAT_ERROR % StrName.split()[
+                                    1],
                                 ExtraData=File.Path)
 
             if StrName and len(StrName.split()[1].split(DT.TAB_UNDERLINE_SPLIT)) == 4:
-                StringTokenList = StrName.split()[1].split(DT.TAB_UNDERLINE_SPLIT)
-                if (StringTokenList[3].upper() in [DT.TAB_STR_TOKENPROMPT, DT.TAB_STR_TOKENHELP] and \
-                    StringTokenList[3] not in [DT.TAB_STR_TOKENPROMPT, DT.TAB_STR_TOKENHELP]) or \
-                    (StringTokenList[2].upper() == DT.TAB_STR_TOKENERR and StringTokenList[2] != DT.TAB_STR_TOKENERR):
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_STRTOKEN_FORMAT_ERROR % StrName.split()[1], \
-                                ExtraData=File.Path)
+                StringTokenList = StrName.split()[1].split(
+                    DT.TAB_UNDERLINE_SPLIT)
+                if (StringTokenList[3].upper() in [DT.TAB_STR_TOKENPROMPT, DT.TAB_STR_TOKENHELP] and
+                        StringTokenList[3] not in [DT.TAB_STR_TOKENPROMPT, DT.TAB_STR_TOKENHELP]) or \
+                        (StringTokenList[2].upper() == DT.TAB_STR_TOKENERR and StringTokenList[2] != DT.TAB_STR_TOKENERR):
+                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                    Message=ST.ERR_UNIPARSE_STRTOKEN_FORMAT_ERROR % StrName.split()[
+                                        1],
+                                    ExtraData=File.Path)
 
             if Line.count(u'#language') > 1:
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_SEP_LANGENTRY_LINE % Line, \
+                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                Message=ST.ERR_UNIPARSE_SEP_LANGENTRY_LINE % Line,
                                 ExtraData=File.Path)
 
             if Line.startswith(u'//'):
@@ -661,74 +702,88 @@ class UniFileClassObject(object):
                     NewLines.append(Line[:Line.find(u'"')].strip())
                     NewLines.append(Line[Line.find(u'"'):])
                 else:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
             elif Line.startswith(u'#string'):
                 if len(Line.split()) == 2:
                     StrName = Line
                     if StrName:
                         if StrName.split()[1] not in ExistStrNameList:
                             ExistStrNameList.append(StrName.split()[1].strip())
-                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION, \
-                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION, \
-                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION, \
+                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION,
+                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION,
+                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION,
                                                     DT.TAB_DEC_BINARY_ABSTRACT, DT.TAB_DEC_BINARY_DESCRIPTION]:
-                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[1], \
+                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[
+                                                1],
                                             ExtraData=File.Path)
                     continue
                 elif len(Line.split()) == 4 and Line.find(u'#language') > 0:
                     if Line[Line.find(u'#language')-1] != ' ' or \
                        Line[Line.find(u'#language')+len(u'#language')] != u' ':
-                        EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                        EdkLogger.Error(
+                            "Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
                     if Line.find(u'"') > 0:
-                        EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                        EdkLogger.Error(
+                            "Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
                     StrName = Line.split()[0] + u' ' + Line.split()[1]
                     if StrName:
                         if StrName.split()[1] not in ExistStrNameList:
                             ExistStrNameList.append(StrName.split()[1].strip())
-                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION, \
-                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION, \
-                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION, \
+                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION,
+                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION,
+                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION,
                                                     DT.TAB_DEC_BINARY_ABSTRACT, DT.TAB_DEC_BINARY_DESCRIPTION]:
-                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[1], \
+                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[
+                                                1],
                                             ExtraData=File.Path)
                     if IsIncludeFile:
                         if StrName not in NewLines:
-                            NewLines.append((Line[:Line.find(u'#language')]).strip())
+                            NewLines.append(
+                                (Line[:Line.find(u'#language')]).strip())
                     else:
-                        NewLines.append((Line[:Line.find(u'#language')]).strip())
+                        NewLines.append(
+                            (Line[:Line.find(u'#language')]).strip())
                     NewLines.append((Line[Line.find(u'#language'):]).strip())
                 elif len(Line.split()) > 4 and Line.find(u'#language') > 0 and Line.find(u'"') > 0:
                     if Line[Line.find(u'#language')-1] != u' ' or \
                        Line[Line.find(u'#language')+len(u'#language')] != u' ':
-                        EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                        EdkLogger.Error(
+                            "Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
                     if Line[Line.find(u'"')-1] != u' ':
-                        EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                        EdkLogger.Error(
+                            "Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
                     StrName = Line.split()[0] + u' ' + Line.split()[1]
                     if StrName:
                         if StrName.split()[1] not in ExistStrNameList:
                             ExistStrNameList.append(StrName.split()[1].strip())
-                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION, \
-                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION, \
-                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION, \
+                        elif StrName.split()[1] in [DT.TAB_INF_ABSTRACT, DT.TAB_INF_DESCRIPTION,
+                                                    DT.TAB_INF_BINARY_ABSTRACT, DT.TAB_INF_BINARY_DESCRIPTION,
+                                                    DT.TAB_DEC_PACKAGE_ABSTRACT, DT.TAB_DEC_PACKAGE_DESCRIPTION,
                                                     DT.TAB_DEC_BINARY_ABSTRACT, DT.TAB_DEC_BINARY_DESCRIPTION]:
-                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[1], \
+                            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                            Message=ST.ERR_UNIPARSE_MULTI_ENTRY_EXIST % StrName.split()[
+                                                1],
                                             ExtraData=File.Path)
                     if IsIncludeFile:
                         if StrName not in NewLines:
-                            NewLines.append((Line[:Line.find(u'#language')]).strip())
+                            NewLines.append(
+                                (Line[:Line.find(u'#language')]).strip())
                     else:
-                        NewLines.append((Line[:Line.find(u'#language')]).strip())
-                    NewLines.append((Line[Line.find(u'#language'):Line.find(u'"')]).strip())
+                        NewLines.append(
+                            (Line[:Line.find(u'#language')]).strip())
+                    NewLines.append(
+                        (Line[Line.find(u'#language'):Line.find(u'"')]).strip())
                     NewLines.append((Line[Line.find(u'"'):]).strip())
                 else:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
             elif Line.startswith(u'#language'):
                 if len(Line.split()) == 2:
                     if IsIncludeFile:
@@ -746,23 +801,27 @@ class UniFileClassObject(object):
                     NewLines.append((Line[:Line.find(u'"')]).strip())
                     NewLines.append((Line[Line.find(u'"'):]).strip())
                 else:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
             elif Line.startswith(u'"'):
-                if u'#string' in Line  or u'#language' in Line:
-                    EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                if u'#string' in Line or u'#language' in Line:
+                    EdkLogger.Error("Unicode File Parser",
+                                    ToolError.FORMAT_INVALID, ExtraData=File.Path)
                 NewLines.append(Line)
             else:
                 print(Line)
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, ExtraData=File.Path)
+                EdkLogger.Error("Unicode File Parser",
+                                ToolError.FORMAT_INVALID, ExtraData=File.Path)
 
         if StrName and not StrName.split()[1].startswith(u'STR_'):
-            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_STRNAME_FORMAT_ERROR % StrName.split()[1], \
-                                ExtraData=File.Path)
+            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                            Message=ST.ERR_UNIPARSE_STRNAME_FORMAT_ERROR % StrName.split()[
+                                1],
+                            ExtraData=File.Path)
 
         if StrName and not NewLines:
-            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                            Message=ST.ERR_UNI_MISS_LANGENTRY % StrName, \
+            EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                            Message=ST.ERR_UNI_MISS_LANGENTRY % StrName,
                             ExtraData=File.Path)
 
         #
@@ -785,33 +844,34 @@ class UniFileClassObject(object):
                     DescriptionPosition = ExistStrNameList.index(StrName)
 
         OrderList = sorted([AbstractPosition, DescriptionPosition])
-        BinaryOrderList = sorted([BinaryAbstractPosition, BinaryDescriptionPosition])
+        BinaryOrderList = sorted(
+            [BinaryAbstractPosition, BinaryDescriptionPosition])
         Min = OrderList[0]
         Max = OrderList[1]
         BinaryMin = BinaryOrderList[0]
         BinaryMax = BinaryOrderList[1]
         if BinaryDescriptionPosition > -1:
-            if not(BinaryDescriptionPosition == BinaryMax and BinaryAbstractPosition == BinaryMin and \
+            if not(BinaryDescriptionPosition == BinaryMax and BinaryAbstractPosition == BinaryMin and
                    BinaryMax > Max):
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG, \
+                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG,
                                 ExtraData=File.Path)
         elif BinaryAbstractPosition > -1:
             if not(BinaryAbstractPosition > Max):
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG, \
+                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG,
                                 ExtraData=File.Path)
 
-        if  DescriptionPosition > -1:
-            if not(DescriptionPosition == Max and AbstractPosition == Min and \
+        if DescriptionPosition > -1:
+            if not(DescriptionPosition == Max and AbstractPosition == Min and
                    DescriptionPosition > AbstractPosition):
-                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID, \
-                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG, \
+                EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
+                                Message=ST.ERR_UNIPARSE_ENTRY_ORDER_WRONG,
                                 ExtraData=File.Path)
 
         if not self.UniFileHeader:
             EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
-                            Message = ST.ERR_NO_SOURCE_HEADER,
+                            Message=ST.ERR_NO_SOURCE_HEADER,
                             ExtraData=File.Path)
 
         return NewLines
@@ -819,7 +879,7 @@ class UniFileClassObject(object):
     #
     # Load a .uni file
     #
-    def LoadUniFile(self, File = None):
+    def LoadUniFile(self, File=None):
         if File is None:
             EdkLogger.Error("Unicode File Parser",
                             ToolError.PARSER_ERROR,
@@ -867,34 +927,39 @@ class UniFileClassObject(object):
             #     "Mi segunda secuencia 2"
             #
             if Line.find(u'#string ') >= 0 and Line.find(u'#language ') < 0 and \
-                SecondLine.find(u'#string ') < 0 and SecondLine.find(u'#language ') >= 0 and \
-                ThirdLine.find(u'#string ') < 0 and ThirdLine.find(u'#language ') < 0:
+                    SecondLine.find(u'#string ') < 0 and SecondLine.find(u'#language ') >= 0 and \
+                    ThirdLine.find(u'#string ') < 0 and ThirdLine.find(u'#language ') < 0:
                 if Line.find('"') > 0 or SecondLine.find('"') > 0:
                     EdkLogger.Error("Unicode File Parser", ToolError.FORMAT_INVALID,
-                                Message=ST.ERR_UNIPARSE_DBLQUOTE_UNMATCHED,
-                                ExtraData=File.Path)
+                                    Message=ST.ERR_UNIPARSE_DBLQUOTE_UNMATCHED,
+                                    ExtraData=File.Path)
 
-                Name = Line[Line.find(u'#string ') + len(u'#string ') : ].strip(' ')
-                Language = SecondLine[SecondLine.find(u'#language ') + len(u'#language ') : ].strip(' ')
+                Name = Line[Line.find(u'#string ') +
+                            len(u'#string '):].strip(' ')
+                Language = SecondLine[SecondLine.find(
+                    u'#language ') + len(u'#language '):].strip(' ')
                 for IndexJ in range(IndexI + 2, len(Lines)):
                     if Lines[IndexJ].find(u'#string ') < 0 and Lines[IndexJ].find(u'#language ') < 0 and \
-                    Lines[IndexJ].strip().startswith(u'"') and Lines[IndexJ].strip().endswith(u'"'):
+                            Lines[IndexJ].strip().startswith(u'"') and Lines[IndexJ].strip().endswith(u'"'):
                         if Lines[IndexJ][-2] == ' ':
                             CombineToken = True
                         if CombineToken:
                             if Lines[IndexJ].strip()[1:-1].strip():
-                                Value = Value + Lines[IndexJ].strip()[1:-1].rstrip() + ' '
+                                Value = Value + \
+                                    Lines[IndexJ].strip()[1:-1].rstrip() + ' '
                             else:
                                 Value = Value + Lines[IndexJ].strip()[1:-1]
                             CombineToken = False
                         else:
-                            Value = Value + Lines[IndexJ].strip()[1:-1] + '\r\n'
+                            Value = Value + \
+                                Lines[IndexJ].strip()[1:-1] + '\r\n'
                     else:
                         IndexI = IndexJ
                         break
                 if Value.endswith('\r\n'):
                     Value = Value[: Value.rfind('\r\n')]
-                Language = GetLanguageCode(Language, self.IsCompatibleMode, self.File)
+                Language = GetLanguageCode(
+                    Language, self.IsCompatibleMode, self.File)
                 self.AddStringToList(Name, Language, Value)
                 continue
 
@@ -911,7 +976,7 @@ class UniFileClassObject(object):
     #
     # Add a string to list
     #
-    def AddStringToList(self, Name, Language, Value, Token = 0, Referenced = False, UseOtherLangDef = '', Index = -1):
+    def AddStringToList(self, Name, Language, Value, Token=0, Referenced=False, UseOtherLangDef='', Index=-1):
         for LangNameItem in self.LanguageDef:
             if Language == LangNameItem[0]:
                 break
@@ -953,7 +1018,8 @@ class UniFileClassObject(object):
                                                                                         Referenced,
                                                                                         Token,
                                                                                         OtherLangDef))
-                        self.OrderedStringDict[LangName[0]][Name] = len(self.OrderedStringList[LangName[0]]) - 1
+                        self.OrderedStringDict[LangName[0]][Name] = len(
+                            self.OrderedStringList[LangName[0]]) - 1
             else:
                 self.OrderedStringList[Language].insert(Index, StringDefClassObject(Name,
                                                                                     Value,
@@ -1015,7 +1081,7 @@ class UniFileClassObject(object):
         # Use small token for all referred string stoken.
         #
         RefToken = 0
-        for Index in range (0, len (self.OrderedStringList[FirstLangName])):
+        for Index in range(0, len(self.OrderedStringList[FirstLangName])):
             FirstLangItem = self.OrderedStringList[FirstLangName][Index]
             if FirstLangItem.Referenced == True:
                 for LangNameItem in self.LanguageDef:
@@ -1030,7 +1096,7 @@ class UniFileClassObject(object):
         # Use big token for all unreferred string stoken.
         #
         UnRefToken = 0
-        for Index in range (0, len (self.OrderedStringList[FirstLangName])):
+        for Index in range(0, len(self.OrderedStringList[FirstLangName])):
             FirstLangItem = self.OrderedStringList[FirstLangName][Index]
             if FirstLangItem.Referenced == False:
                 for LangNameItem in self.LanguageDef:
@@ -1045,7 +1111,7 @@ class UniFileClassObject(object):
     #
     def ShowMe(self):
         print(self.LanguageDef)
-        #print self.OrderedStringList
+        # print self.OrderedStringList
         for Item in self.OrderedStringList:
             print(Item)
             for Member in self.OrderedStringList[Item]:
@@ -1060,15 +1126,18 @@ class UniFileClassObject(object):
 
         if not os.path.exists(FilaPath) or not os.path.isfile(FilaPath):
             EdkLogger.Error("Unicode File Parser",
-                             ToolError.FILE_NOT_FOUND,
-                             ExtraData=FilaPath)
+                            ToolError.FILE_NOT_FOUND,
+                            ExtraData=FilaPath)
         try:
-            FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_8').readlines()
+            FileIn = codecs.open(FilaPath, mode='rb',
+                                 encoding='utf_8').readlines()
         except UnicodeError as Xstr:
-            FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16').readlines()
+            FileIn = codecs.open(FilaPath, mode='rb',
+                                 encoding='utf_16').readlines()
         except UnicodeError:
-            FileIn = codecs.open(FilaPath, mode='rb', encoding='utf_16_le').readlines()
+            FileIn = codecs.open(FilaPath, mode='rb',
+                                 encoding='utf_16_le').readlines()
         except:
-            EdkLogger.Error("Unicode File Parser", ToolError.FILE_OPEN_FAILURE, ExtraData=FilaPath)
+            EdkLogger.Error("Unicode File Parser",
+                            ToolError.FILE_OPEN_FAILURE, ExtraData=FilaPath)
         return FileIn
-
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
index 94e97fa45c12..8cbf8aa6b76a 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This is an XML API that uses a syntax similar to XPath, but it is written in
 # standard python so that no extra python packages are required to use it.
 #
@@ -20,13 +20,15 @@ import codecs
 from Logger.ToolError import PARSER_ERROR
 import Logger.Log as Logger
 
-## Create a element of XML
+# Create a element of XML
 #
 # @param Name
 # @param String
 # @param NodeList
 # @param AttributeList
 #
+
+
 def CreateXmlElement(Name, String, NodeList, AttributeList):
     Doc = xml.dom.minidom.Document()
     Element = Doc.createElement(Name)
@@ -51,7 +53,7 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
 
     return Element
 
-## Get a list of XML nodes using XPath style syntax.
+# Get a list of XML nodes using XPath style syntax.
 #
 # Return a list of XML DOM nodes from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty list is returned.
@@ -59,6 +61,8 @@ def CreateXmlElement(Name, String, NodeList, AttributeList):
 # @param  Dom                The root XML DOM node.
 # @param  String             A XPath style path.
 #
+
+
 def XmlList(Dom, String):
     if String is None or String == "" or Dom is None or Dom == "":
         return []
@@ -74,7 +78,7 @@ def XmlList(Dom, String):
         ChildNodes = []
         for Node in Nodes:
             if Node.nodeType == Node.ELEMENT_NODE and Node.tagName == \
-            TagList[Index]:
+                    TagList[Index]:
                 if Index < End:
                     ChildNodes.extend(Node.childNodes)
                 else:
@@ -86,7 +90,7 @@ def XmlList(Dom, String):
     return Nodes
 
 
-## Get a single XML node using XPath style syntax.
+# Get a single XML node using XPath style syntax.
 #
 # Return a single XML DOM node from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty string is returned.
@@ -95,7 +99,7 @@ def XmlList(Dom, String):
 # @param  String             A XPath style path.
 #
 def XmlNode(Dom, String):
-    if String is None or String == ""  or Dom is None or Dom == "":
+    if String is None or String == "" or Dom is None or Dom == "":
         return None
     if Dom.nodeType == Dom.DOCUMENT_NODE:
         Dom = Dom.documentElement
@@ -118,7 +122,7 @@ def XmlNode(Dom, String):
     return None
 
 
-## Get a single XML element using XPath style syntax.
+# Get a single XML element using XPath style syntax.
 #
 # Return a single XML element from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty string is returned.
@@ -132,7 +136,7 @@ def XmlElement(Dom, String):
     except BaseException:
         return ""
 
-## Get a single XML element using XPath style syntax.
+# Get a single XML element using XPath style syntax.
 #
 # Similar with XmlElement, but do not strip all the leading and tailing space
 # and newline, instead just remove the newline and spaces introduced by
@@ -141,6 +145,8 @@ def XmlElement(Dom, String):
 # @param  Dom                The root XML DOM object.
 # @param  Strin              A XPath style path.
 #
+
+
 def XmlElement2(Dom, String):
     try:
         HelpStr = XmlNode(Dom, String).firstChild.data
@@ -151,7 +157,7 @@ def XmlElement2(Dom, String):
         return ""
 
 
-## Get a single XML element of the current node.
+# Get a single XML element of the current node.
 #
 # Return a single XML element specified by the current root Dom.
 # If the input Dom is not valid, then an empty string is returned.
@@ -165,7 +171,7 @@ def XmlElementData(Dom):
         return ""
 
 
-## Get a list of XML elements using XPath style syntax.
+# Get a list of XML elements using XPath style syntax.
 #
 # Return a list of XML elements from the root Dom specified by XPath String.
 # If the input Dom or String is not valid, then an empty list is returned.
@@ -177,7 +183,7 @@ def XmlElementList(Dom, String):
     return list(map(XmlElementData, XmlList(Dom, String)))
 
 
-## Get the XML attribute of the current node.
+# Get the XML attribute of the current node.
 #
 # Return a single XML attribute named Attribute from the current root Dom.
 # If the input Dom or Attribute is not valid, then an empty string is returned.
@@ -192,7 +198,7 @@ def XmlAttribute(Dom, Attribute):
         return ''
 
 
-## Get the XML node name of the current node.
+# Get the XML node name of the current node.
 #
 # Return a single XML node name from the current root Dom.
 # If the input Dom is not valid, then an empty string is returned.
@@ -205,13 +211,15 @@ def XmlNodeName(Dom):
     except BaseException:
         return ''
 
-## Parse an XML file.
+# Parse an XML file.
 #
 # Parse the input XML file named FileName and return a XML DOM it stands for.
 # If the input File is not a valid XML file, then an empty string is returned.
 #
 # @param  FileName           The XML file name.
 #
+
+
 def XmlParseFile(FileName):
     try:
         XmlFile = codecs.open(FileName, 'rb')
@@ -220,4 +228,5 @@ def XmlParseFile(FileName):
         return Dom
     except BaseException as XExcept:
         XmlFile.close()
-        Logger.Error('\nUPT', PARSER_ERROR, XExcept, File=FileName, RaiseError=True)
+        Logger.Error('\nUPT', PARSER_ERROR, XExcept,
+                     File=FileName, RaiseError=True)
diff --git a/BaseTools/Source/Python/UPT/Library/Xml/__init__.py b/BaseTools/Source/Python/UPT/Library/Xml/__init__.py
index 172e498451b8..03dedeed636e 100644
--- a/BaseTools/Source/Python/UPT/Library/Xml/__init__.py
+++ b/BaseTools/Source/Python/UPT/Library/Xml/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/Library/__init__.py b/BaseTools/Source/Python/UPT/Library/__init__.py
index 07b5b75dd51e..bb3ceeeb7572 100644
--- a/BaseTools/Source/Python/UPT/Library/__init__.py
+++ b/BaseTools/Source/Python/UPT/Library/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/Logger/Log.py b/BaseTools/Source/Python/UPT/Logger/Log.py
index a2e32a6236ac..425bcc7c45a4 100644
--- a/BaseTools/Source/Python/UPT/Logger/Log.py
+++ b/BaseTools/Source/Python/UPT/Logger/Log.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file implements the log mechanism for Python tools.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,7 @@
 Logger
 '''
 
-## Import modules
+# Import modules
 from sys import argv
 from sys import stdout
 from sys import stderr
@@ -42,12 +42,12 @@ DEBUG_7 = 8
 DEBUG_8 = 9
 DEBUG_9 = 10
 VERBOSE = 15
-INFO    = 20
-WARN    = 30
-QUIET   = 40
+INFO = 20
+WARN = 30
+QUIET = 40
 QUIET_1 = 41
-ERROR   = 50
-SILENT  = 60
+ERROR = 50
+SILENT = 60
 
 IS_RAISE_ERROR = True
 SUPRESS_ERROR = False
@@ -59,15 +59,15 @@ _TOOL_NAME = os.path.basename(argv[0])
 #
 # For validation purpose
 #
-_LOG_LEVELS = [DEBUG_0, DEBUG_1, DEBUG_2, DEBUG_3, DEBUG_4, DEBUG_5, DEBUG_6, \
-              DEBUG_7, DEBUG_8, DEBUG_9, VERBOSE, WARN, INFO, ERROR, QUIET, \
-              QUIET_1, SILENT]
+_LOG_LEVELS = [DEBUG_0, DEBUG_1, DEBUG_2, DEBUG_3, DEBUG_4, DEBUG_5, DEBUG_6,
+               DEBUG_7, DEBUG_8, DEBUG_9, VERBOSE, WARN, INFO, ERROR, QUIET,
+               QUIET_1, SILENT]
 #
 # For DEBUG level (All DEBUG_0~9 are applicable)
 #
 _DEBUG_LOGGER = getLogger("tool_debug")
-_DEBUG_FORMATTER = Formatter("[%(asctime)s.%(msecs)d]: %(message)s", \
-                            datefmt="%H:%M:%S")
+_DEBUG_FORMATTER = Formatter("[%(asctime)s.%(msecs)d]: %(message)s",
+                             datefmt="%H:%M:%S")
 #
 # For VERBOSE, INFO, WARN level
 #
@@ -83,10 +83,10 @@ _ERROR_FORMATTER = Formatter("%(message)s")
 # String templates for ERROR/WARN/DEBUG log message
 #
 _ERROR_MESSAGE_TEMPLATE = \
-('\n\n%(tool)s...\n%(file)s(%(line)s): error %(errorcode)04X: %(msg)s\n\t%(extra)s')
+    ('\n\n%(tool)s...\n%(file)s(%(line)s): error %(errorcode)04X: %(msg)s\n\t%(extra)s')
 
 __ERROR_MESSAGE_TEMPLATE_WITHOUT_FILE = \
-'\n\n%(tool)s...\n : error %(errorcode)04X: %(msg)s\n\t%(extra)s'
+    '\n\n%(tool)s...\n : error %(errorcode)04X: %(msg)s\n\t%(extra)s'
 
 _WARNING_MESSAGE_TEMPLATE = '%(tool)s...\n%(file)s(%(line)s): warning: %(msg)s'
 _WARNING_MESSAGE_TEMPLATE_WITHOUT_FILE = '%(tool)s: : warning: %(msg)s'
@@ -104,15 +104,19 @@ def Info(msg, *args, **kwargs):
 #
 # Log information which should be always put out
 #
+
+
 def Quiet(msg, *args, **kwargs):
     _ERROR_LOGGER.error(msg, *args, **kwargs)
 
-## Log debug message
+# Log debug message
 #
 #   @param  Level       DEBUG level (DEBUG0~9)
 #   @param  Message     Debug information
 #   @param  ExtraData   More information associated with "Message"
 #
+
+
 def Debug(Level, Message, ExtraData=None):
     if _DEBUG_LOGGER.level > Level:
         return
@@ -123,9 +127,9 @@ def Debug(Level, Message, ExtraData=None):
     #
     CallerStack = extract_stack()[-2]
     TemplateDict = {
-        "file"      : CallerStack[0],
-        "line"      : CallerStack[1],
-        "msg"       : Message,
+        "file": CallerStack[0],
+        "line": CallerStack[1],
+        "msg": Message,
     }
 
     if ExtraData is not None:
@@ -135,14 +139,16 @@ def Debug(Level, Message, ExtraData=None):
 
     _DEBUG_LOGGER.log(Level, LogText)
 
-## Log verbose message
+# Log verbose message
 #
 #   @param  Message     Verbose information
 #
+
+
 def Verbose(Message):
     return _INFO_LOGGER.log(VERBOSE, Message)
 
-## Log warning message
+# Log warning message
 #
 #   Warning messages are those which might be wrong but won't fail the tool.
 #
@@ -153,6 +159,8 @@ def Verbose(Message):
 #   @param  Line        The line number in the "File" which caused the warning.
 #   @param  ExtraData   More information associated with "Message"
 #
+
+
 def Warn(ToolName, Message, File=None, Line=None, ExtraData=None):
     if _INFO_LOGGER.level > WARN:
         return
@@ -168,10 +176,10 @@ def Warn(ToolName, Message, File=None, Line=None, ExtraData=None):
         Line = "%d" % Line
 
     TemplateDict = {
-        "tool"      : ToolName,
-        "file"      : File,
-        "line"      : Line,
-        "msg"       : Message,
+        "tool": ToolName,
+        "file": File,
+        "line": Line,
+        "msg": Message,
     }
 
     if File is not None:
@@ -189,7 +197,7 @@ def Warn(ToolName, Message, File=None, Line=None, ExtraData=None):
     if GlobalData.gWARNING_AS_ERROR == True:
         raise FatalError(WARNING_AS_ERROR)
 
-## Log ERROR message
+# Log ERROR message
 #
 # Once an error messages is logged, the tool's execution will be broken by
 # raising an exception. If you don't want to break the execution later, you
@@ -205,7 +213,9 @@ def Warn(ToolName, Message, File=None, Line=None, ExtraData=None):
 #   @param  RaiseError  Raise an exception to break the tool's execution if
 #                       it's True. This is the default behavior.
 #
-def Error(ToolName, ErrorCode, Message=None, File=None, Line=None, \
+
+
+def Error(ToolName, ErrorCode, Message=None, File=None, Line=None,
           ExtraData=None, RaiseError=IS_RAISE_ERROR):
     if ToolName:
         pass
@@ -224,16 +234,16 @@ def Error(ToolName, ErrorCode, Message=None, File=None, Line=None, \
         ExtraData = ""
 
     TemplateDict = {
-        "tool"      : _TOOL_NAME,
-        "file"      : File,
-        "line"      : Line,
-        "errorcode" : ErrorCode,
-        "msg"       : Message,
-        "extra"     : ExtraData
+        "tool": _TOOL_NAME,
+        "file": File,
+        "line": Line,
+        "errorcode": ErrorCode,
+        "msg": Message,
+        "extra": ExtraData
     }
 
     if File is not None:
-        LogText =  _ERROR_MESSAGE_TEMPLATE % TemplateDict
+        LogText = _ERROR_MESSAGE_TEMPLATE % TemplateDict
     else:
         LogText = __ERROR_MESSAGE_TEMPLATE_WITHOUT_FILE % TemplateDict
 
@@ -243,7 +253,7 @@ def Error(ToolName, ErrorCode, Message=None, File=None, Line=None, \
         raise FatalError(ErrorCode)
 
 
-## Initialize log system
+# Initialize log system
 #
 def Initialize():
     #
@@ -272,33 +282,39 @@ def Initialize():
     _ERROR_LOGGER.addHandler(_ErrorCh)
 
 
-## Set log level
+# Set log level
 #
 #   @param  Level   One of log level in _LogLevel
 #
 def SetLevel(Level):
     if Level not in _LOG_LEVELS:
-        Info("Not supported log level (%d). Use default level instead." % \
+        Info("Not supported log level (%d). Use default level instead." %
              Level)
         Level = INFO
     _DEBUG_LOGGER.setLevel(Level)
     _INFO_LOGGER.setLevel(Level)
     _ERROR_LOGGER.setLevel(Level)
 
-## Get current log level
+# Get current log level
 #
+
+
 def GetLevel():
     return _INFO_LOGGER.getEffectiveLevel()
 
-## Raise up warning as error
+# Raise up warning as error
 #
+
+
 def SetWarningAsError():
     GlobalData.gWARNING_AS_ERROR = True
 
-## Specify a file to store the log message as well as put on console
+# Specify a file to store the log message as well as put on console
 #
 #   @param  LogFile     The file path used to store the log message
 #
+
+
 def SetLogFile(LogFile):
     if os.path.exists(LogFile):
         remove(LogFile)
@@ -314,6 +330,3 @@ def SetLogFile(LogFile):
     _Ch = FileHandler(LogFile)
     _Ch.setFormatter(_ERROR_FORMATTER)
     _ERROR_LOGGER.addHandler(_Ch)
-
-
-
diff --git a/BaseTools/Source/Python/UPT/Logger/StringTable.py b/BaseTools/Source/Python/UPT/Logger/StringTable.py
index 13c015844ea0..317885048dfb 100644
--- a/BaseTools/Source/Python/UPT/Logger/StringTable.py
+++ b/BaseTools/Source/Python/UPT/Logger/StringTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define strings used in the UPT tool
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -16,7 +16,7 @@ import gettext
 # string table starts here...
 #
 
-## strings are classified as following types
+# strings are classified as following types
 #    MSG_...: it is a message string
 #    ERR_...: it is a error string
 #    WRN_...: it is a warning string
@@ -26,24 +26,25 @@ import gettext
 _ = gettext.gettext
 
 MSG_USAGE_STRING = _("\n"
-    "UEFI Packaging Tool (UEFIPT)\n"
-    "%prog [options]"
-    )
+                     "UEFI Packaging Tool (UEFIPT)\n"
+                     "%prog [options]"
+                     )
 
 ##
 # Version and Copyright
 #
 MSG_VERSION_NUMBER = _("1.1")
-MSG_VERSION = _("UEFI Packaging Tool (UEFIPT) - Revision " + \
+MSG_VERSION = _("UEFI Packaging Tool (UEFIPT) - Revision " +
                 MSG_VERSION_NUMBER)
-MSG_COPYRIGHT = _("Copyright (c) 2011 - 2018 Intel Corporation All Rights Reserved.")
+MSG_COPYRIGHT = _(
+    "Copyright (c) 2011 - 2018 Intel Corporation All Rights Reserved.")
 MSG_VERSION_COPYRIGHT = _("\n  %s\n  %s" % (MSG_VERSION, MSG_COPYRIGHT))
 MSG_USAGE = _("%s [options]\n%s" % ("UPT", MSG_VERSION_COPYRIGHT))
-MSG_DESCRIPTION = _("The UEFIPT is used to create, " + \
-                    "install or remove a UEFI Distribution Package. " + \
-                    "If WORKSPACE environment variable is present, " + \
-                    "then UPT will install packages to the location specified by WORKSPACE, " + \
-                    "otherwise UPT will install packages to the current directory. " + \
+MSG_DESCRIPTION = _("The UEFIPT is used to create, " +
+                    "install or remove a UEFI Distribution Package. " +
+                    "If WORKSPACE environment variable is present, " +
+                    "then UPT will install packages to the location specified by WORKSPACE, " +
+                    "otherwise UPT will install packages to the current directory. " +
                     "Option -n will override this default installation location")
 
 #
@@ -56,23 +57,23 @@ ERR_INF_PARSER_HEADER_MISSGING = _(
 ERR_INF_PARSER_UNKNOWN_SECTION = _("An unknown section was found. "
                                    "It must be corrected before continuing. ")
 ERR_INF_PARSER_NO_SECTION_ERROR = _("No section was found. "
-                            "A section must be included before continuing.")
+                                    "A section must be included before continuing.")
 ERR_INF_PARSER_BUILD_OPTION_FORMAT_INVALID = \
     _("Build Option format incorrect.")
 ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID = _(
-     "The format of binary %s item is incorrect. "
-     "It should contain at least %d elements.")
+    "The format of binary %s item is incorrect. "
+    "It should contain at least %d elements.")
 ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX = _(
-     "The format of binary %s item is invalid, "
-     "it should contain not more than %d elements.")
+    "The format of binary %s item is invalid, "
+    "it should contain not more than %d elements.")
 ERR_INF_PARSER_BINARY_ITEM_INVALID_FILETYPE = _(
-     "The Binary FileType is incorrect. It should in %s")
+    "The Binary FileType is incorrect. It should in %s")
 ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST = _(
-     "The Binary File: %s not exist.")
+    "The Binary File: %s not exist.")
 ERR_INF_PARSER_BINARY_ITEM_FILENAME_NOT_EXIST = _(
-     "The Binary File Name item not exist")
+    "The Binary File Name item not exist")
 ERR_INF_PARSER_BINARY_VER_TYPE = _(
-     "Only this type is allowed: \"%s\".")
+    "Only this type is allowed: \"%s\".")
 ERR_INF_PARSER_MULTI_DEFINE_SECTION = \
     _("Multiple define sections found. "
       "It must be corrected before continuing.")
@@ -91,8 +92,8 @@ ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID = \
       "or has an incorrect file name of the directory containing the INF or DEC file: %s. "
       "It must be corrected before continuing")
 ERR_INF_PARSER_DEFINE_SHADOW_INVALID = \
-   _("The SHADOW keyword is only valid for"
-                                       " SEC, PEI_CORE and PEIM module types.")
+    _("The SHADOW keyword is only valid for"
+      " SEC, PEI_CORE and PEIM module types.")
 ERR_INF_PARSER_DEFINE_SECTION_HEADER_INVALID = \
     _("The format of the section header is incorrect")
 ERR_INF_PARSER_DEPEX_SECTION_INVALID = \
@@ -103,15 +104,15 @@ ERR_INF_PARSER_DEPEX_SECTION_INVALID_FOR_LIBRARY_CLASS = \
     _("A library class can't have a Depex section when its supported module type list is not defined.")
 ERR_INF_PARSER_DEPEX_SECTION_INVALID_FOR_DRIVER = \
     _("A driver can't have a Depex section when its module type is UEFI_DRIVER.")
-ERR_INF_PARSER_DEPEX_SECTION_NOT_DETERMINED  = \
+ERR_INF_PARSER_DEPEX_SECTION_NOT_DETERMINED = \
     _("Cannot determine the module's Depex type. The Depex's module types are conflict")
 ERR_INF_PARSER_DEFINE_SECTION_MUST_ITEM_NOT_EXIST = _(
-                "No %s found in INF file, please check it.")
+    "No %s found in INF file, please check it.")
 ERR_INF_PARSER_DEPEX_SECTION_MODULE_TYPE_ERROR = \
     _("The module type of [Depex] section is invalid, not support type of %s")
 ERR_INF_PARSER_DEPEX_SECTION_CONTENT_MISSING = \
     _("Missing content in: %s")
-ERR_INF_PARSER_DEPEX_SECTION_CONTENT_ERROR  = \
+ERR_INF_PARSER_DEPEX_SECTION_CONTENT_ERROR = \
     _("The [Depex] section contains invalid content: %s")
 ERR_INF_PARSER_DEPEX_SECTION_SEC_TYPE_ERROR = \
     _("The format is incorrect. The section type keyword of the content in the"
@@ -186,7 +187,7 @@ ERR_INF_PARSER_CNAME_MISSING = \
     _("Missing CName. Specify a valid C variable name.")
 ERR_INF_PARSER_DEFINE_SECTION_KEYWORD_INVALID = \
     _("The Define section contains an invalid keyword:  \"%s\"."
-    "It must be corrected before continuing.")
+      "It must be corrected before continuing.")
 ERR_INF_PARSER_FILE_MISS_DEFINE = \
     _("The following file listed in the module "
       "directory is not listed in the INF: %s")
@@ -197,66 +198,75 @@ ERR_INF_PARSER_VER_EXIST_BOTH_NUM_STR = \
     _("The INF file %s defines both VERSION_NUMBER and VERSION_STRING, "
       "using VERSION_STRING")
 ERR_INF_PARSER_NOT_SUPPORT_EDKI_INF = _("EDKI INF is not supported")
-ERR_INF_PARSER_EDKI_COMMENT_IN_EDKII = _("The EDKI style comment is not supported in EDKII modules")
+ERR_INF_PARSER_EDKI_COMMENT_IN_EDKII = _(
+    "The EDKI style comment is not supported in EDKII modules")
 
 ERR_INF_PARSER_FEATUREPCD_USAGE_INVALID = _("The usage for FeaturePcd can only"
-    " be type of \"CONSUMES\".")
+                                            " be type of \"CONSUMES\".")
 
 ERR_INF_PARSER_DEFINE_ITEM_NO_NAME = _("No name specified")
 ERR_INF_PARSER_DEFINE_ITEM_NO_VALUE = _("No value specified")
 
 ERR_INF_PARSER_MODULETYPE_INVALID = _("Drivers and applications are not allowed to have a MODULE_TYPE of \"BASE\". "
-"Only libraries are permitted to a have a MODULE_TYPE of \"BASE\".")
-ERR_INF_GET_PKG_DEPENDENCY_FAIL = _("Failed to get PackageDependencies information from file %s")
-ERR_INF_NO_PKG_DEPENDENCY_INFO = _("There are no packages defined that use the AsBuilt PCD information.")
+                                      "Only libraries are permitted to a have a MODULE_TYPE of \"BASE\".")
+ERR_INF_GET_PKG_DEPENDENCY_FAIL = _(
+    "Failed to get PackageDependencies information from file %s")
+ERR_INF_NO_PKG_DEPENDENCY_INFO = _(
+    "There are no packages defined that use the AsBuilt PCD information.")
 
 #
 # Item duplicate
 #
 ERR_INF_PARSER_ITEM_DUPLICATE_IN_DEC = \
-_('"%s" is redefined in its dependent DEC files')
+    _('"%s" is redefined in its dependent DEC files')
 ERR_INF_PARSER_ITEM_DUPLICATE = _("%s define duplicated! "
                                   "It must be corrected before continuing.")
 ERR_INF_PARSER_ITEM_DUPLICATE_COMMON = _("%s define duplicated! Item listed"
-"in an architectural section must not be listed in the common architectural"
-"section.It must be corrected before continuing.")
+                                         "in an architectural section must not be listed in the common architectural"
+                                         "section.It must be corrected before continuing.")
 ERR_INF_PARSER_UE_SECTION_DUPLICATE_ERROR = \
-_("%s define duplicated! Each UserExtensions section header must have a "
-  "unique set of UserId, IdString and Arch values. "
-  "It must be corrected before continuing.")
+    _("%s define duplicated! Each UserExtensions section header must have a "
+      "unique set of UserId, IdString and Arch values. "
+      "It must be corrected before continuing.")
 
 ERR_INF_PARSER_DEFINE_LIB_NAME_INVALID = \
-_("The name 'NULL' for LibraryClass is a reserved word."
-"Please don't use it.")
+    _("The name 'NULL' for LibraryClass is a reserved word."
+      "Please don't use it.")
 
 ERR_GLOBAL_MARCO_INVALID = \
-_("Using global MACRO in INF/DEC is not permitted: %s . "
-"It must be corrected before continuing.")
+    _("Using global MACRO in INF/DEC is not permitted: %s . "
+      "It must be corrected before continuing.")
 
 ERR_MARCO_DEFINITION_MISS_ERROR = \
-_("MACRO expand incorrectly, can not find the MACRO definition. "
-"It must be corrected before continuing.")
+    _("MACRO expand incorrectly, can not find the MACRO definition. "
+      "It must be corrected before continuing.")
 
 #
 # AsBuilt related
 #
 ERR_LIB_CONTATIN_ASBUILD_AND_COMMON = _("A binary INF file should not contain both AsBuilt LIB_INSTANCES information "
                                         "and a common library entry.")
-ERR_LIB_INSTANCE_MISS_GUID = _("Could not get FILE_GUID definition from instance INF file.")
+ERR_LIB_INSTANCE_MISS_GUID = _(
+    "Could not get FILE_GUID definition from instance INF file.")
 
 ERR_BO_CONTATIN_ASBUILD_AND_COMMON = _("A binary INF file should contain either AsBuilt information "
                                        "or a common build option entry, not both.")
 
-ERR_ASBUILD_PCD_SECTION_TYPE = _("The AsBuilt INF file contains a PCD section type that is not permitted: %s.")
-ERR_ASBUILD_PATCHPCD_FORMAT_INVALID = _("The AsBuilt PatchPcd entry must contain 3 elements: PcdName|Value|Offset")
-ERR_ASBUILD_PCDEX_FORMAT_INVALID = _("The AsBuilt PcdEx entry must contain one element: PcdName")
+ERR_ASBUILD_PCD_SECTION_TYPE = _(
+    "The AsBuilt INF file contains a PCD section type that is not permitted: %s.")
+ERR_ASBUILD_PATCHPCD_FORMAT_INVALID = _(
+    "The AsBuilt PatchPcd entry must contain 3 elements: PcdName|Value|Offset")
+ERR_ASBUILD_PCDEX_FORMAT_INVALID = _(
+    "The AsBuilt PcdEx entry must contain one element: PcdName")
 ERR_ASBUILD_PCD_VALUE_INVALID = \
     _("The AsBuilt PCD value %s is incorrect or not align with its datum type %s. "
       "It must be corrected before continuing.")
-ERR_ASBUILD_PCD_TOKENSPACE_GUID_VALUE_MISS = _("Package file value could not be retrieved for %s.")
-ERR_ASBUILD_PCD_DECLARITION_MISS = _("PCD Declaration in DEC files could not be found for: %s.")
+ERR_ASBUILD_PCD_TOKENSPACE_GUID_VALUE_MISS = _(
+    "Package file value could not be retrieved for %s.")
+ERR_ASBUILD_PCD_DECLARITION_MISS = _(
+    "PCD Declaration in DEC files could not be found for: %s.")
 ERR_ASBUILD_PCD_OFFSET_FORMAT_INVALID = _("PCD offset format invalid, number of (0-4294967295) or"
-"Hex number of UINT32 allowed : %s.")
+                                          "Hex number of UINT32 allowed : %s.")
 
 #
 # XML parser related strings
@@ -267,13 +277,13 @@ ERR_XML_INVALID_VARIABLENAME = \
     _("The VariableName of the GUID in the XML tree does not conform to the packaging specification.  "
       "Only a Hex Byte Array of UCS-2 format or L\"string\" is allowed): %s %s %s")
 ERR_XML_INVALID_LIB_SUPMODLIST = _("The LIBRARY_CLASS entry %s must have the list appended using the format as: \n"
-"BASE SEC PEI_CORE PEIM DXE_CORE DXE_DRIVER SMM_CORE DXE_SMM_DRIVER DXE_RUNTIME_DRIVER "
-"DXE_SAL_DRIVER UEFI_DRIVER UEFI_APPLICATION USER_DEFINED\n Current is %s.")
+                                   "BASE SEC PEI_CORE PEIM DXE_CORE DXE_DRIVER SMM_CORE DXE_SMM_DRIVER DXE_RUNTIME_DRIVER "
+                                   "DXE_SAL_DRIVER UEFI_DRIVER UEFI_APPLICATION USER_DEFINED\n Current is %s.")
 ERR_XML_INVALID_EXTERN_SUPARCHLIST = \
     _("There is a mismatch of SupArchList %s between the EntryPoint, UnloadImage, Constructor, "
       "and Destructor elements in the ModuleSurfaceArea.ModuleProperties: SupArchList: %s. ")
 ERR_XML_INVALID_EXTERN_SUPMODLIST = _("The SupModList attribute of the CONSTRUCTOR or DESTRUCTOR element: %s does not "
-"match the Supported Module Types listed after LIBRARY_CLASS = <Keyword> | %s")
+                                      "match the Supported Module Types listed after LIBRARY_CLASS = <Keyword> | %s")
 ERR_XML_INVALID_EXTERN_SUPMODLIST_NOT_LIB = _("The module is not a library module. "
                                               "The MODULE_TYPE : %s listed in the ModuleSurfaceArea.Header "
                                               "must match the SupModList attribute %s")
@@ -285,76 +295,79 @@ ERR_XML_INVALID_BINARY_FILE_TYPE = _("Invalid binary file type %s.")
 MSG_DISTRIBUTION_PACKAGE_FILE_EXISTS = _(
     "The distribution package file %s already exists.\nPress Y to override it."
     " To exit the application, press any other key.")
-MSG_CHECK_MODULE_EXIST         = _(
+MSG_CHECK_MODULE_EXIST = _(
     "\nChecking to see if module exists in workspace started ...")
-MSG_CHECK_MODULE_EXIST_FINISH  = \
+MSG_CHECK_MODULE_EXIST_FINISH = \
     _("Checking to see if  module exists in workspace ... Done.")
-MSG_CHECK_MODULE_DEPEX_START   = _(
+MSG_CHECK_MODULE_DEPEX_START = _(
     "\nChecking to see if module depex met by workspace started ...")
-MSG_CHECK_MODULE_DEPEX_FINISH  = _(
+MSG_CHECK_MODULE_DEPEX_FINISH = _(
     "Checking to see if module depex met by workspace ... Done.")
-MSG_CHECK_PACKAGE_START        = _(
+MSG_CHECK_PACKAGE_START = _(
     "\nChecking to see if  package exists in workspace started ...")
-MSG_CHECK_PACKAGE_FINISH       = _(
+MSG_CHECK_PACKAGE_FINISH = _(
     "Checking to see if  package exists in workspace ... Done.")
-MSG_CHECK_DP_START             = \
+MSG_CHECK_DP_START = \
     _("\nChecking to see if DP exists in workspace ... Done.")
-MSG_CHECK_DP_FINISH            = _("Check DP exists in workspace ... Done.")
-MSG_MODULE_DEPEND_ON           = _("Module %s depends on Package %s")
-MSG_INIT_IPI_START             = _("\nInitialize IPI database started ...")
-MSG_INIT_IPI_FINISH            = _("Initialize IPI database ... Done.")
-MSG_GET_DP_INSTALL_LIST        = _(
+MSG_CHECK_DP_FINISH = _("Check DP exists in workspace ... Done.")
+MSG_MODULE_DEPEND_ON = _("Module %s depends on Package %s")
+MSG_INIT_IPI_START = _("\nInitialize IPI database started ...")
+MSG_INIT_IPI_FINISH = _("Initialize IPI database ... Done.")
+MSG_GET_DP_INSTALL_LIST = _(
     "\nGetting list of DP install information started ...")
-MSG_GET_DP_INSTALL_INFO_START  = _(
+MSG_GET_DP_INSTALL_INFO_START = _(
     "\nGetting list of DP install information started ...")
 MSG_GET_DP_INSTALL_INFO_FINISH = _("Getting DP install information ... Done.")
-MSG_UZIP_PARSE_XML             = _(
+MSG_UZIP_PARSE_XML = _(
     "Unzipping and parsing distribution package XML file ... ")
-MSG_INSTALL_PACKAGE            = _("Installing package ... %s")
-MSG_INSTALL_MODULE             = _("Installing module ... %s")
-MSG_NEW_FILE_NAME_FOR_DIST      = _(
+MSG_INSTALL_PACKAGE = _("Installing package ... %s")
+MSG_INSTALL_MODULE = _("Installing module ... %s")
+MSG_NEW_FILE_NAME_FOR_DIST = _(
     "Provide new filename for distribution file to be saved:\n")
-MSG_UPDATE_PACKAGE_DATABASE    = _("Update Distribution Package Database ...")
-MSG_PYTHON_ON                  = _("(Python %s on %s) ")
-MSG_EDKII_MAIL_ADDR            = 'devel@edk2.groups.io'
-MSG_SEARCH_FOR_HELP            = _(
+MSG_UPDATE_PACKAGE_DATABASE = _("Update Distribution Package Database ...")
+MSG_PYTHON_ON = _("(Python %s on %s) ")
+MSG_EDKII_MAIL_ADDR = 'devel@edk2.groups.io'
+MSG_SEARCH_FOR_HELP = _(
     "\n(Please send email to %s for\n"
     " help, attach the following call stack trace.)\n")
-MSG_REMOVE_TEMP_FILE_STARTED   = _("Removing temp files started ... ")
-MSG_REMOVE_TEMP_FILE_DONE   = _("Removing temp files ... Done.")
-MSG_FINISH                     = _("Successfully Done.")
-MSG_COMPRESS_DISTRIBUTION_PKG  = _("Compressing Distribution Package File ...")
-MSG_CONFIRM_REMOVE             = _(
+MSG_REMOVE_TEMP_FILE_STARTED = _("Removing temp files started ... ")
+MSG_REMOVE_TEMP_FILE_DONE = _("Removing temp files ... Done.")
+MSG_FINISH = _("Successfully Done.")
+MSG_COMPRESS_DISTRIBUTION_PKG = _("Compressing Distribution Package File ...")
+MSG_CONFIRM_REMOVE = _(
     "Some packages or modules depend on this distribution package.\n"
     "Do you really want to remove it?")
-MSG_CONFIRM_REMOVE2            = _(
+MSG_CONFIRM_REMOVE2 = _(
     "This file has been modified: %s. Do you want to remove it?"
     "Press Y to remove or other key to keep it")
-MSG_CONFIRM_REMOVE3            = _(
+MSG_CONFIRM_REMOVE3 = _(
     "This is a newly created file: %s.  Are you sure you want to remove it?  "
     "Press Y to remove or any other key to keep it")
-MSG_USER_DELETE_OP             = _(
+MSG_USER_DELETE_OP = _(
     "Press Y to delete all files or press any other key to quit:")
-MSG_REMOVE_FILE                = _("Removing file: %s ...")
+MSG_REMOVE_FILE = _("Removing file: %s ...")
 
-MSG_INITIALIZE_ECC_STARTED     = _("\nInitialize ECC database started ...")
-MSG_INITIALIZE_ECC_DONE        = _("Initialize ECC database ... Done.")
-MSG_DEFINE_STATEMENT_FOUND     = _("DEFINE statement '%s' found in section %s")
-MSG_PARSING                    = _("Parsing %s ...")
+MSG_INITIALIZE_ECC_STARTED = _("\nInitialize ECC database started ...")
+MSG_INITIALIZE_ECC_DONE = _("Initialize ECC database ... Done.")
+MSG_DEFINE_STATEMENT_FOUND = _("DEFINE statement '%s' found in section %s")
+MSG_PARSING = _("Parsing %s ...")
 
-MSG_REPKG_CONFLICT             = \
-_("Repackaging is not allowed on this file: %s. "
-  "It was installed from distribution %s(Guid %s Version %s).")
+MSG_REPKG_CONFLICT = \
+    _("Repackaging is not allowed on this file: %s. "
+      "It was installed from distribution %s(Guid %s Version %s).")
 
-MSG_INVALID_MODULE_INTRODUCED  = _("Some modules are not valid after removal.")
-MSG_CHECK_LOG_FILE             = _("Please check log file %s for full list")
-MSG_NEW_FILE_NAME      = _(
+MSG_INVALID_MODULE_INTRODUCED = _("Some modules are not valid after removal.")
+MSG_CHECK_LOG_FILE = _("Please check log file %s for full list")
+MSG_NEW_FILE_NAME = _(
     "Provide new filename:\n")
-MSG_RELATIVE_PATH_ONLY = _("Please specify a relative path, full path is not allowed: %s")
-MSG_NEW_PKG_PATH  = _(
+MSG_RELATIVE_PATH_ONLY = _(
+    "Please specify a relative path, full path is not allowed: %s")
+MSG_NEW_PKG_PATH = _(
     "Select package location.  To quit with no input, press [Enter].")
-MSG_CHECK_DP_FOR_REPLACE = _("Verifying the dependency rule for replacement of distributions:\n %s replaces %s")
-MSG_CHECK_DP_FOR_INSTALL = _("Verifying the dependency rule for installation of distribution:\n %s")
+MSG_CHECK_DP_FOR_REPLACE = _(
+    "Verifying the dependency rule for replacement of distributions:\n %s replaces %s")
+MSG_CHECK_DP_FOR_INSTALL = _(
+    "Verifying the dependency rule for installation of distribution:\n %s")
 MSG_REPLACE_ALREADY_INSTALLED_DP = _("Distribution with the same GUID/Version is already installed, "
                                      "replace would result in two instances, which is not allowed")
 MSG_RECOVER_START = _('An error was detected, recovery started ...')
@@ -364,398 +377,416 @@ MSG_RECOVER_FAIL = _('Recovery failed.')
 # Error related strings.
 #
 
-ERR_DEPENDENCY_NOT_MATCH         = _(
+ERR_DEPENDENCY_NOT_MATCH = _(
     "Module %s's dependency on package %s (GUID %s Version %s) "
     "cannot be satisfied")
-ERR_MODULE_NOT_INSTALLED         = _(
+ERR_MODULE_NOT_INSTALLED = _(
     "This module is not installed in the workspace: %s\n")
-ERR_DIR_ALREADY_EXIST            = _(
+ERR_DIR_ALREADY_EXIST = _(
     "This directory already exists: %s.\n"
     "Select another location.  Press [Enter] with no input to quit:")
-ERR_USER_INTERRUPT               = _("The user has paused the application")
-ERR_DIST_FILE_TOOMANY            = _(
+ERR_USER_INTERRUPT = _("The user has paused the application")
+ERR_DIST_FILE_TOOMANY = _(
     "Only one .content and one .pkg file in ZIP file are allowed.")
-ERR_DIST_FILE_TOOFEW             = _(
+ERR_DIST_FILE_TOOFEW = _(
     "Must have one .content and one .pkg file in the ZIP file.")
-ERR_FILE_ALREADY_EXIST           = _(
+ERR_FILE_ALREADY_EXIST = _(
     "This file already exists: %s.\n"
     "Select another path to continue. To quit with no input press [Enter]:")
-ERR_SPECIFY_PACKAGE              = _(
+ERR_SPECIFY_PACKAGE = _(
     "One distribution package must be specified")
-ERR_FILE_BROKEN                  = _(
+ERR_FILE_BROKEN = _(
     "This file is invalid in the distribution package: %s")
 ERR_PACKAGE_NOT_MATCH_DEPENDENCY = _(
     "This distribution package does not meet the dependency requirements")
-ERR_UNKNOWN_FATAL_INSTALL_ERR    = \
-_("Unknown unrecoverable error when installing: %s")
-ERR_UNKNOWN_FATAL_REPLACE_ERR    = \
-_("Unknown unrecoverable error during replacement of distributions: %s replaces %s")
-ERR_OPTION_NOT_FOUND             = _("Options not found")
-ERR_INVALID_PACKAGE_NAME         = _("Incorrect package name: %s. ")
-ERR_INVALID_PACKAGE_PATH         = \
-_("Incorrect package path: %s. The path must be a relative path.")
-ERR_NOT_FOUND                    = _("This was not found: %s")
-ERR_INVALID_MODULE_NAME          = _("This is not a valid module name: %s")
-ERR_INVALID_METAFILE_PATH        = _('This file must be in sub-directory of WORKSPACE: %s.')
-ERR_INVALID_MODULE_PATH          = \
-_("Incorrect module path: %s. The path must be a relative path.")
-ERR_UNKNOWN_FATAL_CREATING_ERR   = _("Unknown error when creating: %s")
-ERR_PACKAGE_NOT_INSTALLED        = _(
+ERR_UNKNOWN_FATAL_INSTALL_ERR = \
+    _("Unknown unrecoverable error when installing: %s")
+ERR_UNKNOWN_FATAL_REPLACE_ERR = \
+    _("Unknown unrecoverable error during replacement of distributions: %s replaces %s")
+ERR_OPTION_NOT_FOUND = _("Options not found")
+ERR_INVALID_PACKAGE_NAME = _("Incorrect package name: %s. ")
+ERR_INVALID_PACKAGE_PATH = \
+    _("Incorrect package path: %s. The path must be a relative path.")
+ERR_NOT_FOUND = _("This was not found: %s")
+ERR_INVALID_MODULE_NAME = _("This is not a valid module name: %s")
+ERR_INVALID_METAFILE_PATH = _(
+    'This file must be in sub-directory of WORKSPACE: %s.')
+ERR_INVALID_MODULE_PATH = \
+    _("Incorrect module path: %s. The path must be a relative path.")
+ERR_UNKNOWN_FATAL_CREATING_ERR = _("Unknown error when creating: %s")
+ERR_PACKAGE_NOT_INSTALLED = _(
     "This distribution package not installed: %s")
-ERR_DISTRIBUTION_NOT_INSTALLED   = _(
+ERR_DISTRIBUTION_NOT_INSTALLED = _(
     "The distribution package is not installed.")
-ERR_UNKNOWN_FATAL_REMOVING_ERR   = _("Unknown error when removing package")
-ERR_UNKNOWN_FATAL_INVENTORYWS_ERR   = _("Unknown error when inventorying WORKSPACE")
-ERR_NOT_CONFIGURE_WORKSPACE_ENV  = _(
+ERR_UNKNOWN_FATAL_REMOVING_ERR = _("Unknown error when removing package")
+ERR_UNKNOWN_FATAL_INVENTORYWS_ERR = _(
+    "Unknown error when inventorying WORKSPACE")
+ERR_NOT_CONFIGURE_WORKSPACE_ENV = _(
     "The WORKSPACE environment variable must be configured.")
-ERR_NO_TEMPLATE_FILE             = _("This package information data file is not found: %s")
-ERR_DEBUG_LEVEL                  = _(
+ERR_NO_TEMPLATE_FILE = _("This package information data file is not found: %s")
+ERR_DEBUG_LEVEL = _(
     "Not supported debug level. Use default level instead.")
-ERR_REQUIRE_T_OPTION             = _(
+ERR_REQUIRE_T_OPTION = _(
     "Option -t is required during distribution creation.")
-ERR_REQUIRE_O_OPTION             = _(
+ERR_REQUIRE_O_OPTION = _(
     "Option -o is required during distribution replacement.")
-ERR_REQUIRE_U_OPTION             = _(
+ERR_REQUIRE_U_OPTION = _(
     "Option -u is required during distribution replacement.")
-ERR_REQUIRE_I_C_R_OPTION         = _(
+ERR_REQUIRE_I_C_R_OPTION = _(
     "Options -i, -c and -r are mutually exclusive.")
-ERR_I_C_EXCLUSIVE                = \
-_("Option -c and -i are mutually exclusive.")
-ERR_I_R_EXCLUSIVE                = \
-_("Option -i and -r are mutually exclusive.")
-ERR_C_R_EXCLUSIVE                = \
-_("Option -c and -r are mutually exclusive.")
-ERR_U_ICR_EXCLUSIVE                = \
-_("Option -u and -c/-i/-r are mutually exclusive.")
+ERR_I_C_EXCLUSIVE = \
+    _("Option -c and -i are mutually exclusive.")
+ERR_I_R_EXCLUSIVE = \
+    _("Option -i and -r are mutually exclusive.")
+ERR_C_R_EXCLUSIVE = \
+    _("Option -c and -r are mutually exclusive.")
+ERR_U_ICR_EXCLUSIVE = \
+    _("Option -u and -c/-i/-r are mutually exclusive.")
 
-ERR_L_OA_EXCLUSIVE                = \
-_("Option -l and -c/-i/-r/-u are mutually exclusive.")
+ERR_L_OA_EXCLUSIVE = \
+    _("Option -l and -c/-i/-r/-u are mutually exclusive.")
 
-ERR_FAILED_LOAD                  = _("Failed to load %s\n\t%s")
+ERR_FAILED_LOAD = _("Failed to load %s\n\t%s")
 ERR_PLACEHOLDER_DIFFERENT_REPEAT = _(
     "${%s} has different repeat time from others.")
-ERR_KEY_NOTALLOWED               = _("This keyword is not allowed: %s")
-ERR_NOT_FOUND_ENVIRONMENT        = _("Environment variable not found")
-ERR_WORKSPACE_NOTEXIST           = _("WORKSPACE doesn't exist")
-ERR_SPACE_NOTALLOWED             = _(
+ERR_KEY_NOTALLOWED = _("This keyword is not allowed: %s")
+ERR_NOT_FOUND_ENVIRONMENT = _("Environment variable not found")
+ERR_WORKSPACE_NOTEXIST = _("WORKSPACE doesn't exist")
+ERR_SPACE_NOTALLOWED = _(
     "Whitespace characters are not allowed in the WORKSPACE path. ")
-ERR_MACRONAME_NOGIVEN            = _("No MACRO name given")
-ERR_MACROVALUE_NOGIVEN           = _("No MACRO value given")
-ERR_MACRONAME_INVALID            = _("Incorrect MACRO name: %s")
-ERR_MACROVALUE_INVALID            = _("Incorrect MACRO value: %s")
-ERR_NAME_ONLY_DEFINE             = _(
+ERR_MACRONAME_NOGIVEN = _("No MACRO name given")
+ERR_MACROVALUE_NOGIVEN = _("No MACRO value given")
+ERR_MACRONAME_INVALID = _("Incorrect MACRO name: %s")
+ERR_MACROVALUE_INVALID = _("Incorrect MACRO value: %s")
+ERR_NAME_ONLY_DEFINE = _(
     "This variable can only be defined via environment variable: %s")
-ERR_EDK_GLOBAL_SAMENAME          = _(
+ERR_EDK_GLOBAL_SAMENAME = _(
     "EDK_GLOBAL defined a macro with the same name as one defined by 'DEFINE'")
-ERR_SECTIONNAME_INVALID          = _(
+ERR_SECTIONNAME_INVALID = _(
     "An incorrect section name was found: %s. 'The correct file is '%s' .")
-ERR_CHECKFILE_NOTFOUND           = _(
+ERR_CHECKFILE_NOTFOUND = _(
     "Can't find file '%s' defined in section '%s'")
-ERR_INVALID_NOTFOUND             = _(
+ERR_INVALID_NOTFOUND = _(
     "Incorrect statement '%s' was found in section '%s'")
-ERR_TEMPLATE_NOTFOUND            = _("This package information data file is not found: %s")
-ERR_SECTION_NAME_INVALID         = _('Incorrect section name: %s')
-ERR_SECTION_REDEFINE             = _(
+ERR_TEMPLATE_NOTFOUND = _(
+    "This package information data file is not found: %s")
+ERR_SECTION_NAME_INVALID = _('Incorrect section name: %s')
+ERR_SECTION_REDEFINE = _(
     "This section already defined: %s.")
-ERR_SECTION_NAME_NONE            = \
+ERR_SECTION_NAME_NONE = \
     _('The section needs to be specified first.')
-ERR_KEYWORD_INVALID              = _('Invalid keyword: %s')
-ERR_VALUE_INVALID                = _("Invalid \"%s\" value in section [%s].")
-ERR_FILELIST_LOCATION            = _(
+ERR_KEYWORD_INVALID = _('Invalid keyword: %s')
+ERR_VALUE_INVALID = _("Invalid \"%s\" value in section [%s].")
+ERR_FILELIST_LOCATION = _(
     'The directory "%s" must contain this file: "%s".')
-ERR_KEYWORD_REDEFINE             = _(
+ERR_KEYWORD_REDEFINE = _(
     "Keyword in this section can only be used once: %s.")
-ERR_FILELIST_EXIST               = _(
+ERR_FILELIST_EXIST = _(
     'This file does not exist: %s.')
-ERR_COPYRIGHT_CONTENT            = _(
+ERR_COPYRIGHT_CONTENT = _(
     "The copyright content must contain the word \"Copyright\" (case insensitive).")
-ERR_WRONG_FILELIST_FORMAT        = \
-_('File list format is incorrect.'
-  'The correct format is: filename|key=value[|key=value]')
-ERR_FILELIST_ATTR                = _(
+ERR_WRONG_FILELIST_FORMAT = \
+    _('File list format is incorrect.'
+      'The correct format is: filename|key=value[|key=value]')
+ERR_FILELIST_ATTR = _(
     "The value of attribute \"%s\" includes illegal character.")
-ERR_UNKNOWN_FILELIST_ATTR        = _(
+ERR_UNKNOWN_FILELIST_ATTR = _(
     'Unknown attribute name: %s.')
-ERR_EMPTY_VALUE                  = _("Empty value is not allowed")
-ERR_KEYWORD_MANDATORY            = _('This keyword is mandatory: %s')
-ERR_BOOLEAN_VALUE                = _(
+ERR_EMPTY_VALUE = _("Empty value is not allowed")
+ERR_KEYWORD_MANDATORY = _('This keyword is mandatory: %s')
+ERR_BOOLEAN_VALUE = _(
     'Value of key [%s] must be true or false, current: [%s]')
-ERR_GUID_VALUE                   = _(
+ERR_GUID_VALUE = _(
     'GUID must have the format of 8-4-4-4-12 with HEX value. '
     'Current value: [%s]')
-ERR_VERSION_VALUE                = _(
+ERR_VERSION_VALUE = _(
     'The value of key [%s] must be a decimal number. Found: [%s]')
-ERR_VERSION_XMLSPEC              = _(
+ERR_VERSION_XMLSPEC = _(
     'XmlSpecification value must be 1.1, current: %s.')
 
-ERR_INVALID_GUID                 = _("Incorrect GUID value string: %s")
+ERR_INVALID_GUID = _("Incorrect GUID value string: %s")
 
-ERR_FILE_NOT_FOUND               = \
+ERR_FILE_NOT_FOUND = \
     _("File or directory not found in workspace")
-ERR_FILE_OPEN_FAILURE            = _("Could not open file")
-ERR_FILE_WRITE_FAILURE           = _("Could not write file.")
-ERR_FILE_PARSE_FAILURE           = _("Could not parse file")
-ERR_FILE_READ_FAILURE            = _("Could not read file")
-ERR_FILE_CREATE_FAILURE          = _("Could not create file")
-ERR_FILE_CHECKSUM_FAILURE        = _("Checksum of file is incorrect")
-ERR_FILE_COMPRESS_FAILURE        = _("File compression did not correctly")
-ERR_FILE_DECOMPRESS_FAILURE      = \
+ERR_FILE_OPEN_FAILURE = _("Could not open file")
+ERR_FILE_WRITE_FAILURE = _("Could not write file.")
+ERR_FILE_PARSE_FAILURE = _("Could not parse file")
+ERR_FILE_READ_FAILURE = _("Could not read file")
+ERR_FILE_CREATE_FAILURE = _("Could not create file")
+ERR_FILE_CHECKSUM_FAILURE = _("Checksum of file is incorrect")
+ERR_FILE_COMPRESS_FAILURE = _("File compression did not correctly")
+ERR_FILE_DECOMPRESS_FAILURE = \
     _("File decompression did not complete correctly")
-ERR_FILE_MOVE_FAILURE            = _("Move file did not complete successfully")
-ERR_FILE_DELETE_FAILURE          = _("File could not be deleted")
-ERR_FILE_COPY_FAILURE            = _("File did not copy correctly")
-ERR_FILE_POSITIONING_FAILURE     = _("Could not find file seek position")
-ERR_FILE_TYPE_MISMATCH           = _("Incorrect file type")
-ERR_FILE_CASE_MISMATCH           = _("File name case mismatch")
-ERR_FILE_DUPLICATED              = _("Duplicate file found")
-ERR_FILE_UNKNOWN_ERROR           = _("Unknown error encountered on file")
-ERR_FILE_NAME_INVALIDE           = _("This file name is invalid, it must not be an absolute path or "
-                                     "contain a period \".\" or \"..\":  %s.")
-ERR_OPTION_UNKNOWN               = _("Unknown option")
-ERR_OPTION_MISSING               = _("Missing option")
-ERR_OPTION_CONFLICT              = _("Options conflict")
-ERR_OPTION_VALUE_INVALID         = _("Invalid option value")
-ERR_OPTION_DEPRECATED            = _("Deprecated option")
-ERR_OPTION_NOT_SUPPORTED         = _("Unsupported option")
-ERR_OPTION_UNKNOWN_ERROR         = _("Unknown error when processing options")
-ERR_PARAMETER_INVALID            = _("Invalid parameter")
-ERR_PARAMETER_MISSING            = _("Missing parameter")
-ERR_PARAMETER_UNKNOWN_ERROR      = _("Unknown error in parameters")
-ERR_FORMAT_INVALID               = _("Invalid syntax/format")
-ERR_FORMAT_NOT_SUPPORTED         = _("Syntax/format not supported")
-ERR_FORMAT_UNKNOWN               = _("Unknown format")
-ERR_FORMAT_UNKNOWN_ERROR         = _("Unknown error in syntax/format ")
-ERR_RESOURCE_NOT_AVAILABLE       = _("Not available")
-ERR_RESOURCE_ALLOCATE_FAILURE    = _("A resource allocation has failed")
-ERR_RESOURCE_FULL                = _("Full")
-ERR_RESOURCE_OVERFLOW            = _("Overflow")
-ERR_RESOURCE_UNDERRUN            = _("Underrun")
-ERR_RESOURCE_UNKNOWN_ERROR       = _("Unknown error")
-ERR_ATTRIBUTE_NOT_AVAILABLE      = _("Not available")
-ERR_ATTRIBUTE_RETRIEVE_FAILURE   = _("Unable to retrieve")
-ERR_ATTRIBUTE_SET_FAILURE        = _("Unable to set")
-ERR_ATTRIBUTE_UPDATE_FAILURE     = _("Unable to update")
-ERR_ATTRIBUTE_ACCESS_DENIED      = _("Access denied")
-ERR_ATTRIBUTE_UNKNOWN_ERROR      = _("Unknown error when accessing")
-ERR_COMMAND_FAILURE              = _("Unable to execute command")
-ERR_IO_NOT_READY                 = _("Not ready")
-ERR_IO_BUSY                      = _("Busy")
-ERR_IO_TIMEOUT                   = _("Timeout")
-ERR_IO_UNKNOWN_ERROR             = _("Unknown error in IO operation")
-ERR_UNKNOWN_ERROR                = _("Unknown error")
-ERR_UPT_ALREADY_INSTALLED_ERROR  = _("Already installed")
-ERR_UPT_ENVIRON_MISSING_ERROR    = _("Environ missing")
-ERR_UPT_REPKG_ERROR              = _("File not allowed for RePackage")
-ERR_UPT_DB_UPDATE_ERROR          = _("Update database did not complete successfully")
-ERR_UPT_INI_PARSE_ERROR          = _("INI file parse error")
-ERR_COPYRIGHT_MISSING            = \
-_("Header comment section must have copyright information")
-ERR_LICENSE_MISSING              = \
-_("Header comment section must have license information")
-ERR_INVALID_BINARYHEADER_FORMAT  = \
-_("Binary Header comment section must have abstract,description,copyright,license information")
+ERR_FILE_MOVE_FAILURE = _("Move file did not complete successfully")
+ERR_FILE_DELETE_FAILURE = _("File could not be deleted")
+ERR_FILE_COPY_FAILURE = _("File did not copy correctly")
+ERR_FILE_POSITIONING_FAILURE = _("Could not find file seek position")
+ERR_FILE_TYPE_MISMATCH = _("Incorrect file type")
+ERR_FILE_CASE_MISMATCH = _("File name case mismatch")
+ERR_FILE_DUPLICATED = _("Duplicate file found")
+ERR_FILE_UNKNOWN_ERROR = _("Unknown error encountered on file")
+ERR_FILE_NAME_INVALIDE = _("This file name is invalid, it must not be an absolute path or "
+                           "contain a period \".\" or \"..\":  %s.")
+ERR_OPTION_UNKNOWN = _("Unknown option")
+ERR_OPTION_MISSING = _("Missing option")
+ERR_OPTION_CONFLICT = _("Options conflict")
+ERR_OPTION_VALUE_INVALID = _("Invalid option value")
+ERR_OPTION_DEPRECATED = _("Deprecated option")
+ERR_OPTION_NOT_SUPPORTED = _("Unsupported option")
+ERR_OPTION_UNKNOWN_ERROR = _("Unknown error when processing options")
+ERR_PARAMETER_INVALID = _("Invalid parameter")
+ERR_PARAMETER_MISSING = _("Missing parameter")
+ERR_PARAMETER_UNKNOWN_ERROR = _("Unknown error in parameters")
+ERR_FORMAT_INVALID = _("Invalid syntax/format")
+ERR_FORMAT_NOT_SUPPORTED = _("Syntax/format not supported")
+ERR_FORMAT_UNKNOWN = _("Unknown format")
+ERR_FORMAT_UNKNOWN_ERROR = _("Unknown error in syntax/format ")
+ERR_RESOURCE_NOT_AVAILABLE = _("Not available")
+ERR_RESOURCE_ALLOCATE_FAILURE = _("A resource allocation has failed")
+ERR_RESOURCE_FULL = _("Full")
+ERR_RESOURCE_OVERFLOW = _("Overflow")
+ERR_RESOURCE_UNDERRUN = _("Underrun")
+ERR_RESOURCE_UNKNOWN_ERROR = _("Unknown error")
+ERR_ATTRIBUTE_NOT_AVAILABLE = _("Not available")
+ERR_ATTRIBUTE_RETRIEVE_FAILURE = _("Unable to retrieve")
+ERR_ATTRIBUTE_SET_FAILURE = _("Unable to set")
+ERR_ATTRIBUTE_UPDATE_FAILURE = _("Unable to update")
+ERR_ATTRIBUTE_ACCESS_DENIED = _("Access denied")
+ERR_ATTRIBUTE_UNKNOWN_ERROR = _("Unknown error when accessing")
+ERR_COMMAND_FAILURE = _("Unable to execute command")
+ERR_IO_NOT_READY = _("Not ready")
+ERR_IO_BUSY = _("Busy")
+ERR_IO_TIMEOUT = _("Timeout")
+ERR_IO_UNKNOWN_ERROR = _("Unknown error in IO operation")
+ERR_UNKNOWN_ERROR = _("Unknown error")
+ERR_UPT_ALREADY_INSTALLED_ERROR = _("Already installed")
+ERR_UPT_ENVIRON_MISSING_ERROR = _("Environ missing")
+ERR_UPT_REPKG_ERROR = _("File not allowed for RePackage")
+ERR_UPT_DB_UPDATE_ERROR = _("Update database did not complete successfully")
+ERR_UPT_INI_PARSE_ERROR = _("INI file parse error")
+ERR_COPYRIGHT_MISSING = \
+    _("Header comment section must have copyright information")
+ERR_LICENSE_MISSING = \
+    _("Header comment section must have license information")
+ERR_INVALID_BINARYHEADER_FORMAT = \
+    _("Binary Header comment section must have abstract,description,copyright,license information")
 ERR_MULTIPLE_BINARYHEADER_EXIST = \
-_("the inf file at most support one BinaryHeader at the fileheader section.")
-ERR_INVALID_COMMENT_FORMAT       = _("Comment must start with #")
-ERR_USER_ABORT                   = _("User has stopped the application")
-ERR_DIST_EXT_ERROR               = \
-_("Distribution file extension should be '.dist'. Current given: '%s'.")
-ERR_DIST_FILENAME_ONLY_FOR_REMOVE               = \
-_("Only distribution filename without path allowed during remove. Current given: '%s'.")
-ERR_NOT_STANDALONE_MODULE_ERROR  = \
+    _("the inf file at most support one BinaryHeader at the fileheader section.")
+ERR_INVALID_COMMENT_FORMAT = _("Comment must start with #")
+ERR_USER_ABORT = _("User has stopped the application")
+ERR_DIST_EXT_ERROR = \
+    _("Distribution file extension should be '.dist'. Current given: '%s'.")
+ERR_DIST_FILENAME_ONLY_FOR_REMOVE = \
+    _("Only distribution filename without path allowed during remove. Current given: '%s'.")
+ERR_NOT_STANDALONE_MODULE_ERROR = \
     _("Module %s is not a standalone module (found in Package %s)")
-ERR_UPT_ALREADY_RUNNING_ERROR    = \
+ERR_UPT_ALREADY_RUNNING_ERROR = \
     _("UPT is already running, only one instance is allowed")
-ERR_MUL_DEC_ERROR = _("Multiple DEC files found within one package directory tree %s: %s, %s")
-ERR_INSTALL_FILE_FROM_EMPTY_CONTENT = _("Error file to be installed is not found in content file: %s")
+ERR_MUL_DEC_ERROR = _(
+    "Multiple DEC files found within one package directory tree %s: %s, %s")
+ERR_INSTALL_FILE_FROM_EMPTY_CONTENT = _(
+    "Error file to be installed is not found in content file: %s")
 ERR_INSTALL_FILE_DEC_FILE_ERROR = _("Could not obtain the TokenSpaceGuidCName and the PcdCName from the DEC files "
-"that the package depends on for this pcd entry: TokenValue: %s Token: %s")
-ERR_NOT_SUPPORTED_SA_MODULE = _("Stand-alone module distribution does not allow EDK 1 INF")
-ERR_INSTALL_DIST_NOT_FOUND               = \
-_("Distribution file to be installed is not found in current working directory or workspace: %s")
-ERR_REPLACE_DIST_NOT_FOUND               = \
-_("Distribution file for replace function was not found in the current working directory or workspace: %s")
-ERR_DIST_FILENAME_ONLY_FOR_REPLACE_ORIG               = \
-_("Only a distribution file name without a path is allowed for "
-  "the distribution to be replaced during replace. Current given: '%s'.")
+                                    "that the package depends on for this pcd entry: TokenValue: %s Token: %s")
+ERR_NOT_SUPPORTED_SA_MODULE = _(
+    "Stand-alone module distribution does not allow EDK 1 INF")
+ERR_INSTALL_DIST_NOT_FOUND = \
+    _("Distribution file to be installed is not found in current working directory or workspace: %s")
+ERR_REPLACE_DIST_NOT_FOUND = \
+    _("Distribution file for replace function was not found in the current working directory or workspace: %s")
+ERR_DIST_FILENAME_ONLY_FOR_REPLACE_ORIG = \
+    _("Only a distribution file name without a path is allowed for "
+      "the distribution to be replaced during replace. Current given: '%s'.")
 ERR_UNIPARSE_DBLQUOTE_UNMATCHED = \
-_("Only Language entry can contain a couple of matched quote in one line")
-ERR_UNIPARSE_NO_SECTION_EXIST = _("No PackageDef or ModuleDef section exists in the UNI file.")
-ERR_UNIPARSE_STRNAME_FORMAT_ERROR = _("The String Token Name %s must start with \"STR_\"")
-ERR_UNIPARSE_SEP_LANGENTRY_LINE = _("Each <LangEntry> should be in a separate line :%s.")
+    _("Only Language entry can contain a couple of matched quote in one line")
+ERR_UNIPARSE_NO_SECTION_EXIST = _(
+    "No PackageDef or ModuleDef section exists in the UNI file.")
+ERR_UNIPARSE_STRNAME_FORMAT_ERROR = _(
+    "The String Token Name %s must start with \"STR_\"")
+ERR_UNIPARSE_SEP_LANGENTRY_LINE = _(
+    "Each <LangEntry> should be in a separate line :%s.")
 ERR_UNIPARSE_MULTI_ENTRY_EXIST = \
-_("There are same entries : %s in the UNI file, every kind of entry should be only one.")
+    _("There are same entries : %s in the UNI file, every kind of entry should be only one.")
 ERR_UNIPARSE_ENTRY_ORDER_WRONG = \
-_("The string entry order in UNI file should be <AbstractStrings>, <DescriptionStrings>, \
+    _("The string entry order in UNI file should be <AbstractStrings>, <DescriptionStrings>, \
 <BinaryAbstractStrings>, <BinaryDescriptionStrings>.")
-ERR_UNIPARSE_STRTOKEN_FORMAT_ERROR = _("The String Token Type %s must be one of the '_PROMPT', '_HELP' and '_ERR_'.")
-ERR_UNIPARSE_LINEFEED_UNDER_EXIST = _("Line feed should not exist under this line: %s.")
-ERR_UNIPARSE_LINEFEED_UP_EXIST = _("Line feed should not exist up this line: %s.")
+ERR_UNIPARSE_STRTOKEN_FORMAT_ERROR = _(
+    "The String Token Type %s must be one of the '_PROMPT', '_HELP' and '_ERR_'.")
+ERR_UNIPARSE_LINEFEED_UNDER_EXIST = _(
+    "Line feed should not exist under this line: %s.")
+ERR_UNIPARSE_LINEFEED_UP_EXIST = _(
+    "Line feed should not exist up this line: %s.")
 ERR_UNI_MISS_STRING_ENTRY = _("String entry missed in this Entry, %s.")
 ERR_UNI_MISS_LANGENTRY = _("Language entry missed in this Entry, %s.")
-ERR_BINARY_HEADER_ORDER           = _("Binary header must follow the file header.")
-ERR_NO_SOURCE_HEADER              = _("File header statement \"## @file\" must exist at the first place.")
-ERR_UNI_FILE_SUFFIX_WRONG = _("The UNI file must have an extension of '.uni', '.UNI' or '.Uni'")
-ERR_UNI_FILE_NAME_INVALID = _("The use of '..', '../' and './' in the UNI file is prohibited.")
+ERR_BINARY_HEADER_ORDER = _("Binary header must follow the file header.")
+ERR_NO_SOURCE_HEADER = _(
+    "File header statement \"## @file\" must exist at the first place.")
+ERR_UNI_FILE_SUFFIX_WRONG = _(
+    "The UNI file must have an extension of '.uni', '.UNI' or '.Uni'")
+ERR_UNI_FILE_NAME_INVALID = _(
+    "The use of '..', '../' and './' in the UNI file is prohibited.")
 ERR_UNI_SUBGUID_VALUE_DEFINE_DEC_NOT_FOUND = _("There are no DEC file to define the GUID value for \
 this GUID CName: '%s'.")
 
 #
 # Expression error message
 #
-ERR_EXPR_RIGHT_PAREN            = \
-_('Missing ")" in expression "%s".')
-ERR_EXPR_FACTOR                 = \
-_('"%s" is expected to be HEX, integer, macro, quoted string or PcdName in '
-  'expression "%s".')
-ERR_EXPR_STRING_ITEM            = \
-_('"%s" is expected to be HEX, integer, macro, quoted string or PcdName in '
-  'expression [%s].')
-ERR_EXPR_EQUALITY               = \
-_('"%s" is expected to be ==, EQ, != or NE  in expression "%s".')
-ERR_EXPR_BOOLEAN                = \
-_('The string "%s" in expression "%s" can not be recognized as a part of the logical expression.')
-ERR_EXPR_EMPTY                  = _('Boolean value cannot be empty.')
-ERR_EXPRESS_EMPTY               = _('Expression can not be empty.')
-ERR_EXPR_LOGICAL                = \
-_('The following is not a valid logical expression: "%s".')
-ERR_EXPR_OR                     = _('The expression: "%s" must be encapsulated in open "(" and close ")" '
-                                    'parenthesis when using | or ||.')
-ERR_EXPR_RANGE                  = \
-_('The following is not a valid range expression: "%s".')
-ERR_EXPR_RANGE_FACTOR           = \
-_('"%s" is expected to be HEX, integer in valid range expression "%s".')
+ERR_EXPR_RIGHT_PAREN = \
+    _('Missing ")" in expression "%s".')
+ERR_EXPR_FACTOR = \
+    _('"%s" is expected to be HEX, integer, macro, quoted string or PcdName in '
+      'expression "%s".')
+ERR_EXPR_STRING_ITEM = \
+    _('"%s" is expected to be HEX, integer, macro, quoted string or PcdName in '
+      'expression [%s].')
+ERR_EXPR_EQUALITY = \
+    _('"%s" is expected to be ==, EQ, != or NE  in expression "%s".')
+ERR_EXPR_BOOLEAN = \
+    _('The string "%s" in expression "%s" can not be recognized as a part of the logical expression.')
+ERR_EXPR_EMPTY = _('Boolean value cannot be empty.')
+ERR_EXPRESS_EMPTY = _('Expression can not be empty.')
+ERR_EXPR_LOGICAL = \
+    _('The following is not a valid logical expression: "%s".')
+ERR_EXPR_OR = _('The expression: "%s" must be encapsulated in open "(" and close ")" '
+                'parenthesis when using | or ||.')
+ERR_EXPR_RANGE = \
+    _('The following is not a valid range expression: "%s".')
+ERR_EXPR_RANGE_FACTOR = \
+    _('"%s" is expected to be HEX, integer in valid range expression "%s".')
 ERR_EXPR_RANGE_DOUBLE_PAREN_NESTED = \
-_('Double parentheses nested is not allowed in valid range expression: "%s".')
-ERR_EXPR_RANGE_EMPTY            = _('Valid range can not be empty.')
-ERR_EXPR_LIST_EMPTY             = _('Valid list can not be empty.')
-ERR_PAREN_NOT_USED              = _('Parenthesis must be used on both sides of "OR", "AND" in valid range : %s.')
-ERR_EXPR_LIST                   = \
-_('The following is not a valid list expression: "%s".')
+    _('Double parentheses nested is not allowed in valid range expression: "%s".')
+ERR_EXPR_RANGE_EMPTY = _('Valid range can not be empty.')
+ERR_EXPR_LIST_EMPTY = _('Valid list can not be empty.')
+ERR_PAREN_NOT_USED = _(
+    'Parenthesis must be used on both sides of "OR", "AND" in valid range : %s.')
+ERR_EXPR_LIST = \
+    _('The following is not a valid list expression: "%s".')
 
 
 # DEC parser error message
 #
-ERR_DECPARSE_STATEMENT_EMPTY        = \
-_('Must have at least one statement in section %s.')
-ERR_DECPARSE_DEFINE_DEFINED         = \
-_('%s already defined in define section.')
-ERR_DECPARSE_DEFINE_SECNAME         = \
-_('No arch and others can be followed for define section.')
-ERR_DECPARSE_DEFINE_MULTISEC        = \
-_('The DEC file does not allow multiple define sections.')
-ERR_DECPARSE_DEFINE_REQUIRED        = \
-_("Field [%s] is required in define section.")
-ERR_DECPARSE_DEFINE_FORMAT          = \
-_("Wrong define section format, must be KEY = Value.")
-ERR_DECPARSE_DEFINE_UNKNOWKEY       = \
-_("Unknown key [%s] in define section.")
-ERR_DECPARSE_DEFINE_SPEC            = \
-_("Specification value must be HEX numbers or decimal numbers.")
-ERR_DECPARSE_DEFINE_PKGNAME         = \
-_("Package name must be AlphaNumeric characters.")
-ERR_DECPARSE_DEFINE_PKGGUID         = \
-_("GUID format error, must be HEX value with form 8-4-4-4-12.")
-ERR_DECPARSE_DEFINE_PKGVERSION      = \
-_("Version number must be decimal number.")
-ERR_DECPARSE_DEFINE_PKGVUNI         = \
-_("UNI file name format error or file does not exist.")
-ERR_DECPARSE_INCLUDE                = \
-_("Incorrect path: [%s].")
-ERR_DECPARSE_LIBCLASS_SPLIT         = \
-_("Library class format error, must be Libraryclass|Headerpath.")
-ERR_DECPARSE_LIBCLASS_EMPTY         = \
-_("Class name or file name must not be empty.")
-ERR_DECPARSE_LIBCLASS_LIB           = \
-_("Class name format error, must start with upper case letter followed with "
-  "zero or more alphanumeric characters.")
-ERR_DECPARSE_LIBCLASS_PATH_EXT      = _("File name must be end with .h.")
-ERR_DECPARSE_LIBCLASS_PATH_DOT      = _("Path must not include '..'.")
-ERR_DECPARSE_LIBCLASS_PATH_EXIST    = _("File name [%s] does not exist.")
-ERR_DECPARSE_PCD_CVAR_GUID          = \
-_("TokenSpaceGuidCName must be valid C variable format.")
-ERR_DECPARSE_PCD_SPLIT              = \
-_("Incorrect PcdName. The format must be TokenSpaceGuidCName.PcdCName"
-                                        "|PcdData|PcdType|Token.")
-ERR_DECPARSE_PCD_NAME               = \
-_("Incorrect PCD name. The correct format must be "
-  "<TokenSpaceGuidCName>.<PcdCName>.")
-ERR_DECPARSE_PCD_CVAR_PCDCNAME      = \
-_("PcdCName must be valid C variable format.")
-ERR_DECPARSE_PCD_TYPE               = \
-_('Incorrect PCD data type. A PCD data type  must be one of '
-  '"UINT8", "UINT16", "UINT32", "UINT64", "VOID*", "BOOLEAN".')
-ERR_DECPARSE_PCD_VOID               = \
-_("Incorrect  value [%s] of type [%s].  Value  must be printable and in the "
-  "form of{...} for array, or ""..."" for string, or L""..."""
-  "for unicode string.")
-ERR_DECPARSE_PCD_VALUE_EMPTY        = \
-_("Pcd value can not be empty.")
-ERR_DECPARSE_PCD_BOOL               = \
-_("Invalid value [%s] of type [%s]; must be expression, TRUE, FALSE, 0 or 1.")
-ERR_DECPARSE_PCD_INT                = _("Incorrect value [%s] of type [%s]."\
-" Value must be a hexadecimal, decimal or octal in C language format.")
-ERR_DECPARSE_PCD_INT_NEGTIVE        = _("Incorrect value [%s] of type [%s];"
-                                        " must not be signed number.")
-ERR_DECPARSE_PCD_INT_EXCEED         = _("Incorrect value [%s] of type [%s]; "
-                                    "the number is too long for this type.")
-ERR_DECPARSE_PCD_FEATUREFLAG        = \
-_("PcdFeatureFlag only allow BOOLEAN type.")
-ERR_DECPARSE_PCD_TOKEN              = \
-_("An incorrect PCD token found: [%s].  "
-  "It must start with 0x followed by 1 - 8 hexadecimal. ")
-ERR_DECPARSE_PCD_TOKEN_INT          = _("Incorrect token number [%s].  "
-     "This token number exceeds the maximal value of unsigned 32.")
-ERR_DECPARSE_PCD_TOKEN_UNIQUE       = _("Token number must be unique to the token space: %s.")
-ERR_DECPARSE_CGUID                  = \
-_("No GUID name or value specified, must be <CName> = <GuidValueInCFormat>.")
-ERR_DECPARSE_CGUID_NAME             = \
-_("No GUID name specified, must be <CName> = <GuidValueInCFormat>.")
-ERR_DECPARSE_CGUID_GUID             = \
-_("No GUID value specified, must be <CName> = <GuidValueInCFormat>.")
-ERR_DECPARSE_CGUID_GUIDFORMAT       = \
-_("Incorrect GUID value format, must be <GuidValueInCFormat:"
-  "{8,4,4,{2,2,2,2,2,2,2,2}}>.")
-ERR_DECPARSE_CGUID_NOT_FOUND = _("Unable to find the GUID value of this GUID CName : '%s'.")
-ERR_DECPARSE_FILEOPEN               = _("Unable to open: [%s].")
-ERR_DECPARSE_SECTION_EMPTY          = _("Empty sections are not allowed.")
-ERR_DECPARSE_SECTION_UE             = _("Incorrect UserExtensions format. "
+ERR_DECPARSE_STATEMENT_EMPTY = \
+    _('Must have at least one statement in section %s.')
+ERR_DECPARSE_DEFINE_DEFINED = \
+    _('%s already defined in define section.')
+ERR_DECPARSE_DEFINE_SECNAME = \
+    _('No arch and others can be followed for define section.')
+ERR_DECPARSE_DEFINE_MULTISEC = \
+    _('The DEC file does not allow multiple define sections.')
+ERR_DECPARSE_DEFINE_REQUIRED = \
+    _("Field [%s] is required in define section.")
+ERR_DECPARSE_DEFINE_FORMAT = \
+    _("Wrong define section format, must be KEY = Value.")
+ERR_DECPARSE_DEFINE_UNKNOWKEY = \
+    _("Unknown key [%s] in define section.")
+ERR_DECPARSE_DEFINE_SPEC = \
+    _("Specification value must be HEX numbers or decimal numbers.")
+ERR_DECPARSE_DEFINE_PKGNAME = \
+    _("Package name must be AlphaNumeric characters.")
+ERR_DECPARSE_DEFINE_PKGGUID = \
+    _("GUID format error, must be HEX value with form 8-4-4-4-12.")
+ERR_DECPARSE_DEFINE_PKGVERSION = \
+    _("Version number must be decimal number.")
+ERR_DECPARSE_DEFINE_PKGVUNI = \
+    _("UNI file name format error or file does not exist.")
+ERR_DECPARSE_INCLUDE = \
+    _("Incorrect path: [%s].")
+ERR_DECPARSE_LIBCLASS_SPLIT = \
+    _("Library class format error, must be Libraryclass|Headerpath.")
+ERR_DECPARSE_LIBCLASS_EMPTY = \
+    _("Class name or file name must not be empty.")
+ERR_DECPARSE_LIBCLASS_LIB = \
+    _("Class name format error, must start with upper case letter followed with "
+      "zero or more alphanumeric characters.")
+ERR_DECPARSE_LIBCLASS_PATH_EXT = _("File name must be end with .h.")
+ERR_DECPARSE_LIBCLASS_PATH_DOT = _("Path must not include '..'.")
+ERR_DECPARSE_LIBCLASS_PATH_EXIST = _("File name [%s] does not exist.")
+ERR_DECPARSE_PCD_CVAR_GUID = \
+    _("TokenSpaceGuidCName must be valid C variable format.")
+ERR_DECPARSE_PCD_SPLIT = \
+    _("Incorrect PcdName. The format must be TokenSpaceGuidCName.PcdCName"
+      "|PcdData|PcdType|Token.")
+ERR_DECPARSE_PCD_NAME = \
+    _("Incorrect PCD name. The correct format must be "
+      "<TokenSpaceGuidCName>.<PcdCName>.")
+ERR_DECPARSE_PCD_CVAR_PCDCNAME = \
+    _("PcdCName must be valid C variable format.")
+ERR_DECPARSE_PCD_TYPE = \
+    _('Incorrect PCD data type. A PCD data type  must be one of '
+      '"UINT8", "UINT16", "UINT32", "UINT64", "VOID*", "BOOLEAN".')
+ERR_DECPARSE_PCD_VOID = \
+    _("Incorrect  value [%s] of type [%s].  Value  must be printable and in the "
+      "form of{...} for array, or ""..."" for string, or L""..."""
+      "for unicode string.")
+ERR_DECPARSE_PCD_VALUE_EMPTY = \
+    _("Pcd value can not be empty.")
+ERR_DECPARSE_PCD_BOOL = \
+    _("Invalid value [%s] of type [%s]; must be expression, TRUE, FALSE, 0 or 1.")
+ERR_DECPARSE_PCD_INT = _("Incorrect value [%s] of type [%s]."
+                         " Value must be a hexadecimal, decimal or octal in C language format.")
+ERR_DECPARSE_PCD_INT_NEGTIVE = _("Incorrect value [%s] of type [%s];"
+                                 " must not be signed number.")
+ERR_DECPARSE_PCD_INT_EXCEED = _("Incorrect value [%s] of type [%s]; "
+                                "the number is too long for this type.")
+ERR_DECPARSE_PCD_FEATUREFLAG = \
+    _("PcdFeatureFlag only allow BOOLEAN type.")
+ERR_DECPARSE_PCD_TOKEN = \
+    _("An incorrect PCD token found: [%s].  "
+      "It must start with 0x followed by 1 - 8 hexadecimal. ")
+ERR_DECPARSE_PCD_TOKEN_INT = _("Incorrect token number [%s].  "
+                               "This token number exceeds the maximal value of unsigned 32.")
+ERR_DECPARSE_PCD_TOKEN_UNIQUE = _(
+    "Token number must be unique to the token space: %s.")
+ERR_DECPARSE_CGUID = \
+    _("No GUID name or value specified, must be <CName> = <GuidValueInCFormat>.")
+ERR_DECPARSE_CGUID_NAME = \
+    _("No GUID name specified, must be <CName> = <GuidValueInCFormat>.")
+ERR_DECPARSE_CGUID_GUID = \
+    _("No GUID value specified, must be <CName> = <GuidValueInCFormat>.")
+ERR_DECPARSE_CGUID_GUIDFORMAT = \
+    _("Incorrect GUID value format, must be <GuidValueInCFormat:"
+      "{8,4,4,{2,2,2,2,2,2,2,2}}>.")
+ERR_DECPARSE_CGUID_NOT_FOUND = _(
+    "Unable to find the GUID value of this GUID CName : '%s'.")
+ERR_DECPARSE_FILEOPEN = _("Unable to open: [%s].")
+ERR_DECPARSE_SECTION_EMPTY = _("Empty sections are not allowed.")
+ERR_DECPARSE_SECTION_UE = _("Incorrect UserExtensions format. "
                             "Must be UserExtenxions.UserId.IdString[.Arch]+.")
-ERR_DECPARSE_SECTION_UE_USERID      = _("Invalid UserId, must be underscore"
-                                        "or alphanumeric characters.")
-ERR_DECPARSE_SECTION_UE_IDSTRING    = \
+ERR_DECPARSE_SECTION_UE_USERID = _("Invalid UserId, must be underscore"
+                                   "or alphanumeric characters.")
+ERR_DECPARSE_SECTION_UE_IDSTRING = \
     _("Incorrect IdString, must be \" ... \".")
-ERR_DECPARSE_ARCH                   = \
-_("Unknown arch, must be 'common' or start with upper case letter followed by"
-                            " zero or more upper case letters and numbers.")
-ERR_DECPARSE_SECTION_COMMA          = _("Section cannot end with comma.")
-ERR_DECPARSE_SECTION_COMMON         = \
-_("'COMMON' must not be used with specific ARCHs in the same section.")
-ERR_DECPARSE_SECTION_IDENTIFY       = \
-_("Section header must start with and end with brackets[].")
-ERR_DECPARSE_SECTION_SUBEMPTY       = \
-_("Missing a sub-section name in section: [%s]. "
-  "All sub-sections need to have names. ")
-ERR_DECPARSE_SECTION_SUBTOOMANY     = _("Too many DOT splits in [%s].")
-ERR_DECPARSE_SECTION_UNKNOW         = _("Section name [%s] unknown.")
-ERR_DECPARSE_SECTION_FEATUREFLAG    = \
-_("[%s] must not be in the same section as other types of PCD.")
-ERR_DECPARSE_MACRO_PAIR             = _("No macro name/value given.")
-ERR_DECPARSE_MACRO_NAME             = _("No macro name given.")
-ERR_DECPARSE_MACRO_NAME_UPPER       = \
-_("Macro name must start with upper case letter followed "
-"by zero or more upper case letters or numbers.  Current macro name is: [%s].")
-ERR_DECPARSE_SECTION_NAME           = \
-_('Cannot mix different section names %s.')
-ERR_DECPARSE_BACKSLASH              = \
-_('Backslash must be the last character on a line and '
-                                        'preceded by a space character.')
-ERR_DECPARSE_BACKSLASH_EMPTY        = \
-_('Empty line after previous line that has backslash is not allowed.')
-ERR_DECPARSE_REDEFINE               = _(
+ERR_DECPARSE_ARCH = \
+    _("Unknown arch, must be 'common' or start with upper case letter followed by"
+      " zero or more upper case letters and numbers.")
+ERR_DECPARSE_SECTION_COMMA = _("Section cannot end with comma.")
+ERR_DECPARSE_SECTION_COMMON = \
+    _("'COMMON' must not be used with specific ARCHs in the same section.")
+ERR_DECPARSE_SECTION_IDENTIFY = \
+    _("Section header must start with and end with brackets[].")
+ERR_DECPARSE_SECTION_SUBEMPTY = \
+    _("Missing a sub-section name in section: [%s]. "
+      "All sub-sections need to have names. ")
+ERR_DECPARSE_SECTION_SUBTOOMANY = _("Too many DOT splits in [%s].")
+ERR_DECPARSE_SECTION_UNKNOW = _("Section name [%s] unknown.")
+ERR_DECPARSE_SECTION_FEATUREFLAG = \
+    _("[%s] must not be in the same section as other types of PCD.")
+ERR_DECPARSE_MACRO_PAIR = _("No macro name/value given.")
+ERR_DECPARSE_MACRO_NAME = _("No macro name given.")
+ERR_DECPARSE_MACRO_NAME_UPPER = \
+    _("Macro name must start with upper case letter followed "
+      "by zero or more upper case letters or numbers.  Current macro name is: [%s].")
+ERR_DECPARSE_SECTION_NAME = \
+    _('Cannot mix different section names %s.')
+ERR_DECPARSE_BACKSLASH = \
+    _('Backslash must be the last character on a line and '
+      'preceded by a space character.')
+ERR_DECPARSE_BACKSLASH_EMPTY = \
+    _('Empty line after previous line that has backslash is not allowed.')
+ERR_DECPARSE_REDEFINE = _(
     "\"%s\" already defined in line %d.")
-ERR_DECPARSE_MACRO_RESOLVE          = _("Macro %s in %s cannot be resolved.")
-ERR_DECPARSE_UE_DUPLICATE           = \
+ERR_DECPARSE_MACRO_RESOLVE = _("Macro %s in %s cannot be resolved.")
+ERR_DECPARSE_UE_DUPLICATE = \
     _("Duplicated UserExtensions header found.")
 ERR_DECPARSE_PCDERRORMSG_MISS_VALUE_SPLIT = \
     _("Missing '|' between Pcd's error code and Pcd's error message.")
@@ -763,36 +794,38 @@ ERR_DECPARSE_PCD_MISS_ERRORMSG = \
     _("Missing Pcd's error message.")
 ERR_DECPARSE_PCD_UNMATCHED_ERRORCODE = \
     _("There is no error message matched with this Pcd error code : %s in both DEC and UNI file.")
-ERR_DECPARSE_PCD_NODEFINED = _("The PCD : %s used in the Expression is undefined.")
+ERR_DECPARSE_PCD_NODEFINED = _(
+    "The PCD : %s used in the Expression is undefined.")
 #
 # Used to print the current line content which cause error raise.
 # Be attached to the end of every error message above.
 #
-ERR_DECPARSE_LINE                   = _(" Parsing line: \"%s\".")
+ERR_DECPARSE_LINE = _(" Parsing line: \"%s\".")
 
 #
 # Warning related strings.
 #
-WRN_PACKAGE_EXISTED       = _(
+WRN_PACKAGE_EXISTED = _(
     "A package with this GUID and Version already exists: "
     "GUID %s, Version %s.")
-WRN_MODULE_EXISTED        = _("This module already exists: %s")
-WRN_FILE_EXISTED          = _("This file already exists: %s")
-WRN_FILE_NOT_OVERWRITTEN  = \
-_("This file already exist and cannot be overwritten: %s")
-WRN_DIST_PKG_INSTALLED = _("This distribution package %s has previously been installed.")
-WRN_DIST_NOT_FOUND         = _(
+WRN_MODULE_EXISTED = _("This module already exists: %s")
+WRN_FILE_EXISTED = _("This file already exists: %s")
+WRN_FILE_NOT_OVERWRITTEN = \
+    _("This file already exist and cannot be overwritten: %s")
+WRN_DIST_PKG_INSTALLED = _(
+    "This distribution package %s has previously been installed.")
+WRN_DIST_NOT_FOUND = _(
     "Distribution is not found at location %s")
-WRN_MULTI_PCD_RANGES      = _(
+WRN_MULTI_PCD_RANGES = _(
     "A PCD can only have one type of @ValidRange, @ValidList, and @Expression comment")
-WRN_MULTI_PCD_VALIDVALUE  = _(
+WRN_MULTI_PCD_VALIDVALUE = _(
     "A PCD can only have one of @ValidList comment")
-WRN_MULTI_PCD_PROMPT      = _(
+WRN_MULTI_PCD_PROMPT = _(
     "A PCD can only have one of @Prompt comment")
-WRN_MISSING_USAGE                = _("Missing usage")
-WRN_INVALID_GUID_TYPE            = _("This is and incorrect Guid type: %s")
-WRN_MISSING_GUID_TYPE            = _("Missing Guid Type")
-WRN_INVALID_USAGE                = _("This is an incorrect Usage: %s")
+WRN_MISSING_USAGE = _("Missing usage")
+WRN_INVALID_GUID_TYPE = _("This is and incorrect Guid type: %s")
+WRN_MISSING_GUID_TYPE = _("Missing Guid Type")
+WRN_INVALID_USAGE = _("This is an incorrect Usage: %s")
 WRN_INF_PARSER_MODULE_INVALID_HOB_TYPE = \
     _("This is an incorrect HOB type: %s")
 WRN_INF_PARSER_MODULE_INVALID_EVENT_TYPE = \
@@ -817,37 +850,37 @@ WARN_CUSTOMPATH_OVERRIDE_USEGUIDEDPATH = \
 #
 # Help related strings.
 #
-HLP_PRINT_DEBUG_INFO             = _(
+HLP_PRINT_DEBUG_INFO = _(
     "Print DEBUG statements, where DEBUG_LEVEL is 0-9")
 HLP_PRINT_INFORMATIONAL_STATEMENT = _("Print informational statements")
-HLP_RETURN_NO_DISPLAY            = _(
+HLP_RETURN_NO_DISPLAY = _(
     "Returns only the exit code, informational and error messages are"
     " not displayed")
-HLP_RETURN_AND_DISPLAY           = _(
+HLP_RETURN_AND_DISPLAY = _(
     "Returns the exit code and displays  error messages only")
 HLP_SPECIFY_PACKAGE_NAME_INSTALL = _(
     "Specify the UEFI Distribution Package filename to install")
-HLP_SPECIFY_PACKAGE_NAME_CREATE  = _(
+HLP_SPECIFY_PACKAGE_NAME_CREATE = _(
     "Specify the UEFI Distribution Package filename to create")
-HLP_SPECIFY_PACKAGE_NAME_REMOVE  = _(
+HLP_SPECIFY_PACKAGE_NAME_REMOVE = _(
     "Specify the UEFI Distribution Package filename to remove")
 HLP_SPECIFY_TEMPLATE_NAME_CREATE = _(
     "Specify Package Information Data filename to create package")
-HLP_SPECIFY_DEC_NAME_CREATE      = _(
+HLP_SPECIFY_DEC_NAME_CREATE = _(
     "Specify dec file names to create package")
-HLP_SPECIFY_INF_NAME_CREATE      = _(
+HLP_SPECIFY_INF_NAME_CREATE = _(
     "Specify inf file names to create package")
-HLP_LIST_DIST_INSTALLED      = _(
+HLP_LIST_DIST_INSTALLED = _(
     "List the UEFI Distribution Packages that have been installed")
-HLP_NO_SUPPORT_GUI               = _(
+HLP_NO_SUPPORT_GUI = _(
     "Starting the tool in graphical mode is not supported in this version")
-HLP_DISABLE_PROMPT               = _(
+HLP_DISABLE_PROMPT = _(
     "Disable user prompts for removing modified files. Valid only when -r is present")
-HLP_CUSTOM_PATH_PROMPT           = _(
+HLP_CUSTOM_PATH_PROMPT = _(
     "Enable user prompting for alternate installation directories")
-HLP_SKIP_LOCK_CHECK              = _(
+HLP_SKIP_LOCK_CHECK = _(
     "Skip the check for multiple instances")
-HLP_SPECIFY_PACKAGE_NAME_REPLACE  = _(
+HLP_SPECIFY_PACKAGE_NAME_REPLACE = _(
     "Specify the UEFI Distribution Package file name to replace the existing file name")
 HLP_SPECIFY_PACKAGE_NAME_TO_BE_REPLACED = _(
     "Specify the UEFI Distribution Package file name to be replaced")
@@ -856,5 +889,7 @@ HLP_USE_GUIDED_PATHS = _(
 HLP_TEST_INSTALL = _(
     "Specify the UEFI Distribution Package filenames to install")
 
-MSG_TEST_INSTALL_PASS = _("All distribution package file are satisfied for dependence check.")
-MSG_TEST_INSTALL_FAIL = _("NOT all distribution package file are satisfied for dependence check.")
+MSG_TEST_INSTALL_PASS = _(
+    "All distribution package file are satisfied for dependence check.")
+MSG_TEST_INSTALL_FAIL = _(
+    "NOT all distribution package file are satisfied for dependence check.")
diff --git a/BaseTools/Source/Python/UPT/Logger/ToolError.py b/BaseTools/Source/Python/UPT/Logger/ToolError.py
index 925982beb346..32a7fcfcb977 100644
--- a/BaseTools/Source/Python/UPT/Logger/ToolError.py
+++ b/BaseTools/Source/Python/UPT/Logger/ToolError.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Standardized Error Handling infrastructures.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -94,78 +94,79 @@ UPT_MUL_DEC_ERROR = 0xD004
 UPT_DB_UPDATE_ERROR = 0xD005
 UPT_INI_PARSE_ERROR = 0xE000
 
-## Error message of each error code
+# Error message of each error code
 #
 gERROR_MESSAGE = {
-    FILE_NOT_FOUND          :   ST.ERR_FILE_NOT_FOUND,
-    FILE_OPEN_FAILURE       :   ST.ERR_FILE_OPEN_FAILURE,
-    FILE_WRITE_FAILURE      :   ST.ERR_FILE_WRITE_FAILURE,
-    FILE_PARSE_FAILURE      :   ST.ERR_FILE_PARSE_FAILURE,
-    FILE_READ_FAILURE       :   ST.ERR_FILE_READ_FAILURE,
-    FILE_CREATE_FAILURE     :   ST.ERR_FILE_CREATE_FAILURE,
-    FILE_CHECKSUM_FAILURE   :   ST.ERR_FILE_CHECKSUM_FAILURE,
-    FILE_COMPRESS_FAILURE   :   ST.ERR_FILE_COMPRESS_FAILURE,
-    FILE_DECOMPRESS_FAILURE :   ST.ERR_FILE_DECOMPRESS_FAILURE,
-    FILE_MOVE_FAILURE       :   ST.ERR_FILE_MOVE_FAILURE,
-    FILE_DELETE_FAILURE     :   ST.ERR_FILE_DELETE_FAILURE,
-    FILE_COPY_FAILURE       :   ST.ERR_FILE_COPY_FAILURE,
+    FILE_NOT_FOUND:   ST.ERR_FILE_NOT_FOUND,
+    FILE_OPEN_FAILURE:   ST.ERR_FILE_OPEN_FAILURE,
+    FILE_WRITE_FAILURE:   ST.ERR_FILE_WRITE_FAILURE,
+    FILE_PARSE_FAILURE:   ST.ERR_FILE_PARSE_FAILURE,
+    FILE_READ_FAILURE:   ST.ERR_FILE_READ_FAILURE,
+    FILE_CREATE_FAILURE:   ST.ERR_FILE_CREATE_FAILURE,
+    FILE_CHECKSUM_FAILURE:   ST.ERR_FILE_CHECKSUM_FAILURE,
+    FILE_COMPRESS_FAILURE:   ST.ERR_FILE_COMPRESS_FAILURE,
+    FILE_DECOMPRESS_FAILURE:   ST.ERR_FILE_DECOMPRESS_FAILURE,
+    FILE_MOVE_FAILURE:   ST.ERR_FILE_MOVE_FAILURE,
+    FILE_DELETE_FAILURE:   ST.ERR_FILE_DELETE_FAILURE,
+    FILE_COPY_FAILURE:   ST.ERR_FILE_COPY_FAILURE,
     FILE_POSITIONING_FAILURE:   ST.ERR_FILE_POSITIONING_FAILURE,
-    FILE_ALREADY_EXIST      :   ST.ERR_FILE_ALREADY_EXIST,
-    FILE_TYPE_MISMATCH      :   ST.ERR_FILE_TYPE_MISMATCH ,
-    FILE_CASE_MISMATCH      :   ST.ERR_FILE_CASE_MISMATCH,
-    FILE_DUPLICATED         :   ST.ERR_FILE_DUPLICATED,
-    FILE_UNKNOWN_ERROR      :   ST.ERR_FILE_UNKNOWN_ERROR,
+    FILE_ALREADY_EXIST:   ST.ERR_FILE_ALREADY_EXIST,
+    FILE_TYPE_MISMATCH:   ST.ERR_FILE_TYPE_MISMATCH,
+    FILE_CASE_MISMATCH:   ST.ERR_FILE_CASE_MISMATCH,
+    FILE_DUPLICATED:   ST.ERR_FILE_DUPLICATED,
+    FILE_UNKNOWN_ERROR:   ST.ERR_FILE_UNKNOWN_ERROR,
 
-    OPTION_UNKNOWN          :   ST.ERR_OPTION_UNKNOWN,
-    OPTION_MISSING          :   ST.ERR_OPTION_MISSING,
-    OPTION_CONFLICT         :   ST.ERR_OPTION_CONFLICT,
-    OPTION_VALUE_INVALID    :   ST.ERR_OPTION_VALUE_INVALID,
-    OPTION_DEPRECATED       :   ST.ERR_OPTION_DEPRECATED,
-    OPTION_NOT_SUPPORTED    :   ST.ERR_OPTION_NOT_SUPPORTED,
-    OPTION_UNKNOWN_ERROR    :   ST.ERR_OPTION_UNKNOWN_ERROR,
+    OPTION_UNKNOWN:   ST.ERR_OPTION_UNKNOWN,
+    OPTION_MISSING:   ST.ERR_OPTION_MISSING,
+    OPTION_CONFLICT:   ST.ERR_OPTION_CONFLICT,
+    OPTION_VALUE_INVALID:   ST.ERR_OPTION_VALUE_INVALID,
+    OPTION_DEPRECATED:   ST.ERR_OPTION_DEPRECATED,
+    OPTION_NOT_SUPPORTED:   ST.ERR_OPTION_NOT_SUPPORTED,
+    OPTION_UNKNOWN_ERROR:   ST.ERR_OPTION_UNKNOWN_ERROR,
 
-    PARAMETER_INVALID       :   ST.ERR_PARAMETER_INVALID,
-    PARAMETER_MISSING       :   ST.ERR_PARAMETER_MISSING,
-    PARAMETER_UNKNOWN_ERROR :   ST.ERR_PARAMETER_UNKNOWN_ERROR,
+    PARAMETER_INVALID:   ST.ERR_PARAMETER_INVALID,
+    PARAMETER_MISSING:   ST.ERR_PARAMETER_MISSING,
+    PARAMETER_UNKNOWN_ERROR:   ST.ERR_PARAMETER_UNKNOWN_ERROR,
 
-    FORMAT_INVALID          :   ST.ERR_FORMAT_INVALID,
-    FORMAT_NOT_SUPPORTED    :   ST.ERR_FORMAT_NOT_SUPPORTED,
-    FORMAT_UNKNOWN          :   ST.ERR_FORMAT_UNKNOWN,
-    FORMAT_UNKNOWN_ERROR    :   ST.ERR_FORMAT_UNKNOWN_ERROR,
+    FORMAT_INVALID:   ST.ERR_FORMAT_INVALID,
+    FORMAT_NOT_SUPPORTED:   ST.ERR_FORMAT_NOT_SUPPORTED,
+    FORMAT_UNKNOWN:   ST.ERR_FORMAT_UNKNOWN,
+    FORMAT_UNKNOWN_ERROR:   ST.ERR_FORMAT_UNKNOWN_ERROR,
 
-    RESOURCE_NOT_AVAILABLE  :   ST.ERR_RESOURCE_NOT_AVAILABLE,
-    RESOURCE_ALLOCATE_FAILURE : ST.ERR_RESOURCE_ALLOCATE_FAILURE,
-    RESOURCE_FULL           :   ST.ERR_RESOURCE_FULL,
-    RESOURCE_OVERFLOW       :   ST.ERR_RESOURCE_OVERFLOW,
-    RESOURCE_UNDERRUN       :   ST.ERR_RESOURCE_UNDERRUN,
-    RESOURCE_UNKNOWN_ERROR  :   ST.ERR_RESOURCE_UNKNOWN_ERROR,
+    RESOURCE_NOT_AVAILABLE:   ST.ERR_RESOURCE_NOT_AVAILABLE,
+    RESOURCE_ALLOCATE_FAILURE: ST.ERR_RESOURCE_ALLOCATE_FAILURE,
+    RESOURCE_FULL:   ST.ERR_RESOURCE_FULL,
+    RESOURCE_OVERFLOW:   ST.ERR_RESOURCE_OVERFLOW,
+    RESOURCE_UNDERRUN:   ST.ERR_RESOURCE_UNDERRUN,
+    RESOURCE_UNKNOWN_ERROR:   ST.ERR_RESOURCE_UNKNOWN_ERROR,
 
-    ATTRIBUTE_NOT_AVAILABLE :   ST.ERR_ATTRIBUTE_NOT_AVAILABLE,
-    ATTRIBUTE_RETRIEVE_FAILURE : ST.ERR_ATTRIBUTE_RETRIEVE_FAILURE,
-    ATTRIBUTE_SET_FAILURE   :   ST.ERR_ATTRIBUTE_SET_FAILURE,
+    ATTRIBUTE_NOT_AVAILABLE:   ST.ERR_ATTRIBUTE_NOT_AVAILABLE,
+    ATTRIBUTE_RETRIEVE_FAILURE: ST.ERR_ATTRIBUTE_RETRIEVE_FAILURE,
+    ATTRIBUTE_SET_FAILURE:   ST.ERR_ATTRIBUTE_SET_FAILURE,
     ATTRIBUTE_UPDATE_FAILURE:   ST.ERR_ATTRIBUTE_UPDATE_FAILURE,
-    ATTRIBUTE_ACCESS_DENIED :   ST.ERR_ATTRIBUTE_ACCESS_DENIED,
-    ATTRIBUTE_UNKNOWN_ERROR :   ST.ERR_ATTRIBUTE_UNKNOWN_ERROR,
+    ATTRIBUTE_ACCESS_DENIED:   ST.ERR_ATTRIBUTE_ACCESS_DENIED,
+    ATTRIBUTE_UNKNOWN_ERROR:   ST.ERR_ATTRIBUTE_UNKNOWN_ERROR,
 
-    COMMAND_FAILURE         :   ST.ERR_COMMAND_FAILURE,
+    COMMAND_FAILURE:   ST.ERR_COMMAND_FAILURE,
 
-    IO_NOT_READY            :   ST.ERR_IO_NOT_READY,
-    IO_BUSY                 :   ST.ERR_IO_BUSY,
-    IO_TIMEOUT              :   ST.ERR_IO_TIMEOUT,
-    IO_UNKNOWN_ERROR        :   ST.ERR_IO_UNKNOWN_ERROR,
+    IO_NOT_READY:   ST.ERR_IO_NOT_READY,
+    IO_BUSY:   ST.ERR_IO_BUSY,
+    IO_TIMEOUT:   ST.ERR_IO_TIMEOUT,
+    IO_UNKNOWN_ERROR:   ST.ERR_IO_UNKNOWN_ERROR,
 
-    UNKNOWN_ERROR           :   ST.ERR_UNKNOWN_ERROR,
+    UNKNOWN_ERROR:   ST.ERR_UNKNOWN_ERROR,
 
-    UPT_ALREADY_INSTALLED_ERROR : ST.ERR_UPT_ALREADY_INSTALLED_ERROR,
-    UPT_ENVIRON_MISSING_ERROR   : ST.ERR_UPT_ENVIRON_MISSING_ERROR,
-    UPT_REPKG_ERROR             : ST.ERR_UPT_REPKG_ERROR,
-    UPT_ALREADY_RUNNING_ERROR   : ST.ERR_UPT_ALREADY_RUNNING_ERROR,
-    UPT_MUL_DEC_ERROR           : ST.ERR_MUL_DEC_ERROR,
-    UPT_INI_PARSE_ERROR     :   ST.ERR_UPT_INI_PARSE_ERROR,
+    UPT_ALREADY_INSTALLED_ERROR: ST.ERR_UPT_ALREADY_INSTALLED_ERROR,
+    UPT_ENVIRON_MISSING_ERROR: ST.ERR_UPT_ENVIRON_MISSING_ERROR,
+    UPT_REPKG_ERROR: ST.ERR_UPT_REPKG_ERROR,
+    UPT_ALREADY_RUNNING_ERROR: ST.ERR_UPT_ALREADY_RUNNING_ERROR,
+    UPT_MUL_DEC_ERROR: ST.ERR_MUL_DEC_ERROR,
+    UPT_INI_PARSE_ERROR:   ST.ERR_UPT_INI_PARSE_ERROR,
 }
 
-## Exception indicating a fatal error
+# Exception indicating a fatal error
 #
+
+
 class FatalError(Exception):
     pass
-
diff --git a/BaseTools/Source/Python/UPT/Logger/__init__.py b/BaseTools/Source/Python/UPT/Logger/__init__.py
index e53198451794..cee19ef24060 100644
--- a/BaseTools/Source/Python/UPT/Logger/__init__.py
+++ b/BaseTools/Source/Python/UPT/Logger/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Logger' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/MkPkg.py b/BaseTools/Source/Python/UPT/MkPkg.py
index c6d4731ed655..e7912b44c6a1 100644
--- a/BaseTools/Source/Python/UPT/MkPkg.py
+++ b/BaseTools/Source/Python/UPT/MkPkg.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Install distribution package.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -46,27 +46,32 @@ from Core.DistributionPackageClass import DistributionPackageClass
 from Core.PackageFile import PackageFile
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## CheckForExistingDp
+# CheckForExistingDp
 #
 # Check if there is a same name DP file existing
 # @param Path: The path to be checked
 #
+
+
 def CheckForExistingDp(Path):
     if os.path.exists(Path):
         Logger.Info(ST.MSG_DISTRIBUTION_PACKAGE_FILE_EXISTS % Path)
         Input = stdin.readline()
         Input = Input.replace('\r', '').replace('\n', '')
         if Input.upper() != "Y":
-            Logger.Error("\nMkPkg", ABORT_ERROR, ST.ERR_USER_ABORT, RaiseError=True)
+            Logger.Error("\nMkPkg", ABORT_ERROR,
+                         ST.ERR_USER_ABORT, RaiseError=True)
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
 # if it's executed successfully or not.
 #
 #
-def Main(Options = None):
+
+
+def Main(Options=None):
     if Options is None:
         Logger.Error("\nMkPkg", OPTION_UNKNOWN_ERROR, ST.ERR_OPTION_NOT_FOUND)
     try:
@@ -78,7 +83,8 @@ def Main(Options = None):
         # Init PackFileToCreate
         #
         if not Options.PackFileToCreate:
-            Logger.Error("\nMkPkg", OPTION_UNKNOWN_ERROR, ST.ERR_OPTION_NOT_FOUND)
+            Logger.Error("\nMkPkg", OPTION_UNKNOWN_ERROR,
+                         ST.ERR_OPTION_NOT_FOUND)
 
         #
         # Handle if the distribution package file already exists
@@ -88,11 +94,13 @@ def Main(Options = None):
         #
         # Check package file existing and valid
         #
-        CheckFileList('.DEC', Options.PackageFileList, ST.ERR_INVALID_PACKAGE_NAME, ST.ERR_INVALID_PACKAGE_PATH)
+        CheckFileList('.DEC', Options.PackageFileList,
+                      ST.ERR_INVALID_PACKAGE_NAME, ST.ERR_INVALID_PACKAGE_PATH)
         #
         # Check module file existing and valid
         #
-        CheckFileList('.INF', Options.ModuleFileList, ST.ERR_INVALID_MODULE_NAME, ST.ERR_INVALID_MODULE_PATH)
+        CheckFileList('.INF', Options.ModuleFileList,
+                      ST.ERR_INVALID_MODULE_NAME, ST.ERR_INVALID_MODULE_PATH)
 
         #
         # Get list of files that installed with RePackage attribute available
@@ -130,15 +138,17 @@ def Main(Options = None):
                 # strings in your desired encoding before passing them to
                 # write().
                 #
-                FromFile = os.path.normpath(FileObject.GetURI()).encode('utf_8')
+                FromFile = os.path.normpath(
+                    FileObject.GetURI()).encode('utf_8')
                 FileFullPath = mws.join(WorkspaceDir, FromFile)
                 if FileFullPath in RePkgDict:
-                    (DpGuid, DpVersion, DpName, Repackage) = RePkgDict[FileFullPath]
+                    (DpGuid, DpVersion, DpName,
+                     Repackage) = RePkgDict[FileFullPath]
                     if not Repackage:
                         Logger.Error("\nMkPkg",
                                      UPT_REPKG_ERROR,
                                      ST.ERR_UPT_REPKG_ERROR,
-                                     ExtraData=ST.MSG_REPKG_CONFLICT %\
+                                     ExtraData=ST.MSG_REPKG_CONFLICT %
                                      (FileFullPath, DpGuid, DpVersion, DpName)
                                      )
                     else:
@@ -155,7 +165,7 @@ def Main(Options = None):
             DistPkg.Header.Guid = str(uuid4())
             DistPkg.Header.Version = '1.0'
 
-        DistPkg.GetDistributionPackage(WorkspaceDir, Options.PackageFileList, \
+        DistPkg.GetDistributionPackage(WorkspaceDir, Options.PackageFileList,
                                        Options.ModuleFileList)
         FileList, MetaDataFileList = DistPkg.GetDistributionFileList()
         for File in FileList + MetaDataFileList:
@@ -165,14 +175,14 @@ def Main(Options = None):
             # be repackaged
             #
             if FileFullPath in RePkgDict:
-                (DpGuid, DpVersion, DpName, Repackage) = RePkgDict[FileFullPath]
+                (DpGuid, DpVersion, DpName,
+                 Repackage) = RePkgDict[FileFullPath]
                 if not Repackage:
                     Logger.Error("\nMkPkg",
                                  UPT_REPKG_ERROR,
                                  ST.ERR_UPT_REPKG_ERROR,
-                                 ExtraData = \
-                                 ST.MSG_REPKG_CONFLICT %(FileFullPath, DpName, \
-                                                         DpGuid, DpVersion)
+                                 ExtraData=ST.MSG_REPKG_CONFLICT % (FileFullPath, DpName,
+                                                                    DpGuid, DpVersion)
                                  )
                 else:
                     DistPkg.Header.RePackage = True
@@ -190,7 +200,8 @@ def Main(Options = None):
         #
         # Add Md5Signature
         #
-        DistPkg.Header.Signature = md5(open(str(ContentFile), 'rb').read()).hexdigest()
+        DistPkg.Header.Signature = md5(
+            open(str(ContentFile), 'rb').read()).hexdigest()
         #
         # Add current Date
         #
@@ -210,25 +221,25 @@ def Main(Options = None):
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % \
+            Logger.Quiet(ST.MSG_PYTHON_ON %
                          (python_version(), platform) + format_exc())
     except KeyboardInterrupt:
         ReturnCode = ABORT_ERROR
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % \
+            Logger.Quiet(ST.MSG_PYTHON_ON %
                          (python_version(), platform) + format_exc())
     except OSError:
         pass
     except:
         Logger.Error(
-                    "\nMkPkg",
-                    CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_CREATING_ERR % \
-                    Options.PackFileToCreate,
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
-        Logger.Quiet(ST.MSG_PYTHON_ON % \
+            "\nMkPkg",
+            CODE_ERROR,
+            ST.ERR_UNKNOWN_FATAL_CREATING_ERR %
+            Options.PackFileToCreate,
+            ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
+        Logger.Quiet(ST.MSG_PYTHON_ON %
                      (python_version(), platform) + format_exc())
         ReturnCode = CODE_ERROR
     finally:
@@ -240,7 +251,7 @@ def Main(Options = None):
     return ReturnCode
 
 
-## CheckFileList
+# CheckFileList
 #
 # @param QualifiedExt:             QualifiedExt
 # @param FileList:                 FileList
@@ -255,7 +266,7 @@ def CheckFileList(QualifiedExt, FileList, ErrorStringExt, ErrorStringFullPath):
     for Item in FileList:
         Ext = os.path.splitext(Item)[1]
         if Ext.upper() != QualifiedExt.upper():
-            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID, \
+            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID,
                          ErrorStringExt % Item)
 
         Item = os.path.normpath(Item)
@@ -266,9 +277,9 @@ def CheckFileList(QualifiedExt, FileList, ErrorStringExt, ErrorStringFullPath):
             Logger.Error("\nMkPkg", OPTION_VALUE_INVALID,
                          ErrorStringFullPath % Item)
         elif not IsValidPath(Item, WorkspaceDir):
-            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID, \
+            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID,
                          ErrorStringExt % Item)
 
         if not os.path.split(Item)[0]:
-            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID, \
+            Logger.Error("\nMkPkg", OPTION_VALUE_INVALID,
                          ST.ERR_INVALID_METAFILE_PATH % Item)
diff --git a/BaseTools/Source/Python/UPT/Object/POM/CommonObject.py b/BaseTools/Source/Python/UPT/Object/POM/CommonObject.py
index ae8fe8306dbd..061a3864df4f 100644
--- a/BaseTools/Source/Python/UPT/Object/POM/CommonObject.py
+++ b/BaseTools/Source/Python/UPT/Object/POM/CommonObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common items of class object
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,10 +10,12 @@ Common Object
 '''
 from Library.DataType import TAB_LANGUAGE_EN_US
 
-## HelpTextObject
+# HelpTextObject
 #
 # @param object:       Inherited from object class
 #
+
+
 class HelpTextObject(object):
     def __init__(self):
         self.HelpText = TextObject()
@@ -24,10 +26,12 @@ class HelpTextObject(object):
     def GetHelpText(self):
         return self.HelpText
 
-## HelpTextListObject
+# HelpTextListObject
 #
 # @param object:       Inherited from object class
 #
+
+
 class HelpTextListObject(object):
     def __init__(self):
         self.HelpTextList = []
@@ -38,10 +42,12 @@ class HelpTextListObject(object):
     def GetHelpTextList(self):
         return self.HelpTextList
 
-## PromptListObject
+# PromptListObject
 #
 # @param object:       Inherited from object class
 #
+
+
 class PromptListObject(object):
     def __init__(self):
         self.PromptList = []
@@ -52,7 +58,7 @@ class PromptListObject(object):
     def GetPromptList(self):
         return self.PromptList
 
-## CommonPropertiesObject
+# CommonPropertiesObject
 #
 # This class defined common attribution used in Module/Platform/Package files
 #
@@ -63,6 +69,8 @@ class PromptListObject(object):
 # @param HelpText:     Input value for HelpText, default is ''
 # @param HelpTextList: Input value for HelpTextList, default is []
 #
+
+
 class CommonPropertiesObject(HelpTextObject, HelpTextListObject):
     def __init__(self):
         self.Usage = []
@@ -96,12 +104,14 @@ class CommonPropertiesObject(HelpTextObject, HelpTextListObject):
     def GetGuidValue(self):
         return self.GuidValue
 
-## CommonHeaderObject
+# CommonHeaderObject
 #
 # This class defined common header items used in Module/Platform/Package files
 #
 # @param object:          Inherited from object class
 #
+
+
 class CommonHeaderObject(object):
     def __init__(self):
         self.AbstractList = []
@@ -145,12 +155,14 @@ class CommonHeaderObject(object):
     def GetLicense(self):
         return self.LicenseList
 
-## BinaryHeaderObject
+# BinaryHeaderObject
 #
 # This class defined Binary header items used in Module/Platform/Package files
 #
 # @param object:          Inherited from object class
 #
+
+
 class BinaryHeaderObject(object):
     def __init__(self):
         self.BinaryHeaderAbstractList = []
@@ -194,12 +206,14 @@ class BinaryHeaderObject(object):
     def GetBinaryHeaderLicense(self):
         return self.BinaryHeaderLicenseList
 
-## ClonedRecordObject
+# ClonedRecordObject
 #
 # This class defined ClonedRecord items used in Module/Platform/Package files
 #
 # @param object:        Inherited from object class
 #
+
+
 class ClonedRecordObject(object):
     def __init__(self):
         self.IdNum = 0
@@ -245,12 +259,14 @@ class ClonedRecordObject(object):
     def GetModuleVersion(self):
         return self.ModuleVersion
 
-## TextObject
+# TextObject
 #
 # This class defined Text item used in PKG file
 #
 # @param object:     Inherited from object class
 #
+
+
 class TextObject(object):
     def __init__(self):
         self.Lang = TAB_LANGUAGE_EN_US
@@ -268,12 +284,14 @@ class TextObject(object):
     def GetString(self):
         return self.String
 
-## FileNameObject
+# FileNameObject
 #
 # This class defined File item used in module, for binary files
 #
 # @param CommonPropertiesObject:   Inherited from CommonPropertiesObject class
 #
+
+
 class FileNameObject(CommonPropertiesObject):
     def __init__(self):
         self.FileType = ''
@@ -292,12 +310,14 @@ class FileNameObject(CommonPropertiesObject):
     def GetFilename(self):
         return self.Filename
 
-## FileObject
+# FileObject
 #
 # This class defined File item used in PKG file
 #
 # @param object:   Inherited from object class
 #
+
+
 class FileObject(object):
     def __init__(self):
         self.Executable = ''
@@ -327,6 +347,8 @@ class FileObject(object):
 #
 # @param CommonHeaderObject:   Inherited from CommonHeaderObject class
 #
+
+
 class MiscFileObject(CommonHeaderObject):
     def __init__(self):
         self.Name = ''
@@ -348,15 +370,19 @@ class MiscFileObject(CommonHeaderObject):
 ##
 # ToolsObject
 #
+
+
 class ToolsObject(MiscFileObject):
     pass
 
-## GuidVersionObject
+# GuidVersionObject
 #
 # This class defined GUID/Version items used in PKG file
 #
 # @param object:     Inherited from object class
 #
+
+
 class GuidVersionObject(object):
     def __init__(self):
         self.Guid = ''
@@ -374,12 +400,14 @@ class GuidVersionObject(object):
     def GetVersion(self):
         return self.Version
 
-## IdentificationObject
+# IdentificationObject
 #
 # This class defined Identification items used in Module/Platform/Package files
 #
 # @param object:    Inherited from object class
 #
+
+
 class IdentificationObject(GuidVersionObject):
     def __init__(self):
         self.Name = ''
@@ -440,13 +468,15 @@ class IdentificationObject(GuidVersionObject):
     def GetCombinePath(self):
         return self.CombinePath
 
-## GuidProtocolPpiCommonObject
+# GuidProtocolPpiCommonObject
 #
 # This class defined Guid, Protocol and Ppi like items used in
 # Module/Platform/Package files
 #
 # @param CommonPropertiesObject:    Inherited from CommonPropertiesObject class
 #
+
+
 class GuidProtocolPpiCommonObject(CommonPropertiesObject):
     def __init__(self):
         self.Name = ''
@@ -479,17 +509,20 @@ class GuidProtocolPpiCommonObject(CommonPropertiesObject):
     def GetSupModuleList(self):
         return self.SupModuleList
 
-## GuidObject
+# GuidObject
 #
 # This class defined Guid item used in Module/Platform/Package files
 #
 # @param GuidProtocolPpiCommonObject:  GuidProtocolPpiCommonObject
 #
+
+
 class GuidObject(GuidProtocolPpiCommonObject):
     def __init__(self):
         self.VariableName = ''
         self.GuidTypeList = []
         GuidProtocolPpiCommonObject.__init__(self)
+
     def SetVariableName(self, VariableName):
         self.VariableName = VariableName
 
@@ -502,54 +535,64 @@ class GuidObject(GuidProtocolPpiCommonObject):
     def GetGuidTypeList(self):
         return self.GuidTypeList
 
-## ProtocolObject
+# ProtocolObject
 #
 # This class defined Protocol item used in Module/Platform/Package files
 #
 # @param GuidProtocolPpiCommonObject:  Inherited from
 #                                      GuidProtocolPpiCommonObject
 #
+
+
 class ProtocolObject(GuidProtocolPpiCommonObject):
     def __init__(self):
         self.Notify = False
         GuidProtocolPpiCommonObject.__init__(self)
+
     def SetNotify(self, Notify):
         self.Notify = Notify
 
     def GetNotify(self):
         return self.Notify
 
-## PpiObject
+# PpiObject
 #
 # This class defined Ppi item used in Module/Platform/Package files
 #
 # @param GuidProtocolPpiCommonObject:  Inherited from
 #                                      GuidProtocolPpiCommonObject
 #
+
+
 class PpiObject(GuidProtocolPpiCommonObject):
     def __init__(self):
         self.Notify = False
         GuidProtocolPpiCommonObject.__init__(self)
+
     def SetNotify(self, Notify):
         self.Notify = Notify
 
     def GetNotify(self):
         return self.Notify
 
-## DefineObject
+# DefineObject
 #
 # This class defined item DEFINE used in Module/Platform/Package files
 #
 # @param object:  Inherited from object class
 #
+
+
 class DefineClass(object):
     def __init__(self):
         self.Define = {}
 
-## UserExtensionObject
+# UserExtensionObject
 #
 # @param object:  Inherited from object class
 #
+
+
 class UserExtensionObject(object):
     def __init__(self):
         self.UserID = ''
@@ -684,12 +727,14 @@ class UserExtensionObject(object):
     def GetBinariesDict(self):
         return self.BinariesDict
 
-## LibraryClassObject
+# LibraryClassObject
 #
 # This class defined Library item used in Module/Platform/Package files
 #
 # @param CommonPropertiesObject:  Inherited from CommonPropertiesObject class
 #
+
+
 class LibraryClassObject(CommonPropertiesObject):
     def __init__(self):
         self.LibraryClass = ''
@@ -723,7 +768,7 @@ class LibraryClassObject(CommonPropertiesObject):
         return self.RecommendedInstance
 
 
-## PcdErrorObject
+# PcdErrorObject
 #
 # @param object:  Inherited from object class
 #
@@ -801,7 +846,7 @@ class PcdErrorObject(object):
         return self.LineNum
 
 
-## IncludeObject
+# IncludeObject
 #
 # This class defined Include item used in Module/Platform/Package files
 #
@@ -839,7 +884,7 @@ class IncludeObject(CommonPropertiesObject):
     def GetComment(self):
         return self.Comment
 
-## PcdObject
+# PcdObject
 #
 # This class defined Pcd item used in Module/Platform/Package files
 #
@@ -855,6 +900,8 @@ class IncludeObject(CommonPropertiesObject):
 # @param SkuInfoList:          Input value for SkuInfoList, default is {}
 # @param SupModuleList:        Input value for SupModuleList, default is []
 #
+
+
 class PcdObject(CommonPropertiesObject, HelpTextListObject, PromptListObject):
     def __init__(self):
         self.PcdCName = ''
diff --git a/BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py b/BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py
index 6e515a2c3fe4..6aaae4c1aff2 100644
--- a/BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py
+++ b/BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define a class object to describe a module
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -38,6 +38,8 @@ class BootModeObject(CommonPropertiesObject, HelpTextListObject):
 ##
 # EventObject
 #
+
+
 class EventObject(CommonPropertiesObject, HelpTextListObject):
     def __init__(self):
         self.EventType = ''
@@ -53,6 +55,8 @@ class EventObject(CommonPropertiesObject, HelpTextListObject):
 ##
 # HobObject
 #
+
+
 class HobObject(CommonPropertiesObject, HelpTextListObject):
     def __init__(self):
         self.HobType = ''
@@ -68,6 +72,8 @@ class HobObject(CommonPropertiesObject, HelpTextListObject):
 ##
 # SpecObject
 #
+
+
 class SpecObject(object):
     def __init__(self):
         self.Spec = ''
@@ -85,10 +91,12 @@ class SpecObject(object):
     def GetVersion(self):
         return self.Version
 
-## ModuleHeaderObject
+# ModuleHeaderObject
 #
 # This class defined header items used in Module file
 #
+
+
 class ModuleHeaderObject(IdentificationObject, CommonHeaderObject, BinaryHeaderObject):
     def __init__(self):
         self.IsLibrary = False
@@ -211,6 +219,8 @@ class ModuleHeaderObject(IdentificationObject, CommonHeaderObject, BinaryHeaderO
 ##
 # SourceFileObject
 #
+
+
 class SourceFileObject(CommonPropertiesObject):
     def __init__(self):
         CommonPropertiesObject.__init__(self)
@@ -224,7 +234,7 @@ class SourceFileObject(CommonPropertiesObject):
         self.SourceFile = SourceFile
 
     def GetSourceFile(self):
-        return  self.SourceFile
+        return self.SourceFile
 
     def SetTagName(self, TagName):
         self.TagName = TagName
@@ -284,22 +294,27 @@ class AsBuildLibraryClassObject(object):
 
     def SetLibGuid(self, LibGuid):
         self.LibGuid = LibGuid
+
     def GetLibGuid(self):
         return self.LibGuid
 
     def SetLibVersion(self, LibVersion):
         self.LibVersion = LibVersion
+
     def GetLibVersion(self):
         return self.LibVersion
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
 ##
 # AsBuiltObject
 #
+
+
 class AsBuiltObject(object):
     def __init__(self):
         #
@@ -347,6 +362,8 @@ class AsBuiltObject(object):
 # BinaryBuildFlag, this object will include those fields that are not
 # covered by the UPT Spec BinaryFile field
 #
+
+
 class BinaryBuildFlagObject(object):
     def __init__(self):
         self.Target = ''
@@ -374,12 +391,15 @@ class BinaryBuildFlagObject(object):
 
     def SetAsBuiltOptionFlags(self, AsBuiltOptionFlags):
         self.AsBuiltOptionFlags = AsBuiltOptionFlags
+
     def GetAsBuiltOptionFlags(self):
         return self.AsBuiltOptionFlags
 
 ##
 # ExternObject
 #
+
+
 class ExternObject(CommonPropertiesObject):
     def __init__(self):
         self.EntryPoint = ''
@@ -415,12 +435,15 @@ class ExternObject(CommonPropertiesObject):
 
     def SetSupModList(self, SupModList):
         self.SupModList = SupModList
+
     def GetSupModList(self):
         return self.SupModList
 
 ##
 # DepexObject
 #
+
+
 class DepexObject(CommonPropertiesObject):
     def __init__(self):
         self.Depex = ''
@@ -442,6 +465,8 @@ class DepexObject(CommonPropertiesObject):
 ##
 # PackageDependencyObject
 #
+
+
 class PackageDependencyObject(GuidVersionObject, CommonPropertiesObject):
     def __init__(self):
         self.Package = ''
@@ -464,6 +489,8 @@ class PackageDependencyObject(GuidVersionObject, CommonPropertiesObject):
 ##
 # BuildOptionObject
 #
+
+
 class BuildOptionObject(CommonPropertiesObject):
     def __init__(self):
         CommonPropertiesObject.__init__(self)
@@ -478,6 +505,8 @@ class BuildOptionObject(CommonPropertiesObject):
 ##
 # ModuleObject
 #
+
+
 class ModuleObject(ModuleHeaderObject):
     def __init__(self):
         #
diff --git a/BaseTools/Source/Python/UPT/Object/POM/PackageObject.py b/BaseTools/Source/Python/UPT/Object/POM/PackageObject.py
index fcc2a3bea94a..8fc202959a1a 100644
--- a/BaseTools/Source/Python/UPT/Object/POM/PackageObject.py
+++ b/BaseTools/Source/Python/UPT/Object/POM/PackageObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define a class object to describe a package
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -18,8 +18,10 @@ from Object.POM.CommonObject import CommonHeaderObject
 from Object.POM.CommonObject import BinaryHeaderObject
 from Library.Misc import Sdict
 
-## StandardIncludeFileObject
+# StandardIncludeFileObject
 #
+
+
 class StandardIncludeFileObject(CommonPropertiesObject):
     def __init__(self):
         CommonPropertiesObject.__init__(self)
@@ -31,14 +33,18 @@ class StandardIncludeFileObject(CommonPropertiesObject):
     def GetIncludeFile(self):
         return self.IncludeFile
 
-## PackageIncludeFileObject
+# PackageIncludeFileObject
 #
+
+
 class PackageIncludeFileObject(StandardIncludeFileObject):
     pass
 
 ##
 # PackageObject
 #
+
+
 class PackageObject(IdentificationObject, CommonHeaderObject, BinaryHeaderObject):
     def __init__(self):
         IdentificationObject.__init__(self)
@@ -189,4 +195,3 @@ class PackageObject(IdentificationObject, CommonHeaderObject, BinaryHeaderObject
 
     def GetModuleFileList(self):
         return self.ModuleFileList
-
diff --git a/BaseTools/Source/Python/UPT/Object/POM/__init__.py b/BaseTools/Source/Python/UPT/Object/POM/__init__.py
index b36e7f6585c1..ceb3226afc0c 100644
--- a/BaseTools/Source/Python/UPT/Object/POM/__init__.py
+++ b/BaseTools/Source/Python/UPT/Object/POM/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Object' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/DecObject.py b/BaseTools/Source/Python/UPT/Object/Parser/DecObject.py
index 2a7fcf1697e5..28d5153a6af8 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/DecObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/DecObject.py
@@ -1,6 +1,6 @@
-## @file
+# @file
 # This file is used to define class objects for DEC file. It will consumed by
-#DecParser
+# DecParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
 #
@@ -10,7 +10,7 @@
 DecObject
 '''
 
-## Import modules
+# Import modules
 #
 import os.path
 
@@ -25,51 +25,55 @@ from Library.DataType import TAB_USER_EXTENSIONS
 from Library.DataType import TAB_PCDS
 from Library.DataType import TAB_ARCH_COMMON
 
-## _DecComments
+# _DecComments
 #
 # Base class for all data objects which have head and tail comments
 #
+
+
 class _DecComments:
 
-    ##constructor
+    # constructor
     #
     def __init__(self):
         self._HeadComment = []
         self._TailComment = []
 
-    ## GetComments
+    # GetComments
     #
     def GetComments(self):
         return self._HeadComment, self._TailComment
 
-    ## GetHeadComment
+    # GetHeadComment
     #
     def GetHeadComment(self):
         return self._HeadComment
 
-    ## SetHeadComment
+    # SetHeadComment
     #
     # @param Comment: comment content
     #
     def SetHeadComment(self, Comment):
         self._HeadComment = Comment
 
-    ## GetTailComment
+    # GetTailComment
     #
     def GetTailComment(self):
         return self._TailComment
 
-    ## SetTailComment
+    # SetTailComment
     #
     # @param Comment: comment content
     #
     def SetTailComment(self, Comment):
         self._TailComment = Comment
 
-## _DecBaseObject
+# _DecBaseObject
 #
 # Base class that hold common info
 #
+
+
 class _DecBaseObject(_DecComments):
     def __init__(self, PkgFullName):
         _DecComments.__init__(self)
@@ -82,27 +86,27 @@ class _DecBaseObject(_DecComments):
         self._PackagePath, self._FileName = os.path.split(PkgFullName)
         self._SecName = ''
 
-    ## GetSectionName
+    # GetSectionName
     #
     def GetSectionName(self):
         return self._SecName
 
-    ## GetPackagePath
+    # GetPackagePath
     #
     def GetPackagePath(self):
         return self._PackagePath
 
-    ## GetPackageFile
+    # GetPackageFile
     #
     def GetPackageFile(self):
         return self._FileName
 
-    ## GetPackageFullName
+    # GetPackageFullName
     #
     def GetPackageFullName(self):
         return self._PkgFullName
 
-    ## AddItem
+    # AddItem
     # Add sub-item to current object, sub-class should override it if needed
     #
     # @param Item: Sub-item to be added
@@ -122,7 +126,7 @@ class _DecBaseObject(_DecComments):
             ArchModule.append(Ele[1])
         Item.ArchAndModuleType = ArchModule
 
-    ## _GetItemByArch
+    # _GetItemByArch
     # Helper class used by sub-class
     # @param Arch:  arch
     #
@@ -132,7 +136,7 @@ class _DecBaseObject(_DecComments):
             return []
         return self.ValueDict[Arch]
 
-    ## _GetAllItems
+    # _GetAllItems
     # Get all items, union all arches, items in returned list are unique
     #
     def _GetAllItems(self):
@@ -143,10 +147,12 @@ class _DecBaseObject(_DecComments):
                     Retlst.append(Item)
         return Retlst
 
-## _DecItemBaseObject
+# _DecItemBaseObject
 #
 # Module type and arch the item belongs to
 #
+
+
 class _DecItemBaseObject(_DecComments):
     def __init__(self):
         _DecComments.__init__(self)
@@ -155,7 +161,7 @@ class _DecItemBaseObject(_DecComments):
         #
         self.ArchAndModuleType = []
 
-    ## GetArchList
+    # GetArchList
     #
     def GetArchList(self):
         ArchSet = set()
@@ -163,10 +169,12 @@ class _DecItemBaseObject(_DecComments):
             ArchSet.add(Arch)
         return list(ArchSet)
 
-## DecDefineObject
+# DecDefineObject
 #
 # Class to hold define section information
 #
+
+
 class DecDefineObject(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
@@ -177,7 +185,7 @@ class DecDefineObject(_DecBaseObject):
         self._PkgVersion = ''
         self._PkgUniFile = ''
 
-    ## GetPackageSpecification
+    # GetPackageSpecification
     #
     def GetPackageSpecification(self):
         return self._DecSpec
@@ -185,7 +193,7 @@ class DecDefineObject(_DecBaseObject):
     def SetPackageSpecification(self, DecSpec):
         self._DecSpec = DecSpec
 
-    ## GetPackageName
+    # GetPackageName
     #
     def GetPackageName(self):
         return self._PkgName
@@ -193,7 +201,7 @@ class DecDefineObject(_DecBaseObject):
     def SetPackageName(self, PkgName):
         self._PkgName = PkgName
 
-    ## GetPackageGuid
+    # GetPackageGuid
     #
     def GetPackageGuid(self):
         return self._PkgGuid
@@ -201,7 +209,7 @@ class DecDefineObject(_DecBaseObject):
     def SetPackageGuid(self, PkgGuid):
         self._PkgGuid = PkgGuid
 
-    ## GetPackageVersion
+    # GetPackageVersion
     #
     def GetPackageVersion(self):
         return self._PkgVersion
@@ -209,7 +217,7 @@ class DecDefineObject(_DecBaseObject):
     def SetPackageVersion(self, PkgVersion):
         self._PkgVersion = PkgVersion
 
-    ## GetPackageUniFile
+    # GetPackageUniFile
     #
     def GetPackageUniFile(self):
         return self._PkgUniFile
@@ -217,109 +225,119 @@ class DecDefineObject(_DecBaseObject):
     def SetPackageUniFile(self, PkgUniFile):
         self._PkgUniFile = PkgUniFile
 
-    ## GetDefines
+    # GetDefines
     #
     def GetDefines(self):
         return self._GetItemByArch(TAB_ARCH_COMMON)
 
-    ## GetAllDefines
+    # GetAllDefines
     #
     def GetAllDefines(self):
         return self._GetAllItems()
 
-## DecDefineItemObject
+# DecDefineItemObject
 #
 # Each item of define section
 #
+
+
 class DecDefineItemObject(_DecItemBaseObject):
     def __init__(self):
         _DecItemBaseObject.__init__(self)
         self.Key = ''
         self.Value = ''
 
-    ## __hash__
+    # __hash__
     #
     def __hash__(self):
         return hash(self.Key + self.Value)
 
-    ## __eq__
+    # __eq__
     #
     def __eq__(self, Other):
         return id(self) == id(Other)
 
-    ## __str__
+    # __str__
     #
     def __str__(self):
         return str(self.ArchAndModuleType) + '\n' + self.Key + \
             ' = ' + self.Value
 
-## DecIncludeObject
+# DecIncludeObject
 #
 # Class to hold include section info
 #
+
+
 class DecIncludeObject(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
         self._SecName = TAB_INCLUDES.upper()
 
-    ## GetIncludes
+    # GetIncludes
     #
     def GetIncludes(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetAllIncludes
+    # GetAllIncludes
     #
     def GetAllIncludes(self):
         return self._GetAllItems()
 
-## DecIncludeItemObject
+# DecIncludeItemObject
 #
 # Item of include section
 #
+
+
 class DecIncludeItemObject(_DecItemBaseObject):
     def __init__(self, File, Root):
         self.File = File
         self.Root = Root
         _DecItemBaseObject.__init__(self)
 
-    ## __hash__
+    # __hash__
     #
     def __hash__(self):
         return hash(self.File)
 
-    ## __eq__
+    # __eq__
     #
     def __eq__(self, Other):
         return id(self) == id(Other)
 
-    ## __str__
+    # __str__
     #
     def __str__(self):
         return self.File
 
-## DecLibraryclassObject
+# DecLibraryclassObject
 #
 # Class to hold library class section info
 #
+
+
 class DecLibraryclassObject(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
         self._PackagePath, self._FileName = os.path.split(PkgFullName)
         self._SecName = TAB_LIBRARY_CLASSES.upper()
 
-    ## GetLibraryclasses
+    # GetLibraryclasses
     #
     def GetLibraryclasses(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetAllLibraryclasses
+    # GetAllLibraryclasses
     #
     def GetAllLibraryclasses(self):
         return self._GetAllItems()
 
-## DecLibraryclassItemObject
+# DecLibraryclassItemObject
 # Item of library class section
 #
+
+
 class DecLibraryclassItemObject(_DecItemBaseObject):
     def __init__(self, Libraryclass, File, Root):
         _DecItemBaseObject.__init__(self)
@@ -327,30 +345,32 @@ class DecLibraryclassItemObject(_DecItemBaseObject):
         self.Root = Root
         self.Libraryclass = Libraryclass
 
-    ## __hash__
+    # __hash__
     #
     def __hash__(self):
         return hash(self.Libraryclass + self.File)
 
-    ## __eq__
+    # __eq__
     #
     def __eq__(self, Other):
         return id(self) == id(Other)
 
-    ## __str__
+    # __str__
     #
     def __str__(self):
         return self.Libraryclass + '|' + self.File
 
-## DecPcdObject
+# DecPcdObject
 # Class to hold PCD section
 #
+
+
 class DecPcdObject(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
         self._SecName = TAB_PCDS.upper()
 
-    ## AddItem
+    # AddItem
     #
     # Diff from base class
     #
@@ -371,7 +391,7 @@ class DecPcdObject(_DecBaseObject):
             ArchModule.append([Type, Arch])
         Item.ArchAndModuleType = ArchModule
 
-    ## GetPcds
+    # GetPcds
     #
     # @param PcdType: PcdType
     # @param Arch: Arch
@@ -383,7 +403,7 @@ class DecPcdObject(_DecBaseObject):
             return []
         return self.ValueDict[PcdType, Arch]
 
-    ## GetPcdsByType
+    # GetPcdsByType
     #
     # @param PcdType: PcdType
     #
@@ -398,12 +418,14 @@ class DecPcdObject(_DecBaseObject):
                     Retlst.append(Item)
         return Retlst
 
-## DecPcdItemObject
+# DecPcdItemObject
 #
 # Item of PCD section
 #
 # @param _DecItemBaseObject: _DecItemBaseObject object
 #
+
+
 class DecPcdItemObject(_DecItemBaseObject):
     def __init__(self, Guid, Name, Value, DatumType,
                  Token, MaxDatumSize=''):
@@ -415,17 +437,17 @@ class DecPcdItemObject(_DecItemBaseObject):
         self.TokenValue = Token
         self.MaxDatumSize = MaxDatumSize
 
-    ## __hash__
+    # __hash__
     #
     def __hash__(self):
         return hash(self.TokenSpaceGuidCName + self.TokenCName)
 
-    ## __eq__
+    # __eq__
     #
     def __eq__(self, Other):
         return id(self) == id(Other)
 
-    ## GetArchListOfType
+    # GetArchListOfType
     #
     # @param PcdType: PcdType
     #
@@ -438,35 +460,39 @@ class DecPcdItemObject(_DecItemBaseObject):
             ItemSet.add(Arch)
         return list(ItemSet)
 
-## DecGuidObjectBase
+# DecGuidObjectBase
 #
 # Base class for PPI, Protocol, and GUID.
 # Hold same data but has different method for clarification in sub-class
 #
 # @param _DecBaseObject: Dec Base Object
 #
+
+
 class DecGuidObjectBase(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
 
-    ## GetGuidStyleItems
+    # GetGuidStyleItems
     #
     # @param Arch: Arch
     #
     def GetGuidStyleItems(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetGuidStyleAllItems
+    # GetGuidStyleAllItems
     #
     def GetGuidStyleAllItems(self):
         return self._GetAllItems()
 
-## DecGuidItemObject
+# DecGuidItemObject
 #
 # Item of GUID, PPI and Protocol section
 #
 # @param _DecItemBaseObject: Dec Item Base Object
 #
+
+
 class DecGuidItemObject(_DecItemBaseObject):
     def __init__(self, CName, GuidCValue, GuidString):
         _DecItemBaseObject.__init__(self)
@@ -474,103 +500,111 @@ class DecGuidItemObject(_DecItemBaseObject):
         self.GuidCValue = GuidCValue
         self.GuidString = GuidString
 
-    ## __hash__
+    # __hash__
     #
     def __hash__(self):
         return hash(self.GuidCName)
 
-    ## __eq__
+    # __eq__
     #
     def __eq__(self, Other):
         return id(self) == id(Other)
 
-    ## __str__
+    # __str__
     #
     def __str__(self):
         return self.GuidCName + ' = ' + self.GuidCValue
 
-## DecGuidObject
+# DecGuidObject
 #
 # Class for GUID section
 #
 # @param DecGuidObjectBase: Dec Guid Object Base
 #
+
+
 class DecGuidObject(DecGuidObjectBase):
     def __init__(self, PkgFullName):
         DecGuidObjectBase.__init__(self, PkgFullName)
         self._SecName = TAB_GUIDS.upper()
 
-    ## GetGuids
+    # GetGuids
     #
     # @param Arch: Arch
     #
     def GetGuids(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetAllGuids
+    # GetAllGuids
     #
     def GetAllGuids(self):
         return self._GetAllItems()
 
-## DecPpiObject
+# DecPpiObject
 #
 # Class for PPI section
 #
 # @param DecGuidObjectBase: Dec Guid Object Base
 #
+
+
 class DecPpiObject(DecGuidObjectBase):
     def __init__(self, PkgFullName):
         DecGuidObjectBase.__init__(self, PkgFullName)
         self._SecName = TAB_PPIS.upper()
 
-    ## GetPpis
+    # GetPpis
     #
     # @param Arch: Arch
     #
     def GetPpis(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetAllPpis
+    # GetAllPpis
     #
     def GetAllPpis(self):
         return self._GetAllItems()
 
-## DecProtocolObject
+# DecProtocolObject
 #
 # Class for protocol section
 #
 # @param DecGuidObjectBase: Dec Guid Object Base
 #
+
+
 class DecProtocolObject(DecGuidObjectBase):
     def __init__(self, PkgFullName):
         DecGuidObjectBase.__init__(self, PkgFullName)
         self._SecName = TAB_PROTOCOLS.upper()
 
-    ## GetProtocols
+    # GetProtocols
     #
     # @param Arch: Arch
     #
     def GetProtocols(self, Arch=TAB_ARCH_COMMON):
         return self._GetItemByArch(Arch)
 
-    ## GetAllProtocols
+    # GetAllProtocols
     #
     def GetAllProtocols(self):
         return self._GetAllItems()
 
-## DecUserExtensionObject
+# DecUserExtensionObject
 #
 # Class for user extension section
 #
 # @param _DecBaseObject: Dec Guid Object Base
 #
+
+
 class DecUserExtensionObject(_DecBaseObject):
     def __init__(self, PkgFullName):
         _DecBaseObject.__init__(self, PkgFullName)
         self._SecName = TAB_USER_EXTENSIONS.upper()
         self.ItemList = []
 
-    ## GetProtocols
+    # GetProtocols
     #
     # @param Item: Item
     # @param Scope: Scope
@@ -582,13 +616,13 @@ class DecUserExtensionObject(_DecBaseObject):
             return
         self.ItemList.append(Item)
 
-    ## GetAllUserExtensions
+    # GetAllUserExtensions
     #
     def GetAllUserExtensions(self):
         return self.ItemList
 
 
-## DecUserExtensionItemObject
+# DecUserExtensionItemObject
 # Item for user extension section
 #
 # @param _DecItemBaseObject: Dec Item Base Object
@@ -599,7 +633,3 @@ class DecUserExtensionItemObject(_DecItemBaseObject):
         self.UserString = ''
         self.UserId = ''
         self.IdString = ''
-
-
-
-
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
index 6b55d01ab2a2..dd6210c697a3 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Binaries] section.
 # It will consumed by InfParser.
 #
@@ -39,33 +39,41 @@ class InfBianryItem():
 
     def SetFileName(self, FileName):
         self.FileName = FileName
+
     def GetFileName(self):
         return self.FileName
 
     def SetTarget(self, Target):
         self.Target = Target
+
     def GetTarget(self):
         return self.Target
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetType(self, Type):
         self.Type = Type
+
     def GetType(self):
         return self.Type
+
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
+
 class InfBianryVerItem(InfBianryItem, CurrentLine):
     def __init__(self):
         InfBianryItem.__init__(self)
@@ -74,9 +82,11 @@ class InfBianryVerItem(InfBianryItem, CurrentLine):
 
     def SetVerTypeName(self, VerTypeName):
         self.VerTypeName = VerTypeName
+
     def GetVerTypeName(self):
         return self.VerTypeName
 
+
 class InfBianryUiItem(InfBianryItem, CurrentLine):
     def __init__(self):
         InfBianryItem.__init__(self)
@@ -85,9 +95,11 @@ class InfBianryUiItem(InfBianryItem, CurrentLine):
 
     def SetUiTypeName(self, UiTypeName):
         self.UiTypeName = UiTypeName
+
     def GetVerTypeName(self):
         return self.UiTypeName
 
+
 class InfBianryCommonItem(InfBianryItem, CurrentLine):
     def __init__(self):
         self.CommonType = ''
@@ -99,21 +111,25 @@ class InfBianryCommonItem(InfBianryItem, CurrentLine):
 
     def SetCommonType(self, CommonType):
         self.CommonType = CommonType
+
     def GetCommonType(self):
         return self.CommonType
 
     def SetTagName(self, TagName):
         self.TagName = TagName
+
     def GetTagName(self):
         return self.TagName
 
     def SetFamily(self, Family):
         self.Family = Family
+
     def GetFamily(self):
         return self.Family
 
     def SetGuidValue(self, GuidValue):
         self.GuidValue = GuidValue
+
     def GetGuidValue(self):
         return self.GuidValue
 
@@ -121,6 +137,8 @@ class InfBianryCommonItem(InfBianryItem, CurrentLine):
 #
 #
 #
+
+
 class InfBinariesObject(InfSectionCommonDef):
     def __init__(self):
         self.Binaries = Sdict()
@@ -130,7 +148,7 @@ class InfBinariesObject(InfSectionCommonDef):
         self.Macros = {}
         InfSectionCommonDef.__init__(self)
 
-    ## CheckVer
+    # CheckVer
     #
     #
     def CheckVer(self, Ver, __SupArchList):
@@ -150,7 +168,8 @@ class InfBinariesObject(InfSectionCommonDef):
             if len(VerContent) < 2:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (VerContent[0], 2),
+                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (
+                                 VerContent[0], 2),
                              File=VerCurrentLine.GetFileName(),
                              Line=VerCurrentLine.GetLineNo(),
                              ExtraData=VerCurrentLine.GetLineString())
@@ -158,7 +177,8 @@ class InfBinariesObject(InfSectionCommonDef):
             if len(VerContent) > 4:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (VerContent[0], 4),
+                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (
+                                 VerContent[0], 4),
                              File=VerCurrentLine.GetFileName(),
                              Line=VerCurrentLine.GetLineNo(),
                              ExtraData=VerCurrentLine.GetLineString())
@@ -187,7 +207,8 @@ class InfBinariesObject(InfSectionCommonDef):
                 if not (ValidFile(FullFileName) or ValidFile(VerContent[1])):
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (VerContent[1]),
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (
+                                     VerContent[1]),
                                  File=VerCurrentLine.GetFileName(),
                                  Line=VerCurrentLine.GetLineNo(),
                                  ExtraData=VerCurrentLine.GetLineString())
@@ -199,14 +220,15 @@ class InfBinariesObject(InfSectionCommonDef):
                 else:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (VerContent[1]),
+                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                     VerContent[1]),
                                  File=VerCurrentLine.GetFileName(),
                                  Line=VerCurrentLine.GetLineNo(),
                                  ExtraData=VerCurrentLine.GetLineString())
                     return False
                 if IsValidFileFlag:
                     VerContent[0] = ConvPathFromAbsToRel(VerContent[0],
-                                            GlobalData.gINF_MODULE_DIR)
+                                                         GlobalData.gINF_MODULE_DIR)
                     InfBianryVerItemObj.SetFileName(VerContent[1])
             if len(VerContent) >= 3:
                 #
@@ -224,12 +246,13 @@ class InfBinariesObject(InfSectionCommonDef):
                 #
                 # Validate Feature Flag Express
                 #
-                FeatureFlagRtv = IsValidFeatureFlagExp(VerContent[3].\
+                FeatureFlagRtv = IsValidFeatureFlagExp(VerContent[3].
                                                        strip())
                 if not FeatureFlagRtv[0]:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                     FeatureFlagRtv[1]),
                                  File=VerCurrentLine.GetFileName(),
                                  Line=VerCurrentLine.GetLineNo(),
                                  ExtraData=VerCurrentLine.GetLineString())
@@ -275,7 +298,7 @@ class InfBinariesObject(InfSectionCommonDef):
                     BinariesList.append((InfBianryVerItemObj, VerComment))
                     self.Binaries[InfBianryVerItemObj] = BinariesList
 
-    ## ParseCommonBinary
+    # ParseCommonBinary
     #
     # ParseCommonBinary
     #
@@ -295,7 +318,8 @@ class InfBinariesObject(InfSectionCommonDef):
                 if len(ItemContent) < 3:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (ItemContent[0], 3),
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (
+                                     ItemContent[0], 3),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
                                  ExtraData=CurrentLineOfItem.GetLineString())
@@ -304,7 +328,8 @@ class InfBinariesObject(InfSectionCommonDef):
                 if len(ItemContent) < 2:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (ItemContent[0], 2),
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (
+                                     ItemContent[0], 2),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
                                  ExtraData=CurrentLineOfItem.GetLineString())
@@ -313,7 +338,8 @@ class InfBinariesObject(InfSectionCommonDef):
             if len(ItemContent) > 7:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (ItemContent[0], 7),
+                             ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (
+                                 ItemContent[0], 7),
                              File=CurrentLineOfItem.GetFileName(),
                              Line=CurrentLineOfItem.GetLineNo(),
                              ExtraData=CurrentLineOfItem.GetLineString())
@@ -333,7 +359,7 @@ class InfBinariesObject(InfSectionCommonDef):
                 if BinaryFileType not in DT.BINARY_FILE_TYPE_LIST:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_INVALID_FILETYPE % \
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_INVALID_FILETYPE %
                                  (DT.BINARY_FILE_TYPE_LIST.__str__()),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
@@ -345,7 +371,7 @@ class InfBinariesObject(InfSectionCommonDef):
                 if BinaryFileType == 'LIB' or BinaryFileType == 'UEFI_APP':
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_INVALID_FILETYPE % \
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_INVALID_FILETYPE %
                                  (DT.BINARY_FILE_TYPE_LIST.__str__()),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
@@ -360,11 +386,11 @@ class InfBinariesObject(InfSectionCommonDef):
                         FileName = ItemContent[2]
                     else:
                         Logger.Error("InfParser",
-                                 ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_FILENAME_NOT_EXIST,
-                                 File=CurrentLineOfItem.GetFileName(),
-                                 Line=CurrentLineOfItem.GetLineNo(),
-                                 ExtraData=CurrentLineOfItem.GetLineString())
+                                     ToolError.FORMAT_INVALID,
+                                     ST.ERR_INF_PARSER_BINARY_ITEM_FILENAME_NOT_EXIST,
+                                     File=CurrentLineOfItem.GetFileName(),
+                                     Line=CurrentLineOfItem.GetLineNo(),
+                                     ExtraData=CurrentLineOfItem.GetLineString())
                 else:
                     FileName = ItemContent[1]
                 #
@@ -375,7 +401,8 @@ class InfBinariesObject(InfSectionCommonDef):
                 if not (ValidFile(FullFileName) or ValidFile(FileName)):
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (FileName),
+                                 ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (
+                                     FileName),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
                                  ExtraData=CurrentLineOfItem.GetLineString())
@@ -386,14 +413,16 @@ class InfBinariesObject(InfSectionCommonDef):
                     IsValidFileFlag = True
                 else:
                     Logger.Error("InfParser",
-                                ToolError.FORMAT_INVALID,
-                                ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (FileName),
-                                File=CurrentLineOfItem.GetFileName(),
-                                Line=CurrentLineOfItem.GetLineNo(),
-                                ExtraData=CurrentLineOfItem.GetLineString())
+                                 ToolError.FORMAT_INVALID,
+                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                     FileName),
+                                 File=CurrentLineOfItem.GetFileName(),
+                                 Line=CurrentLineOfItem.GetLineNo(),
+                                 ExtraData=CurrentLineOfItem.GetLineString())
                     return False
                 if IsValidFileFlag:
-                    ItemContent[0] = ConvPathFromAbsToRel(ItemContent[0], GlobalData.gINF_MODULE_DIR)
+                    ItemContent[0] = ConvPathFromAbsToRel(
+                        ItemContent[0], GlobalData.gINF_MODULE_DIR)
                     InfBianryCommonItemObj.SetFileName(FileName)
             if len(ItemContent) >= 3:
                 #
@@ -423,7 +452,8 @@ class InfBinariesObject(InfSectionCommonDef):
                     if ItemContent[4].strip() != '':
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED % (ItemContent[4]),
+                                     ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED % (
+                                         ItemContent[4]),
                                      File=CurrentLineOfItem.GetFileName(),
                                      Line=CurrentLineOfItem.GetLineNo(),
                                      ExtraData=CurrentLineOfItem.GetLineString())
@@ -445,11 +475,13 @@ class InfBinariesObject(InfSectionCommonDef):
                     #
                     # Validate Feature Flag Express
                     #
-                    FeatureFlagRtv = IsValidFeatureFlagExp(ItemContent[5].strip())
+                    FeatureFlagRtv = IsValidFeatureFlagExp(
+                        ItemContent[5].strip())
                     if not FeatureFlagRtv[0]:
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                         FeatureFlagRtv[1]),
                                      File=CurrentLineOfItem.GetFileName(),
                                      Line=CurrentLineOfItem.GetLineNo(),
                                      ExtraData=CurrentLineOfItem.GetLineString())
@@ -458,7 +490,8 @@ class InfBinariesObject(InfSectionCommonDef):
                     if ItemContent[5].strip() != '':
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED % (ItemContent[5]),
+                                     ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED % (
+                                         ItemContent[5]),
                                      File=CurrentLineOfItem.GetFileName(),
                                      Line=CurrentLineOfItem.GetLineNo(),
                                      ExtraData=CurrentLineOfItem.GetLineString())
@@ -466,11 +499,11 @@ class InfBinariesObject(InfSectionCommonDef):
             if len(ItemContent) == 7:
                 if ItemContent[6].strip() == '':
                     Logger.Error("InfParser",
-                                     ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_MISSING,
-                                     File=CurrentLineOfItem.GetFileName(),
-                                     Line=CurrentLineOfItem.GetLineNo(),
-                                     ExtraData=CurrentLineOfItem.GetLineString())
+                                 ToolError.FORMAT_INVALID,
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_MISSING,
+                                 File=CurrentLineOfItem.GetFileName(),
+                                 Line=CurrentLineOfItem.GetLineNo(),
+                                 ExtraData=CurrentLineOfItem.GetLineString())
                 #
                 # Validate Feature Flag Express
                 #
@@ -478,7 +511,8 @@ class InfBinariesObject(InfSectionCommonDef):
                 if not FeatureFlagRtv[0]:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                     FeatureFlagRtv[1]),
                                  File=CurrentLineOfItem.GetFileName(),
                                  Line=CurrentLineOfItem.GetLineNo(),
                                  ExtraData=CurrentLineOfItem.GetLineString())
@@ -554,7 +588,8 @@ class InfBinariesObject(InfSectionCommonDef):
                     if len(UiContent) < 2:
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (UiContent[0], 2),
+                                     ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID % (
+                                         UiContent[0], 2),
                                      File=UiCurrentLine.GetFileName(),
                                      Line=UiCurrentLine.GetLineNo(),
                                      ExtraData=UiCurrentLine.GetLineString())
@@ -563,7 +598,8 @@ class InfBinariesObject(InfSectionCommonDef):
                     if len(UiContent) > 4:
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (UiContent[0], 4),
+                                     ST.ERR_INF_PARSER_BINARY_ITEM_FORMAT_INVALID_MAX % (
+                                         UiContent[0], 4),
                                      File=UiCurrentLine.GetFileName(),
                                      Line=UiCurrentLine.GetLineNo(),
                                      ExtraData=UiCurrentLine.GetLineString())
@@ -576,7 +612,8 @@ class InfBinariesObject(InfSectionCommonDef):
                         if UiContent[0] != 'UI':
                             Logger.Error("InfParser",
                                          ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_BINARY_VER_TYPE % ('UI'),
+                                         ST.ERR_INF_PARSER_BINARY_VER_TYPE % (
+                                             'UI'),
                                          File=UiCurrentLine.GetFileName(),
                                          Line=UiCurrentLine.GetLineNo(),
                                          ExtraData=UiCurrentLine.GetLineString())
@@ -590,7 +627,8 @@ class InfBinariesObject(InfSectionCommonDef):
                         if not (ValidFile(FullFileName) or ValidFile(UiContent[1])):
                             Logger.Error("InfParser",
                                          ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (UiContent[1]),
+                                         ST.ERR_INF_PARSER_BINARY_ITEM_FILE_NOT_EXIST % (
+                                             UiContent[1]),
                                          File=UiCurrentLine.GetFileName(),
                                          Line=UiCurrentLine.GetLineNo(),
                                          ExtraData=UiCurrentLine.GetLineString())
@@ -602,13 +640,15 @@ class InfBinariesObject(InfSectionCommonDef):
                         else:
                             Logger.Error("InfParser",
                                          ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (UiContent[1]),
+                                         ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                             UiContent[1]),
                                          File=UiCurrentLine.GetFileName(),
                                          Line=UiCurrentLine.GetLineNo(),
                                          ExtraData=UiCurrentLine.GetLineString())
                             return False
                         if IsValidFileFlag:
-                            UiContent[0] = ConvPathFromAbsToRel(UiContent[0], GlobalData.gINF_MODULE_DIR)
+                            UiContent[0] = ConvPathFromAbsToRel(
+                                UiContent[0], GlobalData.gINF_MODULE_DIR)
                             InfBianryUiItemObj.SetFileName(UiContent[1])
                     if len(UiContent) >= 3:
                         #
@@ -626,11 +666,13 @@ class InfBinariesObject(InfSectionCommonDef):
                         #
                         # Validate Feature Flag Express
                         #
-                        FeatureFlagRtv = IsValidFeatureFlagExp(UiContent[3].strip())
+                        FeatureFlagRtv = IsValidFeatureFlagExp(
+                            UiContent[3].strip())
                         if not FeatureFlagRtv[0]:
                             Logger.Error("InfParser",
                                          ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                         ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                             FeatureFlagRtv[1]),
                                          File=UiCurrentLine.GetFileName(),
                                          Line=UiCurrentLine.GetLineNo(),
                                          ExtraData=UiCurrentLine.GetLineString())
@@ -669,11 +711,13 @@ class InfBinariesObject(InfSectionCommonDef):
                     if InfBianryUiItemObj is not None:
                         if (InfBianryUiItemObj) in self.Binaries:
                             BinariesList = self.Binaries[InfBianryUiItemObj]
-                            BinariesList.append((InfBianryUiItemObj, UiComment))
+                            BinariesList.append(
+                                (InfBianryUiItemObj, UiComment))
                             self.Binaries[InfBianryUiItemObj] = BinariesList
                         else:
                             BinariesList = []
-                            BinariesList.append((InfBianryUiItemObj, UiComment))
+                            BinariesList.append(
+                                (InfBianryUiItemObj, UiComment))
                             self.Binaries[InfBianryUiItemObj] = BinariesList
         if Ver is not None and len(Ver) > 0:
             self.CheckVer(Ver, __SupArchList)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py
index fdba5db98311..a00a93436f78 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [BuildOptions] section.
 # It will consumed by InfParser.
 #
@@ -14,19 +14,22 @@ from Library import GlobalData
 
 from Object.Parser.InfCommonObject import InfSectionCommonDef
 
+
 class InfBuildOptionItem():
     def __init__(self):
-        self.Content     = ''
+        self.Content = ''
         self.SupArchList = []
         self.AsBuildList = []
 
     def SetContent(self, Content):
         self.Content = Content
+
     def GetContent(self):
         return self.Content
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
@@ -35,11 +38,12 @@ class InfBuildOptionItem():
     #
     def SetAsBuildList(self, AsBuildList):
         self.AsBuildList = AsBuildList
+
     def GetAsBuildList(self):
         return self.AsBuildList
 
 
-## INF BuildOption section
+# INF BuildOption section
 #  Macro define is not permitted for this section.
 #
 #
@@ -47,7 +51,7 @@ class InfBuildOptionsObject(InfSectionCommonDef):
     def __init__(self):
         self.BuildOptions = []
         InfSectionCommonDef.__init__(self)
-    ## SetBuildOptions function
+    # SetBuildOptions function
     #
     # For BuildOptionName, need to validate its format
     # For BuildOptionValue, just ignore it.
@@ -61,7 +65,8 @@ class InfBuildOptionsObject(InfSectionCommonDef):
     # @return True          Build options set/validate successfully
     # @return False         Build options set/validate failed
     #
-    def SetBuildOptions(self, BuildOptCont, ArchList = None, SectionContent = ''):
+
+    def SetBuildOptions(self, BuildOptCont, ArchList=None, SectionContent=''):
 
         if not GlobalData.gIS_BINARY_INF:
 
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py
index aa23d0878890..05871d9320a9 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common class objects for INF file.
 # It will consumed by InfParser
 #
@@ -10,7 +10,7 @@
 InfCommonObject
 '''
 
-## InfLineCommentObject
+# InfLineCommentObject
 #
 #  Comment Object for any line in the INF file
 #
@@ -19,6 +19,8 @@ InfCommonObject
 #  #
 #  Line # TailComment
 #
+
+
 class InfLineCommentObject():
     def __init__(self):
         self.HeaderComments = ''
@@ -36,46 +38,48 @@ class InfLineCommentObject():
     def GetTailComments(self):
         return self.TailComments
 
-## CurrentLine
+# CurrentLine
 #
+
+
 class CurrentLine():
     def __init__(self):
         self.LineNo = ''
         self.LineString = ''
         self.FileName = ''
 
-    ## SetLineNo
+    # SetLineNo
     #
     # @param LineNo: LineNo
     #
     def SetLineNo(self, LineNo):
         self.LineNo = LineNo
 
-    ## GetLineNo
+    # GetLineNo
     #
     def GetLineNo(self):
         return self.LineNo
 
-    ## SetLineString
+    # SetLineString
     #
     # @param LineString: Line String content
     #
     def SetLineString(self, LineString):
         self.LineString = LineString
 
-    ## GetLineString
+    # GetLineString
     #
     def GetLineString(self):
         return self.LineString
 
-    ## SetFileName
+    # SetFileName
     #
     # @param FileName: File Name
     #
     def SetFileName(self, FileName):
         self.FileName = FileName
 
-    ## GetFileName
+    # GetFileName
     #
     def GetFileName(self):
         return self.FileName
@@ -83,6 +87,8 @@ class CurrentLine():
 ##
 # Inf Section common data
 #
+
+
 class InfSectionCommonDef():
     def __init__(self):
         #
@@ -93,43 +99,43 @@ class InfSectionCommonDef():
         # data
         #
         self.HeaderComments = ''
-        self.TailComments   = ''
+        self.TailComments = ''
         #
         # The support arch list of this section
         #
-        self.SupArchList  = []
+        self.SupArchList = []
 
         #
         # Store all section content
         # Key is supported Arch
         #
-        self.AllContent   = {}
+        self.AllContent = {}
 
-    ## SetHeaderComments
+    # SetHeaderComments
     #
     # @param HeaderComments: HeaderComments
     #
     def SetHeaderComments(self, HeaderComments):
         self.HeaderComments = HeaderComments
 
-    ## GetHeaderComments
+    # GetHeaderComments
     #
     def GetHeaderComments(self):
         return self.HeaderComments
 
-    ## SetTailComments
+    # SetTailComments
     #
     # @param TailComments: TailComments
     #
     def SetTailComments(self, TailComments):
         self.TailComments = TailComments
 
-    ## GetTailComments
+    # GetTailComments
     #
     def GetTailComments(self):
         return self.TailComments
 
-    ## SetSupArchList
+    # SetSupArchList
     #
     # @param Arch: Arch
     #
@@ -137,12 +143,12 @@ class InfSectionCommonDef():
         if Arch not in self.SupArchList:
             self.SupArchList.append(Arch)
 
-    ## GetSupArchList
+    # GetSupArchList
     #
     def GetSupArchList(self):
         return self.SupArchList
 
-    ## SetAllContent
+    # SetAllContent
     #
     # @param ArchList: ArchList
     # @param Content: Content
@@ -150,7 +156,7 @@ class InfSectionCommonDef():
     def SetAllContent(self, Content):
         self.AllContent = Content
 
-    ## GetAllContent
+    # GetAllContent
     #
     def GetAllContent(self):
         return self.AllContent
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py
index 738a4c2dbf41..16e804e40f98 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define common class objects of [Defines] section for INF file.
 # It will consumed by InfParser
 #
@@ -12,37 +12,48 @@ InfDefineCommonObject
 
 from Object.Parser.InfCommonObject import InfLineCommentObject
 
-## InfDefineImageExeParamItem
+# InfDefineImageExeParamItem
 #
+
+
 class InfDefineImageExeParamItem():
     def __init__(self):
-        self.CName  = ''
+        self.CName = ''
         self.FeatureFlagExp = ''
         self.Comments = InfLineCommentObject()
 
     def SetCName(self, CName):
         self.CName = CName
+
     def GetCName(self):
         return self.CName
+
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
-## InfDefineEntryPointItem
+# InfDefineEntryPointItem
 #
+
+
 class InfDefineEntryPointItem(InfDefineImageExeParamItem):
     def __init__(self):
         InfDefineImageExeParamItem.__init__(self)
 
-## InfDefineUnloadImageItem
+# InfDefineUnloadImageItem
 #
+
+
 class InfDefineUnloadImageItem(InfDefineImageExeParamItem):
     def __init__(self):
         InfDefineImageExeParamItem.__init__(self)
 
-## InfDefineConstructorItem
+# InfDefineConstructorItem
 #
+
+
 class InfDefineConstructorItem(InfDefineImageExeParamItem):
     def __init__(self):
         InfDefineImageExeParamItem.__init__(self)
@@ -50,11 +61,14 @@ class InfDefineConstructorItem(InfDefineImageExeParamItem):
 
     def SetSupModList(self, SupModList):
         self.SupModList = SupModList
+
     def GetSupModList(self):
         return self.SupModList
 
-## InfDefineDestructorItem
+# InfDefineDestructorItem
 #
+
+
 class InfDefineDestructorItem(InfDefineImageExeParamItem):
     def __init__(self):
         InfDefineImageExeParamItem.__init__(self)
@@ -62,11 +76,14 @@ class InfDefineDestructorItem(InfDefineImageExeParamItem):
 
     def SetSupModList(self, SupModList):
         self.SupModList = SupModList
+
     def GetSupModList(self):
         return self.SupModList
 
-## InfDefineLibraryItem
+# InfDefineLibraryItem
 #
+
+
 class InfDefineLibraryItem():
     def __init__(self):
         self.LibraryName = ''
@@ -75,9 +92,12 @@ class InfDefineLibraryItem():
 
     def SetLibraryName(self, Name):
         self.LibraryName = Name
+
     def GetLibraryName(self):
         return self.LibraryName
+
     def SetTypes(self, Type):
         self.Types = Type
+
     def GetTypes(self):
         return self.Types
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
index a1b691ff0300..aa3d20ae2fa4 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of [Defines] section for INF file.
 # It will consumed by InfParser
 #
@@ -44,20 +44,22 @@ from Object.Parser.InfDefineCommonObject import InfDefineUnloadImageItem
 from Object.Parser.InfDefineCommonObject import InfDefineConstructorItem
 from Object.Parser.InfDefineCommonObject import InfDefineDestructorItem
 
+
 class InfDefSectionOptionRomInfo():
     def __init__(self):
-        self.PciVendorId                = None
-        self.PciDeviceId                = None
-        self.PciClassCode               = None
-        self.PciRevision                = None
-        self.PciCompress                = None
-        self.CurrentLine                = ['', -1, '']
+        self.PciVendorId = None
+        self.PciDeviceId = None
+        self.PciClassCode = None
+        self.PciRevision = None
+        self.PciCompress = None
+        self.CurrentLine = ['', -1, '']
+
     def SetPciVendorId(self, PciVendorId, Comments):
         #
         # Value has been set before.
         #
         if self.PciVendorId is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_PCI_VENDOR_ID),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_PCI_VENDOR_ID),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -69,7 +71,7 @@ class InfDefSectionOptionRomInfo():
             self.PciVendorId.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(PciVendorId),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (PciVendorId),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -81,7 +83,7 @@ class InfDefSectionOptionRomInfo():
         # Value has been set before.
         #
         if self.PciDeviceId is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_PCI_DEVICE_ID),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_PCI_DEVICE_ID),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -93,7 +95,7 @@ class InfDefSectionOptionRomInfo():
             self.PciDeviceId.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(PciDeviceId),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (PciDeviceId),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -105,7 +107,7 @@ class InfDefSectionOptionRomInfo():
         # Value has been set before.
         #
         if self.PciClassCode is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_PCI_CLASS_CODE),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_PCI_CLASS_CODE),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -117,7 +119,7 @@ class InfDefSectionOptionRomInfo():
             self.PciClassCode.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%\
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID %
                        (PciClassCode),
                        LineInfo=self.CurrentLine)
             return False
@@ -130,7 +132,7 @@ class InfDefSectionOptionRomInfo():
         # Value has been set before.
         #
         if self.PciRevision is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_PCI_REVISION),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_PCI_REVISION),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -142,7 +144,7 @@ class InfDefSectionOptionRomInfo():
             self.PciRevision.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(PciRevision),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (PciRevision),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -154,7 +156,7 @@ class InfDefSectionOptionRomInfo():
         # Value has been set before.
         #
         if self.PciCompress is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_PCI_COMPRESS),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_PCI_COMPRESS),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -167,41 +169,44 @@ class InfDefSectionOptionRomInfo():
             self.PciCompress.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(PciCompress),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (PciCompress),
                        LineInfo=self.CurrentLine)
             return False
+
     def GetPciCompress(self):
         return self.PciCompress
 ##
 # INF [Define] section Object
 #
+
+
 class InfDefSection(InfDefSectionOptionRomInfo):
     def __init__(self):
-        self.BaseName                   = None
-        self.FileGuid                   = None
-        self.ModuleType                 = None
-        self.ModuleUniFileName          = None
-        self.InfVersion                 = None
-        self.EdkReleaseVersion          = None
-        self.UefiSpecificationVersion   = None
-        self.PiSpecificationVersion     = None
-        self.LibraryClass               = []
-        self.Package                    = None
-        self.VersionString              = None
-        self.PcdIsDriver                = None
-        self.EntryPoint                 = []
-        self.UnloadImages               = []
-        self.Constructor                = []
-        self.Destructor                 = []
-        self.Shadow                     = None
-        self.CustomMakefile             = []
-        self.Specification              = []
-        self.UefiHiiResourceSection     = None
-        self.DpxSource                  = []
-        self.CurrentLine                = ['', -1, '']
+        self.BaseName = None
+        self.FileGuid = None
+        self.ModuleType = None
+        self.ModuleUniFileName = None
+        self.InfVersion = None
+        self.EdkReleaseVersion = None
+        self.UefiSpecificationVersion = None
+        self.PiSpecificationVersion = None
+        self.LibraryClass = []
+        self.Package = None
+        self.VersionString = None
+        self.PcdIsDriver = None
+        self.EntryPoint = []
+        self.UnloadImages = []
+        self.Constructor = []
+        self.Destructor = []
+        self.Shadow = None
+        self.CustomMakefile = []
+        self.Specification = []
+        self.UefiHiiResourceSection = None
+        self.DpxSource = []
+        self.CurrentLine = ['', -1, '']
         InfDefSectionOptionRomInfo.__init__(self)
 
-    ## SetHeadComment
+    # SetHeadComment
     #
     # @param BaseName: BaseName
     #
@@ -210,7 +215,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.BaseName is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_BASE_NAME),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_BASE_NAME),
                        LineInfo=self.CurrentLine)
             return False
         if not (BaseName == '' or BaseName is None):
@@ -220,16 +225,16 @@ class InfDefSection(InfDefSectionOptionRomInfo):
                 self.BaseName.Comments = Comments
                 return True
             else:
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_NAME_INVALID%(BaseName),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_NAME_INVALID % (BaseName),
                            LineInfo=self.CurrentLine)
                 return False
 
-    ## GetBaseName
+    # GetBaseName
     #
     def GetBaseName(self):
         return self.BaseName
 
-    ## SetFileGuid
+    # SetFileGuid
     #
     # @param FileGuid: FileGuid
     #
@@ -238,8 +243,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.FileGuid is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_FILE_GUID),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_FILE_GUID),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -251,16 +256,16 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.FileGuid.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_GUID_INVALID%(FileGuid),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_GUID_INVALID % (FileGuid),
                        LineInfo=self.CurrentLine)
             return False
 
-    ## GetFileGuid
+    # GetFileGuid
     #
     def GetFileGuid(self):
         return self.FileGuid
 
-    ## SetModuleType
+    # SetModuleType
     #
     # @param ModuleType: ModuleType
     #
@@ -269,8 +274,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.ModuleType is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_MODULE_TYPE),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_MODULE_TYPE),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -286,17 +291,17 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.ModuleType.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID%\
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID %
                        (ModuleType),
                        LineInfo=self.CurrentLine)
             return False
 
-    ## GetModuleType
+    # GetModuleType
     #
     def GetModuleType(self):
         return self.ModuleType
 
-    ## SetModuleUniFileName
+    # SetModuleUniFileName
     #
     # @param ModuleUniFileName: ModuleUniFileName
     #
@@ -304,16 +309,16 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         if Comments:
             pass
         if self.ModuleUniFileName is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_MODULE_UNI_FILE),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_MODULE_UNI_FILE),
                        LineInfo=self.CurrentLine)
         self.ModuleUniFileName = ModuleUniFileName
 
-    ## GetModuleType
+    # GetModuleType
     #
     def GetModuleUniFileName(self):
         return self.ModuleUniFileName
 
-    ## SetInfVersion
+    # SetInfVersion
     #
     # @param InfVersion: InfVersion
     #
@@ -322,8 +327,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.InfVersion is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_INF_VERSION),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_INF_VERSION),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -340,7 +345,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
                            ErrorCode=ToolError.EDK1_INF_ERROR,
                            LineInfo=self.CurrentLine)
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(InfVersion),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (InfVersion),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -349,12 +354,12 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         self.InfVersion.Comments = Comments
         return True
 
-    ## GetInfVersion
+    # GetInfVersion
     #
     def GetInfVersion(self):
         return self.InfVersion
 
-    ## SetEdkReleaseVersion
+    # SetEdkReleaseVersion
     #
     # @param EdkReleaseVersion: EdkReleaseVersion
     #
@@ -363,8 +368,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.EdkReleaseVersion is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_EDK_RELEASE_VERSION),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_EDK_RELEASE_VERSION),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -377,17 +382,17 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.EdkReleaseVersion.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID\
-                       %(EdkReleaseVersion),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID
+                       % (EdkReleaseVersion),
                        LineInfo=self.CurrentLine)
             return False
 
-    ## GetEdkReleaseVersion
+    # GetEdkReleaseVersion
     #
     def GetEdkReleaseVersion(self):
         return self.EdkReleaseVersion
 
-    ## SetUefiSpecificationVersion
+    # SetUefiSpecificationVersion
     #
     # @param UefiSpecificationVersion: UefiSpecificationVersion
     #
@@ -396,8 +401,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.UefiSpecificationVersion is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -410,17 +415,17 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.UefiSpecificationVersion.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID\
-                       %(UefiSpecificationVersion),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID
+                       % (UefiSpecificationVersion),
                        LineInfo=self.CurrentLine)
             return False
 
-    ## GetUefiSpecificationVersion
+    # GetUefiSpecificationVersion
     #
     def GetUefiSpecificationVersion(self):
         return self.UefiSpecificationVersion
 
-    ## SetPiSpecificationVersion
+    # SetPiSpecificationVersion
     #
     # @param PiSpecificationVersion: PiSpecificationVersion
     #
@@ -429,8 +434,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.PiSpecificationVersion is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_PI_SPECIFICATION_VERSION),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_PI_SPECIFICATION_VERSION),
                        LineInfo=self.CurrentLine)
             return False
         #
@@ -443,17 +448,17 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.PiSpecificationVersion.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID\
-                       %(PiSpecificationVersion),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID
+                       % (PiSpecificationVersion),
                        LineInfo=self.CurrentLine)
             return False
 
-    ## GetPiSpecificationVersion
+    # GetPiSpecificationVersion
     #
     def GetPiSpecificationVersion(self):
         return self.PiSpecificationVersion
 
-    ## SetLibraryClass
+    # SetLibraryClass
     #
     # @param LibraryClass: LibraryClass
     #
@@ -470,13 +475,13 @@ class InfDefSection(InfDefSectionOptionRomInfo):
                 TypeList = [Type for Type in TypeList if Type != '']
                 for Item in TypeList:
                     if Item not in DT.MODULE_LIST:
-                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Item),
+                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Item),
                                    LineInfo=self.CurrentLine)
                         return False
                 InfDefineLibraryItemObj.SetTypes(TypeList)
             self.LibraryClass.append(InfDefineLibraryItemObj)
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Name),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Name),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -490,20 +495,19 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.VersionString is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_VERSION_STRING),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_VERSION_STRING),
                        LineInfo=self.CurrentLine)
             return False
         if not IsValidDecVersion(VersionString):
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID\
-                       %(VersionString),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID
+                       % (VersionString),
                        LineInfo=self.CurrentLine)
         self.VersionString = InfDefMember()
         self.VersionString.SetValue(VersionString)
         self.VersionString.Comments = Comments
         return True
 
-
     def GetVersionString(self):
         return self.VersionString
 
@@ -512,8 +516,8 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.PcdIsDriver is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND\
-                       %(DT.TAB_INF_DEFINES_PCD_IS_DRIVER),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
+                       % (DT.TAB_INF_DEFINES_PCD_IS_DRIVER),
                        LineInfo=self.CurrentLine)
             return False
         if PcdIsDriver == 'PEI_PCD_DRIVER' or PcdIsDriver == 'DXE_PCD_DRIVER':
@@ -522,7 +526,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.PcdIsDriver.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(PcdIsDriver),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (PcdIsDriver),
                        LineInfo=self.CurrentLine)
             return False
 
@@ -541,13 +545,13 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         ValueList[0:len(TokenList)] = TokenList
         InfDefineEntryPointItemObj = InfDefineEntryPointItem()
         if not IsValidCVariableName(ValueList[0]):
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%\
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID %
                        (ValueList[0]),
                        LineInfo=self.CurrentLine)
         InfDefineEntryPointItemObj.SetCName(ValueList[0])
         if len(ValueList) == 2:
             if ValueList[1].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%\
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID %
                            (ValueList[1]),
                            LineInfo=self.CurrentLine)
             #
@@ -555,12 +559,12 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             #
             FeatureFlagRtv = IsValidFeatureFlagExp(ValueList[1].strip())
             if not FeatureFlagRtv[0]:
-                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%\
+                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID %
                            (FeatureFlagRtv[1]),
                            LineInfo=self.CurrentLine)
             InfDefineEntryPointItemObj.SetFeatureFlagExp(ValueList[1])
         if len(ValueList) > 2:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(EntryPoint),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (EntryPoint),
                        LineInfo=self.CurrentLine)
         InfDefineEntryPointItemObj.Comments = Comments
         self.EntryPoint.append(InfDefineEntryPointItemObj)
@@ -580,24 +584,24 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         ValueList[0:len(TokenList)] = TokenList
         InfDefineUnloadImageItemObj = InfDefineUnloadImageItem()
         if not IsValidCVariableName(ValueList[0]):
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[0]),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[0]),
                        LineInfo=self.CurrentLine)
         InfDefineUnloadImageItemObj.SetCName(ValueList[0])
         if len(ValueList) == 2:
             if ValueList[1].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[1]),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[1]),
                            LineInfo=self.CurrentLine)
             #
             # Validate FFE
             #
             FeatureFlagRtv = IsValidFeatureFlagExp(ValueList[1].strip())
             if not FeatureFlagRtv[0]:
-                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
                            LineInfo=self.CurrentLine)
             InfDefineUnloadImageItemObj.SetFeatureFlagExp(ValueList[1])
 
         if len(ValueList) > 2:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(UnloadImages),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (UnloadImages),
                        LineInfo=self.CurrentLine)
         InfDefineUnloadImageItemObj.Comments = Comments
         self.UnloadImages.append(InfDefineUnloadImageItemObj)
@@ -617,34 +621,34 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         ValueList[0:len(TokenList)] = TokenList
         InfDefineConstructorItemObj = InfDefineConstructorItem()
         if not IsValidCVariableName(ValueList[0]):
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[0]),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[0]),
                        LineInfo=self.CurrentLine)
         InfDefineConstructorItemObj.SetCName(ValueList[0])
         if len(ValueList) >= 2:
             ModList = GetSplitValueList(ValueList[1], ' ')
             if ValueList[1].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[1]),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[1]),
                            LineInfo=self.CurrentLine)
             for ModItem in ModList:
                 if ModItem not in DT.MODULE_LIST:
-                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID%(ModItem),
+                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID % (ModItem),
                                LineInfo=self.CurrentLine)
             InfDefineConstructorItemObj.SetSupModList(ModList)
         if len(ValueList) == 3:
             if ValueList[2].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[2]),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[2]),
                            LineInfo=self.CurrentLine)
             #
             # Validate FFE
             #
             FeatureFlagRtv = IsValidFeatureFlagExp(ValueList[2].strip())
             if not FeatureFlagRtv[0]:
-                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[2]),
+                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[2]),
                            LineInfo=self.CurrentLine)
             InfDefineConstructorItemObj.SetFeatureFlagExp(ValueList[2])
 
         if len(ValueList) > 3:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Constructor),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Constructor),
                        LineInfo=self.CurrentLine)
         InfDefineConstructorItemObj.Comments = Comments
         self.Constructor.append(InfDefineConstructorItemObj)
@@ -664,34 +668,34 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         ValueList[0:len(TokenList)] = TokenList
         InfDefineDestructorItemObj = InfDefineDestructorItem()
         if not IsValidCVariableName(ValueList[0]):
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[0]),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[0]),
                        LineInfo=self.CurrentLine)
         InfDefineDestructorItemObj.SetCName(ValueList[0])
         if len(ValueList) >= 2:
             ModList = GetSplitValueList(ValueList[1].strip(), ' ')
             if ValueList[1].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[1]),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[1]),
                            LineInfo=self.CurrentLine)
             for ModItem in ModList:
                 if ModItem not in DT.MODULE_LIST:
-                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID%(ModItem),
+                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID % (ModItem),
                                LineInfo=self.CurrentLine)
             InfDefineDestructorItemObj.SetSupModList(ModList)
         if len(ValueList) == 3:
             if ValueList[2].strip() == '':
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(ValueList[2]),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ValueList[2]),
                            LineInfo=self.CurrentLine)
             #
             # Validate FFE
             #
             FeatureFlagRtv = IsValidFeatureFlagExp(ValueList[2].strip())
             if not FeatureFlagRtv[0]:
-                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
                            LineInfo=self.CurrentLine)
             InfDefineDestructorItemObj.SetFeatureFlagExp(ValueList[2])
 
         if len(ValueList) > 3:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Destructor),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Destructor),
                        LineInfo=self.CurrentLine)
 
         InfDefineDestructorItemObj.Comments = Comments
@@ -705,7 +709,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         # Value has been set before.
         #
         if self.Shadow is not None:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND%(DT.TAB_INF_DEFINES_SHADOW),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND % (DT.TAB_INF_DEFINES_SHADOW),
                        LineInfo=self.CurrentLine)
             return False
         if (IsValidBoolType(Shadow)):
@@ -714,9 +718,10 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             self.Shadow.Comments = Comments
             return True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Shadow),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Shadow),
                        LineInfo=self.CurrentLine)
             return False
+
     def GetShadow(self):
         return self.Shadow
 
@@ -736,7 +741,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             Family = Family.strip()
             if Family != '':
                 if not IsValidFamily(Family):
-                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Family),
+                    ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Family),
                                LineInfo=self.CurrentLine)
                     return False
             #
@@ -747,11 +752,12 @@ class InfDefSection(InfDefSectionOptionRomInfo):
             if IsValidPath(FileName, ModulePath):
                 IsValidFileFlag = True
             else:
-                ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(FileName),
+                ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (FileName),
                            LineInfo=self.CurrentLine)
                 return False
             if IsValidFileFlag:
-                FileName = ConvPathFromAbsToRel(FileName, GlobalData.gINF_MODULE_DIR)
+                FileName = ConvPathFromAbsToRel(
+                    FileName, GlobalData.gINF_MODULE_DIR)
                 self.CustomMakefile.append((Family, FileName, Comments))
                 IsValidFileFlag = False
             return True
@@ -785,11 +791,11 @@ class InfDefSection(InfDefSectionOptionRomInfo):
                 self.Specification.append((Name, Version, Comments))
                 return True
             else:
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Version),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Version),
                            LineInfo=self.CurrentLine)
                 return False
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(Name),
+            ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (Name),
                        LineInfo=self.CurrentLine)
             return False
         return True
@@ -807,7 +813,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         #
         if self.UefiHiiResourceSection is not None:
             ErrorInInf(ST.ERR_INF_PARSER_DEFINE_ITEM_MORE_THAN_ONE_FOUND
-                       %(DT.TAB_INF_DEFINES_UEFI_HII_RESOURCE_SECTION),
+                       % (DT.TAB_INF_DEFINES_UEFI_HII_RESOURCE_SECTION),
                        LineInfo=self.CurrentLine)
             return False
         if not (UefiHiiResourceSection == '' or UefiHiiResourceSection is None):
@@ -817,7 +823,7 @@ class InfDefSection(InfDefSectionOptionRomInfo):
                 self.UefiHiiResourceSection.Comments = Comments
                 return True
             else:
-                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(UefiHiiResourceSection),
+                ErrorInInf(ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (UefiHiiResourceSection),
                            LineInfo=self.CurrentLine)
                 return False
         else:
@@ -835,12 +841,12 @@ class InfDefSection(InfDefSectionOptionRomInfo):
         if IsValidPath(DpxSource, ModulePath):
             IsValidFileFlag = True
         else:
-            ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(DpxSource),
+            ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (DpxSource),
                        LineInfo=self.CurrentLine)
             return False
         if IsValidFileFlag:
             DpxSource = ConvPathFromAbsToRel(DpxSource,
-                            GlobalData.gINF_MODULE_DIR)
+                                             GlobalData.gINF_MODULE_DIR)
             self.DpxSource.append((DpxSource, Comments))
             IsValidFileFlag = False
         return True
@@ -848,69 +854,79 @@ class InfDefSection(InfDefSectionOptionRomInfo):
     def GetDpxSource(self):
         return self.DpxSource
 
+
 gFUNCTION_MAPPING_FOR_DEFINE_SECTION = {
     #
     # Required Fields
     #
-    DT.TAB_INF_DEFINES_BASE_NAME                   : InfDefSection.SetBaseName,
-    DT.TAB_INF_DEFINES_FILE_GUID                   : InfDefSection.SetFileGuid,
-    DT.TAB_INF_DEFINES_MODULE_TYPE                 : InfDefSection.SetModuleType,
+    DT.TAB_INF_DEFINES_BASE_NAME: InfDefSection.SetBaseName,
+    DT.TAB_INF_DEFINES_FILE_GUID: InfDefSection.SetFileGuid,
+    DT.TAB_INF_DEFINES_MODULE_TYPE: InfDefSection.SetModuleType,
     #
     # Required by EDKII style INF file
     #
-    DT.TAB_INF_DEFINES_INF_VERSION                 : InfDefSection.SetInfVersion,
+    DT.TAB_INF_DEFINES_INF_VERSION: InfDefSection.SetInfVersion,
     #
     # Optional Fields
     #
-    DT.TAB_INF_DEFINES_MODULE_UNI_FILE             : InfDefSection.SetModuleUniFileName,
-    DT.TAB_INF_DEFINES_EDK_RELEASE_VERSION         : InfDefSection.SetEdkReleaseVersion,
-    DT.TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION  : InfDefSection.SetUefiSpecificationVersion,
-    DT.TAB_INF_DEFINES_PI_SPECIFICATION_VERSION    : InfDefSection.SetPiSpecificationVersion,
-    DT.TAB_INF_DEFINES_LIBRARY_CLASS               : InfDefSection.SetLibraryClass,
-    DT.TAB_INF_DEFINES_VERSION_STRING              : InfDefSection.SetVersionString,
-    DT.TAB_INF_DEFINES_PCD_IS_DRIVER               : InfDefSection.SetPcdIsDriver,
-    DT.TAB_INF_DEFINES_ENTRY_POINT                 : InfDefSection.SetEntryPoint,
-    DT.TAB_INF_DEFINES_UNLOAD_IMAGE                : InfDefSection.SetUnloadImages,
-    DT.TAB_INF_DEFINES_CONSTRUCTOR                 : InfDefSection.SetConstructor,
-    DT.TAB_INF_DEFINES_DESTRUCTOR                  : InfDefSection.SetDestructor,
-    DT.TAB_INF_DEFINES_SHADOW                      : InfDefSection.SetShadow,
-    DT.TAB_INF_DEFINES_PCI_VENDOR_ID               : InfDefSection.SetPciVendorId,
-    DT.TAB_INF_DEFINES_PCI_DEVICE_ID               : InfDefSection.SetPciDeviceId,
-    DT.TAB_INF_DEFINES_PCI_CLASS_CODE              : InfDefSection.SetPciClassCode,
-    DT.TAB_INF_DEFINES_PCI_REVISION                : InfDefSection.SetPciRevision,
-    DT.TAB_INF_DEFINES_PCI_COMPRESS                : InfDefSection.SetPciCompress,
-    DT.TAB_INF_DEFINES_CUSTOM_MAKEFILE             : InfDefSection.SetCustomMakefile,
-    DT.TAB_INF_DEFINES_SPEC                        : InfDefSection.SetSpecification,
-    DT.TAB_INF_DEFINES_UEFI_HII_RESOURCE_SECTION   : InfDefSection.SetUefiHiiResourceSection,
-    DT.TAB_INF_DEFINES_DPX_SOURCE                  : InfDefSection.SetDpxSource
+    DT.TAB_INF_DEFINES_MODULE_UNI_FILE: InfDefSection.SetModuleUniFileName,
+    DT.TAB_INF_DEFINES_EDK_RELEASE_VERSION: InfDefSection.SetEdkReleaseVersion,
+    DT.TAB_INF_DEFINES_UEFI_SPECIFICATION_VERSION: InfDefSection.SetUefiSpecificationVersion,
+    DT.TAB_INF_DEFINES_PI_SPECIFICATION_VERSION: InfDefSection.SetPiSpecificationVersion,
+    DT.TAB_INF_DEFINES_LIBRARY_CLASS: InfDefSection.SetLibraryClass,
+    DT.TAB_INF_DEFINES_VERSION_STRING: InfDefSection.SetVersionString,
+    DT.TAB_INF_DEFINES_PCD_IS_DRIVER: InfDefSection.SetPcdIsDriver,
+    DT.TAB_INF_DEFINES_ENTRY_POINT: InfDefSection.SetEntryPoint,
+    DT.TAB_INF_DEFINES_UNLOAD_IMAGE: InfDefSection.SetUnloadImages,
+    DT.TAB_INF_DEFINES_CONSTRUCTOR: InfDefSection.SetConstructor,
+    DT.TAB_INF_DEFINES_DESTRUCTOR: InfDefSection.SetDestructor,
+    DT.TAB_INF_DEFINES_SHADOW: InfDefSection.SetShadow,
+    DT.TAB_INF_DEFINES_PCI_VENDOR_ID: InfDefSection.SetPciVendorId,
+    DT.TAB_INF_DEFINES_PCI_DEVICE_ID: InfDefSection.SetPciDeviceId,
+    DT.TAB_INF_DEFINES_PCI_CLASS_CODE: InfDefSection.SetPciClassCode,
+    DT.TAB_INF_DEFINES_PCI_REVISION: InfDefSection.SetPciRevision,
+    DT.TAB_INF_DEFINES_PCI_COMPRESS: InfDefSection.SetPciCompress,
+    DT.TAB_INF_DEFINES_CUSTOM_MAKEFILE: InfDefSection.SetCustomMakefile,
+    DT.TAB_INF_DEFINES_SPEC: InfDefSection.SetSpecification,
+    DT.TAB_INF_DEFINES_UEFI_HII_RESOURCE_SECTION: InfDefSection.SetUefiHiiResourceSection,
+    DT.TAB_INF_DEFINES_DPX_SOURCE: InfDefSection.SetDpxSource
 }
 
-## InfDefMember
+# InfDefMember
 #
 #
+
+
 class InfDefMember():
     def __init__(self, Name='', Value=''):
         self.Comments = InfLineCommentObject()
-        self.Name  = Name
+        self.Name = Name
         self.Value = Value
         self.CurrentLine = CurrentLine()
+
     def GetName(self):
         return self.Name
+
     def SetName(self, Name):
         self.Name = Name
+
     def GetValue(self):
         return self.Value
+
     def SetValue(self, Value):
         self.Value = Value
 
-## InfDefObject
+# InfDefObject
 #
 #
+
+
 class InfDefObject(InfSectionCommonDef):
     def __init__(self):
         self.Defines = Sdict()
         InfSectionCommonDef.__init__(self)
-    def SetDefines(self, DefineContent, Arch = None):
+
+    def SetDefines(self, DefineContent, Arch=None):
         #
         # Validate Arch
         #
@@ -926,16 +942,19 @@ class InfDefObject(InfSectionCommonDef):
             Value = InfDefMemberObj.GetValue()
             if Name == DT.TAB_INF_DEFINES_MODULE_UNI_FILE:
                 ValidateUNIFilePath(Value)
-                Value = os.path.join(os.path.dirname(InfDefMemberObj.CurrentLine.FileName), Value)
+                Value = os.path.join(os.path.dirname(
+                    InfDefMemberObj.CurrentLine.FileName), Value)
                 if not os.path.isfile(Value) or not os.path.exists(Value):
                     LineInfo[0] = InfDefMemberObj.CurrentLine.GetFileName()
                     LineInfo[1] = InfDefMemberObj.CurrentLine.GetLineNo()
                     LineInfo[2] = InfDefMemberObj.CurrentLine.GetLineString()
-                    ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(Name),
-                                   LineInfo=LineInfo)
+                    ErrorInInf(ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (Name),
+                               LineInfo=LineInfo)
             InfLineCommentObj = InfLineCommentObject()
-            InfLineCommentObj.SetHeaderComments(InfDefMemberObj.Comments.GetHeaderComments())
-            InfLineCommentObj.SetTailComments(InfDefMemberObj.Comments.GetTailComments())
+            InfLineCommentObj.SetHeaderComments(
+                InfDefMemberObj.Comments.GetHeaderComments())
+            InfLineCommentObj.SetTailComments(
+                InfDefMemberObj.Comments.GetTailComments())
             if Name == 'COMPONENT_TYPE':
                 ErrorInInf(ST.ERR_INF_PARSER_NOT_SUPPORT_EDKI_INF,
                            ErrorCode=ToolError.EDK1_INF_ERROR,
@@ -961,7 +980,7 @@ class InfDefObject(InfSectionCommonDef):
                     # Found the process function from mapping table.
                     #
                     if Name not in gFUNCTION_MAPPING_FOR_DEFINE_SECTION.keys():
-                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_SECTION_KEYWORD_INVALID%(Name),
+                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_SECTION_KEYWORD_INVALID % (Name),
                                    LineInfo=LineInfo)
                     else:
                         ProcessFunc = gFUNCTION_MAPPING_FOR_DEFINE_SECTION[Name]
@@ -978,7 +997,7 @@ class InfDefObject(InfSectionCommonDef):
                     # Found the process function from mapping table.
                     #
                     if Name not in gFUNCTION_MAPPING_FOR_DEFINE_SECTION.keys():
-                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_SECTION_KEYWORD_INVALID%(Name),
+                        ErrorInInf(ST.ERR_INF_PARSER_DEFINE_SECTION_KEYWORD_INVALID % (Name),
                                    LineInfo=LineInfo)
                     #
                     # Found the process function from mapping table.
@@ -999,4 +1018,3 @@ class InfDefObject(InfSectionCommonDef):
 
     def GetDefines(self):
         return self.Defines
-
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py
index 0de663291c11..870ca3317f16 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Depex] section.
 # It will consumed by InfParser.
 #
@@ -19,6 +19,7 @@ from Logger import StringTable as ST
 from Object.Parser.InfCommonObject import InfSectionCommonDef
 from Library.ParserValidate import IsValidArch
 
+
 class InfDepexContentItem():
     def __init__(self):
         self.SectionType = ''
@@ -26,11 +27,13 @@ class InfDepexContentItem():
 
     def SetSectionType(self, SectionType):
         self.SectionType = SectionType
+
     def GetSectionType(self):
         return self.SectionType
 
     def SetSectionString(self, SectionString):
         self.SectionString = SectionString
+
     def GetSectionString(self):
         return self.SectionString
 
@@ -46,38 +49,46 @@ class InfDepexItem():
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetSupArch(self, Arch):
         self.SupArch = Arch
+
     def GetSupArch(self):
         return self.SupArch
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetModuleType(self, Type):
         self.ModuleType = Type
+
     def GetModuleType(self):
         return self.ModuleType
 
     def SetDepexConent(self, Content):
         self.DepexContent = Content
+
     def GetDepexContent(self):
         return self.DepexContent
 
     def SetInfDepexContentItemList(self, InfDepexContentItemList):
         self.InfDepexContentItemList = InfDepexContentItemList
+
     def GetInfDepexContentItemList(self):
         return self.InfDepexContentItemList
 
-## InfDepexObject
+# InfDepexObject
 #
 #
 #
+
+
 class InfDepexObject(InfSectionCommonDef):
     def __init__(self):
         self.Depex = []
@@ -112,7 +123,8 @@ class InfDepexObject(InfSectionCommonDef):
                 else:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_DEPEX_SECTION_MODULE_TYPE_ERROR % (ModuleType),
+                                 ST.ERR_INF_PARSER_DEPEX_SECTION_MODULE_TYPE_ERROR % (
+                                     ModuleType),
                                  File=GlobalData.gINF_MODULE_NAME,
                                  Line=KeyItem[2])
 
@@ -127,7 +139,8 @@ class InfDepexObject(InfSectionCommonDef):
             for Line in DepexContent:
                 LineContent = Line[0].strip()
                 if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                    LineContent = LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                    LineContent = LineContent[:LineContent.find(
+                        DT.TAB_COMMENT_SPLIT)]
                 if LineContent:
                     DepexString = DepexString + LineContent + DT.END_OF_LINE
                 continue
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
index 3e0bc8044003..997f8da17f63 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Guids] section.
 # It will consumed by InfParser.
 #
@@ -20,6 +20,7 @@ import Logger.Log as Logger
 from Logger import ToolError
 from Logger import StringTable as ST
 
+
 class InfGuidItemCommentContent():
     def __init__(self):
         #
@@ -45,24 +46,29 @@ class InfGuidItemCommentContent():
 
     def SetUsageItem(self, UsageItem):
         self.UsageItem = UsageItem
+
     def GetUsageItem(self):
         return self.UsageItem
 
     def SetGuidTypeItem(self, GuidTypeItem):
         self.GuidTypeItem = GuidTypeItem
+
     def GetGuidTypeItem(self):
         return self.GuidTypeItem
 
     def SetVariableNameItem(self, VariableNameItem):
         self.VariableNameItem = VariableNameItem
+
     def GetVariableNameItem(self):
         return self.VariableNameItem
 
     def SetHelpStringItem(self, HelpStringItem):
         self.HelpStringItem = HelpStringItem
+
     def GetHelpStringItem(self):
         return self.HelpStringItem
 
+
 class InfGuidItem():
     def __init__(self):
         self.Name = ''
@@ -75,33 +81,39 @@ class InfGuidItem():
 
     def SetName(self, Name):
         self.Name = Name
+
     def GetName(self):
         return self.Name
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetCommentList(self, CommentList):
         self.CommentList = CommentList
+
     def GetCommentList(self):
         return self.CommentList
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
-## ParseComment
+# ParseComment
 #
 # ParseComment
 #
+
+
 def ParseGuidComment(CommentsList, InfGuidItemObj):
     #
     # Get/Set Usage and HelpString
     #
-    if CommentsList is not None and len(CommentsList) != 0 :
+    if CommentsList is not None and len(CommentsList) != 0:
         CommentInsList = []
         PreUsage = None
         PreGuidType = None
@@ -111,14 +123,14 @@ def ParseGuidComment(CommentsList, InfGuidItemObj):
         for CommentItem in CommentsList:
             Count = Count + 1
             CommentItemUsage, \
-            CommentItemGuidType, \
-            CommentItemVarString, \
-            CommentItemHelpText = \
-                    ParseComment(CommentItem,
-                                 DT.ALL_USAGE_TOKENS,
-                                 DT.GUID_TYPE_TOKENS,
-                                 [],
-                                 True)
+                CommentItemGuidType, \
+                CommentItemVarString, \
+                CommentItemHelpText = \
+                ParseComment(CommentItem,
+                             DT.ALL_USAGE_TOKENS,
+                             DT.GUID_TYPE_TOKENS,
+                             [],
+                             True)
 
             if CommentItemHelpText is None:
                 CommentItemHelpText = ''
@@ -158,7 +170,8 @@ def ParseGuidComment(CommentsList, InfGuidItemObj):
                 CommentItemIns.SetGuidTypeItem(CommentItemGuidType)
                 CommentItemIns.SetVariableNameItem(CommentItemVarString)
                 if CommentItemHelpText == '' or CommentItemHelpText.endswith(DT.END_OF_LINE):
-                    CommentItemHelpText = CommentItemHelpText.strip(DT.END_OF_LINE)
+                    CommentItemHelpText = CommentItemHelpText.strip(
+                        DT.END_OF_LINE)
                 CommentItemIns.SetHelpStringItem(CommentItemHelpText)
                 CommentInsList.append(CommentItemIns)
 
@@ -186,7 +199,8 @@ def ParseGuidComment(CommentsList, InfGuidItemObj):
                 CommentItemIns.SetGuidTypeItem(CommentItemGuidType)
                 CommentItemIns.SetVariableNameItem(CommentItemVarString)
                 if CommentItemHelpText == '' or CommentItemHelpText.endswith(DT.END_OF_LINE):
-                    CommentItemHelpText = CommentItemHelpText.strip(DT.END_OF_LINE)
+                    CommentItemHelpText = CommentItemHelpText.strip(
+                        DT.END_OF_LINE)
                 CommentItemIns.SetHelpStringItem(CommentItemHelpText)
                 CommentInsList.append(CommentItemIns)
 
@@ -212,10 +226,12 @@ def ParseGuidComment(CommentsList, InfGuidItemObj):
 
     return InfGuidItemObj
 
-## InfGuidObject
+# InfGuidObject
 #
 # InfGuidObject
 #
+
+
 class InfGuidObject():
     def __init__(self):
         self.Guids = Sdict()
@@ -224,7 +240,7 @@ class InfGuidObject():
         #
         self.Macros = {}
 
-    def SetGuid(self, GuidList, Arch = None):
+    def SetGuid(self, GuidList, Arch=None):
         __SupportArchList = []
         for ArchItem in Arch:
             #
@@ -252,7 +268,7 @@ class InfGuidObject():
                 if not IsValidCVariableName(Item[0]):
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_INVALID_CNAME%(Item[0]),
+                                 ST.ERR_INF_PARSER_INVALID_CNAME % (Item[0]),
                                  File=CurrentLineOfItem[2],
                                  Line=CurrentLineOfItem[1],
                                  ExtraData=CurrentLineOfItem[0])
@@ -285,7 +301,8 @@ class InfGuidObject():
                 if not FeatureFlagRtv[0]:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                     FeatureFlagRtv[1]),
                                  File=CurrentLineOfItem[2],
                                  Line=CurrentLineOfItem[1],
                                  ExtraData=CurrentLineOfItem[0])
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py
index 087edca93bc3..67b7b3267731 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file header.
 # It will consumed by InfParser.
 #
@@ -10,7 +10,7 @@
 InfHeaderObject
 '''
 
-## INF file header object
+# INF file header object
 #
 # A sample file header
 #
@@ -24,15 +24,17 @@ InfHeaderObject
 # # License
 # #
 #
+
+
 class InfHeaderObject():
     def __init__(self):
-        self.FileName    = ''
-        self.Abstract    = ''
+        self.FileName = ''
+        self.Abstract = ''
         self.Description = ''
-        self.Copyright   = ''
-        self.License     = ''
+        self.Copyright = ''
+        self.License = ''
 
-    ## SetFileName
+    # SetFileName
     #
     # @param FileName: File Name
     #
@@ -43,12 +45,12 @@ class InfHeaderObject():
         else:
             return False
 
-    ## GetFileName
+    # GetFileName
     #
     def GetFileName(self):
         return self.FileName
 
-    ## SetAbstract
+    # SetAbstract
     #
     # @param Abstract: Abstract
     #
@@ -59,12 +61,12 @@ class InfHeaderObject():
         else:
             return False
 
-    ## GetAbstract
+    # GetAbstract
     #
     def GetAbstract(self):
         return self.Abstract
 
-    ## SetDescription
+    # SetDescription
     #
     # @param Description: Description content
     #
@@ -75,12 +77,12 @@ class InfHeaderObject():
         else:
             return False
 
-    ## GetAbstract
+    # GetAbstract
     #
     def GetDescription(self):
         return self.Description
 
-    ## SetCopyright
+    # SetCopyright
     #
     # @param Copyright: Copyright content
     #
@@ -91,12 +93,12 @@ class InfHeaderObject():
         else:
             return False
 
-    ## GetCopyright
+    # GetCopyright
     #
     def GetCopyright(self):
         return self.Copyright
 
-    ## SetCopyright
+    # SetCopyright
     #
     # @param License: License content
     #
@@ -107,7 +109,7 @@ class InfHeaderObject():
         else:
             return False
 
-    ## GetLicense
+    # GetLicense
     #
     def GetLicense(self):
         return self.License
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
index 2e56028318a1..423fb15fe43f 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [LibraryClasses] section.
 # It will consumed by InfParser.
 #
@@ -20,10 +20,12 @@ from Object.Parser.InfCommonObject import CurrentLine
 from Library.ExpressionValidate import IsValidFeatureFlagExp
 from Library.ParserValidate import IsValidLibName
 
-## GetArchModuleType
+# GetArchModuleType
 #
 # Get Arch List and ModuleType List
 #
+
+
 def GetArchModuleType(KeyList):
     __SupArchList = []
     __SupModuleList = []
@@ -62,26 +64,31 @@ class InfLibraryClassItem():
 
     def SetLibName(self, LibName):
         self.LibName = LibName
+
     def GetLibName(self):
         return self.LibName
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
     def SetSupModuleList(self, SupModuleList):
         self.SupModuleList = SupModuleList
+
     def GetSupModuleList(self):
         return self.SupModuleList
 
@@ -90,18 +97,22 @@ class InfLibraryClassItem():
     #
     def SetFileGuid(self, FileGuid):
         self.FileGuid = FileGuid
+
     def GetFileGuid(self):
         return self.FileGuid
 
     def SetVersion(self, Version):
         self.Version = Version
+
     def GetVersion(self):
         return self.Version
 
-## INF LibraryClass Section
+# INF LibraryClass Section
 #
 #
 #
+
+
 class InfLibraryClassObject():
     def __init__(self):
         self.LibraryClasses = Sdict()
@@ -110,7 +121,7 @@ class InfLibraryClassObject():
         #
         self.Macros = {}
 
-    ##SetLibraryClasses
+    # SetLibraryClasses
     #
     #
     # @param HelpString:     It can be a common comment or contain a recommend
@@ -139,15 +150,16 @@ class InfLibraryClassObject():
                                 LibItemObj.SetLibName(LibItem[0])
                             else:
                                 Logger.Error("InfParser",
-                                         ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_DEFINE_LIB_NAME_INVALID,
-                                         File=GlobalData.gINF_MODULE_NAME,
-                                         Line=LibItemObj.CurrentLine.GetLineNo(),
-                                         ExtraData=LibItemObj.CurrentLine.GetLineString())
+                                             ToolError.FORMAT_INVALID,
+                                             ST.ERR_INF_PARSER_DEFINE_LIB_NAME_INVALID,
+                                             File=GlobalData.gINF_MODULE_NAME,
+                                             Line=LibItemObj.CurrentLine.GetLineNo(),
+                                             ExtraData=LibItemObj.CurrentLine.GetLineString())
                         else:
                             Logger.Error("InfParser",
                                          ToolError.FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (LibItem[0]),
+                                         ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                             LibItem[0]),
                                          File=GlobalData.gINF_MODULE_NAME,
                                          Line=LibItemObj.CurrentLine.GetLineNo(),
                                          ExtraData=LibItemObj.CurrentLine.GetLineString())
@@ -173,7 +185,8 @@ class InfLibraryClassObject():
                     if not FeatureFlagRtv[0]:
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                         FeatureFlagRtv[1]),
                                      File=GlobalData.gINF_MODULE_NAME,
                                      Line=LibItemObj.CurrentLine.GetLineNo(),
                                      ExtraData=LibItemObj.CurrentLine.GetLineString())
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
index 469d6fbb15ab..a07219d5b5cd 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file miscellaneous.
 # Include BootMode/HOB/Event and others. It will consumed by InfParser.
 #
@@ -20,6 +20,8 @@ from Library.Misc import Sdict
 ##
 # BootModeObject
 #
+
+
 class InfBootModeObject():
     def __init__(self):
         self.SupportedBootModes = ''
@@ -28,21 +30,26 @@ class InfBootModeObject():
 
     def SetSupportedBootModes(self, SupportedBootModes):
         self.SupportedBootModes = SupportedBootModes
+
     def GetSupportedBootModes(self):
         return self.SupportedBootModes
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetUsage(self, Usage):
         self.Usage = Usage
+
     def GetUsage(self):
         return self.Usage
 ##
 # EventObject
 #
+
+
 class InfEventObject():
     def __init__(self):
         self.EventType = ''
@@ -57,16 +64,20 @@ class InfEventObject():
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetUsage(self, Usage):
         self.Usage = Usage
+
     def GetUsage(self):
         return self.Usage
 ##
 # HobObject
 #
+
+
 class InfHobObject():
     def __init__(self):
         self.HobType = ''
@@ -82,28 +93,33 @@ class InfHobObject():
 
     def SetUsage(self, Usage):
         self.Usage = Usage
+
     def GetUsage(self):
         return self.Usage
 
     def SetSupArchList(self, ArchList):
         self.SupArchList = ArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
 ##
 # InfSpecialCommentObject
 #
+
+
 class InfSpecialCommentObject(InfSectionCommonDef):
     def __init__(self):
         self.SpecialComments = Sdict()
         InfSectionCommonDef.__init__(self)
 
-    def SetSpecialComments(self, SepcialSectionList = None, Type = ''):
+    def SetSpecialComments(self, SepcialSectionList=None, Type=''):
         if Type == DT.TYPE_HOB_SECTION or \
            Type == DT.TYPE_EVENT_SECTION or \
            Type == DT.TYPE_BOOTMODE_SECTION:
@@ -123,8 +139,7 @@ class InfSpecialCommentObject(InfSectionCommonDef):
         return self.SpecialComments
 
 
-
-## ErrorInInf
+# ErrorInInf
 #
 # An encapsulate of Error for INF parser.
 #
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
index 0e8fc7d98b3f..fe75340e8f37 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Packages] section.
 # It will consumed by InfParser.
 #
@@ -19,38 +19,43 @@ from Library.Misc import Sdict
 from Library.ParserValidate import IsValidPath
 from Library.ExpressionValidate import IsValidFeatureFlagExp
 
+
 class InfPackageItem():
     def __init__(self,
-                 PackageName = '',
-                 FeatureFlagExp = '',
-                 HelpString = ''):
-        self.PackageName    = PackageName
+                 PackageName='',
+                 FeatureFlagExp='',
+                 HelpString=''):
+        self.PackageName = PackageName
         self.FeatureFlagExp = FeatureFlagExp
-        self.HelpString     = HelpString
-        self.SupArchList    = []
+        self.HelpString = HelpString
+        self.SupArchList = []
 
     def SetPackageName(self, PackageName):
         self.PackageName = PackageName
+
     def GetPackageName(self):
         return self.PackageName
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetHelpString(self, HelpString):
         self.HelpString = HelpString
+
     def GetHelpString(self):
         return self.HelpString
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
 
-## INF package section
+# INF package section
 #
 #
 #
@@ -60,11 +65,11 @@ class InfPackageObject():
         #
         # Macro defined in this section should be only used in this section.
         #
-        self.Macros         = {}
+        self.Macros = {}
 
-    def SetPackages(self, PackageData, Arch = None):
+    def SetPackages(self, PackageData, Arch=None):
         IsValidFileFlag = False
-        SupArchList     = []
+        SupArchList = []
         for ArchItem in Arch:
             #
             # Validate Arch
@@ -94,7 +99,8 @@ class InfPackageObject():
                 else:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(PackageItem[0]),
+                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                     PackageItem[0]),
                                  File=CurrentLineOfPackItem[2],
                                  Line=CurrentLineOfPackItem[1],
                                  ExtraData=CurrentLineOfPackItem[0])
@@ -119,7 +125,8 @@ class InfPackageObject():
                 if not FeatureFlagRtv[0]:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                     FeatureFlagRtv[1]),
                                  File=CurrentLineOfPackItem[2],
                                  Line=CurrentLineOfPackItem[1],
                                  ExtraData=CurrentLineOfPackItem[0])
@@ -176,6 +183,6 @@ class InfPackageObject():
 
         return True
 
-    def GetPackages(self, Arch = None):
+    def GetPackages(self, Arch=None):
         if Arch is None:
             return self.Packages
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
index fd8065fab5f4..717834a4aec0 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Pcds] section.
 # It will consumed by InfParser.
 #
@@ -33,6 +33,7 @@ from Parser.DecParser import Dec
 
 from Object.Parser.InfPackagesObject import InfPackageItem
 
+
 def ValidateArch(ArchItem, PcdTypeItem1, LineNo, SupArchDict, SupArchList):
     #
     # Validate Arch
@@ -46,7 +47,8 @@ def ValidateArch(ArchItem, PcdTypeItem1, LineNo, SupArchDict, SupArchList):
             if not IsValidArch(ArchItemNew):
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (ArchItemNew),
+                             ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                 ArchItemNew),
                              File=GlobalData.gINF_MODULE_NAME,
                              Line=LineNo,
                              ExtraData=ArchItemNew)
@@ -56,6 +58,7 @@ def ValidateArch(ArchItem, PcdTypeItem1, LineNo, SupArchDict, SupArchList):
 
     return SupArchList, SupArchDict
 
+
 def ParsePcdComment(CommentList, PcdTypeItem, PcdItemObj):
     CommentInsList = []
     PreUsage = None
@@ -67,10 +70,10 @@ def ParsePcdComment(CommentList, PcdTypeItem, PcdItemObj):
     for CommentItem in CommentList:
         Count = Count + 1
         CommentItemUsage, CommentType, CommentString, CommentItemHelpText = ParseComment(CommentItem,
-                                                                             DT.ALL_USAGE_TOKENS,
-                                                                             {},
-                                                                             [],
-                                                                             False)
+                                                                                         DT.ALL_USAGE_TOKENS,
+                                                                                         {},
+                                                                                         [],
+                                                                                         False)
         if CommentType and CommentString:
             pass
 
@@ -162,6 +165,7 @@ def ParsePcdComment(CommentList, PcdTypeItem, PcdItemObj):
 
     return PcdItemObj
 
+
 class InfPcdItemCommentContent():
     def __init__(self):
         #
@@ -175,15 +179,17 @@ class InfPcdItemCommentContent():
 
     def SetUsageItem(self, UsageItem):
         self.UsageItem = UsageItem
+
     def GetUsageItem(self):
         return self.UsageItem
 
     def SetHelpStringItem(self, HelpStringItem):
         self.HelpStringItem = HelpStringItem
+
     def GetHelpStringItem(self):
         return self.HelpStringItem
 
-## InfPcdItem
+# InfPcdItem
 #
 # This class defined Pcd item used in Module files
 #
@@ -199,6 +205,8 @@ class InfPcdItemCommentContent():
 # @param SkuInfoList:          Input value for SkuInfoList, default is {}
 # @param SupModuleList:        Input value for SupModuleList, default is []
 #
+
+
 class InfPcdItem():
     def __init__(self):
         self.CName = ''
@@ -219,71 +227,85 @@ class InfPcdItem():
 
     def SetCName(self, CName):
         self.CName = CName
+
     def GetCName(self):
         return self.CName
 
     def SetToken(self, Token):
         self.Token = Token
+
     def GetToken(self):
         return self.Token
 
     def SetTokenSpaceGuidCName(self, TokenSpaceGuidCName):
         self.TokenSpaceGuidCName = TokenSpaceGuidCName
+
     def GetTokenSpaceGuidCName(self):
         return self.TokenSpaceGuidCName
 
     def SetTokenSpaceGuidValue(self, TokenSpaceGuidValue):
         self.TokenSpaceGuidValue = TokenSpaceGuidValue
+
     def GetTokenSpaceGuidValue(self):
         return self.TokenSpaceGuidValue
 
     def SetDatumType(self, DatumType):
         self.DatumType = DatumType
+
     def GetDatumType(self):
         return self.DatumType
 
     def SetMaxDatumSize(self, MaxDatumSize):
         self.MaxDatumSize = MaxDatumSize
+
     def GetMaxDatumSize(self):
         return self.MaxDatumSize
 
     def SetDefaultValue(self, DefaultValue):
         self.DefaultValue = DefaultValue
+
     def GetDefaultValue(self):
         return self.DefaultValue
 
     def SetPcdErrorsList(self, PcdErrorsList):
         self.PcdErrorsList = PcdErrorsList
+
     def GetPcdErrorsList(self):
         return self.PcdErrorsList
 
     def SetItemType(self, ItemType):
         self.ItemType = ItemType
+
     def GetItemType(self):
         return self.ItemType
 
     def SetSupModuleList(self, SupModuleList):
         self.SupModuleList = SupModuleList
+
     def GetSupModuleList(self):
         return self.SupModuleList
 
     def SetHelpStringList(self, HelpStringList):
         self.HelpStringList = HelpStringList
+
     def GetHelpStringList(self):
         return self.HelpStringList
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetSupportArchList(self, ArchList):
         self.SupArchList = ArchList
+
     def GetSupportArchList(self):
         return self.SupArchList
 
     def SetOffset(self, Offset):
         self.Offset = Offset
+
     def GetOffset(self):
         return self.Offset
 
@@ -297,6 +319,8 @@ class InfPcdItem():
 #
 #
 #
+
+
 class InfPcdObject():
     def __init__(self, FileName):
         self.Pcds = Sdict()
@@ -315,7 +339,8 @@ class InfPcdObject():
         SupArchDict = {}
         PcdTypeItem = ''
         for (PcdTypeItem1, ArchItem, LineNo) in KeysList:
-            SupArchList, SupArchDict = ValidateArch(ArchItem, PcdTypeItem1, LineNo, SupArchDict, SupArchList)
+            SupArchList, SupArchDict = ValidateArch(
+                ArchItem, PcdTypeItem1, LineNo, SupArchDict, SupArchList)
 
             #
             # Validate PcdType
@@ -326,7 +351,8 @@ class InfPcdObject():
                 if not IsValidPcdType(PcdTypeItem1):
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_PCD_SECTION_TYPE_ERROR % (DT.PCD_USAGE_TYPE_LIST_OF_MODULE),
+                                 ST.ERR_INF_PARSER_PCD_SECTION_TYPE_ERROR % (
+                                     DT.PCD_USAGE_TYPE_LIST_OF_MODULE),
                                  File=GlobalData.gINF_MODULE_NAME,
                                  Line=LineNo,
                                  ExtraData=PcdTypeItem1)
@@ -341,14 +367,16 @@ class InfPcdObject():
                 PcdItem = PcdItem[0]
 
                 if CommentList is not None and len(CommentList) != 0:
-                    PcdItemObj = ParsePcdComment(CommentList, PcdTypeItem, PcdItemObj)
+                    PcdItemObj = ParsePcdComment(
+                        CommentList, PcdTypeItem, PcdItemObj)
                 else:
                     CommentItemIns = InfPcdItemCommentContent()
                     CommentItemIns.SetUsageItem(DT.ITEM_UNDEFINED)
                     PcdItemObj.SetHelpStringList([CommentItemIns])
 
                 if len(PcdItem) >= 1 and len(PcdItem) <= 3:
-                    PcdItemObj = SetPcdName(PcdItem, CurrentLineOfPcdItem, PcdItemObj)
+                    PcdItemObj = SetPcdName(
+                        PcdItem, CurrentLineOfPcdItem, PcdItemObj)
 
                 if len(PcdItem) >= 2 and len(PcdItem) <= 3:
                     #
@@ -385,7 +413,8 @@ class InfPcdObject():
                     if not FeatureFlagRtv[0]:
                         Logger.Error("InfParser",
                                      ToolError.FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
+                                     ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                         FeatureFlagRtv[1]),
                                      File=CurrentLineOfPcdItem[2],
                                      Line=CurrentLineOfPcdItem[1],
                                      ExtraData=CurrentLineOfPcdItem[0])
@@ -429,9 +458,11 @@ class InfPcdObject():
                 CommentItemIns = InfPcdItemCommentContent()
                 CommentItemIns.SetHelpStringItem(CommentString)
                 CommentItemIns.SetUsageItem(CommentString)
-                PcdItemObj.SetHelpStringList(PcdItemObj.GetHelpStringList() + [CommentItemIns])
+                PcdItemObj.SetHelpStringList(
+                    PcdItemObj.GetHelpStringList() + [CommentItemIns])
                 if PcdItemObj.GetValidUsage():
-                    PcdItemObj.SetValidUsage(PcdItemObj.GetValidUsage() + DT.TAB_VALUE_SPLIT + CommentString)
+                    PcdItemObj.SetValidUsage(
+                        PcdItemObj.GetValidUsage() + DT.TAB_VALUE_SPLIT + CommentString)
                 else:
                     PcdItemObj.SetValidUsage(CommentString)
 
@@ -444,10 +475,10 @@ class InfPcdObject():
             # Set Value/DatumType/OffSet/Token
             #
             PcdItemObj = SetValueDatumTypeMaxSizeToken(PcdItem,
-                                                      CurrentLineOfPcdItem,
-                                                      PcdItemObj,
-                                                      KeysList[0][1],
-                                                      PackageInfo)
+                                                       CurrentLineOfPcdItem,
+                                                       PcdItemObj,
+                                                       KeysList[0][1],
+                                                       PackageInfo)
 
             PcdTypeItem = KeysList[0][0]
             if (PcdTypeItem, PcdItemObj) in self.Pcds:
@@ -462,6 +493,7 @@ class InfPcdObject():
     def GetPcds(self):
         return self.Pcds
 
+
 def ParserPcdInfoInDec(String):
     ValueList = GetSplitValueList(String, DT.TAB_VALUE_SPLIT, 3)
 
@@ -470,6 +502,7 @@ def ParserPcdInfoInDec(String):
     #
     return ValueList[2], ValueList[3]
 
+
 def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arch, PackageInfo=None):
     #
     # Package information not been generated currently, we need to parser INF file to get information.
@@ -477,7 +510,8 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
     if not PackageInfo:
         PackageInfo = []
         InfFileName = CurrentLineOfPcdItem[2]
-        PackageInfoList = GetPackageListInfo(InfFileName, GlobalData.gWORKSPACE, -1)
+        PackageInfoList = GetPackageListInfo(
+            InfFileName, GlobalData.gWORKSPACE, -1)
         for PackageInfoListItem in PackageInfoList:
             PackageInfoIns = InfPackageItem()
             PackageInfoIns.SetPackageName(PackageInfoListItem)
@@ -491,7 +525,8 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
         #
         # Open DEC file to get information
         #
-        FullFileName = os.path.normpath(os.path.realpath(os.path.join(GlobalData.gWORKSPACE, PackageName)))
+        FullFileName = os.path.normpath(os.path.realpath(
+            os.path.join(GlobalData.gWORKSPACE, PackageName)))
 
         DecParser = None
         if FullFileName not in GlobalData.gPackageDict:
@@ -506,7 +541,7 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
         DecPcdsDict = DecParser.GetPcdSectionObject().ValueDict
         for Key in DecPcdsDict.keys():
             if (Key[0] == 'PCDSDYNAMICEX' and PcdItemObj.GetItemType() == 'PcdEx') and \
-                (Key[1] == 'COMMON' or Key[1] == Arch):
+                    (Key[1] == 'COMMON' or Key[1] == Arch):
                 for PcdInDec in DecPcdsDict[Key]:
                     if PcdInDec.TokenCName == PcdItemObj.CName and \
                        PcdInDec.TokenSpaceGuidCName == PcdItemObj.TokenSpaceGuidCName:
@@ -516,7 +551,7 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
                         PcdItemObj.SetDefaultValue(PcdInDec.DefaultValue)
 
             if (Key[0] == 'PCDSPATCHABLEINMODULE' and PcdItemObj.GetItemType() == 'PatchPcd') and \
-           (Key[1] == 'COMMON' or Key[1] == Arch):
+                    (Key[1] == 'COMMON' or Key[1] == Arch):
                 for PcdInDec in DecPcdsDict[Key]:
                     if PcdInDec.TokenCName == PcdItemObj.CName and \
                        PcdInDec.TokenSpaceGuidCName == PcdItemObj.TokenSpaceGuidCName:
@@ -526,7 +561,8 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
 
         if PcdItemObj.GetDatumType() == 'VOID*':
             if len(PcdItem) > 1:
-                PcdItemObj.SetMaxDatumSize('%s' % (len(GetSplitValueList(PcdItem[1], DT.TAB_COMMA_SPLIT))))
+                PcdItemObj.SetMaxDatumSize(
+                    '%s' % (len(GetSplitValueList(PcdItem[1], DT.TAB_COMMA_SPLIT))))
 
         DecGuidsDict = DecParser.GetGuidSectionObject().ValueDict
         for Key in DecGuidsDict.keys():
@@ -555,35 +591,38 @@ def SetValueDatumTypeMaxSizeToken(PcdItem, CurrentLineOfPcdItem, PcdItemObj, Arc
             PcdItemObj.SetDefaultValue(PcdItem[1])
         else:
             Logger.Error("InfParser",
-                     ToolError.FORMAT_INVALID,
-                     ST.ERR_ASBUILD_PCD_VALUE_INVALID % ("\"" + PcdItem[1] + "\"", "\"" +
-                                                       PcdItemObj.GetDatumType() + "\""),
-                     File=CurrentLineOfPcdItem[2],
-                     Line=CurrentLineOfPcdItem[1],
-                     ExtraData=CurrentLineOfPcdItem[0])
+                         ToolError.FORMAT_INVALID,
+                         ST.ERR_ASBUILD_PCD_VALUE_INVALID % ("\"" + PcdItem[1] + "\"", "\"" +
+                                                             PcdItemObj.GetDatumType() + "\""),
+                         File=CurrentLineOfPcdItem[2],
+                         Line=CurrentLineOfPcdItem[1],
+                         ExtraData=CurrentLineOfPcdItem[0])
         #
         # validate offset
         #
         if PcdItemObj.GetItemType().upper() == DT.TAB_INF_PATCH_PCD.upper():
             if not IsHexDigitUINT32(PcdItem[2]):
                 Logger.Error("InfParser",
-                         ToolError.FORMAT_INVALID,
-                         ST.ERR_ASBUILD_PCD_OFFSET_FORMAT_INVALID % ("\"" + PcdItem[2] + "\""),
-                         File=CurrentLineOfPcdItem[2],
-                         Line=CurrentLineOfPcdItem[1],
-                         ExtraData=CurrentLineOfPcdItem[0])
+                             ToolError.FORMAT_INVALID,
+                             ST.ERR_ASBUILD_PCD_OFFSET_FORMAT_INVALID % (
+                                 "\"" + PcdItem[2] + "\""),
+                             File=CurrentLineOfPcdItem[2],
+                             Line=CurrentLineOfPcdItem[1],
+                             ExtraData=CurrentLineOfPcdItem[0])
             PcdItemObj.SetOffset(PcdItem[2])
 
     if PcdItemObj.GetToken() == '' or PcdItemObj.GetDatumType() == '':
         Logger.Error("InfParser",
                      ToolError.FORMAT_INVALID,
-                     ST.ERR_ASBUILD_PCD_DECLARITION_MISS % ("\"" + PcdItem[0] + "\""),
+                     ST.ERR_ASBUILD_PCD_DECLARITION_MISS % (
+                         "\"" + PcdItem[0] + "\""),
                      File=CurrentLineOfPcdItem[2],
                      Line=CurrentLineOfPcdItem[1],
                      ExtraData=CurrentLineOfPcdItem[0])
 
     return PcdItemObj
 
+
 def ValidatePcdValueOnDatumType(Value, Type):
 
     Value = Value.strip()
@@ -618,11 +657,11 @@ def ValidatePcdValueOnDatumType(Value, Type):
 
         if not ReIsValidUint8z.match(Value) and Type == 'UINT8':
             return False
-        elif not ReIsValidUint16z.match(Value) and  Type == 'UINT16':
+        elif not ReIsValidUint16z.match(Value) and Type == 'UINT16':
             return False
-        elif not ReIsValidUint32z.match(Value) and  Type == 'UINT32':
+        elif not ReIsValidUint32z.match(Value) and Type == 'UINT32':
             return False
-        elif not ReIsValidUint64z.match(Value) and  Type == 'UINT64':
+        elif not ReIsValidUint64z.match(Value) and Type == 'UINT64':
             return False
     else:
         #
@@ -632,6 +671,7 @@ def ValidatePcdValueOnDatumType(Value, Type):
 
     return True
 
+
 def SetPcdName(PcdItem, CurrentLineOfPcdItem, PcdItemObj):
     #
     # Only PCD Name specified
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
index 1968c365732d..f73506623c20 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Ppis] section.
 # It will consumed by InfParser.
 #
@@ -20,6 +20,7 @@ import Logger.Log as Logger
 from Logger import ToolError
 from Logger import StringTable as ST
 
+
 def ParsePpiComment(CommentsList, InfPpiItemObj):
     PreNotify = None
     PreUsage = None
@@ -30,14 +31,14 @@ def ParsePpiComment(CommentsList, InfPpiItemObj):
     for CommentItem in CommentsList:
         Count = Count + 1
         CommentItemUsage, \
-        CommentItemNotify, \
-        CommentItemString, \
-        CommentItemHelpText = \
-                ParseComment(CommentItem,
-                             DT.ALL_USAGE_TOKENS,
-                             DT.PPI_NOTIFY_TOKENS,
-                             ['PPI'],
-                             False)
+            CommentItemNotify, \
+            CommentItemString, \
+            CommentItemHelpText = \
+            ParseComment(CommentItem,
+                         DT.ALL_USAGE_TOKENS,
+                         DT.PPI_NOTIFY_TOKENS,
+                         ['PPI'],
+                         False)
 
         #
         # To avoid PyLint error
@@ -134,6 +135,7 @@ def ParsePpiComment(CommentsList, InfPpiItemObj):
 
     return InfPpiItemObj
 
+
 class InfPpiItemCommentContent():
     def __init__(self):
         #
@@ -149,50 +151,60 @@ class InfPpiItemCommentContent():
 
     def SetUsage(self, UsageItem):
         self.UsageItem = UsageItem
+
     def GetUsage(self):
         return self.UsageItem
 
     def SetNotify(self, Notify):
         if Notify != DT.ITEM_UNDEFINED:
             self.Notify = 'true'
+
     def GetNotify(self):
         return self.Notify
 
     def SetHelpStringItem(self, HelpStringItem):
         self.HelpStringItem = HelpStringItem
+
     def GetHelpStringItem(self):
         return self.HelpStringItem
 
+
 class InfPpiItem():
     def __init__(self):
-        self.Name             = ''
-        self.FeatureFlagExp   = ''
-        self.SupArchList      = []
-        self.CommentList      = []
+        self.Name = ''
+        self.FeatureFlagExp = ''
+        self.SupArchList = []
+        self.CommentList = []
 
     def SetName(self, Name):
         self.Name = Name
+
     def GetName(self):
         return self.Name
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
     def SetCommentList(self, CommentList):
         self.CommentList = CommentList
+
     def GetCommentList(self):
         return self.CommentList
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 ##
 #
 #
 #
+
+
 class InfPpiObject():
     def __init__(self):
         self.Ppis = Sdict()
@@ -201,7 +213,7 @@ class InfPpiObject():
         #
         self.Macros = {}
 
-    def SetPpi(self, PpiList, Arch = None):
+    def SetPpi(self, PpiList, Arch=None):
         __SupArchList = []
         for ArchItem in Arch:
             #
@@ -228,7 +240,7 @@ class InfPpiObject():
                 if not IsValidCVariableName(Item[0]):
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_INVALID_CNAME%(Item[0]),
+                                 ST.ERR_INF_PARSER_INVALID_CNAME % (Item[0]),
                                  File=CurrentLineOfItem[2],
                                  Line=CurrentLineOfItem[1],
                                  ExtraData=CurrentLineOfItem[0])
@@ -265,7 +277,8 @@ class InfPpiObject():
                 if not FeatureFlagRtv[0]:
                     Logger.Error("InfParser",
                                  ToolError.FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                                 ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                     FeatureFlagRtv[1]),
                                  File=CurrentLineOfItem[2],
                                  Line=CurrentLineOfItem[1],
                                  ExtraData=CurrentLineOfItem[0])
@@ -332,6 +345,5 @@ class InfPpiObject():
 
         return True
 
-
     def GetPpi(self):
         return self.Ppis
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
index 80bfca607754..cf3798dc5317 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Protocols] section.
 # It will consumed by InfParser.
 #
@@ -21,6 +21,7 @@ from Object.Parser.InfMisc import ErrorInInf
 from Library import DataType as DT
 from Logger import StringTable as ST
 
+
 def ParseProtocolComment(CommentsList, InfProtocolItemObj):
     CommentInsList = []
     PreUsage = None
@@ -31,14 +32,14 @@ def ParseProtocolComment(CommentsList, InfProtocolItemObj):
     for CommentItem in CommentsList:
         Count = Count + 1
         CommentItemUsage, \
-        CommentItemNotify, \
-        CommentItemString, \
-        CommentItemHelpText = \
-                ParseComment(CommentItem,
-                             DT.PROTOCOL_USAGE_TOKENS,
-                             DT.PROTOCOL_NOTIFY_TOKENS,
-                             ['PROTOCOL'],
-                             False)
+            CommentItemNotify, \
+            CommentItemString, \
+            CommentItemHelpText = \
+            ParseComment(CommentItem,
+                         DT.PROTOCOL_USAGE_TOKENS,
+                         DT.PROTOCOL_NOTIFY_TOKENS,
+                         ['PROTOCOL'],
+                         False)
 
         if CommentItemString:
             pass
@@ -123,6 +124,7 @@ def ParseProtocolComment(CommentsList, InfProtocolItemObj):
 
     return InfProtocolItemObj
 
+
 class InfProtocolItemCommentContent():
     def __init__(self):
         #
@@ -138,20 +140,24 @@ class InfProtocolItemCommentContent():
 
     def SetUsageItem(self, UsageItem):
         self.UsageItem = UsageItem
+
     def GetUsageItem(self):
         return self.UsageItem
 
     def SetNotify(self, Notify):
         if Notify != DT.ITEM_UNDEFINED:
             self.Notify = 'true'
+
     def GetNotify(self):
         return self.Notify
 
     def SetHelpStringItem(self, HelpStringItem):
         self.HelpStringItem = HelpStringItem
+
     def GetHelpStringItem(self):
         return self.HelpStringItem
 
+
 class InfProtocolItem():
     def __init__(self):
         self.Name = ''
@@ -161,21 +167,25 @@ class InfProtocolItem():
 
     def SetName(self, Name):
         self.Name = Name
+
     def GetName(self):
         return self.Name
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
     def SetCommentList(self, CommentList):
         self.CommentList = CommentList
+
     def GetCommentList(self):
         return self.CommentList
 
@@ -183,6 +193,8 @@ class InfProtocolItem():
 #
 #
 #
+
+
 class InfProtocolObject():
     def __init__(self):
         self.Protocols = Sdict()
@@ -191,7 +203,7 @@ class InfProtocolObject():
         #
         self.Macros = {}
 
-    def SetProtocol(self, ProtocolContent, Arch = None,):
+    def SetProtocol(self, ProtocolContent, Arch=None,):
         __SupArchList = []
         for ArchItem in Arch:
             #
@@ -209,7 +221,8 @@ class InfProtocolObject():
             if len(Item) == 3:
                 CommentsList = Item[1]
             CurrentLineOfItem = Item[2]
-            LineInfo = (CurrentLineOfItem[2], CurrentLineOfItem[1], CurrentLineOfItem[0])
+            LineInfo = (
+                CurrentLineOfItem[2], CurrentLineOfItem[1], CurrentLineOfItem[0])
             Item = Item[0]
             InfProtocolItemObj = InfProtocolItem()
             if len(Item) >= 1 and len(Item) <= 2:
@@ -217,7 +230,7 @@ class InfProtocolObject():
                 # Only CName contained
                 #
                 if not IsValidCVariableName(Item[0]):
-                    ErrorInInf(ST.ERR_INF_PARSER_INVALID_CNAME%(Item[0]),
+                    ErrorInInf(ST.ERR_INF_PARSER_INVALID_CNAME % (Item[0]),
                                LineInfo=LineInfo)
                 if (Item[0] != ''):
                     InfProtocolItemObj.SetName(Item[0])
@@ -239,7 +252,7 @@ class InfProtocolObject():
                 #
                 FeatureFlagRtv = IsValidFeatureFlagExp(Item[1].strip())
                 if not FeatureFlagRtv[0]:
-                    ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                    ErrorInInf(ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (FeatureFlagRtv[1]),
                                LineInfo=LineInfo)
                 InfProtocolItemObj.SetFeatureFlagExp(Item[1])
 
@@ -254,7 +267,8 @@ class InfProtocolObject():
             # Get/Set Usage and HelpString for Protocol entry
             #
             if CommentsList is not None and len(CommentsList) != 0:
-                InfProtocolItemObj = ParseProtocolComment(CommentsList, InfProtocolItemObj)
+                InfProtocolItemObj = ParseProtocolComment(
+                    CommentsList, InfProtocolItemObj)
             else:
                 CommentItemIns = InfProtocolItemCommentContent()
                 CommentItemIns.SetUsageItem(DT.ITEM_UNDEFINED)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
index 75ea209c48ac..5827afe5f50c 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [Sources] section.
 # It will consumed by InfParser.
 #
@@ -24,9 +24,11 @@ from Library.Misc import ValidFile
 from Library.ParserValidate import IsValidFamily
 from Library.ParserValidate import IsValidPath
 
-## __GenSourceInstance
+# __GenSourceInstance
 #
 #
+
+
 def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
 
     IsValidFileFlag = False
@@ -53,7 +55,8 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             if not FeatureFlagRtv[0]:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID%(FeatureFlagRtv[1]),
+                             ST.ERR_INF_PARSER_FEATURE_FLAG_EXP_SYNTAX_INVLID % (
+                                 FeatureFlagRtv[1]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -64,7 +67,8 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             else:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_TOOLCODE_NOT_PERMITTED%(Item[2]),
+                             ST.ERR_INF_PARSER_TOOLCODE_NOT_PERMITTED % (
+                                 Item[2]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -74,7 +78,8 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             else:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED%(Item[2]),
+                             ST.ERR_INF_PARSER_TAGNAME_NOT_PERMITTED % (
+                                 Item[2]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -89,7 +94,8 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             else:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_SOURCE_SECTION_FAMILY_INVALID%(Item[1]),
+                             ST.ERR_INF_PARSER_SOURCE_SECTION_FAMILY_INVALID % (
+                                 Item[1]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -97,11 +103,12 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             #
             # Validate file name exist.
             #
-            FullFileName = os.path.normpath(os.path.realpath(os.path.join(GlobalData.gINF_MODULE_DIR, Item[0])))
+            FullFileName = os.path.normpath(os.path.realpath(
+                os.path.join(GlobalData.gINF_MODULE_DIR, Item[0])))
             if not (ValidFile(FullFileName) or ValidFile(Item[0])):
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_FILELIST_EXIST%(Item[0]),
+                             ST.ERR_FILELIST_EXIST % (Item[0]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -115,7 +122,8 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
             else:
                 Logger.Error("InfParser",
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(Item[0]),
+                             ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                 Item[0]),
                              File=CurrentLineOfItem[2],
                              Line=CurrentLineOfItem[1],
                              ExtraData=CurrentLineOfItem[0])
@@ -132,74 +140,86 @@ def GenSourceInstance(Item, CurrentLineOfItem, ItemObj):
 
     return ItemObj
 
-## InfSourcesItemObject()
+# InfSourcesItemObject()
 #
 #
+
+
 class InfSourcesItemObject():
-    def __init__(self, \
-                 SourceFileName = '', \
-                 Family = '', \
-                 TagName = '', \
-                 ToolCode = '', \
-                 FeatureFlagExp = ''):
+    def __init__(self,
+                 SourceFileName='',
+                 Family='',
+                 TagName='',
+                 ToolCode='',
+                 FeatureFlagExp=''):
         self.SourceFileName = SourceFileName
-        self.Family         = Family
-        self.TagName        = TagName
-        self.ToolCode       = ToolCode
+        self.Family = Family
+        self.TagName = TagName
+        self.ToolCode = ToolCode
         self.FeatureFlagExp = FeatureFlagExp
-        self.HeaderString   = ''
-        self.TailString     = ''
-        self.SupArchList    = []
+        self.HeaderString = ''
+        self.TailString = ''
+        self.SupArchList = []
 
     def SetSourceFileName(self, SourceFilename):
         self.SourceFileName = SourceFilename
+
     def GetSourceFileName(self):
         return self.SourceFileName
 
     def SetFamily(self, Family):
         self.Family = Family
+
     def GetFamily(self):
         return self.Family
 
     def SetTagName(self, TagName):
         self.TagName = TagName
+
     def GetTagName(self):
         return self.TagName
 
     def SetToolCode(self, ToolCode):
         self.ToolCode = ToolCode
+
     def GetToolCode(self):
         return self.ToolCode
 
     def SetFeatureFlagExp(self, FeatureFlagExp):
         self.FeatureFlagExp = FeatureFlagExp
+
     def GetFeatureFlagExp(self):
         return self.FeatureFlagExp
 
     def SetHeaderString(self, HeaderString):
         self.HeaderString = HeaderString
+
     def GetHeaderString(self):
         return self.HeaderString
 
     def SetTailString(self, TailString):
         self.TailString = TailString
+
     def GetTailString(self):
         return self.TailString
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 ##
 #
 #
 #
+
+
 class InfSourcesObject(InfSectionCommonDef):
     def __init__(self):
         self.Sources = Sdict()
         InfSectionCommonDef.__init__(self)
 
-    def SetSources(self, SourceList, Arch = None):
+    def SetSources(self, SourceList, Arch=None):
         __SupArchList = []
         for ArchItem in Arch:
             #
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
index ce017dbebb45..bab6ecca94b8 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define class objects of INF file [UserExtension] section.
 # It will consumed by InfParser.
 #
@@ -17,33 +17,38 @@ from Library import GlobalData
 
 from Library.Misc import Sdict
 
+
 class InfUserExtensionItem():
     def __init__(self,
-                 Content = '',
-                 UserId = '',
-                 IdString = ''):
-        self.Content  = Content
-        self.UserId   = UserId
+                 Content='',
+                 UserId='',
+                 IdString=''):
+        self.Content = Content
+        self.UserId = UserId
         self.IdString = IdString
         self.SupArchList = []
 
     def SetContent(self, Content):
         self.Content = Content
+
     def GetContent(self):
         return self.Content
 
     def SetUserId(self, UserId):
         self.UserId = UserId
+
     def GetUserId(self):
         return self.UserId
 
     def SetIdString(self, IdString):
         self.IdString = IdString
+
     def GetIdString(self):
         return self.IdString
 
     def SetSupArchList(self, SupArchList):
         self.SupArchList = SupArchList
+
     def GetSupArchList(self):
         return self.SupArchList
 
@@ -51,6 +56,8 @@ class InfUserExtensionItem():
 #
 #
 #
+
+
 class InfUserExtensionObject():
     def __init__(self):
         self.UserExtension = Sdict()
@@ -111,8 +118,9 @@ class InfUserExtensionObject():
                 #
                 Logger.Error('InfParser',
                              ToolError.FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_UE_SECTION_DUPLICATE_ERROR%\
-                             (IdContentItem[0] + '.' + IdContentItem[1] + '.' + IdContentItem[2]),
+                             ST.ERR_INF_PARSER_UE_SECTION_DUPLICATE_ERROR %
+                             (IdContentItem[0] + '.' +
+                              IdContentItem[1] + '.' + IdContentItem[2]),
                              File=GlobalData.gINF_MODULE_NAME,
                              Line=LineNo,
                              ExtraData=None)
diff --git a/BaseTools/Source/Python/UPT/Object/Parser/__init__.py b/BaseTools/Source/Python/UPT/Object/Parser/__init__.py
index 268ce7ca1e99..3fec9ece77c0 100644
--- a/BaseTools/Source/Python/UPT/Object/Parser/__init__.py
+++ b/BaseTools/Source/Python/UPT/Object/Parser/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Object' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/Object/__init__.py b/BaseTools/Source/Python/UPT/Object/__init__.py
index 53db4406dc73..45bbfca726d5 100644
--- a/BaseTools/Source/Python/UPT/Object/__init__.py
+++ b/BaseTools/Source/Python/UPT/Object/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Object' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/Parser/DecParser.py b/BaseTools/Source/Python/UPT/Parser/DecParser.py
index 8dfa12d8268b..4b0c6f540935 100644
--- a/BaseTools/Source/Python/UPT/Parser/DecParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/DecParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse DEC file. It will consumed by DecParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -7,7 +7,7 @@
 '''
 DecParser
 '''
-## Import modules
+# Import modules
 #
 import Logger.Log as Logger
 from Logger.ToolError import FILE_PARSE_FAILURE
@@ -60,6 +60,8 @@ from Library.CommentParsing import ParsePcdErrorCode
 ##
 # _DecBase class for parsing
 #
+
+
 class _DecBase:
     def __init__(self, RawData):
         self._RawData = RawData
@@ -76,20 +78,20 @@ class _DecBase:
     def GetLocalMacro(self):
         return self._LocalMacro
 
-    ## BlockStart
+    # BlockStart
     #
     # Called if a new section starts
     #
     def BlockStart(self):
         self._LocalMacro = {}
 
-    ## _CheckReDefine
+    # _CheckReDefine
     #
     # @param Key: to be checked if multi-defined
     # @param Scope: Format: [[SectionName, Arch], ...].
     #               If scope is none, use global scope
     #
-    def _CheckReDefine(self, Key, Scope = None):
+    def _CheckReDefine(self, Key, Scope=None):
         if not Scope:
             Scope = self._RawData.CurrentScope
             return
@@ -113,7 +115,8 @@ class _DecBase:
                         # Key in common cannot be redefined in other arches
                         # [:-1] means stripping arch info
                         if Other[:-1] == SubValue[:-1]:
-                            self._LoggerError(ST.ERR_DECPARSE_REDEFINE % (Key, Value[1]))
+                            self._LoggerError(
+                                ST.ERR_DECPARSE_REDEFINE % (Key, Value[1]))
                             return
                     continue
                 CommonScope = []
@@ -123,11 +126,12 @@ class _DecBase:
                 # Cannot be redefined if this key already defined in COMMON Or defined in same arch
                 #
                 if SubValue in Value[0] or CommonScope in Value[0]:
-                    self._LoggerError(ST.ERR_DECPARSE_REDEFINE % (Key, Value[1]))
+                    self._LoggerError(ST.ERR_DECPARSE_REDEFINE %
+                                      (Key, Value[1]))
                     return
         self._ItemDict[Key].append([SecArch, self._RawData.LineIndex])
 
-    ## CheckRequiredFields
+    # CheckRequiredFields
     # Some sections need to check if some fields exist, define section for example
     # Derived class can re-implement, top parser will call this function after all parsing done
     #
@@ -136,7 +140,7 @@ class _DecBase:
             pass
         return True
 
-    ## IsItemRequired
+    # IsItemRequired
     # In DEC spec, sections must have at least one statement except user
     # extension.
     # For example: "[guids" [<attribs>] "]" <EOL> <statements>+
@@ -149,23 +153,23 @@ class _DecBase:
 
     def _LoggerError(self, ErrorString):
         Logger.Error(TOOL_NAME, FILE_PARSE_FAILURE, File=self._RawData.Filename,
-                     Line = self._RawData.LineIndex,
+                     Line=self._RawData.LineIndex,
                      ExtraData=ErrorString + ST.ERR_DECPARSE_LINE % self._RawData.CurrentLine)
 
     def _ReplaceMacro(self, String):
         if gMACRO_PATTERN.findall(String):
             String = ReplaceMacro(String, self._LocalMacro, False,
-                                  FileName = self._RawData.Filename,
-                                  Line = ['', self._RawData.LineIndex])
+                                  FileName=self._RawData.Filename,
+                                  Line=['', self._RawData.LineIndex])
             String = ReplaceMacro(String, self._RawData.Macros, False,
-                                  FileName = self._RawData.Filename,
-                                  Line = ['', self._RawData.LineIndex])
+                                  FileName=self._RawData.Filename,
+                                  Line=['', self._RawData.LineIndex])
             MacroUsed = gMACRO_PATTERN.findall(String)
             if MacroUsed:
                 Logger.Error(TOOL_NAME, FILE_PARSE_FAILURE,
                              File=self._RawData.Filename,
-                             Line = self._RawData.LineIndex,
-                             ExtraData = ST.ERR_DECPARSE_MACRO_RESOLVE % (str(MacroUsed), String))
+                             Line=self._RawData.LineIndex,
+                             ExtraData=ST.ERR_DECPARSE_MACRO_RESOLVE % (str(MacroUsed), String))
         return String
 
     def _MacroParser(self, String):
@@ -184,7 +188,7 @@ class _DecBase:
         else:
             self._LocalMacro[TokenList[0]] = self._ReplaceMacro(TokenList[1])
 
-    ## _ParseItem
+    # _ParseItem
     #
     # Parse specified item, this function must be derived by subclass
     #
@@ -196,14 +200,14 @@ class _DecBase:
         #
         return None
 
-
-    ## _TailCommentStrategy
+    # _TailCommentStrategy
     #
     # This function can be derived to parse tail comment
     # default is it will not consume any lines
     #
     # @param Comment: Comment of current line
     #
+
     def _TailCommentStrategy(self, Comment):
         if Comment:
             pass
@@ -211,7 +215,7 @@ class _DecBase:
             pass
         return False
 
-    ## _StopCurrentParsing
+    # _StopCurrentParsing
     #
     # Called in Parse if current parsing should be stopped when encounter some
     # keyword
@@ -224,7 +228,7 @@ class _DecBase:
             pass
         return Line[0] == DT.TAB_SECTION_START and Line[-1] == DT.TAB_SECTION_END
 
-    ## _TryBackSlash
+    # _TryBackSlash
     #
     # Split comment and DEC content, concatenate lines if end of char is '\'
     #
@@ -281,7 +285,7 @@ class _DecBase:
 
         return CatLine, CommentList
 
-    ## Parse
+    # Parse
     # This is a template method in which other member functions which might
     # override by sub class are called. It is responsible for reading file
     # line by line, and call other member functions to parse. This function
@@ -291,24 +295,24 @@ class _DecBase:
         HeadComments = []
         TailComments = []
 
-        #======================================================================
+        # ======================================================================
         # CurComments may pointer to HeadComments or TailComments
-        #======================================================================
+        # ======================================================================
         CurComments = HeadComments
         CurObj = None
         ItemNum = 0
         FromBuf = False
 
-        #======================================================================
+        # ======================================================================
         # Used to report error information if empty section found
-        #======================================================================
+        # ======================================================================
         Index = self._RawData.LineIndex
         LineStr = self._RawData.CurrentLine
         while not self._RawData.IsEndOfFile() or self._RawData.NextLine:
             if self._RawData.NextLine:
-                #==============================================================
+                # ==============================================================
                 # Have processed line in buffer
-                #==============================================================
+                # ==============================================================
                 Line = self._RawData.NextLine
                 HeadComments.extend(self._RawData.HeadComment)
                 TailComments.extend(self._RawData.TailComment)
@@ -316,16 +320,16 @@ class _DecBase:
                 Comment = ''
                 FromBuf = True
             else:
-                #==============================================================
+                # ==============================================================
                 # No line in buffer, read next line
-                #==============================================================
+                # ==============================================================
                 Line, Comment = CleanString(self._RawData.GetNextLine())
                 FromBuf = False
             if Line:
                 if not FromBuf and CurObj and TailComments:
-                    #==========================================================
+                    # ==========================================================
                     # Set tail comments to previous statement if not empty.
-                    #==========================================================
+                    # ==========================================================
                     CurObj.SetTailComment(CurObj.GetTailComment()+TailComments)
 
                 if not FromBuf:
@@ -335,15 +339,15 @@ class _DecBase:
                 if Comment:
                     Comments = [(Comment, self._RawData.LineIndex)]
 
-                #==============================================================
+                # ==============================================================
                 # Try if last char of line has backslash
-                #==============================================================
+                # ==============================================================
                 Line, Comments = self._TryBackSlash(Line, Comments)
                 CurComments.extend(Comments)
 
-                #==============================================================
+                # ==============================================================
                 # Macro found
-                #==============================================================
+                # ==============================================================
                 if Line.startswith('DEFINE '):
                     self._MacroParser(Line)
                     del HeadComments[:]
@@ -352,10 +356,10 @@ class _DecBase:
                     continue
 
                 if self._StopCurrentParsing(Line):
-                    #==========================================================
+                    # ==========================================================
                     # This line does not belong to this parse,
                     # Save it, can be used by next parse
-                    #==========================================================
+                    # ==========================================================
                     self._RawData.SetNext(Line, HeadComments, TailComments)
                     break
 
@@ -371,9 +375,9 @@ class _DecBase:
                     CurObj = None
             else:
                 if id(CurComments) == id(TailComments):
-                    #==========================================================
+                    # ==========================================================
                     # Check if this comment belongs to tail comment
-                    #==========================================================
+                    # ==========================================================
                     if not self._TailCommentStrategy(Comment):
                         CurComments = HeadComments
 
@@ -384,15 +388,17 @@ class _DecBase:
 
         if self._IsStatementRequired() and ItemNum == 0:
             Logger.Error(
-                    TOOL_NAME, FILE_PARSE_FAILURE,
-                    File=self._RawData.Filename,
-                    Line=Index,
-                    ExtraData=ST.ERR_DECPARSE_STATEMENT_EMPTY % LineStr
+                TOOL_NAME, FILE_PARSE_FAILURE,
+                File=self._RawData.Filename,
+                Line=Index,
+                ExtraData=ST.ERR_DECPARSE_STATEMENT_EMPTY % LineStr
             )
 
-## _DecDefine
+# _DecDefine
 # Parse define section
 #
+
+
 class _DecDefine(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -404,11 +410,11 @@ class _DecDefine(_DecBase):
         # Each field has a function to validate
         #
         self.DefineValidation = {
-            DT.TAB_DEC_DEFINES_DEC_SPECIFICATION   :   self._SetDecSpecification,
-            DT.TAB_DEC_DEFINES_PACKAGE_NAME        :   self._SetPackageName,
-            DT.TAB_DEC_DEFINES_PACKAGE_GUID        :   self._SetPackageGuid,
-            DT.TAB_DEC_DEFINES_PACKAGE_VERSION     :   self._SetPackageVersion,
-            DT.TAB_DEC_DEFINES_PKG_UNI_FILE        :   self._SetPackageUni,
+            DT.TAB_DEC_DEFINES_DEC_SPECIFICATION:   self._SetDecSpecification,
+            DT.TAB_DEC_DEFINES_PACKAGE_NAME:   self._SetPackageName,
+            DT.TAB_DEC_DEFINES_PACKAGE_GUID:   self._SetPackageGuid,
+            DT.TAB_DEC_DEFINES_PACKAGE_VERSION:   self._SetPackageVersion,
+            DT.TAB_DEC_DEFINES_PKG_UNI_FILE:   self._SetPackageUni,
         }
 
     def BlockStart(self):
@@ -416,7 +422,7 @@ class _DecDefine(_DecBase):
         if self._DefSecNum > 1:
             self._LoggerError(ST.ERR_DECPARSE_DEFINE_MULTISEC)
 
-    ## CheckRequiredFields
+    # CheckRequiredFields
     #
     # Check required fields: DEC_SPECIFICATION, PACKAGE_NAME
     #                        PACKAGE_GUID, PACKAGE_VERSION
@@ -452,14 +458,15 @@ class _DecDefine(_DecBase):
             self.DefineValidation[TokenList[0]](TokenList[1])
 
         DefineItem = DecDefineItemObject()
-        DefineItem.Key   = TokenList[0]
+        DefineItem.Key = TokenList[0]
         DefineItem.Value = TokenList[1]
         self.ItemObject.AddItem(DefineItem, self._RawData.CurrentScope)
         return DefineItem
 
     def _SetDecSpecification(self, Token):
         if self.ItemObject.GetPackageSpecification():
-            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED % DT.TAB_DEC_DEFINES_DEC_SPECIFICATION)
+            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED %
+                              DT.TAB_DEC_DEFINES_DEC_SPECIFICATION)
         if not IsValidToken('0[xX][0-9a-fA-F]{8}', Token):
             if not IsValidDecVersionVal(Token):
                 self._LoggerError(ST.ERR_DECPARSE_DEFINE_SPEC)
@@ -467,21 +474,24 @@ class _DecDefine(_DecBase):
 
     def _SetPackageName(self, Token):
         if self.ItemObject.GetPackageName():
-            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED % DT.TAB_DEC_DEFINES_PACKAGE_NAME)
+            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED %
+                              DT.TAB_DEC_DEFINES_PACKAGE_NAME)
         if not IsValidWord(Token):
             self._LoggerError(ST.ERR_DECPARSE_DEFINE_PKGNAME)
         self.ItemObject.SetPackageName(Token)
 
     def _SetPackageGuid(self, Token):
         if self.ItemObject.GetPackageGuid():
-            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED % DT.TAB_DEC_DEFINES_PACKAGE_GUID)
+            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED %
+                              DT.TAB_DEC_DEFINES_PACKAGE_GUID)
         if not CheckGuidRegFormat(Token):
             self._LoggerError(ST.ERR_DECPARSE_DEFINE_PKGGUID)
         self.ItemObject.SetPackageGuid(Token)
 
     def _SetPackageVersion(self, Token):
         if self.ItemObject.GetPackageVersion():
-            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED % DT.TAB_DEC_DEFINES_PACKAGE_VERSION)
+            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED %
+                              DT.TAB_DEC_DEFINES_PACKAGE_VERSION)
         if not IsValidToken(VERSION_PATTERN, Token):
             self._LoggerError(ST.ERR_DECPARSE_DEFINE_PKGVERSION)
         else:
@@ -491,13 +501,16 @@ class _DecDefine(_DecBase):
 
     def _SetPackageUni(self, Token):
         if self.ItemObject.GetPackageUniFile():
-            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED % DT.TAB_DEC_DEFINES_PKG_UNI_FILE)
+            self._LoggerError(ST.ERR_DECPARSE_DEFINE_DEFINED %
+                              DT.TAB_DEC_DEFINES_PKG_UNI_FILE)
         self.ItemObject.SetPackageUniFile(Token)
 
-## _DecInclude
+# _DecInclude
 #
 # Parse include section
 #
+
+
 class _DecInclude(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -509,14 +522,17 @@ class _DecInclude(_DecBase):
         if not IsValidPath(Line, self._RawData.PackagePath):
             self._LoggerError(ST.ERR_DECPARSE_INCLUDE % Line)
 
-        Item = DecIncludeItemObject(StripRoot(self._RawData.PackagePath, Line), self._RawData.PackagePath)
+        Item = DecIncludeItemObject(
+            StripRoot(self._RawData.PackagePath, Line), self._RawData.PackagePath)
         self.ItemObject.AddItem(Item, self._RawData.CurrentScope)
         return Item
 
-## _DecLibraryclass
+# _DecLibraryclass
 #
 # Parse library class section
 #
+
+
 class _DecLibraryclass(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -552,10 +568,12 @@ class _DecLibraryclass(_DecBase):
         self.ItemObject.AddItem(Item, self._RawData.CurrentScope)
         return Item
 
-## _DecPcd
+# _DecPcd
 #
 # Parse PCD section
 #
+
+
 class _DecPcd(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -621,7 +639,7 @@ class _DecPcd(_DecBase):
         IntToken = int(Token, 0)
         if (Guid, IntToken) in self.TokenMap:
             if self.TokenMap[Guid, IntToken] != CName:
-                self._LoggerError(ST.ERR_DECPARSE_PCD_TOKEN_UNIQUE%(Token))
+                self._LoggerError(ST.ERR_DECPARSE_PCD_TOKEN_UNIQUE % (Token))
         else:
             self.TokenMap[Guid, IntToken] = CName
 
@@ -629,10 +647,12 @@ class _DecPcd(_DecBase):
         self.ItemObject.AddItem(Item, self._RawData.CurrentScope)
         return Item
 
-## _DecGuid
+# _DecGuid
 #
 # Parse GUID, PPI, Protocol section
 #
+
+
 class _DecGuid(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -640,11 +660,11 @@ class _DecGuid(_DecBase):
         self.PpiObj = DecPpiObject(RawData.Filename)
         self.ProtocolObj = DecProtocolObject(RawData.Filename)
         self.ObjectDict = \
-        {
-            DT.TAB_GUIDS.upper()     :   self.GuidObj,
-            DT.TAB_PPIS.upper()      :   self.PpiObj,
-            DT.TAB_PROTOCOLS.upper() :   self.ProtocolObj
-        }
+            {
+                DT.TAB_GUIDS.upper():   self.GuidObj,
+                DT.TAB_PPIS.upper():   self.PpiObj,
+                DT.TAB_PROTOCOLS.upper():   self.ProtocolObj
+            }
 
     def GetDataObject(self):
         if self._RawData.CurrentScope:
@@ -697,10 +717,12 @@ class _DecGuid(_DecBase):
         ItemObject.AddItem(Item, self._RawData.CurrentScope)
         return Item
 
-## _DecUserExtension
+# _DecUserExtension
 #
 # Parse user extension section
 #
+
+
 class _DecUserExtension(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -739,12 +761,14 @@ class _DecUserExtension(_DecBase):
                 Item.UserString = Line
         return Item
 
-## Dec
+# Dec
 #
 # Top dec parser
 #
+
+
 class Dec(_DecBase, _DecComments):
-    def __init__(self, DecFile, Parse = True):
+    def __init__(self, DecFile, Parse=True):
         try:
             Content = ConvertSpecialChar(open(DecFile, 'r').readlines())
         except BaseException:
@@ -777,29 +801,29 @@ class Dec(_DecBase, _DecComments):
         self.BinaryHeadComment = []
         self.PcdErrorCommentDict = {}
 
-        self._Define    = _DecDefine(RawData)
-        self._Include   = _DecInclude(RawData)
-        self._Guid      = _DecGuid(RawData)
-        self._LibClass  = _DecLibraryclass(RawData)
-        self._Pcd       = _DecPcd(RawData)
-        self._UserEx    = _DecUserExtension(RawData)
+        self._Define = _DecDefine(RawData)
+        self._Include = _DecInclude(RawData)
+        self._Guid = _DecGuid(RawData)
+        self._LibClass = _DecLibraryclass(RawData)
+        self._Pcd = _DecPcd(RawData)
+        self._UserEx = _DecUserExtension(RawData)
 
         #
         # DEC file supported data types (one type per section)
         #
         self._SectionParser = {
-            DT.TAB_DEC_DEFINES.upper()                     :   self._Define,
-            DT.TAB_INCLUDES.upper()                        :   self._Include,
-            DT.TAB_LIBRARY_CLASSES.upper()                 :   self._LibClass,
-            DT.TAB_GUIDS.upper()                           :   self._Guid,
-            DT.TAB_PPIS.upper()                            :   self._Guid,
-            DT.TAB_PROTOCOLS.upper()                       :   self._Guid,
-            DT.TAB_PCDS_FIXED_AT_BUILD_NULL.upper()        :   self._Pcd,
-            DT.TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper()   :   self._Pcd,
-            DT.TAB_PCDS_FEATURE_FLAG_NULL.upper()          :   self._Pcd,
-            DT.TAB_PCDS_DYNAMIC_NULL.upper()               :   self._Pcd,
-            DT.TAB_PCDS_DYNAMIC_EX_NULL.upper()            :   self._Pcd,
-            DT.TAB_USER_EXTENSIONS.upper()                 :   self._UserEx
+            DT.TAB_DEC_DEFINES.upper():   self._Define,
+            DT.TAB_INCLUDES.upper():   self._Include,
+            DT.TAB_LIBRARY_CLASSES.upper():   self._LibClass,
+            DT.TAB_GUIDS.upper():   self._Guid,
+            DT.TAB_PPIS.upper():   self._Guid,
+            DT.TAB_PROTOCOLS.upper():   self._Guid,
+            DT.TAB_PCDS_FIXED_AT_BUILD_NULL.upper():   self._Pcd,
+            DT.TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper():   self._Pcd,
+            DT.TAB_PCDS_FEATURE_FLAG_NULL.upper():   self._Pcd,
+            DT.TAB_PCDS_DYNAMIC_NULL.upper():   self._Pcd,
+            DT.TAB_PCDS_DYNAMIC_EX_NULL.upper():   self._Pcd,
+            DT.TAB_USER_EXTENSIONS.upper():   self._UserEx
         }
 
         if Parse:
@@ -832,29 +856,36 @@ class Dec(_DecBase, _DecComments):
         while not self._RawData.IsEndOfFile():
             self._RawData.CurrentLine = self._RawData.GetNextLine()
             if self._RawData.CurrentLine.startswith(DT.TAB_COMMENT_SPLIT) and \
-                DT.TAB_SECTION_START in self._RawData.CurrentLine and \
-                DT.TAB_SECTION_END in self._RawData.CurrentLine:
-                self._RawData.CurrentLine = self._RawData.CurrentLine.replace(DT.TAB_COMMENT_SPLIT, '').strip()
+                    DT.TAB_SECTION_START in self._RawData.CurrentLine and \
+                    DT.TAB_SECTION_END in self._RawData.CurrentLine:
+                self._RawData.CurrentLine = self._RawData.CurrentLine.replace(
+                    DT.TAB_COMMENT_SPLIT, '').strip()
 
                 if self._RawData.CurrentLine[0] == DT.TAB_SECTION_START and \
-                    self._RawData.CurrentLine[-1] == DT.TAB_SECTION_END:
+                        self._RawData.CurrentLine[-1] == DT.TAB_SECTION_END:
                     RawSection = self._RawData.CurrentLine[1:-1].strip()
                     if RawSection.upper().startswith(DT.TAB_PCD_ERROR.upper()+'.'):
-                        TokenSpaceGuidCName = RawSection.split(DT.TAB_PCD_ERROR+'.')[1].strip()
+                        TokenSpaceGuidCName = RawSection.split(
+                            DT.TAB_PCD_ERROR+'.')[1].strip()
                         continue
 
             if TokenSpaceGuidCName and self._RawData.CurrentLine.startswith(DT.TAB_COMMENT_SPLIT):
-                self._RawData.CurrentLine = self._RawData.CurrentLine.replace(DT.TAB_COMMENT_SPLIT, '').strip()
+                self._RawData.CurrentLine = self._RawData.CurrentLine.replace(
+                    DT.TAB_COMMENT_SPLIT, '').strip()
                 if self._RawData.CurrentLine != '':
                     if DT.TAB_VALUE_SPLIT not in self._RawData.CurrentLine:
-                        self._LoggerError(ST.ERR_DECPARSE_PCDERRORMSG_MISS_VALUE_SPLIT)
+                        self._LoggerError(
+                            ST.ERR_DECPARSE_PCDERRORMSG_MISS_VALUE_SPLIT)
 
-                    PcdErrorNumber, PcdErrorMsg = GetSplitValueList(self._RawData.CurrentLine, DT.TAB_VALUE_SPLIT, 1)
-                    PcdErrorNumber = ParsePcdErrorCode(PcdErrorNumber, self._RawData.Filename, self._RawData.LineIndex)
+                    PcdErrorNumber, PcdErrorMsg = GetSplitValueList(
+                        self._RawData.CurrentLine, DT.TAB_VALUE_SPLIT, 1)
+                    PcdErrorNumber = ParsePcdErrorCode(
+                        PcdErrorNumber, self._RawData.Filename, self._RawData.LineIndex)
                     if not PcdErrorMsg.strip():
                         self._LoggerError(ST.ERR_DECPARSE_PCD_MISS_ERRORMSG)
 
-                    self.PcdErrorCommentDict[(TokenSpaceGuidCName, PcdErrorNumber)] = PcdErrorMsg.strip()
+                    self.PcdErrorCommentDict[(
+                        TokenSpaceGuidCName, PcdErrorNumber)] = PcdErrorMsg.strip()
             else:
                 TokenSpaceGuidCName = ''
 
@@ -873,7 +904,7 @@ class Dec(_DecBase, _DecComments):
                 break
 
             if Comment and Comment.startswith(DT.TAB_SPECIAL_COMMENT) and Comment.find(DT.TAB_HEADER_COMMENT) > 0 \
-                and not Comment[2:Comment.find(DT.TAB_HEADER_COMMENT)].strip():
+                    and not Comment[2:Comment.find(DT.TAB_HEADER_COMMENT)].strip():
                 IsFileHeader = True
                 IsBinaryHeader = False
                 FileHeaderLineIndex = self._RawData.LineIndex
@@ -882,12 +913,12 @@ class Dec(_DecBase, _DecComments):
             # Get license information before '@file'
             #
             if not IsFileHeader and not IsBinaryHeader and Comment and Comment.startswith(DT.TAB_COMMENT_SPLIT) and \
-            DT.TAB_BINARY_HEADER_COMMENT not in Comment:
+                    DT.TAB_BINARY_HEADER_COMMENT not in Comment:
                 self._HeadComment.append((Comment, self._RawData.LineIndex))
 
             if Comment and IsFileHeader and \
-            not(Comment.startswith(DT.TAB_SPECIAL_COMMENT) \
-            and Comment.find(DT.TAB_BINARY_HEADER_COMMENT) > 0):
+                not(Comment.startswith(DT.TAB_SPECIAL_COMMENT)
+                    and Comment.find(DT.TAB_BINARY_HEADER_COMMENT) > 0):
                 self._HeadComment.append((Comment, self._RawData.LineIndex))
             #
             # Double '#' indicates end of header comments
@@ -897,13 +928,14 @@ class Dec(_DecBase, _DecComments):
                 continue
 
             if Comment and Comment.startswith(DT.TAB_SPECIAL_COMMENT) \
-            and Comment.find(DT.TAB_BINARY_HEADER_COMMENT) > 0:
+                    and Comment.find(DT.TAB_BINARY_HEADER_COMMENT) > 0:
                 IsBinaryHeader = True
                 IsFileHeader = False
                 BinaryHeaderLineIndex = self._RawData.LineIndex
 
             if Comment and IsBinaryHeader:
-                self.BinaryHeadComment.append((Comment, self._RawData.LineIndex))
+                self.BinaryHeadComment.append(
+                    (Comment, self._RawData.LineIndex))
             #
             # Double '#' indicates end of header comments
             #
@@ -918,7 +950,7 @@ class Dec(_DecBase, _DecComments):
             self._LoggerError(ST.ERR_BINARY_HEADER_ORDER)
 
         if FileHeaderLineIndex == -1:
-#            self._LoggerError(ST.ERR_NO_SOURCE_HEADER)
+            #            self._LoggerError(ST.ERR_NO_SOURCE_HEADER)
             Logger.Error(TOOL_NAME, FORMAT_INVALID,
                          ST.ERR_NO_SOURCE_HEADER,
                          File=self._RawData.Filename)
@@ -949,7 +981,8 @@ class Dec(_DecBase, _DecComments):
             if Token.upper() != DT.TAB_USER_EXTENSIONS.upper():
                 self._LoggerError(ST.ERR_DECPARSE_SECTION_UE)
             UserExtension = Token.upper()
-            Par.AssertChar(DT.TAB_SPLIT, ST.ERR_DECPARSE_SECTION_UE, self._RawData.LineIndex)
+            Par.AssertChar(DT.TAB_SPLIT, ST.ERR_DECPARSE_SECTION_UE,
+                           self._RawData.LineIndex)
 
             #
             # UserID
@@ -958,7 +991,8 @@ class Dec(_DecBase, _DecComments):
             if not IsValidUserId(Token):
                 self._LoggerError(ST.ERR_DECPARSE_SECTION_UE_USERID)
             UserId = Token
-            Par.AssertChar(DT.TAB_SPLIT, ST.ERR_DECPARSE_SECTION_UE, self._RawData.LineIndex)
+            Par.AssertChar(DT.TAB_SPLIT, ST.ERR_DECPARSE_SECTION_UE,
+                           self._RawData.LineIndex)
             #
             # IdString
             #
@@ -974,7 +1008,7 @@ class Dec(_DecBase, _DecComments):
                     self._LoggerError(ST.ERR_DECPARSE_ARCH)
             ArchList.add(Arch)
             if [UserExtension, UserId, IdString, Arch] not in \
-                self._RawData.CurrentScope:
+                    self._RawData.CurrentScope:
                 self._RawData.CurrentScope.append(
                     [UserExtension, UserId, IdString, Arch]
                 )
@@ -986,7 +1020,7 @@ class Dec(_DecBase, _DecComments):
         if 'COMMON' in ArchList and len(ArchList) > 1:
             self._LoggerError(ST.ERR_DECPARSE_SECTION_COMMON)
 
-    ## Section header parser
+    # Section header parser
     #
     # The section header is always in following format:
     #
@@ -1014,7 +1048,8 @@ class Dec(_DecBase, _DecComments):
         ArchList = set()
         for Item in GetSplitValueList(RawSection, DT.TAB_COMMA_SPLIT):
             if Item == '':
-                self._LoggerError(ST.ERR_DECPARSE_SECTION_SUBEMPTY % self._RawData.CurrentLine)
+                self._LoggerError(ST.ERR_DECPARSE_SECTION_SUBEMPTY %
+                                  self._RawData.CurrentLine)
 
             ItemList = GetSplitValueList(Item, DT.TAB_SPLIT)
             #
@@ -1033,7 +1068,8 @@ class Dec(_DecBase, _DecComments):
                 self._LoggerError(ST.ERR_DECPARSE_SECTION_SUBTOOMANY % Item)
 
             if DT.TAB_PCDS_FEATURE_FLAG_NULL.upper() in SectionNames and len(SectionNames) > 1:
-                self._LoggerError(ST.ERR_DECPARSE_SECTION_FEATUREFLAG % DT.TAB_PCDS_FEATURE_FLAG_NULL)
+                self._LoggerError(ST.ERR_DECPARSE_SECTION_FEATUREFLAG %
+                                  DT.TAB_PCDS_FEATURE_FLAG_NULL)
             #
             # S1 is always Arch
             #
@@ -1053,39 +1089,55 @@ class Dec(_DecBase, _DecComments):
         if 'COMMON' in ArchList and len(ArchList) > 1:
             self._LoggerError(ST.ERR_DECPARSE_SECTION_COMMON)
         if len(SectionNames) == 0:
-            self._LoggerError(ST.ERR_DECPARSE_SECTION_SUBEMPTY % self._RawData.CurrentLine)
+            self._LoggerError(ST.ERR_DECPARSE_SECTION_SUBEMPTY %
+                              self._RawData.CurrentLine)
         if len(SectionNames) != 1:
             for Sec in SectionNames:
                 if not Sec.startswith(DT.TAB_PCDS.upper()):
-                    self._LoggerError(ST.ERR_DECPARSE_SECTION_NAME % str(SectionNames))
+                    self._LoggerError(
+                        ST.ERR_DECPARSE_SECTION_NAME % str(SectionNames))
 
     def GetDefineSectionMacro(self):
         return self._Define.GetLocalMacro()
+
     def GetDefineSectionObject(self):
         return self._Define.GetDataObject()
+
     def GetIncludeSectionObject(self):
         return self._Include.GetDataObject()
+
     def GetGuidSectionObject(self):
         return self._Guid.GetGuidObject()
+
     def GetProtocolSectionObject(self):
         return self._Guid.GetProtocolObject()
+
     def GetPpiSectionObject(self):
         return self._Guid.GetPpiObject()
+
     def GetLibraryClassSectionObject(self):
         return self._LibClass.GetDataObject()
+
     def GetPcdSectionObject(self):
         return self._Pcd.GetDataObject()
+
     def GetUserExtensionSectionObject(self):
         return self._UserEx.GetDataObject()
+
     def GetPackageSpecification(self):
         return self._Define.GetDataObject().GetPackageSpecification()
+
     def GetPackageName(self):
         return self._Define.GetDataObject().GetPackageName()
+
     def GetPackageGuid(self):
         return self._Define.GetDataObject().GetPackageGuid()
+
     def GetPackageVersion(self):
         return self._Define.GetDataObject().GetPackageVersion()
+
     def GetPackageUniFile(self):
         return self._Define.GetDataObject().GetPackageUniFile()
+
     def GetPrivateSections(self):
         return self._Private
diff --git a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
index 27990467d1c5..8b1e949a5f95 100644
--- a/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define helper class and function for DEC parser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -9,7 +9,7 @@
 DecParserMisc
 '''
 
-## Import modules
+# Import modules
 #
 import os
 import Logger.Log as Logger
@@ -30,9 +30,11 @@ CVAR_PATTERN = '[_a-zA-Z][a-zA-Z0-9_]*'
 PCD_TOKEN_PATTERN = '(0[xX]0*[a-fA-F0-9]{1,8})|([0-9]+)'
 MACRO_PATTERN = '[A-Z][_A-Z0-9]*'
 
-## FileContent
+# FileContent
 # Class to hold DEC file information
 #
+
+
 class FileContent:
     def __init__(self, Filename, FileContent2):
         self.Filename = Filename
@@ -72,7 +74,7 @@ class FileContent:
         return self.LineIndex >= self.FileLines
 
 
-## StripRoot
+# StripRoot
 #
 # Strip root path
 #
@@ -92,7 +94,7 @@ def StripRoot(Root, Path):
         return Path
     return OrigPath
 
-## CleanString
+# CleanString
 #
 # Split comments in a string
 # Remove spaces
@@ -101,7 +103,9 @@ def StripRoot(Root, Path):
 # @param CommentCharacter:  Comment char, used to ignore comment content,
 #                           default is DataType.TAB_COMMENT_SPLIT
 #
-def CleanString(Line, CommentCharacter=TAB_COMMENT_SPLIT, \
+
+
+def CleanString(Line, CommentCharacter=TAB_COMMENT_SPLIT,
                 AllowCppStyleComment=False):
     #
     # remove whitespace
@@ -129,7 +133,7 @@ def CleanString(Line, CommentCharacter=TAB_COMMENT_SPLIT, \
     return Line, Comment
 
 
-## IsValidNumValUint8
+# IsValidNumValUint8
 #
 # Check if Token is NumValUint8: <NumValUint8> ::= {<ShortNum>} {<UINT8>} {<Expression>}
 #
@@ -157,13 +161,15 @@ def IsValidNumValUint8(Token):
     else:
         return True
 
-## IsValidNList
+# IsValidNList
 #
 # Check if Value has the format of <NumValUint8> ["," <NumValUint8>]{0,}
 # <NumValUint8> ::= {<ShortNum>} {<UINT8>} {<Expression>}
 #
 # @param Value: Value to be checked
 #
+
+
 def IsValidNList(Value):
     Par = ParserHelper(Value)
     if Par.End():
@@ -180,12 +186,14 @@ def IsValidNList(Value):
             break
     return Par.End()
 
-## IsValidCArray
+# IsValidCArray
 #
 # check Array is valid
 #
 # @param Array:    The input Array
 #
+
+
 def IsValidCArray(Array):
     Par = ParserHelper(Array)
     if not Par.Expect('{'):
@@ -212,13 +220,15 @@ def IsValidCArray(Array):
             return False
     return Par.End()
 
-## IsValidPcdDatum
+# IsValidPcdDatum
 #
 # check PcdDatum is valid
 #
 # @param Type:    The pcd Type
 # @param Value:    The pcd Value
 #
+
+
 def IsValidPcdDatum(Type, Value):
     if not Value:
         return False, ST.ERR_DECPARSE_PCD_VALUE_EMPTY
@@ -227,13 +237,13 @@ def IsValidPcdDatum(Type, Value):
     if Type not in ["UINT8", "UINT16", "UINT32", "UINT64", "VOID*", "BOOLEAN"]:
         return False, ST.ERR_DECPARSE_PCD_TYPE
     if Type == "VOID*":
-        if not ((Value.startswith('L"') or Value.startswith('"') and \
+        if not ((Value.startswith('L"') or Value.startswith('"') and
                  Value.endswith('"'))
-                or (IsValidCArray(Value)) or (IsValidCFormatGuid(Value)) \
+                or (IsValidCArray(Value)) or (IsValidCFormatGuid(Value))
                 or (IsValidNList(Value)) or (CheckGuidRegFormat(Value))
-               ):
+                ):
             return False, ST.ERR_DECPARSE_PCD_VOID % (Value, Type)
-        RealString = Value[Value.find('"') + 1 :-1]
+        RealString = Value[Value.find('"') + 1:-1]
         if RealString:
             if not IsValidBareCString(RealString):
                 return False, ST.ERR_DECPARSE_PCD_VOID % (Value, Type)
@@ -252,7 +262,7 @@ def IsValidPcdDatum(Type, Value):
         try:
             StrVal = Value
             if Value and not Value.startswith('0x') \
-                and not Value.startswith('0X'):
+                    and not Value.startswith('0X'):
                 Value = Value.lstrip('0')
                 if not Value:
                     return True, ""
@@ -268,8 +278,10 @@ def IsValidPcdDatum(Type, Value):
 
     return True, ""
 
-## ParserHelper
+# ParserHelper
 #
+
+
 class ParserHelper:
     def __init__(self, String, File=''):
         self._String = String
@@ -277,7 +289,7 @@ class ParserHelper:
         self._Index = 0
         self._File = File
 
-    ## End
+    # End
     #
     # End
     #
@@ -285,7 +297,7 @@ class ParserHelper:
         self.__SkipWhitespace()
         return self._Index >= self._StrLen
 
-    ## __SkipWhitespace
+    # __SkipWhitespace
     #
     # Skip whitespace
     #
@@ -295,7 +307,7 @@ class ParserHelper:
                 break
             self._Index += 1
 
-    ## Expect
+    # Expect
     #
     # Expect char in string
     #
@@ -314,7 +326,7 @@ class ParserHelper:
         #
         return False
 
-    ## GetToken
+    # GetToken
     #
     # Get token until encounter StopChar, front whitespace is consumed
     #
@@ -338,7 +350,7 @@ class ParserHelper:
                 LastChar = Char
         return self._String[PreIndex:self._Index]
 
-    ## AssertChar
+    # AssertChar
     #
     # Assert char at current index of string is AssertChar, or will report
     # error message
@@ -352,7 +364,7 @@ class ParserHelper:
             Logger.Error(TOOL_NAME, FILE_PARSE_FAILURE, File=self._File,
                          Line=ErrorLineNum, ExtraData=ErrorString)
 
-    ## AssertEnd
+    # AssertEnd
     #
     # @param ErrorString: ErrorString
     # @param ErrorLineNum: ErrorLineNum
diff --git a/BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py b/BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py
index 992b609120f8..04142396f3e6 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to provide method for process AsBuilt INF file. It will consumed by InfParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -7,7 +7,7 @@
 '''
 InfAsBuiltProcess
 '''
-## Import modules
+# Import modules
 #
 
 import os
@@ -27,7 +27,7 @@ from Parser.InfParserMisc import InfExpandMacro
 
 from Library import DataType as DT
 
-## GetLibInstanceInfo
+# GetLibInstanceInfo
 #
 # Get the information from Library Instance INF file.
 #
@@ -35,6 +35,8 @@ from Library import DataType as DT
 # @param WorkSpace. The WorkSpace directory used to combined with INF file path.
 #
 # @return GUID, Version
+
+
 def GetLibInstanceInfo(String, WorkSpace, LineNo, CurrentInfFileName):
 
     FileGuidString = ""
@@ -54,10 +56,10 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo, CurrentInfFileName):
     # To deal with library instance specified by GUID and version
     #
     RegFormatGuidPattern = re.compile("\s*([0-9a-fA-F]){8}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){4}-"
-                                       "([0-9a-fA-F]){12}\s*")
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){4}-"
+                                      "([0-9a-fA-F]){12}\s*")
     VersionPattern = re.compile('[\t\s]*\d+(\.\d+)?[\t\s]*')
     GuidMatchedObj = RegFormatGuidPattern.search(String)
 
@@ -72,8 +74,8 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo, CurrentInfFileName):
     #
     # To deal with library instance specified by file name
     #
-    FileLinesList = GetFileLineContent(String, WorkSpace, LineNo, OriginalString)
-
+    FileLinesList = GetFileLineContent(
+        String, WorkSpace, LineNo, OriginalString)
 
     ReFindFileGuidPattern = re.compile("^\s*FILE_GUID\s*=.*$")
     ReFindVerStringPattern = re.compile("^\s*VERSION_STRING\s*=.*$")
@@ -91,7 +93,7 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo, CurrentInfFileName):
 
     return FileGuidString, VerString
 
-## GetPackageListInfo
+# GetPackageListInfo
 #
 # Get the package information from INF file.
 #
@@ -99,6 +101,8 @@ def GetLibInstanceInfo(String, WorkSpace, LineNo, CurrentInfFileName):
 # @param WorkSpace. The WorkSpace directory used to combined with INF file path.
 #
 # @return GUID, Version
+
+
 def GetPackageListInfo(FileNameString, WorkSpace, LineNo):
     PackageInfoList = []
     DefineSectionMacros = {}
@@ -169,7 +173,8 @@ def GetPackageListInfo(FileNameString, WorkSpace, LineNo):
             #
             # Replace with Local section Macro and [Defines] section Macro.
             #
-            Line = InfExpandMacro(Line, (FileNameString, Line, LineNo), DefineSectionMacros, PackageSectionMacros, True)
+            Line = InfExpandMacro(Line, (FileNameString, Line, LineNo),
+                                  DefineSectionMacros, PackageSectionMacros, True)
 
             Line = GetSplitValueList(Line, "#", 1)[0]
             Line = GetSplitValueList(Line, "|", 1)[0]
@@ -177,6 +182,7 @@ def GetPackageListInfo(FileNameString, WorkSpace, LineNo):
 
     return PackageInfoList
 
+
 def GetFileLineContent(FileName, WorkSpace, LineNo, OriginalString):
 
     if not LineNo:
@@ -185,7 +191,8 @@ def GetFileLineContent(FileName, WorkSpace, LineNo, OriginalString):
     #
     # Validate file name exist.
     #
-    FullFileName = os.path.normpath(os.path.realpath(os.path.join(WorkSpace, FileName)))
+    FullFileName = os.path.normpath(
+        os.path.realpath(os.path.join(WorkSpace, FileName)))
     if not (ValidFile(FullFileName)):
         return []
 
@@ -203,7 +210,8 @@ def GetFileLineContent(FileName, WorkSpace, LineNo, OriginalString):
         try:
             FileLinesList = Inputfile.readlines()
         except BaseException:
-            Logger.Error("InfParser", ToolError.FILE_READ_FAILURE, ST.ERR_FILE_OPEN_FAILURE, File=FullFileName)
+            Logger.Error("InfParser", ToolError.FILE_READ_FAILURE,
+                         ST.ERR_FILE_OPEN_FAILURE, File=FullFileName)
         finally:
             Inputfile.close()
     except BaseException:
@@ -220,10 +228,12 @@ def GetFileLineContent(FileName, WorkSpace, LineNo, OriginalString):
 # Get all INF files from current workspace
 #
 #
+
+
 def GetInfsFromWorkSpace(WorkSpace):
     InfFiles = []
     for top, dirs, files in os.walk(WorkSpace):
-        dirs = dirs # just for pylint
+        dirs = dirs  # just for pylint
         for File in files:
             if File.upper().endswith(".INF"):
                 InfFiles.append(os.path.join(top, File))
@@ -234,6 +244,8 @@ def GetInfsFromWorkSpace(WorkSpace):
 # Get GUID and version from library instance file
 #
 #
+
+
 def GetGuidVerFormLibInstance(Guid, Version, WorkSpace, CurrentInfFileName):
     for InfFile in GetInfsFromWorkSpace(WorkSpace):
         try:
@@ -270,14 +282,13 @@ def GetGuidVerFormLibInstance(Guid, Version, WorkSpace, CurrentInfFileName):
                 VerString = GetSplitValueList(VerString, '=', 1)[1]
 
             if FileGuidString.strip().upper() == Guid.upper() and \
-                VerString.strip().upper() == Version.upper():
+                    VerString.strip().upper() == Version.upper():
                 return Guid, Version
 
         except BaseException:
-            Logger.Error("InfParser", ToolError.FILE_READ_FAILURE, ST.ERR_FILE_OPEN_FAILURE, File=InfFile)
+            Logger.Error("InfParser", ToolError.FILE_READ_FAILURE,
+                         ST.ERR_FILE_OPEN_FAILURE, File=InfFile)
         finally:
             InfFileObj.close()
 
     return '', ''
-
-
diff --git a/BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py
index 58b53276ec0e..dec21e622d80 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Binaries] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,8 +23,9 @@ from Object.Parser.InfCommonObject import InfLineCommentObject
 from Object.Parser.InfCommonObject import CurrentLine
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfBinarySectionParser(InfParserSectionRoot):
-    ## InfBinaryParser
+    # InfBinaryParser
     #
     #
     def InfBinaryParser(self, SectionString, InfSectionObject, FileName):
@@ -32,12 +33,12 @@ class InfBinarySectionParser(InfParserSectionRoot):
         # Macro defined in this section
         #
         SectionMacros = {}
-        ValueList     = []
+        ValueList = []
         #
         # For UI (UI, SEC_UI, UNI_UI) binaries
         # One and only one UI section can be included
         #
-        UiBinaryList  = []
+        UiBinaryList = []
         #
         # For Version (VER, SEC_VER, UNI_VER).
         # One and only one VER section on be included
@@ -48,9 +49,9 @@ class InfBinarySectionParser(InfParserSectionRoot):
         #
         ComBinaryList = []
 
-        StillCommentFalg  = False
-        HeaderComments    = []
-        LineComment       = None
+        StillCommentFalg = False
+        HeaderComments = []
+        LineComment = None
 
         AllSectionContent = ''
         #
@@ -58,7 +59,7 @@ class InfBinarySectionParser(InfParserSectionRoot):
         #
         for Line in SectionString:
             BinLineContent = Line[0]
-            BinLineNo      = Line[1]
+            BinLineNo = Line[1]
 
             if BinLineContent.strip() == '':
                 continue
@@ -104,8 +105,10 @@ class InfBinarySectionParser(InfParserSectionRoot):
             # Find Tail comment.
             #
             if BinLineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                TailComments = BinLineContent[BinLineContent.find(DT.TAB_COMMENT_SPLIT):]
-                BinLineContent = BinLineContent[:BinLineContent.find(DT.TAB_COMMENT_SPLIT)]
+                TailComments = BinLineContent[BinLineContent.find(
+                    DT.TAB_COMMENT_SPLIT):]
+                BinLineContent = BinLineContent[:BinLineContent.find(
+                    DT.TAB_COMMENT_SPLIT)]
                 if LineComment is None:
                     LineComment = InfLineCommentObject()
                 LineComment.SetTailComments(TailComments)
@@ -114,9 +117,9 @@ class InfBinarySectionParser(InfParserSectionRoot):
             # Find Macro
             #
             MacroDef = MacroParser((BinLineContent, BinLineNo),
-                                      FileName,
-                                      DT.MODEL_EFI_BINARY_FILE,
-                                      self.FileLocalMacros)
+                                   FileName,
+                                   DT.MODEL_EFI_BINARY_FILE,
+                                   self.FileLocalMacros)
             if MacroDef[0] is not None:
                 SectionMacros[MacroDef[0]] = MacroDef[1]
                 LineComment = None
@@ -157,8 +160,8 @@ class InfBinarySectionParser(InfParserSectionRoot):
             # Should equal to VER/SEC_VER/UNI_VER
             #
             elif ValueList[0] == DT.BINARY_FILE_TYPE_UNI_VER or \
-               ValueList[0] == DT.BINARY_FILE_TYPE_SEC_VER or \
-               ValueList[0] == DT.BINARY_FILE_TYPE_VER:
+                    ValueList[0] == DT.BINARY_FILE_TYPE_SEC_VER or \
+                    ValueList[0] == DT.BINARY_FILE_TYPE_VER:
                 if len(ValueList) == 2:
                     TokenList = GetSplitValueList(ValueList[1],
                                                   DT.TAB_VALUE_SPLIT,
@@ -178,8 +181,8 @@ class InfBinarySectionParser(InfParserSectionRoot):
                                                       5)
                     else:
                         TokenList = GetSplitValueList(ValueList[1],
-                              DT.TAB_VALUE_SPLIT,
-                              4)
+                                                      DT.TAB_VALUE_SPLIT,
+                                                      4)
 
                     NewValueList = []
                     NewValueList.append(ValueList[0])
@@ -195,9 +198,6 @@ class InfBinarySectionParser(InfParserSectionRoot):
                                           LineComment,
                                           CurrentLineObj))
 
-
-
-
             ValueList = []
             LineComment = None
             TailComments = ''
@@ -220,7 +220,7 @@ class InfBinarySectionParser(InfParserSectionRoot):
                                           ArchList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR%("[Binaries]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Binaries]"),
                          File=FileName,
                          Line=Item[3])
-
diff --git a/BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py
index e3b48e9f44f6..7b9f794a656f 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for BuildOption sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -25,22 +25,23 @@ from Library.ParserValidate import IsValidFamily
 from Library.ParserValidate import IsValidBuildOptionName
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfBuildOptionSectionParser(InfParserSectionRoot):
-    ## InfBuildOptionParser
+    # InfBuildOptionParser
     #
     #
     def InfBuildOptionParser(self, SectionString, InfSectionObject, FileName):
 
         BuildOptionList = []
-        SectionContent  = ''
+        SectionContent = ''
 
         if not GlobalData.gIS_BINARY_INF:
-            ValueList       = []
-            LineNo          = 0
+            ValueList = []
+            LineNo = 0
 
             for Line in SectionString:
                 LineContent = Line[0]
-                LineNo      = Line[1]
+                LineNo = Line[1]
                 TailComments = ''
                 ReplaceFlag = False
 
@@ -58,10 +59,13 @@ class InfBuildOptionSectionParser(InfParserSectionRoot):
                 # Find Tail comment.
                 #
                 if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                    TailComments = LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):]
-                    LineContent = LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                    TailComments = LineContent[LineContent.find(
+                        DT.TAB_COMMENT_SPLIT):]
+                    LineContent = LineContent[:LineContent.find(
+                        DT.TAB_COMMENT_SPLIT)]
 
-                TokenList = GetSplitValueList(LineContent, DT.TAB_DEQUAL_SPLIT, 1)
+                TokenList = GetSplitValueList(
+                    LineContent, DT.TAB_DEQUAL_SPLIT, 1)
                 if len(TokenList) == 2:
                     #
                     # "Replace" type build option
@@ -69,7 +73,8 @@ class InfBuildOptionSectionParser(InfParserSectionRoot):
                     TokenList.append('True')
                     ReplaceFlag = True
                 else:
-                    TokenList = GetSplitValueList(LineContent, DT.TAB_EQUAL_SPLIT, 1)
+                    TokenList = GetSplitValueList(
+                        LineContent, DT.TAB_EQUAL_SPLIT, 1)
                     #
                     # "Append" type build option
                     #
@@ -98,7 +103,8 @@ class InfBuildOptionSectionParser(InfParserSectionRoot):
                 else:
                     EqualString = ' == '
 
-                SectionContent += ValueList[0] + EqualString + ValueList[1] + TailComments + DT.END_OF_LINE
+                SectionContent += ValueList[0] + EqualString + \
+                    ValueList[1] + TailComments + DT.END_OF_LINE
 
                 Family = GetSplitValueList(ValueList[0], DT.TAB_COLON_SPLIT, 1)
                 if len(Family) == 2:
@@ -129,7 +135,8 @@ class InfBuildOptionSectionParser(InfParserSectionRoot):
                 ValueList = []
                 continue
         else:
-            BuildOptionList = InfAsBuiltBuildOptionParser(SectionString, FileName)
+            BuildOptionList = InfAsBuiltBuildOptionParser(
+                SectionString, FileName)
 
         #
         # Current section archs
@@ -146,13 +153,16 @@ class InfBuildOptionSectionParser(InfParserSectionRoot):
         if not InfSectionObject.SetBuildOptions(BuildOptionList, ArchList, SectionContent):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR%("[BuilOptions]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[BuilOptions]"),
                          File=FileName,
                          Line=LastItem[3])
 
-## InfBuildOptionParser
+# InfBuildOptionParser
 #
 #
+
+
 def InfAsBuiltBuildOptionParser(SectionString, FileName):
     BuildOptionList = []
     #
@@ -164,7 +174,7 @@ def InfAsBuiltBuildOptionParser(SectionString, FileName):
     for Line in SectionString:
         Count += 1
         LineContent = Line[0]
-        LineNo      = Line[1]
+        LineNo = Line[1]
 
         #
         # The last line
@@ -172,7 +182,8 @@ def InfAsBuiltBuildOptionParser(SectionString, FileName):
         if len(SectionString) == Count:
             if LineContent.strip().startswith("##") and AsBuildOptionFlag:
                 BuildOptionList.append(BuildOptionItem)
-                BuildOptionList.append([GetHelpStringByRemoveHashKey(LineContent)])
+                BuildOptionList.append(
+                    [GetHelpStringByRemoveHashKey(LineContent)])
             elif LineContent.strip().startswith("#") and AsBuildOptionFlag:
                 BuildOptionInfo = GetHelpStringByRemoveHashKey(LineContent)
                 BuildOptionItem.append(BuildOptionInfo)
@@ -195,11 +206,11 @@ def InfAsBuiltBuildOptionParser(SectionString, FileName):
 
         if not LineContent.strip().startswith("#"):
             Logger.Error('InfParser',
-                        FORMAT_INVALID,
-                        ST.ERR_BO_CONTATIN_ASBUILD_AND_COMMON,
-                        File=FileName,
-                        Line=LineNo,
-                        ExtraData=LineContent)
+                         FORMAT_INVALID,
+                         ST.ERR_BO_CONTATIN_ASBUILD_AND_COMMON,
+                         File=FileName,
+                         Line=LineNo,
+                         ExtraData=LineContent)
 
         if IsAsBuildOptionInfo(LineContent):
             AsBuildOptionFlag = True
diff --git a/BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py
index a63e40e61787..49fd1a26f40e 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for define sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -26,14 +26,17 @@ from Object.Parser.InfMisc import ErrorInInf
 from Logger import StringTable as ST
 from Parser.InfParserMisc import InfParserSectionRoot
 
-## __GetValidateArchList
+# __GetValidateArchList
 #
 #
+
+
 def GetValidateArchList(LineContent):
 
     TempArch = ''
     ArchList = []
-    ValidateAcrhPatten = re.compile(r"^\s*#\s*VALID_ARCHITECTURES\s*=\s*.*$", re.DOTALL)
+    ValidateAcrhPatten = re.compile(
+        r"^\s*#\s*VALID_ARCHITECTURES\s*=\s*.*$", re.DOTALL)
 
     if ValidateAcrhPatten.match(LineContent):
         TempArch = GetSplitValueList(LineContent, DT.TAB_EQUAL_SPLIT, 1)[1]
@@ -50,6 +53,7 @@ def GetValidateArchList(LineContent):
 
     return ArchList
 
+
 class InfDefinSectionParser(InfParserSectionRoot):
     def InfDefineParser(self, SectionString, InfSectionObject, FileName, SectionComment):
 
@@ -58,12 +62,12 @@ class InfDefinSectionParser(InfParserSectionRoot):
         #
         # Parser Defines section content and fill self._ContentList dict.
         #
-        StillCommentFalg  = False
+        StillCommentFalg = False
         HeaderComments = []
         SectionContent = ''
-        ArchList       = []
-        _ContentList   = []
-        _ValueList     = []
+        ArchList = []
+        _ContentList = []
+        _ValueList = []
         #
         # Add WORKSPACE to global Marco dict.
         #
@@ -71,14 +75,14 @@ class InfDefinSectionParser(InfParserSectionRoot):
 
         for Line in SectionString:
             LineContent = Line[0]
-            LineNo      = Line[1]
-            TailComments   = ''
-            LineComment    = None
+            LineNo = Line[1]
+            TailComments = ''
+            LineComment = None
 
-            LineInfo       = ['', -1, '']
-            LineInfo[0]    = FileName
-            LineInfo[1]    = LineNo
-            LineInfo[2]    = LineContent
+            LineInfo = ['', -1, '']
+            LineInfo[0] = FileName
+            LineInfo[1] = LineNo
+            LineInfo[2] = LineContent
 
             if LineContent.strip() == '':
                 continue
@@ -125,8 +129,10 @@ class InfDefinSectionParser(InfParserSectionRoot):
             # Find Tail comment.
             #
             if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                TailComments = LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):]
-                LineContent = LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                TailComments = LineContent[LineContent.find(
+                    DT.TAB_COMMENT_SPLIT):]
+                LineContent = LineContent[:LineContent.find(
+                    DT.TAB_COMMENT_SPLIT)]
                 if LineComment is None:
                     LineComment = InfLineCommentObject()
                 LineComment.SetTailComments(TailComments)
@@ -168,8 +174,10 @@ class InfDefinSectionParser(InfParserSectionRoot):
 
             InfDefMemberObj = InfDefMember(Name, Value)
             if (LineComment is not None):
-                InfDefMemberObj.Comments.SetHeaderComments(LineComment.GetHeaderComments())
-                InfDefMemberObj.Comments.SetTailComments(LineComment.GetTailComments())
+                InfDefMemberObj.Comments.SetHeaderComments(
+                    LineComment.GetHeaderComments())
+                InfDefMemberObj.Comments.SetTailComments(
+                    LineComment.GetTailComments())
 
             InfDefMemberObj.CurrentLine.SetFileName(self.FullPath)
             InfDefMemberObj.CurrentLine.SetLineString(LineContent)
@@ -188,4 +196,3 @@ class InfDefinSectionParser(InfParserSectionRoot):
         InfSectionObject.SetAllContent(SectionContent)
 
         InfSectionObject.SetDefines(_ContentList, Arch=ArchList)
-
diff --git a/BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py
index a2e836e482b9..bdad3c79c5bd 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Depex] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -20,8 +20,9 @@ from Library import DataType as DT
 from Library.Misc import GetSplitValueList
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfDepexSectionParser(InfParserSectionRoot):
-    ## InfDepexParser
+    # InfDepexParser
     #
     # For now, only separate Depex String and comments.
     # Have two types of section header.
@@ -31,13 +32,13 @@ class InfDepexSectionParser(InfParserSectionRoot):
     def InfDepexParser(self, SectionString, InfSectionObject, FileName):
         DepexContent = []
         DepexComment = []
-        ValueList    = []
+        ValueList = []
         #
         # Parse section content
         #
         for Line in SectionString:
             LineContent = Line[0]
-            LineNo      = Line[1]
+            LineNo = Line[1]
 
             #
             # Found comment
@@ -59,7 +60,6 @@ class InfDepexSectionParser(InfParserSectionRoot):
                 DepexComment.append((LineContent[CommentCount:], LineNo))
                 LineContent = LineContent[:CommentCount-1]
 
-
             CommentCount = -1
             DepexContent.append((LineContent, LineNo))
 
@@ -90,9 +90,10 @@ class InfDepexSectionParser(InfParserSectionRoot):
             else:
                 FormatCommentLn = CommentItem[1] + 1
 
-        if not InfSectionObject.SetDepex(DepexContent, KeyList = KeyList, CommentList = NewCommentList):
+        if not InfSectionObject.SetDepex(DepexContent, KeyList=KeyList, CommentList=NewCommentList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR%("[Depex]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Depex]"),
                          File=FileName,
                          Line=LastItem[3])
diff --git a/BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py
index 9b83c0473ba6..2821007a5319 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Guids], [Ppis], [Protocols] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -25,8 +25,9 @@ from Library.ParserValidate import IsValidUserId
 from Library.ParserValidate import IsValidArch
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
-    ## InfGuidParser
+    # InfGuidParser
     #
     #
     def InfGuidParser(self, SectionString, InfSectionObject, FileName):
@@ -58,10 +59,10 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 #
                 if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
                     CommentsList.append((
-                            LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
-                            LineNo))
+                        LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
+                        LineNo))
                     LineContent = \
-                            LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                        LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
 
             if LineContent != '':
                 #
@@ -77,7 +78,8 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                     ValueList = []
                     continue
 
-                TokenList = GetSplitValueList(LineContent, DT.TAB_VALUE_SPLIT, 1)
+                TokenList = GetSplitValueList(
+                    LineContent, DT.TAB_VALUE_SPLIT, 1)
                 ValueList[0:len(TokenList)] = TokenList
 
                 #
@@ -85,11 +87,10 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 #
                 ValueList = [InfExpandMacro(Value, (FileName, LineContent, LineNo),
                                             self.FileLocalMacros, SectionMacros, True)
-                            for Value in ValueList]
+                             for Value in ValueList]
 
                 CurrentLineVar = (LineContent, LineNo, FileName)
 
-
             if len(ValueList) >= 1:
                 GuidList.append((ValueList, CommentsList, CurrentLineVar))
                 CommentsList = []
@@ -109,11 +110,12 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
         if not InfSectionObject.SetGuid(GuidList, Arch=ArchList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Guid]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Guid]"),
                          File=FileName,
                          Line=LineIndex)
 
-    ## InfPpiParser
+    # InfPpiParser
     #
     #
     def InfPpiParser(self, SectionString, InfSectionObject, FileName):
@@ -145,10 +147,10 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 #
                 if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
                     CommentsList.append((
-                            LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
-                            LineNo))
+                        LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
+                        LineNo))
                     LineContent = \
-                            LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                        LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
 
             if LineContent != '':
                 #
@@ -164,14 +166,15 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                     CommentsList = []
                     continue
 
-                TokenList = GetSplitValueList(LineContent, DT.TAB_VALUE_SPLIT, 1)
+                TokenList = GetSplitValueList(
+                    LineContent, DT.TAB_VALUE_SPLIT, 1)
                 ValueList[0:len(TokenList)] = TokenList
 
                 #
                 # Replace with Local section Macro and [Defines] section Macro.
                 #
                 ValueList = [InfExpandMacro(Value, (FileName, LineContent, LineNo), self.FileLocalMacros, SectionMacros)
-                            for Value in ValueList]
+                             for Value in ValueList]
 
                 CurrentLineVar = (LineContent, LineNo, FileName)
 
@@ -194,11 +197,12 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
         if not InfSectionObject.SetPpi(PpiList, Arch=ArchList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Ppis]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Ppis]"),
                          File=FileName,
                          Line=LineIndex)
 
-    ## InfUserExtensionParser
+    # InfUserExtensionParser
     #
     #
     def InfUserExtensionParser(self, SectionString, InfSectionObject, FileName):
@@ -245,7 +249,8 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 if not IsValidUserId(UserId):
                     Logger.Error('InfParser',
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_UE_SECTION_USER_ID_ERROR % (Item[1]),
+                                 ST.ERR_INF_PARSER_UE_SECTION_USER_ID_ERROR % (
+                                     Item[1]),
                                  File=GlobalData.gINF_MODULE_NAME,
                                  Line=SectionLineNo,
                                  ExtraData=None)
@@ -253,7 +258,8 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 if not IsValidIdString(IdString):
                     Logger.Error('InfParser',
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_UE_SECTION_ID_STRING_ERROR % (IdString),
+                                 ST.ERR_INF_PARSER_UE_SECTION_ID_STRING_ERROR % (
+                                     IdString),
                                  File=GlobalData.gINF_MODULE_NAME, Line=SectionLineNo,
                                  ExtraData=None)
                 IdContentList.append((UserId, IdString, Arch))
@@ -272,7 +278,7 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 Logger.Error('InfParser',
                              FORMAT_INVALID,
                              ST.ERR_INF_PARSER_UE_SECTION_DUPLICATE_ERROR % (
-                                                                    IdString),
+                                 IdString),
                              File=GlobalData.gINF_MODULE_NAME,
                              Line=SectionLineNo,
                              ExtraData=None)
@@ -281,10 +287,10 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
         if not InfSectionObject.SetUserExtension(UserExtensionContent,
                                                  IdContent=IdContentList,
                                                  LineNo=SectionLineNo):
-            Logger.Error\
-            ('InfParser', FORMAT_INVALID, \
-             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[UserExtension]"), \
-             File=FileName, Line=LastItem[4])
+            Logger.Error('InfParser', FORMAT_INVALID,
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[UserExtension]"),
+                         File=FileName, Line=LastItem[4])
 
     def InfProtocolParser(self, SectionString, InfSectionObject, FileName):
         #
@@ -315,10 +321,10 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 #
                 if LineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
                     CommentsList.append((
-                            LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
-                            LineNo))
+                        LineContent[LineContent.find(DT.TAB_COMMENT_SPLIT):],
+                        LineNo))
                     LineContent = \
-                            LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
+                        LineContent[:LineContent.find(DT.TAB_COMMENT_SPLIT)]
 
             if LineContent != '':
                 #
@@ -334,14 +340,15 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                     CommentsList = []
                     continue
 
-                TokenList = GetSplitValueList(LineContent, DT.TAB_VALUE_SPLIT, 1)
+                TokenList = GetSplitValueList(
+                    LineContent, DT.TAB_VALUE_SPLIT, 1)
                 ValueList[0:len(TokenList)] = TokenList
 
                 #
                 # Replace with Local section Macro and [Defines] section Macro.
                 #
                 ValueList = [InfExpandMacro(Value, (FileName, LineContent, LineNo), self.FileLocalMacros, SectionMacros)
-                            for Value in ValueList]
+                             for Value in ValueList]
 
                 CurrentLineVar = (LineContent, LineNo, FileName)
 
@@ -362,7 +369,7 @@ class InfGuidPpiProtocolSectionParser(InfParserSectionRoot):
                 ArchList.append(Item[1])
 
         if not InfSectionObject.SetProtocol(ProtocolList, Arch=ArchList):
-            Logger.Error\
-            ('InfParser', FORMAT_INVALID, \
-             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Protocol]"), \
-             File=FileName, Line=LineIndex)
+            Logger.Error('InfParser', FORMAT_INVALID,
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Protocol]"),
+                         File=FileName, Line=LineIndex)
diff --git a/BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py
index f2070b51a42f..61fb75c717d1 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Libraries] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -25,8 +25,9 @@ from Parser.InfParserMisc import IsLibInstanceInfo
 from Parser.InfAsBuiltProcess import GetLibInstanceInfo
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfLibrarySectionParser(InfParserSectionRoot):
-    ## InfLibraryParser
+    # InfLibraryParser
     #
     #
     def InfLibraryParser(self, SectionString, InfSectionObject, FileName):
@@ -88,8 +89,10 @@ class InfLibrarySectionParser(InfParserSectionRoot):
                 # Find Tail comment.
                 #
                 if LibLineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                    LibTailComments = LibLineContent[LibLineContent.find(DT.TAB_COMMENT_SPLIT):]
-                    LibLineContent = LibLineContent[:LibLineContent.find(DT.TAB_COMMENT_SPLIT)]
+                    LibTailComments = LibLineContent[LibLineContent.find(
+                        DT.TAB_COMMENT_SPLIT):]
+                    LibLineContent = LibLineContent[:LibLineContent.find(
+                        DT.TAB_COMMENT_SPLIT)]
                     if LibLineComment is None:
                         LibLineComment = InfLineCommentObject()
                     LibLineComment.SetTailComments(LibTailComments)
@@ -107,7 +110,8 @@ class InfLibrarySectionParser(InfParserSectionRoot):
                     LibHeaderComments = []
                     continue
 
-                TokenList = GetSplitValueList(LibLineContent, DT.TAB_VALUE_SPLIT, 1)
+                TokenList = GetSplitValueList(
+                    LibLineContent, DT.TAB_VALUE_SPLIT, 1)
                 ValueList[0:len(TokenList)] = TokenList
 
                 #
@@ -115,7 +119,7 @@ class InfLibrarySectionParser(InfParserSectionRoot):
                 #
                 ValueList = [InfExpandMacro(Value, (FileName, LibLineContent, LibLineNo),
                                             self.FileLocalMacros, SectionMacros, True)
-                                            for Value in ValueList]
+                             for Value in ValueList]
 
                 LibraryList.append((ValueList, LibLineComment,
                                     (LibLineContent, LibLineNo, FileName)))
@@ -137,14 +141,16 @@ class InfLibrarySectionParser(InfParserSectionRoot):
             if not InfSectionObject.SetLibraryClasses(LibraryList, KeyList=KeyList):
                 Logger.Error('InfParser',
                              FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Library]"),
+                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                                 "[Library]"),
                              File=FileName,
                              Line=Item[3])
         #
         # For Binary INF
         #
         else:
-            self.InfAsBuiltLibraryParser(SectionString, InfSectionObject, FileName)
+            self.InfAsBuiltLibraryParser(
+                SectionString, InfSectionObject, FileName)
 
     def InfAsBuiltLibraryParser(self, SectionString, InfSectionObject, FileName):
         LibraryList = []
@@ -159,18 +165,19 @@ class InfLibrarySectionParser(InfParserSectionRoot):
 
             if not LineContent.strip().startswith("#"):
                 Logger.Error('InfParser',
-                            FORMAT_INVALID,
-                            ST.ERR_LIB_CONTATIN_ASBUILD_AND_COMMON,
-                            File=FileName,
-                            Line=LineNo,
-                            ExtraData=LineContent)
+                             FORMAT_INVALID,
+                             ST.ERR_LIB_CONTATIN_ASBUILD_AND_COMMON,
+                             File=FileName,
+                             Line=LineNo,
+                             ExtraData=LineContent)
 
             if IsLibInstanceInfo(LineContent):
                 LibInsFlag = True
                 continue
 
             if LibInsFlag:
-                LibGuid, LibVer = GetLibInstanceInfo(LineContent, GlobalData.gWORKSPACE, LineNo, FileName)
+                LibGuid, LibVer = GetLibInstanceInfo(
+                    LineContent, GlobalData.gWORKSPACE, LineNo, FileName)
                 #
                 # If the VERSION_STRING is missing from the INF file, tool should default to "0".
                 #
@@ -192,6 +199,7 @@ class InfLibrarySectionParser(InfParserSectionRoot):
         if not InfSectionObject.SetLibraryClasses(LibraryList, KeyList=KeyList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Library]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Library]"),
                          File=FileName,
                          Line=Item[3])
diff --git a/BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py
index f43241034d22..d040f50c4d74 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Packages] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -22,8 +22,9 @@ from Library.Misc import GetSplitValueList
 from Object.Parser.InfCommonObject import InfLineCommentObject
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfPackageSectionParser(InfParserSectionRoot):
-    ## InfPackageParser
+    # InfPackageParser
     #
     #
     def InfPackageParser(self, SectionString, InfSectionObject, FileName):
@@ -31,17 +32,17 @@ class InfPackageSectionParser(InfParserSectionRoot):
         # Macro defined in this section
         #
         SectionMacros = {}
-        ValueList     = []
-        PackageList   = []
-        StillCommentFalg  = False
-        HeaderComments    = []
-        LineComment       = None
+        ValueList = []
+        PackageList = []
+        StillCommentFalg = False
+        HeaderComments = []
+        LineComment = None
         #
         # Parse section content
         #
         for Line in SectionString:
             PkgLineContent = Line[0]
-            PkgLineNo      = Line[1]
+            PkgLineNo = Line[1]
 
             if PkgLineContent.strip() == '':
                 continue
@@ -81,8 +82,10 @@ class InfPackageSectionParser(InfParserSectionRoot):
             # Find Tail comment.
             #
             if PkgLineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                TailComments = PkgLineContent[PkgLineContent.find(DT.TAB_COMMENT_SPLIT):]
-                PkgLineContent = PkgLineContent[:PkgLineContent.find(DT.TAB_COMMENT_SPLIT)]
+                TailComments = PkgLineContent[PkgLineContent.find(
+                    DT.TAB_COMMENT_SPLIT):]
+                PkgLineContent = PkgLineContent[:PkgLineContent.find(
+                    DT.TAB_COMMENT_SPLIT)]
                 if LineComment is None:
                     LineComment = InfLineCommentObject()
                 LineComment.SetTailComments(TailComments)
@@ -99,7 +102,8 @@ class InfPackageSectionParser(InfParserSectionRoot):
                 HeaderComments = []
                 continue
 
-            TokenList = GetSplitValueList(PkgLineContent, DT.TAB_VALUE_SPLIT, 1)
+            TokenList = GetSplitValueList(
+                PkgLineContent, DT.TAB_VALUE_SPLIT, 1)
             ValueList[0:len(TokenList)] = TokenList
 
             #
@@ -107,7 +111,7 @@ class InfPackageSectionParser(InfParserSectionRoot):
             #
             ValueList = [InfExpandMacro(Value, (FileName, PkgLineContent, PkgLineNo),
                                         self.FileLocalMacros, SectionMacros, True)
-                                        for Value in ValueList]
+                         for Value in ValueList]
 
             PackageList.append((ValueList, LineComment,
                                 (PkgLineContent, PkgLineNo, FileName)))
@@ -125,10 +129,10 @@ class InfPackageSectionParser(InfParserSectionRoot):
             if Item[1] not in ArchList:
                 ArchList.append(Item[1])
 
-        if not InfSectionObject.SetPackages(PackageList, Arch = ArchList):
+        if not InfSectionObject.SetPackages(PackageList, Arch=ArchList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR\
-                         %("[Packages]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR
+                         % ("[Packages]"),
                          File=FileName,
                          Line=Item[3])
diff --git a/BaseTools/Source/Python/UPT/Parser/InfParser.py b/BaseTools/Source/Python/UPT/Parser/InfParser.py
index 2072be6e42f9..a4cbad7feb5f 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -38,9 +38,11 @@ from Parser.InfSectionParser import InfSectionParser
 from Parser.InfParserMisc import gINF_SECTION_DEF
 from Parser.InfParserMisc import IsBinaryInf
 
-## OpenInfFile
+# OpenInfFile
 #
 #
+
+
 def OpenInfFile(Filename):
     FileLinesList = []
 
@@ -63,7 +65,7 @@ def OpenInfFile(Filename):
 
     return FileLinesList
 
-## InfParser
+# InfParser
 #
 # This class defined the structure used in InfParser object
 #
@@ -73,19 +75,21 @@ def OpenInfFile(Filename):
 # @param WorkspaceDir:      Input value for current workspace directory,
 #                           default is None
 #
+
+
 class InfParser(InfSectionParser):
 
-    def __init__(self, Filename = None, WorkspaceDir = None):
+    def __init__(self, Filename=None, WorkspaceDir=None):
 
         #
         # Call parent class construct function
         #
         InfSectionParser.__init__()
 
-        self.WorkspaceDir    = WorkspaceDir
-        self.SupArchList     = DT.ARCH_LIST
-        self.EventList    = []
-        self.HobList      = []
+        self.WorkspaceDir = WorkspaceDir
+        self.SupArchList = DT.ARCH_LIST
+        self.EventList = []
+        self.HobList = []
         self.BootModeList = []
 
         #
@@ -94,7 +98,7 @@ class InfParser(InfSectionParser):
         if Filename is not None:
             self.ParseInfFile(Filename)
 
-    ## Parse INF file
+    # Parse INF file
     #
     # Parse the file if it exists
     #
@@ -113,18 +117,18 @@ class InfParser(InfSectionParser):
         #
         # Initialize common data
         #
-        LineNo             = 0
-        CurrentSection     = DT.MODEL_UNKNOWN
-        SectionLines       = []
+        LineNo = 0
+        CurrentSection = DT.MODEL_UNKNOWN
+        SectionLines = []
 
         #
         # Flags
         #
         HeaderCommentStart = False
-        HeaderCommentEnd   = False
+        HeaderCommentEnd = False
         HeaderStarLineNo = -1
         BinaryHeaderCommentStart = False
-        BinaryHeaderCommentEnd   = False
+        BinaryHeaderCommentEnd = False
         BinaryHeaderStarLineNo = -1
 
         #
@@ -136,17 +140,17 @@ class InfParser(InfSectionParser):
         #
         # Parse file content
         #
-        CommentBlock       = []
+        CommentBlock = []
 
         #
         # Variables for Event/Hob/BootMode
         #
-        self.EventList    = []
-        self.HobList      = []
+        self.EventList = []
+        self.HobList = []
         self.BootModeList = []
         SectionType = ''
 
-        FileLinesList = OpenInfFile (Filename)
+        FileLinesList = OpenInfFile(Filename)
 
         #
         # One INF file can only has one [Defines] section.
@@ -178,8 +182,8 @@ class InfParser(InfSectionParser):
         InfSectionCommonDefObj = None
 
         for Line in FileLinesList:
-            LineNo   = LineNo + 1
-            Line     = Line.strip()
+            LineNo = LineNo + 1
+            Line = Line.strip()
             if (LineNo < len(FileLinesList) - 1):
                 NextLine = FileLinesList[LineNo].strip()
 
@@ -209,28 +213,30 @@ class InfParser(InfSectionParser):
             # Collect Header content.
             #
             if (Line.startswith(DT.TAB_COMMENT_SPLIT) and CurrentSection == DT.MODEL_META_DATA_FILE_HEADER) and\
-                HeaderCommentStart and not Line.startswith(DT.TAB_SPECIAL_COMMENT) and not\
-                HeaderCommentEnd and NextLine != '':
+                    HeaderCommentStart and not Line.startswith(DT.TAB_SPECIAL_COMMENT) and not\
+                    HeaderCommentEnd and NextLine != '':
                 SectionLines.append((Line, LineNo))
                 continue
             #
             # Header content end
             #
             if (Line.startswith(DT.TAB_SPECIAL_COMMENT) or not Line.strip().startswith("#")) and HeaderCommentStart \
-                and not HeaderCommentEnd:
+                    and not HeaderCommentEnd:
                 HeaderCommentEnd = True
                 BinaryHeaderCommentStart = False
-                BinaryHeaderCommentEnd   = False
+                BinaryHeaderCommentEnd = False
                 HeaderCommentStart = False
                 if Line.find(DT.TAB_BINARY_HEADER_COMMENT) > -1:
-                    self.InfHeaderParser(SectionLines, self.InfHeader, self.FileName)
+                    self.InfHeaderParser(
+                        SectionLines, self.InfHeader, self.FileName)
                     SectionLines = []
                 else:
                     SectionLines.append((Line, LineNo))
                     #
                     # Call Header comment parser.
                     #
-                    self.InfHeaderParser(SectionLines, self.InfHeader, self.FileName)
+                    self.InfHeaderParser(
+                        SectionLines, self.InfHeader, self.FileName)
                     SectionLines = []
                     continue
 
@@ -239,7 +245,7 @@ class InfParser(InfSectionParser):
             #
             if Line.startswith(DT.TAB_SPECIAL_COMMENT) and \
                 (Line.find(DT.TAB_BINARY_HEADER_COMMENT) > -1) and \
-                not BinaryHeaderCommentStart:
+                    not BinaryHeaderCommentStart:
                 SectionLines = []
                 CurrentSection = DT.MODEL_META_DATA_FILE_HEADER
                 #
@@ -255,7 +261,7 @@ class InfParser(InfSectionParser):
             # check whether there are more than one binary header exist
             #
             if Line.startswith(DT.TAB_SPECIAL_COMMENT) and BinaryHeaderCommentStart and \
-                not BinaryHeaderCommentEnd and (Line.find(DT.TAB_BINARY_HEADER_COMMENT) > -1):
+                    not BinaryHeaderCommentEnd and (Line.find(DT.TAB_BINARY_HEADER_COMMENT) > -1):
                 Logger.Error('Parser',
                              FORMAT_INVALID,
                              ST.ERR_MULTIPLE_BINARYHEADER_EXIST,
@@ -265,23 +271,24 @@ class InfParser(InfSectionParser):
             # Collect Binary Header content.
             #
             if (Line.startswith(DT.TAB_COMMENT_SPLIT) and CurrentSection == DT.MODEL_META_DATA_FILE_HEADER) and\
-                BinaryHeaderCommentStart and not Line.startswith(DT.TAB_SPECIAL_COMMENT) and not\
-                BinaryHeaderCommentEnd and NextLine != '':
+                    BinaryHeaderCommentStart and not Line.startswith(DT.TAB_SPECIAL_COMMENT) and not\
+                    BinaryHeaderCommentEnd and NextLine != '':
                 SectionLines.append((Line, LineNo))
                 continue
             #
             # Binary Header content end
             #
             if (Line.startswith(DT.TAB_SPECIAL_COMMENT) or not Line.strip().startswith(DT.TAB_COMMENT_SPLIT)) and \
-                BinaryHeaderCommentStart and not BinaryHeaderCommentEnd:
+                    BinaryHeaderCommentStart and not BinaryHeaderCommentEnd:
                 SectionLines.append((Line, LineNo))
                 BinaryHeaderCommentStart = False
                 #
                 # Call Binary Header comment parser.
                 #
-                self.InfHeaderParser(SectionLines, self.InfBinaryHeader, self.FileName, True)
+                self.InfHeaderParser(
+                    SectionLines, self.InfBinaryHeader, self.FileName, True)
                 SectionLines = []
-                BinaryHeaderCommentEnd   = True
+                BinaryHeaderCommentEnd = True
                 continue
             #
             # Find a new section tab
@@ -300,7 +307,7 @@ class InfParser(InfSectionParser):
             #
             # Encountered a section. start with '[' and end with ']'
             #
-            if (Line.startswith(DT.TAB_SECTION_START) and \
+            if (Line.startswith(DT.TAB_SECTION_START) and
                Line.find(DT.TAB_SECTION_END) > -1) or LastSectionFalg:
 
                 HeaderCommentEnd = True
@@ -323,14 +330,15 @@ class InfParser(InfSectionParser):
                     # Keep last time section header content for section parser
                     # usage.
                     #
-                    self.LastSectionHeaderContent = deepcopy(self.SectionHeaderContent)
+                    self.LastSectionHeaderContent = deepcopy(
+                        self.SectionHeaderContent)
 
                     #
                     # TailComments in section define.
                     #
                     TailComments = ''
                     CommentIndex = Line.find(DT.TAB_COMMENT_SPLIT)
-                    if  CommentIndex > -1:
+                    if CommentIndex > -1:
                         TailComments = Line[CommentIndex:]
                         Line = Line[:CommentIndex]
 
@@ -345,8 +353,8 @@ class InfParser(InfSectionParser):
                     #
                     if CurrentSection == DT.MODEL_META_DATA_DEFINE:
                         DefineSectionParsedFlag = self._CallSectionParsers(CurrentSection,
-                                                                   DefineSectionParsedFlag, SectionLines,
-                                                                   InfSectionCommonDefObj, LineNo)
+                                                                           DefineSectionParsedFlag, SectionLines,
+                                                                           InfSectionCommonDefObj, LineNo)
                     #
                     # Compare the new section name with current
                     #
@@ -354,7 +362,8 @@ class InfParser(InfSectionParser):
 
                     self._CheckSectionHeaders(Line, LineNo)
 
-                    SectionType = _ConvertSecNameToType(self.SectionHeaderContent[0][0])
+                    SectionType = _ConvertSecNameToType(
+                        self.SectionHeaderContent[0][0])
 
                 if not FirstSectionStartFlag:
                     CurrentSection = SectionType
@@ -366,7 +375,8 @@ class InfParser(InfSectionParser):
                 continue
 
             if LastSectionFalg:
-                SectionLines, CurrentSection = self._ProcessLastSection(SectionLines, Line, LineNo, CurrentSection)
+                SectionLines, CurrentSection = self._ProcessLastSection(
+                    SectionLines, Line, LineNo, CurrentSection)
 
             #
             # End of section content collect.
@@ -374,7 +384,7 @@ class InfParser(InfSectionParser):
             #
             if NewSectionStartFlag or LastSectionFalg:
                 if CurrentSection != DT.MODEL_META_DATA_DEFINE or \
-                    (LastSectionFalg and CurrentSection == DT.MODEL_META_DATA_DEFINE):
+                        (LastSectionFalg and CurrentSection == DT.MODEL_META_DATA_DEFINE):
                     DefineSectionParsedFlag = self._CallSectionParsers(CurrentSection,
                                                                        DefineSectionParsedFlag, SectionLines,
                                                                        InfSectionCommonDefObj, LineNo)
@@ -387,14 +397,14 @@ class InfParser(InfSectionParser):
 
         if HeaderStarLineNo == -1:
             Logger.Error("InfParser",
-                        FORMAT_INVALID,
-                        ST.ERR_NO_SOURCE_HEADER,
-                        File=self.FullPath)
-        if BinaryHeaderStarLineNo > -1 and HeaderStarLineNo > -1  and HeaderStarLineNo > BinaryHeaderStarLineNo:
+                         FORMAT_INVALID,
+                         ST.ERR_NO_SOURCE_HEADER,
+                         File=self.FullPath)
+        if BinaryHeaderStarLineNo > -1 and HeaderStarLineNo > -1 and HeaderStarLineNo > BinaryHeaderStarLineNo:
             Logger.Error("InfParser",
-                        FORMAT_INVALID,
-                        ST.ERR_BINARY_HEADER_ORDER,
-                        File=self.FullPath)
+                         FORMAT_INVALID,
+                         ST.ERR_BINARY_HEADER_ORDER,
+                         File=self.FullPath)
         #
         # EDKII INF should not have EDKI style comment
         #
@@ -411,7 +421,7 @@ class InfParser(InfSectionParser):
         #
         self._ExtractEventHobBootMod(FileLinesList)
 
-    ## _CheckSectionHeaders
+    # _CheckSectionHeaders
     #
     #
     def _CheckSectionHeaders(self, Line, LineNo):
@@ -429,10 +439,10 @@ class InfParser(InfSectionParser):
                 # check.
                 #
                 if SectionItem[0].strip().upper() == DT.TAB_INF_FIXED_PCD.upper() or \
-                    SectionItem[0].strip().upper() == DT.TAB_INF_PATCH_PCD.upper() or \
-                    SectionItem[0].strip().upper() == DT.TAB_INF_PCD_EX.upper() or \
-                    SectionItem[0].strip().upper() == DT.TAB_INF_PCD.upper() or \
-                    SectionItem[0].strip().upper() == DT.TAB_INF_FEATURE_PCD.upper():
+                        SectionItem[0].strip().upper() == DT.TAB_INF_PATCH_PCD.upper() or \
+                        SectionItem[0].strip().upper() == DT.TAB_INF_PCD_EX.upper() or \
+                        SectionItem[0].strip().upper() == DT.TAB_INF_PCD.upper() or \
+                        SectionItem[0].strip().upper() == DT.TAB_INF_FEATURE_PCD.upper():
                     ArchList = GetSplitValueList(SectionItem[1].strip(), ' ')
                 else:
                     ArchList = [SectionItem[1].strip()]
@@ -441,10 +451,11 @@ class InfParser(InfSectionParser):
                     if (not IsValidArch(Arch)) and \
                         (SectionItem[0].strip().upper() != DT.TAB_DEPEX.upper()) and \
                         (SectionItem[0].strip().upper() != DT.TAB_USER_EXTENSIONS.upper()) and \
-                        (SectionItem[0].strip().upper() != DT.TAB_COMMON_DEFINES.upper()):
+                            (SectionItem[0].strip().upper() != DT.TAB_COMMON_DEFINES.upper()):
                         Logger.Error("InfParser",
                                      FORMAT_INVALID,
-                                     ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(SectionItem[1]),
+                                     ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                         SectionItem[1]),
                                      File=self.FullPath,
                                      Line=LineNo, ExtraData=Line)
                 #
@@ -454,15 +465,16 @@ class InfParser(InfSectionParser):
                 if (self.SectionHeaderContent[0][0].upper() in ChkModSectionList):
                     if SectionItem[2].strip().upper():
                         MoudleTypeList = GetSplitValueList(
-                                    SectionItem[2].strip().upper())
+                            SectionItem[2].strip().upper())
                         if (not IsValidInfMoudleTypeList(MoudleTypeList)):
                             Logger.Error("InfParser",
                                          FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID%(SectionItem[2]),
+                                         ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                             SectionItem[2]),
                                          File=self.FullPath, Line=LineNo,
                                          ExtraData=Line)
 
-    ## _CallSectionParsers
+    # _CallSectionParsers
     #
     #
     def _CallSectionParsers(self, CurrentSection, DefineSectionParsedFlag,
@@ -479,7 +491,7 @@ class InfParser(InfSectionParser):
                              PARSER_ERROR,
                              ST.ERR_INF_PARSER_MULTI_DEFINE_SECTION,
                              File=self.FullPath,
-                             RaiseError = Logger.IS_RAISE_ERROR)
+                             RaiseError=Logger.IS_RAISE_ERROR)
 
         elif CurrentSection == DT.MODEL_META_DATA_BUILD_OPTION:
             self.InfBuildOptionParser(SectionLines,
@@ -499,10 +511,10 @@ class InfParser(InfSectionParser):
         # [Pcd] Sections, put it together
         #
         elif CurrentSection == DT.MODEL_PCD_FIXED_AT_BUILD or \
-             CurrentSection == DT.MODEL_PCD_PATCHABLE_IN_MODULE or \
-             CurrentSection == DT.MODEL_PCD_FEATURE_FLAG or \
-             CurrentSection == DT.MODEL_PCD_DYNAMIC_EX or \
-             CurrentSection == DT.MODEL_PCD_DYNAMIC:
+                CurrentSection == DT.MODEL_PCD_PATCHABLE_IN_MODULE or \
+                CurrentSection == DT.MODEL_PCD_FEATURE_FLAG or \
+                CurrentSection == DT.MODEL_PCD_DYNAMIC_EX or \
+                CurrentSection == DT.MODEL_PCD_DYNAMIC:
             self.InfPcdParser(SectionLines,
                               self.InfPcdSection,
                               self.FullPath)
@@ -550,13 +562,13 @@ class InfParser(InfSectionParser):
                              PARSER_ERROR,
                              ST.ERR_INF_PARSER_UNKNOWN_SECTION,
                              File=self.FullPath, Line=LineNo,
-                             RaiseError = Logger.IS_RAISE_ERROR)
+                             RaiseError=Logger.IS_RAISE_ERROR)
             else:
                 Logger.Error("Parser",
                              PARSER_ERROR,
                              ST.ERR_INF_PARSER_NO_SECTION_ERROR,
                              File=self.FullPath, Line=LineNo,
-                             RaiseError = Logger.IS_RAISE_ERROR)
+                             RaiseError=Logger.IS_RAISE_ERROR)
 
         return DefineSectionParsedFlag
 
@@ -564,9 +576,9 @@ class InfParser(InfSectionParser):
         SpecialSectionStart = False
         CheckLocation = False
         GFindSpecialCommentRe = \
-        re.compile(r"""#(?:\s*)\[(.*?)\](?:.*)""", re.DOTALL)
+            re.compile(r"""#(?:\s*)\[(.*?)\](?:.*)""", re.DOTALL)
         GFindNewSectionRe2 = \
-        re.compile(r"""#?(\s*)\[(.*?)\](.*)""", re.DOTALL)
+            re.compile(r"""#?(\s*)\[(.*?)\](.*)""", re.DOTALL)
         LineNum = 0
         Element = []
         for Line in FileLinesList:
@@ -605,8 +617,8 @@ class InfParser(InfSectionParser):
                     else:
                         if not Line.startswith(DT.TAB_COMMENT_SPLIT):
                             Logger.Warn("Parser",
-                                         ST.WARN_SPECIAL_SECTION_LOCATION_WRONG,
-                                         File=self.FullPath, Line=LineNum)
+                                        ST.WARN_SPECIAL_SECTION_LOCATION_WRONG,
+                                        File=self.FullPath, Line=LineNum)
                             SpecialSectionStart = False
                             CheckLocation = False
                             Element = []
@@ -618,8 +630,8 @@ class InfParser(InfSectionParser):
                             CheckLocation = False
                         elif Line:
                             Logger.Warn("Parser",
-                                         ST.WARN_SPECIAL_SECTION_LOCATION_WRONG,
-                                         File=self.FullPath, Line=LineNum)
+                                        ST.WARN_SPECIAL_SECTION_LOCATION_WRONG,
+                                        File=self.FullPath, Line=LineNum)
                             CheckLocation = False
 
         if len(self.BootModeList) >= 1:
@@ -639,9 +651,10 @@ class InfParser(InfSectionParser):
                                          self.InfSpecialCommentSection,
                                          self.FileName,
                                          DT.TYPE_HOB_SECTION)
-    ## _ProcessLastSection
+    # _ProcessLastSection
     #
     #
+
     def _ProcessLastSection(self, SectionLines, Line, LineNo, CurrentSection):
         #
         # The last line is a section header. will discard it.
@@ -658,7 +671,7 @@ class InfParser(InfSectionParser):
                              File=self.FullPath,
                              Line=LineNo,
                              ExtraData=Line,
-                             RaiseError = Logger.IS_RAISE_ERROR
+                             RaiseError=Logger.IS_RAISE_ERROR
                              )
             else:
                 CurrentSection = gINF_SECTION_DEF[TemSectionName]
@@ -666,9 +679,11 @@ class InfParser(InfSectionParser):
 
         return SectionLines, CurrentSection
 
-## _ConvertSecNameToType
+# _ConvertSecNameToType
 #
 #
+
+
 def _ConvertSecNameToType(SectionName):
     SectionType = ''
     if SectionName.upper() not in gINF_SECTION_DEF.keys():
@@ -677,4 +692,3 @@ def _ConvertSecNameToType(SectionName):
         SectionType = gINF_SECTION_DEF[SectionName.upper()]
 
     return SectionType
-
diff --git a/BaseTools/Source/Python/UPT/Parser/InfParserMisc.py b/BaseTools/Source/Python/UPT/Parser/InfParserMisc.py
index d01ae9aa0246..58dcf9cb062b 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfParserMisc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the miscellaneous functions for INF parser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -32,31 +32,31 @@ from Logger.StringTable import ERR_MARCO_DEFINITION_MISS_ERROR
 # Sections can exist in INF file
 #
 gINF_SECTION_DEF = {
-       DT.TAB_UNKNOWN.upper()          : DT.MODEL_UNKNOWN,
-       DT.TAB_HEADER.upper()           : DT.MODEL_META_DATA_FILE_HEADER,
-       DT.TAB_INF_DEFINES.upper()      : DT.MODEL_META_DATA_DEFINE,
-       DT.TAB_BUILD_OPTIONS.upper()    : DT.MODEL_META_DATA_BUILD_OPTION,
-       DT.TAB_LIBRARY_CLASSES.upper()  : DT.MODEL_EFI_LIBRARY_CLASS,
-       DT.TAB_PACKAGES.upper()         : DT.MODEL_META_DATA_PACKAGE,
-       DT.TAB_INF_FIXED_PCD.upper()    : DT.MODEL_PCD_FIXED_AT_BUILD,
-       DT.TAB_INF_PATCH_PCD.upper()    : DT.MODEL_PCD_PATCHABLE_IN_MODULE,
-       DT.TAB_INF_FEATURE_PCD.upper()  : DT.MODEL_PCD_FEATURE_FLAG,
-       DT.TAB_INF_PCD_EX.upper()       : DT.MODEL_PCD_DYNAMIC_EX,
-       DT.TAB_INF_PCD.upper()          : DT.MODEL_PCD_DYNAMIC,
-       DT.TAB_SOURCES.upper()          : DT.MODEL_EFI_SOURCE_FILE,
-       DT.TAB_GUIDS.upper()            : DT.MODEL_EFI_GUID,
-       DT.TAB_PROTOCOLS.upper()        : DT.MODEL_EFI_PROTOCOL,
-       DT.TAB_PPIS.upper()             : DT.MODEL_EFI_PPI,
-       DT.TAB_DEPEX.upper()            : DT.MODEL_EFI_DEPEX,
-       DT.TAB_BINARIES.upper()         : DT.MODEL_EFI_BINARY_FILE,
-       DT.TAB_USER_EXTENSIONS.upper()  : DT.MODEL_META_DATA_USER_EXTENSION
-       #
-       # EDK1 section
-       # TAB_NMAKE.upper()            : MODEL_META_DATA_NMAKE
-       #
-       }
+    DT.TAB_UNKNOWN.upper(): DT.MODEL_UNKNOWN,
+    DT.TAB_HEADER.upper(): DT.MODEL_META_DATA_FILE_HEADER,
+    DT.TAB_INF_DEFINES.upper(): DT.MODEL_META_DATA_DEFINE,
+    DT.TAB_BUILD_OPTIONS.upper(): DT.MODEL_META_DATA_BUILD_OPTION,
+    DT.TAB_LIBRARY_CLASSES.upper(): DT.MODEL_EFI_LIBRARY_CLASS,
+    DT.TAB_PACKAGES.upper(): DT.MODEL_META_DATA_PACKAGE,
+    DT.TAB_INF_FIXED_PCD.upper(): DT.MODEL_PCD_FIXED_AT_BUILD,
+    DT.TAB_INF_PATCH_PCD.upper(): DT.MODEL_PCD_PATCHABLE_IN_MODULE,
+    DT.TAB_INF_FEATURE_PCD.upper(): DT.MODEL_PCD_FEATURE_FLAG,
+    DT.TAB_INF_PCD_EX.upper(): DT.MODEL_PCD_DYNAMIC_EX,
+    DT.TAB_INF_PCD.upper(): DT.MODEL_PCD_DYNAMIC,
+    DT.TAB_SOURCES.upper(): DT.MODEL_EFI_SOURCE_FILE,
+    DT.TAB_GUIDS.upper(): DT.MODEL_EFI_GUID,
+    DT.TAB_PROTOCOLS.upper(): DT.MODEL_EFI_PROTOCOL,
+    DT.TAB_PPIS.upper(): DT.MODEL_EFI_PPI,
+    DT.TAB_DEPEX.upper(): DT.MODEL_EFI_DEPEX,
+    DT.TAB_BINARIES.upper(): DT.MODEL_EFI_BINARY_FILE,
+    DT.TAB_USER_EXTENSIONS.upper(): DT.MODEL_META_DATA_USER_EXTENSION
+    #
+    # EDK1 section
+    # TAB_NMAKE.upper()            : MODEL_META_DATA_NMAKE
+    #
+}
 
-## InfExpandMacro
+# InfExpandMacro
 #
 # Expand MACRO definition with MACROs defined in [Defines] section and specific section.
 # The MACROs defined in specific section has high priority and will be expanded firstly.
@@ -66,6 +66,8 @@ gINF_SECTION_DEF = {
 # @param SectionMacros MACROs defined in INF specific section
 # @param Flag          If the flag set to True, need to skip macros in a quoted string
 #
+
+
 def InfExpandMacro(Content, LineInfo, GlobalMacros=None, SectionMacros=None, Flag=False):
     if GlobalMacros is None:
         GlobalMacros = {}
@@ -85,21 +87,21 @@ def InfExpandMacro(Content, LineInfo, GlobalMacros=None, SectionMacros=None, Fla
     #
     # First, replace MARCOs with value defined in specific section
     #
-    Content = ReplaceMacro (Content,
-                            SectionMacros,
-                            False,
-                            (LineContent, LineNo),
-                            FileName,
-                            Flag)
+    Content = ReplaceMacro(Content,
+                           SectionMacros,
+                           False,
+                           (LineContent, LineNo),
+                           FileName,
+                           Flag)
     #
     # Then replace MARCOs with value defined in [Defines] section
     #
-    Content = ReplaceMacro (Content,
-                            GlobalMacros,
-                            False,
-                            (LineContent, LineNo),
-                            FileName,
-                            Flag)
+    Content = ReplaceMacro(Content,
+                           GlobalMacros,
+                           False,
+                           (LineContent, LineNo),
+                           FileName,
+                           Flag)
 
     MacroUsed = gMACRO_PATTERN.findall(Content)
     #
@@ -109,18 +111,18 @@ def InfExpandMacro(Content, LineInfo, GlobalMacros=None, SectionMacros=None, Fla
         return Content
     else:
         for Macro in MacroUsed:
-            gQuotedMacro = re.compile(".*\".*\$\(%s\).*\".*"%(Macro))
+            gQuotedMacro = re.compile(".*\".*\$\(%s\).*\".*" % (Macro))
             if not gQuotedMacro.match(Content):
                 #
                 # Still have MACROs can't be expanded.
                 #
-                ErrorInInf (ERR_MARCO_DEFINITION_MISS_ERROR,
-                            LineInfo=NewLineInfo)
+                ErrorInInf(ERR_MARCO_DEFINITION_MISS_ERROR,
+                           LineInfo=NewLineInfo)
 
     return Content
 
 
-## IsBinaryInf
+# IsBinaryInf
 #
 # Judge whether the INF file is Binary INF or Common INF
 #
@@ -146,7 +148,7 @@ def IsBinaryInf(FileLineList):
     return False
 
 
-## IsLibInstanceInfo
+# IsLibInstanceInfo
 #
 # Judge whether the string contain the information of ## @LIB_INSTANCES.
 #
@@ -162,7 +164,7 @@ def IsLibInstanceInfo(String):
         return False
 
 
-## IsAsBuildOptionInfo
+# IsAsBuildOptionInfo
 #
 # Judge whether the string contain the information of ## @ASBUILD.
 #
@@ -197,20 +199,20 @@ class InfParserSectionRoot(object):
 
         self.FullPath = ''
 
-        self.InfDefSection              = None
-        self.InfBuildOptionSection      = None
-        self.InfLibraryClassSection     = None
-        self.InfPackageSection          = None
-        self.InfPcdSection              = None
-        self.InfSourcesSection          = None
-        self.InfUserExtensionSection    = None
-        self.InfProtocolSection         = None
-        self.InfPpiSection              = None
-        self.InfGuidSection             = None
-        self.InfDepexSection            = None
-        self.InfPeiDepexSection         = None
-        self.InfDxeDepexSection         = None
-        self.InfSmmDepexSection         = None
-        self.InfBinariesSection         = None
-        self.InfHeader                  = None
-        self.InfSpecialCommentSection   = None
+        self.InfDefSection = None
+        self.InfBuildOptionSection = None
+        self.InfLibraryClassSection = None
+        self.InfPackageSection = None
+        self.InfPcdSection = None
+        self.InfSourcesSection = None
+        self.InfUserExtensionSection = None
+        self.InfProtocolSection = None
+        self.InfPpiSection = None
+        self.InfGuidSection = None
+        self.InfDepexSection = None
+        self.InfPeiDepexSection = None
+        self.InfDxeDepexSection = None
+        self.InfSmmDepexSection = None
+        self.InfBinariesSection = None
+        self.InfHeader = None
+        self.InfSpecialCommentSection = None
diff --git a/BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py
index 6954742bf20f..1ea4770a3d99 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Pcds] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,8 +23,9 @@ from Library import GlobalData
 from Library.StringUtils import SplitPcdEntry
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfPcdSectionParser(InfParserSectionRoot):
-    ## Section PCD related parser
+    # Section PCD related parser
     #
     # For 5 types of PCD list below, all use this function.
     # 'FixedPcd', 'FeaturePcd', 'PatchPcd', 'Pcd', 'PcdEx'
@@ -35,7 +36,7 @@ class InfPcdSectionParser(InfParserSectionRoot):
     #
     def InfPcdParser(self, SectionString, InfSectionObject, FileName):
         KeysList = []
-        PcdList   = []
+        PcdList = []
         CommentsList = []
         ValueList = []
         #
@@ -47,10 +48,10 @@ class InfPcdSectionParser(InfParserSectionRoot):
                 KeysList.append((Item[0], Item[1], Item[3]))
                 LineIndex = Item[3]
 
-            if (Item[0].upper() == DT.TAB_INF_FIXED_PCD.upper() or \
-                Item[0].upper() == DT.TAB_INF_FEATURE_PCD.upper() or \
-                Item[0].upper() == DT.TAB_INF_PCD.upper()) and GlobalData.gIS_BINARY_INF:
-                Logger.Error('InfParser', FORMAT_INVALID, ST.ERR_ASBUILD_PCD_SECTION_TYPE%("\"" + Item[0] + "\""),
+            if (Item[0].upper() == DT.TAB_INF_FIXED_PCD.upper() or
+                Item[0].upper() == DT.TAB_INF_FEATURE_PCD.upper() or
+                    Item[0].upper() == DT.TAB_INF_PCD.upper()) and GlobalData.gIS_BINARY_INF:
+                Logger.Error('InfParser', FORMAT_INVALID, ST.ERR_ASBUILD_PCD_SECTION_TYPE % ("\"" + Item[0] + "\""),
                              File=FileName, Line=LineIndex)
 
         #
@@ -63,7 +64,7 @@ class InfPcdSectionParser(InfParserSectionRoot):
             SectionMacros = {}
             for Line in SectionString:
                 PcdLineContent = Line[0]
-                PcdLineNo      = Line[1]
+                PcdLineNo = Line[1]
                 if PcdLineContent.strip() == '':
                     CommentsList = []
                     continue
@@ -77,9 +78,11 @@ class InfPcdSectionParser(InfParserSectionRoot):
                     #
                     if PcdLineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
                         CommentsList.append((
-                                PcdLineContent[PcdLineContent.find(DT.TAB_COMMENT_SPLIT):],
-                                PcdLineNo))
-                        PcdLineContent = PcdLineContent[:PcdLineContent.find(DT.TAB_COMMENT_SPLIT)]
+                            PcdLineContent[PcdLineContent.find(
+                                DT.TAB_COMMENT_SPLIT):],
+                            PcdLineNo))
+                        PcdLineContent = PcdLineContent[:PcdLineContent.find(
+                            DT.TAB_COMMENT_SPLIT)]
 
                 if PcdLineContent != '':
                     #
@@ -109,10 +112,11 @@ class InfPcdSectionParser(InfParserSectionRoot):
                     #
                     ValueList = [InfExpandMacro(Value, (FileName, PcdLineContent, PcdLineNo),
                                                 self.FileLocalMacros, SectionMacros, True)
-                                for Value in ValueList]
+                                 for Value in ValueList]
 
                 if len(ValueList) >= 1:
-                    PcdList.append((ValueList, CommentsList, (PcdLineContent, PcdLineNo, FileName)))
+                    PcdList.append(
+                        (ValueList, CommentsList, (PcdLineContent, PcdLineNo, FileName)))
                     ValueList = []
                     CommentsList = []
                 continue
@@ -122,7 +126,7 @@ class InfPcdSectionParser(InfParserSectionRoot):
         else:
             for Line in SectionString:
                 LineContent = Line[0].strip()
-                LineNo      = Line[1]
+                LineNo = Line[1]
 
                 if LineContent == '':
                     CommentsList = []
@@ -135,7 +139,7 @@ class InfPcdSectionParser(InfParserSectionRoot):
                 # Have comments at tail.
                 #
                 CommentIndex = LineContent.find(DT.TAB_COMMENT_SPLIT)
-                if  CommentIndex > -1:
+                if CommentIndex > -1:
                     CommentsList.append(LineContent[CommentIndex+1:])
                     LineContent = LineContent[:CommentIndex]
 
@@ -163,16 +167,17 @@ class InfPcdSectionParser(InfParserSectionRoot):
                                      ExtraData=LineContent)
                 ValueList[0:len(TokenList)] = TokenList
                 if len(ValueList) >= 1:
-                    PcdList.append((ValueList, CommentsList, (LineContent, LineNo, FileName)))
+                    PcdList.append(
+                        (ValueList, CommentsList, (LineContent, LineNo, FileName)))
                     ValueList = []
                     CommentsList = []
                 continue
 
-        if not InfSectionObject.SetPcds(PcdList, KeysList = KeysList,
-                                        PackageInfo = self.InfPackageSection.GetPackages()):
+        if not InfSectionObject.SetPcds(PcdList, KeysList=KeysList,
+                                        PackageInfo=self.InfPackageSection.GetPackages()):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR%("[PCD]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[PCD]"),
                          File=FileName,
                          Line=LineIndex)
-
diff --git a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
index 474d37379d2b..5f5a02dc1370 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -53,10 +53,12 @@ from Parser.InfBinarySectionParser import InfBinarySectionParser
 from Parser.InfPcdSectionParser import InfPcdSectionParser
 from Parser.InfDepexSectionParser import InfDepexSectionParser
 
-## GetSpecialStr2
+# GetSpecialStr2
 #
 # GetSpecialStr2
 #
+
+
 def GetSpecialStr2(ItemList, FileName, LineNo, SectionString):
     Str2 = ''
     #
@@ -68,13 +70,14 @@ def GetSpecialStr2(ItemList, FileName, LineNo, SectionString):
         # section can has more than 2 items in section header string,
         # others should report error.
         #
-        if not (ItemList[0].upper() == DT.TAB_LIBRARY_CLASSES.upper() or \
-                ItemList[0].upper() == DT.TAB_DEPEX.upper() or \
+        if not (ItemList[0].upper() == DT.TAB_LIBRARY_CLASSES.upper() or
+                ItemList[0].upper() == DT.TAB_DEPEX.upper() or
                 ItemList[0].upper() == DT.TAB_USER_EXTENSIONS.upper()):
             if ItemList[2] != '':
                 Logger.Error('Parser',
                              FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID % (SectionString),
+                             ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID % (
+                                 SectionString),
                              File=FileName,
                              Line=LineNo,
                              ExtraData=SectionString)
@@ -87,7 +90,7 @@ def GetSpecialStr2(ItemList, FileName, LineNo, SectionString):
         #
         if not ItemList[0].upper() == DT.TAB_USER_EXTENSIONS.upper() or ItemList[0].upper() == DT.TAB_DEPEX.upper():
             if ItemList[3] != '':
-                Logger.Error('Parser', FORMAT_INVALID, ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID \
+                Logger.Error('Parser', FORMAT_INVALID, ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID
                              % (SectionString), File=FileName, Line=LineNo, ExtraData=SectionString)
 
         if not ItemList[0].upper() == DT.TAB_USER_EXTENSIONS.upper():
@@ -96,14 +99,16 @@ def GetSpecialStr2(ItemList, FileName, LineNo, SectionString):
             Str2 = ItemList[2]
 
     elif len(ItemList) > 4:
-        Logger.Error('Parser', FORMAT_INVALID, ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID \
+        Logger.Error('Parser', FORMAT_INVALID, ST.ERR_INF_PARSER_SOURCE_SECTION_SECTIONNAME_INVALID
                      % (SectionString), File=FileName, Line=LineNo, ExtraData=SectionString)
 
     return Str2
 
-## ProcessUseExtHeader
+# ProcessUseExtHeader
 #
 #
+
+
 def ProcessUseExtHeader(ItemList):
     NewItemList = []
     AppendContent = ''
@@ -138,10 +143,12 @@ def ProcessUseExtHeader(ItemList):
 
     return True, NewItemList
 
-## GetArch
+# GetArch
 #
 # GetArch
 #
+
+
 def GetArch(ItemList, ArchList, FileName, LineNo, SectionString):
     #
     # S1 is always Arch
@@ -165,10 +172,12 @@ def GetArch(ItemList, ArchList, FileName, LineNo, SectionString):
 
     return Arch, ArchList
 
-## InfSectionParser
+# InfSectionParser
 #
 # Inherit from object
 #
+
+
 class InfSectionParser(InfDefinSectionParser,
                        InfBuildOptionSectionParser,
                        InfSourceSectionParser,
@@ -183,7 +192,7 @@ class InfSectionParser(InfDefinSectionParser,
     #
     MetaFiles = {}
 
-    ## Factory method
+    # Factory method
     #
     # One file, one parser object. This factory method makes sure that there's
     # only one object constructed for one meta file.
@@ -248,16 +257,18 @@ class InfSectionParser(InfDefinSectionParser,
     #
     # File Header content parser
     #
-    def InfHeaderParser(self, Content, InfHeaderObject2, FileName, IsBinaryHeader = False):
+    def InfHeaderParser(self, Content, InfHeaderObject2, FileName, IsBinaryHeader=False):
         if IsBinaryHeader:
-            (Abstract, Description, Copyright, License) = ParseHeaderCommentSection(Content, FileName, True)
+            (Abstract, Description, Copyright, License) = ParseHeaderCommentSection(
+                Content, FileName, True)
             if not Abstract or not Description or not Copyright or not License:
                 Logger.Error('Parser',
                              FORMAT_INVALID,
                              ST.ERR_INVALID_BINARYHEADER_FORMAT,
                              File=FileName)
         else:
-            (Abstract, Description, Copyright, License) = ParseHeaderCommentSection(Content, FileName)
+            (Abstract, Description, Copyright,
+             License) = ParseHeaderCommentSection(Content, FileName)
         #
         # Not process file name now, for later usage.
         #
@@ -272,10 +283,7 @@ class InfSectionParser(InfDefinSectionParser,
         InfHeaderObject2.SetCopyright(Copyright)
         InfHeaderObject2.SetLicense(License)
 
-
-
-
-    ## Section header parser
+    # Section header parser
     #
     #   The section header is always in following format:
     #
@@ -283,23 +291,25 @@ class InfSectionParser(InfDefinSectionParser,
     #
     # @param String    A string contained the content need to be parsed.
     #
+
     def SectionHeaderParser(self, SectionString, FileName, LineNo):
         _Scope = []
         _SectionName = ''
         ArchList = set()
         _ValueList = []
         _PcdNameList = [DT.TAB_INF_FIXED_PCD.upper(),
-                             DT.TAB_INF_FEATURE_PCD.upper(),
-                             DT.TAB_INF_PATCH_PCD.upper(),
-                             DT.TAB_INF_PCD.upper(),
-                             DT.TAB_INF_PCD_EX.upper()
-                             ]
+                        DT.TAB_INF_FEATURE_PCD.upper(),
+                        DT.TAB_INF_PATCH_PCD.upper(),
+                        DT.TAB_INF_PCD.upper(),
+                        DT.TAB_INF_PCD_EX.upper()
+                        ]
         SectionString = SectionString.strip()
         for Item in GetSplitValueList(SectionString[1:-1], DT.TAB_COMMA_SPLIT):
             if Item == '':
                 Logger.Error('Parser',
                              FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (""),
+                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                                 ""),
                              File=FileName,
                              Line=LineNo,
                              ExtraData=SectionString)
@@ -317,10 +327,11 @@ class InfSectionParser(InfDefinSectionParser,
                                  Line=LineNo,
                                  ExtraData=SectionString)
             elif _PcdNameList[1] in [_SectionName.upper(), ItemList[0].upper()] and \
-                (_SectionName.upper()!= ItemList[0].upper()):
+                    (_SectionName.upper() != ItemList[0].upper()):
                 Logger.Error('Parser',
                              FORMAT_INVALID,
-                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (""),
+                             ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                                 ""),
                              File=FileName,
                              Line=LineNo,
                              ExtraData=SectionString)
@@ -340,7 +351,8 @@ class InfSectionParser(InfDefinSectionParser,
             #
             # Get Arch
             #
-            Str1, ArchList = GetArch(ItemList, ArchList, FileName, LineNo, SectionString)
+            Str1, ArchList = GetArch(
+                ItemList, ArchList, FileName, LineNo, SectionString)
 
             #
             # For [Defines] section, do special check.
@@ -349,7 +361,8 @@ class InfSectionParser(InfDefinSectionParser,
                 if len(ItemList) != 1:
                     Logger.Error('Parser',
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (SectionString),
+                                 ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                     SectionString),
                                  File=FileName, Line=LineNo, ExtraData=SectionString)
 
             #
@@ -362,7 +375,8 @@ class InfSectionParser(InfDefinSectionParser,
                 if not RetValue[0]:
                     Logger.Error('Parser',
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (SectionString),
+                                 ST.ERR_INF_PARSER_DEFINE_FROMAT_INVALID % (
+                                     SectionString),
                                  File=FileName, Line=LineNo, ExtraData=SectionString)
                 else:
                     ItemList = RetValue[1]
@@ -377,12 +391,14 @@ class InfSectionParser(InfDefinSectionParser,
             #
             if ItemList[0].upper() == DT.TAB_LIBRARY_CLASSES.upper() and len(ItemList) == 3:
                 if ItemList[2] != '':
-                    ModuleTypeList = GetSplitValueList(ItemList[2], DT.TAB_VALUE_SPLIT)
+                    ModuleTypeList = GetSplitValueList(
+                        ItemList[2], DT.TAB_VALUE_SPLIT)
                     for Item in ModuleTypeList:
                         if Item.strip() not in DT.MODULE_LIST:
                             Logger.Error('Parser',
                                          FORMAT_INVALID,
-                                         ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID % (Item),
+                                         ST.ERR_INF_PARSER_DEFINE_MODULETYPE_INVALID % (
+                                             Item),
                                          File=FileName,
                                          Line=LineNo,
                                          ExtraData=SectionString)
@@ -412,18 +428,20 @@ class InfSectionParser(InfDefinSectionParser,
                     _ValueList.append([_SectionName, Str1, Str2, LineNo])
                 else:
                     if len(ItemList) == 4:
-                        _ValueList.append([_SectionName, Str1, Str2, ItemList[3], LineNo])
+                        _ValueList.append(
+                            [_SectionName, Str1, Str2, ItemList[3], LineNo])
 
         self.SectionHeaderContent = deepcopy(_ValueList)
 
-    ## GenSpecialSectionList
+    # GenSpecialSectionList
     #
     #  @param SpecialSectionList: a list of list, of which item's format
     #                             (Comment, LineNum)
     #  @param ContainerFile:      Input value for filename of Inf file
     #
-    def InfSpecialCommentParser (self, SpecialSectionList, InfSectionObject, ContainerFile, SectionType):
-        ReFindSpecialCommentRe = re.compile(r"""#(?:\s*)\[(.*?)\](?:.*)""", re.DOTALL)
+    def InfSpecialCommentParser(self, SpecialSectionList, InfSectionObject, ContainerFile, SectionType):
+        ReFindSpecialCommentRe = re.compile(
+            r"""#(?:\s*)\[(.*?)\](?:.*)""", re.DOTALL)
         ReFindHobArchRe = re.compile(r"""[Hh][Oo][Bb]\.([^,]*)""", re.DOTALL)
         if self.FileName:
             pass
@@ -450,7 +468,8 @@ class InfSectionParser(InfDefinSectionParser,
                     ArchList.append(Arch)
             CommentSoFar = ''
             for Index in range(1, len(List)):
-                Result = ParseComment(List[Index], DT.ALL_USAGE_TOKENS, TokenDict, [], False)
+                Result = ParseComment(
+                    List[Index], DT.ALL_USAGE_TOKENS, TokenDict, [], False)
                 Usage = Result[0]
                 Type = Result[1]
                 HelpText = Result[3]
@@ -488,6 +507,7 @@ class InfSectionParser(InfDefinSectionParser,
                                                    SectionType):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (SectionType),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             SectionType),
                          ContainerFile
                          )
diff --git a/BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py b/BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py
index 916df7ee1a34..6d82913b58c8 100644
--- a/BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py
+++ b/BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contained the parser for [Sources] sections in INF file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -22,21 +22,22 @@ from Library.Misc import GetSplitValueList
 from Object.Parser.InfCommonObject import InfLineCommentObject
 from Parser.InfParserMisc import InfParserSectionRoot
 
+
 class InfSourceSectionParser(InfParserSectionRoot):
-    ## InfSourceParser
+    # InfSourceParser
     #
     #
     def InfSourceParser(self, SectionString, InfSectionObject, FileName):
         SectionMacros = {}
-        ValueList     = []
-        SourceList    = []
-        StillCommentFalg  = False
-        HeaderComments    = []
-        LineComment       = None
-        SectionContent  = ''
+        ValueList = []
+        SourceList = []
+        StillCommentFalg = False
+        HeaderComments = []
+        LineComment = None
+        SectionContent = ''
         for Line in SectionString:
             SrcLineContent = Line[0]
-            SrcLineNo      = Line[1]
+            SrcLineNo = Line[1]
 
             if SrcLineContent.strip() == '':
                 continue
@@ -78,8 +79,10 @@ class InfSourceSectionParser(InfParserSectionRoot):
             # Find Tail comment.
             #
             if SrcLineContent.find(DT.TAB_COMMENT_SPLIT) > -1:
-                TailComments = SrcLineContent[SrcLineContent.find(DT.TAB_COMMENT_SPLIT):]
-                SrcLineContent = SrcLineContent[:SrcLineContent.find(DT.TAB_COMMENT_SPLIT)]
+                TailComments = SrcLineContent[SrcLineContent.find(
+                    DT.TAB_COMMENT_SPLIT):]
+                SrcLineContent = SrcLineContent[:SrcLineContent.find(
+                    DT.TAB_COMMENT_SPLIT)]
                 if LineComment is None:
                     LineComment = InfLineCommentObject()
                 LineComment.SetTailComments(TailComments)
@@ -101,11 +104,12 @@ class InfSourceSectionParser(InfParserSectionRoot):
             # Replace with Local section Macro and [Defines] section Macro.
             #
             SrcLineContent = InfExpandMacro(SrcLineContent,
-                                         (FileName, SrcLineContent, SrcLineNo),
-                                         self.FileLocalMacros,
-                                         SectionMacros)
+                                            (FileName, SrcLineContent, SrcLineNo),
+                                            self.FileLocalMacros,
+                                            SectionMacros)
 
-            TokenList = GetSplitValueList(SrcLineContent, DT.TAB_VALUE_SPLIT, 4)
+            TokenList = GetSplitValueList(
+                SrcLineContent, DT.TAB_VALUE_SPLIT, 4)
             ValueList[0:len(TokenList)] = TokenList
 
             #
@@ -131,9 +135,10 @@ class InfSourceSectionParser(InfParserSectionRoot):
                 InfSectionObject.SetSupArchList(Item[1])
 
         InfSectionObject.SetAllContent(SectionContent)
-        if not InfSectionObject.SetSources(SourceList, Arch = ArchList):
+        if not InfSectionObject.SetSources(SourceList, Arch=ArchList):
             Logger.Error('InfParser',
                          FORMAT_INVALID,
-                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % ("[Sources]"),
+                         ST.ERR_INF_PARSER_MODULE_SECTION_TYPE_ERROR % (
+                             "[Sources]"),
                          File=FileName,
                          Line=Item[3])
diff --git a/BaseTools/Source/Python/UPT/Parser/__init__.py b/BaseTools/Source/Python/UPT/Parser/__init__.py
index 785b56df6e2a..539a99527964 100644
--- a/BaseTools/Source/Python/UPT/Parser/__init__.py
+++ b/BaseTools/Source/Python/UPT/Parser/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Parser' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
index da92fe5d3eb2..393d2e86a833 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
@@ -1,4 +1,4 @@
-## @file DecPomAlignment.py
+# @file DecPomAlignment.py
 # This file contained the adapter for convert INF parser object to POM Object
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -86,12 +86,12 @@ from Object.POM.CommonObject import MiscFileObject
 from Object.POM.CommonObject import FileObject
 
 
-## DecPomAlignment
+# DecPomAlignment
 #
 # Inherited from PackageObject
 #
 class DecPomAlignment(PackageObject):
-    def __init__(self, Filename, WorkspaceDir = None, CheckMulDec = False):
+    def __init__(self, Filename, WorkspaceDir=None, CheckMulDec=False):
         PackageObject.__init__(self)
         self.UserExtensions = ''
         self.WorkspaceDir = WorkspaceDir
@@ -111,7 +111,7 @@ class DecPomAlignment(PackageObject):
         #
         self.DecToPackage()
 
-    ## Load Dec file
+    # Load Dec file
     #
     # Load the file if it exists
     #
@@ -131,7 +131,7 @@ class DecPomAlignment(PackageObject):
 
         self.DecParser = Dec(Filename)
 
-    ## Transfer to Package Object
+    # Transfer to Package Object
     #
     # Transfer all contents of a Dec file to a standard Package Object
     #
@@ -187,7 +187,7 @@ class DecPomAlignment(PackageObject):
         #
         self.GenUserExtensions()
 
-    ## Generate user extension
+    # Generate user extension
     #
     #
     def GenUserExtensions(self):
@@ -221,12 +221,14 @@ class DecPomAlignment(PackageObject):
         # Add Private sections to UserExtension
         if self.DecParser.GetPrivateSections():
             PrivateUserExtension = UserExtensionObject()
-            PrivateUserExtension.SetStatement(self.DecParser.GetPrivateSections())
+            PrivateUserExtension.SetStatement(
+                self.DecParser.GetPrivateSections())
             PrivateUserExtension.SetIdentifier(DT.TAB_PRIVATE)
             PrivateUserExtension.SetUserID(DT.TAB_INTEL)
-            self.SetUserExtensionList(self.GetUserExtensionList() + [PrivateUserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [PrivateUserExtension])
 
-    ## Generate miscellaneous files on DEC file
+    # Generate miscellaneous files on DEC file
     #
     #
     def GenMiscFiles(self, Content):
@@ -241,16 +243,18 @@ class DecPomAlignment(PackageObject):
                 if IsValidPath(FileName, self.GetRelaPath()):
                     FileObj = FileObject()
                     FileObj.SetURI(FileName)
-                    MiscFileObj.SetFileList(MiscFileObj.GetFileList()+[FileObj])
+                    MiscFileObj.SetFileList(
+                        MiscFileObj.GetFileList()+[FileObj])
                 else:
                     Logger.Error("InfParser",
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(Line),
+                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                     Line),
                                  File=self.GetFileName(),
                                  ExtraData=Line)
         self.SetMiscFileList(self.GetMiscFileList()+[MiscFileObj])
 
-    ## Generate Package Header
+    # Generate Package Header
     #
     # Gen Package Header of Dec as <Key> = <Value>
     #
@@ -268,9 +272,9 @@ class DecPomAlignment(PackageObject):
             #
             # put items into Dict except for PackageName, Guid, Version, DEC_SPECIFICATION
             #
-            SkipItemList = [TAB_DEC_DEFINES_PACKAGE_NAME, \
-                TAB_DEC_DEFINES_PACKAGE_GUID, TAB_DEC_DEFINES_PACKAGE_VERSION, \
-                TAB_DEC_DEFINES_DEC_SPECIFICATION, TAB_DEC_DEFINES_PKG_UNI_FILE]
+            SkipItemList = [TAB_DEC_DEFINES_PACKAGE_NAME,
+                            TAB_DEC_DEFINES_PACKAGE_GUID, TAB_DEC_DEFINES_PACKAGE_VERSION,
+                            TAB_DEC_DEFINES_DEC_SPECIFICATION, TAB_DEC_DEFINES_PKG_UNI_FILE]
             if Item.Key in SkipItemList:
                 continue
             DefinesDict['%s = %s' % (Item.Key, Item.Value)] = TAB_ARCH_COMMON
@@ -284,7 +288,8 @@ class DecPomAlignment(PackageObject):
         if DefObj.GetPackageUniFile():
             ValidateUNIFilePath(DefObj.GetPackageUniFile())
             self.UniFileClassObject = \
-            UniFileClassObject([PathClass(os.path.join(DefObj.GetPackagePath(), DefObj.GetPackageUniFile()))])
+                UniFileClassObject(
+                    [PathClass(os.path.join(DefObj.GetPackagePath(), DefObj.GetPackageUniFile()))])
         else:
             self.UniFileClassObject = None
 
@@ -322,9 +327,9 @@ class DecPomAlignment(PackageObject):
         if self.DecParser.BinaryHeadComment:
             Abstract, Description, Copyright, License = \
                 ParseHeaderCommentSection(self.DecParser.BinaryHeadComment,
-                                      ContainerFile, True)
+                                          ContainerFile, True)
 
-            if not Abstract  or not Description or not Copyright or not License:
+            if not Abstract or not Description or not Copyright or not License:
                 Logger.Error('MkPkg',
                              FORMAT_INVALID,
                              ST.ERR_INVALID_BINARYHEADER_FORMAT,
@@ -338,7 +343,7 @@ class DecPomAlignment(PackageObject):
         BinaryAbstractList = []
         BinaryDescriptionList = []
 
-        #Get Binary header from UNI file
+        # Get Binary header from UNI file
         # Initialize the UniStrDict dictionary, top keys are language codes
         UniStrDict = {}
         if self.UniFileClassObject:
@@ -348,19 +353,20 @@ class DecPomAlignment(PackageObject):
                     Lang = GetLanguageCode1766(Lang)
                     if StringDefClassObject.StringName == TAB_DEC_BINARY_ABSTRACT:
                         if (Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)) \
-                        not in self.GetBinaryHeaderAbstract():
-                            BinaryAbstractList.append((Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
+                                not in self.GetBinaryHeaderAbstract():
+                            BinaryAbstractList.append(
+                                (Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
                     if StringDefClassObject.StringName == TAB_DEC_BINARY_DESCRIPTION:
                         if (Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)) \
-                        not in self.GetBinaryHeaderDescription():
+                                not in self.GetBinaryHeaderDescription():
                             BinaryDescriptionList.append((Lang,
                                                           ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
-        #Combine Binary header from DEC file and UNI file
+        # Combine Binary header from DEC file and UNI file
         BinaryAbstractList = self.GetBinaryHeaderAbstract() + BinaryAbstractList
         BinaryDescriptionList = self.GetBinaryHeaderDescription() + BinaryDescriptionList
         BinaryCopyrightList = self.GetBinaryHeaderCopyright()
         BinaryLicenseList = self.GetBinaryHeaderLicense()
-        #Generate the UserExtensionObject for TianoCore."BinaryHeader"
+        # Generate the UserExtensionObject for TianoCore."BinaryHeader"
         if BinaryAbstractList or BinaryDescriptionList or BinaryCopyrightList or BinaryLicenseList:
             BinaryUserExtension = UserExtensionObject()
             BinaryUserExtension.SetBinaryAbstract(BinaryAbstractList)
@@ -369,15 +375,16 @@ class DecPomAlignment(PackageObject):
             BinaryUserExtension.SetBinaryLicense(BinaryLicenseList)
             BinaryUserExtension.SetIdentifier(TAB_BINARY_HEADER_IDENTIFIER)
             BinaryUserExtension.SetUserID(TAB_BINARY_HEADER_USERID)
-            self.SetUserExtensionList(self.GetUserExtensionList() + [BinaryUserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [BinaryUserExtension])
 
-
-    ## GenIncludes
+    # GenIncludes
     #
     # Gen Includes of Dec
     #
     # @param ContainerFile: The Dec file full path
     #
+
     def GenIncludes(self, ContainerFile):
         if ContainerFile:
             pass
@@ -393,7 +400,8 @@ class DecPomAlignment(PackageObject):
                 if Item.GetArchList() == [TAB_ARCH_COMMON] or IncludesDict[IncludePath] == [TAB_ARCH_COMMON]:
                     IncludesDict[IncludePath] = [TAB_ARCH_COMMON]
                 else:
-                    IncludesDict[IncludePath] = IncludesDict[IncludePath] + Item.GetArchList()
+                    IncludesDict[IncludePath] = IncludesDict[IncludePath] + \
+                        Item.GetArchList()
             else:
                 IncludesDict[IncludePath] = Item.GetArchList()
 
@@ -421,7 +429,7 @@ class DecPomAlignment(PackageObject):
         # to remove the extra path separator '\'
         # as this list is used to search the supported Arch info
         #
-        for IndexN in range (0, len(IncludePathList)):
+        for IndexN in range(0, len(IncludePathList)):
             IncludePathList[IndexN] = os.path.normpath(IncludePathList[IndexN])
         IncludePathList.sort()
         IncludePathList.reverse()
@@ -434,16 +442,19 @@ class DecPomAlignment(PackageObject):
 
         IncludeFileList = []
         for Path in NonOverLapList:
-            FileList = GetFiles(os.path.join(PackagePath, Path), ['CVS', '.svn'], False)
-            IncludeFileList += [os.path.normpath(os.path.join(Path, File)) for File in FileList]
+            FileList = GetFiles(os.path.join(PackagePath, Path), [
+                                'CVS', '.svn'], False)
+            IncludeFileList += [os.path.normpath(os.path.join(Path, File))
+                                for File in FileList]
         for Includefile in IncludeFileList:
             ExtName = os.path.splitext(Includefile)[1]
             if ExtName.upper() == '.DEC' and self.CheckMulDec:
                 Logger.Error('MkPkg',
                              UPT_MUL_DEC_ERROR,
-                             ST.ERR_MUL_DEC_ERROR%(os.path.dirname(ContainerFile),
-                                                   os.path.basename(ContainerFile),
-                                                   Includefile))
+                             ST.ERR_MUL_DEC_ERROR % (os.path.dirname(ContainerFile),
+                                                     os.path.basename(
+                                 ContainerFile),
+                                 Includefile))
 
             FileCombinePath = os.path.dirname(Includefile)
             Include = IncludeObject()
@@ -471,9 +482,10 @@ class DecPomAlignment(PackageObject):
             Include.SetFilePath(IncludePath)
             Include.SetSupArchList(Item.GetArchList())
             PackagePathList.append(Include)
-        self.SetPackageIncludeFileList(PackagePathList + PackageIncludeFileList)
+        self.SetPackageIncludeFileList(
+            PackagePathList + PackageIncludeFileList)
 
-    ## GenPpis
+    # GenPpis
     #
     # Gen Ppis of Dec
     # <CName>=<GuidValue>
@@ -489,6 +501,7 @@ class DecPomAlignment(PackageObject):
         Factory = None
         if Type == TAB_GUIDS:
             Obj = self.DecParser.GetGuidSectionObject()
+
             def CreateGuidObject():
                 Object = GuidObject()
                 Object.SetGuidTypeList([])
@@ -522,7 +535,7 @@ class DecPomAlignment(PackageObject):
         for Item in Obj.GetGuidStyleAllItems():
             Name = Item.GuidCName
             Value = Item.GuidString
-            HelpTxt = ParseGenericComment(Item.GetHeadComment() + \
+            HelpTxt = ParseGenericComment(Item.GetHeadComment() +
                                           Item.GetTailComment())
 
             ListObject = Factory()
@@ -537,7 +550,7 @@ class DecPomAlignment(PackageObject):
             DeclarationsList.append(ListObject)
 
         #
-        #GuidTypeList is abstracted from help
+        # GuidTypeList is abstracted from help
         #
         if Type == TAB_GUIDS:
             self.SetGuidList(self.GetGuidList() + DeclarationsList)
@@ -546,7 +559,7 @@ class DecPomAlignment(PackageObject):
         elif Type == TAB_PPIS:
             self.SetPpiList(self.GetPpiList() + DeclarationsList)
 
-    ## GenLibraryClasses
+    # GenLibraryClasses
     #
     # Gen LibraryClasses of Dec
     # <CName>=<GuidValue>
@@ -565,7 +578,7 @@ class DecPomAlignment(PackageObject):
             LibraryClass.SetLibraryClass(Item.Libraryclass)
             LibraryClass.SetSupArchList(Item.GetArchList())
             LibraryClass.SetIncludeHeader(Item.File)
-            HelpTxt = ParseGenericComment(Item.GetHeadComment() + \
+            HelpTxt = ParseGenericComment(Item.GetHeadComment() +
                                           Item.GetTailComment(), None, '@libraryclass')
             if HelpTxt:
                 if self.UniFileClassObject:
@@ -573,10 +586,10 @@ class DecPomAlignment(PackageObject):
                 LibraryClass.SetHelpTextList([HelpTxt])
             LibraryClassDeclarations.append(LibraryClass)
 
-        self.SetLibraryClassList(self.GetLibraryClassList() + \
+        self.SetLibraryClassList(self.GetLibraryClassList() +
                                  LibraryClassDeclarations)
 
-    ## GenPcds
+    # GenPcds
     #
     # Gen Pcds of Dec
     # <TokenSpcCName>.<TokenCName>|<Value>|<DatumType>|<Token>
@@ -617,15 +630,15 @@ class DecPomAlignment(PackageObject):
                     StrList = StringDefClassObject.StringName.split('_')
                     # StringName format is STR_<TOKENSPACECNAME>_<PCDCNAME>_PROMPT
                     if len(StrList) == 4 and StrList[0] == TAB_STR_TOKENCNAME and StrList[3] == TAB_STR_TOKENPROMPT:
-                        PromptStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName, \
+                        PromptStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName,
                                               StringDefClassObject.StringValue))
                     # StringName format is STR_<TOKENSPACECNAME>_<PCDCNAME>_HELP
                     if len(StrList) == 4 and StrList[0] == TAB_STR_TOKENCNAME and StrList[3] == TAB_STR_TOKENHELP:
-                        HelpStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName, \
+                        HelpStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName,
                                             StringDefClassObject.StringValue))
                     # StringName format is STR_<TOKENSPACECNAME>_ERR_##
                     if len(StrList) == 4 and StrList[0] == TAB_STR_TOKENCNAME and StrList[2] == TAB_STR_TOKENERR:
-                        PcdErrStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName, \
+                        PcdErrStrList.append((GetLanguageCode1766(Lang), StringDefClassObject.StringName,
                                               StringDefClassObject.StringValue))
         #
         # For each PCD type
@@ -638,13 +651,13 @@ class DecPomAlignment(PackageObject):
             #
             for Item in PcdObj.GetPcdsByType(PcdType.upper()):
                 PcdDeclaration = GenPcdDeclaration(
-                        ContainerFile,
-                        (Item.TokenSpaceGuidCName, Item.TokenCName,
-                        Item.DefaultValue, Item.DatumType, Item.TokenValue,
-                        Type, Item.GetHeadComment(), Item.GetTailComment(), ''),
-                        Language,
-                        self.DecParser.GetDefineSectionMacro()
-                        )
+                    ContainerFile,
+                    (Item.TokenSpaceGuidCName, Item.TokenCName,
+                     Item.DefaultValue, Item.DatumType, Item.TokenValue,
+                     Type, Item.GetHeadComment(), Item.GetTailComment(), ''),
+                    Language,
+                    self.DecParser.GetDefineSectionMacro()
+                )
                 PcdDeclaration.SetSupArchList(Item.GetArchListOfType(PcdType))
 
                 #
@@ -652,38 +665,41 @@ class DecPomAlignment(PackageObject):
                 #
                 for PcdErr in PcdDeclaration.GetPcdErrorsList():
                     if (PcdDeclaration.GetTokenSpaceGuidCName(), PcdErr.GetErrorNumber()) \
-                        in self.DecParser.PcdErrorCommentDict:
-                        Key = (PcdDeclaration.GetTokenSpaceGuidCName(), PcdErr.GetErrorNumber())
-                        PcdErr.SetErrorMessageList(PcdErr.GetErrorMessageList() + \
-                                                      [(Language, self.DecParser.PcdErrorCommentDict[Key])])
+                            in self.DecParser.PcdErrorCommentDict:
+                        Key = (PcdDeclaration.GetTokenSpaceGuidCName(),
+                               PcdErr.GetErrorNumber())
+                        PcdErr.SetErrorMessageList(PcdErr.GetErrorMessageList() +
+                                                   [(Language, self.DecParser.PcdErrorCommentDict[Key])])
 
                 for Index in range(0, len(PromptStrList)):
                     StrNameList = PromptStrList[Index][1].split('_')
                     if StrNameList[1].lower() == Item.TokenSpaceGuidCName.lower() and \
-                    StrNameList[2].lower() == Item.TokenCName.lower():
+                            StrNameList[2].lower() == Item.TokenCName.lower():
                         TxtObj = TextObject()
                         TxtObj.SetLang(PromptStrList[Index][0])
                         TxtObj.SetString(PromptStrList[Index][2])
                         for Prompt in PcdDeclaration.GetPromptList():
                             if Prompt.GetLang() == TxtObj.GetLang() and \
-                                Prompt.GetString() == TxtObj.GetString():
+                                    Prompt.GetString() == TxtObj.GetString():
                                 break
                         else:
-                            PcdDeclaration.SetPromptList(PcdDeclaration.GetPromptList() + [TxtObj])
+                            PcdDeclaration.SetPromptList(
+                                PcdDeclaration.GetPromptList() + [TxtObj])
 
                 for Index in range(0, len(HelpStrList)):
                     StrNameList = HelpStrList[Index][1].split('_')
                     if StrNameList[1].lower() == Item.TokenSpaceGuidCName.lower() and \
-                    StrNameList[2].lower() == Item.TokenCName.lower():
+                            StrNameList[2].lower() == Item.TokenCName.lower():
                         TxtObj = TextObject()
                         TxtObj.SetLang(HelpStrList[Index][0])
                         TxtObj.SetString(HelpStrList[Index][2])
                         for HelpStrObj in PcdDeclaration.GetHelpTextList():
                             if HelpStrObj.GetLang() == TxtObj.GetLang() and \
-                                HelpStrObj.GetString() == TxtObj.GetString():
+                                    HelpStrObj.GetString() == TxtObj.GetString():
                                 break
                         else:
-                            PcdDeclaration.SetHelpTextList(PcdDeclaration.GetHelpTextList() + [TxtObj])
+                            PcdDeclaration.SetHelpTextList(
+                                PcdDeclaration.GetHelpTextList() + [TxtObj])
 
                 #
                 # Get PCD error message from UNI file
@@ -691,12 +707,12 @@ class DecPomAlignment(PackageObject):
                 for Index in range(0, len(PcdErrStrList)):
                     StrNameList = PcdErrStrList[Index][1].split('_')
                     if StrNameList[1].lower() == Item.TokenSpaceGuidCName.lower() and \
-                        StrNameList[2].lower() == TAB_STR_TOKENERR.lower():
+                            StrNameList[2].lower() == TAB_STR_TOKENERR.lower():
                         for PcdErr in PcdDeclaration.GetPcdErrorsList():
                             if PcdErr.GetErrorNumber().lower() == (TAB_HEX_START + StrNameList[3]).lower() and \
-                                (PcdErrStrList[Index][0], PcdErrStrList[Index][2]) not in PcdErr.GetErrorMessageList():
-                                PcdErr.SetErrorMessageList(PcdErr.GetErrorMessageList() + \
-                                                            [(PcdErrStrList[Index][0], PcdErrStrList[Index][2])])
+                                    (PcdErrStrList[Index][0], PcdErrStrList[Index][2]) not in PcdErr.GetErrorMessageList():
+                                PcdErr.SetErrorMessageList(PcdErr.GetErrorMessageList() +
+                                                           [(PcdErrStrList[Index][0], PcdErrStrList[Index][2])])
 
                 #
                 # Check to prevent missing error message if a Pcd has the error code.
@@ -790,9 +806,10 @@ class DecPomAlignment(PackageObject):
             for MatchedItem in MatchedList:
                 if MatchedItem not in self.PcdDefaultValueDict:
                     Logger.Error("Dec File Parser", FORMAT_INVALID, Message=ST.ERR_DECPARSE_PCD_NODEFINED % MatchedItem,
-                                     File=self.FullPath)
+                                 File=self.FullPath)
 
-                ReplaceValue = ReplaceValue.replace(MatchedItem, self.PcdDefaultValueDict[MatchedItem])
+                ReplaceValue = ReplaceValue.replace(
+                    MatchedItem, self.PcdDefaultValueDict[MatchedItem])
 
         return ReplaceValue
 
@@ -802,11 +819,12 @@ class DecPomAlignment(PackageObject):
     def CheckPcdValue(self):
         for Pcd in self.GetPcdList():
             self.PcdDefaultValueDict[TAB_SPLIT.join((Pcd.GetTokenSpaceGuidCName(), Pcd.GetCName())).strip()] = \
-            Pcd.GetDefaultValue()
+                Pcd.GetDefaultValue()
 
         for Pcd in self.GetPcdList():
             ValidationExpressions = []
-            PcdGuidName = TAB_SPLIT.join((Pcd.GetTokenSpaceGuidCName(), Pcd.GetCName()))
+            PcdGuidName = TAB_SPLIT.join(
+                (Pcd.GetTokenSpaceGuidCName(), Pcd.GetCName()))
             Valids = Pcd.GetPcdErrorsList()
             for Valid in Valids:
                 Expression = Valid.GetExpression()
@@ -819,28 +837,35 @@ class DecPomAlignment(PackageObject):
                     if QuotedMatchedObj:
                         MatchedStr = QuotedMatchedObj.group().strip()
                         if MatchedStr.startswith('L'):
-                            Expression = Expression.replace(MatchedStr, MatchedStr[1:].strip())
+                            Expression = Expression.replace(
+                                MatchedStr, MatchedStr[1:].strip())
 
                     Expression = self.ReplaceForEval(Expression, IsExpr=True)
                     Expression = Expression.replace(PcdGuidName, 'x')
-                    Message = self.GetEnErrorMessage(Valid.GetErrorMessageList())
+                    Message = self.GetEnErrorMessage(
+                        Valid.GetErrorMessageList())
                     ValidationExpressions.append((Expression, Message))
 
                 ValidList = Valid.GetValidValue()
                 if ValidList:
-                    ValidValue = 'x in %s' % [eval(v) for v in ValidList.split(' ') if v]
-                    Message = self.GetEnErrorMessage(Valid.GetErrorMessageList())
+                    ValidValue = 'x in %s' % [
+                        eval(v) for v in ValidList.split(' ') if v]
+                    Message = self.GetEnErrorMessage(
+                        Valid.GetErrorMessageList())
                     ValidationExpressions.append((ValidValue, Message))
 
                 ValidValueRange = Valid.GetValidValueRange()
                 if ValidValueRange:
-                    ValidValueRange = self.ReplaceForEval(ValidValueRange, IsRange=True)
+                    ValidValueRange = self.ReplaceForEval(
+                        ValidValueRange, IsRange=True)
                     if ValidValueRange.find('-') >= 0:
-                        ValidValueRange = ValidValueRange.replace('-', '<= x <=')
+                        ValidValueRange = ValidValueRange.replace(
+                            '-', '<= x <=')
                     elif not ValidValueRange.startswith('x ') and not ValidValueRange.startswith('not ') \
-                        and not ValidValueRange.startswith('not(') and not ValidValueRange.startswith('('):
+                            and not ValidValueRange.startswith('not(') and not ValidValueRange.startswith('('):
                         ValidValueRange = 'x %s' % ValidValueRange
-                    Message = self.GetEnErrorMessage(Valid.GetErrorMessageList())
+                    Message = self.GetEnErrorMessage(
+                        Valid.GetErrorMessageList())
                     ValidationExpressions.append((ValidValueRange, Message))
 
             DefaultValue = self.PcdDefaultValueDict[PcdGuidName.strip()]
@@ -852,31 +877,32 @@ class DecPomAlignment(PackageObject):
             if QuotedMatchedObj:
                 MatchedStr = QuotedMatchedObj.group().strip()
                 if MatchedStr.startswith('L'):
-                    DefaultValue = DefaultValue.replace(MatchedStr, MatchedStr[1:].strip())
+                    DefaultValue = DefaultValue.replace(
+                        MatchedStr, MatchedStr[1:].strip())
 
             try:
                 DefaultValue = eval(DefaultValue.replace('TRUE', 'True').replace('true', 'True')
-                                        .replace('FALSE', 'False').replace('false', 'False'))
+                                    .replace('FALSE', 'False').replace('false', 'False'))
             except BaseException:
                 pass
 
             for (Expression, Msg) in ValidationExpressions:
                 try:
-                    if not eval(Expression, {'x':DefaultValue}):
-                        Logger.Error("Dec File Parser", FORMAT_INVALID, ExtraData='%s, value = %s' %\
+                    if not eval(Expression, {'x': DefaultValue}):
+                        Logger.Error("Dec File Parser", FORMAT_INVALID, ExtraData='%s, value = %s' %
                                      (PcdGuidName, DefaultValue), Message=Msg, File=self.FullPath)
                 except TypeError:
-                    Logger.Error("Dec File Parser", FORMAT_INVALID, ExtraData=PcdGuidName, \
-                                    Message=Msg, File=self.FullPath)
+                    Logger.Error("Dec File Parser", FORMAT_INVALID, ExtraData=PcdGuidName,
+                                 Message=Msg, File=self.FullPath)
 
-    ## GenModuleFileList
+    # GenModuleFileList
     #
     def GenModuleFileList(self, ContainerFile):
         ModuleFileList = []
         ContainerFileName = os.path.basename(ContainerFile)
         ContainerFilePath = os.path.dirname(ContainerFile)
         for Item in GetFiles(ContainerFilePath,
-                        ['CVS', '.svn'] + self.GetIncludePathList(), False):
+                             ['CVS', '.svn'] + self.GetIncludePathList(), False):
             ExtName = os.path.splitext(Item)[1]
             if ExtName.lower() == '.inf':
                 ModuleFileList.append(Item)
@@ -885,13 +911,13 @@ class DecPomAlignment(PackageObject):
                     continue
                 Logger.Error('MkPkg',
                              UPT_MUL_DEC_ERROR,
-                             ST.ERR_MUL_DEC_ERROR%(ContainerFilePath,
-                                                   ContainerFileName,
-                                                   Item))
+                             ST.ERR_MUL_DEC_ERROR % (ContainerFilePath,
+                                                     ContainerFileName,
+                                                     Item))
 
         self.SetModuleFileList(ModuleFileList)
 
-    ## Show detailed information of Package
+    # Show detailed information of Package
     #
     # Print all members and their values of Package class
     #
@@ -901,12 +927,12 @@ class DecPomAlignment(PackageObject):
         print('\nVersion =', self.GetVersion())
         print('\nGuid =', self.GetGuid())
 
-        print('\nStandardIncludes = %d ' \
-            % len(self.GetStandardIncludeFileList()), end=' ')
+        print('\nStandardIncludes = %d '
+              % len(self.GetStandardIncludeFileList()), end=' ')
         for Item in self.GetStandardIncludeFileList():
             print(Item.GetFilePath(), '  ', Item.GetSupArchList())
-        print('\nPackageIncludes = %d \n' \
-            % len(self.GetPackageIncludeFileList()), end=' ')
+        print('\nPackageIncludes = %d \n'
+              % len(self.GetPackageIncludeFileList()), end=' ')
         for Item in self.GetPackageIncludeFileList():
             print(Item.GetFilePath(), '  ', Item.GetSupArchList())
 
@@ -921,16 +947,16 @@ class DecPomAlignment(PackageObject):
             print(Item.GetCName(), Item.GetGuid(), Item.GetSupArchList())
         print('\nLibraryClasses =', self.GetLibraryClassList())
         for Item in self.GetLibraryClassList():
-            print(Item.GetLibraryClass(), Item.GetRecommendedInstance(), \
-            Item.GetSupArchList())
+            print(Item.GetLibraryClass(), Item.GetRecommendedInstance(),
+                  Item.GetSupArchList())
         print('\nPcds =', self.GetPcdList())
         for Item in self.GetPcdList():
-            print('CName=', Item.GetCName(), 'TokenSpaceGuidCName=', \
-                Item.GetTokenSpaceGuidCName(), \
-                'DefaultValue=', Item.GetDefaultValue(), \
-                'ValidUsage=', Item.GetValidUsage(), \
-                'SupArchList', Item.GetSupArchList(), \
-                'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType())
+            print('CName=', Item.GetCName(), 'TokenSpaceGuidCName=',
+                  Item.GetTokenSpaceGuidCName(),
+                  'DefaultValue=', Item.GetDefaultValue(),
+                  'ValidUsage=', Item.GetValidUsage(),
+                  'SupArchList', Item.GetSupArchList(),
+                  'Token=', Item.GetToken(), 'DatumType=', Item.GetDatumType())
 
         for Item in self.GetMiscFileList():
             print(Item.GetName())
@@ -938,7 +964,7 @@ class DecPomAlignment(PackageObject):
                 print(FileObjectItem.GetURI())
         print('****************\n')
 
-## GenPcdDeclaration
+# GenPcdDeclaration
 #
 # @param ContainerFile:   File name of the DEC file
 # @param PcdInfo:         Pcd information, of format (TokenGuidCName,
@@ -946,6 +972,8 @@ class DecPomAlignment(PackageObject):
 #                         GenericComment, TailComment, Arch)
 # @param Language: The language of HelpText, Prompt
 #
+
+
 def GenPcdDeclaration(ContainerFile, PcdInfo, Language, MacroReplaceDict):
     HelpStr = ''
     PromptStr = ''
@@ -979,7 +1007,7 @@ def GenPcdDeclaration(ContainerFile, PcdInfo, Language, MacroReplaceDict):
 
     if TailComment:
         SupModuleList, TailHelpStr = ParseDecPcdTailComment(TailComment,
-                                                        ContainerFile)
+                                                            ContainerFile)
         if SupModuleList:
             Pcd.SetSupModuleList(SupModuleList)
 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
index 9c406e5f49e3..e57c1749f2a1 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
@@ -1,4 +1,4 @@
-## @file InfPomAlignment.py
+# @file InfPomAlignment.py
 # This file contained the adapter for convert INF parser object to POM Object
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -47,12 +47,14 @@ from Parser import InfParser
 from PomAdapter.DecPomAlignment import DecPomAlignment
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-## InfPomAlignment
+# InfPomAlignment
 #
 # Inherit from ModuleObject
 #
+
+
 class InfPomAlignment(ModuleObject):
-    ## Construct of InfPomAlignment
+    # Construct of InfPomAlignment
     # Skip means that UPT don't care the syntax of INF, this may be the not
     # distributed INF files during creation or the INF files checked for
     # dependency rule during remove.
@@ -110,7 +112,7 @@ class InfPomAlignment(ModuleObject):
         self._GenGuidProtocolPpis(DT.TAB_PPIS)
         self._GenDepexes()
 
-    ## Convert [Defines] section content to InfDefObject
+    # Convert [Defines] section content to InfDefObject
     #
     # Convert [Defines] section content to InfDefObject
     #
@@ -152,7 +154,8 @@ class InfPomAlignment(ModuleObject):
         ModulePath = os.path.split(CombinePath)[0]
         ModuleRelativePath = ModulePath
         if self.GetPackagePath() != '':
-            ModuleRelativePath = GetRelativePath(ModulePath, self.GetPackagePath())
+            ModuleRelativePath = GetRelativePath(
+                ModulePath, self.GetPackagePath())
         self.SetModulePath(ModuleRelativePath)
         #
         # For Define Seciton Items.
@@ -173,7 +176,8 @@ class InfPomAlignment(ModuleObject):
         SpecList = DefineObj.GetSpecification()
         NewSpecList = []
         for SpecItem in SpecList:
-            NewSpecList.append((SpecItem[0], ConvertVersionToDecimal(SpecItem[1])))
+            NewSpecList.append(
+                (SpecItem[0], ConvertVersionToDecimal(SpecItem[1])))
         self.SetSpecList(NewSpecList)
 
         #
@@ -205,7 +209,8 @@ class InfPomAlignment(ModuleObject):
         else:
             self.SetBaseName(DefineObj.GetBaseName().GetValue())
         if DefineObj.GetModuleUniFileName():
-            self.UniFileClassObject = UniFileClassObject([PathClass(DefineObj.GetModuleUniFileName())])
+            self.UniFileClassObject = UniFileClassObject(
+                [PathClass(DefineObj.GetModuleUniFileName())])
         else:
             self.UniFileClassObject = None
         if DefineObj.GetInfVersion() is None:
@@ -248,7 +253,8 @@ class InfPomAlignment(ModuleObject):
         if DefineObj.GetShadow():
             ModuleTypeValue = DefineObj.GetModuleType().GetValue()
             if not (ModuleTypeValue == 'SEC' or ModuleTypeValue == 'PEI_CORE' or ModuleTypeValue == 'PEIM'):
-                Logger.Error("InfParser", FORMAT_INVALID, ST.ERR_INF_PARSER_DEFINE_SHADOW_INVALID, File=self.FullPath)
+                Logger.Error("InfParser", FORMAT_INVALID,
+                             ST.ERR_INF_PARSER_DEFINE_SHADOW_INVALID, File=self.FullPath)
 
         if DefineObj.GetPcdIsDriver() is not None:
             self.SetPcdIsDriver(DefineObj.GetPcdIsDriver().GetValue())
@@ -284,7 +290,8 @@ class InfPomAlignment(ModuleObject):
             UserExtension.SetDefinesDict(DefinesDictNew)
             UserExtension.SetIdentifier('DefineModifiers')
             UserExtension.SetUserID('EDK2')
-            self.SetUserExtensionList(self.GetUserExtensionList() + [UserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [UserExtension])
         #
         # Get all meta-file header information
         # the record is list of items formatted:
@@ -311,15 +318,18 @@ class InfPomAlignment(ModuleObject):
         #
         InfBinaryHeaderObj = self.Parser.InfBinaryHeader
         if InfBinaryHeaderObj.GetAbstract():
-            self.SetBinaryHeaderAbstract((Lang, InfBinaryHeaderObj.GetAbstract()))
+            self.SetBinaryHeaderAbstract(
+                (Lang, InfBinaryHeaderObj.GetAbstract()))
         if InfBinaryHeaderObj.GetDescription():
-            self.SetBinaryHeaderDescription((Lang, InfBinaryHeaderObj.GetDescription()))
+            self.SetBinaryHeaderDescription(
+                (Lang, InfBinaryHeaderObj.GetDescription()))
         if InfBinaryHeaderObj.GetCopyright():
-            self.SetBinaryHeaderCopyright(('', InfBinaryHeaderObj.GetCopyright()))
+            self.SetBinaryHeaderCopyright(
+                ('', InfBinaryHeaderObj.GetCopyright()))
         if InfBinaryHeaderObj.GetLicense():
             self.SetBinaryHeaderLicense(('', InfBinaryHeaderObj.GetLicense()))
 
-    ## GenModuleHeaderLibClass
+    # GenModuleHeaderLibClass
     #
     #
     def _GenModuleHeaderLibClass(self, DefineObj, ArchList):
@@ -334,9 +344,10 @@ class InfPomAlignment(ModuleObject):
             Lib.SetSupArchList(ArchList)
             self.SetLibraryClassList(self.GetLibraryClassList() + [Lib])
             self.SetIsLibrary(True)
-            self.SetIsLibraryModList(self.GetIsLibraryModList() + SupModuleList)
+            self.SetIsLibraryModList(
+                self.GetIsLibraryModList() + SupModuleList)
 
-    ## GenModuleHeaderExterns
+    # GenModuleHeaderExterns
     #
     #
     def _GenModuleHeaderExterns(self, DefineObj):
@@ -382,7 +393,7 @@ class InfPomAlignment(ModuleObject):
             Image.SetDestructor(DestructorItem.GetCName())
             self.SetExternList(self.GetExternList() + [Image])
 
-    ## GenModuleHeaderExterns
+    # GenModuleHeaderExterns
     # BootMode/HOB/Event
     #
     def _GenSpecialComments(self):
@@ -421,7 +432,8 @@ class InfPomAlignment(ModuleObject):
                 BootModeList = []
                 for Item in SpecialCommentsList[Key]:
                     BootMode = BootModeObject()
-                    BootMode.SetSupportedBootModes(Item.GetSupportedBootModes())
+                    BootMode.SetSupportedBootModes(
+                        Item.GetSupportedBootModes())
                     BootMode.SetUsage(Item.GetUsage())
                     if Item.GetHelpString():
                         HelpTextObj = CommonObject.TextObject()
@@ -432,7 +444,7 @@ class InfPomAlignment(ModuleObject):
                     BootModeList.append(BootMode)
                 self.SetBootModeList(BootModeList)
 
-    ## GenBuildOptions
+    # GenBuildOptions
     #
     # Gen BuildOptions of Inf
     # [<Family>:]<ToolFlag>=Flag
@@ -460,14 +472,15 @@ class InfPomAlignment(ModuleObject):
             UserExtension.SetBuildOptionDict(BuildOptionDict)
             UserExtension.SetIdentifier('BuildOptionModifiers')
             UserExtension.SetUserID('EDK2')
-            self.SetUserExtensionList(self.GetUserExtensionList() + [UserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [UserExtension])
         else:
             #
             # Not process this information, will be processed in GenBinaries()
             #
             pass
 
-    ## GenLibraryClasses
+    # GenLibraryClasses
     #
     # Get LibraryClass of Inf
     # <LibraryClassKeyWord>|<LibraryInstance>
@@ -487,7 +500,8 @@ class InfPomAlignment(ModuleObject):
                     LibraryClass.SetLibraryClass(Item.GetLibName())
                     LibraryClass.SetRecommendedInstance(None)
                     LibraryClass.SetFeatureFlag(Item.GetFeatureFlagExp())
-                    LibraryClass.SetSupArchList(ConvertArchList(Item.GetSupArchList()))
+                    LibraryClass.SetSupArchList(
+                        ConvertArchList(Item.GetSupArchList()))
                     LibraryClass.SetSupModuleList(Item.GetSupModuleList())
                     HelpStringObj = Item.GetHelpString()
                     if HelpStringObj is not None:
@@ -498,9 +512,10 @@ class InfPomAlignment(ModuleObject):
                             HelpTextHeaderObj.SetLang(DT.TAB_LANGUAGE_EN_X)
                         HelpTextHeaderObj.SetString(CommentString)
                         LibraryClass.SetHelpTextList([HelpTextHeaderObj])
-                    self.SetLibraryClassList(self.GetLibraryClassList() + [LibraryClass])
+                    self.SetLibraryClassList(
+                        self.GetLibraryClassList() + [LibraryClass])
 
-    ## GenPackages
+    # GenPackages
     #
     # Gen Packages of Inf
     #
@@ -521,11 +536,15 @@ class InfPomAlignment(ModuleObject):
             # Need package information for dependency check usage
             #
             PackageDependency = PackageDependencyObject()
-            PackageDependency.SetPackageFilePath(NormPath(PackageItemObj.GetPackageName()))
-            PackageDependency.SetSupArchList(ConvertArchList(PackageItemObj.GetSupArchList()))
-            PackageDependency.SetFeatureFlag(PackageItemObj.GetFeatureFlagExp())
+            PackageDependency.SetPackageFilePath(
+                NormPath(PackageItemObj.GetPackageName()))
+            PackageDependency.SetSupArchList(
+                ConvertArchList(PackageItemObj.GetSupArchList()))
+            PackageDependency.SetFeatureFlag(
+                PackageItemObj.GetFeatureFlagExp())
 
-            PkgInfo = GetPkgInfoFromDec(mws.join(self.WorkSpace, NormPath(PackageItemObj.GetPackageName())))
+            PkgInfo = GetPkgInfoFromDec(
+                mws.join(self.WorkSpace, NormPath(PackageItemObj.GetPackageName())))
             if PkgInfo[1] and PkgInfo[2]:
                 PackageDependency.SetGuid(PkgInfo[1])
                 PackageDependency.SetVersion(PkgInfo[2])
@@ -539,7 +558,7 @@ class InfPomAlignment(ModuleObject):
             PackageDependencyList.append(PackageDependency)
             self.SetPackageDependencyList(PackageDependencyList)
 
-    ## GenPcds
+    # GenPcds
     #
     # Gen Pcds of Inf
     # <TokenSpaceGuidCName>.<PcdCName>[|<Value> [|<FFE>]]
@@ -565,22 +584,25 @@ class InfPomAlignment(ModuleObject):
                         for CommentItem in CommentList:
                             Pcd = CommonObject.PcdObject()
                             Pcd.SetCName(PcdItemObj.GetCName())
-                            Pcd.SetTokenSpaceGuidCName(PcdItemObj.GetTokenSpaceGuidCName())
+                            Pcd.SetTokenSpaceGuidCName(
+                                PcdItemObj.GetTokenSpaceGuidCName())
                             Pcd.SetDefaultValue(PcdItemObj.GetDefaultValue())
                             Pcd.SetItemType(PcdType)
                             Pcd.SetValidUsage(CommentItem.GetUsageItem())
                             Pcd.SetFeatureFlag(PcdItemObj.GetFeatureFlagExp())
-                            Pcd.SetSupArchList(ConvertArchList(PcdItemObj.GetSupportArchList()))
+                            Pcd.SetSupArchList(ConvertArchList(
+                                PcdItemObj.GetSupportArchList()))
                             HelpTextObj = CommonObject.TextObject()
                             if self.UniFileClassObject:
                                 HelpTextObj.SetLang(DT.TAB_LANGUAGE_EN_X)
-                            HelpTextObj.SetString(CommentItem.GetHelpStringItem())
+                            HelpTextObj.SetString(
+                                CommentItem.GetHelpStringItem())
                             Pcd.SetHelpTextList([HelpTextObj])
                             PcdList = self.GetPcdList()
                             PcdList.append(Pcd)
                 self.SetPcdList(PcdList)
 
-    ## GenSources
+    # GenSources
     #
     # Gen Sources of Inf
     # <Filename>[|<Family>[|<TagName>[|<ToolCode>[|<PcdFeatureFlag>]]]]
@@ -615,11 +637,11 @@ class InfPomAlignment(ModuleObject):
 
         self.SetSourceFileList(self.GetSourceFileList() + SourceList)
 
-
-    ## GenUserExtensions
+    # GenUserExtensions
     #
     # Gen UserExtensions of Inf
     #
+
     def _GenUserExtensions(self):
         #
         # UserExtensions
@@ -645,19 +667,21 @@ class InfPomAlignment(ModuleObject):
                     self._GenMiscFiles(UserExtensionDataObj.GetContent())
                 UserExtension.SetIdentifier(Identifier)
                 UserExtension.SetStatement(UserExtensionDataObj.GetContent())
-                UserExtension.SetSupArchList(ConvertArchList(UserExtensionDataObj.GetSupArchList()))
-                self.SetUserExtensionList(self.GetUserExtensionList() + [UserExtension])
+                UserExtension.SetSupArchList(ConvertArchList(
+                    UserExtensionDataObj.GetSupArchList()))
+                self.SetUserExtensionList(
+                    self.GetUserExtensionList() + [UserExtension])
 
         #
         #  Gen UserExtensions of TianoCore."BinaryHeader"
         #
 
-        #Get Binary header from INF file
+        # Get Binary header from INF file
         BinaryAbstractList = self.BinaryHeaderAbstractList
         BinaryDescriptionList = self.BinaryHeaderDescriptionList
         BinaryCopyrightList = self.BinaryHeaderCopyrightList
         BinaryLicenseList = self.BinaryHeaderLicenseList
-        #Get Binary header from UNI file
+        # Get Binary header from UNI file
         # Initialize UniStrDict, the top keys are language codes
         UniStrDict = {}
         if self.UniFileClassObject:
@@ -666,9 +690,11 @@ class InfPomAlignment(ModuleObject):
                 for StringDefClassObject in UniStrDict[Lang]:
                     Lang = GetLanguageCode1766(Lang)
                     if StringDefClassObject.StringName == DT.TAB_INF_BINARY_ABSTRACT:
-                        BinaryAbstractList.append((Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
+                        BinaryAbstractList.append(
+                            (Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
                     if StringDefClassObject.StringName == DT.TAB_INF_BINARY_DESCRIPTION:
-                        BinaryDescriptionList.append((Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
+                        BinaryDescriptionList.append(
+                            (Lang, ConvertSpecialUnicodes(StringDefClassObject.StringValue)))
         if BinaryAbstractList or BinaryDescriptionList or BinaryCopyrightList or BinaryLicenseList:
             BinaryUserExtension = CommonObject.UserExtensionObject()
             BinaryUserExtension.SetBinaryAbstract(BinaryAbstractList)
@@ -677,7 +703,8 @@ class InfPomAlignment(ModuleObject):
             BinaryUserExtension.SetBinaryLicense(BinaryLicenseList)
             BinaryUserExtension.SetIdentifier(DT.TAB_BINARY_HEADER_IDENTIFIER)
             BinaryUserExtension.SetUserID(DT.TAB_BINARY_HEADER_USERID)
-            self.SetUserExtensionList(self.GetUserExtensionList() + [BinaryUserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [BinaryUserExtension])
 
     def _GenDepexesList(self, SmmDepexList, DxeDepexList, PeiDepexList):
         if SmmDepexList:
@@ -687,7 +714,7 @@ class InfPomAlignment(ModuleObject):
         if PeiDepexList:
             self.SetPeiDepex(PeiDepexList)
 
-    ## GenDepexes
+    # GenDepexes
     #
     # Gen Depex of Inf
     #
@@ -737,7 +764,8 @@ class InfPomAlignment(ModuleObject):
                     ModuleType = self.ModuleType
                 if ModuleType not in DT.VALID_DEPEX_MODULE_TYPE_LIST:
                     Logger.Error("\nMkPkg", PARSER_ERROR,
-                                 ST.ERR_INF_PARSER_DEPEX_SECTION_MODULE_TYPE_ERROR % (ModuleType),
+                                 ST.ERR_INF_PARSER_DEPEX_SECTION_MODULE_TYPE_ERROR % (
+                                     ModuleType),
                                  self.GetFullPath(), RaiseError=True)
                 if ModuleType != self.ModuleType:
                     Logger.Error("\nMkPkg", PARSER_ERROR, ST.ERR_INF_PARSER_DEPEX_SECTION_NOT_DETERMINED,
@@ -757,7 +785,8 @@ class InfPomAlignment(ModuleObject):
                     HelpIns = CommonObject.TextObject()
                     if self.UniFileClassObject:
                         HelpIns.SetLang(DT.TAB_LANGUAGE_EN_X)
-                    HelpIns.SetString(GetHelpStringByRemoveHashKey(Depex.HelpString))
+                    HelpIns.SetString(
+                        GetHelpStringByRemoveHashKey(Depex.HelpString))
                     DepexIns.SetHelpText(HelpIns)
 
                 if ModuleType in SMM_LIST:
@@ -773,11 +802,11 @@ class InfPomAlignment(ModuleObject):
                         Logger.Error("\nMkPkg", PARSER_ERROR, ST.ERR_INF_PARSER_DEPEX_SECTION_INVALID_FOR_DRIVER,
                                      self.GetFullPath(), RaiseError=True)
 
-            #End of for ModuleType in ModuleTypeList
+            # End of for ModuleType in ModuleTypeList
             self._GenDepexesList(SmmDepexList, DxeDepexList, PeiDepexList)
-        #End of for Depex in DepexData
+        # End of for Depex in DepexData
 
-    ## GenBinaries
+    # GenBinaries
     #
     # Gen Binary of Inf, must be called after Pcd/Library is generated
     # <FileType>|<Filename>|<Target>[|<TokenSpaceGuidCName>.<PcdCName>]
@@ -829,7 +858,8 @@ class InfPomAlignment(ModuleObject):
         #
         # PatchPcd and PcdEx
         #
-        AsBuildIns = self._GenAsBuiltPcds(self.Parser.InfPcdSection.GetPcds(), AsBuildIns)
+        AsBuildIns = self._GenAsBuiltPcds(
+            self.Parser.InfPcdSection.GetPcds(), AsBuildIns)
 
         #
         # Parse the DEC file that contains the GUID value of the GUID CName which is used by
@@ -843,13 +873,15 @@ class InfPomAlignment(ModuleObject):
                 TempPath = ModulePath
                 ModulePath = os.path.dirname(ModulePath)
             PackageName = TempPath
-            DecFilePath = os.path.normpath(os.path.join(WorkSpace, PackageName))
+            DecFilePath = os.path.normpath(
+                os.path.join(WorkSpace, PackageName))
             if DecFilePath:
                 for File in os.listdir(DecFilePath):
                     if File.upper().endswith('.DEC'):
-                        DecFileFullPath = os.path.normpath(os.path.join(DecFilePath, File))
-                        DecObjList.append(DecPomAlignment(DecFileFullPath, self.WorkSpace))
-
+                        DecFileFullPath = os.path.normpath(
+                            os.path.join(DecFilePath, File))
+                        DecObjList.append(DecPomAlignment(
+                            DecFileFullPath, self.WorkSpace))
 
         BinariesDict, AsBuildIns, BinaryFileObjectList = GenBinaryData(BinaryData, BinaryObj,
                                                                        BinariesDict,
@@ -882,9 +914,10 @@ class InfPomAlignment(ModuleObject):
             UserExtension.SetBinariesDict(BinariesDict2)
             UserExtension.SetIdentifier('BinaryFileModifiers')
             UserExtension.SetUserID('EDK2')
-            self.SetUserExtensionList(self.GetUserExtensionList() + [UserExtension])
+            self.SetUserExtensionList(
+                self.GetUserExtensionList() + [UserExtension])
 
-    ## GenAsBuiltPcds
+    # GenAsBuiltPcds
     #
     #
     def _GenAsBuiltPcds(self, PcdList, AsBuildIns):
@@ -902,16 +935,18 @@ class InfPomAlignment(ModuleObject):
                 if PcdItemObj.GetTokenSpaceGuidValue() == '' and self.BinaryModule:
                     Logger.Error("\nMkPkg",
                                  PARSER_ERROR,
-                                 ST.ERR_ASBUILD_PCD_TOKENSPACE_GUID_VALUE_MISS % \
+                                 ST.ERR_ASBUILD_PCD_TOKENSPACE_GUID_VALUE_MISS %
                                  (PcdItemObj.GetTokenSpaceGuidCName()),
                                  self.GetFullPath(), RaiseError=True)
                 else:
-                    Pcd.SetTokenSpaceGuidValue(PcdItemObj.GetTokenSpaceGuidValue())
+                    Pcd.SetTokenSpaceGuidValue(
+                        PcdItemObj.GetTokenSpaceGuidValue())
                 if (PcdItemObj.GetToken() == '' or PcdItemObj.GetDatumType() == '') and self.BinaryModule:
                     Logger.Error("\nMkPkg",
                                  PARSER_ERROR,
-                                 ST.ERR_ASBUILD_PCD_DECLARITION_MISS % \
-                                 (PcdItemObj.GetTokenSpaceGuidCName() + '.' + PcdItemObj.GetCName()),
+                                 ST.ERR_ASBUILD_PCD_DECLARITION_MISS %
+                                 (PcdItemObj.GetTokenSpaceGuidCName() +
+                                  '.' + PcdItemObj.GetCName()),
                                  self.GetFullPath(), RaiseError=True)
                 Pcd.SetToken(PcdItemObj.GetToken())
                 Pcd.SetDatumType(PcdItemObj.GetDatumType())
@@ -920,7 +955,8 @@ class InfPomAlignment(ModuleObject):
                 Pcd.SetOffset(PcdItemObj.GetOffset())
                 Pcd.SetItemType(PcdItem[0])
                 Pcd.SetFeatureFlag(PcdItemObj.GetFeatureFlagExp())
-                Pcd.SetSupArchList(ConvertArchList(PcdItemObj.GetSupportArchList()))
+                Pcd.SetSupArchList(ConvertArchList(
+                    PcdItemObj.GetSupportArchList()))
                 Pcd.SetValidUsage(PcdItemObj.GetValidUsage())
                 for CommentItem in PcdItemObj.GetHelpStringList():
                     HelpTextObj = CommonObject.TextObject()
@@ -939,7 +975,8 @@ class InfPomAlignment(ModuleObject):
                 Pcd.SetDefaultValue(PcdItemObj.GetDefaultValue())
                 Pcd.SetItemType(PcdItem[0])
                 Pcd.SetFeatureFlag(PcdItemObj.GetFeatureFlagExp())
-                Pcd.SetSupArchList(ConvertArchList(PcdItemObj.GetSupportArchList()))
+                Pcd.SetSupArchList(ConvertArchList(
+                    PcdItemObj.GetSupportArchList()))
                 Pcd.SetValidUsage(PcdItemObj.GetValidUsage())
                 for CommentItem in PcdItemObj.GetHelpStringList():
                     HelpTextObj = CommonObject.TextObject()
@@ -953,7 +990,7 @@ class InfPomAlignment(ModuleObject):
 
         return AsBuildIns
 
-    ## GenGuidProtocolPpis
+    # GenGuidProtocolPpis
     #
     # Gen Guids/Protocol/Ppis of INF
     # <CName>=<GuidValue>
@@ -980,12 +1017,15 @@ class InfPomAlignment(ModuleObject):
                 if CommentList:
                     for GuidComentItem in CommentList:
                         ListObject = CommonObject.GuidObject()
-                        ListObject.SetGuidTypeList([GuidComentItem.GetGuidTypeItem()])
-                        ListObject.SetVariableName(GuidComentItem.GetVariableNameItem())
+                        ListObject.SetGuidTypeList(
+                            [GuidComentItem.GetGuidTypeItem()])
+                        ListObject.SetVariableName(
+                            GuidComentItem.GetVariableNameItem())
                         ListObject.SetUsage(GuidComentItem.GetUsageItem())
                         ListObject.SetName(Item.GetName())
                         ListObject.SetCName(Item.GetName())
-                        ListObject.SetSupArchList(ConvertArchList(Item.GetSupArchList()))
+                        ListObject.SetSupArchList(
+                            ConvertArchList(Item.GetSupArchList()))
                         ListObject.SetFeatureFlag(Item.GetFeatureFlagExp())
                         HelpString = GuidComentItem.GetHelpStringItem()
                         if HelpString.strip():
@@ -1003,7 +1043,8 @@ class InfPomAlignment(ModuleObject):
                 for CommentItem in CommentList:
                     ListObject = CommonObject.ProtocolObject()
                     ListObject.SetCName(Item.GetName())
-                    ListObject.SetSupArchList(ConvertArchList(Item.GetSupArchList()))
+                    ListObject.SetSupArchList(
+                        ConvertArchList(Item.GetSupArchList()))
                     ListObject.SetFeatureFlag(Item.GetFeatureFlagExp())
                     ListObject.SetNotify(CommentItem.GetNotify())
                     ListObject.SetUsage(CommentItem.GetUsageItem())
@@ -1022,7 +1063,8 @@ class InfPomAlignment(ModuleObject):
                 for CommentItem in CommentList:
                     ListObject = CommonObject.PpiObject()
                     ListObject.SetCName(Item.GetName())
-                    ListObject.SetSupArchList(ConvertArchList(Item.GetSupArchList()))
+                    ListObject.SetSupArchList(
+                        ConvertArchList(Item.GetSupArchList()))
                     ListObject.SetFeatureFlag(Item.GetFeatureFlagExp())
                     ListObject.SetNotify(CommentItem.GetNotify())
                     ListObject.SetUsage(CommentItem.GetUsage())
@@ -1042,7 +1084,7 @@ class InfPomAlignment(ModuleObject):
         elif Type == DT.TAB_PPIS:
             self.SetPpiList(self.GetPpiList() + GuidProtocolPpiList)
 
-    ## GenMiscFiles
+    # GenMiscFiles
     #
     # Gen MiscellaneousFiles of Inf
     #
@@ -1060,12 +1102,13 @@ class InfPomAlignment(ModuleObject):
                 if IsValidPath(FileName, GlobalData.gINF_MODULE_DIR):
                     FileObj = CommonObject.FileObject()
                     FileObj.SetURI(FileName)
-                    MiscFileObj.SetFileList(MiscFileObj.GetFileList()+[FileObj])
+                    MiscFileObj.SetFileList(
+                        MiscFileObj.GetFileList()+[FileObj])
                 else:
                     Logger.Error("InfParser",
                                  FORMAT_INVALID,
-                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID%(Line),
+                                 ST.ERR_INF_PARSER_FILE_NOT_EXIST_OR_NAME_INVALID % (
+                                     Line),
                                  File=GlobalData.gINF_MODULE_NAME,
                                  ExtraData=Line)
         self.SetMiscFileList(self.GetMiscFileList()+[MiscFileObj])
-
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
index 08a6b257dbef..b93ddee7ccb7 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
@@ -1,4 +1,4 @@
-## @file InfPomAlignmentMisc.py
+# @file InfPomAlignmentMisc.py
 # This file contained the routines for InfPomAlignment
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,7 +23,7 @@ from Library.Misc import CheckGuidRegFormat
 from Logger import StringTable as ST
 
 
-## GenModuleHeaderUserExt
+# GenModuleHeaderUserExt
 #
 #
 def GenModuleHeaderUserExt(DefineObj, ArchString):
@@ -120,7 +120,8 @@ def GenModuleHeaderUserExt(DefineObj, ArchString):
             #
             if len(CustomMakefileItem) == 3:
                 if CustomMakefileItem[0] != '':
-                    Value = CustomMakefileItem[0] + ' | ' + CustomMakefileItem[1]
+                    Value = CustomMakefileItem[0] + \
+                        ' | ' + CustomMakefileItem[1]
                 else:
                     Value = CustomMakefileItem[1]
 
@@ -146,7 +147,7 @@ def GenModuleHeaderUserExt(DefineObj, ArchString):
     return DefinesDictNew
 
 
-## Generate the define statement that will be put into userextension
+# Generate the define statement that will be put into userextension
 #  Not support comments.
 #
 # @param HeaderComment: the original header comment (# not removed)
@@ -162,10 +163,12 @@ def _GenInfDefineStateMent(HeaderComment, Name, Value, TailComment):
 
     return Statement
 
-## GenBinaryData
+# GenBinaryData
 #
 #
-def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObjectList, \
+
+
+def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObjectList,
                   SupArchList, BinaryModule, DecObjList=None):
     if BinaryModule:
         pass
@@ -206,16 +209,16 @@ def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObj
             if not CheckGuidRegFormat(ItemObj.GetGuidValue()):
                 if not DecObjList:
                     if DT.TAB_HORIZON_LINE_SPLIT in ItemObj.GetGuidValue() or \
-                        DT.TAB_COMMA_SPLIT in ItemObj.GetGuidValue():
+                            DT.TAB_COMMA_SPLIT in ItemObj.GetGuidValue():
                         Logger.Error("\nMkPkg",
-                                 FORMAT_INVALID,
-                                 ST.ERR_DECPARSE_DEFINE_PKGGUID,
-                                 ExtraData=ItemObj.GetGuidValue(),
-                                 RaiseError=True)
+                                     FORMAT_INVALID,
+                                     ST.ERR_DECPARSE_DEFINE_PKGGUID,
+                                     ExtraData=ItemObj.GetGuidValue(),
+                                     RaiseError=True)
                     else:
                         Logger.Error("\nMkPkg",
                                      FORMAT_INVALID,
-                                     ST.ERR_UNI_SUBGUID_VALUE_DEFINE_DEC_NOT_FOUND % \
+                                     ST.ERR_UNI_SUBGUID_VALUE_DEFINE_DEC_NOT_FOUND %
                                      (ItemObj.GetGuidValue()),
                                      RaiseError=True)
                 else:
@@ -227,10 +230,10 @@ def GenBinaryData(BinaryData, BinaryObj, BinariesDict, AsBuildIns, BinaryFileObj
 
                     if not FileNameObj.GetGuidValue():
                         Logger.Error("\nMkPkg",
-                                         FORMAT_INVALID,
-                                         ST.ERR_DECPARSE_CGUID_NOT_FOUND % \
-                                         (ItemObj.GetGuidValue()),
-                                         RaiseError=True)
+                                     FORMAT_INVALID,
+                                     ST.ERR_DECPARSE_CGUID_NOT_FOUND %
+                                     (ItemObj.GetGuidValue()),
+                                     RaiseError=True)
             else:
                 FileNameObj.SetGuidValue(ItemObj.GetGuidValue().strip())
 
diff --git a/BaseTools/Source/Python/UPT/PomAdapter/__init__.py b/BaseTools/Source/Python/UPT/PomAdapter/__init__.py
index a7c7e9dbf70c..1143abb3afd8 100644
--- a/BaseTools/Source/Python/UPT/PomAdapter/__init__.py
+++ b/BaseTools/Source/Python/UPT/PomAdapter/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Parser' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/UPT/ReplacePkg.py b/BaseTools/Source/Python/UPT/ReplacePkg.py
index 03b91dab8455..02b5c7141054 100644
--- a/BaseTools/Source/Python/UPT/ReplacePkg.py
+++ b/BaseTools/Source/Python/UPT/ReplacePkg.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Replace distribution package.
 #
 # Copyright (c) 2014 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -30,7 +30,7 @@ from InstallPkg import InstallDp
 from RmPkg import GetInstalledDpInfo
 from RmPkg import RemoveDist
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -38,15 +38,18 @@ from RmPkg import RemoveDist
 #
 # @param  Options: command Options
 #
-def Main(Options = None):
+
+
+def Main(Options=None):
     ContentZipFile, DistFile = None, None
     try:
         DataBase = GlobalData.gDB
         WorkspaceDir = GlobalData.gWORKSPACE
         Dep = DependencyRules(DataBase)
-        DistPkg, ContentZipFile, DpPkgFileName, DistFile = UnZipDp(WorkspaceDir, Options.PackFileToReplace)
+        DistPkg, ContentZipFile, DpPkgFileName, DistFile = UnZipDp(
+            WorkspaceDir, Options.PackFileToReplace)
 
-        StoredDistFile, OrigDpGuid, OrigDpVersion = GetInstalledDpInfo(Options.PackFileToBeReplaced, \
+        StoredDistFile, OrigDpGuid, OrigDpVersion = GetInstalledDpInfo(Options.PackFileToBeReplaced,
                                                                        Dep, DataBase, WorkspaceDir)
 
         #
@@ -57,35 +60,38 @@ def Main(Options = None):
         #
         # Remove the old distribution
         #
-        RemoveDist(OrigDpGuid, OrigDpVersion, StoredDistFile, DataBase, WorkspaceDir, Options.Yes)
+        RemoveDist(OrigDpGuid, OrigDpVersion, StoredDistFile,
+                   DataBase, WorkspaceDir, Options.Yes)
 
         #
         # Install the new distribution
         #
-        InstallDp(DistPkg, DpPkgFileName, ContentZipFile, Options, Dep, WorkspaceDir, DataBase)
+        InstallDp(DistPkg, DpPkgFileName, ContentZipFile,
+                  Options, Dep, WorkspaceDir, DataBase)
         ReturnCode = 0
 
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
-                platform) + format_exc())
+                                             platform) + format_exc())
     except KeyboardInterrupt:
         ReturnCode = ABORT_ERROR
         if Logger.GetLevel() <= Logger.DEBUG_9:
             Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
-                platform) + format_exc())
+                                             platform) + format_exc())
     except:
         ReturnCode = CODE_ERROR
         Logger.Error(
-                    "\nReplacePkg",
-                    CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_REPLACE_ERR % (Options.PackFileToReplace, Options.PackFileToBeReplaced),
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
+            "\nReplacePkg",
+            CODE_ERROR,
+            ST.ERR_UNKNOWN_FATAL_REPLACE_ERR % (
+                Options.PackFileToReplace, Options.PackFileToBeReplaced),
+            ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
         Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(),
-            platform) + format_exc())
+                                         platform) + format_exc())
 
     finally:
         Logger.Quiet(ST.MSG_REMOVE_TEMP_FILE_STARTED)
@@ -103,13 +109,15 @@ def Main(Options = None):
 
     return ReturnCode
 
+
 def CheckReplaceDpx(Dep, DistPkg, OrigDpGuid, OrigDpVersion):
     NewDpPkgList = []
     for PkgInfo in DistPkg.PackageSurfaceArea:
         Guid, Version = PkgInfo[0], PkgInfo[1]
         NewDpPkgList.append((Guid, Version))
 
-    NewDpInfo = "%s %s" % (DistPkg.Header.GetGuid(), DistPkg.Header.GetVersion())
+    NewDpInfo = "%s %s" % (DistPkg.Header.GetGuid(),
+                           DistPkg.Header.GetVersion())
     OrigDpInfo = "%s %s" % (OrigDpGuid, OrigDpVersion)
 
     #
@@ -118,25 +126,25 @@ def CheckReplaceDpx(Dep, DistPkg, OrigDpGuid, OrigDpVersion):
     if (NewDpInfo != OrigDpInfo):
         if Dep.CheckDpExists(DistPkg.Header.GetGuid(), DistPkg.Header.GetVersion()):
             Logger.Error("\nReplacePkg", UPT_ALREADY_INSTALLED_ERROR,
-                ST.WRN_DIST_PKG_INSTALLED,
-                ExtraData=ST.MSG_REPLACE_ALREADY_INSTALLED_DP)
+                         ST.WRN_DIST_PKG_INSTALLED,
+                         ExtraData=ST.MSG_REPLACE_ALREADY_INSTALLED_DP)
 
     #
     # check whether the original distribution could be replaced by new distribution
     #
-    Logger.Verbose(ST.MSG_CHECK_DP_FOR_REPLACE%(NewDpInfo, OrigDpInfo))
-    DepInfoResult = Dep.CheckDpDepexForReplace(OrigDpGuid, OrigDpVersion, NewDpPkgList)
+    Logger.Verbose(ST.MSG_CHECK_DP_FOR_REPLACE % (NewDpInfo, OrigDpInfo))
+    DepInfoResult = Dep.CheckDpDepexForReplace(
+        OrigDpGuid, OrigDpVersion, NewDpPkgList)
     Replaceable = DepInfoResult[0]
     if not Replaceable:
         Logger.Error("\nReplacePkg", UNKNOWN_ERROR,
-            ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY)
+                     ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY)
 
     #
     # check whether new distribution could be installed by dependency rule
     #
-    Logger.Verbose(ST.MSG_CHECK_DP_FOR_INSTALL%str(NewDpInfo))
+    Logger.Verbose(ST.MSG_CHECK_DP_FOR_INSTALL % str(NewDpInfo))
     if not Dep.ReplaceCheckNewDpDepex(DistPkg, OrigDpGuid, OrigDpVersion):
         Logger.Error("\nReplacePkg", UNKNOWN_ERROR,
-            ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY,
-            ExtraData=DistPkg.Header.Name)
-
+                     ST.ERR_PACKAGE_NOT_MATCH_DEPENDENCY,
+                     ExtraData=DistPkg.Header.Name)
diff --git a/BaseTools/Source/Python/UPT/RmPkg.py b/BaseTools/Source/Python/UPT/RmPkg.py
index cf37e2bdcabf..2ca9a020b186 100644
--- a/BaseTools/Source/Python/UPT/RmPkg.py
+++ b/BaseTools/Source/Python/UPT/RmPkg.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Install distribution package.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -32,7 +32,7 @@ from Logger.ToolError import CODE_ERROR
 from Logger.ToolError import FatalError
 
 
-## CheckDpDepex
+# CheckDpDepex
 #
 # Check if the Depex is satisfied
 # @param Dep: Dep
@@ -57,13 +57,14 @@ def CheckDpDepex(Dep, Guid, Version, WorkspaceDir):
             # also generate a log file for reference
             #
             Logger.Info(ST.MSG_INVALID_MODULE_INTRODUCED)
-            LogFilePath = os.path.normpath(os.path.join(WorkspaceDir, GlobalData.gINVALID_MODULE_FILE))
+            LogFilePath = os.path.normpath(os.path.join(
+                WorkspaceDir, GlobalData.gINVALID_MODULE_FILE))
             Logger.Info(ST.MSG_CHECK_LOG_FILE % LogFilePath)
             try:
                 LogFile = open(LogFilePath, 'w')
                 try:
                     for ModulePath in DependModuleList:
-                        LogFile.write("%s\n"%ModulePath)
+                        LogFile.write("%s\n" % ModulePath)
                         Logger.Info(ModulePath)
                 except IOError:
                     Logger.Warn("\nRmPkg", ST.ERR_FILE_WRITE_FAILURE,
@@ -74,13 +75,15 @@ def CheckDpDepex(Dep, Guid, Version, WorkspaceDir):
             finally:
                 LogFile.close()
 
-## Remove Path
+# Remove Path
 #
 # removing readonly file on windows will get "Access is denied"
 # error, so before removing, change the mode to be writeable
 #
 # @param Path: The Path to be removed
 #
+
+
 def RemovePath(Path):
     Logger.Info(ST.MSG_REMOVE_FILE % Path)
     if not os.access(Path, os.W_OK):
@@ -90,16 +93,18 @@ def RemovePath(Path):
         os.removedirs(os.path.split(Path)[0])
     except OSError:
         pass
-## GetCurrentFileList
+# GetCurrentFileList
 #
 # @param DataBase: DataBase of UPT
 # @param Guid: Guid of Dp
 # @param Version: Version of Dp
 # @param WorkspaceDir: Workspace Dir
 #
+
+
 def GetCurrentFileList(DataBase, Guid, Version, WorkspaceDir):
     NewFileList = []
-    for Dir in  DataBase.GetDpInstallDirList(Guid, Version):
+    for Dir in DataBase.GetDpInstallDirList(Guid, Version):
         RootDir = os.path.normpath(os.path.join(WorkspaceDir, Dir))
         for Root, Dirs, Files in os.walk(RootDir):
             Logger.Debug(0, Dirs)
@@ -110,7 +115,7 @@ def GetCurrentFileList(DataBase, Guid, Version, WorkspaceDir):
     return NewFileList
 
 
-## Tool entrance method
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -118,7 +123,7 @@ def GetCurrentFileList(DataBase, Guid, Version, WorkspaceDir):
 #
 # @param  Options: command option
 #
-def Main(Options = None):
+def Main(Options=None):
 
     try:
         DataBase = GlobalData.gDB
@@ -135,7 +140,8 @@ def Main(Options = None):
         #
         # Get the Dp information
         #
-        StoredDistFile, Guid, Version = GetInstalledDpInfo(Options.DistributionFile, Dep, DataBase, WorkspaceDir)
+        StoredDistFile, Guid, Version = GetInstalledDpInfo(
+            Options.DistributionFile, Dep, DataBase, WorkspaceDir)
 
         #
         # Check Dp depex
@@ -145,7 +151,8 @@ def Main(Options = None):
         #
         # remove distribution
         #
-        RemoveDist(Guid, Version, StoredDistFile, DataBase, WorkspaceDir, Options.Yes)
+        RemoveDist(Guid, Version, StoredDistFile,
+                   DataBase, WorkspaceDir, Options.Yes)
 
         Logger.Quiet(ST.MSG_FINISH)
 
@@ -154,27 +161,27 @@ def Main(Options = None):
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
+            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) +
                          format_exc())
     except KeyboardInterrupt:
         ReturnCode = ABORT_ERROR
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
+            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) +
                          format_exc())
     except:
         Logger.Error(
-                    "\nRmPkg",
-                    CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_REMOVING_ERR,
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
-        Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
+            "\nRmPkg",
+            CODE_ERROR,
+            ST.ERR_UNKNOWN_FATAL_REMOVING_ERR,
+            ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
+        Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) +
                      format_exc())
         ReturnCode = CODE_ERROR
     return ReturnCode
 
-## GetInstalledDpInfo method
+# GetInstalledDpInfo method
 #
 # Get the installed distribution information
 #
@@ -186,10 +193,14 @@ def Main(Options = None):
 # @retval Guid: the Guid of the distribution
 # @retval Version: the Version of distribution
 #
+
+
 def GetInstalledDpInfo(DistributionFile, Dep, DataBase, WorkspaceDir):
-    (Guid, Version, NewDpFileName) = DataBase.GetDpByName(os.path.split(DistributionFile)[1])
+    (Guid, Version, NewDpFileName) = DataBase.GetDpByName(
+        os.path.split(DistributionFile)[1])
     if not Guid:
-        Logger.Error("RmPkg", UNKNOWN_ERROR, ST.ERR_PACKAGE_NOT_INSTALLED % DistributionFile)
+        Logger.Error("RmPkg", UNKNOWN_ERROR,
+                     ST.ERR_PACKAGE_NOT_INSTALLED % DistributionFile)
 
     #
     # Check Dp existing
@@ -200,14 +211,15 @@ def GetInstalledDpInfo(DistributionFile, Dep, DataBase, WorkspaceDir):
     # Check for Distribution files existence in /conf/upt, if not exist,
     # Warn user and go on.
     #
-    StoredDistFile = os.path.normpath(os.path.join(WorkspaceDir, GlobalData.gUPT_DIR, NewDpFileName))
+    StoredDistFile = os.path.normpath(os.path.join(
+        WorkspaceDir, GlobalData.gUPT_DIR, NewDpFileName))
     if not os.path.isfile(StoredDistFile):
-        Logger.Warn("RmPkg", ST.WRN_DIST_NOT_FOUND%StoredDistFile)
+        Logger.Warn("RmPkg", ST.WRN_DIST_NOT_FOUND % StoredDistFile)
         StoredDistFile = None
 
     return StoredDistFile, Guid, Version
 
-## RemoveDist method
+# RemoveDist method
 #
 # remove a distribution
 #
@@ -218,6 +230,8 @@ def GetInstalledDpInfo(DistributionFile, Dep, DataBase, WorkspaceDir):
 # @param  WorkspaceDir: work space directory
 # @param  ForceRemove: whether user want to remove file even it is modified
 #
+
+
 def RemoveDist(Guid, Version, StoredDistFile, DataBase, WorkspaceDir, ForceRemove):
     #
     # Get Current File List
diff --git a/BaseTools/Source/Python/UPT/TestInstall.py b/BaseTools/Source/Python/UPT/TestInstall.py
index 1adc19260d89..9d1a5880f9e6 100644
--- a/BaseTools/Source/Python/UPT/TestInstall.py
+++ b/BaseTools/Source/Python/UPT/TestInstall.py
@@ -31,6 +31,8 @@ from sys import platform
 #
 # @param  Options: command Options
 #
+
+
 def Main(Options=None):
     ContentZipFile, DistFile = None, None
     ReturnCode = 0
@@ -39,11 +41,13 @@ def Main(Options=None):
         DataBase = GlobalData.gDB
         WorkspaceDir = GlobalData.gWORKSPACE
         if not Options.DistFiles:
-            Logger.Error("TestInstallPkg", TE.OPTION_MISSING, ExtraData=ST.ERR_SPECIFY_PACKAGE)
+            Logger.Error("TestInstallPkg", TE.OPTION_MISSING,
+                         ExtraData=ST.ERR_SPECIFY_PACKAGE)
 
         DistPkgList = []
         for DistFile in Options.DistFiles:
-            DistPkg, ContentZipFile, __, DistFile = UnZipDp(WorkspaceDir, DistFile)
+            DistPkg, ContentZipFile, __, DistFile = UnZipDp(
+                WorkspaceDir, DistFile)
             DistPkgList.append(DistPkg)
 
         #
@@ -65,18 +69,20 @@ def Main(Options=None):
     except TE.FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
 
     except Exception as x:
         ReturnCode = TE.CODE_ERROR
         Logger.Error(
-                    "\nTestInstallPkg",
-                    TE.CODE_ERROR,
-                    ST.ERR_UNKNOWN_FATAL_INSTALL_ERR % Options.DistFiles,
-                    ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
-        Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            "\nTestInstallPkg",
+            TE.CODE_ERROR,
+            ST.ERR_UNKNOWN_FATAL_INSTALL_ERR % Options.DistFiles,
+            ExtraData=ST.MSG_SEARCH_FOR_HELP % ST.MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
+        Logger.Quiet(ST.MSG_PYTHON_ON %
+                     (python_version(), platform) + format_exc())
 
     finally:
         Logger.Quiet(ST.MSG_REMOVE_TEMP_FILE_STARTED)
@@ -91,4 +97,3 @@ def Main(Options=None):
     if ReturnCode == 0:
         Logger.Quiet(ST.MSG_FINISH)
     return ReturnCode
-
diff --git a/BaseTools/Source/Python/UPT/UPT.py b/BaseTools/Source/Python/UPT/UPT.py
index 480f389d7d03..75b1a3bac21f 100644
--- a/BaseTools/Source/Python/UPT/UPT.py
+++ b/BaseTools/Source/Python/UPT/UPT.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 # This file is the main entry for UPT
 #
@@ -13,6 +13,35 @@ UPT
 
 ## import modules
 #
+from BuildVersion import gBUILD_VERSION
+from Core.IpiDb import IpiDatabase
+from Library import GlobalData
+from Library.Misc import GetWorkspace
+import TestInstall
+import ReplacePkg
+import InventoryWs
+import RmPkg
+import InstallPkg
+import MkPkg
+from Common.MultipleWorkspace import MultipleWorkspace as mws
+from Logger.ToolError import UPT_ALREADY_INSTALLED_ERROR
+from Logger.ToolError import FatalError
+from Logger.ToolError import OPTION_CONFLICT
+from Logger.ToolError import FILE_TYPE_MISMATCH
+from Logger.ToolError import OPTION_MISSING
+from Logger.ToolError import FILE_NOT_FOUND
+from Logger.StringTable import MSG_USAGE
+from Logger.StringTable import MSG_DESCRIPTION
+from Logger.StringTable import MSG_VERSION
+import Logger.Log as Logger
+from Logger import StringTable as ST
+from platform import python_version
+from traceback import format_exc
+from optparse import OptionParser
+import platform as pf
+from sys import platform
+import os.path
+from Core import FileHook
 import locale
 import sys
 from imp import reload
@@ -20,57 +49,29 @@ encoding = locale.getdefaultlocale()[1]
 if encoding:
     reload(sys)
     sys.setdefaultencoding(encoding)
-from Core import FileHook
-import os.path
-from sys import platform
-import platform as pf
-from optparse import OptionParser
-from traceback import format_exc
-from platform import python_version
 
-from Logger import StringTable as ST
-import Logger.Log as Logger
-from Logger.StringTable import MSG_VERSION
-from Logger.StringTable import MSG_DESCRIPTION
-from Logger.StringTable import MSG_USAGE
-from Logger.ToolError import FILE_NOT_FOUND
-from Logger.ToolError import OPTION_MISSING
-from Logger.ToolError import FILE_TYPE_MISMATCH
-from Logger.ToolError import OPTION_CONFLICT
-from Logger.ToolError import FatalError
-from Logger.ToolError import UPT_ALREADY_INSTALLED_ERROR
-from Common.MultipleWorkspace import MultipleWorkspace as mws
 
-import MkPkg
-import InstallPkg
-import RmPkg
-import InventoryWs
-import ReplacePkg
-import TestInstall
-from Library.Misc import GetWorkspace
-from Library import GlobalData
-from Core.IpiDb import IpiDatabase
-from BuildVersion import gBUILD_VERSION
-
-## CheckConflictOption
+# CheckConflictOption
 #
 # CheckConflictOption
 #
+
 def CheckConflictOption(Opt):
     if (Opt.PackFileToCreate or Opt.PackFileToInstall or Opt.PackFileToRemove or Opt.PackFileToReplace) \
-    and Opt.InventoryWs:
+            and Opt.InventoryWs:
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_L_OA_EXCLUSIVE)
     elif Opt.PackFileToReplace and (Opt.PackFileToCreate or Opt.PackFileToInstall or Opt.PackFileToRemove):
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_U_ICR_EXCLUSIVE)
     elif (Opt.PackFileToCreate and Opt.PackFileToInstall and Opt.PackFileToRemove):
-        Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_REQUIRE_I_C_R_OPTION)
+        Logger.Error("UPT", OPTION_CONFLICT,
+                     ExtraData=ST.ERR_REQUIRE_I_C_R_OPTION)
     elif Opt.PackFileToCreate and Opt.PackFileToInstall:
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_I_C_EXCLUSIVE)
     elif Opt.PackFileToInstall and Opt.PackFileToRemove:
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_I_R_EXCLUSIVE)
-    elif Opt.PackFileToCreate and  Opt.PackFileToRemove:
+    elif Opt.PackFileToCreate and Opt.PackFileToRemove:
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_C_R_EXCLUSIVE)
-    elif Opt.TestDistFiles and (Opt.PackFileToCreate or Opt.PackFileToInstall \
+    elif Opt.TestDistFiles and (Opt.PackFileToCreate or Opt.PackFileToInstall
                                 or Opt.PackFileToRemove or Opt.PackFileToReplace):
         Logger.Error("UPT", OPTION_CONFLICT, ExtraData=ST.ERR_C_R_EXCLUSIVE)
 
@@ -78,8 +79,10 @@ def CheckConflictOption(Opt):
         Logger.Warn("UPT", ST.WARN_CUSTOMPATH_OVERRIDE_USEGUIDEDPATH)
         Opt.UseGuidedPkgPath = False
 
-## SetLogLevel
+# SetLogLevel
 #
+
+
 def SetLogLevel(Opt):
     if Opt.opt_verbose:
         Logger.SetLevel(Logger.VERBOSE)
@@ -96,24 +99,29 @@ def SetLogLevel(Opt):
     else:
         Logger.SetLevel(Logger.INFO)
 
-## Main
+# Main
 #
 # Main
 #
+
+
 def Main():
     Logger.Initialize()
 
     Parser = OptionParser(version=(MSG_VERSION + ' Build ' + gBUILD_VERSION), description=MSG_DESCRIPTION,
                           prog="UPT.exe", usage=MSG_USAGE)
 
-    Parser.add_option("-d", "--debug", action="store", type="int", dest="debug_level", help=ST.HLP_PRINT_DEBUG_INFO)
+    Parser.add_option("-d", "--debug", action="store", type="int",
+                      dest="debug_level", help=ST.HLP_PRINT_DEBUG_INFO)
 
     Parser.add_option("-v", "--verbose", action="store_true", dest="opt_verbose",
                       help=ST.HLP_PRINT_INFORMATIONAL_STATEMENT)
 
-    Parser.add_option("-s", "--silent", action="store_true", dest="opt_slient", help=ST.HLP_RETURN_NO_DISPLAY)
+    Parser.add_option("-s", "--silent", action="store_true",
+                      dest="opt_slient", help=ST.HLP_RETURN_NO_DISPLAY)
 
-    Parser.add_option("-q", "--quiet", action="store_true", dest="opt_quiet", help=ST.HLP_RETURN_AND_DISPLAY)
+    Parser.add_option("-q", "--quiet", action="store_true",
+                      dest="opt_quiet", help=ST.HLP_RETURN_AND_DISPLAY)
 
     Parser.add_option("-i", "--install", action="append", type="string", dest="Install_Distribution_Package_File",
                       help=ST.HLP_SPECIFY_PACKAGE_NAME_INSTALL)
@@ -136,11 +144,14 @@ def Main():
     Parser.add_option("-l", "--list", action="store_true", dest="List_Dist_Installed",
                       help=ST.HLP_LIST_DIST_INSTALLED)
 
-    Parser.add_option("-f", "--force", action="store_true", dest="Yes", help=ST.HLP_DISABLE_PROMPT)
+    Parser.add_option("-f", "--force", action="store_true",
+                      dest="Yes", help=ST.HLP_DISABLE_PROMPT)
 
-    Parser.add_option("-n", "--custom-path", action="store_true", dest="CustomPath", help=ST.HLP_CUSTOM_PATH_PROMPT)
+    Parser.add_option("-n", "--custom-path", action="store_true",
+                      dest="CustomPath", help=ST.HLP_CUSTOM_PATH_PROMPT)
 
-    Parser.add_option("-x", "--free-lock", action="store_true", dest="SkipLock", help=ST.HLP_SKIP_LOCK_CHECK)
+    Parser.add_option("-x", "--free-lock", action="store_true",
+                      dest="SkipLock", help=ST.HLP_SKIP_LOCK_CHECK)
 
     Parser.add_option("-u", "--replace", action="store", type="string", dest="Replace_Distribution_Package_File",
                       help=ST.HLP_SPECIFY_PACKAGE_NAME_REPLACE)
@@ -148,7 +159,8 @@ def Main():
     Parser.add_option("-o", "--original", action="store", type="string", dest="Original_Distribution_Package_File",
                       help=ST.HLP_SPECIFY_PACKAGE_NAME_TO_BE_REPLACED)
 
-    Parser.add_option("--use-guided-paths", action="store_true", dest="Use_Guided_Paths", help=ST.HLP_USE_GUIDED_PATHS)
+    Parser.add_option("--use-guided-paths", action="store_true",
+                      dest="Use_Guided_Paths", help=ST.HLP_USE_GUIDED_PATHS)
 
     Parser.add_option("-j", "--test-install", action="append", type="string",
                       dest="Test_Install_Distribution_Package_Files", help=ST.HLP_TEST_INSTALL)
@@ -176,7 +188,8 @@ def Main():
         GlobalData.gWORKSPACE, GlobalData.gPACKAGE_PATH = GetWorkspace()
     except FatalError as XExcept:
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + format_exc())
+            Logger.Quiet(ST.MSG_PYTHON_ON %
+                         (python_version(), platform) + format_exc())
         return XExcept.args[0]
 
     # Support WORKSPACE is a long path
@@ -197,7 +210,7 @@ def Main():
     Mgr = FileHook.RecoverMgr(WorkspaceDir)
     FileHook.SetRecoverMgr(Mgr)
 
-    GlobalData.gDB = IpiDatabase(os.path.normpath(os.path.join(WorkspaceDir, \
+    GlobalData.gDB = IpiDatabase(os.path.normpath(os.path.join(WorkspaceDir,
                                                                "Conf/DistributionPackageDatabase.db")), WorkspaceDir)
     GlobalData.gDB.InitDatabase(Opt.SkipLock)
 
@@ -213,24 +226,30 @@ def Main():
             if Opt.PackageInformationDataFile:
                 if not os.path.exists(Opt.PackageInformationDataFile):
                     if not os.path.exists(os.path.join(WorkspaceDir, Opt.PackageInformationDataFile)):
-                        Logger.Error("\nUPT", FILE_NOT_FOUND, ST.ERR_NO_TEMPLATE_FILE % Opt.PackageInformationDataFile)
+                        Logger.Error(
+                            "\nUPT", FILE_NOT_FOUND, ST.ERR_NO_TEMPLATE_FILE % Opt.PackageInformationDataFile)
                     else:
-                        Opt.PackageInformationDataFile = os.path.join(WorkspaceDir, Opt.PackageInformationDataFile)
+                        Opt.PackageInformationDataFile = os.path.join(
+                            WorkspaceDir, Opt.PackageInformationDataFile)
             else:
-                Logger.Error("UPT", OPTION_MISSING, ExtraData=ST.ERR_REQUIRE_T_OPTION)
+                Logger.Error("UPT", OPTION_MISSING,
+                             ExtraData=ST.ERR_REQUIRE_T_OPTION)
             if not Opt.PackFileToCreate.endswith('.dist'):
-                Logger.Error("CreatePkg", FILE_TYPE_MISMATCH, ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToCreate)
+                Logger.Error("CreatePkg", FILE_TYPE_MISMATCH,
+                             ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToCreate)
             RunModule = MkPkg.Main
 
         elif Opt.PackFileToInstall:
             AbsPath = []
             for Item in Opt.PackFileToInstall:
                 if not Item.endswith('.dist'):
-                    Logger.Error("InstallPkg", FILE_TYPE_MISMATCH, ExtraData=ST.ERR_DIST_EXT_ERROR % Item)
+                    Logger.Error("InstallPkg", FILE_TYPE_MISMATCH,
+                                 ExtraData=ST.ERR_DIST_EXT_ERROR % Item)
 
                 AbsPath.append(GetFullPathDist(Item, WorkspaceDir))
                 if not AbsPath:
-                    Logger.Error("InstallPkg", FILE_NOT_FOUND, ST.ERR_INSTALL_DIST_NOT_FOUND % Item)
+                    Logger.Error("InstallPkg", FILE_NOT_FOUND,
+                                 ST.ERR_INSTALL_DIST_NOT_FOUND % Item)
 
             Opt.PackFileToInstall = AbsPath
             setattr(Opt, 'PackageFile', Opt.PackFileToInstall)
@@ -238,7 +257,8 @@ def Main():
 
         elif Opt.PackFileToRemove:
             if not Opt.PackFileToRemove.endswith('.dist'):
-                Logger.Error("RemovePkg", FILE_TYPE_MISMATCH, ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToRemove)
+                Logger.Error("RemovePkg", FILE_TYPE_MISMATCH,
+                             ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToRemove)
             head, tail = os.path.split(Opt.PackFileToRemove)
             if head or not tail:
                 Logger.Error("RemovePkg",
@@ -251,13 +271,16 @@ def Main():
             RunModule = InventoryWs.Main
 
         elif Opt.PackFileToBeReplaced and not Opt.PackFileToReplace:
-            Logger.Error("ReplacePkg", OPTION_MISSING, ExtraData=ST.ERR_REQUIRE_U_OPTION)
+            Logger.Error("ReplacePkg", OPTION_MISSING,
+                         ExtraData=ST.ERR_REQUIRE_U_OPTION)
 
         elif Opt.PackFileToReplace:
             if not Opt.PackFileToReplace.endswith('.dist'):
-                Logger.Error("ReplacePkg", FILE_TYPE_MISMATCH, ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToReplace)
+                Logger.Error("ReplacePkg", FILE_TYPE_MISMATCH,
+                             ExtraData=ST.ERR_DIST_EXT_ERROR % Opt.PackFileToReplace)
             if not Opt.PackFileToBeReplaced:
-                Logger.Error("ReplacePkg", OPTION_MISSING, ExtraData=ST.ERR_REQUIRE_O_OPTION)
+                Logger.Error("ReplacePkg", OPTION_MISSING,
+                             ExtraData=ST.ERR_REQUIRE_O_OPTION)
             if not Opt.PackFileToBeReplaced.endswith('.dist'):
                 Logger.Error("ReplacePkg",
                              FILE_TYPE_MISMATCH,
@@ -271,7 +294,8 @@ def Main():
 
             AbsPath = GetFullPathDist(Opt.PackFileToReplace, WorkspaceDir)
             if not AbsPath:
-                Logger.Error("ReplacePkg", FILE_NOT_FOUND, ST.ERR_REPLACE_DIST_NOT_FOUND % Opt.PackFileToReplace)
+                Logger.Error("ReplacePkg", FILE_NOT_FOUND,
+                             ST.ERR_REPLACE_DIST_NOT_FOUND % Opt.PackFileToReplace)
 
             Opt.PackFileToReplace = AbsPath
             RunModule = ReplacePkg.Main
@@ -279,7 +303,8 @@ def Main():
         elif Opt.Test_Install_Distribution_Package_Files:
             for Dist in Opt.Test_Install_Distribution_Package_Files:
                 if not Dist.endswith('.dist'):
-                    Logger.Error("TestInstall", FILE_TYPE_MISMATCH, ExtraData=ST.ERR_DIST_EXT_ERROR % Dist)
+                    Logger.Error("TestInstall", FILE_TYPE_MISMATCH,
+                                 ExtraData=ST.ERR_DIST_EXT_ERROR % Dist)
 
             setattr(Opt, 'DistFiles', Opt.Test_Install_Distribution_Package_Files)
             RunModule = TestInstall.Main
@@ -292,7 +317,7 @@ def Main():
     except FatalError as XExcept:
         ReturnCode = XExcept.args[0]
         if Logger.GetLevel() <= Logger.DEBUG_9:
-            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) + \
+            Logger.Quiet(ST.MSG_PYTHON_ON % (python_version(), platform) +
                          format_exc())
     finally:
         try:
@@ -313,7 +338,7 @@ def Main():
 
     return ReturnCode
 
-## GetFullPathDist
+# GetFullPathDist
 #
 #  This function will check DistFile existence, if not absolute path, then try current working directory,
 #  then $(WORKSPACE),and return the AbsPath. If file doesn't find, then return None
@@ -322,6 +347,8 @@ def Main():
 # @param WorkspaceDir:   Workspace Directory
 # @return AbsPath:       The Absolute path of the distribution file if existed, None else
 #
+
+
 def GetFullPathDist(DistFile, WorkspaceDir):
     if os.path.isabs(DistFile):
         if not (os.path.exists(DistFile) and os.path.isfile(DistFile)):
@@ -337,6 +364,7 @@ def GetFullPathDist(DistFile, WorkspaceDir):
 
         return AbsPath
 
+
 if __name__ == '__main__':
     RETVAL = Main()
     #
diff --git a/BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py b/BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py
index dc67dc615a44..249219fbdcd7 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contain unit test for CommentParsing
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -34,6 +34,8 @@ from Library.Misc import CreateDirectory
 #
 # Test _GetHelpStr
 #
+
+
 class _GetHelpStrTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -507,6 +509,8 @@ Guid1|FFE1
 #
 # Test GenProtocolPPiSections
 #
+
+
 class GenProtocolPPiSectionsTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -536,18 +540,18 @@ class GenProtocolPPiSectionsTest(unittest.TestCase):
         return Object
 
     #    Usage        Notify    Help    INF Comment
-    #1   UNDEFINED    true    Present    ## UNDEFINED ## NOTIFY # Help
-    #2   UNDEFINED    true    Not Present    ## UNDEFINED ## NOTIFY
-    #3   UNDEFINED    false    Present    ## UNDEFINED # Help
-    #4   UNDEFINED    false     Not Present    ## UNDEFINED
-    #5   UNDEFINED    Not Present    Present    # Help
-    #6   UNDEFINED    Not Present    Not Present    <empty>
-    #7   Other        true    Present    ## Other ## NOTIFY # Help
-    #8   Other        true    Not Present    ## Other ## NOTIFY
-    #9   Other        false    Present    ## Other # Help
-    #A   Other        false     Not Present    ## Other
-    #B   Other        Not Present    Present    ## Other # Help
-    #C   Other        Not Present    Not Present    ## Other
+    # 1   UNDEFINED    true    Present    ## UNDEFINED ## NOTIFY # Help
+    # 2   UNDEFINED    true    Not Present    ## UNDEFINED ## NOTIFY
+    # 3   UNDEFINED    false    Present    ## UNDEFINED # Help
+    # 4   UNDEFINED    false     Not Present    ## UNDEFINED
+    # 5   UNDEFINED    Not Present    Present    # Help
+    # 6   UNDEFINED    Not Present    Not Present    <empty>
+    # 7   Other        true    Present    ## Other ## NOTIFY # Help
+    # 8   Other        true    Not Present    ## Other ## NOTIFY
+    # 9   Other        false    Present    ## Other # Help
+    # A   Other        false     Not Present    ## Other
+    # B   Other        Not Present    Present    ## Other # Help
+    # C   Other        Not Present    Not Present    ## Other
 
     def testNormalCase1(self):
         ObjectList = []
@@ -560,10 +564,9 @@ class GenProtocolPPiSectionsTest(unittest.TestCase):
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
@@ -572,10 +575,9 @@ Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
         IsProtocol = False
         ObjectList = []
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Ppis]
 Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
@@ -592,10 +594,9 @@ Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## UNDEFINED ## NOTIFY'''
@@ -612,10 +613,9 @@ Guid1|FFE1 ## UNDEFINED ## NOTIFY'''
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## UNDEFINED # Help'''
@@ -632,10 +632,9 @@ Guid1|FFE1 ## UNDEFINED # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## UNDEFINED'''
@@ -652,10 +651,9 @@ Guid1|FFE1 ## UNDEFINED'''
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 # Help'''
@@ -672,10 +670,9 @@ Guid1|FFE1 # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1'''
@@ -692,10 +689,9 @@ Guid1|FFE1'''
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES ## NOTIFY # Help'''
@@ -712,10 +708,9 @@ Guid1|FFE1 ## PRODUCES ## NOTIFY # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES ## NOTIFY'''
@@ -732,10 +727,9 @@ Guid1|FFE1 ## PRODUCES ## NOTIFY'''
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES # Help'''
@@ -752,10 +746,9 @@ Guid1|FFE1 ## PRODUCES # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES'''
@@ -772,10 +765,9 @@ Guid1|FFE1 ## PRODUCES'''
         HelpStr = 'Help'
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES # Help'''
@@ -792,10 +784,9 @@ Guid1|FFE1 ## PRODUCES # Help'''
         HelpStr = ''
         IsProtocol = True
         Object = self.ObjectFactory(CName, FFE, Usage, Notify,
-                                 HelpStr, IsProtocol)
+                                    HelpStr, IsProtocol)
         ObjectList.append(Object)
 
-
         Result = GenProtocolPPiSections(ObjectList, IsProtocol)
         Expected = '''[Protocols]
 Guid1|FFE1 ## PRODUCES'''
@@ -804,6 +795,8 @@ Guid1|FFE1 ## PRODUCES'''
 #
 # Test GenPcdSections
 #
+
+
 class GenPcdSectionsTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -832,12 +825,11 @@ class GenPcdSectionsTest(unittest.TestCase):
 
         return Object
 
-
     #    Usage        Help    INF Comment
-    #1   UNDEFINED    Present    # Help
-    #2   UNDEFINED    Not Present    <empty>
-    #3   Other        Present    ## Other # Help
-    #4   Other        Not Present    ## Other
+    # 1   UNDEFINED    Present    # Help
+    # 2   UNDEFINED    Not Present    <empty>
+    # 3   Other        Present    ## Other # Help
+    # 4   Other        Not Present    ## Other
 
     def testNormalCase1(self):
         ObjectList = []
@@ -1244,7 +1236,6 @@ class GenHobSectionsTest(unittest.TestCase):
         Usage = 'UNDEFINED'
         Str = '\nNew Stack HoB'
 
-
         Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
         ObjectList.append(Object)
 
@@ -1266,7 +1257,6 @@ class GenHobSectionsTest(unittest.TestCase):
         Usage = 'UNDEFINED'
         Str = '\nNew Stack HoB\n\nTail Comment'
 
-
         Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
         ObjectList.append(Object)
 
@@ -1290,7 +1280,6 @@ class GenHobSectionsTest(unittest.TestCase):
         Usage = 'UNDEFINED'
         Str = '\n\n'
 
-
         Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
         ObjectList.append(Object)
 
@@ -1372,6 +1361,8 @@ class GenHobSectionsTest(unittest.TestCase):
 #
 # Test GenGenericCommentF
 #
+
+
 class GenGenericCommentFTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -1409,6 +1400,7 @@ class GenGenericCommentFTest(unittest.TestCase):
         Expected = '# coment line 1\n# coment line 2\n'
         self.assertEqual(Result, Expected)
 
+
 if __name__ == '__main__':
     Logger.Initialize()
     unittest.main()
diff --git a/BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py b/BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py
index 4a3f6db5e3b2..1f31c96e518b 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contain unit test for CommentParsing
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -9,9 +9,9 @@ import unittest
 
 import Logger.Log as Logger
 from Library.CommentParsing import ParseHeaderCommentSection, \
-                                   ParseGenericComment, \
-                                   ParseDecPcdGenericComment, \
-                                   ParseDecPcdTailComment
+    ParseGenericComment, \
+    ParseDecPcdGenericComment, \
+    ParseDecPcdTailComment
 from Library.CommentParsing import _IsCopyrightLine
 from Library.StringUtils import GetSplitValueList
 from Library.DataType import TAB_SPACE_SPLIT
@@ -20,6 +20,8 @@ from Library.DataType import TAB_LANGUAGE_EN_US
 #
 # Test ParseHeaderCommentSection
 #
+
+
 class ParseHeaderCommentSectionTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -32,7 +34,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase1(self):
         TestCommentLines1 = \
-        '''# License1
+            '''# License1
         # License2
         #
         ## @file
@@ -74,7 +76,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase2(self):
         TestCommentLines2 = \
-        ''' # License1
+            ''' # License1
         # License2
         #
         ## @file
@@ -110,14 +112,14 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
         ExpectedLicense = 'License1\nLicense2'
         self.assertEqual(License, ExpectedLicense)
 
-
     #
     # Normal case2: have license/copyright/license above @file,
     # but no abstract/description
     #
+
     def testNormalCase3(self):
         TestCommentLines3 = \
-        ''' # License1
+            ''' # License1
         # License2
         #
         ## @file
@@ -160,7 +162,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase4(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstract
         #
@@ -202,7 +204,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase5(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstract
         #
@@ -248,7 +250,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase6(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstract
         #
@@ -296,7 +298,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase7(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         #
         # Description
@@ -343,7 +345,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testNormalCase8(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstact
         #
@@ -383,7 +385,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testErrorCase1(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstract
         #
@@ -410,7 +412,7 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
     #
     def testErrorCase2(self):
         TestCommentLines = \
-        '''
+            '''
         ## @file
         # Abstract
         #
@@ -437,6 +439,8 @@ class ParseHeaderCommentSectionTest(unittest.TestCase):
 #
 # Test ParseGenericComment
 #
+
+
 class ParseGenericCommentTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -449,7 +453,7 @@ class ParseGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase1(self):
         TestCommentLines = \
-        '''# hello world'''
+            '''# hello world'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -458,7 +462,8 @@ class ParseGenericCommentTest(unittest.TestCase):
             LineNum += 1
             TestCommentLinesList.append((Comment, LineNum))
 
-        HelptxtObj = ParseGenericComment(TestCommentLinesList, 'testNormalCase1')
+        HelptxtObj = ParseGenericComment(
+            TestCommentLinesList, 'testNormalCase1')
         self.failIf(not HelptxtObj)
         self.assertEqual(HelptxtObj.GetString(), 'hello world')
         self.assertEqual(HelptxtObj.GetLang(), TAB_LANGUAGE_EN_US)
@@ -468,7 +473,7 @@ class ParseGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase2(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
@@ -478,7 +483,8 @@ class ParseGenericCommentTest(unittest.TestCase):
             LineNum += 1
             TestCommentLinesList.append((Comment, LineNum))
 
-        HelptxtObj = ParseGenericComment(TestCommentLinesList, 'testNormalCase2')
+        HelptxtObj = ParseGenericComment(
+            TestCommentLinesList, 'testNormalCase2')
         self.failIf(not HelptxtObj)
         self.assertEqual(HelptxtObj.GetString(),
                          'hello world\n' + 'second line')
@@ -489,7 +495,7 @@ class ParseGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase3(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         This is not comment line'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
@@ -499,7 +505,8 @@ class ParseGenericCommentTest(unittest.TestCase):
             LineNum += 1
             TestCommentLinesList.append((Comment, LineNum))
 
-        HelptxtObj = ParseGenericComment(TestCommentLinesList, 'testNormalCase3')
+        HelptxtObj = ParseGenericComment(
+            TestCommentLinesList, 'testNormalCase3')
         self.failIf(not HelptxtObj)
         self.assertEqual(HelptxtObj.GetString(),
                          'hello world\n\n')
@@ -508,6 +515,8 @@ class ParseGenericCommentTest(unittest.TestCase):
 #
 # Test ParseDecPcdGenericComment
 #
+
+
 class ParseDecPcdGenericCommentTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -520,7 +529,7 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase1(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
@@ -537,13 +546,13 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
         self.assertEqual(HelpTxt,
                          'hello world\n' + 'second line')
 
-
     #
     # Normal case2: comments with valid list
     #
+
     def testNormalCase2(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line
         # @ValidList 1, 2, 3
         # other line'''
@@ -562,8 +571,8 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
         self.assertEqual(HelpTxt,
                          'hello world\n' + 'second line\n' + 'other line')
         ExpectedList = GetSplitValueList('1 2 3', TAB_SPACE_SPLIT)
-        ActualList = [item for item in \
-            GetSplitValueList(PcdErr.GetValidValue(), TAB_SPACE_SPLIT) if item]
+        ActualList = [item for item in
+                      GetSplitValueList(PcdErr.GetValidValue(), TAB_SPACE_SPLIT) if item]
         self.assertEqual(ExpectedList, ActualList)
         self.failIf(PcdErr.GetExpression())
         self.failIf(PcdErr.GetValidValueRange())
@@ -573,7 +582,7 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase3(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line
         # @ValidRange LT 1 AND GT 2
         # other line'''
@@ -600,7 +609,7 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase4(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line
         # @Expression LT 1 AND GT 2
         # other line'''
@@ -627,7 +636,7 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase5(self):
         TestCommentLines = \
-        '''# @Expression LT 1 AND GT 2'''
+            '''# @Expression LT 1 AND GT 2'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -649,7 +658,7 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
     #
     def testNormalCase6(self):
         TestCommentLines = \
-        '''#'''
+            '''#'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -663,15 +672,14 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
         self.assertEqual(HelpTxt, '\n')
         self.failIf(PcdErr)
 
-
-
     #
     # Error case1: comments with both expression and valid list, use later
     # ignore the former and with a warning message
     #
+
     def testErrorCase1(self):
         TestCommentLines = \
-        '''## hello world
+            '''## hello world
         # second line
         # @ValidList 1, 2, 3
         # @Expression LT 1 AND GT 2
@@ -692,6 +700,8 @@ class ParseDecPcdGenericCommentTest(unittest.TestCase):
 #
 # Test ParseDecPcdTailComment
 #
+
+
 class ParseDecPcdTailCommentTest(unittest.TestCase):
     def setUp(self):
         pass
@@ -704,7 +714,7 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
     #
     def testNormalCase1(self):
         TestCommentLines = \
-        '''## #hello world'''
+            '''## #hello world'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -725,7 +735,7 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
     #
     def testNormalCase2(self):
         TestCommentLines = \
-        '''## BASE #hello world'''
+            '''## BASE #hello world'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -748,7 +758,7 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
     #
     def testNormalCase3(self):
         TestCommentLines = \
-        '''## BASE  UEFI_APPLICATION #hello world'''
+            '''## BASE  UEFI_APPLICATION #hello world'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -771,7 +781,7 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
     #
     def testNormalCase4(self):
         TestCommentLines = \
-        '''## BASE  UEFI_APPLICATION'''
+            '''## BASE  UEFI_APPLICATION'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -792,7 +802,7 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
     #
     def testNormalCase5(self):
         TestCommentLines = \
-        ''' # 1 = 128MB, 2 = 256MB, 3 = MAX'''
+            ''' # 1 = 128MB, 2 = 256MB, 3 = MAX'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -808,14 +818,14 @@ class ParseDecPcdTailCommentTest(unittest.TestCase):
                          '1 = 128MB, 2 = 256MB, 3 = MAX')
         self.failIf(SupModeList)
 
-
     #
     # Error case2: comments with supModList contains valid and invalid
     # module type
     #
+
     def testErrorCase2(self):
         TestCommentLines = \
-        '''## BASE INVALID_MODULE_TYPE #hello world'''
+            '''## BASE INVALID_MODULE_TYPE #hello world'''
 
         CommentList = GetSplitValueList(TestCommentLines, "\n")
         LineNum = 0
@@ -912,6 +922,7 @@ class _IsCopyrightLineTest(unittest.TestCase):
         Result = _IsCopyrightLine(Line)
         self.failIf(Result)
 
+
 if __name__ == '__main__':
     Logger.Initialize()
     unittest.main()
diff --git a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
index b9f7dfe52a6d..01adb668d4bc 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contain unit test for DecParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -20,6 +20,8 @@ from Library.ParserValidate import IsValidCFormatGuid
 #
 # Test tool function
 #
+
+
 def TestToolFuncs():
     assert IsValidCArray('{0x1, 0x23}')
 
@@ -46,8 +48,11 @@ def TestToolFuncs():
     assert not IsValidPcdDatum('UNKNOWNTYPE', '0xabc')[0]
     assert not IsValidPcdDatum('UINT8', 'not number')[0]
 
-    assert( IsValidCFormatGuid('{ 0xfa0b1735 , 0x87a0, 0x4193, {0xb2, 0x66 , 0x53, 0x8c , 0x38, 0xaf, 0x48, 0xce }}'))
-    assert( not IsValidCFormatGuid('{ 0xfa0b1735 , 0x87a0, 0x4193, {0xb2, 0x66 , 0x53, 0x8c , 0x38, 0xaf, 0x48, 0xce }} 0xaa'))
+    assert(IsValidCFormatGuid(
+        '{ 0xfa0b1735 , 0x87a0, 0x4193, {0xb2, 0x66 , 0x53, 0x8c , 0x38, 0xaf, 0x48, 0xce }}'))
+    assert(not IsValidCFormatGuid(
+        '{ 0xfa0b1735 , 0x87a0, 0x4193, {0xb2, 0x66 , 0x53, 0x8c , 0x38, 0xaf, 0x48, 0xce }} 0xaa'))
+
 
 def TestTemplate(TestString, TestFunc):
     Path = os.path.join(os.getcwd(), 'test.dec')
@@ -75,6 +80,8 @@ def TestTemplate(TestString, TestFunc):
 # This function test right syntax DEC file
 # @retval: parser object
 #
+
+
 def TestOK(Path, TestString):
     try:
         Parser = Dec(Path)
@@ -84,6 +91,8 @@ def TestOK(Path, TestString):
 
 # This function test wrong syntax DEC file
 # if parser checked wrong syntax, exception thrown and it's expected result
+
+
 def TestError(Path, TestString):
     try:
         Dec(Path)
@@ -92,6 +101,7 @@ def TestError(Path, TestString):
         return True
     raise 'Bug!!! Wrong syntax in DEC file, but passed by DEC parser!!\n' + TestString
 
+
 def TestDecDefine():
     TestString = '''
     [Defines]
@@ -119,6 +129,7 @@ def TestDecDefine():
     '''
     assert TestTemplate(TestString, TestError)
 
+
 def TestDecInclude():
     TestString = '''
     [Defines]
@@ -163,6 +174,7 @@ def TestDecInclude():
 
     os.removedirs('Include/Ia32')
 
+
 def TestDecGuidPpiProtocol():
     TestString = '''
     [Defines]
@@ -205,6 +217,7 @@ def TestDecGuidPpiProtocol():
     assert Items[0].GuidCName == 'gEfiPeiMasterBootModePpiGuid'
     assert Items[0].GuidCValue == '{ 0x7408d748, 0xfc8c, 0x4ee6, {0x92, 0x88, 0xc4, 0xbe, 0xc0, 0x92, 0xa4, 0x10 } }'
 
+
 def TestDecPcd():
     TestString = '''
     [Defines]
@@ -246,6 +259,7 @@ def TestDecPcd():
     assert len(Items) == 4
     assert len(Obj.GetPcdsByType('PcdsPatchableInModule')) == 2
 
+
 def TestDecUserExtension():
     TestString = '''
     [Defines]
@@ -264,6 +278,7 @@ def TestDecUserExtension():
     assert len(Items[0].ArchAndModuleType) == 1
     assert ['MyID', '"TestString"', 'IA32'] in Items[0].ArchAndModuleType
 
+
 if __name__ == '__main__':
     import Logger.Logger
     Logger.Logger.Initialize()
@@ -275,5 +290,3 @@ if __name__ == '__main__':
     unittest.FunctionTestCase(TestDecUserExtension).runTest()
 
     print('All tests passed...')
-
-
diff --git a/BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py b/BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py
index bac127f8e2ac..22d4db05e2e0 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contain unit test for DecParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -24,6 +24,8 @@ from Object.Parser.DecObject import _DecComments
 #
 # Test CleanString
 #
+
+
 class CleanStringTestCase(unittest.TestCase):
     def testCleanString(self):
         Line, Comment = CleanString('')
@@ -43,13 +45,16 @@ class CleanStringTestCase(unittest.TestCase):
         self.assertEqual(Comment, '# and comment')
 
     def testCleanStringCpp(self):
-        Line, Comment = CleanString('line // and comment', AllowCppStyleComment = True)
+        Line, Comment = CleanString(
+            'line // and comment', AllowCppStyleComment=True)
         self.assertEqual(Line, 'line')
         self.assertEqual(Comment, '# and comment')
 
 #
 # Test _DecBase._MacroParser function
 #
+
+
 class MacroParserTestCase(unittest.TestCase):
     def setUp(self):
         self.dec = _DecBase(FileContent('dummy', []))
@@ -61,7 +66,8 @@ class MacroParserTestCase(unittest.TestCase):
 
     def testErrorMacro1(self):
         # Raise fatal error, macro name must be upper case letter
-        self.assertRaises(FatalError, self.dec._MacroParser, 'DEFINE not_upper_case = test2')
+        self.assertRaises(FatalError, self.dec._MacroParser,
+                          'DEFINE not_upper_case = test2')
 
     def testErrorMacro2(self):
         # No macro name given
@@ -70,6 +76,8 @@ class MacroParserTestCase(unittest.TestCase):
 #
 # Test _DecBase._TryBackSlash function
 #
+
+
 class TryBackSlashTestCase(unittest.TestCase):
     def setUp(self):
         Content = [
@@ -92,34 +100,43 @@ class TryBackSlashTestCase(unittest.TestCase):
         #
         # Right case, assert return values
         #
-        ConcatLine, CommentList = self.dec._TryBackSlash(self.dec._RawData.GetNextLine(), [])
+        ConcatLine, CommentList = self.dec._TryBackSlash(
+            self.dec._RawData.GetNextLine(), [])
         self.assertEqual(ConcatLine, 'test no backslash')
         self.assertEqual(CommentList, [])
 
-        ConcatLine, CommentList = self.dec._TryBackSlash(self.dec._RawData.GetNextLine(), [])
+        ConcatLine, CommentList = self.dec._TryBackSlash(
+            self.dec._RawData.GetNextLine(), [])
         self.assertEqual(CommentList, [])
-        self.assertEqual(ConcatLine, 'test with backslash continue second line')
+        self.assertEqual(
+            ConcatLine, 'test with backslash continue second line')
 
         #
         # Error cases, assert raise exception
         #
-        self.assertRaises(FatalError, self.dec._TryBackSlash, self.dec._RawData.GetNextLine(), [])
-        self.assertRaises(FatalError, self.dec._TryBackSlash, self.dec._RawData.GetNextLine(), [])
+        self.assertRaises(FatalError, self.dec._TryBackSlash,
+                          self.dec._RawData.GetNextLine(), [])
+        self.assertRaises(FatalError, self.dec._TryBackSlash,
+                          self.dec._RawData.GetNextLine(), [])
 
 #
 # Test _DecBase.Parse function
 #
+
+
 class DataItem(_DecComments):
     def __init__(self):
         _DecComments.__init__(self)
         self.String = ''
 
+
 class Data(_DecComments):
     def __init__(self):
         _DecComments.__init__(self)
         # List of DataItem
         self.ItemList = []
 
+
 class TestInner(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -137,6 +154,7 @@ class TestInner(_DecBase):
     def _TailCommentStrategy(self, Comment):
         return Comment.find('@comment') != -1
 
+
 class TestTop(_DecBase):
     def __init__(self, RawData):
         _DecBase.__init__(self, RawData)
@@ -153,13 +171,14 @@ class TestTop(_DecBase):
         self.ItemObject.append(TestParser.ItemObject)
         return TestParser.ItemObject
 
+
 class ParseTestCase(unittest.TestCase):
     def setUp(self):
         pass
 
     def testParse(self):
         Content = \
-        '''# Top comment
+            '''# Top comment
         [TOP]
           # sub1 head comment
           (test item has both head and tail comment) # sub1 tail comment
@@ -187,7 +206,8 @@ class ParseTestCase(unittest.TestCase):
         self.assertEqual(len(data.ItemList), 3)
 
         dataitem = data.ItemList[0]
-        self.assertEqual(dataitem.String, '(test item has both head and tail comment)')
+        self.assertEqual(
+            dataitem.String, '(test item has both head and tail comment)')
         # Comment content
         self.assertEqual(dataitem._HeadComment[0][0], '# sub1 head comment')
         self.assertEqual(dataitem._TailComment[0][0], '# sub1 tail comment')
@@ -196,10 +216,12 @@ class ParseTestCase(unittest.TestCase):
         self.assertEqual(dataitem._TailComment[0][1], 4)
 
         dataitem = data.ItemList[1]
-        self.assertEqual(dataitem.String, '(test item has head and special tail comment)')
+        self.assertEqual(
+            dataitem.String, '(test item has head and special tail comment)')
         # Comment content
         self.assertEqual(dataitem._HeadComment[0][0], '# sub2 head comment')
-        self.assertEqual(dataitem._TailComment[0][0], '# @comment test TailCommentStrategy branch')
+        self.assertEqual(
+            dataitem._TailComment[0][0], '# @comment test TailCommentStrategy branch')
         # Comment line number
         self.assertEqual(dataitem._HeadComment[0][1], 5)
         self.assertEqual(dataitem._TailComment[0][1], 7)
@@ -225,6 +247,8 @@ class ParseTestCase(unittest.TestCase):
 #
 # Test _DecDefine._ParseItem
 #
+
+
 class DecDefineTestCase(unittest.TestCase):
     def GetObj(self, Content):
         Obj = _DecDefine(FileContent('dummy', Content.splitlines()))
@@ -251,6 +275,8 @@ class DecDefineTestCase(unittest.TestCase):
 #
 # Test _DecLibraryclass._ParseItem
 #
+
+
 class DecLibraryTestCase(unittest.TestCase):
     def GetObj(self, Content):
         Obj = _DecLibraryclass(FileContent('dummy', Content.splitlines()))
@@ -266,7 +292,8 @@ class DecLibraryTestCase(unittest.TestCase):
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testLibclassNaming(self):
-        obj = self.GetObj('lowercase_efiRuntimeLib|Include/Library/UefiRuntimeLib.h')
+        obj = self.GetObj(
+            'lowercase_efiRuntimeLib|Include/Library/UefiRuntimeLib.h')
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testLibclassExt(self):
@@ -280,6 +307,8 @@ class DecLibraryTestCase(unittest.TestCase):
 #
 # Test _DecPcd._ParseItem
 #
+
+
 class DecPcdTestCase(unittest.TestCase):
     def GetObj(self, Content):
         Obj = _DecPcd(FileContent('dummy', Content.splitlines()))
@@ -288,7 +317,8 @@ class DecPcdTestCase(unittest.TestCase):
         return Obj
 
     def testOK(self):
-        item = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d')._ParseItem()
+        item = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d')._ParseItem()
         self.assertEqual(item.TokenSpaceGuidCName, 'gEfiMdePkgTokenSpaceGuid')
         self.assertEqual(item.TokenCName, 'PcdComponentNameDisable')
         self.assertEqual(item.DefaultValue, 'FALSE')
@@ -296,31 +326,39 @@ class DecPcdTestCase(unittest.TestCase):
         self.assertEqual(item.TokenValue, '0x0000000d')
 
     def testNoCvar(self):
-        obj = self.GetObj('123ai.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d')
+        obj = self.GetObj(
+            '123ai.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d')
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testSplit(self):
-        obj = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable FALSE|BOOLEAN|0x0000000d')
+        obj = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable FALSE|BOOLEAN|0x0000000d')
         self.assertRaises(FatalError, obj._ParseItem)
 
-        obj = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d | abc')
+        obj = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|BOOLEAN|0x0000000d | abc')
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testUnknownType(self):
-        obj = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|unknown|0x0000000d')
+        obj = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|FALSE|unknown|0x0000000d')
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testVoid(self):
-        obj = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|abc|VOID*|0x0000000d')
+        obj = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|abc|VOID*|0x0000000d')
         self.assertRaises(FatalError, obj._ParseItem)
 
     def testUINT(self):
-        obj = self.GetObj('gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|0xabc|UINT8|0x0000000d')
+        obj = self.GetObj(
+            'gEfiMdePkgTokenSpaceGuid.PcdComponentNameDisable|0xabc|UINT8|0x0000000d')
         self.assertRaises(FatalError, obj._ParseItem)
 
 #
 # Test _DecInclude._ParseItem
 #
+
+
 class DecIncludeTestCase(unittest.TestCase):
     #
     # Test code to be added
@@ -330,6 +368,8 @@ class DecIncludeTestCase(unittest.TestCase):
 #
 # Test _DecGuid._ParseItem
 #
+
+
 class DecGuidTestCase(unittest.TestCase):
     def GetObj(self, Content):
         Obj = _DecGuid(FileContent('dummy', Content.splitlines()))
@@ -341,12 +381,15 @@ class DecGuidTestCase(unittest.TestCase):
         item = self.GetObj('gEfiIpSecProtocolGuid={ 0xdfb386f7, 0xe100, 0x43ad,'
                            ' {0x9c, 0x9a, 0xed, 0x90, 0xd0, 0x8a, 0x5e, 0x12 }}')._ParseItem()
         self.assertEqual(item.GuidCName, 'gEfiIpSecProtocolGuid')
-        self.assertEqual(item.GuidCValue, '{ 0xdfb386f7, 0xe100, 0x43ad, {0x9c, 0x9a, 0xed, 0x90, 0xd0, 0x8a, 0x5e, 0x12 }}')
+        self.assertEqual(
+            item.GuidCValue, '{ 0xdfb386f7, 0xe100, 0x43ad, {0x9c, 0x9a, 0xed, 0x90, 0xd0, 0x8a, 0x5e, 0x12 }}')
 
     def testGuidString(self):
-        item = self.GetObj('gEfiIpSecProtocolGuid=1E73767F-8F52-4603-AEB4-F29B510B6766')._ParseItem()
+        item = self.GetObj(
+            'gEfiIpSecProtocolGuid=1E73767F-8F52-4603-AEB4-F29B510B6766')._ParseItem()
         self.assertEqual(item.GuidCName, 'gEfiIpSecProtocolGuid')
-        self.assertEqual(item.GuidCValue, '1E73767F-8F52-4603-AEB4-F29B510B6766')
+        self.assertEqual(
+            item.GuidCValue, '1E73767F-8F52-4603-AEB4-F29B510B6766')
 
     def testNoValue1(self):
         obj = self.GetObj('gEfiIpSecProtocolGuid')
@@ -363,10 +406,13 @@ class DecGuidTestCase(unittest.TestCase):
 #
 # Test Dec.__init__
 #
+
+
 class DecDecInitTestCase(unittest.TestCase):
     def testNoDecFile(self):
         self.assertRaises(FatalError, Dec, 'No_Such_File')
 
+
 class TmpFile:
     def __init__(self, File):
         self.File = File
@@ -388,11 +434,13 @@ class TmpFile:
 #
 # Test Dec._UserExtentionSectionParser
 #
+
+
 class DecUESectionTestCase(unittest.TestCase):
     def setUp(self):
         self.File = TmpFile('test.dec')
         self.File.Write(
-'''[userextensions.intel."myid"]
+            '''[userextensions.intel."myid"]
 [userextensions.intel."myid".IA32]
 [userextensions.intel."myid".IA32,]
 [userextensions.intel."myid]
@@ -409,7 +457,8 @@ class DecUESectionTestCase(unittest.TestCase):
         dec._RawData.CurrentLine = CleanString(dec._RawData.GetNextLine())[0]
         dec._UserExtentionSectionParser()
         self.assertEqual(len(dec._RawData.CurrentScope), 1)
-        self.assertEqual(dec._RawData.CurrentScope[0][0], 'userextensions'.upper())
+        self.assertEqual(
+            dec._RawData.CurrentScope[0][0], 'userextensions'.upper())
         self.assertEqual(dec._RawData.CurrentScope[0][1], 'intel')
         self.assertEqual(dec._RawData.CurrentScope[0][2], '"myid"')
         self.assertEqual(dec._RawData.CurrentScope[0][3], 'COMMON')
@@ -418,7 +467,8 @@ class DecUESectionTestCase(unittest.TestCase):
         dec._RawData.CurrentLine = CleanString(dec._RawData.GetNextLine())[0]
         dec._UserExtentionSectionParser()
         self.assertEqual(len(dec._RawData.CurrentScope), 1)
-        self.assertEqual(dec._RawData.CurrentScope[0][0], 'userextensions'.upper())
+        self.assertEqual(
+            dec._RawData.CurrentScope[0][0], 'userextensions'.upper())
         self.assertEqual(dec._RawData.CurrentScope[0][1], 'intel')
         self.assertEqual(dec._RawData.CurrentScope[0][2], '"myid"')
         self.assertEqual(dec._RawData.CurrentScope[0][3], 'IA32')
@@ -434,11 +484,13 @@ class DecUESectionTestCase(unittest.TestCase):
 #
 # Test Dec._SectionHeaderParser
 #
+
+
 class DecSectionTestCase(unittest.TestCase):
     def setUp(self):
         self.File = TmpFile('test.dec')
         self.File.Write(
-'''[no section start or end
+            '''[no section start or end
 [,] # empty sub-section
 [unknow_section_name]
 [Includes.IA32.other] # no third one
@@ -446,7 +498,7 @@ class DecSectionTestCase(unittest.TestCase):
 [Includes.IA32, Includes.IA32]
 [Includes, Includes.IA32] # common cannot be with other arch
 [Includes.IA32, PcdsFeatureFlag] # different section name
-'''     )
+''')
 
     def tearDown(self):
         self.File.Remove()
@@ -457,7 +509,7 @@ class DecSectionTestCase(unittest.TestCase):
         dec._RawData.CurrentLine = CleanString(dec._RawData.GetNextLine())[0]
         self.assertRaises(FatalError, dec._SectionHeaderParser)
 
-        #[,] # empty sub-section
+        # [,] # empty sub-section
         dec._RawData.CurrentLine = CleanString(dec._RawData.GetNextLine())[0]
         self.assertRaises(FatalError, dec._SectionHeaderParser)
 
@@ -491,11 +543,13 @@ class DecSectionTestCase(unittest.TestCase):
 #
 # Test Dec._ParseDecComment
 #
+
+
 class DecDecCommentTestCase(unittest.TestCase):
     def testDecHeadComment(self):
         File = TmpFile('test.dec')
         File.Write(
-       '''# abc
+            '''# abc
           ##''')
         dec = Dec('test.dec', False)
         dec.ParseDecComment()
@@ -509,7 +563,7 @@ class DecDecCommentTestCase(unittest.TestCase):
     def testNoDoubleComment(self):
         File = TmpFile('test.dec')
         File.Write(
-       '''# abc
+            '''# abc
           #
           [section_start]''')
         dec = Dec('test.dec', False)
@@ -521,8 +575,8 @@ class DecDecCommentTestCase(unittest.TestCase):
         self.assertEqual(dec._HeadComment[1][1], 2)
         File.Remove()
 
+
 if __name__ == '__main__':
     import Logger.Logger
     Logger.Logger.Initialize()
     unittest.main()
-
diff --git a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
index 12e3045f3753..4b707b960fc1 100644
--- a/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
+++ b/BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file contain unit test for Test [Binary] section part of InfParser
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,14 +23,14 @@ import Library.GlobalData as Global
 # Only has 1 element, binary item Type
 #
 SectionStringsCommonItem1 = \
-"""
+    """
 GUID
 """
 #
 # Have 2 elements, binary item Type and FileName
 #
 SectionStringsCommonItem2 = \
-"""
+    """
 GUID | Test/Test.guid
 """
 
@@ -38,7 +38,7 @@ GUID | Test/Test.guid
 # Have 3 elements, Type | FileName | Target | Family | TagName | FeatureFlagExp
 #
 SectionStringsCommonItem3 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG
 """
 
@@ -47,7 +47,7 @@ GUID | Test/Test.guid | DEBUG
 # Target with MACRO defined in [Define] section
 #
 SectionStringsCommonItem4 = \
-"""
+    """
 GUID | Test/Test.guid | $(TARGET)
 """
 
@@ -56,7 +56,7 @@ GUID | Test/Test.guid | $(TARGET)
 # FileName with MACRO defined in [Binary] section
 #
 SectionStringsCommonItem5 = \
-"""
+    """
 DEFINE BINARY_FILE_PATH = Test
 GUID | $(BINARY_FILE_PATH)/Test.guid | $(TARGET)
 """
@@ -65,7 +65,7 @@ GUID | $(BINARY_FILE_PATH)/Test.guid | $(TARGET)
 # Have 4 elements, Type | FileName | Target | Family
 #
 SectionStringsCommonItem6 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG | *
 """
 
@@ -73,7 +73,7 @@ GUID | Test/Test.guid | DEBUG | *
 # Have 4 elements, Type | FileName | Target | Family
 #
 SectionStringsCommonItem7 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG | MSFT
 """
 
@@ -81,7 +81,7 @@ GUID | Test/Test.guid | DEBUG | MSFT
 # Have 5 elements, Type | FileName | Target | Family | TagName
 #
 SectionStringsCommonItem8 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG | MSFT | TEST
 """
 
@@ -89,7 +89,7 @@ GUID | Test/Test.guid | DEBUG | MSFT | TEST
 # Have 6 elements, Type | FileName | Target | Family | TagName | FFE
 #
 SectionStringsCommonItem9 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG | MSFT | TEST | TRUE
 """
 
@@ -98,28 +98,27 @@ GUID | Test/Test.guid | DEBUG | MSFT | TEST | TRUE
 # Test wrong format
 #
 SectionStringsCommonItem10 = \
-"""
+    """
 GUID | Test/Test.guid | DEBUG | MSFT | TEST | TRUE | OVERFLOW
 """
 
 #-------------end of common binary item test input----------------------------#
 
 
-
 #-------------start of VER type binary item test input------------------------#
 
 #
 # Has 1 element, error format
 #
 SectionStringsVerItem1 = \
-"""
+    """
 VER
 """
 #
 # Have 5 elements, error format(Maximum elements amount is 4)
 #
 SectionStringsVerItem2 = \
-"""
+    """
 VER | Test/Test.ver | * | TRUE | OverFlow
 """
 
@@ -127,7 +126,7 @@ VER | Test/Test.ver | * | TRUE | OverFlow
 # Have 2 elements, Type | FileName
 #
 SectionStringsVerItem3 = \
-"""
+    """
 VER | Test/Test.ver
 """
 
@@ -135,7 +134,7 @@ VER | Test/Test.ver
 # Have 3 elements, Type | FileName | Target
 #
 SectionStringsVerItem4 = \
-"""
+    """
 VER | Test/Test.ver | DEBUG
 """
 
@@ -143,7 +142,7 @@ VER | Test/Test.ver | DEBUG
 # Have 4 elements, Type | FileName | Target | FeatureFlagExp
 #
 SectionStringsVerItem5 = \
-"""
+    """
 VER | Test/Test.ver | DEBUG | TRUE
 """
 
@@ -151,7 +150,7 @@ VER | Test/Test.ver | DEBUG | TRUE
 # Exist 2 VER items, both opened.
 #
 SectionStringsVerItem6 = \
-"""
+    """
 VER | Test/Test.ver | * | TRUE
 VER | Test/Test2.ver | * | TRUE
 """
@@ -161,7 +160,7 @@ VER | Test/Test2.ver | * | TRUE
 # Exist 2 VER items, only 1 opened.
 #
 SectionStringsVerItem7 = \
-"""
+    """
 VER | Test/Test.ver | * | TRUE
 VER | Test/Test2.ver | * | FALSE
 """
@@ -175,19 +174,19 @@ VER | Test/Test2.ver | * | FALSE
 # Test only one UI section can exist
 #
 SectionStringsUiItem1 = \
-"""
+    """
 UI | Test/Test.ui | * | TRUE
 UI | Test/Test2.ui | * | TRUE
 """
 
 SectionStringsUiItem2 = \
-"""
+    """
 UI | Test/Test.ui | * | TRUE
 SEC_UI | Test/Test2.ui | * | TRUE
 """
 
 SectionStringsUiItem3 = \
-"""
+    """
 UI | Test/Test.ui | * | TRUE
 UI | Test/Test2.ui | * | FALSE
 """
@@ -196,14 +195,14 @@ UI | Test/Test2.ui | * | FALSE
 # Has 1 element, error format
 #
 SectionStringsUiItem4 = \
-"""
+    """
 UI
 """
 #
 # Have 5 elements, error format(Maximum elements amount is 4)
 #
 SectionStringsUiItem5 = \
-"""
+    """
 UI | Test/Test.ui | * | TRUE | OverFlow
 """
 
@@ -211,7 +210,7 @@ UI | Test/Test.ui | * | TRUE | OverFlow
 # Have 2 elements, Type | FileName
 #
 SectionStringsUiItem6 = \
-"""
+    """
 UI | Test/Test.ui
 """
 
@@ -219,7 +218,7 @@ UI | Test/Test.ui
 # Have 3 elements, Type | FileName | Target
 #
 SectionStringsUiItem7 = \
-"""
+    """
 UI | Test/Test.ui | DEBUG
 """
 
@@ -227,7 +226,7 @@ UI | Test/Test.ui | DEBUG
 # Have 4 elements, Type | FileName | Target | FeatureFlagExp
 #
 SectionStringsUiItem8 = \
-"""
+    """
 UI | Test/Test.ui | DEBUG | TRUE
 """
 #---------------end of UI type binary item test input-------------------------#
@@ -238,6 +237,8 @@ gFileName = "BinarySectionTest.inf"
 ##
 # Construct SectionString for call section parser usage.
 #
+
+
 def StringToSectionString(String):
     Lines = String.split('\n')
     LineNo = 0
@@ -250,6 +251,7 @@ def StringToSectionString(String):
 
     return SectionString
 
+
 def PrepareTest(String):
     SectionString = StringToSectionString(String)
     ItemList = []
@@ -263,7 +265,7 @@ def PrepareTest(String):
             #
             FileName = os.path.normpath(os.path.realpath(ValueList[1].strip()))
             try:
-                TempFile  = open (FileName, "w")
+                TempFile = open(FileName, "w")
                 TempFile.close()
             except:
                 print("File Create Error")
@@ -277,6 +279,7 @@ def PrepareTest(String):
 
     return ItemList
 
+
 if __name__ == '__main__':
     Logger.Initialize()
 
@@ -290,26 +293,26 @@ if __name__ == '__main__':
     # For All Ui test
     #
     UiStringList = [
-                    SectionStringsUiItem1,
-                    SectionStringsUiItem2,
-                    SectionStringsUiItem3,
-                    SectionStringsUiItem4,
-                    SectionStringsUiItem5,
-                    SectionStringsUiItem6,
-                    SectionStringsUiItem7,
-                    SectionStringsUiItem8
-                    ]
+        SectionStringsUiItem1,
+        SectionStringsUiItem2,
+        SectionStringsUiItem3,
+        SectionStringsUiItem4,
+        SectionStringsUiItem5,
+        SectionStringsUiItem6,
+        SectionStringsUiItem7,
+        SectionStringsUiItem8
+    ]
 
     for Item in UiStringList:
         Ui = PrepareTest(Item)
         if Item == SectionStringsUiItem4 or Item == SectionStringsUiItem5:
             try:
-                InfBinariesInstance.SetBinary(Ui = Ui, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(Ui=Ui, ArchList=ArchList)
             except Logger.FatalError:
                 pass
         else:
             try:
-                InfBinariesInstance.SetBinary(Ui = Ui, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(Ui=Ui, ArchList=ArchList)
             except:
                 AllPassedFlag = False
 
@@ -317,27 +320,27 @@ if __name__ == '__main__':
     # For All Ver Test
     #
     VerStringList = [
-                     SectionStringsVerItem1,
-                     SectionStringsVerItem2,
-                     SectionStringsVerItem3,
-                     SectionStringsVerItem4,
-                     SectionStringsVerItem5,
-                     SectionStringsVerItem6,
-                     SectionStringsVerItem7
-                     ]
+        SectionStringsVerItem1,
+        SectionStringsVerItem2,
+        SectionStringsVerItem3,
+        SectionStringsVerItem4,
+        SectionStringsVerItem5,
+        SectionStringsVerItem6,
+        SectionStringsVerItem7
+    ]
     for Item in VerStringList:
         Ver = PrepareTest(Item)
         if Item == SectionStringsVerItem1 or \
            Item == SectionStringsVerItem2:
 
             try:
-                InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(Ver=Ver, ArchList=ArchList)
             except:
                 pass
 
         else:
             try:
-                InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(Ver=Ver, ArchList=ArchList)
             except:
                 AllPassedFlag = False
 
@@ -345,17 +348,17 @@ if __name__ == '__main__':
     # For All Common Test
     #
     CommonStringList = [
-                     SectionStringsCommonItem1,
-                     SectionStringsCommonItem2,
-                     SectionStringsCommonItem3,
-                     SectionStringsCommonItem4,
-                     SectionStringsCommonItem5,
-                     SectionStringsCommonItem6,
-                     SectionStringsCommonItem7,
-                     SectionStringsCommonItem8,
-                     SectionStringsCommonItem9,
-                     SectionStringsCommonItem10
-                     ]
+        SectionStringsCommonItem1,
+        SectionStringsCommonItem2,
+        SectionStringsCommonItem3,
+        SectionStringsCommonItem4,
+        SectionStringsCommonItem5,
+        SectionStringsCommonItem6,
+        SectionStringsCommonItem7,
+        SectionStringsCommonItem8,
+        SectionStringsCommonItem9,
+        SectionStringsCommonItem10
+    ]
 
     for Item in CommonStringList:
         CommonBin = PrepareTest(Item)
@@ -363,19 +366,19 @@ if __name__ == '__main__':
            Item == SectionStringsCommonItem1:
 
             try:
-                InfBinariesInstance.SetBinary(CommonBinary = CommonBin, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(
+                    CommonBinary=CommonBin, ArchList=ArchList)
             except:
                 pass
 
         else:
             try:
-                InfBinariesInstance.SetBinary(Ver = Ver, ArchList = ArchList)
+                InfBinariesInstance.SetBinary(Ver=Ver, ArchList=ArchList)
             except:
                 print("Test Failed!")
                 AllPassedFlag = False
 
-    if AllPassedFlag :
+    if AllPassedFlag:
         print('All tests passed...')
     else:
         print('Some unit test failed!')
-
diff --git a/BaseTools/Source/Python/UPT/Xml/CommonXml.py b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
index cfadacf4aaaf..d7dc8435d1a2 100644
--- a/BaseTools/Source/Python/UPT/Xml/CommonXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/CommonXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a PCD file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -41,6 +41,8 @@ import Library.DataType as DataType
 ##
 # ClonedFromXml
 #
+
+
 class ClonedFromXml(object):
     def __init__(self):
         self.GUID = ''
@@ -85,9 +87,11 @@ class CommonDefinesXml(object):
             pass
         self.Usage = XmlAttribute(Item, 'Usage')
         self.SupArchList = \
-        [Arch for Arch in GetSplitValueList(XmlAttribute(Item, 'SupArchList'), DataType.TAB_SPACE_SPLIT) if Arch]
+            [Arch for Arch in GetSplitValueList(XmlAttribute(
+                Item, 'SupArchList'), DataType.TAB_SPACE_SPLIT) if Arch]
         self.SupModList = \
-        [Mod for Mod in GetSplitValueList(XmlAttribute(Item, 'SupModList'), DataType.TAB_SPACE_SPLIT) if Mod]
+            [Mod for Mod in GetSplitValueList(XmlAttribute(
+                Item, 'SupModList'), DataType.TAB_SPACE_SPLIT) if Mod]
         self.FeatureFlag = ConvertNOTEQToNE(XmlAttribute(Item, 'FeatureFlag'))
 
     def ToXml(self):
@@ -95,11 +99,13 @@ class CommonDefinesXml(object):
 
     def __str__(self):
         return "Usage = %s SupArchList = %s SupModList = %s FeatureFlag = %s" \
-                % (self.Usage, self.SupArchList, self.SupModList, self.FeatureFlag)
+            % (self.Usage, self.SupArchList, self.SupModList, self.FeatureFlag)
 
 ##
 # PromptXml
 #
+
+
 class PromptXml(object):
     def __init__(self):
         self.Prompt = ''
@@ -115,12 +121,15 @@ class PromptXml(object):
         if self.Prompt:
             pass
         return CreateXmlElement('%s' % Key, Prompt.GetString(), [], [['Lang', Prompt.GetLang()]])
+
     def __str__(self):
         return "Prompt = %s Lang = %s" % (self.Prompt, self.Lang)
 
 ##
 # HelpTextXml
 #
+
+
 class HelpTextXml(object):
     def __init__(self):
         self.HelpText = ''
@@ -136,12 +145,15 @@ class HelpTextXml(object):
         if self.HelpText:
             pass
         return CreateXmlElement('%s' % Key, HelpText.GetString(), [], [['Lang', HelpText.GetLang()]])
+
     def __str__(self):
         return "HelpText = %s Lang = %s" % (self.HelpText, self.Lang)
 
 ##
 # HeaderXml
 #
+
+
 class HeaderXml(object):
     def __init__(self):
         self.Name = ''
@@ -159,26 +171,32 @@ class HeaderXml(object):
             if IsStandAlongModule:
                 XmlTreeLevel = ['DistributionPackage', 'ModuleSurfaceArea']
             else:
-                XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'ModuleSurfaceArea']
-            CheckDict = {'Header':''}
+                XmlTreeLevel = ['DistributionPackage',
+                                'PackageSurfaceArea', 'ModuleSurfaceArea']
+            CheckDict = {'Header': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         self.Name = XmlElement(Item, '%s/Name' % Key)
-        self.BaseName = XmlAttribute(XmlNode(Item, '%s/Name' % Key), 'BaseName')
+        self.BaseName = XmlAttribute(
+            XmlNode(Item, '%s/Name' % Key), 'BaseName')
         self.GUID = XmlElement(Item, '%s/GUID' % Key)
         self.Version = XmlAttribute(XmlNode(Item, '%s/GUID' % Key), 'Version')
 
         for SubItem in XmlList(Item, '%s/Abstract' % Key):
             HeaderAbstractLang = XmlAttribute(SubItem, 'Lang')
-            self.AbstractList.append((HeaderAbstractLang, XmlElement(SubItem, '%s/Abstract' % Key)))
+            self.AbstractList.append(
+                (HeaderAbstractLang, XmlElement(SubItem, '%s/Abstract' % Key)))
         for SubItem in XmlList(Item, '%s/Description' % Key):
             HeaderDescriptionLang = XmlAttribute(SubItem, 'Lang')
-            self.DescriptionList.append((HeaderDescriptionLang, XmlElement(SubItem, '%s/Description' % Key)))
+            self.DescriptionList.append(
+                (HeaderDescriptionLang, XmlElement(SubItem, '%s/Description' % Key)))
         for SubItem in XmlList(Item, '%s/Copyright' % Key):
             HeaderCopyrightLang = XmlAttribute(SubItem, 'Lang')
-            self.CopyrightList.append((HeaderCopyrightLang, XmlElement(SubItem, '%s/Copyright' % Key)))
+            self.CopyrightList.append(
+                (HeaderCopyrightLang, XmlElement(SubItem, '%s/Copyright' % Key)))
         for SubItem in XmlList(Item, '%s/License' % Key):
             HeaderLicenseLang = XmlAttribute(SubItem, 'Lang')
-            self.LicenseList.append((HeaderLicenseLang, XmlElement(SubItem, '%s/License' % Key)))
+            self.LicenseList.append(
+                (HeaderLicenseLang, XmlElement(SubItem, '%s/License' % Key)))
         ModuleHeader = ModuleObject()
         ModuleHeader.SetName(self.Name)
         ModuleHeader.SetBaseName(self.BaseName)
@@ -193,8 +211,10 @@ class HeaderXml(object):
     def ToXml(self, Header, Key):
         if self.GUID:
             pass
-        Element1 = CreateXmlElement('Name', Header.GetName(), [], [['BaseName', Header.GetBaseName()]])
-        Element2 = CreateXmlElement('GUID', Header.GetGuid(), [], [['Version', Header.GetVersion()]])
+        Element1 = CreateXmlElement('Name', Header.GetName(), [], [
+                                    ['BaseName', Header.GetBaseName()]])
+        Element2 = CreateXmlElement('GUID', Header.GetGuid(), [], [
+                                    ['Version', Header.GetVersion()]])
         NodeList = [Element1,
                     Element2,
                     ]
@@ -226,10 +246,12 @@ class HeaderXml(object):
                 NodeList.append(CreateXmlElement('License', Value, [], []))
         for (Lang, Value) in Header.GetAbstract() + UNIInfAbstractList:
             if Value:
-                NodeList.append(CreateXmlElement('Abstract', Value, [], [['Lang', Lang]]))
+                NodeList.append(CreateXmlElement(
+                    'Abstract', Value, [], [['Lang', Lang]]))
         for (Lang, Value) in Header.GetDescription() + UNIInfDescriptionList:
             if Value:
-                NodeList.append(CreateXmlElement('Description', Value, [], [['Lang', Lang]]))
+                NodeList.append(CreateXmlElement(
+                    'Description', Value, [], [['Lang', Lang]]))
 
         AttributeList = []
         Root = CreateXmlElement('%s' % Key, '', NodeList, AttributeList)
@@ -238,11 +260,13 @@ class HeaderXml(object):
     def __str__(self):
         return "Name = %s BaseName = %s GUID = %s Version = %s Copyright = %s \
         License = %s Abstract = %s Description = %s" % \
-        (self.Name, self.BaseName, self.GUID, self.Version, self.CopyrightList, \
-         self.LicenseList, self.AbstractList, self.DescriptionList)
+            (self.Name, self.BaseName, self.GUID, self.Version, self.CopyrightList,
+             self.LicenseList, self.AbstractList, self.DescriptionList)
 ##
 # DistributionPackageHeaderXml
 #
+
+
 class DistributionPackageHeaderXml(object):
     def __init__(self):
         self.Header = HeaderXml()
@@ -289,19 +313,21 @@ class DistributionPackageHeaderXml(object):
     def ToXml(self, DistributionPackageHeader, Key):
         if self.Header:
             pass
-        Element1 = CreateXmlElement('Name', \
-                                    DistributionPackageHeader.GetName(), [], \
-                                    [['BaseName', \
-                                    DistributionPackageHeader.GetBaseName()]])
-        Element2 = CreateXmlElement('GUID', \
-                                    DistributionPackageHeader.GetGuid(), [], \
-                                    [['Version', \
-                                    DistributionPackageHeader.GetVersion()]])
+        Element1 = CreateXmlElement('Name',
+                                    DistributionPackageHeader.GetName(), [],
+                                    [['BaseName',
+                                      DistributionPackageHeader.GetBaseName()]])
+        Element2 = CreateXmlElement('GUID',
+                                    DistributionPackageHeader.GetGuid(), [],
+                                    [['Version',
+                                      DistributionPackageHeader.GetVersion()]])
         AttributeList = []
         if DistributionPackageHeader.ReadOnly != '':
-            AttributeList.append(['ReadOnly', str(DistributionPackageHeader.ReadOnly).lower()])
+            AttributeList.append(
+                ['ReadOnly', str(DistributionPackageHeader.ReadOnly).lower()])
         if DistributionPackageHeader.RePackage != '':
-            AttributeList.append(['RePackage', str(DistributionPackageHeader.RePackage).lower()])
+            AttributeList.append(
+                ['RePackage', str(DistributionPackageHeader.RePackage).lower()])
         if DistributionPackageHeader.GetAbstract():
             DPAbstract = DistributionPackageHeader.GetAbstract()[0][1]
         else:
@@ -327,7 +353,7 @@ class DistributionPackageHeaderXml(object):
                     ['Abstract', DPAbstract],
                     ['Description', DPDescription],
                     ['Signature', DistributionPackageHeader.Signature],
-                    ['XmlSpecification', \
+                    ['XmlSpecification',
                      DistributionPackageHeader.XmlSpecification],
                     ]
         Root = CreateXmlElement('%s' % Key, '', NodeList, AttributeList)
@@ -336,11 +362,13 @@ class DistributionPackageHeaderXml(object):
     def __str__(self):
         return "ReadOnly = %s RePackage = %s Vendor = %s Date = %s \
         Signature = %s XmlSpecification = %s %s" % \
-        (self.ReadOnly, self.RePackage, self.Vendor, self.Date, \
-         self.Signature, self.XmlSpecification, self.Header)
+            (self.ReadOnly, self.RePackage, self.Vendor, self.Date,
+             self.Signature, self.XmlSpecification, self.Header)
 ##
 # PackageHeaderXml
 #
+
+
 class PackageHeaderXml(object):
     def __init__(self):
         self.Header = HeaderXml()
@@ -366,9 +394,9 @@ class PackageHeaderXml(object):
     def ToXml(self, PackageObject2, Key):
         if self.PackagePath:
             pass
-        Element1 = CreateXmlElement('Name', PackageObject2.GetName(), [], \
-                         [['BaseName', PackageObject2.GetBaseName()]])
-        Element2 = CreateXmlElement('GUID', PackageObject2.GetGuid(), [], \
+        Element1 = CreateXmlElement('Name', PackageObject2.GetName(), [],
+                                    [['BaseName', PackageObject2.GetBaseName()]])
+        Element2 = CreateXmlElement('GUID', PackageObject2.GetGuid(), [],
                                     [['Version', PackageObject2.GetVersion()]])
         NodeList = [Element1,
                     Element2
@@ -395,17 +423,20 @@ class PackageHeaderXml(object):
         # Get Abstract and Description from DEC File Header
         for (Lang, Value) in PackageObject2.GetCopyright():
             if Value:
-                NodeList.append(CreateXmlElement(DataType.TAB_HEADER_COPYRIGHT, Value, [], []))
+                NodeList.append(CreateXmlElement(
+                    DataType.TAB_HEADER_COPYRIGHT, Value, [], []))
         for (Lang, Value) in PackageObject2.GetLicense():
             if Value:
-                NodeList.append(CreateXmlElement(DataType.TAB_HEADER_LICENSE, Value, [], []))
+                NodeList.append(CreateXmlElement(
+                    DataType.TAB_HEADER_LICENSE, Value, [], []))
         for (Lang, Value) in PackageObject2.GetAbstract() + UNIPackageAbrstractList:
             if Value:
-                NodeList.append(CreateXmlElement(DataType.TAB_HEADER_ABSTRACT, Value, [], [['Lang', Lang]]))
+                NodeList.append(CreateXmlElement(
+                    DataType.TAB_HEADER_ABSTRACT, Value, [], [['Lang', Lang]]))
         for (Lang, Value) in PackageObject2.GetDescription() + UNIPackageDescriptionList:
             if Value:
-                NodeList.append(CreateXmlElement(DataType.TAB_HEADER_DESCRIPTION, Value, [], [['Lang', Lang]]))
-
+                NodeList.append(CreateXmlElement(
+                    DataType.TAB_HEADER_DESCRIPTION, Value, [], [['Lang', Lang]]))
 
         NodeList.append(['PackagePath', PackageObject2.GetPackagePath()])
         AttributeList = []
@@ -419,6 +450,8 @@ class PackageHeaderXml(object):
 ##
 # MiscellaneousFileXml
 #
+
+
 class MiscellaneousFileXml(object):
     def __init__(self):
         self.Header = HeaderXml()
@@ -426,6 +459,7 @@ class MiscellaneousFileXml(object):
     ##
     # This API is used for Package or Module's MiscellaneousFile section
     #
+
     def FromXml(self, Item, Key):
         if not Item:
             return None
@@ -434,7 +468,8 @@ class MiscellaneousFileXml(object):
         self.Header.FromXml(NewItem, 'Header')
         for SubItem in XmlList(Item, '%s/Filename' % Key):
             Filename = XmlElement(SubItem, '%s/Filename' % Key)
-            Executable = XmlAttribute(XmlNode(SubItem, '%s/Filename' % Key), 'Executable')
+            Executable = XmlAttribute(
+                XmlNode(SubItem, '%s/Filename' % Key), 'Executable')
             if Executable.upper() == "TRUE":
                 Executable = True
             elif Executable.upper() == "FALSE":
@@ -458,6 +493,7 @@ class MiscellaneousFileXml(object):
     ##
     # This API is used for DistP's tool section
     #
+
     def FromXml2(self, Item, Key):
         if Item is None:
             return None
@@ -466,7 +502,8 @@ class MiscellaneousFileXml(object):
         for SubItem in XmlList(Item, '%s/Filename' % Key):
             Filename = XmlElement(SubItem, '%s/Filename' % Key)
             Executable = \
-            XmlAttribute(XmlNode(SubItem, '%s/Filename' % Key), 'Executable')
+                XmlAttribute(XmlNode(SubItem, '%s/Filename' %
+                             Key), 'Executable')
             OsType = XmlAttribute(XmlNode(SubItem, '%s/Filename' % Key), 'OS')
             if Executable.upper() == "TRUE":
                 Executable = True
@@ -518,17 +555,17 @@ class MiscellaneousFileXml(object):
                         ['License', DPLicense],
                         ['Abstract', DPAbstract],
                         ['Description', DPDescription],
-                       ]
+                        ]
             for File in MiscFile.GetFileList():
-                NodeList.append\
-                (CreateXmlElement\
-                 ('Filename', File.GetURI(), [], \
-                  [['Executable', str(File.GetExecutable()).lower()]]))
+                NodeList.append(CreateXmlElement
+                                ('Filename', File.GetURI(), [],
+                                 [['Executable', str(File.GetExecutable()).lower()]]))
             Root = CreateXmlElement('%s' % Key, '', NodeList, [])
             return Root
     ##
     # This API is used for DistP's tool section
     #
+
     def ToXml2(self, MiscFile, Key):
         if self.Header:
             pass
@@ -554,15 +591,14 @@ class MiscellaneousFileXml(object):
                         ['License', DPLicense],
                         ['Abstract', DPAbstract],
                         ['Description', DPDescription],
-                       ]
+                        ]
             HeaderNode = CreateXmlElement('Header', '', NodeList, [])
             NodeList = [HeaderNode]
             for File in MiscFile.GetFileList():
-                NodeList.append\
-                (CreateXmlElement\
-                 ('Filename', File.GetURI(), [], \
-                  [['Executable', str(File.GetExecutable()).lower()], \
-                   ['OS', File.GetOS()]]))
+                NodeList.append(CreateXmlElement
+                                ('Filename', File.GetURI(), [],
+                                 [['Executable', str(File.GetExecutable()).lower()],
+                                  ['OS', File.GetOS()]]))
             Root = CreateXmlElement('%s' % Key, '', NodeList, [])
             return Root
 
@@ -574,6 +610,8 @@ class MiscellaneousFileXml(object):
 ##
 # UserExtensionsXml
 #
+
+
 class UserExtensionsXml(object):
     def __init__(self):
         self.UserId = ''
@@ -605,22 +643,23 @@ class UserExtensionsXml(object):
         self.UserId = XmlAttribute(XmlNode(Item, '%s' % Key), 'UserId')
         self.Identifier = XmlAttribute(XmlNode(Item, '%s' % Key), 'Identifier')
         if self.UserId == DataType.TAB_BINARY_HEADER_USERID \
-        and self.Identifier == DataType.TAB_BINARY_HEADER_IDENTIFIER:
+                and self.Identifier == DataType.TAB_BINARY_HEADER_IDENTIFIER:
             for SubItem in XmlList(Item, '%s/BinaryAbstract' % Key):
                 BinaryAbstractLang = XmlAttribute(SubItem, 'Lang')
-                self.BinaryAbstractList.append((BinaryAbstractLang, XmlElement(SubItem, '%s/BinaryAbstract' % Key)))
+                self.BinaryAbstractList.append(
+                    (BinaryAbstractLang, XmlElement(SubItem, '%s/BinaryAbstract' % Key)))
             for SubItem in XmlList(Item, '%s/BinaryDescription' % Key):
                 BinaryDescriptionLang = XmlAttribute(SubItem, 'Lang')
                 self.BinaryDescriptionList.append((BinaryDescriptionLang,
-                                                       XmlElement(SubItem, '%s/BinaryDescription' % Key)))
+                                                   XmlElement(SubItem, '%s/BinaryDescription' % Key)))
             for SubItem in XmlList(Item, '%s/BinaryCopyright' % Key):
                 BinaryCopyrightLang = XmlAttribute(SubItem, 'Lang')
                 self.BinaryCopyrightList.append((BinaryCopyrightLang,
-                                                     XmlElement(SubItem, '%s/BinaryCopyright' % Key)))
+                                                 XmlElement(SubItem, '%s/BinaryCopyright' % Key)))
             for SubItem in XmlList(Item, '%s/BinaryLicense' % Key):
                 BinaryLicenseLang = XmlAttribute(SubItem, 'Lang')
                 self.BinaryLicenseList.append((BinaryLicenseLang,
-                                                   XmlElement(SubItem, '%s/BinaryLicense' % Key)))
+                                               XmlElement(SubItem, '%s/BinaryLicense' % Key)))
 
         DefineItem = XmlNode(Item, '%s/Define' % Key)
         for SubItem in XmlList(DefineItem, 'Define/Statement'):
@@ -629,12 +668,14 @@ class UserExtensionsXml(object):
         BuildOptionItem = XmlNode(Item, '%s/BuildOption' % Key)
         for SubItem in XmlList(BuildOptionItem, 'BuildOption/Statement'):
             Statement = XmlElement(SubItem, '%s/Statement' % Key)
-            Arch = XmlAttribute(XmlNode(SubItem, '%s/Statement' % Key), 'SupArchList')
+            Arch = XmlAttribute(
+                XmlNode(SubItem, '%s/Statement' % Key), 'SupArchList')
             self.BuildOptionDict[Arch] = Statement
         IncludesItem = XmlNode(Item, '%s/Includes' % Key)
         for SubItem in XmlList(IncludesItem, 'Includes/Statement'):
             Statement = XmlElement(SubItem, '%s/Statement' % Key)
-            Arch = XmlAttribute(XmlNode(SubItem, '%s/Statement' % Key), 'SupArchList')
+            Arch = XmlAttribute(
+                XmlNode(SubItem, '%s/Statement' % Key), 'SupArchList')
             self.IncludesDict[Statement] = Arch
         SourcesItem = XmlNode(Item, '%s/Sources' % Key)
         Tmp = UserExtensionSourceXml()
@@ -646,7 +687,8 @@ class UserExtensionsXml(object):
         self.BinariesDict = BinariesDict
         self.Statement = XmlElement(Item, 'UserExtensions')
         SupArch = XmlAttribute(XmlNode(Item, '%s' % Key), 'SupArchList')
-        self.SupArchList = [Arch for Arch in GetSplitValueList(SupArch, DataType.TAB_SPACE_SPLIT) if Arch]
+        self.SupArchList = [Arch for Arch in GetSplitValueList(
+            SupArch, DataType.TAB_SPACE_SPLIT) if Arch]
         UserExtension = UserExtensionObject()
         UserExtension.SetUserID(self.UserId)
         UserExtension.SetIdentifier(self.Identifier)
@@ -668,35 +710,39 @@ class UserExtensionsXml(object):
             pass
         AttributeList = [['UserId', str(UserExtension.GetUserID())],
                          ['Identifier', str(UserExtension.GetIdentifier())],
-                         ['SupArchList', \
+                         ['SupArchList',
                           GetStringOfList(UserExtension.GetSupArchList())],
-                        ]
-        Root = CreateXmlElement('%s' % Key, UserExtension.GetStatement(), [], \
-                                    AttributeList)
+                         ]
+        Root = CreateXmlElement('%s' % Key, UserExtension.GetStatement(), [],
+                                AttributeList)
         if UserExtension.GetIdentifier() == DataType.TAB_BINARY_HEADER_IDENTIFIER and \
-        UserExtension.GetUserID() == DataType.TAB_BINARY_HEADER_USERID:
+                UserExtension.GetUserID() == DataType.TAB_BINARY_HEADER_USERID:
             for (Lang, Value) in UserExtension.GetBinaryAbstract():
                 if Value:
-                    ChildElement = CreateXmlElement('BinaryAbstract', Value, [], [['Lang', Lang]])
+                    ChildElement = CreateXmlElement(
+                        'BinaryAbstract', Value, [], [['Lang', Lang]])
                     Root.appendChild(ChildElement)
             for (Lang, Value) in UserExtension.GetBinaryDescription():
                 if Value:
-                    ChildElement = CreateXmlElement('BinaryDescription', Value, [], [['Lang', Lang]])
+                    ChildElement = CreateXmlElement(
+                        'BinaryDescription', Value, [], [['Lang', Lang]])
                     Root.appendChild(ChildElement)
             for (Lang, Value) in UserExtension.GetBinaryCopyright():
                 if Value:
-                    ChildElement = CreateXmlElement('BinaryCopyright', Value, [], [])
+                    ChildElement = CreateXmlElement(
+                        'BinaryCopyright', Value, [], [])
                     Root.appendChild(ChildElement)
             for (Lang, Value) in UserExtension.GetBinaryLicense():
                 if Value:
-                    ChildElement = CreateXmlElement('BinaryLicense', Value, [], [])
+                    ChildElement = CreateXmlElement(
+                        'BinaryLicense', Value, [], [])
                     Root.appendChild(ChildElement)
 
         NodeList = []
         DefineDict = UserExtension.GetDefinesDict()
         if DefineDict:
             for Item in DefineDict.keys():
-                NodeList.append(CreateXmlElement\
+                NodeList.append(CreateXmlElement
                                 ('Statement', Item, [], []))
             DefineElement = CreateXmlElement('Define', '', NodeList, [])
             Root.appendChild(DefineElement)
@@ -704,18 +750,18 @@ class UserExtensionsXml(object):
         BuildOptionDict = UserExtension.GetBuildOptionDict()
         if BuildOptionDict:
             for Item in BuildOptionDict.keys():
-                NodeList.append(CreateXmlElement\
-                                ('Statement', BuildOptionDict[Item], [], \
+                NodeList.append(CreateXmlElement
+                                ('Statement', BuildOptionDict[Item], [],
                                  [['SupArchList', Item]]))
             BuildOptionElement = \
-            CreateXmlElement('BuildOption', '', NodeList, [])
+                CreateXmlElement('BuildOption', '', NodeList, [])
             Root.appendChild(BuildOptionElement)
         NodeList = []
         IncludesDict = UserExtension.GetIncludesDict()
         if IncludesDict:
             for Item in IncludesDict.keys():
-                NodeList.append(CreateXmlElement\
-                                ('Statement', Item, [], \
+                NodeList.append(CreateXmlElement
+                                ('Statement', Item, [],
                                  [['SupArchList', IncludesDict[Item]]]))
             IncludesElement = CreateXmlElement('Includes', '', NodeList, [])
             Root.appendChild(IncludesElement)
@@ -740,6 +786,8 @@ class UserExtensionsXml(object):
 ##
 # UserExtensionSourceXml
 #
+
+
 class UserExtensionSourceXml(object):
     def __init__(self):
         self.UserExtensionSource = ''
@@ -758,13 +806,13 @@ class UserExtensionSourceXml(object):
             SupArchStr = XmlElement(SubItem, 'SourceFile/SupArchList')
             DictKey = (FileName, Family, FeatureFlag, SupArchStr)
             ValueList = []
-            for ValueNodeItem in XmlList(SubItem, \
+            for ValueNodeItem in XmlList(SubItem,
                                          'SourceFile/SourceFileOtherAttr'):
-                TagName = XmlElement(ValueNodeItem, \
+                TagName = XmlElement(ValueNodeItem,
                                      'SourceFileOtherAttr/TagName')
-                ToolCode = XmlElement(ValueNodeItem, \
+                ToolCode = XmlElement(ValueNodeItem,
                                       'SourceFileOtherAttr/ToolCode')
-                Comment = XmlElement(ValueNodeItem, \
+                Comment = XmlElement(ValueNodeItem,
                                      'SourceFileOtherAttr/Comment')
                 if (TagName == ' ') and (ToolCode == ' ') and (Comment == ' '):
                     TagName = ''
@@ -795,10 +843,10 @@ class UserExtensionSourceXml(object):
                 ValueNodeList.append(["TagName", TagName])
                 ValueNodeList.append(["ToolCode", ToolCode])
                 ValueNodeList.append(["Comment", Comment])
-                ValueNodeXml = CreateXmlElement('SourceFileOtherAttr', '', \
+                ValueNodeXml = CreateXmlElement('SourceFileOtherAttr', '',
                                                 ValueNodeList, [])
                 SourceFileNodeList.append(ValueNodeXml)
-            SourceFileNodeXml = CreateXmlElement('SourceFile', '', \
+            SourceFileNodeXml = CreateXmlElement('SourceFile', '',
                                                  SourceFileNodeList, [])
             SourcesNodeList.append(SourceFileNodeXml)
         Root = CreateXmlElement('%s' % Key, '', SourcesNodeList, [])
@@ -807,6 +855,8 @@ class UserExtensionSourceXml(object):
 ##
 # UserExtensionBinaryXml
 #
+
+
 class UserExtensionBinaryXml(object):
     def __init__(self):
         self.UserExtensionBinary = ''
@@ -824,15 +874,15 @@ class UserExtensionBinaryXml(object):
             SupArch = XmlElement(SubItem, 'Binary/SupArchList')
             DictKey = (FileName, FileType, ConvertNOTEQToNE(FFE), SupArch)
             ValueList = []
-            for ValueNodeItem in XmlList(SubItem, \
+            for ValueNodeItem in XmlList(SubItem,
                                          'Binary/BinaryFileOtherAttr'):
-                Target = XmlElement(ValueNodeItem, \
+                Target = XmlElement(ValueNodeItem,
                                     'BinaryFileOtherAttr/Target')
-                Family = XmlElement(ValueNodeItem, \
+                Family = XmlElement(ValueNodeItem,
                                     'BinaryFileOtherAttr/Family')
-                TagName = XmlElement(ValueNodeItem, \
+                TagName = XmlElement(ValueNodeItem,
                                      'BinaryFileOtherAttr/TagName')
-                Comment = XmlElement(ValueNodeItem, \
+                Comment = XmlElement(ValueNodeItem,
                                      'BinaryFileOtherAttr/Comment')
                 if (Target == ' ') and (Family == ' ') and \
                    (TagName == ' ') and (Comment == ' '):
@@ -868,7 +918,7 @@ class UserExtensionBinaryXml(object):
                 ValueNodeList.append(["Family", Family])
                 ValueNodeList.append(["TagName", TagName])
                 ValueNodeList.append(["Comment", Comment])
-                ValueNodeXml = CreateXmlElement('BinaryFileOtherAttr', '', \
+                ValueNodeXml = CreateXmlElement('BinaryFileOtherAttr', '',
                                                 ValueNodeList, [])
                 FileNodeList.append(ValueNodeXml)
             FileNodeXml = CreateXmlElement('Binary', '', FileNodeList, [])
@@ -879,6 +929,8 @@ class UserExtensionBinaryXml(object):
 ##
 # LibraryClassXml
 #
+
+
 class LibraryClassXml(object):
     def __init__(self):
         self.Keyword = ''
@@ -905,7 +957,8 @@ class LibraryClassXml(object):
             LibraryClass.SetUsage(self.CommonDefines.Usage)
         LibraryClass.SetSupArchList(self.CommonDefines.SupArchList)
         LibraryClass.SetSupModuleList(self.CommonDefines.SupModList)
-        LibraryClass.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+        LibraryClass.SetFeatureFlag(
+            ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
         LibraryClass.SetHelpTextList(GetHelpTextList(self.HelpText))
         return LibraryClass
 
@@ -913,10 +966,11 @@ class LibraryClassXml(object):
         if self.HeaderFile:
             pass
         AttributeList = \
-        [['Keyword', LibraryClass.GetLibraryClass()],
-         ['SupArchList', GetStringOfList(LibraryClass.GetSupArchList())],
-         ['SupModList', GetStringOfList(LibraryClass.GetSupModuleList())]
-         ]
+            [['Keyword', LibraryClass.GetLibraryClass()],
+             ['SupArchList', GetStringOfList(LibraryClass.GetSupArchList())],
+                ['SupModList', GetStringOfList(
+                    LibraryClass.GetSupModuleList())]
+             ]
         NodeList = [['HeaderFile', LibraryClass.GetIncludeHeader()]]
         for Item in LibraryClass.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -929,11 +983,12 @@ class LibraryClassXml(object):
             pass
         FeatureFlag = ConvertNEToNOTEQ(LibraryClass.GetFeatureFlag())
         AttributeList = \
-        [['Usage', LibraryClass.GetUsage()], \
-         ['SupArchList', GetStringOfList(LibraryClass.GetSupArchList())], \
-         ['SupModList', GetStringOfList(LibraryClass.GetSupModuleList())], \
-         ['FeatureFlag', FeatureFlag]
-         ]
+            [['Usage', LibraryClass.GetUsage()],
+             ['SupArchList', GetStringOfList(LibraryClass.GetSupArchList())],
+                ['SupModList', GetStringOfList(
+                    LibraryClass.GetSupModuleList())],
+                ['FeatureFlag', FeatureFlag]
+             ]
         NodeList = [['Keyword', LibraryClass.GetLibraryClass()], ]
         for Item in LibraryClass.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -943,8 +998,8 @@ class LibraryClassXml(object):
 
     def __str__(self):
         Str = "Keyword = %s HeaderFile = %s RecommendedInstanceGuid = %s RecommendedInstanceVersion = %s %s" % \
-              (self.Keyword, self.HeaderFile, self.RecommendedInstanceGuid, self.RecommendedInstanceVersion, \
-              self.CommonDefines)
+              (self.Keyword, self.HeaderFile, self.RecommendedInstanceGuid, self.RecommendedInstanceVersion,
+               self.CommonDefines)
         for Item in self.HelpText:
             Str = Str + "\n\t" + str(Item)
         return Str
@@ -952,6 +1007,8 @@ class LibraryClassXml(object):
 ##
 # FilenameXml
 #
+
+
 class FilenameXml(object):
     def __init__(self):
         self.FileType = ''
@@ -982,16 +1039,18 @@ class FilenameXml(object):
     def ToXml(self, Filename, Key):
         if self.Filename:
             pass
-        AttributeList = [['SupArchList', \
+        AttributeList = [['SupArchList',
                           GetStringOfList(Filename.GetSupArchList())],
                          ['FileType', Filename.GetFileType()],
-                         ['FeatureFlag', ConvertNEToNOTEQ(Filename.GetFeatureFlag())],
+                         ['FeatureFlag', ConvertNEToNOTEQ(
+                             Filename.GetFeatureFlag())],
                          ['GUID', Filename.GetGuidValue()]
-                        ]
-        Root = CreateXmlElement('%s' % Key, Filename.GetFilename(), [], AttributeList)
+                         ]
+        Root = CreateXmlElement(
+            '%s' % Key, Filename.GetFilename(), [], AttributeList)
 
         return Root
 
     def __str__(self):
         return "FileType = %s Filename = %s %s" \
-             % (self.FileType, self.Filename, self.CommonDefines)
+            % (self.FileType, self.Filename, self.CommonDefines)
diff --git a/BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py b/BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py
index d6645fc23cf0..dbfc6876c1c7 100644
--- a/BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a xml file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -28,8 +28,10 @@ from Xml.CommonXml import HelpTextXml
 from Xml.XmlParserMisc import GetHelpTextList
 
 ##
-#GUID/Protocol/Ppi Common
+# GUID/Protocol/Ppi Common
 #
+
+
 class GuidProtocolPpiXml(object):
     def __init__(self, Mode):
         self.UiName = ''
@@ -78,18 +80,22 @@ class GuidProtocolPpiXml(object):
         if self.GuidValue:
             pass
         AttributeList = \
-        [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())], \
-         ['UiName', GuidProtocolPpi.GetName()], \
-         ['GuidType', GetStringOfList(GuidProtocolPpi.GetGuidTypeList())], \
-         ['Notify', str(GuidProtocolPpi.GetNotify()).lower()], \
-         ['SupArchList', GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-         ['SupModList', GetStringOfList(GuidProtocolPpi.GetSupModuleList())], \
-         ['FeatureFlag', ConvertNEToNOTEQ(GuidProtocolPpi.GetFeatureFlag())]
-        ]
+            [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())],
+             ['UiName', GuidProtocolPpi.GetName()],
+                ['GuidType', GetStringOfList(
+                    GuidProtocolPpi.GetGuidTypeList())],
+                ['Notify', str(GuidProtocolPpi.GetNotify()).lower()],
+                ['SupArchList', GetStringOfList(
+                    GuidProtocolPpi.GetSupArchList())],
+                ['SupModList', GetStringOfList(
+                    GuidProtocolPpi.GetSupModuleList())],
+                ['FeatureFlag', ConvertNEToNOTEQ(
+                    GuidProtocolPpi.GetFeatureFlag())]
+             ]
         NodeList = [['CName', GuidProtocolPpi.GetCName()],
                     ['GuidValue', GuidProtocolPpi.GetGuid()],
                     ['VariableName', GuidProtocolPpi.VariableName]
-                   ]
+                    ]
         for Item in GuidProtocolPpi.GetHelpTextList():
             Tmp = HelpTextXml()
             NodeList.append(Tmp.ToXml(Item))
@@ -99,15 +105,17 @@ class GuidProtocolPpiXml(object):
 
     def __str__(self):
         Str = \
-        "UiName = %s Notify = %s GuidTypes = %s CName = %s GuidValue = %s %s" \
-        % (self.UiName, self.Notify, self.GuidTypes, self.CName, \
-           self.GuidValue, self.CommonDefines)
+            "UiName = %s Notify = %s GuidTypes = %s CName = %s GuidValue = %s %s" \
+            % (self.UiName, self.Notify, self.GuidTypes, self.CName,
+               self.GuidValue, self.CommonDefines)
         for Item in self.HelpText:
             Str = Str + "\n\t" + str(Item)
         return Str
 ##
-#GUID Xml
+# GUID Xml
 #
+
+
 class GuidXml(GuidProtocolPpiXml):
     def __init__(self, Mode):
         GuidProtocolPpiXml.__init__(self, Mode)
@@ -127,7 +135,8 @@ class GuidXml(GuidProtocolPpiXml):
             if self.GuidType:
                 GuidProtocolPpi.SetGuidTypeList([self.GuidType])
             GuidProtocolPpi.SetSupArchList(self.CommonDefines.SupArchList)
-            GuidProtocolPpi.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+            GuidProtocolPpi.SetFeatureFlag(
+                ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
             GuidProtocolPpi.SetCName(self.CName)
             GuidProtocolPpi.SetVariableName(self.VariableName)
         return GuidProtocolPpi
@@ -135,27 +144,29 @@ class GuidXml(GuidProtocolPpiXml):
     def ToXml(self, GuidProtocolPpi, Key):
         if self.Mode == 'Package':
             AttributeList = \
-            [['GuidType', \
-              GetStringOfList(GuidProtocolPpi.GetGuidTypeList())], \
-              ['SupArchList', \
-               GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-               ['SupModList', \
-                GetStringOfList(GuidProtocolPpi.GetSupModuleList())],
-            ]
+                [['GuidType',
+                  GetStringOfList(GuidProtocolPpi.GetGuidTypeList())],
+                 ['SupArchList',
+                    GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                    ['SupModList',
+                     GetStringOfList(GuidProtocolPpi.GetSupModuleList())],
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
                         ['GuidValue', GuidProtocolPpi.GetGuid()],
-                       ]
+                        ]
         else:
             AttributeList = \
-            [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())], \
-             ['GuidType', GetStringOfList(GuidProtocolPpi.GetGuidTypeList())],\
-              ['SupArchList', \
-               GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-               ['FeatureFlag', ConvertNEToNOTEQ(GuidProtocolPpi.GetFeatureFlag())]
-            ]
+                [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())],
+                 ['GuidType', GetStringOfList(
+                     GuidProtocolPpi.GetGuidTypeList())],
+                    ['SupArchList',
+                     GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                    ['FeatureFlag', ConvertNEToNOTEQ(
+                        GuidProtocolPpi.GetFeatureFlag())]
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
                         ['VariableName', GuidProtocolPpi.GetVariableName()]
-                       ]
+                        ]
 
         for Item in GuidProtocolPpi.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -164,8 +175,10 @@ class GuidXml(GuidProtocolPpiXml):
 
         return Root
 ##
-#Protocol Xml
+# Protocol Xml
 #
+
+
 class ProtocolXml(GuidProtocolPpiXml):
     def __init__(self, Mode):
         GuidProtocolPpiXml.__init__(self, Mode)
@@ -188,7 +201,8 @@ class ProtocolXml(GuidProtocolPpiXml):
             else:
                 GuidProtocolPpi.SetNotify('')
             GuidProtocolPpi.SetSupArchList(self.CommonDefines.SupArchList)
-            GuidProtocolPpi.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+            GuidProtocolPpi.SetFeatureFlag(
+                ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
             GuidProtocolPpi.SetCName(self.CName)
 
         return GuidProtocolPpi
@@ -196,25 +210,26 @@ class ProtocolXml(GuidProtocolPpiXml):
     def ToXml(self, GuidProtocolPpi, Key):
         if self.Mode == 'Package':
             AttributeList = \
-            [['SupArchList', \
-              GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-              ['SupModList', \
-               GetStringOfList(GuidProtocolPpi.GetSupModuleList())], \
-               ['FeatureFlag', GuidProtocolPpi.GetFeatureFlag()]
-            ]
+                [['SupArchList',
+                  GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                 ['SupModList',
+                    GetStringOfList(GuidProtocolPpi.GetSupModuleList())],
+                    ['FeatureFlag', GuidProtocolPpi.GetFeatureFlag()]
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
                         ['GuidValue', GuidProtocolPpi.GetGuid()],
-                       ]
+                        ]
         else:
             AttributeList = \
-            [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())], \
-             ['Notify', str(GuidProtocolPpi.GetNotify()).lower()], \
-             ['SupArchList', \
-              GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-              ['FeatureFlag', ConvertNEToNOTEQ(GuidProtocolPpi.GetFeatureFlag())]
-            ]
+                [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())],
+                 ['Notify', str(GuidProtocolPpi.GetNotify()).lower()],
+                    ['SupArchList',
+                     GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                    ['FeatureFlag', ConvertNEToNOTEQ(
+                        GuidProtocolPpi.GetFeatureFlag())]
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
-                       ]
+                        ]
 
         for Item in GuidProtocolPpi.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -223,8 +238,10 @@ class ProtocolXml(GuidProtocolPpiXml):
 
         return Root
 ##
-#Ppi Xml
+# Ppi Xml
 #
+
+
 class PpiXml(GuidProtocolPpiXml):
     def __init__(self, Mode):
         GuidProtocolPpiXml.__init__(self, Mode)
@@ -246,7 +263,8 @@ class PpiXml(GuidProtocolPpiXml):
             else:
                 GuidProtocolPpi.SetNotify('')
             GuidProtocolPpi.SetSupArchList(self.CommonDefines.SupArchList)
-            GuidProtocolPpi.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+            GuidProtocolPpi.SetFeatureFlag(
+                ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
             GuidProtocolPpi.SetCName(self.CName)
 
         return GuidProtocolPpi
@@ -254,22 +272,23 @@ class PpiXml(GuidProtocolPpiXml):
     def ToXml(self, GuidProtocolPpi, Key):
         if self.Mode == 'Package':
             AttributeList = \
-            [['SupArchList', \
-              GetStringOfList(GuidProtocolPpi.GetSupArchList())],
-            ]
+                [['SupArchList',
+                  GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
                         ['GuidValue', GuidProtocolPpi.GetGuid()],
-                       ]
+                        ]
         else:
             AttributeList = \
-            [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())], \
-             ['Notify', str(GuidProtocolPpi.GetNotify()).lower()], \
-             ['SupArchList', \
-              GetStringOfList(GuidProtocolPpi.GetSupArchList())], \
-              ['FeatureFlag', ConvertNEToNOTEQ(GuidProtocolPpi.GetFeatureFlag())]
-            ]
+                [['Usage', GetStringOfList(GuidProtocolPpi.GetUsage())],
+                 ['Notify', str(GuidProtocolPpi.GetNotify()).lower()],
+                    ['SupArchList',
+                     GetStringOfList(GuidProtocolPpi.GetSupArchList())],
+                    ['FeatureFlag', ConvertNEToNOTEQ(
+                        GuidProtocolPpi.GetFeatureFlag())]
+                 ]
             NodeList = [['CName', GuidProtocolPpi.GetCName()],
-                       ]
+                        ]
 
         for Item in GuidProtocolPpi.GetHelpTextList():
             Tmp = HelpTextXml()
diff --git a/BaseTools/Source/Python/UPT/Xml/IniToXml.py b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
index 3dc4001313de..ce5366d755a2 100644
--- a/BaseTools/Source/Python/UPT/Xml/IniToXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/IniToXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is for converting package information data file to xml file.
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -28,38 +28,46 @@ from Library.StringUtils import ConvertSpecialChar
 from Library.ParserValidate import IsValidPath
 from Library import GlobalData
 
-## log error:
+# log error:
 #
 # @param error: error
 # @param File: File
 # @param Line: Line
 #
+
+
 def IniParseError(Error, File, Line):
     Logger.Error("UPT", UPT_INI_PARSE_ERROR, File=File,
                  Line=Line, ExtraData=Error)
 
-## __ValidatePath
+# __ValidatePath
 #
 # @param Path: Path to be checked
 #
+
+
 def __ValidatePath(Path, Root):
     Path = Path.strip()
     if os.path.isabs(Path) or not IsValidPath(Path, Root):
         return False, ST.ERR_FILELIST_LOCATION % (Root, Path)
     return True, ''
 
-## ValidateMiscFile
+# ValidateMiscFile
 #
 # @param Filename: File to be checked
 #
+
+
 def ValidateMiscFile(Filename):
     Root = GlobalData.gWORKSPACE
     return __ValidatePath(Filename, Root)
 
-## ValidateToolsFile
+# ValidateToolsFile
 #
 # @param Filename: File to be checked
 #
+
+
 def ValidateToolsFile(Filename):
     Valid, Cause = False, ''
     if not Valid and 'EDK_TOOLS_PATH' in os.environ:
@@ -68,13 +76,15 @@ def ValidateToolsFile(Filename):
         Valid, Cause = __ValidatePath(Filename, GlobalData.gWORKSPACE)
     return Valid, Cause
 
-## ParseFileList
+# ParseFileList
 #
 # @param Line: Line
 # @param Map: Map
 # @param CurrentKey: CurrentKey
 # @param PathFunc: Path validate function
 #
+
+
 def ParseFileList(Line, Map, CurrentKey, PathFunc):
     FileList = ["", {}]
     TokenList = Line.split(TAB_VALUE_SPLIT)
@@ -107,11 +117,13 @@ def ParseFileList(Line, Map, CurrentKey, PathFunc):
         Map[CurrentKey].append(FileList)
     return True, ''
 
-## Create header XML file
+# Create header XML file
 #
 # @param DistMap: DistMap
 # @param Root: Root
 #
+
+
 def CreateHeaderXml(DistMap, Root):
     Element1 = CreateXmlElement('Name', DistMap['Name'],
                                 [], [['BaseName', DistMap['BaseName']]])
@@ -133,12 +145,14 @@ def CreateHeaderXml(DistMap, Root):
     Root.appendChild(CreateXmlElement('DistributionHeader', '',
                                       NodeList, AttributeList))
 
-## Create tools XML file
+# Create tools XML file
 #
 # @param Map: Map
 # @param Root: Root
 # @param Tag: Tag
 #
+
+
 def CreateToolsXml(Map, Root, Tag):
     #
     # Check if all elements in this section are empty
@@ -154,7 +168,7 @@ def CreateToolsXml(Map, Root, Tag):
                 ['License', Map['License']],
                 ['Abstract', Map['Abstract']],
                 ['Description', Map['Description']],
-               ]
+                ]
     HeaderNode = CreateXmlElement('Header', '', NodeList, [])
     NodeList = [HeaderNode]
 
@@ -165,12 +179,14 @@ def CreateToolsXml(Map, Root, Tag):
         NodeList.append(CreateXmlElement('Filename', File[0], [], AttrList))
     Root.appendChild(CreateXmlElement(Tag, '', NodeList, []))
 
-## ValidateValues
+# ValidateValues
 #
 # @param Key: Key
 # @param Value: Value
 # @param SectionName: SectionName
 #
+
+
 def ValidateValues(Key, Value, SectionName):
     if SectionName == 'DistributionHeader':
         Valid, Cause = ValidateRegValues(Key, Value)
@@ -185,24 +201,26 @@ def ValidateValues(Key, Value, SectionName):
             return Valid, ST.ERR_VALUE_INVALID % (Key, SectionName)
     return True, ''
 
-## ValidateRegValues
+# ValidateRegValues
 #
 # @param Key: Key
 # @param Value: Value
 #
+
+
 def ValidateRegValues(Key, Value):
     ValidateMap = {
-        'ReadOnly'  :
+        'ReadOnly':
             ('true|false', ST.ERR_BOOLEAN_VALUE % (Key, Value)),
-        'RePackage' :
+        'RePackage':
             ('true|false', ST.ERR_BOOLEAN_VALUE % (Key, Value)),
-        'GUID'      :
+        'GUID':
             ('[a-fA-F0-9]{8}-[a-fA-F0-9]{4}-[a-fA-F0-9]{4}'
-            '-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}',
-            ST.ERR_GUID_VALUE % Value),
-        'Version'   :   ('[0-9]+(\.[0-9]+)?', ST.ERR_VERSION_VALUE % \
-                         (Key, Value)),
-        'XmlSpecification' : ('1\.1', ST.ERR_VERSION_XMLSPEC % Value)
+             '-[a-fA-F0-9]{4}-[a-fA-F0-9]{12}',
+             ST.ERR_GUID_VALUE % Value),
+        'Version':   ('[0-9]+(\.[0-9]+)?', ST.ERR_VERSION_VALUE %
+                      (Key, Value)),
+        'XmlSpecification': ('1\.1', ST.ERR_VERSION_XMLSPEC % Value)
     }
     if Key not in ValidateMap:
         return True, ''
@@ -212,10 +230,12 @@ def ValidateRegValues(Key, Value):
         return True, ''
     return False, Elem[1]
 
-## __ValidateDistHeaderName
+# __ValidateDistHeaderName
 #
 # @param Name: Name
 #
+
+
 def __ValidateDistHeaderName(Name):
     if len(Name) < 1:
         return False
@@ -225,10 +245,12 @@ def __ValidateDistHeaderName(Name):
             return False
     return True
 
-## __ValidateDistHeaderBaseName
+# __ValidateDistHeaderBaseName
 #
 # @param BaseName: BaseName
 #
+
+
 def __ValidateDistHeaderBaseName(BaseName):
     if not BaseName:
         return False
@@ -241,78 +263,88 @@ def __ValidateDistHeaderBaseName(BaseName):
             return False
     return True
 
-## __ValidateDistHeaderAbstract
+# __ValidateDistHeaderAbstract
 #
 # @param Abstract: Abstract
 #
+
+
 def __ValidateDistHeaderAbstract(Abstract):
     return '\t' not in Abstract and len(Abstract.splitlines()) == 1
 
-## __ValidateOtherHeaderAbstract
+# __ValidateOtherHeaderAbstract
 #
 # @param Abstract: Abstract
 #
+
+
 def __ValidateOtherHeaderAbstract(Abstract):
     return __ValidateDistHeaderAbstract(Abstract)
 
-## __ValidateDistHeader
+# __ValidateDistHeader
 #
 # @param Key: Key
 # @param Value: Value
 #
+
+
 def __ValidateDistHeader(Key, Value):
     ValidateMap = {
-        'Name'      : __ValidateDistHeaderName,
-        'BaseName'  : __ValidateDistHeaderBaseName,
-        'Abstract'  : __ValidateDistHeaderAbstract,
-        'Vendor'    : __ValidateDistHeaderAbstract
+        'Name': __ValidateDistHeaderName,
+        'BaseName': __ValidateDistHeaderBaseName,
+        'Abstract': __ValidateDistHeaderAbstract,
+        'Vendor': __ValidateDistHeaderAbstract
     }
     return not (Value and Key in ValidateMap and not ValidateMap[Key](Value))
 
-## __ValidateOtherHeader
+# __ValidateOtherHeader
 #
 # @param Key: Key
 # @param Value: Value
 #
+
+
 def __ValidateOtherHeader(Key, Value):
     ValidateMap = {
-        'Name'      : __ValidateDistHeaderName,
-        'Abstract'  : __ValidateOtherHeaderAbstract
+        'Name': __ValidateDistHeaderName,
+        'Abstract': __ValidateOtherHeaderAbstract
     }
     return not (Value and Key in ValidateMap and not ValidateMap[Key](Value))
 
-## Convert ini file to xml file
+# Convert ini file to xml file
 #
 # @param IniFile
 #
+
+
 def IniToXml(IniFile):
     if not os.path.exists(IniFile):
         Logger.Error("UPT", FILE_NOT_FOUND, ST.ERR_TEMPLATE_NOTFOUND % IniFile)
 
-    DistMap = {'ReadOnly' : '', 'RePackage' : '', 'Name' : '',
-               'BaseName' : '', 'GUID' : '', 'Version' : '', 'Vendor' : '',
-               'Date' : '', 'Copyright' : '', 'License' : '', 'Abstract' : '',
-               'Description' : '', 'Signature' : '', 'XmlSpecification' : ''
-                }
+    DistMap = {'ReadOnly': '', 'RePackage': '', 'Name': '',
+               'BaseName': '', 'GUID': '', 'Version': '', 'Vendor': '',
+               'Date': '', 'Copyright': '', 'License': '', 'Abstract': '',
+               'Description': '', 'Signature': '', 'XmlSpecification': ''
+               }
 
-    ToolsMap = {'Name' : '', 'Copyright' : '', 'License' : '',
-                'Abstract' : '', 'Description' : '', 'FileList' : []}
+    ToolsMap = {'Name': '', 'Copyright': '', 'License': '',
+                'Abstract': '', 'Description': '', 'FileList': []}
     #
     # Only FileList is a list: [['file1', {}], ['file2', {}], ...]
     #
-    MiscMap = {'Name' : '', 'Copyright' : '', 'License' : '',
-               'Abstract' : '', 'Description' : '', 'FileList' : []}
+    MiscMap = {'Name': '', 'Copyright': '', 'License': '',
+               'Abstract': '', 'Description': '', 'FileList': []}
 
     SectionMap = {
-                   'DistributionHeader' : DistMap,
-                   'ToolsHeader' : ToolsMap,
-                   'MiscellaneousFilesHeader' : MiscMap
-                   }
+        'DistributionHeader': DistMap,
+        'ToolsHeader': ToolsMap,
+        'MiscellaneousFilesHeader': MiscMap
+    }
 
     PathValidator = {
-                'ToolsHeader' : ValidateToolsFile,
-                'MiscellaneousFilesHeader' : ValidateMiscFile
-                }
+        'ToolsHeader': ValidateToolsFile,
+        'MiscellaneousFilesHeader': ValidateMiscFile
+    }
 
     ParsedSection = []
 
@@ -332,11 +364,11 @@ def IniToXml(IniFile):
             SectionName = Line[1:-1].strip()
             if SectionName not in SectionMap:
                 IniParseError(ST.ERR_SECTION_NAME_INVALID % SectionName,
-                      IniFile, Index+1)
+                              IniFile, Index+1)
 
             if SectionName in ParsedSection:
                 IniParseError(ST.ERR_SECTION_REDEFINE % SectionName,
-                      IniFile, Index+1)
+                              IniFile, Index+1)
             else:
                 ParsedSection.append(SectionName)
 
@@ -419,7 +451,7 @@ def IniToXml(IniFile):
     return CreateXml(DistMap, ToolsMap, MiscMap, IniFile)
 
 
-## CheckMdtKeys
+# CheckMdtKeys
 #
 # @param MdtDistKeys: All mandatory keys
 # @param DistMap: Dist content
@@ -428,7 +460,8 @@ def IniToXml(IniFile):
 # @param Maps: Tools and Misc section name and map. (('section_name', map),*)
 #
 def CheckMdtKeys(DistMap, IniFile, LastIndex, Maps):
-    MdtDistKeys = ['Name', 'GUID', 'Version', 'Vendor', 'Copyright', 'License', 'Abstract', 'XmlSpecification']
+    MdtDistKeys = ['Name', 'GUID', 'Version', 'Vendor',
+                   'Copyright', 'License', 'Abstract', 'XmlSpecification']
     for Key in MdtDistKeys:
         if Key not in DistMap or DistMap[Key] == '':
             IniParseError(ST.ERR_KEYWORD_MANDATORY % Key, IniFile, LastIndex+1)
@@ -460,22 +493,26 @@ def CheckMdtKeys(DistMap, IniFile, LastIndex, Maps):
                 NonEmptyKey += 1
 
         if NonEmptyKey > 0 and not Map['FileList']:
-            IniParseError(ST.ERR_KEYWORD_MANDATORY % (Item[0] + '.FileList'), IniFile, LastIndex+1)
+            IniParseError(ST.ERR_KEYWORD_MANDATORY %
+                          (Item[0] + '.FileList'), IniFile, LastIndex+1)
 
         if NonEmptyKey > 0 and not Map['Name']:
-            IniParseError(ST.ERR_KEYWORD_MANDATORY % (Item[0] + '.Name'), IniFile, LastIndex+1)
+            IniParseError(ST.ERR_KEYWORD_MANDATORY %
+                          (Item[0] + '.Name'), IniFile, LastIndex+1)
 
-## CreateXml
+# CreateXml
 #
 # @param DistMap:  Dist Content
 # @param ToolsMap: Tools Content
 # @param MiscMap:  Misc Content
 # @param IniFile:  Ini File
 #
+
+
 def CreateXml(DistMap, ToolsMap, MiscMap, IniFile):
     Attrs = [['xmlns', 'http://www.uefi.org/2011/1.1'],
              ['xmlns:xsi', 'http:/www.w3.org/2001/XMLSchema-instance'],
-            ]
+             ]
     Root = CreateXmlElement('DistributionPackage', '', [], Attrs)
     CreateHeaderXml(DistMap, Root)
     CreateToolsXml(ToolsMap, Root, 'Tools')
@@ -489,8 +526,7 @@ def CreateXml(DistMap, ToolsMap, MiscMap, IniFile):
     File = open(FileName, 'w')
 
     try:
-        File.write(Root.toprettyxml(indent = '  '))
+        File.write(Root.toprettyxml(indent='  '))
     finally:
         File.close()
     return FileName
-
diff --git a/BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py b/BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py
index 741da908047c..f1548725219e 100644
--- a/BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a Module file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -48,7 +48,7 @@ from Xml.XmlParserMisc import GetHelpTextList
 from Library import GlobalData
 from Library.Misc import GetSplitValueList
 
-##   BinaryFileXml
+# BinaryFileXml
 #
 #    represent the following XML item
 #
@@ -62,6 +62,8 @@ from Library.Misc import GetSplitValueList
 #    <AsBuilt> ... </AsBuilt> {0,}
 #    </BinaryFile> {1,}
 #
+
+
 class BinaryFileXml(object):
     def __init__(self):
         self.FileNames = []
@@ -117,7 +119,8 @@ class BinaryFileXml(object):
                 BuildFlagList = []
                 for SubItem in XmlList(Item, '%s/AsBuilt/BuildFlags' % Key):
                     BuildFlag = BuildFlagXml()
-                    BuildFlagList.append(BuildFlag.FromXml2(SubItem, 'BuildFlags'))
+                    BuildFlagList.append(
+                        BuildFlag.FromXml2(SubItem, 'BuildFlags'))
                 AsBuilt.SetBuildFlagsList(BuildFlagList)
                 AsBuiltList.append(AsBuilt)
             BinaryFile.SetAsBuiltList(AsBuiltList)
@@ -156,17 +159,20 @@ class BinaryFileXml(object):
         for LibGuidVer in LibGuidVerList:
             if LibGuidVer.GetLibGuid() and IsMatchArch(LibGuidVer.GetSupArchList(), SupportArch):
                 GuiVerElem = \
-                CreateXmlElement('GUID', LibGuidVer.GetLibGuid(), [], [['Version', LibGuidVer.GetLibVersion()]])
+                    CreateXmlElement('GUID', LibGuidVer.GetLibGuid(), [], [
+                                     ['Version', LibGuidVer.GetLibVersion()]])
                 GuiVerElemList.append(GuiVerElem)
         if len(GuiVerElemList) > 0:
-            LibGuidVerElem = CreateXmlElement('LibraryInstances', '', GuiVerElemList, [])
+            LibGuidVerElem = CreateXmlElement(
+                'LibraryInstances', '', GuiVerElemList, [])
             AsBuiltNodeList.append(LibGuidVerElem)
 
         for BuildFlag in BuildFlagList:
             if IsMatchArch(BuildFlag.GetSupArchList(), SupportArch):
                 for Item in BuildFlag.GetAsBuildList():
                     Tmp = BuildFlagXml()
-                    Elem = CreateXmlElement('BuildFlags', ''.join(Item), [], [])
+                    Elem = CreateXmlElement(
+                        'BuildFlags', ''.join(Item), [], [])
                     AsBuiltNodeList.append(Elem)
 
         if len(AsBuiltNodeList) > 0:
@@ -194,6 +200,8 @@ class BinaryFileXml(object):
 ##
 # PackageXml
 #
+
+
 class PackageXml(object):
     def __init__(self):
         self.Description = ''
@@ -211,7 +219,8 @@ class PackageXml(object):
         PackageDependency.SetPackage(self.Description)
         PackageDependency.SetGuid(self.Guid)
         PackageDependency.SetVersion(self.Version)
-        PackageDependency.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+        PackageDependency.SetFeatureFlag(
+            ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
         PackageDependency.SetSupArchList(self.CommonDefines.SupArchList)
 
         return PackageDependency
@@ -223,7 +232,8 @@ class PackageXml(object):
                          ['FeatureFlag', ConvertNEToNOTEQ(PackageDependency.GetFeatureFlag())], ]
         Element1 = CreateXmlElement('GUID', PackageDependency.GetGuid(), [],
                                     [['Version', PackageDependency.GetVersion()]])
-        NodeList = [['Description', PackageDependency.GetPackage()], Element1, ]
+        NodeList = [['Description', PackageDependency.GetPackage()],
+                    Element1, ]
         Root = CreateXmlElement('%s' % Key, '', NodeList, AttributeList)
 
         return Root
@@ -235,6 +245,8 @@ class PackageXml(object):
 ##
 # ExternXml
 #
+
+
 class ExternXml(object):
     def __init__(self):
         self.CommonDefines = CommonDefinesXml()
@@ -290,6 +302,8 @@ class ExternXml(object):
 ##
 # DepexXml
 #
+
+
 class DepexXml(object):
     def __init__(self):
         self.CommonDefines = CommonDefinesXml()
@@ -337,6 +351,8 @@ class DepexXml(object):
 ##
 # BootModeXml
 #
+
+
 class BootModeXml(object):
     def __init__(self):
         self.SupportedBootModes = ''
@@ -345,7 +361,7 @@ class BootModeXml(object):
 
     def FromXml(self, Item, Key):
         self.SupportedBootModes = \
-        XmlElement(Item, '%s/SupportedBootModes' % Key)
+            XmlElement(Item, '%s/SupportedBootModes' % Key)
         self.CommonDefines.FromXml(Item, Key)
         for HelpTextItem in XmlList(Item, '%s/HelpText' % Key):
             HelpTextObj = HelpTextXml()
@@ -372,13 +388,16 @@ class BootModeXml(object):
         return Root
 
     def __str__(self):
-        Str = "SupportedBootModes = %s %s" % (self.SupportedBootModes, self.CommonDefines)
+        Str = "SupportedBootModes = %s %s" % (
+            self.SupportedBootModes, self.CommonDefines)
         for Item in self.HelpText:
             Str = Str + '\n\t' + str(Item)
         return Str
 ##
 # EventXml
 #
+
+
 class EventXml(object):
     def __init__(self):
         self.EventType = ''
@@ -407,7 +426,7 @@ class EventXml(object):
             pass
         AttributeList = [['EventType', Event.GetEventType()],
                          ['Usage', Event.GetUsage()],
-                        ]
+                         ]
         NodeList = []
         for Item in Event.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -424,6 +443,8 @@ class EventXml(object):
 ##
 # HobXml
 #
+
+
 class HobXml(object):
     def __init__(self):
         self.HobType = ''
@@ -471,6 +492,8 @@ class HobXml(object):
 ##
 # SourceFileXml
 #
+
+
 class SourceFileXml(object):
     def __init__(self):
         self.SourceFile = ''
@@ -483,7 +506,8 @@ class SourceFileXml(object):
         self.SourceFile = XmlElement(Item, 'Filename')
         self.CommonDefines.FromXml(Item, Key)
 
-        self.CommonDefines.FeatureFlag = ConvertNOTEQToNE(self.CommonDefines.FeatureFlag)
+        self.CommonDefines.FeatureFlag = ConvertNOTEQToNE(
+            self.CommonDefines.FeatureFlag)
 
         SourceFile = SourceFileObject()
         SourceFile.SetSourceFile(self.SourceFile)
@@ -500,12 +524,15 @@ class SourceFileXml(object):
         AttributeList = [['SupArchList', GetStringOfList(SourceFile.GetSupArchList())],
                          ['Family', SourceFile.GetFamily()],
                          ['FeatureFlag', FeatureFlag], ]
-        Root = CreateXmlElement('%s' % Key, SourceFile.GetSourceFile(), [], AttributeList)
+        Root = CreateXmlElement(
+            '%s' % Key, SourceFile.GetSourceFile(), [], AttributeList)
         return Root
 
 ##
 # ModulePropertyXml
 #
+
+
 class ModulePropertyXml(object):
     def __init__(self):
         self.CommonDefines = CommonDefinesXml()
@@ -525,11 +552,14 @@ class ModulePropertyXml(object):
         self.ModuleType = XmlElement(Item, '%s/ModuleType' % Key)
         self.Path = XmlElement(Item, '%s/Path' % Key)
         self.PcdIsDriver = XmlElement(Item, '%s/PcdIsDriver' % Key)
-        self.UefiSpecificationVersion = XmlElement(Item, '%s/UefiSpecificationVersion' % Key)
-        self.PiSpecificationVersion = XmlElement(Item, '%s/PiSpecificationVersion' % Key)
+        self.UefiSpecificationVersion = XmlElement(
+            Item, '%s/UefiSpecificationVersion' % Key)
+        self.PiSpecificationVersion = XmlElement(
+            Item, '%s/PiSpecificationVersion' % Key)
         for SubItem in XmlList(Item, '%s/Specification' % Key):
             Specification = XmlElement(SubItem, '/Specification')
-            Version = XmlAttribute(XmlNode(SubItem, '/Specification'), 'Version')
+            Version = XmlAttribute(
+                XmlNode(SubItem, '/Specification'), 'Version')
             self.SpecificationList.append((Specification, Version))
         for SubItem in XmlList(Item, '%s/BootMode' % Key):
             Axml = BootModeXml()
@@ -558,21 +588,23 @@ class ModulePropertyXml(object):
 
         return Header, self.BootModes, self.Events, self.HOBs
 
-
     def ToXml(self, Header, BootModes, Events, Hobs, Key):
         if self.ModuleType:
             pass
-        AttributeList = [['SupArchList', GetStringOfList(Header.GetSupArchList())], ]
+        AttributeList = [
+            ['SupArchList', GetStringOfList(Header.GetSupArchList())], ]
 
         NodeList = [['ModuleType', Header.GetModuleType()],
                     ['Path', Header.GetModulePath()],
                     ['PcdIsDriver', Header.GetPcdIsDriver()],
-                    ['UefiSpecificationVersion', Header.GetUefiSpecificationVersion()],
+                    ['UefiSpecificationVersion',
+                        Header.GetUefiSpecificationVersion()],
                     ['PiSpecificationVersion', Header.GetPiSpecificationVersion()],
-                   ]
+                    ]
         for Item in Header.GetSpecList():
             Spec, Version = Item
-            SpecElem = CreateXmlElement('Specification', Spec, [], [['Version', Version]])
+            SpecElem = CreateXmlElement(
+                'Specification', Spec, [], [['Version', Version]])
             NodeList.append(SpecElem)
 
         for Item in BootModes:
@@ -591,9 +623,9 @@ class ModulePropertyXml(object):
     def __str__(self):
         Str = "ModuleType = %s Path = %s PcdIsDriver = %s UefiSpecificationVersion = %s PiSpecificationVersion = %s \
                Specification = %s SpecificationVersion = %s %s" % \
-        (self.ModuleType, self.Path, self.PcdIsDriver, \
-         self.UefiSpecificationVersion, self.PiSpecificationVersion, \
-         self.SpecificationList, self.SpecificationVersion, self.CommonDefines)
+            (self.ModuleType, self.Path, self.PcdIsDriver,
+             self.UefiSpecificationVersion, self.PiSpecificationVersion,
+             self.SpecificationList, self.SpecificationVersion, self.CommonDefines)
         for Item in self.BootModes:
             Str = Str + '\n\t' + str(Item)
         for Item in self.Events:
@@ -605,6 +637,8 @@ class ModulePropertyXml(object):
 ##
 # ModuleXml
 #
+
+
 class ModuleSurfaceAreaXml(object):
     def __init__(self, Package=''):
         self.Module = None
@@ -649,7 +683,8 @@ class ModuleSurfaceAreaXml(object):
         #
         # MiscellaneousFile
         Tmp = MiscellaneousFileXml()
-        MiscFileList = Tmp.FromXml(XmlNode(Item, '/ModuleSurfaceArea/MiscellaneousFiles'), 'MiscellaneousFiles')
+        MiscFileList = Tmp.FromXml(
+            XmlNode(Item, '/ModuleSurfaceArea/MiscellaneousFiles'), 'MiscellaneousFiles')
         if MiscFileList:
             Module.SetMiscFileList([MiscFileList])
         else:
@@ -661,7 +696,8 @@ class ModuleSurfaceAreaXml(object):
         for Item in XmlList(Item, '/ModuleSurfaceArea/UserExtensions'):
             Tmp = UserExtensionsXml()
             UserExtension = Tmp.FromXml(Item, 'UserExtensions')
-            Module.SetUserExtensionList(Module.GetUserExtensionList() + [UserExtension])
+            Module.SetUserExtensionList(
+                Module.GetUserExtensionList() + [UserExtension])
 
         return Module
 
@@ -671,7 +707,8 @@ class ModuleSurfaceAreaXml(object):
         # Header
         #
         Tmp = HeaderXml()
-        Module = Tmp.FromXml(XmlNode(Item, '/%s/Header' % Key), 'Header', True, IsStandAlongModule)
+        Module = Tmp.FromXml(XmlNode(Item, '/%s/Header' %
+                             Key), 'Header', True, IsStandAlongModule)
         Module.SetBinaryModule(IsBinaryModule)
 
         if IsBinaryModule:
@@ -682,7 +719,8 @@ class ModuleSurfaceAreaXml(object):
         #
         Tmp = ModulePropertyXml()
         (Module, BootModes, Events, HOBs) = \
-        Tmp.FromXml(XmlNode(Item, '/ModuleSurfaceArea/ModuleProperties'), 'ModuleProperties', Module)
+            Tmp.FromXml(XmlNode(
+                Item, '/ModuleSurfaceArea/ModuleProperties'), 'ModuleProperties', Module)
         Module.SetBootModeList(BootModes)
         Module.SetEventList(Events)
         Module.SetHobList(HOBs)
@@ -690,7 +728,8 @@ class ModuleSurfaceAreaXml(object):
         # ClonedFrom
         #
         Tmp = ClonedFromXml()
-        ClonedFrom = Tmp.FromXml(XmlNode(Item, '/ModuleSurfaceArea/ClonedFrom'), 'ClonedFrom')
+        ClonedFrom = Tmp.FromXml(
+            XmlNode(Item, '/ModuleSurfaceArea/ClonedFrom'), 'ClonedFrom')
         if ClonedFrom:
             Module.SetClonedFrom(ClonedFrom)
 
@@ -700,7 +739,8 @@ class ModuleSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/ModuleSurfaceArea/LibraryClassDefinitions/LibraryClass'):
             Tmp = LibraryClassXml()
             LibraryClass = Tmp.FromXml(SubItem, 'LibraryClass')
-            Module.SetLibraryClassList(Module.GetLibraryClassList() + [LibraryClass])
+            Module.SetLibraryClassList(
+                Module.GetLibraryClassList() + [LibraryClass])
 
         if XmlList(Item, '/ModuleSurfaceArea/LibraryClassDefinitions') and \
            not XmlList(Item, '/ModuleSurfaceArea/LibraryClassDefinitions/LibraryClass'):
@@ -715,7 +755,7 @@ class ModuleSurfaceAreaXml(object):
             Module.SetSourceFileList(Module.GetSourceFileList() + [SourceFile])
 
         if XmlList(Item, '/ModuleSurfaceArea/SourceFiles') and \
-           not XmlList(Item, '/ModuleSurfaceArea/SourceFiles/Filename') :
+           not XmlList(Item, '/ModuleSurfaceArea/SourceFiles/Filename'):
             Module.SetSourceFileList([None])
 
         #
@@ -727,7 +767,7 @@ class ModuleSurfaceAreaXml(object):
             Module.SetBinaryFileList(Module.GetBinaryFileList() + [BinaryFile])
 
         if XmlList(Item, '/ModuleSurfaceArea/BinaryFiles') and \
-           not XmlList(Item, '/ModuleSurfaceArea/BinaryFiles/BinaryFile') :
+           not XmlList(Item, '/ModuleSurfaceArea/BinaryFiles/BinaryFile'):
             Module.SetBinaryFileList([None])
         #
         # PackageDependencies
@@ -735,7 +775,8 @@ class ModuleSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/ModuleSurfaceArea/PackageDependencies/Package'):
             Tmp = PackageXml()
             PackageDependency = Tmp.FromXml(SubItem, 'Package')
-            Module.SetPackageDependencyList(Module.GetPackageDependencyList() + [PackageDependency])
+            Module.SetPackageDependencyList(
+                Module.GetPackageDependencyList() + [PackageDependency])
 
         if XmlList(Item, '/ModuleSurfaceArea/PackageDependencies') and \
            not XmlList(Item, '/ModuleSurfaceArea/PackageDependencies/Package'):
@@ -758,7 +799,8 @@ class ModuleSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/ModuleSurfaceArea/Protocols/Protocol'):
             Tmp = ProtocolXml('Module')
             GuidProtocolPpi = Tmp.FromXml(SubItem, 'Protocol')
-            Module.SetProtocolList(Module.GetProtocolList() + [GuidProtocolPpi])
+            Module.SetProtocolList(
+                Module.GetProtocolList() + [GuidProtocolPpi])
 
         if XmlList(Item, '/ModuleSurfaceArea/Protocols') and not XmlList(Item, '/ModuleSurfaceArea/Protocols/Protocol'):
             Module.SetProtocolList([None])
@@ -795,7 +837,7 @@ class ModuleSurfaceAreaXml(object):
                 Module.SetPcdList(Module.GetPcdList() + [PcdEntry])
 
             if XmlList(Item, '/ModuleSurfaceArea/PcdCoded') and \
-                not XmlList(Item, '/ModuleSurfaceArea/PcdCoded/PcdEntry'):
+                    not XmlList(Item, '/ModuleSurfaceArea/PcdCoded/PcdEntry'):
                 Module.SetPcdList([None])
 
         Module = self.FromXml2(Item, Module)
@@ -824,21 +866,24 @@ class ModuleSurfaceAreaXml(object):
         # ModuleProperties
         #
         Tmp = ModulePropertyXml()
-        DomModule.appendChild(Tmp.ToXml(Module, Module.GetBootModeList(), Module.GetEventList(), Module.GetHobList(), \
+        DomModule.appendChild(Tmp.ToXml(Module, Module.GetBootModeList(), Module.GetEventList(), Module.GetHobList(),
                                         'ModuleProperties'))
         #
         # ClonedFrom
         #
         Tmp = ClonedFromXml()
         if Module.GetClonedFrom():
-            DomModule.appendChild(Tmp.ToXml(Module.GetClonedFrom(), 'ClonedFrom'))
+            DomModule.appendChild(
+                Tmp.ToXml(Module.GetClonedFrom(), 'ClonedFrom'))
         #
         # LibraryClass
         #
-        LibraryClassNode = CreateXmlElement('LibraryClassDefinitions', '', [], [])
+        LibraryClassNode = CreateXmlElement(
+            'LibraryClassDefinitions', '', [], [])
         for LibraryClass in Module.GetLibraryClassList():
             Tmp = LibraryClassXml()
-            LibraryClassNode.appendChild(Tmp.ToXml2(LibraryClass, 'LibraryClass'))
+            LibraryClassNode.appendChild(
+                Tmp.ToXml2(LibraryClass, 'LibraryClass'))
         DomModule.appendChild(LibraryClassNode)
         #
         # SourceFile
@@ -859,10 +904,12 @@ class ModuleSurfaceAreaXml(object):
         #
         # PackageDependencies
         #
-        PackageDependencyNode = CreateXmlElement('PackageDependencies', '', [], [])
+        PackageDependencyNode = CreateXmlElement(
+            'PackageDependencies', '', [], [])
         for PackageDependency in Module.GetPackageDependencyList():
             Tmp = PackageXml()
-            PackageDependencyNode.appendChild(Tmp.ToXml(PackageDependency, 'Package'))
+            PackageDependencyNode.appendChild(
+                Tmp.ToXml(PackageDependency, 'Package'))
         DomModule.appendChild(PackageDependencyNode)
 
         #
@@ -871,7 +918,8 @@ class ModuleSurfaceAreaXml(object):
         GuidProtocolPpiNode = CreateXmlElement('Guids', '', [], [])
         for GuidProtocolPpi in Module.GetGuidList():
             Tmp = GuidXml('Module')
-            GuidProtocolPpiNode.appendChild(Tmp.ToXml(GuidProtocolPpi, 'GuidCName'))
+            GuidProtocolPpiNode.appendChild(
+                Tmp.ToXml(GuidProtocolPpi, 'GuidCName'))
         DomModule.appendChild(GuidProtocolPpiNode)
 
         #
@@ -880,7 +928,8 @@ class ModuleSurfaceAreaXml(object):
         GuidProtocolPpiNode = CreateXmlElement('Protocols', '', [], [])
         for GuidProtocolPpi in Module.GetProtocolList():
             Tmp = ProtocolXml('Module')
-            GuidProtocolPpiNode.appendChild(Tmp.ToXml(GuidProtocolPpi, 'Protocol'))
+            GuidProtocolPpiNode.appendChild(
+                Tmp.ToXml(GuidProtocolPpi, 'Protocol'))
         DomModule.appendChild(GuidProtocolPpiNode)
 
         #
@@ -937,20 +986,24 @@ class ModuleSurfaceAreaXml(object):
         #
         if Module.GetMiscFileList():
             Tmp = MiscellaneousFileXml()
-            DomModule.appendChild(Tmp.ToXml(Module.GetMiscFileList()[0], 'MiscellaneousFiles'))
+            DomModule.appendChild(
+                Tmp.ToXml(Module.GetMiscFileList()[0], 'MiscellaneousFiles'))
         #
         # UserExtensions
         #
         if Module.GetUserExtensionList():
             for UserExtension in Module.GetUserExtensionList():
                 Tmp = UserExtensionsXml()
-                DomModule.appendChild(Tmp.ToXml(UserExtension, 'UserExtensions'))
+                DomModule.appendChild(
+                    Tmp.ToXml(UserExtension, 'UserExtensions'))
 
         return DomModule
 
 ##
 # BuildFlagXml used to generate BuildFlag for <AsBuilt>
 #
+
+
 class BuildFlagXml(object):
     def __init__(self):
         self.Target = ''
diff --git a/BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py b/BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py
index 103939023bcf..9e1df1faf56d 100644
--- a/BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a Package file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -35,6 +35,8 @@ from Xml.PcdXml import PcdEntryXml
 ##
 # IndustryStandardHeaderXml
 #
+
+
 class IndustryStandardHeaderXml(object):
     def __init__(self):
         self.HeaderFile = ''
@@ -72,6 +74,8 @@ class IndustryStandardHeaderXml(object):
 ##
 # PackageIncludeHeaderXml
 #
+
+
 class PackageIncludeHeaderXml(object):
     def __init__(self):
         self.HeaderFile = ''
@@ -80,7 +84,8 @@ class PackageIncludeHeaderXml(object):
 
     def FromXml(self, Item, Key):
         self.HeaderFile = XmlElement(Item, '%s/HeaderFile' % Key)
-        self.CommonDefines.FromXml(XmlNode(Item, '%s/HeaderFile' % Key), 'HeaderFile')
+        self.CommonDefines.FromXml(
+            XmlNode(Item, '%s/HeaderFile' % Key), 'HeaderFile')
         for HelpTextItem in XmlList(Item, '%s/HelpText' % Key):
             HelpTextObj = HelpTextXml()
             HelpTextObj.FromXml(HelpTextItem, '%s/HelpText' % Key)
@@ -98,10 +103,11 @@ class PackageIncludeHeaderXml(object):
     def ToXml(self, PackageIncludeHeader, Key):
         if self.HeaderFile:
             pass
-        AttributeList = [['SupArchList', GetStringOfList(PackageIncludeHeader.GetSupArchList())], \
+        AttributeList = [['SupArchList', GetStringOfList(PackageIncludeHeader.GetSupArchList())],
                          ['SupModList', GetStringOfList(PackageIncludeHeader.GetSupModuleList())], ]
 
-        HeaderFileNode = CreateXmlElement('HeaderFile', PackageIncludeHeader.FilePath, [], AttributeList)
+        HeaderFileNode = CreateXmlElement(
+            'HeaderFile', PackageIncludeHeader.FilePath, [], AttributeList)
 
         NodeList = [HeaderFileNode]
         for Item in PackageIncludeHeader.GetHelpTextList():
@@ -121,6 +127,8 @@ class PackageIncludeHeaderXml(object):
 ##
 # PcdCheckXml
 #
+
+
 class PcdCheckXml(object):
     def __init__(self):
         self.PcdCheck = ''
@@ -144,6 +152,8 @@ class PcdCheckXml(object):
 ##
 # PackageSurfaceAreaXml
 #
+
+
 class PackageSurfaceAreaXml(object):
     def __init__(self):
         self.Package = None
@@ -159,13 +169,15 @@ class PackageSurfaceAreaXml(object):
         # Header
         #
         Tmp = PackageHeaderXml()
-        Tmp.FromXml(XmlNode(Item, '/PackageSurfaceArea/Header'), 'Header', Package)
+        Tmp.FromXml(XmlNode(Item, '/PackageSurfaceArea/Header'),
+                    'Header', Package)
         #
         # ClonedFrom
         #
         Tmp = ClonedFromXml()
         if XmlNode(Item, '/PackageSurfaceArea/ClonedFrom'):
-            ClonedFrom = Tmp.FromXml(XmlNode(Item, '/PackageSurfaceArea/ClonedFrom'), 'ClonedFrom')
+            ClonedFrom = Tmp.FromXml(
+                XmlNode(Item, '/PackageSurfaceArea/ClonedFrom'), 'ClonedFrom')
             Package.SetClonedFromList([ClonedFrom])
         #
         # LibraryClass
@@ -174,7 +186,8 @@ class PackageSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/PackageSurfaceArea/LibraryClassDeclarations/LibraryClass'):
             Tmp = LibraryClassXml()
             LibraryClass = Tmp.FromXml(SubItem, 'LibraryClass')
-            Package.SetLibraryClassList(Package.GetLibraryClassList() + [LibraryClass])
+            Package.SetLibraryClassList(
+                Package.GetLibraryClassList() + [LibraryClass])
 
         if XmlList(Item, '/PackageSurfaceArea/LibraryClassDeclarations') and \
            not XmlList(Item, '/PackageSurfaceArea/LibraryClassDeclarations/LibraryClass'):
@@ -186,20 +199,21 @@ class PackageSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/PackageSurfaceArea/IndustryStandardIncludes/IndustryStandardHeader'):
             Tmp = IndustryStandardHeaderXml()
             Include = Tmp.FromXml(SubItem, 'IndustryStandardHeader')
-            Package.SetStandardIncludeFileList(Package.GetStandardIncludeFileList() + [Include])
+            Package.SetStandardIncludeFileList(
+                Package.GetStandardIncludeFileList() + [Include])
 
         if XmlList(Item, '/PackageSurfaceArea/IndustryStandardIncludes') and \
-        not XmlList(Item, '/PackageSurfaceArea/IndustryStandardIncludes/IndustryStandardHeader'):
+                not XmlList(Item, '/PackageSurfaceArea/IndustryStandardIncludes/IndustryStandardHeader'):
             Package.SetStandardIncludeFileList([None])
 
-
         #
         # PackageHeader
         #
         for SubItem in XmlList(Item, '/PackageSurfaceArea/PackageIncludes/PackageHeader'):
             Tmp = PackageIncludeHeaderXml()
             Include = Tmp.FromXml(SubItem, 'PackageHeader')
-            Package.SetPackageIncludeFileList(Package.GetPackageIncludeFileList() + [Include])
+            Package.SetPackageIncludeFileList(
+                Package.GetPackageIncludeFileList() + [Include])
 
         if XmlList(Item, '/PackageSurfaceArea/PackageIncludes') and not \
            XmlList(Item, '/PackageSurfaceArea/PackageIncludes/PackageHeader'):
@@ -223,7 +237,8 @@ class PackageSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/PackageSurfaceArea/ProtocolDeclarations/Entry'):
             Tmp = ProtocolXml('Package')
             GuidProtocolPpi = Tmp.FromXml(SubItem, 'Entry')
-            Package.SetProtocolList(Package.GetProtocolList() + [GuidProtocolPpi])
+            Package.SetProtocolList(
+                Package.GetProtocolList() + [GuidProtocolPpi])
 
         if XmlList(Item, '/PackageSurfaceArea/ProtocolDeclarations') and not \
            XmlList(Item, '/PackageSurfaceArea/ProtocolDeclarations/Entry'):
@@ -256,8 +271,7 @@ class PackageSurfaceAreaXml(object):
                 PcdErrorMessageList = PcdErrorObj.GetErrorMessageList()
                 if PcdErrorMessageList:
                     Package.PcdErrorCommentDict[(PcdEntry.GetTokenSpaceGuidCName(), PcdErrorObj.GetErrorNumber())] = \
-                    PcdErrorMessageList
-
+                        PcdErrorMessageList
 
         if XmlList(Item, '/PackageSurfaceArea/PcdDeclarations') and not \
            XmlList(Item, '/PackageSurfaceArea/PcdDeclarations/PcdEntry'):
@@ -277,13 +291,15 @@ class PackageSurfaceAreaXml(object):
         for SubItem in XmlList(Item, '/PackageSurfaceArea/Modules/ModuleSurfaceArea'):
             Tmp = ModuleSurfaceAreaXml()
             Module = Tmp.FromXml(SubItem, 'ModuleSurfaceArea')
-            ModuleDictKey = (Module.GetGuid(), Module.GetVersion(), Module.GetName(), Module.GetModulePath())
+            ModuleDictKey = (Module.GetGuid(), Module.GetVersion(),
+                             Module.GetName(), Module.GetModulePath())
             Package.ModuleDict[ModuleDictKey] = Module
         #
         # MiscellaneousFile
         #
         Tmp = MiscellaneousFileXml()
-        MiscFileList = Tmp.FromXml(XmlNode(Item, '/PackageSurfaceArea/MiscellaneousFiles'), 'MiscellaneousFiles')
+        MiscFileList = Tmp.FromXml(
+            XmlNode(Item, '/PackageSurfaceArea/MiscellaneousFiles'), 'MiscellaneousFiles')
         if MiscFileList:
             Package.SetMiscFileList([MiscFileList])
         else:
@@ -317,30 +333,37 @@ class PackageSurfaceAreaXml(object):
         #
         Tmp = ClonedFromXml()
         if Package.GetClonedFromList() != []:
-            DomPackage.appendChild(Tmp.ToXml(Package.GetClonedFromList[0], 'ClonedFrom'))
+            DomPackage.appendChild(
+                Tmp.ToXml(Package.GetClonedFromList[0], 'ClonedFrom'))
         #
         # LibraryClass
         #
-        LibraryClassNode = CreateXmlElement('LibraryClassDeclarations', '', [], [])
+        LibraryClassNode = CreateXmlElement(
+            'LibraryClassDeclarations', '', [], [])
         for LibraryClass in Package.GetLibraryClassList():
             Tmp = LibraryClassXml()
-            LibraryClassNode.appendChild(Tmp.ToXml(LibraryClass, 'LibraryClass'))
+            LibraryClassNode.appendChild(
+                Tmp.ToXml(LibraryClass, 'LibraryClass'))
         DomPackage.appendChild(LibraryClassNode)
         #
         # IndustryStandardHeader
         #
-        IndustryStandardHeaderNode = CreateXmlElement('IndustryStandardIncludes', '', [], [])
+        IndustryStandardHeaderNode = CreateXmlElement(
+            'IndustryStandardIncludes', '', [], [])
         for Include in Package.GetStandardIncludeFileList():
             Tmp = IndustryStandardHeaderXml()
-            IndustryStandardHeaderNode.appendChild(Tmp.ToXml(Include, 'IndustryStandardHeader'))
+            IndustryStandardHeaderNode.appendChild(
+                Tmp.ToXml(Include, 'IndustryStandardHeader'))
         DomPackage.appendChild(IndustryStandardHeaderNode)
         #
         # PackageHeader
         #
-        PackageIncludeHeaderNode = CreateXmlElement('PackageIncludes', '', [], [])
+        PackageIncludeHeaderNode = CreateXmlElement(
+            'PackageIncludes', '', [], [])
         for Include in Package.GetPackageIncludeFileList():
             Tmp = PackageIncludeHeaderXml()
-            PackageIncludeHeaderNode.appendChild(Tmp.ToXml(Include, 'PackageHeader'))
+            PackageIncludeHeaderNode.appendChild(
+                Tmp.ToXml(Include, 'PackageHeader'))
         DomPackage.appendChild(PackageIncludeHeaderNode)
         ModuleNode = CreateXmlElement('Modules', '', [], [])
         for Module in Package.GetModuleDict().values():
@@ -353,18 +376,18 @@ class PackageSurfaceAreaXml(object):
         GuidProtocolPpiNode = CreateXmlElement('GuidDeclarations', '', [], [])
         for GuidProtocolPpi in Package.GetGuidList():
             Tmp = GuidXml('Package')
-            GuidProtocolPpiNode.appendChild(Tmp.ToXml\
+            GuidProtocolPpiNode.appendChild(Tmp.ToXml
                                             (GuidProtocolPpi, 'Entry'))
         DomPackage.appendChild(GuidProtocolPpiNode)
         #
         # Protocol
         #
         GuidProtocolPpiNode = \
-        CreateXmlElement('ProtocolDeclarations', '', [], [])
+            CreateXmlElement('ProtocolDeclarations', '', [], [])
         for GuidProtocolPpi in Package.GetProtocolList():
             Tmp = ProtocolXml('Package')
-            GuidProtocolPpiNode.appendChild\
-            (Tmp.ToXml(GuidProtocolPpi, 'Entry'))
+            GuidProtocolPpiNode.appendChild(
+                Tmp.ToXml(GuidProtocolPpi, 'Entry'))
         DomPackage.appendChild(GuidProtocolPpiNode)
         #
         # Ppi
@@ -372,8 +395,8 @@ class PackageSurfaceAreaXml(object):
         GuidProtocolPpiNode = CreateXmlElement('PpiDeclarations', '', [], [])
         for GuidProtocolPpi in Package.GetPpiList():
             Tmp = PpiXml('Package')
-            GuidProtocolPpiNode.appendChild\
-            (Tmp.ToXml(GuidProtocolPpi, 'Entry'))
+            GuidProtocolPpiNode.appendChild(
+                Tmp.ToXml(GuidProtocolPpi, 'Entry'))
         DomPackage.appendChild(GuidProtocolPpiNode)
         #
         # PcdEntry
@@ -389,7 +412,8 @@ class PackageSurfaceAreaXml(object):
         #
         Tmp = MiscellaneousFileXml()
         if Package.GetMiscFileList():
-            DomPackage.appendChild(Tmp.ToXml(Package.GetMiscFileList()[0], 'MiscellaneousFiles'))
+            DomPackage.appendChild(
+                Tmp.ToXml(Package.GetMiscFileList()[0], 'MiscellaneousFiles'))
 
         #
         # UserExtensions
@@ -397,6 +421,7 @@ class PackageSurfaceAreaXml(object):
         if Package.GetUserExtensionList():
             for UserExtension in Package.GetUserExtensionList():
                 Tmp = UserExtensionsXml()
-                DomPackage.appendChild(Tmp.ToXml(UserExtension, 'UserExtensions'))
+                DomPackage.appendChild(
+                    Tmp.ToXml(UserExtension, 'UserExtensions'))
 
         return DomPackage
diff --git a/BaseTools/Source/Python/UPT/Xml/PcdXml.py b/BaseTools/Source/Python/UPT/Xml/PcdXml.py
index bbcee45a0132..22fa048afbc3 100644
--- a/BaseTools/Source/Python/UPT/Xml/PcdXml.py
+++ b/BaseTools/Source/Python/UPT/Xml/PcdXml.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a PCD file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -35,6 +35,8 @@ import re
 ##
 # PcdErrorXml
 #
+
+
 class PcdErrorXml(object):
     def __init__(self):
         self.ValidValueList = ''
@@ -47,14 +49,15 @@ class PcdErrorXml(object):
     def FromXml(self, Item, Key):
         self.ValidValueList = XmlElement(Item, '%s/ValidValueList' % Key)
         self.ValidValueListLang = \
-        XmlAttribute(XmlNode(Item, '%s/ValidValueList' % Key), 'Lang')
-        self.ValidValueRange = self.TransferValidEpxr2ValidRange(XmlElement(Item, '%s/ValidValueRange' % Key))
+            XmlAttribute(XmlNode(Item, '%s/ValidValueList' % Key), 'Lang')
+        self.ValidValueRange = self.TransferValidEpxr2ValidRange(
+            XmlElement(Item, '%s/ValidValueRange' % Key))
         self.Expression = XmlElement(Item, '%s/Expression' % Key)
         self.ErrorNumber = XmlElement(Item, '%s/ErrorNumber' % Key)
         for ErrMsg in XmlList(Item, '%s/ErrorMessage' % Key):
             ErrorMessageString = XmlElement(ErrMsg, 'ErrorMessage')
             ErrorMessageLang = \
-            XmlAttribute(XmlNode(ErrMsg, 'ErrorMessage'), 'Lang')
+                XmlAttribute(XmlNode(ErrMsg, 'ErrorMessage'), 'Lang')
             self.ErrorMessage.append((ErrorMessageLang, ErrorMessageString))
 
         Error = PcdErrorObject()
@@ -74,16 +77,16 @@ class PcdErrorXml(object):
         NodeList = []
         if PcdError.GetValidValue():
             Element1 = \
-            CreateXmlElement('ValidValueList', PcdError.GetValidValue(), [], \
-                             [['Lang', PcdError.GetValidValueLang()]])
+                CreateXmlElement('ValidValueList', PcdError.GetValidValue(), [],
+                                 [['Lang', PcdError.GetValidValueLang()]])
             NodeList.append(Element1)
         if PcdError.GetValidValueRange():
             TansferedRangeStr = self.TransferValidRange2Expr(PcdError.GetTokenSpaceGuidCName(),
                                                              PcdError.GetCName(),
                                                              PcdError.GetValidValueRange())
             Element1 = \
-            CreateXmlElement('ValidValueRange', \
-                             TansferedRangeStr, [], [])
+                CreateXmlElement('ValidValueRange',
+                                 TansferedRangeStr, [], [])
             NodeList.append(Element1)
         if PcdError.GetExpression():
             NodeList.append(['Expression', PcdError.GetExpression()])
@@ -91,7 +94,8 @@ class PcdErrorXml(object):
             NodeList.append(['ErrorNumber', PcdError.GetErrorNumber()])
         for Item in PcdError.GetErrorMessageList():
             Element = \
-            CreateXmlElement('ErrorMessage', Item[1], [], [['Lang', Item[0]]])
+                CreateXmlElement('ErrorMessage', Item[1], [], [
+                                 ['Lang', Item[0]]])
             NodeList.append(Element)
         Root = CreateXmlElement('%s' % Key, '', NodeList, AttributeList)
 
@@ -137,7 +141,8 @@ class PcdErrorXml(object):
         for MatchStr in HexMatchedList:
             RangeItemList = MatchStr.strip().split('-')
             TransferedRangeStr = '(%s GE %s) AND (%s LE %s)' % \
-                (PcdName, RangeItemList[0].strip(), PcdName, RangeItemList[1].strip())
+                (PcdName, RangeItemList[0].strip(),
+                 PcdName, RangeItemList[1].strip())
             ValidRange = ValidRange.replace(MatchStr, TransferedRangeStr)
         #
         # Convert INT1 format range
@@ -148,7 +153,8 @@ class PcdErrorXml(object):
         for MatchStr in IntMatchedList:
             RangeItemList = MatchStr.strip().split('-')
             TransferedRangeStr = '(%s GE %s) AND (%s LE %s)' % \
-                (PcdName, RangeItemList[0].strip(), PcdName, RangeItemList[1].strip())
+                (PcdName, RangeItemList[0].strip(),
+                 PcdName, RangeItemList[1].strip())
             ValidRange = ValidRange.replace(MatchStr, TransferedRangeStr)
 
         return ValidRange
@@ -158,18 +164,20 @@ class PcdErrorXml(object):
             pass
 
         PCD_PATTERN = \
-        '[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*\.[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*'
+            '[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*\.[\t\s]*[_a-zA-Z][a-zA-Z0-9_]*[\t\s]*'
         IntPattern1 = \
-        '[\t\s]*\([\t\s]*'+PCD_PATTERN+'[\t\s]+GE[\t\s]+\d+[\t\s]*\)[\t\s]+AND[\t\s]+\([\t\s]*'+\
-        PCD_PATTERN+'[\t\s]+LE[\t\s]+\d+[\t\s]*\)'
+            '[\t\s]*\([\t\s]*'+PCD_PATTERN+'[\t\s]+GE[\t\s]+\d+[\t\s]*\)[\t\s]+AND[\t\s]+\([\t\s]*' +\
+            PCD_PATTERN+'[\t\s]+LE[\t\s]+\d+[\t\s]*\)'
         IntPattern1 = IntPattern1.replace(' ', '')
-        IntPattern2 = '[\t\s]*'+PCD_PATTERN+'[\t\s]+(LT|GT|LE|GE|XOR|EQ)[\t\s]+\d+[\t\s]*'
+        IntPattern2 = '[\t\s]*'+PCD_PATTERN + \
+            '[\t\s]+(LT|GT|LE|GE|XOR|EQ)[\t\s]+\d+[\t\s]*'
 
         HexPattern1 = \
-        '[\t\s]*\([\t\s]*'+PCD_PATTERN+'[\t\s]+GE[\t\s]+0[xX][0-9a-fA-F]+[\t\s]*\)[\t\s]+AND[\t\s]+\([\t\s]*'+\
-        PCD_PATTERN+'[\t\s]+LE[\t\s]+0[xX][0-9a-fA-F]+[\t\s]*\)'
+            '[\t\s]*\([\t\s]*'+PCD_PATTERN+'[\t\s]+GE[\t\s]+0[xX][0-9a-fA-F]+[\t\s]*\)[\t\s]+AND[\t\s]+\([\t\s]*' +\
+            PCD_PATTERN+'[\t\s]+LE[\t\s]+0[xX][0-9a-fA-F]+[\t\s]*\)'
         HexPattern1 = HexPattern1.replace(' ', '')
-        HexPattern2 = '[\t\s]*'+PCD_PATTERN+'[\t\s]+(LT|GT|LE|GE|XOR|EQ)[\t\s]+0[xX][0-9a-zA-Z]+[\t\s]*'
+        HexPattern2 = '[\t\s]*'+PCD_PATTERN + \
+            '[\t\s]+(LT|GT|LE|GE|XOR|EQ)[\t\s]+0[xX][0-9a-zA-Z]+[\t\s]*'
 
         #
         # Do the Hex1 conversion
@@ -180,7 +188,8 @@ class PcdErrorXml(object):
             #
             # To match items on both sides of '-'
             #
-            RangeItemList = re.compile('[\t\s]*0[xX][0-9a-fA-F]+[\t\s]*').findall(HexMatchedItem)
+            RangeItemList = re.compile(
+                '[\t\s]*0[xX][0-9a-fA-F]+[\t\s]*').findall(HexMatchedItem)
             if RangeItemList and len(RangeItemList) == 2:
                 HexRangeDict[HexMatchedItem] = RangeItemList
 
@@ -204,7 +213,8 @@ class PcdErrorXml(object):
             #
             # To match items on both sides of '-'
             #
-            RangeItemList = re.compile('[\t\s]*\d+[\t\s]*').findall(MatchedItem)
+            RangeItemList = re.compile(
+                '[\t\s]*\d+[\t\s]*').findall(MatchedItem)
             if RangeItemList and len(RangeItemList) == 2:
                 IntRangeDict[MatchedItem] = RangeItemList
 
@@ -236,17 +246,17 @@ class PcdErrorXml(object):
 
         return ValidRangeExpr
 
-
-
     def __str__(self):
         return "ValidValueList = %s ValidValueListLang = %s ValidValueRange \
         = %s Expression = %s ErrorNumber = %s %s" % \
-        (self.ValidValueList, self.ValidValueListLang, self.ValidValueRange, \
-         self.Expression, self.ErrorNumber, self.ErrorMessage)
+            (self.ValidValueList, self.ValidValueListLang, self.ValidValueRange,
+             self.Expression, self.ErrorNumber, self.ErrorMessage)
 
 ##
 # PcdEntryXml
 #
+
+
 class PcdEntryXml(object):
     def __init__(self):
         self.PcdItemType = ''
@@ -272,12 +282,12 @@ class PcdEntryXml(object):
     #
     def FromXml(self, Item, Key):
         self.PcdItemType = \
-        XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdItemType')
+            XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdItemType')
         self.PcdUsage = XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdUsage')
         self.TokenSpaceGuidCName = \
-        XmlElement(Item, '%s/TokenSpaceGuidCname' % Key)
+            XmlElement(Item, '%s/TokenSpaceGuidCname' % Key)
         self.TokenSpaceGuidValue = \
-        XmlElement(Item, '%s/TokenSpaceGuidValue' % Key)
+            XmlElement(Item, '%s/TokenSpaceGuidValue' % Key)
         self.Token = XmlElement(Item, '%s/Token' % Key)
         self.CName = XmlElement(Item, '%s/CName' % Key)
         self.PcdCName = XmlElement(Item, '%s/PcdCName' % Key)
@@ -315,7 +325,8 @@ class PcdEntryXml(object):
         PcdEntry.SetValidUsage(self.ValidUsage)
         PcdEntry.SetDefaultValue(self.DefaultValue)
         PcdEntry.SetMaxDatumSize(self.MaxDatumSize)
-        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(
+            self.CommonDefines.FeatureFlag))
         PcdEntry.SetItemType(self.PcdItemType)
 
         PcdEntry.SetHelpTextList(GetHelpTextList(self.HelpText))
@@ -325,9 +336,10 @@ class PcdEntryXml(object):
     ##
     # Package will use FromXml2
     #
+
     def FromXml2(self, Item, Key):
         self.TokenSpaceGuidCName = \
-        XmlElement(Item, '%s/TokenSpaceGuidCname' % Key)
+            XmlElement(Item, '%s/TokenSpaceGuidCname' % Key)
         self.Token = XmlElement(Item, '%s/Token' % Key)
         self.CName = XmlElement(Item, '%s/CName' % Key)
         self.DatumType = XmlElement(Item, '%s/DatumType' % Key)
@@ -360,7 +372,8 @@ class PcdEntryXml(object):
         PcdEntry.SetValidUsage(self.ValidUsage)
         PcdEntry.SetDefaultValue(self.DefaultValue)
         PcdEntry.SetMaxDatumSize(self.MaxDatumSize)
-        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(
+            self.CommonDefines.FeatureFlag))
 
         PcdEntry.SetPromptList(GetPromptList(self.Prompt))
         PcdEntry.SetHelpTextList(GetHelpTextList(self.HelpText))
@@ -373,10 +386,10 @@ class PcdEntryXml(object):
     #
     def FromXml3(self, Item, Key):
         self.PcdItemType = \
-        XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdItemType')
+            XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdItemType')
         self.PcdUsage = XmlAttribute(XmlNode(Item, '%s' % Key), 'PcdUsage')
         self.TokenSpaceGuidCName = \
-        XmlElement(Item, '%s/TokenSpaceGuidCName' % Key)
+            XmlElement(Item, '%s/TokenSpaceGuidCName' % Key)
         self.CName = XmlElement(Item, '%s/CName' % Key)
         self.DefaultValue = XmlElement(Item, '%s/DefaultValue' % Key)
         self.CommonDefines.FromXml(XmlNode(Item, '%s' % Key), Key)
@@ -397,7 +410,8 @@ class PcdEntryXml(object):
         PcdEntry.SetCName(self.CName)
         PcdEntry.SetValidUsage(self.PcdUsage)
         PcdEntry.SetDefaultValue(self.DefaultValue)
-        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(self.CommonDefines.FeatureFlag))
+        PcdEntry.SetFeatureFlag(ConvertNOTEQToNE(
+            self.CommonDefines.FeatureFlag))
         PcdEntry.SetItemType(self.PcdItemType)
 
         PcdEntry.SetHelpTextList(GetHelpTextList(self.HelpText))
@@ -412,11 +426,11 @@ class PcdEntryXml(object):
         DefaultValue = ConvertNEToNOTEQ(PcdEntry.GetDefaultValue())
 
         AttributeList = \
-        [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())], \
-         ['PcdUsage', PcdEntry.GetValidUsage()], \
-         ['PcdItemType', PcdEntry.GetItemType()], \
-         ['FeatureFlag', PcdEntry.GetFeatureFlag()],
-        ]
+            [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())],
+             ['PcdUsage', PcdEntry.GetValidUsage()],
+                ['PcdItemType', PcdEntry.GetItemType()],
+                ['FeatureFlag', PcdEntry.GetFeatureFlag()],
+             ]
         NodeList = [['TokenSpaceGuidCname', PcdEntry.GetTokenSpaceGuidCName()],
                     ['TokenSpaceGuidValue', PcdEntry.GetTokenSpaceGuidValue()],
                     ['Token', PcdEntry.GetToken()],
@@ -426,7 +440,7 @@ class PcdEntryXml(object):
                     ['DefaultValue', DefaultValue],
                     ['MaxDatumSize', PcdEntry.GetMaxDatumSize()],
                     ['Offset', PcdEntry.GetOffset()],
-                   ]
+                    ]
 
         for Item in PcdEntry.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -441,6 +455,7 @@ class PcdEntryXml(object):
     ##
     # Package will use ToXml2
     #
+
     def ToXml2(self, PcdEntry, Key):
         if self.PcdCName:
             pass
@@ -448,9 +463,9 @@ class PcdEntryXml(object):
         DefaultValue = ConvertNEToNOTEQ(PcdEntry.GetDefaultValue())
 
         AttributeList = \
-        [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())], \
-         ['SupModList', GetStringOfList(PcdEntry.GetSupModuleList())]
-        ]
+            [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())],
+             ['SupModList', GetStringOfList(PcdEntry.GetSupModuleList())]
+             ]
         NodeList = [['TokenSpaceGuidCname', PcdEntry.GetTokenSpaceGuidCName()],
                     ['Token', PcdEntry.GetToken()],
                     ['CName', PcdEntry.GetCName()],
@@ -458,7 +473,7 @@ class PcdEntryXml(object):
                     ['ValidUsage', GetStringOfList(PcdEntry.GetValidUsage())],
                     ['DefaultValue', DefaultValue],
                     ['MaxDatumSize', PcdEntry.GetMaxDatumSize()],
-                   ]
+                    ]
         for Item in PcdEntry.GetPromptList():
             Tmp = PromptXml()
             NodeList.append(Tmp.ToXml(Item))
@@ -477,6 +492,7 @@ class PcdEntryXml(object):
     ##
     # Module will use ToXml3
     #
+
     def ToXml3(self, PcdEntry, Key):
         if self.PcdCName:
             pass
@@ -484,15 +500,15 @@ class PcdEntryXml(object):
         DefaultValue = ConvertNEToNOTEQ(PcdEntry.GetDefaultValue())
 
         AttributeList = \
-        [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())], \
-         ['PcdUsage', PcdEntry.GetValidUsage()], \
-         ['PcdItemType', PcdEntry.GetItemType()], \
-         ['FeatureFlag', ConvertNEToNOTEQ(PcdEntry.GetFeatureFlag())],
-        ]
+            [['SupArchList', GetStringOfList(PcdEntry.GetSupArchList())],
+             ['PcdUsage', PcdEntry.GetValidUsage()],
+                ['PcdItemType', PcdEntry.GetItemType()],
+                ['FeatureFlag', ConvertNEToNOTEQ(PcdEntry.GetFeatureFlag())],
+             ]
         NodeList = [['CName', PcdEntry.GetCName()],
                     ['TokenSpaceGuidCName', PcdEntry.GetTokenSpaceGuidCName()],
                     ['DefaultValue', DefaultValue],
-                   ]
+                    ]
 
         for Item in PcdEntry.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -517,14 +533,14 @@ class PcdEntryXml(object):
         AttributeList = []
 
         NodeList = [
-                    ['TokenSpaceGuidValue', PcdEntry.GetTokenSpaceGuidValue()],
-                    ['PcdCName', PcdEntry.GetCName()],
-                    ['Token', PcdEntry.GetToken()],
-                    ['DatumType', PcdEntry.GetDatumType()],
-                    ['MaxDatumSize', PcdEntry.GetMaxDatumSize()],
-                    ['Value', DefaultValue],
-                    ['Offset', PcdEntry.GetOffset()]
-                   ]
+            ['TokenSpaceGuidValue', PcdEntry.GetTokenSpaceGuidValue()],
+            ['PcdCName', PcdEntry.GetCName()],
+            ['Token', PcdEntry.GetToken()],
+            ['DatumType', PcdEntry.GetDatumType()],
+            ['MaxDatumSize', PcdEntry.GetMaxDatumSize()],
+            ['Value', DefaultValue],
+            ['Offset', PcdEntry.GetOffset()]
+        ]
 
         for Item in PcdEntry.GetHelpTextList():
             Tmp = HelpTextXml()
@@ -537,17 +553,16 @@ class PcdEntryXml(object):
 
         return Root
 
-
     def __str__(self):
         Str = \
-        ('PcdItemType = %s PcdUsage = %s TokenSpaceGuidCName = %s \
+            ('PcdItemType = %s PcdUsage = %s TokenSpaceGuidCName = %s \
         TokenSpaceGuidValue = %s Token = %s CName = %s PcdCName = %s \
         DatumType = %s ValidUsage = %s DefaultValue = %s MaxDatumSize = %s \
         Value = %s Offset = %s %s') % \
-        (self.PcdItemType, self.PcdUsage, self.TokenSpaceGuidCName, \
-         self.TokenSpaceGuidValue, self.Token, self.CName, self.PcdCName, \
-         self.DatumType, self.ValidUsage, self.DefaultValue, \
-         self.MaxDatumSize, self.Value, self.Offset, self.CommonDefines)
+            (self.PcdItemType, self.PcdUsage, self.TokenSpaceGuidCName,
+             self.TokenSpaceGuidValue, self.Token, self.CName, self.PcdCName,
+             self.DatumType, self.ValidUsage, self.DefaultValue,
+             self.MaxDatumSize, self.Value, self.Offset, self.CommonDefines)
         for Item in self.HelpText:
             Str = Str + "\n\t" + str(Item)
         for Item in self.PcdError:
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParser.py b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
index 8e22a280f655..789c8b958c75 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParser.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a xml file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -48,12 +48,14 @@ import Logger.Log as Logger
 ##
 # DistributionPackageXml
 #
+
+
 class DistributionPackageXml(object):
     def __init__(self):
         self.DistP = DistributionPackageClass()
         self.Pkg = ''
 
-    ## ValidateDistributionPackage
+    # ValidateDistributionPackage
     #
     # Check if any required item is missing in DistributionPackage
     #
@@ -64,7 +66,7 @@ class DistributionPackageXml(object):
             # Check DistributionPackage -> DistributionHeader
             #
             XmlTreeLevel = ['DistributionPackage', '']
-            CheckDict = {'DistributionHeader':self.DistP.Header }
+            CheckDict = {'DistributionHeader': self.DistP.Header}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             if self.DistP.Header:
@@ -110,7 +112,8 @@ class DistributionPackageXml(object):
             # Check Each Module
             #
             for Key in self.DistP.ModuleSurfaceArea:
-                ValidateMS(self.DistP.ModuleSurfaceArea[Key], ['DistributionPackage', 'ModuleSurfaceArea'])
+                ValidateMS(self.DistP.ModuleSurfaceArea[Key], [
+                           'DistributionPackage', 'ModuleSurfaceArea'])
 
             #
             # Check Each Tool
@@ -133,16 +136,19 @@ class DistributionPackageXml(object):
             # Check Each Misc File
             #
             if self.DistP.MiscellaneousFiles:
-                XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles', 'Header']
+                XmlTreeLevel = ['DistributionPackage',
+                                'MiscellaneousFiles', 'Header']
                 CheckDict = {'Name': self.DistP.MiscellaneousFiles.GetName(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
                 if not self.DistP.MiscellaneousFiles.GetFileList():
-                    XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
+                    XmlTreeLevel = [
+                        'DistributionPackage', 'MiscellaneousFiles']
                     CheckDict = {'FileName': None, }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 for Item in self.DistP.MiscellaneousFiles.GetFileList():
-                    XmlTreeLevel = ['DistributionPackage', 'MiscellaneousFiles']
+                    XmlTreeLevel = [
+                        'DistributionPackage', 'MiscellaneousFiles']
                     CheckDict = {'FileName': Item.GetURI(), }
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
@@ -154,7 +160,6 @@ class DistributionPackageXml(object):
                 CheckDict = {'UserId': Item.GetUserID(), }
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-
     def FromXml(self, Filename=None):
         if Filename is not None:
             self.DistP = DistributionPackageClass()
@@ -168,7 +173,8 @@ class DistributionPackageXml(object):
             #
             Tmp = DistributionPackageHeaderXml()
             DistributionPackageHeader = \
-            Tmp.FromXml(XmlNode(self.Pkg, '/DistributionPackage/DistributionHeader'), 'DistributionHeader')
+                Tmp.FromXml(XmlNode(
+                    self.Pkg, '/DistributionPackage/DistributionHeader'), 'DistributionHeader')
             self.DistP.Header = DistributionPackageHeader
             #
             # Parse each PackageSurfaceArea
@@ -176,40 +182,43 @@ class DistributionPackageXml(object):
             for Item in XmlList(self.Pkg, '/DistributionPackage/PackageSurfaceArea'):
                 Psa = PackageSurfaceAreaXml()
                 Package = Psa.FromXml(Item, 'PackageSurfaceArea')
-                self.DistP.PackageSurfaceArea[(Package.GetGuid(), \
-                                               Package.GetVersion(), \
+                self.DistP.PackageSurfaceArea[(Package.GetGuid(),
+                                               Package.GetVersion(),
                                                Package.GetPackagePath())] = \
-                                               Package
+                    Package
             #
             # Parse each ModuleSurfaceArea
             #
             for Item in XmlList(self.Pkg, '/DistributionPackage/ModuleSurfaceArea'):
                 Msa = ModuleSurfaceAreaXml()
                 Module = Msa.FromXml(Item, 'ModuleSurfaceArea', True)
-                ModuleKey = (Module.GetGuid(), Module.GetVersion(), Module.GetName(), Module.GetModulePath())
+                ModuleKey = (Module.GetGuid(), Module.GetVersion(),
+                             Module.GetName(), Module.GetModulePath())
                 self.DistP.ModuleSurfaceArea[ModuleKey] = Module
 
             #
             # Parse Tools
             #
             Tmp = MiscellaneousFileXml()
-            self.DistP.Tools = Tmp.FromXml2(XmlNode(self.Pkg, '/DistributionPackage/Tools'), 'Tools')
+            self.DistP.Tools = Tmp.FromXml2(
+                XmlNode(self.Pkg, '/DistributionPackage/Tools'), 'Tools')
 
             #
             # Parse MiscFiles
             #
             Tmp = MiscellaneousFileXml()
             self.DistP.MiscellaneousFiles = \
-            Tmp.FromXml2(XmlNode(self.Pkg, \
-                                 '/DistributionPackage/MiscellaneousFiles'), \
-                                 'MiscellaneousFiles')
+                Tmp.FromXml2(XmlNode(self.Pkg,
+                                     '/DistributionPackage/MiscellaneousFiles'),
+                             'MiscellaneousFiles')
 
             #
             # Parse UserExtensions
             #
             for Item in XmlList(self.Pkg, '/DistributionPackage/UserExtensions'):
                 Tmp = UserExtensionsXml()
-                self.DistP.UserExtensions.append(Tmp.FromXml2(Item, 'UserExtensions'))
+                self.DistP.UserExtensions.append(
+                    Tmp.FromXml2(Item, 'UserExtensions'))
 
             #
             # Check Required Items for XML
@@ -264,7 +273,6 @@ class DistributionPackageXml(object):
 
             XmlContent = Root.toprettyxml(indent='  ')
 
-
             #
             # Remove empty element
             #
@@ -280,58 +288,62 @@ class DistributionPackageXml(object):
             # Remove SupArchList="COMMON" or "common"
             #
             XmlContent = \
-            re.sub(r'[\s\r\n]*SupArchList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*COMMON'
-            '[\s\r\n]*"', '', XmlContent)
+                re.sub(r'[\s\r\n]*SupArchList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*COMMON'
+                       '[\s\r\n]*"', '', XmlContent)
             XmlContent = \
-            re.sub(r'[\s\r\n]*SupArchList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*common'
-            '[\s\r\n]*"', '', XmlContent)
+                re.sub(r'[\s\r\n]*SupArchList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*common'
+                       '[\s\r\n]*"', '', XmlContent)
             #
             # Remove <SupArchList> COMMON </SupArchList>
             #
             XmlContent = \
-            re.sub(r'[\s\r\n]*<SupArchList>[\s\r\n]*COMMON[\s\r\n]*'
-            '</SupArchList>[\s\r\n]*', '', XmlContent)
+                re.sub(r'[\s\r\n]*<SupArchList>[\s\r\n]*COMMON[\s\r\n]*'
+                       '</SupArchList>[\s\r\n]*', '', XmlContent)
 
             #
             # Remove <SupArchList> common </SupArchList>
             #
             XmlContent = \
-            re.sub(r'[\s\r\n]*<SupArchList>[\s\r\n]*'
-            'common[\s\r\n]*</SupArchList>[\s\r\n]*', '', XmlContent)
+                re.sub(r'[\s\r\n]*<SupArchList>[\s\r\n]*'
+                       'common[\s\r\n]*</SupArchList>[\s\r\n]*', '', XmlContent)
 
             #
             # Remove SupModList="COMMON" or "common"
             #
             XmlContent = \
-            re.sub(r'[\s\r\n]*SupModList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*COMMON'
-            '[\s\r\n]*"', '', XmlContent)
+                re.sub(r'[\s\r\n]*SupModList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*COMMON'
+                       '[\s\r\n]*"', '', XmlContent)
             XmlContent = \
-            re.sub(r'[\s\r\n]*SupModList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*common'
-            '[\s\r\n]*"', '', XmlContent)
+                re.sub(r'[\s\r\n]*SupModList[\s\r\n]*=[\s\r\n]*"[\s\r\n]*common'
+                       '[\s\r\n]*"', '', XmlContent)
 
             return XmlContent
 
         return ''
 
-## ValidateMS
+# ValidateMS
 #
 # Check if any required item is missing in ModuleSurfaceArea
 #
 # @param Module: The ModuleSurfaceArea to be checked
 # @param XmlTreeLevel: The top level of Module
 #
+
+
 def ValidateMS(Module, TopXmlTreeLevel):
     ValidateMS1(Module, TopXmlTreeLevel)
     ValidateMS2(Module, TopXmlTreeLevel)
     ValidateMS3(Module, TopXmlTreeLevel)
 
-## ValidateMS1
+# ValidateMS1
 #
 # Check if any required item is missing in ModuleSurfaceArea
 #
 # @param Module: The ModuleSurfaceArea to be checked
 # @param XmlTreeLevel: The top level of Module
 #
+
+
 def ValidateMS1(Module, TopXmlTreeLevel):
     #
     # Check Guids -> GuidCName
@@ -339,21 +351,22 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['Guids']
     for Item in Module.GetGuidList():
         if Item is None:
-            CheckDict = {'GuidCName':''}
+            CheckDict = {'GuidCName': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['Guids', 'GuidCName']
     for Item in Module.GetGuidList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'GuidType':Item.GetGuidTypeList(),
-                     'Usage':Item.GetUsage()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'GuidType': Item.GetGuidTypeList(),
+                     'Usage': Item.GetUsage()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
         if Item.GetVariableName():
             Result = ConvertVariableName(Item.GetVariableName())
             if Result is None:
                 Msg = "->".join(Node for Node in XmlTreeLevel)
-                ErrorMsg = ERR_XML_INVALID_VARIABLENAME % (Item.GetVariableName(), Item.GetCName(), Msg)
+                ErrorMsg = ERR_XML_INVALID_VARIABLENAME % (
+                    Item.GetVariableName(), Item.GetCName(), Msg)
                 Logger.Error('\nUPT', PARSER_ERROR, ErrorMsg, RaiseError=True)
             else:
                 Item.SetVariableName(Result)
@@ -364,13 +377,13 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['Protocols']
     for Item in Module.GetProtocolList():
         if Item is None:
-            CheckDict = {'Protocol':''}
+            CheckDict = {'Protocol': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['Protocols', 'Protocol']
     for Item in Module.GetProtocolList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'Usage':Item.GetUsage()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'Usage': Item.GetUsage()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -379,13 +392,13 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['PPIs']
     for Item in Module.GetPpiList():
         if Item is None:
-            CheckDict = {'Ppi':''}
+            CheckDict = {'Ppi': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['PPIs', 'Ppi']
     for Item in Module.GetPpiList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'Usage':Item.GetUsage()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'Usage': Item.GetUsage()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -394,15 +407,15 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['PcdCoded']
     for Item in Module.GetPcdList():
         if Item is None:
-            CheckDict = {'PcdEntry':''}
+            CheckDict = {'PcdEntry': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['PcdCoded', 'PcdEntry']
     for Item in Module.GetPcdList():
-        CheckDict = {'TokenSpaceGuidCname':Item.GetTokenSpaceGuidCName(),
-                     'CName':Item.GetCName(),
-                     'PcdUsage':Item.GetValidUsage(),
-                     'PcdItemType':Item.GetItemType()}
+        CheckDict = {'TokenSpaceGuidCname': Item.GetTokenSpaceGuidCName(),
+                     'CName': Item.GetCName(),
+                     'PcdUsage': Item.GetValidUsage(),
+                     'PcdItemType': Item.GetItemType()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -411,7 +424,7 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['Externs']
     for Item in Module.GetExternList():
         if Item is None:
-            CheckDict = {'Extern':''}
+            CheckDict = {'Extern': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -426,7 +439,8 @@ def ValidateMS1(Module, TopXmlTreeLevel):
             if not IsEqualList(Item.SupArchList, Module.SupArchList):
                 Logger.Error('\nUPT',
                              PARSER_ERROR,
-                             ERR_XML_INVALID_EXTERN_SUPARCHLIST % (str(Item.SupArchList), str(Module.SupArchList)),
+                             ERR_XML_INVALID_EXTERN_SUPARCHLIST % (
+                                 str(Item.SupArchList), str(Module.SupArchList)),
                              RaiseError=True)
 
     #
@@ -434,7 +448,8 @@ def ValidateMS1(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['UserExtensions']
     for Item in Module.GetUserExtensionList():
-        CheckDict = {'UserId':Item.GetUserID(), 'Identifier':Item.GetIdentifier()}
+        CheckDict = {'UserId': Item.GetUserID(
+        ), 'Identifier': Item.GetIdentifier()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -448,13 +463,15 @@ def ValidateMS1(Module, TopXmlTreeLevel):
         for File in Item.GetFileList():
             CheckDict = {'Filename': File.GetURI(), }
 
-## ValidateMS2
+# ValidateMS2
 #
 # Check if any required item is missing in ModuleSurfaceArea
 #
 # @param Module: The ModuleSurfaceArea to be checked
 # @param XmlTreeLevel: The top level of Module
 #
+
+
 def ValidateMS2(Module, TopXmlTreeLevel):
     #
     # Check Header
@@ -471,20 +488,21 @@ def ValidateMS2(Module, TopXmlTreeLevel):
     # Check ModuleProperties
     #
     XmlTreeLevel = TopXmlTreeLevel + ['ModuleProperties']
-    CheckDict = {'ModuleType':Module.GetModuleType(),
-                 'Path':Module.GetModulePath()}
+    CheckDict = {'ModuleType': Module.GetModuleType(),
+                 'Path': Module.GetModulePath()}
     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     if not IsValidInstallPath(Module.GetModulePath()):
-        Logger.Error("UPT", FORMAT_INVALID, ERR_FILE_NAME_INVALIDE % Module.GetModulePath())
+        Logger.Error("UPT", FORMAT_INVALID, ERR_FILE_NAME_INVALIDE %
+                     Module.GetModulePath())
 
     #
     # Check ModuleProperties->BootMode
     #
     XmlTreeLevel = TopXmlTreeLevel + ['ModuleProperties'] + ['BootMode']
     for Item in Module.GetBootModeList():
-        CheckDict = {'Usage':Item.GetUsage(),
-                     'SupportedBootModes':Item.GetSupportedBootModes()}
+        CheckDict = {'Usage': Item.GetUsage(),
+                     'SupportedBootModes': Item.GetSupportedBootModes()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -492,8 +510,8 @@ def ValidateMS2(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['ModuleProperties'] + ['Event']
     for Item in Module.GetEventList():
-        CheckDict = {'Usage':Item.GetUsage(),
-                     'EventType':Item.GetEventType()}
+        CheckDict = {'Usage': Item.GetUsage(),
+                     'EventType': Item.GetEventType()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -501,8 +519,8 @@ def ValidateMS2(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['ModuleProperties'] + ['HOB']
     for Item in Module.GetHobList():
-        CheckDict = {'Usage':Item.GetUsage(),
-                     'HobType':Item.GetHobType()}
+        CheckDict = {'Usage': Item.GetUsage(),
+                     'HobType': Item.GetHobType()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -513,11 +531,11 @@ def ValidateMS2(Module, TopXmlTreeLevel):
     if Module.ModuleType == "UEFI_RUNTIME_DRIVER":
         Module.ModuleType = "DXE_RUNTIME_DRIVER"
         DxeObj = DepexObject()
-        DxeObj.SetDepex("gEfiBdsArchProtocolGuid AND \ngEfiCpuArchProtocolGuid AND\n" + \
-                        "gEfiMetronomeArchProtocolGuid AND \ngEfiMonotonicCounterArchProtocolGuid AND\n" + \
-                        "gEfiRealTimeClockArchProtocolGuid AND \ngEfiResetArchProtocolGuid AND\n" + \
-                        "gEfiRuntimeArchProtocolGuid AND \ngEfiSecurityArchProtocolGuid AND\n" + \
-                        "gEfiTimerArchProtocolGuid AND \ngEfiVariableWriteArchProtocolGuid AND\n" + \
+        DxeObj.SetDepex("gEfiBdsArchProtocolGuid AND \ngEfiCpuArchProtocolGuid AND\n" +
+                        "gEfiMetronomeArchProtocolGuid AND \ngEfiMonotonicCounterArchProtocolGuid AND\n" +
+                        "gEfiRealTimeClockArchProtocolGuid AND \ngEfiResetArchProtocolGuid AND\n" +
+                        "gEfiRuntimeArchProtocolGuid AND \ngEfiSecurityArchProtocolGuid AND\n" +
+                        "gEfiTimerArchProtocolGuid AND \ngEfiVariableWriteArchProtocolGuid AND\n" +
                         "gEfiVariableArchProtocolGuid AND \ngEfiWatchdogTimerArchProtocolGuid")
         DxeObj.SetModuleType(['DXE_RUNTIME_DRIVER'])
         Module.PeiDepex = []
@@ -531,16 +549,17 @@ def ValidateMS2(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['LibraryClassDefinitions']
     for Item in Module.GetLibraryClassList():
         if Item is None:
-            CheckDict = {'LibraryClass':''}
+            CheckDict = {'LibraryClass': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = TopXmlTreeLevel + ['LibraryClassDefinitions', 'LibraryClass']
+    XmlTreeLevel = TopXmlTreeLevel + \
+        ['LibraryClassDefinitions', 'LibraryClass']
 
     IsLibraryModule = False
     LibrarySupModList = []
     for Item in Module.GetLibraryClassList():
-        CheckDict = {'Keyword':Item.GetLibraryClass(),
-                     'Usage':Item.GetUsage()}
+        CheckDict = {'Keyword': Item.GetLibraryClass(),
+                     'Usage': Item.GetUsage()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         #
         # If the LibraryClass:SupModList is not "UNDEFINED" the LIBRARY_CLASS entry must have the list
@@ -559,14 +578,14 @@ def ValidateMS2(Module, TopXmlTreeLevel):
                 if not IsValidInfMoudleType(SupModule):
                     Logger.Error('\nUPT',
                                  PARSER_ERROR,
-                                 ERR_XML_INVALID_LIB_SUPMODLIST % (Item.LibraryClass, str(SupModule)),
+                                 ERR_XML_INVALID_LIB_SUPMODLIST % (
+                                     Item.LibraryClass, str(SupModule)),
                                  RaiseError=True)
 
         if Item.Usage == 'PRODUCES' or Item.Usage == 'SOMETIMES_PRODUCES':
             IsLibraryModule = True
             LibrarySupModList = Item.SupModuleList
 
-
     #
     # For Library modules (indicated by a LIBRARY_CLASS statement in the [Defines] section)
     # If the SupModList attribute of the CONSTRUCTOR or DESTRUCTOR element does not match the Supported Module
@@ -580,9 +599,10 @@ def ValidateMS2(Module, TopXmlTreeLevel):
                 if hasattr(Item, 'SupModList') and len(Item.SupModList) > 0 and \
                    not IsEqualList(Item.SupModList, LibrarySupModList):
                     Logger.Error('\nUPT',
-                         PARSER_ERROR,
-                         ERR_XML_INVALID_EXTERN_SUPMODLIST % (str(Item.SupModList), str(LibrarySupModList)),
-                         RaiseError=True)
+                                 PARSER_ERROR,
+                                 ERR_XML_INVALID_EXTERN_SUPMODLIST % (
+                                     str(Item.SupModList), str(LibrarySupModList)),
+                                 RaiseError=True)
 
     #
     # If the module is not a library module, the MODULE_TYPE listed in the ModuleSurfaceArea.Header must match the
@@ -594,21 +614,22 @@ def ValidateMS2(Module, TopXmlTreeLevel):
             if hasattr(Item, 'SupModList') and len(Item.SupModList) > 0 and \
                not IsEqualList(Item.SupModList, [Module.ModuleType]):
                 Logger.Error('\nUPT',
-                     PARSER_ERROR,
-                     ERR_XML_INVALID_EXTERN_SUPMODLIST_NOT_LIB % (str(Module.ModuleType), str(Item.SupModList)),
-                     RaiseError=True)
+                             PARSER_ERROR,
+                             ERR_XML_INVALID_EXTERN_SUPMODLIST_NOT_LIB % (
+                                 str(Module.ModuleType), str(Item.SupModList)),
+                             RaiseError=True)
     #
     # Check SourceFiles
     #
     XmlTreeLevel = TopXmlTreeLevel + ['SourceFiles']
     for Item in Module.GetSourceFileList():
         if Item is None:
-            CheckDict = {'Filename':''}
+            CheckDict = {'Filename': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['SourceFiles']
     for Item in Module.GetSourceFileList():
-        CheckDict = {'Filename':Item.GetSourceFile()}
+        CheckDict = {'Filename': Item.GetSourceFile()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     for ItemCount in range(len(Module.GetBinaryFileList())):
@@ -617,13 +638,15 @@ def ValidateMS2(Module, TopXmlTreeLevel):
             Item.FileNamList[0].FileType = 'SUBTYPE_GUID'
             Module.GetBinaryFileList()[ItemCount] = Item
 
-## ValidateMS3
+# ValidateMS3
 #
 # Check if any required item is missing in ModuleSurfaceArea
 #
 # @param Module: The ModuleSurfaceArea to be checked
 # @param XmlTreeLevel: The top level of Module
 #
+
+
 def ValidateMS3(Module, TopXmlTreeLevel):
     #
     # Check PackageDependencies -> Package
@@ -631,12 +654,12 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     XmlTreeLevel = TopXmlTreeLevel + ['PackageDependencies']
     for Item in Module.GetPackageDependencyList():
         if Item is None:
-            CheckDict = {'Package':''}
+            CheckDict = {'Package': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     XmlTreeLevel = TopXmlTreeLevel + ['PackageDependencies', 'Package']
     for Item in Module.GetPackageDependencyList():
-        CheckDict = {'GUID':Item.GetGuid()}
+        CheckDict = {'GUID': Item.GetGuid()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -645,50 +668,53 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     for Item in Module.GetBinaryFileList():
         if Item is None:
             XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles']
-            CheckDict = {'BinaryFile':''}
+            CheckDict = {'BinaryFile': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         if not Item.GetFileNameList():
             XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile']
-            CheckDict = {'Filename':''}
+            CheckDict = {'Filename': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
         XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile']
         for File in Item.GetFileNameList():
-            CheckDict = {'Filename':File.GetFilename(),
-                         'FileType':File.GetFileType()}
+            CheckDict = {'Filename': File.GetFilename(),
+                         'FileType': File.GetFileType()}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
         for AsBuilt in Item.GetAsBuiltList():
             #
             # Check LibInstance
             #
             if len(AsBuilt.LibraryInstancesList) == 1 and not AsBuilt.LibraryInstancesList[0]:
-                CheckDict = {'GUID':''}
-                XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'LibraryInstances']
+                CheckDict = {'GUID': ''}
+                XmlTreeLevel = TopXmlTreeLevel + \
+                    ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'LibraryInstances']
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             for LibItem in AsBuilt.LibraryInstancesList:
-                CheckDict = {'Guid':LibItem.Guid,
-                             'Version':LibItem.Version}
-                XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'LibraryInstances']
+                CheckDict = {'Guid': LibItem.Guid,
+                             'Version': LibItem.Version}
+                XmlTreeLevel = TopXmlTreeLevel + \
+                    ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'LibraryInstances']
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
             #
             # Check PatchPcd
             #
             for PatchPcdItem in AsBuilt.PatchPcdList:
-                CheckDict = {'TokenSpaceGuidValue':PatchPcdItem.TokenSpaceGuidValue,
-                             'PcdCName':PatchPcdItem.PcdCName,
-                             'Token':PatchPcdItem.Token,
-                             'DatumType':PatchPcdItem.DatumType,
-                             'Value':PatchPcdItem.DefaultValue,
-                             'Offset':PatchPcdItem.Offset}
-                XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'PatchPcdValue']
+                CheckDict = {'TokenSpaceGuidValue': PatchPcdItem.TokenSpaceGuidValue,
+                             'PcdCName': PatchPcdItem.PcdCName,
+                             'Token': PatchPcdItem.Token,
+                             'DatumType': PatchPcdItem.DatumType,
+                             'Value': PatchPcdItem.DefaultValue,
+                             'Offset': PatchPcdItem.Offset}
+                XmlTreeLevel = TopXmlTreeLevel + \
+                    ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'PatchPcdValue']
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 #
                 # Check PcdError
                 #
                 for PcdErrorItem in PatchPcdItem.PcdErrorsList:
-                    CheckDict = {'ErrorNumber':PcdErrorItem.ErrorNumber}
+                    CheckDict = {'ErrorNumber': PcdErrorItem.ErrorNumber}
                     XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt',
                                                       'PatchPcdValue', 'PcdError']
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
@@ -696,16 +722,17 @@ def ValidateMS3(Module, TopXmlTreeLevel):
             # Check PcdEx
             #
             for PcdExItem in AsBuilt.PcdExValueList:
-                CheckDict = {'TokenSpaceGuidValue':PcdExItem.TokenSpaceGuidValue,
-                             'Token':PcdExItem.Token,
-                             'DatumType':PcdExItem.DatumType}
-                XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'PcdExValue']
+                CheckDict = {'TokenSpaceGuidValue': PcdExItem.TokenSpaceGuidValue,
+                             'Token': PcdExItem.Token,
+                             'DatumType': PcdExItem.DatumType}
+                XmlTreeLevel = TopXmlTreeLevel + \
+                    ['BinaryFiles', 'BinaryFile', 'AsBuilt', 'PcdExValue']
                 IsRequiredItemListNull(CheckDict, XmlTreeLevel)
                 #
                 # Check PcdError
                 #
                 for PcdErrorItem in PcdExItem.PcdErrorsList:
-                    CheckDict = {'ErrorNumber':PcdErrorItem.ErrorNumber}
+                    CheckDict = {'ErrorNumber': PcdErrorItem.ErrorNumber}
                     XmlTreeLevel = TopXmlTreeLevel + ['BinaryFiles', 'BinaryFile', 'AsBuilt',
                                                       'PcdExValue', 'PcdError']
                     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
@@ -714,7 +741,7 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['SmmDepex']
     for Item in Module.GetSmmDepex():
-        CheckDict = {'Expression':Item.GetDepex()}
+        CheckDict = {'Expression': Item.GetDepex()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -722,7 +749,7 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['PeiDepex']
     for Item in Module.GetPeiDepex():
-        CheckDict = {'Expression':Item.GetDepex()}
+        CheckDict = {'Expression': Item.GetDepex()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -730,7 +757,7 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['DxeDepex']
     for Item in Module.GetDxeDepex():
-        CheckDict = {'Expression':Item.GetDepex()}
+        CheckDict = {'Expression': Item.GetDepex()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
@@ -738,13 +765,16 @@ def ValidateMS3(Module, TopXmlTreeLevel):
     #
     XmlTreeLevel = TopXmlTreeLevel + ['UserExtensions']
     for Item in Module.GetUserExtensionList():
-        CheckDict = {'UserId':Item.GetUserID(), 'Identifier':Item.GetIdentifier()}
+        CheckDict = {'UserId': Item.GetUserID(
+        ), 'Identifier': Item.GetIdentifier()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-## ValidatePS1
+# ValidatePS1
 #
 # ValidatePS1
 #
+
+
 def ValidatePS1(Package):
     #
     # Check DistributionPackage -> PackageSurfaceArea -> Header
@@ -759,7 +789,8 @@ def ValidatePS1(Package):
 
     IsRequiredItemListNull(CheckDict, XmlTreeLevel)
     if not IsValidInstallPath(Package.GetPackagePath()):
-        Logger.Error("UPT", FORMAT_INVALID, ERR_FILE_NAME_INVALIDE % Package.GetPackagePath())
+        Logger.Error("UPT", FORMAT_INVALID, ERR_FILE_NAME_INVALIDE %
+                     Package.GetPackagePath())
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> ClonedFrom
@@ -779,134 +810,154 @@ def ValidatePS1(Package):
     #
     # Check DistributionPackage -> PackageSurfaceArea -> LibraryClassDeclarations -> LibraryClass
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'LibraryClassDeclarations']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'LibraryClassDeclarations']
     for Item in Package.GetLibraryClassList():
         if Item is None:
-            CheckDict = {'LibraryClass':''}
+            CheckDict = {'LibraryClass': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'LibraryClassDeclarations', 'LibraryClass']
+    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea',
+                    'LibraryClassDeclarations', 'LibraryClass']
     for Item in Package.GetLibraryClassList():
-        CheckDict = {'Keyword':Item.GetLibraryClass(),
-                     'HeaderFile':Item.GetIncludeHeader()}
+        CheckDict = {'Keyword': Item.GetLibraryClass(),
+                     'HeaderFile': Item.GetIncludeHeader()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> IndustryStandardIncludes -> IndustryStandardHeader
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'IndustryStandardIncludes']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'IndustryStandardIncludes']
     for Item in Package.GetStandardIncludeFileList():
         if Item is None:
-            CheckDict = {'IndustryStandardHeader':''}
+            CheckDict = {'IndustryStandardHeader': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'IndustryStandardIncludes', 'IndustryStandardHeader']
+    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea',
+                    'IndustryStandardIncludes', 'IndustryStandardHeader']
     for Item in Package.GetStandardIncludeFileList():
-        CheckDict = {'HeaderFile':Item.GetFilePath()}
+        CheckDict = {'HeaderFile': Item.GetFilePath()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> PackageIncludes -> PackageHeader
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PackageIncludes']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PackageIncludes']
     for Item in Package.GetPackageIncludeFileList():
         if Item is None:
-            CheckDict = {'PackageHeader':''}
+            CheckDict = {'PackageHeader': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PackageIncludes', 'PackageHeader']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PackageIncludes', 'PackageHeader']
     for Item in Package.GetPackageIncludeFileList():
-        CheckDict = {'HeaderFile':Item.GetFilePath()}
+        CheckDict = {'HeaderFile': Item.GetFilePath()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-## ValidatePS2
+# ValidatePS2
 #
 # ValidatePS2
 #
+
+
 def ValidatePS2(Package):
     #
     # Check DistributionPackage -> PackageSurfaceArea -> Modules -> ModuleSurfaceArea
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'Modules', 'ModuleSurfaceArea']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'Modules', 'ModuleSurfaceArea']
     for Item in Package.GetModuleDict().values():
         ValidateMS(Item, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> GuidDeclarations Entry
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'GuidDeclarations']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'GuidDeclarations']
     for Item in Package.GetGuidList():
         if Item is None:
-            CheckDict = {'Entry':''}
+            CheckDict = {'Entry': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'GuidDeclarations', 'Entry']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'GuidDeclarations', 'Entry']
     for Item in Package.GetGuidList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'GuidValue':Item.GetGuid()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'GuidValue': Item.GetGuid()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> ProtocolDeclarations -> Entry
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'ProtocolDeclarations']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'ProtocolDeclarations']
     for Item in Package.GetProtocolList():
         if Item is None:
-            CheckDict = {'Entry':''}
+            CheckDict = {'Entry': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'ProtocolDeclarations', 'Entry']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'ProtocolDeclarations', 'Entry']
     for Item in Package.GetProtocolList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'GuidValue':Item.GetGuid()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'GuidValue': Item.GetGuid()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> PpiDeclarations -> Entry
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PpiDeclarations']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PpiDeclarations']
     for Item in Package.GetPpiList():
         if Item is None:
-            CheckDict = {'Entry':''}
+            CheckDict = {'Entry': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PpiDeclarations', 'Entry']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PpiDeclarations', 'Entry']
     for Item in Package.GetPpiList():
-        CheckDict = {'CName':Item.GetCName(),
-                     'GuidValue':Item.GetGuid()}
+        CheckDict = {'CName': Item.GetCName(),
+                     'GuidValue': Item.GetGuid()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> PcdDeclarations -> Entry
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PcdDeclarations']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PcdDeclarations']
     for Item in Package.GetPcdList():
         if Item is None:
-            CheckDict = {'PcdEntry':''}
+            CheckDict = {'PcdEntry': ''}
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'PcdDeclarations', 'PcdEntry']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'PcdDeclarations', 'PcdEntry']
     for Item in Package.GetPcdList():
-        CheckDict = {'TokenSpaceGuidCname':Item.GetTokenSpaceGuidCName(),
-                     'Token':Item.GetToken(),
-                     'CName':Item.GetCName(),
-                     'DatumType':Item.GetDatumType(),
-                     'ValidUsage':Item.GetValidUsage(),
-                     'DefaultValue':Item.GetDefaultValue()}
+        CheckDict = {'TokenSpaceGuidCname': Item.GetTokenSpaceGuidCName(),
+                     'Token': Item.GetToken(),
+                     'CName': Item.GetCName(),
+                     'DatumType': Item.GetDatumType(),
+                     'ValidUsage': Item.GetValidUsage(),
+                     'DefaultValue': Item.GetDefaultValue()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> UserExtensions
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'UserExtensions']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'UserExtensions']
     for Item in Package.GetUserExtensionList():
-        CheckDict = {'UserId':Item.GetUserID(), 'Identifier':Item.GetIdentifier()}
+        CheckDict = {'UserId': Item.GetUserID(
+        ), 'Identifier': Item.GetIdentifier()}
         IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
     #
     # Check DistributionPackage -> PackageSurfaceArea -> MiscellaneousFiles -> Filename
     #
-    XmlTreeLevel = ['DistributionPackage', 'PackageSurfaceArea', 'MiscellaneousFiles']
+    XmlTreeLevel = ['DistributionPackage',
+                    'PackageSurfaceArea', 'MiscellaneousFiles']
     for Item in Package.GetMiscFileList():
         if not Item.GetFileList():
             CheckDict = {'Filename': '', }
@@ -915,12 +966,14 @@ def ValidatePS2(Package):
             CheckDict = {'Filename': File.GetURI(), }
             IsRequiredItemListNull(CheckDict, XmlTreeLevel)
 
-## ValidatePackageSurfaceArea
+# ValidatePackageSurfaceArea
 #
 # Check if any required item is missing in  PackageSurfaceArea
 #
 # @param Package: The PackageSurfaceArea to be checked
 #
+
+
 def ValidatePackageSurfaceArea(Package):
     ValidatePS1(Package)
     ValidatePS2(Package)
diff --git a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
index 48381ee8c6bf..3680f667ae57 100644
--- a/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
+++ b/BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse a xml file of .PKG file
 #
 # Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,7 +14,7 @@ from Logger.StringTable import ERR_XML_PARSER_REQUIRED_ITEM_MISSING
 from Logger.ToolError import PARSER_ERROR
 import Logger.Log as Logger
 
-## ConvertVariableName()
+# ConvertVariableName()
 # Convert VariableName to be L"string",
 # input of UCS-2 format Hex Array or L"string" (C style.) could be converted successfully,
 # others will not.
@@ -22,6 +22,8 @@ import Logger.Log as Logger
 # @param VariableName: string need to be converted
 # @retval: the L quoted string converted if success, else None will be returned
 #
+
+
 def ConvertVariableName(VariableName):
     VariableName = VariableName.strip()
     #
@@ -34,7 +36,7 @@ def ConvertVariableName(VariableName):
     # check for Hex Array, it should be little endian even number of hex numbers
     #
     ValueList = VariableName.split(' ')
-    if len(ValueList)%2 == 1:
+    if len(ValueList) % 2 == 1:
         return None
 
     TransferedStr = ''
@@ -49,18 +51,20 @@ def ConvertVariableName(VariableName):
 
         if FirstByte not in range(0x20, 0x7F):
             return None
-        TransferedStr += ('%c')%FirstByte
+        TransferedStr += ('%c') % FirstByte
         Index = Index + 2
 
     return 'L"' + TransferedStr + '"'
 
-## IsRequiredItemListNull
+# IsRequiredItemListNull
 #
 # Check if a required XML section item/attribue is NULL
 #
 # @param ItemList:     The list of items to be checked
 # @param XmlTreeLevel: The error message tree level
 #
+
+
 def IsRequiredItemListNull(ItemDict, XmlTreeLevel):
     for Key in ItemDict:
         if not ItemDict[Key]:
@@ -68,10 +72,12 @@ def IsRequiredItemListNull(ItemDict, XmlTreeLevel):
             ErrorMsg = ERR_XML_PARSER_REQUIRED_ITEM_MISSING % (Key, Msg)
             Logger.Error('\nUPT', PARSER_ERROR, ErrorMsg, RaiseError=True)
 
-## Get help text
+# Get help text
 #
 # @param HelpText
 #
+
+
 def GetHelpTextList(HelpText):
     HelpTextList = []
     for HelT in HelpText:
@@ -81,10 +87,12 @@ def GetHelpTextList(HelpText):
         HelpTextList.append(HelpTextObj)
     return HelpTextList
 
-## Get Prompt text
+# Get Prompt text
 #
 # @param Prompt
 #
+
+
 def GetPromptList(Prompt):
     PromptList = []
     for SubPrompt in Prompt:
diff --git a/BaseTools/Source/Python/UPT/Xml/__init__.py b/BaseTools/Source/Python/UPT/Xml/__init__.py
index 172e498451b8..03dedeed636e 100644
--- a/BaseTools/Source/Python/UPT/Xml/__init__.py
+++ b/BaseTools/Source/Python/UPT/Xml/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Library' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/Workspace/BuildClassObject.py b/BaseTools/Source/Python/Workspace/BuildClassObject.py
index ef873720f455..08c32ca08c2c 100644
--- a/BaseTools/Source/Python/Workspace/BuildClassObject.py
+++ b/BaseTools/Source/Python/Workspace/BuildClassObject.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to define each component of the build database
 #
 # Copyright (c) 2007 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -10,7 +10,7 @@ from Common.DataType import *
 import collections
 import re
 from collections import OrderedDict
-from Common.Misc import CopyDict,ArrayIndex
+from Common.Misc import CopyDict, ArrayIndex
 import copy
 from CommonDataClass.DataClass import *
 import Common.EdkLogger as EdkLogger
@@ -19,7 +19,7 @@ from Common.BuildToolError import OPTION_VALUE_INVALID
 from Common.caching import cached_property
 StructPattern = re.compile(r'[_a-zA-Z][0-9A-Za-z_\[\]]*$')
 
-## PcdClassObject
+# PcdClassObject
 #
 # This Class is used for PcdObject
 #
@@ -45,8 +45,10 @@ StructPattern = re.compile(r'[_a-zA-Z][0-9A-Za-z_\[\]]*$')
 # @var IsOverrided:          To store value for IsOverrided
 # @var Phase:                To store value for Phase, default is "DXE"
 #
+
+
 class PcdClassObject(object):
-    def __init__(self, Name = None, Guid = None, Type = None, DatumType = None, Value = None, Token = None, MaxDatumSize = None, SkuInfoList = None, IsOverrided = False, GuidValue = None, validateranges = None, validlists = None, expressions = None, IsDsc = False, UserDefinedDefaultStoresFlag = False):
+    def __init__(self, Name=None, Guid=None, Type=None, DatumType=None, Value=None, Token=None, MaxDatumSize=None, SkuInfoList=None, IsOverrided=False, GuidValue=None, validateranges=None, validlists=None, expressions=None, IsDsc=False, UserDefinedDefaultStoresFlag=False):
         self.TokenCName = Name
         self.TokenSpaceGuidCName = Guid
         self.TokenSpaceGuidValue = GuidValue
@@ -72,7 +74,7 @@ class PcdClassObject(object):
             self.DscDefaultValue = Value
         self.PcdValueFromComm = ""
         self.PcdValueFromFdf = ""
-        self.PcdValueFromComponents = {} #{ModuleGuid:value, file_path,lineNo}
+        self.PcdValueFromComponents = {}  # {ModuleGuid:value, file_path,lineNo}
         self.CustomAttribute = {}
         self.UserDefinedDefaultStoresFlag = UserDefinedDefaultStoresFlag
         self._Capacity = None
@@ -86,7 +88,8 @@ class PcdClassObject(object):
                 maxsize = item.lstrip("[").rstrip("]").strip()
                 if not maxsize:
                     maxsize = "-1"
-                maxsize = str(int(maxsize,16)) if maxsize.startswith(("0x","0X")) else maxsize
+                maxsize = str(int(maxsize, 16)) if maxsize.startswith(
+                    ("0x", "0X")) else maxsize
                 self._Capacity.append(maxsize)
             if hasattr(self, "SkuOverrideValues"):
                 for sku in self.SkuOverrideValues:
@@ -98,10 +101,11 @@ class PcdClassObject(object):
                             for i in range(len(deme)):
                                 if int(deme[i].lstrip("[").rstrip("]").strip()) >= int(self._Capacity[i]):
                                     if self._Capacity[i] != "-1":
-                                        firstfieldinfo = list(fieldinfo.values())[0]
+                                        firstfieldinfo = list(
+                                            fieldinfo.values())[0]
                                         EdkLogger.error('Build', OPTION_VALUE_INVALID, "For Pcd %s, Array Index exceed the Array size. From %s Line %s \n " %
-                                    (".".join((self.TokenSpaceGuidCName, self.TokenCName)), firstfieldinfo[1],firstfieldinfo[2] ))
-            if hasattr(self,"DefaultValues"):
+                                                        (".".join((self.TokenSpaceGuidCName, self.TokenCName)), firstfieldinfo[1], firstfieldinfo[2]))
+            if hasattr(self, "DefaultValues"):
                 for demesionattr in self.DefaultValues:
                     fieldinfo = self.DefaultValues[demesionattr]
                     deme = ArrayIndex.findall(demesionattr)
@@ -110,7 +114,7 @@ class PcdClassObject(object):
                             if self._Capacity[i] != "-1":
                                 firstfieldinfo = list(fieldinfo.values())[0]
                                 EdkLogger.error('Build', OPTION_VALUE_INVALID, "For Pcd %s, Array Index exceed the Array size. From %s Line %s \n " %
-                                    (".".join((self.TokenSpaceGuidCName, self.TokenCName)), firstfieldinfo[1],firstfieldinfo[2] ))
+                                                (".".join((self.TokenSpaceGuidCName, self.TokenCName)), firstfieldinfo[1], firstfieldinfo[2]))
         return self._Capacity
 
     def PcdArraySize(self):
@@ -120,12 +124,13 @@ class PcdClassObject(object):
         for de in self.Capacity:
             size = size * int(de)
         return size
+
     @property
     def DatumType(self):
         return self._DatumType
 
     @DatumType.setter
-    def DatumType(self,DataType):
+    def DatumType(self, DataType):
         self._DatumType = DataType
         self._Capacity = None
 
@@ -135,6 +140,7 @@ class PcdClassObject(object):
             return self._DatumType[:self._DatumType.index("[")]
         else:
             return self._DatumType
+
     def IsArray(self):
         return True if len(self.Capacity) else False
 
@@ -153,31 +159,32 @@ class PcdClassObject(object):
     @staticmethod
     def GetPcdMaxSizeWorker(PcdString, MaxSize):
         if PcdString.startswith("{") and PcdString.endswith("}"):
-            return  max([len(PcdString.split(",")),MaxSize])
+            return max([len(PcdString.split(",")), MaxSize])
 
         if PcdString.startswith("\"") or PcdString.startswith("\'"):
-            return  max([len(PcdString)-2+1,MaxSize])
+            return max([len(PcdString)-2+1, MaxSize])
 
         if PcdString.startswith("L\""):
-            return  max([2*(len(PcdString)-3+1),MaxSize])
+            return max([2*(len(PcdString)-3+1), MaxSize])
 
-        return max([len(PcdString),MaxSize])
+        return max([len(PcdString), MaxSize])
 
-    ## Get the maximum number of bytes
+    # Get the maximum number of bytes
     def GetPcdMaxSize(self):
         if self.DatumType in TAB_PCD_NUMERIC_TYPES:
             return MAX_SIZE_TYPE[self.DatumType]
 
         MaxSize = int(self.MaxDatumSize, 10) if self.MaxDatumSize else 0
         if self.PcdValueFromFdf:
-            MaxSize = self.GetPcdMaxSizeWorker(self.PcdValueFromFdf,MaxSize)
+            MaxSize = self.GetPcdMaxSizeWorker(self.PcdValueFromFdf, MaxSize)
         if self.PcdValueFromComm:
-            MaxSize = self.GetPcdMaxSizeWorker(self.PcdValueFromComm,MaxSize)
+            MaxSize = self.GetPcdMaxSizeWorker(self.PcdValueFromComm, MaxSize)
         if hasattr(self, "DefaultValueFromDec"):
-            MaxSize = self.GetPcdMaxSizeWorker(self.DefaultValueFromDec,MaxSize)
+            MaxSize = self.GetPcdMaxSizeWorker(
+                self.DefaultValueFromDec, MaxSize)
         return MaxSize
 
-    ## Get the number of bytes
+    # Get the number of bytes
     def GetPcdSize(self):
         if self.DatumType in TAB_PCD_NUMERIC_TYPES:
             return MAX_SIZE_TYPE[self.DatumType]
@@ -190,14 +197,14 @@ class PcdClassObject(object):
         else:
             return len(self.DefaultValue) - 1
 
-
-    ## Convert the class to a string
+    # Convert the class to a string
     #
     #  Convert each member of the class to string
     #  Organize to a single line format string
     #
     #  @retval Rtn Formatted String
     #
+
     def __str__(self):
         Rtn = '\tTokenCName=' + str(self.TokenCName) + ', ' + \
               'TokenSpaceGuidCName=' + str(self.TokenSpaceGuidCName) + ', ' + \
@@ -212,7 +219,7 @@ class PcdClassObject(object):
 
         return Rtn
 
-    ## Override __eq__ function
+    # Override __eq__ function
     #
     # Check whether pcds are the same
     #
@@ -222,7 +229,7 @@ class PcdClassObject(object):
     def __eq__(self, Other):
         return Other and self.TokenCName == Other.TokenCName and self.TokenSpaceGuidCName == Other.TokenSpaceGuidCName
 
-    ## Override __hash__ function
+    # Override __hash__ function
     #
     # Use (TokenCName, TokenSpaceGuidCName) as key in hash table
     #
@@ -233,14 +240,15 @@ class PcdClassObject(object):
 
     @cached_property
     def _fullname(self):
-        return ".".join((self.TokenSpaceGuidCName,self.TokenCName))
+        return ".".join((self.TokenSpaceGuidCName, self.TokenCName))
 
-    def __lt__(self,pcd):
+    def __lt__(self, pcd):
         return self._fullname < pcd._fullname
-    def __gt__(self,pcd):
+
+    def __gt__(self, pcd):
         return self._fullname > pcd._fullname
 
-    def sharedcopy(self,new_pcd):
+    def sharedcopy(self, new_pcd):
         new_pcd.TokenCName = self.TokenCName
         new_pcd.TokenSpaceGuidCName = self.TokenSpaceGuidCName
         new_pcd.TokenSpaceGuidValue = self.TokenSpaceGuidValue
@@ -265,16 +273,18 @@ class PcdClassObject(object):
         new_pcd.validateranges = [item for item in self.validateranges]
         new_pcd.validlists = [item for item in self.validlists]
         new_pcd.expressions = [item for item in self.expressions]
-        new_pcd.SkuInfoList = {key: copy.deepcopy(skuobj) for key,skuobj in self.SkuInfoList.items()}
+        new_pcd.SkuInfoList = {key: copy.deepcopy(
+            skuobj) for key, skuobj in self.SkuInfoList.items()}
         return new_pcd
 
-    def __deepcopy__(self,memo):
+    def __deepcopy__(self, memo):
         new_pcd = PcdClassObject()
         self.sharedcopy(new_pcd)
         return new_pcd
 
+
 class StructurePcd(PcdClassObject):
-    def __init__(self, StructuredPcdIncludeFile=None, Packages=None, Name=None, Guid=None, Type=None, DatumType=None, Value=None, Token=None, MaxDatumSize=None, SkuInfoList=None, IsOverrided=False, GuidValue=None, validateranges=None, validlists=None, expressions=None,default_store = TAB_DEFAULT_STORES_DEFAULT):
+    def __init__(self, StructuredPcdIncludeFile=None, Packages=None, Name=None, Guid=None, Type=None, DatumType=None, Value=None, Token=None, MaxDatumSize=None, SkuInfoList=None, IsOverrided=False, GuidValue=None, validateranges=None, validlists=None, expressions=None, default_store=TAB_DEFAULT_STORES_DEFAULT):
         if SkuInfoList is None:
             SkuInfoList = {}
         if validateranges is None:
@@ -285,8 +295,10 @@ class StructurePcd(PcdClassObject):
             expressions = []
         if Packages is None:
             Packages = []
-        super(StructurePcd, self).__init__(Name, Guid, Type, DatumType, Value, Token, MaxDatumSize, SkuInfoList, IsOverrided, GuidValue, validateranges, validlists, expressions)
-        self.StructuredPcdIncludeFile = [] if StructuredPcdIncludeFile is None else StructuredPcdIncludeFile
+        super(StructurePcd, self).__init__(Name, Guid, Type, DatumType, Value, Token, MaxDatumSize,
+                                           SkuInfoList, IsOverrided, GuidValue, validateranges, validlists, expressions)
+        self.StructuredPcdIncludeFile = [
+        ] if StructuredPcdIncludeFile is None else StructuredPcdIncludeFile
         self.PackageDecs = Packages
         self.DefaultStoreName = [default_store]
         self.DefaultValues = OrderedDict()
@@ -300,41 +312,49 @@ class StructurePcd(PcdClassObject):
         self.ValueChain = set()
         self.PcdFieldValueFromComm = OrderedDict()
         self.PcdFieldValueFromFdf = OrderedDict()
-        self.DefaultFromDSC=None
+        self.DefaultFromDSC = None
         self.PcdFiledValueFromDscComponent = OrderedDict()
+
     def __repr__(self):
         return self.TypeName
 
-    def AddDefaultValue (self, FieldName, Value, FileName="", LineNo=0,DimensionAttr ="-1"):
+    def AddDefaultValue(self, FieldName, Value, FileName="", LineNo=0, DimensionAttr="-1"):
         if DimensionAttr not in self.DefaultValues:
             self.DefaultValues[DimensionAttr] = collections.OrderedDict()
         if FieldName in self.DefaultValues[DimensionAttr]:
             del self.DefaultValues[DimensionAttr][FieldName]
-        self.DefaultValues[DimensionAttr][FieldName] = [Value.strip(), FileName, LineNo]
+        self.DefaultValues[DimensionAttr][FieldName] = [
+            Value.strip(), FileName, LineNo]
         return self.DefaultValues[DimensionAttr][FieldName]
 
-    def SetDecDefaultValue(self, DefaultValue,decpath=None,lineno=None):
+    def SetDecDefaultValue(self, DefaultValue, decpath=None, lineno=None):
         self.DefaultValueFromDec = DefaultValue
-        self.DefaultValueFromDecInfo = (decpath,lineno)
-    def AddOverrideValue (self, FieldName, Value, SkuName, DefaultStoreName, FileName="", LineNo=0, DimensionAttr = '-1'):
+        self.DefaultValueFromDecInfo = (decpath, lineno)
+
+    def AddOverrideValue(self, FieldName, Value, SkuName, DefaultStoreName, FileName="", LineNo=0, DimensionAttr='-1'):
         if SkuName not in self.SkuOverrideValues:
             self.SkuOverrideValues[SkuName] = OrderedDict()
         if DefaultStoreName not in self.SkuOverrideValues[SkuName]:
             self.SkuOverrideValues[SkuName][DefaultStoreName] = OrderedDict()
         if DimensionAttr not in self.SkuOverrideValues[SkuName][DefaultStoreName]:
-            self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr] = collections.OrderedDict()
+            self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr] = collections.OrderedDict(
+            )
         if FieldName in self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr]:
             del self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr][FieldName]
-        self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr][FieldName] = [Value.strip(), FileName, LineNo]
+        self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr][FieldName] = [
+            Value.strip(), FileName, LineNo]
         return self.SkuOverrideValues[SkuName][DefaultStoreName][DimensionAttr][FieldName]
 
-    def AddComponentOverrideValue(self,FieldName, Value, ModuleGuid, FileName="", LineNo=0, DimensionAttr = '-1'):
-        self.PcdFiledValueFromDscComponent.setdefault(ModuleGuid, OrderedDict())
-        self.PcdFiledValueFromDscComponent[ModuleGuid].setdefault(DimensionAttr,OrderedDict())
-        self.PcdFiledValueFromDscComponent[ModuleGuid][DimensionAttr][FieldName] =  [Value.strip(), FileName, LineNo]
+    def AddComponentOverrideValue(self, FieldName, Value, ModuleGuid, FileName="", LineNo=0, DimensionAttr='-1'):
+        self.PcdFiledValueFromDscComponent.setdefault(
+            ModuleGuid, OrderedDict())
+        self.PcdFiledValueFromDscComponent[ModuleGuid].setdefault(
+            DimensionAttr, OrderedDict())
+        self.PcdFiledValueFromDscComponent[ModuleGuid][DimensionAttr][FieldName] = [
+            Value.strip(), FileName, LineNo]
         return self.PcdFiledValueFromDscComponent[ModuleGuid][DimensionAttr][FieldName]
 
-    def SetPcdMode (self, PcdMode):
+    def SetPcdMode(self, PcdMode):
         self.PcdMode = PcdMode
 
     def copy(self, PcdObject):
@@ -343,7 +363,7 @@ class StructurePcd(PcdClassObject):
         self.TokenSpaceGuidValue = PcdObject.TokenSpaceGuidValue if PcdObject.TokenSpaceGuidValue else self.TokenSpaceGuidValue
         self.Type = PcdObject.Type if PcdObject.Type else self.Type
         self._DatumType = PcdObject.DatumType if PcdObject.DatumType else self.DatumType
-        self.DefaultValue = PcdObject.DefaultValue if  PcdObject.DefaultValue else self.DefaultValue
+        self.DefaultValue = PcdObject.DefaultValue if PcdObject.DefaultValue else self.DefaultValue
         self.TokenValue = PcdObject.TokenValue if PcdObject.TokenValue else self.TokenValue
         self.MaxDatumSize = PcdObject.MaxDatumSize if PcdObject.MaxDatumSize else self.MaxDatumSize
         self.SkuInfoList = PcdObject.SkuInfoList if PcdObject.SkuInfoList else self.SkuInfoList
@@ -377,7 +397,7 @@ class StructurePcd(PcdClassObject):
             self.PcdFieldValueFromFdf = PcdObject.PcdFieldValueFromFdf if PcdObject.PcdFieldValueFromFdf else self.PcdFieldValueFromFdf
             self.PcdFiledValueFromDscComponent = PcdObject.PcdFiledValueFromDscComponent if PcdObject.PcdFiledValueFromDscComponent else self.PcdFiledValueFromDscComponent
 
-    def __deepcopy__(self,memo):
+    def __deepcopy__(self, memo):
         new_pcd = StructurePcd()
         self.sharedcopy(new_pcd)
 
@@ -387,53 +407,58 @@ class StructurePcd(PcdClassObject):
         new_pcd.StructName = self.DatumType
         new_pcd.PcdDefineLineNo = self.PcdDefineLineNo
         new_pcd.PkgPath = self.PkgPath
-        new_pcd.StructuredPcdIncludeFile = [item for item in self.StructuredPcdIncludeFile]
+        new_pcd.StructuredPcdIncludeFile = [
+            item for item in self.StructuredPcdIncludeFile]
         new_pcd.PackageDecs = [item for item in self.PackageDecs]
         new_pcd.DefaultValues = CopyDict(self.DefaultValues)
-        new_pcd.DefaultFromDSC=CopyDict(self.DefaultFromDSC)
+        new_pcd.DefaultFromDSC = CopyDict(self.DefaultFromDSC)
         new_pcd.SkuOverrideValues = CopyDict(self.SkuOverrideValues)
         new_pcd.PcdFieldValueFromComm = CopyDict(self.PcdFieldValueFromComm)
         new_pcd.PcdFieldValueFromFdf = CopyDict(self.PcdFieldValueFromFdf)
-        new_pcd.PcdFiledValueFromDscComponent = CopyDict(self.PcdFiledValueFromDscComponent)
+        new_pcd.PcdFiledValueFromDscComponent = CopyDict(
+            self.PcdFiledValueFromDscComponent)
         new_pcd.ValueChain = {item for item in self.ValueChain}
         return new_pcd
 
-LibraryClassObject = namedtuple('LibraryClassObject', ['LibraryClass','SupModList'])
+
+LibraryClassObject = namedtuple(
+    'LibraryClassObject', ['LibraryClass', 'SupModList'])
+
 
 class BuildData(object):
     # dict used to convert PCD type in database to string used by build tool
 
     _PCD_TYPE_STRING_ = {
-        MODEL_PCD_FIXED_AT_BUILD        :   TAB_PCDS_FIXED_AT_BUILD,
-        MODEL_PCD_PATCHABLE_IN_MODULE   :   TAB_PCDS_PATCHABLE_IN_MODULE,
-        MODEL_PCD_FEATURE_FLAG          :   TAB_PCDS_FEATURE_FLAG,
-        MODEL_PCD_DYNAMIC               :   TAB_PCDS_DYNAMIC,
-        MODEL_PCD_DYNAMIC_DEFAULT       :   TAB_PCDS_DYNAMIC,
-        MODEL_PCD_DYNAMIC_HII           :   TAB_PCDS_DYNAMIC_HII,
-        MODEL_PCD_DYNAMIC_VPD           :   TAB_PCDS_DYNAMIC_VPD,
-        MODEL_PCD_DYNAMIC_EX            :   TAB_PCDS_DYNAMIC_EX,
-        MODEL_PCD_DYNAMIC_EX_DEFAULT    :   TAB_PCDS_DYNAMIC_EX,
-        MODEL_PCD_DYNAMIC_EX_HII        :   TAB_PCDS_DYNAMIC_EX_HII,
-        MODEL_PCD_DYNAMIC_EX_VPD        :   TAB_PCDS_DYNAMIC_EX_VPD,
+        MODEL_PCD_FIXED_AT_BUILD:   TAB_PCDS_FIXED_AT_BUILD,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   TAB_PCDS_PATCHABLE_IN_MODULE,
+        MODEL_PCD_FEATURE_FLAG:   TAB_PCDS_FEATURE_FLAG,
+        MODEL_PCD_DYNAMIC:   TAB_PCDS_DYNAMIC,
+        MODEL_PCD_DYNAMIC_DEFAULT:   TAB_PCDS_DYNAMIC,
+        MODEL_PCD_DYNAMIC_HII:   TAB_PCDS_DYNAMIC_HII,
+        MODEL_PCD_DYNAMIC_VPD:   TAB_PCDS_DYNAMIC_VPD,
+        MODEL_PCD_DYNAMIC_EX:   TAB_PCDS_DYNAMIC_EX,
+        MODEL_PCD_DYNAMIC_EX_DEFAULT:   TAB_PCDS_DYNAMIC_EX,
+        MODEL_PCD_DYNAMIC_EX_HII:   TAB_PCDS_DYNAMIC_EX_HII,
+        MODEL_PCD_DYNAMIC_EX_VPD:   TAB_PCDS_DYNAMIC_EX_VPD,
     }
 
     def UpdatePcdTypeDict(self):
-        if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX,"FALSE").upper() == "TRUE":
+        if GlobalData.gCommandLineDefines.get(TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX, "FALSE").upper() == "TRUE":
             self._PCD_TYPE_STRING_ = {
-                MODEL_PCD_FIXED_AT_BUILD        :   TAB_PCDS_FIXED_AT_BUILD,
-                MODEL_PCD_PATCHABLE_IN_MODULE   :   TAB_PCDS_PATCHABLE_IN_MODULE,
-                MODEL_PCD_FEATURE_FLAG          :   TAB_PCDS_FEATURE_FLAG,
-                MODEL_PCD_DYNAMIC               :   TAB_PCDS_DYNAMIC_EX,
-                MODEL_PCD_DYNAMIC_DEFAULT       :   TAB_PCDS_DYNAMIC_EX,
-                MODEL_PCD_DYNAMIC_HII           :   TAB_PCDS_DYNAMIC_EX_HII,
-                MODEL_PCD_DYNAMIC_VPD           :   TAB_PCDS_DYNAMIC_EX_VPD,
-                MODEL_PCD_DYNAMIC_EX            :   TAB_PCDS_DYNAMIC_EX,
-                MODEL_PCD_DYNAMIC_EX_DEFAULT    :   TAB_PCDS_DYNAMIC_EX,
-                MODEL_PCD_DYNAMIC_EX_HII        :   TAB_PCDS_DYNAMIC_EX_HII,
-                MODEL_PCD_DYNAMIC_EX_VPD        :   TAB_PCDS_DYNAMIC_EX_VPD,
+                MODEL_PCD_FIXED_AT_BUILD:   TAB_PCDS_FIXED_AT_BUILD,
+                MODEL_PCD_PATCHABLE_IN_MODULE:   TAB_PCDS_PATCHABLE_IN_MODULE,
+                MODEL_PCD_FEATURE_FLAG:   TAB_PCDS_FEATURE_FLAG,
+                MODEL_PCD_DYNAMIC:   TAB_PCDS_DYNAMIC_EX,
+                MODEL_PCD_DYNAMIC_DEFAULT:   TAB_PCDS_DYNAMIC_EX,
+                MODEL_PCD_DYNAMIC_HII:   TAB_PCDS_DYNAMIC_EX_HII,
+                MODEL_PCD_DYNAMIC_VPD:   TAB_PCDS_DYNAMIC_EX_VPD,
+                MODEL_PCD_DYNAMIC_EX:   TAB_PCDS_DYNAMIC_EX,
+                MODEL_PCD_DYNAMIC_EX_DEFAULT:   TAB_PCDS_DYNAMIC_EX,
+                MODEL_PCD_DYNAMIC_EX_HII:   TAB_PCDS_DYNAMIC_EX_HII,
+                MODEL_PCD_DYNAMIC_EX_VPD:   TAB_PCDS_DYNAMIC_EX_VPD,
             }
 
-    ## Convert the class to a string
+    # Convert the class to a string
     #
     #  Convert member MetaFile of the class to a string
     #
@@ -442,7 +467,7 @@ class BuildData(object):
     def __str__(self):
         return str(self.MetaFile)
 
-    ## Override __eq__ function
+    # Override __eq__ function
     #
     # Check whether ModuleBuildClassObjects are the same
     #
@@ -452,7 +477,7 @@ class BuildData(object):
     def __eq__(self, Other):
         return self.MetaFile == Other
 
-    ## Override __hash__ function
+    # Override __hash__ function
     #
     # Use MetaFile as key in hash table
     #
@@ -461,7 +486,7 @@ class BuildData(object):
     def __hash__(self):
         return hash(self.MetaFile)
 
-## ModuleBuildClassObject
+# ModuleBuildClassObject
 #
 # This Class defines ModuleBuildClass
 #
@@ -505,41 +530,43 @@ class BuildData(object):
 #                              { [BuildOptionKey] : BuildOptionValue}
 # @var Depex:                  To store value for Depex
 #
+
+
 class ModuleBuildClassObject(BuildData):
     def __init__(self):
-        self.AutoGenVersion          = 0
-        self.MetaFile                = ''
-        self.BaseName                = ''
-        self.ModuleType              = ''
-        self.Guid                    = ''
-        self.Version                 = ''
-        self.PcdIsDriver             = ''
-        self.BinaryModule            = ''
-        self.Shadow                  = ''
-        self.CustomMakefile          = {}
-        self.Specification           = {}
-        self.LibraryClass            = []
-        self.ModuleEntryPointList    = []
-        self.ModuleUnloadImageList   = []
-        self.ConstructorList         = []
-        self.DestructorList          = []
+        self.AutoGenVersion = 0
+        self.MetaFile = ''
+        self.BaseName = ''
+        self.ModuleType = ''
+        self.Guid = ''
+        self.Version = ''
+        self.PcdIsDriver = ''
+        self.BinaryModule = ''
+        self.Shadow = ''
+        self.CustomMakefile = {}
+        self.Specification = {}
+        self.LibraryClass = []
+        self.ModuleEntryPointList = []
+        self.ModuleUnloadImageList = []
+        self.ConstructorList = []
+        self.DestructorList = []
 
-        self.Binaries                = []
-        self.Sources                 = []
-        self.LibraryClasses          = OrderedDict()
-        self.Libraries               = []
-        self.Protocols               = []
-        self.Ppis                    = []
-        self.Guids                   = []
-        self.Includes                = []
-        self.Packages                = []
-        self.Pcds                    = {}
-        self.BuildOptions            = {}
-        self.Depex                   = {}
-        self.StrPcdSet               = []
-        self.StrPcdOverallValue      = {}
+        self.Binaries = []
+        self.Sources = []
+        self.LibraryClasses = OrderedDict()
+        self.Libraries = []
+        self.Protocols = []
+        self.Ppis = []
+        self.Guids = []
+        self.Includes = []
+        self.Packages = []
+        self.Pcds = {}
+        self.BuildOptions = {}
+        self.Depex = {}
+        self.StrPcdSet = []
+        self.StrPcdOverallValue = {}
 
-## PackageBuildClassObject
+# PackageBuildClassObject
 #
 # This Class defines PackageBuildClass
 #
@@ -562,21 +589,23 @@ class ModuleBuildClassObject(BuildData):
 # @var Pcds:            To store value for Pcds, it is a set structure as
 #                       { [(PcdCName, PcdGuidCName)] : PcdClassObject}
 #
+
+
 class PackageBuildClassObject(BuildData):
     def __init__(self):
-        self.MetaFile                = ''
-        self.PackageName             = ''
-        self.Guid                    = ''
-        self.Version                 = ''
+        self.MetaFile = ''
+        self.PackageName = ''
+        self.Guid = ''
+        self.Version = ''
 
-        self.Protocols               = {}
-        self.Ppis                    = {}
-        self.Guids                   = {}
-        self.Includes                = []
-        self.LibraryClasses          = {}
-        self.Pcds                    = {}
+        self.Protocols = {}
+        self.Ppis = {}
+        self.Guids = {}
+        self.Includes = []
+        self.LibraryClasses = {}
+        self.Pcds = {}
 
-## PlatformBuildClassObject
+# PlatformBuildClassObject
 #
 # This Class defines PlatformBuildClass
 #
@@ -603,21 +632,23 @@ class PackageBuildClassObject(BuildData):
 # @var BuildOptions:      To store value for BuildOptions, it is a set structure as
 #                         { [BuildOptionKey] : BuildOptionValue }
 #
+
+
 class PlatformBuildClassObject(BuildData):
     def __init__(self):
-        self.MetaFile                = ''
-        self.PlatformName            = ''
-        self.Guid                    = ''
-        self.Version                 = ''
-        self.DscSpecification        = ''
-        self.OutputDirectory         = ''
-        self.FlashDefinition         = ''
-        self.BuildNumber             = ''
+        self.MetaFile = ''
+        self.PlatformName = ''
+        self.Guid = ''
+        self.Version = ''
+        self.DscSpecification = ''
+        self.OutputDirectory = ''
+        self.FlashDefinition = ''
+        self.BuildNumber = ''
 
-        self.SkuIds                  = {}
-        self.Modules                 = []
-        self.LibraryInstances        = []
-        self.LibraryClasses          = {}
-        self.Libraries               = {}
-        self.Pcds                    = {}
-        self.BuildOptions            = {}
+        self.SkuIds = {}
+        self.Modules = []
+        self.LibraryInstances = []
+        self.LibraryClasses = {}
+        self.Libraries = {}
+        self.Pcds = {}
+        self.BuildOptions = {}
diff --git a/BaseTools/Source/Python/Workspace/DecBuildData.py b/BaseTools/Source/Python/Workspace/DecBuildData.py
index eeb7c490ac8c..5f23904d9985 100644
--- a/BaseTools/Source/Python/Workspace/DecBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DecBuildData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by build tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -15,11 +15,13 @@ from Workspace.BuildClassObject import PackageBuildClassObject, StructurePcd, Pc
 from Common.GlobalData import gGlobalDefines
 from re import compile
 
-## Platform build information from DEC file
+# Platform build information from DEC file
 #
 #  This class is used to retrieve information stored in database and convert them
 # into PackageBuildClassObject form for easier use for AutoGen.
 #
+
+
 class DecBuildData(PackageBuildClassObject):
 
     # dict used to convert part of [Defines] to members of DecBuildData directly
@@ -27,13 +29,13 @@ class DecBuildData(PackageBuildClassObject):
         #
         # Required Fields
         #
-        TAB_DEC_DEFINES_PACKAGE_NAME                : "_PackageName",
-        TAB_DEC_DEFINES_PACKAGE_GUID                : "_Guid",
-        TAB_DEC_DEFINES_PACKAGE_VERSION             : "_Version",
-        TAB_DEC_DEFINES_PKG_UNI_FILE                : "_PkgUniFile",
+        TAB_DEC_DEFINES_PACKAGE_NAME: "_PackageName",
+        TAB_DEC_DEFINES_PACKAGE_GUID: "_Guid",
+        TAB_DEC_DEFINES_PACKAGE_VERSION: "_Version",
+        TAB_DEC_DEFINES_PKG_UNI_FILE: "_PkgUniFile",
     }
 
-    ## Constructor of DecBuildData
+    # Constructor of DecBuildData
     #
     #  Initialize object of DecBuildData
     #
@@ -63,43 +65,43 @@ class DecBuildData(PackageBuildClassObject):
     def __getitem__(self, key):
         return self.__dict__[self._PROPERTY_[key]]
 
-    ## "in" test support
+    # "in" test support
     def __contains__(self, key):
         return key in self._PROPERTY_
 
-    ## Set all internal used members of DecBuildData to None
+    # Set all internal used members of DecBuildData to None
     def _Clear(self):
-        self._Header            = None
-        self._PackageName       = None
-        self._Guid              = None
-        self._Version           = None
-        self._PkgUniFile        = None
-        self._Protocols         = None
-        self._Ppis              = None
-        self._Guids             = None
-        self._Includes          = None
-        self._CommonIncludes    = None
-        self._LibraryClasses    = None
-        self._Pcds              = None
-        self._MacroDict         = None
-        self._PrivateProtocols  = None
-        self._PrivatePpis       = None
-        self._PrivateGuids      = None
-        self._PrivateIncludes   = None
+        self._Header = None
+        self._PackageName = None
+        self._Guid = None
+        self._Version = None
+        self._PkgUniFile = None
+        self._Protocols = None
+        self._Ppis = None
+        self._Guids = None
+        self._Includes = None
+        self._CommonIncludes = None
+        self._LibraryClasses = None
+        self._Pcds = None
+        self._MacroDict = None
+        self._PrivateProtocols = None
+        self._PrivatePpis = None
+        self._PrivateGuids = None
+        self._PrivateIncludes = None
 
-    ## Get current effective macros
+    # Get current effective macros
     @property
     def _Macros(self):
         if self._MacroDict is None:
             self._MacroDict = dict(gGlobalDefines)
         return self._MacroDict
 
-    ## Get architecture
+    # Get architecture
     @property
     def Arch(self):
         return self._Arch
 
-    ## Retrieve all information in [Defines] section
+    # Retrieve all information in [Defines] section
     #
     #   (Retrieving all [Defines] information in one-shot is just to save time.)
     #
@@ -111,27 +113,29 @@ class DecBuildData(PackageBuildClassObject):
                 self[Name] = Record[2]
         self._Header = 'DUMMY'
 
-    ## Retrieve package name
+    # Retrieve package name
     @property
     def PackageName(self):
         if self._PackageName is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._PackageName is None:
-                EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "No PACKAGE_NAME", File=self.MetaFile)
+                EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
+                                "No PACKAGE_NAME", File=self.MetaFile)
         return self._PackageName
 
-    ## Retrieve file guid
+    # Retrieve file guid
     @property
     def PackageName(self):
         if self._Guid is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._Guid is None:
-                EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE, "No PACKAGE_GUID", File=self.MetaFile)
+                EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
+                                "No PACKAGE_GUID", File=self.MetaFile)
         return self._Guid
 
-    ## Retrieve package version
+    # Retrieve package version
     @property
     def Version(self):
         if self._Version is None:
@@ -141,7 +145,7 @@ class DecBuildData(PackageBuildClassObject):
                 self._Version = ''
         return self._Version
 
-    ## Retrieve protocol definitions (name/value pairs)
+    # Retrieve protocol definitions (name/value pairs)
     @property
     def Protocols(self):
         if self._Protocols is None:
@@ -162,12 +166,14 @@ class DecBuildData(PackageBuildClassObject):
                         PrivateNameList.append(Name)
                         PrivateProtocolDict[Arch, Name] = Guid
                     if Name in PublicNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 else:
                     if Name not in PublicNameList:
                         PublicNameList.append(Name)
                     if Name in PrivateNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 if Name not in NameList:
                     NameList.append(Name)
                 ProtocolDict[Arch, Name] = Guid
@@ -184,7 +190,7 @@ class DecBuildData(PackageBuildClassObject):
                 self._PrivateProtocols[Name] = PrivateProtocolDict[self._Arch, Name]
         return self._Protocols
 
-    ## Retrieve PPI definitions (name/value pairs)
+    # Retrieve PPI definitions (name/value pairs)
     @property
     def Ppis(self):
         if self._Ppis is None:
@@ -205,12 +211,14 @@ class DecBuildData(PackageBuildClassObject):
                         PrivateNameList.append(Name)
                         PrivatePpiDict[Arch, Name] = Guid
                     if Name in PublicNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 else:
                     if Name not in PublicNameList:
                         PublicNameList.append(Name)
                     if Name in PrivateNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 if Name not in NameList:
                     NameList.append(Name)
                 PpiDict[Arch, Name] = Guid
@@ -227,7 +235,7 @@ class DecBuildData(PackageBuildClassObject):
                 self._PrivatePpis[Name] = PrivatePpiDict[self._Arch, Name]
         return self._Ppis
 
-    ## Retrieve GUID definitions (name/value pairs)
+    # Retrieve GUID definitions (name/value pairs)
     @property
     def Guids(self):
         if self._Guids is None:
@@ -248,12 +256,14 @@ class DecBuildData(PackageBuildClassObject):
                         PrivateNameList.append(Name)
                         PrivateGuidDict[Arch, Name] = Guid
                     if Name in PublicNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 else:
                     if Name not in PublicNameList:
                         PublicNameList.append(Name)
                     if Name in PrivateNameList:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % Name, File=self.MetaFile, Line=LineNo)
                 if Name not in NameList:
                     NameList.append(Name)
                 GuidDict[Arch, Name] = Guid
@@ -270,7 +280,7 @@ class DecBuildData(PackageBuildClassObject):
                 self._PrivateGuids[Name] = PrivateGuidDict[self._Arch, Name]
         return self._Guids
 
-    ## Retrieve public include paths declared in this package
+    # Retrieve public include paths declared in this package
     @property
     def Includes(self):
         if self._Includes is None or self._CommonIncludes is None:
@@ -281,12 +291,14 @@ class DecBuildData(PackageBuildClassObject):
             RecordList = self._RawData[MODEL_EFI_INCLUDE, self._Arch]
             Macros = self._Macros
             for Record in RecordList:
-                File = PathClass(NormPath(Record[0], Macros), self._PackageDir, Arch=self._Arch)
+                File = PathClass(
+                    NormPath(Record[0], Macros), self._PackageDir, Arch=self._Arch)
                 LineNo = Record[-1]
                 # validate the path
                 ErrorCode, ErrorInfo = File.Validate()
                 if ErrorCode != 0:
-                    EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                    EdkLogger.error(
+                        'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
 
                 # avoid duplicate include path
                 if File not in self._Includes:
@@ -295,17 +307,19 @@ class DecBuildData(PackageBuildClassObject):
                     if File not in self._PrivateIncludes:
                         self._PrivateIncludes.append(File)
                     if File in PublicInclues:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % File, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % File, File=self.MetaFile, Line=LineNo)
                 else:
                     if File not in PublicInclues:
                         PublicInclues.append(File)
                     if File in self._PrivateIncludes:
-                        EdkLogger.error('build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % File, File=self.MetaFile, Line=LineNo)
+                        EdkLogger.error(
+                            'build', OPTION_CONFLICT, "Can't determine %s's attribute, it is both defined as Private and non-Private attribute in DEC file." % File, File=self.MetaFile, Line=LineNo)
                 if Record[3] == TAB_COMMON:
                     self._CommonIncludes.append(File)
         return self._Includes
 
-    ## Retrieve library class declarations (not used in build at present)
+    # Retrieve library class declarations (not used in build at present)
     @property
     def LibraryClasses(self):
         if self._LibraryClasses is None:
@@ -318,11 +332,13 @@ class DecBuildData(PackageBuildClassObject):
             RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch]
             Macros = self._Macros
             for LibraryClass, File, Dummy, Arch, PrivateFlag, ID, LineNo in RecordList:
-                File = PathClass(NormPath(File, Macros), self._PackageDir, Arch=self._Arch)
+                File = PathClass(NormPath(File, Macros),
+                                 self._PackageDir, Arch=self._Arch)
                 # check the file validation
                 ErrorCode, ErrorInfo = File.Validate()
                 if ErrorCode != 0:
-                    EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                    EdkLogger.error(
+                        'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
                 LibraryClassSet.add(LibraryClass)
                 LibraryClassDict[Arch, LibraryClass] = File
             self._LibraryClasses = OrderedDict()
@@ -330,7 +346,7 @@ class DecBuildData(PackageBuildClassObject):
                 self._LibraryClasses[LibraryClass] = LibraryClassDict[self._Arch, LibraryClass]
         return self._LibraryClasses
 
-    ## Retrieve PCD declarations
+    # Retrieve PCD declarations
     @property
     def Pcds(self):
         if self._Pcds is None:
@@ -342,7 +358,7 @@ class DecBuildData(PackageBuildClassObject):
             self._Pcds.update(self._GetPcd(MODEL_PCD_DYNAMIC_EX))
         return self._Pcds
 
-    def ParsePcdName(self,TokenCName):
+    def ParsePcdName(self, TokenCName):
         TokenCName = TokenCName.strip()
         if TokenCName.startswith("["):
             if "." in TokenCName:
@@ -355,7 +371,7 @@ class DecBuildData(PackageBuildClassObject):
             Demesionattr = ""
             Fields = TokenCName
 
-        return Demesionattr,Fields
+        return Demesionattr, Fields
 
     def ProcessStructurePcd(self, StructurePcdRawDataSet):
         s_pcd_set = OrderedDict()
@@ -372,25 +388,30 @@ class DecBuildData(PackageBuildClassObject):
                 if not item.TokenCName:
                     continue
                 if "<HeaderFiles>" in item.TokenCName:
-                    struct_pcd.StructuredPcdIncludeFile.append(item.DefaultValue)
+                    struct_pcd.StructuredPcdIncludeFile.append(
+                        item.DefaultValue)
                 elif "<Packages>" in item.TokenCName:
                     dep_pkgs.append(item.DefaultValue)
                 elif item.DatumType == item.TokenCName:
                     struct_pcd.copy(item)
-                    struct_pcd.TokenValue = struct_pcd.TokenValue.strip("{").strip()
-                    struct_pcd.TokenSpaceGuidCName, struct_pcd.TokenCName = pcdname.split(".")
+                    struct_pcd.TokenValue = struct_pcd.TokenValue.strip(
+                        "{").strip()
+                    struct_pcd.TokenSpaceGuidCName, struct_pcd.TokenCName = pcdname.split(
+                        ".")
                     struct_pcd.PcdDefineLineNo = LineNo
                     struct_pcd.PkgPath = self.MetaFile.File
-                    struct_pcd.SetDecDefaultValue(item.DefaultValue,self.MetaFile.File,LineNo)
+                    struct_pcd.SetDecDefaultValue(
+                        item.DefaultValue, self.MetaFile.File, LineNo)
                 else:
                     DemesionAttr, Fields = self.ParsePcdName(item.TokenCName)
-                    struct_pcd.AddDefaultValue(Fields, item.DefaultValue, self.MetaFile.File, LineNo,DemesionAttr)
+                    struct_pcd.AddDefaultValue(
+                        Fields, item.DefaultValue, self.MetaFile.File, LineNo, DemesionAttr)
 
             struct_pcd.PackageDecs = dep_pkgs
             str_pcd_set.append(struct_pcd)
         return str_pcd_set
 
-    ## Retrieve PCD declarations for given type
+    # Retrieve PCD declarations for given type
     def _GetPcd(self, Type):
         Pcds = OrderedDict()
         #
@@ -420,38 +441,43 @@ class DecBuildData(PackageBuildClassObject):
                 continue
 
             DefaultValue, DatumType, TokenNumber = AnalyzePcdData(Setting)
-            validateranges, validlists, expressions = self._RawData.GetValidExpression(TokenSpaceGuid, PcdCName)
+            validateranges, validlists, expressions = self._RawData.GetValidExpression(
+                TokenSpaceGuid, PcdCName)
             PcdObj = PcdClassObject(
-                                        PcdCName,
-                                        TokenSpaceGuid,
-                                        self._PCD_TYPE_STRING_[Type],
-                                        DatumType,
-                                        DefaultValue,
-                                        TokenNumber,
-                                        '',
-                                        {},
-                                        False,
-                                        None,
-                                        list(validateranges),
-                                        list(validlists),
-                                        list(expressions)
-                                        )
+                PcdCName,
+                TokenSpaceGuid,
+                self._PCD_TYPE_STRING_[Type],
+                DatumType,
+                DefaultValue,
+                TokenNumber,
+                '',
+                {},
+                False,
+                None,
+                list(validateranges),
+                list(validlists),
+                list(expressions)
+            )
             DefinitionPosition[PcdObj] = (self.MetaFile.File, LineNo)
             if "." in TokenSpaceGuid:
                 StrPcdSet.append((PcdObj, LineNo))
             else:
-                Pcds[PcdCName, TokenSpaceGuid, self._PCD_TYPE_STRING_[Type]] = PcdObj
+                Pcds[PcdCName, TokenSpaceGuid,
+                     self._PCD_TYPE_STRING_[Type]] = PcdObj
 
         StructurePcds = self.ProcessStructurePcd(StrPcdSet)
         for pcd in StructurePcds:
-            Pcds[pcd.TokenCName, pcd.TokenSpaceGuidCName, self._PCD_TYPE_STRING_[Type]] = pcd
+            Pcds[pcd.TokenCName, pcd.TokenSpaceGuidCName,
+                 self._PCD_TYPE_STRING_[Type]] = pcd
         StructPattern = compile(r'[_a-zA-Z][0-9A-Za-z_]*$')
         for pcd in Pcds.values():
             if pcd.DatumType not in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64, TAB_VOID, "BOOLEAN"]:
                 if not pcd.IsAggregateDatumType():
-                    EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.", DefinitionPosition[pcd][0], DefinitionPosition[pcd][1])
+                    EdkLogger.error('build', FORMAT_INVALID, "DatumType only support BOOLEAN, UINT8, UINT16, UINT32, UINT64, VOID* or a valid struct name.",
+                                    DefinitionPosition[pcd][0], DefinitionPosition[pcd][1])
                 elif not pcd.IsArray() and not pcd.StructuredPcdIncludeFile:
-                    EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR, "The structure Pcd %s.%s header file is not found in %s line %s \n" % (pcd.TokenSpaceGuidCName, pcd.TokenCName, pcd.DefinitionPosition[0], pcd.DefinitionPosition[1] ))
+                    EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR, "The structure Pcd %s.%s header file is not found in %s line %s \n" % (
+                        pcd.TokenSpaceGuidCName, pcd.TokenCName, pcd.DefinitionPosition[0], pcd.DefinitionPosition[1]))
         return Pcds
 
     @property
diff --git a/BaseTools/Source/Python/Workspace/DscBuildData.py b/BaseTools/Source/Python/Workspace/DscBuildData.py
index e9f68384b429..fe1f8369a220 100644
--- a/BaseTools/Source/Python/Workspace/DscBuildData.py
+++ b/BaseTools/Source/Python/Workspace/DscBuildData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by build tool
 #
 # Copyright (c) 2008 - 2020, Intel Corporation. All rights reserved.<BR>
@@ -6,12 +6,13 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Platform build information from DSC file
+# Platform build information from DSC file
 #
 #  This class is used to retrieve information stored in database and convert them
 # into PlatformBuildClassObject form for easier use for AutoGen.
 #
 from __future__ import print_function
+from AutoGen.GenMake import gIncludePattern
 from __future__ import absolute_import
 from Common.StringUtils import *
 from Common.DataType import *
@@ -19,15 +20,15 @@ from Common.Misc import *
 from types import *
 from Common.Expression import *
 from CommonDataClass.CommonClass import SkuInfoClass
-from Common.TargetTxtClassObject import TargetTxtDict,gDefaultTargetTxtFile
-from Common.ToolDefClassObject import ToolDefDict,gDefaultToolsDefFile
+from Common.TargetTxtClassObject import TargetTxtDict, gDefaultTargetTxtFile
+from Common.ToolDefClassObject import ToolDefDict, gDefaultToolsDefFile
 from .MetaDataTable import *
 from .MetaFileTable import *
 from .MetaFileParser import *
 
 from .WorkspaceCommon import GetDeclaredPcd
 from Common.Misc import AnalyzeDscPcd
-from Common.Misc import ProcessDuplicatedInf,RemoveCComments,ArrayIndex
+from Common.Misc import ProcessDuplicatedInf, RemoveCComments, ArrayIndex
 import re
 from Common.Parsing import IsValidWord
 from Common.VariableAttributes import VariableAttributes
@@ -38,11 +39,12 @@ from Common.Misc import SaveFileOnChange
 from Workspace.BuildClassObject import PlatformBuildClassObject, StructurePcd, PcdClassObject, ModuleBuildClassObject
 from collections import OrderedDict, defaultdict
 
-def _IsFieldValueAnArray (Value):
+
+def _IsFieldValueAnArray(Value):
     Value = Value.strip()
     if Value.startswith(TAB_GUID) and Value.endswith(')'):
         return True
-    if Value.startswith('L"') and Value.endswith('"')  and len(list(Value[2:-1])) > 1:
+    if Value.startswith('L"') and Value.endswith('"') and len(list(Value[2:-1])) > 1:
         return True
     if Value[0] == '"' and Value[-1] == '"' and len(list(Value[1:-1])) > 1:
         return True
@@ -54,6 +56,7 @@ def _IsFieldValueAnArray (Value):
         return True
     return False
 
+
 PcdValueInitName = 'PcdValueInit'
 PcdValueCommonName = 'PcdValueCommon'
 
@@ -109,13 +112,12 @@ LIBS = -lCommon
 
 variablePattern = re.compile(r'[\t\s]*0[xX][a-fA-F0-9]+$')
 SkuIdPattern = re.compile(r'^[a-zA-Z_][a-zA-Z0-9_]*$')
-## regular expressions for finding decimal and hex numbers
+# regular expressions for finding decimal and hex numbers
 Pattern = re.compile('^[1-9]\d*|0$')
 HexPattern = re.compile(r'0[xX][0-9a-fA-F]+$')
-## Regular expression for finding header file inclusions
-from AutoGen.GenMake import gIncludePattern
+# Regular expression for finding header file inclusions
 
-## Find dependencies for one source file
+# Find dependencies for one source file
 #
 #  By searching recursively "#include" directive in file, find out all the
 #  files needed by given source file. The dependecies will be only searched
@@ -125,6 +127,8 @@ from AutoGen.GenMake import gIncludePattern
 #
 #   @retval     list            The list of files the given source file depends on
 #
+
+
 def GetDependencyList(FileStack, SearchPathList):
     DepDb = dict()
     DependencySet = set(FileStack)
@@ -139,7 +143,8 @@ def GetDependencyList(FileStack, SearchPathList):
                 Fd = open(F, 'r')
                 FileContent = Fd.read()
             except BaseException as X:
-                EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F + "\n\t" + str(X))
+                EdkLogger.error("build", FILE_OPEN_FAILURE,
+                                ExtraData=F + "\n\t" + str(X))
             finally:
                 if "Fd" in dir(locals()):
                     Fd.close()
@@ -179,6 +184,7 @@ def GetDependencyList(FileStack, SearchPathList):
 
     return DependencyList
 
+
 class DscBuildData(PlatformBuildClassObject):
 
     # dict used to convert part of [Defines] to members of DscBuildData directly
@@ -186,19 +192,19 @@ class DscBuildData(PlatformBuildClassObject):
         #
         # Required Fields
         #
-        TAB_DSC_DEFINES_PLATFORM_NAME           :   "_PlatformName",
-        TAB_DSC_DEFINES_PLATFORM_GUID           :   "_Guid",
-        TAB_DSC_DEFINES_PLATFORM_VERSION        :   "_Version",
-        TAB_DSC_DEFINES_DSC_SPECIFICATION       :   "_DscSpecification",
+        TAB_DSC_DEFINES_PLATFORM_NAME:   "_PlatformName",
+        TAB_DSC_DEFINES_PLATFORM_GUID:   "_Guid",
+        TAB_DSC_DEFINES_PLATFORM_VERSION:   "_Version",
+        TAB_DSC_DEFINES_DSC_SPECIFICATION:   "_DscSpecification",
         # TAB_DSC_DEFINES_OUTPUT_DIRECTORY        :   "_OutputDirectory",
         # TAB_DSC_DEFINES_SUPPORTED_ARCHITECTURES :   "_SupArchList",
         # TAB_DSC_DEFINES_BUILD_TARGETS           :   "_BuildTargets",
-        TAB_DSC_DEFINES_SKUID_IDENTIFIER        :   "_SkuName",
+        TAB_DSC_DEFINES_SKUID_IDENTIFIER:   "_SkuName",
         # TAB_DSC_DEFINES_FLASH_DEFINITION        :   "_FlashDefinition",
-        TAB_DSC_DEFINES_BUILD_NUMBER            :   "_BuildNumber",
-        TAB_DSC_DEFINES_MAKEFILE_NAME           :   "_MakefileName",
-        TAB_DSC_DEFINES_BS_BASE_ADDRESS         :   "_BsBaseAddress",
-        TAB_DSC_DEFINES_RT_BASE_ADDRESS         :   "_RtBaseAddress",
+        TAB_DSC_DEFINES_BUILD_NUMBER:   "_BuildNumber",
+        TAB_DSC_DEFINES_MAKEFILE_NAME:   "_MakefileName",
+        TAB_DSC_DEFINES_BS_BASE_ADDRESS:   "_BsBaseAddress",
+        TAB_DSC_DEFINES_RT_BASE_ADDRESS:   "_RtBaseAddress",
         # TAB_DSC_DEFINES_RFC_LANGUAGES           :   "_RFCLanguages",
         # TAB_DSC_DEFINES_ISO_LANGUAGES           :   "_ISOLanguages",
     }
@@ -206,7 +212,7 @@ class DscBuildData(PlatformBuildClassObject):
     # used to compose dummy library class name for those forced library instances
     _NullLibraryNumber = 0
 
-    ## Constructor of DscBuildData
+    # Constructor of DscBuildData
     #
     #  Initialize object of DscBuildData
     #
@@ -226,10 +232,12 @@ class DscBuildData(PlatformBuildClassObject):
         self._Toolchain = Toolchain
         self._ToolChainFamily = None
         self._Clear()
-        self.WorkspaceDir = os.getenv("WORKSPACE") if os.getenv("WORKSPACE") else ""
+        self.WorkspaceDir = os.getenv(
+            "WORKSPACE") if os.getenv("WORKSPACE") else ""
         self.DefaultStores = None
         self.SkuIdMgr = SkuClass(self.SkuName, self.SkuIds)
         self.UpdatePcdTypeDict()
+
     @property
     def OutputPath(self):
         if os.getenv("WORKSPACE"):
@@ -245,46 +253,46 @@ class DscBuildData(PlatformBuildClassObject):
     def __getitem__(self, key):
         return self.__dict__[self._PROPERTY_[key]]
 
-    ## "in" test support
+    # "in" test support
     def __contains__(self, key):
         return key in self._PROPERTY_
 
-    ## Set all internal used members of DscBuildData to None
+    # Set all internal used members of DscBuildData to None
     def _Clear(self):
-        self._Header            = None
-        self._PlatformName      = None
-        self._Guid              = None
-        self._Version           = None
-        self._DscSpecification  = None
-        self._OutputDirectory   = None
-        self._SupArchList       = None
-        self._BuildTargets      = None
-        self._SkuName           = None
-        self._PcdInfoFlag       = None
-        self._VarCheckFlag      = None
-        self._FlashDefinition   = None
-        self._Prebuild          = None
-        self._Postbuild         = None
-        self._BuildNumber       = None
-        self._MakefileName      = None
-        self._BsBaseAddress     = None
-        self._RtBaseAddress     = None
-        self._SkuIds            = None
-        self._Modules           = None
-        self._LibraryInstances  = None
-        self._LibraryClasses    = None
-        self._Pcds              = None
-        self._DecPcds           = None
-        self._BuildOptions      = None
+        self._Header = None
+        self._PlatformName = None
+        self._Guid = None
+        self._Version = None
+        self._DscSpecification = None
+        self._OutputDirectory = None
+        self._SupArchList = None
+        self._BuildTargets = None
+        self._SkuName = None
+        self._PcdInfoFlag = None
+        self._VarCheckFlag = None
+        self._FlashDefinition = None
+        self._Prebuild = None
+        self._Postbuild = None
+        self._BuildNumber = None
+        self._MakefileName = None
+        self._BsBaseAddress = None
+        self._RtBaseAddress = None
+        self._SkuIds = None
+        self._Modules = None
+        self._LibraryInstances = None
+        self._LibraryClasses = None
+        self._Pcds = None
+        self._DecPcds = None
+        self._BuildOptions = None
         self._ModuleTypeOptions = None
-        self._LoadFixAddress    = None
-        self._RFCLanguages      = None
-        self._ISOLanguages      = None
-        self._VpdToolGuid       = None
-        self._MacroDict         = None
-        self.DefaultStores      = None
+        self._LoadFixAddress = None
+        self._RFCLanguages = None
+        self._ISOLanguages = None
+        self._VpdToolGuid = None
+        self._MacroDict = None
+        self.DefaultStores = None
 
-    ## Get current effective macros
+    # Get current effective macros
     @property
     def _Macros(self):
         if self._MacroDict is None:
@@ -294,15 +302,16 @@ class DscBuildData(PlatformBuildClassObject):
             self._MacroDict.update(GlobalData.gCommandLineDefines)
         return self._MacroDict
 
-    ## Get architecture
+    # Get architecture
     @property
     def Arch(self):
         return self._Arch
+
     @property
     def Dir(self):
         return self.MetaFile.Dir
 
-    ## Retrieve all information in [Defines] section
+    # Retrieve all information in [Defines] section
     #
     #   (Retrieving all [Defines] information in one-shot is just to save time.)
     #
@@ -320,7 +329,8 @@ class DscBuildData(PlatformBuildClassObject):
                                     File=self.MetaFile, Line=Record[-1],
                                     ExtraData=self._OutputDirectory)
             elif Name == TAB_DSC_DEFINES_FLASH_DEFINITION:
-                self._FlashDefinition = PathClass(NormPath(Record[2], self._Macros), GlobalData.gWorkspace)
+                self._FlashDefinition = PathClass(
+                    NormPath(Record[2], self._Macros), GlobalData.gWorkspace)
                 ErrorCode, ErrorInfo = self._FlashDefinition.Validate('.fdf')
                 if ErrorCode != 0:
                     EdkLogger.error('build', ErrorCode, File=self.MetaFile, Line=Record[-1],
@@ -330,7 +340,7 @@ class DscBuildData(PlatformBuildClassObject):
                 if Record[2][0] == '"':
                     if Record[2][-1] != '"':
                         EdkLogger.error('build', FORMAT_INVALID, 'Missing double quotes in the end of %s statement.' % TAB_DSC_PREBUILD,
-                                    File=self.MetaFile, Line=Record[-1])
+                                        File=self.MetaFile, Line=Record[-1])
                     PrebuildValue = Record[2][1:-1]
                 self._Prebuild = PrebuildValue
             elif Name == TAB_DSC_POSTBUILD:
@@ -338,11 +348,12 @@ class DscBuildData(PlatformBuildClassObject):
                 if Record[2][0] == '"':
                     if Record[2][-1] != '"':
                         EdkLogger.error('build', FORMAT_INVALID, 'Missing double quotes in the end of %s statement.' % TAB_DSC_POSTBUILD,
-                                    File=self.MetaFile, Line=Record[-1])
+                                        File=self.MetaFile, Line=Record[-1])
                     PostbuildValue = Record[2][1:-1]
                 self._Postbuild = PostbuildValue
             elif Name == TAB_DSC_DEFINES_SUPPORTED_ARCHITECTURES:
-                self._SupArchList = GetSplitValueList(Record[2], TAB_VALUE_SPLIT)
+                self._SupArchList = GetSplitValueList(
+                    Record[2], TAB_VALUE_SPLIT)
             elif Name == TAB_DSC_DEFINES_BUILD_TARGETS:
                 self._BuildTargets = GetSplitValueList(Record[2])
             elif Name == TAB_DSC_DEFINES_SKUID_IDENTIFIER:
@@ -356,9 +367,10 @@ class DscBuildData(PlatformBuildClassObject):
                 self._VarCheckFlag = Record[2]
             elif Name == TAB_FIX_LOAD_TOP_MEMORY_ADDRESS:
                 try:
-                    self._LoadFixAddress = int (Record[2], 0)
+                    self._LoadFixAddress = int(Record[2], 0)
                 except:
-                    EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (Record[2]))
+                    EdkLogger.error(
+                        "build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (Record[2]))
             elif Name == TAB_DSC_DEFINES_RFC_LANGUAGES:
                 if not Record[2] or Record[2][0] != '"' or Record[2][-1] != '"' or len(Record[2]) == 1:
                     EdkLogger.error('build', FORMAT_NOT_SUPPORTED, 'language code for RFC_LANGUAGES must have double quotes around it, for example: RFC_LANGUAGES = "en-us;zh-hans"',
@@ -367,7 +379,8 @@ class DscBuildData(PlatformBuildClassObject):
                 if not LanguageCodes:
                     EdkLogger.error('build', FORMAT_NOT_SUPPORTED, 'one or more RFC4646 format language code must be provided for RFC_LANGUAGES statement',
                                     File=self.MetaFile, Line=Record[-1])
-                LanguageList = GetSplitValueList(LanguageCodes, TAB_SEMI_COLON_SPLIT)
+                LanguageList = GetSplitValueList(
+                    LanguageCodes, TAB_SEMI_COLON_SPLIT)
                 # check whether there is empty entries in the list
                 if None in LanguageList:
                     EdkLogger.error('build', FORMAT_NOT_SUPPORTED, 'one or more empty language code is in RFC_LANGUAGES statement',
@@ -390,7 +403,8 @@ class DscBuildData(PlatformBuildClassObject):
                 self._ISOLanguages = LanguageList
             elif Name == TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE:
                 if TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE not in gCommandLineDefines:
-                    gCommandLineDefines[TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE] = Record[2].strip()
+                    gCommandLineDefines[TAB_DSC_DEFINES_VPD_AUTHENTICATED_VARIABLE_STORE] = Record[2].strip(
+                    )
 
             elif Name == TAB_DSC_DEFINES_VPD_TOOL_GUID:
                 #
@@ -400,88 +414,97 @@ class DscBuildData(PlatformBuildClassObject):
                 try:
                     uuid.UUID(Record[2])
                 except:
-                    EdkLogger.error("build", FORMAT_INVALID, "Invalid GUID format for VPD_TOOL_GUID", File=self.MetaFile)
+                    EdkLogger.error(
+                        "build", FORMAT_INVALID, "Invalid GUID format for VPD_TOOL_GUID", File=self.MetaFile)
                 self._VpdToolGuid = Record[2]
             elif Name == TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX:
                 if TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX not in gCommandLineDefines:
-                    gCommandLineDefines[TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX] = Record[2].strip()
+                    gCommandLineDefines[TAB_DSC_DEFINES_PCD_DYNAMIC_AS_DYNAMICEX] = Record[2].strip(
+                    )
             elif Name in self:
                 self[Name] = Record[2]
         # set _Header to non-None in order to avoid database re-querying
         self._Header = 'DUMMY'
 
-    ## Retrieve platform name
+    # Retrieve platform name
     @property
     def PlatformName(self):
         if self._PlatformName is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._PlatformName is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No PLATFORM_NAME", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No PLATFORM_NAME", File=self.MetaFile)
         return self._PlatformName
 
     @property
     def Platform(self):
         return self.PlatformName
 
-    ## Retrieve file guid
+    # Retrieve file guid
     @property
     def Guid(self):
         if self._Guid is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._Guid is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No PLATFORM_GUID", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No PLATFORM_GUID", File=self.MetaFile)
         return self._Guid
 
-    ## Retrieve platform version
+    # Retrieve platform version
     @property
     def Version(self):
         if self._Version is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._Version is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No PLATFORM_VERSION", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No PLATFORM_VERSION", File=self.MetaFile)
         return self._Version
 
-    ## Retrieve platform description file version
+    # Retrieve platform description file version
     @property
     def DscSpecification(self):
         if self._DscSpecification is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._DscSpecification is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No DSC_SPECIFICATION", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No DSC_SPECIFICATION", File=self.MetaFile)
         return self._DscSpecification
 
-    ## Retrieve OUTPUT_DIRECTORY
+    # Retrieve OUTPUT_DIRECTORY
     @property
     def OutputDirectory(self):
         if self._OutputDirectory is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._OutputDirectory is None:
-                self._OutputDirectory = os.path.join("Build", self._PlatformName)
+                self._OutputDirectory = os.path.join(
+                    "Build", self._PlatformName)
         return self._OutputDirectory
 
-    ## Retrieve SUPPORTED_ARCHITECTURES
+    # Retrieve SUPPORTED_ARCHITECTURES
     @property
     def SupArchList(self):
         if self._SupArchList is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._SupArchList is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No SUPPORTED_ARCHITECTURES", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No SUPPORTED_ARCHITECTURES", File=self.MetaFile)
         return self._SupArchList
 
-    ## Retrieve BUILD_TARGETS
+    # Retrieve BUILD_TARGETS
     @property
     def BuildTargets(self):
         if self._BuildTargets is None:
             if self._Header is None:
                 self._GetHeaderInfo()
             if self._BuildTargets is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No BUILD_TARGETS", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No BUILD_TARGETS", File=self.MetaFile)
         return self._BuildTargets
 
     @property
@@ -512,7 +535,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._SkuName = TAB_DEFAULT
         return self._SkuName
 
-    ## Override SKUID_IDENTIFIER
+    # Override SKUID_IDENTIFIER
     @SkuName.setter
     def SkuName(self, Value):
         self._SkuName = Value
@@ -544,7 +567,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._Postbuild = ''
         return self._Postbuild
 
-    ## Retrieve FLASH_DEFINITION
+    # Retrieve FLASH_DEFINITION
     @property
     def BuildNumber(self):
         if self._BuildNumber is None:
@@ -554,7 +577,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._BuildNumber = ''
         return self._BuildNumber
 
-    ## Retrieve MAKEFILE_NAME
+    # Retrieve MAKEFILE_NAME
     @property
     def MakefileName(self):
         if self._MakefileName is None:
@@ -564,7 +587,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._MakefileName = ''
         return self._MakefileName
 
-    ## Retrieve BsBaseAddress
+    # Retrieve BsBaseAddress
     @property
     def BsBaseAddress(self):
         if self._BsBaseAddress is None:
@@ -574,7 +597,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._BsBaseAddress = ''
         return self._BsBaseAddress
 
-    ## Retrieve RtBaseAddress
+    # Retrieve RtBaseAddress
     @property
     def RtBaseAddress(self):
         if self._RtBaseAddress is None:
@@ -584,7 +607,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._RtBaseAddress = ''
         return self._RtBaseAddress
 
-    ## Retrieve the top address for the load fix address
+    # Retrieve the top address for the load fix address
     @property
     def LoadFixAddress(self):
         if self._LoadFixAddress is None:
@@ -592,30 +615,36 @@ class DscBuildData(PlatformBuildClassObject):
                 self._GetHeaderInfo()
 
             if self._LoadFixAddress is None:
-                self._LoadFixAddress = self._Macros.get(TAB_FIX_LOAD_TOP_MEMORY_ADDRESS, '0')
+                self._LoadFixAddress = self._Macros.get(
+                    TAB_FIX_LOAD_TOP_MEMORY_ADDRESS, '0')
 
             try:
-                self._LoadFixAddress = int (self._LoadFixAddress, 0)
+                self._LoadFixAddress = int(self._LoadFixAddress, 0)
             except:
-                EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (self._LoadFixAddress))
+                EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (
+                    self._LoadFixAddress))
 
         #
         # If command line defined, should override the value in DSC file.
         #
         if 'FIX_LOAD_TOP_MEMORY_ADDRESS' in GlobalData.gCommandLineDefines:
             try:
-                self._LoadFixAddress = int(GlobalData.gCommandLineDefines['FIX_LOAD_TOP_MEMORY_ADDRESS'], 0)
+                self._LoadFixAddress = int(
+                    GlobalData.gCommandLineDefines['FIX_LOAD_TOP_MEMORY_ADDRESS'], 0)
             except:
-                EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (GlobalData.gCommandLineDefines['FIX_LOAD_TOP_MEMORY_ADDRESS']))
+                EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS %s is not valid dec or hex string" % (
+                    GlobalData.gCommandLineDefines['FIX_LOAD_TOP_MEMORY_ADDRESS']))
 
         if self._LoadFixAddress < 0:
-            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS is set to the invalid negative value 0x%x" % (self._LoadFixAddress))
+            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS is set to the invalid negative value 0x%x" % (
+                self._LoadFixAddress))
         if self._LoadFixAddress != 0xFFFFFFFFFFFFFFFF and self._LoadFixAddress % 0x1000 != 0:
-            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS is set to the invalid unaligned 4K value 0x%x" % (self._LoadFixAddress))
+            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS is set to the invalid unaligned 4K value 0x%x" % (
+                self._LoadFixAddress))
 
         return self._LoadFixAddress
 
-    ## Retrieve RFCLanguage filter
+    # Retrieve RFCLanguage filter
     @property
     def RFCLanguages(self):
         if self._RFCLanguages is None:
@@ -625,7 +654,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._RFCLanguages = []
         return self._RFCLanguages
 
-    ## Retrieve ISOLanguage filter
+    # Retrieve ISOLanguage filter
     @property
     def ISOLanguages(self):
         if self._ISOLanguages is None:
@@ -635,7 +664,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._ISOLanguages = []
         return self._ISOLanguages
 
-    ## Retrieve the GUID string for VPD tool
+    # Retrieve the GUID string for VPD tool
     @property
     def VpdToolGuid(self):
         if self._VpdToolGuid is None:
@@ -645,7 +674,7 @@ class DscBuildData(PlatformBuildClassObject):
                 self._VpdToolGuid = ''
         return self._VpdToolGuid
 
-    ## Retrieve [SkuIds] section information
+    # Retrieve [SkuIds] section information
     @property
     def SkuIds(self):
         if self._SkuIds is None:
@@ -664,7 +693,8 @@ class DscBuildData(PlatformBuildClassObject):
                 if not SkuIdPattern.match(Record[1]) or (Record[2] and not SkuIdPattern.match(Record[2])):
                     EdkLogger.error('build', FORMAT_INVALID, "The format of the Sku ID name is invalid. The correct format is '(a-zA-Z_)(a-zA-Z0-9_)*'",
                                     File=self.MetaFile, Line=Record[-1])
-                self._SkuIds[Record[1].upper()] = (str(DscBuildData.ToInt(Record[0])), Record[1].upper(), Record[2].upper())
+                self._SkuIds[Record[1].upper()] = (
+                    str(DscBuildData.ToInt(Record[0])), Record[1].upper(), Record[2].upper())
             if TAB_DEFAULT not in self._SkuIds:
                 self._SkuIds[TAB_DEFAULT] = ("0", TAB_DEFAULT, TAB_DEFAULT)
             if TAB_COMMON not in self._SkuIds:
@@ -692,9 +722,11 @@ class DscBuildData(PlatformBuildClassObject):
                 if not IsValidWord(Record[1]):
                     EdkLogger.error('build', FORMAT_INVALID, "The format of the DefaultStores ID name is invalid. The correct format is '(a-zA-Z0-9_)(a-zA-Z0-9_-.)*'",
                                     File=self.MetaFile, Line=Record[-1])
-                self.DefaultStores[Record[1].upper()] = (DscBuildData.ToInt(Record[0]), Record[1].upper())
+                self.DefaultStores[Record[1].upper()] = (
+                    DscBuildData.ToInt(Record[0]), Record[1].upper())
             if TAB_DEFAULT_STORES_DEFAULT not in self.DefaultStores:
-                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (0, TAB_DEFAULT_STORES_DEFAULT)
+                self.DefaultStores[TAB_DEFAULT_STORES_DEFAULT] = (
+                    0, TAB_DEFAULT_STORES_DEFAULT)
             GlobalData.gDefaultStores = sorted(self.DefaultStores.keys())
         return self.DefaultStores
 
@@ -704,32 +736,38 @@ class DscBuildData(PlatformBuildClassObject):
         Components = {}
         for Record in RecordList:
             ModuleId = Record[6]
-            file_guid = self._RawData[MODEL_META_DATA_HEADER, self._Arch, None, ModuleId]
+            file_guid = self._RawData[MODEL_META_DATA_HEADER,
+                                      self._Arch, None, ModuleId]
             file_guid_str = file_guid[0][2] if file_guid else "NULL"
-            ModuleFile = PathClass(NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
-            if self._Arch != TAB_ARCH_COMMON and (file_guid_str,str(ModuleFile)) in Components:
-                self._RawData.DisableOverrideComponent(Components[(file_guid_str,str(ModuleFile))])
-            Components[(file_guid_str,str(ModuleFile))] = ModuleId
+            ModuleFile = PathClass(
+                NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+            if self._Arch != TAB_ARCH_COMMON and (file_guid_str, str(ModuleFile)) in Components:
+                self._RawData.DisableOverrideComponent(
+                    Components[(file_guid_str, str(ModuleFile))])
+            Components[(file_guid_str, str(ModuleFile))] = ModuleId
         self._RawData._PostProcessed = False
 
-    ## Retrieve packages this Platform depends on
+    # Retrieve packages this Platform depends on
     @cached_property
     def Packages(self):
         RetVal = set()
         RecordList = self._RawData[MODEL_META_DATA_PACKAGE, self._Arch]
         Macros = self._Macros
         for Record in RecordList:
-            File = PathClass(NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+            File = PathClass(
+                NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
             # check the file validation
             ErrorCode, ErrorInfo = File.Validate('.dec')
             if ErrorCode != 0:
                 LineNo = Record[-1]
-                EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                EdkLogger.error(
+                    'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
             # parse this package now. we need it to get protocol/ppi/guid value
-            RetVal.add(self._Bdb[File, self._Arch, self._Target, self._Toolchain])
+            RetVal.add(self._Bdb[File, self._Arch,
+                       self._Target, self._Toolchain])
         return RetVal
 
-    ## Retrieve [Components] section information
+    # Retrieve [Components] section information
     @property
     def Modules(self):
         if self._Modules is not None:
@@ -739,7 +777,8 @@ class DscBuildData(PlatformBuildClassObject):
         RecordList = self._RawData[MODEL_META_DATA_COMPONENT, self._Arch]
         Macros = self._Macros
         for Record in RecordList:
-            ModuleFile = PathClass(NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+            ModuleFile = PathClass(
+                NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
             ModuleId = Record[6]
             LineNo = Record[7]
 
@@ -749,15 +788,18 @@ class DscBuildData(PlatformBuildClassObject):
                 EdkLogger.error('build', ErrorCode, File=self.MetaFile, Line=LineNo,
                                 ExtraData=ErrorInfo)
 
-            ModuleBuildData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
+            ModuleBuildData = self._Bdb[ModuleFile,
+                                        self._Arch, self._Target, self._Toolchain]
             Module = ModuleBuildClassObject()
             Module.MetaFile = ModuleFile
             Module.Guid = ModuleBuildData.Guid
             # get module private library instance
-            RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, None, ModuleId]
+            RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS,
+                                       self._Arch, None, ModuleId]
             for Record in RecordList:
                 LibraryClass = Record[0]
-                LibraryPath = PathClass(NormPath(Record[1], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+                LibraryPath = PathClass(
+                    NormPath(Record[1], Macros), GlobalData.gWorkspace, Arch=self._Arch)
                 LineNo = Record[-1]
 
                 # check the file validation
@@ -769,13 +811,14 @@ class DscBuildData(PlatformBuildClassObject):
                 if LibraryClass == '' or LibraryClass == 'NULL':
                     self._NullLibraryNumber += 1
                     LibraryClass = 'NULL%d' % self._NullLibraryNumber
-                    EdkLogger.verbose("Found forced library for %s\n\t%s [%s]" % (ModuleFile, LibraryPath, LibraryClass))
+                    EdkLogger.verbose("Found forced library for %s\n\t%s [%s]" % (
+                        ModuleFile, LibraryPath, LibraryClass))
                 Module.LibraryClasses[LibraryClass] = LibraryPath
                 if LibraryPath not in self.LibraryInstances:
                     self.LibraryInstances.append(LibraryPath)
             S_PcdSet = []
             # get module private PCD setting
-            for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, \
+            for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE,
                          MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX]:
                 RecordList = self._RawData[Type, self._Arch, None, ModuleId]
                 for TokenSpaceGuid, PcdCName, Setting, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
@@ -788,45 +831,52 @@ class DscBuildData(PlatformBuildClassObject):
                         MaxDatumSize = ''
                     TypeString = self._PCD_TYPE_STRING_[Type]
 
-                    TCName,PCName,DimensionAttr,Field = self.ParsePcdNameStruct(TokenSpaceGuid, PcdCName)
+                    TCName, PCName, DimensionAttr, Field = self.ParsePcdNameStruct(
+                        TokenSpaceGuid, PcdCName)
 
                     if ("." in TokenSpaceGuid or "[" in PcdCName):
-                        S_PcdSet.append([ TCName,PCName,DimensionAttr,Field, ModuleBuildData.Guid, "", Dummy5, AnalyzePcdExpression(Setting)[0]])
+                        S_PcdSet.append([TCName, PCName, DimensionAttr, Field,
+                                        ModuleBuildData.Guid, "", Dummy5, AnalyzePcdExpression(Setting)[0]])
                         DefaultValue = ''
-                    if ( PCName,TCName) not in Module.Pcds:
+                    if (PCName, TCName) not in Module.Pcds:
                         Pcd = PcdClassObject(
-                                PCName,
-                                TCName,
-                                TypeString,
-                                '',
-                                DefaultValue,
-                                '',
-                                MaxDatumSize,
-                                {},
-                                False,
-                                None,
-                                IsDsc=True)
+                            PCName,
+                            TCName,
+                            TypeString,
+                            '',
+                            DefaultValue,
+                            '',
+                            MaxDatumSize,
+                            {},
+                            False,
+                            None,
+                            IsDsc=True)
                         Module.Pcds[PCName, TCName] = Pcd
 
             Module.StrPcdSet = S_PcdSet
-            for TCName,PCName, _,_,_,_,_,_ in S_PcdSet:
-                if (PCName,TCName) in Module.Pcds:
-                    Module.StrPcdOverallValue[(PCName,TCName)] = Module.Pcds[(PCName,TCName)].DefaultValue, self.MetaFile,Dummy5
+            for TCName, PCName, _, _, _, _, _, _ in S_PcdSet:
+                if (PCName, TCName) in Module.Pcds:
+                    Module.StrPcdOverallValue[(PCName, TCName)] = Module.Pcds[(
+                        PCName, TCName)].DefaultValue, self.MetaFile, Dummy5
             # get module private build options
-            RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, None, ModuleId]
+            RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION,
+                                       self._Arch, None, ModuleId]
             for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                 if (ToolChainFamily, ToolChain) not in Module.BuildOptions:
                     Module.BuildOptions[ToolChainFamily, ToolChain] = Option
                 else:
                     OptionString = Module.BuildOptions[ToolChainFamily, ToolChain]
-                    Module.BuildOptions[ToolChainFamily, ToolChain] = OptionString + " " + Option
+                    Module.BuildOptions[ToolChainFamily,
+                                        ToolChain] = OptionString + " " + Option
 
-            RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, None, ModuleId]
+            RecordList = self._RawData[MODEL_META_DATA_HEADER,
+                                       self._Arch, None, ModuleId]
             if RecordList:
                 if len(RecordList) != 1:
                     EdkLogger.error('build', OPTION_UNKNOWN, 'Only FILE_GUID can be listed in <Defines> section.',
                                     File=self.MetaFile, ExtraData=str(ModuleFile), Line=LineNo)
-                ModuleFile = ProcessDuplicatedInf(ModuleFile, RecordList[0][2], GlobalData.gWorkspace)
+                ModuleFile = ProcessDuplicatedInf(
+                    ModuleFile, RecordList[0][2], GlobalData.gWorkspace)
                 ModuleFile.Arch = self._Arch
                 Module.Guid = RecordList[0][2]
                 for item in Module.StrPcdSet:
@@ -834,14 +884,14 @@ class DscBuildData(PlatformBuildClassObject):
             self._Modules[ModuleFile] = Module
         return self._Modules
 
-    ## Retrieve all possible library instances used in this platform
+    # Retrieve all possible library instances used in this platform
     @property
     def LibraryInstances(self):
         if self._LibraryInstances is None:
             self.LibraryClasses
         return self._LibraryInstances
 
-    ## Retrieve [LibraryClasses] information
+    # Retrieve [LibraryClasses] information
     @property
     def LibraryClasses(self):
         if self._LibraryClasses is None:
@@ -853,16 +903,19 @@ class DscBuildData(PlatformBuildClassObject):
             LibraryClassDict = tdict(True, 3)
             # track all library class names
             LibraryClassSet = set()
-            RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, None, -1]
+            RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS,
+                                       self._Arch, None, -1]
             Macros = self._Macros
             for Record in RecordList:
                 LibraryClass, LibraryInstance, Dummy, Arch, ModuleType, Dummy, Dummy, LineNo = Record
                 if LibraryClass == '' or LibraryClass == 'NULL':
                     self._NullLibraryNumber += 1
                     LibraryClass = 'NULL%d' % self._NullLibraryNumber
-                    EdkLogger.verbose("Found forced library for arch=%s\n\t%s [%s]" % (Arch, LibraryInstance, LibraryClass))
+                    EdkLogger.verbose("Found forced library for arch=%s\n\t%s [%s]" % (
+                        Arch, LibraryInstance, LibraryClass))
                 LibraryClassSet.add(LibraryClass)
-                LibraryInstance = PathClass(NormPath(LibraryInstance, Macros), GlobalData.gWorkspace, Arch=self._Arch)
+                LibraryInstance = PathClass(
+                    NormPath(LibraryInstance, Macros), GlobalData.gWorkspace, Arch=self._Arch)
                 # check the file validation
                 ErrorCode, ErrorInfo = LibraryInstance.Validate('.inf')
                 if ErrorCode != 0:
@@ -872,7 +925,8 @@ class DscBuildData(PlatformBuildClassObject):
                 if ModuleType != TAB_COMMON and ModuleType not in SUP_MODULE_LIST:
                     EdkLogger.error('build', OPTION_UNKNOWN, "Unknown module type [%s]" % ModuleType,
                                     File=self.MetaFile, ExtraData=LibraryInstance, Line=LineNo)
-                LibraryClassDict[Arch, ModuleType, LibraryClass] = LibraryInstance
+                LibraryClassDict[Arch, ModuleType,
+                                 LibraryClass] = LibraryInstance
                 if LibraryInstance not in self._LibraryInstances:
                     self._LibraryInstances.append(LibraryInstance)
 
@@ -881,14 +935,17 @@ class DscBuildData(PlatformBuildClassObject):
             for LibraryClass in LibraryClassSet:
                 # try all possible module types
                 for ModuleType in SUP_MODULE_LIST:
-                    LibraryInstance = LibraryClassDict[self._Arch, ModuleType, LibraryClass]
+                    LibraryInstance = LibraryClassDict[self._Arch,
+                                                       ModuleType, LibraryClass]
                     if LibraryInstance is None:
                         continue
-                    self._LibraryClasses[LibraryClass, ModuleType] = LibraryInstance
+                    self._LibraryClasses[LibraryClass,
+                                         ModuleType] = LibraryInstance
 
             RecordList = self._RawData[MODEL_EFI_LIBRARY_INSTANCE, self._Arch]
             for Record in RecordList:
-                File = PathClass(NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+                File = PathClass(
+                    NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
                 LineNo = Record[-1]
                 # check the file validation
                 ErrorCode, ErrorInfo = File.Validate('.inf')
@@ -902,7 +959,8 @@ class DscBuildData(PlatformBuildClassObject):
                 # to parse it here. (self._Bdb[] will trigger a file parse if it
                 # hasn't been parsed)
                 #
-                Library = self._Bdb[File, self._Arch, self._Target, self._Toolchain]
+                Library = self._Bdb[File, self._Arch,
+                                    self._Target, self._Toolchain]
                 self._LibraryClasses[Library.BaseName, ':dummy:'] = Library
         return self._LibraryClasses
 
@@ -915,21 +973,26 @@ class DscBuildData(PlatformBuildClassObject):
 
             PkgSet = set()
             for Inf in FdfInfList:
-                ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch)
+                ModuleFile = PathClass(
+                    NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch)
                 if ModuleFile in self._Modules:
                     continue
-                ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
+                ModuleData = self._Bdb[ModuleFile,
+                                       self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
             if self.Packages:
                 PkgSet.update(self.Packages)
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(
+                self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
             self._GuidDict.update(GlobalData.gPlatformPcds)
 
         if (PcdCName, TokenSpaceGuid) not in self._DecPcds:
             EdkLogger.error('build', PARSER_ERROR,
-                            "Pcd (%s.%s) defined in DSC is not declared in DEC files referenced in INF files in FDF. Arch: ['%s']" % (TokenSpaceGuid, PcdCName, self._Arch),
+                            "Pcd (%s.%s) defined in DSC is not declared in DEC files referenced in INF files in FDF. Arch: ['%s']" % (
+                                TokenSpaceGuid, PcdCName, self._Arch),
                             File=self.MetaFile, Line=LineNo)
-        ValueList, IsValid, Index = AnalyzeDscPcd(Setting, PcdType, self._DecPcds[PcdCName, TokenSpaceGuid].DatumType)
+        ValueList, IsValid, Index = AnalyzeDscPcd(
+            Setting, PcdType, self._DecPcds[PcdCName, TokenSpaceGuid].DatumType)
         if not IsValid:
             if PcdType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
                 EdkLogger.error('build', FORMAT_INVALID, "Pcd format incorrect.", File=self.MetaFile, Line=LineNo,
@@ -937,16 +1000,17 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 if ValueList[2] == '-1':
                     EdkLogger.error('build', FORMAT_INVALID, "Pcd format incorrect.", File=self.MetaFile, Line=LineNo,
-                                ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
+                                    ExtraData="%s.%s|%s" % (TokenSpaceGuid, PcdCName, Setting))
         if ValueList[Index]:
             DatumType = self._DecPcds[PcdCName, TokenSpaceGuid].DatumType
             if "{CODE(" not in ValueList[Index]:
                 try:
-                    ValueList[Index] = ValueExpressionEx(ValueList[Index], DatumType, self._GuidDict)(True)
+                    ValueList[Index] = ValueExpressionEx(
+                        ValueList[Index], DatumType, self._GuidDict)(True)
                 except BadExpression as Value:
                     EdkLogger.error('Parser', FORMAT_INVALID, Value, File=self.MetaFile, Line=LineNo,
                                     ExtraData="PCD [%s.%s] Value \"%s\" " % (
-                                    TokenSpaceGuid, PcdCName, ValueList[Index]))
+                                        TokenSpaceGuid, PcdCName, ValueList[Index]))
                 except EvaluationException as Excpt:
                     if hasattr(Excpt, 'Pcd'):
                         if Excpt.Pcd in GlobalData.gPlatformOtherPcds:
@@ -962,7 +1026,8 @@ class DscBuildData(PlatformBuildClassObject):
                                         File=self.MetaFile, Line=LineNo)
 
         if ValueList[Index]:
-            Valid, ErrStr = CheckPcdDatum(self._DecPcds[PcdCName, TokenSpaceGuid].DatumType, ValueList[Index])
+            Valid, ErrStr = CheckPcdDatum(
+                self._DecPcds[PcdCName, TokenSpaceGuid].DatumType, ValueList[Index])
             if not Valid:
                 EdkLogger.error('build', FORMAT_INVALID, ErrStr, File=self.MetaFile, Line=LineNo,
                                 ExtraData="%s.%s" % (TokenSpaceGuid, PcdCName))
@@ -970,14 +1035,16 @@ class DscBuildData(PlatformBuildClassObject):
                 if self._DecPcds[PcdCName, TokenSpaceGuid].DatumType.strip() != ValueList[1].strip():
                     DecPcd = self._DecPcds[PcdCName, TokenSpaceGuid]
                     EdkLogger.error('build', FORMAT_INVALID,
-                                    "Pcd datumtype used in DSC file is not the same as its declaration. DatumType:%s"%DecPcd.DatumType,
+                                    "Pcd datumtype used in DSC file is not the same as its declaration. DatumType:%s" % DecPcd.DatumType,
                                     File=self.MetaFile, Line=LineNo,
-                                    ExtraData="Dsc:%s.%s|%s\n    Dec:%s.%s|%s|%s|%s" % (TokenSpaceGuid, PcdCName, Setting, TokenSpaceGuid, \
-                                    PcdCName, DecPcd.DefaultValue, DecPcd.DatumType, DecPcd.TokenValue))
+                                    ExtraData="Dsc:%s.%s|%s\n    Dec:%s.%s|%s|%s|%s" % (TokenSpaceGuid, PcdCName, Setting, TokenSpaceGuid,
+                                                                                        PcdCName, DecPcd.DefaultValue, DecPcd.DatumType, DecPcd.TokenValue))
         if (TokenSpaceGuid + '.' + PcdCName) in GlobalData.gPlatformPcds:
             if GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] != ValueList[Index]:
-                GlobalData.gPlatformPcds[TokenSpaceGuid + '.' + PcdCName] = ValueList[Index]
-            GlobalData.gPlatformFinalPcds[TokenSpaceGuid + '.' + PcdCName] = ValueList[Index]
+                GlobalData.gPlatformPcds[TokenSpaceGuid +
+                                         '.' + PcdCName] = ValueList[Index]
+            GlobalData.gPlatformFinalPcds[TokenSpaceGuid +
+                                          '.' + PcdCName] = ValueList[Index]
         return ValueList
 
     def _FilterPcdBySkuUsage(self, Pcds):
@@ -986,38 +1053,47 @@ class DscBuildData(PlatformBuildClassObject):
         if sku_usage == SkuClass.SINGLE:
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
-                Pcds[pcdname].SkuInfoList = {TAB_DEFAULT:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
+                Pcds[pcdname].SkuInfoList = {TAB_DEFAULT: pcd.SkuInfoList[skuid]
+                                             for skuid in pcd.SkuInfoList if skuid in available_sku}
                 if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
-                    Pcds[pcdname].SkuOverrideValues = {TAB_DEFAULT:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
+                    Pcds[pcdname].SkuOverrideValues = {
+                        TAB_DEFAULT: pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         else:
             for pcdname in Pcds:
                 pcd = Pcds[pcdname]
-                Pcds[pcdname].SkuInfoList = {skuid:pcd.SkuInfoList[skuid] for skuid in pcd.SkuInfoList if skuid in available_sku}
+                Pcds[pcdname].SkuInfoList = {skuid: pcd.SkuInfoList[skuid]
+                                             for skuid in pcd.SkuInfoList if skuid in available_sku}
                 if isinstance(pcd, StructurePcd) and pcd.SkuOverrideValues:
-                    Pcds[pcdname].SkuOverrideValues = {skuid:pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
+                    Pcds[pcdname].SkuOverrideValues = {
+                        skuid: pcd.SkuOverrideValues[skuid] for skuid in pcd.SkuOverrideValues if skuid in available_sku}
         return Pcds
 
     def CompleteHiiPcdsDefaultStores(self, Pcds):
-        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
+        HiiPcd = [Pcds[pcd] for pcd in Pcds if Pcds[pcd].Type in [self._PCD_TYPE_STRING_[
+            MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]]
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         for pcd in HiiPcd:
             for skuid in pcd.SkuInfoList:
                 skuobj = pcd.SkuInfoList.get(skuid)
                 if TAB_DEFAULT_STORES_DEFAULT not in skuobj.DefaultStoreDict:
-                    PcdDefaultStoreSet = set(defaultstorename  for defaultstorename in skuobj.DefaultStoreDict)
-                    mindefaultstorename = DefaultStoreMgr.GetMin(PcdDefaultStoreSet)
+                    PcdDefaultStoreSet = set(
+                        defaultstorename for defaultstorename in skuobj.DefaultStoreDict)
+                    mindefaultstorename = DefaultStoreMgr.GetMin(
+                        PcdDefaultStoreSet)
                     skuobj.DefaultStoreDict[TAB_DEFAULT_STORES_DEFAULT] = skuobj.DefaultStoreDict[mindefaultstorename]
         return Pcds
 
     def RecoverCommandLinePcd(self):
         def UpdateCommandLineValue(pcd):
             if pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+                            self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
                 pcd.PcdValueFromComm = pcd.DefaultValue
             elif pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
-                pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).HiiDefaultValue
+                pcd.PcdValueFromComm = pcd.SkuInfoList.get(
+                    TAB_DEFAULT).HiiDefaultValue
             else:
-                pcd.PcdValueFromComm = pcd.SkuInfoList.get(TAB_DEFAULT).DefaultValue
+                pcd.PcdValueFromComm = pcd.SkuInfoList.get(
+                    TAB_DEFAULT).DefaultValue
         for pcd in self._Pcds:
             if isinstance(self._Pcds[pcd], StructurePcd) and (self._Pcds[pcd].PcdValueFromComm or self._Pcds[pcd].PcdFieldValueFromComm):
                 UpdateCommandLineValue(self._Pcds[pcd])
@@ -1029,7 +1105,8 @@ class DscBuildData(PlatformBuildClassObject):
                     continue
                 (pcdname, pcdvalue) = pcd.split('=')
                 if not pcdvalue:
-                    EdkLogger.error('build', AUTOGEN_ERROR, "No Value specified for the PCD %s." % (pcdname))
+                    EdkLogger.error(
+                        'build', AUTOGEN_ERROR, "No Value specified for the PCD %s." % (pcdname))
                 if '.' in pcdname:
                     (Name1, Name2) = pcdname.split('.', 1)
                     if "." in Name2:
@@ -1048,7 +1125,7 @@ class DscBuildData(PlatformBuildClassObject):
                             HasTokenSpace = True
                             TokenCName = Name2
                             TokenSpaceGuidCName = Name1
-                            FieldName =""
+                            FieldName = ""
                         else:
                             FieldName = Name2
                             TokenCName = Name1
@@ -1070,48 +1147,59 @@ class DscBuildData(PlatformBuildClassObject):
                         PcdItem = self.DecPcds[key]
                         if TokenCName == PcdItem.TokenCName:
                             if not PcdItem.TokenSpaceGuidCName in TokenSpaceGuidCNameList:
-                                if len (TokenSpaceGuidCNameList) < 1:
-                                    TokenSpaceGuidCNameList.append(PcdItem.TokenSpaceGuidCName)
+                                if len(TokenSpaceGuidCNameList) < 1:
+                                    TokenSpaceGuidCNameList.append(
+                                        PcdItem.TokenSpaceGuidCName)
                                     TokenSpaceGuidCName = PcdItem.TokenSpaceGuidCName
                                     PcdDatumType = PcdItem.DatumType
                                     FoundFlag = True
                                 else:
                                     EdkLogger.error(
-                                            'build',
-                                             AUTOGEN_ERROR,
-                                            "The Pcd %s is found under multiple different TokenSpaceGuid: %s and %s." % (DisplayName, PcdItem.TokenSpaceGuidCName, TokenSpaceGuidCNameList[0])
-                                            )
+                                        'build',
+                                        AUTOGEN_ERROR,
+                                        "The Pcd %s is found under multiple different TokenSpaceGuid: %s and %s." % (
+                                            DisplayName, PcdItem.TokenSpaceGuidCName, TokenSpaceGuidCNameList[0])
+                                    )
                 else:
                     if (TokenCName, TokenSpaceGuidCName) in self.DecPcds:
-                        PcdDatumType = self.DecPcds[(TokenCName, TokenSpaceGuidCName)].DatumType
+                        PcdDatumType = self.DecPcds[(
+                            TokenCName, TokenSpaceGuidCName)].DatumType
                         FoundFlag = True
                 if not FoundFlag:
                     if HasTokenSpace:
-                        EdkLogger.error('build', AUTOGEN_ERROR, "The Pcd %s.%s is not found in the DEC file." % (TokenSpaceGuidCName, DisplayName))
+                        EdkLogger.error('build', AUTOGEN_ERROR, "The Pcd %s.%s is not found in the DEC file." % (
+                            TokenSpaceGuidCName, DisplayName))
                     else:
-                        EdkLogger.error('build', AUTOGEN_ERROR, "The Pcd %s is not found in the DEC file." % (DisplayName))
-                pcdvalue = pcdvalue.replace("\\\\\\'", '\\\\\\"').replace('\\\'', '\'').replace('\\\\\\"', "\\'")
+                        EdkLogger.error(
+                            'build', AUTOGEN_ERROR, "The Pcd %s is not found in the DEC file." % (DisplayName))
+                pcdvalue = pcdvalue.replace("\\\\\\'", '\\\\\\"').replace(
+                    '\\\'', '\'').replace('\\\\\\"', "\\'")
                 if FieldName:
-                    pcdvalue = DscBuildData.HandleFlexiblePcd(TokenSpaceGuidCName, TokenCName, pcdvalue, PcdDatumType, self._GuidDict, FieldName)
+                    pcdvalue = DscBuildData.HandleFlexiblePcd(
+                        TokenSpaceGuidCName, TokenCName, pcdvalue, PcdDatumType, self._GuidDict, FieldName)
                 else:
-                    pcdvalue = DscBuildData.HandleFlexiblePcd(TokenSpaceGuidCName, TokenCName, pcdvalue, PcdDatumType, self._GuidDict)
+                    pcdvalue = DscBuildData.HandleFlexiblePcd(
+                        TokenSpaceGuidCName, TokenCName, pcdvalue, PcdDatumType, self._GuidDict)
                     IsValid, Cause = CheckPcdDatum(PcdDatumType, pcdvalue)
                     if not IsValid:
-                        EdkLogger.error("build", FORMAT_INVALID, Cause, ExtraData="%s.%s" % (TokenSpaceGuidCName, TokenCName))
-                GlobalData.BuildOptionPcd[i] = (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, ("build command options", 1))
+                        EdkLogger.error("build", FORMAT_INVALID, Cause, ExtraData="%s.%s" % (
+                            TokenSpaceGuidCName, TokenCName))
+                GlobalData.BuildOptionPcd[i] = (
+                    TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, ("build command options", 1))
 
         if GlobalData.BuildOptionPcd:
-            inf_objs = [item for item in self._Bdb._CACHE_.values() if item.Arch == self.Arch and item.MetaFile.Ext.lower() == '.inf']
+            inf_objs = [item for item in self._Bdb._CACHE_.values(
+            ) if item.Arch == self.Arch and item.MetaFile.Ext.lower() == '.inf']
             for pcd in GlobalData.BuildOptionPcd:
                 (TokenSpaceGuidCName, TokenCName, FieldName, pcdvalue, _) = pcd
                 for BuildData in inf_objs:
                     for key in BuildData.Pcds:
                         PcdItem = BuildData.Pcds[key]
-                        if (TokenSpaceGuidCName, TokenCName) == (PcdItem.TokenSpaceGuidCName, PcdItem.TokenCName) and FieldName =="":
+                        if (TokenSpaceGuidCName, TokenCName) == (PcdItem.TokenSpaceGuidCName, PcdItem.TokenCName) and FieldName == "":
                             PcdItem.DefaultValue = pcdvalue
                             PcdItem.PcdValueFromComm = pcdvalue
-        #In command line, the latter full assign value in commandLine should override the former field assign value.
-        #For example, --pcd Token.pcd.field="" --pcd Token.pcd=H"{}"
+        # In command line, the latter full assign value in commandLine should override the former field assign value.
+        # For example, --pcd Token.pcd.field="" --pcd Token.pcd=H"{}"
         delete_assign = []
         field_assign = {}
         if GlobalData.BuildOptionPcd:
@@ -1123,7 +1211,8 @@ class DscBuildData(PlatformBuildClassObject):
                     field_assign[TokenSpaceGuid, Token].append(pcdTuple)
                 else:
                     if (TokenSpaceGuid, Token) in field_assign:
-                        delete_assign.extend(field_assign[TokenSpaceGuid, Token])
+                        delete_assign.extend(
+                            field_assign[TokenSpaceGuid, Token])
                         field_assign[TokenSpaceGuid, Token] = []
             for item in delete_assign:
                 GlobalData.BuildOptionPcd.remove(item)
@@ -1140,7 +1229,8 @@ class DscBuildData(PlatformBuildClassObject):
             if FieldName and not IsArray:
                 return PcdValue
             try:
-                PcdValue = ValueExpressionEx(PcdValue[1:], PcdDatumType, GuidDict)(True)
+                PcdValue = ValueExpressionEx(
+                    PcdValue[1:], PcdDatumType, GuidDict)(True)
             except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
@@ -1151,7 +1241,8 @@ class DscBuildData(PlatformBuildClassObject):
             if FieldName and not IsArray:
                 return PcdValue
             try:
-                PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
+                PcdValue = ValueExpressionEx(
+                    PcdValue, PcdDatumType, GuidDict)(True)
             except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
@@ -1163,7 +1254,8 @@ class DscBuildData(PlatformBuildClassObject):
             if FieldName and not IsArray:
                 return PcdValue
             try:
-                PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
+                PcdValue = ValueExpressionEx(
+                    PcdValue, PcdDatumType, GuidDict)(True)
             except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
@@ -1192,13 +1284,14 @@ class DscBuildData(PlatformBuildClassObject):
                 if not IsArray:
                     return PcdValue
             try:
-                PcdValue = ValueExpressionEx(PcdValue, PcdDatumType, GuidDict)(True)
+                PcdValue = ValueExpressionEx(
+                    PcdValue, PcdDatumType, GuidDict)(True)
             except BadExpression as Value:
                 EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s",  %s' %
                                 (TokenSpaceGuidCName, TokenCName, PcdValue, Value))
         return PcdValue
 
-    ## Retrieve all PCD settings in platform
+    # Retrieve all PCD settings in platform
     @property
     def Pcds(self):
         if self._Pcds is None:
@@ -1210,14 +1303,16 @@ class DscBuildData(PlatformBuildClassObject):
             self._Pcds.update(self._GetDynamicPcd(MODEL_PCD_DYNAMIC_DEFAULT))
             self._Pcds.update(self._GetDynamicHiiPcd(MODEL_PCD_DYNAMIC_HII))
             self._Pcds.update(self._GetDynamicVpdPcd(MODEL_PCD_DYNAMIC_VPD))
-            self._Pcds.update(self._GetDynamicPcd(MODEL_PCD_DYNAMIC_EX_DEFAULT))
+            self._Pcds.update(self._GetDynamicPcd(
+                MODEL_PCD_DYNAMIC_EX_DEFAULT))
             self._Pcds.update(self._GetDynamicHiiPcd(MODEL_PCD_DYNAMIC_EX_HII))
             self._Pcds.update(self._GetDynamicVpdPcd(MODEL_PCD_DYNAMIC_EX_VPD))
 
             self._Pcds = self.CompletePcdValues(self._Pcds)
             self._Pcds = self.OverrideByFdfOverAll(self._Pcds)
             self._Pcds = self.OverrideByCommOverAll(self._Pcds)
-            self._Pcds = self.UpdateStructuredPcds(MODEL_PCD_TYPE_LIST, self._Pcds)
+            self._Pcds = self.UpdateStructuredPcds(
+                MODEL_PCD_TYPE_LIST, self._Pcds)
             self._Pcds = self.CompleteHiiPcdsDefaultStores(self._Pcds)
             self._Pcds = self._FilterPcdBySkuUsage(self._Pcds)
 
@@ -1233,7 +1328,8 @@ class DscBuildData(PlatformBuildClassObject):
             # Retrieve build option for EDKII and EDK style module
             #
             for CodeBase in (EDKII_NAME, EDK_NAME):
-                RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, CodeBase]
+                RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION,
+                                           self._Arch, CodeBase]
                 for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
                     if Dummy3.upper() != TAB_COMMON:
                         continue
@@ -1247,6 +1343,7 @@ class DscBuildData(PlatformBuildClassObject):
                         if ' ' + Option not in self._BuildOptions[CurKey]:
                             self._BuildOptions[CurKey] += ' ' + Option
         return self._BuildOptions
+
     def GetBuildOptionsByPkg(self, Module, ModuleType):
 
         local_pkg = os.path.split(Module.LocalPkg())[0]
@@ -1254,10 +1351,10 @@ class DscBuildData(PlatformBuildClassObject):
             self._ModuleTypeOptions = OrderedDict()
         if ModuleType not in self._ModuleTypeOptions:
             options = OrderedDict()
-            self._ModuleTypeOptions[ ModuleType] = options
+            self._ModuleTypeOptions[ModuleType] = options
             RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch]
             for ToolChainFamily, ToolChain, Option, Dummy1, Dummy2, Dummy3, Dummy4, Dummy5 in RecordList:
-                if Dummy2 not in (TAB_COMMON,local_pkg.upper(),"EDKII"):
+                if Dummy2 not in (TAB_COMMON, local_pkg.upper(), "EDKII"):
                     continue
                 Type = Dummy3
                 if Type.upper() == ModuleType.upper():
@@ -1268,6 +1365,7 @@ class DscBuildData(PlatformBuildClassObject):
                         if ' ' + Option not in options[Key]:
                             options[Key] += ' ' + Option
         return self._ModuleTypeOptions[ModuleType]
+
     def GetBuildOptionsByModuleType(self, Edk, ModuleType):
         if self._ModuleTypeOptions is None:
             self._ModuleTypeOptions = OrderedDict()
@@ -1297,29 +1395,31 @@ class DscBuildData(PlatformBuildClassObject):
         return structure_pcd_data
 
     @staticmethod
-    def OverrideByFdf(StruPcds,workspace):
+    def OverrideByFdf(StruPcds, workspace):
         if GlobalData.gFdfParser is None:
             return StruPcds
         StructurePcdInFdf = OrderedDict()
         fdfpcd = GlobalData.gFdfParser.Profile.PcdDict
         fdfpcdlocation = GlobalData.gFdfParser.Profile.PcdLocalDict
-        for item in fdfpcd :
-            if len(item[2]) and (item[0],item[1]) in StruPcds:
-                StructurePcdInFdf[(item[1],item[0],item[2] )] = fdfpcd[item]
-        GlobalPcds = {(item[0],item[1]) for item in StructurePcdInFdf}
+        for item in fdfpcd:
+            if len(item[2]) and (item[0], item[1]) in StruPcds:
+                StructurePcdInFdf[(item[1], item[0], item[2])] = fdfpcd[item]
+        GlobalPcds = {(item[0], item[1]) for item in StructurePcdInFdf}
         for Pcd in StruPcds.values():
-            if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) not in GlobalPcds:
+            if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in GlobalPcds:
                 continue
             FieldValues = OrderedDict()
             for item in StructurePcdInFdf:
-                if (Pcd.TokenSpaceGuidCName,Pcd.TokenCName) == (item[0],item[1]) and item[2]:
+                if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) == (item[0], item[1]) and item[2]:
                     FieldValues[item[2]] = StructurePcdInFdf[item]
             for field in FieldValues:
                 if field not in Pcd.PcdFieldValueFromFdf:
-                    Pcd.PcdFieldValueFromFdf[field] = ["","",""]
+                    Pcd.PcdFieldValueFromFdf[field] = ["", "", ""]
                 Pcd.PcdFieldValueFromFdf[field][0] = FieldValues[field]
-                Pcd.PcdFieldValueFromFdf[field][1] = os.path.relpath(fdfpcdlocation[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName,field)][0],workspace)
-                Pcd.PcdFieldValueFromFdf[field][2] = fdfpcdlocation[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName,field)][1]
+                Pcd.PcdFieldValueFromFdf[field][1] = os.path.relpath(
+                    fdfpcdlocation[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName, field)][0], workspace)
+                Pcd.PcdFieldValueFromFdf[field][2] = fdfpcdlocation[(
+                    Pcd.TokenCName, Pcd.TokenSpaceGuidCName, field)][1]
 
         return StruPcds
 
@@ -1328,7 +1428,8 @@ class DscBuildData(PlatformBuildClassObject):
         StructurePcdInCom = OrderedDict()
         for item in GlobalData.BuildOptionPcd:
             if len(item) == 5 and (item[1], item[0]) in StruPcds:
-                StructurePcdInCom[(item[0], item[1], item[2] )] = (item[3], item[4])
+                StructurePcdInCom[(item[0], item[1], item[2])
+                                  ] = (item[3], item[4])
         GlobalPcds = {(item[0], item[1]) for item in StructurePcdInCom}
         for Pcd in StruPcds.values():
             if (Pcd.TokenSpaceGuidCName, Pcd.TokenCName) not in GlobalPcds:
@@ -1345,7 +1446,7 @@ class DscBuildData(PlatformBuildClassObject):
                 Pcd.PcdFieldValueFromComm[field][2] = FieldValues[field][1][1]
         return StruPcds
 
-    def OverrideByCommOverAll(self,AllPcds):
+    def OverrideByCommOverAll(self, AllPcds):
         def CheckStructureInComm(commpcds):
             if not commpcds:
                 return False
@@ -1356,7 +1457,8 @@ class DscBuildData(PlatformBuildClassObject):
         if CheckStructureInComm(GlobalData.BuildOptionPcd):
             StructurePcdInCom = OrderedDict()
             for item in GlobalData.BuildOptionPcd:
-                StructurePcdInCom[(item[0], item[1], item[2] )] = (item[3], item[4])
+                StructurePcdInCom[(item[0], item[1], item[2])
+                                  ] = (item[3], item[4])
             for item in StructurePcdInCom:
                 if not item[2]:
                     NoFiledValues[(item[0], item[1])] = StructurePcdInCom[item]
@@ -1367,24 +1469,32 @@ class DscBuildData(PlatformBuildClassObject):
             if (Name, Guid) in AllPcds:
                 Pcd = AllPcds.get((Name, Guid))
                 if isinstance(self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), None), StructurePcd):
-                    self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName)).PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                    self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName)).PcdValueFromComm = NoFiledValues[(
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                 else:
-                    Pcd.PcdValueFromComm = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
-                    Pcd.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                    Pcd.PcdValueFromComm = NoFiledValues[(
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                    Pcd.DefaultValue = NoFiledValues[(
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                     for sku in Pcd.SkuInfoList:
                         SkuInfo = Pcd.SkuInfoList[sku]
                         if SkuInfo.DefaultValue:
-                            SkuInfo.DefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                            SkuInfo.DefaultValue = NoFiledValues[(
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                         else:
-                            SkuInfo.HiiDefaultValue = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                            SkuInfo.HiiDefaultValue = NoFiledValues[(
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                             for defaultstore in SkuInfo.DefaultStoreDict:
-                                SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
+                                SkuInfo.DefaultStoreDict[defaultstore] = NoFiledValues[(
+                                    Pcd.TokenSpaceGuidCName, Pcd.TokenCName)][0]
                     if Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII]]:
                         if Pcd.DatumType == TAB_VOID:
                             if not Pcd.MaxDatumSize:
                                 Pcd.MaxDatumSize = '0'
-                            CurrentSize = int(Pcd.MaxDatumSize, 16) if Pcd.MaxDatumSize.upper().startswith("0X") else int(Pcd.MaxDatumSize)
-                            OptionSize = len((StringToArray(Pcd.PcdValueFromComm)).split(","))
+                            CurrentSize = int(Pcd.MaxDatumSize, 16) if Pcd.MaxDatumSize.upper(
+                            ).startswith("0X") else int(Pcd.MaxDatumSize)
+                            OptionSize = len(
+                                (StringToArray(Pcd.PcdValueFromComm)).split(","))
                             MaxSize = max(CurrentSize, OptionSize)
                             Pcd.MaxDatumSize = str(MaxSize)
             else:
@@ -1392,30 +1502,36 @@ class DscBuildData(PlatformBuildClassObject):
                 if PcdInDec:
                     PcdInDec.PcdValueFromComm = NoFiledValues[(Guid, Name)][0]
                     if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
+                                         self._PCD_TYPE_STRING_[
+                                             MODEL_PCD_PATCHABLE_IN_MODULE],
+                                         self._PCD_TYPE_STRING_[
+                                             MODEL_PCD_FEATURE_FLAG],
+                                         self._PCD_TYPE_STRING_[
+                                             MODEL_PCD_DYNAMIC],
+                                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
                         self._Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
-                        self._Pcds[Name, Guid].DefaultValue = NoFiledValues[( Guid, Name)][0]
+                        self._Pcds[Name, Guid].DefaultValue = NoFiledValues[(
+                            Guid, Name)][0]
                     if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
-                        self._Pcds[Name, Guid].SkuInfoList = {TAB_DEFAULT:SkuInfoClass(TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '', '', NoFiledValues[( Guid, Name)][0])}
+                                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX]]:
+                        self._Pcds[Name, Guid].SkuInfoList = {TAB_DEFAULT: SkuInfoClass(
+                            TAB_DEFAULT, self.SkuIds[TAB_DEFAULT][0], '', '', '', '', '', NoFiledValues[(Guid, Name)][0])}
         return AllPcds
 
-    def OverrideByFdfOverAll(self,AllPcds):
+    def OverrideByFdfOverAll(self, AllPcds):
 
         if GlobalData.gFdfParser is None:
             return AllPcds
         NoFiledValues = GlobalData.gFdfParser.Profile.PcdDict
-        for Name,Guid,Field in NoFiledValues:
+        for Name, Guid, Field in NoFiledValues:
             if len(Field):
                 continue
-            Value = NoFiledValues[(Name,Guid,Field)]
-            if (Name,Guid) in AllPcds:
-                Pcd = AllPcds.get((Name,Guid))
-                if isinstance(self._DecPcds.get((Pcd.TokenCName,Pcd.TokenSpaceGuidCName), None),StructurePcd):
-                    self._DecPcds.get((Pcd.TokenCName,Pcd.TokenSpaceGuidCName)).PcdValueFromComm = Value
+            Value = NoFiledValues[(Name, Guid, Field)]
+            if (Name, Guid) in AllPcds:
+                Pcd = AllPcds.get((Name, Guid))
+                if isinstance(self._DecPcds.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), None), StructurePcd):
+                    self._DecPcds.get(
+                        (Pcd.TokenCName, Pcd.TokenSpaceGuidCName)).PcdValueFromComm = Value
                 else:
                     Pcd.PcdValueFromComm = Value
                     Pcd.DefaultValue = Value
@@ -1431,22 +1547,25 @@ class DscBuildData(PlatformBuildClassObject):
                         if Pcd.DatumType == TAB_VOID:
                             if not Pcd.MaxDatumSize:
                                 Pcd.MaxDatumSize = '0'
-                            CurrentSize = int(Pcd.MaxDatumSize,16) if Pcd.MaxDatumSize.upper().startswith("0X") else int(Pcd.MaxDatumSize)
-                            OptionSize = len((StringToArray(Pcd.PcdValueFromComm)).split(","))
+                            CurrentSize = int(Pcd.MaxDatumSize, 16) if Pcd.MaxDatumSize.upper(
+                            ).startswith("0X") else int(Pcd.MaxDatumSize)
+                            OptionSize = len(
+                                (StringToArray(Pcd.PcdValueFromComm)).split(","))
                             MaxSize = max(CurrentSize, OptionSize)
                             Pcd.MaxDatumSize = str(MaxSize)
             else:
-                PcdInDec = self.DecPcds.get((Name,Guid))
+                PcdInDec = self.DecPcds.get((Name, Guid))
                 if PcdInDec:
                     PcdInDec.PcdValueFromFdf = Value
                     if PcdInDec.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
+                                         self._PCD_TYPE_STRING_[
+                                             MODEL_PCD_PATCHABLE_IN_MODULE],
+                                         self._PCD_TYPE_STRING_[MODEL_PCD_FEATURE_FLAG]]:
                         self._Pcds[Name, Guid] = copy.deepcopy(PcdInDec)
                         self._Pcds[Name, Guid].DefaultValue = Value
         return AllPcds
 
-    def ParsePcdNameStruct(self,NamePart1,NamePart2):
+    def ParsePcdNameStruct(self, NamePart1, NamePart2):
         TokenSpaceCName = PcdCName = DimensionAttr = Field = ""
         if "." in NamePart1:
             TokenSpaceCName, TempPcdCName = NamePart1.split(".")
@@ -1464,22 +1583,23 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 PcdCName = NamePart2
 
-        return TokenSpaceCName,PcdCName,DimensionAttr,Field
+        return TokenSpaceCName, PcdCName, DimensionAttr, Field
 
     def UpdateStructuredPcds(self, TypeList, AllPcds):
 
         DynamicPcdType = [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_VPD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_DEFAULT],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]
+                          self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                          self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_VPD],
+                          self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_DEFAULT],
+                          self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII],
+                          self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]
 
         Pcds = AllPcds
         DefaultStoreMgr = DefaultStore(self.DefaultStores)
         SkuIds = self.SkuIds
-        self.SkuIdMgr.AvailableSkuIdSet.update({TAB_DEFAULT:0})
-        DefaultStores = {storename for pcdobj in AllPcds.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict}
+        self.SkuIdMgr.AvailableSkuIdSet.update({TAB_DEFAULT: 0})
+        DefaultStores = {storename for pcdobj in AllPcds.values(
+        ) for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict}
         DefaultStores.add(TAB_DEFAULT_STORES_DEFAULT)
 
         S_PcdSet = []
@@ -1495,28 +1615,34 @@ class DscBuildData(PlatformBuildClassObject):
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in SkuIds:
                 continue
-            TCName,PCName,DimensionAttr,Field = self.ParsePcdNameStruct(TokenSpaceGuid, PcdCName)
-            pcd_in_dec = self._DecPcds.get((PCName,TCName), None)
+            TCName, PCName, DimensionAttr, Field = self.ParsePcdNameStruct(
+                TokenSpaceGuid, PcdCName)
+            pcd_in_dec = self._DecPcds.get((PCName, TCName), None)
             if pcd_in_dec is None:
                 EdkLogger.error('build', PARSER_ERROR,
-                            "Pcd (%s.%s) defined in DSC is not declared in DEC files. Arch: ['%s']" % (TCName, PCName, self._Arch),
-                            File=self.MetaFile, Line = Dummy5)
+                                "Pcd (%s.%s) defined in DSC is not declared in DEC files. Arch: ['%s']" % (
+                                    TCName, PCName, self._Arch),
+                                File=self.MetaFile, Line=Dummy5)
             if SkuName in SkuIds and ("." in TokenSpaceGuid or "[" in PcdCName):
-                if not isinstance (pcd_in_dec, StructurePcd):
+                if not isinstance(pcd_in_dec, StructurePcd):
                     EdkLogger.error('build', PARSER_ERROR,
-                                "Pcd (%s.%s) is not declared as Structure PCD in DEC files. Arch: ['%s']" % (TCName, PCName, self._Arch),
-                                File=self.MetaFile, Line = Dummy5)
+                                    "Pcd (%s.%s) is not declared as Structure PCD in DEC files. Arch: ['%s']" % (
+                                        TCName, PCName, self._Arch),
+                                    File=self.MetaFile, Line=Dummy5)
 
-                S_PcdSet.append([ TCName,PCName,DimensionAttr,Field, SkuName, default_store, Dummy5, AnalyzePcdExpression(Setting)[0]])
+                S_PcdSet.append([TCName, PCName, DimensionAttr, Field, SkuName,
+                                default_store, Dummy5, AnalyzePcdExpression(Setting)[0]])
         ModuleScopeOverallValue = {}
         for m in self.Modules.values():
             mguid = m.Guid
             if m.StrPcdSet:
                 S_PcdSet.extend(m.StrPcdSet)
                 mguid = m.StrPcdSet[0][4]
-            for (PCName,TCName) in m.StrPcdOverallValue:
-                Value, dsc_file, lineNo = m.StrPcdOverallValue[(PCName,TCName)]
-                ModuleScopeOverallValue.setdefault((PCName,TCName),{})[mguid] = Value, dsc_file, lineNo
+            for (PCName, TCName) in m.StrPcdOverallValue:
+                Value, dsc_file, lineNo = m.StrPcdOverallValue[(
+                    PCName, TCName)]
+                ModuleScopeOverallValue.setdefault((PCName, TCName), {})[
+                    mguid] = Value, dsc_file, lineNo
         # handle pcd value override
         StrPcdSet = DscBuildData.GetStructurePcdInfo(S_PcdSet)
         S_pcd_set = OrderedDict()
@@ -1528,17 +1654,23 @@ class DscBuildData(PlatformBuildClassObject):
             if str_pcd_obj:
                 str_pcd_obj_str.copy(str_pcd_obj)
                 if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
-                    str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
+                    str_pcd_obj_str.DefaultFromDSC = {skuname: {defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(
+                        defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                 else:
-                    str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
+                    str_pcd_obj_str.DefaultFromDSC = {skuname: {defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(
+                        defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
             for str_pcd_data in StrPcdSet[str_pcd]:
                 if str_pcd_data[4] in SkuIds:
-                    str_pcd_obj_str.AddOverrideValue(str_pcd_data[3], str(str_pcd_data[7]), TAB_DEFAULT if str_pcd_data[4] == TAB_COMMON else str_pcd_data[4], TAB_DEFAULT_STORES_DEFAULT if str_pcd_data[5] == TAB_COMMON else str_pcd_data[5], self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:], LineNo=str_pcd_data[6],DimensionAttr = str_pcd_data[2])
+                    str_pcd_obj_str.AddOverrideValue(str_pcd_data[3], str(str_pcd_data[7]), TAB_DEFAULT if str_pcd_data[4] == TAB_COMMON else str_pcd_data[4], TAB_DEFAULT_STORES_DEFAULT if str_pcd_data[5] == TAB_COMMON else str_pcd_data[5],
+                                                     self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:], LineNo=str_pcd_data[6], DimensionAttr=str_pcd_data[2])
                 elif GlobalData.gGuidPattern.match(str_pcd_data[4]):
-                    str_pcd_obj_str.AddComponentOverrideValue(str_pcd_data[3], str(str_pcd_data[7]), str_pcd_data[4].replace("-","S"), self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:], LineNo=str_pcd_data[6],DimensionAttr = str_pcd_data[2])
-                    PcdComponentValue = ModuleScopeOverallValue.get((str_pcd_obj_str.TokenCName,str_pcd_obj_str.TokenSpaceGuidCName))
+                    str_pcd_obj_str.AddComponentOverrideValue(str_pcd_data[3], str(str_pcd_data[7]), str_pcd_data[4].replace("-", "S"), self.MetaFile.File if self.WorkspaceDir not in self.MetaFile.File else self.MetaFile.File[len(
+                        self.WorkspaceDir) if self.WorkspaceDir.endswith(os.path.sep) else len(self.WorkspaceDir)+1:], LineNo=str_pcd_data[6], DimensionAttr=str_pcd_data[2])
+                    PcdComponentValue = ModuleScopeOverallValue.get(
+                        (str_pcd_obj_str.TokenCName, str_pcd_obj_str.TokenSpaceGuidCName))
                     for module_guid in PcdComponentValue:
-                        str_pcd_obj_str.PcdValueFromComponents[module_guid.replace("-","S")] = PcdComponentValue[module_guid]
+                        str_pcd_obj_str.PcdValueFromComponents[module_guid.replace(
+                            "-", "S")] = PcdComponentValue[module_guid]
             S_pcd_set[str_pcd[1], str_pcd[0]] = str_pcd_obj_str
 
         # Add the Structure PCD that only defined in DEC, don't have override in DSC file
@@ -1551,9 +1683,11 @@ class DscBuildData(PlatformBuildClassObject):
                     if str_pcd_obj:
                         str_pcd_obj_str.copy(str_pcd_obj)
                         if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
-                            str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
+                            str_pcd_obj_str.DefaultFromDSC = {skuname: {defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(
+                                defaultstore, str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                         else:
-                            str_pcd_obj_str.DefaultFromDSC = {skuname:{defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
+                            str_pcd_obj_str.DefaultFromDSC = {skuname: {defaultstore: str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.get(
+                                defaultstore, str_pcd_obj.SkuInfoList[skuname].DefaultValue) for defaultstore in DefaultStores} for skuname in str_pcd_obj.SkuInfoList}
                     S_pcd_set[Pcd] = str_pcd_obj_str
         if S_pcd_set:
             GlobalData.gStructurePcd[self.Arch] = S_pcd_set.copy()
@@ -1570,12 +1704,13 @@ class DscBuildData(PlatformBuildClassObject):
                             NoDefault = True
                             break
                         nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
-                    stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy({defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {}) #{TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
+                    stru_pcd.SkuOverrideValues[skuid] = copy.deepcopy(stru_pcd.SkuOverrideValues[nextskuid]) if not NoDefault else copy.deepcopy(
+                        {defaultstorename: stru_pcd.DefaultValues for defaultstorename in DefaultStores} if DefaultStores else {})  # {TAB_DEFAULT_STORES_DEFAULT:stru_pcd.DefaultValues})
                     if not NoDefault:
                         stru_pcd.ValueChain.add((skuid, ''))
                 if 'DEFAULT' in stru_pcd.SkuOverrideValues and not GlobalData.gPcdSkuOverrides.get((stru_pcd.TokenCName, stru_pcd.TokenSpaceGuidCName)):
                     GlobalData.gPcdSkuOverrides.update(
-                        {(stru_pcd.TokenCName, stru_pcd.TokenSpaceGuidCName): {'DEFAULT':stru_pcd.SkuOverrideValues['DEFAULT']}})
+                        {(stru_pcd.TokenCName, stru_pcd.TokenSpaceGuidCName): {'DEFAULT': stru_pcd.SkuOverrideValues['DEFAULT']}})
             if stru_pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
                 for skuid in SkuIds:
                     nextskuid = skuid
@@ -1588,14 +1723,17 @@ class DscBuildData(PlatformBuildClassObject):
                             nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
                     if NoDefault:
                         continue
-                    PcdDefaultStoreSet = set(defaultstorename  for defaultstorename in stru_pcd.SkuOverrideValues[nextskuid])
-                    mindefaultstorename = DefaultStoreMgr.GetMin(PcdDefaultStoreSet)
+                    PcdDefaultStoreSet = set(
+                        defaultstorename for defaultstorename in stru_pcd.SkuOverrideValues[nextskuid])
+                    mindefaultstorename = DefaultStoreMgr.GetMin(
+                        PcdDefaultStoreSet)
 
                     for defaultstoreid in DefaultStores:
                         if defaultstoreid not in stru_pcd.SkuOverrideValues[skuid]:
-                            stru_pcd.SkuOverrideValues[skuid][defaultstoreid] = CopyDict(stru_pcd.SkuOverrideValues[nextskuid][mindefaultstorename])
+                            stru_pcd.SkuOverrideValues[skuid][defaultstoreid] = CopyDict(
+                                stru_pcd.SkuOverrideValues[nextskuid][mindefaultstorename])
                             stru_pcd.ValueChain.add((skuid, defaultstoreid))
-        S_pcd_set = DscBuildData.OverrideByFdf(S_pcd_set,self.WorkspaceDir)
+        S_pcd_set = DscBuildData.OverrideByFdf(S_pcd_set, self.WorkspaceDir)
         S_pcd_set = DscBuildData.OverrideByComm(S_pcd_set)
 
         # Create a tool to caculate structure pcd value
@@ -1610,21 +1748,24 @@ class DscBuildData(PlatformBuildClassObject):
                 if str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
                                         self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
                     if skuname not in str_pcd_obj.SkuInfoList:
-                        str_pcd_obj.SkuInfoList[skuname] = SkuInfoClass(SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], HiiDefaultValue=PcdValue, DefaultStore = {StoreName:PcdValue})
+                        str_pcd_obj.SkuInfoList[skuname] = SkuInfoClass(
+                            SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], HiiDefaultValue=PcdValue, DefaultStore={StoreName: PcdValue})
                     else:
                         str_pcd_obj.SkuInfoList[skuname].HiiDefaultValue = PcdValue
-                        str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.update({StoreName:PcdValue})
+                        str_pcd_obj.SkuInfoList[skuname].DefaultStoreDict.update(
+                            {StoreName: PcdValue})
                 elif str_pcd_obj.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+                                          self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
                     if skuname in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
                         str_pcd_obj.DefaultValue = PcdValue
                     else:
-                        #Module Scope Structure Pcd
-                        moduleguid = skuname.replace("S","-")
+                        # Module Scope Structure Pcd
+                        moduleguid = skuname.replace("S", "-")
                         if GlobalData.gGuidPattern.match(moduleguid):
                             for component in self.Modules.values():
                                 if component.Guid == moduleguid:
-                                    component.Pcds[(PcdName, PcdGuid)].DefaultValue = PcdValue
+                                    component.Pcds[(
+                                        PcdName, PcdGuid)].DefaultValue = PcdValue
 
                 else:
                     if skuname not in str_pcd_obj.SkuInfoList:
@@ -1635,25 +1776,32 @@ class DscBuildData(PlatformBuildClassObject):
                                 NoDefault = True
                                 break
                             nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
-                        str_pcd_obj.SkuInfoList[skuname] = copy.deepcopy(str_pcd_obj.SkuInfoList[nextskuid]) if not NoDefault else SkuInfoClass(SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], DefaultValue=PcdValue)
+                        str_pcd_obj.SkuInfoList[skuname] = copy.deepcopy(str_pcd_obj.SkuInfoList[nextskuid]) if not NoDefault else SkuInfoClass(
+                            SkuIdName=skuname, SkuId=self.SkuIds[skuname][0], DefaultValue=PcdValue)
                         str_pcd_obj.SkuInfoList[skuname].SkuId = self.SkuIds[skuname][0]
                         str_pcd_obj.SkuInfoList[skuname].SkuIdName = skuname
                     else:
                         str_pcd_obj.SkuInfoList[skuname].DefaultValue = PcdValue
             for str_pcd_obj in S_pcd_set.values():
                 if str_pcd_obj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
+                                            self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
                     continue
-                PcdDefaultStoreSet = set(defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values() for defaultstorename in skuobj.DefaultStoreDict)
+                PcdDefaultStoreSet = set(defaultstorename for skuobj in str_pcd_obj.SkuInfoList.values(
+                ) for defaultstorename in skuobj.DefaultStoreDict)
                 DefaultStoreObj = DefaultStore(self._GetDefaultStores())
-                mindefaultstorename = DefaultStoreObj.GetMin(PcdDefaultStoreSet)
-                str_pcd_obj.SkuInfoList[self.SkuIdMgr.SystemSkuId].HiiDefaultValue = str_pcd_obj.SkuInfoList[self.SkuIdMgr.SystemSkuId].DefaultStoreDict[mindefaultstorename]
+                mindefaultstorename = DefaultStoreObj.GetMin(
+                    PcdDefaultStoreSet)
+                str_pcd_obj.SkuInfoList[self.SkuIdMgr.SystemSkuId].HiiDefaultValue = str_pcd_obj.SkuInfoList[
+                    self.SkuIdMgr.SystemSkuId].DefaultStoreDict[mindefaultstorename]
 
             for str_pcd_obj in S_pcd_set.values():
 
-                str_pcd_obj.MaxDatumSize = DscBuildData.GetStructurePcdMaxSize(str_pcd_obj)
-                Pcds[str_pcd_obj.TokenCName, str_pcd_obj.TokenSpaceGuidCName] = str_pcd_obj
-                Pcds[str_pcd_obj.TokenCName, str_pcd_obj.TokenSpaceGuidCName].CustomAttribute['IsStru']=True
+                str_pcd_obj.MaxDatumSize = DscBuildData.GetStructurePcdMaxSize(
+                    str_pcd_obj)
+                Pcds[str_pcd_obj.TokenCName,
+                     str_pcd_obj.TokenSpaceGuidCName] = str_pcd_obj
+                Pcds[str_pcd_obj.TokenCName,
+                     str_pcd_obj.TokenSpaceGuidCName].CustomAttribute['IsStru'] = True
 
             for pcdkey in Pcds:
                 pcd = Pcds[pcdkey]
@@ -1663,31 +1811,36 @@ class DscBuildData(PlatformBuildClassObject):
                 elif TAB_DEFAULT in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                     del pcd.SkuInfoList[TAB_COMMON]
 
-        list(map(self.FilterSkuSettings, [Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType]))
+        list(map(self.FilterSkuSettings, [
+             Pcds[pcdkey] for pcdkey in Pcds if Pcds[pcdkey].Type in DynamicPcdType]))
         return Pcds
+
     @cached_property
     def PlatformUsedPcds(self):
         FdfInfList = []
         if GlobalData.gFdfParser:
             FdfInfList = GlobalData.gFdfParser.Profile.InfList
-        FdfModuleList = [PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch) for Inf in FdfInfList]
+        FdfModuleList = [PathClass(
+            NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch) for Inf in FdfInfList]
         AllModulePcds = set()
         ModuleSet = set(list(self._Modules.keys()) + FdfModuleList)
         for ModuleFile in ModuleSet:
-            ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
+            ModuleData = self._Bdb[ModuleFile,
+                                   self._Arch, self._Target, self._Toolchain]
             AllModulePcds = AllModulePcds | ModuleData.PcdsName
         for ModuleFile in self.LibraryInstances:
-            ModuleData = self._Bdb.CreateBuildObject(ModuleFile, self._Arch, self._Target, self._Toolchain)
+            ModuleData = self._Bdb.CreateBuildObject(
+                ModuleFile, self._Arch, self._Target, self._Toolchain)
             AllModulePcds = AllModulePcds | ModuleData.PcdsName
         return AllModulePcds
 
-    #Filter the StrucutrePcd that is not used by any module in dsc file and fdf file.
+    # Filter the StrucutrePcd that is not used by any module in dsc file and fdf file.
     def FilterStrcturePcd(self, S_pcd_set):
         UnusedStruPcds = set(S_pcd_set.keys()) - self.PlatformUsedPcds
         for (Token, TokenSpaceGuid) in UnusedStruPcds:
             del S_pcd_set[(Token, TokenSpaceGuid)]
 
-    ## Retrieve non-dynamic PCD settings
+    # Retrieve non-dynamic PCD settings
     #
     #   @param  Type    PCD type
     #
@@ -1711,7 +1864,7 @@ class DscBuildData(PlatformBuildClassObject):
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build ', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
-                                            File=self.MetaFile, Line=Dummy5)
+                                File=self.MetaFile, Line=Dummy5)
             if SkuName in (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT, TAB_COMMON):
                 if "." not in TokenSpaceGuid and "[" not in PcdCName and (PcdCName, TokenSpaceGuid, SkuName, Dummy5) not in PcdList:
                     PcdList.append((PcdCName, TokenSpaceGuid, SkuName, Dummy5))
@@ -1721,7 +1874,8 @@ class DscBuildData(PlatformBuildClassObject):
             Setting = PcdDict[self._Arch, PcdCName, TokenSpaceGuid, SkuName]
             if Setting is None:
                 continue
-            PcdValue, DatumType, MaxDatumSize = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
+            PcdValue, DatumType, MaxDatumSize = self._ValidatePcd(
+                PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
             if MaxDatumSize:
                 if int(MaxDatumSize, 0) > 0xFFFF:
                     EdkLogger.error('build', FORMAT_INVALID, "The size value must not exceed the maximum value of 0xFFFF (UINT16) for %s." % ".".join((TokenSpaceGuid, PcdCName)),
@@ -1730,47 +1884,51 @@ class DscBuildData(PlatformBuildClassObject):
                     EdkLogger.error('build', FORMAT_INVALID, "The size value can't be set to negative value for %s." % ".".join((TokenSpaceGuid, PcdCName)),
                                     File=self.MetaFile, Line=Dummy4)
             if (PcdCName, TokenSpaceGuid) in PcdValueDict:
-                PcdValueDict[PcdCName, TokenSpaceGuid][SkuName] = (PcdValue, DatumType, MaxDatumSize,Dummy4)
+                PcdValueDict[PcdCName, TokenSpaceGuid][SkuName] = (
+                    PcdValue, DatumType, MaxDatumSize, Dummy4)
             else:
-                PcdValueDict[PcdCName, TokenSpaceGuid] = {SkuName:(PcdValue, DatumType, MaxDatumSize,Dummy4)}
+                PcdValueDict[PcdCName, TokenSpaceGuid] = {
+                    SkuName: (PcdValue, DatumType, MaxDatumSize, Dummy4)}
 
         for ((PcdCName, TokenSpaceGuid), PcdSetting) in PcdValueDict.items():
             if self.SkuIdMgr.SystemSkuId in PcdSetting:
-                PcdValue, DatumType, MaxDatumSize,_ = PcdSetting[self.SkuIdMgr.SystemSkuId]
+                PcdValue, DatumType, MaxDatumSize, _ = PcdSetting[self.SkuIdMgr.SystemSkuId]
             elif TAB_DEFAULT in PcdSetting:
-                PcdValue, DatumType, MaxDatumSize,_  = PcdSetting[TAB_DEFAULT]
+                PcdValue, DatumType, MaxDatumSize, _ = PcdSetting[TAB_DEFAULT]
             elif TAB_COMMON in PcdSetting:
-                PcdValue, DatumType, MaxDatumSize,_  = PcdSetting[TAB_COMMON]
+                PcdValue, DatumType, MaxDatumSize, _ = PcdSetting[TAB_COMMON]
             else:
                 PcdValue = None
                 DatumType = None
                 MaxDatumSize = None
 
             Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
-                                                PcdCName,
-                                                TokenSpaceGuid,
-                                                self._PCD_TYPE_STRING_[Type],
-                                                DatumType,
-                                                PcdValue,
-                                                '',
-                                                MaxDatumSize,
-                                                {},
-                                                False,
-                                                None,
-                                                IsDsc=True)
+                PcdCName,
+                TokenSpaceGuid,
+                self._PCD_TYPE_STRING_[Type],
+                DatumType,
+                PcdValue,
+                '',
+                MaxDatumSize,
+                {},
+                False,
+                None,
+                IsDsc=True)
             for SkuName in PcdValueDict[PcdCName, TokenSpaceGuid]:
                 Settings = PcdValueDict[PcdCName, TokenSpaceGuid][SkuName]
                 if SkuName not in Pcds[PcdCName, TokenSpaceGuid].DscRawValue:
                     Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName] = {}
                     Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName] = {}
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName][TAB_DEFAULT_STORES_DEFAULT] = Settings[0]
-                Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (self.MetaFile.File,Settings[3])
+                Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (
+                    self.MetaFile.File, Settings[3])
         return Pcds
 
     @staticmethod
     def GetStructurePcdMaxSize(str_pcd):
         pcd_default_value = str_pcd.DefaultValue
-        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in [DscBuildData._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], DscBuildData._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]] else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
+        sku_values = [skuobj.HiiDefaultValue if str_pcd.Type in [DscBuildData._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
+                                                                 DscBuildData._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]] else skuobj.DefaultValue for skuobj in str_pcd.SkuInfoList.values()]
         sku_values.append(pcd_default_value)
 
         def get_length(value):
@@ -1788,7 +1946,7 @@ class DscBuildData(PlatformBuildClassObject):
                 if (Value[0] == '{' and Value[-1] == '}'):
                     return len(Value.split(","))
                 if Value.startswith("L'") and Value.endswith("'") and len(list(Value[2:-1])) > 1:
-                    return  len(list(Value[2:-1]))
+                    return len(list(Value[2:-1]))
                 if Value[0] == "'" and Value[-1] == "'" and len(list(Value[1:-1])) > 1:
                     return len(Value) - 2
             return len(Value)
@@ -1796,18 +1954,20 @@ class DscBuildData(PlatformBuildClassObject):
         return str(max(get_length(item) for item in sku_values))
 
     @staticmethod
-    def ExecuteCommand (Command):
+    def ExecuteCommand(Command):
         try:
-            Process = subprocess.Popen(Command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+            Process = subprocess.Popen(
+                Command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
         except:
-            EdkLogger.error('Build', COMMAND_FAILURE, 'Can not execute command: %s' % Command)
+            EdkLogger.error('Build', COMMAND_FAILURE,
+                            'Can not execute command: %s' % Command)
         Result = Process.communicate()
         return Process.returncode, Result[0].decode(errors='ignore'), Result[1].decode(errors='ignore')
 
     @staticmethod
     def IntToCString(Value, ValueSize):
         Result = '"'
-        if not isinstance (Value, str):
+        if not isinstance(Value, str):
             for Index in range(0, ValueSize):
                 Result = Result + '\\x%02x' % (Value & 0xff)
                 Value = Value >> 8
@@ -1816,38 +1976,51 @@ class DscBuildData(PlatformBuildClassObject):
 
     def GenerateSizeFunction(self, Pcd):
         CApp = "// Default Value in Dec \n"
-        CApp = CApp + "void Cal_%s_%s_Size(UINT32 *Size){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+        CApp = CApp + \
+            "void Cal_%s_%s_Size(UINT32 *Size){\n" % (
+                Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
 
         if Pcd.IsArray() and Pcd.Capacity[-1] != "-1":
-            CApp += "  *Size = (sizeof (%s) > *Size ? sizeof (%s) : *Size);\n" % (Pcd.DatumType,Pcd.DatumType)
+            CApp += "  *Size = (sizeof (%s) > *Size ? sizeof (%s) : *Size);\n" % (
+                Pcd.DatumType, Pcd.DatumType)
         else:
             if "{CODE(" in Pcd.DefaultValueFromDec:
-                CApp += "  *Size = (sizeof (%s_%s_INIT_Value) > *Size ? sizeof (%s_%s_INIT_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName,Pcd.TokenCName,Pcd.TokenSpaceGuidCName,Pcd.TokenCName)
+                CApp += "  *Size = (sizeof (%s_%s_INIT_Value) > *Size ? sizeof (%s_%s_INIT_Value) : *Size);\n" % (
+                    Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             if Pcd.Type in PCD_DYNAMIC_TYPE_SET | PCD_DYNAMIC_EX_TYPE_SET:
                 for skuname in Pcd.SkuInfoList:
                     skuobj = Pcd.SkuInfoList[skuname]
                     if skuobj.VariableName:
                         for defaultstore in skuobj.DefaultStoreDict:
-                            pcddef = self.GetPcdDscRawDefaultValue(Pcd,skuname,defaultstore)
+                            pcddef = self.GetPcdDscRawDefaultValue(
+                                Pcd, skuname, defaultstore)
                             if pcddef:
                                 if "{CODE(" in pcddef:
-                                    CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,defaultstore,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,defaultstore)
+                                    CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (
+                                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName, skuname, defaultstore, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, skuname, defaultstore)
                                 else:
-                                    CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (self.GetStructurePcdMaxSize(Pcd),self.GetStructurePcdMaxSize(Pcd))
+                                    CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (
+                                        self.GetStructurePcdMaxSize(Pcd), self.GetStructurePcdMaxSize(Pcd))
                     else:
-                        pcddef = self.GetPcdDscRawDefaultValue(Pcd,skuname,TAB_DEFAULT_STORES_DEFAULT)
+                        pcddef = self.GetPcdDscRawDefaultValue(
+                            Pcd, skuname, TAB_DEFAULT_STORES_DEFAULT)
                         if pcddef:
-                            if  "{CODE(" in pcddef:
-                                CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,TAB_DEFAULT_STORES_DEFAULT,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,TAB_DEFAULT_STORES_DEFAULT)
+                            if "{CODE(" in pcddef:
+                                CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName,
+                                                                                                                                    Pcd.TokenCName, skuname, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, skuname, TAB_DEFAULT_STORES_DEFAULT)
                             else:
-                                CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (self.GetStructurePcdMaxSize(Pcd),self.GetStructurePcdMaxSize(Pcd))
+                                CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (
+                                    self.GetStructurePcdMaxSize(Pcd), self.GetStructurePcdMaxSize(Pcd))
             else:
-                pcddef = self.GetPcdDscRawDefaultValue(Pcd,TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT)
+                pcddef = self.GetPcdDscRawDefaultValue(
+                    Pcd, TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
                 if pcddef:
                     if "{CODE(" in pcddef:
-                        CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName,Pcd.TokenCName,TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,TAB_DEFAULT,TAB_DEFAULT_STORES_DEFAULT)
+                        CApp += "  *Size = (sizeof (%s_%s_%s_%s_Value) > *Size ? sizeof (%s_%s_%s_%s_Value) : *Size);\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,
+                                                                                                                            TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
                     else:
-                        CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (self.GetStructurePcdMaxSize(Pcd),self.GetStructurePcdMaxSize(Pcd))
+                        CApp += "  *Size = %s > *Size ? %s : *Size;\n" % (
+                            self.GetStructurePcdMaxSize(Pcd), self.GetStructurePcdMaxSize(Pcd))
         ActualCap = []
         for index in Pcd.DefaultValues:
             if index:
@@ -1857,33 +2030,41 @@ class DscBuildData(PlatformBuildClassObject):
                 continue
             for FieldName in FieldList:
                 FieldName = "." + FieldName
-                IsArray = _IsFieldValueAnArray(FieldList[FieldName.strip(".")][0])
+                IsArray = _IsFieldValueAnArray(
+                    FieldList[FieldName.strip(".")][0])
                 if IsArray and not (FieldList[FieldName.strip(".")][0].startswith('{GUID') and FieldList[FieldName.strip(".")][0].endswith('}')):
                     try:
-                        Value = ValueExpressionEx(FieldList[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
+                        Value = ValueExpressionEx(
+                            FieldList[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
                     except BadExpression:
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                         (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName.strip('.'))), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2]))
                     Value, ValueSize = ParseFieldValue(Value)
                     if not Pcd.IsArray():
-                        CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0));  // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2], FieldList[FieldName.strip(".")][0]);
+                        CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0));  // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip(
+                            "."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2], FieldList[FieldName.strip(".")][0])
                 else:
                     NewFieldName = ''
                     FieldName_ori = FieldName.strip('.')
-                    while '[' in  FieldName:
-                        NewFieldName = NewFieldName + FieldName.split('[', 1)[0] + '[0]'
-                        Array_Index = int(FieldName.split('[', 1)[1].split(']', 1)[0])
+                    while '[' in FieldName:
+                        NewFieldName = NewFieldName + \
+                            FieldName.split('[', 1)[0] + '[0]'
+                        Array_Index = int(FieldName.split(
+                            '[', 1)[1].split(']', 1)[0])
                         FieldName = FieldName.split(']', 1)[1]
                     FieldName = NewFieldName + FieldName
                     while '[' in FieldName and not Pcd.IsArray():
                         FieldName = FieldName.rsplit('[', 1)[0]
-                        CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), Array_Index + 1, FieldList[FieldName_ori][1], FieldList[FieldName_ori][2], FieldList[FieldName_ori][0])
+                        CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip(
+                            "."), Array_Index + 1, FieldList[FieldName_ori][1], FieldList[FieldName_ori][2], FieldList[FieldName_ori][0])
         flexisbale_size_statement_cache = set()
         for skuname in Pcd.SkuOverrideValues:
             if skuname == TAB_COMMON:
                 continue
             for defaultstorenameitem in Pcd.SkuOverrideValues[skuname]:
-                CApp = CApp + "// SkuName: %s,  DefaultStoreName: %s \n" % (skuname, defaultstorenameitem)
+                CApp = CApp + \
+                    "// SkuName: %s,  DefaultStoreName: %s \n" % (
+                        skuname, defaultstorenameitem)
                 for index in Pcd.SkuOverrideValues[skuname][defaultstorenameitem]:
                     if index:
                         ActualCap.append(index)
@@ -1896,85 +2077,107 @@ class DscBuildData(PlatformBuildClassObject):
                                 continue
                             flexisbale_size_statement_cache.add(fieldinfo)
                             FieldName = "." + FieldName
-                            IsArray = _IsFieldValueAnArray(FieldList[FieldName.strip(".")][0])
+                            IsArray = _IsFieldValueAnArray(
+                                FieldList[FieldName.strip(".")][0])
                             if IsArray and not (FieldList[FieldName.strip(".")][0].startswith('{GUID') and FieldList[FieldName.strip(".")][0].endswith('}')):
                                 try:
-                                    Value = ValueExpressionEx(FieldList[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
+                                    Value = ValueExpressionEx(
+                                        FieldList[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
                                 except BadExpression:
                                     EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                                     (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName.strip('.'))), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2]))
                                 Value, ValueSize = ParseFieldValue(Value)
                                 if not Pcd.IsArray():
-                                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2], FieldList[FieldName.strip(".")][0]);
+                                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip(
+                                        "."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), FieldList[FieldName.strip(".")][1], FieldList[FieldName.strip(".")][2], FieldList[FieldName.strip(".")][0])
                             else:
                                 NewFieldName = ''
                                 FieldName_ori = FieldName.strip('.')
-                                while '[' in  FieldName:
-                                    NewFieldName = NewFieldName + FieldName.split('[', 1)[0] + '[0]'
-                                    Array_Index = int(FieldName.split('[', 1)[1].split(']', 1)[0])
+                                while '[' in FieldName:
+                                    NewFieldName = NewFieldName + \
+                                        FieldName.split('[', 1)[0] + '[0]'
+                                    Array_Index = int(FieldName.split(
+                                        '[', 1)[1].split(']', 1)[0])
                                     FieldName = FieldName.split(']', 1)[1]
                                 FieldName = NewFieldName + FieldName
                                 while '[' in FieldName and not Pcd.IsArray():
                                     FieldName = FieldName.rsplit('[', 1)[0]
-                                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip("."), Array_Index + 1, FieldList[FieldName_ori][1], FieldList[FieldName_ori][2], FieldList[FieldName_ori][0])
+                                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip(
+                                        "."), Array_Index + 1, FieldList[FieldName_ori][1], FieldList[FieldName_ori][2], FieldList[FieldName_ori][0])
         if Pcd.PcdFieldValueFromFdf:
             CApp = CApp + "// From fdf \n"
         for FieldName in Pcd.PcdFieldValueFromFdf:
             FieldName = "." + FieldName
-            IsArray = _IsFieldValueAnArray(Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0])
+            IsArray = _IsFieldValueAnArray(
+                Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0])
             if IsArray and not (Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0].startswith('{GUID') and Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0].endswith('}')):
                 try:
-                    Value = ValueExpressionEx(Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
+                    Value = ValueExpressionEx(Pcd.PcdFieldValueFromFdf[FieldName.strip(
+                        ".")][0], TAB_VOID, self._GuidDict)(True)
                 except BadExpression:
                     EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                     (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName.strip('.'))), Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][1], Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][2]))
                 Value, ValueSize = ParseFieldValue(Value)
                 if not Pcd.IsArray():
-                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][1], Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][2], Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0]);
+                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip(
+                        "."), ValueSize, Pcd.DatumType, FieldName.strip("."), Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][1], Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][2], Pcd.PcdFieldValueFromFdf[FieldName.strip(".")][0])
             else:
                 NewFieldName = ''
                 FieldName_ori = FieldName.strip('.')
-                while '[' in  FieldName:
-                    NewFieldName = NewFieldName + FieldName.split('[', 1)[0] + '[0]'
-                    Array_Index = int(FieldName.split('[', 1)[1].split(']', 1)[0])
+                while '[' in FieldName:
+                    NewFieldName = NewFieldName + \
+                        FieldName.split('[', 1)[0] + '[0]'
+                    Array_Index = int(FieldName.split(
+                        '[', 1)[1].split(']', 1)[0])
                     FieldName = FieldName.split(']', 1)[1]
                 FieldName = NewFieldName + FieldName
                 while '[' in FieldName:
                     FieldName = FieldName.rsplit('[', 1)[0]
-                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %s Value %s \n' % (Pcd.DatumType, FieldName.strip("."), Array_Index + 1, Pcd.PcdFieldValueFromFdf[FieldName_ori][1], Pcd.PcdFieldValueFromFdf[FieldName_ori][2], Pcd.PcdFieldValueFromFdf[FieldName_ori][0])
+                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %s Value %s \n' % (Pcd.DatumType, FieldName.strip(
+                        "."), Array_Index + 1, Pcd.PcdFieldValueFromFdf[FieldName_ori][1], Pcd.PcdFieldValueFromFdf[FieldName_ori][2], Pcd.PcdFieldValueFromFdf[FieldName_ori][0])
         if Pcd.PcdFieldValueFromComm:
             CApp = CApp + "// From Command Line \n"
         for FieldName in Pcd.PcdFieldValueFromComm:
             FieldName = "." + FieldName
-            IsArray = _IsFieldValueAnArray(Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0])
+            IsArray = _IsFieldValueAnArray(
+                Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0])
             if IsArray and not (Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0].startswith('{GUID') and Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0].endswith('}')):
                 try:
-                    Value = ValueExpressionEx(Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0], TAB_VOID, self._GuidDict)(True)
+                    Value = ValueExpressionEx(Pcd.PcdFieldValueFromComm[FieldName.strip(
+                        ".")][0], TAB_VOID, self._GuidDict)(True)
                 except BadExpression:
                     EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                     (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName.strip('.'))), Pcd.PcdFieldValueFromComm[FieldName.strip(".")][1], Pcd.PcdFieldValueFromComm[FieldName.strip(".")][2]))
                 Value, ValueSize = ParseFieldValue(Value)
                 if not Pcd.IsArray():
-                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip("."), Pcd.PcdFieldValueFromComm[FieldName.strip(".")][1], Pcd.PcdFieldValueFromComm[FieldName.strip(".")][2], Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0]);
+                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d / __ARRAY_ELEMENT_SIZE(%s, %s) + ((%d %% __ARRAY_ELEMENT_SIZE(%s, %s)) ? 1 : 0)); // From %s Line %d Value %s\n' % (Pcd.DatumType, FieldName.strip("."), ValueSize, Pcd.DatumType, FieldName.strip(
+                        "."), ValueSize, Pcd.DatumType, FieldName.strip("."), Pcd.PcdFieldValueFromComm[FieldName.strip(".")][1], Pcd.PcdFieldValueFromComm[FieldName.strip(".")][2], Pcd.PcdFieldValueFromComm[FieldName.strip(".")][0])
             else:
                 NewFieldName = ''
                 FieldName_ori = FieldName.strip('.')
-                while '[' in  FieldName:
-                    NewFieldName = NewFieldName + FieldName.split('[', 1)[0] + '[0]'
-                    Array_Index = int(FieldName.split('[', 1)[1].split(']', 1)[0])
+                while '[' in FieldName:
+                    NewFieldName = NewFieldName + \
+                        FieldName.split('[', 1)[0] + '[0]'
+                    Array_Index = int(FieldName.split(
+                        '[', 1)[1].split(']', 1)[0])
                     FieldName = FieldName.split(']', 1)[1]
                 FieldName = NewFieldName + FieldName
                 while '[' in FieldName and not Pcd.IsArray():
                     FieldName = FieldName.rsplit('[', 1)[0]
-                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip("."), Array_Index + 1, Pcd.PcdFieldValueFromComm[FieldName_ori][1], Pcd.PcdFieldValueFromComm[FieldName_ori][2], Pcd.PcdFieldValueFromComm[FieldName_ori][0])
+                    CApp = CApp + '  __FLEXIBLE_SIZE(*Size, %s, %s, %d); // From %s Line %d Value %s \n' % (Pcd.DatumType, FieldName.strip(
+                        "."), Array_Index + 1, Pcd.PcdFieldValueFromComm[FieldName_ori][1], Pcd.PcdFieldValueFromComm[FieldName_ori][2], Pcd.PcdFieldValueFromComm[FieldName_ori][0])
         if Pcd.GetPcdMaxSize():
-            CApp = CApp + "  *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize())
+            CApp = CApp + "  *Size = (%d > *Size ? %d : *Size); // The Pcd maxsize is %d \n" % (
+                Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize(), Pcd.GetPcdMaxSize())
         ArraySizeByAssign = self.CalculateActualCap(ActualCap)
         if ArraySizeByAssign > 1:
-            CApp = CApp + "  *Size = (%d > *Size ? %d : *Size); \n" % (ArraySizeByAssign, ArraySizeByAssign)
+            CApp = CApp + \
+                "  *Size = (%d > *Size ? %d : *Size); \n" % (ArraySizeByAssign,
+                                                             ArraySizeByAssign)
         CApp = CApp + "}\n"
         return CApp
-    def CalculateActualCap(self,ActualCap):
+
+    def CalculateActualCap(self, ActualCap):
         if not ActualCap:
             return 1
         maxsize = 1
@@ -1986,22 +2189,23 @@ class DscBuildData(PlatformBuildClassObject):
                 if not index_num:
                     # Not support flexiable pcd array assignment
                     return 1
-                index_num = int(index_num,16) if index_num.startswith(("0x","0X")) else int(index_num)
+                index_num = int(index_num, 16) if index_num.startswith(
+                    ("0x", "0X")) else int(index_num)
                 rt = rt * (index_num+1)
-            if rt  >maxsize:
+            if rt > maxsize:
                 maxsize = rt
 
         return maxsize
 
     @staticmethod
-    def GenerateSizeStatments(Pcd,skuname,defaultstorename):
+    def GenerateSizeStatments(Pcd, skuname, defaultstorename):
         if Pcd.IsArray():
             r_datatype = [Pcd.BaseDatumType]
             lastoneisEmpty = False
             for dem in Pcd.Capacity:
                 if lastoneisEmpty:
                     EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s.  " %
-                                        (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))))
+                                    (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName))))
                 if dem == '0' or dem == "-1":
                     r_datatype.append("[1]")
                     lastoneisEmpty = True
@@ -2009,45 +2213,54 @@ class DscBuildData(PlatformBuildClassObject):
                     r_datatype.append("[" + dem + "]")
 
             if Pcd.Type in [MODEL_PCD_DYNAMIC_EX_HII, MODEL_PCD_DYNAMIC_HII]:
-                PcdDefValue = Pcd.SkuInfoList.get(skuname).DefaultStoreDict.get(defaultstorename)
-            elif Pcd.Type in [MODEL_PCD_DYNAMIC_EX_DEFAULT,MODEL_PCD_DYNAMIC_VPD,MODEL_PCD_DYNAMIC_DEFAULT,MODEL_PCD_DYNAMIC_EX_VPD]:
+                PcdDefValue = Pcd.SkuInfoList.get(
+                    skuname).DefaultStoreDict.get(defaultstorename)
+            elif Pcd.Type in [MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_VPD]:
                 PcdDefValue = Pcd.SkuInfoList.get(skuname).DefaultValue
             else:
                 PcdDefValue = Pcd.DefaultValue
             if lastoneisEmpty:
                 if "{CODE(" not in PcdDefValue:
-                    sizebasevalue_plus = "(%s / sizeof(%s) + 1)" % ((DscBuildData.GetStructurePcdMaxSize(Pcd), Pcd.BaseDatumType))
-                    sizebasevalue = "(%s / sizeof(%s))" % ((DscBuildData.GetStructurePcdMaxSize(Pcd), Pcd.BaseDatumType))
+                    sizebasevalue_plus = "(%s / sizeof(%s) + 1)" % (
+                        (DscBuildData.GetStructurePcdMaxSize(Pcd), Pcd.BaseDatumType))
+                    sizebasevalue = "(%s / sizeof(%s))" % (
+                        (DscBuildData.GetStructurePcdMaxSize(Pcd), Pcd.BaseDatumType))
                     sizeof = "sizeof(%s)" % Pcd.BaseDatumType
-                    CApp = '  int ArraySize = %s %% %s ? %s : %s ;\n' % ( (DscBuildData.GetStructurePcdMaxSize(Pcd), sizeof, sizebasevalue_plus, sizebasevalue))
+                    CApp = '  int ArraySize = %s %% %s ? %s : %s ;\n' % (
+                        (DscBuildData.GetStructurePcdMaxSize(Pcd), sizeof, sizebasevalue_plus, sizebasevalue))
                     CApp += '  Size = ArraySize * sizeof(%s); \n' % Pcd.BaseDatumType
                 else:
                     CApp = "  Size = 0;\n"
             else:
-                CApp = '  Size = sizeof(%s);\n' % ("".join(r_datatype) )
+                CApp = '  Size = sizeof(%s);\n' % ("".join(r_datatype))
         else:
             CApp = '  Size = sizeof(%s);\n' % (Pcd.DatumType)
-        CApp = CApp + '  Cal_%s_%s_Size(&Size);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+        CApp = CApp + \
+            '  Cal_%s_%s_Size(&Size);\n' % (
+                Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
 
-    def GetIndicator(self,index,FieldName,Pcd):
+    def GetIndicator(self, index, FieldName, Pcd):
         def cleanupindex(indexstr):
             return indexstr.strip("[").strip("]").strip()
         index_elements = ArrayIndex.findall(index)
         pcd_capacity = Pcd.Capacity
         if index:
             indicator = "(Pcd"
-            if len(pcd_capacity)>2:
-                for i in range(0,len(index_elements)):
+            if len(pcd_capacity) > 2:
+                for i in range(0, len(index_elements)):
                     index_ele = index_elements[i]
                     index_num = index_ele.strip("[").strip("]").strip()
-                    if i == len(index_elements) -2:
-                        indicator += "+ %d*Size/sizeof(%s)/%d + %s)" %(int(cleanupindex(index_elements[i+1])),Pcd.BaseDatumType,reduce(lambda x,y: int(x)*int(y),pcd_capacity[:-1]), cleanupindex(index_elements[i]))
+                    if i == len(index_elements) - 2:
+                        indicator += "+ %d*Size/sizeof(%s)/%d + %s)" % (int(cleanupindex(index_elements[i+1])), Pcd.BaseDatumType, reduce(
+                            lambda x, y: int(x)*int(y), pcd_capacity[:-1]), cleanupindex(index_elements[i]))
                         break
                     else:
-                        indicator += " + %d*%s*Size/sizeof(%s)/%d" %(int(cleanupindex(index_elements[i])),reduce(lambda x,y: int(x)*int(y),pcd_capacity[i+1:-1]),Pcd.BaseDatumType,reduce(lambda x,y: int(x)*int(y),pcd_capacity[:-1]))
+                        indicator += " + %d*%s*Size/sizeof(%s)/%d" % (int(cleanupindex(index_elements[i])), reduce(lambda x, y: int(
+                            x)*int(y), pcd_capacity[i+1:-1]), Pcd.BaseDatumType, reduce(lambda x, y: int(x)*int(y), pcd_capacity[:-1]))
             elif len(pcd_capacity) == 2:
-                indicator += "+ %d*Size/sizeof(%s)/%d + %s)" %(int(cleanupindex(index_elements[0])),Pcd.BaseDatumType,int(pcd_capacity[0]), index_elements[1].strip("[").strip("]").strip())
+                indicator += "+ %d*Size/sizeof(%s)/%d + %s)" % (int(cleanupindex(index_elements[0])), Pcd.BaseDatumType, int(
+                    pcd_capacity[0]), index_elements[1].strip("[").strip("]").strip())
             elif len(pcd_capacity) == 1:
                 index_ele = index_elements[0]
                 index_num = index_ele.strip("[").strip("]").strip()
@@ -2058,16 +2271,18 @@ class DscBuildData(PlatformBuildClassObject):
             indicator += "->" + FieldName
         return indicator
 
-    def GetStarNum(self,Pcd):
+    def GetStarNum(self, Pcd):
         if not Pcd.IsArray():
             return 1
         elif Pcd.IsSimpleTypeArray():
             return len(Pcd.Capacity)
         else:
             return len(Pcd.Capacity) + 1
+
     def GenerateDefaultValueAssignFunction(self, Pcd):
         CApp = "// Default value in Dec \n"
-        CApp = CApp + "void Assign_%s_%s_Default_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.BaseDatumType)
+        CApp = CApp + "void Assign_%s_%s_Default_Value(%s *Pcd){\n" % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.BaseDatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
         CApp = CApp + ' UINT32 PcdArraySize;\n'
@@ -2075,12 +2290,13 @@ class DscBuildData(PlatformBuildClassObject):
         IsArray = _IsFieldValueAnArray(Pcd.DefaultValueFromDec)
         if IsArray:
             try:
-                DefaultValueFromDec = ValueExpressionEx(Pcd.DefaultValueFromDec, TAB_VOID)(True)
+                DefaultValueFromDec = ValueExpressionEx(
+                    Pcd.DefaultValueFromDec, TAB_VOID)(True)
             except BadExpression:
                 EdkLogger.error("Build", FORMAT_INVALID, "Invalid value format for %s.%s, from DEC: %s" %
                                 (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, DefaultValueFromDec))
         DefaultValueFromDec = StringToArray(DefaultValueFromDec)
-        Value, ValueSize = ParseFieldValue (DefaultValueFromDec)
+        Value, ValueSize = ParseFieldValue(DefaultValueFromDec)
         if IsArray:
             #
             # Use memcpy() to copy value into field
@@ -2089,24 +2305,34 @@ class DscBuildData(PlatformBuildClassObject):
                 pcdarraysize = Pcd.PcdArraySize()
                 if "{CODE(" in Pcd.DefaultValueFromDec:
                     if Pcd.Capacity[-1] != "-1":
-                        CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_INIT_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dec exceed the array capability %s"); // From  %s Line %s \n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,Pcd.DefaultValueFromDecInfo[0],Pcd.DefaultValueFromDecInfo[1])
-                    CApp = CApp + ' PcdArraySize = sizeof(%s_%s_INIT_Value);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
-                    CApp = CApp + '  memcpy (Pcd, %s_%s_INIT_Value,PcdArraySize);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                        CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_INIT_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dec exceed the array capability %s"); // From  %s Line %s \n ' % (
+                            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, Pcd.DefaultValueFromDecInfo[0], Pcd.DefaultValueFromDecInfo[1])
+                    CApp = CApp + ' PcdArraySize = sizeof(%s_%s_INIT_Value);\n ' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                    CApp = CApp + '  memcpy (Pcd, %s_%s_INIT_Value,PcdArraySize);\n ' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
                 else:
                     if Pcd.Capacity[-1] != "-1":
-                        CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dec exceed the array capability %s"); // From %s Line %s \n' % (ValueSize,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,Pcd.DefaultValueFromDecInfo[0],Pcd.DefaultValueFromDecInfo[1])
+                        CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dec exceed the array capability %s"); // From %s Line %s \n' % (
+                            ValueSize, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, Pcd.DefaultValueFromDecInfo[0], Pcd.DefaultValueFromDecInfo[1])
                     CApp = CApp + ' PcdArraySize = %d;\n' % ValueSize
-                    CApp = CApp + '  Value     = %s; // From DEC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultValueFromDec)
+                    CApp = CApp + '  Value     = %s; // From DEC Default Value %s\n' % (
+                        DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultValueFromDec)
                     CApp = CApp + '  memcpy (Pcd, Value, PcdArraySize);\n'
             else:
                 if "{CODE(" in Pcd.DefaultValueFromDec:
-                    CApp = CApp + '  PcdArraySize = sizeof(%s_%s_INIT_Value);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
-                    CApp = CApp + '  memcpy (Pcd, &%s_%s_INIT_Value,PcdArraySize);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                    CApp = CApp + '  PcdArraySize = sizeof(%s_%s_INIT_Value);\n ' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                    CApp = CApp + '  memcpy (Pcd, &%s_%s_INIT_Value,PcdArraySize);\n ' % (
+                        Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
                 else:
-                    CApp = CApp + '  Value     = %s; // From DEC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultValueFromDec)
+                    CApp = CApp + '  Value     = %s; // From DEC Default Value %s\n' % (
+                        DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultValueFromDec)
                     CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
         elif isinstance(Value, str):
-            CApp = CApp + '  Pcd = %s; // From DEC Default Value %s\n' % (Value, Pcd.DefaultValueFromDec)
+            CApp = CApp + \
+                '  Pcd = %s; // From DEC Default Value %s\n' % (
+                    Value, Pcd.DefaultValueFromDec)
         for index in Pcd.DefaultValues:
             FieldList = Pcd.DefaultValues[index]
             if not FieldList:
@@ -2115,100 +2341,128 @@ class DscBuildData(PlatformBuildClassObject):
                 IsArray = _IsFieldValueAnArray(FieldList[FieldName][0])
                 if IsArray:
                     try:
-                        FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
+                        FieldList[FieldName][0] = ValueExpressionEx(
+                            FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                     except BadExpression:
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                         (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
 
                 try:
-                    Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
+                    Value, ValueSize = ParseFieldValue(FieldList[FieldName][0])
                 except Exception:
-                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
+                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join(
+                        (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
 
-                indicator = self.GetIndicator(index, FieldName,Pcd)
+                indicator = self.GetIndicator(index, FieldName, Pcd)
                 if IsArray:
                     #
                     # Use memcpy() to copy value into field
                     #
-                    CApp = CApp + '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (Pcd.BaseDatumType, FieldName)
-                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  memcpy (&%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (indicator, ValueSize, ValueSize)
+                    CApp = CApp + \
+                        '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (
+                            Pcd.BaseDatumType, FieldName)
+                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(
+                        Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (
+                        Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  memcpy (&%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (
+                        indicator, ValueSize, ValueSize)
                 elif isinstance(Value, str):
-                    CApp = CApp + '  %s = %s; // From %s Line %d Value %s\n' % (indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  %s = %s; // From %s Line %d Value %s\n' % (
+                        indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                 else:
                     if '[' in FieldName and ']' in FieldName:
                         Index = int(FieldName.split('[')[1].split(']')[0])
-                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
+                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (
+                            Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
                     if ValueSize > 4:
-                        CApp = CApp + '  %s = %dULL; // From %s Line %d Value %s\n' % (indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  %s = %dULL; // From %s Line %d Value %s\n' % (
+                            indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     else:
-                        CApp = CApp + '  %s = %d; // From %s Line %d Value %s\n' % (indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  %s = %d; // From %s Line %d Value %s\n' % (
+                            indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
         CApp = CApp + "}\n"
         return CApp
 
     @staticmethod
     def GenerateDefaultValueAssignStatement(Pcd):
-        CApp = '  Assign_%s_%s_Default_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+        CApp = '  Assign_%s_%s_Default_Value(Pcd);\n' % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
 
-    def GetPcdDscRawDefaultValue(self,Pcd, SkuName,DefaultStoreName):
+    def GetPcdDscRawDefaultValue(self, Pcd, SkuName, DefaultStoreName):
         if Pcd.Type in PCD_DYNAMIC_TYPE_SET or Pcd.Type in PCD_DYNAMIC_EX_TYPE_SET:
             if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT):
-                pcddefaultvalue = Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT) if Pcd.DefaultFromDSC else None
+                pcddefaultvalue = Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(
+                    TAB_DEFAULT_STORES_DEFAULT) if Pcd.DefaultFromDSC else None
             else:
-                pcddefaultvalue = Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName)
+                pcddefaultvalue = Pcd.DscRawValue.get(
+                    SkuName, {}).get(DefaultStoreName)
         else:
-            pcddefaultvalue = Pcd.DscRawValue.get(SkuName, {}).get(TAB_DEFAULT_STORES_DEFAULT)
+            pcddefaultvalue = Pcd.DscRawValue.get(
+                SkuName, {}).get(TAB_DEFAULT_STORES_DEFAULT)
 
         return pcddefaultvalue
-    def GetPcdDscRawValueInfo(self,Pcd, SkuName,DefaultStoreName):
-        DscValueInfo = Pcd.DscRawValueInfo.get(SkuName, {}).get(DefaultStoreName)
+
+    def GetPcdDscRawValueInfo(self, Pcd, SkuName, DefaultStoreName):
+        DscValueInfo = Pcd.DscRawValueInfo.get(
+            SkuName, {}).get(DefaultStoreName)
         if DscValueInfo:
-            dscfilepath,lineno = DscValueInfo
+            dscfilepath, lineno = DscValueInfo
         else:
             dscfilepath = self.MetaFile.File
             lineno = ""
-        return dscfilepath,lineno
+        return dscfilepath, lineno
 
     def GenerateInitValueFunction(self, Pcd, SkuName, DefaultStoreName):
-        CApp = "// Value in Dsc for Sku: %s, DefaultStore %s\n" % (SkuName, DefaultStoreName)
-        CApp = CApp + "void Assign_%s_%s_%s_%s_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, Pcd.BaseDatumType)
+        CApp = "// Value in Dsc for Sku: %s, DefaultStore %s\n" % (
+            SkuName, DefaultStoreName)
+        CApp = CApp + "void Assign_%s_%s_%s_%s_Value(%s *Pcd){\n" % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, Pcd.BaseDatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
         CApp = CApp + ' UINT32 PcdArraySize;\n'
 
-        CApp = CApp + "// SkuName: %s,  DefaultStoreName: %s \n" % (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
+        CApp = CApp + \
+            "// SkuName: %s,  DefaultStoreName: %s \n" % (
+                TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
         inherit_OverrideValues = Pcd.SkuOverrideValues[SkuName]
-        dscfilepath,lineno = self.GetPcdDscRawValueInfo(Pcd, SkuName, DefaultStoreName)
+        dscfilepath, lineno = self.GetPcdDscRawValueInfo(
+            Pcd, SkuName, DefaultStoreName)
         if lineno:
-            valuefrom = "%s Line %s" % (dscfilepath,str(lineno))
+            valuefrom = "%s Line %s" % (dscfilepath, str(lineno))
         else:
             valuefrom = dscfilepath
 
-        pcddefaultvalue = self.GetPcdDscRawDefaultValue(Pcd, SkuName, DefaultStoreName)
+        pcddefaultvalue = self.GetPcdDscRawDefaultValue(
+            Pcd, SkuName, DefaultStoreName)
         if pcddefaultvalue:
             FieldList = pcddefaultvalue
             IsArray = _IsFieldValueAnArray(FieldList)
             if IsArray:
                 if "{CODE(" not in FieldList:
                     try:
-                        FieldList = ValueExpressionEx(FieldList, TAB_VOID)(True)
+                        FieldList = ValueExpressionEx(
+                            FieldList, TAB_VOID)(True)
                     except BadExpression:
                         EdkLogger.error("Build", FORMAT_INVALID, "Invalid value format for %s.%s, from DSC: %s" %
                                         (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldList))
-            Value, ValueSize = ParseFieldValue (FieldList)
+            Value, ValueSize = ParseFieldValue(FieldList)
 
             if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT):
                 if isinstance(Value, str):
                     if "{CODE(" in Value:
                         if Pcd.IsArray() and Pcd.Capacity[-1] != "-1":
                             pcdarraysize = Pcd.PcdArraySize()
-                            CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From %s \n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType, valuefrom)
-                        CApp = CApp+ ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
-                        CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value,PcdArraySize);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                            CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From %s \n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
+                        CApp = CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (
+                            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
+                        CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value,PcdArraySize);\n ' % (
+                            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                     else:
-                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DefaultFromDSC.get(
+                            TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
                 elif IsArray:
                     #
                     # Use memcpy() to copy value into field
@@ -2217,32 +2471,46 @@ class DscBuildData(PlatformBuildClassObject):
                         pcdarraysize = Pcd.PcdArraySize()
                         if "{CODE(" in pcddefaultvalue:
                             if Pcd.Capacity[-1] != "-1":
-                                CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,valuefrom)
-                            CApp = CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
-                            CApp = CApp + '  memcpy (Pcd, %s_%s_%s_%s_Value, PcdArraySize);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                                CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (
+                                    Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
+                            CApp = CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
+                            CApp = CApp + '  memcpy (Pcd, %s_%s_%s_%s_Value, PcdArraySize);\n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                         else:
                             if Pcd.Capacity[-1] != "-1":
-                                CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (ValueSize,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,valuefrom)
+                                CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (
+                                    ValueSize, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
                             CApp = CApp + ' PcdArraySize = %d;\n' % ValueSize
-                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
-                            CApp = CApp + '  memcpy (Pcd, Value, PcdArraySize);\n'
+                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(
+                                TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                            CApp = CApp + \
+                                '  memcpy (Pcd, Value, PcdArraySize);\n'
                     else:
                         if "{CODE(" in pcddefaultvalue:
-                            CApp = CApp + '  PcdArraySize = %d < sizeof(%s) * %d ? %d: sizeof(%s) * %d;\n ' % (ValueSize,Pcd.BaseDatumType,pcdarraysize,ValueSize,Pcd.BaseDatumType,pcdarraysize)
-                            CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                            CApp = CApp + '  PcdArraySize = %d < sizeof(%s) * %d ? %d: sizeof(%s) * %d;\n ' % (
+                                ValueSize, Pcd.BaseDatumType, pcdarraysize, ValueSize, Pcd.BaseDatumType, pcdarraysize)
+                            CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                         else:
-                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
-                            CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
+                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DefaultFromDSC.get(
+                                TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                            CApp = CApp + \
+                                '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
             else:
                 if isinstance(Value, str):
                     if "{CODE(" in Value:
                         if Pcd.IsArray() and Pcd.Capacity[-1] != "-1":
                             pcdarraysize = Pcd.PcdArraySize()
-                            CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From %s \n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,valuefrom)
-                        CApp = CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n '% (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
-                        CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                            CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From %s \n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
+                        CApp = CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (
+                            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
+                        CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (
+                            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                     else:
-                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (Value, Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
+                        CApp = CApp + '  Pcd = %s; // From DSC Default Value %s\n' % (
+                            Value, Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
                 elif IsArray:
                     #
                     # Use memcpy() to copy value into field
@@ -2251,22 +2519,32 @@ class DscBuildData(PlatformBuildClassObject):
                         pcdarraysize = Pcd.PcdArraySize()
                         if "{CODE(" in pcddefaultvalue:
                             if Pcd.Capacity[-1] != "-1":
-                                CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,valuefrom)
-                            CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
-                            CApp = CApp + '  memcpy (Pcd, %s_%s_%s_%s_Value, PcdArraySize);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                                CApp = CApp + '__STATIC_ASSERT(sizeof(%s_%s_%s_%s_Value) < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (
+                                    Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
+                            CApp + ' PcdArraySize = sizeof(%s_%s_%s_%s_Value);\n ' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
+                            CApp = CApp + '  memcpy (Pcd, %s_%s_%s_%s_Value, PcdArraySize);\n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                         else:
                             if Pcd.Capacity[-1] != "-1":
-                                CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (ValueSize,pcdarraysize,Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.DatumType,valuefrom)
+                                CApp = CApp + '__STATIC_ASSERT(%d < %d * sizeof(%s), "Pcd %s.%s Value in Dsc exceed the array capability %s"); // From  %s \n' % (
+                                    ValueSize, pcdarraysize, Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, valuefrom)
                             CApp = CApp + ' PcdArraySize = %d;\n' % ValueSize
-                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DscRawValue.get(TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
-                            CApp = CApp + '  memcpy (Pcd, Value, PcdArraySize);\n'
+                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DscRawValue.get(
+                                TAB_DEFAULT, {}).get(TAB_DEFAULT_STORES_DEFAULT, Pcd.DefaultValue) if Pcd.DefaultFromDSC else Pcd.DefaultValue)
+                            CApp = CApp + \
+                                '  memcpy (Pcd, Value, PcdArraySize);\n'
                     else:
                         if "{CODE(" in pcddefaultvalue:
-                            CApp = CApp + '  PcdArraySize = %d < sizeof(%s) * %d ? %d: sizeof(%s) * %d;\n ' % (ValueSize,Pcd.BaseDatumType,pcdarraysize,ValueSize,Pcd.BaseDatumType,pcdarraysize)
-                            CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,SkuName, DefaultStoreName)
+                            CApp = CApp + '  PcdArraySize = %d < sizeof(%s) * %d ? %d: sizeof(%s) * %d;\n ' % (
+                                ValueSize, Pcd.BaseDatumType, pcdarraysize, ValueSize, Pcd.BaseDatumType, pcdarraysize)
+                            CApp = CApp + '  memcpy (Pcd, &%s_%s_%s_%s_Value, PcdArraySize);\n' % (
+                                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
                         else:
-                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
-                            CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
+                            CApp = CApp + '  Value     = %s; // From DSC Default Value %s\n' % (DscBuildData.IntToCString(
+                                Value, ValueSize), Pcd.DscRawValue.get(SkuName, {}).get(DefaultStoreName))
+                            CApp = CApp + \
+                                '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
 
         inheritvalue = inherit_OverrideValues.get(DefaultStoreName)
         if not inheritvalue:
@@ -2275,49 +2553,63 @@ class DscBuildData(PlatformBuildClassObject):
             FieldList = inheritvalue[index]
             if not FieldList:
                 continue
-            if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT) or (( (SkuName, '') not in Pcd.ValueChain) and ( (SkuName, DefaultStoreName) not in Pcd.ValueChain )):
+            if (SkuName, DefaultStoreName) == (TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT) or (((SkuName, '') not in Pcd.ValueChain) and ((SkuName, DefaultStoreName) not in Pcd.ValueChain)):
                 for FieldName in FieldList:
-                    indicator = self.GetIndicator(index, FieldName,Pcd)
+                    indicator = self.GetIndicator(index, FieldName, Pcd)
                     IsArray = _IsFieldValueAnArray(FieldList[FieldName][0])
                     if IsArray:
                         try:
-                            FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
+                            FieldList[FieldName][0] = ValueExpressionEx(
+                                FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                         except BadExpression:
                             EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                             (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     try:
-                        Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
+                        Value, ValueSize = ParseFieldValue(
+                            FieldList[FieldName][0])
                     except Exception:
-                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
+                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join(
+                            (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     if isinstance(Value, str):
-                        CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     elif IsArray:
-                    #
-                    # Use memcpy() to copy value into field
-                    #
-                        CApp = CApp + '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (Pcd.BaseDatumType, FieldName)
-                        CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                        CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                        CApp = CApp + '  memcpy (&%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (indicator, ValueSize, ValueSize)
+                        #
+                        # Use memcpy() to copy value into field
+                        #
+                        CApp = CApp + \
+                            '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (
+                                Pcd.BaseDatumType, FieldName)
+                        CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(
+                            Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (
+                            Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  memcpy (&%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (
+                            indicator, ValueSize, ValueSize)
                     else:
                         if '[' in FieldName and ']' in FieldName:
                             Index = int(FieldName.split('[')[1].split(']')[0])
-                            CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
+                            CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (
+                                Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
                         if ValueSize > 4:
-                            CApp = CApp + '  %s = %dULL; // From %s Line %d Value %s\n' % (indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                            CApp = CApp + '  %s = %dULL; // From %s Line %d Value %s\n' % (
+                                indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                         else:
-                            CApp = CApp + '  %s = %d; // From %s Line %d Value %s\n' % (indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                            CApp = CApp + '  %s = %d; // From %s Line %d Value %s\n' % (
+                                indicator, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
         CApp = CApp + "}\n"
         return CApp
 
     @staticmethod
     def GenerateInitValueStatement(Pcd, SkuName, DefaultStoreName):
-        CApp = '  Assign_%s_%s_%s_%s_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
+        CApp = '  Assign_%s_%s_%s_%s_Value(Pcd);\n' % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, SkuName, DefaultStoreName)
         return CApp
 
     def GenerateCommandLineValue(self, Pcd):
         CApp = "// Value in CommandLine\n"
-        CApp = CApp + "void Assign_%s_%s_CommandLine_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.BaseDatumType)
+        CApp = CApp + "void Assign_%s_%s_CommandLine_Value(%s *Pcd){\n" % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.BaseDatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
 
@@ -2329,53 +2621,67 @@ class DscBuildData(PlatformBuildClassObject):
                 IsArray = _IsFieldValueAnArray(FieldList)
                 if IsArray:
                     try:
-                        FieldList = ValueExpressionEx(FieldList, TAB_VOID)(True)
+                        FieldList = ValueExpressionEx(
+                            FieldList, TAB_VOID)(True)
                     except BadExpression:
                         EdkLogger.error("Build", FORMAT_INVALID, "Invalid value format for %s.%s, from Command: %s" %
                                         (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldList))
-                Value, ValueSize = ParseFieldValue (FieldList)
+                Value, ValueSize = ParseFieldValue(FieldList)
 
                 if isinstance(Value, str):
-                    CApp = CApp + '  Pcd = %s; // From Command Line \n' % (Value)
+                    CApp = CApp + \
+                        '  Pcd = %s; // From Command Line \n' % (Value)
                 elif IsArray:
-                #
-                # Use memcpy() to copy value into field
-                #
-                    CApp = CApp + '  Value     = %s; // From Command Line.\n' % (DscBuildData.IntToCString(Value, ValueSize))
+                    #
+                    # Use memcpy() to copy value into field
+                    #
+                    CApp = CApp + '  Value     = %s; // From Command Line.\n' % (
+                        DscBuildData.IntToCString(Value, ValueSize))
                     CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
                 continue
             for FieldName in FieldList:
                 IsArray = _IsFieldValueAnArray(FieldList[FieldName][0])
                 if IsArray:
                     try:
-                        FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
+                        FieldList[FieldName][0] = ValueExpressionEx(
+                            FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                     except BadExpression:
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                         (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     except:
                         print("error")
                 try:
-                    Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
+                    Value, ValueSize = ParseFieldValue(FieldList[FieldName][0])
                 except Exception:
-                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
+                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join(
+                        (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                 if isinstance(Value, str):
-                    CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (
+                        FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                 elif IsArray:
-                #
-                # Use memcpy() to copy value into field
-                #
-                    CApp = CApp + '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (Pcd.BaseDatumType, FieldName)
-                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (FieldName, ValueSize, ValueSize)
+                    #
+                    # Use memcpy() to copy value into field
+                    #
+                    CApp = CApp + \
+                        '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (
+                            Pcd.BaseDatumType, FieldName)
+                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(
+                        Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (
+                        Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (
+                        FieldName, ValueSize, ValueSize)
                 else:
                     if '[' in FieldName and ']' in FieldName:
                         Index = int(FieldName.split('[')[1].split(']')[0])
-                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
+                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (
+                            Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
                     if ValueSize > 4:
-                        CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     else:
-                        CApp = CApp + '  Pcd->%s = %d; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %d; // From %s Line %d Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
         CApp = CApp + "}\n"
         return CApp
 
@@ -2383,31 +2689,36 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = "// Value in Dsc Module scope \n"
         for ModuleGuid in Pcd.PcdFiledValueFromDscComponent:
 
-            CApp = CApp + "void Assign_%s_%s_%s_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, ModuleGuid,Pcd.BaseDatumType)
+            CApp = CApp + "void Assign_%s_%s_%s_Value(%s *Pcd){\n" % (
+                Pcd.TokenSpaceGuidCName, Pcd.TokenCName, ModuleGuid, Pcd.BaseDatumType)
             CApp = CApp + '  UINT32  FieldSize;\n'
             CApp = CApp + '  CHAR8   *Value;\n'
-            pcddefaultvalue, file_path,lineNo = Pcd.PcdValueFromComponents.get(ModuleGuid,(None,None,None))
+            pcddefaultvalue, file_path, lineNo = Pcd.PcdValueFromComponents.get(
+                ModuleGuid, (None, None, None))
 
             if pcddefaultvalue:
                 IsArray = _IsFieldValueAnArray(pcddefaultvalue)
                 if IsArray:
                     try:
-                        FieldList = ValueExpressionEx(pcddefaultvalue, TAB_VOID)(True)
+                        FieldList = ValueExpressionEx(
+                            pcddefaultvalue, TAB_VOID)(True)
                     except BadExpression:
                         EdkLogger.error("Build", FORMAT_INVALID, "Invalid value format for %s.%s, from %s Line %s: %s" %
                                         (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, file_path, lineNo, FieldList))
-                Value, ValueSize = ParseFieldValue (FieldList)
+                Value, ValueSize = ParseFieldValue(FieldList)
 
                 if isinstance(Value, str):
-                    CApp = CApp + '  Pcd = %s; // From %s Line %s \n' % (Value, file_path, lineNo)
+                    CApp = CApp + \
+                        '  Pcd = %s; // From %s Line %s \n' % (
+                            Value, file_path, lineNo)
                 elif IsArray:
-                #
-                # Use memcpy() to copy value into field
-                #
-                    CApp = CApp + '  Value     = %s; // From %s Line %s.\n' % (DscBuildData.IntToCString(Value, ValueSize), file_path, lineNo)
+                    #
+                    # Use memcpy() to copy value into field
+                    #
+                    CApp = CApp + '  Value     = %s; // From %s Line %s.\n' % (
+                        DscBuildData.IntToCString(Value, ValueSize), file_path, lineNo)
                     CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
 
-
             PcdFiledValue = Pcd.PcdFiledValueFromDscComponent.get(ModuleGuid)
             for index in PcdFiledValue:
                 FieldList = PcdFiledValue[index]
@@ -2417,118 +2728,151 @@ class DscBuildData(PlatformBuildClassObject):
                     IsArray = _IsFieldValueAnArray(FieldList[FieldName][0])
                     if IsArray:
                         try:
-                            FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
+                            FieldList[FieldName][0] = ValueExpressionEx(
+                                FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                         except BadExpression:
                             EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                             (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                         except:
                             print("error")
                     try:
-                        Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
+                        Value, ValueSize = ParseFieldValue(
+                            FieldList[FieldName][0])
                     except Exception:
-                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
+                        EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join(
+                            (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     if isinstance(Value, str):
-                        CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     elif IsArray:
-                    #
-                    # Use memcpy() to copy value into field
-                    #
-                        CApp = CApp + '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (Pcd.BaseDatumType, FieldName)
-                        CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                        CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                        CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (FieldName, ValueSize, ValueSize)
+                        #
+                        # Use memcpy() to copy value into field
+                        #
+                        CApp = CApp + \
+                            '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (
+                                Pcd.BaseDatumType, FieldName)
+                        CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(
+                            Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (
+                            Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (
+                            FieldName, ValueSize, ValueSize)
                     else:
                         if '[' in FieldName and ']' in FieldName:
                             Index = int(FieldName.split('[')[1].split(']')[0])
-                            CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
+                            CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (
+                                Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
                         if ValueSize > 4:
-                            CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                            CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (
+                                FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                         else:
-                            CApp = CApp + '  Pcd->%s = %d; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                            CApp = CApp + '  Pcd->%s = %d; // From %s Line %d Value %s\n' % (
+                                FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
             CApp = CApp + "}\n"
         return CApp
 
     @staticmethod
     def GenerateCommandLineValueStatement(Pcd):
-        CApp = '  Assign_%s_%s_CommandLine_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+        CApp = '  Assign_%s_%s_CommandLine_Value(Pcd);\n' % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
-    def GenerateFdfValue(self,Pcd):
+
+    def GenerateFdfValue(self, Pcd):
         CApp = "// Value in Fdf\n"
-        CApp = CApp + "void Assign_%s_%s_Fdf_Value(%s *Pcd){\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName,Pcd.BaseDatumType)
+        CApp = CApp + "void Assign_%s_%s_Fdf_Value(%s *Pcd){\n" % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.BaseDatumType)
         CApp = CApp + '  UINT32  FieldSize;\n'
         CApp = CApp + '  CHAR8   *Value;\n'
 
         pcddefaultvalue = Pcd.PcdValueFromFdf
-        for FieldList in [pcddefaultvalue,Pcd.PcdFieldValueFromFdf]:
+        for FieldList in [pcddefaultvalue, Pcd.PcdFieldValueFromFdf]:
             if not FieldList:
                 continue
             if pcddefaultvalue and FieldList == pcddefaultvalue:
                 IsArray = _IsFieldValueAnArray(FieldList)
                 if IsArray:
                     try:
-                        FieldList = ValueExpressionEx(FieldList, TAB_VOID)(True)
+                        FieldList = ValueExpressionEx(
+                            FieldList, TAB_VOID)(True)
                     except BadExpression:
                         EdkLogger.error("Build", FORMAT_INVALID, "Invalid value format for %s.%s, from Fdf: %s" %
                                         (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldList))
-                Value, ValueSize = ParseFieldValue (FieldList)
+                Value, ValueSize = ParseFieldValue(FieldList)
 
                 if isinstance(Value, str):
                     CApp = CApp + '  Pcd = %s; // From Fdf \n' % (Value)
                 elif IsArray:
-                #
-                # Use memcpy() to copy value into field
-                #
-                    CApp = CApp + '  Value     = %s; // From Fdf .\n' % (DscBuildData.IntToCString(Value, ValueSize))
+                    #
+                    # Use memcpy() to copy value into field
+                    #
+                    CApp = CApp + \
+                        '  Value     = %s; // From Fdf .\n' % (
+                            DscBuildData.IntToCString(Value, ValueSize))
                     CApp = CApp + '  memcpy (Pcd, Value, %d);\n' % (ValueSize)
                 continue
             for FieldName in FieldList:
                 IsArray = _IsFieldValueAnArray(FieldList[FieldName][0])
                 if IsArray:
                     try:
-                        FieldList[FieldName][0] = ValueExpressionEx(FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
+                        FieldList[FieldName][0] = ValueExpressionEx(
+                            FieldList[FieldName][0], TAB_VOID, self._GuidDict)(True)
                     except BadExpression:
                         EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " %
                                         (".".join((Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                     except:
                         print("error")
                 try:
-                    Value, ValueSize = ParseFieldValue (FieldList[FieldName][0])
+                    Value, ValueSize = ParseFieldValue(FieldList[FieldName][0])
                 except Exception:
-                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join((Pcd.TokenSpaceGuidCName,Pcd.TokenCName,FieldName)),FieldList[FieldName][1], FieldList[FieldName][2]))
+                    EdkLogger.error('Build', FORMAT_INVALID, "Invalid value format for %s. From %s Line %d " % (".".join(
+                        (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, FieldName)), FieldList[FieldName][1], FieldList[FieldName][2]))
                 if isinstance(Value, str):
-                    CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  Pcd->%s = %s; // From %s Line %d Value %s\n' % (
+                        FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                 elif IsArray:
-                #
-                # Use memcpy() to copy value into field
-                #
-                    CApp = CApp + '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (Pcd.BaseDatumType, FieldName)
-                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
-                    CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (FieldName, ValueSize, ValueSize)
+                    #
+                    # Use memcpy() to copy value into field
+                    #
+                    CApp = CApp + \
+                        '  FieldSize = __FIELD_SIZE(%s, %s);\n' % (
+                            Pcd.BaseDatumType, FieldName)
+                    CApp = CApp + '  Value     = %s; // From %s Line %d Value %s\n' % (DscBuildData.IntToCString(
+                        Value, ValueSize), FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  __STATIC_ASSERT((__FIELD_SIZE(%s, %s) >= %d) || (__FIELD_SIZE(%s, %s) == 0), "Input buffer exceeds the buffer array"); // From %s Line %d Value %s\n' % (
+                        Pcd.BaseDatumType, FieldName, ValueSize, Pcd.BaseDatumType, FieldName, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                    CApp = CApp + '  memcpy (&Pcd->%s, Value, (FieldSize > 0 && FieldSize < %d) ? FieldSize : %d);\n' % (
+                        FieldName, ValueSize, ValueSize)
                 else:
                     if '[' in FieldName and ']' in FieldName:
                         Index = int(FieldName.split('[')[1].split(']')[0])
-                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
+                        CApp = CApp + '  __STATIC_ASSERT((%d < __ARRAY_SIZE(Pcd->%s)) || (__ARRAY_SIZE(Pcd->%s) == 0), "array index exceeds the array number"); // From %s Line %d Index of %s\n' % (
+                            Index, FieldName.split('[')[0], FieldName.split('[')[0], FieldList[FieldName][1], FieldList[FieldName][2], FieldName)
                     if ValueSize > 4:
-                        CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %dULL; // From %s Line %d Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
                     else:
-                        CApp = CApp + '  Pcd->%s = %d; // From %s Line %s Value %s\n' % (FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
+                        CApp = CApp + '  Pcd->%s = %d; // From %s Line %s Value %s\n' % (
+                            FieldName, Value, FieldList[FieldName][1], FieldList[FieldName][2], FieldList[FieldName][0])
         CApp = CApp + "}\n"
         return CApp
 
     @staticmethod
     def GenerateFdfValueStatement(Pcd):
-        CApp = '  Assign_%s_%s_Fdf_Value(Pcd);\n' % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+        CApp = '  Assign_%s_%s_Fdf_Value(Pcd);\n' % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         return CApp
 
     @staticmethod
     def GenerateModuleValueStatement(module_guid, Pcd):
-        CApp = "  Assign_%s_%s_%s_Value(Pcd);\n" % (Pcd.TokenSpaceGuidCName, Pcd.TokenCName, module_guid)
+        CApp = "  Assign_%s_%s_%s_Value(Pcd);\n" % (
+            Pcd.TokenSpaceGuidCName, Pcd.TokenCName, module_guid)
         return CApp
-    def GenerateModuleScopeInitializeFunc(self,SkuName, Pcd,  InitByteValue, CApp):
+
+    def GenerateModuleScopeInitializeFunc(self, SkuName, Pcd,  InitByteValue, CApp):
         for module_guid in Pcd.PcdFiledValueFromDscComponent:
             CApp = CApp + 'void\n'
-            CApp = CApp + 'Initialize_%s_%s_%s_%s(\n' % (module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + 'Initialize_%s_%s_%s_%s(\n' % (
+                module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             CApp = CApp + '  void\n'
             CApp = CApp + '  )\n'
             CApp = CApp + '{\n'
@@ -2538,16 +2882,19 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + '  UINT32  OriginalSize;\n'
             CApp = CApp + '  VOID    *OriginalPcd;\n'
 
-            CApp = CApp + '  %s      *Pcd;  // From %s Line %d \n' % (Pcd.BaseDatumType,Pcd.PkgPath, Pcd.PcdDefineLineNo)
+            CApp = CApp + '  %s      *Pcd;  // From %s Line %d \n' % (
+                Pcd.BaseDatumType, Pcd.PkgPath, Pcd.PcdDefineLineNo)
 
             CApp = CApp + '\n'
 
             PcdDefaultValue = StringToArray(Pcd.DefaultValueFromDec.strip())
-            InitByteValue += '%s.%s.%s.%s|%s|%s\n' % (module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, PcdDefaultValue)
+            InitByteValue += '%s.%s.%s.%s|%s|%s\n' % (module_guid, TAB_DEFAULT_STORES_DEFAULT,
+                                                      Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, PcdDefaultValue)
             #
             # Get current PCD value and size
             #
-            CApp = CApp + '  OriginalPcd = PcdGetPtr (%s, %s, %s, %s, &OriginalSize);\n' % (module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + '  OriginalPcd = PcdGetPtr (%s, %s, %s, %s, &OriginalSize);\n' % (
+                module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
 
             #
             # Determine the size of the PCD.  For simple structures, sizeof(TYPE) provides
@@ -2557,18 +2904,25 @@ class DscBuildData(PlatformBuildClassObject):
             # in a structure.  The size formula for this case is:
             # OFFSET_OF(FlexbleArrayField) + sizeof(FlexibleArray[0]) * (HighestIndex + 1)
             #
-            CApp = CApp + DscBuildData.GenerateSizeStatments(Pcd,SkuName,TAB_DEFAULT_STORES_DEFAULT)
+            CApp = CApp + \
+                DscBuildData.GenerateSizeStatments(
+                    Pcd, SkuName, TAB_DEFAULT_STORES_DEFAULT)
             if Pcd.IsArray() and Pcd.Capacity[-1] != "-1":
-                CApp = CApp + '  OriginalSize = OriginalSize < sizeof(%s) * %d? OriginalSize:sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,Pcd.PcdArraySize(),Pcd.BaseDatumType,Pcd.PcdArraySize())
-                CApp = CApp + '  Size = sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,Pcd.PcdArraySize())
+                CApp = CApp + '  OriginalSize = OriginalSize < sizeof(%s) * %d? OriginalSize:sizeof(%s) * %d; \n' % (
+                    Pcd.BaseDatumType, Pcd.PcdArraySize(), Pcd.BaseDatumType, Pcd.PcdArraySize())
+                CApp = CApp + \
+                    '  Size = sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,
+                                                      Pcd.PcdArraySize())
 
             #
             # Allocate and zero buffer for the PCD
             # Must handle cases where current value is smaller, larger, or same size
             # Always keep that larger one as the current size
             #
-            CApp = CApp + '  Size = (OriginalSize > Size ? OriginalSize : Size);\n'
-            CApp = CApp + '  Pcd     = (%s *)malloc (Size);\n' % (Pcd.BaseDatumType,)
+            CApp = CApp + \
+                '  Size = (OriginalSize > Size ? OriginalSize : Size);\n'
+            CApp = CApp + \
+                '  Pcd     = (%s *)malloc (Size);\n' % (Pcd.BaseDatumType,)
             CApp = CApp + '  memset (Pcd, 0, Size);\n'
 
             #
@@ -2582,15 +2936,18 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + DscBuildData.GenerateDefaultValueAssignStatement(Pcd)
 
             CApp = CApp + "// SkuName: %s,  DefaultStoreName: STANDARD \n" % self.SkuIdMgr.SystemSkuId
-            CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
-            CApp = CApp + DscBuildData.GenerateModuleValueStatement(module_guid,Pcd)
+            CApp = CApp + DscBuildData.GenerateInitValueStatement(
+                Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
+            CApp = CApp + \
+                DscBuildData.GenerateModuleValueStatement(module_guid, Pcd)
             CApp = CApp + DscBuildData.GenerateFdfValueStatement(Pcd)
             CApp = CApp + DscBuildData.GenerateCommandLineValueStatement(Pcd)
 
             #
             # Set new PCD value and size
             #
-            CApp = CApp + '  PcdSetPtr (%s, %s, %s, %s, Size, (void *)Pcd);\n' % (module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + '  PcdSetPtr (%s, %s, %s, %s, Size, (void *)Pcd);\n' % (
+                module_guid, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
 
             #
             # Free PCD
@@ -2598,17 +2955,18 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + '  free (Pcd);\n'
             CApp = CApp + '}\n'
             CApp = CApp + '\n'
-        return InitByteValue,CApp
+        return InitByteValue, CApp
 
     def GenerateInitializeFunc(self, SkuName, DefaultStore, Pcd, InitByteValue, CApp):
-        OverrideValues = {DefaultStore:{}}
+        OverrideValues = {DefaultStore: {}}
         if Pcd.SkuOverrideValues:
             OverrideValues = Pcd.SkuOverrideValues[SkuName]
         if not OverrideValues:
-            OverrideValues = {TAB_DEFAULT_STORES_DEFAULT:Pcd.DefaultValues}
+            OverrideValues = {TAB_DEFAULT_STORES_DEFAULT: Pcd.DefaultValues}
         for DefaultStoreName in OverrideValues:
             CApp = CApp + 'void\n'
-            CApp = CApp + 'Initialize_%s_%s_%s_%s(\n' % (SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + 'Initialize_%s_%s_%s_%s(\n' % (
+                SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             CApp = CApp + '  void\n'
             CApp = CApp + '  )\n'
             CApp = CApp + '{\n'
@@ -2618,18 +2976,21 @@ class DscBuildData(PlatformBuildClassObject):
             CApp = CApp + '  UINT32  OriginalSize;\n'
             CApp = CApp + '  VOID    *OriginalPcd;\n'
 
-            CApp = CApp + '  %s      *Pcd;  // From %s Line %d \n' % (Pcd.BaseDatumType,Pcd.PkgPath, Pcd.PcdDefineLineNo)
+            CApp = CApp + '  %s      *Pcd;  // From %s Line %d \n' % (
+                Pcd.BaseDatumType, Pcd.PkgPath, Pcd.PcdDefineLineNo)
 
             CApp = CApp + '\n'
 
             PcdDefaultValue = StringToArray(Pcd.DefaultValueFromDec.strip())
 
-            InitByteValue += '%s.%s.%s.%s|%s|%s\n' % (SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, PcdDefaultValue)
+            InitByteValue += '%s.%s.%s.%s|%s|%s\n' % (
+                SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Pcd.DatumType, PcdDefaultValue)
 
             #
             # Get current PCD value and size
             #
-            CApp = CApp + '  OriginalPcd = PcdGetPtr (%s, %s, %s, %s, &OriginalSize);\n' % (SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + '  OriginalPcd = PcdGetPtr (%s, %s, %s, %s, &OriginalSize);\n' % (
+                SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
 
             #
             # Determine the size of the PCD.  For simple structures, sizeof(TYPE) provides
@@ -2639,18 +3000,25 @@ class DscBuildData(PlatformBuildClassObject):
             # in a structure.  The size formula for this case is:
             # OFFSET_OF(FlexbleArrayField) + sizeof(FlexibleArray[0]) * (HighestIndex + 1)
             #
-            CApp = CApp + DscBuildData.GenerateSizeStatments(Pcd,SkuName,DefaultStoreName)
+            CApp = CApp + \
+                DscBuildData.GenerateSizeStatments(
+                    Pcd, SkuName, DefaultStoreName)
             if Pcd.IsArray() and Pcd.Capacity[-1] != "-1":
-                CApp = CApp + '  OriginalSize = OriginalSize < sizeof(%s) * %d? OriginalSize:sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,Pcd.PcdArraySize(),Pcd.BaseDatumType,Pcd.PcdArraySize())
-                CApp = CApp + '  Size = sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,Pcd.PcdArraySize())
+                CApp = CApp + '  OriginalSize = OriginalSize < sizeof(%s) * %d? OriginalSize:sizeof(%s) * %d; \n' % (
+                    Pcd.BaseDatumType, Pcd.PcdArraySize(), Pcd.BaseDatumType, Pcd.PcdArraySize())
+                CApp = CApp + \
+                    '  Size = sizeof(%s) * %d; \n' % (Pcd.BaseDatumType,
+                                                      Pcd.PcdArraySize())
 
             #
             # Allocate and zero buffer for the PCD
             # Must handle cases where current value is smaller, larger, or same size
             # Always keep that larger one as the current size
             #
-            CApp = CApp + '  Size = (OriginalSize > Size ? OriginalSize : Size);\n'
-            CApp = CApp + '  Pcd     = (%s *)malloc (Size);\n' % (Pcd.BaseDatumType,)
+            CApp = CApp + \
+                '  Size = (OriginalSize > Size ? OriginalSize : Size);\n'
+            CApp = CApp + \
+                '  Pcd     = (%s *)malloc (Size);\n' % (Pcd.BaseDatumType,)
             CApp = CApp + '  memset (Pcd, 0, Size);\n'
 
             #
@@ -2663,23 +3031,30 @@ class DscBuildData(PlatformBuildClassObject):
             #
             CApp = CApp + DscBuildData.GenerateDefaultValueAssignStatement(Pcd)
             if Pcd.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+                                self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
                 for skuname in self.SkuIdMgr.GetSkuChain(SkuName):
-                    storeset = [DefaultStoreName] if DefaultStoreName == TAB_DEFAULT_STORES_DEFAULT else [TAB_DEFAULT_STORES_DEFAULT, DefaultStoreName]
+                    storeset = [DefaultStoreName] if DefaultStoreName == TAB_DEFAULT_STORES_DEFAULT else [
+                        TAB_DEFAULT_STORES_DEFAULT, DefaultStoreName]
                     for defaultstorenameitem in storeset:
-                        CApp = CApp + "// SkuName: %s,  DefaultStoreName: %s \n" % (skuname, defaultstorenameitem)
-                        CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd, skuname, defaultstorenameitem)
+                        CApp = CApp + \
+                            "// SkuName: %s,  DefaultStoreName: %s \n" % (
+                                skuname, defaultstorenameitem)
+                        CApp = CApp + \
+                            DscBuildData.GenerateInitValueStatement(
+                                Pcd, skuname, defaultstorenameitem)
                     if skuname == SkuName:
                         break
             else:
                 CApp = CApp + "// SkuName: %s,  DefaultStoreName: STANDARD \n" % self.SkuIdMgr.SystemSkuId
-                CApp = CApp + DscBuildData.GenerateInitValueStatement(Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
+                CApp = CApp + DscBuildData.GenerateInitValueStatement(
+                    Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             CApp = CApp + DscBuildData.GenerateFdfValueStatement(Pcd)
             CApp = CApp + DscBuildData.GenerateCommandLineValueStatement(Pcd)
             #
             # Set new PCD value and size
             #
-            CApp = CApp + '  PcdSetPtr (%s, %s, %s, %s, Size, (void *)Pcd);\n' % (SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+            CApp = CApp + '  PcdSetPtr (%s, %s, %s, %s, Size, (void *)Pcd);\n' % (
+                SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
 
             #
             # Free PCD
@@ -2699,37 +3074,48 @@ class DscBuildData(PlatformBuildClassObject):
 
         Value = Pcd.DefaultValueFromDec
         if "{CODE(" in Pcd.DefaultValueFromDec:
-            realvalue = Pcd.DefaultValueFromDec.strip()[6:-2] # "{CODE(").rstrip(")}"
-            CApp += "static %s %s_%s_INIT_Value%s = %s;\n" % (Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,Demesion,realvalue)
+            # "{CODE(").rstrip(")}"
+            realvalue = Pcd.DefaultValueFromDec.strip()[6:-2]
+            CApp += "static %s %s_%s_INIT_Value%s = %s;\n" % (
+                Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Demesion, realvalue)
 
         if Pcd.Type in PCD_DYNAMIC_TYPE_SET | PCD_DYNAMIC_EX_TYPE_SET:
             for skuname in Pcd.SkuInfoList:
                 skuinfo = Pcd.SkuInfoList[skuname]
                 if skuinfo.VariableName:
                     for defaultstore in skuinfo.DefaultStoreDict:
-                        pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(Pcd, skuname, defaultstore)
+                        pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(
+                            Pcd, skuname, defaultstore)
                         if pcddscrawdefaultvalue:
                             Value = skuinfo.DefaultStoreDict[defaultstore]
                             if "{CODE(" in Value:
-                                realvalue = Value.strip()[6:-2] # "{CODE(").rstrip(")}"
-                                CApp += "static %s %s_%s_%s_%s_Value%s = %s;\n" % (Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,defaultstore,Demesion,realvalue)
+                                # "{CODE(").rstrip(")}"
+                                realvalue = Value.strip()[6:-2]
+                                CApp += "static %s %s_%s_%s_%s_Value%s = %s;\n" % (
+                                    Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, skuname, defaultstore, Demesion, realvalue)
                 else:
-                    pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(Pcd, skuname, TAB_DEFAULT_STORES_DEFAULT)
+                    pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(
+                        Pcd, skuname, TAB_DEFAULT_STORES_DEFAULT)
                     if pcddscrawdefaultvalue:
                         Value = skuinfo.DefaultValue
                         if "{CODE(" in Value:
-                            realvalue = Value.strip()[6:-2] # "{CODE(").rstrip(")}"
-                            CApp += "static %s %s_%s_%s_%s_Value%s = %s;\n" % (Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,skuname,TAB_DEFAULT_STORES_DEFAULT,Demesion,realvalue)
+                            # "{CODE(").rstrip(")}"
+                            realvalue = Value.strip()[6:-2]
+                            CApp += "static %s %s_%s_%s_%s_Value%s = %s;\n" % (
+                                Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, skuname, TAB_DEFAULT_STORES_DEFAULT, Demesion, realvalue)
         else:
-            pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(Pcd, TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
+            pcddscrawdefaultvalue = self.GetPcdDscRawDefaultValue(
+                Pcd, TAB_DEFAULT, TAB_DEFAULT_STORES_DEFAULT)
             if pcddscrawdefaultvalue:
                 if "{CODE(" in Pcd.DefaultValue:
-                    realvalue = Pcd.DefaultValue.strip()[6:-2] # "{CODE(").rstrip(")}"
-                    CApp += "static %s %s_%s_DEFAULT_STANDARD_Value%s = %s;\n" % (Pcd.BaseDatumType,Pcd.TokenSpaceGuidCName,Pcd.TokenCName,Demesion,realvalue)
+                    # "{CODE(").rstrip(")}"
+                    realvalue = Pcd.DefaultValue.strip()[6:-2]
+                    CApp += "static %s %s_%s_DEFAULT_STANDARD_Value%s = %s;\n" % (
+                        Pcd.BaseDatumType, Pcd.TokenSpaceGuidCName, Pcd.TokenCName, Demesion, realvalue)
 
         return CApp
 
-    def SkuOverrideValuesEmpty(self,OverrideValues):
+    def SkuOverrideValuesEmpty(self, OverrideValues):
         if not OverrideValues:
             return True
         for key in OverrideValues:
@@ -2743,14 +3129,15 @@ class DscBuildData(PlatformBuildClassObject):
         i = 0
         while i < len(ccflaglist):
             item = ccflaglist[i].strip()
-            if item in (r"/D", r"/U","-D","-U"):
-                ccflags.add(" ".join((ccflaglist[i],ccflaglist[i+1])))
+            if item in (r"/D", r"/U", "-D", "-U"):
+                ccflags.add(" ".join((ccflaglist[i], ccflaglist[i+1])))
                 i = i+1
-            elif item.startswith((r"/D", r"/U","-D","-U")):
+            elif item.startswith((r"/D", r"/U", "-D", "-U")):
                 ccflags.add(item)
-            i +=1
+            i += 1
         return ccflags
-    def GenerateByteArrayValue (self, StructuredPcds):
+
+    def GenerateByteArrayValue(self, StructuredPcds):
         #
         # Generate/Compile/Run C application to determine if there are any flexible array members
         #
@@ -2773,10 +3160,10 @@ class DscBuildData(PlatformBuildClassObject):
         for PcdName in sorted(StructuredPcds.keys()):
             Pcd = StructuredPcds[PcdName]
 
-            #create void void Cal_tocken_cname_Size functions
+            # create void void Cal_tocken_cname_Size functions
             CApp = CApp + self.GenerateSizeFunction(Pcd)
 
-            #create void Assign_ functions
+            # create void Assign_ functions
 
             # From DEC
             CApp = CApp + self.GenerateDefaultValueAssignFunction(Pcd)
@@ -2787,30 +3174,36 @@ class DscBuildData(PlatformBuildClassObject):
 
             # From Dsc Global setting
             if self.SkuOverrideValuesEmpty(Pcd.SkuOverrideValues) or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                CApp = CApp + self.GenerateInitValueFunction(Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
+                                                                                  self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+                CApp = CApp + self.GenerateInitValueFunction(
+                    Pcd, self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
                     if SkuName not in Pcd.SkuOverrideValues:
                         continue
                     for DefaultStoreName in Pcd.SkuOverrideValues[SkuName]:
-                        CApp = CApp + self.GenerateInitValueFunction(Pcd, SkuName, DefaultStoreName)
+                        CApp = CApp + \
+                            self.GenerateInitValueFunction(
+                                Pcd, SkuName, DefaultStoreName)
 
             # From Dsc module scope setting
             CApp = CApp + self.GenerateModuleScopeValue(Pcd)
 
-            #create Initialize_ functions
+            # create Initialize_ functions
             if self.SkuOverrideValuesEmpty(Pcd.SkuOverrideValues) or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                InitByteValue, CApp = self.GenerateInitializeFunc(self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd, InitByteValue, CApp)
-                InitByteValue, CApp =  self.GenerateModuleScopeInitializeFunc(self.SkuIdMgr.SystemSkuId,Pcd,InitByteValue,CApp)
+                                                                                  self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
+                InitByteValue, CApp = self.GenerateInitializeFunc(
+                    self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd, InitByteValue, CApp)
+                InitByteValue, CApp = self.GenerateModuleScopeInitializeFunc(
+                    self.SkuIdMgr.SystemSkuId, Pcd, InitByteValue, CApp)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
                     if SkuName not in Pcd.SkuOverrideValues:
                         continue
                     for DefaultStoreName in Pcd.DefaultStoreName:
                         Pcd = StructuredPcds[PcdName]
-                        InitByteValue, CApp = self.GenerateInitializeFunc(SkuName, DefaultStoreName, Pcd, InitByteValue, CApp)
+                        InitByteValue, CApp = self.GenerateInitializeFunc(
+                            SkuName, DefaultStoreName, Pcd, InitByteValue, CApp)
 
         CApp = CApp + 'VOID\n'
         CApp = CApp + 'PcdEntryPoint(\n'
@@ -2819,15 +3212,18 @@ class DscBuildData(PlatformBuildClassObject):
         CApp = CApp + '{\n'
         for Pcd in StructuredPcds.values():
             if self.SkuOverrideValuesEmpty(Pcd.SkuOverrideValues) or Pcd.Type in [self._PCD_TYPE_STRING_[MODEL_PCD_FIXED_AT_BUILD], self._PCD_TYPE_STRING_[MODEL_PCD_PATCHABLE_IN_MODULE]]:
-                CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (self.SkuIdMgr.SystemSkuId, TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (self.SkuIdMgr.SystemSkuId,
+                                                                 TAB_DEFAULT_STORES_DEFAULT, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
                 for ModuleGuid in Pcd.PcdFiledValueFromDscComponent:
-                    CApp += "  Initialize_%s_%s_%s_%s();\n" % (ModuleGuid,TAB_DEFAULT_STORES_DEFAULT ,Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                    CApp += "  Initialize_%s_%s_%s_%s();\n" % (ModuleGuid, TAB_DEFAULT_STORES_DEFAULT,
+                                                               Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
             else:
                 for SkuName in self.SkuIdMgr.SkuOverrideOrder():
                     if SkuName not in self.SkuIdMgr.AvailableSkuIdSet:
                         continue
                     for DefaultStoreName in Pcd.SkuOverrideValues[SkuName]:
-                        CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (SkuName, DefaultStoreName, Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
+                        CApp = CApp + '  Initialize_%s_%s_%s_%s();\n' % (SkuName, DefaultStoreName,
+                                                                         Pcd.TokenSpaceGuidCName, Pcd.TokenCName)
         CApp = CApp + '}\n'
 
         CApp = CApp + PcdMainCEntry + '\n'
@@ -2840,11 +3236,13 @@ class DscBuildData(PlatformBuildClassObject):
         # start generating makefile
         MakeApp = PcdMakefileHeader
         if sys.platform == "win32":
-            MakeApp = MakeApp + 'APPFILE = %s\%s.exe\n' % (self.OutputPath, PcdValueInitName) + 'APPNAME = %s\n' % (PcdValueInitName) + 'OBJECTS = %s\%s.obj %s.obj\n' % (self.OutputPath, PcdValueInitName, os.path.join(self.OutputPath, PcdValueCommonName)) + 'INC = '
+            MakeApp = MakeApp + 'APPFILE = %s\%s.exe\n' % (self.OutputPath, PcdValueInitName) + 'APPNAME = %s\n' % (
+                PcdValueInitName) + 'OBJECTS = %s\%s.obj %s.obj\n' % (self.OutputPath, PcdValueInitName, os.path.join(self.OutputPath, PcdValueCommonName)) + 'INC = '
         else:
             MakeApp = MakeApp + PcdGccMakefile
             MakeApp = MakeApp + 'APPFILE = %s/%s\n' % (self.OutputPath, PcdValueInitName) + 'APPNAME = %s\n' % (PcdValueInitName) + 'OBJECTS = %s/%s.o %s.o\n' % (self.OutputPath, PcdValueInitName, os.path.join(self.OutputPath, PcdValueCommonName)) + \
-                      'include $(MAKEROOT)/Makefiles/app.makefile\n' + 'TOOL_INCLUDE +='
+                'include $(MAKEROOT)/Makefiles/app.makefile\n' + \
+                'TOOL_INCLUDE +='
 
         IncSearchList = []
         PlatformInc = OrderedDict()
@@ -2854,15 +3252,19 @@ class DscBuildData(PlatformBuildClassObject):
             if Cache.Includes:
                 if str(Cache.MetaFile.Path) not in PlatformInc:
                     PlatformInc[str(Cache.MetaFile.Path)] = []
-                    PlatformInc[str(Cache.MetaFile.Path)].append (os.path.dirname(Cache.MetaFile.Path))
-                    PlatformInc[str(Cache.MetaFile.Path)].extend (Cache.CommonIncludes)
+                    PlatformInc[str(Cache.MetaFile.Path)].append(
+                        os.path.dirname(Cache.MetaFile.Path))
+                    PlatformInc[str(Cache.MetaFile.Path)].extend(
+                        Cache.CommonIncludes)
 
         PcdDependDEC = []
         for Pcd in StructuredPcds.values():
             for PackageDec in Pcd.PackageDecs:
-                Package = os.path.normpath(mws.join(GlobalData.gWorkspace, PackageDec))
+                Package = os.path.normpath(
+                    mws.join(GlobalData.gWorkspace, PackageDec))
                 if not os.path.exists(Package):
-                    EdkLogger.error('Build', RESOURCE_NOT_AVAILABLE, "The dependent Package %s of PCD %s.%s is not exist." % (PackageDec, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
+                    EdkLogger.error('Build', RESOURCE_NOT_AVAILABLE, "The dependent Package %s of PCD %s.%s is not exist." % (
+                        PackageDec, Pcd.TokenSpaceGuidCName, Pcd.TokenCName))
                 if Package not in PcdDependDEC:
                     PcdDependDEC.append(Package)
 
@@ -2873,7 +3275,7 @@ class DscBuildData(PlatformBuildClassObject):
                         #
                         # Get list of files in potential -I include path
                         #
-                        FileList = os.listdir (str(inc))
+                        FileList = os.listdir(str(inc))
                         #
                         # Skip -I include path if one of the include files required
                         # by PcdValueInit.c are present in the include paths from
@@ -2881,12 +3283,12 @@ class DscBuildData(PlatformBuildClassObject):
                         # files from the host compiler.
                         #
                         if 'stdio.h' in FileList:
-                          continue
+                            continue
                         if 'stdlib.h' in FileList:
-                          continue
+                            continue
                         if 'string.h' in FileList:
-                          continue
-                        MakeApp += '-I'  + str(inc) + ' '
+                            continue
+                        MakeApp += '-I' + str(inc) + ' '
                         IncSearchList.append(inc)
         MakeApp = MakeApp + '\n'
 
@@ -2910,52 +3312,71 @@ class DscBuildData(PlatformBuildClassObject):
                     if 'COMMON' not in BuildOptions:
                         BuildOptions['COMMON'] = set()
                     if Arch == TAB_STAR:
-                        BuildOptions['COMMON']|= self.ParseCCFlags(self.BuildOptions[Options])
+                        BuildOptions['COMMON'] |= self.ParseCCFlags(
+                            self.BuildOptions[Options])
                     if Arch in self.SupArchList:
                         if Arch not in BuildOptions:
                             BuildOptions[Arch] = set()
-                        BuildOptions[Arch] |= self.ParseCCFlags(self.BuildOptions[Options])
+                        BuildOptions[Arch] |= self.ParseCCFlags(
+                            self.BuildOptions[Options])
 
         if BuildOptions:
-            ArchBuildOptions = {arch:flags for arch,flags in BuildOptions.items() if arch != 'COMMON'}
+            ArchBuildOptions = {arch: flags for arch,
+                                flags in BuildOptions.items() if arch != 'COMMON'}
             if len(ArchBuildOptions.keys()) == 1:
                 BuildOptions['COMMON'] |= (list(ArchBuildOptions.values())[0])
             elif len(ArchBuildOptions.keys()) > 1:
-                CommonBuildOptions = reduce(lambda x,y: x&y, ArchBuildOptions.values())
+                CommonBuildOptions = reduce(
+                    lambda x, y: x & y, ArchBuildOptions.values())
                 BuildOptions['COMMON'] |= CommonBuildOptions
-            ValueList = [item for item in BuildOptions['COMMON'] if item.startswith((r"/U","-U"))]
-            ValueList.extend([item for item in BuildOptions['COMMON'] if item.startswith((r"/D", "-D"))])
+            ValueList = [item for item in BuildOptions['COMMON']
+                         if item.startswith((r"/U", "-U"))]
+            ValueList.extend(
+                [item for item in BuildOptions['COMMON'] if item.startswith((r"/D", "-D"))])
             CC_FLAGS += " ".join(ValueList)
         MakeApp += CC_FLAGS
 
         if sys.platform == "win32":
             MakeApp = MakeApp + PcdMakefileEnd
-            MakeApp = MakeApp + AppTarget % ("""\tcopy $(APPLICATION) $(APPFILE) /y """)
+            MakeApp = MakeApp + \
+                AppTarget % ("""\tcopy $(APPLICATION) $(APPFILE) /y """)
         else:
-            MakeApp = MakeApp + AppTarget % ("""\tcp -p $(APPLICATION) $(APPFILE) """)
+            MakeApp = MakeApp + \
+                AppTarget % ("""\tcp -p $(APPLICATION) $(APPFILE) """)
         MakeApp = MakeApp + '\n'
         IncludeFileFullPaths = []
         for includefile in IncludeFiles:
             for includepath in IncSearchList:
                 includefullpath = os.path.join(str(includepath), includefile)
                 if os.path.exists(includefullpath):
-                    IncludeFileFullPaths.append(os.path.normpath(includefullpath))
+                    IncludeFileFullPaths.append(
+                        os.path.normpath(includefullpath))
                     break
         SearchPathList = []
-        SearchPathList.append(os.path.normpath(mws.join(GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "BaseTools/Source/C/Include")))
-        SearchPathList.append(os.path.normpath(mws.join(GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "BaseTools/Source/C/Common")))
+        SearchPathList.append(os.path.normpath(mws.join(
+            GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "BaseTools/Source/C/Include")))
+        SearchPathList.append(os.path.normpath(mws.join(
+            GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "BaseTools/Source/C/Common")))
         SearchPathList.extend(str(item) for item in IncSearchList)
         IncFileList = GetDependencyList(IncludeFileFullPaths, SearchPathList)
         for include_file in IncFileList:
             MakeApp += "$(OBJECTS) : %s\n" % include_file
         if sys.platform == "win32":
-            PcdValueCommonPath = os.path.normpath(mws.join(GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "Source\C\Common\PcdValueCommon.c"))
-            MakeApp = MakeApp + '%s\PcdValueCommon.c : %s\n' % (self.OutputPath, PcdValueCommonPath)
+            PcdValueCommonPath = os.path.normpath(mws.join(
+                GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "Source\C\Common\PcdValueCommon.c"))
+            MakeApp = MakeApp + \
+                '%s\PcdValueCommon.c : %s\n' % (
+                    self.OutputPath, PcdValueCommonPath)
             MakeApp = MakeApp + '\tcopy /y %s $@\n' % (PcdValueCommonPath)
         else:
-            PcdValueCommonPath = os.path.normpath(mws.join(GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "Source/C/Common/PcdValueCommon.c"))
-            MakeApp = MakeApp + '%s/PcdValueCommon.c : %s\n' % (self.OutputPath, PcdValueCommonPath)
-            MakeApp = MakeApp + '\tcp -p -f %s %s/PcdValueCommon.c\n' % (PcdValueCommonPath, self.OutputPath)
+            PcdValueCommonPath = os.path.normpath(mws.join(
+                GlobalData.gGlobalDefines["EDK_TOOLS_PATH"], "Source/C/Common/PcdValueCommon.c"))
+            MakeApp = MakeApp + \
+                '%s/PcdValueCommon.c : %s\n' % (self.OutputPath,
+                                                PcdValueCommonPath)
+            MakeApp = MakeApp + \
+                '\tcp -p -f %s %s/PcdValueCommon.c\n' % (
+                    PcdValueCommonPath, self.OutputPath)
         MakeFileName = os.path.join(self.OutputPath, 'Makefile')
         MakeApp += "$(OBJECTS) : %s\n" % MakeFileName
         SaveFileOnChange(MakeFileName, MakeApp, False)
@@ -2967,48 +3388,52 @@ class DscBuildData(PlatformBuildClassObject):
 
         Dest_PcdValueInitExe = PcdValueInitName
         if not sys.platform == "win32":
-            Dest_PcdValueInitExe = os.path.join(self.OutputPath, PcdValueInitName)
+            Dest_PcdValueInitExe = os.path.join(
+                self.OutputPath, PcdValueInitName)
         else:
-            Dest_PcdValueInitExe = os.path.join(self.OutputPath, PcdValueInitName) +".exe"
+            Dest_PcdValueInitExe = os.path.join(
+                self.OutputPath, PcdValueInitName) + ".exe"
 
-        #start building the structure pcd value tool
+        # start building the structure pcd value tool
         Messages = ''
         if sys.platform == "win32":
             MakeCommand = 'nmake -f %s' % (MakeFileName)
-            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand (MakeCommand)
+            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand(
+                MakeCommand)
             Messages = StdOut
         else:
             MakeCommand = 'make -f %s' % (MakeFileName)
-            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand (MakeCommand)
+            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand(
+                MakeCommand)
             Messages = StdErr
 
-        EdkLogger.verbose ('%s\n%s\n%s' % (MakeCommand, StdOut, StdErr))
+        EdkLogger.verbose('%s\n%s\n%s' % (MakeCommand, StdOut, StdErr))
         Messages = Messages.split('\n')
         MessageGroup = []
         if returncode != 0:
             CAppBaseFileName = os.path.join(self.OutputPath, PcdValueInitName)
-            File = open (CAppBaseFileName + '.c', 'r')
+            File = open(CAppBaseFileName + '.c', 'r')
             FileData = File.readlines()
             File.close()
             for Message in Messages:
                 if " error" in Message or "warning" in Message:
                     try:
                         FileInfo = Message.strip().split('(')
-                        if len (FileInfo) > 1:
-                            FileName = FileInfo [0]
-                            FileLine = FileInfo [1].split (')')[0]
+                        if len(FileInfo) > 1:
+                            FileName = FileInfo[0]
+                            FileLine = FileInfo[1].split(')')[0]
                         else:
                             FileInfo = Message.strip().split(':')
                             if len(FileInfo) < 2:
                                 continue
-                            FileName = FileInfo [0]
-                            FileLine = FileInfo [1]
+                            FileName = FileInfo[0]
+                            FileLine = FileInfo[1]
                     except:
                         continue
                     if "PcdValueInit.c" not in FileName:
                         continue
                     if FileLine.isdigit():
-                        error_line = FileData[int (FileLine) - 1]
+                        error_line = FileData[int(FileLine) - 1]
                         if r"//" in error_line:
                             c_line, dsc_line = error_line.split(r"//")
                         else:
@@ -3025,33 +3450,39 @@ class DscBuildData(PlatformBuildClassObject):
                                     Index = message_itmes.index(item)
                                     message_itmes[Index] = dsc_line.strip()
                                     break
-                            MessageGroup.append(":".join(message_itmes[Index:]).strip())
+                            MessageGroup.append(
+                                ":".join(message_itmes[Index:]).strip())
                             continue
                     else:
                         MessageGroup.append(Message)
             if MessageGroup:
-                EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR, "\n".join(MessageGroup) )
+                EdkLogger.error("build", PCD_STRUCTURE_PCD_ERROR,
+                                "\n".join(MessageGroup))
             else:
-                EdkLogger.error('Build', COMMAND_FAILURE, 'Can not execute command: %s\n%s\n%s' % (MakeCommand, StdOut, StdErr))
+                EdkLogger.error('Build', COMMAND_FAILURE, 'Can not execute command: %s\n%s\n%s' % (
+                    MakeCommand, StdOut, StdErr))
 
-        #start executing the structure pcd value tool
+        # start executing the structure pcd value tool
         if DscBuildData.NeedUpdateOutput(OutputValueFile, Dest_PcdValueInitExe, InputValueFile):
-            Command = Dest_PcdValueInitExe + ' -i %s -o %s' % (InputValueFile, OutputValueFile)
-            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand (Command)
-            EdkLogger.verbose ('%s\n%s\n%s' % (Command, StdOut, StdErr))
+            Command = Dest_PcdValueInitExe + \
+                ' -i %s -o %s' % (InputValueFile, OutputValueFile)
+            returncode, StdOut, StdErr = DscBuildData.ExecuteCommand(Command)
+            EdkLogger.verbose('%s\n%s\n%s' % (Command, StdOut, StdErr))
             if returncode != 0:
-                EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s\n%s\n%s\n' % (Command, StdOut, StdErr))
+                EdkLogger.warn('Build', COMMAND_FAILURE, 'Can not collect output from command: %s\n%s\n%s\n' % (
+                    Command, StdOut, StdErr))
 
-        #start update structure pcd final value
-        File = open (OutputValueFile, 'r')
+        # start update structure pcd final value
+        File = open(OutputValueFile, 'r')
         FileBuffer = File.readlines()
         File.close()
 
         StructurePcdSet = []
         for Pcd in FileBuffer:
-            PcdValue = Pcd.split ('|')
-            PcdInfo = PcdValue[0].split ('.')
-            StructurePcdSet.append((PcdInfo[0], PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
+            PcdValue = Pcd.split('|')
+            PcdInfo = PcdValue[0].split('.')
+            StructurePcdSet.append(
+                (PcdInfo[0], PcdInfo[1], PcdInfo[2], PcdInfo[3], PcdValue[2].strip()))
         return StructurePcdSet
 
     @staticmethod
@@ -3064,7 +3495,7 @@ class DscBuildData(PlatformBuildClassObject):
             return True
         return False
 
-    ## Retrieve dynamic PCD settings
+    # Retrieve dynamic PCD settings
     #
     #   @param  Type    PCD type
     #
@@ -3072,7 +3503,6 @@ class DscBuildData(PlatformBuildClassObject):
     #
     def _GetDynamicPcd(self, Type):
 
-
         Pcds = OrderedDict()
         #
         # tdict is a special dict kind of type, used for selecting correct
@@ -3084,13 +3514,12 @@ class DscBuildData(PlatformBuildClassObject):
         RecordList = self._RawData[Type, self._Arch]
         AvailableSkuIdSet = copy.copy(self.SkuIds)
 
-
         for TokenSpaceGuid, PcdCName, Setting, Arch, SkuName, Dummy3, Dummy4, Dummy5 in RecordList:
             SkuName = SkuName.upper()
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
-                                            File=self.MetaFile, Line=Dummy5)
+                                File=self.MetaFile, Line=Dummy5)
             if "." not in TokenSpaceGuid and "[" not in PcdCName and (PcdCName, TokenSpaceGuid, SkuName, Dummy5) not in PcdList:
                 PcdList.append((PcdCName, TokenSpaceGuid, SkuName, Dummy5))
             PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid] = Setting
@@ -3102,7 +3531,8 @@ class DscBuildData(PlatformBuildClassObject):
             if Setting is None:
                 continue
 
-            PcdValue, DatumType, MaxDatumSize = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
+            PcdValue, DatumType, MaxDatumSize = self._ValidatePcd(
+                PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
             if MaxDatumSize:
                 if int(MaxDatumSize, 0) > 0xFFFF:
                     EdkLogger.error('build', FORMAT_INVALID, "The size value must not exceed the maximum value of 0xFFFF (UINT16) for %s." % ".".join((TokenSpaceGuid, PcdCName)),
@@ -3110,7 +3540,8 @@ class DscBuildData(PlatformBuildClassObject):
                 if int(MaxDatumSize, 0) < 0:
                     EdkLogger.error('build', FORMAT_INVALID, "The size value can't be set to negative value for %s." % ".".join((TokenSpaceGuid, PcdCName)),
                                     File=self.MetaFile, Line=Dummy4)
-            SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], '', '', '', '', '', PcdValue)
+            SkuInfo = SkuInfoClass(
+                SkuName, self.SkuIds[SkuName][0], '', '', '', '', '', PcdValue)
             if (PcdCName, TokenSpaceGuid) in Pcds:
                 pcdObject = Pcds[PcdCName, TokenSpaceGuid]
                 pcdObject.SkuInfoList[SkuName] = SkuInfo
@@ -3126,33 +3557,36 @@ class DscBuildData(PlatformBuildClassObject):
                     pcdObject.MaxDatumSize = str(CurrentMaxSize)
             else:
                 Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
-                                                    PcdCName,
-                                                    TokenSpaceGuid,
-                                                    self._PCD_TYPE_STRING_[Type],
-                                                    DatumType,
-                                                    PcdValue,
-                                                    '',
-                                                    MaxDatumSize,
-                                                    OrderedDict({SkuName : SkuInfo}),
-                                                    False,
-                                                    None,
-                                                    IsDsc=True)
+                    PcdCName,
+                    TokenSpaceGuid,
+                    self._PCD_TYPE_STRING_[Type],
+                    DatumType,
+                    PcdValue,
+                    '',
+                    MaxDatumSize,
+                    OrderedDict({SkuName: SkuInfo}),
+                    False,
+                    None,
+                    IsDsc=True)
 
             if SkuName not in Pcds[PcdCName, TokenSpaceGuid].DscRawValue:
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName] = {}
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName] = {}
             Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName][TAB_DEFAULT_STORES_DEFAULT] = PcdValue
-            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (self.MetaFile.File,Dummy4)
+            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (
+                self.MetaFile.File, Dummy4)
 
         for pcd in Pcds.values():
-            pcdDecObject = self._DecPcds[pcd.TokenCName, pcd.TokenSpaceGuidCName]
+            pcdDecObject = self._DecPcds[pcd.TokenCName,
+                                         pcd.TokenSpaceGuidCName]
             # Only fix the value while no value provided in DSC file.
             for sku in pcd.SkuInfoList.values():
                 if not sku.DefaultValue:
                     sku.DefaultValue = pcdDecObject.DefaultValue
             if TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON not in pcd.SkuInfoList:
                 valuefromDec = pcdDecObject.DefaultValue
-                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', '', '', '', '', '', valuefromDec)
+                SkuInfo = SkuInfoClass(
+                    TAB_DEFAULT, '0', '', '', '', '', '', valuefromDec)
                 pcd.SkuInfoList[TAB_DEFAULT] = SkuInfo
             elif TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 pcd.SkuInfoList[TAB_DEFAULT] = pcd.SkuInfoList[TAB_COMMON]
@@ -3169,12 +3603,13 @@ class DscBuildData(PlatformBuildClassObject):
         if self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.SINGLE:
             if TAB_DEFAULT in PcdObj.SkuInfoList and self.SkuIdMgr.SystemSkuId not in PcdObj.SkuInfoList:
                 PcdObj.SkuInfoList[self.SkuIdMgr.SystemSkuId] = PcdObj.SkuInfoList[TAB_DEFAULT]
-            PcdObj.SkuInfoList = {TAB_DEFAULT:PcdObj.SkuInfoList[self.SkuIdMgr.SystemSkuId]}
+            PcdObj.SkuInfoList = {
+                TAB_DEFAULT: PcdObj.SkuInfoList[self.SkuIdMgr.SystemSkuId]}
             PcdObj.SkuInfoList[TAB_DEFAULT].SkuIdName = TAB_DEFAULT
             PcdObj.SkuInfoList[TAB_DEFAULT].SkuId = '0'
 
         elif self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.DEFAULT:
-            PcdObj.SkuInfoList = {TAB_DEFAULT:PcdObj.SkuInfoList[TAB_DEFAULT]}
+            PcdObj.SkuInfoList = {TAB_DEFAULT: PcdObj.SkuInfoList[TAB_DEFAULT]}
 
         return PcdObj
 
@@ -3194,24 +3629,31 @@ class DscBuildData(PlatformBuildClassObject):
     def CompletePcdValues(self, PcdSet):
         Pcds = OrderedDict()
         DefaultStoreObj = DefaultStore(self._GetDefaultStores())
-        SkuIds = {skuname:skuid for skuname, skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname != TAB_COMMON}
-        DefaultStores = set(storename for pcdobj in PcdSet.values() for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict)
+        SkuIds = {skuname: skuid for skuname,
+                  skuid in self.SkuIdMgr.AvailableSkuIdSet.items() if skuname != TAB_COMMON}
+        DefaultStores = set(storename for pcdobj in PcdSet.values(
+        ) for skuobj in pcdobj.SkuInfoList.values() for storename in skuobj.DefaultStoreDict)
         for PcdCName, TokenSpaceGuid in PcdSet:
             PcdObj = PcdSet[(PcdCName, TokenSpaceGuid)]
 
             if PcdObj.Type not in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_DEFAULT],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_VPD],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_DEFAULT],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII],
-                        self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]:
-                Pcds[PcdCName, TokenSpaceGuid]= PcdObj
+                                   self._PCD_TYPE_STRING_[
+                                       MODEL_PCD_DYNAMIC_HII],
+                                   self._PCD_TYPE_STRING_[
+                                       MODEL_PCD_DYNAMIC_VPD],
+                                   self._PCD_TYPE_STRING_[
+                                       MODEL_PCD_DYNAMIC_EX_DEFAULT],
+                                   self._PCD_TYPE_STRING_[
+                                       MODEL_PCD_DYNAMIC_EX_HII],
+                                   self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_VPD]]:
+                Pcds[PcdCName, TokenSpaceGuid] = PcdObj
                 continue
             PcdType = PcdObj.Type
             if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
                 for skuid in PcdObj.SkuInfoList:
                     skuobj = PcdObj.SkuInfoList[skuid]
-                    mindefaultstorename = DefaultStoreObj.GetMin(set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict))
+                    mindefaultstorename = DefaultStoreObj.GetMin(
+                        set(defaultstorename for defaultstorename in skuobj.DefaultStoreDict))
                     for defaultstorename in DefaultStores:
                         if defaultstorename not in skuobj.DefaultStoreDict:
                             skuobj.DefaultStoreDict[defaultstorename] = skuobj.DefaultStoreDict[mindefaultstorename]
@@ -3221,19 +3663,22 @@ class DscBuildData(PlatformBuildClassObject):
                     nextskuid = self.SkuIdMgr.GetNextSkuId(skuname)
                     while nextskuid not in PcdObj.SkuInfoList:
                         nextskuid = self.SkuIdMgr.GetNextSkuId(nextskuid)
-                    PcdObj.SkuInfoList[skuname] = copy.deepcopy(PcdObj.SkuInfoList[nextskuid])
+                    PcdObj.SkuInfoList[skuname] = copy.deepcopy(
+                        PcdObj.SkuInfoList[nextskuid])
                     PcdObj.SkuInfoList[skuname].SkuId = skuid
                     PcdObj.SkuInfoList[skuname].SkuIdName = skuname
             if PcdType in [self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_HII], self._PCD_TYPE_STRING_[MODEL_PCD_DYNAMIC_EX_HII]]:
-                PcdObj.DefaultValue = list(PcdObj.SkuInfoList.values())[0].HiiDefaultValue if self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.SINGLE else PcdObj.SkuInfoList[TAB_DEFAULT].HiiDefaultValue
-            Pcds[PcdCName, TokenSpaceGuid]= PcdObj
+                PcdObj.DefaultValue = list(PcdObj.SkuInfoList.values())[
+                    0].HiiDefaultValue if self.SkuIdMgr.SkuUsageType == self.SkuIdMgr.SINGLE else PcdObj.SkuInfoList[TAB_DEFAULT].HiiDefaultValue
+            Pcds[PcdCName, TokenSpaceGuid] = PcdObj
         return Pcds
-    ## Retrieve dynamic HII PCD settings
+    # Retrieve dynamic HII PCD settings
     #
     #   @param  Type    PCD type
     #
     #   @retval a dict object contains settings of given PCD type
     #
+
     def _GetDynamicHiiPcd(self, Type):
 
         VariableAttrs = {}
@@ -3258,31 +3703,34 @@ class DscBuildData(PlatformBuildClassObject):
             if DefaultStore == TAB_COMMON:
                 DefaultStore = TAB_DEFAULT_STORES_DEFAULT
             else:
-                #The end user define [DefaultStores] and [SKUID_IDENTIFIER.Menufacturing] in DSC
+                # The end user define [DefaultStores] and [SKUID_IDENTIFIER.Menufacturing] in DSC
                 UserDefinedDefaultStores.append((PcdCName, TokenSpaceGuid))
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
-                                            File=self.MetaFile, Line=Dummy5)
+                                File=self.MetaFile, Line=Dummy5)
             if DefaultStore not in DefaultStoresDefine:
                 EdkLogger.error('build', PARAMETER_INVALID, 'DefaultStores %s is not defined in [DefaultStores] section' % DefaultStore,
-                                            File=self.MetaFile, Line=Dummy5)
+                                File=self.MetaFile, Line=Dummy5)
             if "." not in TokenSpaceGuid and "[" not in PcdCName and (PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy5) not in PcdList:
-                PcdList.append((PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy5))
-            PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore] = Setting
-
+                PcdList.append((PcdCName, TokenSpaceGuid,
+                               SkuName, DefaultStore, Dummy5))
+            PcdDict[Arch, SkuName, PcdCName,
+                    TokenSpaceGuid, DefaultStore] = Setting
 
         # Remove redundant PCD candidates, per the ARCH and SKU
-        for index,(PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4) in enumerate(PcdList):
+        for index, (PcdCName, TokenSpaceGuid, SkuName, DefaultStore, Dummy4) in enumerate(PcdList):
 
-            Setting = PcdDict[self._Arch, SkuName, PcdCName, TokenSpaceGuid, DefaultStore]
+            Setting = PcdDict[self._Arch, SkuName,
+                              PcdCName, TokenSpaceGuid, DefaultStore]
             if Setting is None:
                 continue
-            VariableName, VariableGuid, VariableOffset, DefaultValue, VarAttribute = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
+            VariableName, VariableGuid, VariableOffset, DefaultValue, VarAttribute = self._ValidatePcd(
+                PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
 
             rt, Msg = VariableAttributes.ValidateVarAttributes(VarAttribute)
             if not rt:
                 EdkLogger.error("build", PCD_VARIABLE_ATTRIBUTES_ERROR, "Variable attributes settings for %s is incorrect.\n %s" % (".".join((TokenSpaceGuid, PcdCName)), Msg),
-                        ExtraData="[%s]" % VarAttribute)
+                                ExtraData="[%s]" % VarAttribute)
             ExceedMax = False
             FormatCorrect = True
             if VariableOffset.isdigit():
@@ -3301,42 +3749,49 @@ class DscBuildData(PlatformBuildClassObject):
             else:
                 FormatCorrect = False
             if not FormatCorrect:
-                EdkLogger.error('Build', FORMAT_INVALID, "Invalid syntax or format of the variable offset value is incorrect for %s." % ".".join((TokenSpaceGuid, PcdCName)))
+                EdkLogger.error('Build', FORMAT_INVALID, "Invalid syntax or format of the variable offset value is incorrect for %s." % ".".join(
+                    (TokenSpaceGuid, PcdCName)))
 
             if ExceedMax:
-                EdkLogger.error('Build', OPTION_VALUE_INVALID, "The variable offset value must not exceed the maximum value of 0xFFFF (UINT16) for %s." % ".".join((TokenSpaceGuid, PcdCName)))
+                EdkLogger.error('Build', OPTION_VALUE_INVALID, "The variable offset value must not exceed the maximum value of 0xFFFF (UINT16) for %s." % ".".join(
+                    (TokenSpaceGuid, PcdCName)))
             if (VariableName, VariableGuid) not in VariableAttrs:
                 VariableAttrs[(VariableName, VariableGuid)] = VarAttribute
             else:
                 if not DscBuildData.CompareVarAttr(VariableAttrs[(VariableName, VariableGuid)], VarAttribute):
-                    EdkLogger.error('Build', PCD_VARIABLE_ATTRIBUTES_CONFLICT_ERROR, "The variable %s.%s for DynamicHii PCDs has conflicting attributes [%s] and [%s] " % (VariableGuid, VariableName, VarAttribute, VariableAttrs[(VariableName, VariableGuid)]))
+                    EdkLogger.error('Build', PCD_VARIABLE_ATTRIBUTES_CONFLICT_ERROR, "The variable %s.%s for DynamicHii PCDs has conflicting attributes [%s] and [%s] " % (
+                        VariableGuid, VariableName, VarAttribute, VariableAttrs[(VariableName, VariableGuid)]))
 
             pcdDecObject = self._DecPcds[PcdCName, TokenSpaceGuid]
             if (PcdCName, TokenSpaceGuid) in Pcds:
                 pcdObject = Pcds[PcdCName, TokenSpaceGuid]
                 if SkuName in pcdObject.SkuInfoList:
                     Skuitem = pcdObject.SkuInfoList[SkuName]
-                    Skuitem.DefaultStoreDict.update({DefaultStore:DefaultValue})
+                    Skuitem.DefaultStoreDict.update(
+                        {DefaultStore: DefaultValue})
                 else:
-                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
+                    SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset,
+                                           DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore: DefaultValue})
                     pcdObject.SkuInfoList[SkuName] = SkuInfo
             else:
-                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset, DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore:DefaultValue})
+                SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], VariableName, VariableGuid, VariableOffset,
+                                       DefaultValue, VariableAttribute=VarAttribute, DefaultStore={DefaultStore: DefaultValue})
                 PcdClassObj = PcdClassObject(
-                                                PcdCName,
-                                                TokenSpaceGuid,
-                                                self._PCD_TYPE_STRING_[Type],
-                                                '',
-                                                DefaultValue,
-                                                '',
-                                                '',
-                                                OrderedDict({SkuName : SkuInfo}),
-                                                False,
-                                                None,
-                                                pcdDecObject.validateranges,
-                                                pcdDecObject.validlists,
-                                                pcdDecObject.expressions,
-                                                IsDsc=True)
+                    PcdCName,
+                    TokenSpaceGuid,
+                    self._PCD_TYPE_STRING_[Type],
+                    '',
+                    DefaultValue,
+                    '',
+                    '',
+                    OrderedDict(
+                        {SkuName: SkuInfo}),
+                    False,
+                    None,
+                    pcdDecObject.validateranges,
+                    pcdDecObject.validlists,
+                    pcdDecObject.expressions,
+                    IsDsc=True)
                 if (PcdCName, TokenSpaceGuid) in UserDefinedDefaultStores:
                     PcdClassObj.UserDefinedDefaultStoresFlag = True
                 Pcds[PcdCName, TokenSpaceGuid] = PcdClassObj
@@ -3346,21 +3801,24 @@ class DscBuildData(PlatformBuildClassObject):
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName] = {}
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName] = {}
             Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName][DefaultStore] = DefaultValue
-            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][DefaultStore] = (self.MetaFile.File,Dummy4)
+            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][DefaultStore] = (
+                self.MetaFile.File, Dummy4)
         for pcd in Pcds.values():
-            pcdDecObject = self._DecPcds[pcd.TokenCName, pcd.TokenSpaceGuidCName]
+            pcdDecObject = self._DecPcds[pcd.TokenCName,
+                                         pcd.TokenSpaceGuidCName]
             pcd.DatumType = pcdDecObject.DatumType
             # Only fix the value while no value provided in DSC file.
             for sku in pcd.SkuInfoList.values():
                 if (sku.HiiDefaultValue == "" or sku.HiiDefaultValue is None):
                     sku.HiiDefaultValue = pcdDecObject.DefaultValue
                     for default_store in sku.DefaultStoreDict:
-                        sku.DefaultStoreDict[default_store]=pcdDecObject.DefaultValue
+                        sku.DefaultStoreDict[default_store] = pcdDecObject.DefaultValue
                     pcd.DefaultValue = pcdDecObject.DefaultValue
             if TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON not in pcd.SkuInfoList:
                 SkuInfoObj = list(pcd.SkuInfoList.values())[0]
                 valuefromDec = pcdDecObject.DefaultValue
-                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset, valuefromDec, VariableAttribute=SkuInfoObj.VariableAttribute, DefaultStore={DefaultStore:valuefromDec})
+                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', SkuInfoObj.VariableName, SkuInfoObj.VariableGuid, SkuInfoObj.VariableOffset,
+                                       valuefromDec, VariableAttribute=SkuInfoObj.VariableAttribute, DefaultStore={DefaultStore: valuefromDec})
                 pcd.SkuInfoList[TAB_DEFAULT] = SkuInfo
             elif TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 pcd.SkuInfoList[TAB_DEFAULT] = pcd.SkuInfoList[TAB_COMMON]
@@ -3375,18 +3833,21 @@ class DscBuildData(PlatformBuildClassObject):
             if pcd.DatumType not in TAB_PCD_NUMERIC_TYPES:
                 for (_, skuobj) in pcd.SkuInfoList.items():
                     datalen = 0
-                    skuobj.HiiDefaultValue = StringToArray(skuobj.HiiDefaultValue)
+                    skuobj.HiiDefaultValue = StringToArray(
+                        skuobj.HiiDefaultValue)
                     datalen = len(skuobj.HiiDefaultValue.split(","))
                     if datalen > MaxSize:
                         MaxSize = datalen
                     for defaultst in skuobj.DefaultStoreDict:
-                        skuobj.DefaultStoreDict[defaultst] = StringToArray(skuobj.DefaultStoreDict[defaultst])
+                        skuobj.DefaultStoreDict[defaultst] = StringToArray(
+                            skuobj.DefaultStoreDict[defaultst])
                 pcd.DefaultValue = StringToArray(pcd.DefaultValue)
                 pcd.MaxDatumSize = str(MaxSize)
         rt, invalidhii = DscBuildData.CheckVariableNameAssignment(Pcds)
         if not rt:
             invalidpcd = ",".join(invalidhii)
-            EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR, Message='The same HII PCD must map to the same EFI variable for all SKUs', File=self.MetaFile, ExtraData=invalidpcd)
+            EdkLogger.error('build', PCD_VARIABLE_INFO_ERROR,
+                            Message='The same HII PCD must map to the same EFI variable for all SKUs', File=self.MetaFile, ExtraData=invalidpcd)
 
         list(map(self.FilterSkuSettings, Pcds.values()))
 
@@ -3397,22 +3858,23 @@ class DscBuildData(PlatformBuildClassObject):
         invalidhii = []
         for pcdname in Pcds:
             pcd = Pcds[pcdname]
-            varnameset = set(sku.VariableName for (skuid, sku) in pcd.SkuInfoList.items())
+            varnameset = set(sku.VariableName for (
+                skuid, sku) in pcd.SkuInfoList.items())
             if len(varnameset) > 1:
                 invalidhii.append(".".join((pcdname[1], pcdname[0])))
         if len(invalidhii):
             return False, invalidhii
         else:
             return True, []
-    ## Retrieve dynamic VPD PCD settings
+    # Retrieve dynamic VPD PCD settings
     #
     #   @param  Type    PCD type
     #
     #   @retval a dict object contains settings of given PCD type
     #
+
     def _GetDynamicVpdPcd(self, Type):
 
-
         Pcds = OrderedDict()
         #
         # tdict is a special dict kind of type, used for selecting correct
@@ -3430,7 +3892,7 @@ class DscBuildData(PlatformBuildClassObject):
             SkuName = TAB_DEFAULT if SkuName == TAB_COMMON else SkuName
             if SkuName not in AvailableSkuIdSet:
                 EdkLogger.error('build', PARAMETER_INVALID, 'Sku %s is not defined in [SkuIds] section' % SkuName,
-                                            File=self.MetaFile, Line=Dummy5)
+                                File=self.MetaFile, Line=Dummy5)
             if "." not in TokenSpaceGuid and "[" not in PcdCName and (PcdCName, TokenSpaceGuid, SkuName, Dummy5) not in PcdList:
                 PcdList.append((PcdCName, TokenSpaceGuid, SkuName, Dummy5))
             PcdDict[Arch, SkuName, PcdCName, TokenSpaceGuid] = Setting
@@ -3446,7 +3908,8 @@ class DscBuildData(PlatformBuildClassObject):
             # At this point, we put all the data into the PcdClssObject for we don't know the PCD's datumtype
             # until the DEC parser has been called.
             #
-            VpdOffset, MaxDatumSize, InitialValue = self._ValidatePcd(PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
+            VpdOffset, MaxDatumSize, InitialValue = self._ValidatePcd(
+                PcdCName, TokenSpaceGuid, Setting, Type, Dummy4)
             if MaxDatumSize:
                 if int(MaxDatumSize, 0) > 0xFFFF:
                     EdkLogger.error('build', FORMAT_INVALID, "The size value must not exceed the maximum value of 0xFFFF (UINT16) for %s." % ".".join((TokenSpaceGuid, PcdCName)),
@@ -3454,7 +3917,8 @@ class DscBuildData(PlatformBuildClassObject):
                 if int(MaxDatumSize, 0) < 0:
                     EdkLogger.error('build', FORMAT_INVALID, "The size value can't be set to negative value for %s." % ".".join((TokenSpaceGuid, PcdCName)),
                                     File=self.MetaFile, Line=Dummy4)
-            SkuInfo = SkuInfoClass(SkuName, self.SkuIds[SkuName][0], '', '', '', '', VpdOffset, InitialValue)
+            SkuInfo = SkuInfoClass(
+                SkuName, self.SkuIds[SkuName][0], '', '', '', '', VpdOffset, InitialValue)
             if (PcdCName, TokenSpaceGuid) in Pcds:
                 pcdObject = Pcds[PcdCName, TokenSpaceGuid]
                 pcdObject.SkuInfoList[SkuName] = SkuInfo
@@ -3470,25 +3934,28 @@ class DscBuildData(PlatformBuildClassObject):
                     pcdObject.MaxDatumSize = str(CurrentMaxSize)
             else:
                 Pcds[PcdCName, TokenSpaceGuid] = PcdClassObject(
-                                                PcdCName,
-                                                TokenSpaceGuid,
-                                                self._PCD_TYPE_STRING_[Type],
-                                                '',
-                                                InitialValue,
-                                                '',
-                                                MaxDatumSize,
-                                                OrderedDict({SkuName : SkuInfo}),
-                                                False,
-                                                None,
-                                                IsDsc=True)
+                    PcdCName,
+                    TokenSpaceGuid,
+                    self._PCD_TYPE_STRING_[Type],
+                    '',
+                    InitialValue,
+                    '',
+                    MaxDatumSize,
+                    OrderedDict(
+                        {SkuName: SkuInfo}),
+                    False,
+                    None,
+                    IsDsc=True)
 
             if SkuName not in Pcds[PcdCName, TokenSpaceGuid].DscRawValue:
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName] = {}
                 Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName] = {}
             Pcds[PcdCName, TokenSpaceGuid].DscRawValue[SkuName][TAB_DEFAULT_STORES_DEFAULT] = InitialValue
-            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (self.MetaFile.File,Dummy4)
+            Pcds[PcdCName, TokenSpaceGuid].DscRawValueInfo[SkuName][TAB_DEFAULT_STORES_DEFAULT] = (
+                self.MetaFile.File, Dummy4)
         for pcd in Pcds.values():
-            pcdDecObject = self._DecPcds[pcd.TokenCName, pcd.TokenSpaceGuidCName]
+            pcdDecObject = self._DecPcds[pcd.TokenCName,
+                                         pcd.TokenSpaceGuidCName]
             pcd.DatumType = pcdDecObject.DatumType
             # Only fix the value while no value provided in DSC file.
             for sku in pcd.SkuInfoList.values():
@@ -3497,7 +3964,8 @@ class DscBuildData(PlatformBuildClassObject):
             if TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON not in pcd.SkuInfoList:
                 SkuInfoObj = list(pcd.SkuInfoList.values())[0]
                 valuefromDec = pcdDecObject.DefaultValue
-                SkuInfo = SkuInfoClass(TAB_DEFAULT, '0', '', '', '', '', SkuInfoObj.VpdOffset, valuefromDec)
+                SkuInfo = SkuInfoClass(
+                    TAB_DEFAULT, '0', '', '', '', '', SkuInfoObj.VpdOffset, valuefromDec)
                 pcd.SkuInfoList[TAB_DEFAULT] = SkuInfo
             elif TAB_DEFAULT not in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 pcd.SkuInfoList[TAB_DEFAULT] = pcd.SkuInfoList[TAB_COMMON]
@@ -3505,20 +3973,22 @@ class DscBuildData(PlatformBuildClassObject):
             elif TAB_DEFAULT in pcd.SkuInfoList and TAB_COMMON in pcd.SkuInfoList:
                 del pcd.SkuInfoList[TAB_COMMON]
 
-        #For the same one VOID* pcd, if the default value type of one SKU is "Unicode string",
-        #the other SKUs are "OtherVOID*"(ASCII string or byte array),Then convert "Unicode string" to "byte array".
+        # For the same one VOID* pcd, if the default value type of one SKU is "Unicode string",
+        # the other SKUs are "OtherVOID*"(ASCII string or byte array),Then convert "Unicode string" to "byte array".
         for pcd in Pcds.values():
             PcdValueTypeSet = set()
             for sku in pcd.SkuInfoList.values():
-                PcdValueTypeSet.add("UnicodeString" if sku.DefaultValue.startswith(('L"',"L'")) else "OtherVOID*")
+                PcdValueTypeSet.add("UnicodeString" if sku.DefaultValue.startswith(
+                    ('L"', "L'")) else "OtherVOID*")
             if len(PcdValueTypeSet) > 1:
                 for sku in pcd.SkuInfoList.values():
-                    sku.DefaultValue = StringToArray(sku.DefaultValue) if sku.DefaultValue.startswith(('L"',"L'")) else sku.DefaultValue
+                    sku.DefaultValue = StringToArray(sku.DefaultValue) if sku.DefaultValue.startswith(
+                        ('L"', "L'")) else sku.DefaultValue
 
         list(map(self.FilterSkuSettings, Pcds.values()))
         return Pcds
 
-    ## Add external modules
+    # Add external modules
     #
     #   The external modules are mostly those listed in FDF file, which don't
     # need "build".
@@ -3537,13 +4007,16 @@ class DscBuildData(PlatformBuildClassObject):
         self._ToolChainFamily = TAB_COMPILER_MSFT
         TargetObj = TargetTxtDict()
         TargetTxt = TargetObj.Target
-        BuildConfigurationFile = os.path.normpath(os.path.join(GlobalData.gConfDirectory, gDefaultTargetTxtFile))
+        BuildConfigurationFile = os.path.normpath(os.path.join(
+            GlobalData.gConfDirectory, gDefaultTargetTxtFile))
         if os.path.isfile(BuildConfigurationFile) == True:
             ToolDefinitionFile = TargetTxt.TargetTxtDictionary[DataType.TAB_TAT_DEFINES_TOOL_CHAIN_CONF]
             if ToolDefinitionFile == '':
-                ToolDefinitionFile = os.path.normpath(mws.join(self.WorkspaceDir, 'Conf', gDefaultToolsDefFile))
+                ToolDefinitionFile = os.path.normpath(
+                    mws.join(self.WorkspaceDir, 'Conf', gDefaultToolsDefFile))
             if os.path.isfile(ToolDefinitionFile) == True:
-                ToolDefObj = ToolDefDict((os.path.join(os.getenv("WORKSPACE"), "Conf")))
+                ToolDefObj = ToolDefDict(
+                    (os.path.join(os.getenv("WORKSPACE"), "Conf")))
                 ToolDefinition = ToolDefObj.ToolDef.ToolsDefTxtDatabase
                 if TAB_TOD_DEFINES_FAMILY not in ToolDefinition \
                    or self._Toolchain not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
@@ -3553,7 +4026,7 @@ class DscBuildData(PlatformBuildClassObject):
                     self._ToolChainFamily = ToolDefinition[TAB_TOD_DEFINES_FAMILY][self._Toolchain]
         return self._ToolChainFamily
 
-    ## Add external PCDs
+    # Add external PCDs
     #
     #   The external PCDs are mostly those listed in FDF file to specify address
     # or offset information.
@@ -3564,7 +4037,8 @@ class DscBuildData(PlatformBuildClassObject):
     #
     def AddPcd(self, Name, Guid, Value):
         if (Name, Guid) not in self.Pcds:
-            self.Pcds[Name, Guid] = PcdClassObject(Name, Guid, '', '', '', '', '', {}, False, None)
+            self.Pcds[Name, Guid] = PcdClassObject(
+                Name, Guid, '', '', '', '', '', {}, False, None)
         self.Pcds[Name, Guid].DefaultValue = Value
 
     @property
@@ -3575,13 +4049,16 @@ class DscBuildData(PlatformBuildClassObject):
                 FdfInfList = GlobalData.gFdfParser.Profile.InfList
             PkgSet = set()
             for Inf in FdfInfList:
-                ModuleFile = PathClass(NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch)
+                ModuleFile = PathClass(
+                    NormPath(Inf), GlobalData.gWorkspace, Arch=self._Arch)
                 if ModuleFile in self._Modules:
                     continue
-                ModuleData = self._Bdb[ModuleFile, self._Arch, self._Target, self._Toolchain]
+                ModuleData = self._Bdb[ModuleFile,
+                                       self._Arch, self._Target, self._Toolchain]
                 PkgSet.update(ModuleData.Packages)
             if self.Packages:
                 PkgSet.update(self.Packages)
-            self._DecPcds, self._GuidDict = GetDeclaredPcd(self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
+            self._DecPcds, self._GuidDict = GetDeclaredPcd(
+                self, self._Bdb, self._Arch, self._Target, self._Toolchain, PkgSet)
             self._GuidDict.update(GlobalData.gPlatformPcds)
         return self._DecPcds
diff --git a/BaseTools/Source/Python/Workspace/InfBuildData.py b/BaseTools/Source/Python/Workspace/InfBuildData.py
index e4ff1c668666..4a28c0991c3a 100644
--- a/BaseTools/Source/Python/Workspace/InfBuildData.py
+++ b/BaseTools/Source/Python/Workspace/InfBuildData.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by build tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -16,7 +16,7 @@ from collections import OrderedDict
 from Workspace.BuildClassObject import ModuleBuildClassObject, LibraryClassObject, PcdClassObject
 from Common.Expression import ValueExpressionEx, PcdPattern
 
-## Get Protocol value from given packages
+# Get Protocol value from given packages
 #
 #   @param      CName           The CName of the GUID
 #   @param      PackageList     List of packages looking-up in
@@ -25,17 +25,20 @@ from Common.Expression import ValueExpressionEx, PcdPattern
 #   @retval     GuidValue   if the CName is found in any given package
 #   @retval     None        if the CName is not found in all given packages
 #
-def _ProtocolValue(CName, PackageList, Inffile = None):
+
+
+def _ProtocolValue(CName, PackageList, Inffile=None):
     for P in PackageList:
         ProtocolKeys = list(P.Protocols.keys())
         if Inffile and P._PrivateProtocols:
             if not Inffile.startswith(P.MetaFile.Dir):
-                ProtocolKeys = [x for x in P.Protocols if x not in P._PrivateProtocols]
+                ProtocolKeys = [
+                    x for x in P.Protocols if x not in P._PrivateProtocols]
         if CName in ProtocolKeys:
             return P.Protocols[CName]
     return None
 
-## Get PPI value from given packages
+# Get PPI value from given packages
 #
 #   @param      CName           The CName of the GUID
 #   @param      PackageList     List of packages looking-up in
@@ -44,7 +47,9 @@ def _ProtocolValue(CName, PackageList, Inffile = None):
 #   @retval     GuidValue   if the CName is found in any given package
 #   @retval     None        if the CName is not found in all given packages
 #
-def _PpiValue(CName, PackageList, Inffile = None):
+
+
+def _PpiValue(CName, PackageList, Inffile=None):
     for P in PackageList:
         PpiKeys = list(P.Ppis.keys())
         if Inffile and P._PrivatePpis:
@@ -54,11 +59,13 @@ def _PpiValue(CName, PackageList, Inffile = None):
             return P.Ppis[CName]
     return None
 
-## Module build information from INF file
+# Module build information from INF file
 #
 #  This class is used to retrieve information stored in database and convert them
 # into ModuleBuildClassObject form for easier use for AutoGen.
 #
+
+
 class InfBuildData(ModuleBuildClassObject):
 
     # dict used to convert part of [Defines] to members of InfBuildData directly
@@ -66,35 +73,35 @@ class InfBuildData(ModuleBuildClassObject):
         #
         # Required Fields
         #
-        TAB_INF_DEFINES_BASE_NAME                   : "_BaseName",
-        TAB_INF_DEFINES_FILE_GUID                   : "_Guid",
-        TAB_INF_DEFINES_MODULE_TYPE                 : "_ModuleType",
+        TAB_INF_DEFINES_BASE_NAME: "_BaseName",
+        TAB_INF_DEFINES_FILE_GUID: "_Guid",
+        TAB_INF_DEFINES_MODULE_TYPE: "_ModuleType",
         #
         # Optional Fields
         #
         # TAB_INF_DEFINES_INF_VERSION                 : "_AutoGenVersion",
-        TAB_INF_DEFINES_COMPONENT_TYPE              : "_ComponentType",
-        TAB_INF_DEFINES_MAKEFILE_NAME               : "_MakefileName",
+        TAB_INF_DEFINES_COMPONENT_TYPE: "_ComponentType",
+        TAB_INF_DEFINES_MAKEFILE_NAME: "_MakefileName",
         # TAB_INF_DEFINES_CUSTOM_MAKEFILE             : "_CustomMakefile",
-        TAB_INF_DEFINES_DPX_SOURCE                  :"_DxsFile",
-        TAB_INF_DEFINES_VERSION_NUMBER              : "_Version",
-        TAB_INF_DEFINES_VERSION_STRING              : "_Version",
-        TAB_INF_DEFINES_VERSION                     : "_Version",
-        TAB_INF_DEFINES_PCD_IS_DRIVER               : "_PcdIsDriver",
-        TAB_INF_DEFINES_SHADOW                      : "_Shadow"
+        TAB_INF_DEFINES_DPX_SOURCE: "_DxsFile",
+        TAB_INF_DEFINES_VERSION_NUMBER: "_Version",
+        TAB_INF_DEFINES_VERSION_STRING: "_Version",
+        TAB_INF_DEFINES_VERSION: "_Version",
+        TAB_INF_DEFINES_PCD_IS_DRIVER: "_PcdIsDriver",
+        TAB_INF_DEFINES_SHADOW: "_Shadow"
     }
 
     # regular expression for converting XXX_FLAGS in [nmake] section to new type
-    _NMAKE_FLAG_PATTERN_ = re.compile("(?:EBC_)?([A-Z]+)_(?:STD_|PROJ_|ARCH_)?FLAGS(?:_DLL|_ASL|_EXE)?", re.UNICODE)
+    _NMAKE_FLAG_PATTERN_ = re.compile(
+        "(?:EBC_)?([A-Z]+)_(?:STD_|PROJ_|ARCH_)?FLAGS(?:_DLL|_ASL|_EXE)?", re.UNICODE)
     # dict used to convert old tool name used in [nmake] section to new ones
     _TOOL_CODE_ = {
-        "C"         :   "CC",
-        BINARY_FILE_TYPE_LIB       :   "SLINK",
-        "LINK"      :   "DLINK",
+        "C":   "CC",
+        BINARY_FILE_TYPE_LIB:   "SLINK",
+        "LINK":   "DLINK",
     }
 
-
-    ## Constructor of InfBuildData
+    # Constructor of InfBuildData
     #
     #  Initialize object of InfBuildData
     #
@@ -105,6 +112,7 @@ class InfBuildData(ModuleBuildClassObject):
     #   @param      Platform        The name of platform employing this module
     #   @param      Macros          Macros used for replacement in DSC file
     #
+
     def __init__(self, FilePath, RawData, BuildDatabase, Arch=TAB_ARCH_COMMON, Target=None, Toolchain=None):
         self.MetaFile = FilePath
         self._ModuleDir = FilePath.Dir
@@ -145,7 +153,7 @@ class InfBuildData(ModuleBuildClassObject):
         self.LibInstances = []
         self.ReferenceModules = set()
 
-    def SetReferenceModule(self,Module):
+    def SetReferenceModule(self, Module):
         self.ReferenceModules.add(Module)
         return self
 
@@ -157,22 +165,22 @@ class InfBuildData(ModuleBuildClassObject):
     def __getitem__(self, key):
         return self.__dict__[self._PROPERTY_[key]]
 
-    ## "in" test support
+    # "in" test support
     def __contains__(self, key):
         return key in self._PROPERTY_
 
-    ## Get current effective macros
+    # Get current effective macros
     @cached_property
     def _Macros(self):
         RetVal = {}
         return RetVal
 
-    ## Get architecture
+    # Get architecture
     @cached_property
     def Arch(self):
         return self._Arch
 
-    ## Return the name of platform employing this module
+    # Return the name of platform employing this module
     @cached_property
     def Platform(self):
         return self._Platform
@@ -185,15 +193,17 @@ class InfBuildData(ModuleBuildClassObject):
     def TailComments(self):
         return [a[0] for a in self._RawData[MODEL_META_DATA_TAIL_COMMENT]]
 
-    ## Retrieve all information in [Defines] section
+    # Retrieve all information in [Defines] section
     #
     #   (Retrieving all [Defines] information in one-shot is just to save time.)
     #
     @cached_class_function
     def _GetHeaderInfo(self):
-        RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_META_DATA_HEADER,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
-            Name, Value = Record[1], ReplaceMacro(Record[2], self._Macros, False)
+            Name, Value = Record[1], ReplaceMacro(
+                Record[2], self._Macros, False)
             # items defined _PROPERTY_ don't need additional processing
             if Name in self:
                 self[Name] = Value
@@ -208,7 +218,8 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Specification[Name] = GetHexVerValue(Value)
                 if self._Specification[Name] is None:
                     EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
-                                    "'%s' format is not supported for %s" % (Value, Name),
+                                    "'%s' format is not supported for %s" % (
+                                        Value, Name),
                                     File=self.MetaFile, Line=Record[-1])
             elif Name == 'LIBRARY_CLASS':
                 if self._LibraryClass is None:
@@ -219,7 +230,8 @@ class InfBuildData(ModuleBuildClassObject):
                     SupModuleList = GetSplitValueList(ValueList[1], ' ')
                 else:
                     SupModuleList = SUP_MODULE_LIST
-                self._LibraryClass.append(LibraryClassObject(LibraryClass, SupModuleList))
+                self._LibraryClass.append(
+                    LibraryClassObject(LibraryClass, SupModuleList))
             elif Name == 'ENTRY_POINT':
                 if self._ModuleEntryPointList is None:
                     self._ModuleEntryPointList = []
@@ -266,38 +278,45 @@ class InfBuildData(ModuleBuildClassObject):
             EdkLogger.error("build", ATTRIBUTE_NOT_AVAILABLE,
                             "MODULE_TYPE is not given", File=self.MetaFile)
         if self._ModuleType not in SUP_MODULE_LIST:
-            RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, self._Platform]
+            RecordList = self._RawData[MODEL_META_DATA_HEADER,
+                                       self._Arch, self._Platform]
             for Record in RecordList:
                 Name = Record[1]
                 if Name == "MODULE_TYPE":
                     LineNo = Record[6]
                     break
             EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
-                            "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (self._ModuleType, ' '.join(l for l in SUP_MODULE_LIST)),
+                            "MODULE_TYPE %s is not supported for EDK II, valid values are:\n %s" % (
+                                self._ModuleType, ' '.join(l for l in SUP_MODULE_LIST)),
                             File=self.MetaFile, Line=LineNo)
         if (self._Specification is None) or (not 'PI_SPECIFICATION_VERSION' in self._Specification) or (int(self._Specification['PI_SPECIFICATION_VERSION'], 16) < 0x0001000A):
             if self._ModuleType == SUP_MODULE_SMM_CORE:
-                EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.MetaFile)
+                EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                                "SMM_CORE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x0001000A", File=self.MetaFile)
         if (self._Specification is None) or (not 'PI_SPECIFICATION_VERSION' in self._Specification) or (int(self._Specification['PI_SPECIFICATION_VERSION'], 16) < 0x00010032):
             if self._ModuleType == SUP_MODULE_MM_CORE_STANDALONE:
-                EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "MM_CORE_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.MetaFile)
+                EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                                "MM_CORE_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.MetaFile)
             if self._ModuleType == SUP_MODULE_MM_STANDALONE:
-                EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "MM_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.MetaFile)
+                EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                                "MM_STANDALONE module type can't be used in the module with PI_SPECIFICATION_VERSION less than 0x00010032", File=self.MetaFile)
         if 'PCI_DEVICE_ID' in self._Defs and 'PCI_VENDOR_ID' in self._Defs \
            and 'PCI_CLASS_CODE' in self._Defs and 'PCI_REVISION' in self._Defs:
             self._BuildType = 'UEFI_OPTIONROM'
             if 'PCI_COMPRESS' in self._Defs:
                 if self._Defs['PCI_COMPRESS'] not in ('TRUE', 'FALSE'):
-                    EdkLogger.error("build", FORMAT_INVALID, "Expected TRUE/FALSE for PCI_COMPRESS: %s" % self.MetaFile)
+                    EdkLogger.error(
+                        "build", FORMAT_INVALID, "Expected TRUE/FALSE for PCI_COMPRESS: %s" % self.MetaFile)
 
         elif 'UEFI_HII_RESOURCE_SECTION' in self._Defs \
-           and self._Defs['UEFI_HII_RESOURCE_SECTION'] == 'TRUE':
+                and self._Defs['UEFI_HII_RESOURCE_SECTION'] == 'TRUE':
             self._BuildType = 'UEFI_HII'
         else:
             self._BuildType = self._ModuleType.upper()
 
         if self._DxsFile:
-            File = PathClass(NormPath(self._DxsFile), self._ModuleDir, Arch=self._Arch)
+            File = PathClass(NormPath(self._DxsFile),
+                             self._ModuleDir, Arch=self._Arch)
             # check the file validation
             ErrorCode, ErrorInfo = File.Validate(".dxs", CaseSensitive=False)
             if ErrorCode != 0:
@@ -307,11 +326,12 @@ class InfBuildData(ModuleBuildClassObject):
                 self._DependencyFileList = []
             self._DependencyFileList.append(File)
 
-    ## Retrieve file version
+    # Retrieve file version
     @cached_property
     def AutoGenVersion(self):
         RetVal = 0x00010000
-        RecordList = self._RawData[MODEL_META_DATA_HEADER, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_META_DATA_HEADER,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
             if Record[1] == TAB_INF_DEFINES_INF_VERSION:
                 if '.' in Record[2]:
@@ -324,16 +344,17 @@ class InfBuildData(ModuleBuildClassObject):
                 break
         return RetVal
 
-    ## Retrieve BASE_NAME
+    # Retrieve BASE_NAME
     @cached_property
     def BaseName(self):
         if self._BaseName is None:
             self._GetHeaderInfo()
             if self._BaseName is None:
-                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE, "No BASE_NAME name", File=self.MetaFile)
+                EdkLogger.error('build', ATTRIBUTE_NOT_AVAILABLE,
+                                "No BASE_NAME name", File=self.MetaFile)
         return self._BaseName
 
-    ## Retrieve DxsFile
+    # Retrieve DxsFile
     @cached_property
     def DxsFile(self):
         if self._DxsFile is None:
@@ -342,7 +363,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._DxsFile = ''
         return self._DxsFile
 
-    ## Retrieve MODULE_TYPE
+    # Retrieve MODULE_TYPE
     @cached_property
     def ModuleType(self):
         if self._ModuleType is None:
@@ -353,7 +374,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._ModuleType = SUP_MODULE_USER_DEFINED
         return self._ModuleType
 
-    ## Retrieve COMPONENT_TYPE
+    # Retrieve COMPONENT_TYPE
     @cached_property
     def ComponentType(self):
         if self._ComponentType is None:
@@ -362,7 +383,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._ComponentType = SUP_MODULE_USER_DEFINED
         return self._ComponentType
 
-    ## Retrieve "BUILD_TYPE"
+    # Retrieve "BUILD_TYPE"
     @cached_property
     def BuildType(self):
         if self._BuildType is None:
@@ -371,7 +392,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._BuildType = SUP_MODULE_BASE
         return self._BuildType
 
-    ## Retrieve file guid
+    # Retrieve file guid
     @cached_property
     def Guid(self):
         if self._Guid is None:
@@ -380,7 +401,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Guid = '00000000-0000-0000-0000-000000000000'
         return self._Guid
 
-    ## Retrieve module version
+    # Retrieve module version
     @cached_property
     def Version(self):
         if self._Version is None:
@@ -389,7 +410,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Version = '0.0'
         return self._Version
 
-    ## Retrieve PCD_IS_DRIVER
+    # Retrieve PCD_IS_DRIVER
     @cached_property
     def PcdIsDriver(self):
         if self._PcdIsDriver is None:
@@ -398,7 +419,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._PcdIsDriver = ''
         return self._PcdIsDriver
 
-    ## Retrieve SHADOW
+    # Retrieve SHADOW
     @cached_property
     def Shadow(self):
         if self._Shadow is None:
@@ -409,7 +430,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Shadow = False
         return self._Shadow
 
-    ## Retrieve CUSTOM_MAKEFILE
+    # Retrieve CUSTOM_MAKEFILE
     @cached_property
     def CustomMakefile(self):
         if self._CustomMakefile is None:
@@ -418,7 +439,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._CustomMakefile = {}
         return self._CustomMakefile
 
-    ## Retrieve EFI_SPECIFICATION_VERSION
+    # Retrieve EFI_SPECIFICATION_VERSION
     @cached_property
     def Specification(self):
         if self._Specification is None:
@@ -427,7 +448,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._Specification = {}
         return self._Specification
 
-    ## Retrieve LIBRARY_CLASS
+    # Retrieve LIBRARY_CLASS
     @cached_property
     def LibraryClass(self):
         if self._LibraryClass is None:
@@ -436,7 +457,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._LibraryClass = []
         return self._LibraryClass
 
-    ## Retrieve ENTRY_POINT
+    # Retrieve ENTRY_POINT
     @cached_property
     def ModuleEntryPointList(self):
         if self._ModuleEntryPointList is None:
@@ -445,7 +466,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._ModuleEntryPointList = []
         return self._ModuleEntryPointList
 
-    ## Retrieve UNLOAD_IMAGE
+    # Retrieve UNLOAD_IMAGE
     @cached_property
     def ModuleUnloadImageList(self):
         if self._ModuleUnloadImageList is None:
@@ -454,7 +475,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._ModuleUnloadImageList = []
         return self._ModuleUnloadImageList
 
-    ## Retrieve CONSTRUCTOR
+    # Retrieve CONSTRUCTOR
     @cached_property
     def ConstructorList(self):
         if self._ConstructorList is None:
@@ -463,7 +484,7 @@ class InfBuildData(ModuleBuildClassObject):
                 self._ConstructorList = []
         return self._ConstructorList
 
-    ## Retrieve DESTRUCTOR
+    # Retrieve DESTRUCTOR
     @cached_property
     def DestructorList(self):
         if self._DestructorList is None:
@@ -472,17 +493,18 @@ class InfBuildData(ModuleBuildClassObject):
                 self._DestructorList = []
         return self._DestructorList
 
-    ## Retrieve definies other than above ones
+    # Retrieve definies other than above ones
     @cached_property
     def Defines(self):
         self._GetHeaderInfo()
         return self._Defs
 
-    ## Retrieve binary files
+    # Retrieve binary files
     @cached_class_function
     def _GetBinaries(self):
         RetVal = []
-        RecordList = self._RawData[MODEL_EFI_BINARY_FILE, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_EFI_BINARY_FILE,
+                                   self._Arch, self._Platform]
         Macros = self._Macros
         Macros['PROCESSOR'] = self._Arch
         for Record in RecordList:
@@ -497,25 +519,28 @@ class InfBuildData(ModuleBuildClassObject):
                 if len(TokenList) > 1:
                     FeatureFlag = Record[1:]
 
-            File = PathClass(NormPath(Record[1], Macros), self._ModuleDir, '', FileType, True, self._Arch, '', Target)
+            File = PathClass(NormPath(
+                Record[1], Macros), self._ModuleDir, '', FileType, True, self._Arch, '', Target)
             # check the file validation
             ErrorCode, ErrorInfo = File.Validate()
             if ErrorCode != 0:
-                EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                EdkLogger.error(
+                    'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
             RetVal.append(File)
         return RetVal
 
-    ## Retrieve binary files with error check.
+    # Retrieve binary files with error check.
     @cached_property
     def Binaries(self):
         RetVal = self._GetBinaries()
         if GlobalData.gIgnoreSource and not RetVal:
             ErrorInfo = "The INF file does not contain any RetVal to use in creating the image\n"
-            EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, ExtraData=ErrorInfo, File=self.MetaFile)
+            EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
+                            ExtraData=ErrorInfo, File=self.MetaFile)
 
         return RetVal
 
-    ## Retrieve source files
+    # Retrieve source files
     @cached_property
     def Sources(self):
         self._GetHeaderInfo()
@@ -524,7 +549,8 @@ class InfBuildData(ModuleBuildClassObject):
             return []
 
         RetVal = []
-        RecordList = self._RawData[MODEL_EFI_SOURCE_FILE, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_EFI_SOURCE_FILE,
+                                   self._Arch, self._Platform]
         Macros = self._Macros
         for Record in RecordList:
             LineNo = Record[-1]
@@ -543,7 +569,8 @@ class InfBuildData(ModuleBuildClassObject):
             # check the file validation
             ErrorCode, ErrorInfo = File.Validate()
             if ErrorCode != 0:
-                EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                EdkLogger.error(
+                    'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
 
             RetVal.append(File)
         # add any previously found dependency files to the source list
@@ -551,11 +578,12 @@ class InfBuildData(ModuleBuildClassObject):
             RetVal.extend(self._DependencyFileList)
         return RetVal
 
-    ## Retrieve library classes employed by this module
+    # Retrieve library classes employed by this module
     @cached_property
     def LibraryClasses(self):
         RetVal = OrderedDict()
-        RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_EFI_LIBRARY_CLASS,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
             Lib = Record[0]
             Instance = Record[1]
@@ -566,11 +594,12 @@ class InfBuildData(ModuleBuildClassObject):
                 RetVal[Lib] = None
         return RetVal
 
-    ## Retrieve library names (for Edk.x style of modules)
+    # Retrieve library names (for Edk.x style of modules)
     @cached_property
     def Libraries(self):
         RetVal = []
-        RecordList = self._RawData[MODEL_EFI_LIBRARY_INSTANCE, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_EFI_LIBRARY_INSTANCE,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
             LibraryName = ReplaceMacro(Record[0], self._Macros, False)
             # in case of name with '.lib' extension, which is unusual in Edk.x inf
@@ -584,12 +613,13 @@ class InfBuildData(ModuleBuildClassObject):
         self.Protocols
         return self._ProtocolComments
 
-    ## Retrieve protocols consumed/produced by this module
+    # Retrieve protocols consumed/produced by this module
     @cached_property
     def Protocols(self):
         RetVal = OrderedDict()
         self._ProtocolComments = OrderedDict()
-        RecordList = self._RawData[MODEL_EFI_PROTOCOL, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_EFI_PROTOCOL,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
             CName = Record[0]
             Value = _ProtocolValue(CName, self.Packages, self.MetaFile.Path)
@@ -599,7 +629,8 @@ class InfBuildData(ModuleBuildClassObject):
                                 "Value of Protocol [%s] is not found under [Protocols] section in" % CName,
                                 ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
             RetVal[CName] = Value
-            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT, self._Arch, self._Platform, Record[5]]
+            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT,
+                                           self._Arch, self._Platform, Record[5]]
             self._ProtocolComments[CName] = [a[0] for a in CommentRecords]
         return RetVal
 
@@ -608,7 +639,7 @@ class InfBuildData(ModuleBuildClassObject):
         self.Ppis
         return self._PpiComments
 
-    ## Retrieve PPIs consumed/produced by this module
+    # Retrieve PPIs consumed/produced by this module
     @cached_property
     def Ppis(self):
         RetVal = OrderedDict()
@@ -623,7 +654,8 @@ class InfBuildData(ModuleBuildClassObject):
                                 "Value of PPI [%s] is not found under [Ppis] section in " % CName,
                                 ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
             RetVal[CName] = Value
-            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT, self._Arch, self._Platform, Record[5]]
+            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT,
+                                           self._Arch, self._Platform, Record[5]]
             self._PpiComments[CName] = [a[0] for a in CommentRecords]
         return RetVal
 
@@ -632,7 +664,7 @@ class InfBuildData(ModuleBuildClassObject):
         self.Guids
         return self._GuidComments
 
-    ## Retrieve GUIDs consumed/produced by this module
+    # Retrieve GUIDs consumed/produced by this module
     @cached_property
     def Guids(self):
         RetVal = OrderedDict()
@@ -647,17 +679,20 @@ class InfBuildData(ModuleBuildClassObject):
                                 "Value of Guid [%s] is not found under [Guids] section in" % CName,
                                 ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
             RetVal[CName] = Value
-            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT, self._Arch, self._Platform, Record[5]]
+            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT,
+                                           self._Arch, self._Platform, Record[5]]
             self._GuidComments[CName] = [a[0] for a in CommentRecords]
 
-        for Type in [MODEL_PCD_FIXED_AT_BUILD,MODEL_PCD_PATCHABLE_IN_MODULE,MODEL_PCD_FEATURE_FLAG,MODEL_PCD_DYNAMIC,MODEL_PCD_DYNAMIC_EX]:
+        for Type in [MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX]:
             RecordList = self._RawData[Type, self._Arch, self._Platform]
             for TokenSpaceGuid, _, _, _, _, _, LineNo in RecordList:
                 # get the guid value
                 if TokenSpaceGuid not in RetVal:
-                    Value = GuidValue(TokenSpaceGuid, self.Packages, self.MetaFile.Path)
+                    Value = GuidValue(
+                        TokenSpaceGuid, self.Packages, self.MetaFile.Path)
                     if Value is None:
-                        PackageList = "\n\t".join(str(P) for P in self.Packages)
+                        PackageList = "\n\t".join(str(P)
+                                                  for P in self.Packages)
                         EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
                                         "Value of Guid [%s] is not found under [Guids] section in" % TokenSpaceGuid,
                                         ExtraData=PackageList, File=self.MetaFile, Line=LineNo)
@@ -665,13 +700,15 @@ class InfBuildData(ModuleBuildClassObject):
                     self._GuidsUsedByPcd[TokenSpaceGuid] = Value
         return RetVal
 
-    ## Retrieve include paths necessary for this module (for Edk.x style of modules)
+    # Retrieve include paths necessary for this module (for Edk.x style of modules)
     @cached_property
     def Includes(self):
         RetVal = []
         Macros = self._Macros
-        Macros['PROCESSOR'] = GlobalData.gEdkGlobal.get('PROCESSOR', self._Arch)
-        RecordList = self._RawData[MODEL_EFI_INCLUDE, self._Arch, self._Platform]
+        Macros['PROCESSOR'] = GlobalData.gEdkGlobal.get(
+            'PROCESSOR', self._Arch)
+        RecordList = self._RawData[MODEL_EFI_INCLUDE,
+                                   self._Arch, self._Platform]
         for Record in RecordList:
             File = NormPath(Record[0], Macros)
             if File[0] == '.':
@@ -683,30 +720,34 @@ class InfBuildData(ModuleBuildClassObject):
                 RetVal.append(File)
         return RetVal
 
-    ## Retrieve packages this module depends on
+    # Retrieve packages this module depends on
     @cached_property
     def Packages(self):
         RetVal = []
-        RecordList = self._RawData[MODEL_META_DATA_PACKAGE, self._Arch, self._Platform]
+        RecordList = self._RawData[MODEL_META_DATA_PACKAGE,
+                                   self._Arch, self._Platform]
         Macros = self._Macros
         for Record in RecordList:
-            File = PathClass(NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
+            File = PathClass(
+                NormPath(Record[0], Macros), GlobalData.gWorkspace, Arch=self._Arch)
             # check the file validation
             ErrorCode, ErrorInfo = File.Validate('.dec')
             if ErrorCode != 0:
                 LineNo = Record[-1]
-                EdkLogger.error('build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
+                EdkLogger.error(
+                    'build', ErrorCode, ExtraData=ErrorInfo, File=self.MetaFile, Line=LineNo)
             # parse this package now. we need it to get protocol/ppi/guid value
-            RetVal.append(self._Bdb[File, self._Arch, self._Target, self._Toolchain])
+            RetVal.append(self._Bdb[File, self._Arch,
+                          self._Target, self._Toolchain])
         return RetVal
 
-    ## Retrieve PCD comments
+    # Retrieve PCD comments
     @cached_property
     def PcdComments(self):
         self.Pcds
         return self._PcdComments
 
-    ## Retrieve PCDs used in this module
+    # Retrieve PCDs used in this module
     @cached_property
     def Pcds(self):
         self._PcdComments = OrderedDict()
@@ -722,6 +763,7 @@ class InfBuildData(ModuleBuildClassObject):
     def ModulePcdList(self):
         RetVal = self.Pcds
         return RetVal
+
     @cached_property
     def LibraryPcdList(self):
         if bool(self.LibraryClass):
@@ -737,21 +779,23 @@ class InfBuildData(ModuleBuildClassObject):
                 PcdsInLibrary[Key] = copy.copy(Library.Pcds[Key])
             RetVal[Library] = PcdsInLibrary
         return RetVal
+
     @cached_property
     def PcdsName(self):
         PcdsName = set()
-        for Type in (MODEL_PCD_FIXED_AT_BUILD,MODEL_PCD_PATCHABLE_IN_MODULE,MODEL_PCD_FEATURE_FLAG,MODEL_PCD_DYNAMIC,MODEL_PCD_DYNAMIC_EX):
+        for Type in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_FEATURE_FLAG, MODEL_PCD_DYNAMIC, MODEL_PCD_DYNAMIC_EX):
             RecordList = self._RawData[Type, self._Arch, self._Platform]
             for TokenSpaceGuid, PcdCName, _, _, _, _, _ in RecordList:
                 PcdsName.add((PcdCName, TokenSpaceGuid))
         return PcdsName
 
-    ## Retrieve build options specific to this module
+    # Retrieve build options specific to this module
     @cached_property
     def BuildOptions(self):
         if self._BuildOptions is None:
             self._BuildOptions = OrderedDict()
-            RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION, self._Arch, self._Platform]
+            RecordList = self._RawData[MODEL_META_DATA_BUILD_OPTION,
+                                       self._Arch, self._Platform]
             for Record in RecordList:
                 ToolChainFamily = Record[0]
                 ToolChain = Record[1]
@@ -761,10 +805,11 @@ class InfBuildData(ModuleBuildClassObject):
                 else:
                     # concatenate the option string if they're for the same tool
                     OptionString = self._BuildOptions[ToolChainFamily, ToolChain]
-                    self._BuildOptions[ToolChainFamily, ToolChain] = OptionString + " " + Option
+                    self._BuildOptions[ToolChainFamily,
+                                       ToolChain] = OptionString + " " + Option
         return self._BuildOptions
 
-    ## Retrieve dependency expression
+    # Retrieve dependency expression
     @cached_property
     def Depex(self):
         RetVal = tdict(False, 2)
@@ -777,8 +822,8 @@ class InfBuildData(ModuleBuildClassObject):
         # PEIM and DXE drivers must have a valid [Depex] section
         if len(self.LibraryClass) == 0 and len(RecordList) == 0:
             if self.ModuleType == SUP_MODULE_DXE_DRIVER or self.ModuleType == SUP_MODULE_PEIM or self.ModuleType == SUP_MODULE_DXE_SMM_DRIVER or \
-                self.ModuleType == SUP_MODULE_DXE_SAL_DRIVER or self.ModuleType == SUP_MODULE_DXE_RUNTIME_DRIVER:
-                EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No [Depex] section or no valid expression in [Depex] section for [%s] module" \
+                    self.ModuleType == SUP_MODULE_DXE_SAL_DRIVER or self.ModuleType == SUP_MODULE_DXE_RUNTIME_DRIVER:
+                EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "No [Depex] section or no valid expression in [Depex] section for [%s] module"
                                 % self.ModuleType, File=self.MetaFile)
 
         if len(RecordList) != 0 and (self.ModuleType == SUP_MODULE_USER_DEFINED or self.ModuleType == SUP_MODULE_HOST_APPLICATION):
@@ -811,21 +856,27 @@ class InfBuildData(ModuleBuildClassObject):
                     # it use the Fixed PCD format
                     if '.' in Token:
                         if tuple(Token.split('.')[::-1]) not in self.Pcds:
-                            EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "PCD [{}] used in [Depex] section should be listed in module PCD section".format(Token), File=self.MetaFile, Line=Record[-1])
+                            EdkLogger.error('build', RESOURCE_NOT_AVAILABLE, "PCD [{}] used in [Depex] section should be listed in module PCD section".format(
+                                Token), File=self.MetaFile, Line=Record[-1])
                         else:
                             if self.Pcds[tuple(Token.split('.')[::-1])].DatumType != TAB_VOID:
-                                EdkLogger.error('build', FORMAT_INVALID, "PCD [{}] used in [Depex] section should be VOID* datum type".format(Token), File=self.MetaFile, Line=Record[-1])
+                                EdkLogger.error('build', FORMAT_INVALID, "PCD [{}] used in [Depex] section should be VOID* datum type".format(
+                                    Token), File=self.MetaFile, Line=Record[-1])
                         Value = Token
                     else:
                         # get the GUID value now
-                        Value = _ProtocolValue(Token, self.Packages, self.MetaFile.Path)
+                        Value = _ProtocolValue(
+                            Token, self.Packages, self.MetaFile.Path)
                         if Value is None:
-                            Value = _PpiValue(Token, self.Packages, self.MetaFile.Path)
+                            Value = _PpiValue(
+                                Token, self.Packages, self.MetaFile.Path)
                             if Value is None:
-                                Value = GuidValue(Token, self.Packages, self.MetaFile.Path)
+                                Value = GuidValue(
+                                    Token, self.Packages, self.MetaFile.Path)
 
                     if Value is None:
-                        PackageList = "\n\t".join(str(P) for P in self.Packages)
+                        PackageList = "\n\t".join(str(P)
+                                                  for P in self.Packages)
                         EdkLogger.error('build', RESOURCE_NOT_AVAILABLE,
                                         "Value of [%s] is not found in" % Token,
                                         ExtraData=PackageList, File=self.MetaFile, Line=Record[-1])
@@ -834,7 +885,7 @@ class InfBuildData(ModuleBuildClassObject):
             RetVal[Arch, ModuleType] = TemporaryDictionary[Arch, ModuleType]
         return RetVal
 
-    ## Retrieve dependency expression
+    # Retrieve dependency expression
     @cached_property
     def DepexExpression(self):
         RetVal = tdict(False, 2)
@@ -848,36 +899,41 @@ class InfBuildData(ModuleBuildClassObject):
             if (Arch, ModuleType) not in TemporaryDictionary:
                 TemporaryDictionary[Arch, ModuleType] = ''
             for Token in TokenList:
-                TemporaryDictionary[Arch, ModuleType] = TemporaryDictionary[Arch, ModuleType] + Token.strip() + ' '
+                TemporaryDictionary[Arch, ModuleType] = TemporaryDictionary[Arch,
+                                                                            ModuleType] + Token.strip() + ' '
         for Arch, ModuleType in TemporaryDictionary:
             RetVal[Arch, ModuleType] = TemporaryDictionary[Arch, ModuleType]
         return RetVal
+
     def LocalPkg(self):
         module_path = self.MetaFile.File
         subdir = os.path.split(module_path)[0]
         TopDir = ""
         while subdir:
-            subdir,TopDir = os.path.split(subdir)
+            subdir, TopDir = os.path.split(subdir)
 
-        for file_name in os.listdir(os.path.join(self.MetaFile.Root,TopDir)):
+        for file_name in os.listdir(os.path.join(self.MetaFile.Root, TopDir)):
             if file_name.upper().endswith("DEC"):
-                pkg = os.path.join(TopDir,file_name)
+                pkg = os.path.join(TopDir, file_name)
         return pkg
+
     @cached_class_function
     def GetGuidsUsedByPcd(self):
         self.Guid
         return self._GuidsUsedByPcd
 
-    ## Retrieve PCD for given type
+    # Retrieve PCD for given type
     def _GetPcd(self, Type):
         Pcds = OrderedDict()
         PcdDict = tdict(True, 4)
         PcdList = []
         RecordList = self._RawData[Type, self._Arch, self._Platform]
         for TokenSpaceGuid, PcdCName, Setting, Arch, Platform, Id, LineNo in RecordList:
-            PcdDict[Arch, Platform, PcdCName, TokenSpaceGuid] = (Setting, LineNo)
+            PcdDict[Arch, Platform, PcdCName,
+                    TokenSpaceGuid] = (Setting, LineNo)
             PcdList.append((PcdCName, TokenSpaceGuid))
-            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT, self._Arch, self._Platform, Id]
+            CommentRecords = self._RawData[MODEL_META_DATA_COMMENT,
+                                           self._Arch, self._Platform, Id]
             Comments = []
             for CmtRec in CommentRecords:
                 Comments.append(CmtRec[0])
@@ -887,23 +943,24 @@ class InfBuildData(ModuleBuildClassObject):
         _GuidDict = self.Guids.copy()
         for PcdCName, TokenSpaceGuid in PcdList:
             PcdRealName = PcdCName
-            Setting, LineNo = PcdDict[self._Arch, self.Platform, PcdCName, TokenSpaceGuid]
+            Setting, LineNo = PcdDict[self._Arch,
+                                      self.Platform, PcdCName, TokenSpaceGuid]
             if Setting is None:
                 continue
             ValueList = AnalyzePcdData(Setting)
             DefaultValue = ValueList[0]
             Pcd = PcdClassObject(
-                    PcdCName,
-                    TokenSpaceGuid,
-                    '',
-                    '',
-                    DefaultValue,
-                    '',
-                    '',
-                    {},
-                    False,
-                    self.Guids[TokenSpaceGuid]
-                    )
+                PcdCName,
+                TokenSpaceGuid,
+                '',
+                '',
+                DefaultValue,
+                '',
+                '',
+                {},
+                False,
+                self.Guids[TokenSpaceGuid]
+            )
             if Type == MODEL_PCD_PATCHABLE_IN_MODULE and ValueList[1]:
                 # Patch PCD: TokenSpace.PcdCName|Value|Offset
                 Pcd.Offset = ValueList[1]
@@ -916,11 +973,13 @@ class InfBuildData(ModuleBuildClassObject):
                                 Pcd_Type = item[0].split('_')[-1]
                                 if Pcd_Type == Package.Pcds[key].Type:
                                     Value = Package.Pcds[key]
-                                    Value.TokenCName = Package.Pcds[key].TokenCName + '_' + Pcd_Type
+                                    Value.TokenCName = Package.Pcds[key].TokenCName + \
+                                        '_' + Pcd_Type
                                     if len(key) == 2:
                                         newkey = (Value.TokenCName, key[1])
                                     elif len(key) == 3:
-                                        newkey = (Value.TokenCName, key[1], key[2])
+                                        newkey = (Value.TokenCName,
+                                                  key[1], key[2])
                                     del Package.Pcds[key]
                                     Package.Pcds[newkey] = Value
                                     break
@@ -971,7 +1030,8 @@ class InfBuildData(ModuleBuildClassObject):
                         pass
 
                 if (PcdCName, TokenSpaceGuid, PcdType) in Package.Pcds:
-                    PcdInPackage = Package.Pcds[PcdCName, TokenSpaceGuid, PcdType]
+                    PcdInPackage = Package.Pcds[PcdCName,
+                                                TokenSpaceGuid, PcdType]
                     Pcd.Type = PcdType
                     Pcd.TokenValue = PcdInPackage.TokenValue
 
@@ -980,48 +1040,53 @@ class InfBuildData(ModuleBuildClassObject):
                     #
                     if Pcd.TokenValue is None or Pcd.TokenValue == "":
                         EdkLogger.error(
-                                'build',
-                                FORMAT_INVALID,
-                                "No TokenValue for PCD [%s.%s] in [%s]!" % (TokenSpaceGuid, PcdRealName, str(Package)),
-                                File=self.MetaFile, Line=LineNo,
-                                ExtraData=None
-                                )
+                            'build',
+                            FORMAT_INVALID,
+                            "No TokenValue for PCD [%s.%s] in [%s]!" % (
+                                TokenSpaceGuid, PcdRealName, str(Package)),
+                            File=self.MetaFile, Line=LineNo,
+                            ExtraData=None
+                        )
                     #
                     # Check hexadecimal token value length and format.
                     #
-                    ReIsValidPcdTokenValue = re.compile(r"^[0][x|X][0]*[0-9a-fA-F]{1,8}$", re.DOTALL)
+                    ReIsValidPcdTokenValue = re.compile(
+                        r"^[0][x|X][0]*[0-9a-fA-F]{1,8}$", re.DOTALL)
                     if Pcd.TokenValue.startswith("0x") or Pcd.TokenValue.startswith("0X"):
                         if ReIsValidPcdTokenValue.match(Pcd.TokenValue) is None:
                             EdkLogger.error(
+                                'build',
+                                FORMAT_INVALID,
+                                "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid:" % (
+                                    Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
+                                File=self.MetaFile, Line=LineNo,
+                                ExtraData=None
+                            )
+
+                    #
+                    # Check decimal token value length and format.
+                    #
+                    else:
+                        try:
+                            TokenValueInt = int(Pcd.TokenValue, 10)
+                            if (TokenValueInt < 0 or TokenValueInt > 4294967295):
+                                EdkLogger.error(
                                     'build',
                                     FORMAT_INVALID,
-                                    "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid:" % (Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
+                                    "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid, as a decimal it should between: 0 - 4294967295!" % (
+                                        Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
                                     File=self.MetaFile, Line=LineNo,
                                     ExtraData=None
-                                    )
-
-                    #
-                    # Check decimal token value length and format.
-                    #
-                    else:
-                        try:
-                            TokenValueInt = int (Pcd.TokenValue, 10)
-                            if (TokenValueInt < 0 or TokenValueInt > 4294967295):
-                                EdkLogger.error(
-                                            'build',
-                                            FORMAT_INVALID,
-                                            "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid, as a decimal it should between: 0 - 4294967295!" % (Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
-                                            File=self.MetaFile, Line=LineNo,
-                                            ExtraData=None
-                                            )
+                                )
                         except:
                             EdkLogger.error(
-                                        'build',
-                                        FORMAT_INVALID,
-                                        "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid, it should be hexadecimal or decimal!" % (Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
-                                        File=self.MetaFile, Line=LineNo,
-                                        ExtraData=None
-                                        )
+                                'build',
+                                FORMAT_INVALID,
+                                "The format of TokenValue [%s] of PCD [%s.%s] in [%s] is invalid, it should be hexadecimal or decimal!" % (
+                                    Pcd.TokenValue, TokenSpaceGuid, PcdRealName, str(Package)),
+                                File=self.MetaFile, Line=LineNo,
+                                ExtraData=None
+                            )
 
                     Pcd.DatumType = PcdInPackage.DatumType
                     Pcd.MaxDatumSize = PcdInPackage.MaxDatumSize
@@ -1030,30 +1095,34 @@ class InfBuildData(ModuleBuildClassObject):
                         Pcd.DefaultValue = PcdInPackage.DefaultValue
                     else:
                         try:
-                            Pcd.DefaultValue = ValueExpressionEx(Pcd.DefaultValue, Pcd.DatumType, _GuidDict)(True)
+                            Pcd.DefaultValue = ValueExpressionEx(
+                                Pcd.DefaultValue, Pcd.DatumType, _GuidDict)(True)
                         except BadExpression as Value:
-                            EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' %(TokenSpaceGuid, PcdRealName, Pcd.DefaultValue, Value),
+                            EdkLogger.error('Parser', FORMAT_INVALID, 'PCD [%s.%s] Value "%s", %s' % (TokenSpaceGuid, PcdRealName, Pcd.DefaultValue, Value),
                                             File=self.MetaFile, Line=LineNo)
                     break
             else:
                 EdkLogger.error(
-                            'build',
-                            FORMAT_INVALID,
-                            "PCD [%s.%s] in [%s] is not found in dependent packages:" % (TokenSpaceGuid, PcdRealName, self.MetaFile),
-                            File=self.MetaFile, Line=LineNo,
-                            ExtraData="\t%s" % '\n\t'.join(str(P) for P in self.Packages)
-                            )
+                    'build',
+                    FORMAT_INVALID,
+                    "PCD [%s.%s] in [%s] is not found in dependent packages:" % (
+                        TokenSpaceGuid, PcdRealName, self.MetaFile),
+                    File=self.MetaFile, Line=LineNo,
+                    ExtraData="\t%s" % '\n\t'.join(
+                        str(P) for P in self.Packages)
+                )
             Pcds[PcdCName, TokenSpaceGuid] = Pcd
 
         return Pcds
 
-    ## check whether current module is binary module
+    # check whether current module is binary module
     @property
     def IsBinaryModule(self):
         if (self.Binaries and not self.Sources) or GlobalData.gIgnoreSource:
             return True
         return False
-    def CheckFeatureFlagPcd(self,Instance):
+
+    def CheckFeatureFlagPcd(self, Instance):
         Pcds = GlobalData.gPlatformFinalPcds.copy()
         if PcdPattern.search(Instance):
             PcdTuple = tuple(Instance.split('.')[::-1])
@@ -1064,7 +1133,7 @@ class InfBuildData(ModuleBuildClassObject):
                                     File=str(self), ExtraData=Instance)
                 if not Instance in Pcds:
                     Pcds[Instance] = self.Pcds[PcdTuple].DefaultValue
-            else: #if PcdTuple not in self.Pcds:
+            else:  # if PcdTuple not in self.Pcds:
                 EdkLogger.error('build', FORMAT_INVALID,
                                 "\nFeatureFlagPcd must be defined in [FeaturePcd] or [FixedPcd] of Inf file",
                                 File=str(self), ExtraData=Instance)
@@ -1079,22 +1148,27 @@ class InfBuildData(ModuleBuildClassObject):
                     return True
                 return False
             except:
-                EdkLogger.warn('build', FORMAT_INVALID,"The FeatureFlagExpression cannot be evaluated", File=str(self), ExtraData=Instance)
+                EdkLogger.warn('build', FORMAT_INVALID, "The FeatureFlagExpression cannot be evaluated", File=str(
+                    self), ExtraData=Instance)
                 return False
         else:
             for Name, Guid in self.Pcds:
                 if self.Pcds[(Name, Guid)].Type == 'FeatureFlag' or self.Pcds[(Name, Guid)].Type == 'FixedAtBuild':
-                    PcdFullName = '%s.%s' % (Guid, Name);
+                    PcdFullName = '%s.%s' % (Guid, Name)
                     if not PcdFullName in Pcds:
-                        Pcds[PcdFullName] = self.Pcds[(Name, Guid)].DefaultValue
+                        Pcds[PcdFullName] = self.Pcds[(
+                            Name, Guid)].DefaultValue
             try:
                 Value = ValueExpression(Instance, Pcds)()
                 if Value == True:
                     return True
                 return False
             except:
-                EdkLogger.warn('build', FORMAT_INVALID, "The FeatureFlagExpression cannot be evaluated", File=str(self), ExtraData=Instance)
+                EdkLogger.warn('build', FORMAT_INVALID, "The FeatureFlagExpression cannot be evaluated", File=str(
+                    self), ExtraData=Instance)
                 return False
+
+
 def ExtendCopyDictionaryLists(CopyToDict, CopyFromDict):
     for Key in CopyFromDict:
         CopyToDict[Key].extend(CopyFromDict[Key])
diff --git a/BaseTools/Source/Python/Workspace/MetaDataTable.py b/BaseTools/Source/Python/Workspace/MetaDataTable.py
index a20bd147846b..79d21c4140cd 100644
--- a/BaseTools/Source/Python/Workspace/MetaDataTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaDataTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase table for files
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -14,11 +14,13 @@ import Common.EdkLogger as EdkLogger
 from CommonDataClass import DataClass
 from CommonDataClass.DataClass import FileClass
 
-## Convert to SQL required string format
+# Convert to SQL required string format
+
+
 def ConvertToSqlString(StringList):
     return list(map(lambda s: "'" + s.replace("'", "''") + "'", StringList))
 
-## TableFile
+# TableFile
 #
 # This class defined a common table
 #
@@ -27,6 +29,8 @@ def ConvertToSqlString(StringList):
 # @param Cursor:     Cursor of the database
 # @param TableName:  Name of the table
 #
+
+
 class Table(object):
     _COLUMN_ = ''
     _ID_STEP_ = 1
@@ -44,7 +48,7 @@ class Table(object):
     def __str__(self):
         return self.Table
 
-    ## Create table
+    # Create table
     #
     # Create a table
     #
@@ -52,7 +56,7 @@ class Table(object):
         self.Db.CreateEmptyTable(self.Table)
         self.ID = self.GetId()
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into a table
     #
@@ -66,18 +70,17 @@ class Table(object):
 
         return self.ID
 
-
-    ## Get count
+    # Get count
     #
     # Get a count of all records of the table
     #
     # @retval Count:  Total count of all records
     #
+
     def GetCount(self):
         tab = self.Db.GetTable(self.Table)
         return len(tab)
 
-
     def GetId(self):
         tab = self.Db.GetTable(self.Table)
         Id = max([int(item[0]) for item in tab])
@@ -85,14 +88,14 @@ class Table(object):
             Id = self.IdBase
         return Id
 
-    ## Init the ID of the table
+    # Init the ID of the table
     #
     # Init the ID of the table
     #
     def InitID(self):
         self.ID = self.GetId()
 
-    ## Exec
+    # Exec
     #
     # Exec Sql Command, return result
     #
@@ -110,7 +113,6 @@ class Table(object):
         Tab = self.Db.GetTable(self.Table)
         Tab.append(self._DUMMY_)
 
-
     def IsIntegral(self):
         tab = self.Db.GetTable(self.Table)
         Id = min([int(item[0]) for item in tab])
@@ -123,7 +125,7 @@ class Table(object):
         return tab
 
 
-## TableFile
+# TableFile
 #
 # This class defined a table used for file
 #
@@ -140,10 +142,11 @@ class TableFile(Table):
         TimeStamp SINGLE NOT NULL,
         FromItem REAL NOT NULL
         '''
+
     def __init__(self, Cursor):
         Table.__init__(self, Cursor, 'File')
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table File
     #
@@ -155,7 +158,8 @@ class TableFile(Table):
     # @param TimeStamp: TimeStamp of a File
     #
     def Insert(self, Name, ExtName, Path, FullPath, Model, TimeStamp, FromItem=0):
-        (Name, ExtName, Path, FullPath) = ConvertToSqlString((Name, ExtName, Path, FullPath))
+        (Name, ExtName, Path, FullPath) = ConvertToSqlString(
+            (Name, ExtName, Path, FullPath))
         return Table.Insert(
             self,
             Name,
@@ -165,9 +169,9 @@ class TableFile(Table):
             Model,
             TimeStamp,
             FromItem
-            )
+        )
 
-    ## InsertFile
+    # InsertFile
     #
     # Insert one file to table
     #
@@ -179,76 +183,82 @@ class TableFile(Table):
     def InsertFile(self, File, Model, FromItem=''):
         if FromItem:
             return self.Insert(
-                        File.Name,
-                        File.Ext,
-                        File.Dir,
-                        File.Path,
-                        Model,
-                        File.TimeStamp,
-                        FromItem
-                        )
+                File.Name,
+                File.Ext,
+                File.Dir,
+                File.Path,
+                Model,
+                File.TimeStamp,
+                FromItem
+            )
         return self.Insert(
-                        File.Name,
-                        File.Ext,
-                        File.Dir,
-                        File.Path,
-                        Model,
-                        File.TimeStamp
-                        )
+            File.Name,
+            File.Ext,
+            File.Dir,
+            File.Path,
+            Model,
+            File.TimeStamp
+        )
 
-    ## Get type of a given file
+    # Get type of a given file
     #
     #   @param  FileId      ID of a file
     #
     #   @retval file_type   Model value of given file in the table
     #
     def GetFileType(self, FileId):
-        QueryScript = "select Model from %s where ID = '%s'" % (self.Table, FileId)
+        QueryScript = "select Model from %s where ID = '%s'" % (
+            self.Table, FileId)
         RecordList = self.Exec(QueryScript)
         if len(RecordList) == 0:
             return None
         return RecordList[0][0]
 
-    ## Get file timestamp of a given file
+    # Get file timestamp of a given file
     #
     #   @param  FileId      ID of file
     #
     #   @retval timestamp   TimeStamp value of given file in the table
     #
     def GetFileTimeStamp(self, FileId):
-        QueryScript = "select TimeStamp from %s where ID = '%s'" % (self.Table, FileId)
+        QueryScript = "select TimeStamp from %s where ID = '%s'" % (
+            self.Table, FileId)
         RecordList = self.Exec(QueryScript)
         if len(RecordList) == 0:
             return None
         return RecordList[0][0]
 
-    ## Update the timestamp of a given file
+    # Update the timestamp of a given file
     #
     #   @param  FileId      ID of file
     #   @param  TimeStamp   Time stamp of file
     #
     def SetFileTimeStamp(self, FileId, TimeStamp):
-        self.Exec("update %s set TimeStamp=%s where ID='%s'" % (self.Table, TimeStamp, FileId))
+        self.Exec("update %s set TimeStamp=%s where ID='%s'" %
+                  (self.Table, TimeStamp, FileId))
 
-    ## Get list of file with given type
+    # Get list of file with given type
     #
     #   @param  FileType    Type value of file
     #
     #   @retval file_list   List of files with the given type
     #
     def GetFileList(self, FileType):
-        RecordList = self.Exec("select FullPath from %s where Model=%s" % (self.Table, FileType))
+        RecordList = self.Exec(
+            "select FullPath from %s where Model=%s" % (self.Table, FileType))
         if len(RecordList) == 0:
             return []
         return [R[0] for R in RecordList]
 
-## TableDataModel
+# TableDataModel
 #
 # This class defined a table used for data model
 #
 # @param object:       Inherited from object class
 #
 #
+
+
 class TableDataModel(Table):
     _COLUMN_ = """
         ID INTEGER PRIMARY KEY,
@@ -256,10 +266,11 @@ class TableDataModel(Table):
         Name VARCHAR NOT NULL,
         Description VARCHAR
         """
+
     def __init__(self, Cursor):
         Table.__init__(self, Cursor, 'DataModel')
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table DataModel
     #
@@ -272,7 +283,7 @@ class TableDataModel(Table):
         (Name, Description) = ConvertToSqlString((Name, Description))
         return Table.Insert(self, CrossIndex, Name, Description)
 
-    ## Init table
+    # Init table
     #
     # Create all default records of table DataModel
     #
@@ -288,7 +299,7 @@ class TableDataModel(Table):
             self.Insert(CrossIndex, Name, Description)
         EdkLogger.verbose("Initialize table DataModel ... DONE!")
 
-    ## Get CrossIndex
+    # Get CrossIndex
     #
     # Get a model's cross index from its name
     #
@@ -303,4 +314,3 @@ class TableDataModel(Table):
             CrossIndex = Item[0]
 
         return CrossIndex
-
diff --git a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
index 3737ae3511c2..01363c10a96b 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to check format of comments
 #
 # Copyright (c) 2012, Intel Corporation. All rights reserved.<BR>
@@ -20,19 +20,24 @@ UsageList = ("PRODUCES", "PRODUCED", "ALWAYS_PRODUCES", "ALWAYS_PRODUCED", "SOME
              "SOMETIMES_PRODUCED", "CONSUMES", "CONSUMED", "ALWAYS_CONSUMES", "ALWAYS_CONSUMED",
              "SOMETIMES_CONSUMES", "SOMETIMES_CONSUMED", "SOMETIME_CONSUMES")
 ErrorMsgMap = {
-    MODEL_EFI_GUID      : "The usage for this GUID is not listed in this INF: %s[%d]:%s",
-    MODEL_EFI_PPI       : "The usage for this PPI is not listed in this INF: %s[%d]:%s.",
-    MODEL_EFI_PROTOCOL  : "The usage for this Protocol is not listed in this INF: %s[%d]:%s.",
-    MODEL_PCD_DYNAMIC   : "The usage for this PCD is not listed in this INF: %s[%d]:%s."
+    MODEL_EFI_GUID: "The usage for this GUID is not listed in this INF: %s[%d]:%s",
+    MODEL_EFI_PPI: "The usage for this PPI is not listed in this INF: %s[%d]:%s.",
+    MODEL_EFI_PROTOCOL: "The usage for this Protocol is not listed in this INF: %s[%d]:%s.",
+    MODEL_PCD_DYNAMIC: "The usage for this PCD is not listed in this INF: %s[%d]:%s."
 }
 
+
 def CheckInfComment(SectionType, Comments, InfFile, LineNo, ValueList):
     if SectionType in [MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_EX, MODEL_PCD_DYNAMIC]:
-        CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0]+'.'+ValueList[1], ErrorMsgMap[MODEL_PCD_DYNAMIC])
+        CheckUsage(Comments, UsageList, InfFile, LineNo,
+                   ValueList[0]+'.'+ValueList[1], ErrorMsgMap[MODEL_PCD_DYNAMIC])
     elif SectionType in [MODEL_EFI_GUID, MODEL_EFI_PPI]:
-        CheckUsage(Comments, UsageList, InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
+        CheckUsage(Comments, UsageList, InfFile, LineNo,
+                   ValueList[0], ErrorMsgMap[SectionType])
     elif SectionType == MODEL_EFI_PROTOCOL:
-        CheckUsage(Comments, UsageList + ("TO_START", "BY_START"), InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
+        CheckUsage(Comments, UsageList + ("TO_START", "BY_START"),
+                   InfFile, LineNo, ValueList[0], ErrorMsgMap[SectionType])
+
 
 def CheckUsage(Comments, Usages, InfFile, LineNo, Value, ErrorMsg):
     for Comment in Comments:
diff --git a/BaseTools/Source/Python/Workspace/MetaFileParser.py b/BaseTools/Source/Python/Workspace/MetaFileParser.py
index 3508591b281e..ea40cc5fcd41 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileParser.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileParser.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to parse meta files
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -32,12 +32,14 @@ from .MetaFileTable import MetaFileStorage
 from .MetaFileCommentParser import CheckInfComment
 from Common.DataType import TAB_COMMENT_EDK_START, TAB_COMMENT_EDK_END
 
-## RegEx for finding file versions
+# RegEx for finding file versions
 hexVersionPattern = re.compile(r'0[xX][\da-f-A-F]{5,8}')
 decVersionPattern = re.compile(r'\d+\.\d+')
 CODEPattern = re.compile(r"{CODE\([a-fA-F0-9Xx\{\},\s]*\)}")
 
-## A decorator used to parse macro definition
+# A decorator used to parse macro definition
+
+
 def ParseMacro(Parser):
     def MacroParser(self):
         Match = GlobalData.gMacroDefPattern.match(self._CurrentLine)
@@ -46,7 +48,8 @@ def ParseMacro(Parser):
             Parser(self)
             return
 
-        TokenList = GetSplitValueList(self._CurrentLine[Match.end(1):], TAB_EQUAL_SPLIT, 1)
+        TokenList = GetSplitValueList(
+            self._CurrentLine[Match.end(1):], TAB_EQUAL_SPLIT, 1)
         # Syntax check
         if not TokenList[0]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No macro name given",
@@ -103,7 +106,7 @@ def ParseMacro(Parser):
 
     return MacroParser
 
-## Base class of parser
+# Base class of parser
 #
 #  This class is used for derivation purpose. The specific parser for one kind
 # type file must derive this class and implement some public interfaces.
@@ -115,6 +118,8 @@ def ParseMacro(Parser):
 #   @param      Owner           Owner ID (for sub-section parsing)
 #   @param      From            ID from which the data comes (for !INCLUDE directive)
 #
+
+
 class MetaFileParser(object):
     # data type (file content) for specific file type
     DataType = {}
@@ -122,7 +127,7 @@ class MetaFileParser(object):
     # Parser objects used to implement singleton
     MetaFiles = {}
 
-    ## Factory method
+    # Factory method
     #
     # One file, one parser object. This factory method makes sure that there's
     # only one object constructed for one meta file.
@@ -141,7 +146,7 @@ class MetaFileParser(object):
             Class.MetaFiles[FilePath] = ParserObject
             return ParserObject
 
-    ## Constructor of MetaFileParser
+    # Constructor of MetaFileParser
     #
     #  Initialize object of MetaFileParser
     #
@@ -152,7 +157,7 @@ class MetaFileParser(object):
     #   @param      Owner           Owner ID (for sub-section parsing)
     #   @param      From            ID from which the data comes (for !INCLUDE directive)
     #
-    def __init__(self, FilePath, FileType, Arch, Table, Owner= -1, From= -1):
+    def __init__(self, FilePath, FileType, Arch, Table, Owner=-1, From=-1):
         self._Table = Table
         self._RawTable = Table
         self._Arch = Arch
@@ -191,19 +196,19 @@ class MetaFileParser(object):
         self._PcdDataTypeCODE = False
         self._CurrentPcdName = ""
 
-    ## Store the parsed data in table
+    # Store the parsed data in table
     def _Store(self, *Args):
         return self._Table.Insert(*Args)
 
-    ## Virtual method for starting parse
+    # Virtual method for starting parse
     def Start(self):
         raise NotImplementedError
 
-    ## Notify a post-process is needed
+    # Notify a post-process is needed
     def DoPostProcess(self):
         self._PostProcessed = False
 
-    ## Set parsing complete flag in both class and table
+    # Set parsing complete flag in both class and table
     def _Done(self):
         self._Finished = True
         self._Table.SetEndFlag()
@@ -211,17 +216,17 @@ class MetaFileParser(object):
     def _PostProcess(self):
         self._PostProcessed = True
 
-    ## Get the parse complete flag
+    # Get the parse complete flag
     @property
     def Finished(self):
         return self._Finished
 
-    ## Set the complete flag
+    # Set the complete flag
     @Finished.setter
     def Finished(self, Value):
         self._Finished = Value
 
-    ## Remove records that do not match given Filter Arch
+    # Remove records that do not match given Filter Arch
     def _FilterRecordList(self, RecordList, FilterArch):
         NewRecordList = []
         for Record in RecordList:
@@ -230,7 +235,7 @@ class MetaFileParser(object):
                 NewRecordList.append(Record)
         return NewRecordList
 
-    ## Use [] style to query data in table, just for readability
+    # Use [] style to query data in table, just for readability
     #
     #   DataInfo = [data_type, scope1(arch), scope2(platform/moduletype)]
     #
@@ -259,18 +264,19 @@ class MetaFileParser(object):
                 self._Table = self._RawTable
                 self._PostProcessed = False
                 self.Start()
-    ## Data parser for the common format in different type of file
+    # Data parser for the common format in different type of file
     #
     #   The common format in the meatfile is like
     #
     #       xxx1 | xxx2 | xxx3
     #
+
     @ParseMacro
     def _CommonParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         self._ValueList[0:len(TokenList)] = TokenList
 
-    ## Data parser for the format in which there's path
+    # Data parser for the format in which there's path
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -281,19 +287,20 @@ class MetaFileParser(object):
         # Don't do macro replacement for dsc file at this point
         if not isinstance(self, DscParser):
             Macros = self._Macros
-            self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
+            self._ValueList = [ReplaceMacro(Value, Macros)
+                               for Value in self._ValueList]
 
-    ## Skip unsupported data
+    # Skip unsupported data
     def _Skip(self):
         EdkLogger.warn("Parser", "Unrecognized content", File=self.MetaFile,
-                        Line=self._LineIndex + 1, ExtraData=self._CurrentLine);
+                       Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
         self._ValueList[0:1] = [self._CurrentLine]
 
-    ## Skip unsupported data for UserExtension Section
+    # Skip unsupported data for UserExtension Section
     def _SkipUserExtension(self):
         self._ValueList[0:1] = [self._CurrentLine]
 
-    ## Section header parser
+    # Section header parser
     #
     #   The section header is always in following format:
     #
@@ -353,14 +360,14 @@ class MetaFileParser(object):
         # If the section information is needed later, it should be stored in database
         self._ValueList[0] = self._SectionName
 
-    ## [packages] section parser
+    # [packages] section parser
     @ParseMacro
     def _PackageParser(self):
         self._CurrentLine = CleanString(self._CurrentLine)
         self._Packages.append(self._CurrentLine)
         self._ValueList[0] = self._CurrentLine
 
-    ## [defines] section parser
+    # [defines] section parser
     @ParseMacro
     def _DefineParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -372,15 +379,18 @@ class MetaFileParser(object):
             EdkLogger.error('Parser', FORMAT_INVALID, "No value specified",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
-        self._ValueList = [ReplaceMacro(Value, self._Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, self._Macros)
+                           for Value in self._ValueList]
         Name, Value = self._ValueList[1], self._ValueList[2]
         MacroUsed = GlobalData.gMacroRefPattern.findall(Value)
         if len(MacroUsed) != 0:
             for Macro in MacroUsed:
                 if Macro in GlobalData.gGlobalDefines:
-                    EdkLogger.error("Parser", FORMAT_INVALID, "Global macro %s is not permitted." % (Macro), ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
+                    EdkLogger.error("Parser", FORMAT_INVALID, "Global macro %s is not permitted." % (
+                        Macro), ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
             else:
-                EdkLogger.error("Parser", FORMAT_INVALID, "%s not defined" % (Macro), ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
+                EdkLogger.error("Parser", FORMAT_INVALID, "%s not defined" % (
+                    Macro), ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         # Sometimes, we need to make differences between EDK and EDK2 modules
         if Name == 'INF_VERSION':
             if hexVersionPattern.match(Value):
@@ -402,7 +412,7 @@ class MetaFileParser(object):
             self._FileLocalMacros[Name] = Value
         self._Defines[Name] = Value
 
-    ## [BuildOptions] section parser
+    # [BuildOptions] section parser
     @ParseMacro
     def _BuildOptionParser(self):
         self._CurrentLine = CleanString(self._CurrentLine, BuildOption=True)
@@ -413,18 +423,20 @@ class MetaFileParser(object):
             self._ValueList[1] = TokenList2[1]              # keys
         else:
             self._ValueList[1] = TokenList[0]
-        if len(TokenList) == 2 and not isinstance(self, DscParser): # value
+        if len(TokenList) == 2 and not isinstance(self, DscParser):  # value
             self._ValueList[2] = ReplaceMacro(TokenList[1], self._Macros)
 
         if self._ValueList[1].count('_') != 4:
             EdkLogger.error(
                 'Parser',
                 FORMAT_INVALID,
-                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
+                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[
+                    1],
                 ExtraData=self._CurrentLine,
                 File=self.MetaFile,
                 Line=self._LineIndex + 1
-                )
+            )
+
     def GetValidExpression(self, TokenSpaceGuid, PcdCName):
         return self._Table.GetValidExpression(TokenSpaceGuid, PcdCName)
 
@@ -435,7 +447,7 @@ class MetaFileParser(object):
         Macros.update(self._GetApplicableSectionMacro())
         return Macros
 
-    ## Construct section Macro dict
+    # Construct section Macro dict
     def _ConstructSectionMacroDict(self, Name, Value):
         ScopeKey = [(Scope[0], Scope[1], Scope[2]) for Scope in self._Scope]
         ScopeKey = tuple(ScopeKey)
@@ -450,8 +462,8 @@ class MetaFileParser(object):
 
         self._SectionsMacroDict[SectionDictKey][Name] = Value
 
-    ## Get section Macros that are applicable to current line, which may come from other sections
-    ## that share the same name while scope is wider
+    # Get section Macros that are applicable to current line, which may come from other sections
+    # that share the same name while scope is wider
     def _GetApplicableSectionMacro(self):
         Macros = {}
 
@@ -468,21 +480,24 @@ class MetaFileParser(object):
                 continue
 
             for ActiveScope in self._Scope:
-                Scope0, Scope1, Scope2= ActiveScope[0], ActiveScope[1], ActiveScope[2]
+                Scope0, Scope1, Scope2 = ActiveScope[0], ActiveScope[1], ActiveScope[2]
                 if(Scope0, Scope1, Scope2) not in Scope:
                     break
             else:
-                SpeSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
+                SpeSpeMacroDict.update(
+                    self._SectionsMacroDict[(SectionType, Scope)])
 
             for ActiveScope in self._Scope:
                 Scope0, Scope1, Scope2 = ActiveScope[0], ActiveScope[1], ActiveScope[2]
                 if(Scope0, Scope1, Scope2) not in Scope and (Scope0, TAB_COMMON, TAB_COMMON) not in Scope and (TAB_COMMON, Scope1, TAB_COMMON) not in Scope:
                     break
             else:
-                ComSpeMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
+                ComSpeMacroDict.update(
+                    self._SectionsMacroDict[(SectionType, Scope)])
 
             if (TAB_COMMON, TAB_COMMON, TAB_COMMON) in Scope:
-                ComComMacroDict.update(self._SectionsMacroDict[(SectionType, Scope)])
+                ComComMacroDict.update(
+                    self._SectionsMacroDict[(SectionType, Scope)])
 
         Macros.update(ComComMacroDict)
         Macros.update(ComSpeMacroDict)
@@ -490,7 +505,7 @@ class MetaFileParser(object):
 
         return Macros
 
-    def ProcessMultipleLineCODEValue(self,Content):
+    def ProcessMultipleLineCODEValue(self, Content):
         CODEBegin = False
         CODELine = ""
         continuelinecount = 0
@@ -499,7 +514,7 @@ class MetaFileParser(object):
             Line = Content[Index]
             if CODEBegin:
                 CODELine = CODELine + Line
-                continuelinecount +=1
+                continuelinecount += 1
                 if ")}" in Line:
                     newContent.append(CODELine)
                     for _ in range(continuelinecount):
@@ -525,40 +540,42 @@ class MetaFileParser(object):
 
     _SectionParser = {}
 
-## INF file parser class
+# INF file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
 #   @param      Table           Database used to retrieve module/package information
 #   @param      Macros          Macros used for replacement in file
 #
+
+
 class InfParser(MetaFileParser):
     # INF file supported data types (one type per section)
     DataType = {
-        TAB_UNKNOWN.upper() : MODEL_UNKNOWN,
-        TAB_INF_DEFINES.upper() : MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE : MODEL_META_DATA_DEFINE,
-        TAB_BUILD_OPTIONS.upper() : MODEL_META_DATA_BUILD_OPTION,
-        TAB_INCLUDES.upper() : MODEL_EFI_INCLUDE,
-        TAB_LIBRARIES.upper() : MODEL_EFI_LIBRARY_INSTANCE,
-        TAB_LIBRARY_CLASSES.upper() : MODEL_EFI_LIBRARY_CLASS,
-        TAB_PACKAGES.upper() : MODEL_META_DATA_PACKAGE,
-        TAB_NMAKE.upper() : MODEL_META_DATA_NMAKE,
-        TAB_INF_FIXED_PCD.upper() : MODEL_PCD_FIXED_AT_BUILD,
-        TAB_INF_PATCH_PCD.upper() : MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_INF_FEATURE_PCD.upper() : MODEL_PCD_FEATURE_FLAG,
-        TAB_INF_PCD_EX.upper() : MODEL_PCD_DYNAMIC_EX,
-        TAB_INF_PCD.upper() : MODEL_PCD_DYNAMIC,
-        TAB_SOURCES.upper() : MODEL_EFI_SOURCE_FILE,
-        TAB_GUIDS.upper() : MODEL_EFI_GUID,
-        TAB_PROTOCOLS.upper() : MODEL_EFI_PROTOCOL,
-        TAB_PPIS.upper() : MODEL_EFI_PPI,
-        TAB_DEPEX.upper() : MODEL_EFI_DEPEX,
-        TAB_BINARIES.upper() : MODEL_EFI_BINARY_FILE,
-        TAB_USER_EXTENSIONS.upper() : MODEL_META_DATA_USER_EXTENSION
+        TAB_UNKNOWN.upper(): MODEL_UNKNOWN,
+        TAB_INF_DEFINES.upper(): MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE: MODEL_META_DATA_DEFINE,
+        TAB_BUILD_OPTIONS.upper(): MODEL_META_DATA_BUILD_OPTION,
+        TAB_INCLUDES.upper(): MODEL_EFI_INCLUDE,
+        TAB_LIBRARIES.upper(): MODEL_EFI_LIBRARY_INSTANCE,
+        TAB_LIBRARY_CLASSES.upper(): MODEL_EFI_LIBRARY_CLASS,
+        TAB_PACKAGES.upper(): MODEL_META_DATA_PACKAGE,
+        TAB_NMAKE.upper(): MODEL_META_DATA_NMAKE,
+        TAB_INF_FIXED_PCD.upper(): MODEL_PCD_FIXED_AT_BUILD,
+        TAB_INF_PATCH_PCD.upper(): MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_INF_FEATURE_PCD.upper(): MODEL_PCD_FEATURE_FLAG,
+        TAB_INF_PCD_EX.upper(): MODEL_PCD_DYNAMIC_EX,
+        TAB_INF_PCD.upper(): MODEL_PCD_DYNAMIC,
+        TAB_SOURCES.upper(): MODEL_EFI_SOURCE_FILE,
+        TAB_GUIDS.upper(): MODEL_EFI_GUID,
+        TAB_PROTOCOLS.upper(): MODEL_EFI_PROTOCOL,
+        TAB_PPIS.upper(): MODEL_EFI_PPI,
+        TAB_DEPEX.upper(): MODEL_EFI_DEPEX,
+        TAB_BINARIES.upper(): MODEL_EFI_BINARY_FILE,
+        TAB_USER_EXTENSIONS.upper(): MODEL_META_DATA_USER_EXTENSION
     }
 
-    ## Constructor of InfParser
+    # Constructor of InfParser
     #
     #  Initialize object of InfParser
     #
@@ -574,7 +591,7 @@ class InfParser(MetaFileParser):
         MetaFileParser.__init__(self, FilePath, FileType, Arch, Table)
         self.PcdsDict = {}
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         NmakeLine = ''
         Content = ''
@@ -582,7 +599,8 @@ class InfParser(MetaFileParser):
             with open(str(self.MetaFile), 'r') as File:
                 Content = File.readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
 
         # parse the file line by line
         IsFindBlockComment = False
@@ -593,7 +611,8 @@ class InfParser(MetaFileParser):
 
         for Index in range(0, len(Content)):
             # skip empty, commented, block commented lines
-            Line, Comment = CleanString2(Content[Index], AllowCppStyleComment=True)
+            Line, Comment = CleanString2(
+                Content[Index], AllowCppStyleComment=True)
             NextLine = ''
             if Index + 1 < len(Content):
                 NextLine, NextComment = CleanString2(Content[Index + 1])
@@ -642,13 +661,15 @@ class InfParser(MetaFileParser):
                                              MODEL_EFI_PPI,
                                              MODEL_META_DATA_USER_EXTENSION]:
                         EdkLogger.error('Parser', FORMAT_INVALID,
-                                        "Section [%s] is not allowed in inf file without version" % (self._SectionName),
+                                        "Section [%s] is not allowed in inf file without version" % (
+                                            self._SectionName),
                                         ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
                 elif self._SectionType in [MODEL_EFI_INCLUDE,
                                            MODEL_EFI_LIBRARY_INSTANCE,
                                            MODEL_META_DATA_NMAKE]:
                     EdkLogger.error('Parser', FORMAT_INVALID,
-                                    "Section [%s] is not allowed in inf file with version 0x%08x" % (self._SectionName, self._Version),
+                                    "Section [%s] is not allowed in inf file with version 0x%08x" % (
+                                        self._SectionName, self._Version),
                                     ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
                 continue
             # merge two lines specified by '\' in section NMAKE
@@ -679,25 +700,26 @@ class InfParser(MetaFileParser):
             if Comment:
                 Comments.append((Comment, Index + 1))
             if GlobalData.gOptions and GlobalData.gOptions.CheckUsage:
-                CheckInfComment(self._SectionType, Comments, str(self.MetaFile), Index + 1, self._ValueList)
+                CheckInfComment(self._SectionType, Comments, str(
+                    self.MetaFile), Index + 1, self._ValueList)
             #
             # Model, Value1, Value2, Value3, Arch, Platform, BelongsToItem=-1,
             # LineBegin=-1, ColumnBegin=-1, LineEnd=-1, ColumnEnd=-1, Enabled=-1
             #
             for Arch, Platform, _ in self._Scope:
                 LastItem = self._Store(self._SectionType,
-                            self._ValueList[0],
-                            self._ValueList[1],
-                            self._ValueList[2],
-                            Arch,
-                            Platform,
-                            self._Owner[-1],
-                            self._LineIndex + 1,
-                            - 1,
-                            self._LineIndex + 1,
-                            - 1,
-                            0
-                            )
+                                       self._ValueList[0],
+                                       self._ValueList[1],
+                                       self._ValueList[2],
+                                       Arch,
+                                       Platform,
+                                       self._Owner[-1],
+                                       self._LineIndex + 1,
+                                       - 1,
+                                       self._LineIndex + 1,
+                                       - 1,
+                                       0
+                                       )
                 for Comment, LineNo in Comments:
                     self._Store(MODEL_META_DATA_COMMENT, Comment, '', '', Arch, Platform,
                                 LastItem, LineNo, -1, LineNo, -1, 0)
@@ -711,10 +733,10 @@ class InfParser(MetaFileParser):
         # If there are tail comments in INF file, save to database whatever the comments are
         for Comment in TailComments:
             self._Store(MODEL_META_DATA_TAIL_COMMENT, Comment[0], '', '', TAB_COMMON,
-                                TAB_COMMON, self._Owner[-1], -1, -1, -1, -1, 0)
+                        TAB_COMMON, self._Owner[-1], -1, -1, -1, -1, 0)
         self._Done()
 
-    ## Data parser for the format in which there's path
+    # Data parser for the format in which there's path
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -729,7 +751,7 @@ class InfParser(MetaFileParser):
                     continue
                 self._ValueList[Index] = ReplaceMacro(Value, Macros)
 
-    ## Parse [Sources] section
+    # Parse [Sources] section
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -745,12 +767,14 @@ class InfParser(MetaFileParser):
         # For Acpi tables, remove macro like ' TABLE_NAME=Sata1'
         if 'COMPONENT_TYPE' in Macros:
             if self._Defines['COMPONENT_TYPE'].upper() == 'ACPITABLE':
-                self._ValueList[0] = GetSplitValueList(self._ValueList[0], ' ', 1)[0]
+                self._ValueList[0] = GetSplitValueList(
+                    self._ValueList[0], ' ', 1)[0]
         if self._Defines['BASE_NAME'] == 'Microcode':
             pass
-        self._ValueList = [ReplaceMacro(Value, Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, Macros)
+                           for Value in self._ValueList]
 
-    ## Parse [Binaries] section
+    # Parse [Binaries] section
     #
     #   Only path can have macro used. So we need to replace them before use.
     #
@@ -759,20 +783,23 @@ class InfParser(MetaFileParser):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 2)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file type or path specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if not TokenList[0]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file type specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if not TokenList[1]:
             EdkLogger.error('Parser', FORMAT_INVALID, "No file path specified",
-                            ExtraData=self._CurrentLine + " (<FileType> | <FilePath> [| <Target>])",
+                            ExtraData=self._CurrentLine +
+                            " (<FileType> | <FilePath> [| <Target>])",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
         self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros)
 
-    ## [nmake] section parser (Edk.x style only)
+    # [nmake] section parser (Edk.x style only)
     def _NmakeParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
         self._ValueList[0:len(TokenList)] = TokenList
@@ -781,70 +808,77 @@ class InfParser(MetaFileParser):
         # remove self-reference in macro setting
         #self._ValueList[1] = ReplaceMacro(self._ValueList[1], {self._ValueList[0]:''})
 
-    ## [FixedPcd], [FeaturePcd], [PatchPcd], [Pcd] and [PcdEx] sections parser
+    # [FixedPcd], [FeaturePcd], [PatchPcd], [Pcd] and [PcdEx] sections parser
     @ParseMacro
     def _PcdParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
         ValueList = GetSplitValueList(TokenList[0], TAB_SPLIT)
         if len(ValueList) != 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "Illegal token space GUID and PCD name format",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:1] = ValueList
         if len(TokenList) > 1:
             self._ValueList[2] = TokenList[1]
         if self._ValueList[0] == '' or self._ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         if self._ValueList[2] != '':
-            InfPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
+            InfPcdValueList = GetSplitValueList(
+                TokenList[1], TAB_VALUE_SPLIT, 1)
             if InfPcdValueList[0] in ['True', 'true', 'TRUE']:
-                self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '1', 1)
+                self._ValueList[2] = TokenList[1].replace(
+                    InfPcdValueList[0], '1', 1)
             elif InfPcdValueList[0] in ['False', 'false', 'FALSE']:
-                self._ValueList[2] = TokenList[1].replace(InfPcdValueList[0], '0', 1)
+                self._ValueList[2] = TokenList[1].replace(
+                    InfPcdValueList[0], '0', 1)
             elif isinstance(InfPcdValueList[0], str) and InfPcdValueList[0].find('$(') >= 0:
-                Value = ReplaceExprMacro(InfPcdValueList[0],self._Macros)
+                Value = ReplaceExprMacro(InfPcdValueList[0], self._Macros)
                 if Value != '0':
                     self._ValueList[2] = Value
         if (self._ValueList[0], self._ValueList[1]) not in self.PcdsDict:
-            self.PcdsDict[self._ValueList[0], self._ValueList[1]] = self._SectionType
+            self.PcdsDict[self._ValueList[0],
+                          self._ValueList[1]] = self._SectionType
         elif self.PcdsDict[self._ValueList[0], self._ValueList[1]] != self._SectionType:
             EdkLogger.error('Parser', FORMAT_INVALID, "It is not permissible to list a specified PCD in different PCD type sections.",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<PcdCName>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<PcdCName>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
-    ## [depex] section parser
+    # [depex] section parser
     @ParseMacro
     def _DepexParser(self):
         self._ValueList[0:1] = [self._CurrentLine]
 
     _SectionParser = {
-        MODEL_UNKNOWN                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
-        MODEL_META_DATA_BUILD_OPTION    :   MetaFileParser._BuildOptionParser,
-        MODEL_EFI_INCLUDE               :   _IncludeParser, # for Edk.x modules
-        MODEL_EFI_LIBRARY_INSTANCE      :   MetaFileParser._CommonParser, # for Edk.x modules
-        MODEL_EFI_LIBRARY_CLASS         :   MetaFileParser._PathParser,
-        MODEL_META_DATA_PACKAGE         :   MetaFileParser._PathParser,
-        MODEL_META_DATA_NMAKE           :   _NmakeParser, # for Edk.x modules
-        MODEL_PCD_FIXED_AT_BUILD        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG          :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX            :   _PcdParser,
-        MODEL_PCD_DYNAMIC               :   _PcdParser,
-        MODEL_EFI_SOURCE_FILE           :   _SourceFileParser,
-        MODEL_EFI_GUID                  :   MetaFileParser._CommonParser,
-        MODEL_EFI_PROTOCOL              :   MetaFileParser._CommonParser,
-        MODEL_EFI_PPI                   :   MetaFileParser._CommonParser,
-        MODEL_EFI_DEPEX                 :   _DepexParser,
-        MODEL_EFI_BINARY_FILE           :   _BinaryFileParser,
-        MODEL_META_DATA_USER_EXTENSION  :   MetaFileParser._SkipUserExtension,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_HEADER:   MetaFileParser._DefineParser,
+        MODEL_META_DATA_BUILD_OPTION:   MetaFileParser._BuildOptionParser,
+        MODEL_EFI_INCLUDE:   _IncludeParser,  # for Edk.x modules
+        MODEL_EFI_LIBRARY_INSTANCE:   MetaFileParser._CommonParser,  # for Edk.x modules
+        MODEL_EFI_LIBRARY_CLASS:   MetaFileParser._PathParser,
+        MODEL_META_DATA_PACKAGE:   MetaFileParser._PathParser,
+        MODEL_META_DATA_NMAKE:   _NmakeParser,  # for Edk.x modules
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX:   _PcdParser,
+        MODEL_PCD_DYNAMIC:   _PcdParser,
+        MODEL_EFI_SOURCE_FILE:   _SourceFileParser,
+        MODEL_EFI_GUID:   MetaFileParser._CommonParser,
+        MODEL_EFI_PROTOCOL:   MetaFileParser._CommonParser,
+        MODEL_EFI_PPI:   MetaFileParser._CommonParser,
+        MODEL_EFI_DEPEX:   _DepexParser,
+        MODEL_EFI_BINARY_FILE:   _BinaryFileParser,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._SkipUserExtension,
     }
 
-## DSC file parser class
+# DSC file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
@@ -853,37 +887,39 @@ class InfParser(MetaFileParser):
 #   @param      Owner           Owner ID (for sub-section parsing)
 #   @param      From            ID from which the data comes (for !INCLUDE directive)
 #
+
+
 class DscParser(MetaFileParser):
     # DSC file supported data types (one type per section)
     DataType = {
-        TAB_SKUIDS.upper()                          :   MODEL_EFI_SKU_ID,
-        TAB_DEFAULT_STORES.upper()                  :   MODEL_EFI_DEFAULT_STORES,
-        TAB_LIBRARIES.upper()                       :   MODEL_EFI_LIBRARY_INSTANCE,
-        TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
-        TAB_BUILD_OPTIONS.upper()                   :   MODEL_META_DATA_BUILD_OPTION,
-        TAB_PACKAGES.upper()                        :   MODEL_META_DATA_PACKAGE,
-        TAB_PCDS_FIXED_AT_BUILD_NULL.upper()        :   MODEL_PCD_FIXED_AT_BUILD,
-        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper()   :   MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_PCDS_FEATURE_FLAG_NULL.upper()          :   MODEL_PCD_FEATURE_FLAG,
-        TAB_PCDS_DYNAMIC_DEFAULT_NULL.upper()       :   MODEL_PCD_DYNAMIC_DEFAULT,
-        TAB_PCDS_DYNAMIC_HII_NULL.upper()           :   MODEL_PCD_DYNAMIC_HII,
-        TAB_PCDS_DYNAMIC_VPD_NULL.upper()           :   MODEL_PCD_DYNAMIC_VPD,
-        TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL.upper()    :   MODEL_PCD_DYNAMIC_EX_DEFAULT,
-        TAB_PCDS_DYNAMIC_EX_HII_NULL.upper()        :   MODEL_PCD_DYNAMIC_EX_HII,
-        TAB_PCDS_DYNAMIC_EX_VPD_NULL.upper()        :   MODEL_PCD_DYNAMIC_EX_VPD,
-        TAB_COMPONENTS.upper()                      :   MODEL_META_DATA_COMPONENT,
-        TAB_DSC_DEFINES.upper()                     :   MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE                      :   MODEL_META_DATA_DEFINE,
-        TAB_DSC_DEFINES_EDKGLOBAL                   :   MODEL_META_DATA_GLOBAL_DEFINE,
-        TAB_INCLUDE.upper()                         :   MODEL_META_DATA_INCLUDE,
-        TAB_IF.upper()                              :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
-        TAB_IF_DEF.upper()                          :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
-        TAB_IF_N_DEF.upper()                        :   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF,
-        TAB_ELSE_IF.upper()                         :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF,
-        TAB_ELSE.upper()                            :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
-        TAB_END_IF.upper()                          :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF,
-        TAB_USER_EXTENSIONS.upper()                 :   MODEL_META_DATA_USER_EXTENSION,
-        TAB_ERROR.upper()                           :   MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR,
+        TAB_SKUIDS.upper():   MODEL_EFI_SKU_ID,
+        TAB_DEFAULT_STORES.upper():   MODEL_EFI_DEFAULT_STORES,
+        TAB_LIBRARIES.upper():   MODEL_EFI_LIBRARY_INSTANCE,
+        TAB_LIBRARY_CLASSES.upper():   MODEL_EFI_LIBRARY_CLASS,
+        TAB_BUILD_OPTIONS.upper():   MODEL_META_DATA_BUILD_OPTION,
+        TAB_PACKAGES.upper():   MODEL_META_DATA_PACKAGE,
+        TAB_PCDS_FIXED_AT_BUILD_NULL.upper():   MODEL_PCD_FIXED_AT_BUILD,
+        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper():   MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_PCDS_FEATURE_FLAG_NULL.upper():   MODEL_PCD_FEATURE_FLAG,
+        TAB_PCDS_DYNAMIC_DEFAULT_NULL.upper():   MODEL_PCD_DYNAMIC_DEFAULT,
+        TAB_PCDS_DYNAMIC_HII_NULL.upper():   MODEL_PCD_DYNAMIC_HII,
+        TAB_PCDS_DYNAMIC_VPD_NULL.upper():   MODEL_PCD_DYNAMIC_VPD,
+        TAB_PCDS_DYNAMIC_EX_DEFAULT_NULL.upper():   MODEL_PCD_DYNAMIC_EX_DEFAULT,
+        TAB_PCDS_DYNAMIC_EX_HII_NULL.upper():   MODEL_PCD_DYNAMIC_EX_HII,
+        TAB_PCDS_DYNAMIC_EX_VPD_NULL.upper():   MODEL_PCD_DYNAMIC_EX_VPD,
+        TAB_COMPONENTS.upper():   MODEL_META_DATA_COMPONENT,
+        TAB_DSC_DEFINES.upper():   MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE:   MODEL_META_DATA_DEFINE,
+        TAB_DSC_DEFINES_EDKGLOBAL:   MODEL_META_DATA_GLOBAL_DEFINE,
+        TAB_INCLUDE.upper():   MODEL_META_DATA_INCLUDE,
+        TAB_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
+        TAB_IF_DEF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF,
+        TAB_IF_N_DEF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF,
+        TAB_ELSE_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF,
+        TAB_ELSE.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE,
+        TAB_END_IF.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF,
+        TAB_USER_EXTENSIONS.upper():   MODEL_META_DATA_USER_EXTENSION,
+        TAB_ERROR.upper():   MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR,
     }
 
     # Valid names in define section
@@ -917,7 +953,7 @@ class DscParser(MetaFileParser):
 
     IncludedFiles = set()
 
-    ## Constructor of DscParser
+    # Constructor of DscParser
     #
     #  Initialize object of DscParser
     #
@@ -928,11 +964,12 @@ class DscParser(MetaFileParser):
     #   @param      Owner           Owner ID (for sub-section parsing)
     #   @param      From            ID from which the data comes (for !INCLUDE directive)
     #
-    def __init__(self, FilePath, FileType, Arch, Table, Owner= -1, From= -1):
+    def __init__(self, FilePath, FileType, Arch, Table, Owner=-1, From=-1):
         # prevent re-initialization
         if hasattr(self, "_Table") and self._Table is Table:
             return
-        MetaFileParser.__init__(self, FilePath, FileType, Arch, Table, Owner, From)
+        MetaFileParser.__init__(self, FilePath, FileType,
+                                Arch, Table, Owner, From)
         self._Version = 0x00010005  # Only EDK2 dsc file is supported
         # to store conditional directive evaluation result
         self._DirectiveStack = []
@@ -950,18 +987,19 @@ class DscParser(MetaFileParser):
         #  Map the ID between the original table and new table to track
         #  the owner item
         #
-        self._IdMapping = {-1:-1}
+        self._IdMapping = {-1: -1}
 
         self._Content = None
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         Content = ''
         try:
             with open(str(self.MetaFile), 'r') as File:
                 Content = File.readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
 
         OwnerId = {}
 
@@ -1004,7 +1042,8 @@ class DscParser(MetaFileParser):
                     self._DirectiveParser()
                 continue
             if Line[0] == TAB_OPTION_START and not self._InSubsection:
-                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (Line, Index+1), ExtraData=self.MetaFile)
+                EdkLogger.error("Parser", FILE_READ_FAILURE, "Missing the '{' before %s in Line %s" % (
+                    Line, Index+1), ExtraData=self.MetaFile)
 
             if self._InSubsection:
                 SectionType = self._SubsectionType
@@ -1030,21 +1069,21 @@ class DscParser(MetaFileParser):
                 if self._SubsectionType != MODEL_UNKNOWN and Arch in OwnerId:
                     Owner = OwnerId[Arch]
                 self._LastItem = self._Store(
-                                        self._ItemType,
-                                        self._ValueList[0],
-                                        self._ValueList[1],
-                                        self._ValueList[2],
-                                        Arch,
-                                        ModuleType,
-                                        DefaultStore,
-                                        Owner,
-                                        self._From,
-                                        self._LineIndex + 1,
-                                        - 1,
-                                        self._LineIndex + 1,
-                                        - 1,
-                                        self._Enabled
-                                        )
+                    self._ItemType,
+                    self._ValueList[0],
+                    self._ValueList[1],
+                    self._ValueList[2],
+                    Arch,
+                    ModuleType,
+                    DefaultStore,
+                    Owner,
+                    self._From,
+                    self._LineIndex + 1,
+                    - 1,
+                    self._LineIndex + 1,
+                    - 1,
+                    self._Enabled
+                )
                 if self._SubsectionType == MODEL_UNKNOWN and self._InSubsection:
                     OwnerId[Arch] = self._LastItem
 
@@ -1054,7 +1093,7 @@ class DscParser(MetaFileParser):
                             ExtraData=Text, File=self.MetaFile, Line=Line)
         self._Done()
 
-    ## <subsection_header> parser
+    # <subsection_header> parser
     def _SubsectionHeaderParser(self):
         self._SubsectionName = self._CurrentLine[1:-1].upper()
         if self._SubsectionName in self.DataType:
@@ -1065,7 +1104,7 @@ class DscParser(MetaFileParser):
                            Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
         self._ValueList[0] = self._SubsectionName
 
-    ## Directive statement parser
+    # Directive statement parser
     def _DirectiveParser(self):
         self._ValueList = ['', '', '']
         TokenList = GetSplitValueList(self._CurrentLine, ' ', 1)
@@ -1115,7 +1154,8 @@ class DscParser(MetaFileParser):
                 EdkLogger.error("Parser", FORMAT_INVALID, "'!elseif' after '!else'",
                                 File=self.MetaFile, Line=self._LineIndex + 1,
                                 ExtraData=self._CurrentLine)
-            self._DirectiveStack.append((ItemType, self._LineIndex + 1, self._CurrentLine))
+            self._DirectiveStack.append(
+                (ItemType, self._LineIndex + 1, self._CurrentLine))
 
         #
         # Model, Value1, Value2, Value3, Arch, ModuleType, BelongsToItem=-1, BelongsToFile=-1,
@@ -1123,23 +1163,23 @@ class DscParser(MetaFileParser):
         #
         for Arch, ModuleType, DefaultStore in Scope:
             self._LastItem = self._Store(
-                                    ItemType,
-                                    self._ValueList[0],
-                                    self._ValueList[1],
-                                    self._ValueList[2],
-                                    Arch,
-                                    ModuleType,
-                                    DefaultStore,
-                                    self._Owner[-1],
-                                    self._From,
-                                    self._LineIndex + 1,
-                                    - 1,
-                                    self._LineIndex + 1,
-                                    - 1,
-                                    0
-                                    )
+                ItemType,
+                self._ValueList[0],
+                self._ValueList[1],
+                self._ValueList[2],
+                Arch,
+                ModuleType,
+                DefaultStore,
+                self._Owner[-1],
+                self._From,
+                self._LineIndex + 1,
+                - 1,
+                self._LineIndex + 1,
+                - 1,
+                0
+            )
 
-    ## [defines] section parser
+    # [defines] section parser
     @ParseMacro
     def _DefineParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
@@ -1153,7 +1193,7 @@ class DscParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "No value specified",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         if (not self._ValueList[1] in self.DefineKeywords and
-            (self._InSubsection and self._ValueList[1] not in self.SubSectionDefineKeywords)):
+                (self._InSubsection and self._ValueList[1] not in self.SubSectionDefineKeywords)):
             EdkLogger.error('Parser', FORMAT_INVALID,
                             "Unknown keyword found: %s. "
                             "If this is a macro you must "
@@ -1170,6 +1210,7 @@ class DscParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "Correct format is '<Number>|<UiName>[|<UiName>]'",
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
+
     @ParseMacro
     def _DefaultStoresParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
@@ -1178,15 +1219,14 @@ class DscParser(MetaFileParser):
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0:len(TokenList)] = TokenList
 
-    ## Parse Edk style of library modules
+    # Parse Edk style of library modules
     @ParseMacro
     def _LibraryInstanceParser(self):
         self._ValueList[0] = self._CurrentLine
 
-
     def _DecodeCODEData(self):
         pass
-    ## PCD sections parser
+    # PCD sections parser
     #
     #   [PcdsFixedAtBuild]
     #   [PcdsPatchableInModule]
@@ -1200,12 +1240,14 @@ class DscParser(MetaFileParser):
     #   [PcdsDynamicVpd]
     #   [PcdsDynamicHii]
     #
+
     @ParseMacro
     def _PcdParser(self):
         if self._PcdDataTypeCODE:
             self._PcdCodeValue = self._PcdCodeValue + "\n " + self._CurrentLine
             if self._CurrentLine.endswith(")}"):
-                self._CurrentLine = "|".join((self._CurrentPcdName, self._PcdCodeValue))
+                self._CurrentLine = "|".join(
+                    (self._CurrentPcdName, self._PcdCodeValue))
                 self._PcdDataTypeCODE = False
                 self._PcdCodeValue = ""
             else:
@@ -1229,14 +1271,17 @@ class DscParser(MetaFileParser):
         if len(PcdNameTockens) == 2:
             self._ValueList[0], self._ValueList[1] = PcdNameTockens[0], PcdNameTockens[1]
         elif len(PcdNameTockens) == 3:
-            self._ValueList[0], self._ValueList[1] = ".".join((PcdNameTockens[0], PcdNameTockens[1])), PcdNameTockens[2]
+            self._ValueList[0], self._ValueList[1] = ".".join(
+                (PcdNameTockens[0], PcdNameTockens[1])), PcdNameTockens[2]
         elif len(PcdNameTockens) > 3:
-            self._ValueList[0], self._ValueList[1] = ".".join((PcdNameTockens[0], PcdNameTockens[1])), ".".join(PcdNameTockens[2:])
+            self._ValueList[0], self._ValueList[1] = ".".join(
+                (PcdNameTockens[0], PcdNameTockens[1])), ".".join(PcdNameTockens[2:])
         if len(TokenList) == 2:
             self._ValueList[2] = TokenList[1]
         if self._ValueList[0] == '' or self._ValueList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if self._ValueList[2] == '':
             #
@@ -1245,32 +1290,36 @@ class DscParser(MetaFileParser):
             if self._SectionType in (MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT):
                 return
             EdkLogger.error('Parser', FORMAT_INVALID, "No PCD value given",
-                            ExtraData=self._CurrentLine + " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
+                            ExtraData=self._CurrentLine +
+                            " (<TokenSpaceGuidCName>.<TokenCName>|<PcdValue>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
         # Validate the datum type of Dynamic Defaul PCD and DynamicEx Default PCD
         ValueList = GetSplitValueList(self._ValueList[2])
         if len(ValueList) > 1 and ValueList[1] in [TAB_UINT8, TAB_UINT16, TAB_UINT32, TAB_UINT64] \
-                              and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
+                and self._ItemType in [MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT]:
             EdkLogger.error('Parser', FORMAT_INVALID, "The datum type '%s' of PCD is wrong" % ValueList[1],
                             ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
         # Validate the VariableName of DynamicHii and DynamicExHii for PCD Entry must not be an empty string
         if self._ItemType in [MODEL_PCD_DYNAMIC_HII, MODEL_PCD_DYNAMIC_EX_HII]:
-            DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
+            DscPcdValueList = GetSplitValueList(
+                TokenList[1], TAB_VALUE_SPLIT, 1)
             if len(DscPcdValueList[0].replace('L', '').replace('"', '').strip()) == 0:
                 EdkLogger.error('Parser', FORMAT_INVALID, "The VariableName field in the HII format PCD entry must not be an empty string",
-                            ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
+                                ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
 
         # if value are 'True', 'true', 'TRUE' or 'False', 'false', 'FALSE', replace with integer 1 or 0.
         DscPcdValueList = GetSplitValueList(TokenList[1], TAB_VALUE_SPLIT, 1)
         if DscPcdValueList[0] in ['True', 'true', 'TRUE']:
-            self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '1', 1);
+            self._ValueList[2] = TokenList[1].replace(
+                DscPcdValueList[0], '1', 1)
         elif DscPcdValueList[0] in ['False', 'false', 'FALSE']:
-            self._ValueList[2] = TokenList[1].replace(DscPcdValueList[0], '0', 1);
+            self._ValueList[2] = TokenList[1].replace(
+                DscPcdValueList[0], '0', 1)
 
+    # [components] section parser
 
-    ## [components] section parser
     @ParseMacro
     def _ComponentParser(self):
         if self._CurrentLine[-1] == '{':
@@ -1280,27 +1329,30 @@ class DscParser(MetaFileParser):
         else:
             self._ValueList[0] = self._CurrentLine
 
-    ## [LibraryClasses] section
+    # [LibraryClasses] section
     @ParseMacro
     def _LibraryClassParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No library class or instance specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if TokenList[0] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No library class specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if TokenList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No library instance specified",
-                            ExtraData=self._CurrentLine + " (<LibraryClassName>|<LibraryInstancePath>)",
+                            ExtraData=self._CurrentLine +
+                            " (<LibraryClassName>|<LibraryInstancePath>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
 
         self._ValueList[0:len(TokenList)] = TokenList
 
+    # [BuildOptions] section parser
 
-    ## [BuildOptions] section parser
     @ParseMacro
     def _BuildOptionParser(self):
         self._CurrentLine = CleanString(self._CurrentLine, BuildOption=True)
@@ -1318,13 +1370,14 @@ class DscParser(MetaFileParser):
             EdkLogger.error(
                 'Parser',
                 FORMAT_INVALID,
-                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[1],
+                "'%s' must be in format of <TARGET>_<TOOLCHAIN>_<ARCH>_<TOOL>_FLAGS" % self._ValueList[
+                    1],
                 ExtraData=self._CurrentLine,
                 File=self.MetaFile,
                 Line=self._LineIndex + 1
-                )
+            )
 
-    ## Override parent's method since we'll do all macro replacements in parser
+    # Override parent's method since we'll do all macro replacements in parser
     @property
     def _Macros(self):
         Macros = {}
@@ -1347,40 +1400,41 @@ class DscParser(MetaFileParser):
 
     def _PostProcess(self):
         Processer = {
-            MODEL_META_DATA_SECTION_HEADER                  :   self.__ProcessSectionHeader,
-            MODEL_META_DATA_SUBSECTION_HEADER               :   self.__ProcessSubsectionHeader,
-            MODEL_META_DATA_HEADER                          :   self.__ProcessDefine,
-            MODEL_META_DATA_DEFINE                          :   self.__ProcessDefine,
-            MODEL_META_DATA_GLOBAL_DEFINE                   :   self.__ProcessDefine,
-            MODEL_META_DATA_INCLUDE                         :   self.__ProcessDirective,
-            MODEL_META_DATA_PACKAGE                         :   self.__ProcessPackages,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IF        :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE      :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF     :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF    :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF     :   self.__ProcessDirective,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF    :   self.__ProcessDirective,
-            MODEL_EFI_SKU_ID                                :   self.__ProcessSkuId,
-            MODEL_EFI_DEFAULT_STORES                        :   self.__ProcessDefaultStores,
-            MODEL_EFI_LIBRARY_INSTANCE                      :   self.__ProcessLibraryInstance,
-            MODEL_EFI_LIBRARY_CLASS                         :   self.__ProcessLibraryClass,
-            MODEL_PCD_FIXED_AT_BUILD                        :   self.__ProcessPcd,
-            MODEL_PCD_PATCHABLE_IN_MODULE                   :   self.__ProcessPcd,
-            MODEL_PCD_FEATURE_FLAG                          :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_DEFAULT                       :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_HII                           :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_VPD                           :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_DEFAULT                    :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_HII                        :   self.__ProcessPcd,
-            MODEL_PCD_DYNAMIC_EX_VPD                        :   self.__ProcessPcd,
-            MODEL_META_DATA_COMPONENT                       :   self.__ProcessComponent,
-            MODEL_META_DATA_BUILD_OPTION                    :   self.__ProcessBuildOption,
-            MODEL_UNKNOWN                                   :   self._Skip,
-            MODEL_META_DATA_USER_EXTENSION                  :   self._SkipUserExtension,
-            MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR     :   self._ProcessError,
+            MODEL_META_DATA_SECTION_HEADER:   self.__ProcessSectionHeader,
+            MODEL_META_DATA_SUBSECTION_HEADER:   self.__ProcessSubsectionHeader,
+            MODEL_META_DATA_HEADER:   self.__ProcessDefine,
+            MODEL_META_DATA_DEFINE:   self.__ProcessDefine,
+            MODEL_META_DATA_GLOBAL_DEFINE:   self.__ProcessDefine,
+            MODEL_META_DATA_INCLUDE:   self.__ProcessDirective,
+            MODEL_META_DATA_PACKAGE:   self.__ProcessPackages,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSE:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFDEF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ENDIF:   self.__ProcessDirective,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ELSEIF:   self.__ProcessDirective,
+            MODEL_EFI_SKU_ID:   self.__ProcessSkuId,
+            MODEL_EFI_DEFAULT_STORES:   self.__ProcessDefaultStores,
+            MODEL_EFI_LIBRARY_INSTANCE:   self.__ProcessLibraryInstance,
+            MODEL_EFI_LIBRARY_CLASS:   self.__ProcessLibraryClass,
+            MODEL_PCD_FIXED_AT_BUILD:   self.__ProcessPcd,
+            MODEL_PCD_PATCHABLE_IN_MODULE:   self.__ProcessPcd,
+            MODEL_PCD_FEATURE_FLAG:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_DEFAULT:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_HII:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_VPD:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_DEFAULT:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_HII:   self.__ProcessPcd,
+            MODEL_PCD_DYNAMIC_EX_VPD:   self.__ProcessPcd,
+            MODEL_META_DATA_COMPONENT:   self.__ProcessComponent,
+            MODEL_META_DATA_BUILD_OPTION:   self.__ProcessBuildOption,
+            MODEL_UNKNOWN:   self._Skip,
+            MODEL_META_DATA_USER_EXTENSION:   self._SkipUserExtension,
+            MODEL_META_DATA_CONDITIONAL_STATEMENT_ERROR:   self._ProcessError,
         }
 
-        self._Table = MetaFileStorage(self._RawTable.DB, self.MetaFile, MODEL_FILE_DSC, True)
+        self._Table = MetaFileStorage(
+            self._RawTable.DB, self.MetaFile, MODEL_FILE_DSC, True)
         self._DirectiveStack = []
         self._DirectiveEvalStack = []
         self._FileWithError = self.MetaFile
@@ -1393,7 +1447,7 @@ class DscParser(MetaFileParser):
         self._Content = self._RawTable.GetAll()
         self._ContentIndex = 0
         self._InSubsection = False
-        while self._ContentIndex < len(self._Content) :
+        while self._ContentIndex < len(self._Content):
             Id, self._ItemType, V1, V2, V3, S1, S2, S3, Owner, self._From, \
                 LineStart, ColStart, LineEnd, ColEnd, Enabled = self._Content[self._ContentIndex]
 
@@ -1438,52 +1492,60 @@ class DscParser(MetaFileParser):
                         EdkLogger.error('Parser', FORMAT_INVALID, "Cannot use this PCD (%s) in an expression as"
                                         " it must be defined in a [PcdsFixedAtBuild] or [PcdsFeatureFlag] section"
                                         " of the DSC file, and it is currently defined in this section:"
-                                        " %s, line #: %d." % (Excpt.Pcd, Info[0], Info[1]),
-                                    File=self._FileWithError, ExtraData=' '.join(self._ValueList),
-                                    Line=self._LineIndex + 1)
+                                        " %s, line #: %d." % (
+                                            Excpt.Pcd, Info[0], Info[1]),
+                                        File=self._FileWithError, ExtraData=' '.join(
+                                            self._ValueList),
+                                        Line=self._LineIndex + 1)
                     else:
                         EdkLogger.error('Parser', FORMAT_INVALID, "PCD (%s) is not defined in DSC file" % Excpt.Pcd,
-                                    File=self._FileWithError, ExtraData=' '.join(self._ValueList),
-                                    Line=self._LineIndex + 1)
+                                        File=self._FileWithError, ExtraData=' '.join(
+                                            self._ValueList),
+                                        Line=self._LineIndex + 1)
                 else:
                     EdkLogger.error('Parser', FORMAT_INVALID, "Invalid expression: %s" % str(Excpt),
-                                    File=self._FileWithError, ExtraData=' '.join(self._ValueList),
+                                    File=self._FileWithError, ExtraData=' '.join(
+                                        self._ValueList),
                                     Line=self._LineIndex + 1)
             except MacroException as Excpt:
                 EdkLogger.error('Parser', FORMAT_INVALID, str(Excpt),
-                                File=self._FileWithError, ExtraData=' '.join(self._ValueList),
+                                File=self._FileWithError, ExtraData=' '.join(
+                                    self._ValueList),
                                 Line=self._LineIndex + 1)
 
             if self._ValueList is None:
                 continue
 
             NewOwner = self._IdMapping.get(Owner, -1)
-            self._Enabled = int((not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack))
+            self._Enabled = int((not self._DirectiveEvalStack) or (
+                False not in self._DirectiveEvalStack))
             self._LastItem = self._Store(
-                                self._ItemType,
-                                self._ValueList[0],
-                                self._ValueList[1],
-                                self._ValueList[2],
-                                S1,
-                                S2,
-                                S3,
-                                NewOwner,
-                                self._From,
-                                self._LineIndex + 1,
-                                - 1,
-                                self._LineIndex + 1,
-                                - 1,
-                                self._Enabled
-                                )
+                self._ItemType,
+                self._ValueList[0],
+                self._ValueList[1],
+                self._ValueList[2],
+                S1,
+                S2,
+                S3,
+                NewOwner,
+                self._From,
+                self._LineIndex + 1,
+                - 1,
+                self._LineIndex + 1,
+                - 1,
+                self._Enabled
+            )
             self._IdMapping[Id] = self._LastItem
 
         GlobalData.gPlatformDefines.update(self._FileLocalMacros)
         self._PostProcessed = True
         self._Content = None
+
     def _ProcessError(self):
         if not self._Enabled:
             return
-        EdkLogger.error('Parser', ERROR_STATEMENT, self._ValueList[1], File=self.MetaFile, Line=self._LineIndex + 1)
+        EdkLogger.error('Parser', ERROR_STATEMENT,
+                        self._ValueList[1], File=self.MetaFile, Line=self._LineIndex + 1)
 
     def __ProcessSectionHeader(self):
         self._SectionName = self._ValueList[0]
@@ -1504,20 +1566,22 @@ class DscParser(MetaFileParser):
             with open(str(self.MetaFile), 'r') as File:
                 Content = File.readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
 
         GlobalData.gPlatformOtherPcds['DSCFILE'] = str(self.MetaFile)
         for PcdType in (MODEL_PCD_PATCHABLE_IN_MODULE, MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_HII,
                         MODEL_PCD_DYNAMIC_VPD, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_DYNAMIC_EX_HII,
                         MODEL_PCD_DYNAMIC_EX_VPD):
-            Records = self._RawTable.Query(PcdType, BelongsToItem= -1.0)
+            Records = self._RawTable.Query(PcdType, BelongsToItem=-1.0)
             for TokenSpaceGuid, PcdName, Value, Dummy2, Dummy3, Dummy4, ID, Line in Records:
                 Name = TokenSpaceGuid + '.' + PcdName
                 if Name not in GlobalData.gPlatformOtherPcds:
                     PcdLine = Line
                     while not Content[Line - 1].lstrip().startswith(TAB_SECTION_START):
                         Line -= 1
-                    GlobalData.gPlatformOtherPcds[Name] = (CleanString(Content[Line - 1]), PcdLine, PcdType)
+                    GlobalData.gPlatformOtherPcds[Name] = (
+                        CleanString(Content[Line - 1]), PcdLine, PcdType)
 
     def __ProcessDefine(self):
         if not self._Enabled:
@@ -1557,7 +1621,8 @@ class DscParser(MetaFileParser):
             try:
                 Result = ValueExpression(self._ValueList[1], Macros)()
             except SymbolNotFound as Exc:
-                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc), self._ValueList[1])
+                EdkLogger.debug(EdkLogger.DEBUG_5, str(Exc),
+                                self._ValueList[1])
                 Result = False
             except WrnExpression as Excpt:
                 #
@@ -1565,8 +1630,9 @@ class DscParser(MetaFileParser):
                 # the precise number of line and return the evaluation result
                 #
                 EdkLogger.warn('Parser', "Suspicious expression: %s" % str(Excpt),
-                                File=self._FileWithError, ExtraData=' '.join(self._ValueList),
-                                Line=self._LineIndex + 1)
+                               File=self._FileWithError, ExtraData=' '.join(
+                                   self._ValueList),
+                               Line=self._LineIndex + 1)
                 Result = Excpt.result
 
         if self._ItemType in [MODEL_META_DATA_CONDITIONAL_STATEMENT_IF,
@@ -1577,7 +1643,8 @@ class DscParser(MetaFileParser):
                 Result = bool(Result)
             else:
                 Macro = self._ValueList[1]
-                Macro = Macro[2:-1] if (Macro.startswith("$(") and Macro.endswith(")")) else Macro
+                Macro = Macro[2:-1] if (Macro.startswith("$(")
+                                        and Macro.endswith(")")) else Macro
                 Result = Macro in self._Macros
                 if self._ItemType == MODEL_META_DATA_CONDITIONAL_STATEMENT_IFNDEF:
                     Result = not Result
@@ -1611,7 +1678,8 @@ class DscParser(MetaFileParser):
             #
             __IncludeMacros.update(self._Macros)
 
-            IncludedFile = NormPath(ReplaceMacro(self._ValueList[1], __IncludeMacros, RaiseError=True))
+            IncludedFile = NormPath(ReplaceMacro(
+                self._ValueList[1], __IncludeMacros, RaiseError=True))
             #
             # First search the include file under the same directory as DSC file
             #
@@ -1622,7 +1690,8 @@ class DscParser(MetaFileParser):
                     #
                     # Also search file under the WORKSPACE directory
                     #
-                    IncludedFile1 = PathClass(IncludedFile, GlobalData.gWorkspace)
+                    IncludedFile1 = PathClass(
+                        IncludedFile, GlobalData.gWorkspace)
                     ErrorCode, ErrorInfo2 = IncludedFile1.Validate()
                     if ErrorCode != 0:
                         EdkLogger.error('parser', ErrorCode, File=self._FileWithError,
@@ -1635,11 +1704,12 @@ class DscParser(MetaFileParser):
                     Owner = self._Content[self._ContentIndex - 1][8]
                 else:
                     Owner = self._Content[self._ContentIndex - 1][0]
-                IncludedFileTable = MetaFileStorage(self._RawTable.DB, IncludedFile1, MODEL_FILE_DSC, False, FromItem=FromItem)
+                IncludedFileTable = MetaFileStorage(
+                    self._RawTable.DB, IncludedFile1, MODEL_FILE_DSC, False, FromItem=FromItem)
                 Parser = DscParser(IncludedFile1, self._FileType, self._Arch, IncludedFileTable,
                                    Owner=Owner, From=FromItem)
 
-                self.IncludedFiles.add (IncludedFile1)
+                self.IncludedFiles.add(IncludedFile1)
 
                 # set the parser status with current status
                 Parser._SectionName = self._SectionName
@@ -1664,22 +1734,27 @@ class DscParser(MetaFileParser):
     def __ProcessSkuId(self):
         self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=True)
                            for Value in self._ValueList]
+
     def __ProcessDefaultStores(self):
         self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=True)
                            for Value in self._ValueList]
 
     def __ProcessLibraryInstance(self):
-        self._ValueList = [ReplaceMacro(Value, self._Macros) for Value in self._ValueList]
+        self._ValueList = [ReplaceMacro(Value, self._Macros)
+                           for Value in self._ValueList]
 
     def __ProcessLibraryClass(self):
-        self._ValueList[1] = ReplaceMacro(self._ValueList[1], self._Macros, RaiseError=True)
+        self._ValueList[1] = ReplaceMacro(
+            self._ValueList[1], self._Macros, RaiseError=True)
 
     def __ProcessPcd(self):
         if self._ItemType not in [MODEL_PCD_FEATURE_FLAG, MODEL_PCD_FIXED_AT_BUILD]:
-            self._ValueList[2] = ReplaceMacro(self._ValueList[2], self._Macros, RaiseError=True)
+            self._ValueList[2] = ReplaceMacro(
+                self._ValueList[2], self._Macros, RaiseError=True)
             return
 
-        ValList, Valid, Index = AnalyzeDscPcd(self._ValueList[2], self._ItemType)
+        ValList, Valid, Index = AnalyzeDscPcd(
+            self._ValueList[2], self._ItemType)
         if not Valid:
             if self._ItemType in (MODEL_PCD_DYNAMIC_DEFAULT, MODEL_PCD_DYNAMIC_EX_DEFAULT, MODEL_PCD_FIXED_AT_BUILD, MODEL_PCD_PATCHABLE_IN_MODULE):
                 if ValList[1] != TAB_VOID and StructPattern.match(ValList[1]) is None and ValList[2]:
@@ -1702,7 +1777,8 @@ class DscParser(MetaFileParser):
             ValList[Index] = '0'
 
         if (not self._DirectiveEvalStack) or (False not in self._DirectiveEvalStack):
-            GlobalData.gPlatformPcds[TAB_SPLIT.join(self._ValueList[0:2])] = PcdValue
+            GlobalData.gPlatformPcds[TAB_SPLIT.join(
+                self._ValueList[0:2])] = PcdValue
             self._Symbols[TAB_SPLIT.join(self._ValueList[0:2])] = PcdValue
         try:
             self._ValueList[2] = '|'.join(ValList)
@@ -1716,61 +1792,63 @@ class DscParser(MetaFileParser):
         self._ValueList = [ReplaceMacro(Value, self._Macros, RaiseError=False)
                            for Value in self._ValueList]
 
-    def DisableOverrideComponent(self,module_id):
+    def DisableOverrideComponent(self, module_id):
         for ori_id in self._IdMapping:
             if self._IdMapping[ori_id] == module_id:
                 self._RawTable.DisableComponent(ori_id)
 
     _SectionParser = {
-        MODEL_META_DATA_HEADER                          :   _DefineParser,
-        MODEL_EFI_SKU_ID                                :   _SkuIdParser,
-        MODEL_EFI_DEFAULT_STORES                        :   _DefaultStoresParser,
-        MODEL_EFI_LIBRARY_INSTANCE                      :   _LibraryInstanceParser,
-        MODEL_EFI_LIBRARY_CLASS                         :   _LibraryClassParser,
-        MODEL_PCD_FIXED_AT_BUILD                        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE                   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG                          :   _PcdParser,
-        MODEL_PCD_DYNAMIC_DEFAULT                       :   _PcdParser,
-        MODEL_PCD_DYNAMIC_HII                           :   _PcdParser,
-        MODEL_PCD_DYNAMIC_VPD                           :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_DEFAULT                    :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_HII                        :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX_VPD                        :   _PcdParser,
-        MODEL_META_DATA_COMPONENT                       :   _ComponentParser,
-        MODEL_META_DATA_BUILD_OPTION                    :   _BuildOptionParser,
-        MODEL_UNKNOWN                                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_PACKAGE                         :   MetaFileParser._PackageParser,
-        MODEL_META_DATA_USER_EXTENSION                  :   MetaFileParser._SkipUserExtension,
-        MODEL_META_DATA_SECTION_HEADER                  :   MetaFileParser._SectionHeaderParser,
-        MODEL_META_DATA_SUBSECTION_HEADER               :   _SubsectionHeaderParser,
+        MODEL_META_DATA_HEADER:   _DefineParser,
+        MODEL_EFI_SKU_ID:   _SkuIdParser,
+        MODEL_EFI_DEFAULT_STORES:   _DefaultStoresParser,
+        MODEL_EFI_LIBRARY_INSTANCE:   _LibraryInstanceParser,
+        MODEL_EFI_LIBRARY_CLASS:   _LibraryClassParser,
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC_DEFAULT:   _PcdParser,
+        MODEL_PCD_DYNAMIC_HII:   _PcdParser,
+        MODEL_PCD_DYNAMIC_VPD:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_DEFAULT:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_HII:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX_VPD:   _PcdParser,
+        MODEL_META_DATA_COMPONENT:   _ComponentParser,
+        MODEL_META_DATA_BUILD_OPTION:   _BuildOptionParser,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_PACKAGE:   MetaFileParser._PackageParser,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._SkipUserExtension,
+        MODEL_META_DATA_SECTION_HEADER:   MetaFileParser._SectionHeaderParser,
+        MODEL_META_DATA_SUBSECTION_HEADER:   _SubsectionHeaderParser,
     }
 
-## DEC file parser class
+# DEC file parser class
 #
 #   @param      FilePath        The path of platform description file
 #   @param      FileType        The raw data of DSC file
 #   @param      Table           Database used to retrieve module/package information
 #   @param      Macros          Macros used for replacement in file
 #
+
+
 class DecParser(MetaFileParser):
     # DEC file supported data types (one type per section)
     DataType = {
-        TAB_DEC_DEFINES.upper()                     :   MODEL_META_DATA_HEADER,
-        TAB_DSC_DEFINES_DEFINE                      :   MODEL_META_DATA_DEFINE,
-        TAB_INCLUDES.upper()                        :   MODEL_EFI_INCLUDE,
-        TAB_LIBRARY_CLASSES.upper()                 :   MODEL_EFI_LIBRARY_CLASS,
-        TAB_GUIDS.upper()                           :   MODEL_EFI_GUID,
-        TAB_PPIS.upper()                            :   MODEL_EFI_PPI,
-        TAB_PROTOCOLS.upper()                       :   MODEL_EFI_PROTOCOL,
-        TAB_PCDS_FIXED_AT_BUILD_NULL.upper()        :   MODEL_PCD_FIXED_AT_BUILD,
-        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper()   :   MODEL_PCD_PATCHABLE_IN_MODULE,
-        TAB_PCDS_FEATURE_FLAG_NULL.upper()          :   MODEL_PCD_FEATURE_FLAG,
-        TAB_PCDS_DYNAMIC_NULL.upper()               :   MODEL_PCD_DYNAMIC,
-        TAB_PCDS_DYNAMIC_EX_NULL.upper()            :   MODEL_PCD_DYNAMIC_EX,
-        TAB_USER_EXTENSIONS.upper()                 :   MODEL_META_DATA_USER_EXTENSION,
+        TAB_DEC_DEFINES.upper():   MODEL_META_DATA_HEADER,
+        TAB_DSC_DEFINES_DEFINE:   MODEL_META_DATA_DEFINE,
+        TAB_INCLUDES.upper():   MODEL_EFI_INCLUDE,
+        TAB_LIBRARY_CLASSES.upper():   MODEL_EFI_LIBRARY_CLASS,
+        TAB_GUIDS.upper():   MODEL_EFI_GUID,
+        TAB_PPIS.upper():   MODEL_EFI_PPI,
+        TAB_PROTOCOLS.upper():   MODEL_EFI_PROTOCOL,
+        TAB_PCDS_FIXED_AT_BUILD_NULL.upper():   MODEL_PCD_FIXED_AT_BUILD,
+        TAB_PCDS_PATCHABLE_IN_MODULE_NULL.upper():   MODEL_PCD_PATCHABLE_IN_MODULE,
+        TAB_PCDS_FEATURE_FLAG_NULL.upper():   MODEL_PCD_FEATURE_FLAG,
+        TAB_PCDS_DYNAMIC_NULL.upper():   MODEL_PCD_DYNAMIC,
+        TAB_PCDS_DYNAMIC_EX_NULL.upper():   MODEL_PCD_DYNAMIC_EX,
+        TAB_USER_EXTENSIONS.upper():   MODEL_META_DATA_USER_EXTENSION,
     }
 
-    ## Constructor of DecParser
+    # Constructor of DecParser
     #
     #  Initialize object of DecParser
     #
@@ -1786,7 +1864,7 @@ class DecParser(MetaFileParser):
         MetaFileParser.__init__(self, FilePath, FileType, Arch, Table, -1)
         self._Comments = []
         self._Version = 0x00010005  # Only EDK2 dec file is supported
-        self._AllPCDs = [] # Only for check duplicate PCD
+        self._AllPCDs = []  # Only for check duplicate PCD
         self._AllPcdDict = {}
 
         self._CurrentStructurePcdName = ""
@@ -1795,14 +1873,15 @@ class DecParser(MetaFileParser):
 
         self._RestofValue = ""
 
-    ## Parser starter
+    # Parser starter
     def Start(self):
         Content = ''
         try:
             with open(str(self.MetaFile), 'r') as File:
                 Content = File.readlines()
         except:
-            EdkLogger.error("Parser", FILE_READ_FAILURE, ExtraData=self.MetaFile)
+            EdkLogger.error("Parser", FILE_READ_FAILURE,
+                            ExtraData=self.MetaFile)
 
         Content = self.ProcessMultipleLineCODEValue(Content)
 
@@ -1829,7 +1908,7 @@ class DecParser(MetaFileParser):
             if self._SectionType == MODEL_UNKNOWN:
                 EdkLogger.error("Parser", FORMAT_INVALID,
                                 ""
-                                "Not able to determine \"%s\" in which section."%self._CurrentLine,
+                                "Not able to determine \"%s\" in which section." % self._CurrentLine,
                                 self.MetaFile, self._LineIndex + 1)
             elif len(self._SectionType) == 0:
                 self._Comments = []
@@ -1861,7 +1940,7 @@ class DecParser(MetaFileParser):
                     self._LineIndex + 1,
                     - 1,
                     0
-                    )
+                )
                 for Comment, LineNo in self._Comments:
                     self._Store(
                         MODEL_META_DATA_COMMENT,
@@ -1876,21 +1955,23 @@ class DecParser(MetaFileParser):
                         LineNo,
                         - 1,
                         0
-                        )
+                    )
             self._Comments = []
         if self._DefinesCount > 1:
-            EdkLogger.error('Parser', FORMAT_INVALID, 'Multiple [Defines] section is exist.', self.MetaFile )
+            EdkLogger.error('Parser', FORMAT_INVALID,
+                            'Multiple [Defines] section is exist.', self.MetaFile)
         if self._DefinesCount == 0:
-            EdkLogger.error('Parser', FORMAT_INVALID, 'No [Defines] section exist.', self.MetaFile)
+            EdkLogger.error('Parser', FORMAT_INVALID,
+                            'No [Defines] section exist.', self.MetaFile)
         self._Done()
 
-
-    ## Section header parser
+    # Section header parser
     #
     #   The section header is always in following format:
     #
     #       [section_name.arch<.platform|module_type>]
     #
+
     def _SectionHeaderParser(self):
         self._Scope = []
         self._SectionName = ''
@@ -1919,13 +2000,13 @@ class DecParser(MetaFileParser):
 
             if MODEL_PCD_FEATURE_FLAG in self._SectionType and len(self._SectionType) > 1:
                 EdkLogger.error(
-                            'Parser',
-                            FORMAT_INVALID,
-                            "%s must not be in the same section of other types of PCD" % TAB_PCDS_FEATURE_FLAG_NULL,
-                            File=self.MetaFile,
-                            Line=self._LineIndex + 1,
-                            ExtraData=self._CurrentLine
-                            )
+                    'Parser',
+                    FORMAT_INVALID,
+                    "%s must not be in the same section of other types of PCD" % TAB_PCDS_FEATURE_FLAG_NULL,
+                    File=self.MetaFile,
+                    Line=self._LineIndex + 1,
+                    ExtraData=self._CurrentLine
+                )
             # S1 is always Arch
             if len(ItemList) > 1:
                 S1 = ItemList[1].upper()
@@ -1956,33 +2037,36 @@ class DecParser(MetaFileParser):
             EdkLogger.error('Parser', FORMAT_INVALID, "Can't mix section tags without the Private attribute with section tags with the Private attribute",
                             File=self.MetaFile, Line=self._LineIndex + 1, ExtraData=self._CurrentLine)
 
-    ## [guids], [ppis] and [protocols] section parser
+    # [guids], [ppis] and [protocols] section parser
     @ParseMacro
     def _GuidParser(self):
         TokenList = GetSplitValueList(self._CurrentLine, TAB_EQUAL_SPLIT, 1)
         if len(TokenList) < 2:
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID name or value specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if TokenList[0] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID name specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if TokenList[1] == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "No GUID value specified",
-                            ExtraData=self._CurrentLine + " (<CName> = <GuidValueInCFormat>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         if TokenList[1][0] != '{' or TokenList[1][-1] != '}' or GuidStructureStringToGuidString(TokenList[1]) == '':
             EdkLogger.error('Parser', FORMAT_INVALID, "Invalid GUID value format",
-                            ExtraData=self._CurrentLine + \
-                                      " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
+                            ExtraData=self._CurrentLine +
+                            " (<CName> = <GuidValueInCFormat:{8,4,4,{2,2,2,2,2,2,2,2}}>)",
                             File=self.MetaFile, Line=self._LineIndex + 1)
         self._ValueList[0] = TokenList[0]
         self._ValueList[1] = TokenList[1]
         if self._ValueList[0] not in self._GuidDict:
             self._GuidDict[self._ValueList[0]] = self._ValueList[1]
 
-    def ParsePcdName(self,namelist):
+    def ParsePcdName(self, namelist):
         if "[" in namelist[1]:
             pcdname = namelist[1][:namelist[1].index("[")]
             arrayindex = namelist[1][namelist[1].index("["):]
@@ -1990,10 +2074,10 @@ class DecParser(MetaFileParser):
             if len(namelist) == 2:
                 namelist.append(arrayindex)
             else:
-                namelist[2] = ".".join((arrayindex,namelist[2]))
+                namelist[2] = ".".join((arrayindex, namelist[2]))
         return namelist
 
-    ## PCD sections parser
+    # PCD sections parser
     #
     #   [PcdsFixedAtBuild]
     #   [PcdsPatchableInModule]
@@ -2020,10 +2104,12 @@ class DecParser(MetaFileParser):
                     return
 
                 if self._include_flag:
-                    self._ValueList[1] = "<HeaderFiles>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
+                    self._ValueList[1] = "<HeaderFiles>_" + \
+                        md5(self._CurrentLine.encode('utf-8')).hexdigest()
                     self._ValueList[2] = self._CurrentLine
                 if self._package_flag and "}" != self._CurrentLine:
-                    self._ValueList[1] = "<Packages>_" + md5(self._CurrentLine.encode('utf-8')).hexdigest()
+                    self._ValueList[1] = "<Packages>_" + \
+                        md5(self._CurrentLine.encode('utf-8')).hexdigest()
                     self._ValueList[2] = self._CurrentLine
                 if self._CurrentLine == "}":
                     self._package_flag = False
@@ -2037,7 +2123,8 @@ class DecParser(MetaFileParser):
                     if PcdNames[1].strip().endswith("]"):
                         PcdName = PcdNames[1][:PcdNames[1].index('[')]
                         Index = PcdNames[1][PcdNames[1].index('['):]
-                        self._ValueList[0] = TAB_SPLIT.join((PcdNames[0],PcdName))
+                        self._ValueList[0] = TAB_SPLIT.join(
+                            (PcdNames[0], PcdName))
                         self._ValueList[1] = Index
                         self._ValueList[2] = PcdTockens[1]
                     else:
@@ -2045,22 +2132,24 @@ class DecParser(MetaFileParser):
                 else:
                     if self._CurrentStructurePcdName != TAB_SPLIT.join(PcdNames[:2]):
                         EdkLogger.error('Parser', FORMAT_INVALID, "Pcd Name does not match: %s and %s " % (self._CurrentStructurePcdName, TAB_SPLIT.join(PcdNames[:2])),
-                                File=self.MetaFile, Line=self._LineIndex + 1)
+                                        File=self.MetaFile, Line=self._LineIndex + 1)
                     self._ValueList[1] = TAB_SPLIT.join(PcdNames[2:])
                     self._ValueList[2] = PcdTockens[1]
         if not self._CurrentStructurePcdName:
             if self._PcdDataTypeCODE:
                 if ")}" in self._CurrentLine:
-                    ValuePart,RestofValue = self._CurrentLine.split(")}")
+                    ValuePart, RestofValue = self._CurrentLine.split(")}")
                     self._PcdCodeValue = self._PcdCodeValue + "\n " + ValuePart
-                    self._CurrentLine = "|".join((self._CurrentPcdName, self._PcdCodeValue,RestofValue))
+                    self._CurrentLine = "|".join(
+                        (self._CurrentPcdName, self._PcdCodeValue, RestofValue))
                     self._PcdDataTypeCODE = False
                     self._PcdCodeValue = ""
                 else:
                     self._PcdCodeValue = self._PcdCodeValue + "\n " + self._CurrentLine
                     self._ValueList = None
                     return
-            TokenList = GetSplitValueList(self._CurrentLine, TAB_VALUE_SPLIT, 1)
+            TokenList = GetSplitValueList(
+                self._CurrentLine, TAB_VALUE_SPLIT, 1)
             self._CurrentPcdName = TokenList[0]
             if len(TokenList) == 2 and TokenList[1].strip().startswith("{CODE"):
                 if ")}" in self._CurrentLine:
@@ -2077,29 +2166,28 @@ class DecParser(MetaFileParser):
             # check PCD information
             if self._ValueList[0] == '' or self._ValueList[1] == '':
                 EdkLogger.error('Parser', FORMAT_INVALID, "No token space GUID or PCD name specified",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check format of token space GUID CName
             if not ValueRe.match(self._ValueList[0]):
                 EdkLogger.error('Parser', FORMAT_INVALID, "The format of the token space GUID CName is invalid. The correct format is '(a-zA-Z_)[a-zA-Z0-9_]*'",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check format of PCD CName
             if not ValueRe.match(self._ValueList[1]):
                 EdkLogger.error('Parser', FORMAT_INVALID, "The format of the PCD CName is invalid. The correct format is '(a-zA-Z_)[a-zA-Z0-9_]*'",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check PCD datum information
             if len(TokenList) < 2 or TokenList[1] == '':
                 EdkLogger.error('Parser', FORMAT_INVALID, "No PCD Datum information given",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
 
-
             ValueRe = re.compile(r'^\s*L?\".*\|.*\"')
             PtrValue = ValueRe.findall(TokenList[1])
 
@@ -2111,39 +2199,40 @@ class DecParser(MetaFileParser):
             else:
                 ValueList = AnalyzePcdExpression(TokenList[1])
 
-
             # check if there's enough datum information given
             if len(ValueList) != 3:
                 EdkLogger.error('Parser', FORMAT_INVALID, "Invalid PCD Datum information given",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check default value
             if ValueList[0] == '':
                 EdkLogger.error('Parser', FORMAT_INVALID, "Missing DefaultValue in PCD Datum information",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check datum type
             if ValueList[1] == '':
                 EdkLogger.error('Parser', FORMAT_INVALID, "Missing DatumType in PCD Datum information",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
             # check token of the PCD
             if ValueList[2] == '':
                 EdkLogger.error('Parser', FORMAT_INVALID, "Missing Token in PCD Datum information",
-                                ExtraData=self._CurrentLine + \
-                                          " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
+                                ExtraData=self._CurrentLine +
+                                " (<TokenSpaceGuidCName>.<PcdCName>|<DefaultValue>|<DatumType>|<Token>)",
                                 File=self.MetaFile, Line=self._LineIndex + 1)
 
             PcdValue = ValueList[0]
             if PcdValue:
                 try:
                     self._GuidDict.update(self._AllPcdDict)
-                    ValueList[0] = ValueExpressionEx(ValueList[0], ValueList[1], self._GuidDict)(True)
+                    ValueList[0] = ValueExpressionEx(
+                        ValueList[0], ValueList[1], self._GuidDict)(True)
                 except BadExpression as Value:
-                    EdkLogger.error('Parser', FORMAT_INVALID, Value, ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
+                    EdkLogger.error('Parser', FORMAT_INVALID, Value, ExtraData=self._CurrentLine,
+                                    File=self.MetaFile, Line=self._LineIndex + 1)
             # check format of default value against the datum type
             IsValid, Cause = CheckPcdDatum(ValueList[1], ValueList[0])
             if not IsValid:
@@ -2151,7 +2240,8 @@ class DecParser(MetaFileParser):
                                 File=self.MetaFile, Line=self._LineIndex + 1)
 
             if Cause == "StructurePcd":
-                self._CurrentStructurePcdName = TAB_SPLIT.join(self._ValueList[0:2])
+                self._CurrentStructurePcdName = TAB_SPLIT.join(
+                    self._ValueList[0:2])
                 self._ValueList[0] = self._CurrentStructurePcdName
                 self._ValueList[1] = ValueList[1].strip()
 
@@ -2166,27 +2256,31 @@ class DecParser(MetaFileParser):
                                 "The same PCD name and GUID have been already defined",
                                 ExtraData=self._CurrentLine, File=self.MetaFile, Line=self._LineIndex + 1)
             else:
-                self._AllPCDs.append((self._Scope[0], self._ValueList[0], self._ValueList[1]))
-                self._AllPcdDict[TAB_SPLIT.join(self._ValueList[0:2])] = ValueList[0]
+                self._AllPCDs.append(
+                    (self._Scope[0], self._ValueList[0], self._ValueList[1]))
+                self._AllPcdDict[TAB_SPLIT.join(
+                    self._ValueList[0:2])] = ValueList[0]
 
-            self._ValueList[2] = ValueList[0].strip() + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
+            self._ValueList[2] = ValueList[0].strip(
+            ) + '|' + ValueList[1].strip() + '|' + ValueList[2].strip()
 
     _SectionParser = {
-        MODEL_META_DATA_HEADER          :   MetaFileParser._DefineParser,
-        MODEL_EFI_INCLUDE               :   MetaFileParser._PathParser,
-        MODEL_EFI_LIBRARY_CLASS         :   MetaFileParser._PathParser,
-        MODEL_EFI_GUID                  :   _GuidParser,
-        MODEL_EFI_PPI                   :   _GuidParser,
-        MODEL_EFI_PROTOCOL              :   _GuidParser,
-        MODEL_PCD_FIXED_AT_BUILD        :   _PcdParser,
-        MODEL_PCD_PATCHABLE_IN_MODULE   :   _PcdParser,
-        MODEL_PCD_FEATURE_FLAG          :   _PcdParser,
-        MODEL_PCD_DYNAMIC               :   _PcdParser,
-        MODEL_PCD_DYNAMIC_EX            :   _PcdParser,
-        MODEL_UNKNOWN                   :   MetaFileParser._Skip,
-        MODEL_META_DATA_USER_EXTENSION  :   MetaFileParser._SkipUserExtension,
+        MODEL_META_DATA_HEADER:   MetaFileParser._DefineParser,
+        MODEL_EFI_INCLUDE:   MetaFileParser._PathParser,
+        MODEL_EFI_LIBRARY_CLASS:   MetaFileParser._PathParser,
+        MODEL_EFI_GUID:   _GuidParser,
+        MODEL_EFI_PPI:   _GuidParser,
+        MODEL_EFI_PROTOCOL:   _GuidParser,
+        MODEL_PCD_FIXED_AT_BUILD:   _PcdParser,
+        MODEL_PCD_PATCHABLE_IN_MODULE:   _PcdParser,
+        MODEL_PCD_FEATURE_FLAG:   _PcdParser,
+        MODEL_PCD_DYNAMIC:   _PcdParser,
+        MODEL_PCD_DYNAMIC_EX:   _PcdParser,
+        MODEL_UNKNOWN:   MetaFileParser._Skip,
+        MODEL_META_DATA_USER_EXTENSION:   MetaFileParser._SkipUserExtension,
     }
 
+
 ##
 #
 # This acts like the main() function for the script, unless it is 'import'ed into another
@@ -2194,4 +2288,3 @@ class DecParser(MetaFileParser):
 #
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/Workspace/MetaFileTable.py b/BaseTools/Source/Python/Workspace/MetaFileTable.py
index bebf9062e8e5..307a709fb57d 100644
--- a/BaseTools/Source/Python/Workspace/MetaFileTable.py
+++ b/BaseTools/Source/Python/Workspace/MetaFileTable.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create/update/query/erase a meta file table
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -15,15 +15,16 @@ import Common.EdkLogger as EdkLogger
 from Common.BuildToolError import FORMAT_INVALID
 
 from CommonDataClass.DataClass import MODEL_FILE_DSC, MODEL_FILE_DEC, MODEL_FILE_INF, \
-                                      MODEL_FILE_OTHERS
+    MODEL_FILE_OTHERS
 from Common.DataType import *
 
+
 class MetaFileTable():
     # TRICK: use file ID as the part before '.'
     _ID_STEP_ = 1
     _ID_MAX_ = 99999999
 
-    ## Constructor
+    # Constructor
     def __init__(self, DB, MetaFile, FileType, Temporary, FromItem=None):
         self.MetaFile = MetaFile
         self.TableName = ""
@@ -32,16 +33,17 @@ class MetaFileTable():
 
         self.CurrentContent = []
         DB.TblFile.append([MetaFile.Name,
-                        MetaFile.Ext,
-                        MetaFile.Dir,
-                        MetaFile.Path,
-                        FileType,
-                        MetaFile.TimeStamp,
-                        FromItem])
+                           MetaFile.Ext,
+                           MetaFile.Dir,
+                           MetaFile.Path,
+                           FileType,
+                           MetaFile.TimeStamp,
+                           FromItem])
         self.FileId = len(DB.TblFile)
         self.ID = self.FileId * 10**8
         if Temporary:
-            self.TableName = "_%s_%s_%s" % (FileType, len(DB.TblFile), uuid.uuid4().hex)
+            self.TableName = "_%s_%s_%s" % (
+                FileType, len(DB.TblFile), uuid.uuid4().hex)
         else:
             self.TableName = "_%s_%s" % (FileType, len(DB.TblFile))
 
@@ -62,9 +64,11 @@ class MetaFileTable():
         self.CurrentContent.append(self._DUMMY_)
 
     def GetAll(self):
-        return [item for item in self.CurrentContent if item[0] >= 0 and item[-1]>=0]
+        return [item for item in self.CurrentContent if item[0] >= 0 and item[-1] >= 0]
+
+# Python class representation of table storing module data
+
 
-## Python class representation of table storing module data
 class ModuleTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -82,13 +86,14 @@ class ModuleTable(MetaFileTable):
         Enabled INTEGER DEFAULT 0
         '''
     # used as table end flag, in case the changes to database is not committed to db file
-    _DUMMY_ = [-1, -1, '====', '====', '====', '====', '====', -1, -1, -1, -1, -1, -1]
+    _DUMMY_ = [-1, -1, '====', '====', '====',
+               '====', '====', -1, -1, -1, -1, -1, -1]
 
-    ## Constructor
+    # Constructor
     def __init__(self, Db, MetaFile, Temporary):
         MetaFileTable.__init__(self, Db, MetaFile, MODEL_FILE_INF, Temporary)
 
-    ## Insert a record into table Inf
+    # Insert a record into table Inf
     #
     # @param Model:          Model of a Inf item
     # @param Value1:         Value1 of a Inf item
@@ -106,29 +111,30 @@ class ModuleTable(MetaFileTable):
     def Insert(self, Model, Value1, Value2, Value3, Scope1=TAB_ARCH_COMMON, Scope2=TAB_COMMON,
                BelongsToItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
 
-        (Value1, Value2, Value3, Scope1, Scope2) = (Value1.strip(), Value2.strip(), Value3.strip(), Scope1.strip(), Scope2.strip())
+        (Value1, Value2, Value3, Scope1, Scope2) = (Value1.strip(),
+                                                    Value2.strip(), Value3.strip(), Scope1.strip(), Scope2.strip())
         self.ID = self.ID + self._ID_STEP_
         if self.ID >= (MODEL_FILE_INF + self._ID_MAX_):
             self.ID = MODEL_FILE_INF + self._ID_STEP_
 
-        row = [ self.ID,
-                Model,
-                Value1,
-                Value2,
-                Value3,
-                Scope1,
-                Scope2,
-                BelongsToItem,
-                StartLine,
-                StartColumn,
-                EndLine,
-                EndColumn,
-                Enabled
-            ]
+        row = [self.ID,
+               Model,
+               Value1,
+               Value2,
+               Value3,
+               Scope1,
+               Scope2,
+               BelongsToItem,
+               StartLine,
+               StartColumn,
+               EndLine,
+               EndColumn,
+               Enabled
+               ]
         self.CurrentContent.append(row)
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param    Model:      The Model of Record
     # @param    Arch:       The Arch attribute of Record
@@ -139,7 +145,8 @@ class ModuleTable(MetaFileTable):
     def Query(self, Model, Arch=None, Platform=None, BelongsToItem=None):
 
         QueryTab = self.CurrentContent
-        result = [item for item in QueryTab if item[1] == Model and item[-1]>=0 ]
+        result = [item for item in QueryTab if item[1]
+                  == Model and item[-1] >= 0]
 
         if Arch is not None and Arch != TAB_ARCH_COMMON:
             ArchList = set(['COMMON'])
@@ -147,17 +154,19 @@ class ModuleTable(MetaFileTable):
             result = [item for item in result if item[5] in ArchList]
 
         if Platform is not None and Platform != TAB_COMMON:
-            Platformlist = set( ['COMMON','DEFAULT'])
+            Platformlist = set(['COMMON', 'DEFAULT'])
             Platformlist.add(Platform)
             result = [item for item in result if item[6] in Platformlist]
 
         if BelongsToItem is not None:
             result = [item for item in result if item[7] == BelongsToItem]
 
-        result = [ [r[2],r[3],r[4],r[5],r[6],r[0],r[8]] for r in result ]
+        result = [[r[2], r[3], r[4], r[5], r[6], r[0], r[8]] for r in result]
         return result
 
-## Python class representation of table storing package data
+# Python class representation of table storing package data
+
+
 class PackageTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -175,13 +184,15 @@ class PackageTable(MetaFileTable):
         Enabled INTEGER DEFAULT 0
         '''
     # used as table end flag, in case the changes to database is not committed to db file
-    _DUMMY_ = [-1, -1, '====', '====', '====', '====', '====', -1, -1, -1, -1, -1, -1]
+    _DUMMY_ = [-1, -1, '====', '====', '====',
+               '====', '====', -1, -1, -1, -1, -1, -1]
 
-    ## Constructor
+    # Constructor
     def __init__(self, Cursor, MetaFile, Temporary):
-        MetaFileTable.__init__(self, Cursor, MetaFile, MODEL_FILE_DEC, Temporary)
+        MetaFileTable.__init__(self, Cursor, MetaFile,
+                               MODEL_FILE_DEC, Temporary)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dec
     #
@@ -200,27 +211,28 @@ class PackageTable(MetaFileTable):
     #
     def Insert(self, Model, Value1, Value2, Value3, Scope1=TAB_ARCH_COMMON, Scope2=TAB_COMMON,
                BelongsToItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=0):
-        (Value1, Value2, Value3, Scope1, Scope2) = (Value1.strip(), Value2.strip(), Value3.strip(), Scope1.strip(), Scope2.strip())
+        (Value1, Value2, Value3, Scope1, Scope2) = (Value1.strip(),
+                                                    Value2.strip(), Value3.strip(), Scope1.strip(), Scope2.strip())
         self.ID = self.ID + self._ID_STEP_
 
-        row = [ self.ID,
-                Model,
-                Value1,
-                Value2,
-                Value3,
-                Scope1,
-                Scope2,
-                BelongsToItem,
-                StartLine,
-                StartColumn,
-                EndLine,
-                EndColumn,
-                Enabled
-            ]
+        row = [self.ID,
+               Model,
+               Value1,
+               Value2,
+               Value3,
+               Scope1,
+               Scope2,
+               BelongsToItem,
+               StartLine,
+               StartColumn,
+               EndLine,
+               EndColumn,
+               Enabled
+               ]
         self.CurrentContent.append(row)
         return self.ID
 
-    ## Query table
+    # Query table
     #
     # @param    Model:  The Model of Record
     # @param    Arch:   The Arch attribute of Record
@@ -230,7 +242,8 @@ class PackageTable(MetaFileTable):
     def Query(self, Model, Arch=None):
 
         QueryTab = self.CurrentContent
-        result = [item for item in QueryTab if item[1] == Model and item[-1]>=0 ]
+        result = [item for item in QueryTab if item[1]
+                  == Model and item[-1] >= 0]
 
         if Arch is not None and Arch != TAB_ARCH_COMMON:
             ArchList = set(['COMMON'])
@@ -242,7 +255,8 @@ class PackageTable(MetaFileTable):
     def GetValidExpression(self, TokenSpaceGuid, PcdCName):
 
         QueryTab = self.CurrentContent
-        result = [[item[2], item[8]] for item in QueryTab if item[3] == TokenSpaceGuid and item[4] == PcdCName]
+        result = [[item[2], item[8]] for item in QueryTab if item[3]
+                  == TokenSpaceGuid and item[4] == PcdCName]
         validateranges = []
         validlists = []
         expressions = []
@@ -276,7 +290,9 @@ class PackageTable(MetaFileTable):
             return set(), set(), set()
         return set(validateranges), set(validlists), set(expressions)
 
-## Python class representation of table storing platform data
+# Python class representation of table storing platform data
+
+
 class PlatformTable(MetaFileTable):
     _COLUMN_ = '''
         ID REAL PRIMARY KEY,
@@ -296,13 +312,15 @@ class PlatformTable(MetaFileTable):
         Enabled INTEGER DEFAULT 0
         '''
     # used as table end flag, in case the changes to database is not committed to db file
-    _DUMMY_ = [-1, -1, '====', '====', '====', '====', '====','====', -1, -1, -1, -1, -1, -1, -1]
+    _DUMMY_ = [-1, -1, '====', '====', '====', '====',
+               '====', '====', -1, -1, -1, -1, -1, -1, -1]
 
-    ## Constructor
+    # Constructor
     def __init__(self, Cursor, MetaFile, Temporary, FromItem=0):
-        MetaFileTable.__init__(self, Cursor, MetaFile, MODEL_FILE_DSC, Temporary, FromItem)
+        MetaFileTable.__init__(self, Cursor, MetaFile,
+                               MODEL_FILE_DSC, Temporary, FromItem)
 
-    ## Insert table
+    # Insert table
     #
     # Insert a record into table Dsc
     #
@@ -320,32 +338,32 @@ class PlatformTable(MetaFileTable):
     # @param EndColumn:      EndColumn of a Dsc item
     # @param Enabled:        If this item enabled
     #
-    def Insert(self, Model, Value1, Value2, Value3, Scope1=TAB_ARCH_COMMON, Scope2=TAB_COMMON, Scope3=TAB_DEFAULT_STORES_DEFAULT,BelongsToItem=-1,
+    def Insert(self, Model, Value1, Value2, Value3, Scope1=TAB_ARCH_COMMON, Scope2=TAB_COMMON, Scope3=TAB_DEFAULT_STORES_DEFAULT, BelongsToItem=-1,
                FromItem=-1, StartLine=-1, StartColumn=-1, EndLine=-1, EndColumn=-1, Enabled=1):
-        (Value1, Value2, Value3, Scope1, Scope2, Scope3) = (Value1.strip(), Value2.strip(), Value3.strip(), Scope1.strip(), Scope2.strip(), Scope3.strip())
+        (Value1, Value2, Value3, Scope1, Scope2, Scope3) = (Value1.strip(), Value2.strip(
+        ), Value3.strip(), Scope1.strip(), Scope2.strip(), Scope3.strip())
         self.ID = self.ID + self._ID_STEP_
 
-        row = [ self.ID,
-                Model,
-                Value1,
-                Value2,
-                Value3,
-                Scope1,
-                Scope2,
-                Scope3,
-                BelongsToItem,
-                FromItem,
-                StartLine,
-                StartColumn,
-                EndLine,
-                EndColumn,
-                Enabled
-            ]
+        row = [self.ID,
+               Model,
+               Value1,
+               Value2,
+               Value3,
+               Scope1,
+               Scope2,
+               Scope3,
+               BelongsToItem,
+               FromItem,
+               StartLine,
+               StartColumn,
+               EndLine,
+               EndColumn,
+               Enabled
+               ]
         self.CurrentContent.append(row)
         return self.ID
 
-
-    ## Query table
+    # Query table
     #
     # @param Model:          The Model of Record
     # @param Scope1:         Arch of a Dsc item
@@ -355,15 +373,17 @@ class PlatformTable(MetaFileTable):
     #
     # @retval:       A recordSet of all found records
     #
+
     def Query(self, Model, Scope1=None, Scope2=None, BelongsToItem=None, FromItem=None):
 
         QueryTab = self.CurrentContent
-        result = [item for item in QueryTab if item[1] == Model and item[-1]>0 ]
+        result = [item for item in QueryTab if item[1]
+                  == Model and item[-1] > 0]
         if Scope1 is not None and Scope1 != TAB_ARCH_COMMON:
             Sc1 = set(['COMMON'])
             Sc1.add(Scope1)
             result = [item for item in result if item[5] in Sc1]
-        Sc2 = set( ['COMMON','DEFAULT'])
+        Sc2 = set(['COMMON', 'DEFAULT'])
         if Scope2 and Scope2 != TAB_COMMON:
             if '.' in Scope2:
                 Index = Scope2.index('.')
@@ -379,33 +399,37 @@ class PlatformTable(MetaFileTable):
         if FromItem is not None:
             result = [item for item in result if item[9] == FromItem]
 
-        result = [ [r[2],r[3],r[4],r[5],r[6],r[7],r[0],r[10]] for r in result ]
+        result = [[r[2], r[3], r[4], r[5], r[6], r[7], r[0], r[10]]
+                  for r in result]
         return result
 
-    def DisableComponent(self,comp_id):
+    def DisableComponent(self, comp_id):
         for item in self.CurrentContent:
             if item[0] == comp_id or item[8] == comp_id:
                 item[-1] = -1
 
-## Factory class to produce different storage for different type of meta-file
+# Factory class to produce different storage for different type of meta-file
+
+
 class MetaFileStorage(object):
     _FILE_TABLE_ = {
-        MODEL_FILE_INF      :   ModuleTable,
-        MODEL_FILE_DEC      :   PackageTable,
-        MODEL_FILE_DSC      :   PlatformTable,
-        MODEL_FILE_OTHERS   :   MetaFileTable,
+        MODEL_FILE_INF:   ModuleTable,
+        MODEL_FILE_DEC:   PackageTable,
+        MODEL_FILE_DSC:   PlatformTable,
+        MODEL_FILE_OTHERS:   MetaFileTable,
     }
 
     _FILE_TYPE_ = {
-        ".inf"  : MODEL_FILE_INF,
-        ".dec"  : MODEL_FILE_DEC,
-        ".dsc"  : MODEL_FILE_DSC,
+        ".inf": MODEL_FILE_INF,
+        ".dec": MODEL_FILE_DEC,
+        ".dsc": MODEL_FILE_DSC,
     }
     _ObjectCache = {}
-    ## Constructor
+    # Constructor
+
     def __new__(Class, Cursor, MetaFile, FileType=None, Temporary=False, FromItem=None):
         # no type given, try to find one
-        key = (MetaFile.Path, FileType,Temporary,FromItem)
+        key = (MetaFile.Path, FileType, Temporary, FromItem)
         if key in Class._ObjectCache:
             return Class._ObjectCache[key]
         if not FileType:
@@ -427,4 +451,3 @@ class MetaFileStorage(object):
         if not Temporary:
             Class._ObjectCache[key] = reval
         return reval
-
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
index 9e506fc646b1..c14bf28d812e 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceCommon.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Common routines used by workspace
 #
 # Copyright (c) 2012 - 2020, Intel Corporation. All rights reserved.<BR>
@@ -17,6 +17,7 @@ from Common.BuildToolError import OPTION_MISSING
 from Common.BuildToolError import BUILD_ERROR
 import Common.EdkLogger as EdkLogger
 
+
 class OrderedListDict(OrderedDict):
     def __init__(self, *args, **kwargs):
         super(OrderedListDict, self).__init__(*args, **kwargs)
@@ -26,7 +27,7 @@ class OrderedListDict(OrderedDict):
         self[key] = Value = self.default_factory()
         return Value
 
-## Get all packages from platform for specified arch, target and toolchain
+# Get all packages from platform for specified arch, target and toolchain
 #
 #  @param Platform: DscBuildData instance
 #  @param BuildDatabase: The database saves all data for all metafiles
@@ -35,6 +36,8 @@ class OrderedListDict(OrderedDict):
 #  @param Toolchain: Current toolchain
 #  @retval: List of packages which are DecBuildData instances
 #
+
+
 def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
     PkgSet = set()
     if Platform.Packages:
@@ -46,7 +49,7 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
             PkgSet.update(Lib.Packages)
     return list(PkgSet)
 
-## Get all declared PCD from platform for specified arch, target and toolchain
+# Get all declared PCD from platform for specified arch, target and toolchain
 #
 #  @param Platform: DscBuildData instance
 #  @param BuildDatabase: The database saves all data for all metafiles
@@ -56,6 +59,8 @@ def GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain):
 #  @retval: A dictionary contains instances of PcdClassObject with key (PcdCName, TokenSpaceGuid)
 #  @retval: A dictionary contains real GUIDs of TokenSpaceGuid
 #
+
+
 def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalPkgs):
     PkgList = GetPackageList(Platform, BuildDatabase, Arch, Target, Toolchain)
     PkgList = set(PkgList)
@@ -77,7 +82,7 @@ def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalP
                 DecPcds[PcdCName, PcdTokenName] = Pkg.Pcds[Pcd]
     return DecPcds, GuidDict
 
-## Get all dependent libraries for a module
+# Get all dependent libraries for a module
 #
 #  @param Module: InfBuildData instance
 #  @param Platform: DscBuildData instance
@@ -87,10 +92,13 @@ def GetDeclaredPcd(Platform, BuildDatabase, Arch, Target, Toolchain, additionalP
 #  @param Toolchain: Current toolchain
 #  @retval: List of dependent libraries which are InfBuildData instances
 #
+
+
 def GetLiabraryInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain):
-    return GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain,Platform.MetaFile,EdkLogger)
+    return GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain, Platform.MetaFile, EdkLogger)
 
-def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain, FileName = '', EdkLogger = None):
+
+def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolchain, FileName='', EdkLogger=None):
     if Module.LibInstances:
         return Module.LibInstances
     ModuleType = Module.ModuleType
@@ -103,12 +111,14 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
     if Module.ModuleType != SUP_MODULE_USER_DEFINED:
         for LibraryClass in Platform.LibraryClasses.GetKeys():
             if LibraryClass.startswith("NULL") and Platform.LibraryClasses[LibraryClass, Module.ModuleType]:
-                Module.LibraryClasses[LibraryClass] = Platform.LibraryClasses[LibraryClass, Module.ModuleType]
+                Module.LibraryClasses[LibraryClass] = Platform.LibraryClasses[LibraryClass,
+                                                                              Module.ModuleType]
 
     # add forced library instances (specified in module overrides)
     for LibraryClass in Platform.Modules[str(Module)].LibraryClasses:
         if LibraryClass.startswith("NULL"):
-            Module.LibraryClasses[LibraryClass] = Platform.Modules[str(Module)].LibraryClasses[LibraryClass]
+            Module.LibraryClasses[LibraryClass] = Platform.Modules[str(
+                Module)].LibraryClasses[LibraryClass]
 
     # EdkII module
     LibraryConsumerList = [Module]
@@ -118,14 +128,16 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
 
     if not Module.LibraryClass:
         EdkLogger.verbose("")
-        EdkLogger.verbose("Library instances of module [%s] [%s]:" % (str(Module), Arch))
+        EdkLogger.verbose(
+            "Library instances of module [%s] [%s]:" % (str(Module), Arch))
 
     while len(LibraryConsumerList) > 0:
         M = LibraryConsumerList.pop()
         for LibraryClassName in M.LibraryClasses:
             if LibraryClassName not in LibraryInstance:
                 # override library instance for this module
-                LibraryPath = Platform.Modules[str(Module)].LibraryClasses.get(LibraryClassName,Platform.LibraryClasses[LibraryClassName, ModuleType])
+                LibraryPath = Platform.Modules[str(Module)].LibraryClasses.get(
+                    LibraryClassName, Platform.LibraryClasses[LibraryClassName, ModuleType])
                 if LibraryPath is None:
                     LibraryPath = M.LibraryClasses.get(LibraryClassName)
                     if LibraryPath is None:
@@ -137,20 +149,22 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
                         else:
                             return []
 
-                LibraryModule = BuildDatabase[LibraryPath, Arch, Target, Toolchain]
+                LibraryModule = BuildDatabase[LibraryPath,
+                                              Arch, Target, Toolchain]
                 # for those forced library instance (NULL library), add a fake library class
                 if LibraryClassName.startswith("NULL"):
-                    LibraryModule.LibraryClass.append(LibraryClassObject(LibraryClassName, [ModuleType]))
+                    LibraryModule.LibraryClass.append(
+                        LibraryClassObject(LibraryClassName, [ModuleType]))
                 elif LibraryModule.LibraryClass is None \
-                     or len(LibraryModule.LibraryClass) == 0 \
-                     or (ModuleType != SUP_MODULE_USER_DEFINED and ModuleType != SUP_MODULE_HOST_APPLICATION
-                         and ModuleType not in LibraryModule.LibraryClass[0].SupModList):
+                    or len(LibraryModule.LibraryClass) == 0 \
+                    or (ModuleType != SUP_MODULE_USER_DEFINED and ModuleType != SUP_MODULE_HOST_APPLICATION
+                        and ModuleType not in LibraryModule.LibraryClass[0].SupModList):
                     # only USER_DEFINED can link against any library instance despite of its SupModList
                     if not Module.LibraryClass:
                         EdkLogger.error("build", OPTION_MISSING,
-                                        "Module type [%s] is not supported by library instance [%s]" \
+                                        "Module type [%s] is not supported by library instance [%s]"
                                         % (ModuleType, LibraryPath), File=FileName,
-                                        ExtraData="consumed by library instance [%s] which is consumed by module [%s]" \
+                                        ExtraData="consumed by library instance [%s] which is consumed by module [%s]"
                                         % (str(M), str(Module))
                                         )
                     else:
@@ -159,7 +173,8 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
                 LibraryInstance[LibraryClassName] = LibraryModule
                 LibraryConsumerList.append(LibraryModule)
                 if not Module.LibraryClass:
-                    EdkLogger.verbose("\t" + str(LibraryClassName) + " : " + str(LibraryModule))
+                    EdkLogger.verbose(
+                        "\t" + str(LibraryClassName) + " : " + str(LibraryModule))
             else:
                 LibraryModule = LibraryInstance[LibraryClassName]
 
@@ -181,7 +196,7 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
     #
     # Q <- Set of all nodes with no incoming edges
     #
-    LibraryList = [] #LibraryInstance.values()
+    LibraryList = []  # LibraryInstance.values()
     Q = []
     for LibraryClassName in LibraryInstance:
         M = LibraryInstance[LibraryClassName]
@@ -240,7 +255,9 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
     for Item in LibraryList:
         if ConsumedByList[Item] and Item in Constructor and len(Constructor) > 1:
             if not Module.LibraryClass:
-                ErrorMessage = "\tconsumed by " + "\n\tconsumed by ".join(str(L) for L in ConsumedByList[Item])
+                ErrorMessage = "\tconsumed by " + \
+                    "\n\tconsumed by ".join(str(L)
+                                            for L in ConsumedByList[Item])
                 EdkLogger.error("build", BUILD_ERROR, 'Library [%s] with constructors has a cycle' % str(Item),
                                 ExtraData=ErrorMessage, File=FileName)
             else:
@@ -254,5 +271,6 @@ def GetModuleLibInstances(Module, Platform, BuildDatabase, Arch, Target, Toolcha
     #
     SortedLibraryList.reverse()
     Module.LibInstances = SortedLibraryList
-    SortedLibraryList = [lib.SetReferenceModule(Module) for lib in SortedLibraryList]
+    SortedLibraryList = [lib.SetReferenceModule(
+        Module) for lib in SortedLibraryList]
     return SortedLibraryList
diff --git a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
index d955c78b258f..09e235b23bc5 100644
--- a/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
+++ b/BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # This file is used to create a database used by build tool
 #
 # Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -23,7 +23,7 @@ from Workspace.DecBuildData import DecBuildData
 from Workspace.DscBuildData import DscBuildData
 from Workspace.InfBuildData import InfBuildData
 
-## Database
+# Database
 #
 #   This class defined the build database for all modules, packages and platform.
 # It will call corresponding parser for the given file if it cannot find it in
@@ -33,6 +33,8 @@ from Workspace.InfBuildData import InfBuildData
 # @param GlobalMacros       Global macros used for replacement during file parsing
 # @param RenewDb=False      Create new database file if it's already there
 #
+
+
 class WorkspaceDatabase(object):
 
     #
@@ -42,26 +44,27 @@ class WorkspaceDatabase(object):
     class BuildObjectFactory(object):
 
         _FILE_TYPE_ = {
-            ".inf"  : MODEL_FILE_INF,
-            ".dec"  : MODEL_FILE_DEC,
-            ".dsc"  : MODEL_FILE_DSC,
+            ".inf": MODEL_FILE_INF,
+            ".dec": MODEL_FILE_DEC,
+            ".dsc": MODEL_FILE_DSC,
         }
 
         # file parser
         _FILE_PARSER_ = {
-            MODEL_FILE_INF  :   InfParser,
-            MODEL_FILE_DEC  :   DecParser,
-            MODEL_FILE_DSC  :   DscParser,
+            MODEL_FILE_INF:   InfParser,
+            MODEL_FILE_DEC:   DecParser,
+            MODEL_FILE_DSC:   DscParser,
         }
 
         # convert to xxxBuildData object
         _GENERATOR_ = {
-            MODEL_FILE_INF  :   InfBuildData,
-            MODEL_FILE_DEC  :   DecBuildData,
-            MODEL_FILE_DSC  :   DscBuildData,
+            MODEL_FILE_INF:   InfBuildData,
+            MODEL_FILE_DEC:   DecBuildData,
+            MODEL_FILE_DSC:   DscBuildData,
         }
 
         _CACHE_ = {}    # (FilePath, Arch)  : <object>
+
         def GetCache(self):
             return self._CACHE_
 
@@ -101,10 +104,12 @@ class WorkspaceDatabase(object):
                 return self._CACHE_[Key]
 
             # check file type
-            BuildObject = self.CreateBuildObject(FilePath, Arch, Target, Toolchain)
+            BuildObject = self.CreateBuildObject(
+                FilePath, Arch, Target, Toolchain)
             self._CACHE_[Key] = BuildObject
             return BuildObject
-        def CreateBuildObject(self,FilePath, Arch, Target, Toolchain):
+
+        def CreateBuildObject(self, FilePath, Arch, Target, Toolchain):
             Ext = FilePath.Type
             if Ext not in self._FILE_TYPE_:
                 return None
@@ -114,22 +119,22 @@ class WorkspaceDatabase(object):
 
             # get the parser ready for this file
             MetaFile = self._FILE_PARSER_[FileType](
-                                FilePath,
-                                FileType,
-                                Arch,
-                                MetaFileStorage(self.WorkspaceDb, FilePath, FileType)
-                                )
+                FilePath,
+                FileType,
+                Arch,
+                MetaFileStorage(self.WorkspaceDb, FilePath, FileType)
+            )
             # always do post-process, in case of macros change
             MetaFile.DoPostProcess()
             # object the build is based on
             BuildObject = self._GENERATOR_[FileType](
-                                    FilePath,
-                                    MetaFile,
-                                    self,
-                                    Arch,
-                                    Target,
-                                    Toolchain
-                                    )
+                FilePath,
+                MetaFile,
+                self,
+                Arch,
+                Target,
+                Toolchain
+            )
             return BuildObject
 
     # placeholder for file format conversion
@@ -141,7 +146,7 @@ class WorkspaceDatabase(object):
         def __getitem__(self, Key):
             pass
 
-    ## Constructor of WorkspaceDatabase
+    # Constructor of WorkspaceDatabase
     #
     # @param DbPath             Path of database file
     # @param GlobalMacros       Global macros used for replacement during file parsing
@@ -158,8 +163,8 @@ class WorkspaceDatabase(object):
         self.BuildObject = WorkspaceDatabase.BuildObjectFactory(self)
         self.TransformObject = WorkspaceDatabase.TransformObjectFactory(self)
 
+    # Summarize all packages in the database
 
-    ## Summarize all packages in the database
     def GetPackageList(self, Platform, Arch, TargetName, ToolChainTag):
         self.Platform = Platform
         PackageList = []
@@ -168,7 +173,8 @@ class WorkspaceDatabase(object):
         # Get Package related to Modules
         #
         for Module in Pa.Modules:
-            ModuleObj = self.BuildObject[Module, Arch, TargetName, ToolChainTag]
+            ModuleObj = self.BuildObject[Module,
+                                         Arch, TargetName, ToolChainTag]
             for Package in ModuleObj.Packages:
                 if Package not in PackageList:
                     PackageList.append(Package)
@@ -190,9 +196,11 @@ class WorkspaceDatabase(object):
     def MapPlatform(self, Dscfile):
         Platform = self.BuildObject[PathClass(Dscfile), TAB_COMMON]
         if Platform is None:
-            EdkLogger.error('build', PARSER_ERROR, "Failed to parser DSC file: %s" % Dscfile)
+            EdkLogger.error('build', PARSER_ERROR,
+                            "Failed to parser DSC file: %s" % Dscfile)
         return Platform
 
+
 BuildDB = WorkspaceDatabase()
 ##
 #
@@ -201,4 +209,3 @@ BuildDB = WorkspaceDatabase()
 #
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/Workspace/__init__.py b/BaseTools/Source/Python/Workspace/__init__.py
index 85ae9937c43f..400adb76a0aa 100644
--- a/BaseTools/Source/Python/Workspace/__init__.py
+++ b/BaseTools/Source/Python/Workspace/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'Workspace' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/build/BuildReport.py b/BaseTools/Source/Python/build/BuildReport.py
index 468772930ca1..673ef7528150 100644
--- a/BaseTools/Source/Python/build/BuildReport.py
+++ b/BaseTools/Source/Python/build/BuildReport.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Routines for generating build report.
 #
 # This module contains the functionality to generate build report after
@@ -8,7 +8,7 @@
 # SPDX-License-Identifier: BSD-2-Clause-Patent
 #
 
-## Import Modules
+# Import Modules
 #
 import Common.LongFilePathOs as os
 import re
@@ -42,86 +42,91 @@ import collections
 from Common.Expression import *
 from GenFds.AprioriSection import DXE_APRIORI_GUID, PEI_APRIORI_GUID
 
-## Pattern to extract contents in EDK DXS files
-gDxsDependencyPattern = re.compile(r"DEPENDENCY_START(.+)DEPENDENCY_END", re.DOTALL)
+# Pattern to extract contents in EDK DXS files
+gDxsDependencyPattern = re.compile(
+    r"DEPENDENCY_START(.+)DEPENDENCY_END", re.DOTALL)
 
-## Pattern to find total FV total size, occupied size in flash report intermediate file
+# Pattern to find total FV total size, occupied size in flash report intermediate file
 gFvTotalSizePattern = re.compile(r"EFI_FV_TOTAL_SIZE = (0x[0-9a-fA-F]+)")
 gFvTakenSizePattern = re.compile(r"EFI_FV_TAKEN_SIZE = (0x[0-9a-fA-F]+)")
 
-## Pattern to find module size and time stamp in module summary report intermediate file
+# Pattern to find module size and time stamp in module summary report intermediate file
 gModuleSizePattern = re.compile(r"MODULE_SIZE = (\d+)")
-gTimeStampPattern  = re.compile(r"TIME_STAMP = (\d+)")
+gTimeStampPattern = re.compile(r"TIME_STAMP = (\d+)")
 
-## Pattern to find GUID value in flash description files
+# Pattern to find GUID value in flash description files
 gPcdGuidPattern = re.compile(r"PCD\((\w+)[.](\w+)\)")
 
-## Pattern to collect offset, GUID value pair in the flash report intermediate file
+# Pattern to collect offset, GUID value pair in the flash report intermediate file
 gOffsetGuidPattern = re.compile(r"(0x[0-9A-Fa-f]+) ([-A-Fa-f0-9]+)")
 
-## Pattern to find module base address and entry point in fixed flash map file
+# Pattern to find module base address and entry point in fixed flash map file
 gModulePattern = r"\n[-\w]+\s*\(([^,]+),\s*BaseAddress=%(Address)s,\s*EntryPoint=%(Address)s,\s*Type=\w+\)\s*\(GUID=([-0-9A-Fa-f]+)[^)]*\)"
-gMapFileItemPattern = re.compile(gModulePattern % {"Address" : "(-?0[xX][0-9A-Fa-f]+)"})
+gMapFileItemPattern = re.compile(
+    gModulePattern % {"Address": "(-?0[xX][0-9A-Fa-f]+)"})
 
-## Pattern to find all module referenced header files in source files
-gIncludePattern  = re.compile(r'#include\s*["<]([^">]+)[">]')
+# Pattern to find all module referenced header files in source files
+gIncludePattern = re.compile(r'#include\s*["<]([^">]+)[">]')
 gIncludePattern2 = re.compile(r"#include\s+EFI_([A-Z_]+)\s*[(]\s*(\w+)\s*[)]")
 
-## Pattern to find the entry point for EDK module using EDKII Glue library
-gGlueLibEntryPoint = re.compile(r"__EDKII_GLUE_MODULE_ENTRY_POINT__\s*=\s*(\w+)")
+# Pattern to find the entry point for EDK module using EDKII Glue library
+gGlueLibEntryPoint = re.compile(
+    r"__EDKII_GLUE_MODULE_ENTRY_POINT__\s*=\s*(\w+)")
 
-## Tags for MaxLength of line in report
+# Tags for MaxLength of line in report
 gLineMaxLength = 120
 
-## Tags for end of line in report
+# Tags for end of line in report
 gEndOfLine = "\r\n"
 
-## Tags for section start, end and separator
+# Tags for section start, end and separator
 gSectionStart = ">" + "=" * (gLineMaxLength - 2) + "<"
 gSectionEnd = "<" + "=" * (gLineMaxLength - 2) + ">" + "\n"
 gSectionSep = "=" * gLineMaxLength
 
-## Tags for subsection start, end and separator
+# Tags for subsection start, end and separator
 gSubSectionStart = ">" + "-" * (gLineMaxLength - 2) + "<"
 gSubSectionEnd = "<" + "-" * (gLineMaxLength - 2) + ">"
 gSubSectionSep = "-" * gLineMaxLength
 
 
-## The look up table to map PCD type to pair of report display type and DEC type
+# The look up table to map PCD type to pair of report display type and DEC type
 gPcdTypeMap = {
-  TAB_PCDS_FIXED_AT_BUILD     : ('FIXED',  TAB_PCDS_FIXED_AT_BUILD),
-  TAB_PCDS_PATCHABLE_IN_MODULE: ('PATCH',  TAB_PCDS_PATCHABLE_IN_MODULE),
-  TAB_PCDS_FEATURE_FLAG       : ('FLAG',   TAB_PCDS_FEATURE_FLAG),
-  TAB_PCDS_DYNAMIC            : ('DYN',    TAB_PCDS_DYNAMIC),
-  TAB_PCDS_DYNAMIC_HII        : ('DYNHII', TAB_PCDS_DYNAMIC),
-  TAB_PCDS_DYNAMIC_VPD        : ('DYNVPD', TAB_PCDS_DYNAMIC),
-  TAB_PCDS_DYNAMIC_EX         : ('DEX',    TAB_PCDS_DYNAMIC_EX),
-  TAB_PCDS_DYNAMIC_EX_HII     : ('DEXHII', TAB_PCDS_DYNAMIC_EX),
-  TAB_PCDS_DYNAMIC_EX_VPD     : ('DEXVPD', TAB_PCDS_DYNAMIC_EX),
-  }
+    TAB_PCDS_FIXED_AT_BUILD: ('FIXED',  TAB_PCDS_FIXED_AT_BUILD),
+    TAB_PCDS_PATCHABLE_IN_MODULE: ('PATCH',  TAB_PCDS_PATCHABLE_IN_MODULE),
+    TAB_PCDS_FEATURE_FLAG: ('FLAG',   TAB_PCDS_FEATURE_FLAG),
+    TAB_PCDS_DYNAMIC: ('DYN',    TAB_PCDS_DYNAMIC),
+    TAB_PCDS_DYNAMIC_HII: ('DYNHII', TAB_PCDS_DYNAMIC),
+    TAB_PCDS_DYNAMIC_VPD: ('DYNVPD', TAB_PCDS_DYNAMIC),
+    TAB_PCDS_DYNAMIC_EX: ('DEX',    TAB_PCDS_DYNAMIC_EX),
+    TAB_PCDS_DYNAMIC_EX_HII: ('DEXHII', TAB_PCDS_DYNAMIC_EX),
+    TAB_PCDS_DYNAMIC_EX_VPD: ('DEXVPD', TAB_PCDS_DYNAMIC_EX),
+}
 
-## The look up table to map module type to driver type
+# The look up table to map module type to driver type
 gDriverTypeMap = {
-  SUP_MODULE_SEC               : '0x3 (SECURITY_CORE)',
-  SUP_MODULE_PEI_CORE          : '0x4 (PEI_CORE)',
-  SUP_MODULE_PEIM              : '0x6 (PEIM)',
-  SUP_MODULE_DXE_CORE          : '0x5 (DXE_CORE)',
-  SUP_MODULE_DXE_DRIVER        : '0x7 (DRIVER)',
-  SUP_MODULE_DXE_SAL_DRIVER    : '0x7 (DRIVER)',
-  SUP_MODULE_DXE_SMM_DRIVER    : '0x7 (DRIVER)',
-  SUP_MODULE_DXE_RUNTIME_DRIVER: '0x7 (DRIVER)',
-  SUP_MODULE_UEFI_DRIVER       : '0x7 (DRIVER)',
-  SUP_MODULE_UEFI_APPLICATION  : '0x9 (APPLICATION)',
-  SUP_MODULE_SMM_CORE          : '0xD (SMM_CORE)',
-  'SMM_DRIVER'        : '0xA (SMM)', # Extension of module type to support PI 1.1 SMM drivers
-  SUP_MODULE_MM_STANDALONE     : '0xE (MM_STANDALONE)',
-  SUP_MODULE_MM_CORE_STANDALONE : '0xF (MM_CORE_STANDALONE)'
-  }
+    SUP_MODULE_SEC: '0x3 (SECURITY_CORE)',
+    SUP_MODULE_PEI_CORE: '0x4 (PEI_CORE)',
+    SUP_MODULE_PEIM: '0x6 (PEIM)',
+    SUP_MODULE_DXE_CORE: '0x5 (DXE_CORE)',
+    SUP_MODULE_DXE_DRIVER: '0x7 (DRIVER)',
+    SUP_MODULE_DXE_SAL_DRIVER: '0x7 (DRIVER)',
+    SUP_MODULE_DXE_SMM_DRIVER: '0x7 (DRIVER)',
+    SUP_MODULE_DXE_RUNTIME_DRIVER: '0x7 (DRIVER)',
+    SUP_MODULE_UEFI_DRIVER: '0x7 (DRIVER)',
+    SUP_MODULE_UEFI_APPLICATION: '0x9 (APPLICATION)',
+    SUP_MODULE_SMM_CORE: '0xD (SMM_CORE)',
+    # Extension of module type to support PI 1.1 SMM drivers
+    'SMM_DRIVER': '0xA (SMM)',
+    SUP_MODULE_MM_STANDALONE: '0xE (MM_STANDALONE)',
+    SUP_MODULE_MM_CORE_STANDALONE: '0xF (MM_CORE_STANDALONE)'
+}
 
-## The look up table of the supported opcode in the dependency expression binaries
-gOpCodeList = ["BEFORE", "AFTER", "PUSH", "AND", "OR", "NOT", "TRUE", "FALSE", "END", "SOR"]
+# The look up table of the supported opcode in the dependency expression binaries
+gOpCodeList = ["BEFORE", "AFTER", "PUSH", "AND",
+               "OR", "NOT", "TRUE", "FALSE", "END", "SOR"]
 
-## Save VPD Pcd
+# Save VPD Pcd
 VPDPcdList = []
 
 ##
@@ -134,11 +139,14 @@ VPDPcdList = []
 # @String                    The string to be written to the file
 # @Wrapper                   Indicates whether to wrap the string
 #
+
+
 def FileWrite(File, String, Wrapper=False):
     if Wrapper:
         String = textwrap.fill(String, 120)
     File.append(String + gEndOfLine)
 
+
 def ByteArrayForamt(Value):
     IsByteArray = False
     SplitNum = 16
@@ -157,7 +165,7 @@ def ByteArrayForamt(Value):
             Id = 0
             while (Id <= Len):
                 End = min(SplitNum*(Id+1), len(ValueList))
-                Str = ','.join(ValueList[SplitNum*Id : End])
+                Str = ','.join(ValueList[SplitNum*Id: End])
                 if End == len(ValueList):
                     Str += '}'
                     ArrayList.append(Str)
@@ -181,6 +189,8 @@ def ByteArrayForamt(Value):
 # @IncludePathList           The list of include path to find the source file.
 # @IncludeFiles              The dictionary of current found include files.
 #
+
+
 def FindIncludeFiles(Source, IncludePathList, IncludeFiles):
     FileContents = open(Source).read()
     #
@@ -191,7 +201,8 @@ def FindIncludeFiles(Source, IncludePathList, IncludeFiles):
         for Dir in [os.path.dirname(Source)] + IncludePathList:
             FullFileName = os.path.normpath(os.path.join(Dir, FileName))
             if os.path.exists(FullFileName):
-                IncludeFiles[FullFileName.lower().replace("\\", "/")] = FullFileName
+                IncludeFiles[FullFileName.lower().replace(
+                    "\\", "/")] = FullFileName
                 break
 
     #
@@ -201,22 +212,23 @@ def FindIncludeFiles(Source, IncludePathList, IncludeFiles):
         Key = Match.group(2)
         Type = Match.group(1)
         if "ARCH_PROTOCOL" in Type:
-            FileName = "ArchProtocol/%(Key)s/%(Key)s.h" % {"Key" : Key}
+            FileName = "ArchProtocol/%(Key)s/%(Key)s.h" % {"Key": Key}
         elif "PROTOCOL" in Type:
-            FileName = "Protocol/%(Key)s/%(Key)s.h" % {"Key" : Key}
+            FileName = "Protocol/%(Key)s/%(Key)s.h" % {"Key": Key}
         elif "PPI" in Type:
-            FileName = "Ppi/%(Key)s/%(Key)s.h" % {"Key" : Key}
+            FileName = "Ppi/%(Key)s/%(Key)s.h" % {"Key": Key}
         elif TAB_GUID in Type:
-            FileName = "Guid/%(Key)s/%(Key)s.h" % {"Key" : Key}
+            FileName = "Guid/%(Key)s/%(Key)s.h" % {"Key": Key}
         else:
             continue
         for Dir in IncludePathList:
             FullFileName = os.path.normpath(os.path.join(Dir, FileName))
             if os.path.exists(FullFileName):
-                IncludeFiles[FullFileName.lower().replace("\\", "/")] = FullFileName
+                IncludeFiles[FullFileName.lower().replace(
+                    "\\", "/")] = FullFileName
                 break
 
-## Split each lines in file
+# Split each lines in file
 #
 #  This method is used to split the lines in file to make the length of each line
 #  less than MaxLength.
@@ -224,6 +236,8 @@ def FindIncludeFiles(Source, IncludePathList, IncludeFiles):
 #  @param      Content           The content of file
 #  @param      MaxLength         The Max Length of the line
 #
+
+
 def FileLinesSplit(Content=None, MaxLength=None):
     ContentList = Content.split(TAB_LINE_BREAK)
     NewContent = ''
@@ -234,7 +248,8 @@ def FileLinesSplit(Content=None, MaxLength=None):
             LineSlashIndex = Line.rfind(TAB_SLASH, 0, MaxLength)
             LineBackSlashIndex = Line.rfind(TAB_BACK_SLASH, 0, MaxLength)
             if max(LineSpaceIndex, LineSlashIndex, LineBackSlashIndex) > 0:
-                LineBreakIndex = max(LineSpaceIndex, LineSlashIndex, LineBackSlashIndex)
+                LineBreakIndex = max(
+                    LineSpaceIndex, LineSlashIndex, LineBackSlashIndex)
             else:
                 LineBreakIndex = MaxLength
             NewContentList.append(Line[:LineBreakIndex])
@@ -244,11 +259,11 @@ def FileLinesSplit(Content=None, MaxLength=None):
     for NewLine in NewContentList:
         NewContent += NewLine + TAB_LINE_BREAK
 
-    NewContent = NewContent.replace(gEndOfLine, TAB_LINE_BREAK).replace('\r\r\n', gEndOfLine)
+    NewContent = NewContent.replace(
+        gEndOfLine, TAB_LINE_BREAK).replace('\r\r\n', gEndOfLine)
     return NewContent
 
 
-
 ##
 # Parse binary dependency expression section
 #
@@ -270,19 +285,23 @@ class DepexParser(object):
         for Pa in Wa.AutoGenObjectList:
             for Package in Pa.PackageList:
                 for Protocol in Package.Protocols:
-                    GuidValue = GuidStructureStringToGuidString(Package.Protocols[Protocol])
+                    GuidValue = GuidStructureStringToGuidString(
+                        Package.Protocols[Protocol])
                     self._GuidDb[GuidValue.upper()] = Protocol
                 for Ppi in Package.Ppis:
-                    GuidValue = GuidStructureStringToGuidString(Package.Ppis[Ppi])
+                    GuidValue = GuidStructureStringToGuidString(
+                        Package.Ppis[Ppi])
                     self._GuidDb[GuidValue.upper()] = Ppi
                 for Guid in Package.Guids:
-                    GuidValue = GuidStructureStringToGuidString(Package.Guids[Guid])
+                    GuidValue = GuidStructureStringToGuidString(
+                        Package.Guids[Guid])
                     self._GuidDb[GuidValue.upper()] = Guid
             for Ma in Pa.ModuleAutoGenList:
                 for Pcd in Ma.FixedVoidTypePcds:
                     PcdValue = Ma.FixedVoidTypePcds[Pcd]
                     if len(PcdValue.split(',')) == 16:
-                        GuidValue = GuidStructureByteArrayToGuidString(PcdValue)
+                        GuidValue = GuidStructureByteArrayToGuidString(
+                            PcdValue)
                         self._GuidDb[GuidValue.upper()] = Pcd
     ##
     # Parse the binary dependency expression files.
@@ -293,6 +312,7 @@ class DepexParser(object):
     # @param self            The object pointer
     # @param DepexFileName   The file name of binary dependency expression file.
     #
+
     def ParseDepexFile(self, DepexFileName):
         DepexFile = open(DepexFileName, "rb")
         DepexStatement = []
@@ -301,7 +321,8 @@ class DepexParser(object):
             Statement = gOpCodeList[struct.unpack("B", OpCode)[0]]
             if Statement in ["BEFORE", "AFTER", "PUSH"]:
                 GuidValue = "%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X" % \
-                            struct.unpack(PACK_PATTERN_GUID, DepexFile.read(16))
+                            struct.unpack(PACK_PATTERN_GUID,
+                                          DepexFile.read(16))
                 GuidString = self._GuidDb.get(GuidValue, GuidValue)
                 Statement = "%s %s" % (Statement, GuidString)
             DepexStatement.append(Statement)
@@ -314,6 +335,8 @@ class DepexParser(object):
 #
 # This class reports the module library subsection in the build report file.
 #
+
+
 class LibraryReport(object):
     ##
     # Constructor function for class LibraryReport
@@ -337,7 +360,8 @@ class LibraryReport(object):
                 if LibInfPath == LibAutoGen.MetaFile.Path:
                     LibTime = LibAutoGen.BuildTime
                     break
-            self.LibraryList.append((LibInfPath, LibClassList, LibConstructorList, LibDesstructorList, LibDepexList, LibTime))
+            self.LibraryList.append(
+                (LibInfPath, LibClassList, LibConstructorList, LibDesstructorList, LibDepexList, LibTime))
 
     ##
     # Generate report for module library information
@@ -383,6 +407,8 @@ class LibraryReport(object):
 #
 # This class reports the module dependency expression subsection in the build report file.
 #
+
+
 class DepexReport(object):
     ##
     # Constructor function for class DepexReport
@@ -398,7 +424,8 @@ class DepexReport(object):
     #
     def __init__(self, M):
         self.Depex = ""
-        self._DepexFileName = os.path.join(M.BuildDir, "OUTPUT", M.Module.BaseName + ".depex")
+        self._DepexFileName = os.path.join(
+            M.BuildDir, "OUTPUT", M.Module.BaseName + ".depex")
         ModuleType = M.ModuleType
         if not ModuleType:
             ModuleType = COMPONENT_TO_MODULE_MAP_DICT.get(M.ComponentType, "")
@@ -415,13 +442,15 @@ class DepexReport(object):
                     break
         else:
             self.Depex = M.DepexExpressionDict.get(M.ModuleType, "")
-            self.ModuleDepex = " ".join(M.Module.DepexExpression[M.Arch, M.ModuleType])
+            self.ModuleDepex = " ".join(
+                M.Module.DepexExpression[M.Arch, M.ModuleType])
             if not self.ModuleDepex:
                 self.ModuleDepex = "(None)"
 
             LibDepexList = []
             for Lib in M.DependentLibraryList:
-                LibDepex = " ".join(Lib.DepexExpression[M.Arch, M.ModuleType]).strip()
+                LibDepex = " ".join(
+                    Lib.DepexExpression[M.Arch, M.ModuleType]).strip()
                 if LibDepex != "":
                     LibDepexList.append("(" + LibDepex + ")")
             self.LibraryDepex = " AND ".join(LibDepexList)
@@ -444,13 +473,16 @@ class DepexReport(object):
         FileWrite(File, gSubSectionStart)
         if os.path.isfile(self._DepexFileName):
             try:
-                DepexStatements = GlobalDepexParser.ParseDepexFile(self._DepexFileName)
-                FileWrite(File, "Final Dependency Expression (DEPEX) Instructions")
+                DepexStatements = GlobalDepexParser.ParseDepexFile(
+                    self._DepexFileName)
+                FileWrite(
+                    File, "Final Dependency Expression (DEPEX) Instructions")
                 for DepexStatement in DepexStatements:
                     FileWrite(File, "  %s" % DepexStatement)
                 FileWrite(File, gSubSectionSep)
             except:
-                EdkLogger.warn(None, "Dependency expression file is corrupted", self._DepexFileName)
+                EdkLogger.warn(
+                    None, "Dependency expression file is corrupted", self._DepexFileName)
 
         FileWrite(File, "Dependency Expression (DEPEX) from %s" % self.Source)
 
@@ -468,6 +500,8 @@ class DepexReport(object):
 #
 # This class reports the module build flags subsection in the build report file.
 #
+
+
 class BuildFlagsReport(object):
     ##
     # Constructor function for class BuildFlagsReport
@@ -516,7 +550,8 @@ class BuildFlagsReport(object):
         self.ToolChainTag = M.ToolChain
         self.BuildFlags = {}
         for Tool in BuildOptions:
-            self.BuildFlags[Tool + "_FLAGS"] = M.BuildOption.get(Tool, {}).get("FLAGS", "")
+            self.BuildFlags[Tool +
+                            "_FLAGS"] = M.BuildOption.get(Tool, {}).get("FLAGS", "")
 
     ##
     # Generate report for module build flags information
@@ -567,17 +602,21 @@ class ModuleReport(object):
         if not M.IsLibrary:
             ModuleType = M.ModuleType
             if not ModuleType:
-                ModuleType = COMPONENT_TO_MODULE_MAP_DICT.get(M.ComponentType, "")
+                ModuleType = COMPONENT_TO_MODULE_MAP_DICT.get(
+                    M.ComponentType, "")
             #
             # If a module complies to PI 1.1, promote Module type to "SMM_DRIVER"
             #
             if ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
-                PiSpec = M.Module.Specification.get("PI_SPECIFICATION_VERSION", "0x00010000")
+                PiSpec = M.Module.Specification.get(
+                    "PI_SPECIFICATION_VERSION", "0x00010000")
                 if int(PiSpec, 0) >= 0x0001000A:
                     ModuleType = "SMM_DRIVER"
             self.DriverType = gDriverTypeMap.get(ModuleType, "0x2 (FREE_FORM)")
-        self.UefiSpecVersion = M.Module.Specification.get("UEFI_SPECIFICATION_VERSION", "")
-        self.PiSpecVersion = M.Module.Specification.get("PI_SPECIFICATION_VERSION", "")
+        self.UefiSpecVersion = M.Module.Specification.get(
+            "UEFI_SPECIFICATION_VERSION", "")
+        self.PiSpecVersion = M.Module.Specification.get(
+            "PI_SPECIFICATION_VERSION", "")
         self.PciDeviceId = M.Module.Defines.get("PCI_DEVICE_ID", "")
         self.PciVendorId = M.Module.Defines.get("PCI_VENDOR_ID", "")
         self.PciClassCode = M.Module.Defines.get("PCI_CLASS_CODE", "")
@@ -591,7 +630,8 @@ class ModuleReport(object):
             # It also saves module INF default values of them in case they exist.
             #
             for Pcd in M.ModulePcdList + M.LibraryPcdList:
-                self.ModulePcdSet.setdefault((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Type), (Pcd.InfDefaultValue, Pcd.DefaultValue))
+                self.ModulePcdSet.setdefault(
+                    (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Pcd.Type), (Pcd.InfDefaultValue, Pcd.DefaultValue))
 
         self.LibraryReport = None
         if "LIBRARY" in ReportType:
@@ -604,7 +644,6 @@ class ModuleReport(object):
         if "BUILD_FLAGS" in ReportType:
             self.BuildFlagsReport = BuildFlagsReport(M)
 
-
     ##
     # Generate report for module information
     #
@@ -618,10 +657,12 @@ class ModuleReport(object):
     # @param GlobalDepexParser      The platform global Dependency expression parser object
     # @param ReportType             The kind of report items in the final report file
     #
+
     def GenerateReport(self, File, GlobalPcdReport, GlobalPredictionReport, GlobalDepexParser, ReportType):
         FileWrite(File, gSectionStart)
 
-        FwReportFileName = os.path.join(self._BuildDir, "OUTPUT", self.ModuleName + ".txt")
+        FwReportFileName = os.path.join(
+            self._BuildDir, "OUTPUT", self.ModuleName + ".txt")
         if os.path.isfile(FwReportFileName):
             try:
                 FileContents = open(FwReportFileName).read()
@@ -631,25 +672,32 @@ class ModuleReport(object):
 
                 Match = gTimeStampPattern.search(FileContents)
                 if Match:
-                    self.BuildTimeStamp = datetime.utcfromtimestamp(int(Match.group(1)))
+                    self.BuildTimeStamp = datetime.utcfromtimestamp(
+                        int(Match.group(1)))
             except IOError:
-                EdkLogger.warn(None, "Fail to read report file", FwReportFileName)
+                EdkLogger.warn(None, "Fail to read report file",
+                               FwReportFileName)
 
         if "HASH" in ReportType:
             OutputDir = os.path.join(self._BuildDir, "OUTPUT")
             DefaultEFIfile = os.path.join(OutputDir, self.ModuleName + ".efi")
             if os.path.isfile(DefaultEFIfile):
-                Tempfile = os.path.join(OutputDir, self.ModuleName + "_hash.tmp")
+                Tempfile = os.path.join(
+                    OutputDir, self.ModuleName + "_hash.tmp")
                 # rebase the efi image since its base address may not zero
-                cmd = ["GenFw", "--rebase", str(0), "-o", Tempfile, DefaultEFIfile]
+                cmd = ["GenFw", "--rebase",
+                       str(0), "-o", Tempfile, DefaultEFIfile]
                 try:
-                    PopenObject = subprocess.Popen(' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
+                    PopenObject = subprocess.Popen(
+                        ' '.join(cmd), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
                 except Exception as X:
-                    EdkLogger.error("GenFw", COMMAND_FAILURE, ExtraData="%s: %s" % (str(X), cmd[0]))
+                    EdkLogger.error("GenFw", COMMAND_FAILURE,
+                                    ExtraData="%s: %s" % (str(X), cmd[0]))
                 EndOfProcedure = threading.Event()
                 EndOfProcedure.clear()
                 if PopenObject.stderr:
-                    StdErrThread = threading.Thread(target=ReadMessage, args=(PopenObject.stderr, EdkLogger.quiet, EndOfProcedure))
+                    StdErrThread = threading.Thread(target=ReadMessage, args=(
+                        PopenObject.stderr, EdkLogger.quiet, EndOfProcedure))
                     StdErrThread.setName("STDERR-Redirector")
                     StdErrThread.setDaemon(False)
                     StdErrThread.start()
@@ -658,7 +706,8 @@ class ModuleReport(object):
                 if PopenObject.stderr:
                     StdErrThread.join()
                 if PopenObject.returncode != 0:
-                    EdkLogger.error("GenFw", COMMAND_FAILURE, "Failed to generate firmware hash image for %s" % (DefaultEFIfile))
+                    EdkLogger.error(
+                        "GenFw", COMMAND_FAILURE, "Failed to generate firmware hash image for %s" % (DefaultEFIfile))
                 if os.path.isfile(Tempfile):
                     self.Hash = hashlib.sha1()
                     buf = open(Tempfile, 'rb').read()
@@ -673,9 +722,11 @@ class ModuleReport(object):
         FileWrite(File, "Module INF Path:      %s" % self.ModuleInfPath)
         FileWrite(File, "File GUID:            %s" % self.FileGuid)
         if self.Size:
-            FileWrite(File, "Size:                 0x%X (%.2fK)" % (self.Size, self.Size / 1024.0))
+            FileWrite(File, "Size:                 0x%X (%.2fK)" %
+                      (self.Size, self.Size / 1024.0))
         if self.Hash:
-            FileWrite(File, "SHA1 HASH:            %s *%s" % (self.Hash, self.ModuleName + ".efi"))
+            FileWrite(File, "SHA1 HASH:            %s *%s" %
+                      (self.Hash, self.ModuleName + ".efi"))
         if self.BuildTimeStamp:
             FileWrite(File, "Build Time Stamp:     %s" % self.BuildTimeStamp)
         if self.BuildTime:
@@ -696,7 +747,8 @@ class ModuleReport(object):
         FileWrite(File, gSectionSep)
 
         if "PCD" in ReportType:
-            GlobalPcdReport.GenerateReport(File, self.ModulePcdSet,self.FileGuid)
+            GlobalPcdReport.GenerateReport(
+                File, self.ModulePcdSet, self.FileGuid)
 
         if "LIBRARY" in ReportType:
             self.LibraryReport.GenerateReport(File)
@@ -712,6 +764,7 @@ class ModuleReport(object):
 
         FileWrite(File, gSectionEnd)
 
+
 def ReadMessage(From, To, ExitFlag):
     while True:
         # read one line a time
@@ -730,6 +783,8 @@ def ReadMessage(From, To, ExitFlag):
 # This class reports the platform PCD section and module PCD subsection
 # in the build report file.
 #
+
+
 class PcdReport(object):
     ##
     # Constructor function for class PcdReport
@@ -767,7 +822,8 @@ class PcdReport(object):
             # GUID C Names
             #
             for Pcd in Pa.AllPcdList:
-                PcdList = self.AllPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
+                PcdList = self.AllPcds.setdefault(
+                    Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
                 if Pcd not in PcdList:
                     PcdList.append(Pcd)
                 if len(Pcd.TokenCName) > self.MaxLen:
@@ -776,11 +832,13 @@ class PcdReport(object):
             # Collect the PCD defined in DSC/FDF file, but not used in module
             #
             UnusedPcdFullList = []
-            StructPcdDict = GlobalData.gStructurePcd.get(self.Arch, collections.OrderedDict())
+            StructPcdDict = GlobalData.gStructurePcd.get(
+                self.Arch, collections.OrderedDict())
             for Name, Guid in StructPcdDict:
                 if (Name, Guid) not in Pa.Platform.Pcds:
                     Pcd = StructPcdDict[(Name, Guid)]
-                    PcdList = self.AllPcds.setdefault(Guid, {}).setdefault(Pcd.Type, [])
+                    PcdList = self.AllPcds.setdefault(
+                        Guid, {}).setdefault(Pcd.Type, [])
                     if Pcd not in PcdList and Pcd not in UnusedPcdFullList:
                         UnusedPcdFullList.append(Pcd)
             for item in Pa.Platform.Pcds:
@@ -788,7 +846,8 @@ class PcdReport(object):
                 if not Pcd.Type:
                     # check the Pcd in FDF file, whether it is used in module first
                     for T in PCD_TYPE_LIST:
-                        PcdList = self.AllPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(T, [])
+                        PcdList = self.AllPcds.setdefault(
+                            Pcd.TokenSpaceGuidCName, {}).setdefault(T, [])
                         if Pcd in PcdList:
                             Pcd.Type = T
                             break
@@ -800,7 +859,8 @@ class PcdReport(object):
                                 Pcd.Type = T
                                 PcdTypeFlag = True
                                 if not Pcd.DatumType:
-                                    Pcd.DatumType = package.Pcds[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName, T)].DatumType
+                                    Pcd.DatumType = package.Pcds[(
+                                        Pcd.TokenCName, Pcd.TokenSpaceGuidCName, T)].DatumType
                                 break
                         if PcdTypeFlag:
                             break
@@ -813,11 +873,14 @@ class PcdReport(object):
                         PcdType = TAB_PCDS_DYNAMIC
                     for package in Pa.PackageList:
                         if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, PcdType) in package.Pcds:
-                            Pcd.DatumType = package.Pcds[(Pcd.TokenCName, Pcd.TokenSpaceGuidCName, PcdType)].DatumType
+                            Pcd.DatumType = package.Pcds[(
+                                Pcd.TokenCName, Pcd.TokenSpaceGuidCName, PcdType)].DatumType
                             break
 
-                PcdList = self.AllPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
-                UnusedPcdList = self.UnusedPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
+                PcdList = self.AllPcds.setdefault(
+                    Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
+                UnusedPcdList = self.UnusedPcds.setdefault(
+                    Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
                 if Pcd in UnusedPcdList:
                     UnusedPcdList.remove(Pcd)
                 if Pcd not in PcdList and Pcd not in UnusedPcdFullList:
@@ -830,8 +893,10 @@ class PcdReport(object):
                     if '.' in PcdItem:
                         (TokenSpaceGuidCName, TokenCName) = PcdItem.split('.')
                         if (TokenCName, TokenSpaceGuidCName) in Pa.Platform.Pcds:
-                            Pcd = Pa.Platform.Pcds[(TokenCName, TokenSpaceGuidCName)]
-                            PcdList = self.ConditionalPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
+                            Pcd = Pa.Platform.Pcds[(
+                                TokenCName, TokenSpaceGuidCName)]
+                            PcdList = self.ConditionalPcds.setdefault(
+                                Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
                             if Pcd not in PcdList:
                                 PcdList.append(Pcd)
 
@@ -843,7 +908,8 @@ class PcdReport(object):
                     UnusedPcdList.append(Pcd)
 
             for Pcd in UnusedPcdList:
-                PcdList = self.UnusedPcds.setdefault(Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
+                PcdList = self.UnusedPcds.setdefault(
+                    Pcd.TokenSpaceGuidCName, {}).setdefault(Pcd.Type, [])
                 if Pcd not in PcdList:
                     PcdList.append(Pcd)
 
@@ -856,8 +922,8 @@ class PcdReport(object):
                     TokenSpaceGuid = ModulePcd.TokenSpaceGuidCName
                     ModuleDefault = ModulePcd.DefaultValue
                     ModulePath = os.path.basename(Module.M.MetaFile.File)
-                    self.ModulePcdOverride.setdefault((TokenCName, TokenSpaceGuid), {})[ModulePath] = ModuleDefault
-
+                    self.ModulePcdOverride.setdefault((TokenCName, TokenSpaceGuid), {})[
+                        ModulePath] = ModuleDefault
 
         #
         # Collect PCD DEC default value.
@@ -869,19 +935,23 @@ class PcdReport(object):
                 Guids = Package.Guids
                 self._GuidDict.update(Guids)
                 for (TokenCName, TokenSpaceGuidCName, DecType) in Package.Pcds:
-                    DecDefaultValue = Package.Pcds[TokenCName, TokenSpaceGuidCName, DecType].DefaultValue
-                    self.DecPcdDefault.setdefault((TokenCName, TokenSpaceGuidCName, DecType), DecDefaultValue)
+                    DecDefaultValue = Package.Pcds[TokenCName,
+                                                   TokenSpaceGuidCName, DecType].DefaultValue
+                    self.DecPcdDefault.setdefault(
+                        (TokenCName, TokenSpaceGuidCName, DecType), DecDefaultValue)
         #
         # Collect PCDs defined in DSC common section
         #
         self.DscPcdDefault = {}
         for Pa in Wa.AutoGenObjectList:
             for (TokenCName, TokenSpaceGuidCName) in Pa.Platform.Pcds:
-                DscDefaultValue = Pa.Platform.Pcds[(TokenCName, TokenSpaceGuidCName)].DscDefaultValue
+                DscDefaultValue = Pa.Platform.Pcds[(
+                    TokenCName, TokenSpaceGuidCName)].DscDefaultValue
                 if DscDefaultValue:
-                    self.DscPcdDefault[(TokenCName, TokenSpaceGuidCName)] = DscDefaultValue
+                    self.DscPcdDefault[(
+                        TokenCName, TokenSpaceGuidCName)] = DscDefaultValue
 
-    def GenerateReport(self, File, ModulePcdSet,ModuleGuid=None):
+    def GenerateReport(self, File, ModulePcdSet, ModuleGuid=None):
         if not ModulePcdSet:
             if self.ConditionalPcds:
                 self.GenerateReportDetail(File, ModulePcdSet, 1)
@@ -897,7 +967,7 @@ class PcdReport(object):
                         break
                 if not IsEmpty:
                     self.GenerateReportDetail(File, ModulePcdSet, 2)
-        self.GenerateReportDetail(File, ModulePcdSet,ModuleGuid = ModuleGuid)
+        self.GenerateReportDetail(File, ModulePcdSet, ModuleGuid=ModuleGuid)
 
     ##
     # Generate report for PCD information
@@ -913,7 +983,7 @@ class PcdReport(object):
     #                        directives section report, 2 means Unused Pcds section report
     # @param DscOverridePcds Module DSC override PCDs set
     #
-    def GenerateReportDetail(self, File, ModulePcdSet, ReportSubType = 0,ModuleGuid=None):
+    def GenerateReportDetail(self, File, ModulePcdSet, ReportSubType=0, ModuleGuid=None):
         PcdDict = self.AllPcds
         if ReportSubType == 1:
             PcdDict = self.ConditionalPcds
@@ -923,9 +993,11 @@ class PcdReport(object):
         if not ModulePcdSet:
             FileWrite(File, gSectionStart)
             if ReportSubType == 1:
-                FileWrite(File, "Conditional Directives used by the build system")
+                FileWrite(
+                    File, "Conditional Directives used by the build system")
             elif ReportSubType == 2:
-                FileWrite(File, "PCDs not used by modules or in conditional directives")
+                FileWrite(
+                    File, "PCDs not used by modules or in conditional directives")
             else:
                 FileWrite(File, "Platform Configuration Database Report")
 
@@ -971,8 +1043,10 @@ class PcdReport(object):
                 #
                 # Get PCD default value and their override relationship
                 #
-                DecDefaultValue = self.DecPcdDefault.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, DecType))
-                DscDefaultValue = self.DscPcdDefault.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
+                DecDefaultValue = self.DecPcdDefault.get(
+                    (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, DecType))
+                DscDefaultValue = self.DscPcdDefault.get(
+                    (Pcd.TokenCName, Pcd.TokenSpaceGuidCName))
                 DscDefaultValBak = DscDefaultValue
                 Field = ''
                 for (CName, Guid, Field) in self.FdfPcdSet:
@@ -981,32 +1055,37 @@ class PcdReport(object):
                         break
                 if DscDefaultValue != DscDefaultValBak:
                     try:
-                        DscDefaultValue = ValueExpressionEx(DscDefaultValue, Pcd.DatumType, self._GuidDict)(True)
+                        DscDefaultValue = ValueExpressionEx(
+                            DscDefaultValue, Pcd.DatumType, self._GuidDict)(True)
                     except BadExpression as DscDefaultValue:
-                        EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" %(DscDefaultValue, Pcd.DatumType))
+                        EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" % (
+                            DscDefaultValue, Pcd.DatumType))
 
                 InfDefaultValue = None
 
                 PcdValue = DecDefaultValue
                 if DscDefaultValue:
                     PcdValue = DscDefaultValue
-                #The DefaultValue of StructurePcd already be the latest, no need to update.
+                # The DefaultValue of StructurePcd already be the latest, no need to update.
                 if not self.IsStructurePcd(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):
                     Pcd.DefaultValue = PcdValue
                 PcdComponentValue = None
                 if ModulePcdSet is not None:
                     if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Type) not in ModulePcdSet:
                         continue
-                    InfDefaultValue, PcdComponentValue = ModulePcdSet[Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Type]
+                    InfDefaultValue, PcdComponentValue = ModulePcdSet[Pcd.TokenCName,
+                                                                      Pcd.TokenSpaceGuidCName, Type]
                     PcdValue = PcdComponentValue
-                    #The DefaultValue of StructurePcd already be the latest, no need to update.
+                    # The DefaultValue of StructurePcd already be the latest, no need to update.
                     if not self.IsStructurePcd(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):
                         Pcd.DefaultValue = PcdValue
                     if InfDefaultValue:
                         try:
-                            InfDefaultValue = ValueExpressionEx(InfDefaultValue, Pcd.DatumType, self._GuidDict)(True)
+                            InfDefaultValue = ValueExpressionEx(
+                                InfDefaultValue, Pcd.DatumType, self._GuidDict)(True)
                         except BadExpression as InfDefaultValue:
-                            EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" % (InfDefaultValue, Pcd.DatumType))
+                            EdkLogger.error('BuildReport', FORMAT_INVALID, "PCD Value: %s, Type: %s" % (
+                                InfDefaultValue, Pcd.DatumType))
                     if InfDefaultValue == "":
                         InfDefaultValue = None
 
@@ -1017,7 +1096,7 @@ class PcdReport(object):
                             if pcd[2]:
                                 continue
                             PcdValue = pcd[3]
-                            #The DefaultValue of StructurePcd already be the latest, no need to update.
+                            # The DefaultValue of StructurePcd already be the latest, no need to update.
                             if not self.IsStructurePcd(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):
                                 Pcd.DefaultValue = PcdValue
                             BuildOptionMatch = True
@@ -1029,7 +1108,6 @@ class PcdReport(object):
                     FileWrite(File, Key)
                     First = False
 
-
                 if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
                     if PcdValue.startswith('0') and not PcdValue.lower().startswith('0x') and \
                             len(PcdValue) > 1 and PcdValue.lstrip('0'):
@@ -1065,26 +1143,31 @@ class PcdReport(object):
                     if DecDefaultValue is None:
                         DecMatch = True
                     else:
-                        DecMatch = (DecDefaultValue.strip() == PcdValue.strip())
+                        DecMatch = (DecDefaultValue.strip()
+                                    == PcdValue.strip())
 
                     if InfDefaultValue is None:
                         InfMatch = True
                     else:
-                        InfMatch = (InfDefaultValue.strip() == PcdValue.strip())
+                        InfMatch = (InfDefaultValue.strip()
+                                    == PcdValue.strip())
 
                     if DscDefaultValue is None:
                         DscMatch = True
                     else:
-                        DscMatch = (DscDefaultValue.strip() == PcdValue.strip())
+                        DscMatch = (DscDefaultValue.strip()
+                                    == PcdValue.strip())
 
                 IsStructure = False
                 if self.IsStructurePcd(Pcd.TokenCName, Pcd.TokenSpaceGuidCName):
                     IsStructure = True
                     if TypeName in ('DYNVPD', 'DEXVPD'):
                         SkuInfoList = Pcd.SkuInfoList
-                    Pcd = GlobalData.gStructurePcd[self.Arch][(Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
+                    Pcd = GlobalData.gStructurePcd[self.Arch][(
+                        Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                     if ModulePcdSet and ModulePcdSet.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Type)):
-                        InfDefaultValue, PcdComponentValue = ModulePcdSet[Pcd.TokenCName, Pcd.TokenSpaceGuidCName, Type]
+                        InfDefaultValue, PcdComponentValue = ModulePcdSet[Pcd.TokenCName,
+                                                                          Pcd.TokenSpaceGuidCName, Type]
                         DscDefaultValBak = Pcd.DefaultValue
                         Pcd.DefaultValue = PcdComponentValue
 
@@ -1116,18 +1199,23 @@ class PcdReport(object):
                                             for Data in OverrideValues.values():
                                                 Struct = list(Data.values())
                                                 if Struct:
-                                                    DscOverride = self.ParseStruct(Struct[0])
+                                                    DscOverride = self.ParseStruct(
+                                                        Struct[0])
                                                     break
                                     else:
-                                        SkuList = sorted(Pcd.SkuInfoList.keys())
+                                        SkuList = sorted(
+                                            Pcd.SkuInfoList.keys())
                                         for Sku in SkuList:
                                             SkuInfo = Pcd.SkuInfoList[Sku]
                                             if SkuInfo.DefaultStoreDict:
-                                                DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
+                                                DefaultStoreList = sorted(
+                                                    SkuInfo.DefaultStoreDict.keys())
                                                 for DefaultStore in DefaultStoreList:
-                                                    OverrideValues = Pcd.SkuOverrideValues.get(Sku)
+                                                    OverrideValues = Pcd.SkuOverrideValues.get(
+                                                        Sku)
                                                     if OverrideValues:
-                                                        DscOverride = self.ParseStruct(OverrideValues[DefaultStore])
+                                                        DscOverride = self.ParseStruct(
+                                                            OverrideValues[DefaultStore])
                                                         if DscOverride:
                                                             break
                                             if DscOverride:
@@ -1139,7 +1227,7 @@ class PcdReport(object):
                             else:
                                 DecMatch = True
                         else:
-                            if Pcd.DscRawValue or (ModuleGuid and ModuleGuid.replace("-","S") in Pcd.PcdValueFromComponents):
+                            if Pcd.DscRawValue or (ModuleGuid and ModuleGuid.replace("-", "S") in Pcd.PcdValueFromComponents):
                                 DscDefaultValue = True
                                 DscMatch = True
                                 DecMatch = False
@@ -1160,20 +1248,25 @@ class PcdReport(object):
                     if Pcd.DefaultValue:
                         Pcd.DefaultValue = str(int(Pcd.DefaultValue, 0))
                 if DecMatch:
-                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '  ')
+                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch,
+                                       DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '  ')
                 elif InfDefaultValue and InfMatch:
-                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*M')
+                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch,
+                                       DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*M')
                 elif BuildOptionMatch:
-                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*B')
+                    self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch,
+                                       DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*B')
                 else:
                     if PcdComponentValue:
-                        self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, PcdComponentValue, DecMatch, DecDefaultValue, '*M', ModuleGuid)
+                        self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak,
+                                           InfMatch, PcdComponentValue, DecMatch, DecDefaultValue, '*M', ModuleGuid)
                     elif DscDefaultValue and DscMatch:
                         if (Pcd.TokenCName, Key, Field) in self.FdfPcdSet:
-                            self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*F')
+                            self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch,
+                                               DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*F')
                         else:
-                            self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*P')
-
+                            self.PrintPcdValue(File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch,
+                                               DscDefaultValBak, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, '*P')
 
                 if ModulePcdSet is None:
                     if IsStructure:
@@ -1181,34 +1274,44 @@ class PcdReport(object):
                     if not TypeName in ('PATCH', 'FLAG', 'FIXED'):
                         continue
                     if not BuildOptionMatch:
-                        ModuleOverride = self.ModulePcdOverride.get((Pcd.TokenCName, Pcd.TokenSpaceGuidCName), {})
+                        ModuleOverride = self.ModulePcdOverride.get(
+                            (Pcd.TokenCName, Pcd.TokenSpaceGuidCName), {})
                         for ModulePath in ModuleOverride:
                             ModuleDefault = ModuleOverride[ModulePath]
                             if Pcd.DatumType in TAB_PCD_NUMERIC_TYPES:
                                 if ModuleDefault.startswith('0') and not ModuleDefault.lower().startswith('0x') and \
                                         len(ModuleDefault) > 1 and ModuleDefault.lstrip('0'):
                                     ModuleDefault = ModuleDefault.lstrip('0')
-                                ModulePcdDefaultValueNumber = int(ModuleDefault.strip(), 0)
-                                Match = (ModulePcdDefaultValueNumber == PcdValueNumber)
+                                ModulePcdDefaultValueNumber = int(
+                                    ModuleDefault.strip(), 0)
+                                Match = (ModulePcdDefaultValueNumber ==
+                                         PcdValueNumber)
                                 if Pcd.DatumType == 'BOOLEAN':
-                                    ModuleDefault = str(ModulePcdDefaultValueNumber)
+                                    ModuleDefault = str(
+                                        ModulePcdDefaultValueNumber)
                             else:
-                                Match = (ModuleDefault.strip() == PcdValue.strip())
+                                Match = (ModuleDefault.strip()
+                                         == PcdValue.strip())
                             if Match:
                                 continue
-                            IsByteArray, ArrayList = ByteArrayForamt(ModuleDefault.strip())
+                            IsByteArray, ArrayList = ByteArrayForamt(
+                                ModuleDefault.strip())
                             if IsByteArray:
-                                FileWrite(File, ' *M     %-*s = %s' % (self.MaxLen + 15, ModulePath, '{'))
+                                FileWrite(File, ' *M     %-*s = %s' %
+                                          (self.MaxLen + 15, ModulePath, '{'))
                                 for Array in ArrayList:
                                     FileWrite(File, Array)
                             else:
-                                Value =  ModuleDefault.strip()
+                                Value = ModuleDefault.strip()
                                 if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
                                     if Value.startswith(('0x', '0X')):
-                                        Value = '{} ({:d})'.format(Value, int(Value, 0))
+                                        Value = '{} ({:d})'.format(
+                                            Value, int(Value, 0))
                                     else:
-                                        Value = "0x{:X} ({})".format(int(Value, 0), Value)
-                                FileWrite(File, ' *M     %-*s = %s' % (self.MaxLen + 15, ModulePath, Value))
+                                        Value = "0x{:X} ({})".format(
+                                            int(Value, 0), Value)
+                                FileWrite(File, ' *M     %-*s = %s' %
+                                          (self.MaxLen + 15, ModulePath, Value))
 
         if ModulePcdSet is None:
             FileWrite(File, gSectionEnd)
@@ -1233,7 +1336,8 @@ class PcdReport(object):
             Value = DscDefaultValue.strip()
             IsByteArray, ArrayList = ByteArrayForamt(Value)
             if IsByteArray:
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'DSC DEFAULT', "{"))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'DSC DEFAULT', "{"))
                 for Array in ArrayList:
                     FileWrite(File, Array)
             else:
@@ -1242,12 +1346,14 @@ class PcdReport(object):
                         Value = '{} ({:d})'.format(Value, int(Value, 0))
                     else:
                         Value = "0x{:X} ({})".format(int(Value, 0), Value)
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'DSC DEFAULT', Value))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'DSC DEFAULT', Value))
         if not InfMatch and InfDefaultValue is not None:
             Value = InfDefaultValue.strip()
             IsByteArray, ArrayList = ByteArrayForamt(Value)
             if IsByteArray:
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'INF DEFAULT', "{"))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'INF DEFAULT', "{"))
                 for Array in ArrayList:
                     FileWrite(File, Array)
             else:
@@ -1256,13 +1362,15 @@ class PcdReport(object):
                         Value = '{} ({:d})'.format(Value, int(Value, 0))
                     else:
                         Value = "0x{:X} ({})".format(int(Value, 0), Value)
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'INF DEFAULT', Value))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'INF DEFAULT', Value))
 
         if not DecMatch and DecDefaultValue is not None:
             Value = DecDefaultValue.strip()
             IsByteArray, ArrayList = ByteArrayForamt(Value)
             if IsByteArray:
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'DEC DEFAULT', "{"))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'DEC DEFAULT', "{"))
                 for Array in ArrayList:
                     FileWrite(File, Array)
             else:
@@ -1271,7 +1379,8 @@ class PcdReport(object):
                         Value = '{} ({:d})'.format(Value, int(Value, 0))
                     else:
                         Value = "0x{:X} ({})".format(int(Value, 0), Value)
-                FileWrite(File, '    %*s = %s' % (self.MaxLen + 19, 'DEC DEFAULT', Value))
+                FileWrite(File, '    %*s = %s' %
+                          (self.MaxLen + 19, 'DEC DEFAULT', Value))
             if IsStructure:
                 for filedvalues in Pcd.DefaultValues.values():
                     self.PrintStructureInfo(File, filedvalues)
@@ -1279,12 +1388,13 @@ class PcdReport(object):
             for filedvalues in Pcd.DefaultValues.values():
                 self.PrintStructureInfo(File, filedvalues)
 
-    def PrintPcdValue(self, File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, Flag = '  ',ModuleGuid=None):
+    def PrintPcdValue(self, File, Pcd, PcdTokenCName, TypeName, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue, Flag='  ', ModuleGuid=None):
         if not Pcd.SkuInfoList:
             Value = Pcd.DefaultValue
             IsByteArray, ArrayList = ByteArrayForamt(Value)
             if IsByteArray:
-                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '{'))
+                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag +
+                          ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '{'))
                 for Array in ArrayList:
                     FileWrite(File, Array)
             else:
@@ -1295,11 +1405,13 @@ class PcdReport(object):
                         Value = '{} ({:d})'.format(Value, int(Value, 0))
                     else:
                         Value = "0x{:X} ({})".format(int(Value, 0), Value)
-                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
+                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag +
+                          ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
             if IsStructure:
                 FiledOverrideFlag = False
-                if (Pcd.TokenCName,Pcd.TokenSpaceGuidCName) in GlobalData.gPcdSkuOverrides:
-                    OverrideValues = GlobalData.gPcdSkuOverrides[(Pcd.TokenCName,Pcd.TokenSpaceGuidCName)]
+                if (Pcd.TokenCName, Pcd.TokenSpaceGuidCName) in GlobalData.gPcdSkuOverrides:
+                    OverrideValues = GlobalData.gPcdSkuOverrides[(
+                        Pcd.TokenCName, Pcd.TokenSpaceGuidCName)]
                 else:
                     OverrideValues = Pcd.SkuOverrideValues
                 FieldOverrideValues = None
@@ -1310,16 +1422,19 @@ class PcdReport(object):
                             FieldOverrideValues = Struct[0]
                             FiledOverrideFlag = True
                             break
-                if Pcd.PcdFiledValueFromDscComponent and ModuleGuid and ModuleGuid.replace("-","S") in Pcd.PcdFiledValueFromDscComponent:
-                    FieldOverrideValues = Pcd.PcdFiledValueFromDscComponent[ModuleGuid.replace("-","S")]
+                if Pcd.PcdFiledValueFromDscComponent and ModuleGuid and ModuleGuid.replace("-", "S") in Pcd.PcdFiledValueFromDscComponent:
+                    FieldOverrideValues = Pcd.PcdFiledValueFromDscComponent[ModuleGuid.replace(
+                        "-", "S")]
                 if FieldOverrideValues:
-                    OverrideFieldStruct = self.OverrideFieldValue(Pcd, FieldOverrideValues)
+                    OverrideFieldStruct = self.OverrideFieldValue(
+                        Pcd, FieldOverrideValues)
                     self.PrintStructureInfo(File, OverrideFieldStruct)
 
                 if not FiledOverrideFlag and (Pcd.PcdFieldValueFromComm or Pcd.PcdFieldValueFromFdf):
                     OverrideFieldStruct = self.OverrideFieldValue(Pcd, {})
                     self.PrintStructureInfo(File, OverrideFieldStruct)
-            self.PrintPcdDefault(File, Pcd, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
+            self.PrintPcdDefault(File, Pcd, IsStructure, DscMatch, DscDefaultValue,
+                                 InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
         else:
             FirstPrint = True
             SkuList = sorted(Pcd.SkuInfoList.keys())
@@ -1328,7 +1443,8 @@ class PcdReport(object):
                 SkuIdName = SkuInfo.SkuIdName
                 if TypeName in ('DYNHII', 'DEXHII'):
                     if SkuInfo.DefaultStoreDict:
-                        DefaultStoreList = sorted(SkuInfo.DefaultStoreDict.keys())
+                        DefaultStoreList = sorted(
+                            SkuInfo.DefaultStoreDict.keys())
                         for DefaultStore in DefaultStoreList:
                             Value = SkuInfo.DefaultStoreDict[DefaultStore]
                             IsByteArray, ArrayList = ByteArrayForamt(Value)
@@ -1338,62 +1454,86 @@ class PcdReport(object):
                                 FirstPrint = False
                                 if IsByteArray:
                                     if self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (
+                                            self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '{'))
                                     elif self.DefaultStoreSingle and not self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                                  PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '{'))
                                     elif not self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                                  PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', '{'))
                                     else:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName,
+                                                  TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', '{'))
                                     for Array in ArrayList:
                                         FileWrite(File, Array)
                                 else:
                                     if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
                                         if Value.startswith(('0x', '0X')):
-                                            Value = '{} ({:d})'.format(Value, int(Value, 0))
+                                            Value = '{} ({:d})'.format(
+                                                Value, int(Value, 0))
                                         else:
-                                            Value = "0x{:X} ({})".format(int(Value, 0), Value)
+                                            Value = "0x{:X} ({})".format(
+                                                int(Value, 0), Value)
                                     if self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (
+                                            self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
                                     elif self.DefaultStoreSingle and not self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                                  PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                                     elif not self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                                  PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', Value))
                                     else:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName,
+                                                  TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', Value))
                             else:
                                 if IsByteArray:
                                     if self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '{'))
                                     elif self.DefaultStoreSingle and not self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '{'))
                                     elif not self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', '{'))
                                     else:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', '{'))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, ' ', TypeName,
+                                                  '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', '{'))
                                     for Array in ArrayList:
                                         FileWrite(File, Array)
                                 else:
                                     if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
                                         if Value.startswith(('0x', '0X')):
-                                            Value = '{} ({:d})'.format(Value, int(Value, 0))
+                                            Value = '{} ({:d})'.format(
+                                                Value, int(Value, 0))
                                         else:
-                                            Value = "0x{:X} ({})".format(int(Value, 0), Value)
+                                            Value = "0x{:X} ({})".format(
+                                                int(Value, 0), Value)
                                     if self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')',  Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')',  Value))
                                     elif self.DefaultStoreSingle and not self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                                     elif not self.DefaultStoreSingle and self.SkuSingle:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', Value))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (
+                                            self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + DefaultStore + ')', Value))
                                     else:
-                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', Value))
-                            FileWrite(File, '%*s: %s: %s' % (self.MaxLen + 4, SkuInfo.VariableGuid, SkuInfo.VariableName, SkuInfo.VariableOffset))
+                                        FileWrite(File, ' %-*s   : %6s %10s %10s %10s = %s' % (self.MaxLen, ' ', TypeName,
+                                                  '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', '(' + DefaultStore + ')', Value))
+                            FileWrite(File, '%*s: %s: %s' % (self.MaxLen + 4, SkuInfo.VariableGuid,
+                                      SkuInfo.VariableName, SkuInfo.VariableOffset))
                             if IsStructure:
                                 OverrideValues = Pcd.SkuOverrideValues.get(Sku)
                                 if OverrideValues:
-                                    OverrideFieldStruct = self.OverrideFieldValue(Pcd, OverrideValues[DefaultStore])
-                                    self.PrintStructureInfo(File, OverrideFieldStruct)
-                            self.PrintPcdDefault(File, Pcd, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
+                                    OverrideFieldStruct = self.OverrideFieldValue(
+                                        Pcd, OverrideValues[DefaultStore])
+                                    self.PrintStructureInfo(
+                                        File, OverrideFieldStruct)
+                            self.PrintPcdDefault(
+                                File, Pcd, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
                 else:
                     Value = SkuInfo.DefaultValue
                     IsByteArray, ArrayList = ByteArrayForamt(Value)
@@ -1403,44 +1543,59 @@ class PcdReport(object):
                         FirstPrint = False
                         if IsByteArray:
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag +
+                                          ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', "{"))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                          PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
                             for Array in ArrayList:
                                 FileWrite(File, Array)
                         else:
                             if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
                                 if Value.startswith(('0x', '0X')):
-                                    Value = '{} ({:d})'.format(Value, int(Value, 0))
+                                    Value = '{} ({:d})'.format(
+                                        Value, int(Value, 0))
                                 else:
-                                    Value = "0x{:X} ({})".format(int(Value, 0), Value)
+                                    Value = "0x{:X} ({})".format(
+                                        int(Value, 0), Value)
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, Flag +
+                                          ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' + PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, Flag + ' ' +
+                                          PcdTokenCName, TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                     else:
                         if IsByteArray:
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen,
+                                          ' ', TypeName, '(' + Pcd.DatumType + ')', "{"))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ',
+                                          TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', "{"))
                             for Array in ArrayList:
                                 FileWrite(File, Array)
                         else:
                             if Pcd.DatumType in TAB_PCD_CLEAN_NUMERIC_TYPES:
                                 if Value.startswith(('0x', '0X')):
-                                    Value = '{} ({:d})'.format(Value, int(Value, 0))
+                                    Value = '{} ({:d})'.format(
+                                        Value, int(Value, 0))
                                 else:
-                                    Value = "0x{:X} ({})".format(int(Value, 0), Value)
+                                    Value = "0x{:X} ({})".format(
+                                        int(Value, 0), Value)
                             if self.SkuSingle:
-                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s = %s' % (self.MaxLen,
+                                          ' ', TypeName, '(' + Pcd.DatumType + ')', Value))
                             else:
-                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ', TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
+                                FileWrite(File, ' %-*s   : %6s %10s %10s = %s' % (self.MaxLen, ' ',
+                                          TypeName, '(' + Pcd.DatumType + ')', '(' + SkuIdName + ')', Value))
                     if TypeName in ('DYNVPD', 'DEXVPD'):
-                        FileWrite(File, '%*s' % (self.MaxLen + 4, SkuInfo.VpdOffset))
-                        VPDPcdItem = (Pcd.TokenSpaceGuidCName + '.' + PcdTokenCName, SkuIdName, SkuInfo.VpdOffset, Pcd.MaxDatumSize, SkuInfo.DefaultValue)
+                        FileWrite(File, '%*s' %
+                                  (self.MaxLen + 4, SkuInfo.VpdOffset))
+                        VPDPcdItem = (Pcd.TokenSpaceGuidCName + '.' + PcdTokenCName, SkuIdName,
+                                      SkuInfo.VpdOffset, Pcd.MaxDatumSize, SkuInfo.DefaultValue)
                         if VPDPcdItem not in VPDPcdList:
-                            PcdGuidList = self.UnusedPcds.get(Pcd.TokenSpaceGuidCName)
+                            PcdGuidList = self.UnusedPcds.get(
+                                Pcd.TokenSpaceGuidCName)
                             if PcdGuidList:
                                 PcdList = PcdGuidList.get(Pcd.Type)
                                 if not PcdList:
@@ -1455,19 +1610,22 @@ class PcdReport(object):
                         OverrideValues = Pcd.SkuOverrideValues.get(Sku)
                         if OverrideValues:
                             Keys = list(OverrideValues.keys())
-                            OverrideFieldStruct = self.OverrideFieldValue(Pcd, OverrideValues[Keys[0]])
+                            OverrideFieldStruct = self.OverrideFieldValue(
+                                Pcd, OverrideValues[Keys[0]])
                             self.PrintStructureInfo(File, OverrideFieldStruct)
                             FiledOverrideFlag = True
                         if not FiledOverrideFlag and (Pcd.PcdFieldValueFromComm or Pcd.PcdFieldValueFromFdf):
-                            OverrideFieldStruct = self.OverrideFieldValue(Pcd, {})
+                            OverrideFieldStruct = self.OverrideFieldValue(
+                                Pcd, {})
                             self.PrintStructureInfo(File, OverrideFieldStruct)
-                    self.PrintPcdDefault(File, Pcd, IsStructure, DscMatch, DscDefaultValue, InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
+                    self.PrintPcdDefault(File, Pcd, IsStructure, DscMatch, DscDefaultValue,
+                                         InfMatch, InfDefaultValue, DecMatch, DecDefaultValue)
 
     def OverrideFieldValue(self, Pcd, OverrideStruct):
         OverrideFieldStruct = collections.OrderedDict()
         if OverrideStruct:
             for _, Values in OverrideStruct.items():
-                for Key,value in Values.items():
+                for Key, value in Values.items():
                     if value[1] and value[1].endswith('.dsc'):
                         OverrideFieldStruct[Key] = value
         if Pcd.PcdFieldValueFromFdf:
@@ -1485,11 +1643,14 @@ class PcdReport(object):
     def PrintStructureInfo(self, File, Struct):
         for Key, Value in sorted(Struct.items(), key=lambda x: x[0]):
             if Value[1] and 'build command options' in Value[1]:
-                FileWrite(File, '    *B  %-*s = %s' % (self.MaxLen + 4, '.' + Key, Value[0]))
+                FileWrite(File, '    *B  %-*s = %s' %
+                          (self.MaxLen + 4, '.' + Key, Value[0]))
             elif Value[1] and Value[1].endswith('.fdf'):
-                FileWrite(File, '    *F  %-*s = %s' % (self.MaxLen + 4, '.' + Key, Value[0]))
+                FileWrite(File, '    *F  %-*s = %s' %
+                          (self.MaxLen + 4, '.' + Key, Value[0]))
             else:
-                FileWrite(File, '        %-*s = %s' % (self.MaxLen + 4, '.' + Key, Value[0]))
+                FileWrite(File, '        %-*s = %s' %
+                          (self.MaxLen + 4, '.' + Key, Value[0]))
 
     def StrtoHex(self, value):
         try:
@@ -1533,6 +1694,8 @@ class PcdReport(object):
 # This class reports the platform execution order prediction section and
 # module load fixed address prediction subsection in the build report file.
 #
+
+
 class PredictionReport(object):
     ##
     # Constructor function for class PredictionReport
@@ -1575,24 +1738,28 @@ class PredictionReport(object):
                 for Source in Module.SourceFileList:
                     if os.path.splitext(str(Source))[1].lower() == ".c":
                         self._SourceList.append("  " + str(Source))
-                        FindIncludeFiles(Source.Path, Module.IncludePathList, IncludeList)
+                        FindIncludeFiles(
+                            Source.Path, Module.IncludePathList, IncludeList)
                 for IncludeFile in IncludeList.values():
                     self._SourceList.append("  " + IncludeFile)
 
                 for Guid in Module.PpiList:
-                    self._GuidMap[Guid] = GuidStructureStringToGuidString(Module.PpiList[Guid])
+                    self._GuidMap[Guid] = GuidStructureStringToGuidString(
+                        Module.PpiList[Guid])
                 for Guid in Module.ProtocolList:
-                    self._GuidMap[Guid] = GuidStructureStringToGuidString(Module.ProtocolList[Guid])
+                    self._GuidMap[Guid] = GuidStructureStringToGuidString(
+                        Module.ProtocolList[Guid])
                 for Guid in Module.GuidList:
-                    self._GuidMap[Guid] = GuidStructureStringToGuidString(Module.GuidList[Guid])
+                    self._GuidMap[Guid] = GuidStructureStringToGuidString(
+                        Module.GuidList[Guid])
 
                 if Module.Guid and not Module.IsLibrary:
                     EntryPoint = " ".join(Module.Module.ModuleEntryPointList)
 
                     RealEntryPoint = "_ModuleEntryPoint"
 
-                    self._FfsEntryPoint[Module.Guid.upper()] = (EntryPoint, RealEntryPoint)
-
+                    self._FfsEntryPoint[Module.Guid.upper()] = (
+                        EntryPoint, RealEntryPoint)
 
         #
         # Collect platform firmware volume list as the input of EOT.
@@ -1617,7 +1784,6 @@ class PredictionReport(object):
                                 except AttributeError:
                                     pass
 
-
     ##
     # Parse platform fixed address map files
     #
@@ -1627,6 +1793,7 @@ class PredictionReport(object):
     #
     # @param self:           The object pointer
     #
+
     def _ParseMapFile(self):
         if self._MapFileParsed:
             return
@@ -1643,7 +1810,8 @@ class PredictionReport(object):
                     List.append((AddressType, BaseAddress, "*I"))
                     List.append((AddressType, EntryPoint, "*E"))
             except:
-                EdkLogger.warn(None, "Cannot open file to read", self._MapFileName)
+                EdkLogger.warn(None, "Cannot open file to read",
+                               self._MapFileName)
 
     ##
     # Invokes EOT tool to get the predicted the execution order.
@@ -1693,7 +1861,8 @@ class PredictionReport(object):
             Eot(CommandLineOption=False, SourceFileList=SourceList, GuidList=GuidList,
                 FvFileList=' '.join(FvFileList), Dispatch=DispatchList, IsInit=True)
             EotEndTime = time.time()
-            EotDuration = time.strftime("%H:%M:%S", time.gmtime(int(round(EotEndTime - EotStartTime))))
+            EotDuration = time.strftime("%H:%M:%S", time.gmtime(
+                int(round(EotEndTime - EotStartTime))))
             EdkLogger.quiet("EOT run time: %s\n" % EotDuration)
 
             #
@@ -1708,9 +1877,10 @@ class PredictionReport(object):
                     self.MaxLen = len(Symbol)
                 self.ItemList.append((Phase, Symbol, FilePath))
         except:
-            EdkLogger.quiet("(Python %s on %s\n%s)" % (platform.python_version(), sys.platform, traceback.format_exc()))
-            EdkLogger.warn(None, "Failed to generate execution order prediction report, for some error occurred in executing EOT.")
-
+            EdkLogger.quiet("(Python %s on %s\n%s)" % (
+                platform.python_version(), sys.platform, traceback.format_exc()))
+            EdkLogger.warn(
+                None, "Failed to generate execution order prediction report, for some error occurred in executing EOT.")
 
     ##
     # Generate platform execution order report
@@ -1720,6 +1890,7 @@ class PredictionReport(object):
     # @param self            The object pointer
     # @param File            The file object for report
     #
+
     def _GenerateExecutionOrderReport(self, File):
         self._InvokeEotTool()
         if len(self.ItemList) == 0:
@@ -1731,10 +1902,12 @@ class PredictionReport(object):
         FileWrite(File, "*E Module INF entry point name")
         FileWrite(File, "*N Module notification function name")
 
-        FileWrite(File, "Type %-*s %s" % (self.MaxLen, "Symbol", "Module INF Path"))
+        FileWrite(File, "Type %-*s %s" %
+                  (self.MaxLen, "Symbol", "Module INF Path"))
         FileWrite(File, gSectionSep)
         for Item in self.ItemList:
-            FileWrite(File, "*%sE  %-*s %s" % (Item[0], self.MaxLen, Item[1], Item[2]))
+            FileWrite(File, "*%sE  %-*s %s" %
+                      (Item[0], self.MaxLen, Item[1], Item[2]))
 
         FileWrite(File, gSectionStart)
 
@@ -1774,7 +1947,8 @@ class PredictionReport(object):
             if Symbol == "*I":
                 Name = "(Image Base)"
             elif Symbol == "*E":
-                Name = self._FfsEntryPoint.get(Guid, ["", "_ModuleEntryPoint"])[1]
+                Name = self._FfsEntryPoint.get(
+                    Guid, ["", "_ModuleEntryPoint"])[1]
             elif Symbol in NotifyList:
                 Name = Symbol
                 Symbol = "*N"
@@ -1823,6 +1997,8 @@ class PredictionReport(object):
 # If there are nesting FVs, the nested FVs will list immediate after
 # this FD region subsection
 #
+
+
 class FdRegionReport(object):
     ##
     # Discover all the nested FV name list.
@@ -1836,7 +2012,7 @@ class FdRegionReport(object):
     # @param Wa              Workspace context information
     #
     def _DiscoverNestedFvList(self, FvName, Wa):
-        FvDictKey=FvName.upper()
+        FvDictKey = FvName.upper()
         if FvDictKey in Wa.FdfProfile.FvDict:
             for Ffs in Wa.FdfProfile.FvDict[FvName.upper()].FfsList:
                 for Section in Ffs.SectionList:
@@ -1844,7 +2020,8 @@ class FdRegionReport(object):
                         for FvSection in Section.SectionList:
                             if FvSection.FvName in self.FvList:
                                 continue
-                            self._GuidsDb[Ffs.NameGuid.upper()] = FvSection.FvName
+                            self._GuidsDb[Ffs.NameGuid.upper()
+                                          ] = FvSection.FvName
                             self.FvList.append(FvSection.FvName)
                             self.FvInfo[FvSection.FvName] = ("Nested FV", 0, 0)
                             self._DiscoverNestedFvList(FvSection.FvName, Wa)
@@ -1897,15 +2074,19 @@ class FdRegionReport(object):
         for Pa in Wa.AutoGenObjectList:
             for Package in Pa.PackageList:
                 for (TokenCName, TokenSpaceGuidCName, DecType) in Package.Pcds:
-                    DecDefaultValue = Package.Pcds[TokenCName, TokenSpaceGuidCName, DecType].DefaultValue
-                    PlatformPcds[(TokenCName, TokenSpaceGuidCName)] = DecDefaultValue
+                    DecDefaultValue = Package.Pcds[TokenCName,
+                                                   TokenSpaceGuidCName, DecType].DefaultValue
+                    PlatformPcds[(TokenCName, TokenSpaceGuidCName)
+                                 ] = DecDefaultValue
         #
         # Collect PCDs defined in DSC file
         #
         for Pa in Wa.AutoGenObjectList:
             for (TokenCName, TokenSpaceGuidCName) in Pa.Platform.Pcds:
-                DscDefaultValue = Pa.Platform.Pcds[(TokenCName, TokenSpaceGuidCName)].DefaultValue
-                PlatformPcds[(TokenCName, TokenSpaceGuidCName)] = DscDefaultValue
+                DscDefaultValue = Pa.Platform.Pcds[(
+                    TokenCName, TokenSpaceGuidCName)].DefaultValue
+                PlatformPcds[(TokenCName, TokenSpaceGuidCName)
+                             ] = DscDefaultValue
 
         #
         # Add PEI and DXE a priori files GUIDs defined in PI specification.
@@ -1921,13 +2102,14 @@ class FdRegionReport(object):
             for ModuleKey in Pa.Platform.Modules:
                 M = Pa.Platform.Modules[ModuleKey].M
                 InfPath = mws.join(Wa.WorkspaceDir, M.MetaFile.File)
-                self._GuidsDb[M.Guid.upper()] = "%s (%s)" % (M.Module.BaseName, InfPath)
+                self._GuidsDb[M.Guid.upper()] = "%s (%s)" % (
+                    M.Module.BaseName, InfPath)
 
         #
         # Collect the GUID map in the FV firmware volume
         #
         for FvName in self.FvList:
-            FvDictKey=FvName.upper()
+            FvDictKey = FvName.upper()
             if FvDictKey in Wa.FdfProfile.FvDict:
                 for Ffs in Wa.FdfProfile.FvDict[FvName.upper()].FfsList:
                     try:
@@ -1940,18 +2122,20 @@ class FdRegionReport(object):
                             PcdTokenspace = Match.group(1)
                             PcdToken = Match.group(2)
                             if (PcdToken, PcdTokenspace) in PlatformPcds:
-                                GuidValue = PlatformPcds[(PcdToken, PcdTokenspace)]
-                                Guid = GuidStructureByteArrayToGuidString(GuidValue).upper()
+                                GuidValue = PlatformPcds[(
+                                    PcdToken, PcdTokenspace)]
+                                Guid = GuidStructureByteArrayToGuidString(
+                                    GuidValue).upper()
                         for Section in Ffs.SectionList:
                             try:
-                                ModuleSectFile = mws.join(Wa.WorkspaceDir, Section.SectFileName)
+                                ModuleSectFile = mws.join(
+                                    Wa.WorkspaceDir, Section.SectFileName)
                                 self._GuidsDb[Guid] = ModuleSectFile
                             except AttributeError:
                                 pass
                     except AttributeError:
                         pass
 
-
     ##
     # Internal worker function to generate report for the FD region
     #
@@ -1965,6 +2149,7 @@ class FdRegionReport(object):
     # @param Size            The size of the FD region
     # @param FvName          The FV name if the FD region is a firmware volume
     #
+
     def _GenerateReport(self, File, Title, Type, BaseAddress, Size=0, FvName=None):
         FileWrite(File, gSubSectionStart)
         FileWrite(File, Title)
@@ -1974,7 +2159,7 @@ class FdRegionReport(object):
         if self.Type == BINARY_FILE_TYPE_FV:
             FvTotalSize = 0
             FvTakenSize = 0
-            FvFreeSize  = 0
+            FvFreeSize = 0
             if FvName.upper().endswith('.FV'):
                 FileExt = FvName + ".txt"
             else:
@@ -1999,10 +2184,14 @@ class FdRegionReport(object):
                 #
                 # Write size information to the report file.
                 #
-                FileWrite(File, "Size:               0x%X (%.0fK)" % (FvTotalSize, FvTotalSize / 1024.0))
-                FileWrite(File, "Fv Name:            %s (%.1f%% Full)" % (FvName, FvTakenSize * 100.0 / FvTotalSize))
-                FileWrite(File, "Occupied Size:      0x%X (%.0fK)" % (FvTakenSize, FvTakenSize / 1024.0))
-                FileWrite(File, "Free Size:          0x%X (%.0fK)" % (FvFreeSize, FvFreeSize / 1024.0))
+                FileWrite(File, "Size:               0x%X (%.0fK)" %
+                          (FvTotalSize, FvTotalSize / 1024.0))
+                FileWrite(File, "Fv Name:            %s (%.1f%% Full)" %
+                          (FvName, FvTakenSize * 100.0 / FvTotalSize))
+                FileWrite(File, "Occupied Size:      0x%X (%.0fK)" %
+                          (FvTakenSize, FvTakenSize / 1024.0))
+                FileWrite(File, "Free Size:          0x%X (%.0fK)" %
+                          (FvFreeSize, FvFreeSize / 1024.0))
                 FileWrite(File, "Offset     Module")
                 FileWrite(File, gSubSectionSep)
                 #
@@ -2014,11 +2203,13 @@ class FdRegionReport(object):
                     OffsetInfo[Match.group(1)] = self._GuidsDb.get(Guid, Guid)
                 OffsetList = sorted(OffsetInfo.keys())
                 for Offset in OffsetList:
-                    FileWrite (File, "%s %s" % (Offset, OffsetInfo[Offset]))
+                    FileWrite(File, "%s %s" % (Offset, OffsetInfo[Offset]))
             except IOError:
-                EdkLogger.warn(None, "Fail to read report file", FvReportFileName)
+                EdkLogger.warn(None, "Fail to read report file",
+                               FvReportFileName)
         else:
-            FileWrite(File, "Size:               0x%X (%.0fK)" % (Size, Size / 1024.0))
+            FileWrite(File, "Size:               0x%X (%.0fK)" %
+                      (Size, Size / 1024.0))
         FileWrite(File, gSubSectionEnd)
 
     ##
@@ -2033,9 +2224,11 @@ class FdRegionReport(object):
         if (len(self.FvList) > 0):
             for FvItem in self.FvList:
                 Info = self.FvInfo[FvItem]
-                self._GenerateReport(File, Info[0], TAB_FV_DIRECTORY, Info[1], Info[2], FvItem)
+                self._GenerateReport(
+                    File, Info[0], TAB_FV_DIRECTORY, Info[1], Info[2], FvItem)
         else:
-            self._GenerateReport(File, "FD Region", self.Type, self.BaseAddress, self.Size)
+            self._GenerateReport(File, "FD Region", self.Type,
+                                 self.BaseAddress, self.Size)
 
 ##
 # Reports FD information
@@ -2043,6 +2236,8 @@ class FdRegionReport(object):
 # This class reports the FD section in the build report file.
 # It collects flash device information for a platform.
 #
+
+
 class FdReport(object):
     ##
     # Constructor function for class FdReport
@@ -2058,7 +2253,8 @@ class FdReport(object):
         self.FdName = Fd.FdUiName
         self.BaseAddress = Fd.BaseAddress
         self.Size = Fd.Size
-        self.FdRegionList = [FdRegionReport(FdRegion, Wa) for FdRegion in Fd.RegionList]
+        self.FdRegionList = [FdRegionReport(
+            FdRegion, Wa) for FdRegion in Fd.RegionList]
         self.FvPath = os.path.join(Wa.BuildDir, TAB_FV_DIRECTORY)
         self.VPDBaseAddress = 0
         self.VPDSize = 0
@@ -2081,7 +2277,8 @@ class FdReport(object):
         FileWrite(File, "Firmware Device (FD)")
         FileWrite(File, "FD Name:            %s" % self.FdName)
         FileWrite(File, "Base Address:       %s" % self.BaseAddress)
-        FileWrite(File, "Size:               0x%X (%.0fK)" % (self.Size, self.Size / 1024.0))
+        FileWrite(File, "Size:               0x%X (%.0fK)" %
+                  (self.Size, self.Size / 1024.0))
         if len(self.FdRegionList) > 0:
             FileWrite(File, gSectionSep)
             for FdRegionItem in self.FdRegionList:
@@ -2092,26 +2289,28 @@ class FdReport(object):
             FileWrite(File, gSubSectionStart)
             FileWrite(File, "FD VPD Region")
             FileWrite(File, "Base Address:       0x%X" % self.VPDBaseAddress)
-            FileWrite(File, "Size:               0x%X (%.0fK)" % (self.VPDSize, self.VPDSize / 1024.0))
+            FileWrite(File, "Size:               0x%X (%.0fK)" %
+                      (self.VPDSize, self.VPDSize / 1024.0))
             FileWrite(File, gSubSectionSep)
             for item in VPDPcdList:
                 # Add BaseAddress for offset
                 Offset = '0x%08X' % (int(item[2], 16) + self.VPDBaseAddress)
                 IsByteArray, ArrayList = ByteArrayForamt(item[-1])
                 Skuinfo = item[1]
-                if len(GlobalData.gSkuids) == 1 :
+                if len(GlobalData.gSkuids) == 1:
                     Skuinfo = GlobalData.gSkuids[0]
                 if IsByteArray:
-                    FileWrite(File, "%s | %s | %s | %s | %s" % (item[0], Skuinfo, Offset, item[3], '{'))
+                    FileWrite(File, "%s | %s | %s | %s | %s" %
+                              (item[0], Skuinfo, Offset, item[3], '{'))
                     for Array in ArrayList:
                         FileWrite(File, Array)
                 else:
-                    FileWrite(File, "%s | %s | %s | %s | %s" % (item[0], Skuinfo, Offset, item[3], item[-1]))
+                    FileWrite(File, "%s | %s | %s | %s | %s" %
+                              (item[0], Skuinfo, Offset, item[3], item[-1]))
             FileWrite(File, gSubSectionEnd)
         FileWrite(File, gSectionEnd)
 
 
-
 ##
 # Reports platform information
 #
@@ -2146,7 +2345,8 @@ class PlatformReport(object):
         self.FdReportList = []
         if "FLASH" in ReportType and Wa.FdfProfile and MaList is None:
             for Fd in Wa.FdfProfile.FdDict:
-                self.FdReportList.append(FdReport(Wa.FdfProfile.FdDict[Fd], Wa))
+                self.FdReportList.append(
+                    FdReport(Wa.FdfProfile.FdDict[Fd], Wa))
 
         self.PredictionReport = None
         if "FIXED_ADDRESS" in ReportType or "EXECUTION_ORDER" in ReportType:
@@ -2171,16 +2371,17 @@ class PlatformReport(object):
                     if Pa.Arch in GlobalData.gFdfParser.Profile.InfDict:
                         INFList = GlobalData.gFdfParser.Profile.InfDict[Pa.Arch]
                         for InfName in INFList:
-                            InfClass = PathClass(NormPath(InfName), Wa.WorkspaceDir, Pa.Arch)
-                            Ma = ModuleAutoGen(Wa, InfClass, Pa.BuildTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile, Pa.DataPipe)
+                            InfClass = PathClass(
+                                NormPath(InfName), Wa.WorkspaceDir, Pa.Arch)
+                            Ma = ModuleAutoGen(
+                                Wa, InfClass, Pa.BuildTarget, Pa.ToolChain, Pa.Arch, Wa.MetaFile, Pa.DataPipe)
                             if Ma is None:
                                 continue
                             if Ma not in ModuleAutoGenList:
                                 ModuleAutoGenList.append(Ma)
                 for MGen in ModuleAutoGenList:
-                    self.ModuleReportList.append(ModuleReport(MGen, ReportType))
-
-
+                    self.ModuleReportList.append(
+                        ModuleReport(MGen, ReportType))
 
     ##
     # Generate report for the whole platform.
@@ -2197,6 +2398,7 @@ class PlatformReport(object):
     # @param GenFdsTime      The total time of GenFds Phase
     # @param ReportType      The kind of report items in the final report file
     #
+
     def GenerateReport(self, File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, ReportType):
         FileWrite(File, "Platform Summary")
         FileWrite(File, "Platform Name:        %s" % self.PlatformName)
@@ -2205,9 +2407,11 @@ class PlatformReport(object):
         FileWrite(File, "Tool Chain:           %s" % self.ToolChain)
         FileWrite(File, "Target:               %s" % self.Target)
         if GlobalData.gSkuids:
-            FileWrite(File, "SKUID:                %s" % " ".join(GlobalData.gSkuids))
+            FileWrite(File, "SKUID:                %s" %
+                      " ".join(GlobalData.gSkuids))
         if GlobalData.gDefaultStores:
-            FileWrite(File, "DefaultStore:         %s" % " ".join(GlobalData.gDefaultStores))
+            FileWrite(File, "DefaultStore:         %s" %
+                      " ".join(GlobalData.gDefaultStores))
         FileWrite(File, "Output Path:          %s" % self.OutputPath)
         FileWrite(File, "Build Environment:    %s" % self.BuildEnvironment)
         FileWrite(File, "Build Duration:       %s" % BuildDuration)
@@ -2236,17 +2440,20 @@ class PlatformReport(object):
                     FdReportListItem.GenerateReport(File)
 
         for ModuleReportItem in self.ModuleReportList:
-            ModuleReportItem.GenerateReport(File, self.PcdReport, self.PredictionReport, self.DepexParser, ReportType)
+            ModuleReportItem.GenerateReport(
+                File, self.PcdReport, self.PredictionReport, self.DepexParser, ReportType)
 
         if not self._IsModuleBuild:
             if "EXECUTION_ORDER" in ReportType:
                 self.PredictionReport.GenerateReport(File, None)
 
-## BuildReport class
+# BuildReport class
 #
 #  This base class contain the routines to collect data and then
 #  applies certain format to the output report
 #
+
+
 class BuildReport(object):
     ##
     # Constructor function for class BuildReport
@@ -2269,7 +2476,8 @@ class BuildReport(object):
                     if ReportTypeItem not in self.ReportType:
                         self.ReportType.append(ReportTypeItem)
             else:
-                self.ReportType = ["PCD", "LIBRARY", "BUILD_FLAGS", "DEPEX", "HASH", "FLASH", "FIXED_ADDRESS"]
+                self.ReportType = ["PCD", "LIBRARY", "BUILD_FLAGS",
+                                   "DEPEX", "HASH", "FLASH", "FIXED_ADDRESS"]
     ##
     # Adds platform report to the list
     #
@@ -2279,6 +2487,7 @@ class BuildReport(object):
     # @param Wa              Workspace context information
     # @param MaList          The list of modules in the platform build
     #
+
     def AddPlatformReport(self, Wa, MaList=None):
         if self.ReportFile:
             self.ReportList.append((Wa, MaList))
@@ -2300,17 +2509,22 @@ class BuildReport(object):
             try:
                 File = []
                 for (Wa, MaList) in self.ReportList:
-                    PlatformReport(Wa, MaList, self.ReportType).GenerateReport(File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
+                    PlatformReport(Wa, MaList, self.ReportType).GenerateReport(
+                        File, BuildDuration, AutoGenTime, MakeTime, GenFdsTime, self.ReportType)
                 Content = FileLinesSplit(''.join(File), gLineMaxLength)
                 SaveFileOnChange(self.ReportFile, Content, False)
-                EdkLogger.quiet("Build report can be found at %s" % os.path.abspath(self.ReportFile))
+                EdkLogger.quiet("Build report can be found at %s" %
+                                os.path.abspath(self.ReportFile))
             except IOError:
-                EdkLogger.error(None, FILE_WRITE_FAILURE, ExtraData=self.ReportFile)
+                EdkLogger.error(None, FILE_WRITE_FAILURE,
+                                ExtraData=self.ReportFile)
             except:
-                EdkLogger.error("BuildReport", CODE_ERROR, "Unknown fatal error when generating build report", ExtraData=self.ReportFile, RaiseError=False)
-                EdkLogger.quiet("(Python %s on %s\n%s)" % (platform.python_version(), sys.platform, traceback.format_exc()))
+                EdkLogger.error("BuildReport", CODE_ERROR, "Unknown fatal error when generating build report",
+                                ExtraData=self.ReportFile, RaiseError=False)
+                EdkLogger.quiet("(Python %s on %s\n%s)" % (
+                    platform.python_version(), sys.platform, traceback.format_exc()))
+
 
 # This acts like the main() function for the script, unless it is 'import'ed into another script.
 if __name__ == '__main__':
     pass
-
diff --git a/BaseTools/Source/Python/build/__init__.py b/BaseTools/Source/Python/build/__init__.py
index 41a3808ae9ee..e2c52cf372a2 100644
--- a/BaseTools/Source/Python/build/__init__.py
+++ b/BaseTools/Source/Python/build/__init__.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Python 'build' package initialization file.
 #
 # This file is required to make Python interpreter treat the directory
diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Python/build/build.py
index 07187c03618a..00f8bf490dd3 100755
--- a/BaseTools/Source/Python/build/build.py
+++ b/BaseTools/Source/Python/build/build.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # build a platform or a module
 #
 #  Copyright (c) 2014, Hewlett-Packard Development Company, L.P.<BR>
@@ -23,16 +23,16 @@ import time
 import platform
 import traceback
 import multiprocessing
-from threading import Thread,Event,BoundedSemaphore
+from threading import Thread, Event, BoundedSemaphore
 import threading
 from linecache import getlines
-from subprocess import Popen,PIPE, STDOUT
+from subprocess import Popen, PIPE, STDOUT
 from collections import OrderedDict, defaultdict
 
 from AutoGen.PlatformAutoGen import PlatformAutoGen
 from AutoGen.ModuleAutoGen import ModuleAutoGen
 from AutoGen.WorkspaceAutoGen import WorkspaceAutoGen
-from AutoGen.AutoGenWorker import AutoGenWorkerInProcess,AutoGenManager,\
+from AutoGen.AutoGenWorker import AutoGenWorkerInProcess, AutoGenManager,\
     LogAgent
 from AutoGen import GenMake
 from Common import Misc as Utils
@@ -40,7 +40,7 @@ from Common import Misc as Utils
 from Common.TargetTxtClassObject import TargetTxtDict
 from Common.ToolDefClassObject import ToolDefDict
 from buildoptions import MyOptionParser
-from Common.Misc import PathClass,SaveFileOnChange,RemoveDirectory
+from Common.Misc import PathClass, SaveFileOnChange, RemoveDirectory
 from Common.StringUtils import NormPath
 from Common.MultipleWorkspace import MultipleWorkspace as mws
 from Common.BuildToolError import *
@@ -50,7 +50,7 @@ import Common.EdkLogger as EdkLogger
 from Workspace.WorkspaceDatabase import BuildDB
 
 from BuildReport import BuildReport
-from GenPatchPcdTable.GenPatchPcdTable import PeImageClass,parsePcdInfoFromMapFile
+from GenPatchPcdTable.GenPatchPcdTable import PeImageClass, parsePcdInfoFromMapFile
 from PatchPcdValue.PatchPcdValue import PatchBinaryFile
 
 import Common.GlobalData as GlobalData
@@ -64,17 +64,20 @@ from AutoGen.IncludesAutoGen import IncludesAutoGen
 from GenFds.GenFds import resetFdsGlobalVariable
 from AutoGen.AutoGen import CalculatePriorityValue
 
-## standard targets of build command
-gSupportedTarget = ['all', 'genc', 'genmake', 'modules', 'libraries', 'fds', 'clean', 'cleanall', 'cleanlib', 'run']
+# standard targets of build command
+gSupportedTarget = ['all', 'genc', 'genmake', 'modules',
+                    'libraries', 'fds', 'clean', 'cleanall', 'cleanlib', 'run']
 
 TemporaryTablePattern = re.compile(r'^_\d+_\d+_[a-fA-F0-9]+$')
 TmpTableDict = {}
 
-## Check environment PATH variable to make sure the specified tool is found
+# Check environment PATH variable to make sure the specified tool is found
 #
 #   If the tool is found in the PATH, then True is returned
 #   Otherwise, False is returned
 #
+
+
 def IsToolInPath(tool):
     if 'PATHEXT' in os.environ:
         extns = os.environ['PATHEXT'].split(os.path.pathsep)
@@ -86,7 +89,7 @@ def IsToolInPath(tool):
                 return True
     return False
 
-## Check environment variables
+# Check environment variables
 #
 #  Check environment variables that must be set for build. Currently they are
 #
@@ -97,6 +100,8 @@ def IsToolInPath(tool):
 #   If any of above environment variable is not set or has error, the build
 #   will be broken.
 #
+
+
 def CheckEnvVariable():
     # check WORKSPACE
     if "WORKSPACE" not in os.environ:
@@ -105,7 +110,8 @@ def CheckEnvVariable():
 
     WorkspaceDir = os.path.normcase(os.path.normpath(os.environ["WORKSPACE"]))
     if not os.path.exists(WorkspaceDir):
-        EdkLogger.error("build", FILE_NOT_FOUND, "WORKSPACE doesn't exist", ExtraData=WorkspaceDir)
+        EdkLogger.error("build", FILE_NOT_FOUND,
+                        "WORKSPACE doesn't exist", ExtraData=WorkspaceDir)
     elif ' ' in WorkspaceDir:
         EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "No space is allowed in WORKSPACE path",
                         ExtraData=WorkspaceDir)
@@ -117,12 +123,14 @@ def CheckEnvVariable():
     if mws.PACKAGES_PATH:
         for Path in mws.PACKAGES_PATH:
             if not os.path.exists(Path):
-                EdkLogger.error("build", FILE_NOT_FOUND, "One Path in PACKAGES_PATH doesn't exist", ExtraData=Path)
+                EdkLogger.error(
+                    "build", FILE_NOT_FOUND, "One Path in PACKAGES_PATH doesn't exist", ExtraData=Path)
             elif ' ' in Path:
-                EdkLogger.error("build", FORMAT_NOT_SUPPORTED, "No space is allowed in PACKAGES_PATH", ExtraData=Path)
+                EdkLogger.error("build", FORMAT_NOT_SUPPORTED,
+                                "No space is allowed in PACKAGES_PATH", ExtraData=Path)
 
-
-    os.environ["EDK_TOOLS_PATH"] = os.path.normcase(os.environ["EDK_TOOLS_PATH"])
+    os.environ["EDK_TOOLS_PATH"] = os.path.normcase(
+        os.environ["EDK_TOOLS_PATH"])
 
     # check EDK_TOOLS_PATH
     if "EDK_TOOLS_PATH" not in os.environ:
@@ -136,10 +144,10 @@ def CheckEnvVariable():
 
     GlobalData.gWorkspace = WorkspaceDir
 
-    GlobalData.gGlobalDefines["WORKSPACE"]  = WorkspaceDir
+    GlobalData.gGlobalDefines["WORKSPACE"] = WorkspaceDir
     GlobalData.gGlobalDefines["EDK_TOOLS_PATH"] = os.environ["EDK_TOOLS_PATH"]
 
-## Get normalized file path
+# Get normalized file path
 #
 # Convert the path to be local format, and remove the WORKSPACE path at the
 # beginning if the file path is given in full path.
@@ -149,6 +157,8 @@ def CheckEnvVariable():
 #
 # @retval string        The normalized file path
 #
+
+
 def NormFile(FilePath, Workspace):
     # check if the path is absolute or relative
     if os.path.isabs(FilePath):
@@ -159,7 +169,8 @@ def NormFile(FilePath, Workspace):
 
     # check if the file path exists or not
     if not os.path.isfile(FileFullPath):
-        EdkLogger.error("build", FILE_NOT_FOUND, ExtraData="\t%s (Please give file in absolute path or relative to WORKSPACE)" % FileFullPath)
+        EdkLogger.error("build", FILE_NOT_FOUND,
+                        ExtraData="\t%s (Please give file in absolute path or relative to WORKSPACE)" % FileFullPath)
 
     # remove workspace directory from the beginning part of the file path
     if Workspace[-1] in ["\\", "/"]:
@@ -167,7 +178,7 @@ def NormFile(FilePath, Workspace):
     else:
         return FileFullPath[(len(Workspace) + 1):]
 
-## Get the output of an external program
+# Get the output of an external program
 #
 # This is the entrance method of thread reading output of an external program and
 # putting them in STDOUT/STDERR of current program.
@@ -176,7 +187,9 @@ def NormFile(FilePath, Workspace):
 # @param  To        The stream message put on
 # @param  ExitFlag  The flag used to indicate stopping reading
 #
-def ReadMessage(From, To, ExitFlag,MemTo=None):
+
+
+def ReadMessage(From, To, ExitFlag, MemTo=None):
     while True:
         # read one line a time
         Line = From.readline()
@@ -184,7 +197,7 @@ def ReadMessage(From, To, ExitFlag,MemTo=None):
         if Line is not None and Line != b"":
             LineStr = Line.rstrip().decode(encoding='utf-8', errors='ignore')
             if MemTo is not None:
-                if "Note: including file:" ==  LineStr.lstrip()[:21]:
+                if "Note: including file:" == LineStr.lstrip()[:21]:
                     MemTo.append(LineStr)
                 else:
                     To(LineStr)
@@ -196,12 +209,13 @@ def ReadMessage(From, To, ExitFlag,MemTo=None):
         if ExitFlag.is_set():
             break
 
+
 class MakeSubProc(Popen):
-    def __init__(self,*args, **argv):
-        super(MakeSubProc,self).__init__(*args, **argv)
+    def __init__(self, *args, **argv):
+        super(MakeSubProc, self).__init__(*args, **argv)
         self.ProcOut = []
 
-## Launch an external program
+# Launch an external program
 #
 # This method will call subprocess.Popen to execute an external program with
 # given options in specified directory. Because of the dead-lock issue during
@@ -211,7 +225,9 @@ class MakeSubProc(Popen):
 # @param  Command               A list or string containing the call of the program
 # @param  WorkingDir            The directory in which the program will be running
 #
-def LaunchCommand(Command, WorkingDir,ModuleAuto = None):
+
+
+def LaunchCommand(Command, WorkingDir, ModuleAuto=None):
     BeginTime = time.time()
     # if working directory doesn't exist, Popen() will raise an exception
     if not os.path.isdir(WorkingDir):
@@ -230,29 +246,32 @@ def LaunchCommand(Command, WorkingDir,ModuleAuto = None):
     EndOfProcedure = None
     try:
         # launch the command
-        Proc = MakeSubProc(Command, stdout=PIPE, stderr=STDOUT, env=os.environ, cwd=WorkingDir, bufsize=-1, shell=True)
+        Proc = MakeSubProc(Command, stdout=PIPE, stderr=STDOUT,
+                           env=os.environ, cwd=WorkingDir, bufsize=-1, shell=True)
 
         # launch two threads to read the STDOUT and STDERR
         EndOfProcedure = Event()
         EndOfProcedure.clear()
         if Proc.stdout:
-            StdOutThread = Thread(target=ReadMessage, args=(Proc.stdout, EdkLogger.info, EndOfProcedure,Proc.ProcOut))
+            StdOutThread = Thread(target=ReadMessage, args=(
+                Proc.stdout, EdkLogger.info, EndOfProcedure, Proc.ProcOut))
             StdOutThread.name = "STDOUT-Redirector"
             StdOutThread.daemon = False
             StdOutThread.start()
 
-
         # waiting for program exit
         Proc.wait()
-    except: # in case of aborting
+    except:  # in case of aborting
         # terminate the threads redirecting the program output
-        EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+        EdkLogger.quiet("(Python %s on %s) " % (
+            platform.python_version(), sys.platform) + traceback.format_exc())
         if EndOfProcedure is not None:
             EndOfProcedure.set()
         if Proc is None:
             if not isinstance(Command, type("")):
                 Command = " ".join(Command)
-            EdkLogger.error("build", COMMAND_FAILURE, "Failed to start command", ExtraData="%s [%s]" % (Command, WorkingDir))
+            EdkLogger.error("build", COMMAND_FAILURE, "Failed to start command",
+                            ExtraData="%s [%s]" % (Command, WorkingDir))
 
     if Proc.stdout:
         StdOutThread.join()
@@ -269,9 +288,10 @@ def LaunchCommand(Command, WorkingDir,ModuleAuto = None):
             f.close()
             EdkLogger.info(RespContent)
 
-        EdkLogger.error("build", COMMAND_FAILURE, ExtraData="%s [%s]" % (Command, WorkingDir))
+        EdkLogger.error("build", COMMAND_FAILURE,
+                        ExtraData="%s [%s]" % (Command, WorkingDir))
     if ModuleAuto:
-        iau = IncludesAutoGen(WorkingDir,ModuleAuto)
+        iau = IncludesAutoGen(WorkingDir, ModuleAuto)
         if ModuleAuto.ToolChainFamily == TAB_COMPILER_MSFT:
             iau.CreateDepsFileForMsvc(Proc.ProcOut)
         else:
@@ -282,7 +302,7 @@ def LaunchCommand(Command, WorkingDir,ModuleAuto = None):
         iau.CreateDepsTarget()
     return "%dms" % (int(round((time.time() - BeginTime) * 1000)))
 
-## The smallest unit that can be built in multi-thread build mode
+# The smallest unit that can be built in multi-thread build mode
 #
 # This is the base class of build unit. The "Obj" parameter must provide
 # __str__(), __eq__() and __hash__() methods. Otherwise there could be build units
@@ -290,8 +310,10 @@ def LaunchCommand(Command, WorkingDir,ModuleAuto = None):
 #
 # Currently the "Obj" should be only ModuleAutoGen or PlatformAutoGen objects.
 #
+
+
 class BuildUnit:
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Obj         The object the build is working on
@@ -309,20 +331,20 @@ class BuildUnit:
             EdkLogger.error("build", OPTION_MISSING,
                             "No build command found for this module. "
                             "Please check your setting of %s_%s_%s_MAKE_PATH in Conf/tools_def.txt file." %
-                                (Obj.BuildTarget, Obj.ToolChain, Obj.Arch),
+                            (Obj.BuildTarget, Obj.ToolChain, Obj.Arch),
                             ExtraData=str(Obj))
 
-
-    ## str() method
+    # str() method
     #
     #   It just returns the string representation of self.BuildObject
     #
     #   @param  self        The object pointer
     #
+
     def __str__(self):
         return str(self.BuildObject)
 
-    ## "==" operator method
+    # "==" operator method
     #
     #   It just compares self.BuildObject with "Other". So self.BuildObject must
     #   provide its own __eq__() method.
@@ -332,10 +354,10 @@ class BuildUnit:
     #
     def __eq__(self, Other):
         return Other and self.BuildObject == Other.BuildObject \
-                and Other.BuildObject \
-                and self.BuildObject.Arch == Other.BuildObject.Arch
+            and Other.BuildObject \
+            and self.BuildObject.Arch == Other.BuildObject.Arch
 
-    ## hash() method
+    # hash() method
     #
     #   It just returns the hash value of self.BuildObject which must be hashable.
     #
@@ -347,7 +369,7 @@ class BuildUnit:
     def __repr__(self):
         return repr(self.BuildObject)
 
-## The smallest module unit that can be built by nmake/make command in multi-thread build mode
+# The smallest module unit that can be built by nmake/make command in multi-thread build mode
 #
 # This class is for module build by nmake/make build system. The "Obj" parameter
 # must provide __str__(), __eq__() and __hash__() methods. Otherwise there could
@@ -355,20 +377,24 @@ class BuildUnit:
 #
 # Currently the "Obj" should be only ModuleAutoGen object.
 #
+
+
 class ModuleMakeUnit(BuildUnit):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Obj         The ModuleAutoGen object the build is working on
     #   @param  Target      The build target name, one of gSupportedTarget
     #
-    def __init__(self, Obj, BuildCommand,Target):
-        Dependency = [ModuleMakeUnit(La, BuildCommand,Target) for La in Obj.LibraryAutoGenList]
-        BuildUnit.__init__(self, Obj, BuildCommand, Target, Dependency, Obj.MakeFileDir)
+    def __init__(self, Obj, BuildCommand, Target):
+        Dependency = [ModuleMakeUnit(La, BuildCommand, Target)
+                      for La in Obj.LibraryAutoGenList]
+        BuildUnit.__init__(self, Obj, BuildCommand, Target,
+                           Dependency, Obj.MakeFileDir)
         if Target in [None, "", "all"]:
             self.Target = "tbuild"
 
-## The smallest platform unit that can be built by nmake/make command in multi-thread build mode
+# The smallest platform unit that can be built by nmake/make command in multi-thread build mode
 #
 # This class is for platform build by nmake/make build system. The "Obj" parameter
 # must provide __str__(), __eq__() and __hash__() methods. Otherwise there could
@@ -376,23 +402,30 @@ class ModuleMakeUnit(BuildUnit):
 #
 # Currently the "Obj" should be only PlatformAutoGen object.
 #
+
+
 class PlatformMakeUnit(BuildUnit):
-    ## The constructor
+    # The constructor
     #
     #   @param  self        The object pointer
     #   @param  Obj         The PlatformAutoGen object the build is working on
     #   @param  Target      The build target name, one of gSupportedTarget
     #
     def __init__(self, Obj, BuildCommand, Target):
-        Dependency = [ModuleMakeUnit(Lib, BuildCommand, Target) for Lib in self.BuildObject.LibraryAutoGenList]
-        Dependency.extend([ModuleMakeUnit(Mod, BuildCommand,Target) for Mod in self.BuildObject.ModuleAutoGenList])
-        BuildUnit.__init__(self, Obj, BuildCommand, Target, Dependency, Obj.MakeFileDir)
+        Dependency = [ModuleMakeUnit(Lib, BuildCommand, Target)
+                      for Lib in self.BuildObject.LibraryAutoGenList]
+        Dependency.extend([ModuleMakeUnit(Mod, BuildCommand, Target)
+                          for Mod in self.BuildObject.ModuleAutoGenList])
+        BuildUnit.__init__(self, Obj, BuildCommand, Target,
+                           Dependency, Obj.MakeFileDir)
 
-## The class representing the task of a module build or platform build
+# The class representing the task of a module build or platform build
 #
 # This class manages the build tasks in multi-thread build mode. Its jobs include
 # scheduling thread running, catching thread error, monitor the thread status, etc.
 #
+
+
 class BuildTask:
     # queue for tasks waiting for schedule
     _PendingQueue = OrderedDict()
@@ -421,14 +454,15 @@ class BuildTask:
     _SchedulerStopped = threading.Event()
     _SchedulerStopped.set()
 
-    ## Start the task scheduler thread
+    # Start the task scheduler thread
     #
     #   @param  MaxThreadNumber     The maximum thread number
     #   @param  ExitFlag            Flag used to end the scheduler
     #
     @staticmethod
     def StartScheduler(MaxThreadNumber, ExitFlag):
-        SchedulerThread = Thread(target=BuildTask.Scheduler, args=(MaxThreadNumber, ExitFlag))
+        SchedulerThread = Thread(
+            target=BuildTask.Scheduler, args=(MaxThreadNumber, ExitFlag))
         SchedulerThread.name = "Build-Task-Scheduler"
         SchedulerThread.daemon = False
         SchedulerThread.start()
@@ -436,7 +470,7 @@ class BuildTask:
         while not BuildTask.IsOnGoing():
             time.sleep(0.01)
 
-    ## Scheduler method
+    # Scheduler method
     #
     #   @param  MaxThreadNumber     The maximum thread number
     #   @param  ExitFlag            Flag used to end the scheduler
@@ -451,7 +485,7 @@ class BuildTask:
             # scheduling loop, which will exits when no pending/ready task and
             # indicated to do so, or there's error in running thread
             #
-            while (len(BuildTask._PendingQueue) > 0 or len(BuildTask._ReadyQueue) > 0 \
+            while (len(BuildTask._PendingQueue) > 0 or len(BuildTask._ReadyQueue) > 0
                    or not ExitFlag.is_set()) and not BuildTask._ErrorFlag.is_set():
                 EdkLogger.debug(EdkLogger.DEBUG_8, "Pending Queue (%d), Ready Queue (%d)"
                                 % (len(BuildTask._PendingQueue), len(BuildTask._ReadyQueue)))
@@ -466,7 +500,8 @@ class BuildTask:
                 for BuildObject in BuildObjectList:
                     Bt = BuildTask._PendingQueue[BuildObject]
                     if Bt.IsReady():
-                        BuildTask._ReadyQueue[BuildObject] = BuildTask._PendingQueue.pop(BuildObject)
+                        BuildTask._ReadyQueue[BuildObject] = BuildTask._PendingQueue.pop(
+                            BuildObject)
                 BuildTask._PendingQueueLock.release()
 
                 # launch build thread until the maximum number of threads is reached
@@ -498,8 +533,10 @@ class BuildTask:
                 EdkLogger.quiet("\nWaiting for all build threads exit...")
             # while not BuildTask._ErrorFlag.is_set() and \
             while len(BuildTask._RunningQueue) > 0:
-                EdkLogger.verbose("Waiting for thread ending...(%d)" % len(BuildTask._RunningQueue))
-                EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join(Th.name for Th in threading.enumerate()))
+                EdkLogger.verbose("Waiting for thread ending...(%d)" %
+                                  len(BuildTask._RunningQueue))
+                EdkLogger.debug(EdkLogger.DEBUG_8, "Threads [%s]" % ", ".join(
+                    Th.name for Th in threading.enumerate()))
                 # avoid tense loop
                 time.sleep(0.1)
         except BaseException as X:
@@ -509,7 +546,8 @@ class BuildTask:
             #
             EdkLogger.SetLevel(EdkLogger.ERROR)
             BuildTask._ErrorFlag.set()
-            BuildTask._ErrorMessage = "build thread scheduler error\n\t%s" % str(X)
+            BuildTask._ErrorMessage = "build thread scheduler error\n\t%s" % str(
+                X)
 
         BuildTask._PendingQueue.clear()
         BuildTask._ReadyQueue.clear()
@@ -517,26 +555,26 @@ class BuildTask:
         BuildTask._TaskQueue.clear()
         BuildTask._SchedulerStopped.set()
 
-    ## Wait for all running method exit
+    # Wait for all running method exit
     #
     @staticmethod
     def WaitForComplete():
         BuildTask._SchedulerStopped.wait()
 
-    ## Check if the scheduler is running or not
+    # Check if the scheduler is running or not
     #
     @staticmethod
     def IsOnGoing():
         return not BuildTask._SchedulerStopped.is_set()
 
-    ## Abort the build
+    # Abort the build
     @staticmethod
     def Abort():
         if BuildTask.IsOnGoing():
             BuildTask._ErrorFlag.set()
             BuildTask.WaitForComplete()
 
-    ## Check if there's error in running thread
+    # Check if there's error in running thread
     #
     #   Since the main thread cannot catch exceptions in other thread, we have to
     #   use threading.Event to communicate this formation to main thread.
@@ -545,7 +583,7 @@ class BuildTask:
     def HasError():
         return BuildTask._ErrorFlag.is_set()
 
-    ## Get error message in running thread
+    # Get error message in running thread
     #
     #   Since the main thread cannot catch exceptions in other thread, we have to
     #   use a static variable to communicate this message to main thread.
@@ -554,7 +592,7 @@ class BuildTask:
     def GetErrorMessage():
         return BuildTask._ErrorMessage
 
-    ## Factory method to create a BuildTask object
+    # Factory method to create a BuildTask object
     #
     #   This method will check if a module is building or has been built. And if
     #   true, just return the associated BuildTask object in the _TaskQueue. If
@@ -580,7 +618,7 @@ class BuildTask:
 
         return Bt
 
-    ## The real constructor of BuildTask
+    # The real constructor of BuildTask
     #
     #   @param  BuildItem       A BuildUnit object representing a build object
     #   @param  Dependency      The dependent build object of BuildItem
@@ -597,7 +635,7 @@ class BuildTask:
         # flag indicating build completes, used to avoid unnecessary re-build
         self.CompleteFlag = False
 
-    ## Check if all dependent build tasks are completed or not
+    # Check if all dependent build tasks are completed or not
     #
     def IsReady(self):
         ReadyFlag = True
@@ -609,23 +647,25 @@ class BuildTask:
 
         return ReadyFlag
 
-    ## Add dependent build task
+    # Add dependent build task
     #
     #   @param  Dependency      The list of dependent build objects
     #
     def AddDependency(self, Dependency):
         for Dep in Dependency:
             if not Dep.BuildObject.IsBinaryModule and not Dep.BuildObject.CanSkipbyCache(GlobalData.gModuleCacheHit):
-                self.DependencyList.append(BuildTask.New(Dep))    # BuildTask list
+                self.DependencyList.append(
+                    BuildTask.New(Dep))    # BuildTask list
 
-    ## The thread wrapper of LaunchCommand function
+    # The thread wrapper of LaunchCommand function
     #
     # @param  Command               A list or string contains the call of the command
     # @param  WorkingDir            The directory in which the program will be running
     #
     def _CommandThread(self, Command, WorkingDir):
         try:
-            self.BuildItem.BuildObject.BuildTime = LaunchCommand(Command, WorkingDir,self.BuildItem.BuildObject)
+            self.BuildItem.BuildObject.BuildTime = LaunchCommand(
+                Command, WorkingDir, self.BuildItem.BuildObject)
             self.CompleteFlag = True
 
             # Run hash operation post dependency to account for libs
@@ -645,11 +685,12 @@ class BuildTask:
                                                                   self.BuildItem.BuildObject.Arch,
                                                                   self.BuildItem.BuildObject.ToolChain,
                                                                   self.BuildItem.BuildObject.BuildTarget
-                                                                 )
+                                                                  )
             EdkLogger.SetLevel(EdkLogger.ERROR)
             BuildTask._ErrorFlag.set()
             BuildTask._ErrorMessage = "%s broken\n    %s [%s]" % \
-                                      (threading.current_thread().name, Command, WorkingDir)
+                                      (threading.current_thread().name,
+                                       Command, WorkingDir)
 
         # indicate there's a thread is available for another build task
         BuildTask._RunningQueueLock.acquire()
@@ -657,20 +698,23 @@ class BuildTask:
         BuildTask._RunningQueueLock.release()
         BuildTask._Thread.release()
 
-    ## Start build task thread
+    # Start build task thread
     #
     def Start(self):
         EdkLogger.quiet("Building ... %s" % repr(self.BuildItem))
         Command = self.BuildItem.BuildCommand + [self.BuildItem.Target]
-        self.BuildTread = Thread(target=self._CommandThread, args=(Command, self.BuildItem.WorkingDir))
+        self.BuildTread = Thread(target=self._CommandThread, args=(
+            Command, self.BuildItem.WorkingDir))
         self.BuildTread.name = "build thread"
         self.BuildTread.daemon = False
         self.BuildTread.start()
 
-## The class contains the information related to EFI image
+# The class contains the information related to EFI image
 #
+
+
 class PeImageInfo():
-    ## Constructor
+    # Constructor
     #
     # Constructor will load all required image information.
     #
@@ -682,15 +726,15 @@ class PeImageInfo():
     #   @param  ImageClass        PeImage Information
     #
     def __init__(self, BaseName, Guid, Arch, OutputDir, DebugDir, ImageClass):
-        self.BaseName         = BaseName
-        self.Guid             = Guid
-        self.Arch             = Arch
-        self.OutputDir        = OutputDir
-        self.DebugDir         = DebugDir
-        self.Image            = ImageClass
-        self.Image.Size       = (self.Image.Size // 0x1000 + 1) * 0x1000
+        self.BaseName = BaseName
+        self.Guid = Guid
+        self.Arch = Arch
+        self.OutputDir = OutputDir
+        self.DebugDir = DebugDir
+        self.Image = ImageClass
+        self.Image.Size = (self.Image.Size // 0x1000 + 1) * 0x1000
 
-## The class implementing the EDK2 build process
+# The class implementing the EDK2 build process
 #
 #   The build process includes:
 #       1. Load configuration from target.txt and tools_def.txt in $(WORKSPACE)/Conf
@@ -700,8 +744,10 @@ class PeImageInfo():
 #       5. Create AutoGen files (C code file, depex file, makefile) if necessary
 #       6. Call build command
 #
+
+
 class Build():
-    ## Constructor
+    # Constructor
     #
     # Constructor will load all necessary configurations, parse platform, modules
     # and packages and the establish a database for AutoGen.
@@ -710,53 +756,58 @@ class Build():
     #   @param  WorkspaceDir        The directory of workspace
     #   @param  BuildOptions        Build options passed from command line
     #
-    def __init__(self, Target, WorkspaceDir, BuildOptions,log_q):
-        self.WorkspaceDir   = WorkspaceDir
-        self.Target         = Target
-        self.PlatformFile   = BuildOptions.PlatformFile
-        self.ModuleFile     = BuildOptions.ModuleFile
-        self.ArchList       = BuildOptions.TargetArch
-        self.ToolChainList  = BuildOptions.ToolChain
-        self.BuildTargetList= BuildOptions.BuildTarget
-        self.Fdf            = BuildOptions.FdfFile
-        self.FdList         = BuildOptions.RomImage
-        self.FvList         = BuildOptions.FvImage
-        self.CapList        = BuildOptions.CapName
-        self.SilentMode     = BuildOptions.SilentMode
-        self.ThreadNumber   = 1
-        self.SkipAutoGen    = BuildOptions.SkipAutoGen
-        self.Reparse        = BuildOptions.Reparse
-        self.SkuId          = BuildOptions.SkuId
+    def __init__(self, Target, WorkspaceDir, BuildOptions, log_q):
+        self.WorkspaceDir = WorkspaceDir
+        self.Target = Target
+        self.PlatformFile = BuildOptions.PlatformFile
+        self.ModuleFile = BuildOptions.ModuleFile
+        self.ArchList = BuildOptions.TargetArch
+        self.ToolChainList = BuildOptions.ToolChain
+        self.BuildTargetList = BuildOptions.BuildTarget
+        self.Fdf = BuildOptions.FdfFile
+        self.FdList = BuildOptions.RomImage
+        self.FvList = BuildOptions.FvImage
+        self.CapList = BuildOptions.CapName
+        self.SilentMode = BuildOptions.SilentMode
+        self.ThreadNumber = 1
+        self.SkipAutoGen = BuildOptions.SkipAutoGen
+        self.Reparse = BuildOptions.Reparse
+        self.SkuId = BuildOptions.SkuId
         if self.SkuId:
             GlobalData.gSKUID_CMD = self.SkuId
         self.ConfDirectory = BuildOptions.ConfDirectory
-        self.SpawnMode      = True
-        self.BuildReport    = BuildReport(BuildOptions.ReportFile, BuildOptions.ReportType)
-        self.AutoGenTime    = 0
-        self.MakeTime       = 0
-        self.GenFdsTime     = 0
-        self.MakeFileName   = ""
+        self.SpawnMode = True
+        self.BuildReport = BuildReport(
+            BuildOptions.ReportFile, BuildOptions.ReportType)
+        self.AutoGenTime = 0
+        self.MakeTime = 0
+        self.GenFdsTime = 0
+        self.MakeFileName = ""
         TargetObj = TargetTxtDict()
-        ToolDefObj = ToolDefDict((os.path.join(os.getenv("WORKSPACE"),"Conf")))
+        ToolDefObj = ToolDefDict(
+            (os.path.join(os.getenv("WORKSPACE"), "Conf")))
         self.TargetTxt = TargetObj.Target
         self.ToolDef = ToolDefObj.ToolDef
-        GlobalData.BuildOptionPcd     = BuildOptions.OptionPcd if BuildOptions.OptionPcd else []
-        #Set global flag for build mode
+        GlobalData.BuildOptionPcd = BuildOptions.OptionPcd if BuildOptions.OptionPcd else []
+        # Set global flag for build mode
         GlobalData.gIgnoreSource = BuildOptions.IgnoreSources
         GlobalData.gUseHashCache = BuildOptions.UseHashCache
-        GlobalData.gBinCacheDest   = BuildOptions.BinCacheDest
+        GlobalData.gBinCacheDest = BuildOptions.BinCacheDest
         GlobalData.gBinCacheSource = BuildOptions.BinCacheSource
         GlobalData.gEnableGenfdsMultiThread = not BuildOptions.NoGenfdsMultiThread
         GlobalData.gDisableIncludePathCheck = BuildOptions.DisableIncludePathCheck
 
         if GlobalData.gBinCacheDest and not GlobalData.gUseHashCache:
-            EdkLogger.error("build", OPTION_NOT_SUPPORTED, ExtraData="--binary-destination must be used together with --hash.")
+            EdkLogger.error("build", OPTION_NOT_SUPPORTED,
+                            ExtraData="--binary-destination must be used together with --hash.")
 
         if GlobalData.gBinCacheSource and not GlobalData.gUseHashCache:
-            EdkLogger.error("build", OPTION_NOT_SUPPORTED, ExtraData="--binary-source must be used together with --hash.")
+            EdkLogger.error("build", OPTION_NOT_SUPPORTED,
+                            ExtraData="--binary-source must be used together with --hash.")
 
         if GlobalData.gBinCacheDest and GlobalData.gBinCacheSource:
-            EdkLogger.error("build", OPTION_NOT_SUPPORTED, ExtraData="--binary-destination can not be used together with --binary-source.")
+            EdkLogger.error("build", OPTION_NOT_SUPPORTED,
+                            ExtraData="--binary-destination can not be used together with --binary-source.")
 
         if GlobalData.gBinCacheSource:
             BinCacheSource = os.path.normpath(GlobalData.gBinCacheSource)
@@ -765,7 +816,8 @@ class Build():
             GlobalData.gBinCacheSource = BinCacheSource
         else:
             if GlobalData.gBinCacheSource is not None:
-                EdkLogger.error("build", OPTION_VALUE_INVALID, ExtraData="Invalid value of option --binary-source.")
+                EdkLogger.error("build", OPTION_VALUE_INVALID,
+                                ExtraData="Invalid value of option --binary-source.")
 
         if GlobalData.gBinCacheDest:
             BinCacheDest = os.path.normpath(GlobalData.gBinCacheDest)
@@ -774,9 +826,11 @@ class Build():
             GlobalData.gBinCacheDest = BinCacheDest
         else:
             if GlobalData.gBinCacheDest is not None:
-                EdkLogger.error("build", OPTION_VALUE_INVALID, ExtraData="Invalid value of option --binary-destination.")
+                EdkLogger.error("build", OPTION_VALUE_INVALID,
+                                ExtraData="Invalid value of option --binary-destination.")
 
-        GlobalData.gDatabasePath = os.path.normpath(os.path.join(GlobalData.gConfDirectory, GlobalData.gDatabasePath))
+        GlobalData.gDatabasePath = os.path.normpath(os.path.join(
+            GlobalData.gConfDirectory, GlobalData.gDatabasePath))
         if not os.path.exists(os.path.join(GlobalData.gConfDirectory, '.cache')):
             os.makedirs(os.path.join(GlobalData.gConfDirectory, '.cache'))
         self.Db = BuildDB
@@ -784,12 +838,13 @@ class Build():
         self.Platform = None
         self.ToolChainFamily = None
         self.LoadFixAddress = 0
-        self.UniFlag        = BuildOptions.Flag
+        self.UniFlag = BuildOptions.Flag
         self.BuildModules = []
         self.HashSkipModules = []
         self.Db_Flag = False
         self.LaunchPrebuildFlag = False
-        self.PlatformBuildPath = os.path.join(GlobalData.gConfDirectory, '.cache', '.PlatformBuild')
+        self.PlatformBuildPath = os.path.join(
+            GlobalData.gConfDirectory, '.cache', '.PlatformBuild')
         if BuildOptions.CommandLength:
             GlobalData.gCommandMaxLength = BuildOptions.CommandLength
 
@@ -799,19 +854,24 @@ class Build():
         EdkLogger.quiet("%-16s = %s" % ("WORKSPACE", os.environ["WORKSPACE"]))
         if "PACKAGES_PATH" in os.environ:
             # WORKSPACE env has been converted before. Print the same path style with WORKSPACE env.
-            EdkLogger.quiet("%-16s = %s" % ("PACKAGES_PATH", os.path.normcase(os.path.normpath(os.environ["PACKAGES_PATH"]))))
-        EdkLogger.quiet("%-16s = %s" % ("EDK_TOOLS_PATH", os.environ["EDK_TOOLS_PATH"]))
+            EdkLogger.quiet("%-16s = %s" % ("PACKAGES_PATH",
+                            os.path.normcase(os.path.normpath(os.environ["PACKAGES_PATH"]))))
+        EdkLogger.quiet("%-16s = %s" %
+                        ("EDK_TOOLS_PATH", os.environ["EDK_TOOLS_PATH"]))
         if "EDK_TOOLS_BIN" in os.environ:
             # Print the same path style with WORKSPACE env.
-            EdkLogger.quiet("%-16s = %s" % ("EDK_TOOLS_BIN", os.path.normcase(os.path.normpath(os.environ["EDK_TOOLS_BIN"]))))
-        EdkLogger.quiet("%-16s = %s" % ("CONF_PATH", GlobalData.gConfDirectory))
+            EdkLogger.quiet("%-16s = %s" % ("EDK_TOOLS_BIN",
+                            os.path.normcase(os.path.normpath(os.environ["EDK_TOOLS_BIN"]))))
+        EdkLogger.quiet("%-16s = %s" %
+                        ("CONF_PATH", GlobalData.gConfDirectory))
         if "PYTHON3_ENABLE" in os.environ:
             PYTHON3_ENABLE = os.environ["PYTHON3_ENABLE"]
             if PYTHON3_ENABLE != "TRUE":
                 PYTHON3_ENABLE = "FALSE"
             EdkLogger.quiet("%-16s = %s" % ("PYTHON3_ENABLE", PYTHON3_ENABLE))
         if "PYTHON_COMMAND" in os.environ:
-            EdkLogger.quiet("%-16s = %s" % ("PYTHON_COMMAND", os.environ["PYTHON_COMMAND"]))
+            EdkLogger.quiet("%-16s = %s" %
+                            ("PYTHON_COMMAND", os.environ["PYTHON_COMMAND"]))
         self.InitPreBuild()
         self.InitPostBuild()
         if self.Prebuild:
@@ -821,7 +881,8 @@ class Build():
         if self.Prebuild:
             self.LaunchPrebuild()
             TargetObj = TargetTxtDict()
-            ToolDefObj = ToolDefDict((os.path.join(os.getenv("WORKSPACE"), "Conf")))
+            ToolDefObj = ToolDefDict(
+                (os.path.join(os.getenv("WORKSPACE"), "Conf")))
             self.TargetTxt = TargetObj.Target
             self.ToolDef = ToolDefObj.ToolDef
         if not (self.LaunchPrebuildFlag and os.path.exists(self.PlatformBuildPath)):
@@ -831,7 +892,7 @@ class Build():
         EdkLogger.info("")
         os.chdir(self.WorkspaceDir)
         self.log_q = log_q
-        GlobalData.file_lock =  mp.Lock()
+        GlobalData.file_lock = mp.Lock()
         # Init cache data for local only
         GlobalData.gPackageHashFile = dict()
         GlobalData.gModulePreMakeCacheStatus = dict()
@@ -843,18 +904,20 @@ class Build():
         GlobalData.gModuleAllCacheStatus = set()
         GlobalData.gModuleCacheHit = set()
 
-    def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,cqueue):
+    def StartAutoGen(self, mqueue, DataPipe, SkipAutoGen, PcdMaList, cqueue):
         try:
             if SkipAutoGen:
-                return True,0
+                return True, 0
             feedback_q = mp.Queue()
             error_event = mp.Event()
             FfsCmd = DataPipe.Get("FfsCommand")
             if FfsCmd is None:
                 FfsCmd = {}
             GlobalData.FfsCmd = FfsCmd
-            auto_workers = [AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,GlobalData.file_lock,cqueue,self.log_q,error_event) for _ in range(self.ThreadNumber)]
-            self.AutoGenMgr = AutoGenManager(auto_workers,feedback_q,error_event)
+            auto_workers = [AutoGenWorkerInProcess(
+                mqueue, DataPipe.dump_file, feedback_q, GlobalData.file_lock, cqueue, self.log_q, error_event) for _ in range(self.ThreadNumber)]
+            self.AutoGenMgr = AutoGenManager(
+                auto_workers, feedback_q, error_event)
             self.AutoGenMgr.start()
             for w in auto_workers:
                 w.start()
@@ -866,14 +929,17 @@ class Build():
                     RetVal = PcdMa.SourceFileList
                     # Force cache miss for PCD driver
                     if GlobalData.gUseHashCache and not GlobalData.gBinCacheDest and self.Target in [None, "", "all"]:
-                        cqueue.put((PcdMa.MetaFile.Path, PcdMa.Arch, "PreMakeCache", False))
+                        cqueue.put(
+                            (PcdMa.MetaFile.Path, PcdMa.Arch, "PreMakeCache", False))
 
                     PcdMa.CreateCodeFile(False)
-                    PcdMa.CreateMakeFile(False,GenFfsList = DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.Path, PcdMa.Arch),[]))
+                    PcdMa.CreateMakeFile(False, GenFfsList=DataPipe.Get(
+                        "FfsCommand").get((PcdMa.MetaFile.Path, PcdMa.Arch), []))
                     PcdMa.CreateAsBuiltInf()
                     # Force cache miss for PCD driver
                     if GlobalData.gBinCacheSource and self.Target in [None, "", "all"]:
-                        cqueue.put((PcdMa.MetaFile.Path, PcdMa.Arch, "MakeCache", False))
+                        cqueue.put(
+                            (PcdMa.MetaFile.Path, PcdMa.Arch, "MakeCache", False))
 
             self.AutoGenMgr.join()
             rt = self.AutoGenMgr.Status
@@ -886,50 +952,56 @@ class Build():
         except:
             return False, UNKNOWN_ERROR
 
-    ## Add TOOLCHAIN and FAMILY declared in DSC [BuildOptions] to ToolsDefTxtDatabase.
+    # Add TOOLCHAIN and FAMILY declared in DSC [BuildOptions] to ToolsDefTxtDatabase.
     #
     # Loop through the set of build targets, tool chains, and archs provided on either
     # the command line or in target.txt to discover FAMILY and TOOLCHAIN delclarations
     # in [BuildOptions] sections that may be within !if expressions that may use
     # $(TARGET), $(TOOLCHAIN), $(TOOLCHAIN_TAG), or $(ARCH) operands.
     #
-    def GetToolChainAndFamilyFromDsc (self, File):
+    def GetToolChainAndFamilyFromDsc(self, File):
         SavedGlobalDefines = GlobalData.gGlobalDefines.copy()
         for BuildTarget in self.BuildTargetList:
             GlobalData.gGlobalDefines['TARGET'] = BuildTarget
             for BuildToolChain in self.ToolChainList:
-                GlobalData.gGlobalDefines['TOOLCHAIN']      = BuildToolChain
+                GlobalData.gGlobalDefines['TOOLCHAIN'] = BuildToolChain
                 GlobalData.gGlobalDefines['TOOL_CHAIN_TAG'] = BuildToolChain
                 for BuildArch in self.ArchList:
                     GlobalData.gGlobalDefines['ARCH'] = BuildArch
                     dscobj = self.BuildDatabase[File, BuildArch]
                     for KeyFamily, Key, KeyCodeBase in dscobj.BuildOptions:
                         try:
-                            Target, ToolChain, Arch, Tool, Attr = Key.split('_')
+                            Target, ToolChain, Arch, Tool, Attr = Key.split(
+                                '_')
                         except:
                             continue
                         if ToolChain == TAB_STAR or Attr != TAB_TOD_DEFINES_FAMILY:
                             continue
                         try:
-                            Family = dscobj.BuildOptions[(KeyFamily, Key, KeyCodeBase)]
+                            Family = dscobj.BuildOptions[(
+                                KeyFamily, Key, KeyCodeBase)]
                             Family = Family.strip().strip('=').strip()
                         except:
                             continue
                         if TAB_TOD_DEFINES_FAMILY not in self.ToolDef.ToolsDefTxtDatabase:
-                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY] = {}
+                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY] = {
+                            }
                         if ToolChain not in self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY]:
                             self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_FAMILY][ToolChain] = Family
                         if TAB_TOD_DEFINES_BUILDRULEFAMILY not in self.ToolDef.ToolsDefTxtDatabase:
-                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY] = {}
+                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY] = {
+                            }
                         if ToolChain not in self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY]:
                             self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_BUILDRULEFAMILY][ToolChain] = Family
                         if TAB_TOD_DEFINES_TOOL_CHAIN_TAG not in self.ToolDef.ToolsDefTxtDatabase:
-                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG] = []
+                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG] = [
+                            ]
                         if ToolChain not in self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG]:
-                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG].append(ToolChain)
+                            self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG].append(
+                                ToolChain)
         GlobalData.gGlobalDefines = SavedGlobalDefines
 
-    ## Load configuration
+    # Load configuration
     #
     #   This method will parse target.txt and get the build configurations.
     #
@@ -948,14 +1020,16 @@ class Build():
         if not self.ToolChainList:
             self.ToolChainList = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_TOOL_CHAIN_TAG]
             if self.ToolChainList is None or len(self.ToolChainList) == 0:
-                EdkLogger.error("build", RESOURCE_NOT_AVAILABLE, ExtraData="No toolchain given. Don't know how to build.\n")
+                EdkLogger.error("build", RESOURCE_NOT_AVAILABLE,
+                                ExtraData="No toolchain given. Don't know how to build.\n")
 
         if not self.PlatformFile:
             PlatformFile = self.TargetTxt.TargetTxtDictionary[TAB_TAT_DEFINES_ACTIVE_PLATFORM]
             if not PlatformFile:
                 # Try to find one in current directory
                 WorkingDirectory = os.getcwd()
-                FileList = glob.glob(os.path.normpath(os.path.join(WorkingDirectory, '*.dsc')))
+                FileList = glob.glob(os.path.normpath(
+                    os.path.join(WorkingDirectory, '*.dsc')))
                 FileNum = len(FileList)
                 if FileNum >= 2:
                     EdkLogger.error("build", OPTION_MISSING,
@@ -966,15 +1040,17 @@ class Build():
                     EdkLogger.error("build", RESOURCE_NOT_AVAILABLE,
                                     ExtraData="No active platform specified in target.txt or command line! Nothing can be built.\n")
 
-            self.PlatformFile = PathClass(NormFile(PlatformFile, self.WorkspaceDir), self.WorkspaceDir)
+            self.PlatformFile = PathClass(
+                NormFile(PlatformFile, self.WorkspaceDir), self.WorkspaceDir)
 
-        self.GetToolChainAndFamilyFromDsc (self.PlatformFile)
+        self.GetToolChainAndFamilyFromDsc(self.PlatformFile)
 
         # check if the tool chains are defined or not
         NewToolChainList = []
         for ToolChain in self.ToolChainList:
             if ToolChain not in self.ToolDef.ToolsDefTxtDatabase[TAB_TOD_DEFINES_TOOL_CHAIN_TAG]:
-                EdkLogger.warn("build", "Tool chain [%s] is not defined" % ToolChain)
+                EdkLogger.warn(
+                    "build", "Tool chain [%s] is not defined" % ToolChain)
             else:
                 NewToolChainList.append(ToolChain)
         # if no tool chain available, break the build
@@ -989,18 +1065,21 @@ class Build():
         for Tool in self.ToolChainList:
             if TAB_TOD_DEFINES_FAMILY not in ToolDefinition or Tool not in ToolDefinition[TAB_TOD_DEFINES_FAMILY] \
                or not ToolDefinition[TAB_TOD_DEFINES_FAMILY][Tool]:
-                EdkLogger.warn("build", "No tool chain family found in configuration for %s. Default to MSFT." % Tool)
+                EdkLogger.warn(
+                    "build", "No tool chain family found in configuration for %s. Default to MSFT." % Tool)
                 ToolChainFamily.append(TAB_COMPILER_MSFT)
             else:
-                ToolChainFamily.append(ToolDefinition[TAB_TOD_DEFINES_FAMILY][Tool])
+                ToolChainFamily.append(
+                    ToolDefinition[TAB_TOD_DEFINES_FAMILY][Tool])
         self.ToolChainFamily = ToolChainFamily
 
-        self.ThreadNumber   = ThreadNum()
-    ## Initialize build configuration
+        self.ThreadNumber = ThreadNum()
+    # Initialize build configuration
     #
     #   This method will parse DSC file and merge the configurations from
     #   command line and target.txt, then get the final build configurations.
     #
+
     def InitBuild(self):
         # parse target.txt, tools_def.txt, and platform file
         self.LoadConfiguration()
@@ -1010,7 +1089,6 @@ class Build():
         if ErrorCode != 0:
             EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
 
-
     def InitPreBuild(self):
         self.LoadConfiguration()
         ErrorCode, ErrorInfo = self.PlatformFile.Validate(".dsc", False)
@@ -1026,7 +1104,7 @@ class Build():
         if self.ToolChainFamily:
             GlobalData.gGlobalDefines['FAMILY'] = self.ToolChainFamily[0]
         if 'PREBUILD' in GlobalData.gCommandLineDefines:
-            self.Prebuild   = GlobalData.gCommandLineDefines.get('PREBUILD')
+            self.Prebuild = GlobalData.gCommandLineDefines.get('PREBUILD')
         else:
             self.Db_Flag = True
             Platform = self.Db.MapPlatform(str(self.PlatformFile))
@@ -1043,7 +1121,7 @@ class Build():
                 #
                 # Do not modify Arg if it looks like a flag or an absolute file path
                 #
-                if Arg.startswith('-')  or os.path.isabs(Arg):
+                if Arg.startswith('-') or os.path.isabs(Arg):
                     PrebuildList.append(Arg)
                     continue
                 #
@@ -1061,8 +1139,9 @@ class Build():
                 if os.path.isfile(Temp):
                     Arg = Temp
                 PrebuildList.append(Arg)
-            self.Prebuild       = ' '.join(PrebuildList)
-            self.Prebuild += self.PassCommandOption(self.BuildTargetList, self.ArchList, self.ToolChainList, self.PlatformFile, self.Target)
+            self.Prebuild = ' '.join(PrebuildList)
+            self.Prebuild += self.PassCommandOption(
+                self.BuildTargetList, self.ArchList, self.ToolChainList, self.PlatformFile, self.Target)
 
     def InitPostBuild(self):
         if 'POSTBUILD' in GlobalData.gCommandLineDefines:
@@ -1082,7 +1161,7 @@ class Build():
                 #
                 # Do not modify Arg if it looks like a flag or an absolute file path
                 #
-                if Arg.startswith('-')  or os.path.isabs(Arg):
+                if Arg.startswith('-') or os.path.isabs(Arg):
                     PostbuildList.append(Arg)
                     continue
                 #
@@ -1100,8 +1179,9 @@ class Build():
                 if os.path.isfile(Temp):
                     Arg = Temp
                 PostbuildList.append(Arg)
-            self.Postbuild       = ' '.join(PostbuildList)
-            self.Postbuild += self.PassCommandOption(self.BuildTargetList, self.ArchList, self.ToolChainList, self.PlatformFile, self.Target)
+            self.Postbuild = ' '.join(PostbuildList)
+            self.Postbuild += self.PassCommandOption(
+                self.BuildTargetList, self.ArchList, self.ToolChainList, self.PlatformFile, self.Target)
 
     def PassCommandOption(self, BuildTarget, TargetArch, ToolChain, PlatformFile, Target):
         BuildStr = ''
@@ -1156,7 +1236,8 @@ class Build():
             # and preserve them for the rest of the main build step, because the child process environment will
             # evaporate as soon as it exits, we cannot get it in build step.
             #
-            PrebuildEnvFile = os.path.join(GlobalData.gConfDirectory, '.cache', '.PrebuildEnv')
+            PrebuildEnvFile = os.path.join(
+                GlobalData.gConfDirectory, '.cache', '.PrebuildEnv')
             if os.path.isfile(PrebuildEnvFile):
                 os.remove(PrebuildEnvFile)
             if os.path.isfile(self.PlatformBuildPath):
@@ -1172,13 +1253,15 @@ class Build():
             EndOfProcedure = Event()
             EndOfProcedure.clear()
             if Process.stdout:
-                StdOutThread = Thread(target=ReadMessage, args=(Process.stdout, EdkLogger.info, EndOfProcedure))
+                StdOutThread = Thread(target=ReadMessage, args=(
+                    Process.stdout, EdkLogger.info, EndOfProcedure))
                 StdOutThread.name = "STDOUT-Redirector"
                 StdOutThread.daemon = False
                 StdOutThread.start()
 
             if Process.stderr:
-                StdErrThread = Thread(target=ReadMessage, args=(Process.stderr, EdkLogger.quiet, EndOfProcedure))
+                StdErrThread = Thread(target=ReadMessage, args=(
+                    Process.stderr, EdkLogger.quiet, EndOfProcedure))
                 StdErrThread.name = "STDERR-Redirector"
                 StdErrThread.daemon = False
                 StdErrThread.start()
@@ -1189,15 +1272,17 @@ class Build():
                 StdOutThread.join()
             if Process.stderr:
                 StdErrThread.join()
-            if Process.returncode != 0 :
-                EdkLogger.error("Prebuild", PREBUILD_ERROR, 'Prebuild process is not success!')
+            if Process.returncode != 0:
+                EdkLogger.error("Prebuild", PREBUILD_ERROR,
+                                'Prebuild process is not success!')
 
             if os.path.exists(PrebuildEnvFile):
                 f = open(PrebuildEnvFile)
                 envs = f.readlines()
                 f.close()
-                envs = [l.split("=", 1) for l in envs ]
-                envs = [[I.strip() for I in item] for item in envs if len(item) == 2]
+                envs = [l.split("=", 1) for l in envs]
+                envs = [[I.strip() for I in item]
+                        for item in envs if len(item) == 2]
                 os.environ.update(dict(envs))
             EdkLogger.info("\n- Prebuild Done -\n")
 
@@ -1205,20 +1290,24 @@ class Build():
         if self.Postbuild:
             EdkLogger.info("\n- Postbuild Start -\n")
             if sys.platform == "win32":
-                Process = Popen(self.Postbuild, stdout=PIPE, stderr=PIPE, shell=True)
+                Process = Popen(self.Postbuild, stdout=PIPE,
+                                stderr=PIPE, shell=True)
             else:
-                Process = Popen(self.Postbuild, stdout=PIPE, stderr=PIPE, shell=True)
+                Process = Popen(self.Postbuild, stdout=PIPE,
+                                stderr=PIPE, shell=True)
             # launch two threads to read the STDOUT and STDERR
             EndOfProcedure = Event()
             EndOfProcedure.clear()
             if Process.stdout:
-                StdOutThread = Thread(target=ReadMessage, args=(Process.stdout, EdkLogger.info, EndOfProcedure))
+                StdOutThread = Thread(target=ReadMessage, args=(
+                    Process.stdout, EdkLogger.info, EndOfProcedure))
                 StdOutThread.name = "STDOUT-Redirector"
                 StdOutThread.daemon = False
                 StdOutThread.start()
 
             if Process.stderr:
-                StdErrThread = Thread(target=ReadMessage, args=(Process.stderr, EdkLogger.quiet, EndOfProcedure))
+                StdErrThread = Thread(target=ReadMessage, args=(
+                    Process.stderr, EdkLogger.quiet, EndOfProcedure))
                 StdErrThread.name = "STDERR-Redirector"
                 StdErrThread.daemon = False
                 StdErrThread.start()
@@ -1229,11 +1318,12 @@ class Build():
                 StdOutThread.join()
             if Process.stderr:
                 StdErrThread.join()
-            if Process.returncode != 0 :
-                EdkLogger.error("Postbuild", POSTBUILD_ERROR, 'Postbuild process is not success!')
+            if Process.returncode != 0:
+                EdkLogger.error("Postbuild", POSTBUILD_ERROR,
+                                'Postbuild process is not success!')
             EdkLogger.info("\n- Postbuild Done -\n")
 
-    ## Build a module or platform
+    # Build a module or platform
     #
     # Create autogen code and makefile for a module or platform, and the launch
     # "make" command to build it
@@ -1260,21 +1350,30 @@ class Build():
             mqueue = mp.Queue()
             for m in AutoGenObject.GetAllModuleInfo:
                 mqueue.put(m)
-            mqueue.put((None,None,None,None,None,None,None))
-            AutoGenObject.DataPipe.DataContainer = {"CommandTarget": self.Target}
-            AutoGenObject.DataPipe.DataContainer = {"Workspace_timestamp": AutoGenObject.Workspace._SrcTimeStamp}
+            mqueue.put((None, None, None, None, None, None, None))
+            AutoGenObject.DataPipe.DataContainer = {
+                "CommandTarget": self.Target}
+            AutoGenObject.DataPipe.DataContainer = {
+                "Workspace_timestamp": AutoGenObject.Workspace._SrcTimeStamp}
             AutoGenObject.CreateLibModuelDirs()
-            AutoGenObject.DataPipe.DataContainer = {"LibraryBuildDirectoryList":AutoGenObject.LibraryBuildDirectoryList}
-            AutoGenObject.DataPipe.DataContainer = {"ModuleBuildDirectoryList":AutoGenObject.ModuleBuildDirectoryList}
-            AutoGenObject.DataPipe.DataContainer = {"FdsCommandDict": AutoGenObject.Workspace.GenFdsCommandDict}
+            AutoGenObject.DataPipe.DataContainer = {
+                "LibraryBuildDirectoryList": AutoGenObject.LibraryBuildDirectoryList}
+            AutoGenObject.DataPipe.DataContainer = {
+                "ModuleBuildDirectoryList": AutoGenObject.ModuleBuildDirectoryList}
+            AutoGenObject.DataPipe.DataContainer = {
+                "FdsCommandDict": AutoGenObject.Workspace.GenFdsCommandDict}
             self.Progress.Start("Generating makefile and code")
-            data_pipe_file = os.path.join(AutoGenObject.BuildDir, "GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch))
+            data_pipe_file = os.path.join(AutoGenObject.BuildDir, "GlobalVar_%s_%s.bin" % (
+                str(AutoGenObject.Guid), AutoGenObject.Arch))
             AutoGenObject.DataPipe.dump(data_pipe_file)
             cqueue = mp.Queue()
-            autogen_rt,errorcode = self.StartAutoGen(mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList, cqueue)
-            AutoGenIdFile = os.path.join(GlobalData.gConfDirectory,".AutoGenIdFile.txt")
-            with open(AutoGenIdFile,"w") as fw:
-                fw.write("Arch=%s\n" % "|".join((AutoGenObject.Workspace.ArchList)))
+            autogen_rt, errorcode = self.StartAutoGen(
+                mqueue, AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList, cqueue)
+            AutoGenIdFile = os.path.join(
+                GlobalData.gConfDirectory, ".AutoGenIdFile.txt")
+            with open(AutoGenIdFile, "w") as fw:
+                fw.write("Arch=%s\n" % "|".join(
+                    (AutoGenObject.Workspace.ArchList)))
                 fw.write("BuildDir=%s\n" % AutoGenObject.Workspace.BuildDir)
                 fw.write("PlatformGuid=%s\n" % str(AutoGenObject.Guid))
             self.Progress.Stop("done!")
@@ -1297,7 +1396,8 @@ class Build():
             EdkLogger.error("build", OPTION_MISSING,
                             "No build command found for this module. "
                             "Please check your setting of %s_%s_%s_MAKE_PATH in Conf/tools_def.txt file." %
-                                (AutoGenObject.BuildTarget, AutoGenObject.ToolChain, AutoGenObject.Arch),
+                            (AutoGenObject.BuildTarget,
+                             AutoGenObject.ToolChain, AutoGenObject.Arch),
                             ExtraData=str(AutoGenObject))
 
         # run
@@ -1325,10 +1425,14 @@ class Build():
             DirList = []
             for Lib in AutoGenObject.LibraryAutoGenList:
                 if not Lib.IsBinaryModule:
-                    DirList.append((os.path.join(AutoGenObject.BuildDir, Lib.BuildDir),Lib))
+                    DirList.append(
+                        (os.path.join(AutoGenObject.BuildDir, Lib.BuildDir), Lib))
             for Lib, LibAutoGen in DirList:
-                NewBuildCommand = BuildCommand + ['-f', os.path.normpath(os.path.join(Lib, self.MakeFileName)), 'pbuild']
-                LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir,LibAutoGen)
+                NewBuildCommand = BuildCommand + \
+                    ['-f', os.path.normpath(os.path.join(Lib,
+                                            self.MakeFileName)), 'pbuild']
+                LaunchCommand(NewBuildCommand,
+                              AutoGenObject.MakeFileDir, LibAutoGen)
             return True
 
         # build module
@@ -1336,18 +1440,26 @@ class Build():
             DirList = []
             for Lib in AutoGenObject.LibraryAutoGenList:
                 if not Lib.IsBinaryModule:
-                    DirList.append((os.path.join(AutoGenObject.BuildDir, Lib.BuildDir),Lib))
+                    DirList.append(
+                        (os.path.join(AutoGenObject.BuildDir, Lib.BuildDir), Lib))
             for Lib, LibAutoGen in DirList:
-                NewBuildCommand = BuildCommand + ['-f', os.path.normpath(os.path.join(Lib, self.MakeFileName)), 'pbuild']
-                LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir,LibAutoGen)
+                NewBuildCommand = BuildCommand + \
+                    ['-f', os.path.normpath(os.path.join(Lib,
+                                            self.MakeFileName)), 'pbuild']
+                LaunchCommand(NewBuildCommand,
+                              AutoGenObject.MakeFileDir, LibAutoGen)
 
             DirList = []
             for ModuleAutoGen in AutoGenObject.ModuleAutoGenList:
                 if not ModuleAutoGen.IsBinaryModule:
-                    DirList.append((os.path.join(AutoGenObject.BuildDir, ModuleAutoGen.BuildDir),ModuleAutoGen))
-            for Mod,ModAutoGen in DirList:
-                NewBuildCommand = BuildCommand + ['-f', os.path.normpath(os.path.join(Mod, self.MakeFileName)), 'pbuild']
-                LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir,ModAutoGen)
+                    DirList.append(
+                        (os.path.join(AutoGenObject.BuildDir, ModuleAutoGen.BuildDir), ModuleAutoGen))
+            for Mod, ModAutoGen in DirList:
+                NewBuildCommand = BuildCommand + \
+                    ['-f', os.path.normpath(os.path.join(Mod,
+                                            self.MakeFileName)), 'pbuild']
+                LaunchCommand(NewBuildCommand,
+                              AutoGenObject.MakeFileDir, ModAutoGen)
             self.CreateAsBuiltInf()
             if GlobalData.gBinCacheDest:
                 self.GenDestCache()
@@ -1361,36 +1473,42 @@ class Build():
         # cleanlib
         if Target == 'cleanlib':
             for Lib in AutoGenObject.LibraryBuildDirectoryList:
-                LibMakefile = os.path.normpath(os.path.join(Lib, self.MakeFileName))
+                LibMakefile = os.path.normpath(
+                    os.path.join(Lib, self.MakeFileName))
                 if os.path.exists(LibMakefile):
-                    NewBuildCommand = BuildCommand + ['-f', LibMakefile, 'cleanall']
+                    NewBuildCommand = BuildCommand + \
+                        ['-f', LibMakefile, 'cleanall']
                     LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir)
             return True
 
         # clean
         if Target == 'clean':
             for Mod in AutoGenObject.ModuleBuildDirectoryList:
-                ModMakefile = os.path.normpath(os.path.join(Mod, self.MakeFileName))
+                ModMakefile = os.path.normpath(
+                    os.path.join(Mod, self.MakeFileName))
                 if os.path.exists(ModMakefile):
-                    NewBuildCommand = BuildCommand + ['-f', ModMakefile, 'cleanall']
+                    NewBuildCommand = BuildCommand + \
+                        ['-f', ModMakefile, 'cleanall']
                     LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir)
             for Lib in AutoGenObject.LibraryBuildDirectoryList:
-                LibMakefile = os.path.normpath(os.path.join(Lib, self.MakeFileName))
+                LibMakefile = os.path.normpath(
+                    os.path.join(Lib, self.MakeFileName))
                 if os.path.exists(LibMakefile):
-                    NewBuildCommand = BuildCommand + ['-f', LibMakefile, 'cleanall']
+                    NewBuildCommand = BuildCommand + \
+                        ['-f', LibMakefile, 'cleanall']
                     LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir)
             return True
 
         # cleanall
         if Target == 'cleanall':
             try:
-                #os.rmdir(AutoGenObject.BuildDir)
+                # os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
             except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
-    ## Build a module or platform
+    # Build a module or platform
     #
     # Create autogen code and makefile for a module or platform, and the launch
     # "make" command to build it
@@ -1423,7 +1541,7 @@ class Build():
             if not self.SkipAutoGen or Target == 'genmake':
                 self.Progress.Start("Generating makefile")
                 AutoGenObject.CreateMakeFile(CreateDepsMakeFile)
-                #AutoGenObject.CreateAsBuiltInf()
+                # AutoGenObject.CreateAsBuiltInf()
                 self.Progress.Stop("done!")
             if Target == "genmake":
                 return True
@@ -1440,14 +1558,16 @@ class Build():
             EdkLogger.error("build", OPTION_MISSING,
                             "No build command found for this module. "
                             "Please check your setting of %s_%s_%s_MAKE_PATH in Conf/tools_def.txt file." %
-                                (AutoGenObject.BuildTarget, AutoGenObject.ToolChain, AutoGenObject.Arch),
+                            (AutoGenObject.BuildTarget,
+                             AutoGenObject.ToolChain, AutoGenObject.Arch),
                             ExtraData=str(AutoGenObject))
 
         # build modules
         if BuildModule:
             if Target != 'fds':
                 BuildCommand = BuildCommand + [Target]
-            AutoGenObject.BuildTime = LaunchCommand(BuildCommand, AutoGenObject.MakeFileDir)
+            AutoGenObject.BuildTime = LaunchCommand(
+                BuildCommand, AutoGenObject.MakeFileDir)
             self.CreateAsBuiltInf()
             if GlobalData.gBinCacheDest:
                 self.GenDestCache()
@@ -1477,42 +1597,46 @@ class Build():
 
         # not build modules
 
-
         # cleanall
         if Target == 'cleanall':
             try:
-                #os.rmdir(AutoGenObject.BuildDir)
+                # os.rmdir(AutoGenObject.BuildDir)
                 RemoveDirectory(AutoGenObject.BuildDir, True)
             except WindowsError as X:
                 EdkLogger.error("build", FILE_DELETE_FAILURE, ExtraData=str(X))
         return True
 
-    ## Rebase module image and Get function address for the input module list.
+    # Rebase module image and Get function address for the input module list.
     #
-    def _RebaseModule (self, MapBuffer, BaseAddress, ModuleList, AddrIsOffset = True, ModeIsSmm = False):
+    def _RebaseModule(self, MapBuffer, BaseAddress, ModuleList, AddrIsOffset=True, ModeIsSmm=False):
         if ModeIsSmm:
             AddrIsOffset = False
         for InfFile in ModuleList:
-            sys.stdout.write (".")
+            sys.stdout.write(".")
             sys.stdout.flush()
             ModuleInfo = ModuleList[InfFile]
             ModuleName = ModuleInfo.BaseName
             ModuleOutputImage = ModuleInfo.Image.FileName
-            ModuleDebugImage  = os.path.join(ModuleInfo.DebugDir, ModuleInfo.BaseName + '.efi')
-            ## for SMM module in SMRAM, the SMRAM will be allocated from base to top.
+            ModuleDebugImage = os.path.join(
+                ModuleInfo.DebugDir, ModuleInfo.BaseName + '.efi')
+            # for SMM module in SMRAM, the SMRAM will be allocated from base to top.
             if not ModeIsSmm:
                 BaseAddress = BaseAddress - ModuleInfo.Image.Size
                 #
                 # Update Image to new BaseAddress by GenFw tool
                 #
-                LaunchCommand(["GenFw", "--rebase", str(BaseAddress), "-r", ModuleOutputImage], ModuleInfo.OutputDir)
-                LaunchCommand(["GenFw", "--rebase", str(BaseAddress), "-r", ModuleDebugImage], ModuleInfo.DebugDir)
+                LaunchCommand(["GenFw", "--rebase", str(BaseAddress),
+                              "-r", ModuleOutputImage], ModuleInfo.OutputDir)
+                LaunchCommand(["GenFw", "--rebase", str(BaseAddress),
+                              "-r", ModuleDebugImage], ModuleInfo.DebugDir)
             else:
                 #
                 # Set new address to the section header only for SMM driver.
                 #
-                LaunchCommand(["GenFw", "--address", str(BaseAddress), "-r", ModuleOutputImage], ModuleInfo.OutputDir)
-                LaunchCommand(["GenFw", "--address", str(BaseAddress), "-r", ModuleDebugImage], ModuleInfo.DebugDir)
+                LaunchCommand(["GenFw", "--address", str(BaseAddress),
+                              "-r", ModuleOutputImage], ModuleInfo.OutputDir)
+                LaunchCommand(["GenFw", "--address", str(BaseAddress),
+                              "-r", ModuleDebugImage], ModuleInfo.DebugDir)
             #
             # Collect function address from Map file
             #
@@ -1522,32 +1646,37 @@ class Build():
                 OrigImageBaseAddress = 0
                 ImageMap = open(ImageMapTable, 'r')
                 for LinStr in ImageMap:
-                    if len (LinStr.strip()) == 0:
+                    if len(LinStr.strip()) == 0:
                         continue
                     #
                     # Get the preferred address set on link time.
                     #
-                    if LinStr.find ('Preferred load address is') != -1:
+                    if LinStr.find('Preferred load address is') != -1:
                         StrList = LinStr.split()
-                        OrigImageBaseAddress = int (StrList[len(StrList) - 1], 16)
+                        OrigImageBaseAddress = int(
+                            StrList[len(StrList) - 1], 16)
 
                     StrList = LinStr.split()
-                    if len (StrList) > 4:
+                    if len(StrList) > 4:
                         if StrList[3] == 'f' or StrList[3] == 'F':
                             Name = StrList[1]
-                            RelativeAddress = int (StrList[2], 16) - OrigImageBaseAddress
-                            FunctionList.append ((Name, RelativeAddress))
+                            RelativeAddress = int(
+                                StrList[2], 16) - OrigImageBaseAddress
+                            FunctionList.append((Name, RelativeAddress))
 
                 ImageMap.close()
             #
             # Add general information.
             #
             if ModeIsSmm:
-                MapBuffer.append('\n\n%s (Fixed SMRAM Offset,   BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
+                MapBuffer.append('\n\n%s (Fixed SMRAM Offset,   BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (
+                    ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
             elif AddrIsOffset:
-                MapBuffer.append('\n\n%s (Fixed Memory Offset,  BaseAddress=-0x%010X, EntryPoint=-0x%010X)\n' % (ModuleName, 0 - BaseAddress, 0 - (BaseAddress + ModuleInfo.Image.EntryPoint)))
+                MapBuffer.append('\n\n%s (Fixed Memory Offset,  BaseAddress=-0x%010X, EntryPoint=-0x%010X)\n' %
+                                 (ModuleName, 0 - BaseAddress, 0 - (BaseAddress + ModuleInfo.Image.EntryPoint)))
             else:
-                MapBuffer.append('\n\n%s (Fixed Memory Address, BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
+                MapBuffer.append('\n\n%s (Fixed Memory Address, BaseAddress=0x%010X,  EntryPoint=0x%010X)\n' % (
+                    ModuleName, BaseAddress, BaseAddress + ModuleInfo.Image.EntryPoint))
             #
             # Add guid and general seciton section.
             #
@@ -1559,9 +1688,11 @@ class Build():
                 elif SectionHeader[0] in ['.data', '.sdata']:
                     DataSectionAddress = SectionHeader[1]
             if AddrIsOffset:
-                MapBuffer.append('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
+                MapBuffer.append('(GUID=%s, .textbaseaddress=-0x%010X, .databaseaddress=-0x%010X)\n' % (
+                    ModuleInfo.Guid, 0 - (BaseAddress + TextSectionAddress), 0 - (BaseAddress + DataSectionAddress)))
             else:
-                MapBuffer.append('(GUID=%s, .textbaseaddress=0x%010X, .databaseaddress=0x%010X)\n' % (ModuleInfo.Guid, BaseAddress + TextSectionAddress, BaseAddress + DataSectionAddress))
+                MapBuffer.append('(GUID=%s, .textbaseaddress=0x%010X, .databaseaddress=0x%010X)\n' % (
+                    ModuleInfo.Guid, BaseAddress + TextSectionAddress, BaseAddress + DataSectionAddress))
             #
             # Add debug image full path.
             #
@@ -1571,9 +1702,11 @@ class Build():
             #
             for Function in FunctionList:
                 if AddrIsOffset:
-                    MapBuffer.append('  -0x%010X    %s\n' % (0 - (BaseAddress + Function[1]), Function[0]))
+                    MapBuffer.append('  -0x%010X    %s\n' %
+                                     (0 - (BaseAddress + Function[1]), Function[0]))
                 else:
-                    MapBuffer.append('  0x%010X    %s\n' % (BaseAddress + Function[1], Function[0]))
+                    MapBuffer.append('  0x%010X    %s\n' %
+                                     (BaseAddress + Function[1], Function[0]))
             ImageMap.close()
 
             #
@@ -1582,9 +1715,9 @@ class Build():
             if ModeIsSmm:
                 BaseAddress = BaseAddress + ModuleInfo.Image.Size
 
-    ## Collect MAP information of all FVs
+    # Collect MAP information of all FVs
     #
-    def _CollectFvMapBuffer (self, MapBuffer, Wa, ModuleList):
+    def _CollectFvMapBuffer(self, MapBuffer, Wa, ModuleList):
         if self.Fdf:
             # First get the XIP base address for FV map file.
             GuidPattern = re.compile("[-a-fA-F0-9]+")
@@ -1594,7 +1727,7 @@ class Build():
                 if not os.path.exists(FvMapBuffer):
                     continue
                 FvMap = open(FvMapBuffer, 'r')
-                #skip FV size information
+                # skip FV size information
                 FvMap.readline()
                 FvMap.readline()
                 FvMap.readline()
@@ -1607,7 +1740,8 @@ class Build():
                         #
                         GuidString = MatchGuid.group()
                         if GuidString.upper() in ModuleList:
-                            Line = Line.replace(GuidString, ModuleList[GuidString.upper()].Name)
+                            Line = Line.replace(
+                                GuidString, ModuleList[GuidString.upper()].Name)
                     MapBuffer.append(Line)
                     #
                     # Add the debug image full path.
@@ -1616,28 +1750,30 @@ class Build():
                     if MatchGuid is not None:
                         GuidString = MatchGuid.group().split("=")[1]
                         if GuidString.upper() in ModuleList:
-                            MapBuffer.append('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper()].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
+                            MapBuffer.append('(IMAGE=%s)\n' % (os.path.join(ModuleList[GuidString.upper(
+                            )].DebugDir, ModuleList[GuidString.upper()].Name + '.efi')))
 
                 FvMap.close()
 
-    ## Collect MAP information of all modules
+    # Collect MAP information of all modules
     #
-    def _CollectModuleMapBuffer (self, MapBuffer, ModuleList):
-        sys.stdout.write ("Generate Load Module At Fix Address Map")
+    def _CollectModuleMapBuffer(self, MapBuffer, ModuleList):
+        sys.stdout.write("Generate Load Module At Fix Address Map")
         sys.stdout.flush()
         PatchEfiImageList = []
-        PeiModuleList  = {}
-        BtModuleList   = {}
-        RtModuleList   = {}
-        SmmModuleList  = {}
+        PeiModuleList = {}
+        BtModuleList = {}
+        RtModuleList = {}
+        SmmModuleList = {}
         PeiSize = 0
-        BtSize  = 0
-        RtSize  = 0
+        BtSize = 0
+        RtSize = 0
         # reserve 4K size in SMRAM to make SMM module address not from 0.
         SmmSize = 0x1000
         for ModuleGuid in ModuleList:
             Module = ModuleList[ModuleGuid]
-            GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (Module.MetaFile, Module.Arch, Module.ToolChain, Module.BuildTarget)
+            GlobalData.gProcessingFile = "%s [%s, %s, %s]" % (
+                Module.MetaFile, Module.Arch, Module.ToolChain, Module.BuildTarget)
 
             OutputImageFile = ''
             for ResultFile in Module.CodaTargetList:
@@ -1645,11 +1781,14 @@ class Build():
                     #
                     # module list for PEI, DXE, RUNTIME and SMM
                     #
-                    OutputImageFile = os.path.join(Module.OutputDir, Module.Name + '.efi')
-                    ImageClass = PeImageClass (OutputImageFile)
+                    OutputImageFile = os.path.join(
+                        Module.OutputDir, Module.Name + '.efi')
+                    ImageClass = PeImageClass(OutputImageFile)
                     if not ImageClass.IsValid:
-                        EdkLogger.error("build", FILE_PARSE_FAILURE, ExtraData=ImageClass.ErrorInfo)
-                    ImageInfo = PeImageInfo(Module.Name, Module.Guid, Module.Arch, Module.OutputDir, Module.DebugDir, ImageClass)
+                        EdkLogger.error("build", FILE_PARSE_FAILURE,
+                                        ExtraData=ImageClass.ErrorInfo)
+                    ImageInfo = PeImageInfo(
+                        Module.Name, Module.Guid, Module.Arch, Module.OutputDir, Module.DebugDir, ImageClass)
                     if Module.ModuleType in [SUP_MODULE_PEI_CORE, SUP_MODULE_PEIM, EDK_COMPONENT_TYPE_COMBINED_PEIM_DRIVER, EDK_COMPONENT_TYPE_PIC_PEIM, EDK_COMPONENT_TYPE_RELOCATABLE_PEIM, SUP_MODULE_DXE_CORE]:
                         PeiModuleList[Module.MetaFile] = ImageInfo
                         PeiSize += ImageInfo.Image.Size
@@ -1663,7 +1802,8 @@ class Build():
                         SmmModuleList[Module.MetaFile] = ImageInfo
                         SmmSize += ImageInfo.Image.Size
                         if Module.ModuleType == SUP_MODULE_DXE_SMM_DRIVER:
-                            PiSpecVersion = Module.Module.Specification.get('PI_SPECIFICATION_VERSION', '0x00000000')
+                            PiSpecVersion = Module.Module.Specification.get(
+                                'PI_SPECIFICATION_VERSION', '0x00000000')
                             # for PI specification < PI1.1, DXE_SMM_DRIVER also runs as BOOT time driver.
                             if int(PiSpecVersion, 16) < 0x0001000A:
                                 BtModuleList[Module.MetaFile] = ImageInfo
@@ -1691,7 +1831,7 @@ class Build():
                 # Module includes the patchable load fix address PCDs.
                 # It will be fixed up later.
                 #
-                PatchEfiImageList.append (OutputImageFile)
+                PatchEfiImageList.append(OutputImageFile)
 
         #
         # Get Top Memory address
@@ -1703,7 +1843,8 @@ class Build():
         else:
             TopMemoryAddress = self.LoadFixAddress
             if TopMemoryAddress < RtSize + BtSize + PeiSize:
-                EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS is too low to load driver")
+                EdkLogger.error("build", PARAMETER_INVALID,
+                                "FIX_LOAD_TOP_MEMORY_ADDRESS is too low to load driver")
 
         #
         # Patch FixAddress related PCDs into EFI image
@@ -1722,37 +1863,50 @@ class Build():
             for PcdInfo in PcdTable:
                 ReturnValue = 0
                 if PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE:
-                    ReturnValue, ErrorInfo = PatchBinaryFile (EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE_DATA_TYPE, str (PeiSize // 0x1000))
+                    ReturnValue, ErrorInfo = PatchBinaryFile(
+                        EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_PEI_PAGE_SIZE_DATA_TYPE, str(PeiSize // 0x1000))
                 elif PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE:
-                    ReturnValue, ErrorInfo = PatchBinaryFile (EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE_DATA_TYPE, str (BtSize // 0x1000))
+                    ReturnValue, ErrorInfo = PatchBinaryFile(
+                        EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_DXE_PAGE_SIZE_DATA_TYPE, str(BtSize // 0x1000))
                 elif PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE:
-                    ReturnValue, ErrorInfo = PatchBinaryFile (EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE_DATA_TYPE, str (RtSize // 0x1000))
-                elif PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE and len (SmmModuleList) > 0:
-                    ReturnValue, ErrorInfo = PatchBinaryFile (EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE_DATA_TYPE, str (SmmSize // 0x1000))
+                    ReturnValue, ErrorInfo = PatchBinaryFile(
+                        EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_RUNTIME_PAGE_SIZE_DATA_TYPE, str(RtSize // 0x1000))
+                elif PcdInfo[0] == TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE and len(SmmModuleList) > 0:
+                    ReturnValue, ErrorInfo = PatchBinaryFile(
+                        EfiImage, PcdInfo[1], TAB_PCDS_PATCHABLE_LOAD_FIX_ADDRESS_SMM_PAGE_SIZE_DATA_TYPE, str(SmmSize // 0x1000))
                 if ReturnValue != 0:
-                    EdkLogger.error("build", PARAMETER_INVALID, "Patch PCD value failed", ExtraData=ErrorInfo)
+                    EdkLogger.error("build", PARAMETER_INVALID,
+                                    "Patch PCD value failed", ExtraData=ErrorInfo)
 
-        MapBuffer.append('PEI_CODE_PAGE_NUMBER      = 0x%x\n' % (PeiSize // 0x1000))
-        MapBuffer.append('BOOT_CODE_PAGE_NUMBER     = 0x%x\n' % (BtSize // 0x1000))
-        MapBuffer.append('RUNTIME_CODE_PAGE_NUMBER  = 0x%x\n' % (RtSize // 0x1000))
-        if len (SmmModuleList) > 0:
-            MapBuffer.append('SMM_CODE_PAGE_NUMBER      = 0x%x\n' % (SmmSize // 0x1000))
+        MapBuffer.append('PEI_CODE_PAGE_NUMBER      = 0x%x\n' %
+                         (PeiSize // 0x1000))
+        MapBuffer.append('BOOT_CODE_PAGE_NUMBER     = 0x%x\n' %
+                         (BtSize // 0x1000))
+        MapBuffer.append('RUNTIME_CODE_PAGE_NUMBER  = 0x%x\n' %
+                         (RtSize // 0x1000))
+        if len(SmmModuleList) > 0:
+            MapBuffer.append('SMM_CODE_PAGE_NUMBER      = 0x%x\n' %
+                             (SmmSize // 0x1000))
 
         PeiBaseAddr = TopMemoryAddress - RtSize - BtSize
-        BtBaseAddr  = TopMemoryAddress - RtSize
-        RtBaseAddr  = TopMemoryAddress - ReservedRuntimeMemorySize
+        BtBaseAddr = TopMemoryAddress - RtSize
+        RtBaseAddr = TopMemoryAddress - ReservedRuntimeMemorySize
 
-        self._RebaseModule (MapBuffer, PeiBaseAddr, PeiModuleList, TopMemoryAddress == 0)
-        self._RebaseModule (MapBuffer, BtBaseAddr, BtModuleList, TopMemoryAddress == 0)
-        self._RebaseModule (MapBuffer, RtBaseAddr, RtModuleList, TopMemoryAddress == 0)
-        self._RebaseModule (MapBuffer, 0x1000, SmmModuleList, AddrIsOffset=False, ModeIsSmm=True)
+        self._RebaseModule(MapBuffer, PeiBaseAddr,
+                           PeiModuleList, TopMemoryAddress == 0)
+        self._RebaseModule(MapBuffer, BtBaseAddr,
+                           BtModuleList, TopMemoryAddress == 0)
+        self._RebaseModule(MapBuffer, RtBaseAddr,
+                           RtModuleList, TopMemoryAddress == 0)
+        self._RebaseModule(MapBuffer, 0x1000, SmmModuleList,
+                           AddrIsOffset=False, ModeIsSmm=True)
         MapBuffer.append('\n\n')
-        sys.stdout.write ("\n")
+        sys.stdout.write("\n")
         sys.stdout.flush()
 
-    ## Save platform Map file
+    # Save platform Map file
     #
-    def _SaveMapFile (self, MapBuffer, Wa):
+    def _SaveMapFile(self, MapBuffer, Wa):
         #
         # Map file path is got.
         #
@@ -1762,13 +1916,15 @@ class Build():
         #
         SaveFileOnChange(MapFilePath, ''.join(MapBuffer), False)
         if self.LoadFixAddress != 0:
-            sys.stdout.write ("\nLoad Module At Fix Address Map file can be found at %s\n" % (MapFilePath))
+            sys.stdout.write(
+                "\nLoad Module At Fix Address Map file can be found at %s\n" % (MapFilePath))
         sys.stdout.flush()
 
-    ## Build active platform for different build targets and different tool chains
+    # Build active platform for different build targets and different tool chains
     #
     def _BuildPlatform(self):
-        SaveFileOnChange(self.PlatformBuildPath, '# DO NOT EDIT \n# FILE auto-generated\n', False)
+        SaveFileOnChange(self.PlatformBuildPath,
+                         '# DO NOT EDIT \n# FILE auto-generated\n', False)
         for BuildTarget in self.BuildTargetList:
             GlobalData.gGlobalDefines['TARGET'] = BuildTarget
             index = 0
@@ -1778,22 +1934,22 @@ class Build():
                 GlobalData.gGlobalDefines['FAMILY'] = self.ToolChainFamily[index]
                 index += 1
                 Wa = WorkspaceAutoGen(
-                        self.WorkspaceDir,
-                        self.PlatformFile,
-                        BuildTarget,
-                        ToolChain,
-                        self.ArchList,
-                        self.BuildDatabase,
-                        self.TargetTxt,
-                        self.ToolDef,
-                        self.Fdf,
-                        self.FdList,
-                        self.FvList,
-                        self.CapList,
-                        self.SkuId,
-                        self.UniFlag,
-                        self.Progress
-                        )
+                    self.WorkspaceDir,
+                    self.PlatformFile,
+                    BuildTarget,
+                    ToolChain,
+                    self.ArchList,
+                    self.BuildDatabase,
+                    self.TargetTxt,
+                    self.ToolDef,
+                    self.Fdf,
+                    self.FdList,
+                    self.FvList,
+                    self.CapList,
+                    self.SkuId,
+                    self.UniFlag,
+                    self.Progress
+                )
                 self.Fdf = Wa.FdfFile
                 self.LoadFixAddress = Wa.Platform.LoadFixAddress
                 self.BuildReport.AddPlatformReport(Wa)
@@ -1805,12 +1961,14 @@ class Build():
                     CmdListDict = self._GenFfsCmd(Wa.ArchList)
 
                 for Arch in Wa.ArchList:
-                    PcdMaList    = []
+                    PcdMaList = []
                     GlobalData.gGlobalDefines['ARCH'] = Arch
-                    Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
+                    Pa = PlatformAutoGen(
+                        Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
                     for Module in Pa.Platform.Modules:
                         # Get ModuleAutoGen object to generate C code file and makefile
-                        Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
+                        Ma = ModuleAutoGen(
+                            Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile, Pa.DataPipe)
                         if Ma is None:
                             continue
                         if Ma.PcdIsDriver:
@@ -1818,9 +1976,11 @@ class Build():
                             Ma.Workspace = Wa
                             PcdMaList.append(Ma)
                         self.BuildModules.append(Ma)
-                    Pa.DataPipe.DataContainer = {"FfsCommand":CmdListDict}
-                    Pa.DataPipe.DataContainer = {"Workspace_timestamp": Wa._SrcTimeStamp}
-                    self._BuildPa(self.Target, Pa, FfsCommand=CmdListDict,PcdMaList=PcdMaList)
+                    Pa.DataPipe.DataContainer = {"FfsCommand": CmdListDict}
+                    Pa.DataPipe.DataContainer = {
+                        "Workspace_timestamp": Wa._SrcTimeStamp}
+                    self._BuildPa(self.Target, Pa,
+                                  FfsCommand=CmdListDict, PcdMaList=PcdMaList)
 
                 # Create MAP file when Load Fix Address is enabled.
                 if self.Target in ["", "all", "fds"]:
@@ -1830,7 +1990,8 @@ class Build():
                         # Check whether the set fix address is above 4G for 32bit image.
                         #
                         if (Arch == 'IA32' or Arch == 'ARM') and self.LoadFixAddress != 0xFFFFFFFFFFFFFFFF and self.LoadFixAddress >= 0x100000000:
-                            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platform with IA32 or ARM arch modules")
+                            EdkLogger.error(
+                                "build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platform with IA32 or ARM arch modules")
                     #
                     # Get Module List
                     #
@@ -1860,10 +2021,10 @@ class Build():
                     #
                     # Save MAP buffer into MAP file.
                     #
-                    self._SaveMapFile (MapBuffer, Wa)
+                    self._SaveMapFile(MapBuffer, Wa)
                 self.CreateGuidedSectionToolsFile(Wa)
 
-    ## Build active module for different build targets, different tool chains and different archs
+    # Build active module for different build targets, different tool chains and different archs
     #
     def _BuildModule(self):
         for BuildTarget in self.BuildTargetList:
@@ -1880,23 +2041,23 @@ class Build():
                 # AutoGen first
                 #
                 Wa = WorkspaceAutoGen(
-                        self.WorkspaceDir,
-                        self.PlatformFile,
-                        BuildTarget,
-                        ToolChain,
-                        self.ArchList,
-                        self.BuildDatabase,
-                        self.TargetTxt,
-                        self.ToolDef,
-                        self.Fdf,
-                        self.FdList,
-                        self.FvList,
-                        self.CapList,
-                        self.SkuId,
-                        self.UniFlag,
-                        self.Progress,
-                        self.ModuleFile
-                        )
+                    self.WorkspaceDir,
+                    self.PlatformFile,
+                    BuildTarget,
+                    ToolChain,
+                    self.ArchList,
+                    self.BuildDatabase,
+                    self.TargetTxt,
+                    self.ToolDef,
+                    self.Fdf,
+                    self.FdList,
+                    self.FvList,
+                    self.CapList,
+                    self.SkuId,
+                    self.UniFlag,
+                    self.Progress,
+                    self.ModuleFile
+                )
                 self.Fdf = Wa.FdfFile
                 self.LoadFixAddress = Wa.Platform.LoadFixAddress
                 Wa.CreateMakeFile(False)
@@ -1912,14 +2073,17 @@ class Build():
                 MaList = []
                 ExitFlag = threading.Event()
                 ExitFlag.clear()
-                self.AutoGenTime += int(round((time.time() - WorkspaceAutoGenTime)))
+                self.AutoGenTime += int(round((time.time() -
+                                        WorkspaceAutoGenTime)))
                 for Arch in Wa.ArchList:
                     AutoGenStart = time.time()
                     GlobalData.gGlobalDefines['ARCH'] = Arch
-                    Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
+                    Pa = PlatformAutoGen(
+                        Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
                     for Module in Pa.Platform.Modules:
                         if self.ModuleFile.Dir == Module.Dir and self.ModuleFile.Name == Module.Name:
-                            Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
+                            Ma = ModuleAutoGen(
+                                Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile, Pa.DataPipe)
                             if Ma is None:
                                 continue
                             if Ma.PcdIsDriver:
@@ -1945,7 +2109,8 @@ class Build():
                                 if not self.SkipAutoGen or self.Target == 'genmake':
                                     self.Progress.Start("Generating makefile")
                                     if CmdListDict and self.Fdf and (Module.Path, Arch) in CmdListDict:
-                                        Ma.CreateMakeFile(True, CmdListDict[Module.Path, Arch])
+                                        Ma.CreateMakeFile(
+                                            True, CmdListDict[Module.Path, Arch])
                                         del CmdListDict[Module.Path, Arch]
                                     else:
                                         Ma.CreateMakeFile(True)
@@ -1964,22 +2129,26 @@ class Build():
                     MakeStart = time.time()
                     for Ma in self.BuildModules:
                         if not Ma.IsBinaryModule:
-                            Bt = BuildTask.New(ModuleMakeUnit(Ma, Pa.BuildCommand,self.Target))
+                            Bt = BuildTask.New(ModuleMakeUnit(
+                                Ma, Pa.BuildCommand, self.Target))
                         # Break build if any build thread has error
                         if BuildTask.HasError():
                             # we need a full version of makefile for platform
                             ExitFlag.set()
                             BuildTask.WaitForComplete()
                             Pa.CreateMakeFile(False)
-                            EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                            EdkLogger.error(
+                                "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                         # Start task scheduler
                         if not BuildTask.IsOnGoing():
-                            BuildTask.StartScheduler(self.ThreadNumber, ExitFlag)
+                            BuildTask.StartScheduler(
+                                self.ThreadNumber, ExitFlag)
 
                     # in case there's an interruption. we need a full version of makefile for platform
                     Pa.CreateMakeFile(False)
                     if BuildTask.HasError():
-                        EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                        EdkLogger.error(
+                            "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                     self.MakeTime += int(round((time.time() - MakeStart)))
 
                 MakeContiue = time.time()
@@ -1995,19 +2164,20 @@ class Build():
                 self.BuildModules = []
                 self.MakeTime += int(round((time.time() - MakeContiue)))
                 if BuildTask.HasError():
-                    EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                    EdkLogger.error(
+                        "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 self.BuildReport.AddPlatformReport(Wa, MaList)
                 if MaList == []:
                     EdkLogger.error(
-                                'build',
-                                BUILD_ERROR,
-                                "Module for [%s] is not a component of active platform."\
-                                " Please make sure that the ARCH and inf file path are"\
-                                " given in the same as in [%s]" % \
-                                    (', '.join(Wa.ArchList), self.PlatformFile),
-                                ExtraData=self.ModuleFile
-                                )
+                        'build',
+                        BUILD_ERROR,
+                        "Module for [%s] is not a component of active platform."
+                        " Please make sure that the ARCH and inf file path are"
+                        " given in the same as in [%s]" %
+                        (', '.join(Wa.ArchList), self.PlatformFile),
+                        ExtraData=self.ModuleFile
+                    )
                 # Create MAP file when Load Fix Address is enabled.
                 if self.Target == "fds" and self.Fdf:
                     for Arch in Wa.ArchList:
@@ -2015,7 +2185,8 @@ class Build():
                         # Check whether the set fix address is above 4G for 32bit image.
                         #
                         if (Arch == 'IA32' or Arch == 'ARM') and self.LoadFixAddress != 0xFFFFFFFFFFFFFFFF and self.LoadFixAddress >= 0x100000000:
-                            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platorm with IA32 or ARM arch modules")
+                            EdkLogger.error(
+                                "build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platorm with IA32 or ARM arch modules")
                     #
                     # Get Module List
                     #
@@ -2046,19 +2217,22 @@ class Build():
                     #
                     # Save MAP buffer into MAP file.
                     #
-                    self._SaveMapFile (MapBuffer, Wa)
+                    self._SaveMapFile(MapBuffer, Wa)
 
-    def _GenFfsCmd(self,ArchList):
+    def _GenFfsCmd(self, ArchList):
         # convert dictionary of Cmd:(Inf,Arch)
         # to a new dictionary of (Inf,Arch):Cmd,Cmd,Cmd...
         CmdSetDict = defaultdict(set)
-        GenFfsDict = GenFds.GenFfsMakefile('', GlobalData.gFdfParser, self, ArchList, GlobalData)
+        GenFfsDict = GenFds.GenFfsMakefile(
+            '', GlobalData.gFdfParser, self, ArchList, GlobalData)
         for Cmd in GenFfsDict:
             tmpInf, tmpArch = GenFfsDict[Cmd]
             CmdSetDict[tmpInf, tmpArch].add(Cmd)
         return CmdSetDict
+
     def VerifyAutoGenFiles(self):
-        AutoGenIdFile = os.path.join(GlobalData.gConfDirectory,".AutoGenIdFile.txt")
+        AutoGenIdFile = os.path.join(
+            GlobalData.gConfDirectory, ".AutoGenIdFile.txt")
         try:
             with open(AutoGenIdFile) as fd:
                 lines = fd.readlines()
@@ -2073,7 +2247,8 @@ class Build():
                 PlatformGuid = line.split("=")[1].strip()
         GlobalVarList = []
         for arch in ArchList:
-            global_var = os.path.join(BuildDir, "GlobalVar_%s_%s.bin" % (str(PlatformGuid),arch))
+            global_var = os.path.join(
+                BuildDir, "GlobalVar_%s_%s.bin" % (str(PlatformGuid), arch))
             if not os.path.exists(global_var):
                 return None
             GlobalVarList.append(global_var)
@@ -2088,26 +2263,30 @@ class Build():
             workspacedir = data_pipe.Get("P_Info").get("WorkspaceDir")
             PackagesPath = os.getenv("PACKAGES_PATH")
             mws.setWs(workspacedir, PackagesPath)
-            LibraryBuildDirectoryList = data_pipe.Get("LibraryBuildDirectoryList")
-            ModuleBuildDirectoryList = data_pipe.Get("ModuleBuildDirectoryList")
+            LibraryBuildDirectoryList = data_pipe.Get(
+                "LibraryBuildDirectoryList")
+            ModuleBuildDirectoryList = data_pipe.Get(
+                "ModuleBuildDirectoryList")
 
             for m_build_dir in LibraryBuildDirectoryList:
-                if not os.path.exists(os.path.join(m_build_dir,self.MakeFileName)):
+                if not os.path.exists(os.path.join(m_build_dir, self.MakeFileName)):
                     return None
             for m_build_dir in ModuleBuildDirectoryList:
-                if not os.path.exists(os.path.join(m_build_dir,self.MakeFileName)):
+                if not os.path.exists(os.path.join(m_build_dir, self.MakeFileName)):
                     return None
             Wa = WorkSpaceInfo(
-                workspacedir,active_p,target,toolchain,archlist
-                )
-            Pa = PlatformInfo(Wa, active_p, target, toolchain, Arch,data_pipe)
+                workspacedir, active_p, target, toolchain, archlist
+            )
+            Pa = PlatformInfo(Wa, active_p, target, toolchain, Arch, data_pipe)
             Wa.AutoGenObjectList.append(Pa)
         return Wa
-    def SetupMakeSetting(self,Wa):
+
+    def SetupMakeSetting(self, Wa):
         BuildModules = []
         for Pa in Wa.AutoGenObjectList:
             for m in Pa._MbList:
-                ma = ModuleAutoGen(Wa,m.MetaFile, Pa.BuildTarget, Wa.ToolChain, Pa.Arch, Pa.MetaFile,Pa.DataPipe)
+                ma = ModuleAutoGen(Wa, m.MetaFile, Pa.BuildTarget,
+                                   Wa.ToolChain, Pa.Arch, Pa.MetaFile, Pa.DataPipe)
                 BuildModules.append(ma)
         fdf_file = Wa.FlashDefinition
         if fdf_file:
@@ -2119,34 +2298,35 @@ class Build():
                 for FdRegion in FdDict.RegionList:
                     if str(FdRegion.RegionType) == 'FILE' and self.Platform.VpdToolGuid in str(FdRegion.RegionDataList):
                         if int(FdRegion.Offset) % 8 != 0:
-                            EdkLogger.error("build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
+                            EdkLogger.error(
+                                "build", FORMAT_INVALID, 'The VPD Base Address %s must be 8-byte aligned.' % (FdRegion.Offset))
             Wa.FdfProfile = Fdf.Profile
             self.Fdf = Fdf
         else:
             self.Fdf = None
         return BuildModules
 
-    ## Build a platform in multi-thread mode
+    # Build a platform in multi-thread mode
     #
-    def PerformAutoGen(self,BuildTarget,ToolChain):
+    def PerformAutoGen(self, BuildTarget, ToolChain):
         WorkspaceAutoGenTime = time.time()
         Wa = WorkspaceAutoGen(
-                self.WorkspaceDir,
-                self.PlatformFile,
-                BuildTarget,
-                ToolChain,
-                self.ArchList,
-                self.BuildDatabase,
-                self.TargetTxt,
-                self.ToolDef,
-                self.Fdf,
-                self.FdList,
-                self.FvList,
-                self.CapList,
-                self.SkuId,
-                self.UniFlag,
-                self.Progress
-                )
+            self.WorkspaceDir,
+            self.PlatformFile,
+            BuildTarget,
+            ToolChain,
+            self.ArchList,
+            self.BuildDatabase,
+            self.TargetTxt,
+            self.ToolDef,
+            self.Fdf,
+            self.FdList,
+            self.FvList,
+            self.CapList,
+            self.SkuId,
+            self.UniFlag,
+            self.Progress
+        )
         self.Fdf = Wa.FdfFile
         self.LoadFixAddress = Wa.Platform.LoadFixAddress
         self.BuildReport.AddPlatformReport(Wa)
@@ -2160,10 +2340,11 @@ class Build():
         self.AutoGenTime += int(round((time.time() - WorkspaceAutoGenTime)))
         BuildModules = []
         for Arch in Wa.ArchList:
-            PcdMaList    = []
+            PcdMaList = []
             AutoGenStart = time.time()
             GlobalData.gGlobalDefines['ARCH'] = Arch
-            Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget, ToolChain, Arch)
+            Pa = PlatformAutoGen(Wa, self.PlatformFile,
+                                 BuildTarget, ToolChain, Arch)
             if Pa is None:
                 continue
             ModuleList = []
@@ -2176,26 +2357,33 @@ class Build():
                     if Inf in Pa.Platform.Modules:
                         continue
                     ModuleList.append(Inf)
-            Pa.DataPipe.DataContainer = {"FfsCommand":CmdListDict}
-            Pa.DataPipe.DataContainer = {"Workspace_timestamp": Wa._SrcTimeStamp}
+            Pa.DataPipe.DataContainer = {"FfsCommand": CmdListDict}
+            Pa.DataPipe.DataContainer = {
+                "Workspace_timestamp": Wa._SrcTimeStamp}
             Pa.DataPipe.DataContainer = {"CommandTarget": self.Target}
             Pa.CreateLibModuelDirs()
             # Fetch the MakeFileName.
             self.MakeFileName = Pa.MakeFileName
 
-            Pa.DataPipe.DataContainer = {"LibraryBuildDirectoryList":Pa.LibraryBuildDirectoryList}
-            Pa.DataPipe.DataContainer = {"ModuleBuildDirectoryList":Pa.ModuleBuildDirectoryList}
-            Pa.DataPipe.DataContainer = {"FdsCommandDict": Wa.GenFdsCommandDict}
+            Pa.DataPipe.DataContainer = {
+                "LibraryBuildDirectoryList": Pa.LibraryBuildDirectoryList}
+            Pa.DataPipe.DataContainer = {
+                "ModuleBuildDirectoryList": Pa.ModuleBuildDirectoryList}
+            Pa.DataPipe.DataContainer = {
+                "FdsCommandDict": Wa.GenFdsCommandDict}
             # Prepare the cache share data for multiprocessing
-            Pa.DataPipe.DataContainer = {"gPlatformHashFile":GlobalData.gPlatformHashFile}
+            Pa.DataPipe.DataContainer = {
+                "gPlatformHashFile": GlobalData.gPlatformHashFile}
             ModuleCodaFile = {}
             for ma in Pa.ModuleAutoGenList:
-                ModuleCodaFile[(ma.MetaFile.File,ma.MetaFile.Root,ma.Arch,ma.MetaFile.Path)] = [item.Target for item in ma.CodaTargetList]
-            Pa.DataPipe.DataContainer = {"ModuleCodaFile":ModuleCodaFile}
+                ModuleCodaFile[(ma.MetaFile.File, ma.MetaFile.Root, ma.Arch, ma.MetaFile.Path)] = [
+                    item.Target for item in ma.CodaTargetList]
+            Pa.DataPipe.DataContainer = {"ModuleCodaFile": ModuleCodaFile}
             # ModuleList contains all driver modules only
             for Module in ModuleList:
                 # Get ModuleAutoGen object to generate C code file and makefile
-                Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
+                Ma = ModuleAutoGen(
+                    Wa, Module, BuildTarget, ToolChain, Arch, self.PlatformFile, Pa.DataPipe)
                 if Ma is None:
                     continue
                 if Ma.PcdIsDriver:
@@ -2209,16 +2397,18 @@ class Build():
             cqueue = mp.Queue()
             for m in Pa.GetAllModuleInfo:
                 mqueue.put(m)
-                module_file,module_root,module_path,module_basename,\
-                    module_originalpath,module_arch,IsLib = m
-                Ma = ModuleAutoGen(Wa, PathClass(module_path, Wa), BuildTarget,\
-                                  ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
+                module_file, module_root, module_path, module_basename,\
+                    module_originalpath, module_arch, IsLib = m
+                Ma = ModuleAutoGen(Wa, PathClass(module_path, Wa), BuildTarget,
+                                   ToolChain, Arch, self.PlatformFile, Pa.DataPipe)
                 self.AllModules.add(Ma)
-            data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid),Pa.Arch))
+            data_pipe_file = os.path.join(
+                Pa.BuildDir, "GlobalVar_%s_%s.bin" % (str(Pa.Guid), Pa.Arch))
             Pa.DataPipe.dump(data_pipe_file)
 
-            mqueue.put((None,None,None,None,None,None,None))
-            autogen_rt, errorcode = self.StartAutoGen(mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList, cqueue)
+            mqueue.put((None, None, None, None, None, None, None))
+            autogen_rt, errorcode = self.StartAutoGen(
+                mqueue, Pa.DataPipe, self.SkipAutoGen, PcdMaList, cqueue)
 
             if not autogen_rt:
                 self.AutoGenMgr.TerminateWorkers()
@@ -2228,8 +2418,8 @@ class Build():
             if GlobalData.gUseHashCache:
                 for item in GlobalData.gModuleAllCacheStatus:
                     (MetaFilePath, Arch, CacheStr, Status) = item
-                    Ma = ModuleAutoGen(Wa, PathClass(MetaFilePath, Wa), BuildTarget,\
-                                      ToolChain, Arch, self.PlatformFile,Pa.DataPipe)
+                    Ma = ModuleAutoGen(Wa, PathClass(MetaFilePath, Wa), BuildTarget,
+                                       ToolChain, Arch, self.PlatformFile, Pa.DataPipe)
                     if CacheStr == "PreMakeCache" and Status == False:
                         self.PreMakeCacheMiss.add(Ma)
                     if CacheStr == "PreMakeCache" and Status == True:
@@ -2241,8 +2431,9 @@ class Build():
                         self.MakeCacheHit.add(Ma)
                         GlobalData.gModuleCacheHit.add(Ma)
             self.AutoGenTime += int(round((time.time() - AutoGenStart)))
-        AutoGenIdFile = os.path.join(GlobalData.gConfDirectory,".AutoGenIdFile.txt")
-        with open(AutoGenIdFile,"w") as fw:
+        AutoGenIdFile = os.path.join(
+            GlobalData.gConfDirectory, ".AutoGenIdFile.txt")
+        with open(AutoGenIdFile, "w") as fw:
             fw.write("Arch=%s\n" % "|".join((Wa.ArchList)))
             fw.write("BuildDir=%s\n" % Wa.BuildDir)
             fw.write("PlatformGuid=%s\n" % str(Wa.AutoGenObjectList[0].Guid))
@@ -2258,7 +2449,8 @@ class Build():
         return Wa, BuildModules
 
     def _MultiThreadBuildPlatform(self):
-        SaveFileOnChange(self.PlatformBuildPath, '# DO NOT EDIT \n# FILE auto-generated\n', False)
+        SaveFileOnChange(self.PlatformBuildPath,
+                         '# DO NOT EDIT \n# FILE auto-generated\n', False)
         for BuildTarget in self.BuildTargetList:
             GlobalData.gGlobalDefines['TARGET'] = BuildTarget
             index = 0
@@ -2274,41 +2466,50 @@ class Build():
                     Wa = self.VerifyAutoGenFiles()
                     if Wa is None:
                         self.SkipAutoGen = False
-                        Wa, self.BuildModules = self.PerformAutoGen(BuildTarget,ToolChain)
+                        Wa, self.BuildModules = self.PerformAutoGen(
+                            BuildTarget, ToolChain)
                     else:
                         GlobalData.gAutoGenPhase = True
                         self.BuildModules = self.SetupMakeSetting(Wa)
                 else:
-                    Wa, self.BuildModules = self.PerformAutoGen(BuildTarget,ToolChain)
+                    Wa, self.BuildModules = self.PerformAutoGen(
+                        BuildTarget, ToolChain)
                 Pa = Wa.AutoGenObjectList[0]
                 GlobalData.gAutoGenPhase = False
 
                 if GlobalData.gBinCacheSource:
-                    EdkLogger.quiet("[cache Summary]: Total module num: %s" % len(self.AllModules))
-                    EdkLogger.quiet("[cache Summary]: PreMakecache miss num: %s " % len(self.PreMakeCacheMiss))
-                    EdkLogger.quiet("[cache Summary]: Makecache miss num: %s " % len(self.MakeCacheMiss))
+                    EdkLogger.quiet(
+                        "[cache Summary]: Total module num: %s" % len(self.AllModules))
+                    EdkLogger.quiet("[cache Summary]: PreMakecache miss num: %s " % len(
+                        self.PreMakeCacheMiss))
+                    EdkLogger.quiet(
+                        "[cache Summary]: Makecache miss num: %s " % len(self.MakeCacheMiss))
 
                 for Arch in Wa.ArchList:
                     MakeStart = time.time()
                     for Ma in set(self.BuildModules):
                         # Generate build task for the module
                         if not Ma.IsBinaryModule:
-                            Bt = BuildTask.New(ModuleMakeUnit(Ma, Pa.BuildCommand,self.Target))
+                            Bt = BuildTask.New(ModuleMakeUnit(
+                                Ma, Pa.BuildCommand, self.Target))
                         # Break build if any build thread has error
                         if BuildTask.HasError():
                             # we need a full version of makefile for platform
                             ExitFlag.set()
                             BuildTask.WaitForComplete()
                             Pa.CreateMakeFile(False)
-                            EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                            EdkLogger.error(
+                                "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                         # Start task scheduler
                         if not BuildTask.IsOnGoing():
-                            BuildTask.StartScheduler(self.ThreadNumber, ExitFlag)
+                            BuildTask.StartScheduler(
+                                self.ThreadNumber, ExitFlag)
 
                     # in case there's an interruption. we need a full version of makefile for platform
 
                     if BuildTask.HasError():
-                        EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                        EdkLogger.error(
+                            "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
                     self.MakeTime += int(round((time.time() - MakeStart)))
 
                 MakeContiue = time.time()
@@ -2336,7 +2537,8 @@ class Build():
                 # has been signaled.
                 #
                 if BuildTask.HasError():
-                    EdkLogger.error("build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
+                    EdkLogger.error(
+                        "build", BUILD_ERROR, "Failed to build module", ExtraData=GlobalData.gBuildingModule)
 
                 # Create MAP file when Load Fix Address is enabled.
                 if self.Target in ["", "all", "fds"]:
@@ -2345,7 +2547,8 @@ class Build():
                         # Check whether the set fix address is above 4G for 32bit image.
                         #
                         if (Arch == 'IA32' or Arch == 'ARM') and self.LoadFixAddress != 0xFFFFFFFFFFFFFFFF and self.LoadFixAddress >= 0x100000000:
-                            EdkLogger.error("build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platorm with IA32 or ARM arch modules")
+                            EdkLogger.error(
+                                "build", PARAMETER_INVALID, "FIX_LOAD_TOP_MEMORY_ADDRESS can't be set to larger than or equal to 4G for the platorm with IA32 or ARM arch modules")
 
                     #
                     # Rebase module to the preferred memory address before GenFds
@@ -2376,13 +2579,14 @@ class Build():
                     self._SaveMapFile(MapBuffer, Wa)
                 self.CreateGuidedSectionToolsFile(Wa)
 
-    ## GetFreeSizeThreshold()
+    # GetFreeSizeThreshold()
     #
     #   @retval int             Threshold value
     #
     def GetFreeSizeThreshold(self):
         Threshold = None
-        Threshold_Str = GlobalData.gCommandLineDefines.get('FV_SPARE_SPACE_THRESHOLD')
+        Threshold_Str = GlobalData.gCommandLineDefines.get(
+            'FV_SPARE_SPACE_THRESHOLD')
         if Threshold_Str:
             try:
                 if Threshold_Str.lower().startswith('0x'):
@@ -2390,7 +2594,8 @@ class Build():
                 else:
                     Threshold = int(Threshold_Str)
             except:
-                EdkLogger.warn("build", 'incorrect value for FV_SPARE_SPACE_THRESHOLD %s.Only decimal or hex format is allowed.' % Threshold_Str)
+                EdkLogger.warn(
+                    "build", 'incorrect value for FV_SPARE_SPACE_THRESHOLD %s.Only decimal or hex format is allowed.' % Threshold_Str)
         return Threshold
 
     def CheckFreeSizeThreshold(self, Threshold=None, FvDir=None):
@@ -2399,10 +2604,12 @@ class Build():
         if not isinstance(FvDir, str) or not FvDir:
             return
         FdfParserObject = GlobalData.gFdfParser
-        FvRegionNameList = [FvName for FvName in FdfParserObject.Profile.FvDict if FdfParserObject.Profile.FvDict[FvName].FvRegionInFD]
+        FvRegionNameList = [
+            FvName for FvName in FdfParserObject.Profile.FvDict if FdfParserObject.Profile.FvDict[FvName].FvRegionInFD]
         for FvName in FdfParserObject.Profile.FvDict:
             if FvName in FvRegionNameList:
-                FvSpaceInfoFileName = os.path.join(FvDir, FvName.upper() + '.Fv.map')
+                FvSpaceInfoFileName = os.path.join(
+                    FvDir, FvName.upper() + '.Fv.map')
                 if os.path.exists(FvSpaceInfoFileName):
                     FileLinesList = getlines(FvSpaceInfoFileName)
                     for Line in FileLinesList:
@@ -2415,9 +2622,9 @@ class Build():
                                                     FvName, FreeSizeValue, Threshold))
                             break
 
-    ## Generate GuidedSectionTools.txt in the FV directories.
+    # Generate GuidedSectionTools.txt in the FV directories.
     #
-    def CreateGuidedSectionToolsFile(self,Wa):
+    def CreateGuidedSectionToolsFile(self, Wa):
         for BuildTarget in self.BuildTargetList:
             for ToolChain in self.ToolChainList:
                 FvDir = Wa.FvDir
@@ -2434,13 +2641,14 @@ class Build():
                             continue
                         if Platform.Arch != Arch:
                             continue
-                        if hasattr (Platform, 'BuildOption'):
+                        if hasattr(Platform, 'BuildOption'):
                             for Tool in Platform.BuildOption:
                                 if 'GUID' in Platform.BuildOption[Tool]:
                                     if 'PATH' in Platform.BuildOption[Tool]:
                                         value = Platform.BuildOption[Tool]['GUID']
                                         if value in guidList:
-                                            EdkLogger.error("build", FORMAT_INVALID, "Duplicate GUID value %s used with Tool %s in DSC [BuildOptions]." % (value, Tool))
+                                            EdkLogger.error(
+                                                "build", FORMAT_INVALID, "Duplicate GUID value %s used with Tool %s in DSC [BuildOptions]." % (value, Tool))
                                         path = Platform.BuildOption[Tool]['PATH']
                                         guidList.append(value)
                                         guidAttribs.append((value, Tool, path))
@@ -2449,7 +2657,8 @@ class Build():
                                 if 'PATH' in Platform.ToolDefinition[Tool]:
                                     value = Platform.ToolDefinition[Tool]['GUID']
                                     if value in tooldefguidList:
-                                        EdkLogger.error("build", FORMAT_INVALID, "Duplicate GUID value %s used with Tool %s in tools_def.txt." % (value, Tool))
+                                        EdkLogger.error(
+                                            "build", FORMAT_INVALID, "Duplicate GUID value %s used with Tool %s in tools_def.txt." % (value, Tool))
                                     tooldefguidList.append(value)
                                     if value in guidList:
                                         # Already added by platform
@@ -2458,7 +2667,7 @@ class Build():
                                     guidList.append(value)
                                     guidAttribs.append((value, Tool, path))
                     # Sort by GuidTool name
-                    guidAttribs = sorted (guidAttribs, key=lambda x: x[1])
+                    guidAttribs = sorted(guidAttribs, key=lambda x: x[1])
                     # Write out GuidedSecTools.txt
                     toolsFile = os.path.join(FvDir, 'GuidedSectionTools.txt')
                     toolsFile = open(toolsFile, 'wt')
@@ -2466,14 +2675,14 @@ class Build():
                         print(' '.join(guidedSectionTool), file=toolsFile)
                     toolsFile.close()
 
-    ## Returns the real path of the tool.
+    # Returns the real path of the tool.
     #
-    def GetRealPathOfTool (self, tool):
+    def GetRealPathOfTool(self, tool):
         if os.path.exists(tool):
             return os.path.realpath(tool)
         return tool
 
-    ## Launch the module or platform build
+    # Launch the module or platform build
     #
     def Launch(self):
         self.AllDrivers = set()
@@ -2509,7 +2718,7 @@ class Build():
         for Module in self.PreMakeCacheMiss:
             Module.GenPreMakefileHashList()
 
-    ## Do some clean-up works when error occurred
+    # Do some clean-up works when error occurred
     def Relinquish(self):
         OldLogLevel = EdkLogger.GetLevel()
         EdkLogger.SetLevel(EdkLogger.ERROR)
@@ -2518,6 +2727,7 @@ class Build():
             BuildTask.Abort()
         EdkLogger.SetLevel(OldLogLevel)
 
+
 def ParseDefines(DefineList=[]):
     DefineDict = {}
     if DefineList is not None:
@@ -2535,18 +2745,20 @@ def ParseDefines(DefineList=[]):
     return DefineDict
 
 
-
 def LogBuildTime(Time):
     if Time:
         TimeDurStr = ''
         TimeDur = time.gmtime(Time)
         if TimeDur.tm_yday > 1:
-            TimeDurStr = time.strftime("%H:%M:%S", TimeDur) + ", %d day(s)" % (TimeDur.tm_yday - 1)
+            TimeDurStr = time.strftime(
+                "%H:%M:%S", TimeDur) + ", %d day(s)" % (TimeDur.tm_yday - 1)
         else:
             TimeDurStr = time.strftime("%H:%M:%S", TimeDur)
         return TimeDurStr
     else:
         return None
+
+
 def ThreadNum():
     OptionParser = MyOptionParser()
     if not OptionParser.BuildOption and not OptionParser.BuildTarget:
@@ -2556,7 +2768,8 @@ def ThreadNum():
     GlobalData.gCmdConfDir = BuildOption.ConfDirectory
     if ThreadNumber is None:
         TargetObj = TargetTxtDict()
-        ThreadNumber = TargetObj.Target.TargetTxtDictionary[TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
+        ThreadNumber = TargetObj.Target.TargetTxtDictionary[
+            TAB_TAT_DEFINES_MAX_CONCURRENT_THREAD_NUMBER]
         if ThreadNumber == '':
             ThreadNumber = 0
         else:
@@ -2568,7 +2781,9 @@ def ThreadNum():
         except (ImportError, NotImplementedError):
             ThreadNumber = 1
     return ThreadNumber
-## Tool entrance method
+
+
+# Tool entrance method
 #
 # This method mainly dispatch specific methods per the command line options.
 # If no error found, return zero value so the caller of this tool can know
@@ -2578,6 +2793,8 @@ def ThreadNum():
 #   @retval 1     Tool failed
 #
 LogQMaxSize = ThreadNum() * 10
+
+
 def Main():
     StartTime = time.time()
 
@@ -2614,7 +2831,7 @@ def Main():
 
     if Option.WarningAsError == True:
         EdkLogger.SetWarningAsError()
-    Log_Agent = LogAgent(LogQ,LogLevel,Option.LogFile)
+    Log_Agent = LogAgent(LogQ, LogLevel, Option.LogFile)
     Log_Agent.start()
 
     if platform.platform().find("Windows") >= 0:
@@ -2623,7 +2840,8 @@ def Main():
         GlobalData.gIsWindows = False
 
     EdkLogger.quiet("Build environment: %s" % platform.platform())
-    EdkLogger.quiet(time.strftime("Build start time: %H:%M:%S, %b.%d %Y\n", time.localtime()));
+    EdkLogger.quiet(time.strftime(
+        "Build start time: %H:%M:%S, %b.%d %Y\n", time.localtime()))
     ReturnCode = 0
     MyBuild = None
     BuildError = True
@@ -2654,7 +2872,8 @@ def Main():
 
         WorkingDirectory = os.getcwd()
         if not Option.ModuleFile:
-            FileList = glob.glob(os.path.normpath(os.path.join(WorkingDirectory, '*.inf')))
+            FileList = glob.glob(os.path.normpath(
+                os.path.join(WorkingDirectory, '*.inf')))
             FileNum = len(FileList)
             if FileNum >= 2:
                 EdkLogger.error("build", OPTION_NOT_SUPPORTED, "There are %d INF files in %s." % (FileNum, WorkingDirectory),
@@ -2663,33 +2882,37 @@ def Main():
                 Option.ModuleFile = NormFile(FileList[0], Workspace)
 
         if Option.ModuleFile:
-            if os.path.isabs (Option.ModuleFile):
-                if os.path.normcase (os.path.normpath(Option.ModuleFile)).find (Workspace) == 0:
-                    Option.ModuleFile = NormFile(os.path.normpath(Option.ModuleFile), Workspace)
+            if os.path.isabs(Option.ModuleFile):
+                if os.path.normcase(os.path.normpath(Option.ModuleFile)).find(Workspace) == 0:
+                    Option.ModuleFile = NormFile(
+                        os.path.normpath(Option.ModuleFile), Workspace)
             Option.ModuleFile = PathClass(Option.ModuleFile, Workspace)
             ErrorCode, ErrorInfo = Option.ModuleFile.Validate(".inf", False)
             if ErrorCode != 0:
                 EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
 
         if Option.PlatformFile is not None:
-            if os.path.isabs (Option.PlatformFile):
-                if os.path.normcase (os.path.normpath(Option.PlatformFile)).find (Workspace) == 0:
-                    Option.PlatformFile = NormFile(os.path.normpath(Option.PlatformFile), Workspace)
+            if os.path.isabs(Option.PlatformFile):
+                if os.path.normcase(os.path.normpath(Option.PlatformFile)).find(Workspace) == 0:
+                    Option.PlatformFile = NormFile(
+                        os.path.normpath(Option.PlatformFile), Workspace)
             Option.PlatformFile = PathClass(Option.PlatformFile, Workspace)
 
         if Option.FdfFile is not None:
-            if os.path.isabs (Option.FdfFile):
-                if os.path.normcase (os.path.normpath(Option.FdfFile)).find (Workspace) == 0:
-                    Option.FdfFile = NormFile(os.path.normpath(Option.FdfFile), Workspace)
+            if os.path.isabs(Option.FdfFile):
+                if os.path.normcase(os.path.normpath(Option.FdfFile)).find(Workspace) == 0:
+                    Option.FdfFile = NormFile(
+                        os.path.normpath(Option.FdfFile), Workspace)
             Option.FdfFile = PathClass(Option.FdfFile, Workspace)
             ErrorCode, ErrorInfo = Option.FdfFile.Validate(".fdf", False)
             if ErrorCode != 0:
                 EdkLogger.error("build", ErrorCode, ExtraData=ErrorInfo)
 
         if Option.Flag is not None and Option.Flag not in ['-c', '-s']:
-            EdkLogger.error("build", OPTION_VALUE_INVALID, "UNI flag must be one of -c or -s")
+            EdkLogger.error("build", OPTION_VALUE_INVALID,
+                            "UNI flag must be one of -c or -s")
 
-        MyBuild = Build(Target, Workspace, Option,LogQ)
+        MyBuild = Build(Target, Workspace, Option, LogQ)
         GlobalData.gCommandLineDefines['ARCH'] = ' '.join(MyBuild.ArchList)
         if not (MyBuild.LaunchPrebuildFlag and os.path.exists(MyBuild.PlatformBuildPath)):
             MyBuild.Launch()
@@ -2703,7 +2926,8 @@ def Main():
             # for multi-thread build exits safely
             MyBuild.Relinquish()
         if Option is not None and Option.debug is not None:
-            EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            EdkLogger.quiet("(Python %s on %s) " % (
+                platform.python_version(), sys.platform) + traceback.format_exc())
         ReturnCode = X.args[0]
     except Warning as X:
         # error from Fdf parser
@@ -2711,9 +2935,11 @@ def Main():
             # for multi-thread build exits safely
             MyBuild.Relinquish()
         if Option is not None and Option.debug is not None:
-            EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            EdkLogger.quiet("(Python %s on %s) " % (
+                platform.python_version(), sys.platform) + traceback.format_exc())
         else:
-            EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName, Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
+            EdkLogger.error(X.ToolName, FORMAT_INVALID, File=X.FileName,
+                            Line=X.LineNumber, ExtraData=X.Message, RaiseError=False)
         ReturnCode = FORMAT_INVALID
     except KeyboardInterrupt:
         if MyBuild is not None:
@@ -2722,7 +2948,8 @@ def Main():
             MyBuild.Relinquish()
         ReturnCode = ABORT_ERROR
         if Option is not None and Option.debug is not None:
-            EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            EdkLogger.quiet("(Python %s on %s) " % (
+                platform.python_version(), sys.platform) + traceback.format_exc())
     except:
         if MyBuild is not None:
             # for multi-thread build exits safely
@@ -2736,13 +2963,14 @@ def Main():
                 MetaFile = Tb.tb_frame.f_locals['self'].MetaFile
             Tb = Tb.tb_next
         EdkLogger.error(
-                    "\nbuild",
-                    CODE_ERROR,
-                    "Unknown fatal error when processing [%s]" % MetaFile,
-                    ExtraData="\n(Please send email to %s for help, attaching following call stack trace!)\n" % MSG_EDKII_MAIL_ADDR,
-                    RaiseError=False
-                    )
-        EdkLogger.quiet("(Python %s on %s) " % (platform.python_version(), sys.platform) + traceback.format_exc())
+            "\nbuild",
+            CODE_ERROR,
+            "Unknown fatal error when processing [%s]" % MetaFile,
+            ExtraData="\n(Please send email to %s for help, attaching following call stack trace!)\n" % MSG_EDKII_MAIL_ADDR,
+            RaiseError=False
+        )
+        EdkLogger.quiet("(Python %s on %s) " % (
+            platform.python_version(), sys.platform) + traceback.format_exc())
         ReturnCode = CODE_ERROR
     finally:
         Utils.Progressor.Abort()
@@ -2763,27 +2991,32 @@ def Main():
     BuildDuration = time.gmtime(int(round(FinishTime - StartTime)))
     BuildDurationStr = ""
     if BuildDuration.tm_yday > 1:
-        BuildDurationStr = time.strftime("%H:%M:%S", BuildDuration) + ", %d day(s)" % (BuildDuration.tm_yday - 1)
+        BuildDurationStr = time.strftime(
+            "%H:%M:%S", BuildDuration) + ", %d day(s)" % (BuildDuration.tm_yday - 1)
     else:
         BuildDurationStr = time.strftime("%H:%M:%S", BuildDuration)
     if MyBuild is not None:
         if not BuildError:
-            MyBuild.BuildReport.GenerateReport(BuildDurationStr, LogBuildTime(MyBuild.AutoGenTime), LogBuildTime(MyBuild.MakeTime), LogBuildTime(MyBuild.GenFdsTime))
+            MyBuild.BuildReport.GenerateReport(BuildDurationStr, LogBuildTime(
+                MyBuild.AutoGenTime), LogBuildTime(MyBuild.MakeTime), LogBuildTime(MyBuild.GenFdsTime))
 
     EdkLogger.SetLevel(EdkLogger.QUIET)
     EdkLogger.quiet("\n- %s -" % Conclusion)
-    EdkLogger.quiet(time.strftime("Build end time: %H:%M:%S, %b.%d %Y", time.localtime()))
+    EdkLogger.quiet(time.strftime(
+        "Build end time: %H:%M:%S, %b.%d %Y", time.localtime()))
     EdkLogger.quiet("Build total time: %s\n" % BuildDurationStr)
     Log_Agent.kill()
     Log_Agent.join()
     return ReturnCode
 
+
 if __name__ == '__main__':
     try:
         mp.set_start_method('spawn')
     except:
         pass
     r = Main()
-    ## 0-127 is a safe return range, and 1 is a standard default error
-    if r < 0 or r > 127: r = 1
+    # 0-127 is a safe return range, and 1 is a standard default error
+    if r < 0 or r > 127:
+        r = 1
     sys.exit(r)
diff --git a/BaseTools/Source/Python/build/buildoptions.py b/BaseTools/Source/Python/build/buildoptions.py
index 39d92cff209d..c662b833c049 100644
--- a/BaseTools/Source/Python/build/buildoptions.py
+++ b/BaseTools/Source/Python/build/buildoptions.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # build a platform or a module
 #
 #  Copyright (c) 2014, Hewlett-Packard Development Company, L.P.<BR>
@@ -16,12 +16,15 @@ __version__ = "%prog Version " + VersionNumber
 __copyright__ = "Copyright (c) 2007 - 2018, Intel Corporation  All rights reserved."
 
 gParamCheck = []
+
+
 def SingleCheckCallback(option, opt_str, value, parser):
     if option not in gParamCheck:
         setattr(parser.values, option.dest, value)
         gParamCheck.append(option)
     else:
-        parser.error("Option %s only allows one instance in command line!" % option)
+        parser.error(
+            "Option %s only allows one instance in command line!" % option)
 
 
 class MyOptionParser():
@@ -39,67 +42,89 @@ class MyOptionParser():
             self.BuildTarget = None
 
     def GetOption(self):
-        Parser = OptionParser(description=__copyright__, version=__version__, prog="build.exe", usage="%prog [options] [all|fds|genc|genmake|clean|cleanall|cleanlib|modules|libraries|run]")
+        Parser = OptionParser(description=__copyright__, version=__version__, prog="build.exe",
+                              usage="%prog [options] [all|fds|genc|genmake|clean|cleanall|cleanlib|modules|libraries|run]")
         Parser.add_option("-a", "--arch", action="append", dest="TargetArch",
-            help="ARCHS is one of list: IA32, X64, ARM, AARCH64, RISCV64 or EBC, which overrides target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option.")
+                          help="ARCHS is one of list: IA32, X64, ARM, AARCH64, RISCV64 or EBC, which overrides target.txt's TARGET_ARCH definition. To specify more archs, please repeat this option.")
         Parser.add_option("-p", "--platform", action="callback", type="string", dest="PlatformFile", callback=SingleCheckCallback,
-            help="Build the platform specified by the DSC file name argument, overriding target.txt's ACTIVE_PLATFORM definition.")
+                          help="Build the platform specified by the DSC file name argument, overriding target.txt's ACTIVE_PLATFORM definition.")
         Parser.add_option("-m", "--module", action="callback", type="string", dest="ModuleFile", callback=SingleCheckCallback,
-            help="Build the module specified by the INF file name argument.")
+                          help="Build the module specified by the INF file name argument.")
         Parser.add_option("-b", "--buildtarget", type="string", dest="BuildTarget", help="Using the TARGET to build the platform, overriding target.txt's TARGET definition.",
                           action="append")
         Parser.add_option("-t", "--tagname", action="append", type="string", dest="ToolChain",
-            help="Using the Tool Chain Tagname to build the platform, overriding target.txt's TOOL_CHAIN_TAG definition.")
+                          help="Using the Tool Chain Tagname to build the platform, overriding target.txt's TOOL_CHAIN_TAG definition.")
         Parser.add_option("-x", "--sku-id", action="callback", type="string", dest="SkuId", callback=SingleCheckCallback,
-            help="Using this name of SKU ID to build the platform, overriding SKUID_IDENTIFIER in DSC file.")
+                          help="Using this name of SKU ID to build the platform, overriding SKUID_IDENTIFIER in DSC file.")
 
         Parser.add_option("-n", action="callback", type="int", dest="ThreadNumber", callback=SingleCheckCallback,
-            help="Build the platform using multi-threaded compiler. The value overrides target.txt's MAX_CONCURRENT_THREAD_NUMBER. When value is set to 0, tool automatically detect number of "\
-                 "processor threads, set value to 1 means disable multi-thread build, and set value to more than 1 means user specify the threads number to build.")
+                          help="Build the platform using multi-threaded compiler. The value overrides target.txt's MAX_CONCURRENT_THREAD_NUMBER. When value is set to 0, tool automatically detect number of "
+                          "processor threads, set value to 1 means disable multi-thread build, and set value to more than 1 means user specify the threads number to build.")
 
         Parser.add_option("-f", "--fdf", action="callback", type="string", dest="FdfFile", callback=SingleCheckCallback,
-            help="The name of the FDF file to use, which overrides the setting in the DSC file.")
+                          help="The name of the FDF file to use, which overrides the setting in the DSC file.")
         Parser.add_option("-r", "--rom-image", action="append", type="string", dest="RomImage", default=[],
-            help="The name of FD to be generated. The name must be from [FD] section in FDF file.")
+                          help="The name of FD to be generated. The name must be from [FD] section in FDF file.")
         Parser.add_option("-i", "--fv-image", action="append", type="string", dest="FvImage", default=[],
-            help="The name of FV to be generated. The name must be from [FV] section in FDF file.")
+                          help="The name of FV to be generated. The name must be from [FV] section in FDF file.")
         Parser.add_option("-C", "--capsule-image", action="append", type="string", dest="CapName", default=[],
-            help="The name of Capsule to be generated. The name must be from [Capsule] section in FDF file.")
-        Parser.add_option("-u", "--skip-autogen", action="store_true", dest="SkipAutoGen", help="Skip AutoGen step.")
-        Parser.add_option("-e", "--re-parse", action="store_true", dest="Reparse", help="Re-parse all meta-data files.")
+                          help="The name of Capsule to be generated. The name must be from [Capsule] section in FDF file.")
+        Parser.add_option("-u", "--skip-autogen", action="store_true",
+                          dest="SkipAutoGen", help="Skip AutoGen step.")
+        Parser.add_option("-e", "--re-parse", action="store_true",
+                          dest="Reparse", help="Re-parse all meta-data files.")
 
-        Parser.add_option("-c", "--case-insensitive", action="store_true", dest="CaseInsensitive", default=False, help="Don't check case of file name.")
+        Parser.add_option("-c", "--case-insensitive", action="store_true",
+                          dest="CaseInsensitive", default=False, help="Don't check case of file name.")
 
-        Parser.add_option("-w", "--warning-as-error", action="store_true", dest="WarningAsError", help="Treat warning in tools as error.")
-        Parser.add_option("-j", "--log", action="store", dest="LogFile", help="Put log in specified file as well as on console.")
+        Parser.add_option("-w", "--warning-as-error", action="store_true",
+                          dest="WarningAsError", help="Treat warning in tools as error.")
+        Parser.add_option("-j", "--log", action="store", dest="LogFile",
+                          help="Put log in specified file as well as on console.")
 
         Parser.add_option("-s", "--silent", action="store_true", type=None, dest="SilentMode",
-            help="Make use of silent mode of (n)make.")
-        Parser.add_option("-q", "--quiet", action="store_true", type=None, help="Disable all messages except FATAL ERRORS.")
-        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "\
-                                                                                   "including library instances selected, final dependency expression, "\
-                                                                                   "and warning messages, etc.")
-        Parser.add_option("-d", "--debug", action="store", type="int", help="Enable debug messages at specified level.")
-        Parser.add_option("-D", "--define", action="append", type="string", dest="Macros", help="Macro: \"Name [= Value]\".")
+                          help="Make use of silent mode of (n)make.")
+        Parser.add_option("-q", "--quiet", action="store_true",
+                          type=None, help="Disable all messages except FATAL ERRORS.")
+        Parser.add_option("-v", "--verbose", action="store_true", type=None, help="Turn on verbose output with informational messages printed, "
+                          "including library instances selected, final dependency expression, "
+                          "and warning messages, etc.")
+        Parser.add_option("-d", "--debug", action="store", type="int",
+                          help="Enable debug messages at specified level.")
+        Parser.add_option("-D", "--define", action="append", type="string",
+                          dest="Macros", help="Macro: \"Name [= Value]\".")
 
-        Parser.add_option("-y", "--report-file", action="store", dest="ReportFile", help="Create/overwrite the report to the specified filename.")
+        Parser.add_option("-y", "--report-file", action="store", dest="ReportFile",
+                          help="Create/overwrite the report to the specified filename.")
         Parser.add_option("-Y", "--report-type", action="append", type="choice", choices=['PCD', 'LIBRARY', 'FLASH', 'DEPEX', 'BUILD_FLAGS', 'FIXED_ADDRESS', 'HASH', 'EXECUTION_ORDER'], dest="ReportType", default=[],
-            help="Flags that control the type of build report to generate.  Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER].  "\
-                 "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
+                          help="Flags that control the type of build report to generate.  Must be one of: [PCD, LIBRARY, FLASH, DEPEX, BUILD_FLAGS, FIXED_ADDRESS, HASH, EXECUTION_ORDER].  "
+                          "To specify more than one flag, repeat this option on the command line and the default flag set is [PCD, LIBRARY, FLASH, DEPEX, HASH, BUILD_FLAGS, FIXED_ADDRESS]")
         Parser.add_option("-F", "--flag", action="store", type="string", dest="Flag",
-            help="Specify the specific option to parse EDK UNI file. Must be one of: [-c, -s]. -c is for EDK framework UNI file, and -s is for EDK UEFI UNI file. "\
-                 "This option can also be specified by setting *_*_*_BUILD_FLAGS in [BuildOptions] section of platform DSC. If they are both specified, this value "\
-                 "will override the setting in [BuildOptions] section of platform DSC.")
-        Parser.add_option("-N", "--no-cache", action="store_true", dest="DisableCache", default=False, help="Disable build cache mechanism")
-        Parser.add_option("--conf", action="store", type="string", dest="ConfDirectory", help="Specify the customized Conf directory.")
-        Parser.add_option("--check-usage", action="store_true", dest="CheckUsage", default=False, help="Check usage content of entries listed in INF file.")
-        Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources", default=False, help="Focus to a binary build and ignore all source files")
-        Parser.add_option("--pcd", action="append", dest="OptionPcd", help="Set PCD value by command line. Format: \"PcdName=Value\" ")
-        Parser.add_option("-l", "--cmd-len", action="store", type="int", dest="CommandLength", help="Specify the maximum line length of build command. Default is 4096.")
-        Parser.add_option("--hash", action="store_true", dest="UseHashCache", default=False, help="Enable hash-based caching during build process.")
-        Parser.add_option("--binary-destination", action="store", type="string", dest="BinCacheDest", help="Generate a cache of binary files in the specified directory.")
-        Parser.add_option("--binary-source", action="store", type="string", dest="BinCacheSource", help="Consume a cache of binary files from the specified directory.")
-        Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread", default=True, help="Enable GenFds multi thread to generate ffs file.")
-        Parser.add_option("--no-genfds-multi-thread", action="store_true", dest="NoGenfdsMultiThread", default=False, help="Disable GenFds multi thread to generate ffs file.")
-        Parser.add_option("--disable-include-path-check", action="store_true", dest="DisableIncludePathCheck", default=False, help="Disable the include path check for outside of package.")
+                          help="Specify the specific option to parse EDK UNI file. Must be one of: [-c, -s]. -c is for EDK framework UNI file, and -s is for EDK UEFI UNI file. "
+                          "This option can also be specified by setting *_*_*_BUILD_FLAGS in [BuildOptions] section of platform DSC. If they are both specified, this value "
+                          "will override the setting in [BuildOptions] section of platform DSC.")
+        Parser.add_option("-N", "--no-cache", action="store_true",
+                          dest="DisableCache", default=False, help="Disable build cache mechanism")
+        Parser.add_option("--conf", action="store", type="string",
+                          dest="ConfDirectory", help="Specify the customized Conf directory.")
+        Parser.add_option("--check-usage", action="store_true", dest="CheckUsage",
+                          default=False, help="Check usage content of entries listed in INF file.")
+        Parser.add_option("--ignore-sources", action="store_true", dest="IgnoreSources",
+                          default=False, help="Focus to a binary build and ignore all source files")
+        Parser.add_option("--pcd", action="append", dest="OptionPcd",
+                          help="Set PCD value by command line. Format: \"PcdName=Value\" ")
+        Parser.add_option("-l", "--cmd-len", action="store", type="int", dest="CommandLength",
+                          help="Specify the maximum line length of build command. Default is 4096.")
+        Parser.add_option("--hash", action="store_true", dest="UseHashCache",
+                          default=False, help="Enable hash-based caching during build process.")
+        Parser.add_option("--binary-destination", action="store", type="string", dest="BinCacheDest",
+                          help="Generate a cache of binary files in the specified directory.")
+        Parser.add_option("--binary-source", action="store", type="string", dest="BinCacheSource",
+                          help="Consume a cache of binary files from the specified directory.")
+        Parser.add_option("--genfds-multi-thread", action="store_true", dest="GenfdsMultiThread",
+                          default=True, help="Enable GenFds multi thread to generate ffs file.")
+        Parser.add_option("--no-genfds-multi-thread", action="store_true", dest="NoGenfdsMultiThread",
+                          default=False, help="Disable GenFds multi thread to generate ffs file.")
+        Parser.add_option("--disable-include-path-check", action="store_true", dest="DisableIncludePathCheck",
+                          default=False, help="Disable the include path check for outside of package.")
         self.BuildOption, self.BuildTarget = Parser.parse_args()
diff --git a/BaseTools/Source/Python/sitecustomize.py b/BaseTools/Source/Python/sitecustomize.py
index 50783e1b3af0..a0bc2546bc92 100644
--- a/BaseTools/Source/Python/sitecustomize.py
+++ b/BaseTools/Source/Python/sitecustomize.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #
 #
 # Copyright (c) 2009 - 2014, Apple Inc. All rights reserved.<BR>
@@ -8,8 +8,7 @@ import sys
 import locale
 
 if sys.platform == "darwin" and sys.version_info[0] < 3:
-  DefaultLocal = locale.getdefaultlocale()[1]
-  if DefaultLocal is None:
-    DefaultLocal = 'UTF8'
-  sys.setdefaultencoding(DefaultLocal)
-
+    DefaultLocal = locale.getdefaultlocale()[1]
+    if DefaultLocal is None:
+        DefaultLocal = 'UTF8'
+    sys.setdefaultencoding(DefaultLocal)
diff --git a/BaseTools/Source/Python/tests/Split/test_split.py b/BaseTools/Source/Python/tests/Split/test_split.py
index e4866be390b3..4a7a23ba6464 100644
--- a/BaseTools/Source/Python/tests/Split/test_split.py
+++ b/BaseTools/Source/Python/tests/Split/test_split.py
@@ -59,15 +59,15 @@ class TestSplit(unittest.TestCase):
             "Binary.bin",
             "Binary1.bin",
             r"output/Binary1.bin",
-            os.path.abspath( r"output/Binary1.bin")
-            ]
+            os.path.abspath(r"output/Binary1.bin")
+        ]
         expected_output = [
-            os.path.join(os.path.dirname(self.binary_file),"Binary.bin1" ),
-            os.path.join(os.getcwd(),"Binary.bin"),
-            os.path.join(os.getcwd(),"Binary1.bin"),
-            os.path.join(os.getcwd(),r"output/Binary1.bin"),
-            os.path.join(os.path.abspath( r"output/Binary1.bin"))
-            ]
+            os.path.join(os.path.dirname(self.binary_file), "Binary.bin1"),
+            os.path.join(os.getcwd(), "Binary.bin"),
+            os.path.join(os.getcwd(), "Binary1.bin"),
+            os.path.join(os.getcwd(), r"output/Binary1.bin"),
+            os.path.join(os.path.abspath(r"output/Binary1.bin"))
+        ]
         for index, o in enumerate(output):
             try:
                 sp.splitFile(self.binary_file, 123, outputfile1=o)
@@ -84,26 +84,27 @@ class TestSplit(unittest.TestCase):
             r"output1/output2",
             os.path.abspath("output"),
             "output"
-            ]
+        ]
         output = [
             None,
             None,
             "Binary1.bin",
             r"output/Binary1.bin",
-            os.path.abspath( r"output_1/Binary1.bin")
-            ]
+            os.path.abspath(r"output_1/Binary1.bin")
+        ]
 
         expected_output = [
-            os.path.join(os.path.dirname(self.binary_file),"Binary.bin1" ),
-            os.path.join(os.getcwd(),"output", "Binary.bin1"),
-            os.path.join(os.getcwd(), r"output1/output2" , "Binary1.bin"),
-            os.path.join(os.getcwd(),r"output", "output/Binary1.bin"),
-            os.path.join(os.path.abspath( r"output/Binary1.bin"))
-            ]
+            os.path.join(os.path.dirname(self.binary_file), "Binary.bin1"),
+            os.path.join(os.getcwd(), "output", "Binary.bin1"),
+            os.path.join(os.getcwd(), r"output1/output2", "Binary1.bin"),
+            os.path.join(os.getcwd(), r"output", "output/Binary1.bin"),
+            os.path.join(os.path.abspath(r"output/Binary1.bin"))
+        ]
 
         for index, o in enumerate(outputfolder):
             try:
-                sp.splitFile(self.binary_file, 123, outputdir=o,outputfile1=output[index])
+                sp.splitFile(self.binary_file, 123, outputdir=o,
+                             outputfile1=output[index])
             except Exception as e:
                 self.assertTrue(False, msg="splitFile function error")
 
diff --git a/BaseTools/Tests/CToolsTests.py b/BaseTools/Tests/CToolsTests.py
index 2bc1b62d40ba..b487dc4f7f05 100644
--- a/BaseTools/Tests/CToolsTests.py
+++ b/BaseTools/Tests/CToolsTests.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Unit tests for C based BaseTools
 #
 #  Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
@@ -16,14 +16,14 @@ import unittest
 import TianoCompress
 modules = (
     TianoCompress,
-    )
+)
 
 
 def TheTestSuite():
     suites = list(map(lambda module: module.TheTestSuite(), modules))
     return unittest.TestSuite(suites)
 
+
 if __name__ == '__main__':
     allTests = TheTestSuite()
     unittest.TextTestRunner().run(allTests)
-
diff --git a/BaseTools/Tests/CheckPythonSyntax.py b/BaseTools/Tests/CheckPythonSyntax.py
index 099920721f5a..cf37e452d8f3 100644
--- a/BaseTools/Tests/CheckPythonSyntax.py
+++ b/BaseTools/Tests/CheckPythonSyntax.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Unit tests for checking syntax of Python source code
 #
 #  Copyright (c) 2009 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -15,6 +15,7 @@ import py_compile
 
 import TestTools
 
+
 class Tests(TestTools.BaseToolsTest):
 
     def setUp(self):
@@ -26,6 +27,7 @@ class Tests(TestTools.BaseToolsTest):
         except Exception as e:
             self.fail('syntax error: %s, Error is %s' % (filename, str(e)))
 
+
 def MakePythonSyntaxCheckTests():
     def GetAllPythonSourceFiles():
         pythonSourceFiles = []
@@ -33,8 +35,8 @@ def MakePythonSyntaxCheckTests():
             for filename in files:
                 if filename.lower().endswith('.py'):
                     pythonSourceFiles.append(
-                            os.path.join(root, filename)
-                        )
+                        os.path.join(root, filename)
+                    )
         return pythonSourceFiles
 
     def MakeTestName(filename):
@@ -46,16 +48,17 @@ def MakePythonSyntaxCheckTests():
 
     def MakeNewTest(filename):
         test = MakeTestName(filename)
-        newmethod = lambda self: self.SingleFileTest(filename)
+        def newmethod(self): return self.SingleFileTest(filename)
         setattr(
             Tests,
             test,
             newmethod
-            )
+        )
 
     for filename in GetAllPythonSourceFiles():
         MakeNewTest(filename)
 
+
 MakePythonSyntaxCheckTests()
 del MakePythonSyntaxCheckTests
 
@@ -64,5 +67,3 @@ TheTestSuite = TestTools.MakeTheTestSuite(locals())
 if __name__ == '__main__':
     allTests = TheTestSuite()
     unittest.TextTestRunner().run(allTests)
-
-
diff --git a/BaseTools/Tests/CheckUnicodeSourceFiles.py b/BaseTools/Tests/CheckUnicodeSourceFiles.py
index 1502402619e1..ede49af0e2b1 100644
--- a/BaseTools/Tests/CheckUnicodeSourceFiles.py
+++ b/BaseTools/Tests/CheckUnicodeSourceFiles.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 #  Unit tests for AutoGen.UniClassObject
 #
 #  Copyright (c) 2015, Intel Corporation. All rights reserved.<BR>
@@ -22,6 +22,7 @@ import AutoGen.UniClassObject as BtUni
 from Common import EdkLogger
 EdkLogger.InitializeForUnitTest()
 
+
 class Tests(TestTools.BaseToolsTest):
 
     SampleData = u'''
@@ -168,6 +169,7 @@ class Tests(TestTools.BaseToolsTest):
 
         self.CheckFile(encoding=None, shouldPass=False, string=data)
 
+
 TheTestSuite = TestTools.MakeTheTestSuite(locals())
 
 if __name__ == '__main__':
diff --git a/BaseTools/Tests/PythonTest.py b/BaseTools/Tests/PythonTest.py
index ec44c7947086..f5480eb6b611 100644
--- a/BaseTools/Tests/PythonTest.py
+++ b/BaseTools/Tests/PythonTest.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Test whether PYTHON_COMMAND is available
 #
 # Copyright (c) 2013 - 2018, Intel Corporation. All rights reserved.<BR>
diff --git a/BaseTools/Tests/PythonToolsTests.py b/BaseTools/Tests/PythonToolsTests.py
index 05b27ab03335..3258c8aa4d27 100644
--- a/BaseTools/Tests/PythonToolsTests.py
+++ b/BaseTools/Tests/PythonToolsTests.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Unit tests for Python based BaseTools
 #
 #  Copyright (c) 2008 - 2015, Intel Corporation. All rights reserved.<BR>
@@ -22,7 +22,7 @@ def TheTestSuite():
     suites.append(CheckUnicodeSourceFiles.TheTestSuite())
     return unittest.TestSuite(suites)
 
+
 if __name__ == '__main__':
     allTests = TheTestSuite()
     unittest.TextTestRunner().run(allTests)
-
diff --git a/BaseTools/Tests/RunTests.py b/BaseTools/Tests/RunTests.py
index 934683a44654..ae16599bb45f 100644
--- a/BaseTools/Tests/RunTests.py
+++ b/BaseTools/Tests/RunTests.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Unit tests for BaseTools utilities
 #
 #  Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
@@ -15,18 +15,21 @@ import unittest
 
 import TestTools
 
+
 def GetCTestSuite():
     import CToolsTests
     return CToolsTests.TheTestSuite()
 
+
 def GetPythonTestSuite():
     import PythonToolsTests
     return PythonToolsTests.TheTestSuite()
 
+
 def GetAllTestsSuite():
     return unittest.TestSuite([GetCTestSuite(), GetPythonTestSuite()])
 
+
 if __name__ == '__main__':
     allTests = GetAllTestsSuite()
     unittest.TextTestRunner(verbosity=2).run(allTests)
-
diff --git a/BaseTools/Tests/TestRegularExpression.py b/BaseTools/Tests/TestRegularExpression.py
index 3e6c5f446383..3b6190978714 100644
--- a/BaseTools/Tests/TestRegularExpression.py
+++ b/BaseTools/Tests/TestRegularExpression.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Routines for generating Pcd Database
 #
 # Copyright (c) 2018, Intel Corporation. All rights reserved.<BR>
@@ -8,6 +8,7 @@ import unittest
 from Common.Misc import RemoveCComments
 from Workspace.BuildClassObject import ArrayIndex
 
+
 class TestRe(unittest.TestCase):
     def test_ccomments(self):
         TestStr1 = """ {0x01,0x02} """
@@ -42,7 +43,9 @@ class TestRe(unittest.TestCase):
         self.assertEquals(['[1]'], ArrayIndex.findall(TestStr1))
 
         TestStr2 = """[1][2][0x1][0x01][]"""
-        self.assertEquals(['[1]','[2]','[0x1]','[0x01]','[]'], ArrayIndex.findall(TestStr2))
+        self.assertEquals(['[1]', '[2]', '[0x1]', '[0x01]',
+                          '[]'], ArrayIndex.findall(TestStr2))
+
 
 if __name__ == '__main__':
     unittest.main()
diff --git a/BaseTools/Tests/TestTools.py b/BaseTools/Tests/TestTools.py
index 1099fd4eeaea..3ff294721aef 100644
--- a/BaseTools/Tests/TestTools.py
+++ b/BaseTools/Tests/TestTools.py
@@ -1,5 +1,5 @@
 from __future__ import print_function
-## @file
+# @file
 # Utility functions and classes for BaseTools unit tests
 #
 #  Copyright (c) 2008 - 2018, Intel Corporation. All rights reserved.<BR>
@@ -33,6 +33,7 @@ if PythonSourceDir not in sys.path:
     #
     sys.path.append(PythonSourceDir)
 
+
 def MakeTheTestSuite(localItems):
     tests = []
     for name, item in localItems.items():
@@ -43,26 +44,30 @@ def MakeTheTestSuite(localItems):
                 tests.append(item())
     return lambda: unittest.TestSuite(tests)
 
+
 def GetBaseToolsPaths():
     if sys.platform in ('win32', 'win64'):
-        return [ os.path.join(BaseToolsDir, 'Bin', sys.platform.title()) ]
+        return [os.path.join(BaseToolsDir, 'Bin', sys.platform.title())]
     else:
         uname = os.popen('uname -sm').read().strip()
         for char in (' ', '/'):
             uname = uname.replace(char, '-')
         return [
-                os.path.join(BaseToolsDir, 'Bin', uname),
-                os.path.join(BaseToolsDir, 'BinWrappers', uname),
-                os.path.join(BaseToolsDir, 'BinWrappers', 'PosixLike')
-            ]
+            os.path.join(BaseToolsDir, 'Bin', uname),
+            os.path.join(BaseToolsDir, 'BinWrappers', uname),
+            os.path.join(BaseToolsDir, 'BinWrappers', 'PosixLike')
+        ]
+
 
 BaseToolsBinPaths = GetBaseToolsPaths()
 
+
 class BaseToolsTest(unittest.TestCase):
 
     def cleanOutDir(self, dir):
         for dirItem in os.listdir(dir):
-            if dirItem in ('.', '..'): continue
+            if dirItem in ('.', '..'):
+                continue
             dirItem = os.path.join(dir, dirItem)
             self.RemoveFileOrDir(dirItem)
 
@@ -103,12 +108,17 @@ class BaseToolsTest(unittest.TestCase):
         return bin
 
     def RunTool(self, *args, **kwd):
-        if 'toolName' in kwd: toolName = kwd['toolName']
-        else: toolName = None
-        if 'logFile' in kwd: logFile = kwd['logFile']
-        else: logFile = None
+        if 'toolName' in kwd:
+            toolName = kwd['toolName']
+        else:
+            toolName = None
+        if 'logFile' in kwd:
+            logFile = kwd['logFile']
+        else:
+            logFile = None
 
-        if toolName is None: toolName = self.toolName
+        if toolName is None:
+            toolName = self.toolName
         bin = self.FindToolBin(toolName)
         if logFile is not None:
             logFile = open(os.path.join(self.testDir, logFile), 'w')
@@ -121,7 +131,7 @@ class BaseToolsTest(unittest.TestCase):
         Proc = subprocess.Popen(
             args, executable=bin,
             stdout=popenOut, stderr=subprocess.STDOUT
-            )
+        )
 
         if logFile is None:
             Proc.stdout.read()
@@ -131,7 +141,7 @@ class BaseToolsTest(unittest.TestCase):
     def GetTmpFilePath(self, fileName):
         return os.path.join(self.testDir, fileName)
 
-    def OpenTmpFile(self, fileName, mode = 'r'):
+    def OpenTmpFile(self, fileName, mode='r'):
         return open(os.path.join(self.testDir, fileName), mode)
 
     def ReadTmpFile(self, fileName):
@@ -148,19 +158,22 @@ class BaseToolsTest(unittest.TestCase):
             with codecs.open(self.GetTmpFilePath(fileName), 'w', encoding='utf-8') as f:
                 f.write(data)
 
-    def GenRandomFileData(self, fileName, minlen = None, maxlen = None):
-        if maxlen is None: maxlen = minlen
+    def GenRandomFileData(self, fileName, minlen=None, maxlen=None):
+        if maxlen is None:
+            maxlen = minlen
         f = self.OpenTmpFile(fileName, 'w')
         f.write(self.GetRandomString(minlen, maxlen))
         f.close()
 
-    def GetRandomString(self, minlen = None, maxlen = None):
-        if minlen is None: minlen = 1024
-        if maxlen is None: maxlen = minlen
+    def GetRandomString(self, minlen=None, maxlen=None):
+        if minlen is None:
+            minlen = 1024
+        if maxlen is None:
+            maxlen = minlen
         return ''.join(
             [chr(random.randint(0, 255))
              for x in range(random.randint(minlen, maxlen))
-            ])
+             ])
 
     def setUp(self):
         self.savedEnvPath = os.environ['PATH']
@@ -181,4 +194,3 @@ class BaseToolsTest(unittest.TestCase):
 
         os.environ['PATH'] = self.savedEnvPath
         sys.path = self.savedSysPath
-
diff --git a/BaseTools/Tests/TianoCompress.py b/BaseTools/Tests/TianoCompress.py
index 685968b18fb3..ffe0ae61a11f 100644
--- a/BaseTools/Tests/TianoCompress.py
+++ b/BaseTools/Tests/TianoCompress.py
@@ -1,4 +1,4 @@
-## @file
+# @file
 # Unit tests for TianoCompress utility
 #
 #  Copyright (c) 2008, Intel Corporation. All rights reserved.<BR>
@@ -17,6 +17,7 @@ import unittest
 
 import TestTools
 
+
 class Tests(TestTools.BaseToolsTest):
 
     def setUp(self):
@@ -25,7 +26,7 @@ class Tests(TestTools.BaseToolsTest):
 
     def testHelp(self):
         result = self.RunTool('--help', logFile='help')
-        #self.DisplayFile('help')
+        # self.DisplayFile('help')
         self.assertTrue(result == 0)
 
     def compressionTestCycle(self, data):
@@ -35,13 +36,13 @@ class Tests(TestTools.BaseToolsTest):
             '-e',
             '-o', self.GetTmpFilePath('output1'),
             self.GetTmpFilePath('input')
-            )
+        )
         self.assertTrue(result == 0)
         result = self.RunTool(
             '-d',
             '-o', self.GetTmpFilePath('output2'),
             self.GetTmpFilePath('output1')
-            )
+        )
         self.assertTrue(result == 0)
         start = self.ReadTmpFile('input')
         finish = self.ReadTmpFile('output2')
@@ -50,7 +51,8 @@ class Tests(TestTools.BaseToolsTest):
             print()
             print('Original data did not match decompress(compress(data))')
             self.DisplayBinaryData('original data', start)
-            self.DisplayBinaryData('after compression', self.ReadTmpFile('output1'))
+            self.DisplayBinaryData('after compression',
+                                   self.ReadTmpFile('output1'))
             self.DisplayBinaryData('after decompression', finish)
         self.assertTrue(startEqualsFinish)
 
@@ -60,10 +62,9 @@ class Tests(TestTools.BaseToolsTest):
             self.compressionTestCycle(data)
             self.CleanUpTmpDir()
 
+
 TheTestSuite = TestTools.MakeTheTestSuite(locals())
 
 if __name__ == '__main__':
     allTests = TheTestSuite()
     unittest.TextTestRunner().run(allTests)
-
-
-- 
2.37.3


^ permalink raw reply related	[flat|nested] 5+ messages in thread

* Re: [PATCH v1 0/1] BaseTools: Fix Python Formatting
  2022-10-10 20:05 [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
  2022-10-10 20:05 ` [PATCH v1 1/1] Format BaseTools python files using autopep8 Ayush Singh
@ 2022-10-12  4:56 ` Ayush Singh
  2022-10-12  5:12   ` Michael D Kinney
  1 sibling, 1 reply; 5+ messages in thread
From: Ayush Singh @ 2022-10-12  4:56 UTC (permalink / raw)
  To: edk2-devel-groups-io; +Cc: Kinney, Michael D, Nate DeSimone

[-- Attachment #1: Type: text/plain, Size: 27420 bytes --]

I wanted to ask if BaseTools has a maintainer or someone I should directly
Cc. I also think that it would be great to have a list of people for
different parts of edk2 (like Linux Kernel has) for contributions to each
package.

On Tue, 11 Oct, 2022, 01:36 Ayush Singh, <ayushdevel1325@gmail.com> wrote:

> Fix formatting of Python files in BaseTools to conform to PEP8 using
> autopep8. This does not fix all the warnings/errors from flake8, but I
> wanted to get this patch checked out first to see if ignoring those
> warnings is deliberate or not.
>
> The complete code can be found:
> https://github.com/Ayush1325/edk2/tree/formatting
>
> Ayush Singh (1):
>   Format BaseTools python files using autopep8
>
>  BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
>                |   74 +-
>  BaseTools/Edk2ToolsBuild.py
>                |    4 +-
>  BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py
>               |   15 +-
>  BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py
>                |    6 +-
>  BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py
>                |    8 +-
>  BaseTools/Scripts/BinToPcd.py
>                |  185 +-
>  BaseTools/Scripts/ConvertFceToStructurePcd.py
>                | 1312 ++--
>  BaseTools/Scripts/ConvertMasmToNasm.py
>               |    7 +-
>  BaseTools/Scripts/ConvertUni.py
>                |   14 +-
>  BaseTools/Scripts/DetectNotUsedItem.py
>               |   23 +-
>  BaseTools/Scripts/FormatDosFiles.py
>                |   25 +-
>  BaseTools/Scripts/GetMaintainer.py
>               |   19 +-
>  BaseTools/Scripts/GetUtcDateTime.py
>                |   18 +-
>  BaseTools/Scripts/MemoryProfileSymbolGen.py
>                |  162 +-
>  BaseTools/Scripts/PackageDocumentTools/__init__.py
>               |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
>               |  138 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py
>                |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py
>        |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
>         |   79 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
>       |   96 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
>             |   92 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py
>      |    3 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py
>         |   12 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py
>             |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py
>       |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
>     |  165 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py
>            |   41 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
>     |  374 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
> |  372 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py
>            |   25 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
>            |   59 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py
>               |    2 +-
>  BaseTools/Scripts/PatchCheck.py
>                |   90 +-
>  BaseTools/Scripts/RunMakefile.py
>               |  258 +-
>  BaseTools/Scripts/SetupGit.py
>                |   23 +-
>  BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
>                |  165 +-
>  BaseTools/Scripts/UpdateBuildVersions.py
>               |   64 +-
>  BaseTools/Scripts/efi_debugging.py
>               |    4 +-
>  BaseTools/Scripts/efi_gdb.py
>               |    1 +
>  BaseTools/Source/C/Makefiles/NmakeSubdirs.py
>               |   43 +-
>  BaseTools/Source/C/PyEfiCompressor/setup.py
>                |   16 +-
>  BaseTools/Source/Python/AmlToC/AmlToC.py
>               |   29 +-
>  BaseTools/Source/Python/AutoGen/AutoGen.py
>               |   58 +-
>  BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>               |  145 +-
>  BaseTools/Source/Python/AutoGen/BuildEngine.py
>               |  158 +-
>  BaseTools/Source/Python/AutoGen/DataPipe.py
>                |  152 +-
>  BaseTools/Source/Python/AutoGen/GenC.py
>                |  944 ++-
>  BaseTools/Source/Python/AutoGen/GenDepex.py
>                |  211 +-
>  BaseTools/Source/Python/AutoGen/GenMake.py
>               |  738 +-
>  BaseTools/Source/Python/AutoGen/GenPcdDb.py
>                |  533 +-
>  BaseTools/Source/Python/AutoGen/GenVar.py
>                |  182 +-
>  BaseTools/Source/Python/AutoGen/IdfClassObject.py
>                |   97 +-
>  BaseTools/Source/Python/AutoGen/IncludesAutoGen.py
>               |  109 +-
>  BaseTools/Source/Python/AutoGen/InfSectionParser.py
>                |   47 +-
>  BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>               |  945 ++-
>  BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
>               |  255 +-
>  BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
>               |  554 +-
>  BaseTools/Source/Python/AutoGen/StrGather.py
>               |  221 +-
>  BaseTools/Source/Python/AutoGen/UniClassObject.py
>                |  261 +-
>  BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
>               |   23 +-
>  BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
>                |  405 +-
>  BaseTools/Source/Python/AutoGen/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/BPDG/BPDG.py
>               |   37 +-
>  BaseTools/Source/Python/BPDG/GenVpd.py
>               |  348 +-
>  BaseTools/Source/Python/BPDG/StringTable.py
>                |   47 +-
>  BaseTools/Source/Python/BPDG/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Capsule/GenerateCapsule.py
>               | 1329 ++--
>  BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py
>               |  119 +-
>  BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py
>               |   83 +-
>  BaseTools/Source/Python/Common/BuildToolError.py
>               |  109 +-
>  BaseTools/Source/Python/Common/BuildVersion.py
>               |    2 +-
>  BaseTools/Source/Python/Common/DataType.py
>               |  187 +-
>  BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py
>                |   84 +-
>  BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/Edk2/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/EdkLogger.py
>                |  112 +-
>  BaseTools/Source/Python/Common/Expression.py
>               |  243 +-
>  BaseTools/Source/Python/Common/GlobalData.py
>               |   24 +-
>  BaseTools/Source/Python/Common/LongFilePathOs.py
>               |   31 +-
>  BaseTools/Source/Python/Common/LongFilePathOsPath.py
>               |   10 +-
>  BaseTools/Source/Python/Common/LongFilePathSupport.py
>                |   11 +-
>  BaseTools/Source/Python/Common/Misc.py
>               |  479 +-
>  BaseTools/Source/Python/Common/MultipleWorkspace.py
>                |   32 +-
>  BaseTools/Source/Python/Common/Parsing.py
>                |  332 +-
>  BaseTools/Source/Python/Common/RangeExpression.py
>                |   89 +-
>  BaseTools/Source/Python/Common/StringUtils.py
>                |  194 +-
>  BaseTools/Source/Python/Common/TargetTxtClassObject.py
>               |   73 +-
>  BaseTools/Source/Python/Common/ToolDefClassObject.py
>               |   89 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py
>               |  394 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py
>               |  117 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py
>                |  286 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py
>               |  110 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/Uefi/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/VariableAttributes.py
>               |   14 +-
>  BaseTools/Source/Python/Common/VpdInfoFile.py
>                |   96 +-
>  BaseTools/Source/Python/Common/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Common/caching.py
>                |   29 +-
>  BaseTools/Source/Python/CommonDataClass/CommonClass.py
>               |   29 +-
>  BaseTools/Source/Python/CommonDataClass/DataClass.py
>               |   88 +-
>  BaseTools/Source/Python/CommonDataClass/Exceptions.py
>                |   12 +-
>  BaseTools/Source/Python/CommonDataClass/FdfClass.py
>                |  129 +-
>  BaseTools/Source/Python/CommonDataClass/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/CParser3/CLexer.py
>               | 1908 ++---
>  BaseTools/Source/Python/Ecc/CParser3/CParser.py
>                | 7876 +++++++++-----------
>  BaseTools/Source/Python/Ecc/CParser4/CLexer.py
>               |  140 +-
>  BaseTools/Source/Python/Ecc/CParser4/CListener.py
>                |  359 +-
>  BaseTools/Source/Python/Ecc/CParser4/CParser.py
>                | 2451 +++---
>  BaseTools/Source/Python/Ecc/Check.py
>               |  404 +-
>  BaseTools/Source/Python/Ecc/CodeFragment.py
>                |   68 +-
>  BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
>               |  151 +-
>  BaseTools/Source/Python/Ecc/Configuration.py
>               |  245 +-
>  BaseTools/Source/Python/Ecc/Database.py
>                |  111 +-
>  BaseTools/Source/Python/Ecc/EccGlobalData.py
>               |    2 +-
>  BaseTools/Source/Python/Ecc/EccMain.py
>               |  144 +-
>  BaseTools/Source/Python/Ecc/EccToolError.py
>                |  177 +-
>  BaseTools/Source/Python/Ecc/Exception.py
>               |   16 +-
>  BaseTools/Source/Python/Ecc/FileProfile.py
>               |   10 +-
>  BaseTools/Source/Python/Ecc/MetaDataParser.py
>                |   62 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
>               |   43 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
>                |  827 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
>               |  181 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/ParserWarning.py
>               |    8 +-
>  BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
>               |   39 +-
>  BaseTools/Source/Python/Ecc/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/c.py
>               |  512 +-
>  BaseTools/Source/Python/Eot/CParser3/CLexer.py
>               | 1908 ++---
>  BaseTools/Source/Python/Eot/CParser3/CParser.py
>                | 7876 +++++++++-----------
>  BaseTools/Source/Python/Eot/CParser4/CLexer.py
>               |  139 +-
>  BaseTools/Source/Python/Eot/CParser4/CListener.py
>                |  358 +-
>  BaseTools/Source/Python/Eot/CParser4/CParser.py
>                | 2451 +++---
>  BaseTools/Source/Python/Eot/CodeFragment.py
>                |   78 +-
>  BaseTools/Source/Python/Eot/CodeFragmentCollector.py
>               |  119 +-
>  BaseTools/Source/Python/Eot/Database.py
>                |   77 +-
>  BaseTools/Source/Python/Eot/EotGlobalData.py
>               |    5 +-
>  BaseTools/Source/Python/Eot/EotMain.py
>               |  544 +-
>  BaseTools/Source/Python/Eot/EotToolError.py
>                |    7 +-
>  BaseTools/Source/Python/Eot/FileProfile.py
>               |   10 +-
>  BaseTools/Source/Python/Eot/Identification.py
>                |   11 +-
>  BaseTools/Source/Python/Eot/InfParserLite.py
>               |   52 +-
>  BaseTools/Source/Python/Eot/Parser.py
>                |  244 +-
>  BaseTools/Source/Python/Eot/ParserWarning.py
>               |    6 +-
>  BaseTools/Source/Python/Eot/Report.py
>                |   63 +-
>  BaseTools/Source/Python/Eot/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Eot/c.py
>               |  100 +-
>  BaseTools/Source/Python/FMMT/FMMT.py
>               |   57 +-
>  BaseTools/Source/Python/FMMT/__init__.py
>               |    4 +-
>  BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py
>                |  151 +-
>  BaseTools/Source/Python/FMMT/core/BiosTree.py
>                |   47 +-
>  BaseTools/Source/Python/FMMT/core/BiosTreeNode.py
>                |   77 +-
>  BaseTools/Source/Python/FMMT/core/FMMTOperation.py
>               |   40 +-
>  BaseTools/Source/Python/FMMT/core/FMMTParser.py
>                |   30 +-
>  BaseTools/Source/Python/FMMT/core/FvHandler.py
>               |  201 +-
>  BaseTools/Source/Python/FMMT/core/GuidTools.py
>               |   47 +-
>  BaseTools/Source/Python/FMMT/utils/FmmtLogger.py
>               |   10 +-
>  BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py
>                |   29 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/Common.py
>                |   20 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py
>               |    5 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py
>                |   31 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py
>               |    9 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/__init__.py
>                |    4 +-
>  BaseTools/Source/Python/GenFds/AprioriSection.py
>               |   48 +-
>  BaseTools/Source/Python/GenFds/Capsule.py
>                |  103 +-
>  BaseTools/Source/Python/GenFds/CapsuleData.py
>                |  111 +-
>  BaseTools/Source/Python/GenFds/CompressSection.py
>                |   42 +-
>  BaseTools/Source/Python/GenFds/DataSection.py
>                |   63 +-
>  BaseTools/Source/Python/GenFds/DepexSection.py
>               |   40 +-
>  BaseTools/Source/Python/GenFds/EfiSection.py
>               |  167 +-
>  BaseTools/Source/Python/GenFds/Fd.py
>               |   80 +-
>  BaseTools/Source/Python/GenFds/FdfParser.py
>                | 1680 +++--
>  BaseTools/Source/Python/GenFds/Ffs.py
>                |   58 +-
>  BaseTools/Source/Python/GenFds/FfsFileStatement.py
>               |   67 +-
>  BaseTools/Source/Python/GenFds/FfsInfStatement.py
>                |  572 +-
>  BaseTools/Source/Python/GenFds/Fv.py
>               |  240 +-
>  BaseTools/Source/Python/GenFds/FvImageSection.py
>               |   77 +-
>  BaseTools/Source/Python/GenFds/GenFds.py
>               |  353 +-
>  BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
>               |  355 +-
>  BaseTools/Source/Python/GenFds/GuidSection.py
>                |   90 +-
>  BaseTools/Source/Python/GenFds/OptRomFileStatement.py
>                |   16 +-
>  BaseTools/Source/Python/GenFds/OptRomInfStatement.py
>               |   56 +-
>  BaseTools/Source/Python/GenFds/OptionRom.py
>                |   46 +-
>  BaseTools/Source/Python/GenFds/Region.py
>               |  113 +-
>  BaseTools/Source/Python/GenFds/Rule.py
>               |    8 +-
>  BaseTools/Source/Python/GenFds/RuleComplexFile.py
>                |   12 +-
>  BaseTools/Source/Python/GenFds/RuleSimpleFile.py
>               |   10 +-
>  BaseTools/Source/Python/GenFds/Section.py
>                |  121 +-
>  BaseTools/Source/Python/GenFds/UiSection.py
>                |   23 +-
>  BaseTools/Source/Python/GenFds/VerSection.py
>               |   15 +-
>  BaseTools/Source/Python/GenFds/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
>               |   61 +-
>  BaseTools/Source/Python/GenPatchPcdTable/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
>               |   49 +-
>  BaseTools/Source/Python/PatchPcdValue/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
>               |  408 +-
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
>               |  259 +-
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
>               |  350 +-
>  BaseTools/Source/Python/Split/Split.py
>               |   17 +-
>  BaseTools/Source/Python/Table/Table.py
>               |   25 +-
>  BaseTools/Source/Python/Table/TableDataModel.py
>                |   17 +-
>  BaseTools/Source/Python/Table/TableDec.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableDsc.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableEotReport.py
>                |   19 +-
>  BaseTools/Source/Python/Table/TableFdf.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableFile.py
>               |   26 +-
>  BaseTools/Source/Python/Table/TableFunction.py
>               |   15 +-
>  BaseTools/Source/Python/Table/TableIdentifier.py
>               |   15 +-
>  BaseTools/Source/Python/Table/TableInf.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TablePcd.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableQuery.py
>                |   12 +-
>  BaseTools/Source/Python/Table/TableReport.py
>               |   37 +-
>  BaseTools/Source/Python/Table/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/TargetTool/TargetTool.py
>               |  107 +-
>  BaseTools/Source/Python/TargetTool/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Trim/Trim.py
>               |  224 +-
>  BaseTools/Source/Python/UPT/BuildVersion.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Core/DependencyRules.py
>                |   82 +-
>  BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
>               |   78 +-
>  BaseTools/Source/Python/UPT/Core/FileHook.py
>               |   50 +-
>  BaseTools/Source/Python/UPT/Core/IpiDb.py
>                |  230 +-
>  BaseTools/Source/Python/UPT/Core/PackageFile.py
>                |   61 +-
>  BaseTools/Source/Python/UPT/Core/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
>                |  149 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
>                |  181 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py
>               |   40 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/InstallPkg.py
>                |  295 +-
>  BaseTools/Source/Python/UPT/InventoryWs.py
>               |   45 +-
>  BaseTools/Source/Python/UPT/Library/CommentGenerating.py
>               |   88 +-
>  BaseTools/Source/Python/UPT/Library/CommentParsing.py
>                |  177 +-
>  BaseTools/Source/Python/UPT/Library/DataType.py
>                |  379 +-
>  BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
>                |  147 +-
>  BaseTools/Source/Python/UPT/Library/GlobalData.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Library/Misc.py
>                |  227 +-
>  BaseTools/Source/Python/UPT/Library/ParserValidate.py
>                |  130 +-
>  BaseTools/Source/Python/UPT/Library/Parsing.py
>               |  363 +-
>  BaseTools/Source/Python/UPT/Library/StringUtils.py
>               |  203 +-
>  BaseTools/Source/Python/UPT/Library/UniClassObject.py
>                |  447 +-
>  BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
>               |   37 +-
>  BaseTools/Source/Python/UPT/Library/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Library/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Logger/Log.py
>                |   97 +-
>  BaseTools/Source/Python/UPT/Logger/StringTable.py
>                |  933 +--
>  BaseTools/Source/Python/UPT/Logger/ToolError.py
>                |  117 +-
>  BaseTools/Source/Python/UPT/Logger/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/MkPkg.py
>               |   73 +-
>  BaseTools/Source/Python/UPT/Object/POM/CommonObject.py
>               |   93 +-
>  BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/POM/PackageObject.py
>                |   13 +-
>  BaseTools/Source/Python/UPT/Object/POM/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/DecObject.py
>               |  186 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
>               |  138 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py
>                |   15 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py
>               |   46 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py
>               |   36 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
>               |  334 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py
>                |   21 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
>               |   51 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py
>               |   34 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
>               |   23 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
>                |  116 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
>                |   46 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
>               |   42 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
>               |   66 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
>                |   24 +-
>  BaseTools/Source/Python/UPT/Object/Parser/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Object/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/Parser/DecParser.py
>                |  280 +-
>  BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
>                |   56 +-
>  BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py
>                |   47 +-
>  BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py
>               |   44 +-
>  BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py
>                |   53 +-
>  BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py
>               |   45 +-
>  BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py
>                |   15 +-
>  BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py
>                |   73 +-
>  BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py
>                |   38 +-
>  BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py
>                |   34 +-
>  BaseTools/Source/Python/UPT/Parser/InfParser.py
>                |  164 +-
>  BaseTools/Source/Python/UPT/Parser/InfParserMisc.py
>                |  122 +-
>  BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py
>                |   45 +-
>  BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
>               |   94 +-
>  BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py
>               |   39 +-
>  BaseTools/Source/Python/UPT/Parser/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
>                |  236 +-
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
>                |  183 +-
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
>                |   35 +-
>  BaseTools/Source/Python/UPT/PomAdapter/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/ReplacePkg.py
>                |   60 +-
>  BaseTools/Source/Python/UPT/RmPkg.py
>               |   66 +-
>  BaseTools/Source/Python/UPT/TestInstall.py
>               |   27 +-
>  BaseTools/Source/Python/UPT/UPT.py
>               |  150 +-
>  BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py
>                |   86 +-
>  BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py
>               |   91 +-
>  BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
>                |   23 +-
>  BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py
>                |  118 +-
>  BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
>               |  131 +-
>  BaseTools/Source/Python/UPT/Xml/CommonXml.py
>               |  265 +-
>  BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py
>                |  139 +-
>  BaseTools/Source/Python/UPT/Xml/IniToXml.py
>                |  152 +-
>  BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py
>                |  143 +-
>  BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py
>               |   87 +-
>  BaseTools/Source/Python/UPT/Xml/PcdXml.py
>                |  141 +-
>  BaseTools/Source/Python/UPT/Xml/XmlParser.py
>               |  363 +-
>  BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
>               |   22 +-
>  BaseTools/Source/Python/UPT/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Workspace/BuildClassObject.py
>                |  303 +-
>  BaseTools/Source/Python/Workspace/DecBuildData.py
>                |  184 +-
>  BaseTools/Source/Python/Workspace/DscBuildData.py
>                | 2007 +++--
>  BaseTools/Source/Python/Workspace/InfBuildData.py
>                |  426 +-
>  BaseTools/Source/Python/Workspace/MetaDataTable.py
>               |   98 +-
>  BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
>               |   21 +-
>  BaseTools/Source/Python/Workspace/MetaFileParser.py
>                |  885 ++-
>  BaseTools/Source/Python/Workspace/MetaFileTable.py
>               |  217 +-
>  BaseTools/Source/Python/Workspace/WorkspaceCommon.py
>               |   60 +-
>  BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
>               |   67 +-
>  BaseTools/Source/Python/Workspace/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/build/BuildReport.py
>               |  766 +-
>  BaseTools/Source/Python/build/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/build/build.py
>               | 1145 +--
>  BaseTools/Source/Python/build/buildoptions.py
>                |  113 +-
>  BaseTools/Source/Python/sitecustomize.py
>               |   11 +-
>  BaseTools/Source/Python/tests/Split/test_split.py
>                |   37 +-
>  BaseTools/Tests/CToolsTests.py
>               |    6 +-
>  BaseTools/Tests/CheckPythonSyntax.py
>               |   15 +-
>  BaseTools/Tests/CheckUnicodeSourceFiles.py
>               |    4 +-
>  BaseTools/Tests/PythonTest.py
>                |    2 +-
>  BaseTools/Tests/PythonToolsTests.py
>                |    4 +-
>  BaseTools/Tests/RunTests.py
>                |    7 +-
>  BaseTools/Tests/TestRegularExpression.py
>               |    7 +-
>  BaseTools/Tests/TestTools.py
>               |   54 +-
>  BaseTools/Tests/TianoCompress.py
>               |   15 +-
>  335 files changed, 35765 insertions(+), 32705 deletions(-)
>
> --
> 2.37.3
>
>

[-- Attachment #2: Type: text/html, Size: 43592 bytes --]

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: [PATCH v1 0/1] BaseTools: Fix Python Formatting
  2022-10-12  4:56 ` [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
@ 2022-10-12  5:12   ` Michael D Kinney
  2022-10-15 13:09     ` [edk2-devel] " Ayush Singh
  0 siblings, 1 reply; 5+ messages in thread
From: Michael D Kinney @ 2022-10-12  5:12 UTC (permalink / raw)
  To: Ayush Singh, edk2-devel-groups-io, Kinney, Michael D
  Cc: Desimone, Nathaniel L

[-- Attachment #1: Type: text/plain, Size: 35622 bytes --]

Hi Ayush,

All BaseTools python changes are initially done in the edk2-basetools repo to generate pip modules and then ported to edk2 repo BaseTools dir

https://github.com/tianocore/edk2-basetools/

Maintained by EDK II Tools Maintainers

https://github.com/orgs/tianocore/teams/edk-ii-tool-maintainers

edk2 Maintainers.txt lists the following Maintainers and Reviewers for the BaseTools directory in edk2 repo
https://github.com/tianocore/edk2/blob/master/Maintainers.txt

BaseTools
F: BaseTools/
W: https://github.com/tianocore/tianocore.github.io/wiki/BaseTools
M: Bob Feng <bob.c.feng@intel.com> [BobCF]
M: Liming Gao <gaoliming@byosoft.com.cn> [lgao4]
R: Yuwei Chen <yuwei.chen@intel.com> [YuweiChen1110]

There is also a Tools and CI meeting in Monday listed in the groups.io calendar where these types of topics are discussed.

https://github.com/tianocore/edk2/discussions/2614

Best regards,

Mike

From: Ayush Singh <ayushdevel1325@gmail.com>
Sent: Tuesday, October 11, 2022 9:57 PM
To: edk2-devel-groups-io <devel@edk2.groups.io>
Cc: Kinney, Michael D <michael.d.kinney@intel.com>; Desimone, Nathaniel L <nathaniel.l.desimone@intel.com>
Subject: Re: [PATCH v1 0/1] BaseTools: Fix Python Formatting

I wanted to ask if BaseTools has a maintainer or someone I should directly Cc. I also think that it would be great to have a list of people for different parts of edk2 (like Linux Kernel has) for contributions to each package.

On Tue, 11 Oct, 2022, 01:36 Ayush Singh, <ayushdevel1325@gmail.com<mailto:ayushdevel1325@gmail.com>> wrote:
Fix formatting of Python files in BaseTools to conform to PEP8 using
autopep8. This does not fix all the warnings/errors from flake8, but I
wanted to get this patch checked out first to see if ignoring those
warnings is deliberate or not.

The complete code can be found: https://github.com/Ayush1325/edk2/tree/formatting

Ayush Singh (1):
  Format BaseTools python files using autopep8

 BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py                                       |   74 +-
 BaseTools/Edk2ToolsBuild.py                                                             |    4 +-
 BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py                          |   15 +-
 BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py                               |    6 +-
 BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py                                   |    8 +-
 BaseTools/Scripts/BinToPcd.py                                                           |  185 +-
 BaseTools/Scripts/ConvertFceToStructurePcd.py                                           | 1312 ++--
 BaseTools/Scripts/ConvertMasmToNasm.py                                                  |    7 +-
 BaseTools/Scripts/ConvertUni.py                                                         |   14 +-
 BaseTools/Scripts/DetectNotUsedItem.py                                                  |   23 +-
 BaseTools/Scripts/FormatDosFiles.py                                                     |   25 +-
 BaseTools/Scripts/GetMaintainer.py                                                      |   19 +-
 BaseTools/Scripts/GetUtcDateTime.py                                                     |   18 +-
 BaseTools/Scripts/MemoryProfileSymbolGen.py                                             |  162 +-
 BaseTools/Scripts/PackageDocumentTools/__init__.py                                      |    2 +-
 BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py                                |  138 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py                   |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py         |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py          |   79 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py        |   96 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py              |   92 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py       |    3 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py          |   12 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py              |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py        |    2 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py      |  165 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py             |   41 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py      |  374 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py |  372 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py             |   25 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py             |   59 +-
 BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py                              |    2 +-
 BaseTools/Scripts/PatchCheck.py                                                         |   90 +-
 BaseTools/Scripts/RunMakefile.py                                                        |  258 +-
 BaseTools/Scripts/SetupGit.py                                                           |   23 +-
 BaseTools/Scripts/SmiHandlerProfileSymbolGen.py                                         |  165 +-
 BaseTools/Scripts/UpdateBuildVersions.py                                                |   64 +-
 BaseTools/Scripts/efi_debugging.py                                                      |    4 +-
 BaseTools/Scripts/efi_gdb.py                                                            |    1 +
 BaseTools/Source/C/Makefiles/NmakeSubdirs.py                                            |   43 +-
 BaseTools/Source/C/PyEfiCompressor/setup.py                                             |   16 +-
 BaseTools/Source/Python/AmlToC/AmlToC.py                                                |   29 +-
 BaseTools/Source/Python/AutoGen/AutoGen.py                                              |   58 +-
 BaseTools/Source/Python/AutoGen/AutoGenWorker.py                                        |  145 +-
 BaseTools/Source/Python/AutoGen/BuildEngine.py                                          |  158 +-
 BaseTools/Source/Python/AutoGen/DataPipe.py                                             |  152 +-
 BaseTools/Source/Python/AutoGen/GenC.py                                                 |  944 ++-
 BaseTools/Source/Python/AutoGen/GenDepex.py                                             |  211 +-
 BaseTools/Source/Python/AutoGen/GenMake.py                                              |  738 +-
 BaseTools/Source/Python/AutoGen/GenPcdDb.py                                             |  533 +-
 BaseTools/Source/Python/AutoGen/GenVar.py                                               |  182 +-
 BaseTools/Source/Python/AutoGen/IdfClassObject.py                                       |   97 +-
 BaseTools/Source/Python/AutoGen/IncludesAutoGen.py                                      |  109 +-
 BaseTools/Source/Python/AutoGen/InfSectionParser.py                                     |   47 +-
 BaseTools/Source/Python/AutoGen/ModuleAutoGen.py                                        |  945 ++-
 BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py                                  |  255 +-
 BaseTools/Source/Python/AutoGen/PlatformAutoGen.py                                      |  554 +-
 BaseTools/Source/Python/AutoGen/StrGather.py                                            |  221 +-
 BaseTools/Source/Python/AutoGen/UniClassObject.py                                       |  261 +-
 BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py                              |   23 +-
 BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py                                     |  405 +-
 BaseTools/Source/Python/AutoGen/__init__.py                                             |    2 +-
 BaseTools/Source/Python/BPDG/BPDG.py                                                    |   37 +-
 BaseTools/Source/Python/BPDG/GenVpd.py                                                  |  348 +-
 BaseTools/Source/Python/BPDG/StringTable.py                                             |   47 +-
 BaseTools/Source/Python/BPDG/__init__.py                                                |    2 +-
 BaseTools/Source/Python/Capsule/GenerateCapsule.py                                      | 1329 ++--
 BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py                                |  119 +-
 BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py                          |   83 +-
 BaseTools/Source/Python/Common/BuildToolError.py                                        |  109 +-
 BaseTools/Source/Python/Common/BuildVersion.py                                          |    2 +-
 BaseTools/Source/Python/Common/DataType.py                                              |  187 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py                         |   84 +-
 BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Edk2/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/EdkLogger.py                                             |  112 +-
 BaseTools/Source/Python/Common/Expression.py                                            |  243 +-
 BaseTools/Source/Python/Common/GlobalData.py                                            |   24 +-
 BaseTools/Source/Python/Common/LongFilePathOs.py                                        |   31 +-
 BaseTools/Source/Python/Common/LongFilePathOsPath.py                                    |   10 +-
 BaseTools/Source/Python/Common/LongFilePathSupport.py                                   |   11 +-
 BaseTools/Source/Python/Common/Misc.py                                                  |  479 +-
 BaseTools/Source/Python/Common/MultipleWorkspace.py                                     |   32 +-
 BaseTools/Source/Python/Common/Parsing.py                                               |  332 +-
 BaseTools/Source/Python/Common/RangeExpression.py                                       |   89 +-
 BaseTools/Source/Python/Common/StringUtils.py                                           |  194 +-
 BaseTools/Source/Python/Common/TargetTxtClassObject.py                                  |   73 +-
 BaseTools/Source/Python/Common/ToolDefClassObject.py                                    |   89 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py                        |  394 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py                            |  117 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py                         |  286 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py                        |  110 +-
 BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py                                 |    2 +-
 BaseTools/Source/Python/Common/Uefi/__init__.py                                         |    2 +-
 BaseTools/Source/Python/Common/VariableAttributes.py                                    |   14 +-
 BaseTools/Source/Python/Common/VpdInfoFile.py                                           |   96 +-
 BaseTools/Source/Python/Common/__init__.py                                              |    2 +-
 BaseTools/Source/Python/Common/caching.py                                               |   29 +-
 BaseTools/Source/Python/CommonDataClass/CommonClass.py                                  |   29 +-
 BaseTools/Source/Python/CommonDataClass/DataClass.py                                    |   88 +-
 BaseTools/Source/Python/CommonDataClass/Exceptions.py                                   |   12 +-
 BaseTools/Source/Python/CommonDataClass/FdfClass.py                                     |  129 +-
 BaseTools/Source/Python/CommonDataClass/__init__.py                                     |    2 +-
 BaseTools/Source/Python/Ecc/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Ecc/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Ecc/CParser4/CLexer.py                                          |  140 +-
 BaseTools/Source/Python/Ecc/CParser4/CListener.py                                       |  359 +-
 BaseTools/Source/Python/Ecc/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Ecc/Check.py                                                    |  404 +-
 BaseTools/Source/Python/Ecc/CodeFragment.py                                             |   68 +-
 BaseTools/Source/Python/Ecc/CodeFragmentCollector.py                                    |  151 +-
 BaseTools/Source/Python/Ecc/Configuration.py                                            |  245 +-
 BaseTools/Source/Python/Ecc/Database.py                                                 |  111 +-
 BaseTools/Source/Python/Ecc/EccGlobalData.py                                            |    2 +-
 BaseTools/Source/Python/Ecc/EccMain.py                                                  |  144 +-
 BaseTools/Source/Python/Ecc/EccToolError.py                                             |  177 +-
 BaseTools/Source/Python/Ecc/Exception.py                                                |   16 +-
 BaseTools/Source/Python/Ecc/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Ecc/MetaDataParser.py                                           |   62 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py                          |   43 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py                         |  827 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py                          |  181 +-
 BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py                               |    2 +-
 BaseTools/Source/Python/Ecc/ParserWarning.py                                            |    8 +-
 BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py                                          |   39 +-
 BaseTools/Source/Python/Ecc/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Ecc/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Ecc/c.py                                                        |  512 +-
 BaseTools/Source/Python/Eot/CParser3/CLexer.py                                          | 1908 ++---
 BaseTools/Source/Python/Eot/CParser3/CParser.py                                         | 7876 +++++++++-----------
 BaseTools/Source/Python/Eot/CParser4/CLexer.py                                          |  139 +-
 BaseTools/Source/Python/Eot/CParser4/CListener.py                                       |  358 +-
 BaseTools/Source/Python/Eot/CParser4/CParser.py                                         | 2451 +++---
 BaseTools/Source/Python/Eot/CodeFragment.py                                             |   78 +-
 BaseTools/Source/Python/Eot/CodeFragmentCollector.py                                    |  119 +-
 BaseTools/Source/Python/Eot/Database.py                                                 |   77 +-
 BaseTools/Source/Python/Eot/EotGlobalData.py                                            |    5 +-
 BaseTools/Source/Python/Eot/EotMain.py                                                  |  544 +-
 BaseTools/Source/Python/Eot/EotToolError.py                                             |    7 +-
 BaseTools/Source/Python/Eot/FileProfile.py                                              |   10 +-
 BaseTools/Source/Python/Eot/Identification.py                                           |   11 +-
 BaseTools/Source/Python/Eot/InfParserLite.py                                            |   52 +-
 BaseTools/Source/Python/Eot/Parser.py                                                   |  244 +-
 BaseTools/Source/Python/Eot/ParserWarning.py                                            |    6 +-
 BaseTools/Source/Python/Eot/Report.py                                                   |   63 +-
 BaseTools/Source/Python/Eot/__init__.py                                                 |    2 +-
 BaseTools/Source/Python/Eot/c.py                                                        |  100 +-
 BaseTools/Source/Python/FMMT/FMMT.py                                                    |   57 +-
 BaseTools/Source/Python/FMMT/__init__.py                                                |    4 +-
 BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py                               |  151 +-
 BaseTools/Source/Python/FMMT/core/BiosTree.py                                           |   47 +-
 BaseTools/Source/Python/FMMT/core/BiosTreeNode.py                                       |   77 +-
 BaseTools/Source/Python/FMMT/core/FMMTOperation.py                                      |   40 +-
 BaseTools/Source/Python/FMMT/core/FMMTParser.py                                         |   30 +-
 BaseTools/Source/Python/FMMT/core/FvHandler.py                                          |  201 +-
 BaseTools/Source/Python/FMMT/core/GuidTools.py                                          |   47 +-
 BaseTools/Source/Python/FMMT/utils/FmmtLogger.py                                        |   10 +-
 BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py                                     |   29 +-
 BaseTools/Source/Python/FirmwareStorageFormat/Common.py                                 |   20 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py                          |    5 +-
 BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py                               |   31 +-
 BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py                          |    9 +-
 BaseTools/Source/Python/FirmwareStorageFormat/__init__.py                               |    4 +-
 BaseTools/Source/Python/GenFds/AprioriSection.py                                        |   48 +-
 BaseTools/Source/Python/GenFds/Capsule.py                                               |  103 +-
 BaseTools/Source/Python/GenFds/CapsuleData.py                                           |  111 +-
 BaseTools/Source/Python/GenFds/CompressSection.py                                       |   42 +-
 BaseTools/Source/Python/GenFds/DataSection.py                                           |   63 +-
 BaseTools/Source/Python/GenFds/DepexSection.py                                          |   40 +-
 BaseTools/Source/Python/GenFds/EfiSection.py                                            |  167 +-
 BaseTools/Source/Python/GenFds/Fd.py                                                    |   80 +-
 BaseTools/Source/Python/GenFds/FdfParser.py                                             | 1680 +++--
 BaseTools/Source/Python/GenFds/Ffs.py                                                   |   58 +-
 BaseTools/Source/Python/GenFds/FfsFileStatement.py                                      |   67 +-
 BaseTools/Source/Python/GenFds/FfsInfStatement.py                                       |  572 +-
 BaseTools/Source/Python/GenFds/Fv.py                                                    |  240 +-
 BaseTools/Source/Python/GenFds/FvImageSection.py                                        |   77 +-
 BaseTools/Source/Python/GenFds/GenFds.py                                                |  353 +-
 BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py                                  |  355 +-
 BaseTools/Source/Python/GenFds/GuidSection.py                                           |   90 +-
 BaseTools/Source/Python/GenFds/OptRomFileStatement.py                                   |   16 +-
 BaseTools/Source/Python/GenFds/OptRomInfStatement.py                                    |   56 +-
 BaseTools/Source/Python/GenFds/OptionRom.py                                             |   46 +-
 BaseTools/Source/Python/GenFds/Region.py                                                |  113 +-
 BaseTools/Source/Python/GenFds/Rule.py                                                  |    8 +-
 BaseTools/Source/Python/GenFds/RuleComplexFile.py                                       |   12 +-
 BaseTools/Source/Python/GenFds/RuleSimpleFile.py                                        |   10 +-
 BaseTools/Source/Python/GenFds/Section.py                                               |  121 +-
 BaseTools/Source/Python/GenFds/UiSection.py                                             |   23 +-
 BaseTools/Source/Python/GenFds/VerSection.py                                            |   15 +-
 BaseTools/Source/Python/GenFds/__init__.py                                              |    2 +-
 BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py                            |   61 +-
 BaseTools/Source/Python/GenPatchPcdTable/__init__.py                                    |    2 +-
 BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py                                  |   49 +-
 BaseTools/Source/Python/PatchPcdValue/__init__.py                                       |    2 +-
 BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py                                          |  408 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py                  |  259 +-
 BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py                          |  350 +-
 BaseTools/Source/Python/Split/Split.py                                                  |   17 +-
 BaseTools/Source/Python/Table/Table.py                                                  |   25 +-
 BaseTools/Source/Python/Table/TableDataModel.py                                         |   17 +-
 BaseTools/Source/Python/Table/TableDec.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableDsc.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableEotReport.py                                         |   19 +-
 BaseTools/Source/Python/Table/TableFdf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableFile.py                                              |   26 +-
 BaseTools/Source/Python/Table/TableFunction.py                                          |   15 +-
 BaseTools/Source/Python/Table/TableIdentifier.py                                        |   15 +-
 BaseTools/Source/Python/Table/TableInf.py                                               |   15 +-
 BaseTools/Source/Python/Table/TablePcd.py                                               |   15 +-
 BaseTools/Source/Python/Table/TableQuery.py                                             |   12 +-
 BaseTools/Source/Python/Table/TableReport.py                                            |   37 +-
 BaseTools/Source/Python/Table/__init__.py                                               |    2 +-
 BaseTools/Source/Python/TargetTool/TargetTool.py                                        |  107 +-
 BaseTools/Source/Python/TargetTool/__init__.py                                          |    2 +-
 BaseTools/Source/Python/Trim/Trim.py                                                    |  224 +-
 BaseTools/Source/Python/UPT/BuildVersion.py                                             |    2 +-
 BaseTools/Source/Python/UPT/Core/DependencyRules.py                                     |   82 +-
 BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py                            |   78 +-
 BaseTools/Source/Python/UPT/Core/FileHook.py                                            |   50 +-
 BaseTools/Source/Python/UPT/Core/IpiDb.py                                               |  230 +-
 BaseTools/Source/Python/UPT/Core/PackageFile.py                                         |   61 +-
 BaseTools/Source/Python/UPT/Core/__init__.py                                            |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py                                   |  149 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py                                   |  181 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py                              |   40 +-
 BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py                                   |    2 +-
 BaseTools/Source/Python/UPT/GenMetaFile/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/InstallPkg.py                                               |  295 +-
 BaseTools/Source/Python/UPT/InventoryWs.py                                              |   45 +-
 BaseTools/Source/Python/UPT/Library/CommentGenerating.py                                |   88 +-
 BaseTools/Source/Python/UPT/Library/CommentParsing.py                                   |  177 +-
 BaseTools/Source/Python/UPT/Library/DataType.py                                         |  379 +-
 BaseTools/Source/Python/UPT/Library/ExpressionValidate.py                               |  147 +-
 BaseTools/Source/Python/UPT/Library/GlobalData.py                                       |    2 +-
 BaseTools/Source/Python/UPT/Library/Misc.py                                             |  227 +-
 BaseTools/Source/Python/UPT/Library/ParserValidate.py                                   |  130 +-
 BaseTools/Source/Python/UPT/Library/Parsing.py                                          |  363 +-
 BaseTools/Source/Python/UPT/Library/StringUtils.py                                      |  203 +-
 BaseTools/Source/Python/UPT/Library/UniClassObject.py                                   |  447 +-
 BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py                                  |   37 +-
 BaseTools/Source/Python/UPT/Library/Xml/__init__.py                                     |    2 +-
 BaseTools/Source/Python/UPT/Library/__init__.py                                         |    2 +-
 BaseTools/Source/Python/UPT/Logger/Log.py                                               |   97 +-
 BaseTools/Source/Python/UPT/Logger/StringTable.py                                       |  933 +--
 BaseTools/Source/Python/UPT/Logger/ToolError.py                                         |  117 +-
 BaseTools/Source/Python/UPT/Logger/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/MkPkg.py                                                    |   73 +-
 BaseTools/Source/Python/UPT/Object/POM/CommonObject.py                                  |   93 +-
 BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py                                  |   35 +-
 BaseTools/Source/Python/UPT/Object/POM/PackageObject.py                                 |   13 +-
 BaseTools/Source/Python/UPT/Object/POM/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/Object/Parser/DecObject.py                                  |  186 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py                            |  138 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py                       |   15 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py                            |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py                      |   36 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py                            |  334 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py                             |   21 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py                              |   51 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py                            |   34 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py                    |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py                                    |   23 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py                          |   35 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py                               |  116 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py                               |   46 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py                          |   42 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py                            |   66 +-
 BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py                     |   24 +-
 BaseTools/Source/Python/UPT/Object/Parser/__init__.py                                   |    2 +-
 BaseTools/Source/Python/UPT/Object/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/Parser/DecParser.py                                         |  280 +-
 BaseTools/Source/Python/UPT/Parser/DecParserMisc.py                                     |   56 +-
 BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py                                 |   47 +-
 BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py                            |   44 +-
 BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py                       |   53 +-
 BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py                            |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py                             |   15 +-
 BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py                   |   73 +-
 BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py                           |   38 +-
 BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py                           |   34 +-
 BaseTools/Source/Python/UPT/Parser/InfParser.py                                         |  164 +-
 BaseTools/Source/Python/UPT/Parser/InfParserMisc.py                                     |  122 +-
 BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py                               |   45 +-
 BaseTools/Source/Python/UPT/Parser/InfSectionParser.py                                  |   94 +-
 BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py                            |   39 +-
 BaseTools/Source/Python/UPT/Parser/__init__.py                                          |    2 +-
 BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py                               |  236 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py                               |  183 +-
 BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py                           |   35 +-
 BaseTools/Source/Python/UPT/PomAdapter/__init__.py                                      |    2 +-
 BaseTools/Source/Python/UPT/ReplacePkg.py                                               |   60 +-
 BaseTools/Source/Python/UPT/RmPkg.py                                                    |   66 +-
 BaseTools/Source/Python/UPT/TestInstall.py                                              |   27 +-
 BaseTools/Source/Python/UPT/UPT.py                                                      |  150 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py                       |   86 +-
 BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py                          |   91 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py                                   |   23 +-
 BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py                               |  118 +-
 BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py                            |  131 +-
 BaseTools/Source/Python/UPT/Xml/CommonXml.py                                            |  265 +-
 BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py                                   |  139 +-
 BaseTools/Source/Python/UPT/Xml/IniToXml.py                                             |  152 +-
 BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py                                 |  143 +-
 BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py                                |   87 +-
 BaseTools/Source/Python/UPT/Xml/PcdXml.py                                               |  141 +-
 BaseTools/Source/Python/UPT/Xml/XmlParser.py                                            |  363 +-
 BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py                                        |   22 +-
 BaseTools/Source/Python/UPT/Xml/__init__.py                                             |    2 +-
 BaseTools/Source/Python/Workspace/BuildClassObject.py                                   |  303 +-
 BaseTools/Source/Python/Workspace/DecBuildData.py                                       |  184 +-
 BaseTools/Source/Python/Workspace/DscBuildData.py                                       | 2007 +++--
 BaseTools/Source/Python/Workspace/InfBuildData.py                                       |  426 +-
 BaseTools/Source/Python/Workspace/MetaDataTable.py                                      |   98 +-
 BaseTools/Source/Python/Workspace/MetaFileCommentParser.py                              |   21 +-
 BaseTools/Source/Python/Workspace/MetaFileParser.py                                     |  885 ++-
 BaseTools/Source/Python/Workspace/MetaFileTable.py                                      |  217 +-
 BaseTools/Source/Python/Workspace/WorkspaceCommon.py                                    |   60 +-
 BaseTools/Source/Python/Workspace/WorkspaceDatabase.py                                  |   67 +-
 BaseTools/Source/Python/Workspace/__init__.py                                           |    2 +-
 BaseTools/Source/Python/build/BuildReport.py                                            |  766 +-
 BaseTools/Source/Python/build/__init__.py                                               |    2 +-
 BaseTools/Source/Python/build/build.py                                                  | 1145 +--
 BaseTools/Source/Python/build/buildoptions.py                                           |  113 +-
 BaseTools/Source/Python/sitecustomize.py                                                |   11 +-
 BaseTools/Source/Python/tests/Split/test_split.py                                       |   37 +-
 BaseTools/Tests/CToolsTests.py                                                          |    6 +-
 BaseTools/Tests/CheckPythonSyntax.py                                                    |   15 +-
 BaseTools/Tests/CheckUnicodeSourceFiles.py                                              |    4 +-
 BaseTools/Tests/PythonTest.py                                                           |    2 +-
 BaseTools/Tests/PythonToolsTests.py                                                     |    4 +-
 BaseTools/Tests/RunTests.py                                                             |    7 +-
 BaseTools/Tests/TestRegularExpression.py                                                |    7 +-
 BaseTools/Tests/TestTools.py                                                            |   54 +-
 BaseTools/Tests/TianoCompress.py                                                        |   15 +-
 335 files changed, 35765 insertions(+), 32705 deletions(-)

--
2.37.3

[-- Attachment #2: Type: text/html, Size: 118257 bytes --]

^ permalink raw reply	[flat|nested] 5+ messages in thread

* Re: [edk2-devel] [PATCH v1 0/1] BaseTools: Fix Python Formatting
  2022-10-12  5:12   ` Michael D Kinney
@ 2022-10-15 13:09     ` Ayush Singh
  0 siblings, 0 replies; 5+ messages in thread
From: Ayush Singh @ 2022-10-15 13:09 UTC (permalink / raw)
  To: edk2-devel-groups-io, Kinney, Michael D; +Cc: Desimone, Nathaniel L

[-- Attachment #1: Type: text/plain, Size: 29181 bytes --]

Thanks Michael. I totally missed the Maintainers.txt. I will try improving
the Getting Started section of edk2 wiki.


I have opened an issue (
https://github.com/tianocore/edk2-basetools/issues/57) and PR (
https://github.com/tianocore/edk2-basetools/pull/58) to discuss the same.

Yours Sincerely,
Ayush Singh

On Wed, 12 Oct, 2022, 10:42 Michael D Kinney, <michael.d.kinney@intel.com>
wrote:

> Hi Ayush,
>
>
>
> All BaseTools python changes are initially done in the edk2-basetools
> repo to generate pip modules and then ported to edk2 repo BaseTools dir
>
>
>
> https://github.com/tianocore/edk2-basetools/
>
>
>
> Maintained by EDK II Tools Maintainers
>
>
>
> https://github.com/orgs/tianocore/teams/edk-ii-tool-maintainers
>
>
>
> edk2 Maintainers.txt lists the following Maintainers and Reviewers for the
> BaseTools directory in edk2 repo
>
> https://github.com/tianocore/edk2/blob/master/Maintainers.txt
>
>
>
> BaseTools
>
> F: BaseTools/
>
> W: https://github.com/tianocore/tianocore.github.io/wiki/BaseTools
>
> M: Bob Feng <bob.c.feng@intel.com> [BobCF]
>
> M: Liming Gao <gaoliming@byosoft.com.cn> [lgao4]
>
> R: Yuwei Chen <yuwei.chen@intel.com> [YuweiChen1110]
>
>
>
> There is also a Tools and CI meeting in Monday listed in the groups.io
> calendar where these types of topics are discussed.
>
>
>
> https://github.com/tianocore/edk2/discussions/2614
>
>
>
> Best regards,
>
>
>
> Mike
>
>
>
> *From:* Ayush Singh <ayushdevel1325@gmail.com>
> *Sent:* Tuesday, October 11, 2022 9:57 PM
> *To:* edk2-devel-groups-io <devel@edk2.groups.io>
> *Cc:* Kinney, Michael D <michael.d.kinney@intel.com>; Desimone, Nathaniel
> L <nathaniel.l.desimone@intel.com>
> *Subject:* Re: [PATCH v1 0/1] BaseTools: Fix Python Formatting
>
>
>
> I wanted to ask if BaseTools has a maintainer or someone I should directly
> Cc. I also think that it would be great to have a list of people for
> different parts of edk2 (like Linux Kernel has) for contributions to each
> package.
>
>
>
> On Tue, 11 Oct, 2022, 01:36 Ayush Singh, <ayushdevel1325@gmail.com> wrote:
>
> Fix formatting of Python files in BaseTools to conform to PEP8 using
> autopep8. This does not fix all the warnings/errors from flake8, but I
> wanted to get this patch checked out first to see if ignoring those
> warnings is deliberate or not.
>
> The complete code can be found:
> https://github.com/Ayush1325/edk2/tree/formatting
>
> Ayush Singh (1):
>   Format BaseTools python files using autopep8
>
>  BaseTools/Bin/CYGWIN_NT-5.1-i686/armcc_wrapper.py
>                |   74 +-
>  BaseTools/Edk2ToolsBuild.py
>                |    4 +-
>  BaseTools/Plugin/BuildToolsReport/BuildToolsReportGenerator.py
>               |   15 +-
>  BaseTools/Plugin/LinuxGcc5ToolChain/LinuxGcc5ToolChain.py
>                |    6 +-
>  BaseTools/Plugin/WindowsResourceCompiler/WinRcPath.py
>                |    8 +-
>  BaseTools/Scripts/BinToPcd.py
>                |  185 +-
>  BaseTools/Scripts/ConvertFceToStructurePcd.py
>                | 1312 ++--
>  BaseTools/Scripts/ConvertMasmToNasm.py
>               |    7 +-
>  BaseTools/Scripts/ConvertUni.py
>                |   14 +-
>  BaseTools/Scripts/DetectNotUsedItem.py
>               |   23 +-
>  BaseTools/Scripts/FormatDosFiles.py
>                |   25 +-
>  BaseTools/Scripts/GetMaintainer.py
>               |   19 +-
>  BaseTools/Scripts/GetUtcDateTime.py
>                |   18 +-
>  BaseTools/Scripts/MemoryProfileSymbolGen.py
>                |  162 +-
>  BaseTools/Scripts/PackageDocumentTools/__init__.py
>               |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/packagedoc_cli.py
>               |  138 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/__init__.py
>                |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/__init__.py
>        |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/doxygen.py
>         |   79 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/efibinary.py
>       |   96 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/ini.py
>             |   92 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/inidocview.py
>      |    3 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/basemodel/message.py
>         |   12 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/__init__.py
>             |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/__init__.py
>       |    2 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/baseobject.py
>     |  165 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dec.py
>            |   41 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen.py
>     |  374 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/doxygengen_spec.py
> |  372 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/dsc.py
>            |   25 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/EdkPlugins/edk2/model/inf.py
>            |   59 +-
>  BaseTools/Scripts/PackageDocumentTools/plugins/__init__.py
>               |    2 +-
>  BaseTools/Scripts/PatchCheck.py
>                |   90 +-
>  BaseTools/Scripts/RunMakefile.py
>               |  258 +-
>  BaseTools/Scripts/SetupGit.py
>                |   23 +-
>  BaseTools/Scripts/SmiHandlerProfileSymbolGen.py
>                |  165 +-
>  BaseTools/Scripts/UpdateBuildVersions.py
>               |   64 +-
>  BaseTools/Scripts/efi_debugging.py
>               |    4 +-
>  BaseTools/Scripts/efi_gdb.py
>               |    1 +
>  BaseTools/Source/C/Makefiles/NmakeSubdirs.py
>               |   43 +-
>  BaseTools/Source/C/PyEfiCompressor/setup.py
>                |   16 +-
>  BaseTools/Source/Python/AmlToC/AmlToC.py
>               |   29 +-
>  BaseTools/Source/Python/AutoGen/AutoGen.py
>               |   58 +-
>  BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>               |  145 +-
>  BaseTools/Source/Python/AutoGen/BuildEngine.py
>               |  158 +-
>  BaseTools/Source/Python/AutoGen/DataPipe.py
>                |  152 +-
>  BaseTools/Source/Python/AutoGen/GenC.py
>                |  944 ++-
>  BaseTools/Source/Python/AutoGen/GenDepex.py
>                |  211 +-
>  BaseTools/Source/Python/AutoGen/GenMake.py
>               |  738 +-
>  BaseTools/Source/Python/AutoGen/GenPcdDb.py
>                |  533 +-
>  BaseTools/Source/Python/AutoGen/GenVar.py
>                |  182 +-
>  BaseTools/Source/Python/AutoGen/IdfClassObject.py
>                |   97 +-
>  BaseTools/Source/Python/AutoGen/IncludesAutoGen.py
>               |  109 +-
>  BaseTools/Source/Python/AutoGen/InfSectionParser.py
>                |   47 +-
>  BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>               |  945 ++-
>  BaseTools/Source/Python/AutoGen/ModuleAutoGenHelper.py
>               |  255 +-
>  BaseTools/Source/Python/AutoGen/PlatformAutoGen.py
>               |  554 +-
>  BaseTools/Source/Python/AutoGen/StrGather.py
>               |  221 +-
>  BaseTools/Source/Python/AutoGen/UniClassObject.py
>                |  261 +-
>  BaseTools/Source/Python/AutoGen/ValidCheckingInfoObject.py
>               |   23 +-
>  BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py
>                |  405 +-
>  BaseTools/Source/Python/AutoGen/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/BPDG/BPDG.py
>               |   37 +-
>  BaseTools/Source/Python/BPDG/GenVpd.py
>               |  348 +-
>  BaseTools/Source/Python/BPDG/StringTable.py
>                |   47 +-
>  BaseTools/Source/Python/BPDG/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Capsule/GenerateCapsule.py
>               | 1329 ++--
>  BaseTools/Source/Python/Capsule/GenerateWindowsDriver.py
>               |  119 +-
>  BaseTools/Source/Python/Capsule/WindowsCapsuleSupportHelper.py
>               |   83 +-
>  BaseTools/Source/Python/Common/BuildToolError.py
>               |  109 +-
>  BaseTools/Source/Python/Common/BuildVersion.py
>               |    2 +-
>  BaseTools/Source/Python/Common/DataType.py
>               |  187 +-
>  BaseTools/Source/Python/Common/Edk2/Capsule/FmpPayloadHeader.py
>                |   84 +-
>  BaseTools/Source/Python/Common/Edk2/Capsule/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/Edk2/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/EdkLogger.py
>                |  112 +-
>  BaseTools/Source/Python/Common/Expression.py
>               |  243 +-
>  BaseTools/Source/Python/Common/GlobalData.py
>               |   24 +-
>  BaseTools/Source/Python/Common/LongFilePathOs.py
>               |   31 +-
>  BaseTools/Source/Python/Common/LongFilePathOsPath.py
>               |   10 +-
>  BaseTools/Source/Python/Common/LongFilePathSupport.py
>                |   11 +-
>  BaseTools/Source/Python/Common/Misc.py
>               |  479 +-
>  BaseTools/Source/Python/Common/MultipleWorkspace.py
>                |   32 +-
>  BaseTools/Source/Python/Common/Parsing.py
>                |  332 +-
>  BaseTools/Source/Python/Common/RangeExpression.py
>                |   89 +-
>  BaseTools/Source/Python/Common/StringUtils.py
>                |  194 +-
>  BaseTools/Source/Python/Common/TargetTxtClassObject.py
>               |   73 +-
>  BaseTools/Source/Python/Common/ToolDefClassObject.py
>               |   89 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/CapsuleDependency.py
>               |  394 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/FmpAuthHeader.py
>               |  117 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/FmpCapsuleHeader.py
>                |  286 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/UefiCapsuleHeader.py
>               |  110 +-
>  BaseTools/Source/Python/Common/Uefi/Capsule/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/Uefi/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Common/VariableAttributes.py
>               |   14 +-
>  BaseTools/Source/Python/Common/VpdInfoFile.py
>                |   96 +-
>  BaseTools/Source/Python/Common/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Common/caching.py
>                |   29 +-
>  BaseTools/Source/Python/CommonDataClass/CommonClass.py
>               |   29 +-
>  BaseTools/Source/Python/CommonDataClass/DataClass.py
>               |   88 +-
>  BaseTools/Source/Python/CommonDataClass/Exceptions.py
>                |   12 +-
>  BaseTools/Source/Python/CommonDataClass/FdfClass.py
>                |  129 +-
>  BaseTools/Source/Python/CommonDataClass/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/CParser3/CLexer.py
>               | 1908 ++---
>  BaseTools/Source/Python/Ecc/CParser3/CParser.py
>                | 7876 +++++++++-----------
>  BaseTools/Source/Python/Ecc/CParser4/CLexer.py
>               |  140 +-
>  BaseTools/Source/Python/Ecc/CParser4/CListener.py
>                |  359 +-
>  BaseTools/Source/Python/Ecc/CParser4/CParser.py
>                | 2451 +++---
>  BaseTools/Source/Python/Ecc/Check.py
>               |  404 +-
>  BaseTools/Source/Python/Ecc/CodeFragment.py
>                |   68 +-
>  BaseTools/Source/Python/Ecc/CodeFragmentCollector.py
>               |  151 +-
>  BaseTools/Source/Python/Ecc/Configuration.py
>               |  245 +-
>  BaseTools/Source/Python/Ecc/Database.py
>                |  111 +-
>  BaseTools/Source/Python/Ecc/EccGlobalData.py
>               |    2 +-
>  BaseTools/Source/Python/Ecc/EccMain.py
>               |  144 +-
>  BaseTools/Source/Python/Ecc/EccToolError.py
>                |  177 +-
>  BaseTools/Source/Python/Ecc/Exception.py
>               |   16 +-
>  BaseTools/Source/Python/Ecc/FileProfile.py
>               |   10 +-
>  BaseTools/Source/Python/Ecc/MetaDataParser.py
>                |   62 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaDataTable.py
>               |   43 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileParser.py
>                |  827 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/MetaFileTable.py
>               |  181 +-
>  BaseTools/Source/Python/Ecc/MetaFileWorkspace/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/ParserWarning.py
>               |    8 +-
>  BaseTools/Source/Python/Ecc/Xml/XmlRoutines.py
>               |   39 +-
>  BaseTools/Source/Python/Ecc/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Ecc/c.py
>               |  512 +-
>  BaseTools/Source/Python/Eot/CParser3/CLexer.py
>               | 1908 ++---
>  BaseTools/Source/Python/Eot/CParser3/CParser.py
>                | 7876 +++++++++-----------
>  BaseTools/Source/Python/Eot/CParser4/CLexer.py
>               |  139 +-
>  BaseTools/Source/Python/Eot/CParser4/CListener.py
>                |  358 +-
>  BaseTools/Source/Python/Eot/CParser4/CParser.py
>                | 2451 +++---
>  BaseTools/Source/Python/Eot/CodeFragment.py
>                |   78 +-
>  BaseTools/Source/Python/Eot/CodeFragmentCollector.py
>               |  119 +-
>  BaseTools/Source/Python/Eot/Database.py
>                |   77 +-
>  BaseTools/Source/Python/Eot/EotGlobalData.py
>               |    5 +-
>  BaseTools/Source/Python/Eot/EotMain.py
>               |  544 +-
>  BaseTools/Source/Python/Eot/EotToolError.py
>                |    7 +-
>  BaseTools/Source/Python/Eot/FileProfile.py
>               |   10 +-
>  BaseTools/Source/Python/Eot/Identification.py
>                |   11 +-
>  BaseTools/Source/Python/Eot/InfParserLite.py
>               |   52 +-
>  BaseTools/Source/Python/Eot/Parser.py
>                |  244 +-
>  BaseTools/Source/Python/Eot/ParserWarning.py
>               |    6 +-
>  BaseTools/Source/Python/Eot/Report.py
>                |   63 +-
>  BaseTools/Source/Python/Eot/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Eot/c.py
>               |  100 +-
>  BaseTools/Source/Python/FMMT/FMMT.py
>               |   57 +-
>  BaseTools/Source/Python/FMMT/__init__.py
>               |    4 +-
>  BaseTools/Source/Python/FMMT/core/BinaryFactoryProduct.py
>                |  151 +-
>  BaseTools/Source/Python/FMMT/core/BiosTree.py
>                |   47 +-
>  BaseTools/Source/Python/FMMT/core/BiosTreeNode.py
>                |   77 +-
>  BaseTools/Source/Python/FMMT/core/FMMTOperation.py
>               |   40 +-
>  BaseTools/Source/Python/FMMT/core/FMMTParser.py
>                |   30 +-
>  BaseTools/Source/Python/FMMT/core/FvHandler.py
>               |  201 +-
>  BaseTools/Source/Python/FMMT/core/GuidTools.py
>               |   47 +-
>  BaseTools/Source/Python/FMMT/utils/FmmtLogger.py
>               |   10 +-
>  BaseTools/Source/Python/FMMT/utils/FvLayoutPrint.py
>                |   29 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/Common.py
>                |   20 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/FfsFileHeader.py
>               |    5 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/FvHeader.py
>                |   31 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/SectionHeader.py
>               |    9 +-
>  BaseTools/Source/Python/FirmwareStorageFormat/__init__.py
>                |    4 +-
>  BaseTools/Source/Python/GenFds/AprioriSection.py
>               |   48 +-
>  BaseTools/Source/Python/GenFds/Capsule.py
>                |  103 +-
>  BaseTools/Source/Python/GenFds/CapsuleData.py
>                |  111 +-
>  BaseTools/Source/Python/GenFds/CompressSection.py
>                |   42 +-
>  BaseTools/Source/Python/GenFds/DataSection.py
>                |   63 +-
>  BaseTools/Source/Python/GenFds/DepexSection.py
>               |   40 +-
>  BaseTools/Source/Python/GenFds/EfiSection.py
>               |  167 +-
>  BaseTools/Source/Python/GenFds/Fd.py
>               |   80 +-
>  BaseTools/Source/Python/GenFds/FdfParser.py
>                | 1680 +++--
>  BaseTools/Source/Python/GenFds/Ffs.py
>                |   58 +-
>  BaseTools/Source/Python/GenFds/FfsFileStatement.py
>               |   67 +-
>  BaseTools/Source/Python/GenFds/FfsInfStatement.py
>                |  572 +-
>  BaseTools/Source/Python/GenFds/Fv.py
>               |  240 +-
>  BaseTools/Source/Python/GenFds/FvImageSection.py
>               |   77 +-
>  BaseTools/Source/Python/GenFds/GenFds.py
>               |  353 +-
>  BaseTools/Source/Python/GenFds/GenFdsGlobalVariable.py
>               |  355 +-
>  BaseTools/Source/Python/GenFds/GuidSection.py
>                |   90 +-
>  BaseTools/Source/Python/GenFds/OptRomFileStatement.py
>                |   16 +-
>  BaseTools/Source/Python/GenFds/OptRomInfStatement.py
>               |   56 +-
>  BaseTools/Source/Python/GenFds/OptionRom.py
>                |   46 +-
>  BaseTools/Source/Python/GenFds/Region.py
>               |  113 +-
>  BaseTools/Source/Python/GenFds/Rule.py
>               |    8 +-
>  BaseTools/Source/Python/GenFds/RuleComplexFile.py
>                |   12 +-
>  BaseTools/Source/Python/GenFds/RuleSimpleFile.py
>               |   10 +-
>  BaseTools/Source/Python/GenFds/Section.py
>                |  121 +-
>  BaseTools/Source/Python/GenFds/UiSection.py
>                |   23 +-
>  BaseTools/Source/Python/GenFds/VerSection.py
>               |   15 +-
>  BaseTools/Source/Python/GenFds/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/GenPatchPcdTable/GenPatchPcdTable.py
>               |   61 +-
>  BaseTools/Source/Python/GenPatchPcdTable/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/PatchPcdValue/PatchPcdValue.py
>               |   49 +-
>  BaseTools/Source/Python/PatchPcdValue/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Pkcs7Sign/Pkcs7Sign.py
>               |  408 +-
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256GenerateKeys.py
>               |  259 +-
>  BaseTools/Source/Python/Rsa2048Sha256Sign/Rsa2048Sha256Sign.py
>               |  350 +-
>  BaseTools/Source/Python/Split/Split.py
>               |   17 +-
>  BaseTools/Source/Python/Table/Table.py
>               |   25 +-
>  BaseTools/Source/Python/Table/TableDataModel.py
>                |   17 +-
>  BaseTools/Source/Python/Table/TableDec.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableDsc.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableEotReport.py
>                |   19 +-
>  BaseTools/Source/Python/Table/TableFdf.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableFile.py
>               |   26 +-
>  BaseTools/Source/Python/Table/TableFunction.py
>               |   15 +-
>  BaseTools/Source/Python/Table/TableIdentifier.py
>               |   15 +-
>  BaseTools/Source/Python/Table/TableInf.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TablePcd.py
>                |   15 +-
>  BaseTools/Source/Python/Table/TableQuery.py
>                |   12 +-
>  BaseTools/Source/Python/Table/TableReport.py
>               |   37 +-
>  BaseTools/Source/Python/Table/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/TargetTool/TargetTool.py
>               |  107 +-
>  BaseTools/Source/Python/TargetTool/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/Trim/Trim.py
>               |  224 +-
>  BaseTools/Source/Python/UPT/BuildVersion.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Core/DependencyRules.py
>                |   82 +-
>  BaseTools/Source/Python/UPT/Core/DistributionPackageClass.py
>               |   78 +-
>  BaseTools/Source/Python/UPT/Core/FileHook.py
>               |   50 +-
>  BaseTools/Source/Python/UPT/Core/IpiDb.py
>                |  230 +-
>  BaseTools/Source/Python/UPT/Core/PackageFile.py
>                |   61 +-
>  BaseTools/Source/Python/UPT/Core/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenDecFile.py
>                |  149 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenInfFile.py
>                |  181 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenMetaFileMisc.py
>               |   40 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/GenXmlFile.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/GenMetaFile/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/InstallPkg.py
>                |  295 +-
>  BaseTools/Source/Python/UPT/InventoryWs.py
>               |   45 +-
>  BaseTools/Source/Python/UPT/Library/CommentGenerating.py
>               |   88 +-
>  BaseTools/Source/Python/UPT/Library/CommentParsing.py
>                |  177 +-
>  BaseTools/Source/Python/UPT/Library/DataType.py
>                |  379 +-
>  BaseTools/Source/Python/UPT/Library/ExpressionValidate.py
>                |  147 +-
>  BaseTools/Source/Python/UPT/Library/GlobalData.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Library/Misc.py
>                |  227 +-
>  BaseTools/Source/Python/UPT/Library/ParserValidate.py
>                |  130 +-
>  BaseTools/Source/Python/UPT/Library/Parsing.py
>               |  363 +-
>  BaseTools/Source/Python/UPT/Library/StringUtils.py
>               |  203 +-
>  BaseTools/Source/Python/UPT/Library/UniClassObject.py
>                |  447 +-
>  BaseTools/Source/Python/UPT/Library/Xml/XmlRoutines.py
>               |   37 +-
>  BaseTools/Source/Python/UPT/Library/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Library/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Logger/Log.py
>                |   97 +-
>  BaseTools/Source/Python/UPT/Logger/StringTable.py
>                |  933 +--
>  BaseTools/Source/Python/UPT/Logger/ToolError.py
>                |  117 +-
>  BaseTools/Source/Python/UPT/Logger/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/MkPkg.py
>               |   73 +-
>  BaseTools/Source/Python/UPT/Object/POM/CommonObject.py
>               |   93 +-
>  BaseTools/Source/Python/UPT/Object/POM/ModuleObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/POM/PackageObject.py
>                |   13 +-
>  BaseTools/Source/Python/UPT/Object/POM/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/Object/Parser/DecObject.py
>               |  186 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfBinaryObject.py
>               |  138 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfBuildOptionObject.py
>                |   15 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfCommonObject.py
>               |   46 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDefineCommonObject.py
>               |   36 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDefineObject.py
>               |  334 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfDepexObject.py
>                |   21 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfGuidObject.py
>               |   51 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfHeaderObject.py
>               |   34 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfLibraryClassesObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfMisc.py
>               |   23 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPackagesObject.py
>               |   35 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPcdObject.py
>                |  116 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfPpiObject.py
>                |   46 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfProtocolObject.py
>               |   42 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfSoucesObject.py
>               |   66 +-
>  BaseTools/Source/Python/UPT/Object/Parser/InfUserExtensionObject.py
>                |   24 +-
>  BaseTools/Source/Python/UPT/Object/Parser/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/UPT/Object/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/Parser/DecParser.py
>                |  280 +-
>  BaseTools/Source/Python/UPT/Parser/DecParserMisc.py
>                |   56 +-
>  BaseTools/Source/Python/UPT/Parser/InfAsBuiltProcess.py
>                |   47 +-
>  BaseTools/Source/Python/UPT/Parser/InfBinarySectionParser.py
>               |   44 +-
>  BaseTools/Source/Python/UPT/Parser/InfBuildOptionSectionParser.py
>                |   53 +-
>  BaseTools/Source/Python/UPT/Parser/InfDefineSectionParser.py
>               |   45 +-
>  BaseTools/Source/Python/UPT/Parser/InfDepexSectionParser.py
>                |   15 +-
>  BaseTools/Source/Python/UPT/Parser/InfGuidPpiProtocolSectionParser.py
>                |   73 +-
>  BaseTools/Source/Python/UPT/Parser/InfLibrarySectionParser.py
>                |   38 +-
>  BaseTools/Source/Python/UPT/Parser/InfPackageSectionParser.py
>                |   34 +-
>  BaseTools/Source/Python/UPT/Parser/InfParser.py
>                |  164 +-
>  BaseTools/Source/Python/UPT/Parser/InfParserMisc.py
>                |  122 +-
>  BaseTools/Source/Python/UPT/Parser/InfPcdSectionParser.py
>                |   45 +-
>  BaseTools/Source/Python/UPT/Parser/InfSectionParser.py
>               |   94 +-
>  BaseTools/Source/Python/UPT/Parser/InfSourceSectionParser.py
>               |   39 +-
>  BaseTools/Source/Python/UPT/Parser/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/PomAdapter/DecPomAlignment.py
>                |  236 +-
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignment.py
>                |  183 +-
>  BaseTools/Source/Python/UPT/PomAdapter/InfPomAlignmentMisc.py
>                |   35 +-
>  BaseTools/Source/Python/UPT/PomAdapter/__init__.py
>               |    2 +-
>  BaseTools/Source/Python/UPT/ReplacePkg.py
>                |   60 +-
>  BaseTools/Source/Python/UPT/RmPkg.py
>               |   66 +-
>  BaseTools/Source/Python/UPT/TestInstall.py
>               |   27 +-
>  BaseTools/Source/Python/UPT/UPT.py
>               |  150 +-
>  BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py
>                |   86 +-
>  BaseTools/Source/Python/UPT/UnitTest/CommentParsingUnitTest.py
>               |   91 +-
>  BaseTools/Source/Python/UPT/UnitTest/DecParserTest.py
>                |   23 +-
>  BaseTools/Source/Python/UPT/UnitTest/DecParserUnitTest.py
>                |  118 +-
>  BaseTools/Source/Python/UPT/UnitTest/InfBinarySectionTest.py
>               |  131 +-
>  BaseTools/Source/Python/UPT/Xml/CommonXml.py
>               |  265 +-
>  BaseTools/Source/Python/UPT/Xml/GuidProtocolPpiXml.py
>                |  139 +-
>  BaseTools/Source/Python/UPT/Xml/IniToXml.py
>                |  152 +-
>  BaseTools/Source/Python/UPT/Xml/ModuleSurfaceAreaXml.py
>                |  143 +-
>  BaseTools/Source/Python/UPT/Xml/PackageSurfaceAreaXml.py
>               |   87 +-
>  BaseTools/Source/Python/UPT/Xml/PcdXml.py
>                |  141 +-
>  BaseTools/Source/Python/UPT/Xml/XmlParser.py
>               |  363 +-
>  BaseTools/Source/Python/UPT/Xml/XmlParserMisc.py
>               |   22 +-
>  BaseTools/Source/Python/UPT/Xml/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/Workspace/BuildClassObject.py
>                |  303 +-
>  BaseTools/Source/Python/Workspace/DecBuildData.py
>                |  184 +-
>  BaseTools/Source/Python/Workspace/DscBuildData.py
>                | 2007 +++--
>  BaseTools/Source/Python/Workspace/InfBuildData.py
>                |  426 +-
>  BaseTools/Source/Python/Workspace/MetaDataTable.py
>               |   98 +-
>  BaseTools/Source/Python/Workspace/MetaFileCommentParser.py
>               |   21 +-
>  BaseTools/Source/Python/Workspace/MetaFileParser.py
>                |  885 ++-
>  BaseTools/Source/Python/Workspace/MetaFileTable.py
>               |  217 +-
>  BaseTools/Source/Python/Workspace/WorkspaceCommon.py
>               |   60 +-
>  BaseTools/Source/Python/Workspace/WorkspaceDatabase.py
>               |   67 +-
>  BaseTools/Source/Python/Workspace/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/build/BuildReport.py
>               |  766 +-
>  BaseTools/Source/Python/build/__init__.py
>                |    2 +-
>  BaseTools/Source/Python/build/build.py
>               | 1145 +--
>  BaseTools/Source/Python/build/buildoptions.py
>                |  113 +-
>  BaseTools/Source/Python/sitecustomize.py
>               |   11 +-
>  BaseTools/Source/Python/tests/Split/test_split.py
>                |   37 +-
>  BaseTools/Tests/CToolsTests.py
>               |    6 +-
>  BaseTools/Tests/CheckPythonSyntax.py
>               |   15 +-
>  BaseTools/Tests/CheckUnicodeSourceFiles.py
>               |    4 +-
>  BaseTools/Tests/PythonTest.py
>                |    2 +-
>  BaseTools/Tests/PythonToolsTests.py
>                |    4 +-
>  BaseTools/Tests/RunTests.py
>                |    7 +-
>  BaseTools/Tests/TestRegularExpression.py
>               |    7 +-
>  BaseTools/Tests/TestTools.py
>               |   54 +-
>  BaseTools/Tests/TianoCompress.py
>               |   15 +-
>  335 files changed, 35765 insertions(+), 32705 deletions(-)
>
> --
> 2.37.3
>
> 
>

[-- Attachment #2: Type: text/html, Size: 49676 bytes --]

^ permalink raw reply	[flat|nested] 5+ messages in thread

end of thread, other threads:[~2022-10-15 13:09 UTC | newest]

Thread overview: 5+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2022-10-10 20:05 [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
2022-10-10 20:05 ` [PATCH v1 1/1] Format BaseTools python files using autopep8 Ayush Singh
2022-10-12  4:56 ` [PATCH v1 0/1] BaseTools: Fix Python Formatting Ayush Singh
2022-10-12  5:12   ` Michael D Kinney
2022-10-15 13:09     ` [edk2-devel] " Ayush Singh

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox